IT··3 min read

AI Music Generation and Copyright Ethics

When AI-made songs chart — who owns the copyright?

A Song I Heard on YouTube Was AI-Generated

On my commute, YouTube Music recommended a track. Good vibes, so I hit like. Checked the artist name — never heard of them. Searched it. AI-generated. The description had a small note: "Generated with Suno v4."

I was genuinely shocked. Not that it existed, but that I couldn't tell. (In my defense, I mostly listen to melodies and don't pay attention to lyrics.)

When Did AI Music Get This Good?

In 2023, a fake Drake song made by AI hit 20 million streams on Spotify. Back then it was just "huh, cool." In 2024, tools like Suno and Udio let anyone create songs from text prompts. Type "sad ballad, rainy day vibe, female vocal" and a 3-minute track appears in 40 seconds.

Now tens of thousands of AI-generated songs hit Spotify monthly. Exact numbers aren't public, but industry estimates suggest around 14,000 per day as of September 2025. That might already exceed human-created output.

The problem is training data. Suno hasn't disclosed which songs it learned from. But specify a particular artist's style and the output is eerily similar. That strongly implies their music was in the training set.

The legal debate splits here. One side: "Style isn't copyrightable." The other: "The training data itself is unauthorized reproduction." Lawsuits have been piling up in the US since 2024, but there's no clear precedent yet.

In South Korea, it's even murkier. Copyright law has virtually no AI provisions. There's no legal answer to "who owns the copyright of AI-generated work" while the market grows rapidly. (Korea's copyright framework, like many countries, was built for a pre-AI world.)

I Asked a Musician Friend

I have a friend who makes indie music. Monthly streaming income: about $92. I asked if AI music had cut into their earnings. "I don't know if it went down. It was always small."

But the scarier part, they said: "Companies stopped licensing background music. AI-generated music is zero royalties." Cafe playlists, YouTube video backgrounds, ad music — that market is rapidly going to AI. Major artists might be fine, but people who lived off library and background music are taking a direct hit.

Where I Land on This

Honestly, I'm torn. As a developer, I believe you can't stop technological progress. I use AI code generation tools — calling AI music bad would be a double standard.

But simultaneously, using creators' work as training data without consent feels unfair. At minimum, there should be transparency: "your song was used in training, here's your compensation."

Right now the market is racing ahead without that transparency. If this continues, the path for new musicians to make a living from music just... closes. AI can "make" music, but it can't "make" music culture. There are reasons humans should keep creating music, but those reasons need economic support.

I don't have the answer. But I'm sure that "not knowing doesn't mean you shouldn't be thinking about it."

Related Posts