SoundCloud Allows AI Training on User Content - Policy Changes

SoundCloud Updates Terms to Potentially Allow AI Training on User Audio
Recent changes to SoundCloud’s terms of use suggest the platform may now be permitted to utilize user-uploaded audio for the purpose of training artificial intelligence models.
Discovery of the Terms Update
The alteration was initially identified by tech ethicist Ed Newton-Rex. The current iteration of SoundCloud’s terms grants the platform the authority to employ uploaded content to “inform, train, [or] develop” AI technologies.
Specifically, the terms state that users “explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.” This update was implemented on February 7th.
Exceptions and Licensing Agreements
The terms do include a provision for content governed by “separate agreements” with third-party copyright holders, such as record labels. SoundCloud maintains various licensing arrangements with both independent labels and major music publishers, including Universal Music and Warner Music Group.
Opt-Out Options and Platform Response
An explicit option to opt-out of this AI training usage was not readily discoverable within the platform’s web settings. SoundCloud did not immediately provide a response to inquiries regarding this matter.
SoundCloud’s Broader AI Integration
Like many prominent creator platforms, SoundCloud is increasingly incorporating AI into its services.
Last year, the platform established partnerships with approximately twelve vendors to introduce AI-driven tools for remixing, vocal generation, and the creation of customized samples. SoundCloud announced in a blog post that these partners would have access to content ID solutions to “ensure rights holders [sic] receive proper credit and compensation.” The company also committed to “uphold ethical and transparent AI practices that respect creators’ rights.”
Industry Trend of AI Training Permissions
Numerous content hosting and social media platforms have revised their policies in recent months to permit both first- and third-party AI training. For example, X (formerly Twitter) updated its privacy policy in October to allow external companies to train AI on user posts. LinkedIn amended its terms in September to enable data scraping for training purposes, and YouTube began allowing third-party AI training on user clips in December.
User Concerns and Backlash
These policy changes have sparked criticism from users who advocate for opt-in AI training policies, as opposed to the current opt-out approach. Many also believe that creators should be credited and compensated for their contributions to AI training datasets.
SoundCloud’s Response to Concerns
A SoundCloud spokesperson provided the following statement via email, which is partially included below:
“SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes. We have implemented technical safeguards, including a ‘no AI’ tag on our site to explicitly prohibit unauthorized use.
- The February 2024 update to our terms of service was intended to clarify how content may interact with AI technologies within SoundCloud’s own platform.
- Use cases include personalized recommendations, content organization, fraud detection, and improvements to content identification with the help of AI technologies.
Any future application of AI at SoundCloud will be designed to support human artists, enhancing the tools, capabilities, reach, and opportunities available to them on our platform. Examples include improving music recommendations, generating playlists, organizing content, and detecting fraudulent activity. These efforts are aligned with existing licensing agreements and ethical standards. Tools like [those from our partner] Musiio are strictly used to power artist discovery and content organization, not to train generative AI models.
We understand the concerns raised and remain committed to open dialogue. Artists will continue to have control over their work, and we’ll keep our community informed every step of the way as we explore innovation and apply AI technologies responsibly, especially as legal and commercial frameworks continue to evolve.”
Related Posts

ChatGPT Launches App Store for Developers

Pickle Robot Appoints Tesla Veteran as First CFO

Peripheral Labs: Self-Driving Car Sensors Enhance Sports Fan Experience

Luma AI: Generate Videos from Start and End Frames

Alexa+ Adds AI to Ring Doorbells - Amazon's New Feature
