Adobe Responds to AI Fears With Plans for Updated Legal Terms
Table of Contents
Customer turmoil over a seemingly routine change in Adobe Systems Inc.’s terms of use agreement spurred a rare internal reckoning over how the company communicates complex legal issues to users and accounts for anxiety around generative AI.
“As technology evolves, we have to evolve,” Dana Rao, Adobe’s general counsel, said in an interview with Bloomberg Law. “The legal terms have to evolve, too. And that’s really the lesson that we’re sort of internalizing here.”
Over the weekend, some Adobe customers revolted on social media, crying foul at updated terms of use they claimed allowed Adobe to seize their intellectual property and use their data to feed AI models.
The Photoshop and Illustrator maker responded with multiple blog posts over several days seeking to reassure users it wasn’t stealing their content, including a pledge to quickly rewrite its user agreement in clearer language. Rao said Tuesday that Adobe will be issuing updated terms of use on June 18 in which it will specifically state the company doesn’t train its Firefly AI models on its cloud content.
The unexpected online storm around the updates is the latest example of how sweeping technological changes—such as the rise of generative AI—have bolstered users’ fears of copyright violations and privacy invasions. That sentiment is part of the landscape the tech industry must navigate to serve a creator community increasingly on edge.
What happened is “more of a lesson in terms of how to present terms of use and roll out updates in a way that can address or alleviate customer concerns, especially in the era of AI and increased concern over privacy,” said Los Angeles-based advertising attorney Robert Freund.
Companies can expect similar tumult going forward, even over seemingly technical changes to their terms of service or harmless phrasing in communications with their customers, attorneys said, because many users—especially in the creative industries—are now wary of any movement toward using their data or work to boost AI.
“Language that we might have considered innocuous in the past might take on new concerns if there’s an implicit permission to index content for general AI purposes,” said Eric Goldman, a law professor at Santa Clara University School of Law.
Online Revolt
Adobe’s February update to its user agreement allowed the company to access and review user content through both “automated and manual methods.” In addition, the company could conduct “manual review” of user content to screen for certain types of illegal content, including child sexual abuse material.
While new customers signed off on the updated terms, existing users were notified of the change on May 23, Rao said. In early June, several Adobe customers took to social media threatening to discontinue their use of its products. At issue were provisions some users said appeared to give Adobe carte blanche to access and use customers’ content—even if it was confidential intellectual property or documentation covered by nondisclosure agreements or attorney-client privilege.
Specifically, the “worldwide royalty-free” license in the terms of service that users grant Adobe “to use, reproduce,” and “create derivative works based on” churned some of the nervous backlash on the social media platform X.
The provisions aren’t uncommon, especially among leading cloud-based providers. Microsoft’s Azure OpenAI Service and Google’s services, for example, have similar content moderation and licensing provisions. But for creators, that phrasing is laced with concerns that their content would be plugged into an AI model, much like what has happened with well-known publications and creators who have sued tech giants like OpenAI Inc.
“In the AI era, people have seen their content taken,” Roy Kaufman, managing director of the Copyright Clearance Center, a collective licensing company, said. “People have seen their content used for commercial purposes with some specious arguments as to why they can do it.”
Rao said that language wasn’t new and had been included in their end-user agreements prior the February update.
Must-Have Provisions
Some provisions in cloud-based providers’ terms of use stem from technical necessity—a tool like Amazon’s virtual assistant Alexa, for example, needs to capture and process a user’s voice to function, and the user agreement ensures Amazon has that right, Freund said. For Adobe, the licensing clause is necessary for it to create copies to upload content to the cloud and create thumbnails, among other practical tasks, Rao said.
The added language clarified how the provider inspects some of its customers’ content to flag any illegal activities—a right that the platform has “likely always had,” said Andrew Klungness from Fenwick & West LLP’s technology transactions group.
“There’s a bunch of people that live in the digital apartment building, and the landlord or owner preserves rights to make sure there’s nothing illegal going on, or nobody’s doing damage to the solution, or the apartment building,” Klungness said.
Adobe’s update was a “very modest change” in the current technological age, said Aloke Chakravarty, co-chair of Snell & Wilmer’s cybersecurity, data protection, and privacy practice.
“But the implications of the change, particularly for Adobe, make it different than for the social media purveyors and even for the mega searchlike Google and the like, because this is so, so associated with the creative world,” Chakravarty said.
Rao, Adobe’s executive vice president, general counsel, and chief trust officer, acknowleged that customers’ unique relationship with creative platforms places a greater importance on trust, even if the provisions in the terms of service are ubiquitous.
“We’re a part of people’s lives and livelihoods in a way that’s pretty personal,” Rao said. “Wedding photographers, movie makers, graphic artists, regular artists, like their whole lives are based on Adobe tools and products.”
‘Test Your Message’
User policies have periodically stirred up mayhem among customers, according to Goldman, the Santa Clara Law professor. Just a decade ago, Instagram came under fire for language in its terms of service that was perceived to give it the right to sell user content or have user photos in ads, after which the company revised the terms.
Such “freakouts” often come up when there is a “divergence between how we as lawyers describe legal rights and how lay readers interpret and often misunderstand what we’re saying as lawyers,” Goldman said.
The latest episode shows that changes to terms of service around data collection and use are higher profile than they used to be, said Tod Cohen, partner at Steptoe LLP.
“People are much more paranoid than they were before,” Cohen said. “So, if anything, the lesson is: Make sure you test your message, your changes, with people that are not necessarily just the lawyers.”
Ultimately, terms of use boil down to communications with customers. Rao told Bloomberg Law that Adobe plans to test new language on customers going forward.
“Lots of people are going to read these languages who are not trained in the law and don’t know how to read legal words,” he said. “I do think that we could have thought about that before we put this out there.”