Adobe Photoshop Is Asking for Access to All Content Created Using Software, Even if It’s Confidential

  • Users are strongly opposing this intrusion into their content.

  • Other companies, like Microsoft and Slack, are also moving in this direction.

Adobe
No comments Twitter Flipboard E-mail

Adobe’s new Terms of Use, implemented last February, are only now becoming evident to creators, who are realizing the extent of the privacy invasion that the new mandatory policy entails.

Photoshop is requesting access to all of our content. Adobe’s new Terms of Use include Sections 2.2 and 4.1, which allow the company to access all users’ content “through both automated and manual methods.”

As Sam Santala, founder of Songhorn Studios, pointed out, this new policy means that Photoshop will have access to all the content users create, including sensitive material. This could include sketches for a Hollywood movie, a yet-to-be-announced video game, or any other creations that should remain strictly in the hands of the creator and their company.

Additionally, Photoshop, Substance 3D, and other Adobe tools now require subscribers to have access to the Internet. The new Terms of Use force users to accept the company’s access to their content. If users refuse, Adobe blocks their access to the app.

Creators are upset. The recent policy change surprised many creators and Photoshop users. Renowned filmmaker Duncan Jones, known for directing films such as Moon, Source Code, and Warcraft, also shared his disapproval on social media. “We are working on a bloody movie here, and NO, you don’t suddenly have the right to any of the work we are doing on it just because we pay you to use Photoshop,” he wrote on X, formerly known as Twitter.

Yes, there are limitations to this access. Of course there are. The new Terms of Use’s privacy section explains that this access will be “limited” and only as “permitted by law.” In Europe, Adobe directly explains that this access will be subject to the General Data Protection Regulation, although this law doesn't apply in the U.S.

Among the reasons Adobe gives for wanting to access the content are to provide “feedback” and to “detect, prevent, or otherwise address fraud, security, legal, or technical issues.”

Adobe is defending itself. Jérémie Noguer, a Substance 3D product manager, has stated that the company won’t be “accessing or reading Substance users’ projects in any way, shape or form.” 

“I fail to see the point of doing so and every serious company in the industry would drop us immediately if it was the case,” he added on an X post.

“Reviewing content is limited to very specific cases,” Noguer said. “Every time our team wishes to use an artist’s work for any purpose, we’ve contacted them directly and agreed on licensing.” This last point seems to be the real issue at hand.

AI Training. Another reason for this access is outlined in Section 4.4 (C) dedicated to content sharing. Adobe wants to use Photoshop to have full access to user content to analyze and train its machine learning algorithms, as the company has stated, with the aim of enhancing its services.

“Our automated systems may analyze your Content and Creative Cloud Customer Fonts… using techniques such as machine learning in order to improve our Services and Software and the user experience.”

This is how Adobe defines the process of training artificial intelligence systems. The software company wants to access the vast amount of content generated through its tools to continuously improve its algorithms. This practice isn’t unique to Adobe and is becoming more widespread throughout the industry.

It’s set to become a major issue in data protection. There’s widespread controversy surrounding this issue. It’s akin to the case of Windows with Recall, a feature that monitors laptop activity to provide appropriate responses. Cybersecurity experts have already raised concerns about its potential risks. This approach is also similar to what companies like Slack are doing by using private chat messages to train AI. Similar practices are being employed by companies like Meta and Dropbox.

The use of personal content to train algorithms is likely to spark significant debate in the data protection arena in the months ahead. This week alone, the European Center for Digital Rights (NOYB) filed 11 complaints with various data protection agencies to examine the impact of Meta’s access to content and thousands of personal data points. Although the complaints don’t specifically mention Adobe, the potential privacy risks to users are similar. We’ll have to see whether or not this use of content is justified on a case-by-case basis.

Adobe says that it's not training its AI. In response to the controversy, Adobe has released a statement clarifying that it doesn’t utilize user content to train its Firefly Gen AI. The company stated that the AI is trained using licensed content, such as Adobe Stock, and content from the public domain. Additionally, Adobe emphasizes that it’ll never claim authorship of any creator’s work.

However, Adobe does need access to user content to apply filters, such as AI-based background removal. Additionally, for content uploaded to the cloud, Adobe may use scaling technologies and review the content for illegal materials.

Image | Adobe

Related | Windows’ Photographic Memory Feature, Recall, Wants You to Trust Microsoft More Than Any Other Company

Home o Index