Adbe A recent viral moment highlights just how nervous the artist community is about artificial intelligence (AI). It started earlier this week, when French comic book author Claire Wendling posted a screenshot of a curious passage in Adobe’s privacy and personal data settings to Instagram. It was quickly reposted on Twitter by another artist and campaigner, Jon Lam, where it subsequently spread throughout the artistic community, drawing nearly 2 million views and thousands of retweets. (Neither Wendling nor Lam responded to requests to comment for this story.)
The reaction was predictable: One commenter accused Adobe of having “predatory business practices against artists”; another worried “machine working overlords… steal from you while you work.” Another noted it was, “reason number 32405585382281858428 on why you shouldn’t use Adobe products.”
The reality may be more complex. An Adobe spokesperson says that the company is not using customer accounts to train AI. “When it comes to Generative AI, Adobe does not use any data stored on customers’ Creative Cloud accounts to train its experimental Generative AI features,” said the company spokesperson in a written statement to Fast Company. “We are currently reviewing our policy to better define Generative AI use cases.” Meanwhile, Adobe’s FAQ on its machine learning content analysis cites examples of how the company may use machine learning-based object recognition to auto-tag photographs of known creatures or individuals, such as dogs and cats. “In Photoshop, machine learning can be used to automatically correct the perspective of an image for you,” the company says. It also can be utilized to suggest context-aware options: If the company’s apps believe you’re designing a website, it might suggest relevant buttons to include.
“For me, it’s astonishing that a paid service assumes it’s okay to violate users’ privacy at such a scale,” says Andrey Okonetchnikov, a front-end developer and UI and UX designer from Vienna, Austria, who uses Adobe products to sync photographs. “It’s troublesome because companies who offer to store data in the cloud assume that they own the data. It violates intellectual property and privacy of millions of people and it’s assumed to be ‘business as usual’. This must stop now.”
Yet not everyone is quite as concerned. As with lots of movements on social media, some think that a legitimate concern has been overhyped and misconstrued. Part of that confusion stems from prior controversies with Adobe, where they planned to allow AI-generated images in their stock library—a move announced on December 5—which some saw as directly harmful to stock artists, and an earlier dispute with Pantone which resulted in some users losing access to the Pantone colors they had deployed in earlier design projects using Adobe software. “People saw that little checkbox for sharing data used for machine learning and conflated it with all the current AI image generation drama currently underway,” says Daniel Landerman, a Los Angeles-based creative director and illustrator.
To Landerman’s eyes, the data sharing feature for machine learning has been present in Adobe apps for years—and only applies to files stored in the Adobe cloud, which he says “any professional shouldn’t be doing anyway.” Landerman has long made sure to uncheck any options that share data with app makers, as part of the process of working with clients who often require him to sign non-disclosure agreements to work on projects.
“Everything is moving so fast with all the AI stuff: Artists trying to get regulations to catch up, AI engineers and NFT bros trying to outpace the artists,” Landerman says. “I’m not surprised some other non-issues get caught up in the turmoil.”
But beyond artists’ concerns, data protection experts say they’re worried about the way Adobe has handled the process. “Under European ePrivacy law, Adobe needs opt-in consent before reading data from individuals’ devices for the purpose not necessary for the service the user requested,” says Michael Veale, a University College London professor who specializes in digital rights.
“We give customers full control of their privacy preferences and settings,” an Adobe spokesperson told Fast Company in a statement. “The policy in discussion is not new and has been in place for a decade to help us enhance our products for customers.” The spokesperson directed any customer who prefers their content be excluded from analysis to the options on the privacy page.