How Adobe is managing the AI copyright dilemma, with general counsel Dana Rao
Jan 9, 2024
auto_awesome
Dana Rao, General Counsel at Adobe, discusses the impact of generative AI on copyright law and Adobe's failed attempt to acquire Figma. They also talk about the evolving definition of a photo, enforcing mandates on Photoshop, government regulation in the AI industry, and upcoming advancements in creative tools.
Understanding the evolving legal landscape is crucial for the development and use of AI models, especially in terms of copyright law and its implications for human creators.
The introduction of a federal anti-impersonation right aims to protect artists' unique artistic style by allowing them to enforce against individuals who intentionally impersonate their work for commercial gain.
Deep dives
Adobe's General Counsel and Chief Trust Officer discusses the challenges of generative AI and copyright
The conversation between Decoder's Neil Appetel and Dana Rao, General Counsel and Chief Trust Officer at Adobe, delves into the complex issues arising from the intersection of generative AI and copyright law. The discussion highlights the importance of understanding the evolving legal landscape and how it impacts the development and use of AI models. They explore the implications of training AI models on copyrighted works and the potential economic harm to human creators. Furthermore, they touch on how copyright law may vary in different regions and the need for clarity and balance when it comes to protecting intellectual property while fostering innovation.
The Need for Style Protection in Copyright Law
There is currently no style protection in copyright law, which means that artists cannot protect their unique artistic style. To address this issue, a federal anti-impersonation right has been proposed, which would allow artists to enforce against individuals who intentionally impersonate their work for commercial gain. The goal is to provide artists with a right similar to copyright, with exceptions for fair use, and the ability to pursue statutory damages. The introduction of this legislation aims to address the economic harm caused by style appropriation and ensure artists' livelihoods are protected.
The Content Authenticity Initiative and the Importance of Proving Truth
The Content Authenticity Initiative, founded four years ago, focuses on assigning metadata to images to prove their authenticity and provide a way for people to verify what's true. By attaching metadata to images, such as information on who took the picture, when and where it was taken, it allows users to see the provenance of the image and verify its authenticity. The initiative has gained significant traction with over 2,000 members, including companies like Sony and Leica. The development of this open standard aims to address the negative implications of deep fakes and ensure that people can trust the images they see, promoting factual discourse and combating disinformation.
Today, I'm talking to Dana Rao, who is General Counsel and Chief Trust Officer at Adobe. Now, if you're a longtime Decoder listener, you know that I have always been fascinated with Adobe, which I think the tech press largely undercovers. If you're interested in how creativity happens, you're kind of necessarily interested in what Adobe's up to. And it is fascinating to consider how Dana's job as Adobe's top lawyer is really at the center of the company's future.
The copyright issues with generative AI are so unknown and unfolding so fast that they will necessarily shape what kind of products Adobe can even make in the future, and what people can make with those products. The company also just tried and failed to buy the popular upstart design company Figma, a potentially $20 billion deal that was shut down over antitrust concerns in the European Union. So Dana and I had a lot to talk about.