Time to sue AI

Millions and millions of bits and bytes have been used to write about AI. Some of the evergreen topics tend to focus on biases, people losing jobs, or AI being sentient. Not much is written about how little we know about what's happening inside the algorithms. Similarly, we know less than you may think about how to test AI. And related to testing, we have issues with the training of AI systems.

Have you heard about GPT-3 or used Google image search? Remember hearing about Tesla’s soon-to-be self-driving car? All of them have a few important things in common — they were all trained on human generated content. Google is a great example of it because humans had to manually label the millions of images you find in their search engine. To ensure safety, Tesla had to get images of all of the objects a car could potentially encounter on the road. OpenAI, the developer of GPT-3, used content from the Internet to become a writing assistant.

The capabilities of AI are getting better and better. Using OpenAI as another example of this, users can now create images based on a description — like “dancing pig in a clown’s costume in a field of petunias,” or it can even help you to write software.

And that’s where the problem is.

What kind of problem exactly? A class action lawsuit.

OpenAI partnered with Microsoft to offer a new product — an AI-powered coding assistant called Copilot.

Imagine this: you’re a developer who wants to write a code. Wouldn’t it be great if, somehow, the AI based on a simple description of the code wrote it for you? Sounds like a dream. How did OpenAI and Microsoft train Copilot? It’s all thanks to GitHub, Microsoft's subsidiary. GitHub is an online repository for source code used by about 83 million developers. It provides '... the distributed version control of Git plus access control, bug tracking, software feature requests, task management, continuous integration, and wikis for every project ...'. Some of the projects are private, some are public or open-source.

It is not exactly clear what content on GitHub was used to train Copilot, but developers who contributed to the open-source projects are discovering that their code was used without attribution by Copilot. Although the open-source code is free to use, it is distributed under different licensing schemes that require users to declare when they are using it.

This issue isn’t unique to Microsoft or OpenAI. Stable Diffusion, another open-source software, also can create dancing pigs for you after being trained on copyrighted images scrapped on the Internet. The developers behind that project claim that it is done under the fair use doctrine.

One complaint by the plaintiffs is that Microsoft is receiving content for free while also charging money for the service. In the future, maybe you’ll be able to claim that AI did something but you are not even responsible for its action. AI system DABUS submitted a patent application case in 2021 to recognize the software as an inventor. IP offices around the world rejected the claim. Only the Federal Court of Australia heard the appeal and found that under the Australian Patent Act, the AI could be listed as an inventor.

So what happens when you get hit by a self-driving Tesla, where AI wrote the software? Beats me.

Many of these issues are rooted in the (very unfortunate) term AI - artificial intelligence - which somehow creates a parallel between humans and machines activities. This anthropomorphism of computer systems leads to nonsensical discussions like how good or bad it is. How biased or sentient is the AI? Is AI responsible for its actions? All nonsense.

Don't sue AI — sue people. Computer does what the programmer tells it to do. It either works or doesn't. That's the only recurrent pattern here.

Previous
Previous

Nobody wants to talk to computers

Next
Next

The right platform to build the next pyramid