Tech Insight : Some Legal Considerations Of Online Writing And Publishing

With ChatGPT’s makers OpenAI facing a possible defamation lawsuit from an Australian mayor, we look at the legal aspects of online writing and publishing. 

Defamed? 

Last November, when ChatGPT was first released for public use, Brian Hood, the mayor of Hepburn Shire, 120km northwest of Melbourne, was told by members of the public that ChatGPT had (falsely) named him as a guilty party in a foreign bribery scandal involving a subsidiary of the Reserve Bank of Australia in the early 2000s. In fact, Mr Hood had been the whistleblower in the scandal, hadn’t been convicted in a crime, and hadn’t served time in prison as claimed in ChatGTP’s output about the scandal. Mr Hood is, therefore reported to have instructed lawyers to send a letter of concern to OpenAI on giving the company 28 days to fix the errors about him or face a possible defamation lawsuit. 

Can An AI Algorithm Defame Someone? 

Although this is a relatively new area, it could be argued that because ChatGPT is an AI language model and does not have the intent or awareness to defame someone, it may not be as clear a case as it sounds. Defamation requires a statement that is false, communicated to a third party, and causes harm to the reputation of an individual. Since ChatGPT does not have the ability to communicate with a third party outside of the conversation, it is unlikely that its responses could cause harm to an individual’s reputation in a way that would meet the legal standard for defamation. 

Could The Third-Party Source Be Liable? 

However, if the information provided by ChatGPT is based on false or defamatory information from a third-party source that is repeated by ChatGPT, that third party may potentially be liable for defamation. In this case, the individual who is harmed by the false information may (as in the case of Mr Hood) be able to pursue legal action against the original source of the information. 

It’s important to remember that the legal standards for defamation can vary depending on the jurisdiction, and it is always advisable to consult with a legal expert for guidance on specific cases. 

What Are The Main UK Laws Relating to Online Publishing? 

Publishing online information about people is, of course, covered by laws and for publishers of content of all kinds, plus it’s worth being aware that these exist and should be adhered to. For example, in the UK, there are several laws and regulations that relate to online publishing. The main ones are: 

– The Defamation Act 2013. This law sets out the rules for defamation, which occurs when a false statement is made that harms someone’s reputation. It applies to online publishing, including social media and websites. 

– The Electronic Commerce (EC Directive) Regulations 2002. This regulation requires website owners to provide certain information to users, such as the website owner’s name and address, and details about any professional body or trade association they belong to. 

– The Copyright, Designs and Patents Act 1988. This law protects original works such as text, images, videos, and music from being copied or used without permission. It applies to online publishing as well as offline publishing. 

– The Computer Misuse Act 1990. This law makes it a criminal offense to access or modify computer systems without authorisation. This applies to online publishing, including hacking and malware attacks. 

– General Data Protection Regulation (GDPR). This EU regulation, which was adopted in the UK (and made into the UK’s own version following Brexit), sets out rules for the collection, use, and storage of personal data. It applies to online publishing, including websites and social media. 

– The Obscene Publications Act 1959. This law makes it an offense to publish material that is obscene, or that is likely to be harmful to the general public. This applies to online publishing, including websites and social media. 

What Does This Mean For Your Business? 

Although there has always been a culture of sharing and a degree of free speech on the web, uploading content to a website or platform that’s publicly available should still be regarded as publishing and, therefore, subject to many different publishing laws. In the case of ChatGPT, it’s an algorithm that has been trained on content gathered from third-party sources online and, therefore has no ‘intent’ to defame anyone, even though it has now become a popular and trusted source of information itself.

In the case of Mr Hood, it could be argued that being a public figure such as a mayor, his public reputation is of particular significance and that the third-party sources that ChatGTP’s system got the information from, and the apparent inability of ChatGPT to check the accuracy of what it produces may be central points in the case, should it go to court. When businesses use ChatGPT to produce content that they intend to publish online it is therefore worth conducting some basic fact checks, particularly where specific people and organisations are named. ChatGPT’s existence, however, shouldn’t change the fact that businesses should always be aware that there are many laws to adhere to when publishing online and that compliance is always an important issue.