Heimdal
article featured image

Contents:

Koko, a mental health company, announced on 6 January 2023, that it provided AI-generated counseling to 4,000 people. The information raised ethical and legal concerns about the regulation of the use of AI and the absence of consent from individuals included in this experiment.

Meet GPT3, Your AI Co-pilot

As a non-profit mental health service, Koko connects people in need of advice to peers who volunteer on social media platforms like Telegram and Discord. Usually, messages asking for counseling would be sent by a bot to anonymous individuals who will provide a written response.

But this time, as Robert Morris, co-founder of the company, disclosed on Twitter, the volunteers could use an AI co-pilot to compose their answers. GPT3 is an OpenAI program that can write a large range of texts, “from poems to code and provide articulate responses on a variety of topics”, according to Cybernews.

He also stated that, while AI-generated texts scored substantially higher than those authored by humans, users were uneasy with a robot’s lack of genuine compassion and empathy.

Concerned Reactions

But some details shared by Robert Morris made Twitter users wonder about the lack of informed consent among the 4,000 people affected by this experiment.

“Once people learned the messages were co-created by a machine, it didn’t work. Simulated empathy feels weird, empty”, he twitted.

People criticized the test’s immoral character, arguing that this breaches the trust between a therapist and a patient. This made Koko’s co-founder clarify that the AI feature was opt-in and everybody knew about it, being live only for a limited number of days.

It is still unclear what kind of details users knew before using the company’s services, as the lack of consent would make the experiment illegal from a medical point of view.

Morris also told Motherboard that this experiment is exempt from informed consent (which would require a document signing), as KoKo didn’t use any personal information.

Source

On the other hand, online psychological services operate beyond a traditional medical context so they remain in a grey area of the law. And this experiment was not vetted by an Institutional Review Board (IRB) as such authority did not approve it.

If you liked this article, follow us on LinkedInTwitterFacebookYouTube, and Instagram for more cybersecurity news and topics.

Author Profile

Andreea Chebac

Digital Content Creator

Andreea is a digital content creator within Heimdal® with a great belief in the educational power of content. A literature-born cybersecurity enthusiast (through all those SF novels…), she loves to bring her ONG, cultural, and media background to this job.

Leave a Reply

Your email address will not be published. Required fields are marked *

CHECK OUR SUITE OF 11 CYBERSECURITY SOLUTIONS

SEE MORE