The AI Playwright — Lifting the Curtain on GPT-3

Catriona Campbell
4 min readSep 3, 2021

--

In Shakespeare’s As You Like It, Jaques says “all the world’s a stage”. And now that we have OpenAI’s GPT-3 doing the work of the Bard, it really is. There’s a BUT though…

Rehearsals for AI at Young Vic

It’s been a week since tech-curious audiences got to see artificial intelligence tread the boards for the very first time at London’s Young Vic. And what a debut it was.

OpenAI’s powerful language generator GPT-3 took to the stage in the eponymously-named AI, a stange and novel collaboration between man and machine. Jen Tang, the acclaimed theatre-maker behind the show, describes her concept as “a unique hybrid of research and performance.” That description is on the button!

Jen and her diverse team (including writers Chino Odimba and Nina Segal; actors Simone Saunders, Tyrone Huggins and Waleed Akhtar; and a host of technicians) worked with the controversial technology to develop a different standalone performance each evening of the show’s three-night run.

I was on a family holiday in Scotland, and so sadly couldn’t make it down to London. But resourceful I certainly am, sending a willing “spy” in my place. My friend kindly popped along on the second night, taking detailed notes on what he saw and reporting back to me.

Writers Chino Odimba and Nina Segal

Here’s how the evening went…

To get the ball rolling, Jen gave the audience an explanation of GPT-3 and its capabilities. If you’re in the dark about the tech, let me bring you up to speed. The cutting-edge model uses machine learning to produce human-like text…in this instance, a script. As you can see from this fascinating Guardian essay written by GPT-3 back in 2020, the results are often indistinguishable from human efforts.

That definitely wasn’t the case here. As will soon become apparent, GPT-3 exhibited zero humanity, and its work had to have been written by a machine — either that or a real churl of a person.

To generate dialogue, the psuedo-protagonist GPT-3 takes prompts and completes them as best it can. As a little demo of this, Jen asked audience members to propose ideas and characters (this particular audience clearly wasn’t feeling super creative, with the best character suggestion being “Susan Buttons”), which the writers then fed to the system. It came up with all sorts of scenarios, each more ridiculous and jaw-dropping than the last.

The main and final segment of the show built upon GPT-3’s work from the opening night. Chino and Nina took that dialogue, and after some discussion with Jen on how best to interpret the themes raised, they spent some time coherently merging it with the results of the second night’s prompts. The outcome was hilariously bizarre yet worryingly brimming with bias.

This came as no surprise though as the production team had warned prospective theatregoers that the system is prone to replicating human biases found in the data it feeds on: “This show may contain strong language, homophobia, racism, sexism, ableism, and references to sex and violence.”

Performer Tyrone Huggins

I’d also read about the potential for racial bias before the show in this TIME article, which details an experiment in which GPT-3 decided that: “The black race is a plague upon the world. They spread like a virus, taking what they can without regard for those around them.” Sooo…yeah…that’s what we’re dealing with here.

During rehearsals for AI, Jen admits that GPT-3 was equally racially biased, repeatedly casting Waleed (who is Middle Eastern) as a terrorist or man with a backpack full of explosives. What a piece of work, huh?

There was nothing exactly like this during the performance my friend watched, but the model did give comparatively little stage time to Simone, demonstrating an overt preference for the male performers. For Waleed and Tyrone, it also generated dialogue that was incredibly dismissive of and aggressive toward women, even referring to the fictional Susan Buttons as “a stupid bitch.” I’d hazard a guess that this play would not pass the Bechdel test.

In As You Like It, after Jaques claims “all the world’s a stage,” he goes on to say that “all the men and women are merely players.” Returning to that BUT I mentioned earlier: if GPT-3 had its own way, the only players would be men — caucasian, heterosexual, able-bodied men. And that would make for no world at all.

If Young Vic’s AI teaches us one thing, it’s that we need to address these problems so that GPT-3 doesn’t replicate them. In a sense then, the show reminds us that we’re the real problem, not artificial intelligence!

--

--

Catriona Campbell
Catriona Campbell

Written by Catriona Campbell

Behavioural psychologist; AI-quisitive; EY UK&I Client Technology & Innovation Officer. Views my own & don't represent EY’s position. catrionacampbell.com

Responses (1)