高清福利片

Dolores Abernathy, a Dolores Abernathy, a robot character in the TV series Westworld that becomes self-aware
高清福利片_

From HAL 9000 to Westworld鈥檚 Dolores: the pop culture robots that influenced smart voice assistants

22 June 2020
Your devices' voices matter more than you realise
Two media scholars analysed how Siri, Alexa and other smart voice assistants got their distinctive voices. Their work reveals the consequences of this on consumers, and ultimately, society at large.

As at January 2019, nearly a third of Australians owned a smart speaker device like an Amazon Echo or HomePod that allow you to call on 鈥楢lexa鈥 or 鈥楽iri鈥, respectively. With people remaining homebound due to the COVID-19 pandemic, smart voice assistants may now be playing even bigger roles in people鈥檚 lives. Yet not everyone embraces them, with some finding them intrusive and surveillant.

In a paper recently published in New Media & Society, Dr Justine Humphry and Dr Chris Chesher from the Department of Media and Communications trace anxiety about smart voice assistants to a long history of threatening robot voices and narratives in Hollywood.

Menacing males鈥 and 鈥榤onstrous mothers鈥

Despite smart voice assistants鈥 popularity, largely due to their default naturalistic female voices and helpful personalities, their uncanniness 鈥 seeming to be something between human and robotic 鈥 deters many people.

鈥淭his is underscored by the progression of robot voices in popular culture from the earth 20th century onwards,鈥 said Dr Humphry, a Lecturer in Digital Cultures. 鈥淭hey went from representing 鈥榤arvels of futuristic technology鈥 in the early 20th century, to sounding darker and more sinister from around the 1950s onwards. With distinctive sounds that gave the robots a sense of 鈥榦therness鈥, they became associated with narratives of science gone out of control.鈥

Examples of such robots include those in films Forbidden Planet (1956), The Collossus of New York (1958) and the infamous computer HAL 9000 in Stanley Kubrick鈥檚 2001 A Space Odyssey (1968), where the computer鈥檚 allegiance to the mission at the cost of the crew becomes murderous.

'Pris', a 'pleasure model' replicant in the film Bladerunner

'Pris', a 'pleasure model' replicant in the film Bladerunner. Credit: Warner Bros.


It wasn鈥檛 too long, however, that popular culture robots morphed again; this time, into humanlike robots, like those in the film聽叠濒补诲别谤耻苍苍别谤听(1982), who, in speech, more closely resemble a Siri, Alexa, Cortana (Microsoft) or Google Assistant (Google Home). 鈥淭he replicants in聽叠濒补诲别谤耻苍苍别谤听are very hard to distinguish from humans,鈥 Dr Chesher, Senior Lecturer in Digital Cultures said.

Today, this kind of robot predominates on the small and big screen. 鈥淩obots have increasingly tended to become more psychologically complex characters, and this has been accompanied by variations in modality in voice and speech,鈥 Dr Chesher continued. 鈥淚n the 2016 TV series聽Westworld, for example, as the robots Maeve and Dolores achieve more sentience, their behaviour becomes more naturalistic, and their voices become more inflected, cynical and self-aware.鈥

This can also be seen in the 2015 British TV series聽Humans, where two groups of anthropomorphic robots, called 鈥榮ynths鈥 are distinguished by one group鈥檚 ability to more closely resemble humans. Their speech, for instance, has features of natural conversation such as more animation and meaningful pauses.

Amidst these 鈥榤enacing males鈥 and humanlike robots emerged 鈥榤onstrous mother鈥 archetypes 鈥 maternal figures with misplaced instincts. In the Disney movie聽Smart House聽(1999), the smart home, personified as the voice of PAT, turns into a controlling mother who flies into a rage when the family refuses to cede to her maternal demands. In聽I, Robot聽(2004), the computer VIKI and her robot hordes turn against people in a maternal effort to protect humanity from itself.

A promotional poster for the film 'Smart House'

A promotional poster for the film 'Smart House'. Credit: Disney.

How robots informed the voices of smart assistants

鈥淚n all the above examples, the voice is a crucial vehicle with which robots express a persona,鈥 Dr Chesher said. 鈥淪mart voice assistant developers adopted this concept after recognising its value in getting consumers to identify with their products.鈥

Big tech companies鈥 adaptation of the voices of specifically female voice actors was strategic, explained Dr Humphry: 鈥淣ot only does this leverage the notion of females as naturally sympathetic and helpful; it is the antithesis of the 鈥榤enacing male鈥 or 鈥榤onstrous mother鈥 cinematic robot archetypes, with their heavily synthesised voices. This could steer would-be consumers away from thinking of them as dangerous surveillant machines.鈥

The smart assistants are also programmed to be culturally competent in their relevant market. For example, the Australian version of Google Assistant knows about pavlova and galahs and uses Australian slang expressions. Gentle humour, too, plays a significant role in humanising the artificial intelligence behind these devices. 鈥淲hen asked 鈥楢lexa, are you dangerous?鈥, she replies in a calm retort, 鈥楴o, I am not dangerous鈥,鈥 Dr Humphry said.

With these features combined, smart assistants resemble the humanoid robots in latter-day pop culture, which are sometimes near indistinguishable from humans themselves.

Dangerous intimacy

Drs Humphry and Chesher argue that by positioning smart assistants as innocuous through their voice characteristics, consumers can be lulled into a false sense of security. 鈥淲ith voices that are apparently natural, transparent and depoliticised, the assistants give only one brief answer to each question and draw these responses from a small range of sources. This gives the tech companies that own them significant soft power, in their potential to influence consumers鈥 feelings, thoughts and behaviour,鈥 they write.

Of concern to them, too, is the future of smart voice assistants: 鈥淕oogle鈥檚 experimental technology Duplex allows users to ask the assistant to make phone calls on their behalf to perform tasks such as booking a hair appointment. In this case, the assistant starts a naturalistic conversation with an involuntary user. The test of its success here is the extent to which it/she can pass as 鈥榟uman鈥. These developments further risk manipulating consumers and obscure the implications of surveillance, soft power and global monopoly.鈥

Hero image: Dolores Abernathy, a robot character in the TV series Westworld that becomes self-aware. Credit: HBO.

Loren Smith

Assistant Media Adviser (Humanities)

Related articles