Dear All,
Last week one amusing trend on Twitter was about the Apple personal assistant ‘Siri’. This was because if you ask Siri "what is zero divided by zero?", it (She, He?) responds by first explaining the logic of the answer through a hypothetical situation of sharing cookies with friends and then ends on a rather nastier note by saying that therefore "you have no cookies -- and no friends".
Siri’s rebuke and what people perceived as ‘attitude’ was yet another episode in the ongoing story of human beings’ problematic relationship with artificial intelligence and the pitfalls of trying to create intelligence in their ‘own image’. An early tragedy on this theme was of course Mary Shelley’s Frankenstein (1818) in which a scientist creates a sentient creature but things do not go well as the creature (known generally) as the ‘monster’ starts to develop a personality of its own and demands companionship.
Later, science fiction works continued to explore this theme: there is Hal the evil computer of Kubrick’s film 2001: A Space Odyssey and ‘Computer’ addressed so confidently and often by Star Trek’s Captain James T Kirk. Although the character Spock was not an example of artificial intelligence, his propensity to use logic and his lack of emotion proved as fascinating as that of a later Star Trek character the android, Data.
Recent films have continued to explore the theme of robots’ relationship with humans and the dangers posed by losing control of these creations (a new Terminator movie has just been released where Joachim Phoenix had a romantic relationship with a computer in ‘Her’) and around the time the film Transcendence was released last year the astrophycist Stephen Hawking published an article (co-authored with Berkley’s Frank Wilczek and MIT’s Max Tegmark) warning about the dangers of Artificial Intelligence (AI).
Siri, Microsoft’s Cortana, Google Now, self driving Cars, and various other intelligent machines have an as yet unknown capacity to harm humanity. The theme is explored yet again in the TV series Humans (styled as HUM∀NS) which "explores the emotional impact of the blurring of the lines between humans and machines".
In the social landscape of this story ‘Synths’ are human-looking machines that you can acquire to perform several functions -- as a nurse, housekeeper, assistant etc. One family acquires a synth ‘Anita’ to look after the household but have tremendous difficulty understanding how to deal with ‘her’ on an emotional level. Another character, an elderly scientist once associated with the Synth project, played by the wonderful William Hurt is so attached to his long time Synth ‘Odi’ that he tries to protect ‘him’ from ‘recycling’ (after his malfunction) by hiding him. Yet another character is trying to re-programme and use Synths for criminal purposes while various plot lines explore whether the Synths have developed what can be called emotional memory.
Humans is a disturbing story but it makes compelling viewing because it is not set in some futuristic dystopia; indeed its setting feels so topical as to be quite mundane. The performances and visual effects are also impressive, particularly Gemma Chan as the Synth Anita. The series is often quite dark and frantic but it is intriguing because of its basic and very relevant preoccupation with how we relate to the forms of Artificial Intelligence we have created -- and are now busy normalising in our lives: Want a housekeeper, chauffeur or editor? Buy a synth, read the instruction manual and register the guarantee.
If only it were all that simple…
Best wishes