Rage against the machine

By Liubov Vetoshkina

Today I watched HBO’s second season finale of Westworld, one of my personally favorite series. It tells a story of a future American Old West theme park populated by androids, which are programmed to fulfill all human desires. Apart from being a truly stunning series with a fascinating plot, it touches upon philosophical and ethical questions with regard to technology. One of the issues is not just in possible changes and threats technologies may bring us, but in the way how we, humans, should treat new, “human-like” technologies, like AI and robots.

Westworld may lack scientific feasibility on the issues of consciousness, but rises important ethical and philosophical questions. The plot, visuals and actors are also stunning.  Picture source: https://www.hbo.com/westworld

Recurrently, one can find different trends or questions in discussing new  technologies. Not only modern ones, like AI or robotics. Atomic bomb, assembly line. Wheel, I suppose. The trends often find reflection in plots of movies, games and literature. Generally, I am rather critical about looking at general trends. Putting it simply, things are a little bit different in Silicon Valley and in Krasnogor village in central Russia (the place is real, my aunt lives there). Things are even different for separate activities and communities.  But the general trends though provide a certain background for concrete activities, connected to technologies, and need to be addresses and discussed.

One of the recurring moral questions, reflected in movies, is “how we can harm other people with certain technology or using a certain technology”. Canonical movie example will be the already mentioned atomic bomb – represented, for instance in Stanley Cubrik’s Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb. A less evident example is the moral choice to use Uber or Amazon – the companies having their dark side and exploiting their workers.

Another recurring question is the fear of new technology – “how can machines harm us”? The Terminator series is the classic example with robots rebelling against humans. Discussions on whether humans at workplaces will be replaced by AI can be put into the same category.

But Westworld tackles more complex philosophical and ethical questions concerning humanity: “how we should treat AI/ robots?”, “are they equal to humans and what rights and responsibilities do they have?”. These questions have been present in a variety of recent movies, from Ex machina to Bladerunner 2049.

It is a question of humans being cruel (like ones kicking food delivery robots) and exploiting robots.  This issue goes beyond a simple question of the intelligence of machines – are the machines “smart”? It goes as far as their free will and their nature – their similarity and difference to humans. It is a question whether robots and AI be have same rights and responsibilities as humans. Should the self-driving Uber car be put to jail for killing a pedestrian?  Or the human overseeing it (or is it even it)?

The solutions and answers should be discussed now – until it is too late, like in Westworld (spoiler alert!), where android hosts, subjects of constant abuse by humans, rebel and kill almost everyone in the amusement park. Humankind is creating something new and exiting, the problem is to avoid abuse – from all the sides.

Leave a Reply

Your email address will not be published. Required fields are marked *