Статья опубликована в рамках: CXCIX Международной научно-практической конференции «Научное сообщество студентов: МЕЖДИСЦИПЛИНАРНЫЕ ИССЛЕДОВАНИЯ» (Россия, г. Новосибирск, 28 октября 2024 г.)
Наука: Филология
Секция: Литературоведение
Скачать книгу(-и): Сборник статей конференции
THE CONCEPT OF DECEPTION IN IAN MCEWAN’S NOVEL “MACHINES LIKE ME”
ABSTRACT
The differences in moral understanding between machines and humans have been a topic of debate for years. The purpose of this paper is to address one of these differences, the concept of deception, through Ian McEwan's novel “Machines Like Me”. Based on the novel, it is understood that the concept of deception is one of the concepts that reflects humanity.
Keywords: deception; machines; morality.
It is considered in the novel that lying is one of the biggest differences between man and machine. And therefore, it is a very important factor in distinguishing humans from machines. Machines have an exact rule due to their coding. They cannot approach the coded rules and laws emotionally, morally, or questioningly. They cannot understand the terms of right or wrong like humans do. This leads them to accept an absolute right and absolute wrong. Machines follow the rules they are taught as they are. Because coding the complex human mind is difficult. As stated in the book, even humans and researchers have not yet been able to understand the human mind themselves. As Stefan Back states in his article ''Do We Want Dystopia?'': ''Influenced by emotion, morality, culture, etiquette, and so on, human beings are messy and unpredictable in ways no machine can properly mimic.''(8) This is the reason behind Adam's actions. Ever since he first came home, he has followed the rules the creators of his have set for him.
Lying is both a human ability and a defining characteristic of a person. Because machines can only apply what is taught to them by people. As Alan Turing said at the end of the novel, people's minds are complex. When concepts such as culture, background, and family that influence the decision-making process come into play, human intelligence takes on an inimitable feature. At least, this skill has not been taught to the machines in the novel, and how to teach it has not yet been discovered. While Adam is deciding, it is quite easy to understand how he came to that conclusion. Even another person cannot understand how people view the concepts of decision-making and good and bad. In fact, this was the point that Adam and other machines could not understand. Concepts of good and bad are created by humans. Except of cultural codes, religions, etc., they have no tangible source. These concepts, which develop with people and sometimes change, are also human-specific, just like lying. Just as we see in Miranda and Charlie's point of view, the good and bad are the concepts whose boundaries are defined by people, and sometimes people ignore these boundaries. In fact, this was what Alan Turing meant by teaching the machines lie. It is not possible to tell machines that good is sometimes good and sometimes bad. Or what needs to be done is not always ethical. At least that's how it is in people's minds.
The approach of people and machines to events and the decisions they make according to these events also reveal the difference. For example, when Adam first comes home, he immediately says that Miranda is a liar. Miranda is someone we can label as a liar and blame when we know nothing about her lie. But for many people, Miranda's lie might seem forgivable because it was about revenge. People can easily find justifications for this crime and lie and conclude that it is a crime that does not require punishment. Just like Charlie. Charlie, like other people, became suspicious and judged Miranda when Adam said Miranda was a liar. But when he learns the background and the revenge concept in her lying, he no longer thinks that she is a liar or guilty. Even if Miranda is guilty under the law, Charlie wants to lie, not wanting it to be revealed. They even blame Adam for not understanding them about it. To prevent Adam from enforcing the law, Miranda even asks him if he loves her. But Adam gives the answer that neither Charlie nor Miranda expects. For him, his love for Miranda and the laws are separate. From this conversation between Adam and Miranda, we understand that Adam does not understand people at all. Because when Adam told Miranda that she should surrender to the police, he thought that Miranda would accept it and be happy. Additionally, Miranda and Charlie found Adam's approach to Mark wrong. They accused him of not being able to empathize with children and not loving them. Adam, on the other hand, did not harbour any hatred towards Mark. He only considered it appropriate that, by law, Mark should be handed over to the police. They also asked him to think about Mark's future to prevent Adam from reporting Miranda. But Adam could not create this connection. There were thousands of children like Mark, and Mark would somehow live without Miranda. In fact, Adam's actions, judged by people, were not wrong. Conscientious and moral elements that only people can understand, even in courts, are decisive in the punishment of the perpetrator. At the end, Adam was a machine made by humans, coded by humans to obey the laws set by humans. His inability to understand people breaking their laws has driven him and the machines to destruction.
If one day machines improve, become affordable, and increase in number, they may question people's morals and come to unsatisfactory conclusions. Machines that do not understand people's emotions and decision-making mechanisms will try to apply the rules they have learned and will bring these machines together on a more ethical axis. They will be able to make more ethical decisions than humans. At the end of the novel, this is the point Adam draws attention to as he dies. As he says: ''It’s about machines like me and people like you and our future together…the sadness that’s to come. It will happen. With improvements over time…we’ll surpass you…and outlast you…even as we love you.''(279) The common future of Humans and Machines is rather dark. Machines that enforce the law as they were coded, are likely to find all of humanity guilty.
For these reasons, deception and lying are strictly human things. Coding that can make machines do this is not known yet. As it is also stated in the novel, creating a perfect copy of this imperfect mind brought unhappiness to the machines and eventually led them to self-destruct.
References:
- Beck, Stefan. “Do We Want Dystopia?” The New Atlantis, no. 61, 2020, pp. 87–96. JSTOR, https://www.jstor.org/stable/26898502. Accessed 7 Oct. 2024.
- Pellegrino, Massimo, and Richard Kelly. “Intelligent Machines and the Growing Importance of Ethics.” The Brain and the Processor: Unpacking the Challenges of Human-Machine Interaction, edited by Andrea Gilli, NATO Defense College, 2019, pp. 45–54. JSTOR, http://www.jstor.org/stable/resrep19966.11. Accessed 7 Oct. 2024.
Оставить комментарий