RoboCop, Peace and Postmodernism

By Armenak Tokmajyan

On the eve of a technological revolution, the “… American machines are helping to promote peace abroad.” This is how the new RoboCop movie starts: it is 2028, and gigantic machines are bringing peace to Iran, which is apparently full of “terrorists”. Sounds like a common Hollywood movie. However, Pat Novak (Samuel L. Jakson), in his TV show, promotes the idea of using machines to achieve peace inside the US because the experience abroad was successful. RoboCop, peace and Postmodernism might seem hardly connected but in fact, RoboCop is the postmodern soldier. Even though the action is taking place in Detroit, somehow countries like Afghanistan, Iraq, Syria and Yemen, where there is surplus of “terrorists” are/will be the test fields of postmodern warfare. Not surprisingly, Robocop, as the American soldiers did in Afghanistan and Iraq, will be there to bring peace in the future.

Who wants peace from the state or non-state actors of today’s world politics? Interestingly, they are many. Pat Novak’s TV show in the film claims that the initial objective of the machines is to bring peace. The President of the United States expresses its nation’s willingness to have global peace, the United Nations Secretary General, Ban Ki Moon, is promoting peace, the Syrian authorities speak of peace and even transnational jihadists around the world claim to fight for justice and peace. It seems that everyone is looking for the same thing; peace as a condition is desirable and will be desirable at least for the near future.

This is how OmniCorp, the multicultural military technology corporation, also thinks when it comes to the Detroit internal security — making it more secure and peaceful.  The ideologues of “war on terror” claim that after the 9/11 tragic events, the nature of the security threat have changed. Now, the enemy is not a state, a coalition of states or a military alliance. It is not very well identified or placed; it is a network rather than a hierarchic institution; it can be anywhere at any time. Our enemy is terrorism. To face this foe, the military strategists concluded that there is a need for soldiers that are faster, more efficient, physically invulnerable, easily reparable, and easily controllable. In other words, we want perfect and high-tech warriors.

The problem was that the human soldier is slow, not very efficient, worth a bullet, almost unrepeatable and sometimes very rebellious. Despite all the modern weapons and information channels that the military corporations created, it was not enough to defeat this invisible enemy. Subsequently, we need a postmodern approach to deconstruct this puzzle. The dominant narrative is that these terrorists are in Afghanistan or Iraq or nowadays in Syria, but the reality is that terrorists can be everywhere, even in Detroit or any other place in US.

To encounter “terrorism” in Afghanistan and Iraq, the US Army used postmodern technology and tactics including “unmanned” ground and air vehicles. Authors of Cyborg World, Les Levidow and Kevin Robin, argue that there are projects to create autonomous land vehicles, minesweepers, antiradar, anti-armor anti-everything drones. Robots can be incredibly fast, solid, efficient and remotely controllable. No one cares if a robot “dies”. Nay, the public opinion in the democratic countries will embrace this idea because it can save hundreds of soldiers’ lives. This brings us to OmniCorp’s CEO Raymond Sellars’ (Michael Keaton) argument that robots not only save human lives but also do the job much better and therefore they have to implement this strategy in the US as well.

The movie also raises the core dilemma of the postmodern argument. The robot is not human, it cannot feel and can kill or destroy mistakenly. In a democracy like the US, where the public opinion matters, the “not human” problem can hinder such strategy from implementation. The Dreyfus Act symbolized this notion; it was the only obstacle against the robotization of the police forces. Even though humans can be more evil than a robot by, for example, killing or raping a child with full consciousness, still the fact that the robot did not feel seemed to be unacceptable.

In the peak of his frustration and greed, Mr. Sellars finds the solution – why not humanize the robot? Why not put a “useless” human in a robot? In this way, the robot will have emotions and human consciousness, which will please the American public. This is how some perceive the postmodern warfare humanizing the machine so it has the feeling as we do but it is perfect and none-penetrable. This constant attempt at perfecting an imperfect human being is central to the postmodern thinking. The film however shows that when the human factor interferes, the machine cease being perfect. So there is a dilemma, we cannot extract the human factor because the public will not accept it, but on the other hand the machine remains imperfect. What is the solution? Dr. Norton (Gary Olmand), in a state of panic, when RoboCop lost his consciousness right before his first public appearance, finds it: robotizing the human.

As long as there is a human face, the robot is not perceived as a mere machine. Further, when this human-a-machine does the job perfectly by being objective, incorruptible, efficient, and fast, we even start loving it. Does this mean that if we want to achieve peace, which is a “perfect” state, we need to create perfect entities such as RoboCop? RoboCops, as the movie showed, made Detroit a better place by reducing crime, yet the problem is that RoboCop is just a means standing between systems and political ends. In this case, the system was morally corrupt whereas the political ends go hand in hand with previously set agendas. Therefore, to achieve peace maybe the systems should be “perfect”, not the means. Here we are not enabling peace; we are advancing our securitization techniques, which so far proved to be useless at least in Afghanistan and Iraq.

The latter notion makes an observer wonder: what will change in 2028? Most probably we will see nicely designed robots in our streets as a benchmark for postmodern era. But what will remain modern, or even pre-modern, is the military market. Decades ago, states, non-state actors and military corporations carried their billions worth deals without caring who will be slaughtered or whose house will be destroyed by those weapons. It seems that even in the postmodern world this kind of business will continue normally or in the best case to a lesser extent. What will differ is that we will sell much more sophisticated weapons, including combat robots, careless of who they will kill. What is even worse is that we will still promote our “securitization” policies as peace policies.

The film also raises questions about morality. On the one hand, there are claims that our world is increasingly improving in terms of human rights issues; on the other hand, we are creating human machines and putting them in the market. Does this mean that we want to restart the “human” trade but in a nicer way? What is also morally troublesome is the production of these human-a-machines, which in the movie takes place in China. It sounds like the human-rights-problematic “projects” are done in china because we do not really care about their human rights records (do we?), what we care to appear clean is our human rights profile at home. Let us just imagine how crashing it can be for the reputation of the United States or other democratic states to dehumanize and robotize a paralyzed human on its land. At the end of the day, we can always blame the Chinese.

The relationship between the RoboCop and his family is strong and emotional. When his job interferes with his family, he goes through emotional ups-and-downs. He starts an inter struggle and eventually he prioritizes his family over the duty. This symbolizes our emotions when they hinder the constant efforts of perfectionism. The huge economic interests behind such gigantic projects are likely to marginalize the role of emotional-human role to a

certain extent. As the movie shows, during the process of dehumanization and then rehumanization or the main character, there is a complex set of variables. It characterizes the battle between the party that cares about the human side and the other that merely cares about the economic benefits. This humanization/dehumanization binary is critical to the postmodern thinking. Are we becoming more dependent on the machines to an extent that some of us will agree to become a machine him/herself? Some will agree; some will disagree; some will try to robotize to make profit; and some will fight back by reject that. In short, there will be a new debate about how much we can relinquish out humanness and how far we can be robotized.

Is the robot out of control? Dr. Norton created a computer, which is more complicated than the one I am typing on right now, but the similarity between my computer and the RoboCop is that both shut down by touching of a button. If we remove this control mechanism from my computer, it will start and shut down on its own; yet, it will not kill anyone. But what if we remove the remote control option from the RoboCop? This is what happened at the end of the movie, when Dr. Norton removed the chip that kept the robot under constant control. As in most Hollywood action movies, the RoboCop ended up being a hero, but post-modernist critics argue that what if the Robot was a “bad boy/girl” who turned against its owners, became a potential source of destruction and not “peace”? Perhaps one of the core questions that we are thinking currently is the control dilemma – who controls whom?

Leave a Reply

Your email address will not be published. Required fields are marked *