Questions of Empathy in “Do Androids Dream of Electric Sheep?”

I will continue my thoughts on Blade Runner / …Eelctric Sheep with this consideration of some of the psychological qualities of andorids,  vs humans, and the moral questions that they seem to raise…

A key theme of the book is empathy. Notionally, the means of detecting a rogue android (or ‘Andy’) is by a test of empathy, the Voigt-Kampff (sounding for all the world like any one of a number of current and past Psychological tests of Intellect, emotion, personality and so on…) and one which, presuming an essentially reliable narrator, androids don’t pass. The trouble is, aside from their inability to pass the test properly, and everyone’s acknowledgement of that, they don’t seem to be any more or less lacking in ’empathy’ than anyone else in the book.

The theme of empathy comes through in a number of ways. Very early on you are introduced to Deckard’s electric sheep and the point and purpose of such an outlandish construct; it is to demonstrate both status and empathy to his neighbours. It would seem, at some undefined by reasonably recent past point, there was a terrible (assumed) nuclear world war, of which one up-shot was the sudden and extensive extinction of numerous species of animals, and a distinct endangerment of many others. Since then, people have taken to owning animals as a form of status symbol; we see this in Deckard’s clear jealousy of his neighbours rare – and apparently fertile – horse. But the animals are not intended as status symbols. Whilst it is never stated, it is heavily implied that a deficit of empathy allowed, or caused, the war in the first place and that to guard against it people must develop their empathy. Partly this is done through Mercerism, the unusual world religion that seems to have come about with little fanfare, but a near total grip of all humans (but not androids) which appears to develop and instil empathy by the enforced experience of a single persons suffering and failure. But it is also a part of the religion to own animals, to develop one’s own empathy.

Yet the whole business seems to have little to do with any genuine empathy. As I said, the animals become status symbols, partly due to the expense of any of them, and partly through the requirement to actually have one at all. It would seem, through Deckard’s entreaties to his neighbour for a chance to purchase his foal, and by revealing his own shameful, ersatz animal, that there is a compunction towards those who do not have at all, to provide. While Deckard’s neighbour believed his animal real, he felt no guilt in one-upping him, only pride. Once he see’s it is a fake, he suddenly feels guilty for owning two. In the same way, we might boast of success to someone whose job is perhaps not so good as ours, but it would become despicable to do so to an unemployed person. But of most importance, little, if any, real empathy is shown towards these creatures. They are admired as prizes, and that’s all.

Meanwhile, no empathy at all is shown towards “Specials” intellectually backward people, known as “Chickenheads” such as JR Isidore who is of so little interest and worth that he is left to live alone amongst the wreckage (kipple!) of the old world. And then he encounters Androids, who do show him some sympathy, who can empathise with him, in a way that even his own boss, who is supposed to be sympathetic, seems not to do. In fact, despite being repeatedly told Androids have no empathy, and that this makes them dangerous, their behaviour on this matter seems persistently ambiguous. Both Androids and humans behave in much the way that humans have always behaved. With compassion, pity, fear, cruelty, ignorance, dignity, pride and many other qualities besides. Some androids seem dangerous and murderous, like Polokov, yet others, such as Garland, seem only defensive and protective. Indeed, except for the testimony of characters (both human and android)  and the fact of the various failed (and passed) Voigt-Kampff tests, it seems almost bizarre that androids are lacking in empathy, or any human quality. It would not be a stretch to read the whole work as a paranoid fantasy, and take all androids as simply human, though that is not my reading here. In fact what it is most reminiscent of is the pseudo-patients experiment, by David Rosenhan, where, having been labeled “Android” everything they do is determined thusly. In fact, it is merely a matter of perspective. This question is what is thrown into such sharp relief when Deckard is taken to the fake police station, run by androids, and the question of Phil Resch’s humanity – and Deckard’s own – is brought up. Once we have confusion about the matter, and assuming their behaviour does not change, nor the  authorial reporting of it, then we struggle to ever be sure again of exactly who is or is not an android.

All this leads to one of the most vital themes of the work; the morality of killing an android. It is just a given, they must be killed. Their lack of empathy is occasionally touched on as a reason, but mostly it is simply a means of detecting them. No thought is (by the characters, at least) given to the question of what it means if an android can pass an empathy test, exactly as a human does.  It is merely a larger obstacle in their detection, it does not for one second call into question their ability to empathise, or whether humans truly are empathic. Both are assumed, and never questioned, regardless of the facts of behaviour.

One Response to “Questions of Empathy in “Do Androids Dream of Electric Sheep?””

  1. […] Questions of Empathy in “Do Androids Dream of Electric Sheep?”. Share this:TwitterFacebookLike this:LikeBe the first to like this. […]

Leave a reply to Questions of Empathy in “Do Androids Dream of Electric Sheep?” « tenseicoalition Cancel reply