Filming location for spaghetti westerns in Almería, Spain

Custom Search

© 1987-2016
Scott Larson

Building façade in Cannes, France

Does not compute

If you cared at all for the Fay Wray tribute I wrote last week, then you should know that it was even better the first time I wrote it.

Now it may surprise you that I would have written it more than once. It surprised me too, because usually I am lucky if I write these things even one time. But during the short period of time between the moment when I finished writing it (the first time) and the moment when I went to upload it to my web server, the hard disk on my usually trusty laptop failed. That will teach me to delay sharing my words the very minute they are first minted. Now, everyone who has experienced this himself or herself is wincing with empathetic pain, but I can’t really feel too sorry for myself. In countless hours of using various computers since, well, since they have had hard drives in desktop computers, this is the first time that I have had a hard disk fail. My more stubbornly loyal readers will recall that I had a computer catastrophe on this very same computer some 11 months ago, although that wasn’t a hard disk failure. But it had the same results as one, and as a consequence I have been much more faithful about backing up my files ever since. Because of this, happily, I lost very few files, the Fay Wray piece notwithstanding.

This got me to thinking about computers in general and the insidious ways they have infiltrated our lives. And what better way to ponder this than to examine that constant mirror on our reality, the movies? Not surprisingly, computers have been “characters” in films for quite a long time. And, if we are a bit loose with the definition of the word “computer,” then they are really ubiquitous, especially when we consider that robots (a sci-fi staple) are essentially computers. For my purposes, however, I am just concentrating on the basic machines themselves.

One thing is clear from watching Hollywood movies: people were afraid of computers, even long before most of us had them on our desks or in our homes. A common theme about them is how they go haywire and put our lives in danger. This fear of technology is way older than computers, of course. It goes all the way back to the Greek myth of Prometheus and how he was punished for daring to bring the new technology fire to mere mortals. This theme has shown up in literature again and again, e.g. Frankenstein, which was subtitled The New Prometheus. It reflects mankind’s general unease with new technology, whether it be electricity, horseless carriages or GM foods.

The most famous computer in film history is, without a doubt, the HAL 9000 from Stanley Kubrick’s classic 1968 movie 2001: A Space Odyssey. We could all relate to HAL’s obstinate “Sorry, Dave” even before we (or most of us) used computers ourselves, and we can all relate to it even more now that we do. Kubrick caught brilliantly the frustration of dealing with an artificial intelligence that has too much control over our lives. A lot of HAL’s dramatic effect was in its (his?) annoyingly soothing voice, which was provided by Douglas Rain. The late critic Pauline Kael called HAL the “only amusing character” in the movie and said that he suggested “a rejected homosexual lover.”

The theme of a computer running amok has been reprised again and again. In Donald Cammell’s 1977 film, Demon Seed, based on a Dean Koontz book, Julie Christie is impregnated by an evil computer called Proteus IV, which wants to take over the world and which has the voice of Robert Vaughn. In Rachel Talalay’s 1993 Ghost in the Machine, a mainframe in Cleveland, Ohio, becomes possessed by a serial killer and goes after a hapless Karen Allen. Eventually, it’s just a matter of time before computers take over the world completely and enslave mankind. We know this will happen, thanks to James Cameron’s 1984 classic, The Terminator (followed by two sequels) and to the Wachowski brothers’ 1999 The Matrix (also followed by two sequels).

Less menacing but still a bit creepy (and harking back to 2001’s computer-as-rejected-lover theme) was the plot of Steven Barron’s 1984 Electric Dreams, as Lenny von Dohlen’s new home computer becomes part of a love triangle along with him and Virginia Madsen. Actually, maybe it’s not that creepy after all. In fact, something similar goes on in my own house.

In the course of the latter 20th century, however, there was a subtle change to the computer-as-menace theme. Perhaps reflecting ordinary people’s growing sophistication regarding computers, the villains shifted from the machines themselves to programmers and/or hackers. In John Badham’s 1983 WarGames, it is nerdy teenager Matthew Broderick who is the menace, as he hacks his way into a Defense Department computer, thinking he is playing a new video game, nearly starting World War III. Steven Spielberg’s 1993 Jurassic Park was, of course, a dinosaur movie more than a computer movie, but it wasn’t lost on computer nerds that the main villain was a computer nerd. All the trouble starts when programmer Wayne Knight shuts down the park’s computer system. (In the end, however, it takes another computer nerd, young Ariana Richards, to save the day.) In Irwin Winkler’s 1995 The Net, Sandra Bullock is a computer programmer who is menaced by bad guy computer programmers (with a major case of identity theft) when she refuses to give them a disk they want. The good-hackers-versus-bad-hackers theme was employed the same year in Iain Softley’s Hackers, starring Jonny Lee Miller and Angelina Jolie.

These days, no action movie is complete without a scene in which the hero hacks into a computer as part of the plot. By 2001 (the year in which Kubrick’s film was titularly set), you could argue, computers had ceased to function as agents or characters in the movies at all. That year, in Peter Howitt’s AntiTrust, it is Tim Robbins, as a thinly disguised Bill Gates, who is the villain, offing independent open-source programmers (the old-fashioned way) and stealing their code.

In the meantime, movies had also capitalized on the young male fascination with computer games and presented computers as environments for virtual reality worlds. There have been numerous films of this type, but they all owe their existence to Steven Lisberger’s 1982 film for Disney, Tron. In a Wizard of Oz-like twist, it featured actors playing dual roles as normal people in the real world and as computer programs in the virtual computer world. But the plot and action make it clear that this is not really a movie “about” computers. It is the first of all-too-numerous video game movies.

Another by-product of the ubiquity of computers in the real world has been that the computers in movies have gone from being generic piles of circuits with generic monitors and generic keyboards to being makes and models we all recognize, i.e. product placements. One of the best computer product placements of all time was in Leonard Nimoy’s 1986 Star Trek IV: The Voyage Home. In this film, the crew has traveled back in time to contemporary San Francisco. Obliged to use a 1986 computer, Chief Engineer Scotty tries giving verbal commands to a Macintosh, as he would to the Enterprise’s own computer (which had the voice of Majel Barrett Roddenberry). When he gets no response, someone suggests helpfully that he use the mouse. So, Scotty picks up the mouse and tries talking to that. It was a great gag, and there’s a little story behind it. The filmmakers originally wanted to use an Amiga computer for this scene. The reason is clear. The Amiga was simply the coolest computer of its time. But, at that point, the Amiga was owned by Commodore, which had the worst marketing department in the whole world of that year or any other year. Commodore couldn’t get its act together to accommodate the filmmakers, and they eventually gave up and worked with Apple instead. Maybe if things had worked out differently and the Amiga had gotten a big publicity boost from that movie, I might be using an Amiga today, instead of an IBM machine that seems determined to wipe out all my work every year or so.

And with that, I will now stop and upload this column.

Right now.

-S.L., 19 August 2004

If you would like to respond to this commentary or to anything else on this web site, please send a message to Messages sent to this address will be considered for publishing on the Feedback Page without attribution. (That means your name, email address or anything else that might identify you won’t be included.) Messages published will be at my discretion and subject to editing. But I promise not to leave something out just because it’s unflattering.

If you would like to send me a message but not have it considered for publishing, you can send it to

Commentaries Archive