It is by no means an exaggeration to say that the various generations of the gaming industry have corresponded, usually very clearly and very closely, with the various eras of my life: I literally grew up with the 8-bit, spent most of my adolescence with the 16-bit, came of age with the 32-/64-bit, worked videogame retail during the 128-bit, and have become a gaming journo during the current (which, my development contacts chide me, no longer uses a silly “bit” designation) generation.
It’s partially for this reason that the Nintendo 64 and PlayStation era is my favorite, but only partially; just as I was graduating high school, entering college, and meeting my future wife, the industry was similarly coming into its own, ditching cartridges for CDs, entering the third dimension (the virtual one, not the literal one, naturally), and finally expanding its narratives into fully-fledged stories, replete with voice acting and themes and, most excitingly of all, real endings (something that comic books are still working on, sadly). It was an exciting time.
And there were plenty of exciting experiences to be had. Making Mario run around a three-dimensional landscape for the first time was one of the singularly most immersive moments in the history of the medium (as was evidenced by a close buddy who literally couldn’t stop giggling the first time he picked up the controller). The chills of being pursued by zombified dogs in Resident Evil have proven, even 15 years later, to be hard to top, both viscerally and artistically; the thrills of, say, leading Link across Hyrule Field to go fishing at Lake Hylia have easily made Ocarina of Time my favorite game of all time. And I can still feel the electric jolt that Metal Gear Solid‘s closing credits sequence produced in me – the first time that I was affected emotionally as well as conceptually by a game.
And then there’s GoldenEye 007. A close group of friends and I played the game so much, literally every day all throughout our senior year of high school (a tradition which mostly continued in my first dorm the semester [or three] thereafter), they rechristened the “marksmanship award” after my name in honor of how notoriously difficult I was to kill, even when going three-against-one with lasers in the Facility. I can still hear the cries of “No god!” – our nerd-parlance for having no gun, making you safe from the crossfire – and the colorfully worded epithets they flung in my general direction, prodding my parents to pull the occasional Statler and Waldorf from the other room. Split-screen multiplayer was never better.
And, depressingly enough, it’s never been replayable. The sad truth is that, while Super Mario 64 and Silent Hill and Final Fantasy VII remain indelible classics, they also remain nearly unplayable (particularly on a 64″ inch HDTV). Much more than most other art forms, videogames are very much borne of one moment and expire (or is that transpire?) in the next, leaving behind telltale, but nonetheless ephemeral, traces of their existence. No, not their existences, per se, but their existential relevance, if a slightly different connotation can be ascribed to those two words in at least this particular context. It’s a fact that is both strangely comforting as well as distressing; it makes games like memories – magical and mortal. It also makes one wonder what the ultimate resonance of, say, Uncharted or BioShock will be, and what form that echo will ultimately take.
Robert Pirsig is famous for having said that the godhead is equally comfortable resting on the petal of a flower as it is residing in the circuits of a computer. Indeed, with transience and emptiness as their twin pillars, videogames are the very embodiment of samsara – a digital manifestation of a very-much-real precept of life on this planet.
Somewhere, beyond a polygonal landscape aided and abetted by force feedback, the godhead is laughing.
Marc N. Kleinhenz has covered gaming for over a dozen publications, including Gamasutra and TotalPlayStation, where he was features editor. He also likes mittens.