Remembering the Moon
Wendy Grossman looks back at what the landing on the moon meant and how our predictions of the technology of the future are never quite right: “Computers are… yesterday's future, and tomorrow will be about something else”.
"I knew my life was going to be lived in space," a 50-something said to me in 2009 on the anniversary of the moon landings, trying to describe the impact they had on him as a 12-year-old. I understood what he meant: on July 20, 1969, a late summer Sunday evening in my time zone, I was 15 and allowed to stay up late to watch; awed at both the achievement and the fact that we could see it live, we took Polaroid pictures (!) of the TV image showing Armstrong stepping onto the Moon's surface.
The science writer Tom Wilkie remarked once that the real impact of those early days of the space program was the image of the Earth from space, that it kicked off a new understanding of the planet as a whole, fragile ecosystem. The first Earth Day was just nine months later. At the time, it didn't seem like that. "We landed on the moon" became a sort of yardstick; how could we put a man on the moon yet be unable to fix a bicycle? That sort of thing.
To those who've grown up always knowing we landed on the moon in ancient times (that is, before they were born), it's hard to convey what a staggering moment of hope and astonishment that was. For one thing, it seemed so improbable and it happened so fast. In 1962, President Kennedy promised to put a man on the moon by the end of the decade - and it happened, even though he was assassinated. For another, it was the science fiction we all read as teens come to life. Surely the next steps would be other planets, greater access for the rest of us. Wouldn't I, in my lifetime, eventually be able also to look out the window of a vehicle in motion and see the Earth getting smaller?
Probably not. Many years later, I was on the receiving end of a rant from an English friend about the wasteful expense of sending people into space when unmanned spacecraft could do so much more for so much less money. He was, of course, right, and it's not much of a surprise that the death of the first human to set foot on the Moon, Neil Armstrong, so nearly coincided with the success of the Mars investigator robot, Curiosity. What Curiosity also reminds us, or should, is that although we admire Armstrong as a hero, the fact is that landing on the Moon wasn't so much his achievement as that of probably thousands, of engineers, programmers, and scientists who developed and built the technology necessary to get him there. As a result, the thing that makes me saddest about Armstrong's death on August 25 is the loss of his human memory of the experience of seeing and touching that off-Earth orbiting body.
The science fiction writer Charlie Stross has a lecture transcript I particularly like about the way the future changes under your feet. The space program - and, in the UK and France, Concorde - seemed like a beginning at the time, but has so far turned out to be an end. Sometime between 1950 and 1970, Stross argues, progress was redefined from being all about the speed of transport to being all about the speed of computers or, more precisely, Moore's Law. In the 1930s, when the moon-walkers were born, the speed of transport was doubling in less than a decade; but it only doubled in the 40 years from the late 1960s to 2007, when he wrote this talk. The speed of acceleration had slowed dramatically.
Applying this precedent to Moore's Law, Intel founder Gordon Moore's observation that the number of transistors that could fit on an integrated circuit doubled about every 24 months, increasing computing speed and power proportionately, Stross was happy to argue that despite what we all think today and the obsessive belief among Singularitarians that computers will surpass the computational power of humans oh, any day now, but certainly by 2030, "Computers and microprocessors aren't the future. They're yesterday's future, and tomorrow will be about something else." His suggestion: bandwidth, bringing things like lifelogging and ubiquitous computing so that no one ever gets lost; if we'd had that in 1969, the astronauts would have been sending back first-person total-immersion visual and tactile experiences that would now be in NASA's library for us all to experience as if at first hand instead of the just the external image we all know.
The science fiction I grew up with assumed that computers would remain rare (if huge) expensive items operated by the elite and knowledgeable (except, perhaps, for personal robots). Space flight, and personal transport, on the other hand, would be democratized. Partly, let's face it, that's because space travel and robots make compelling images and stories, particularly for movies, while sitting and typing...not so much. I didn't grow up imagining my life being mediated and expanded by computer use; I, like countless generations before me, grew up imagining the places I might go and the things I might see. Armstrong and the other astronauts, were my proxies. One day in the not-too-distant future, we will have no humans left who remember what it was actually like to look up and see the Earth in the sky while standing on a distant rock. There only ever have been, Wikipedia tells me, 12, all born in the 1930s.
Share this article
Wendy M. Grossman responds to "loopy" statements made by Google Executive Chairman Eric Schmidt in regards to censorship and encryption.
ORGZine: the Digital Rights magazine written for and by Open Rights Group supporters and engaged experts expressing their personal views
People who have written us are: campaigners, inventors, legal professionals , artists, writers, curators and publishers, technology experts, volunteers, think tanks, MPs, journalists and ORG supporters.