“Technology has a lot to answer for.”
My mother has a good point. For all the wonderful opportunities that improving technology opens up, humans still have fundamental problems applying it to their world, and things aren’t getting any better.
This particular charge was voiced on Boxing Day, as my family were on our way home from a Christmassy visit to the paternal aunts, uncles and cousins. We had enjoyed (to begin with) a viewing of photos from my cousin’s recent 21st birthday party (which we had attended). These were followed by shots from a day out with friends who had travelled down to Sydney for a couple of days around the party. Not printed photos, of course, but digital photos displayed on a good sized television.
It seems to be a general rule of the universe that a mandatory five or ten minutes are required at any family visual presentation to get the technology working. In our case, the males and twenty-somethings in the room attempted to get the images off my cousin’s camera and onto her laptop, then hook up the laptop to the television and play around with the laptop monitor resolution and the television’s aspect ratio before finally the photos were presentable. I’m sure that no less time is required today than in days of yore when slides would need to be loaded into a still projector (upside-down and back-to-front of course), or a film threaded tenaciously through the wheels and cogs of a film projector.
We were then treated to more than two hundred photos of the party and surrounding events. Two hundred. Many were repeats, “just one more” -style. Many more were of the same group of people in front of different non-descript backgrounds. Some were great, catching a moment or providing a glimpse of someone we haven’t seen for a while. But most were not. As a digital camera user I am forever grateful for the flexibility that solid-state memory provides – to take as many photos as I want without thought for running out of film – but taking the photos is only part of the job. Even die-hard film fanatics don’t keep every photo they shoot, and they certainly don’t show them all to family. But the combined technologies of digital cameras and mass storage seem to have conspired against humanity to subject us to nearly endless streams of humdrum photos, the a jewels buried deep within.
Perhaps because taking a shot is now free of cost, many people pay little attention to the composition or actual photography of it. Just because we don’t need to discard photos now, to fit the bulging albums on our shelves, it seems that most people think we shouldn’t. The constraints that physical storage and film imposed on us have disappeared, and without them we are drowning in the sea of easy, cheap, second-rate photos. We used to need to put a little labour and money into snapping and printing a picture; now we don’t, but that doesn’t mean we’re home free. Instead, that effort should be transferred to the editing and culling of images after the fact. Such an arrangement has the potential to improve everyone’s photography, but it’s a lot easier to forget about that that all-important selection, or to put it off indefinitely. It’s easier instead to subject our friends and family to the raw footage, as it were. Which must be annoying in at least two ways for people like Mum. Photography was a lot more effort in her day, with better results, but now she has to endure the mediocrity that progress has begat.
Digital photography is just one example of what seems to be a general trend across many areas of technology, at least where it has become commoditised and reached the eager hands of “everyday” people. Technology is providing increasingly complex tools to people who are increasingly incapable of using them properly. Desktop computers themselves are a prime example. Their commodity nature, and the simply unimaginable potential offered by an Internet-connected computer, make them practically a necessity for modern-day life. To be without access to this global network is to be truly disadvantaged. Cheap access to such an important resource should perhaps be classified with food, water and shelter as a basic necessity of life, and OLPC are pursuing a truly worthy goal. But that is not my point. As accessible and necessary as computers are, most people remain incapable of using them safely and correctly. Or perhaps, to be fair, computer designers remain incapable of creating a computer that normal people can use.
A PC can be used for good and evil. Like scissors, a car, or nuclear power, in the right hands a computer is a fantastic tool. But in the hands of someone who doesn’t understand how it works, a computer can be dangerous. To the user, a computer is a danger to their work, their photos, address book and sensitive personal and financial information. A poorly protected computer is also a danger to others. The PC users of the world collectively wield a weapon powerful enough to bring any organisation to its knees, but very few computer users know how to keep their computer safe from viruses and trojans. Most probably don’t even consider the danger to others, and simply endure the inconvenience to themselves of a regular re-installation.
“A computer lets you make more mistakes faster than any invention in human history – with the possible exceptions of handguns and tequila.” – Mitch Ratcliffe
The ubiquity of digital technology has lead to its use by many people who, in previous ages, would not have been allowed to. Were computers not so useful, offering so much potential for productivity an enjoyment, they might be regulated like other, similar technologies; technologies that are complex and easy to misuse such as cars, guns and heavy machinery. Such technology requires a license to use, an assurance to others that the user has training and knowledge appropriate to the task. Computers, on the other hand, have become cheap, common and (just) usable enough that they have been given to a huge group of people who, in previous ages, would not have been allowed near anything so complex and powerful. Technology has accelerated away from our ability to teach people how to use it.
“Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.” – Rich Cook
This is a popular sentiment, especially among software engineers and designers, with whom rests most of the blame that can’t be placed at the feet of the idiots. There will, of course, always be a bigger and better idiot, but that’s not the problem. The problem is that the vast majority of people can’t effectively use technology that has almost suddenly become so pervasive that we can no longer do without it. And it’s not because the software engineers can’t build good systems. The complexity of technology being developed today absolutely dwarfs the complexity of any previous technology. We have never had the ability to build such powerful and sophisticated tools, never realised that we would approach some complexity threshold whereupon suddenly the majority of the population is left behind. As technology continues to become more and more capable, and correspondingly more complex, it is a simple fact that we won’t be able to build interfaces to it simple enough for everyone to use.
If the trend continues then a very real digital divide will form, not between those who have the technology and those who don’t (it will be cheap), but between those who can use it and those who can’t. Knowing how to use information and technology will become a phenomenal advantage in life, equivalent to being one of the best hunters, strongest warriors or most adept craftsman. As societies used to rely on the hunters, then on the farmers, then on the industrialists, society will come to rely on the technologists for continued improvement in life. Given the tools available to the technologists, this will place them in a position of rather unique influence.
But it’s still going to take your dad five minutes to set up the holo-visual reality field.