On Computing
This is an iPad post. Tune out if you're sick of this stuff.
I've been watching with some degree of bemusement the reaction of people to the iPad and the very strictly limited feature set. I've debated the value of this product with people ad nauseam, and I've read a number of opinion pieces on the device. Most of all though, I am excited over the potential for this product to allow me to do my job a lot better.
For a very long time – longer than I care to admit – I've been a technologist for the sake of technology. Whenever the latest and greatest bit of technology came out, I'd immerse myself in it wholly, simply because it was new and shiny. My love affair with technology can be traced back to my Father, who repaired consumer electronics all the way back in the 80s. He'd come home with new TVs and VCRs and we'd sit and watch in awe as he fixed these arcane devices, and then act as QA while we not only took advantage of these increasingly sophisticated devices, but reported bugs in the systems themselves.
Once computers came out and we eventually got a hold of one of these newfangled devices, both me and my older brother took to the machines. We learnt about the devices from the inside out, and we became the masters of their dominion. My father, once the expert on electronics, now had to take a back seat to a pair of punk early tweens. He had the purchasing power, but we held all the keys to the workings of these arcane systems (literally since our first PC had a keyboard lock).
Fast forward a few years and we're still masters of this technology. We upgrade our machines by hand, we're manually assigning interrupts to peripheral devices, wrangling the most out of extended memory, and logging on to primitive bulletin board systems via this revolutionary modem device. Yet some things remained the same. We were still the resident experts. We owned the future.
Eventually, we grew up (ha!). We dutifully shuffled off to university to gain our "education". This is, as the custard maker said to the non-Newtonian liquid, where the plot thickens. Both my brother and I are studying computer engineering. However, instead of vanilla computer engineering, we're studying a double degree with biomedical engineering. It's no longer pure computing we're looking at, but this bizarre field of applied computing. It's here we learn about the wonders of the rest of the world. Anatomy, Physiology, Physics. You don't know wonder till you've opened up the chest of a toad, seen its lungs pop out, and looked at the blood coursing through the capillaries in the lung. It's enough to make you believe in God. Yet, at the same time, we're learning about the dark magiks of computing, algorithm development and CPU design. It's involved stuff, and to even know how a single sub-field works would require a lifetime. The computing, for a while becomes my focus. Yet, in the back of my head, there's a slight tingling, an inkling that maybe working on the biomedical stuff is kind of important.
My brother and I split paths after university. He began his PhD in the field of surgical simulation (or close enough), and I started moving closer and closer to molecular biology. University had taught me that computing skills were a commodity. There would always be someone else that had the skills to replace me. Yet, in molecular biology, I was again the guru. I performed magic with computers. And it was good.
Betraying my past – as every year passed, I became less and less interested in the computing. I used Linux on the desktop for many years. I stopped using it after I realised that it was actually stopping me from doing what I was most interested in doing. Solving problems. I became engrossed in problems of mass spectrometry, of proteomics, of glycomics. I compared these to the problems I solved when I was involved only in computing, and realised that I was simply playing on computers when I first embarked on this so-called career so many years ago.
This, I believe is the natural progression of any geek from a proto-geek to a whitebeard. The proto-geek is only interested in the technology. The whitebeard has seen so many technologies come and go, that the whitebeard only cares about the problems. The whitebeards look beyond spec sheets, and see the utility of the whole. They understand their place within a larger system that involves a large number of meatspace (human) components. I once had an argument in a bar, where two proto-geeks were comparing which operating system they used. As a user of Mac OS X, I was the subject of ridicule. I wasn't hard enough for them. They had been using Ubuntu for the past year, and it was more real computing than a Mac could ever be. The utility of the tools were never brought up, but instead the deviation from 80 columns and 20 rows of characters was the metric of "coolness". I guess, at some point of time, I was that person too.
As a geek, what we do has been steadily shifting away from gatekeeper of technology to custodian. Our value no longer comes from being the family member who "does computers" and knows how to burn a DVD. The role of geek is to become invisible. To tame and protect technology, and present it to the rest of the world while they do their jobs. This century, we're facing a whole bunch of problems, and we're going to need everyone's help to get through it. The last thing we need is technology getting in the way of the problem solving. Before I digress, the protection of technology is an important point. We're the only people that can stand up to attacks on freedom like the great firewall of Australia and net (non-)neutrality.
Apple positions itself at the intersection of the Liberal Arts and Technology. It's where I've been for at least the last 10 years, and it's the most effective place to be in computing. I've had a long journey through technology to get where I am. I'm not the only person here. Almost everyone in Social Networking belongs in this space too (to various degrees of effectiveness). Everyone in bioinformatics belongs here too (to various degrees of effectiveness). You can pick any field of endeavour, and see the effect of computing on it (to various degrees of effectiveness). Computing is pervasive, and integrated across all of society.
The iPad is the first public and loud death knell for computing as we know it. The first sign of the death of the late Twentieth century nerd. It's going to take a number of years before the revolution is realised, but it's been happening for a number of years, and it's not going to stop now. What you're hearing now, is the sound of inevitability. So, my advice to the people who still cling to their roles as gatekeepers of technology is this: If you want to have a job in the next decade, ask yourself what you're doing to help. If you can't answer how you're directly solving real, human problems, then you're going to find yourself out of place. If only that were true for the financial services sector too. *sigh*
In the future, if you want to know how successful any bit of technology is, don't look at the implementation. Look only at how well it solves a problem, how invisible the tech is (i.e. how easy it is to use) and how accessible it is (which includes price).
1 comment
oh hiren, welcome back