What does it mean to be a Practical Nerd?
I mean, being a standard-issue nerd is no longer a challenge. Not like when I grew up, when “nerd” was a term of derision, a reason for even the art and drama geeks to shun you, and a near-guarantee that someone at junior high school graduation would place an overly ripe plum on your chair just before you sat down in your, uh, white suit. (Yes. It’s true.)
But nerds – or at least their label— are now mainstream, thanks to the success of Bill Gates, Steve Jobs and J.J. Abrams. So we now get to watch in glee as once-revered jocks show up bald and fat at high school reunions, while those of us once-shunned at the same events get awards such as “Best Preserved.” (Yes. Also true.)
It may be time, though, for nerds to stop being respectable and reclaim their heritage of edgy unpopularity. Because going mainstream has brought with it a definitely counter-productive aspect: having been allowed to fit in, self-proclaimed nerds are counter-intuitively popular – and are watering down what “nerd” means.
A true nerd, a Practical Nerd, realizes the geek not so much inherit the Earth as they sardonically evaluate and then critique it for having a primitive UI. Practical Nerds don’t chase the next bright shiny object as much as they toil to make sure the playthings cavalierly toyed with by the digirati actually fracking work for normal people.
And we Practical Nerds have our own Manifesto of five core principles, honed from hard-won, real-world experience:
1) Cool is not necessarily useful. I’ve reviewed literally hundreds of gadgets, websites and apps over the years. In most cases, what may seem absolutely fascinating upon demo doesn’t hold up when integrated into everyday routine. (Getting gadgets to review may sound great, but it’s analogous to when I used to review science-fiction novels, causing friends to jealously comment, “You get to read all those free books!” and me to reply, “No, I have to read all those free books – even the bad ones.”)
I’ve been caught in this trap repeatedly myself, discovering that what works fine for a few days when I’m focused on it frequently requires me to modify my habits too much when I try to make it a regular part of my life.
There have been cutting-edge exceptions – the very first TiVo from Philips with a whole dozen hours of low-definition capacity comes to mind – but what’s cool on day one is frequently annoying by day 30. If a new tech app or other product calls for convoluted changes to routine that rival advanced yoga, you – and I – will not keep using it. Human biology and psychology simply don’t advance as quickly as technology.
2) Software is never, ever going to be “dead.” Whether you try to dress it up by calling it an app, a cloud service or a virtual environment, it all still runs on code. Code makes up software, and software is what makes the hardware work. I tire of first Salesforce.com, then Apple iOS developers and some industry trade associations shying away from using the term “software” as though the word itself is a transmittable virus that causes accessories to fall off.
If you’re selling a program or service that manipulates data or content, no matter how it’s distributed, embedded or marketed, it is software. Without it, that precious bright shiny gadget is nothing more than an expensive brick.
3) Free isn’t forever. Business models are required to keep products and services around, because someone, in some manner, has to make it worthwhile for those products and services to be maintained, updated and supported. Whether the user of that product or service directly pays out of pocket, through his or her attention (e.g., advertising), through time and labor and perhaps donation and foundation (e.g., open source and open content) – there is a price for continuity initially fueled by enthusiasts or aspiration or promise.
Companies that introduce cool new stuff that appears to be “free” without overtly relying on any of the above are trying to gain short-term market share for a longer-term payoff. Or they’re idiots, and you shouldn’t trust your data or content to them if you ever want to see it again.
4) Features aren’t products in the long run. Did you know there was once a thriving market for software for connecting computers to online services by phone? Or a huge business in selling incredibly detailed, animated screensavers? Procomm Plus and Berkeley Systems’ flying toasters have long since vanished into personal tech lore. But they are only two of many examples of once-hot and much-imitated products in categories which ultimately turned out to be features of other, longer-lived products and services.
Before becoming fully invested, financially or otherwise, in a fast-rolling new bandwagon that others are jumping on, it helps to think through whether a product or service can survive long-term on its own. Or whether the single ability to, say, send 140-character messages will make more sense eventually as a component of other products.
5) Bubbles happen. Personal, digital technology is cyclical. I’ve worked through three boom-bust cycles: packaged consumer software, multimedia CD-ROM and the dot-com era. In all three cases, estimates of immediately addressable market, available customer dollars and perceived product usefulness/quality were generally, and horribly, overstated yet tacitly accepted by all involved. Of the three, the dot-com boom and bust did the most damage because the spread of digital technology had finally reached the mainstream, with equally significant downstream consequences.
The “new normal” never is. And what some are calling Bubble 2.0 (social media/social networks) is more accurately personal digital tech’s potential Bubble 4.0. It wouldn’t be the last, either. When hype leads to hyperactive froth, expect a lather of bubbles. Prepare to rinse and repeat.
Finally, as this Manifesto implies, Practical Nerds do not care if we are in step with everyone else or with what’s “popular” in the digital world. Because, a nerd should be – by tradition and by duty – out of step with the mainstream. While at the same time tinkering to see if we uniquely can make it better.