We are all Big Brother

And we are all Winston Smith.

Those, of course (or maybe not of course if you have not read the book), are aspects of one of the great novels of all time – 1984. Published in 1948 (hence where the title comes from) and written by George Orwell.

I have read it a couple of times – the first being in High School. The second after college – which I appreciated more than the first read. Not sure if it was because I was older – or because it was my choice and not part of a course curriculum. I was fascinated by the hyped idea of predictions. I was sadly disappointed when there were no outright predictions during the first read. Again, it wasn’t until the second read that I saw the predictions for what they were – amazingly insightful narratives put forth by Mr. Orwell.

I think to a lot of people the power aspect was Room 101, Big Brother and the spin used in selecting words for description – Ministry of Love (read torture), DoubleSpeak, etc. Yes, those three main aspects were controlling and all powerful – in an omnipotent kind of way.

In looking back on the novel and seeing where we are (30 years past the time the novel took place) 66 years after it was first published I see some other more disturbing predictions. Here are a couple:


The premise in the book to ‘erase’ someone from history. No evidence, no existence. Orwell based this aspect on a Stalin practice – where photos were re-touched to remove people that had fallen out of favour (read killed). For the longest time this was in the hands of experts. People that could do this with skill. Not today – anybody with PhotoShop or any online Raster-based image-editing software can have access to a massive erase button. And it’s not limited to still images. Video is vast becoming the next realm of adding and deleting for content.


Winston Smith’s neighbour (Parsons) was turned into the authorities for having thoughts against the state (while he was sleeping no less)  – by his own daughter. He admits to Winston that he is relieved to have been caught and his thoughts corrected. We don’t need people we know to do this anymore. It can, and is anyone. We have smartphones and CCTV feeds that allow any of us to collect data (ideas and photos) and distribute them without any inkling of understanding to the situation they’re reporting. Some make their way to group-judgement websites. Now, I’m not defending or condemning these site – these are just examples.

You park like an asshole.

People of Walmart.

Clients from hell.

We are all in the same boat – we are all Big Brother by judging and being a voyeur (and recorder) to the lives and apparent missteps of others (guilty-pleasure confession – I love the sites listed). And we are a text or photo away from being Winston Smith – unaware of that telescreen behind the painting.

I think it’s time to read 1984 again.

What’s your biggest social media (big brother) fear?



Talkin’ about my generation

Apologies to Pete Townsend for stealing this headline.

Technology adoption age (the comfort age that people need to be in order to adopt current technology/ideology) is changing at a very rapid pace. There was a time when generational changes in technology took years (if not decades). Some changes are hampered by society’s ability to accept the paradigms, others’ by  piggy back technology, e.g., the internet is what it is today because of other technologies like, broadband, server capacities, etc. It’s kind of a leap frog exercise.

In the past 100-150 years I’d say that the length of a technology generation has decreased dramatically.

The introduction of movable type was the biggest turning point in the history of our world (and more specifically communication) – hell TIME® named Mr. Gutenberg the top guy of the millenium that ended in 2000. But in terms of todays technology – I’ll point to the telephone as our current starting point. The changes measured over the first 100 years took us from a novelty to a business and household staple.

Major technology shifts happened along the lines of people generations – about 10-15 years. What I mean by that is that you could go that length of time and not have to upgrade for fear of falling behind. So the adoption age for this – and other technologies – had a good span. If you had used a telephone in 1945, chances are that if the next time you used the phone was in 1960 it wouldn’t be a huge challenge. Your ability to adapt would be in technology adoption age.

The times they are a changing (same apologies to Bob Dylan)

I would say that current technology generations are measured not in decades, or years any more. I think that a closer measure is mobile devices. It really wasn’t all that long ago that people walked around with mini fridges for cell phones. Then they got smaller, and then they got really got small. We added cameras, texting. Our phones became smart, they stopped having physical keypads. You could surf the web, send an e-mail, find a date – finally have that video call that was promised all those years back (think Dick Tracy). The end result is that the technology adoption age keeps shrinking.

It is most evident in dealing with parents or grandparents – my Dad has trouble with the universal remote that I bought for him. Therefore Twitter is beyond comprehension, not just from a technology standpoint, but from a concept. Parents are sad time machines of ourselves. It’ll be us in no time not understanding, not getting it.

It dosen’t have to be that way

Technology need not be the beast that does this. Technology has the ability to be as invisible as the designer/developer wants it to be. A good current artifact that does that is the iPad2 (offered up by a quaint little fruit company in California). The iPad offers a glimpse at what is possible from a technology standpoint, the ability to use the device without having to tinker. The number of kids and elderly that love using it should be testament enough.

It allows the user to focus on the concept of things. That they can E-mail, Text, surf the ‘net, etc. Are these tablets perfect, nope, but they are also a work in progress. As more and more app developers create experiences that match those of the iPad, we are going to be adding length to the technology adoption age. We just have to work on getting the concepts across. That’s all!

Swiss Army Electronics

To quote David Mamet’s Heist “I never liked the Swiss” –I should clarify, I don’t like the Swiss Army™ knife. Oh sure I own and use one. When I do use it it’s for very simple, limited tasks. I might use it to open a resistant package of JuJubes, or the occasional bottle of beer. But I’m not doing anything overly productive with it.

Welcome to the world of current/future technology–the everything device

Electronics have had this odd place in the world of tools–they can be whatever they’re molded into–usually, but not always, based on an existing artifact, e.g., DSLRs look the same film SLRs, but really are nothing but a computer that takes pictures. Where any area of task and electronics overlap, one function is never enough. Why is that? I blame Mr. Coffee™, the 1970s device that changed the world. Up until that point in time coffee was done through a percolator, a time-consuming process. Along comes a gentleman (Mr. Coffee), says look how easy this is now and ta-da –advancement. Clearly that’s not the end of the story, we added a clock, which lead to a timer, added water directly, now we can set it up to have that cup of java when we awake. On top of that we’ve added a lot of extras to this device, the latest incarnation has a barcode scanner and will endeavour to make unique beverages like an espresso, etc.Where the overall function has stayed the same, the electronic complexity that now is built into a coffee maker would be the envy of circa 1960 NASA.

The same can’t be said for computers, smart phones and now tablets. These devices ultimately have no primary function–sure we can use any one of them for a focused task, but we prefer when they Swiss Army Knife. As stated earlier, they’re electronics, they can be whatever we want them to be, and we want them to be everything. The call for more starts before we even have the device in our hands. Whatever the specs are of rumoured, up-coming digital artifacts they’re never enough. If doesn’t have a camera–it needs one. It has a camera, well that’ll need a flash. What, no video, oh, no HD, can I upload the video after I shoot; during the shoot; how about before I shoot. I’ll need to edit that video, do some post, add some titles, nifty transitions. At what point is enough, enough.

I’m guessing never–it will never be enough. Which is both a good and bad thing.

It’s good because the advent of artifact convergence helps streamline things, instead of walking around with a music player, a phone, a day planner–we can travel with a smart phone (which has a plethora of other tools at our disposal). It’s a bad thing because when we combine tools they are never as effective as the single-purpose tool, visit any construction site and see how many Swiss Army knives are being used. And that is especially true in the world of digital. I’m typing this on my iPad–because I can. It affords me the ability to work on this post just about anywhere. The drawback is that it is taking me longer to type because it’s keyboard is not as good as the single-focused keyboard I use at the iMac.

Am I saying to get rid of artifact convergence, no. What I am saying is that accept the downside. Multiple-function devices are good in a pinch, but they are no substitute for the real thing.