It is hard to believe that it was just 1 year and 1 week ago, Google uploaded to its YouTube channel, a teaser video for a project they had been working out of from their top-secret “moonshot” [x] Labs called Project Glass.
The project was ambitious, an augmented reality layer over your very life, answering questions before you even asked, and all around simplifying your life.
2 Months later at Google I/O 2012, Google staged a “demonstration” involving a blimp, skydivers, BMX Trick bikers and more to show off the device as what seemed like little more than a network connected GoPro camera. They then asked Developers who are interested in getting an early look at the technology if they would be willing to fork over $1,500 for the chance to be one of the first non-Google employees with this whole new class of Technology.
I and 2,000 other attendees happily stood in a long line to put down our commitment to try it out. Then, months of agonizing waiting began. Waiting for a future that was so close we could taste it.
Over the rest of the summer, news came slowly, but I and the rest of the Glass Explorers (that is what they call us) waited with great anticipation. Finally in January, another video was uploaded answering the one question that Developers at I/O most wanted to know, what sort of APIs would there be so that I and other developers could write applications for Glass. It was called the Mirror API, and it was REST based, which means that rather than writing software that you install on the device itself, you write web software that runs on a web server, and it communicates with the device through Google’s servers.
This was great news for developers, especially for web developers like myself, in that REST APIs are inherently language agnostic. This means that I can write in any programming language, instead of a specific one for a specific device as is the case with traditional mobile development. I was eager to get my hands dirty and start coding, and fortunately for me. Google gave me that chance.
Just over a week later, on January 28th at Google’s San Francisco Office, Google hosted a 2 day event they called the Glass Foundry event. I and 200 other developers (and a week later 200 more developers at Google’s New York City office) were invited to come take what Glass Developer Advocate Timothy Jordan called “a Vacation with Glass.”
For those 2 glorious days I got to wear Google’s Project Glass, and more importantly, I got to use the Mirror API to write my first Glass Application. Together me and my teammates Jake Weisz and Nathan Buth coded and hacked into the early morning hours eagerly working with Google’s latest API on a gadget that we just knew would change the way we interact with information in the future.
During my time wearing Glass, I was surprised just how much of the original “One Day” video was possible with it. Things that I really did think was simply not going to be possible yet was not only possible but effortless, and it quickly began to feel like an extension of me, rather than a device I was using, in a way that no cell phone or computer ever has.
In just 2 days I found interacting with glass to become such second nature, that when I reluctantly
let them pry the device from my hands returned the device they so graciously lent me, for days afterward I found myself looking up when I wanted to know simple things like the time, or the answer to a question, only to find the answer was no longer right there, ready and waiting for me.
The only thing about the event that I didn’t really like (though I understand it completely) is that I was not permitted to talk about anything that happened at the event, or about glass itself. Being the talkative guy I am, this was hard for me, waiting, waiting to talk about what I believed to be the most amazing piece of equipment I’ve ever had the privilege of using. Again, I was waiting.
In the meantime, while I waited, the detractors started coming out.
Maybe you’ve seen these people around San Francisco or Mountain View, inevitably staring off into space while swiping the sides of their glasses during conversation, ignoring those around them while surfing the web or scrolling through images they’ve captured with the device. I like to call them “Glassholes.” — TechCrunch
Articles like this bugged me because they inherently did not get the point. Google Glass is not about continuing the trend of people who stare at their phones and don’t participate in the world. Google Glass is about giving you the information you need without you looking away from what you are doing. Google Glass is about integrating the information you need into your life. It’s about taking someone who was looking down at their phone, and encouraging to tilt their heads up and look at the world.
The small floating image is purposely in your peripheral vision, not in your line of sight. It is specifically designed to keep your attention on your surroundings. This is why many of the videos and demonstrations Google has shown us, don’t focus on the information Glass can give you, they focus on people living their lives, doing their jobs, spending time with their families, and having way more fun then I ever do.
Will having Glass make me take up skydiving, or encourage me to trek to Thailand to have authentic Thai food? Probably not, but when I’m in the moment, living my life, and there’s a question I need answering, Google Glass will be there, ready with that answer. When that moment I’m living in, becomes that perfect moment, Google Glass will be there, ready to capture it. No fumbling for a phone to look up an answer, taking me away, no pulling out the camera and starting it up in hopes that I can get it on in time to capture that moment.
At the Glass Foundry, Timothy Jordan described it in one sentence better than I have in the last several paragraphs when he said “By bringing technology closer, we can get it out of the way.” A statement he and others have repeated at other events.
So when will I have my Glass? Very soon. Last Night Google sent an email to those of us joined the Explorer Program back at Google I/O 2012 saying that the first batch of Glass is off the production line, and they are ready to start getting them into the hands of the Explorers. They don’t have enough for all of us yet, but they have some, and that’s Exciting news.
I know several people (many of whom I met either at Google I/O or at the Glass Foundry event) who have already got the second email, the one indicating that their specific device is ready. And some of these friends are going to have their glass in their hands as early as today! Unfortunately, I have not received this second email yet, which is a strong indication that I will not be in this first batch. Hopefully the second batch will be coming real soon, and I’ll get the opportunity to get my Glass very soon.
Also last night (they’ve been busy), the Glass Developer team released the developer documentation for the Mirror API so that other developers can start dreaming of their own applications and start working on them.
The future of technology looks great, and my future looks great, on the other side of the Glass.