Earlier today Google revealed mockup images and released a video for Project Glass. For those of you that don’t know, Project Glass is one of Google’s side projects that will attempt to bridge the gap between the user and the user interface for “smart devices” (smartphones and tablets) using a convenient, simple and ergonomically correct form factor.
While Google didn’t release a plethora of information on Project Glass, they made it abundantly clear that they’re far from a finalized product. In their Google+ post, they pointed out that they simply created some mockups to “show what this technology could look like ” and released a video to “demonstrate what it [the glasses that could evolve from Project Glass] might enable you to do”:
A group of us from Google[x] started Project Glass to build this kind of technology, one that helps you explore and share your world, putting you back in the moment. We’re sharing this information now because we want to start a conversation and learn from your valuable input. So we took a few design photos to show what this technology could look like and created a video to demonstrate what it might enable you to do.
Please follow along as we share some of our ideas and stories. We’d love to hear yours, too. What would you like to see from Project Glass?
From the looks of the video and the mockups, in conjunction with information from a month ago that details a proposed UI navigational method, it appears that the glasses simply have one “hidden” lens that is controlled by head gestures and eye movements. In addition, the partially blurry moments in the demonstration video (embedded above) in conjunction with the position of the lens (in the mockups) indicates that the user is only meant to interact with the glasses when a task needs to be accomplished.
If the information above is accurate, the majority of the time that the user wears the glasses will simply be dedicated to viewing the world as they would normally and the glasses would only be used to complete quick tasks (as demonstrated in the video).
However, the actual lens (on the right side of the glasses) appears to be attached to an arm of its own. This could be an indicator that the lens will reposition itself when needed and the real world application for this could be as simple as looking at or away from the lens for it to move accordingly.
But, as stated by the Google[x] team working on Project Glass, just because they’ve released mockups and a demonstration video, it doesn’t mean that the concepts embody the future product or reveal its capabilities in any way. While it could be some time before we see a production-ready pair of “Google Glasses” birthed from Project Glass, stay tuned for full covers on the Google[x] team that will try and make these concepts a reality.