Every year in Las Vegas, the world of consumer electronics comes crashing together at CES, an annual trade show demonstrating the latest in digital technologies.
This year, we at Looking Glass Factory wanted to make something special to show off the new 8K Holographic Display. The things we were eager to share were:
- how beautiful digital holography can look when at absurdly high resolutions
- how physically evocative and truly magical holograms at this scale can feel when properly integrated into an environment
This is what eventually shook out:
The software development was split between myself and Oliver Garcia-Borg, and we got fabrication support from Evan Kahn and sound support from Corey Bertelsen.
Oliver is a developer here at Looking Glass Factory. He’s intimately familiar with the ins and outs of making things look beautiful on the Looking Glass. My understanding is he does this through his familiarity with the render stack as well as a solid competency at managing lighting and scene composition. (If you’d like a glimmer into some of his insights, you can read our Design Guidelines, which he’s made many contributions to).
Evan is among many things a hardware and electronics tinkerer and software developer. He builds out the low-level stuff that many of our other tools rest upon. Fortunately for me for the sake of this project, he also has a lot of experience with electronics hacking.
Corey is a sound maestro — and what I can say about sound is that it’s one of those things I forget to plan for until I very urgently need it. Corey knows how to deliver amazing effects and is a solid dev as well.
Lastly, my background focuses more on the design of physical 3D spaces and interaction design. The main thing that fascinates me about working with Looking Glass tech is how well it connects 3D digital spaces with the physical world around it. Here’s a hackathon project that illustrates this fascination by overlapping phone AR and the Looking Glass. And here’s a old post showing an early version of the flashlight effect with the Volume, an older generation Looking Glass product.
By combining the diverse skills represented in our team, we were able to produce this demo for CES, and I think I can speak for all of us when I say we’re quite happy with the result.
So, how does it work?
When footage of the demo circled the internet, there was a bit of internet commentary speculating how it worked, some of it right, and some of it less so. Here’s a technical overview of what was going on to achieve this effect. I’ve broken it down by its component technology pieces.
Looking Glass 8K Technology (the display)
First off, the Looking Glass 8K is not a TV — it’s a hologram display. Can’t really blame folks for not being familiar with our tech because it’s so new. That said, here’s an overview to help you acquaint yourself if you’re unfamiliar:
And this particular piece of technology is the 8K display, which is our latest iteration of this tech. Though it is new, the underlying principles are the same as the other sizes of the display, and for those who are curious, our docs site has an in-depth technical explanation of how the displays work.
One of the design challenges of the 8K display is that it comes in the form factor of a TV, so people expect to interact with only through their eyes and by deactivating the rest of their body. We want people to look at Looking Glass content as if it were a piece in a museum, walking up to it, appreciating different details from different angles.
So that’s where all this other technology comes into play.
VR Tracking Puck
The Looking Glass presents 3D holograms using passive techniques — meaning it only needs a computer and requires no glasses. It simply presents a hologram to anybody who happens to enter the view cone without knowing anything about the physical state of anything around it.
That said, just because it doesn’t read 3D data from a room doesn’t mean it can’t.
By borrowing technology components from the world of Virtual Reality, we were able to weave the physical world into the Looking Glass. The technology we borrowed from VR is the Vive Tracker, pictured here:
In a VR context, trackers are used to aid with prop tracking or motion capture (body tracking). The way it works, you attach one of these to a prop (say a plastic tennis racket), and now the prop can be used in VR because the computer has a precise read on the position and orientation of the puck, and, by extension, the prop.
With this component we could track a user’s behavior as they stood in front of the Looking Glass. This allowed us to design behaviors that would force them to physically engage to interact with the content, forcing them to stop thinking about the Looking Glass as an advanced TV, and start thinking about it as something entirely new.
So now all we needed was a prop to attach this tracking puck to…
We evaluated numerous flashlight based on a handful of criteria, including:
- Relative size to tracking puck and the Looking Glass
- Compatibility with tracking
- Tactile quality of the button press
- How easy it would be to hack
Eventually we landed on using a Maglite weighted with 2 D batteries. It was about the right size and weight, and proved to work well in our tracking tests. Unfortunately, it was slightly trickier to hack into than cheaper plastic ones, but this shortcoming was more than made up for by the huge and prolific Maglite flashlight hacking community on the internet (who knew!?).
So we drilled in a mounting screw to the back of the flashlight, and attached the tracking puck. To account for it in the software, I piped the tracking puck’s data into Unity, the game engine this was built in, and attached a spotlight object as a child element to the position of the tracker, offset by the length of the flashlight.
From the beginning, I knew I wanted the physical button of the flashlight to activate events in the Looking Glass, I just didn’t know how hard it would be. I suspected the circuit could be very simple, but was worried that the tracking puck’s electronics would require an MCU (like an Arduino or something) and a bunch of resistors etc, and I didn’t want to do that.
So I asked Evan to take a look, and he found that the puck would send events to the computer whenever two readily exposed pins were connected / disconnected from one another. This simplified everything greatly! All that was required to send events to the computer was connecting the tracker’s pins to the positive and negative end of the flashlight’s switch, which Evan did in a jiffy. No MCU’s / resistors required!
Once we got this working, we worked with Corey to integrate some good wholesome click sounds so that the user’s physical action of pressing the button was amplified in the digital environment with the sound of a flashlight clicking.
Once you put a flashlight into someone’s hand, turns out they instinctively know what to do with it: turn it on and go exploring — not just with the eyes, but with the whole hands and the whole body. Once that flashlight is in the hand, it’s impossible to think of the TV-form factor thing like a TV anymore. The body begins to feel what it means to be engaging with an interactive hologram and can finally see that this experience is truly unique.
Ambient Smart Lights
The last technology worth noting is the smart light. In some ways this tech takes a back seat because it’s not the main feature in the demo, but I believe it still played an important role in tying together the digital space with the physical space of the room.
In this demo, there are three different scenes: a forest, a cobblestone path, and an astronaut walking in a space station on Mars. Each scene has its own color palette and soundscape, and the demo presenter can press a button to transition between scenes.
To get a sense of this effect, look at how red the Mars scene is in the video at the top of this blog post, and compare that to how blue the walls are in this video:
Along with the ambient sound, the ambient lighting helped tie the digital space with the physical space. Transitions established that not only the content inside the Looking Glass change, but the quality of the room itself was changing as the viewer transported from location to location.
Additionally, when the walls and the viewer’s and and the flashlight reflect the colors of the environment, it makes the presence of the flashlight (a hot-yellow light machine) all the more significant.
So there you have it. The smart light provided a transformable ambient space that is just waiting to be interacted with, the modified flashlight gave the viewer the chance to interact with the space, the tracking puck kinesthetically connected the user to the experience, and the Looking Glass anchored everything in with a super pretty 3D hologram to look at and interact with.
In addition to all the tech, the team was so inspiring to work with — both the folks mentioned in this post + the many others on the team coordinating all the crazy logistics of bringing palettes of tech to the desert and ensuring the right people get to see it.
The tech and the team were a blast to work with, and altogether they provided the opportunity to create a new and completely jaw-dropping experience.