When it came to launching our creator series – In-Depth – where we dive deep with some of the artists, developers and collaborators we've worked with over the years, one of the first people that came to my mind was Mike. As he'll recount below, we first met Mike at Siggraph in 2017 when Mettle, the company he worked at back then, was serendipitously stationed right next to ours. Though I have not seen Mike in person since that one summer four years ago, I am constantly reminded by his enthusiasm and fervor for exploring new technologies through following him on Twitter. I can't wait for y'all to dive into this interview with him as he recounts a near-lifetime of being obsessed with interactive, creative and holographic technologies – Nikki
Hey Mike! Can you tell us a little bit about yourself?
Hello, I’m Mike Dopsa from Toronto, Canada! I use digital technologies like VR/AR, Holograms and Depth Cameras for apps, installations and all things interactive. I studied at the University of Toronto and now work at Array of Stars, an Innovation Partner Agency where I focus on Spatial Computing.
How did you get into holograms — was there a particular a-ha moment or was this more gradual?
Ever since I was a teenager, with AVATAR (2009), the RealD revolution and the PS3, I have been obsessed with 3D technology. Growing up loving cameras and videogames, I dreamed of a day when digital worlds could be seen in 3D, real worlds could be seen virtually. 3DTVs and theatres were beautiful, showing how amazing 3D can be when done right, but without parallax it felt limited and it was clear not everyone was as enthusiastic about it as I was.
In 2015 I was spoiled with a chance to play with the original HoloLens. While I didn’t develop anything for it at the time, I got to experience & learn from a 6DoF, standalone HMD with hand tracking, and brilliantly anchored holograms years before I owned a VR headset.
A few years later I began working with Mettle, traveling to various technical conferences demoing creator tools for 360 Video. While I still love 360 video, I couldn’t help imagining it with 6DoF movement, and started thinking about how capturing the real world into a hologram would be possible.
At the time I didn’t know anything about photogrammetry, so I came up with my own capture method, using four 360 cameras, with known object size and camera positions. (Basically Photogrammetry in reverse). I manually created the 3D mesh, texture projected the images in photoshop and exported my first Hologram captured from real-world images.
A few weeks later at Siggraph 2017, Mettle’s booth was #1211 and our neighbors two booths away at #1207 was none other than Looking Glass Factory, showing off your then-new HoloPlayer One. Walking by [their] booth for the first time, the hologram on screen literally caught my eye and made me do a double take. Until that moment, I had never seen proper parallax on a virtual object without an HMD. I had never seen a glasses-free, non-eye-tracked 3D display, and I was quite literally dumbstruck.
I stood there, jaw on the floor, head parallax-bobbing left and right for probably a full 30 seconds. It was here that I got to see (AND TOUCH) my homemade hologram for the first time, due to the kindness of the Looking Glass team. I couldn’t believe my eyes, but I instantly believed in the future of holograms and realized they were not merely going to be real in my lifetime, but that I could be a part of creating them & working with them for years to come.
What have you been working on recently — either as a hobby / part of your work at Array of Stars?
Recently I’ve been working on a few hobby projects I’ve been interested in making real for a long time, and are finally becoming practical / possible. One of these is a touch-less/holographic ATM keypad that randomizes the keypad layout for security. Imagine if people are using hand tracking to enter in a passcode or signature in mid-air, it would be very easy for anyone to watch you and learn your passcode. But if you randomize the keypad, it makes it nearly impossible to guess your passcode just by watching hand movements. Bonus, nobody has to change their passcodes - the security is in the system itself.
A good half of my hobby ideas are practical like this, and the other half are silly, fun or memes in an effort to not take myself / advanced technologies too seriously. Spreadsheets are critical tools, but flash games are passionate joy - we need both!
At Array of Stars, we’ve been really excited about the accessibility of AR experiences on social media platforms. The pace of development on creator tools in the last few years has taken these platforms from some of the least featured toolsets to some of the most advanced, and benefit from a huge audience. The more people creating for 3D the better, as each device is merely a new window to the content. Assets made for a specific social experience can quickly be shown in a Looking Glass or HMD, and generally are very light on performance.
What are you most excited to make with Looking Glass Portrait
I can’t wait to build applications that allow multiple Looking Glass Portraits to work together. Up until this point, I have only used Looking Glass displays for one-off demos or specific installations. But now that my family have Looking Glass Portraits, I can start creating applications or games that they will be able to use practically and even connect with mine, no matter where in the world we are. There’s some exciting experiences that feel ripped from High-Fantasy magic that I can’t wait to try.
What have you made so far with Looking Glass Portrait?
So far I have been working with the Unreal Engine Looking Glass Plugin to combine virtual production with holograms! UE4 has some incredible Virtual Production tools, including face mocap for 3D characters, which is super valuable to a 3D animator who is creating virtual humans. They just released ‘MetaHumans’, which is an easy-to-use virtual human creator, and I thought about how it might be useful to see the MetaHuman in 3D while you're animating. Sure, you could use a VR or AR headset, but then your face is covered, and you can’t use mocap. You could maybe use a 3DTV & glasses, but you’d still be limited by parallax. But by using a Looking Glass portrait, you can look at the 3D character in genuine 3D, while puppeteering with your own face.
To me this speaks to the future of 3D creative work, where you can make things inside the medium you’re building for. Today, we can do this with analogue things like Theatre and Dance, where choreography, set and costume are all made and built in 3D, for a 3D audience on a 3D stage. When we do this virtually, we’ll have superpowers like Copy/Paste, scale, and the Internet allowing everyone to work together, from anywhere, in their own role.
How does Looking Glass Portrait differ from other technology you've used so far?
Everyone is used to seeing bright pixels from digital screens like Phones & TVs, but they’re always flat. We’re used to seeing 3D depth and parallax in reality, but almost all the light is reflected. The Looking Glass is both Emissive and 3D, something most people haven’t experienced outside a VR headset. Seeing this effect without a special pair of glasses is astounding and not being limited to a single viewer means you can casually observe a hologram, with a friend, with 0 setup time!
What's your favorite hologram?
There are so many amazing holograms / scans, and while I love how CGI images look in the Looking Glass, my all-time favorite is this volumetric selfie by Azad Balabanian. Aside from it being an awesome location & image, using a drone + photogrammetry like this is pioneering. ~20 years from now when drones are the go-to travel camera, we will see memories captured in a format that looks amazingly similar to this one.
Your favorite 3D software tool?
Blender. It’s not even the application I spend the most time in or feel the most enabled by, but their dedication to being free, constantly improving and the community of learning and extensions is incredible. The value blender provides is unmatched and essential.
50 years from now, what does the future look like?
The highest quality education is available in the form of a download.
Children use what we now call ‘Immersive Technologies’ to play and learn. Toys built with microprocessors and biometric sensors that respond to their actions, questions and intentions.
Drones have usurped phones as people’s favorite on-the-go camera. The convenience of using a drone dramatically outweighs a handheld camera, as it is volumetric capture, can be re-framed at a later time, and unwanted obstacles / pedestrians are instantly removable.
Professional programmers / engineers / artists synchronize their brains with their computers, operating all of the computer’s functions at the speed of thought. They view Kinetic input like Keyboards, Mice and Touchscreens as archaic tools, far too blunt and slow for their needs.
People still watch 2D Movies and shows, still listen to music, and attend live events as they do today (sans quarantine).
What's the day in the life of Mike Dopsa like?
QuarantineDIL of Mike Dopsa is roughly ~
- Either starts with a run, or a coffee (alternating runs every other day)
- Logon and begin work on (often secret!) Array of Stars project
- Have a quick lunch (usuallllly leftovers)
- Prep for meetings
- Re-approach anything that was stumping me before meetings
- Try to end the work day on an accomplishment / completion! (Not always possible)
- Prep/Cook dinner (alternating days with my partner). Make extra for leftovers
- Depending on the day, watch Overwatch League or a show of some kind (GO Defiant!)
- Open a hobby project, game, or social VR world for the end of the night
What’s on your desk?
- Blue Yeti
- Azure Kinect DK
- Looking Glass Portrait
- Leap Motion Controller
- Playstation 3DTV
- LED strip
- Note 9
- Magsafe / iPhone 12 Pro
Mike Dopsa is an Interactive Technologies Developer at Array of Stars, an Innovation Partner Agency based in Toronto. You can find his work on his website and by following him on Twitter.