Hi! I'm Bryan Brown and I'm a Technical Specialist here at Looking Glass Factory.
One of the most special events in the XR space is the MIT Reality Hack. The Reality Hack is a multi-day hackathon event hosted at MIT by a group of volunteers and the VR/AR club.
We were super excited to attend this year’s Reality Hack that took place during March 23rd-27th at MIT. This was my 2nd Reality Hack, I had first attended the 2020 Reality Hack as a mentor for Unreal Engine and Project North Star and absolutely knew that Looking Glass Factory had to attend this year!
This was the first time we showed off our Looking Glass 4K Gen2 and Looking Glass 8K Gen2 systems in public and it was so incredible seeing folks interact with our newest generation Looking Glass displays.
Day 1 - Workshops & Inspiration!
The first day was full of excitement as folks arrived to the hackathon, registered and started attending workshops! We gave a workshop on Unreal Engine and on Looking Glass in actual MIT lecture halls!
The truly exciting thing about MIT Reality Hack isn’t just being able to show folks the cool products and projects we’ve been working on but also giving them the opportunity to use Looking Glass displays to learn and create their own projects. We ended the first day of the hackathon with a wonderful opening ceremony alongside introductions from fellow sponsors and presentations from guest speakers, all culminating in the team formation process.
Day 2 and 3 - #HackToTheFuture
Let the Hacking begin!
With workshops completed and teams formed everyone got straight to work on building their projects. With 60 total teams, folks were using products from our fellow sponsors including the HoloLens 2, Snapchat Spectacles, Magic Leap and Looking Glass Portrait – it was a full cross-platform hackathon!
There were a total six teams that used Looking Glass Portrait or Looking Glass 4K Gen2 to build their projects. Most of these teams used Unity and the latest release of our HoloPlay Unity Plugin (now at version 1.5)! These projects ranged from a holographic emotional mirror, to local news from the future using an avatar and local data from the internet! All of the participating teams were incredibly innovative and energized about bringing their projects to reality.
Oliver and Bryan from Looking Glass Factory assisted the Looking Glass teams as mentors to help them through technical challenges with tools like Unity, Unreal and Blender! No time for errors during a hack!
Missed the hackathon this year? Feel like you were there with this 360° video filmed by Jared Bienz, a member of the HoloLens team at Microsoft, and author of ReGlass and Refract for Looking Glass.
Day 4 - Closing Ceremony and Prizes!
While necessary, the hardest part of the Reality Hack as judges and mentors is selecting the winning projects. Each and every team developed something truly special at the hackathon, and below are a few of our favorites that used Looking Glass displays as part of their project ↴
Terrariam: The winning team comprising of Ezekiel D’Ascoli, Madeline (Maddy Van) Hulse, Michael Steinberg, and Elizabeth Sheffield created a unique, holographic experience of a user’s emotional state. By measuring real-time emotional data, Terrariam explores what it means to physicalize an emotional headspace into a three-dimensional landscape, encouraging personal growth, acceptance, and attachment to community.
AI witness news: Kahlil Calavas , Andrew McGregor and AudreyLaneHalo created an AI-enabled holographic avatar so that people can ask questions about how other people in a specific geographic region of the world feel about a current event.
Reach Out and I Will be There: Liz Newton-Tanzer and Lisa Szolovits were drawn to working with the Looking Glass as a headset-free 3D device. "We liked the idea that the Looking Glass could exist as a "charmed object" that lives in your home and invites low-commitment XR interaction without complicated setup.", said one of Reach Out's team members.
In this project, when the user reaches their hand towards a field of dynamic particles floating inside the Looking Glass, the particles converge towards the user’s hand, then re-diverge into a fluid point cloud representation of the user’s Loved One. The user can subtly interact with the particles that make up the ethereal representation of their Loved One, causing visual shifts that create a sense of embodied contact not possible with a traditional photograph, video, or even synchronous video chat.
More about their project in this devpost here.
We were so thrilled to be able to attend this event, especially after two whole years of working remotely, and hope that through this post and some of our videos, we've been able to teleport to you the infectious energy that came from the event. (We miss it so much already, can't you tell?)
The MIT Reality Hack wouldn't have been made possible without the amazing folks at the MIT VR/AR club & the Reality Hack Organization so make sure to go give them a follow @MITRealityHack on twitter and if you want to follow along with our journey, follow us at @LKGGlass! See you in 2023!
-Bryan, Arturo, Max, Oliver & the rest of the Looking Glass Factory team 👋🏼