In this post, I’ll be teaching you how to take a photo you’ve taken on an iPhone 7+, 8+, or X and drag and drop it into straight into your Looking Glass to view it in 3D.
“What? Impossible!” you exclaim, in disbelief. Oh but it is possible — and I’ll tell you why before I get into how you can (very easily) do it yourself.
“What is it?”
Stereoscopy is a technique that creates an illusion of depth using flat 2D images. Most stereoscopic methods use a pair of images that are offset for the left and right eye of the viewer, which combine in the mind to create a 3D image. You might recognize this practice in the use of red and blue glasses that are used to look at anaglyph images and make them three dimensional.
“What does it do??”
StereoPhoto Maker allows users to create interlaced media or anaglyphs (images and video) so that they can create their own 3D content. Users can also generate depth maps based on the two images, similar to the depth maps embedded in the metadata of portrait mode photos and used to create Facebook 3D photos.
“Why is this important???”
One of the developers of StereoPhoto Maker, Masuji Suto, created a few HoloPlay.js Looking Glass Web Applications that allow you to pull some cool things into your Looking Glass for immediate viewing. I am going to focus on the application called “Dual Lens iPhone portrait viewer”, which is the tool you will use to view your iPhone photos in 3D. Put simply, Mr. Suto has written an application that reads the Depth Data in the metadata of an iPhone photo taken in portrait mode, and outputs it as an image that can be viewed in 3D on the Looking Glass. Sort’ve how the left and right images are interlaced to create 3D in traditional stereoscopy, the depth map and RGB images are read together to create 3D in the Looking Glass.
I’m not going to get too detailed here, but I figure it would be nice to have a basic idea of how Masuji does this, since it’s a bit different than how we at Looking Glass Factory display 3D on the Looking Glass.
The HoloPlay Unity SDK, as an example, first renders all of the individual views of the three dimensional view before copying those views to a quilt. From there, the quilt is processed by a shader into the lightfield. Mr. Suto takes the RGB + depth input and outputs it directly into the lightfield. That’s why you can drag and drop the image into the application and see it nearly immediately in the Looking Glass. If you want to understand how he does this, you can check out his code directly on the website as he provides a zip of the source project with all of his code.
“Oh that’s rad!”
Yeah its cool and I’m gonna show you how to use it.
Setting up Dual Lens iPhone Portrait Viewer
1. Connect your Looking Glass to your Computer
If this is your first time connecting a Looking Glass to a computer, visit our documentation site now :)
2. Download Google Chrome
All Looking Glass web apps are currently Chrome only. If you don’t use Google Chrome, I’m really sorry but you’ll have to download it to use it now! If you already use Google Chrome, you can forget I said anything.
3. Download our web drivers
Now you need to download the appropriate web drivers for your computer. Some websites natively support the Looking Glass, but you still have to download and install these drivers for them to run properly. You can test your installation with our calibration tester here.
4. Open up Dual Lens iPhone Portrait Viewer
After you’ve successfully installed the drivers, go to the Looking Glass Web Applications site (using Google Chrome) and click on the “Dual lens iPhone portrait viewer”. Pull that window into your Looking Glass display and expand it into full screen mode. For a quick test, email a photo to yourself that you’ve taken in portrait mode. Do not airdrop it nor use a communications app like iMessage, Slack, Whatsapp, etc to send it to yourself — I’ve found that doing anything other than emailing the image directly from phone to computer can strip the depth data from the iPhone image. Google Drive and Dropbox also work great.
5. Pull in your first Portrait mode image
Once you’ve downloaded your test portrait mode image, simply drag and drop it into the browser window open in your Looking Glass. If you see bold Japanese characters instead of a glorious 3D scene, that means the photo you pulled in doesn’t have depth data for the application to read.
In the words of Aaliyah, “dust yourself off and try again” using a different portrait mode image.
Once you’ve got a 3D photo showing up in the Looking Glass, start messing around with the depth, focal plane, and positioning of the image:
- Change the focal plane with the left and right arrow keys
- Change the depth with the up and down arrow keys
- Re-position the photo with mouse click and drag
- Zoom in or out of the photo with two finger scroll (track-pad) or mouse wheel scroll
Best Practices for taking iPhone photos for the Dual lens iPhone portrait viewer
It is my utmost pleasure to provide you with the best practices for taking incredible iPhone 3D images in the Looking Glass. I emailed every portrait mode image I’ve ever taken to myself, set up a two day photo shoot, and took selfies in all sorts of lighting in order to figure out what exactly creates the most 3D image using Masuji Suto’s Looking Glass Web Application. This is all for you.
I’ve created a google drive folder with images that you can download to drag and drop alongside this post to see sample images of what I’ve found to work and not work using the Dual Lens tool.
It’s first good to understand that, while you might have taken a legendary photo of some beautiful vista in the mountains, that alone won’t mean that it’ll easily translate into a dope 3D photo in the Looking Glass. You’ll have to take photos (or search for photos in your album) that have a couple of qualities I’ve found intrinsic to a really good 3D shot for your Looking Glass.
Back-facing vs front facing camera
There’s been a lot of talk on the internet about whether the front-facing or back-facing camera is better at grabbing depth, with a majority of the talk saying that the front-facing wins. The back facing camera creates depth using a dual lens camera, while the front facing(the selfie camera) uses something called True Depth, a system that relies on an infrared emitter and camera. The back facing camera is better at taking photos of objects and reading depth at further distances away, while the front facing camera creates a more smooth, seamless depth map for selfie style portraits.
Thinking with Depth
When taking a photo to view in the Looking Glass, you’ll need to adjust to the idea that you’re not only shooting for a great shot, but also a great depth map. Great depth maps have clean, smooth edges and follow the depth logic of the shot you took.
This means that if you took a photo of your cat in front of your computer (coincidentally like this example pair), your depth map should have your cat as lighter colors in the depth map gray scale gradient, your computer a bit darker (because it’s further away), and the absolute background of the photo as black. I’ll go into how you can extract the depth map of a photo in another post, but first I’ll chat a bit about the best ways to shoot for a great depth map in your photo.
- Fill up your photo’s scene with content that’s in close spatial distance from each other
- Shoot in consistent lighting
- Aim for a clear shot to complement a clear depth map (this will all be for naught if you have a shaky photograph you’re trying to pull in)
- Shoot photos horizontally for the photo to translate more seamlessly to the Looking Glass
Fill up your photo’s scene
When taking a photo of a larger scene, you should
- make sure that there aren’t large distances between your subjects of interest.
- create ascending/descending points of interest in the photograph space.
Images with information traveling consistently from the front to further away from the shot work better than large spaces between the information (by large space, I mean really large. Like trying to take a picture of you in front of the ocean, or you in front of a mountain)
This photo of an Easter egg basket highlights this practice quite beautifully — the focus of the shot is on the front of the basket (the closest point to the camera), but there is a line that follows down the table of Easter egg baskets that creates continuity in the gradient of the depth map, which in turn makes the 3D image on the Looking Glass more seamless. For both close range and wider shots, it’s great to think about giving your depth map the most it can to work with for your shot.
Take this photo of two cosplayers in the Brooklyn Botanical garden as an example of what not to do. It’s a colorfully clear shot, but the distance between the two cosplayers and the trees is so large that it creates two flat points of interest. The color differences between the cosplay pair and the trees are stark in comparison to the continuous gradient in the Easter egg basket photo above. A couple more people in between the pair of cosplayers and the trees would have created a wider gradient and more depth in the resulting image.
Consistent, low contrast lighting
It’s also important that the lighting in your photo
- is consistent
- without high contrast
When I say contrast, I mean lighting sources that create more or less intensity on the subject than the other sources of lighting. This might seem extraneous but it’s pretty crucial in making sure that the iPhone outputs a depth map that lines up cleanly with the actual depth of the photo.
Here’s an example of a portrait I took of my beautiful friend that looks Frankenstein-ish in the Dual Lens portrait viewer. If you pull it in, you’ll see that the depth map believes that the part of her face that’s shadowed is further back than the rest of her face and body, and that her chest that is under more lighting is further in front.
This is because the blown out lighting on certain parts of her body in comparison to the other lower lighting on other parts confused the depth map into parsing those contrasts as actual differences in where they are in real space.
The front facing camera seems to be able to parse lighting data a wee bit better than the back-facing, as illustrated in this selfie of me under similarly (but a little less dramatic) contrasting lighting. Here you see that the natural gradient shifts from the closest point (my headscarf), to the furthest point (the wall). Even though there are high contrasted lighting points in the scene (as you can see in the RGB version of the photo in the link), this depth map does a great job of finding the actual depth of my face and body in relation to the background in the scene.
Get into the habit of taking shots with your iPhone horizontal rather than vertical.
Obviously you have the creative license to shoot however you want, but this suggestion is primarily for you to have to do as little re-positioning and cropping as possible after you drag and drop the photo in.
I’ve found that, with shots that I’ve taken vertically, I lose more information from the top and bottom of the shot when I frame it neatly in the web tool, than when I pull in a shot taken horizontally.
This shot of a meal I ordered in Tokyo fits nearly perfectly, while this photograph on a nature reserve in Peru needs a bit of fussing with scaling and positioning and loses information in the photo as a result.
Selfies follow every rule I mentioned above, except that the rules apply far closer to the subject (which I’m assuming is you).
- Get as much of your face and body in the photo as possible
- Make sure there is a small distance between you and the background
- Low contrast, consistent lighting
- Lower lighting works really well (try taking a selfie and use the ambient phone light as your main source of lighting)
- Tilt to angle your head to keep from flattening the face
- Orient your phone horizontally
And there you have it!
Now you’re ready to go out into the world with your iPhone and take some cool 3D photos for your Looking Glass. Stay tuned, because my next post will be about extracting a depth map from you iPhone photos, refining them, and pulling them back together using StereoPhoto Maker to view in your Looking Glass.
In the meantime, send me any questions you might have as well as cool photos you take for the Looking Glass at firstname.lastname@example.org. I look forward to seeing what you create!
To the motherfucking future ✌✌🏻✌🏼✌🏽✌🏾✌🏿