Lightform turns your beamer into an AR projector — and a 3D Scanner

You might now that with my blog and consultancy service I’m focussing on what I call “3D Scanning Beyond Engineering”. For me this means that I try to inspire people about using 3D scanning technology for all kinds of new, innovative purposes.

It excites me that recently, this has started to go beyond capturing objects and people. A while back I wrote about Hayo — a virtual assistant-like device that can make a 3D scan of a room and turn any object into a remote control for the internet of things.

Today, I’m writing about another device that wants to bring 3D scanning into your living room but in a totally different way. It’s called Lightform and its mission is to make Projection Mapping easy and accessible.



What can Lightform do?

The device itself is a mini computer with a camera which you can connect to a digital projector of your choice. It uses Structured Light Scanning to determine the shape and locations of the objects in front of it. After that, you can use a laptop or smartphone to set up all kinds of things to project onto the separate surfaces.

If you’ve ever used the HP/DAVID SLS-3 I reviewed recently you might recognize this behavior since that 3D scanner also uses structured light and projection maps a checkerboard pattern onto the calibration board as a calibration confirmation.

Of course, the usefulness of a device like this depends on the things you can project onto the surfaces and how smart the software is in detecting them.

Lightform apparently uses AI to perform its magic and the current examples show this can be used to project nifty animated patterns onto everyday objects. That’s funky for decorating hip stores, restaurants and art installations (the purpose projection mapping is most used for) but the practical examples on the product website are far from inspiring.



Projection Problems

I studied Art & Technology and have been working in Motion Design, VFX and Animation for over 10 years, so I’ve come across many projection mapping projects. But they never really attracted my beyond artistic use. That’s mainly because projections aren’t very practical in everyday situations. Although light production of beamers has increased in the past years, they still require a dimmed environment for best visibility.

The coffee shop menu example above is clearly a mockup and I’m a bit skeptical how this would look in reality. Of course, this totally depends on the beamer you have. The ones that can produce a lot of lumens are also the most expensive. And if you count in the costs of periodically changing the lamp, I’m not sure if you still want to keep it on for many hours a day just to project a menu.

Lightform periodically scans your scene for changes and can re-align the projections automatically. I don’t know how often it will do this but the fact that it uses fast-moving structured light patterns isn’t very pleasing to look at and could even be problematic for people with epilepsy.


Using it for 3D scanning

According to the project website you can also “connect Lightform with any projector to scan a scene in under a minute. Zero expertise required.” It also states that it can generate a high-resolution meshes and export them to an FBX file.

The usefulness of that depends on the price of the device (which is yet to be announced) and the software included to to this. As I’ve demonstrated when I reviewed the HP Sprout — which also uses Structured Light Scanning — a single 3D “snapshot” can be great quality but isn’t very useful by itself. You need to be able to take at least a few separate scans from different angles and have software that can perform global registration and fuse all data into a single volumetric model — a process that’s very processor intensive and is very unlikely to be pulled off by a tiny computer like this.



What about interactivity?

Unlike Hayo, Lightfrom doesn’t appear to contain an infrared depth sensor besides a regular camera. This would mean that the “magical experiences” it can project will not be interactive. That greatly limits the possibilities, especially functional ones. But the other way around, Hayo doesn’t feature a projector (and cant be connected to one) for visual feedback.

For comparison, Sony recently announced the Xperia Touch — a projection device that runs Android (so it instantly has many apps) and does feature an infrared depth sensor that can detect user gestures. Technically, that sensor can also allow the Touch to do projection mapping, although I haven’t seen examples of that yet. And apart from enabling game-like experiences, gesture input can also enable an wide array of business an commercial applications — at least in dimly lit rooms, and for people willing to spend $1600 on it.


I’m keeping a close eye on Lightform because I’m curious about the price and what it can actually do — both in terms of AI-powered projection mapping and 3D Scanning. Follow @3Dscanexpert on your favorite social network if you want to receive updates about this.

[tw-social icon=”twitter” url=”http://twitter.com/3dscanexpert” title=””][/tw-social] [tw-social icon=”facebook” url=”http://facebook.com/3dscanexpert” title=””][/tw-social] [tw-social icon=”instagram” url=”http://instagram.com/3dscanexpert” title=””][/tw-social]

  1. Hi Nick
    Could you post an update here about Lightform? I can’t get any info from the company itself, support seems non- existent
    And the examples on their website are not inspiring in terms of AR so it’s hard to know what it can really achieve..
    Thanks

Leave a Reply

Your email address will not be published. Required fields are marked *

Contents