Fast, automated 3D modeling of building interiors
A plenary talk from SPIE Optics + Photonics 2017.
In a plenary talk at SPIE Optics + Photonics 2017, Avideh Zakhor of University of California, Berkeley presented a mapping and visualization platform for 3D modeling and documentation of indoor environments.
Zakhor has been developing this technology for about 10 years at Berkeley. In 2015, she started Indoor Reality, a Berkeley based company that specializes in fast scanning "reality capture" of indoor environments by creating auto-generated 3D blueprints, virtual walkthroughs, and maps of buildings much faster than traditional scanning.
In her presentation, Zakhor gave a detailed account of the development of the components and techniques used in automated 3D scanning. She explained how 3D scanners detect the distance for the device to a point on a surface based on the time interval between the laser pulse emanating from the device and the return that is reflected off the surface.
The most prominent type of early scanning systems used tripod-based scanners. This was time intensive and required multiple scans from various locations throughout a room to capture a single space. To get around this, researchers have been developing mobile and LIDAR scanning devices, suitable for large-scale infrastructure and city mapping. Airborne scanning offers direct views of pavement and building tops, but poor oblique views of vertical surfaces. Mobile LIDAR offers good views of pavement, but can't capture building tops. Both are only for outdoor use.
Around 2007, Zakhor proposed two distinct hardware systems for 3D indoor modeling. The first is an ambulatory backpack system equipped with a suite of sensors worn by an operator walking at normal speeds in and out of rooms inside a building in a continuous walk through. The second involves a handheld system carried by a human operator as he or she waves it at walls while walking inside the building.
Both systems share a common software pipeline that results in 3D point clouds, texture mapped surface reconstructed 3D models, 3D architectural models and floor plans, and web-based virtual navigation with tagging, annotation, and dimension measurement capability.
"Indoor Reality was launched in order to commercialize some of these technologies," said Zakhor.
Beyond just modeling architecture, Zakhor described the visual analytic platform that can be used to automatically recognize energy relevant assets such as windows, lights, and computers. The same walkthrough that generates the 3D model can also be used to collect building sensor fingerprints, which can later be used in a mobile app to locate building occupants, for example by first responders in emergency situations.
SPIE Optics + Photonics 2017, 6-10 August in San Diego, CA (USA), featured 3300 technical presentations on light-based technologies in 69 conferences. It was also the venue for a three-day industry exhibition with 180 companies; a two-day Career Center job fair; 34 courses and workshops; and several networking opportunities for professionals and students. Read more news from SPIE Optics + Photonics 2017.
SPIE Optics + Photonics 2018 will run 19-23 August in San Diego.