There are few publications as revered in the real estate community as Architectural Digest and we are beyond thrilled to announce that this year, AD named us as one of the top 10 Innovators of 2013. When I sat down with Editor-In-Chief Margaret Russell to show her a demo of our new lighting engine, Luma, I could see her gears turning, the creator’s spark in her eye. She gracefully took over the meeting and made my sales pitch for me, describing a future where we communicate about physical spaces digitally in 3D, the way architects & designers dream them up. It will take some time before we can upgrade completely from photography to interactive 3D, but I look forward to the day when I can browse architecturaldigest.com in 3D, a window into the most beautiful spaces in the world.
Below is an animation of the primitive extraction and meshing algorithm. The algorithm looks at a horizontal slice of the point cloud and extracts a large number of candidate rectangles (magenta) that describe data in the cloud.
It then runs a scoring test to see how well the data fits the model and greedily adds good rectangles to the 2D mesh model (in yellow). What we end up with is a nice line-based approximation of the shape of the room, that also captures some nooks and crannies like door/window jambs.
Not visualized here are several thousands of rectangles that the algorithm rejects for having a poor fitness score. (There are about 45,000 rectangles in total.)
We’ve been working on our own hardware for capturing 3D and 2D data using a combination of a laser scanner and dSLR camera.
We’ve named our first device “Nelson”. To celebrate our progress, we put Nelson on a table and scanned ourselves raising a toast to him and to Dale, our intern and his maker.
It’s very common in computer graphics to represent images in a format known as RGB. One of the largest issues with this format is the difficulty in simulating realistic lighting scenarios with a large range of of colors (also known as the dynamic range of a scene). For example, with traditional methods, it’s difficult to capture a really bright sun and a dark alleyway in the same image without significant loss of detail.
High dynamic range (HDR) techniques attempt to combat this issue and ultimately lead to more physically plausible graphics by representing color in more precise floating point formats that are capable of accurately representing the entire visible spectrum. Eventually, any HDR rendering pipeline has to compress this larger spectrum back into the standard RGB color space in order to view the results on a normal screen, and this compression process is known as tone mapping.
In basic terms, HDR will make bright things really bright, dark things really dark, and still have enough contrast inbetween to discern details. End-to-end HDR support is one of the cornerstones of modern rendering engines, both in terms of real-time games as well as offline renderers, and at Floored, the Luma rendering engine is no different.
HDR Support in Luma
When you talk about HDR in a 3D engine, it generally refers to one of two things:
Texture support for HDR-specific image formats such as RGBE, OpenEXR, LogLuv, etc. Luma currently supports a custom HDR format for high quality environment mapping and will be adding support for more HDR image formats soon.
Using floating point textures for internal computations to increased precision and remove arbitrary clamping. Luma utilizes floating point textures for most important internal render targets, including shadow maps, g-buffer, and deferred lighting calculations.
Luma uses floating-point textures to process all internal lighting calculations and then uses a simple global Reinhard tonemapping operator to efficiently compress the resulting luminance values into an aesthetically-pleasing RGB display.
Note that WebGL currently defines floating point texture support as an extension (OES_texture_float), and it is not guaranteed to be available on all WebGL browsers (though in practice it’s very well supported on most desktop browsers via webglstats).
The Dark Side of HDR
The primary disadvantage of having an HDR-oriented pipeline is the increased memory footprint of using larger floating-point textures everywhere, which can be an especially constraining problem on mobile devices without much memory and/or shared VRAM/RAM. That being said, in practice on desktop WebGL, the benefits of using HDR in terms of overall realism greatly outweigh these relatively manageable performance costs.
HDR Image Formats
There are a number of HDR image formats available, each with their own pros and cons. Here are some of the most popular HDR formats:
Radiance (.pic, .hdr)
Developed by Greg Ward in 1985, this is the simplest and most common HDR image format at 32bpp. It comes in two flavors, RGBE and XYZE, with the E channel representing a shared exponent for the RGB or XYZ channels, thereby allowing different pixels to store very different ranges of data. The downside of the radiance image format is that it can introduce noticeable banding artifacts due to the relatively low dynamic range compared with other HDR formats.
OpenEXR (.exr)
Developed by Industrial Light & Magic and supporting lossless or lossy compression, this 48bpp HDR format also comes in a very popular 16bpp variant natively supported by modern Nvidia and ATI GPUs. See this GPU Gem for a good overview.
LogLuv (.tiff)
Developed by Greg Ward in 1998, LogLuv comes in 24 and 32bpp flavors and log-encodes luminance for an increased perceptual range. Although LogLuv is particularly interesting compared to other HDR formats in terms of quality vs size tradeoff, it hasn’t seen widespread adoption in the industry.
Note that there are other HDR image formats, including Portable Float Maps (.pfm), scRGB, and Pixar’s log-encoded TIFF, but they don’t see much use in practice.
HDR in WebGL
SpiderGL has a WebGL demo displaying an HDR texture with variable exposure (using PNGHDR format which appears to be a PNG version of RGBE).
Three.js has a corresponding demo utilizing variable exposure and basic gamma correction / tone mapping, as well as another useful demo of HDR illumination by Pierre Lepers.