Category Archives: News and Technology

HXGI – Voxel GI in Unity

Yeah, yeah. I said I would no longer waste blog space on fickle betas and unavailable alpha experiments… But lighting is the single most important aspect of any 3D film – no matter the art style, without highlights and shadows objects have no depth and cannot be visually “anchored” in a scene. Lighting also happens to be the framerate killer for game engines. Each scene light increases the scenes polygon count geometrically. Finding an artistic balance between the fewest possible live scene lights, and an approximation of ambient and reflected bounce lighting or global illumination, is the Holy Grail of realtime 3D…. Here’s the latest maybe…, called HXGI by the author of Hx Volumetric Lighting.

The typical solution is to bake static scene lights and shadows onto a second texture map, either in another program or internally in the game engine. Most game engines also employ reflection probes and a low-res dynamic shadow map to blend over the baked shadows. With the adoption of Lightmass in Unreal and Enlighten in Unity, small scenes can be entirely dynamic with propagated bounce light continuously rebaked on the fly, limited only by the resolution of their shadow maps and speed of the GPU. Very fine-detail shadows that fall below the resolution of the lightmap are handled with an onscreen image-effect ambient occlusion, and some models may have their own AO maps.

One problem however is that animated meshes can’t possibly rebake their maps every frame. A figure walking through the environment, including hair, clothing, and props, can’t update reflective GI maps quickly enough. Static scene objects can be pre-baked or updated every few seconds, but animated figures still require scene lights. A workaround is to use a Light Probes Proxy Volume: an array of point lights that take on a color approximating the bounce light. You can imagine what having dozens of extra scene lights approximating ambient bounce lighting will do to your frame rates. Such an elaborately lit scene would be a chore to create by hand, and cannot be generated procedurally with any strategic efficiency.

voxelized scene in CryEngine

Voxel-Based Global Illumination

In old-fashioned ray-traced rendering, the camera traces a vector to each visible surface, and calculates the global illumination for each pixel by following the ray back to the light source(s). Game engines can’t possibly raytrace every pixel in realtime. What’s needed is a way to simplify the process with fewer rays, and bake the lighting values into an accessible “map” of volumetric elements, or voxels. If a rasterized imagemap is created from regularly-spaced pixels, a voxelized scene is represented by regularly-spaced cubes.

The automagic voxelization process subdivides a scene into diminishing cubes, discarding the empty spaces while dividing into smaller and smaller elements. The scale and subdivision of the voxels is adjustable, and not at all dependent on the polygon detail of the scene. Distant objects like mountain terrains can voxelize just as quickly as nearby, detailed models. Finally, each voxel creates a fast look-up reference map of the light in all directions using similarly reduced low resolution light “cones”. The engine generates only the voxels it can see through the camera, and data is interpolated from voxel to voxel, updating as needed. The result is voxel-based global illumination!

With physically based rendering (PBR), every surface material is reflective. The difference between a polished-mirror and matte skin is only in the brightness and blurriness of their reflections. With traditional raytracing realistic blurry reflections are computationally expensive, however voxel GI interpolates blurry reflections with a lower voxel resolution.

Lexie Dostal, creator of HXGI says, “One of the big issues with Enlighten is that dynamic objects are lit by a single interpolated light probe. this can make large dynamic objects look extremely out of place. The way my system works is each fragment (pixel) samples the GI data independently. this means large dynamic objects will be lit correctly.”

read more on the Unity Forums…

Unity 2017 Roadmap from Unite Europe 2017

(Annotation to come)

IKinema Orion

IKinema Orion is an animation technology that uses a standard HTC Vive VR headset and trackers to deliver motion capture, as well as live body tracking for VR content. IKinema’s software body-solving solution helps position the figure according to the six tracked points: head, hands, hips, and feet.


Vive’s “lighthouse” laser tracking system uses base stations that sweep the play area with a vertical and horizontal laser line 100 times per second. The sensors on the tracker see the laser flashes and send timing data back to the computer, either through the HTV headset or wirelessly via a small usb rf dongle. With two or more base stations in line-of-sight, the trackers can triangulate their position in space. Remarkably, the system can scale to larger areas with additional base stations and additional trackers.

The required hardware is somewhat configurable. A headset, hand controllers, and two base stations retail for around $800, and individual trackers can be purchased at around $100 each. IKinema is licensing Orion at $500/yr per seat, making a complete system $1600 with the headset, or under $1400 for a 6-tracker solution with 2 base stations.

Tribeca Immersive’s Virtual Arcade

Tribeca Immersive’s Virtual Arcade featuring Storyscapes – explore and celebrate the art found in the virtual world. With 30 virtual reality (VR) and innovative interactive projects on display, Tribeca Immersive presents thought-provoking experiences and installations from top creators and emerging artists, including 20 world premieres.

Storyscapes, which celebrates its fifth year this spring, was created in 2013 to bridge filmmaking, technology, and storytelling. At the 2017 Festival, the Storyscapes juried showcase will continue to present new trends and innovative work across mediums that integrate various forms of audience participation with six VR and interactive installations focusing on emotion and the human experience. The projects tackle topics including an exploration of autobiography in VR, a hunger to connect with the world around us, recounting life in a concentration camp, perception and identity, and the secret lives of strangers.

Tribeca Immersive’s Virtual Arcade will be held April 21 – 29, 2017 on the fifth floor of the Tribeca Festival Hub located at Spring Studios, 50 Varick Street, New York, NY 10013.