«Left Right»

Procedural Animation in Unity3D: video blog

HTC Vive and IKinema Orion reinvent motion tracking, and Google’s Jump Start will fund 360° video projects.

Cutscene Artist: recent posts

Octane for Unity 2017

Octane for Unity 2017

A handful of assets from OTOY appeared in the Unity Asset Store to announce Octane for Unity, including the import of Octane orbx scenes, pathtraced light baking, and an emmisive shader. Octane is a GPU renderer that requires Nvidia’s CUDA technology. 25+ 3D applications can send scenes to Octane Render. Octane and Unity2017 are required.

“OctaneRender for Unity is now available in preview, demonstrating physically accurate rendering for the first time inside of the game engine. The release is the first phase of a partnership that will bring a GPU-accelerated rendering pipeline into Unity, delivering cinematic path-tracing to Unity’s new scene composition tools, Timeline and Cinemachine. This powerful storytelling toolkit will enable millions of Unity creators worldwide to compose beautiful, film quality scenes and sequences.”

https://unity.otoy.com/

HXGI – Voxel GI in Unity

HXGI – Voxel GI in Unity

Yeah, yeah. I said I would no longer waste blog space on fickle betas and unavailable alpha experiments… But lighting is the single most important aspect of any 3D film – no matter the art style, without highlights and shadows objects have no depth and cannot be visually “anchored” in a scene. Lighting also happens to be the framerate killer for game engines. Each scene light increases the scenes polygon count geometrically. Finding an artistic balance between the fewest possible live scene lights, and an approximation of ambient and reflected bounce lighting or global illumination, is the Holy Grail of realtime 3D…. Here’s the latest maybe…, called HXGI by the author of Hx Volumetric Lighting.

The typical solution is to bake static scene lights and shadows onto a second texture map, either in another program or internally in the game engine. Most game engines also employ reflection probes and a low-res dynamic shadow map to blend over the baked shadows. With the adoption of Lightmass in Unreal and Enlighten in Unity, small scenes can be entirely dynamic with propagated bounce light continuously rebaked on the fly, limited only by the resolution of their shadow maps and speed of the GPU. Very fine-detail shadows that fall below the resolution of the lightmap are handled with an onscreen image-effect ambient occlusion, and some models may have their own AO maps.

One problem however is that animated meshes can’t possibly rebake their maps every frame. A figure walking through the environment, including hair, clothing, and props, can’t update reflective GI maps quickly enough. Static scene objects can be pre-baked or updated every few seconds, but animated figures still require scene lights. A workaround is to use a Light Probes Proxy Volume: an array of point lights that take on a color approximating the bounce light. You can imagine what having dozens of extra scene lights approximating ambient bounce lighting will do to your frame rates. Such an elaborately lit scene would be a chore to create by hand, and cannot be generated procedurally with any strategic efficiency.

voxelized scene in CryEngine

Voxel-Based Global Illumination

In old-fashioned ray-traced rendering, the camera traces a vector to each visible surface, and calculates the global illumination for each pixel by following the ray back to the light source(s). Game engines can’t possibly raytrace every pixel in realtime. What’s needed is a way to simplify the process with fewer rays, and bake the lighting values into an accessible “map” of volumetric elements, or voxels. If a rasterized imagemap is created from regularly-spaced pixels, a voxelized scene is represented by regularly-spaced cubes.

The automagic voxelization process subdivides a scene into diminishing cubes, discarding the empty spaces while dividing into smaller and smaller elements. The scale and subdivision of the voxels is adjustable, and not at all dependent on the polygon detail of the scene. Distant objects like mountain terrains can voxelize just as quickly as nearby, detailed models. Finally, each voxel creates a fast look-up reference map of the light in all directions using similarly reduced low resolution light “cones”. The engine generates only the voxels it can see through the camera, and data is interpolated from voxel to voxel, updating as needed. The result is voxel-based global illumination!

With physically based rendering (PBR), every surface material is reflective. The difference between a polished-mirror and matte skin is only in the brightness and blurriness of their reflections. With traditional raytracing realistic blurry reflections are computationally expensive, however voxel GI interpolates blurry reflections with a lower voxel resolution.

Lexie Dostal, creator of HXGI says, “One of the big issues with Enlighten is that dynamic objects are lit by a single interpolated light probe. this can make large dynamic objects look extremely out of place. The way my system works is each fragment (pixel) samples the GI data independently. this means large dynamic objects will be lit correctly.”

read more on the Unity Forums…

Script for importing Adobe Fuse character model into Unity (fixes materials)

Script for importing Adobe Fuse character model into Unity (fixes materials)

A common beginner’s complaint in the Fuse forums is the confusion of how to set up Unity’s Standard Shader to interpret alpha channels. The Standard Shader has four opacity modes: Opaque, Cutout, Fade, and Transparent. Unity detects the alpha channel in the Fuse textures and defaults to transparent – appropriate for glasses but not hair or eyelashes.

Saulth created a script to simplify the import workflow of Adobe (Mixamo) Fuse CC figures in Unity. In addition to fixing the opacity issues, the script will also:

  • Set shader render modes correctly per material
  • Create new materials for Eyes and Eyelashes
  • Correct the MetallicAndSmoothness map (has inverted alpha)
  • Set texture import settings correctly (alpha channel, normal maps)
  • Set smoothness per material to roughly match visuals in Adobe Fuse CC (beta)
  • Create prefab ready for easy use

Download the script from the Unity Forums:
Script for importing Adobe Fuse character model into Unity (fixes materials)

News April 26, 2017

HTC Vive and IKinema Orion reinvent motion tracking, and Google’s Jump Start will fund 360° video projects.

IKinema Orion

IKinema Orion

IKinema Orion is an animation technology that uses a standard HTC Vive VR headset and trackers to deliver motion capture, as well as live body tracking for VR content. IKinema’s software body-solving solution helps position the figure according to the six tracked points: head, hands, hips, and feet.

 


Vive’s “lighthouse” laser tracking system uses base stations that sweep the play area with a vertical and horizontal laser line 100 times per second. The sensors on the tracker see the laser flashes and send timing data back to the computer, either through the HTV headset or wirelessly via a small usb rf dongle. With two or more base stations in line-of-sight, the trackers can triangulate their position in space. Remarkably, the system can scale to larger areas with additional base stations and additional trackers.

The required hardware is somewhat configurable. A headset, hand controllers, and two base stations retail for around $800, and individual trackers can be purchased at around $100 each. IKinema is licensing Orion at $500/yr per seat, making a complete system $1600 with the headset, or under $1400 for a 6-tracker solution with 2 base stations.

https://ikinema.com/Orion