the processes and techniques used by the artists and engineers at Epic Games and its partners to create realistic and detailed digital characters running in real time in Unreal Engine.
Unity and NVIDA demonstrate RTXGI ray traced global illumination arriving in Unity 2019.3.
The nearly-automated process allows – for now – a single volume of probes that can be baked into textures for distribution to devices.
An overview of upcoming features, enhancements, packages, templates and integrations, the different types of solutions Unity offers for film and animation, and a Q&A.
Speaker: Mathieu Muller, Sr. Technical Product Manager for Film (Unity Technologies)
- DCC Mesg Sync: live link to Maya, Max, Blender, and Motion Builder
- Alembic package
- Unscaled Blendshapes, negative blendshapes
- Multiple scene editing, Nested Prefabs,
- Nested Timelines
- HDRP (preview) – High Definition Render Pipeline,
- HDRP Shader Graph
- HDRP soft shadows, volumetric lighting, planar reflections, stack lit shader
- Global Illumination: GPU accelerated ray-tracing for Baked Lightmap (preview)
- Cinemachine: Auto-focus, Storyboard, Post-processing, Maya cameras, lens presets
- Post Processing Stack
- VFX Graph (preview)
- Unity Editor on Linux CentOS (preview)
- Licensing for closed networks
- Python Support for Editor (Windows, Mac, Linux)
- Recorder v2 as package
- Post Processing v3
- Fur and Hair material
- Blendshapes up to 16 bones per vertex influence
According to Highlights from Unity’s Film and Animation Summit at Unite LA, Sr. Technical Product Manager for Film and TV, Mathieu Muller, shared the product roadmap and plan of intent for Unity when it comes to animated storytelling. This talk is suspiciously missing from YouTube.
However, many interesting features were teased during the general Product Roadmap, including video streaming and genlock coming in Unity 2019.
The full roadmap presentation is below.
(Annotation to come)
In this GDC 2017 session, members of the Unreal Engine 4 development team demonstrate several exciting upcoming animation and physics features for the engine. These include new workflows for previewing and editing animation, new tools for creating physics simulations with improved quality and performance, and other ways to improve the quality of digital characters. Our goal is to allow teams of all sizes to easily make cutting-edge, engaging experiences!
As part of this showcase, 3Lateral presents Gene Splicer, its novel approach to parametric modeling of avatars embedded in Unreal Engine, which is both a production and runtime solution for democratizing high-end character rigs. Gene Splicer evaluates the appropriate rig for the animated avatar created by user, and this all happens in milliseconds. This solution is based on a database of scanned people, and through this database approach we’re generating truly unique characters with appropriate facial gestures for their anatomy, while keeping the ability to use the same animation across the created population. This technology is not just high-quality, but is also highly optimized for both high-end cinematics and VR applications.