Morph3D has announced Artist Tools to create clothing, hair, and props for their figure system. The tools are free to use until June 2017, when a TBA subscription fee kicks in. A number of licensing restrictions also apply, including a share of any in-app purchases where the tools would be used to create add-on game content.
Read full user agreement here
The Morph Character System (MCS) has already impressed the game, application, and VR world by rolling out the ability to create realistic custom avatars. That means game designers can build or even let their players build their own unique, high quality characters. It is this very flexibility that has platforms like High Fidelity and VR Chat jumping on board.
At the heart of that value proposition this is the ability to have add-on content that will change and fit to any character type or shape.
Now, with the launch of Morph 3D’s Artist Tools game designers can take characters even further by designing their own content to be used with MCS. This allows an artist to make content that will change dynamically and work for a huge variety of characters and applications. Additionally, it means that artists will be able to work with Morph 3D to sell their work to the growing number of app and game developers also using MCS.
Lighting In A Bottle gives you effortless ray traced light maps. Watch your lighting render directly into your textures. You can even adjust settings live and watch the estimates update. Or just hit render and stop when you are happy with the quality.
Simple, easy to use, pure ray traced light mapping with live feedback.
NOTE: This editor tool will run only on windows 64 bit at the moment. A 32 bit windows version then an OSX version is planned.
• See your light maps render directly onto your objects, live in the editor window.
• Get sub second feedback on the render progress with samples per second, accurate time estimates and a pixel accurate progress bar.
• Use physically based area lights and adjust their size to get hard or soft shadows
• Set your bounces and get full global illumination complete with bounce light from your albedo textures.
• Use the integrated subdivision to render with a perfectly smooth version of your geometry, making your real time shading look even better.
• Flexible workflow renders with the uv set you choose and writes both .exr and .png files. Iterations are automatically versioned with your unity file.
• Intuitive error messages.
• Separate control over which objects are traced as part of the render as well as which objects are rendering maps.
• Fully multi-threaded with live control over the number of threads being used.
Eye Controller is an easy-to-use script that adds lifelike random eye movement and blinking to your characters.
– Controllable random blinking.
– Supports both random look direction and look at target modes, and can transition between them.
– Support for both blendshapes and bones.
– Supports both 3D meshes and 2D Sprites.
– Auto-target, finds and looks at marked objects when in-range.
– Highly customisable:
– Eye rotation range
– Eye turn speed
– New look direction rate
– Blink rate
– Blink speed
Offline Render is a easy to use, realtime capture plugin for Unity. It allows you to capture the game view to a multi-channel OpenEXR, supporting not just the final output image, but also some common elements, like depth, per-light shadows, diffuse, AO (if present in scene as an Image Effect) and other G-Buffers.
Offline Render allows you to render your Unity scenes and take them into a normal post-production pipeline using your favorite compositing software.
- multi-channel OpenEXR output
- Offscreen rendering
- setup target framerate
- 9 out of the box render elements (Diffuse, Specular, Emission/Lighting, Reflections, Depth, Velocity, Normals, AO, Motion Vectors)
requires deferred rendering path
shadow pass only works with directional lights
Occipital, maker of the Structure Sensor 3D scanner for iPhone and iPads, has announced a headset designed to mount their infra-red depth scanner and an iPhone running their Bridge Engine software to combine realtime camera data and room-scale motion-tracking with virtual reality environments.
iPhones are generally viewed as lagging behind the VR trend, with no official VR support from Apple, slower hardware and relatively lower-density screens. However Bridge promises to leapfrog over the rest of mobile VR with it’s “6 Degrees-of-Freedom” positional tracking (mobile-based VR typically only provides rotational tracking), which allows the wearer to move freely around the room untethered to a more powerful desktop computer.
Bridge is expected in March of 2017, and comes with or without the Structure Sensor, in case you already own one.
Cinemachine is unified procedural camera system for AAA games, film pre-visualization and virtual cinematography eSports solutions. Originally released last year, Cinemation was bought by Unity and the base rig is now free on the Asset Store.
Cinemachine’s CEO, Adam Myhill, has joined Unity as Head of Cinematics.
Cinemachine Base Rig includes these components:
Composer – cinematically tracks and composes whatever target you define, be it an object or bone in your character. It’s a smart camera operator which procedurally films the actions based on your direction of where you want it on screen.
Transposer – mount cameras to objects with real-time offset tuning and per-axis dampening controls
Noise multi-channel Perlin noise function which allows you to create anything from handheld behaviors to speed vibrations and everything in-between
Blender – define how any camera blends from one shot to the next. Easily create huge camera state machine setups for in-game cameras.
Priority Assign a priority to cameras and have the highest priority shot be used in any given situation.
Cinemachine can be seen in the cinematics of Homeworld: Deserts of Kharak