Significant Improvements to Flagship Image Generator
On 1 November VT MAK announced the release of VR-Vantage 2.4, its flagship image generator, identifying this as a major feature release that introduces some significant improvements to the product.
This release enhances the rendering performance of OpenFlight and MetaFlight terrains. Terrains built by offline tools now take advantage of rendering techniques – indirect rendering, bindless textures and advanced occlusion culling – that were previously only used in procedurally-generated terrains. The upgrade will have the most benefit on databases that are not already highly optimised for VR-Vantage’s render engine.
While substantial architectural changes were made internally to VR-Vantage, they do not significantly alter the VR-Vantage API and configuration, so the upgrade process should be straight-forward.
In addition to the performance improvements above, the release incporates a number of smaller, yet important improvements, including:
Destructible terrain with indirect rendering – While previous releases have used the much faster indirect rendering approach for point features on a streaming terrain, if those point features contained switch nodes, indirect rendering was not possible. This is no longer the case; buildings with switch nodes (dynamic terrain, destructible states, doors, etc.) can now get the same speed-up as any other model by using indirect rendering. This means that dynamic content no longer limits performance in large terrains.
Improved image/detail blending – It is now possible to configure VR-Vantage to smoothly transition from real imagery in the distance to high-resolution land-use-based textures close to the camera.
DI-Guys in vehicles – Too often 3D models are built without humans in them – cars driving around the scene with no driver, or aircraft flying with no pilot. VR-Vantage can now automatically render DI-Guy characters in the driver/pilot seat without simulating an embarked entity on the network. Users can configure the specific character, appearance, and location of the character for their custom 3D vehicle models. This simple change leads to much more realistic immersive scenes with high detail characters where you care the most.
CIGI control of articulated parts for humans – You can now directly control DI-Guy joint angles (elbows, knees, etc.) through CIGI. If your host simulation uses hand controllers, a motion-capture suit, or a video-based motion capture system such as Microsoft KINECT to capture the motion of a real person, you can now drive the DI-Guy characters within VR-Vantage to mimic that person’s motions in real time. This feature brings high fidelity, fine grain control of human characters by external data sources to VR-Vantage for some really impressive immersive scenes.
Improvements to the sound system API – The API used to control and produce sound in VR-Vantage has been overhauled, to make it easier to extend to meet programme needs.
H265 video streaming – VR-Vantage can now generate streaming H265 video using on-board GPU hardware to drive high performance and high pixel count displays.
In conjunction with this release of VR-Vantage, MAK is providing a new data package, which is also compatible with VR-Vantage 2.3.x and VR-Forces 4.6.x. Customers using VR-Forces can reinstall the software with this data build to gain access to the new terrain and model improvements. However, for customers not wishing to upgrade VR-Forces data, VR-Forces 4.6.x will continue to function well with VR-Vantage and the new data as well.