Daisyfield

Our flagship rendering technology, Daisyfield, is a neural rendering engine based on a completely new system of how visual data is represented on a screen. Utilizing bleeding edge deep learning technology it is capable of tracing rays in path space at real-time rates on any consumer hardware capable of tensor acceleration. It specializes in powering digital realities at a massive scale (trillions+ of potentially visible polygons), simulating natural forces like thermodynamics and electromagnetism in unparalleled visual quality.

The engine is written in C++ and Vulkan. It's designed completely around AI and path tracing and can be broken down into 5 parts:

Deconstruction rendering (DNR) keeps GPUs performance optimal all the while enabling smart scene sampling and reconstruction from very sparse samples.

Smart Geometry (SMG) accelerates ray tracing, light transport and importance sampling, resulting in quicker scene sampling.

Path Guided Neural Rendering (PGNR), introduces a new baking process of voxelized autoencoders to the development pipeline, that denoises sparse path tracing samples and ensures spatial stability.

Deep Super Resolution Virtual Texturing (DSRVT) provides on the fly, high quality texture upscaling. The result is either higher texture resolutions or saving significant amounts of space when shipping lower resolution textures without degrading original resolution quality in real-time.

Quantum Neural Models (QNM) is designed for storing, compressing and coherently tracing geometry on the Tensor pipeline that sets ground for unifying multiple data structures (3D models, animation, materials etc.), into a single unified model. Read more.

For licensing, demos and other enquiries contact us.

info@lightmass-dynamics.com