Re-creating 3D using machine learning

I started using photogrammetry to replicate art and present it in virtual environments. Photogrammetry is a technique used to extract geometric information from photographs. It is the science of making measurements from photographs, especially for the purpose of creating realistic and authentic looking virtual objects from the original, analog sources.

360° photo’s serve as a backdrop for the 3D models to create a wide veriety of media like visualisation videos, webshop product previews and NFTs. Using engines like Unity or Unreal it is also possible to experience them close up with a VR headset.

VR-environments

NFT Gallery Concept

This gallery is made using a seashell photogrammetry dataset. My reaction to all the ordinary

NFT galleries that turn up when you look for one.

NeRFs – Neural Radience Fields

I also create NeRF’s and Gaussian Splatterings from similar datasets. They are full 3D environments so creating VR experiences with these models is possible.

Gaussian Splattering plant models

Gaussian Splattering stereo-360 render

The proces of creating 3D models:

– Preparation of the object and the lighting of the studio.
Lighting is key to a high resolution model and texture. Apply markers if needed and illuminate the object from all sides to record it properly. An accurate scan will help speed up the rest of the process and create a sharper texture.

– Recording photos and creating a mesh.
Generally I shoot 2 or 3 series of 100-150 photos so that the bottom, top and sides are properly recorded and can be combined into 1 model. The millions of data points generated by the software create a model with a lot of detail.

– Polish and optimise the model.
The model size usually needs to be reduced to become practical to display on websites or instagram filters. Simplifying and retopologizing to reduce file size and stress on hardware is key. In some cases manual sculpting of the model is needed to remove any abnormalities.

I use Metashape to create the models and refine them using ZBrush and Blender.
A PICO headset running Unity enables viewers to walk through the VR environments.

    Contact me