Spawn
F6 SMART™ – Volumetric Handheld Camera – User Guide
uses the UV texture coordinates to determine how to paint the three-dimensional
surface. Image projection Mesh provide a captivating, hyper realistic texture for the
scanned model.
Color vertex mesh – using the F6 Smart RGB camera Echo decoder sample color value
(RGB) for each point, then we can use it to paint the meshed model.
The benefits of using this option are:
No mapping is necessary, all objects are inherently paintable without the need for
defining texture coordinates.
without mapping, there are no mapping discontinuities
An intrinsic one-to-one correspondence is maintained between the 3D surface and the
color data (no duplicated color data)
Models can be edited, after coloring, without resampling
They do not require any precomputation; the procedure is compatible with the current
real-time graphics pipeline.
Since mesh colors can be implemented to satisfy all the above criteria with high
performance and low memory use, the approach is ideal for many high-end applications
like 3D texture painting, and for storing precomputed data, such as ambient occlusion,
light maps, and global illumination, on a 3D surface.
Mesh Quality preset
There are 3 Mesh presets which users can chose from, each of the presets predefining
Poisson depth, Poisson accuracy, trim mesh and color equalization in 3 deferent levels.
The main difference between the 3 mesh presets Low, Medium and High is in the Poisson
depth value.
Poisson depth or octree depth help organize the points of a 3D object very efficiently.
The octree responsible for dividing the point cloud into a voxel grid by matching through
the point cloud and analyzing which points define the isosurface of our object. By
detecting which edges of the voxel are intersected by the model’s isosurface the
algorithm creates the triangles of the mesh.
The higher the value for the octree-depth is chosen the more detailed results you will
get. This is because the deeper the marching cubes algorithm goes the finer gets the
granularity of the voxel grid. On the other hand, with noisy data (like scanned point
clouds) it keeps vertices in the generated mesh that are outliners but the algorithm