Photogrammetry

Photogrammetry tools are at a point where they are accessible to anybody with a decent cellphone camera and an overcast day.  For an object you can walk around, take pictures every 10 degrees in three separate "halos":  one at level with the object, one 30 degrees up from that ring, and then one more another 30 degrees up from the previous.  These three halos plus two opposing arches that cross over the object like the sun should give the software plenty of information to composite a complete textured mesh.  I can get away with it here but normally I would want to repeat the halo process with the object turned upside down.  While I have not used this technique in any landscape projects to date, I can see these meshes generated from photos being really useful when a designer needs to include assets in a 3D scene and constructing a model is not practical or timely.

White Knight

The left image panel shows one of the original 143 shots that were composited in RealityScan which, along with Twinmotion, is owned by Epic.  I had to ensure the light was even on all sides of this knight.  So I set up outside when the clouds were cooperative and served as an effective scrim for the sunlight.  I then took the photos as I described above.  There are a number of steps before texturing but I think the final result is convincing.  The textured mesh is now ready to be transferred to a 3D modeler/renderer.

The knight after being transferred to LightWave and lit with the Octane rendering engine . . . . 

. . . . and the same piece in Blender, lit with the native Eevee rendering engine.