Photogrammetry

A global platform supporting photogrammetry artists

Depending on who you talk to, photogrammetry can mean a lot of different things. There are many types of new technology being used to generate 3D graphics for models in the use of architecture, video games, and of course the big topic right now, The Metaverse.

Lidar is a light distancing imaging system that’s been used for mapping spaces and all kinds of stuff. It used to be incredibly expensive but now even iPhone has Lidar in it. Apple’s starting to allow you to start capturing the environment around you, and have it grow into a 3D world for virtualization. While we’re on the subject of lidar, technically speaking it is not photogrammetry. Lidar surveys a location by measuring reflected laser light and calculating the time of flight for the bounce, while photogrammetric calculations are based on photographic images. But the techniques are very similar, in that they yield physically accurate 3D versions of real environments that can save real headaches in production and post. We think the camera of the future will be a flat camera, about as thick as an iPad, that absorbs and times light and can calculate depth. We can see photogrammetry having an impact on editorial, allowing an editor to select not just the best take but also to dictate the precise camera angle.

Examples of photogrammetry in major feature films include The Matrix (1999), Fight Club (1999), Panic Room (2002) and Quantum of Solace (2008). A show called Station Eleven on HBO Max did photogrammetry of their sets. So that if and when something came up, they would have the capability to put an actor in front of a background anywhere in the world and drop this background in. With virtual production, you can actually put that up on a wall in a motion environment. Instead of simply showing images on the screen, you can show a 3D model inside of Unreal Engine.

Characters created with technology like Unreal Engine’s Metahuman Creator look great, but they still need to be animated realistically, either by hand or with the help of mocap data. But if a character is created through photogrammetry, realistic movements (based on real physical performances) can be baked into the asset. RC DJI UE5

Last updated