I seem to recall a prior Epic technology had something to do with dynamically scaling models to fewer polygons automatically. I'm not sure if that is completely automatic, or works as I'm assuming, but I imagine there's probably some amazing synergy with this technology.
> Nanite virtualized micropolygon geometry frees artists to create as much geometric detail as the eye can see. Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works. Nanite geometry is streamed and scaled in real time so there are no more polygon count budgets, polygon memory budgets, or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.
Yes, I believe that's what I'm recalling, I remember there was a cave walk through demonstrated along with it.
I would think a technology that claims the more detail the better and that it will deal with worrying about optimizing what can be handled would work very well with a high quality scanning technology to make very detailed real-world based models.
It's generally called "LOD", dynamically changing the level of detail of an object based on the distance from the camera.
It has been a pretty widely used optimization technique for a long time, and not specific to Unreal Engine (though I can't find which 3D renderer introduced it first).
The impression I got was that instead of dynamically changing between a few different pre-made models for an asset at a specific point, it could automatically scale a very high quality and poly count model down to less quality (that is, generate the lower quality model automatically from the high quality one), and on the fly or maybe JIT, but I'm not sure if that's accurate or if that's also old tech.
Edit: See your sibling comment, which came in just after I originally replied here.
I was in the back room of Epic Megagames in 97 when Tim Sweeney showed me his automatic LOD on the Skaarj model. It was pretty basic compared to the current tech, but I think remember him saying it didn't require lower poly meshes/maps to be generated by the animators/artists. I don't know if the LOD was pre-baked or not at the time, but given that we were running the first MMX chips it probably was.