Oh for f***s sake, someone delete that from YT please. Stupid stupid stupid.
Looks really good though
Oh for f***s sake, someone delete that from YT please. Stupid stupid stupid.
Oh come on, this is just scanning existing real world locations and converting them to some static dataset that can be cached in real time.
Oh come on, this is just scanning existing real world locations and converting them to some static dataset that can be cached in real time.
And you don't even see any dynamic lighting or shading, so it's not obvious that the surfaces are super noisy and messy, there's no clear distinction between objects at large scales and so on.
Same bullshit, only now they have more varied material to display. Still impossible to make a good game out of it, especially something compared to GTA or even COD.
I'd really rather not see anything more of this but I guess it'll flood the internet once again...
Laa-Yosh's point isn't to undervalue the effort of effective capture, but the massive limitations of the end data. 3D captures are great for historical building preservation. They'll be great for fairly static adventure games with fixed locals to look around (maybe their game is a murder mystery set in a mansion?). But Euclideon are placing themselves as superior to all the other 3D tech companies because they have broken through the photorealistic-detail barrier that no-one else can, ignoring completely that everyone is pursuing solutions that are dynamic. What good is a cathedral interior if it can only be rendered at one time of day at one time of the year in one weather condition? Also, their realtime comparisons were with characters (avatar) yet they haven't shown characters, only scenery. Let's see what their engine does with Avatar. Or how's about picking up one of those cathedral candlesticks and putting it on the floor. Rubbishy old Elder Scrolls can't manage that photorealistically because they're not use Teh Awsomist FuturTech Euclideon!Just? 3D scanning from multiple views and reconstructing complex environments is not trivial. Assuming they didn't use some third party software for it or spend eons cleaning shit up it's quite an achievement.
Just? 3D scanning from multiple views and reconstructing complex environments is not trivial. Assuming they didn't use some third party software for it or spend eons cleaning shit up it's quite an achievement.
Euclideon is perfectly entitled to trumpet their own achievements, but they shouldn't hype it unrealistically and basically lie about their tech by leaving unanswered questions and letting people believe the details.
As a talking point, it could be with enough scanning and calculating variances. Imagine a whole load of shots of the same scene at different times of day. As the colours change, you could derive a neutral albedo from the integral of every point, say. Then you can compute the variance from the albedo at a given time and store that as just a delta with a representation that allows for tweening between different time points. I'm sure there's a lot that could be done with scanning including geometry, surface properties, and lighting data. It's just nothing like the cure-all it's being represented as. And, of course, it can't handle anything that's not scanned, which is probably more what your saying (no flaming torches or first-person flashlights happening in any Euclideon sceneAnd that problem isn't solvable by scanning, really.
There was a portable lightstage shown at Siggraph. I don't know how good it is and it's smaller than the original lightstage.Photogrammetry's most advanced implementation is the Lightstage developed by Paul Debevec; it's a huge and heavy instrument in a big room - a geodesic sphere with computer controlled LED lights - so it's not really portable at all.
Or clueless customers that need some 3D Engine for visualisation, or game companies with CEOs that are totally detached from the technology.I'm beginning to think that this... campaign has very little to do with acquiring video game customers. Looks more like an ego thing, really.
I mean any sane developer would immediately see through the bullshit, there's no way to actually sell them the tech; and all the talk and video is aimed at clueless gamers instead.
You would be surprised how much plain idiotic decisions are made within big companies not driven by engineers.
The surfaces look way too static. There's no specular highlights and no reflections (*). Any modern PBR pipeline produces a better looking lighting result. They have lots of geometry detail, and that's very nice, but otherwise their pipeline is not that impressive. It's only rendering single textured unlit surfaces (all diffuse lighting is "offline" baked on the "textures").They are back.