Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
disagree thats the only way to have similar run but for sure easiest and imo useless
not useless if you're trying to determine where the bottleneck is: CPU or GPU. Unfortunately it's hard to tell. There is a lot going on in the world all at once. I can't really differentiate what is a GPU bottleneck from a CPU bottleneck just by watching since it's not always clear why the performance is dropping.
 
not useless if you're trying to determine where the bottleneck is: CPU or GPU. Unfortunately it's hard to tell. There is a lot going on in the world all at once. I can't really differentiate what is a GPU bottleneck from a CPU bottleneck just by watching since it's not always clear why the performance is dropping.
yeah its some data for sure but its like this synthethics build-in game benchmarks popular years ago that gives you flawless fps, you are happy then you play real game and it runs like shit ;d
 
yeah its some data for sure but its like this synthethics build-in game benchmarks popular years ago that gives you flawless fps, you are happy then you play real game and it runs like shit ;d
Lol indeed. But for the sake of science; it would have been worthwhile to get an idea of what’s going on under the hood
 
  • Like
Reactions: snc
This is the same discourse that I hear when DF analysis doesn't fit a certain crowd, or that someone's console of choice isn't leading in a benchmark.

NXGamer = PS fanboy...
Alex = PC troll...
John = PS fanboy...
Rich = MS shill...

And, so on and so on...

That's certainly not an equivalence I would have drawn.

Yes the guys at DF have their platform preferences (like we all do) and make no secret of it. But that's fine because the analysis they produce are well balanced and scientific.

Putting any talk of platform bias aside (and NX has never openly expressed his personal preference as far as I'm aware) I think the issues raised here are more about the accuracy of his technical analysis and the methods used to arrive at the conclusions which are very different to the standards the DF guys adhere to.
 
That's certainly not an equivalence I would have drawn.

Yes the guys at DF have their platform preferences (like we all do) and make no secret of it. But that's fine because the analysis they produce are well balanced and scientific.

Putting any talk of platform bias aside (and NX has never openly expressed his personal preference as far as I'm aware) I think the issues raised here are more about the accuracy of his technical analysis and the methods used to arrive at the conclusions which are very different to the standards the DF guys adhere to.

No, I'm simply stating whenever benchmarking practices come into question, it's usually console warriors not liking a particular outcome. Hence, my statement about DF and NXGamer being called biased regularly whenever some group/individual epenis isn't measuring up.
 
Not sure if you agree or disagree here. Tessellation is a covering of a surface with tiles. I think one can use it in the context of videogame by meaning "tessellating = adding more polygons". I believe based on one of his posts NXGamer actually uses that word in that way.
In the context of computer graphics, tessellation always means subdividing an object, most usually a triangle mesh but also high order surfaces, into (more) triangles.
 
if anybody interested nxgamer answered few remarks that were also made here
" nanite doesn't fit the definition of a proxy (replacing data with another often simpler representation)"
But it does, at any one point in a frame a 1million +polygon model with 2 million or more verts will be possibly 900 Polys and about 1500 verts, this is a reconfiguration and a simplification of the original data. In this case quite literally.

" nor is it tessellation (adding/interprolating data that isn't there"
Again, tessellation is a mathematics term and it means to create or fill a surface with multiple shapes to fill the same space/silhouette. Then what happens with nanite is it constantly 'adapts' the tessellation to create a close as visual view of the base object with the least amount of verts/tris as possible. Adaptively tessellating the inner surface at all times, which you can quite clearly see in the visualiser and even count them as they change.

"Nanite is just sampling data that is there and is the true representation of the object. Not a proxy and not tessellation."
Nanite is sampling from the source model verts/objects and creating a compressed representation of that at all times but using (most likely) a complexity based compression (As in Data science ranges of data can be compressed easier/better the less complex they are and this is likely why it only supports rigid body models). Ala at any one frame to the next it will be a Simpler/reduced/smaller model of the base model unless big enough to require all Verts/Tris i.e. a Proxy representation of the huge source import.

Think of it like this, if it rendered all the polygons of each imported model at all times (a full data target no reductions) it simply would not run at playable frame-rates, it has to simplify the model on the fly and is likely using a something as described above to manage against the verts and reduce this with a compression technique to represent it.
 
Status
Not open for further replies.
Back
Top