Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

Exagerating the effect seems in line though with the fact that they had very little time with the actual hardware. Prolly it was not their intention
Huh? They developed it on Titan V which surely had DXR-drivers, why would it suddenly change because there's acceleration for it? It's still the same DXR
 
It's probably best to start a new thread for if you want to promote Kyle's self-interests. As Guru3D said

https://forums.guru3d.com/threads/n...stom-rtx-2080-ti-reviews.422723/#post-5579179

I believe the intent of it is being questions and rightfully so. Be it that you believe it's his "self interest" or not the public is gaining insight to inner workings of the cogs. As already mentioned nVidia uses a 5 year NDA. I've not heard/known the same from AMD.

Another point is that even if it's true AMD would be more justified in this approach as they've been shafted multiple times by reviewers who were bias. Even on this forum one brought us to light on just such occasion.

https://forum.beyond3d.com/posts/1472023/
NV's review guides kinda frowned on ATI's catalyst AI and said something to the effect that they do not do optimizations like that... so that's why I turned it off.


https://forum.beyond3d.com/posts/1472046/
After getting some constructive criticism at some other hardware communities, I have decided to re-run the HD 5830 benchmarks to create a more apples-to-apples comparison with regards to texture filtering and optimizations. Apparently you can't turn off NV's driver-level optimizations, and when I did the review I turned Catalyst AI off of the ATI card.

I'm re-running the ATI benchmarks with Catalyst AI on Standard, with the texture filtering set the same as NV's: @ Quality, not High Quality.

I will amend the review when I have all the data in.

And that goes back 8 years ago. So it's a bit obtuse to claim that AMD does it as well in the same like an nV when we know 1st hand that AMD would need to do it out of necessity IMO.
 
I believe the intent of it is being questions and rightfully so. Be it that you believe it's his "self interest" or not the public is gaining insight to inner workings of the cogs. As already mentioned nVidia uses a 5 year NDA. I've not heard/known the same from AMD.
Kindly open a new thread relevant to the topic you want to discuss.
 
2080 Ti Timespy GPU score:

DmGJ1HhXoAAmwY0-1030x576.jpg


Seems about 20-25% faster than a 1080 Ti, so what we expected really.
 
So reviewers are going to end up with very little time to test?
 
They simply shouldn't publish their review until they have enough time to and put up a simple "Review coming later because Nvidia didn't provide in time" message and to act as a sort of de-hyper. I don't know why any of the testers and writers ever tollerate these hard deadlines, especially when they benefit the OEMs more than the review sites.
 
I don't know why any of the testers and writers ever tollerate these hard deadlines, especially when they benefit the OEMs more than the review sites.
What are they supposed to do instead when many of them survive on page views? Aren't they really at the mercy of the hardware vendor? Especially with Nvidia basically having control of the market and dictating terms with NDA's.
 
The only way would be for all major reviewers to agree not to publish their reviews until a certain date. But that would be very difficult to achieve.
 
Is it better for them to rush out a review that won't cover anything unique and cover the same things as everyone else or to do it properly and offer unique perspectives?
 
According to Digital Foundry, who spoke to DICE engineers on site:

1-BFV was being developed on Titan V, as such the whole RTX operation lacked proper RT acceleration on the RT cores.
2-DICE only received Turing two weeks before the demo
3-DICE is also not using the Tensor Cores in denoising currently
4-At this state, the 1440p performance is 40-50fps, @4K it's sub 30fps, DICE is amazed the game worked at 4K at all
5-DICE will make the quality of Ray Tracing scalable to work on different GPUs, also to decouple the RT resolution from the rendering resolution.
6-RT acceleration is currently lacking as it only works after G-Buffer stage, DICE will change it to work asynchronously alongside the rasterization which will lead to a significant FPS boost.
7-Also DICE will be merging the drawn object instances into the same acceleration structure, they expect this to lead to an almost 30% increase in raytracing performance.
8-DICE is looking into RT AO, and RT particles ..


None of that^ even matters, if nVidia's $1,200 graphics card... can not give us stable 75 ~100+ frames at 4k.

Most of us "advanced gamers" will just stick with our g-sync 3880 x 1440 & 3440 x 1440 gaming displays, and our 1080ti's (RTX is a bust)... and just wait 3 months until AMD releases their 7nm Vega 128 freesync2 cards for $1,200... then upgrade to a nice 4k display.

Nvidia's proprietary raytracing doesn't matter to the end gamer, when you have DX12, Vulkan, DXR and other open standards. Nvidia is just using marketing propaganda to sway people away from performance and focused on visuals.




Gamers want/demand lower latencies & better sustained frames, not higher reflection & lighting fidelity...
 
What are they supposed to do instead when many of them survive on page views?

If the articles are worth reading [maybe] you get multiple ad-impressions per page-view.
Except if you think your readership has the attention-span of a "resting" Kolibri ... by god's design, instead of ... media's own vicious circle design. It's not a feedback loop. The so called marketing research has no interest in the people's true desires and needs, and all the interest in strategically splicing a second here and there from the peoples already stretched time-budged. Giving 100 articles in an hour to a consumer - or was it cattle? - is obviously of higher value than 1 article for 100 hours. Exposition means proof of activity, "reading" a page for a long time can also mean someone had a long workday. ;) Anyway. /OT Sorry. If I say I lament Nvidia's decision to jump on this train themselfs ... I guess it's not really more OnT. :)
 
Back
Top