Nvidia giving free GPU samples to reviewers that follow procedure


And "Speaking for their audience" makes this channel a club, huh? Doesnt sound very neutral and fair. What is with new viewers finding their reviews through google and youtube searches?



I linked to the 6800XT review. HUB has called better raytracing support and DLSS "questionable features" and "no mayor selling point".
Why are you seen keen on trying to force others to not only like but pretty much worship DLSS and RT?
 
Its an easy enough future to predict.



- DLSS will extend the usability of RT on the current 3000 cards for a couple of years for the games that support both. But they will hit Ram limitations before their RT performance becomes unusable, thus nullifying both rasterization and RT advantages.

- The 6000 series will have Ram to sustain their rasterization performance and outlive the 3070/3080 by many years, but their RT is already unusable due to lack of a good upscaling tech.



What do you play? what do you value? how often you upgrade? These are HU arguments in a nutshell, and I cannot disagree with any of them.
 
Its an easy enough future to predict.

- DLSS will extend the usability of RT on the current 3000 cards for a couple of years for the games that support both. But they will hit Ram limitations before their RT performance becomes unusable, thus nullifying both rasterization and RT advantages.

- The 6000 series will have Ram to sustain their rasterization performance and outlive the 3070/3080 by many years, but their RT is already unusable due to lack of a good upscaling tech.

What do you play? what do you value? how often you upgrade? These are HU arguments in a nutshell, and I cannot disagree with any of them.
You forgot the part where AMD is bringing their own upscaling method (supposedly in cooperation with Microsoft) early next year (if schedule holds)
 
Why are you seen keen on trying to force others to not only like but pretty much worship DLSS and RT?

DF called RT a gamechanger in CP2077. DLSS was praised all over for the performance it brings to the table with minimal impact on visual fidelity. Both MS and Sony advertised their hardware as 'RT capable' many times over, making it selling points and demonstrating its effects.
Now just about any next gen game has some form of it. But ye, guess their forcing it upon us.

Its an easy enough future to predict.

Someone has predicted the future, very easily. How many times have we seen that on forums :p
I think the 3000 series of GPUs will see variants with larger VRAM allocations to begin with. Aside from that, Ampere isnt any worse in normal rasterization compared to RDNA2. On top of that it has next gen defining features such as ray tracing and reconstruction tech using its tensor cores. RDNA2 rt is some 200% slower, upscaling can make things faster but thats for both nv and amd.
4k performance is important too in special if you dislike tech such as dlss. PC gamers going for a 600 dollar gpu or more isnt going to spend the next 3 years at native 1440p i hope.

It seems to me that its better to wait for RDNA3 or even 4 if you want to go AMD.
 
'The channel where we CUT through the marketing BS'

I assume he's going to rant on RT being useless for about 10 minutes. He looks angry and calls everything stinking and conspiracy.
That such content (froma flat earther) makes it here is impressive alone. Do you actually, by sharing this link, consider that your also attacking your own platform of choice? PS5 has RT, is it all marketing BS from sony? Same for MS? Same for intel? Is the RT part of the hardware going to see it being totally unused later on, because its all bs? And since DF call it a gamechanger, they are also part of the conspiracy according to him. (do they get paid by nv, amd, sony and ms?)

I honestly think in all the hesitation your forgetting that all platforms/hardware are having some sort of RT. just that NV has the most performant solution. But by downplaying RT because of that you nullify it for all hardware vendors and consoles.
Even if you downplay it and burry it into the ground, the games are there, even on consoles, doing RT in the best ways possible for the given hardware. Its not going to dissapear.
 
Last edited:
PSman1700, its really easy to predict because it happened in the past with pixel shaders etc.

None of the first generations of cards available at the introduction of a new rendering paradigm will be any good at it when it finally becomes standard.

Its a terrible investment if you are buying them for RT, because in two years time it doesn't matter if the 3080 is 3x faster than the 6800x when none of them can do above 25fps in 4K. If you go by ram necessity in those same 2 years from now, the 3070/3080 are an even worse investment.

Like you said, if you can then wait for the end of 2021. If you play at 4K and need to buy one now, go for Ram, because consoles dictate the baseline and now they have 16gb but very simple RT.
 
Come one. Not every youtuber is worth to watch. These clickbait and hate youtubers are the next level.

This is straight from Cyberpunk in 4K without upscaling: https://imgsli.com/MzM4MDA
Like HUB these people use plattforms like youtube to spread fud and nonsense. HUB is hiding it under the "reviewer umbrella"...

Yes its disgusting, its indeed the next level. Its not that, but the fact its appearing on a 'adult' forum as B3D seems (or seemed) to be.

This is straight from Cyberpunk in 4K without upscaling: https://imgsli.com/MzM4MDA

And that comparison is quite 'generous'. Playing around with the settings, in some scenes, especially some indoors with lots of different lighting, its a huge difference in fidelity. LTT was gobsmacked by the difference. On some occasions, like outside in the daylight, the difference isnt as stark. But the city at night with the fog hanging between the scrapers.... its a dramatic difference. Also, it depends on resolution, DLSS mode, what kind of RT level used, even framerate. With the youtube link, he could be creating the ideal situation for non-RT. Which isnt unlikely seeing what kind of person your looking at.

PSman1700, its really easy to predict because it happened in the past with pixel shaders etc.

None of the first generations of cards available at the introduction of a new rendering paradigm will be any good at it when it finally becomes standard.

Its a terrible investment if you are buying them for RT, because in two years time it doesn't matter if the 3080 is 3x faster than the 6800x when none of them can do above 25fps in 4K. If you go by ram necessity in those same 2 years from now, the 3070/3080 are an even worse investment.

Like you said, if you can then wait for the end of 2021. If you play at 4K and need to buy one now, go for Ram, because consoles dictate the baseline and now they have 16gb but very simple RT.

Dunno, predicting the future is one of the hardest tasks out there. But yes looking back is a possibility. That also means that the flexible PS2 hardware didnt have a future at all. It was the same discussion back then (here, search for it). The PS2 was flexible and devs 'could do what they want'. Opposed to what Xbox was doing in a much less flexible way (pixel/vertex shaders).

Well, yea, makes some sense. I could agree on that somewhere, but how about the consoles? Their in an even worse position following your post. At 10/12TF their going to be completely useless for RT very quick. On top of that, they dont have 16GB ram available to videoram either. Aside from B/W problems.
I think, to an extend thanks to consoles, RT is not going to be completely useless for either RDNA2 or RTX3000 series in some years.

Agree, i would say wait for 2021 for anyone considering doing a complete upgrade. But thats mostly due to price/performance ratio.
Games made for consoles that have very simple RT (see spiderman latest DF), that doesnt mean you cant crank it up on pc. Cyberpunk consoles will have RT, just simple.
Aside, consoles do not have 16GB of VRAM like the 6800XT has for example, which has the entire 16GB dedicated to the GPU and its bandwith.

If your really unsure, get the 6800XT or something since thats never going to be left behind, and at double the performance of consoles your going to enjoy both better RT and normal rendering. (then again, i dont think a 3080 will be obsolete anytime soon, rather i believe it will age quite well).
 
Last edited:
YWell, yea, makes some sense. I could agree on that somewhere, but how about the consoles? Their in an even worse position following your post. At 10/12TF their going to be completely useless for RT very quick. On top of that, they dont have 16GB ram available to videoram either. Aside from B/W problems.
I think, to an extend thanks to consoles, RT is not going to be completely useless for either RDNA2 or RTX3000 series in some years.

I don't buy the memory argument. At least sony exclusives can bypass memory limitations with streaming. The oodle compression can get 1:2 ratio for textures leading to 11GB/s effective bandwidth. The flops/rt performance I do buy. It will be limited ray tracing complemented by traditional techniques including screen space algorithms for consoles. Though it will be same for pc also, but it's less limited. CP2077 has 5 different effects done with ray tracing. As seen from df optimization guide some of them can be turned off or to lower settings without major loss in quality.

I like the spiderman approach of perf mode vs. quality mode. It's like curated pc settings to choose from though of course pc games allow more tweaks user can do. But in essence pc has to do that as there is much wider variety of hw gamers have.
 
I don't buy the memory argument. At least sony exclusives can bypass memory limitations with streaming. The oodle compression can get 1:2 ratio for textures leading to 11GB/s effective bandwidth. The flops/rt performance I do buy. It will be limited ray tracing complemented by traditional techniques and screen space solution for consoles. Though it will be same for pc also, but it's less limited. CP2077 has 5 different effects done with ray tracing. As seen from df optimization guide some of them can be turned off or to lower settings without major loss in quality.

I like the spiderman approach of perf mode vs. quality mode. It's like curated pc settings to choose from though of course pc games allow more tweaks user can do. But in essence pc has to do that as there is much wider variety of hw gamers have.

Yes, but the streaming accouts for all platforms. If NV wasnt lying we might see up to 14GB/s bandwith, with direct access down the line. Xbox has its own fast solution too. That, and i still dont think nvme drives can or will replace GPU bandwith, as in infinity cache or straight balls to the wall 800Gb/s worth of ampere bandwiths.

Spiderman offering different modes is really nice, giving abit more choice. Its only doing RT for reflections though, at a lower quality if you want performance. CP2077 could provide five different forms of high quality ray tracing at the same time, if you have the hardware for it. As a user shared here, even pshyco RT mode does make a visual difference where i think its enough to warrent it having enabled if possible.

seems most easily visible on reflections. So those with GTX 1080 probably still can enjoy ray tracing, as with GTX 1660 Super, it got 20-25fps with ray tracing on reflection, rendering at 1080p. So GTX 1080 and equivalent probably able to get 30 fps.

Wouldnt recommend going by that video where he tries his best to diminish RT as much as possible. Watch LTT or DFs comparisons instead. And yes 1080 users can have ray tracing, but its the immersion breaker imo, in special since you loose rather much performance due to the lack of hw ray tracing and not the ability to use reconstruction tech.
 
Yes, but the streaming accouts for all platforms. If NV wasnt lying we might see up to 14GB/s bandwith, with direct access down the line. Xbox has its own fast solution too. That, and i still dont think nvme drives can or will replace GPU bandwith, as in infinity cache or straight balls to the wall 800Gb/s worth of ampere bandwiths.

I didn't want to make assumptions there. PC is so complicated as there is no good baseline. Unreal5 is fully streaming solution though, so perhaps it will drive adoption come 2022+. Microsoft hopefully will push streaming earlier. I wouldn't be surprised if we didn't need big jumps in gpu memory capacities thanks to streaming. Another angle is that we already have 4k and it's also unlikely we would need more memory due to increasing resolution. 8k is a pipedream for long time to come.

DirectStorage is major reason why I'm waiting zen4 for pc upgrade. I hope by that time we would know what exactly is required and new cpu's, io-chips, motherboards, ssd's etc. would be optimized. I tend to keep pc for long time so I hope my next upgrade would last me as well or better than my current rig from 2013. It wouldn't hurt if intel was more competitive and parts in general would be cheaper. The new intel pcie4 optane looks insane though consumer version wasn't announced yet. My next pc is going to be a beast

https://www.anandtech.com/show/16318/intel-announces-new-wave-of-optane-and-3d-nand-ssds
 
I don't buy the memory argument. .

What isn't there to buy? You see frame pacing stutters today with the 3070 on some benchs due to Ram limits, today.

This is not new information.

We seen GPUs hitting ram limits all the time. If you bought one of these two cards 3 years ago - The 1060 6GB VS 390 8GB - which one do you think is already stuttering due to Ram limits? I know which. 10GB vs 16GB is night and day longevity wise.
 
I didn't want to make assumptions there. PC is so complicated as there is no good baseline.

There never has been on pc, its the inherit disadvantage to the dynamic platform. though, it always has been the platform where newest tech shines first, and where you usually have the best versions of games (at a higher cost). The consoles somewhat suffer from the same problem, with different hardwares (mid gen refreshes, double SKUs), at a much lesser extend though.

Unreal5 is fully streaming solution though, so perhaps it will drive adoption come 2022+. Microsoft hopefully will push streaming earlier. I wouldn't be surprised if we didn't need big jumps in gpu memory capacities thanks to streaming. Another angle is that we already have 4k and it's also unlikely we would need more memory due to increasing resolution. 8k is a pipedream for long time to come.

UE5 tech demo was supposedly running great (er) on pc hardware too. If we will see less improvements due to nvme streaming... maybe, i think we need both to improve. The more your going to stream onscreen, the more processiong power you need to have coming from somewhere.

DirectStorage is major reason why I'm waiting zen4 for pc upgrade. I hope by that time we would know what exactly is required and new cpu's, io-chips, motherboards, ssd's etc. would be optimized. I tend to keep pc for long time so I hope my next upgrade would last me as well or better than my current rig from 2013. It wouldn't hurt if intel was more competitive and parts in general would be cheaper. The new intel pcie4 optane looks insane though consumer version wasn't announced yet. My next pc is going to be a beast

Yeah, agree, i see no reason for the need to upgrade now either to keep up. The price to performance ratio is always terrible at the start of a new generation. its totally un-needed too. I also tend to keep my hardware as long as possible, that way you get the most value.
Intel not competitive? I think in the top end their still the performance king regarding CPUs. Its GPUs where they need to enter the market.
Yes im too getting the PCIE4 optane whenever the time is ripe. Its insane on all levels. Heck, even their current optane is at certain levels.

10GB vs 16GB is night and day longevity wise.

Consoles are probably closer to the 10GB cards then the 16 ones. Obviously RDNA2 pc gpus have the huge ram advantage, where NV will have an answer to soon.
 
you are taking that value from the percentage PS4 had for games. But assuming the same percentage on the PS5, means the OS would reserve 6GB which is enough to run windows 10, let alone a cutdown version of freebsd that could run on a potato.

We will see. The entire argument is based on whats available today. Next year is another story.
 
you are taking that value from the percentage PS4 had for games. But assuming the same percentage on the PS5, means the OS would reserve 6GB which is enough to run windows 10, let alone a cutdown version of freebsd that could run on a potato.

We will see. The entire argument is based on whats available today. Next year is another story.

A wild guess would be around 13GB allocated to the GPU for the PS5. 3GB seems feasible for the OS, background game capture etc?
 
Back
Top