Nvidia giving free GPU samples to reviewers that follow procedure

For PS5 I suppose between 13 or 14 GB available to game probably the same than Xbox, 13.5 GB for Games and 2.5 GB for OS. They don't need more with a SSD.

Like XSX, I suppose around 10 GB of the RAM use as VRAM.

I think 10 GB of VRAM is probably enough for a PC GPU.

There never has been on pc, its the inherit disadvantage to the dynamic platform. though, it always has been the platform where newest tech shines first, and where you usually have the best versions of games (at a higher cost). The consoles somewhat suffer from the same problem, with different hardwares (mid gen refreshes, double SKUs), at a much lesser extend though.



UE5 tech demo was supposedly running great (er) on pc hardware too. If we will see less improvements due to nvme streaming... maybe, i think we need both to improve. The more your going to stream onscreen, the more processiong power you need to have coming from somewhere.



Yeah, agree, i see no reason for the need to upgrade now either to keep up. The price to performance ratio is always terrible at the start of a new generation. its totally un-needed too. I also tend to keep my hardware as long as possible, that way you get the most value.
Intel not competitive? I think in the top end their still the performance king regarding CPUs. Its GPUs where they need to enter the market.
Yes im too getting the PCIE4 optane whenever the time is ripe. Its insane on all levels. Heck, even their current optane is at certain levels.



Consoles are probably closer to the 10GB cards then the 16 ones. Obviously RDNA2 pc gpus have the huge ram advantage, where NV will have an answer to soon.

UE 5 tech demo was supposed to run good on a low speed SSD if you write sequentially the data for the last sequence if I remember well of the deleted chinese blog article not a good solution but I suppose all 2,4 GB/s ot better PC SSD will be more than enough.
 
Last edited:
What isn't there to buy? You see frame pacing stutters today with the 3070 on some benchs due to Ram limits, today.

This is not new information.

We seen GPUs hitting ram limits all the time. If you bought one of these two cards 3 years ago - The 1060 6GB VS 390 8GB - which one do you think is already stuttering due to Ram limits? I know which. 10GB vs 16GB is night and day longevity wise.

I was commenting in ps5 context for sony exclusives which can heavily rely on streaming. PS5 can pull textures at speed of 11GB/s. I doubt ps5 memory amount will become a limitation. PC side is whole another ball game until games heavily using streaming and by extension directstorage become mainstream. For this I offer unreal5 as example. I expect 2022 we will start to see some games with unreal5 shipping while requiring decently fast ssd. It looks like unreal5 streaming speed requirement is heavily tied into used resolution. 1080p-4k difference could be 4x difference needed in streaming speed. If one assumes 1080p requires 500MB/s then 4k would be around 2GB/s.
 
Last edited:
'The channel where we CUT through the marketing BS'

I assume he's going to rant on RT being useless for about 10 minutes.
How about watching it with an open mind rather than pre-judging based on your own personal biased. That's a lot to ask for you. I know. But at least try. Jesus.

But I guess trashing the video is a lot better than actually admitting that RT is not worth the performance hit at this point... Completely predictable with the exact same users.

Do you actually, by sharing this link, consider that your also attacking your own platform of choice? PS5 has RT, is it all marketing BS from sony? Same for MS? Same for intel? Is the RT part of the hardware going to see it being totally unused later on, because its all bs? And since DF call it a gamechanger, they are also part of the conspiracy according to him. (do they get paid by nv, amd, sony and ms?)
Just FYI, I have a PC, a Nintendo Switch, and Xbox One. No PS5 or Xbox Series X to speak of. And RT is like the last reason on the list to get one of those consoles.

I honestly think in all the hesitation your forgetting that all platforms/hardware are having some sort of RT. just that NV has the most performant solution. But by downplaying RT because of that you nullify it for all hardware vendors and consoles.
Even if you downplay it and burry it into the ground, the games are there, even on consoles, doing RT in the best ways possible for the given hardware. Its not going to dissapear.
Being the most performant solution means very little when with a $1500 RTX 3090 you can only achieve a measily 4K/30fps in Cyberpunk with DLSSQ enabled with RT settings turned on...
 
Come one. Not every youtuber is worth to watch. These clickbait and hate youtubers are the next level.
He has great content. Much greater than the mindless spouting that I see on here daily by certain users

This is straight from Cyberpunk in 4K without upscaling: https://imgsli.com/MzM4MDA
What is that? 1080p?
Honest question... Which one do you think is the RT image? Because I suspect you might be confusing the RT one with the SSR one...

Like HUB these people use plattforms like youtube to spread fud and nonsense. HUB is hiding it under the "reviewer umbrella"...
This is yet again another one of those instances where your opinion about certain people or groups of people says more about you than about them.
 
How about watching it with an open mind rather than pre-judging based on your own personal biased. That's a lot to ask for you. I know. But at least try. Jesus.

But I guess trashing the video is a lot better than actually admitting that RT is not worth the performance hit at this point... Completely predictable with the exact same users.

The video you talk about is a flat earther bashing tech that exists in all popular machines, even your PS5.

But I guess trashing the video is a lot better than actually admitting that RT is not worth the performance hit at this point

Say that to Sony, DF, MS, CDPR etc.
 
But I guess trashing the video is a lot better than actually admitting that RT is not worth the performance hit at this point... Completely predictable with the exact same users.

This is highly subjective topic. For me with 3070fe+dlss quality mode+rt ultra is completely worth it in cp2077 when playing in "1440p" resolution. It's single player, I play it non competitively and really enjoy the higher fidelity graphics as part of gaming experience. Something like fortnite, cod, destiny which are competitive shooters I wouldn't use ray tracing. I might not even use max settings to make it easier to see enemies.
 
PSman1700, its really easy to predict because it happened in the past with pixel shaders etc.

None of the first generations of cards available at the introduction of a new rendering paradigm will be any good at it when it finally becomes standard.

Well, you can opt to wait for it to become standard. Not sure what that even means, to be honest.
Others might well prefer to just go ahead and use it today.

Its a terrible investment if you are buying them for RT, because in two years time it doesn't matter if the 3080 is 3x faster than the 6800x when none of them can do above 25fps in 4K. If you go by ram necessity in those same 2 years from now, the 3070/3080 are an even worse investment.

I tend to think of investments as things that appreciate value over time.
Consumer electronics on the other hand I call disposable income.
But I know there are some people that treat them like fine wine. Whatever floats your boat, I guess.

Like you said, if you can then wait for the end of 2021. If you play at 4K and need to buy one now, go for Ram, because consoles dictate the baseline and now they have 16gb but very simple RT.

Or you can, you know, buy the card with the feature that games support today.
And worry about having a card with 16GB at such a time when games benefit from that.
 
Well, you can opt to wait for it to become standard. Not sure what that even means, to be honest.
Others might well prefer to just go ahead and use it today.



I tend to think of investments as things that appreciate value over time.
Consumer electronics on the other hand I call disposable income.
But I know there are some people that treat them like fine wine. Whatever floats your boat, I guess.



Or you can, you know, buy the card with the feature that games support today.
And worry about having a card with 16GB at such a time when games benefit from that.

Your second quote is just embarrassing. I wont argue what investments can be. I will though, introduce you the meaning of "context". And the context of all you quoted from me was based on people who upgrade every 3~5 years and have to buy one today. The Maxwell/Pascal/Vega audience if you will.

The argument is not: - "buy always more Ram". The argument is: - "Do you need to upgrade this year? please don't. If you do, chose Ram". For those who can upgrade yearly to 800$ cards, well...
 
Your second quote is just embarrassing. I wont argue what investments can be. I will though, introduce you the meaning of "context". And the context of all you quoted from me was based on people who upgrade every 3~5 years and have to buy one today. The Maxwell/Pascal/Vega audience if you will.

The argument is not: - "buy always more Ram". The argument is: - "Do you need to upgrade this year? please don't. If you do, chose Ram". For those who can upgrade yearly to 800$ cards, well...

Yeah. So my take is if you're getting an upgrade now get the card that runs everything well AND has good RT. And enjoy the hell out of it.

Unless you anticipate deriving more pleasure from spending the next 3-5 years pointing out that your priorities are better than that of most people.
 
Yeah. So my take is if you're getting an upgrade now get the card that runs everything well AND has good RT. And enjoy the hell out of it.

Unless you anticipate deriving more pleasure from spending the next 3-5 years pointing out that your priorities are better than that of most people.

Not much to say except that If that is how you choose to interpret what I said, by all means go ahead and do your thing.
 
That made my day. A flat earther making videos about tech :D Thanks for letting me know before I started giving that retard one more view.
I'm pretty sure he has no clue whether the YouTuber is flat earther or not, it was just the "worst insult" he could figure out for someone who doesn't share his views on importance of RT.
 
I'm pretty sure he has no clue whether the YouTuber is flat earther or not, it was just the "worst insult" he could figure out for someone who doesn't share his views on importance of RT.
I googled it before making my post. And now looking better I'm not sure that's true anymore :-|

@PSman1700 can you please elaborate? :)
 
I'm pretty sure he has no clue whether the YouTuber is flat earther or not, it was just the "worst insult" he could figure out for someone who doesn't share his views on importance of RT.
I mean... He was trying to make a fool of Hardware Unboxed... Of course he's going to trash the little guy. It's pretty much the same as nVidia's mentality. At least nVidia apologized, which this person is in no way going to do. And that makes him worse than nVidia.
 
Just watch the beginning of that video.
I must be missing something but in what I heard or saw in the first 4 minutes nothing proves he's a flat earther. I would even go further: what he says looks quite OK to me, that doesn't seem too partisan (and I really tend to like RT so I could be considered as slightly biased though I like to listen to all arguments from all sides).
 
It's a completely fair comparison. There is nothing he could elaborate on. RT looks better but it's not transformative.
 
It's a completely fair comparison. There is nothing he could elaborate on. RT looks better but it's not transformative.

“Looks better but not transformative” can describe everything since DirectX 9 debuted in 2002. We just got faster hardware since then that’s running the same old vertex and pixel shaders but at higher resolutions with nicer textures.

If RT isn’t transformative then what is?
 
“Looks better but not transformative” can describe everything since DirectX 9 debuted in 2002. We just got faster hardware since then that’s running the same old vertex and pixel shaders but at higher resolutions with nicer textures.

If RT isn’t transformative then what is?

Transform & Lighting maybe?

Sorry I'll let myself out.
 
As far as gaming tech goes and how much effect it has on the overall rendering, compared to most other iterative steps over the last decade or so, real-time raytracing would have to be one of the most significant.
 
Back
Top