Value of Hardware Unboxed benchmarking

This graph can be misleading. 1600x1200 resolution would be something like 4K nowadays, it wasn't common. The most common would be 800x600, 1366x768, 1024x768 or 1280x1024. Apart from the wonder that CRT monitors were, in which lowering the resolution did not have a very noticeable impact on the image quality. Using AA was more of a requirement for LCD monitors.

56MACcqGSnMWqt8XBYtZ6U-1200-80.gif
farcry.png
These Things are all about relativisms of course, but If you are using a highend GPU from 2002 onward in the preLCD era, there are high Chances it was a CRT capable of 1600x1200 and easily 1280x1024. If you Had an old 800x600 or 768 Set then you we're probably Not running a 9800pro, x800xt, 6800 Ultra or anything, rather one of the lower end cheaper Models. In which Case using 4xMSAA even at 800x600 would be expensive.

Speaking from my experience, both CRTs that I switched from in that era did 1600x1200.
 
Most CRTs did 1600x1200 but it lead to lower refresh rates (60 which is very low on a CRT) so in many cases a lower resolution was preferable.

But this is arguing around the core issue. RT is not some special thing which has never happened previously. Over the years there were numerous graphical features and advances which when used resulted in the same if not higher performance loss for lower visual gain and tended to be unusable on anything but high end GPUs. Never ever have this been a problem but now suddenly with RT it is.

Also RT will never run on low end GPUs as well as on the high end ones. RT is easily scalable and once will be past the period where your choice is essentially to use it or not the choices will shift to the amount of RT you're willing to use. This is already happening with Avatar, Outlaws and Indiana. The scaling will likely increase in the future. RT is a bottomless sink of performance, and the more you through at it - the less noise like artifacts there will be.
 
Last edited:
Yep, you need to learn so much about AA and I don't have the time to help you.
This is not constructive discussion. Engage properly.

The point is that in the past, 'high-end graphics' meant AA+AF. Enabling these reduced the framerate. DavidGraham has presented examples of the framerate being significantly affected, of the order of half, by the high-end setting.

Your counter needs to be about the presented data on the topic at hand. Does the data prove or disprove GPU framerates tanked running high-end settings? Explain your argument with reference to existing or new evidence.
 
Most CRTs did 1600x1200 but it lead to lower refresh rates (60 which is very low on a CRT) so in many cases a lower resolution was preferable.

But this is arguing around the core issue. RT is not some special thing which has never happened previously. Over the years there were numerous graphical features and advances which when used resulted in the same if not higher performance loss for lower visual gain and tended to be unusable on anything but high end GPUs. Never ever have this been a problem but now suddenly with RT it is.

Also RT will never run on low end GPUs as well as on the high end ones. RT is easily scalable and once will be past the period where your choice is essentially to use it or not the choices will shift to the amount of RT you're willing to use. This is already happening with Avatar, Outlaws and Indiana. The scaling will likely increase in the future. RT is a bottomless sink of performance, and the more you through at it - the less noise like artifacts there will be.

I believe the main point, which seems to be ignored, is that the market today is much bigger than it was back then. The PC market nowadays is very similar to that of consoles, but with a little more freedom, as you can modify both the games and the hardware.

At this point, it is understandable why this type of technological leap that occurred abruptly in the past is no longer strong today.

Another point that I believe influences too much is the price of the hardware. In 2007, an 8800 GT was $300. Nowadays, for a similar performance gain, you need to pay 5x more. We can say that what really hinders technological evolution and greater adoption of new technologies is inflation.

These Things are all about relativisms of course, but If you are using a highend GPU from 2002 onward in the preLCD era, there are high Chances it was a CRT capable of 1600x1200 and easily 1280x1024. If you Had an old 800x600 or 768 Set then you we're probably Not running a 9800pro, x800xt, 6800 Ultra or anything, rather one of the lower end cheaper Models. In which Case using 4xMSAA even at 800x600 would be expensive.

Speaking from my experience, both CRTs that I switched from in that era did 1600x1200.

I think this is a false equivalence. Not every upgrade is complete, it was very common for a certain imbalance to occur, a more powerful GPU was more impactful than a monitor with a higher resolution. CRT monitors, in addition to being less sensitive to changes in resolution, were also mostly small, so pixel density was another factor in not causing as much impact. And as mentioned above, many monitors ran at lower frequencies at high resolution.
 
1600x1200 resolution would be something like 4K nowadays
That's true, and people evaluate high end GPUs on 4K resolution.
The most common would be 800x600, 1366x768, 1024x768 or 1280x1024
Just like today, 1080p is the most common resolution by far, yet we evaluate high end GPU performance by it's 1440p or 2160p performance.
Using AA was more of a requirement for LCD monitors.
Back in the day I had a 17inch CRT that maxed out on 1280x1024, and it still needed AA for most games, as without it the experience was very jaggie.

Also, in the Doom test you posted the fps is CPU limited at the 1024x768 resolution. AA still caused massive performance drops even at this low resolution. See here with Doom.

3423.png

3425.png
 
Last edited:
I was running Tomb Raider at 320*240 in software, and used the image correction buttons on the CRT to change the size of the image to fill a bigger part of my 640*480 monitor...
Until I gave the salary of a whole month at my very first job to buy a voodoo 3D.

It's always been that way.
Someone that had an 800*600 crt monitor, had no reason to buy a 6800 Ultra.
 
Freshen up your memory.
?
The large drops you showed are due to high resolution and frame buffer limitations.

EDIT: Look at the 7800 GTX review: https://www.anandtech.com/show/1717/8

As you show, Doom 3 has a big drop with AA. However, as Derek wrote:
"Unlike most of the other games we're looking at, Doom 3 actually places quite a strain on the memory bandwidth of the graphics card. This seems to be the a common occurrence with many of the OpenGL games, though Doom 3 more so than others."

But look at the other games. For example:
Battlefield 2 demo
1600*1200 no AA: 68.4 FPS
1600*1200 AA: 53.4 FPS

The drops for turning on AA were not ~50% for most games and GPUs unless you were running into an obvious VRAM problem.

Steve forms strong opinions early and triples down on them. He personally only wants to play shooters with max frame rates so is quick to diminish graphics fidelity tech.

He doesn’t care for RT, hdr or even oled.

Their active community is very pro amd from years of Steve going to bat for Zen repeatedly even when it was notably inferior for gaming by coming up with power draw comparisons as his main talking points. So polls and comments on HuB content will be very pro amd regardless of reality.

And yet he ignores Reflex which makes playing "shooters with max frame rates" even more awesome. Obviously this guy likes everything which makes AMD equal or better than nVidia.

These simple concepts seem to elude Hardware Unboxed and the other copy cat Youtubers, they seem fundamentally incapable of understanding the PC platform itself, I would have respected them if they went on a crusade against all taxing visual features, at least then they would be consistent! But no, they went on a crusade against ray tracing alone, confirming their bias and arbitrary standards.

I don't know why we are even still debating against ray tracing when the archetict of PS5 (Mark Cerny) said that raster is no longer good enough and ray tracing + machine learning are the future in which we are going to achieve big jumps in quality and performance.
This is hilarious.
Are you watching the same channel as most of us here?

HUB have copped a tonne of flak from the AMD fanbase for slamming AMD’s various fumbles. To suggest that they’re pro-AMD is absurd.
 
Last edited:
The large drops you showed are due to high resolution and frame buffer limitations.
No, it happens even at 1024x768, see above. It happens on every GPU and on every resolution.

Please take the time to read the graphs well before posting, above there are tests for Doom and FEAR showing similar large drops.
 
No, it happens even at 1024x768, see above. It happens on every GPU and on every resolution.

Please take the time to read the graphs well before posting, above there are tests for Doom and FEAR showing similar large drops.
This is just factually not true, or cherry picking at best.

Please have a look at the old Anandtech review I posted.
 
An interesting comparison is the visual benefit AA+AF brought compared to ray tracing. I would say RT comes out ahead when given the “full RT” treatment. When looking at the typical RT implementations, I would go with the huge IQ boost of AA+AF.
 
This is just factually not true, or cherry picking at best.

Please have a look at the old Anandtech review I posted.
From your Anand article of the "brand new" 7800 GTX:
  • Doom 3 at 1600x1200 no AA: 91fps
  • Doom 3 at 1600x1200 with 4xAA: 54fps (decrease of 41%)
  • Everquest at 1600x1200 with no AA: 38fps
  • Everquest at 1600x1200 with 4xAA: 21.5fps (decrease of 44%)
  • Guildwars at 1600x1200 with no AA: 111fps
  • Guildwars at 1600x1200 with 4xAA: 55fps (decrease of 50%)
(for those wondering, this was the lowest resolution tested.)

Yeah, there are a couple games in there which "only" lost about 20-30%, yet somehow you're accusing someone else of cherry-picking? I even left out the 2048x1536 rez numbers because it makes your attempted counter-examples look even worse.

I think you've made his point for him.
 
Last edited:
This is just factually not true, or cherry picking at best.

Please have a look at the old Anandtech review I posted.
From that very same review, the 7800 GTX goes:
  • Guild Wars 2 goes from 110fps to 55fps after AA.
  • Splinter Cell Chaos goes from 83fps to 58fps after AA.
  • Knights of the Old Republic 2, goes from 78fps to 52fps after AA.

So it's definitely not a Doom 3 problem. In the last page you can see benchmarks of Far Cry and FEAR exhibiting the same issue. Here they are again!

2848.png
2849.png


fear3.jpg
fear2.jpg
 
Last edited:
From your Anand article of the "brand new" 7800 GTX:
  • Doom 3 at 1600x1200 no AA: 91fps
  • Doom 3 at 1600x1200 with 4xAA: 54fps (decrease of 41%)
  • Everquest at 1600x1200 with no AA: 38fps
  • Everquest at 1600x1200 with 4xAA: 21.5fps (decrease of 44%)
  • Guildwars at 1600x1200 with no AA: 111fps
  • Guildwars at 1600x1200 with 4xAA: 55fps (decrease of 50%)
(for those wondering, this was the lowest resolution tested.)

Yeah, there are a couple games in there which "only" lost about 20-30%, yet somehow you're accusing someone else of cherry-picking? I even left out the 2048x1536 rez numbers because it makes your attempted counter-examples look even worse.

I think you've made his point for him.
Christ almighty. Did you even read my post or what the reviewer said?

Those remaining on this site who are genuinely interested in discussion and learning are welcome to read the review.
 
An interesting comparison is the visual benefit AA+AF brought compared to ray tracing. I would say RT comes out ahead when given the “full RT” treatment. When looking at the typical RT implementations, I would go with the huge IQ boost of AA+AF.
AF had some insanely high costs too initially and this cost tended to be the highest when using maximum AF levels. With low AF levels the gains often weren't as obvious. It is the same with RT "levels" now.
 
From that very same review, the 7800 GTX goes:
*Guild Wars 2 goes from 110fps to 55fps after AA.
*Splinter Cell Chaos goes from 83fps to 58fps after AA.
*Kinghts of the Old Republic 2, goes from 78fps to 52fps after AA.

So it's definitely not a Doom 3 problem. In the last page you can see benchmarks of Far Cry and FEAR exhibiting the same issue. Here they are again!
Yet look what happens when we get a card with more memory + bandwidth:
1735758763998.png
1735758800018.png
Ta-da! The price of AA drops.

See also the review of R580+:
 
Yet look what happens when we get a card with more memory + bandwidth:
View attachment 12763
View attachment 12764
Ta-da! The price of AA drops.

See also the review of R580+:
Are you saying that it's not a relevant example because newer cards with more memory bandwidth are better at doing AA?
How is this different from newer cards being better at doing RT?
 
Are you saying that it's not a relevant example because newer cards with more memory bandwidth are better at doing AA?
How is this different from newer cards being better at doing RT?
I never made such an argument for/against RT, so can’t comment on that.

I’m simply showing reviews of older cards to show that enabling 4*AA did not lead to an instant -50% performance penalty. You only got crazy drops like that when the GPU was VRAM or bandwidth limited, or with R600’s obvious architectural problems.
 
I’m simply showing reviews of older cards to show that enabling 4*AA did not lead to an instant -50% performance penalty. You only got crazy drops like that when the GPU was VRAM or bandwidth limited, or with R600’s obvious architectural problems.
So we could just had the same memory bandwidth on the older cards and there would be no such performance hit from AA on them?

Also why do you think the hit is smaller when there is more bandwidth? What happens to the rendering without AA when there is more memory bandwidth which leads to lack of performance gains similar to those we see with AA enabled from said higher bandwidth?
 
Back
Top