Really weird that WCCFTech is covering this stuff before other outlets. Are they trying to be seen as a serious outfit?
Hmmm SAM and DLSS in the same graph. That’s sure to rattle some nerves.
With all the hype about CB2077 recently I'd forgot how gorgeous a fully maxed out WDL can look.
AMD Smart Access Memory: Auch Zen+ und Zen 2 beherrschen den VRAM-Vollzugriff - ComputerBaseAs first benchmarks from the ComputerBase community show, an AMD Ryzen 7 2700X (test) based on the already older Zen+ architecture of AMD Smart Access Memory (SAM) also benefits.
In Assassin's Creed Valhalla (Test), community member "Strugus" achieved about 15 percent more frames per second than without the function with an AMD Radeon RX 6800 graphics card and activated Re-Size BAR.
Given how good Q2RTX still runs on Radeon RX 6800, I tried it on a GTX 1080 just for giggles. In Q2-esque resolutions its borderline playable in single player mode. In default quality and 1024 x 768, I'm getting around 30-ish fps - which is more basically, than what my rig achieved, when I first played through Q2 before the turn of the millenium.
Just wish AMD would enable shader-based DXR/VKRT for RX 5700 and Vega as well. Could spark more interest in RT and desire to upgrade for Radeon users.
Heck, that's better than when I first played it. 640x480 on a Voodoo Rush with decent performance. I could do 800x600, but the performance just wasn't there for competitive play. Ah the memories. I was finally happy with performance at 1024x768 once I got a Voodoo 2.
Regards,
SB
You needed 2 X Voodoo 2s for 1024x768 didn't you?
Not without active adaptersCan you connect old D Sub or component CRTs up to modern graphics cards?
I seem to recall 800x600 was max resolution for one card (even the 12MB versions), and 2 enabled 1024x768, but like you say, it was a long time ago
Can you connect old D Sub or component CRTs up to modern graphics cards?
Heck, that's better than when I first played it. 640x480 on a Voodoo Rush with decent performance. I could do 800x600, but the performance just wasn't there for competitive play. Ah the memories. I was finally happy with performance at 1024x768 once I got a Voodoo 2.
Same here, was on a voodoo2 for a while. You also had to thinkoff color dept as it could impact performance on those (sounds strange these days )
1024x768 was a beastly resolution back then lol.
V2 had fixed 4MB for the frame buffer. That would fit two 16-bit frames (standard double-buffering) and a 32-bit depth-buffer, all at 800*600 resolution.I seem to recall 800x600 was max resolution for one card (even the 12MB versions), and 2 enabled 1024x768, but like you say, it was a long time ago
Can you connect old D Sub or component CRTs up to modern graphics cards?
My issue with it is efficiency. I see no way that it can scale down in the future to mass market prices and power draws, and that doesn’t improve as users keep moving to ever more mobile devices. In which case it is reduced to being a cool effect you can switch on to see how it looks. And making lighting function well with RT will be work and effort spent on top of the other path(s).This is one of the reasons I'm suggesting ditching hw tracing. You can do what you want in compute, forget the api restrictions.
The other is that the baseline requirements aren't Nvidia cards, they aren't even the PS5/SX. It's the Series S that all high end titles must include. Core features have to run on there, the console where Watchdogs Legion looks like a PS2 game thanks to how low res the raytracing is. Call of Duty doesn't even enable raytracing.
That's why hw raytracing is potentially too costly even for devs that think it's a good idea.
There were some games that didn't use a depth buffer back then. Like Jedi Knight. You could get a bit more resolution out of your Voodoo1/2 then.V2 had fixed 4MB for the frame buffer. That would fit two 16-bit frames (standard double-buffering) and a 32-bit depth-buffer, all at 800*600 resolution.