Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
I think it be smarter to put an m.2 nvme drive right on the back of the card. With the size of NVME's and speed it be pretty easy to just load up a whole game on the nvme. That way the card doesn't have to go over pci-e to cpu to ram it can just go gpu to nvme.

But how to you get the data on the NVME on the GPU 👀
 
A little late but: I'd definitely pay a premium for unified memory desktops. PC gaming's standard line of "what if it was a ton more work to develop on PCs in order to try to solve an impossible problem -- also, I am furious whenever textures stream slowly or the game stutters" makes zero sense to me,
 
So we have a third contender for a next-gen console! Could we have an ARM CPU and Intel GPU design? Or are we beholden to x64 now? Is the choice more Intel SoC or AMD SoC? Still, nice that Intel are competitive and there are more options and competitive pressures.
well, I am not sure it is going to happen. I can recall when the abandoned Larrabee was a posible design for a console chip I was so hyped back then.

Dont you think that backwards compatibility is a must nowadays? I can't foresee consoles switching to ARM coming from a x64 background. Backwards compatibility works very well on PC, with games still selling accross several generations -I recently purchased some Heroes of Might & Magic games from Ubisoft, to name a few-.

As for a third contender..., in the PC space sure we have a new contender now.

This time I am going with Intel: 100% sure. They seem to be solid GPUs and if they dont work well, it's "just" 350€, so if you ever switch GPU again, it won't hurt as much.

nVidia is just adding more and more of everything -power but also power consumption- with a new GPU which consumes between 450W to 600W of power.

I also doesn't help that nVidia starts this generation showing a GPU for rich people for which I neither care nor like. If they showed a mid-range or mid-high range GPU like the 4070 or 4060, it would be more interesting, but after seeing the 4090 at such a price.... enough money with which you can almost buy a good second-hand car, the hype for other Ada GPUs has decreased more than increased.

I am more interested in what AMD offers and to see if this time they RT performance is on par. That said, I just want October 12th to come asap and get the A770 16GB.
 
Dont you think that backwards compatibility is a must nowadays? I can't foresee consoles switching to ARM coming from a x64 background. Backwards compatibility works very well on PC, with games still selling accross several generations -I recently purchased some Heroes of Might & Magic games from Ubisoft, to name a few-.
There will be a point where you have to "desupport" BC games and move on. The onus to sell games is on game developers by either remastering old games to run on future gen consoles or provide the original gaming experience by only supporting previous gen consoles. Future gen consoles don't necessarily have to support BC games forever.
 
There will be a point where you have to "desupport" BC games and move on. The onus to sell games is on game developers by either remastering old games to run on future gen consoles or provide the original gaming experience by only supporting previous gen consoles. Future gen consoles don't necessarily have to support BC games forever.
I think we reached a point were many games look good enough to last. Especially indie games that have a particular artistic expression. Some big games also have good enough graphics and good enough gameplay that will be still be enjoyed. Steam still sells games that by today's standards should have been technically irrelevant but continue to sell due to their art, atmosphere or gameplay or a combination of all three
 
Nope, buy new hardware to play new games.

And what happens when hardware becomes purely ray traced? BC will have to vanish then.
why so? Any future graphics card is going to have some kind of rasterizer anyways.

Consoles aren't about raw power, usually. BC just means more earnings, companies want money, I'd swear BC will stay for a long time.
 
Not too bad a GPU line for anyone looking at 3060 performance across the board with 16GB vram. The price is very good for the performance.
And as DF says, Intels GPUs also sport the all-critical dedicated RT and ML acceleration, something AMD gpus do not have yet, performing very well in RT games like 2077 and Control, surpassing AMD. In raw raster its not bad either, very competitive with 3060/6600XT.
Intel's ray tracing solution is better than nvidia's in ways.. battlmage should have the best ray tracing on any gpu.. rumous it will have 16k cores
 
Intel's ray tracing solution is better than nvidia's in ways.. battlmage should have the best ray tracing on any gpu.. rumous it will have 16k cores
Actually Intel's RT solution will always differentiate between XMX and DP4a from a performance and quality perspective, so in a sense can't see Intel's solution being better.
 
There will be a point where you have to "desupport" BC games and move on. The onus to sell games is on game developers by either remastering old games to run on future gen consoles or provide the original gaming experience by only supporting previous gen consoles. Future gen consoles don't necessarily have to support BC games forever.
And that more than likely to be determined by revenue and profit, not hardware. MS and Sony run gaming businesses, not participate in beauty pageants.

As long as revenue/profits of older titles have a notable positive effect on Sony and MS’s bottom lines, they will probably engineer their hardware to support those titles.

That may show up as BC being streaming only and supported by older gen hardware. Or it may be akin to what Nvidia and AMD does, iterate their hardware and drivers to maintain BC but only to a point so older titles lose support over time. But an outright dismissal of BC just to chase better graphics will probably never happen.
 
Last edited:
There may come a point where they don't.
the future is kind of a mystery if you ask me. 4K 60Hz is a reality nowadays, and we aren't far off from 4K 120Hz becoming a standard. However, to achieve that with RT on, we are going to need one heck of a GPU whose entire design focuses on RT and nothing else. That leaves rasterization abandoned forever, which will eventually happen. That being said, rasterization is going to be necessary for quite a few years to come, and I can only foresee a 100% RT future if the chip can emulate a rasterizer when it's needed.
 
However, to achieve that with RT on, we are going to need one heck of a GPU whose entire design focuses on RT and nothing else.

This is where consoles would have an advantage over PC as it's much easier in my opinion to ditch backwards compatibility with rasterised games on a console than it is to abandon decades and decades of games and software on PC.

I would love to see Nvidia/AMD/Intel make a pure RT GPU as it would be interesting to see what it could do in terms of performance.

I also think memory needs to see to pretty significant gains and while HBM has dropped off everyone's radar thanks to GDDR6X, I do think at some point it'll start to become a necessity due to it's sheer bandwidth with HMB3 supplying 819 GB/sec per stack.

4 stacks and you're at 3.2TB/s of GPU bandwidth.
 
Status
Not open for further replies.
Back
Top