Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
I'd prefer more of what we're currently getting in the PC space tbh which is ever closer integration between discrete CPU's and GPU's to bring the advantages of both UMA and separate dedicated processors under one umbrella. Things like resizable bar, smart access storage and HBCC are great examples of this. The tech just needs to become standardised so it can start being treated as the default by devs.

Thats how DF said it too, and id imagine 99% of the industry agrees on that. Both performance and flexebility.
 
The ray tracing performance is very impressive, imagine if they just took the A770 and doubled everything, would be quite the GPU and up their with the 3080ti/3090 with ray tracing.
 
Last edited:
Not too bad a GPU line for anyone looking at 3060 performance across the board with 16GB vram. The price is very good for the performance.
And as DF says, Intels GPUs also sport the all-critical dedicated RT and ML acceleration, something AMD gpus do not have yet, performing very well in RT games like 2077 and Control, surpassing AMD. In raw raster its not bad either, very competitive with 3060/6600XT.
 
Last edited:
after watching this video I decided that I am going to purchase the A770 16GB -I am not rich, but I saved some money to get a new GPU-. The aspect of the Arc I liked the most, impressive RT performance aside, is that the more resolution the best it performs compared to GPUs of a similar tier.

As of recently I got a 4k TV with 120Hz support and in games that use XeSS and some expected drivers improvements, I think the performance is going to be decent enough -also you don't need to set every game to Ultra-.
 
after watching this video I decided that I am going to purchase the A770 16GB -I am not rich, but I saved some money to get a new GPU-. The aspect of the Arc I liked the most, impressive RT performance aside, is that the more resolution the best it performs compared to GPUs of a similar tier.

As of recently I got a 4k TV with 120Hz support and in games that use XeSS and some expected drivers improvements, I think the performance is going to be decent enough -also you don't need to set every game to Ultra-.

I have a 3060ti so not looking at changing but after this video I will definitely be keeping an eye on their GPU's now.
 
So we have a third contender for a next-gen console! Could we have an ARM CPU and Intel GPU design? Or are we beholden to x64 now? Is the choice more Intel SoC or AMD SoC? Still, nice that Intel are competitive and there are more options and competitive pressures.
 
So we have a third contender for a next-gen console! Could we have an ARM CPU and Intel GPU design? Or are we beholden to x64 now? Is the choice more Intel SoC or AMD SoC? Still, nice that Intel are competitive and there are more options and competitive pressures.
I have no idea of ARM can fit/compete in the power, performance, price trifecta vs x86, but I have been nicely impressed with my M1 Macbook Pro.
 
So we have a third contender for a next-gen console!
Technically yes, but practically not at all because of BC requirements for the entire Xbox EccoSystem and that says nothing for likely price margins Intel would demand. Maybe when they get drivers to perfectly support older DirectX runtime and get hungry enough to lower their asking price for IP License.
 
Its atm an alternative to AMD since Intel has the ML/RT going for it, and comparable raw raster to amd/NV. But we dont know where amd will be in some years. NV would be too expensive.

I have no idea of ARM can fit/compete in the power, performance, price trifecta vs x86, but I have been nicely impressed with my M1 Macbook Pro.

Maybe they can turn to Apple, macbook air-like performance would fit in the budget.
 
I have a 3060ti so not looking at changing but after this video I will definitely be keeping an eye on their GPU's now.
you are very well served with the 3060Ti indeed! Save for a couple of games with RT on at 1440p and 4K where the A770 shines, you aren't gaining anything from switching now.

My case is different. I am used to Intel Graphics solutions on laptops and they usually have a very nice interface to manage the GPU.

Additionally, I am not so fond of certain nVidia politics (NIS available but no HDR + NIS for those with a Pascal or previous GPU), no Resizable BAR except for those with RTX 3000, so I am not going to miss them this new generation.

Finally, the power budget of the A770 -225W- is quite close to the power budget of my GTX 1080. I have a 550W power supply and that's important for me. The 4060 and 4070 migh be power hungry by the looks of it.
 
What do you mean? the 3060Ti has always higher fps than A770, even with RT enabled.

Didn't DF show a few instances where the A770 matched or slightly edged out the 3060ti with RT enabled?

Even techpowerups review shows that it can match and beat the 3060ti with ray tracing enabled: https://www.techpowerup.com/review/intel-arc-a770/34.html

And the RTX3060ti is on mature Nvidia drivers so it's RT performance won't really improve, I would be willing to bet the A770's will as Intel works on the drivers so potentially the A770 could end up beating it more than it loses.
 
Last edited:
It happens at 4K when the 8GB VRAM of the 3060Ti runs out, the only exception is Metro Exodus.

There are games where it's in spitting distance at 1080p and 1440p and games like RE8: Village where it's a tie.

For a GPU that's on poor drivers and on it's first generation RT hardware it's a very impressive showing and should improve over time.

Price wise in the UK the A770 is competing with the high end 3050's and entry level 3060's which it destroys both with RT enabled.

Also funny how the 8Gb Nvidia GPU's tank in Far Cry 6 at 4k with RT due to lack of VRAM but yet the 8Gb A750 doesn't seem to have that problem.
 
Ampere drivers could and probably will still imrpove over time too, not as much as Intels drivers but Ampere aint that old yet.
 
Yeah, I assume desktop PCs will continue this path and improving it.

However, I assume huge frame drops when VRAM is overflowing will always be a thing there as a result. I can't see how this is ever going to change as DRAM will always be much slower than GDDR. DDR5-6400 for example only has around 90 GB/s, a far cry from 800 GB/s and more on High End GPUs and M1 Ultra.
I think it be smarter to put an m.2 nvme drive right on the back of the card. With the size of NVME's and speed it be pretty easy to just load up a whole game on the nvme. That way the card doesn't have to go over pci-e to cpu to ram it can just go gpu to nvme. A pci-e 5 nvme drive should top out at 16GB/s . That should be fast enough when you compare it to the actual speed you'd get with all the other jumps
 
Status
Not open for further replies.
Back
Top