Leoneazzurro5
Regular
No, because a 5700XT has a board power of 220W.Isn’t that a 100% improvement ?
300/220*1,5= 2,045
No, because a 5700XT has a board power of 220W.Isn’t that a 100% improvement ?
The biggest question at the moment is is whether that USB-C is practically dead VirtualLink or something else using USB-C
This is the new cooler AMD will use. I like it.
I may be the only person on the net that prefers blower style reference coolers. I want that heat dumped outside of my case, damn it!
This is the new cooler AMD will use. I like it.
The heat dumped outside, not inside?I may be the only person on the net that prefers blower style reference coolers. I want that heat dumped outside of my case, damn it!
Blowers are inevitably loud, even if they cool well. And most of the time, they can't cool as well as axial designs. There are two main arguments for blower coolers... Low power GPUs and Multi-GPU configuration. And the latter is debatable.I may be the only person on the net that prefers blower style reference coolers. I want that heat dumped outside of my case, damn it!
Well at the moment I'm pretty sure it's just pixels on my monitor.Isn't that just a render?
[Wishfull mode=ON]The biggest question at the moment is is whether that USB-C is practically dead VirtualLink or something else using USB-C
[Wishfull mode=ON]
It's a USB4 port at 40Gbps that people can use to directly connect external NVMe drives and make use of the embedded hardware decompressor and DirectStorage to get I/O results similar to the new consoles. The GPU can take advantage of both data coming from the PCIe connector and the dedicated USB4 connection.
I am quite skeptical both about the "Infinity cache" than the "clock instability" over PS5 clocks, also because in the same video of RGT there are screens of the PS5 presentation where it is explicitely stated that PS5's CUs are diffrent respect to desktop RDNA2 Cu's and that the latter are quite "beefier" than the PS5's ones. So as they are different, they are quite unlikely to have the same issues if not process related.
What screens where now?I am quite skeptical both about the "Infinity cache" than the "clock instability" over PS5 clocks, also because in the same video of RGT there are screens of the PS5 presentation where it is explicitely stated that PS5's CUs are diffrent respect to desktop RDNA2 Cu's and that the latter are quite "beefier" than the PS5's ones. So as they are different, they are quite unlikely to have the same issues if not process related.
What screens where now?
Why are you assuming 12MB of cache for the CPU? 8MB L3 + 512KB L2 per core? Don't the L1 count for ESRAM?XSX has 76 MB of cache, if you take out the CPU cache it is around 64 MB for the 2 Shader Engines.
There's maybe about 40MB of cache that are more easily accounted for between the CPU and GPU bits, but at least on the die shot, you can make out a fair bit of cache-like structures on the (Infinity) Fabric marked areas. There's a bunch of other cache for the multimedia blocks as well.At Hot Chips though, MS never said a thing where those 64MB of cache were located or used for.
Microsoft said so.Why are you assuming 12MB of cache for the CPU?