Bondrewd
Veteran
Yea.return on investment ?
Who's gonna pay $1.5k a 120 CU Radeon?AMD really needs more performance in ray tracing
They just drop N32 which is a lot less broken and et voila.Leaves AMD very little room to maneuver
Yea.return on investment ?
Who's gonna pay $1.5k a 120 CU Radeon?AMD really needs more performance in ray tracing
They just drop N32 which is a lot less broken and et voila.Leaves AMD very little room to maneuver
Does it perform as good as the 4090 cause people seem to be buying a lot of thoseYea.
Who's gonna pay $1.5k a 120 CU Radeon?
They just drop N32 which is a lot less broken and et voila.
The entire point is that it was supposed to clock that or higher instead of what it does now.it would need to hit 3.2Ghz to reach 4080 RT performance, and that's probably only in more simple RT games
The. WGPs. Are. Broken.What I don't understand is the power consumption.
Sir it's broken.It's a hog, and I was expecting it to be a more efficient architecture.
However, at least the power consumption and voltage of my 7900XT cannot support it to play games in this state, and it can only perform general-purpose computing work.
A properly functioning N32 could come dangerously close to the 7900XT/XTX, so where do they price it without making the 7900 irrelevant?They just drop N32 which is a lot less broken and et voila.
It's real.If this guy's not faking it nor using LN2, then it looks like AMD have fubared things quite badly.
649 just like 6800XT and engage winning.so where do they price it without making the 7900 irrelevant?
How so?The. WGPs. Are. Broken.
649 just like 6800XT and engage winning.
Current N31 is made to become irrelevant once non-FUBAR bits crawl out.
Something in the SIMDs or VRF goes utterly sad under heavier loads.How so?
It was/is kinda the point.but if it's close to 7900 XT at that price and a FAR more sane power consumption
But is it something at the architecture level design and was missed during simulations or something related to the manufacturing process? Or a mix of both? It's seems one of those "how could anyone not catch this" situations.Something in the SIMDs or VRF goes utterly sad under heavier loads.
At 350W? That seems unlikely.That was the point.
3.5@1.1V.
Pretty sure logic design issues.Or a mix of both?
My man you don't even know how Intel does stuff to say that.No company will spend money on try and error.
It was the target.At 350W? That seems unlikely.