Rootax
Veteran
Maybe to save money?
But their is a cost to design and validate a new power delivery system too ? Well in the end I guess you're right. I hope Gamernexus or Buildzoid will analyse the pcb&stuff.
Maybe to save money?
Doubt it, Strix has actually more VRMs than ref.Maybe to save money?
It's ASUS, they always manage to fuck up at least something in their AMD cards. Their 400 cards were meh.Jayztwocent saw the same thing, the vanilla amd card clocks a little higher.
I guess the asus cooler is not designed correctly for Vega (bad gpu/hbm contact, bad vrm&co contact, ... ?)
But, I get the need for custom cooling, but why did asus made a custom vrm/power delivery design too ? Amd stock is top notch, hell it's one thing in Vega which is nearly perfect, and that's the thing they change... ?
I noticed in pictures of the Guru3D (I think it was) disassembly that the mosfet from the topmost phase is only half covered by the heatsink and accompanying thermal tape. Thermal camera imagery also shows some 110C temps in that area, I bet that's from that very same half-covered mosfet. If they'd just shifted the power phases maybe 4mm down towards the PCIe connector side of the card, all of the fets would have been covered. Very halfassed, what are their product design, prototyping and QC people doing???It's ASUS, they always manage to fuck up at least something in their AMD cards.
It's ASUS.I noticed in pictures of the Guru3D (I think it was) disassembly that the mosfet from the topmost phase is only half covered by the heatsink and accompanying thermal tape. Thermal camera imagery also shows some 110C temps in that area, I bet that's from that very same half-covered mosfet. If they'd just shifted the power phases maybe 4mm down towards the PCIe connector side of the card, all of the fets would have been covered. Very halfassed, what are their product design, prototyping and QC people doing???
This is supposed to be a high-end premium product.
What's shocking is that cooling the power delivery apparently wasn't part of the cooler design requirement on a premium part. FETs can run hot, which makes cooling them that much easier. However, it's always possible that extra FET is powering something different. Say HBM2 or board components and won't have the same draw as the others. Going back to that Hovis design with XB, driving core and memory separately, even within the chip, may be part of the design. Run even the memory controller at different voltages for some added isolation. That could be the case here.Thermal camera imagery also shows some 110C temps in that area, I bet that's from that very same half-covered mosfet. If they'd just shifted the power phases maybe 4mm down towards the PCIe connector side of the card, all of the fets would have been covered. Very halfassed, what are their product design, prototyping and QC people doing???
It appears ASUS is just paper launching a token product just to claim dips on being the first, however other AIBs appear not interested in pursuing custom Vegas. After MSI, now comes GIGABYTE.This is supposed to be a high-end premium product.
https://www.techpowerup.com/237379/...o-release-a-custom-radeon-rx-vega-64#commentsGIGABYTE has stated that there are no current plans to make a custom Radeon RX Vega 64. This might change in the future. But for now, early Vega 64 adopters have no choice but to settle for the reference design or or custom design cards coming out from other vendors. There is still a light hope for the Vega 56 though, since GIGABYTE didn't discard the possibility of releasing a RX Vega 56 Gaming G1.
I don't think any AiB should ever bother with something that is basically an early access product.It appears ASUS is just paper launching a token product just to claim dips on being the first, however other AIBs appear not interested in pursuing custom Vegas. After MSI, now comes GIGABYTE.
https://www.techpowerup.com/237379/...o-release-a-custom-radeon-rx-vega-64#comments
It appears ASUS is just paper launching a token product just to claim dips on being the first, however other AIBs appear not interested in pursuing custom Vegas. After MSI, now comes GIGABYTE.
https://www.techpowerup.com/237379/...o-release-a-custom-radeon-rx-vega-64#comments
Haven't decided yet most probably.Eh, who cares 'bout Gigabyte?
Dead silence from both Sapphire and XFX is weird.
We live to serve!New Vega driver out.
You people with cards already, mind trying it out and reporting back any changes?
Thanks!
We live to serve!
Nah, I dunno. Magic driver, new geometry processing pipeline finally being enabled, anything of the sort... Well, one can always dream, right?Anything in particular you’re thinking of? (I’d love to see a WattMan overhaul.)
If it's ever going to happen, it's not happening in this driver.Nah, I dunno. Magic driver, new geometry processing pipeline finally being enabled, anything of the sort... Well, one can always dream, right?
Nah, I dunno. Magic driver, new geometry processing pipeline finally being enabled, anything of the sort... Well, one can always dream, right?
Wattman yes. Constant issues there, is it finally reliable now or still wonky with regards to changing settings and/or getting them to stick?
Other than that, lost 0.5MH/s mining ETH
I don't think AMD would be silent if it were.;-)Can sombody explain this?
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080
Primitive Shader and DSBR now active?
Can sombody explain this?
https://www.computerbase.de/2017-09/forza-7-benchmark/2/#diagramm-forza-7-1920-1080
Primitive Shader and DSBR now active?