Volari Review at Driverheaven - Interesting results

Lecram25 said:
the XG40 is 128bit too. But it seems that by bridging them together, they claim to get a 256bit memory bus, which is what they plan on doing with the XG45, dual chip solution...

That's the exact problem with AFR though and it's by far not a 256bit or even 256bit equivalent bus. What's the available bandwidth between the two chips? Somewhat above 2GB/s...

Albeit AFR and SLI (the latter being in fact a far better idea if further refined) are quite different methods the constant problems were that each chip always required it's own dedicated ram and triangle setup was never shared between chips.

In the case of the Volari it's oversimplyfied a Master and a Slave chip where one chip takes on the even frames and the other the odd frames (Master= Frame 1, Slave= Frame 2, Master= Frame 3....etc.); not exactly but I prefer that simple explanation. In my mind per frame and per chip there's only (with 350MHz DDR) a maximum theoretical bandwidth of 11.2GB/s available.

At 325MHz and 16 total TMUs the bandwidth requirement should be at 20.8GB/sec (gee I wonder why ATI/NV while increasing the amount of texture ops on future products have also raised the available memory bandwidth up the wazoo...).

Now a funky speculation on a dual XG45 config; for PS/VS2.0 and XG40 they had ~80M transistors per chip * 2 = 160M transistors and two molex connectors. A conservative estimate for PS/VS3.0 could point at ~120M transistors (or more) * 2 = 240M transistors...

That's the exact reason why I'm saying that they need to change their design philosophy from scratch and especially get rid of multi-chip approaches. Or at least get rid of that hideous AFR approach and assign viewports to chips, so that they finally can share triangle setup and truly double throughput....
 
Rugor said:
Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.

Have you managed to make Dusk or Dawn work on ATI cards without using a wrapper or a hack ? Me neither.
Yet they both use OpenGL which is supposed to work on ATI ;)
 
Rugor said:
Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.

Have you managed to make Dusk or Dawn work on ATI cards without using a wrapper or a hack ? Me neither.
Yet they both use OpenGL which is supposed to work on ATI ;)
 
Rugor said:
Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.

Have you managed to make Dusk or Dawn work on ATI cards without using a wrapper or a hack ? Me neither.
Yet they both use OpenGL which is supposed to work on ATI ;)
 
Rugor said:
Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.

Have you managed to make Dusk or Dawn work on ATI cards without using a wrapper or a hack ? Me neither.
Yet they both use OpenGL which is supposed to work on ATI ;)
 
Rugor said:
Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.

Do the Dawn and Dusk demos work on ATI cards without any hack or use of a wrapper ?
Yet they both use OpenGL and OpenGL is supposed to work on ATI ;)
 
vnet said:
Rugor said:
Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.

Do the Dawn and Dusk demos work on ATI cards without any hack or use of a wrapper ?
Yet they both use OpenGL and OpenGL is supposed to work on ATI ;)

If you weren't so excited to post this reply (5 times even), you'd probably remember that it uses NV proprietary extensions and if that wasn't enough, not long ago it was uncovered that some EA games detect the ID of your 3D card and if they find an ATI automatically thumb down the quality of the shaders.

The onus is not then on the ATI to run dawn. It's neither on nVidia to make it run on ATI cards since those are nVidia made tech demos. What's important here is that currently the volari is not an option:

1) It's slower than their competition.
2) It's more expensive than comparable cards.
3) It has errors/can't run some games.

So, while it may become a contender in the future, I'm not holding my breath and don't think anyone should buy a V8 card until they can show it's worth it.

Remember, as much as we can sympathise with XGI when they say current games are aimed at ATI/nVidia that doesn't change the fact that gamers want to play games, not argue which IHV is better (regardless of what you see on certain forums). Volari can't compete in the above 3 points and thus it's worthless.
 
Back
Top