Volari Review at Driverheaven - Interesting results

Oblivious

Newcomer
Veridian's review of XGI/Club3d's Volari Duo V8 is up at Driverheaven. He has some very interesting findings including the following:

Another disturbing issue was the quality of image outputted through the card when compared to the screencaptures we took. On several occasions we attempted to capture bugs onscreen (e.g. DVD quality or Unreal Tournaments many issue) and the captured image was significantly improved over the realtime image. Yet again a driver bug which helps skew reviews in favour of the Volari, hiding its true performance from the end user/review reader.

Read the rest of the review here http://www.driverheaven.net/reviews/Volari/index.htm.

Pure speculation on my part, but maybe Gabe Newell was referring to XGI altering the output of screencaps at Shader Day and not Nvidia. What do you think?
 
The Baron said:
did XGI have working hardware by Shader Day? I don't think they did

Good point. Admittedly, I know little about Volari's development. However, given Shader Day took place in September and the Volari supposedly shipped around December (please correct me if I'm wrong), I think it's possible XGI had some early silicon back then.

Either way, it looks like XGI needs to work on their drivers a lot.
 
Based on the reviews, it can be argued that as far as gaming is concerned they don't have working hardware now.

As far as I am concerned a card that borked on every single game the reviewer tested does not count as working hardware. If I can't play any games with it, I don't have a 3D accelerator.
 
Oblivious said:
Pure speculation on my part, but maybe Gabe Newell was referring to XGI altering the output of screencaps at Shader Day and not Nvidia. What do you think?

Considering that to my knowledge not one person has ever claimed first hand observation of NVIDIA altering screencaps, that might be feasible. The fanboys just took that comment, and since the others were true and NVIDIA did do them, ran with the whole boat.
 
Oblivious said:
Pure speculation on my part, but maybe Gabe Newell was referring to XGI altering the output of screencaps at Shader Day and not Nvidia. What do you think?
I don't think there's enough proof that the Volari driver really attempts to cheat the screenshots. Especially the problem in UT2003 looks to me like an issue I wouldn't expect to be able to see in screenshots (based on driverheaven's description of the problem).
And, if they really tried to cheat with screenshots, they'd have done an awful job - there are lots of screenshots available prooving their other cheats :p.
 
XGI exemplies some qualities other IHVs have:

ATI's driver quality (before CATALYST)
NVidia's "optimizations"
3dfx's dual chip architecture

Isn't that something?
 
Deathlike2 said:
XGI exemplies some qualities other IHVs have:

ATI's driver quality (before CATALYST)
NVidia's "optimizations"
3dfx's dual chip architecture

Isn't that something?

Must be they got the leftover employees... :LOL:
 
I don't think there's enough proof that the Volari driver really attempts to cheat at screenshots.

Just think : if the driver really attempted to alter the quality of screenshots, then how did XBitlabs possibly succeed in taking screenshots showing the bugs in games ?

Also, Driverheaven managed to take a pic of UT2003's bug, an issue I wouldn't expect to be able to see in screenshots if there was any alterations done.

Last but not least, Gabe Newell couldn't have been referring to companies other than ATI or NV as he only had chips from these 2 companies back then, he did not have any DX9 chip from XGI, S3, etc. There was no working silicon back then.
Hence the fact that the official Valve benchmarks only mentionned DX9 solutions from ATI and NV.

As far as I am concerned a card that borked on every single game the reviewer tested does not count as working hardware. If I can't play any games with it, I don't have a 3D accelerator.

Not quite : if you tried to run Glide games on your ATI or NV card, the games won't work either because the games were designed for 3dfx cards only. Yet it still deserves the title '3d accelerator'.
In the same way, all the games out there today are designed for ATI and NV in mind only, not XGI, so that could explain some of the issues.

I've reached this conclusion as I hear the S3 Deltachrome has exactly the same problems : Far Cry doesn't work at all, exactly the same bug in Prince of Persia (pic being too bright and all white), etc.

Either way, it looks like XGI needs to work a lot on their drivers.
 
vnet said:
Not quite : if you tried to run Glide games on your ATI or NV card, the games won't work either because the games were designed for 3dfx cards only. Yet it still deserves the title '3d accelerator'.
In the same way, all the games out there today are designed for ATI and NV in mind only, not XGI, so that could explain some of the issues.
Nonsense. GLide games don't run on ATi or nVidia hardware because neither company supports this API through their drivers.

"All the games" you're referring probably support 3D acceleration through the Direct3D and/or OpenGL API, both of which XGI claims to support with their current drivers. :rolleyes: :rolleyes: :rolleyes:

cu

incurable
 
vnet said:
Just think : if the driver really attempted to alter the quality of screenshots, then how did XBitlabs possibly succeed in taking screenshots showing the bugs in games ?

Two completely different things -- unintentional bugs vs. intentional lowering of image quality.
 
I remember running Morrowind for the first time on a Xabre 400. Wow what a disappointment (pixel shading disabled). Then I ran 3dmark 2001's advanced pixel shader test, wow another disappointment (white areas where the boat and dock was supposed to be). Then I played Quake3 another disappointment (16 bit textures, banding all over the game)

When confronting SIS PR with all of these concerns (those were off the top of my head it's been over a year since I worked on a Xabre 400) their response was rather interesting . To wit, game developers didn't support them. So much so, that they asked me if I could provide them with a copy of Morrowind.

I had really high hopes for a good solution with Xabre400 unfortunately, it never really worked out. I had high hopes for Volari Duo, and hope XGI fixes their issues...
 
vnet wrote:

Not quite : if you tried to run Glide games on your ATI or NV card, the games won't work either because the games were designed for 3dfx cards only. Yet it still deserves the title '3d accelerator'.
In the same way, all the games out there today are designed for ATI and NV in mind only, not XGI, so that could explain some of the issues.

Ahh, but GLide isn't supposed to work on ATI or Nvidia cards, however D3D and OpenGL are supposed to work on XGI.
 
XGi should IMHO start it's entire design philosophy from scratch, if they really want to make a serious entry in the graphics market. Their idea for releasing aggressively a top to bottom product range is most certainly a fact that deserves applaud, because that's the only way to confront the competition in the PC standalone graphics market.

For the rest they seem to constantly repeat the same past mistakes, probably in the hope that people won't care or won't notice. One step further they chose to opt for a highly expensive multi-chip design for the high end segment which never in the past saw any real breakthroughs and on average the disadvantages always weighted over the advantages (think 3dfx SLI - which was to abandon AFAIK multichip past Spectre - and ATI's MAXX technology) in the PC graphics markets. Granted multi-chip configurations see a lot of success in the arcade or professional simulator markets, but there cost isn't a single consideration either.

Volari is neither optimized for a propietary API like Glide was, nor is it anything else but an IMR, to justify any compatibility excuses. It's claiming D3D and OpenGL support and there a card doesn't need any special attention to work as it should.

I very much doubt that they will not get all the support they can get with an honest and straightforward attempt in the future. Finally last time I checked it's not a developer's responsibility to fix whatever is or might be broken in your hardware or drivers.
 
Ailuros said:
XGi should IMHO start it's entire design philosophy from scratch, if they really want to make a serious entry in the graphics market.
Agreed. The Volari V8 chip does seem to have enough functional units to perform well, but it seems to have major trouble getting the right data to the right place at the right time, keeping it from performing remotely close to what paper specs suggest. In particular, the vertex and texture caches seem to be seriously misdesigned. It performs rather badly in both synthetic and game tests, suggesting that the hardware rather than the drivers is responsible for the weak performance. Given that the efficiency problems appear to be similar to those of Xabre and XP4, I suspect that the HW design team hasn't been learning from past mistakes, and might well end up cranking out one core after another with the same problems until their investors call it quits.

As for games support, the natural thing would seem to be to optimize the hardware for at least DirectX8 operation (=fast VS/PS1.1, multitexturing, multisample AA, some level of aniso), as this would give good performance, decent compatibility and acceptable image quailty with most of the games out there. Yet on all those points the Volari V8 Duo currently falls flat on its face.

Hmmm. The Volari is released more than 1 year after the R300, yet is creamed by R300 derivatives - surely someone at XGI must have seen this coming ....?
 
but after all, V5 & V8 are what now was called Xabre2? So it should be expected to have same problems....
IMHO we can't expect anything new before 2004's fall ...
 
Hmmm. The Volari is released more than 1 year after the R300, yet is creamed by R300 derivatives - surely someone at XGI must have seen this coming ....?

A product positioning table from CP tech I had seen right after announcement placed the XG40 (V8 Duo) above the FX5900/9800 and on the same level as R360.

but after all, V5 & V8 are what now was called Xabre2? So it should be expected to have same problems....
IMHO we can't expect anything new before 2004's fall ...

Next generation XG45 is truly forecasted to enter mass production according to a CP tech roadmap in 8'04' with the following notes underneath:

  • DX9.1 (and then people wonder why the dx9.1 rumour constantly re-appears).
  • AGPx8 / PCI-E
  • 128bit DDR/DDR II
  • 256MB DRAM

It's reasonable to assume that with dx9.1 they mean PS/VS3.0. I honestly hope that this one is a from ground up design, abandon multi-chip approaches and the buswidth is a typo up there (if they want to keep the same amount of texture ops as on XG40).

There was a newsblurb from the Inquirer stating that they will try it out on both 130 and 90nm; if that's true and if they should decide for the smaller process then not sooner than 2005.
 
the XG40 is 128bit too. But it seems that by bridging them together, they claim to get a 256bit memory bus, which is what they plan on doing with the XG45, dual chip solution...
 
Back
Top