BenSkywalker
Regular
Joe-
Then assume it is.
This is about having functioning drivers. If I was using, say, an Asus K7M mobo and it couldn't function properly in AGP 2X because the motherboard has known line noise issues, I would be pretty foolish to blame it on a vid board maker wouldn't I?
There claim was backed by review sites and forum posters, but I'll expand on that in a bit.
And that means nVidia doesn't earn your trust. I am saying that ATi doesn't earn mine.
List the game, list the driver. It is that simple. I see posts from people who have issues that others can't reproduce, to me that indicates it is something elsewhere. I'll give you an example- Install the Catalyst 3.2 drivers and start any HL powered game running OpenGL. Play for any amount of time and hit Esc. Alternatively, fire up Sacrifice and simply look around using any ATi Cat drivers from the launch of the R300 core boards up until the Cat 3.2s(at least, got rid of my board so haven't tested the latest drivers). Another example, the original No One Lives Forever, Catalyst version 3.1 or 3.2, extreme levels of input latency playing the game no matter what framerate or setting.
List something comparable for nVidia, name the game name the driver and I'll check it out. I'm not talking about random issues that only I have come across, I'm talking about issues that impact all the people using the boards/drivers.
I paid my money for a R300 core board, have you purchased a FX? I'm not talking in a hypothetical BS sense here, I put my money out and then had numerous complaints with the board. I didn't need to listen to second and third hand information about supposed issues, I dealt with them first hand.
And in doing as much it is assumed that the games that use shaders will be limited by their performance above all else.
What 'real truth' exactly?
None, and that includes B3D. To cover every single aspect of a graphics card you would need several thousand pages worth of review at the very least. Because this is not reasonable, sites are forced to cut things down to cover the things they think are most important. To me, those are drivers first and foremost, followed by IQ and performance.
Besides the driver issues, I don't trust their analysis on things such as IQ. The latter happens to be due to their emphasis on AA over AF(which has been a constant here, btw). To me, proper texture filtering is a lot more important to me then having perfect edge AA. I would take a full, proper, 64x AF implementation over 16x MSAA of any type. This does not agree with the preference that B3D has.
On the driver front, this site focuses on 3D tech and stays away from gaming related issues. As of right now, the only reason I look at 3D vid card reviews is to observe potential benefits to my gaming.
For performance, they look to the 'generic' performance of the board, testing its theoretical specs which is the right way for a tech based review. Take SplinterCell being used as a bench. They level off the playing field to utilize the bench without really getting in to nV's superior visuals in the game. In a tech based sense, the only reason nV has this edge is the developers exploited features not supported directly by DirectX due to the games XBox roots. If I was convinced that SC would be the last XBox port the PC would see, I wouldn't consider this too much, but it isn't. If I want to play SplinterCell "the way it's meant to be played" is actually viable for this particular game. Only running NV hardware can you see the game as it was intended. Focusing on the tech end of the spectrum, this is a non issue as it isn't a truly legit utilization of DX, and it isn't something that is rampant in the PC space. As a gamer, when one of the best titles released in the last six months looks best on a particular piece of hardware due to additional implementations, that is something I do care about. The same would be the case with ATi products. If there is a title that offered visual enhancements only available on ATi boards then I would want to know about it. Is B3D going to cover this? No.
For the non tech sites, most of them don't have a clue what they are doing and horribly screw up everything. I put more faith in to B3D's reviews then I do any other sites, but I do not trust what they find an excellent product will be so much as tollerable for myself.
Could you list the games that have come out in the last year that don't use static hard T&L?
It was used as an equalizer for the Voodoo5.
Using it to prop up another part is where the issue comes in. They also mistakenly assumed that developers were going to skip static T&L completely and jump from software to pure VS. Nothing was ever given as to why they assumed this would happen except why they didn't like the technology. Is that what you are talking about in terms of 'honesty'?
In the years I've been reading here B3D has an excellent record of seeing where hardware is going and a fairly horrible record of seeing where games are going. As such in the past they have weighted the importance of each feature based on where they thought the gaming market was headed, even if they were far off the mark. At least now they are using DirectX as a guideline, even if that in itself fails to truly indicate where games are headed.
You can try and spin it in to a conspiracy theory if you want, the history stands that B3D has come out with support for features that have failed to be utilized by developers(T-Buffer, only real use for it was FSAA despite the nigh PR article that B3D posted) while denouncing those that are still in use(static T&L). nVidia takes one direction and this site stands staunchly behind the industry going in another direction where there is a rift at nearly every chance.
It isn't about honesty. Look at nV's pixel pipe configuration. Yes, I know there PR is full of shit but forget that for a moment. With Carmack's direction with Doom3 and the prediction that that is where the industry is headed, why so much focus on the PS side of things when the game utilizes hardly any advanced shaders while ignoring the stencil fill requirements such a direction would require? If B3D were to focus on the fact that nV has nigh 'free' stencil fill available to it, would it be dishonest? It would be a different focus, one that is friendly to nVidia. If they did do this, and I was working for ATi, I would be upset that they weren't focusing on PS performance more and relatively ignoring stencil fill issues. It isn't about them calling it as they see it, it is how they see it looking at where the market it headed.
None of them do I trust completely nor will I. If there was a site like B3D that focused more on gaming and attempted to paint all new features as equally important, then I would probably be the most inclined to trust it. Listening to sites like B3D is the reason why I have gripes with ATi right now. Do I think they were 'dishonest'? No. Doesn't change that the impression I got from reading their reviews was not what I needed to know about the product.
Because in some cases, it is
Then assume it is.
This is about "pulling it out of the box and it working" right?
This is about having functioning drivers. If I was using, say, an Asus K7M mobo and it couldn't function properly in AGP 2X because the motherboard has known line noise issues, I would be pretty foolish to blame it on a vid board maker wouldn't I?
So, why don't you ignore ATI PR when they say they support Aniso?
There claim was backed by review sites and forum posters, but I'll expand on that in a bit.
If nVidia was "forthcoming" about their performance, there wouldn't be any "shader" cheats or "clipping plane" cheats. They are designed to FOOL ME into thinking thier performance is something it's not.
And that means nVidia doesn't earn your trust. I am saying that ATi doesn't earn mine.
It's just wonderful how you attribute all the bugs that others run into as "not nVidia problems", "random", etc.
List the game, list the driver. It is that simple. I see posts from people who have issues that others can't reproduce, to me that indicates it is something elsewhere. I'll give you an example- Install the Catalyst 3.2 drivers and start any HL powered game running OpenGL. Play for any amount of time and hit Esc. Alternatively, fire up Sacrifice and simply look around using any ATi Cat drivers from the launch of the R300 core boards up until the Cat 3.2s(at least, got rid of my board so haven't tested the latest drivers). Another example, the original No One Lives Forever, Catalyst version 3.1 or 3.2, extreme levels of input latency playing the game no matter what framerate or setting.
List something comparable for nVidia, name the game name the driver and I'll check it out. I'm not talking about random issues that only I have come across, I'm talking about issues that impact all the people using the boards/drivers.
Just stick with the "nVidia logo" games, and you'll be fine.
I paid my money for a R300 core board, have you purchased a FX? I'm not talking in a hypothetical BS sense here, I put my money out and then had numerous complaints with the board. I didn't need to listen to second and third hand information about supposed issues, I dealt with them first hand.
And right now, the indication is, unless there are specialized paths / hacks for the NV3x, shader performance is poor compared to R3x0. And if the game gets those hacks, the quality won't be as good.
And in doing as much it is assumed that the games that use shaders will be limited by their performance above all else.
I would word that differently. If I was working at nVidia, I would trust they B3D would try its best to find the real truth, so I would trust that they might not come up with the "best light" kind of review.
What 'real truth' exactly?
So, Ben, which 3DGraphcis site do YOU PERSONALLY trust most for truthful analysis?
None, and that includes B3D. To cover every single aspect of a graphics card you would need several thousand pages worth of review at the very least. Because this is not reasonable, sites are forced to cut things down to cover the things they think are most important. To me, those are drivers first and foremost, followed by IQ and performance.
Besides the driver issues, I don't trust their analysis on things such as IQ. The latter happens to be due to their emphasis on AA over AF(which has been a constant here, btw). To me, proper texture filtering is a lot more important to me then having perfect edge AA. I would take a full, proper, 64x AF implementation over 16x MSAA of any type. This does not agree with the preference that B3D has.
On the driver front, this site focuses on 3D tech and stays away from gaming related issues. As of right now, the only reason I look at 3D vid card reviews is to observe potential benefits to my gaming.
For performance, they look to the 'generic' performance of the board, testing its theoretical specs which is the right way for a tech based review. Take SplinterCell being used as a bench. They level off the playing field to utilize the bench without really getting in to nV's superior visuals in the game. In a tech based sense, the only reason nV has this edge is the developers exploited features not supported directly by DirectX due to the games XBox roots. If I was convinced that SC would be the last XBox port the PC would see, I wouldn't consider this too much, but it isn't. If I want to play SplinterCell "the way it's meant to be played" is actually viable for this particular game. Only running NV hardware can you see the game as it was intended. Focusing on the tech end of the spectrum, this is a non issue as it isn't a truly legit utilization of DX, and it isn't something that is rampant in the PC space. As a gamer, when one of the best titles released in the last six months looks best on a particular piece of hardware due to additional implementations, that is something I do care about. The same would be the case with ATi products. If there is a title that offered visual enhancements only available on ATi boards then I would want to know about it. Is B3D going to cover this? No.
For the non tech sites, most of them don't have a clue what they are doing and horribly screw up everything. I put more faith in to B3D's reviews then I do any other sites, but I do not trust what they find an excellent product will be so much as tollerable for myself.
Static acceleration the standard? Where did that come from? Static T&L is used, but to what extent over CPU T&L?
Could you list the games that have come out in the last year that don't use static hard T&L?
And given that nVidia was first out with both static T&L and vertex shading....how is this "against nVidia?"
It was used as an equalizer for the Voodoo5.
What is this "old stance" you are talking about? Again, their stance on "static T&L"? As if that's an nVidia specific feature? Asif nVidia also wasn't the first part out that had DX8 vertex shaders?
Using it to prop up another part is where the issue comes in. They also mistakenly assumed that developers were going to skip static T&L completely and jump from software to pure VS. Nothing was ever given as to why they assumed this would happen except why they didn't like the technology. Is that what you are talking about in terms of 'honesty'?
In the years I've been reading here B3D has an excellent record of seeing where hardware is going and a fairly horrible record of seeing where games are going. As such in the past they have weighted the importance of each feature based on where they thought the gaming market was headed, even if they were far off the mark. At least now they are using DirectX as a guideline, even if that in itself fails to truly indicate where games are headed.
You can try and spin it in to a conspiracy theory if you want, the history stands that B3D has come out with support for features that have failed to be utilized by developers(T-Buffer, only real use for it was FSAA despite the nigh PR article that B3D posted) while denouncing those that are still in use(static T&L). nVidia takes one direction and this site stands staunchly behind the industry going in another direction where there is a rift at nearly every chance.
It isn't about honesty. Look at nV's pixel pipe configuration. Yes, I know there PR is full of shit but forget that for a moment. With Carmack's direction with Doom3 and the prediction that that is where the industry is headed, why so much focus on the PS side of things when the game utilizes hardly any advanced shaders while ignoring the stencil fill requirements such a direction would require? If B3D were to focus on the fact that nV has nigh 'free' stencil fill available to it, would it be dishonest? It would be a different focus, one that is friendly to nVidia. If they did do this, and I was working for ATi, I would be upset that they weren't focusing on PS performance more and relatively ignoring stencil fill issues. It isn't about them calling it as they see it, it is how they see it looking at where the market it headed.
Again, I ask you, what 3D site do YOU trust most for an honest assesment of the hardware, vs. just spitting out marketing gcrap?
None of them do I trust completely nor will I. If there was a site like B3D that focused more on gaming and attempted to paint all new features as equally important, then I would probably be the most inclined to trust it. Listening to sites like B3D is the reason why I have gripes with ATi right now. Do I think they were 'dishonest'? No. Doesn't change that the impression I got from reading their reviews was not what I needed to know about the product.