Review of 9700Pro @ ixbt.com

DaveBaumann said:
I think that is Johns point. Lots of us have been annoyed at Alexsok's constnat NV30 babble and we've told him so - likewise the same should be shown elsewhere.

agree.

I have no problems with rumours. But why not post them just once and be done with it ?
 
DaveBaumann said:
I think that is Johns point. Lots of us have been annoyed at Alexsok's constnat NV30 babble and we've told him so - likewise the same should be shown elsewhere.
Do you suppose maybe a new forum for rumours only might work,just a thought. ;)

It seems as Beyond3d gets more popular this is inevitable & that most come here because they think they're 'unbiased' but if you ask me the proof is in the pudding in everytime someone posts.

I for one just come to this site just to get through all the PR cr*p that both Nvidia & ATI serve & just get right down to what the hardware is capable of,but sadly I never thought that cr*p would come in the form of Fanboys,especially here. :eek:
 
DaveBaumann said:
Isn't it unfair and disingenuous to hold strong opinion or make argument about any vapourware - ATI or nvidia?

I think that is Johns point. Lots of us have been annoyed at Alexsok's constnat NV30 babble and we've told him so - likewise the same should be shown elsewhere.

Nobody is starting any R350 threads anywhere here, all people are saying that and been stated on many review sites like Anandtech that there may be a refresh part coming...and that is all I stated and certain other members..
ATI has six months to Q1 2003, if anyone doesn't think they are doing something as we speak needs there head examined..
So as as said before...people can compare the Nv30 to the Radeon 9700 even though its paper vs. silicon but in the end that will not be what the Nv30 will be competing with...the 9700 we see here TODAY
 
John Reynolds said:
alexsok said:
The real deal will come with NV30! :D

A statement which implies ...

That statement implies the obvious : he's a strong nVIDIA supporter ( I know - I smell these guys from a mile away ) and that's why he will never give the full recognition to any non-nVIDIA products and will give much more than the real deserved truth to any nVIDIA creation doesn't matter how inferiour and expensive it is .
 
alexsok said:
Did I say I wanted to stop that comparisons?

Obviously the GF4 will be compared to R300 no matter what, all I said is that the more "right" comparison would be between R300 & NV30, since they are both next-gen solutions, that's all!

Neeyik - you're right, but I don't think there is something wrong about me bringing NV30 info to the forum. I stated countless times that u can take it as you will: rumour, truth, whatever. My purpose was to arise intresting discussions about NV30 and the technologies lying within and compare them to R300 (not in a "Nvidia vs ATI" kind of way).
But yes I agree, I should have expressed myself more clearly to avoid confusions.
Yes, everytime you say the real test will be against an NV30, you're more than implying that you wanted to stop NV25 vs R300 comparos. The fact is, in the real world, comparisons are based on price and availability, not equal generations--just like the R200 has been compared to an NV25 for the past few months.

Don't you understand that we take your consistent and unyielding barrage of "info" as something approaching trolling? Enough already. We all expect the NV30 to be at least slightly better than the R300 when it arrives, and we've pretty much exhausted release date speculation. I'd appreciate it if you'd limit your comments to the topic at hand, and not constantly refer to NV30. You can provide the occasional snippet of unsubstantiated info from your sources,* but there's no need to repeat it ad infinitum--we're a small group, it's hard to miss the advance info we all scour this forum for. A little restraint, please. TIA.

And I hope B3D doesn't resort to a "Rumors and Speculation" forum to shunt all the fanboys to. They can go elsewhere for that; B3D really shouldn't have to pay for bandwidth for that circle jerk of ego stroking and baseless bickering.

* This isn't a deregotory, but rather a factual, statement. You report, we decide. We appreciate your confiding in us. We hope you can understand our initial skepticism. Also understand how constant pimping of a certain company on no rational basis begins to be viewed as a sort of Chinese torture--an ultimately unbearable stream of rumor drips.
 
ixbt seem to have their 6x FSAA broken. All the 6x shots are exactly the same as the 4x ones (I think they commented as much but I cant read russian :))
 
Bambers said:
ixbt seem to have their 6x FSAA broken. All the 6x shots are exactly the same as the 4x ones (I think they commented as much but I cant read russian :))

They said that there is no visual difference between 6x FSAA & 4x FSAA, thus, it's wiser to use 4x FSAA.
 
They must have broken it or they took the shots at 1600x1200 and the driver dropped back to 4x fsaa. Other sites certainly show a difference between 4x and 6x.
 
alexsok said:
Bambers said:
ixbt seem to have their 6x FSAA broken. All the 6x shots are exactly the same as the 4x ones (I think they commented as much but I cant read russian :))

They said that there is no visual difference between 6x FSAA & 4x FSAA, thus, it's wiser to use 4x FSAA.

Maybe they simply couldn't figure out that a driver bug requires rebooting for settings changes to take effect?
 
Is it just me or did the point of this thread just end up getting flushed down the tubes?

It was on track for about the first 3 messages and then it just erupted into flames. :(
 
OpenGL guy said:
Actually, a mode switch will suffice.

Reminds me of problems that 3dfx cards/drivers had when the app requested that the vidmode to be set was the mode that was already being used. The trick that they used to get around the problem was to change the display bit depth to 8 bits, instead of 16 or 32 as requested. This forced a mode change and it made Windows/DirectX know that a mode change had occured. This 'hack' of course worked since the cards use a 16/32 bit overlay for the fullscreen framebuffer.
 
Bigus Dickus said:
Maybe they simply couldn't figure out that a driver bug requires rebooting for settings changes to take effect?

The SS:SE shot was presumably opengl and I wasn't aware of any problems getting aa to change in that api.
 
Back
Top