R420 Hints from people under NDA!

Re: No indications of valid info on Rage3d

Bouncing Zabaglione Bros. said:
Far Cry PS 2.0 looks very, very similar to PS3.0. If you listen to the presentation videos, the Crytek developer always says "PS2.0 or PS 3.0". Nvidia would like us to believe that there is a massive difference between 2.0 and 3.0, but I can tell you that my 9700 Pro looks almost identical quality to the picture Nvidia showed.

Man your 9700 must be better than my 9800pro b/c what they showed looked way better than what I see when I play farcry...
 
Re: No indications of valid info on Rage3d

Sxotty said:
Bouncing Zabaglione Bros. said:
Far Cry PS 2.0 looks very, very similar to PS3.0. If you listen to the presentation videos, the Crytek developer always says "PS2.0 or PS 3.0". Nvidia would like us to believe that there is a massive difference between 2.0 and 3.0, but I can tell you that my 9700 Pro looks almost identical quality to the picture Nvidia showed.

Man your 9700 must be better than my 9800pro b/c what they showed looked way better than what I see when I play farcry...

do you run it at max settings? :oops:
 
I ran it at what I thought was the optimum mix of fps and quality, as I am sure everyone does, or I would assume. However this is always different per person.

So is displacement mapping what is unique for the nv40 (and hey likely r420) over r3xx, I thought matrox was already talking about that back with the parahelia or something. In any case perhaps that is what it is. Regular bump mapping looks much more "fake" to me, I hope that new games do at least use it if it is available. I would assume it would be very easy to implement in games like D3, and HL2 where the source for the bump maps could simply be put thru some calculation to make displacement maps instead right?
 
i think if certain mods from rage3d knew anything it would be in there sig alongside there decree of being a rage3d mod.
the EGO is a wonderful thing
 
Sxotty said:
I know I too wanted to copy some of HB's more ludicrous quotes but I decided that doing that would be childish.

He is just the polar opposite of these people, but unlike them he will remain around afterwards :)

You make that sound like a positive thing.... ;)

John Reynolds said:
Who cares about the Unrealengine 3 for this generation? NV50 and R500 will be out before a game ships using the engine.

It is amusing the fuss some people are making about this engine. I mean Tim Sweeney stated himself he doesnt expect to see games based on it until the late 2005/2006 timeframe. Nv50 and R500 will be on their way out by then...
 
DaveBaumann said:
Guden Oden said:
Anyway, I wonder if R420 and derivates will have anything like Nvidia's programmable video processor; if not, NV may well have an edge there no matter what the performance. Some may think NV4x are fast enough and that shaders 3.0 + programmable video accelerator throughout the entire product line tips the balance in favor for NV...

Not that I've read the rest of this thread, but ATI already has a programmable video processor - they use the shaders.

It's not equivalent. How would you do motion estimation in the shaders efficiently? Only part of the MPEG pipeline is amenable to stream operations. The shaders help with removing artifacts, deblocking, potentially even block transforms, but there's alot more. There would be no need for the Theater chip otherwise.

(note: this is not a dismissal of ATI HW. It is a simple assertion of the non-equivalence of the pixel pipelines, and the video unit with ability to read/write data to a random access memory, looping and branching over it (in addition to framebuffer))
 
highpingdrifter said:
i think if certain mods from rage3d knew anything it would be in there sig alongside there decree of being a rage3d mod.
the EGO is a wonderful thing
Ooop's....somebodies got some issues. Great second post :rolleyes:
 
highpingdrifter

um if ANYONE has seen the R420 in action its GIbro he hangs out at ATI HQ as he lives just a little awhile away from their hmm and Mr B both are personal friends with Terry and have been in on the info for quite awhile so u need stop talking now tahnks now bye

as for the rest of thee thread im not gonna comment on performance until the R420 is out
 
highpingdrifter said:
i think if certain mods from rage3d knew anything it would be in there sig alongside there decree of being a rage3d mod.
the EGO is a wonderful thing

did YOU used to be a mod over there? ;)
 
I didn't read the whole thread yet since I'm pretty exhausted and on my way to bed, but I did read the first few and skimmed some on this page and just wanted to say that if GI Bro or Kombatant said they know something than you can bet your last dollar that they KNOW SOMETHING! 8)

I've known 'em both for years, and bumped heads with them a lot and disagreed too...but I have never known either to be untruthful or to brag about anything serious unless they had something to back it up.

Call it a voice-o-confidence, but it's my contribution before bedtime. Bro and Komb are good people, there words have weight with me.
 
All of the mods and admins are good ppl

and all the betatesters are great people

the mods and admins that are betatester are godly people (except alpha he has been tainted no body wants to touch him anymore)

Have a nice night Digi well be fun reading your response in the morning
 
PC-Engine said:
Actually we'll be able to play U3 at 3-5 fps on our NV40 equipped PCs in about a month :LOL:
Not to kill the joke, but I thought the previous gen ran UE3 at 5fps, and NV40/R420 ran it at 10-15fps? A couple more gens and we may just reach playable framerates! ;)
 
Pete said:
PC-Engine said:
Actually we'll be able to play U3 at 3-5 fps on our NV40 equipped PCs in about a month :LOL:
Not to kill the joke, but I thought the previous gen ran UE3 at 5fps, and NV40/R420 ran it at 10-15fps? A couple more gens and we may just reach playable framerates! ;)

I think Sweeney said ".. 6800 .. first card to run at decent framerates" in the Unreal E 3 video. Decent is imo higher then 10-15 fps but i'm guessing that it isn't that much more.
 
Stryyder said:
trinibwoy said:
Must have missed all his bragging...he even says that the best thing for everyone is for both companies to be relatively equal. It's a sad thing when somebody reads another person's opinion on a video card and reacts like they b*tch slapped his mother.

I like mermaids......

Here is his original post that after being challenged he waffled and did a complete 180 its nothing but hypocrytical BS and that pisses me off.

I see no indications of any valid information on Rage3d. It appears to me there trying down play the 6800 Ultra and its amazing PS 3.0 support.

I am trying to see if ATI current cards can do what FarCry and Unreal 3 demos are showing.. I don't believe so.. In fact I doubt the R420 can do it without PS 3.0 support.

I am curious what can ATI do to complete with NVidia now.. it sounds like to me its going to very hard.. with so many developers supporting PS 3.0 and so many good reviews..

I think it is same as the last LOTR movies - ie "Return of the King" - in this case I mean NVidia.

No matter what with 6800 Ultra it definetly shows that NVidia is sick and tired of the GFX jokes.

Of course we really don't know what will happen until ATI 420 is out, but in my opinion ATI is scared and thats why they are down playing PS 3.0. A couple of weeks we will know the truth.
_________________
Exploration of humanity.

Quoted for emphasis.

trinibwoy, are you really trying to convince us that his first post was not trolling to the extreme?
 
Re: No indications of valid info on Rage3d

Bouncing Zabaglione Bros. said:
You probably won't be using NV40 for Unreal Engine 3 games either, because:

- NV40 will be the absolute baseline graphics card for UE3 games. You will have to turn down settings and/or resolution.

- Unreal Engine 3 games arn't due for another two years, ie when we'll be looking forward to the refreshes of at least NV50 and R500.

While i agree that you shouldn't put to much emphasis on the Unreal Engine 3, i think it's a bit of a stretch to say that the NV40 won't be used for Unreal Engine 3 games. The 9700 will be around two years "old" when Half Life 2 is released and i think that there'll be a lot of 9700 owners playing Half Life 2. And not only 9700 but 9600 Pro's, XT's and so forth.
 
Huh! Ignorance Abounds Here

Flow Control is a good thing?

NOT!

Maybe on a CPU flwo control is good, but in what is essentially a super-scalar processor, flow control is bad-bad-bad. Let's pretend each pixel path in a pipeline quad (which is how both nV and ATI stuff is organized) has two ALUs. Now I can use those two ALUs to do a couple of different thing OR I can use those two ALUs to do the same thing going down two paths. In the case of NV40 do my 32Z pixel thing OR do flow control and execute down both paths of the branch.

To me flow control is at best a half-speed performance decelerator.

I will wager there will be no games with any flow control until at least mid-2005, and very little after that.

Those in the know can tell you SM3 gives you absolutely no visual effects that SM2 doesn't give you. This is totally unlike the comparison between SM2 and SM1.x. The whole SM3 thing is some MS program managers idea of something cool but pointless. HLSL is so dominant and will generate SM2 & SM3 code with equal facility that no ISV will care. After watching their SM3 code run slower on NV40 than SM2 code runs on NV40, guess which switch nV will pull (hint-hint: what they did for the last 18 mos, where they swapped out SM2 code for SM1.x code when their HW sucked on SM2 code)?

In the end just about everything will be SM2 until DX10 comes out.


P.S. As for nV being ahead of ATI, they finally have some video capability only two years later, and at that at least two reviews I read said the "video processor" is broken. Bah. The whole lack of un-readiness of this product smells of panic and poopy pants. One would have expected nV to do better after two years of suckage.
 
Hellbinder said:
This is great. ;)

Threads like these are going to be soo soo classic in a wee bit more time. :)

I think i should save them all so i can bust out some Quick quotes on a few people later hehe :LOL:

Will they ever top the legendery stuff like this?

On NV40:

It is an 8x2 actually from some real information i have seen it *can* be an 8x4. Being that their Vertex Shadxers can handle 4 textures per cycle VS the Nv30/35 2 textures.
The Nv40 is *NOT* a 16x1 Never has been and never will be.

It is an 8x2 with vertex units that *Can* load up to 4 textures per cycle depending on the operation and needs at the given time.
There will be some serious Crow eaten by some sites out there when they suddenly have to explain why the Nv40 is not Gods Child of Video power and it still has some of the weaknesses of the Nv30 line.
Yes but.. I was also told that their AF will be limited to 8x still. And their New AA mode is not all that great. Ati's New AA mode will whipe the floor with it in Quality and Performacne.
That Nv40 info is totally inaccurate. the Nv40 is an 8 "pixel" design that can Do 16 pixels in some cases.
Of course I could say the sky was blue and you would tell everyone six months from now how wrong i was... so...

It makes no sense at all that the Nv40 is a 16 pipeline card. Just like the Nv30 and Nv35 are simply NOT 8 pipeline cards even though Nvidia Claimed over and over that they were. The Nv40 is an extension of the Nv35. They may well claim 16 pipelines, but then again they claim for the last year that "pipelines" dont count anyone..

The bottom line is that Their Designs now are about Pixels and not Pipelines. So yes I think that on some cases it may well do 16. But not always, and not in the same way you are thinking of it.
AS for Nv40 it is more like 16 *pixels* per clock. And not in every case. They will also try to claim 32 *pixels* which is also garbage. Just like they pulled with the Nv30. where it will be able to do 32 operations per clock in certain rare situations. Of course they already have the media eating the lies and intentional exagerations right out of their hand.

On the source of ATI presentation/slide/comments:

Yet here we are with the latest underhanded deed. Gettng some internal Ati Presentation then Selectively releasing the part that (without the other slides they also have) paints Ati into a bad light. Yet again they have succeeded at Misrepresenting the truth. Fud is spread and many people will think ill Of ATi now regardless of any official response they post. just like in the days of Quak, or the PowerVR Doc.

On R360:

R360 = 8x2

On R350:

R350 = .15u @ 400mhz...
Rv350=.13U @ 375mhz...
The R350 will ship with an approx clock speed of 425mhz
R350 will also sport OVER 400 million PPS vertex engine.
R350 will have around 30GB of REAL raw bandwidth.

On NV 35:

I *know* for sure as in Burn my Entire Family in hell that the Nv35 is comming around june/july and has a 256bit memory controler, and is going to be clocked WAY higher than the Nv30 ulltra...
Want the truth?

Nv35 is going to beat the R350 in every single benchmark known to man... and thats the bottom line.

BUT, unless something changes. its FSAA is still going to suck.

However, really the R350 is not intended to compete with the Nv35 at all anyway. The only reason Nv35 is going to come out a while after the R350 is that if they(nvidia) dont launch it then.. no one on earth would buy it. Its that simple. When the Nv35 comes out everyone will already be thinking excidtedly about the R400. By then the rumor mill will be in FULL swing. Regardless of what some people are posting at nvnews etc.. R400 makes the Nv35 a moot point. I am already seeing some posts over there where some of the main guys like Uttar (a nice fellow) are making statements like this. *Nvidia does not need the Nv40 for the R400, The Nv35 is more than enough*..
 
Back
Top