My notes from Las Vegas Launch Event

Joe DeFuria said:
Ummm...Fred...

Guys. We essentially don't know ANYTHING yet...

First of all, considering this is a LAUNCH, the fact that "we don't know anything yet" should raise some flags....
We've already been told it won't be available until February, which is ~90 days away. I don't see any "flag" here. The pooch has already been screwed, so to speak.

However, the fact that they have demo boards running a variety of software eliminates most concerns regarding NV30. And provided that those demo boards go out for independent previewing next month, any remaining uncertainty will vanish. If Anand has been told to expect a 500 MHz (!) part, NVIDIA must be reasonably certain of their yields. Mind you, we don't know much about the availability of 500 MHz DDRII...
 
Man, I wish I had my Sandisk, I just looked at my digicam and I have video of the Sweeney tech demo. If Dave can give me hosting space, I will upload them early Wednesday morning when I get back home. The quality won't be good, since I didn't lug a real miniDV digi-cam (my Sony TRV900), but recorded the video on my digital camera (PowerShot G1). I think it will end up looking grainy when viewed on a computer monitor, but it should give you the idea.

-DC
 
Oompa Loompa said:
Joe DeFuria said:
Ummm...Fred...

Guys. We essentially don't know ANYTHING yet...

First of all, considering this is a LAUNCH, the fact that "we don't know anything yet" should raise some flags....
We've already been told it won't be available until February, which is ~90 days away. I don't see any "flag" here. The pooch has already been screwed, so to speak.

However, the fact that they have demo boards running a variety of software eliminates most concerns regarding NV30. And provided that those demo boards go out for independent previewing next month, any remaining uncertainty will vanish. If Anand has been told to expect a 500 MHz (!) part, NVIDIA must be reasonably certain of their yields. Mind you, we don't know much about the availability of 500 MHz DDRII...


I still think that things are not far enough along for nVidia to have yet established a good knowledge base on chip yields, at this point. I would think that either very good or very poor yields could definitely change things--both would affect price, though in opposite directions. I'm having trouble reconciling the "effective" 48gigs/sec bandwidth numbers with nVidia's depednency on 500MHz DDRII (this seems contradictory.)
 
It doesn't matter what you think about yields. NVidia can take risks and lose as much money as they want based on their marketing strategy, if they think it is utterly important to risk money upfront and deliver by late January or February, and reap better margins later from followups.


IMHO, the A01 silicon was running complex games and demos for over 2 hours at the event, whatever bugs they have, none of them are showstoppers.
 
Democoder said:
My overall feeling is: Thank gawd it's not a huge disappointment. It is performance competitive with ATI, so we will soon have two high performance DX9 chips on the market, and hopefully, ATI and NVidia can together force everyone to upgrade over time, and create a large platform for developers to take advantage of these features.
That's what I have been hoping that everyone here will focus on, instead of the usual tit-for-tats.

Thanks for your first post, Ray.
 
DemoCoder said:
IMHO, the A01 silicon was running complex games and demos for over 2 hours at the event, whatever bugs they have, none of them are showstoppers.
I'd certainly be interested in learning more about Sweeney's new engine, by the way. Soft shadowing is nice; what else did you observe that would represent an evolution wrt UT2003?
 
DemoCoder said:
And to answer the comments about the Sweeney demo, it was NOT the UT2k3 engine. It was stated to be a research project and it was more than just softshadows, it was a unified lighting model like Doom3 will full 100% shadowing everywhere (including softshadows) and also lots of other volumetric lighting effects. Sweeney said the main character that was walking around the screen (a Knight in armor with a torch) had 1+million polygons and that the NV30 could comfortably render about 20 of these at once. However, I doubt the full resolution mesh is actually being used.


Thanks for clarifying this a bit. I only asked about the UT2K3 engine because I'd assumed he'd built it for more longevity--but I guess Sweeny's relying more on the cpu-limited nature of the engine to go the distance.


And to answer your question about the benchmarks of Unreal, et al, they had them running on OEM stations outside the main conference, and you could play around with all the NV30 demos. However I was not able to get a look at benchmark numbers because the benchmark demos were in a special room for VIPs (probably for OEMs and important ISVs like EA, Epic, etc).

I must emphasize again, they had real, live, running hardware that anyone could play on, presumably running @ 500Mhz (dont know about memory) so none of the demos were canned, faked, etc.

As far as I'm concerned, that was never in doubt...;) (Um, now that you mention it....were the demo boxes sealed?....*chuckle*)

I was told by several NVidia and OEM people I could buy one in January, so believe it or not for what it's worth.

Any specific OEMs? This sounds logical to me. Was there mention of what the exact problems were that prevented a pre-Xmas release? I'm getting the impression from you that you felt things were "ready to go" and so I'm wondering if mention of this was made (how the subject might not have come up would surprise me--but maybe it didn't.)


The beginning the presentation almost made me faint: GeForce + 3dfx animation was shown combining into GeforceFx? Was the NV30 a Gigapixel tiler!? HOLYCOW! What a surprise. Reality turned out to be far more mundane, but I was still happy that they managed to get it running at 500Mhz with 1Ghz RAM. Think about it: 4gigapixels/s, possiblty 4-8 giga-shader-ops per second (depending on the dispatch rate), and 6-8X FSAA? That's potentially 32-64giga-ops/sec.

I'm not too suprised at this considering it was done on .13 microns. I mean, if some air-cooled, factory .15 micron R300s can do 3.2 gigapixels/sec (400MHz), I can't see how 500MHz might elude ATI at .13 microns. I would also think that a lot of the work nVidia & TMSC did with the process would have been pioneering, and will allow ATI to much more quickly get to .13 microns when it's time.

I am waiting with caution to see how many other features are present: displacement maps, gamma correct AA, Video features, dual output?, and how does LMA3 work, etc.

Seem to be quite a few holes in the presentation thus far. I can't think of a good reason why they'd keep things like this so vague at this point.

My overall feeling is: Thank gawd it's not a huge disappointment. It is performance competitive with ATI, so we will soon have two high performance DX9 chips on the market, and hopefully, ATI and NVidia can together force everyone to upgrade over time, and create a large platform for developers to take advantage of these features.

I don't know too many who felt it might be dissappointing or fail to compete with the 9700 Pro... :) However, I think there will be more than a few who will find what the nv30 is does not match their expectations (which were probably way out there in the first place.) The one great advantage I think ATI had here is that no one expected them to do an R300 9700 Pro product. As such, there was not as much fantasy built up around the product's release as nv30 has had. I'm anxious for things to shake out, though, so that I can see this product clearly. Right now it's difficult to separate the marketing from the technically factual.
 
Well, I think I noticed some of that over-bright "glow" lighting like you saw in the Halo video, but I'm not sure. I rewatched the video and Sweeney says apologetically that it's an early demo of some of the things they are experimenting with. It didn't have the "wow" factor of the Doom3 demo, but what still very cool.

Actually, I found the EverQuest2 demos very interesting. The armor and character customizations are modeled in equisite detail. Most clothing has realistic cloth physics (women's dress flowing around lets as she walked, armor with individual chain mail rings modeled it appears!)

Hopefully I can get them online for people by wednesday afternoon.

-DC
 
WaltC said:
Silent running must mean the thing is dang loud when it's running...;) nv30 must be hot as a firecracker at stock speeds.

All the FiringSquad talk of heat pipes and circulating gases made me think of a jet turbine, for some reason. A big jet engine on the back of a F18, with afterburner engaged... that's what NV30 will sound like when its running at full 500 MHz. :p
 
Thanks for the notes DemoCoder!

To me it looks like Nvidia had to clock their ultra high end part a little higher than projected. That is a pretty sophisticated cooling system that their going to use. In the end it doesn't matter much. The high end $500 card is going to be bought by enthusiasts that are going to try and overclock it anyway.

Am I right and saying ATI hasn't released the 9700 yet? If so, the real battle will be probably be between 9700 (Not the 350) and Nvidia's 2nd fastest chip. It looks like ATI is going to hold off on the 9700 so that they know where to clock and price it compared to Nvidia's solution.

The architecture's for these cards seem to have converged and the R300 and NV30 seem very similar. My guess is they will start to diverge in the next generation now that they both seem to have completed the major (and probably resource intensive) Vertex and Pixel shader advances building up from the last couple of generations. I'm guessing will see more radical solutions to bandwidth such as deferred rendering or tiling on the R400 and NV40 now that shaders have matured.
What else will those engineers use their larger transistor budgets on now besides attacking bandwidth?
 
DemoCoder said:
It doesn't matter what you think about yields. NVidia can take risks and lose as much money as they want based on their marketing strategy, if they think it is utterly important to risk money upfront and deliver by late January or February, and reap better margins later from followups.


IMHO, the A01 silicon was running complex games and demos for over 2 hours at the event, whatever bugs they have, none of them are showstoppers.

Highend chips are just a waste of money to begin with. They're just a marketing tool for the lesser cards that actually sell. I don't see why they'd stop forking out the cash at this point...it's not like they'll actually make any money off the NV30 itself to begin with.

DemoCoder said:
Actually, I found the EverQuest2 demos very interesting. The armor and character customizations are modeled in equisite detail. Most clothing has realistic cloth physics (women's dress flowing around lets as she walked, armor with individual chain mail rings modeled it appears!)

Would be nice if they spent 1/10th the time on the game play that they seem to be spending on the graphics. Actually even 1/100th would be nice.
 
Laa-Yosh said:
Demo, is this screenshot from the new Unreal engine that you were talking about?

http://www.guru3d.com/tech/geforcefx/img/Games/Unreal_GFFX_01.jpg
For the lazy ones, the test for the pic at Guru3D is as follows:

Unreal Engine GeForce FX Technology Demo - A very highly detailed scene with bump maps, rich textures, and dynamic lighting: the flexibility of the GeForce FX shading allows for elaborate lighting calculations resulting in the soft self shadowing in the folds of statue’s cloak and the subtle shadows in the corner of the room.
 
BoddoZerg,

while I was at epic they showed us some stuff for an upcoming yet to be annoced title and it did look very sweet!
 
Wasn't it rumoured that next gen Thief or Deus Ex would run on a modified Unreal Engine with unified lightning?

It might as well have been some other game, but I thought this was the rumour..

With Regards
Kjetil
 
Kaizer said:
Wasn't it rumoured that next gen Thief or Deus Ex would run on a modified Unreal Engine with unified lightning?

It might as well have been some other game, but I thought this was the rumour..

With Regards
Kjetil

Not a rumour, it's fact. The T3/DX2 engine has unified lighting, BUT it's all stencil shadows like Doom3 and are hard. All the enhancement to the Unreal engine used by those games were done in house at Ion Storm Austin.

-Colourless
 
Democoder,

Are there any url / links to the EverQuest2 demos that were shown on the FX card? I know you said Wednesday - I didn't see anything on their site which referenced the fx board or demo...

Have a good one.
-saf

DemoCoder said:
Actually, I found the EverQuest2 demos very interesting. The armor and character customizations are modeled in equisite detail. Most clothing has realistic cloth physics (women's dress flowing around lets as she walked, armor with individual chain mail rings modeled it appears!)

Hopefully I can get them online for people by wednesday afternoon.

-DC
 
http://event.on24.com/eventRegistra...&format=&key=8818A278CD4B677B742470856670B9CB

Skip forward to 1:08

-- Daniel, Epic Games Inc.

worm[MadOnion.com said:
]
Laa-Yosh said:
Demo, is this screenshot from the new Unreal engine that you were talking about?

http://www.guru3d.com/tech/geforcefx/img/Games/Unreal_GFFX_01.jpg
For the lazy ones, the test for the pic at Guru3D is as follows:

Unreal Engine GeForce FX Technology Demo - A very highly detailed scene with bump maps, rich textures, and dynamic lighting: the flexibility of the GeForce FX shading allows for elaborate lighting calculations resulting in the soft self shadowing in the folds of statue’s cloak and the subtle shadows in the corner of the room.
 
Back
Top