DX10 Hardware out with developers

LeStoffer

Veteran
Okay, we finally got the word the DX10 hardware is out and working amongst developers like Crytek:

Key qoutes from Crytek’s CEO and President Cevat Yerli:

We already have DX10 footage – it was shown at the Microsoft Press Conference. The 360 trailer (shown in the EA booth) that’s DX10 as well. The video is captured from actual DX10 hardware.
GI: How close are you working with Microsoft with Vista and DX10?

Yerli: Very close. Actually we are so close that we get new drivers everyday, new hardware, and new builds. You know, our programmers are crazy about it. It’s an insult that they go crazy because what happens is that everyday potentially a new thing has to be rebuilt. Since it’s Vista we are working with Alpha versions of Vista, we are working with drivers that get updated everyday with new source code, we work with hardware that are prototypes, everything is just unstable.

http://www.gameinformer.com/News/Story/200608/N06.0830.2058.31148.htm
 
Last edited by a moderator:
Well, G80 has been back from the fab for a while. And so has G965 and SiS's part. But it's always good to get confirmation from developers that it's actually in their hands already :) Now, if we could get confirmation that it's G80, that'd be really interesting from the pov that it'd duplicate the G70 advantage of getting stuff into devs' hands first, and letting them tune their apps to NVIDIA's hardware.
Let's hope, if that's the case, that R600's optimization guidelines aren't too different from G80's. Otherwise, ATI's in trouble, and that's Bad (TM).

Uttar
 
I took:
Yerli: If there is no hardware, you have to use the software emulation, and because of its performance, it's no pleasure to work with. On the other hand, buggy alpha hardware can be really painful. We cannot develop techniques solely for DX9 or DX10, so we implement, create, and tweak the level with DX9 and adjust the code afterwards for DX10. We cannot comment though under which DX10 development conditions we are working, since it would infringe NDAs.
http://www.extremetech.com/article2/0,1697,1989497,00.asp

to mean that Crytek has been working with buggy alpha hardware for a while now.

I don't for a second believe it's G80, since Crytek cuddled up to ATI quite a long time ago (for The Project techdemo) and haven't budged. Witness all the Crytek stuff on ATI's site, e.g.:

http://www.ati.com/developer/techpapers.html#acm2006

I wouldn't be surprised if Epic has G80, though.

Earlier thread about the Extremetech article:

http://www.beyond3d.com/forum/showthread.php?t=32039

Jawed
 
I think Crytek are smart enough to not side 100% with any one IHV. I'd bet that it is actually G80, myself ;)

On a side note, good to see you back, Jawed! A few of us wondered in private a week or two ago where you were, have you been on holiday?
 
I think Crytek are smart enough to not side 100% with any one IHV. I'd bet that it is actually G80, myself ;)
Cynically, I'm afraid, I perceive it as Crytek having been burnt by NVidia's devrel - they wasted an awful lot of time experimenting with SM3-specific stuff in the run up to NV40's release and all they had to show at the end was a set of conclusions why SM3 was of barely any use to them and FP16 HDR was implemented in the game (nothing to do with SM3) which took months of futzing with to finally get sorted. Arguably they prolly would have wanted to do all that experimentation anyway, once NV40 was released - but with Far Cry in release crunch just before NV40 appeared I bet they were a bit miffed at the timing and pressure from NV to put stuff into the game so that NVidia could market the HDR and SM3 goodness of NV40.

From these most recent comments, perhaps you might argue M$ is pressurising Crytek... Anyway, I think the relationship between ATI and Crytek has been fairly amicable for over 18 months now. Live public showings of Crysis at E3 were on CrossFire X1900XTXs...

On a side note, good to see you back, Jawed! A few of us wondered in private a week or two ago where you were, have you been on holiday?
Cheers :!: I've been "between places" since the start of August and just moved somewhere permanent this Monday... All a bit frustrating being constrained to internet-caff access with limited-time, but perhaps not a bad time of year...

Jawed
 
Source? Is this public information, or are you claiming this here for the record by yourself? ;) :)

Uttar
Well, in some way or another, it has to be public information, of course. You just need to know where to search for it. ;) :smile:
Not that it helps much anyway, because the details we don´t know of tend to be the most interesting ones and it surely won´t affect any introduction date, at least not if there are further delays, which always pop up when you are not expecting them.
 
Last edited by a moderator:
Main Entry: vague
Pronunciation: 'vAg
Function: adjective
1 : Sunrise's last post

Main Entry: cryp·tic
Pronunciation: 'krip-tik
Function: adjective
1 : See vague

:D
 
Uttar said:
Let's hope, if that's the case, that R600's optimization guidelines aren't too different from G80's. Otherwise, ATI's in trouble, and that's Bad (TM).
Well, with R600 being USA, it shouldn't have any problems in that regard anyway should it?
 
Well, with R600 being USA, it shouldn't have any problems in that regard anyway should it?
That's part of NVidia's argument against USA, that the way it performs is significantly different (unpredictable). It's only possible to say what the minimum number of cycles for a shader is - I think it's because the overhead that a USA suffers when threads are swapped is unpredictable and so the maximum duration of a shader varies depending on total workload on the GPU. (I'm including in this stuff I've read about Xenos performance - e.g. clause or loop sizes have an effect on performance if they're not long enough to avoid a stall incurred by thread swapping or looping.)

Slide 21:

http://www.ati.com/developer/brighton/05 Graphics Performance.pdf

But if you note the tenor of the presentation as a whole, you'll see that the concept of "balance" is somewhat misleading in the sense that the traditional pipeline allows devs to balance their code (slide 10).

As a non dev, though, I really haven't any empirical appreciation for this stuff.

Jawed
 
I don't know about that video they talk about, but as for the hardware.. I bet it's intel GMA X3000 :)

Intel Graphical Mediatoronomo Accelerometer EX-Thirty 3 Thousand Million Mega Onboard edition?

Can't they just call it II3 (Intel Integrated 3) or GMA-3? What makes you think they're using that by the way? :LOL:
 
I doubt the Crytek folks would not like to work with something orders of magnitude better than the software rasterizer and ressembling more what will be used to play..

as for generating a video.. it doesn't have to be real time :), have it render with a dynamic tickrate, whatever you have to do to grab your hi-res frames. to get audio run the same demo with null renderer
 
No doubt but then again a high end D3D10 GPU is also worlds apart compared to a D3D10 IGP and even more so the G965.
 
Back
Top