*spin-off* idTech Related Discussion

Is this what Epic is doing in Gears 2 :?: Sounds similar in concept.

I'm not familiar with Gears2 (thanks to MS wanting to revive the PC so much :rolleyes::rolleyes: ). Do you mean WRT texture streaming?

About the slides. Am I reading too much into slide 23 or will we need a tri-core or above for a 60fps game on PCs?
 
Thanks for the link, liolio. The conclusion ties in with a lot of what was being said in the early days of this generation. Cell forces alternative programming models, but the whole world was going that way so may as well make a start now!

And good news for the gameplay - 60fps and intolerance of even a single frame lag!
 
About the slides. Am I reading too much into slide 23 or will we need a tri-core or above for a 60fps game on PCs?
They don't say what speed core to get those individual job times. I imagine id would want to target dual-core and maybe scale back.
 
I'm not familiar with Gears2 (thanks to MS wanting to revive the PC so much :rolleyes::rolleyes: ). Do you mean WRT texture streaming?

About the slides. Am I reading too much into slide 23 or will we need a tri-core or above for a 60fps game on PCs?

38 msec of processing in 16 msec? That's how you got tri-core or above? They're also allowed some jobs to be completed 1 frame late, so they don't stall for synchronization. I guess it would still end up being the same 38msec, even if jobs were being deferred. Aren't some of those jobs GPU functions like "rendering"? I think dual core would probably work, depending on resolution, detail settings etc. Are we talking 60Hz at 720p, 1080p or higher?
 
Yeah. I'll see if I can't find a better video of it...

http://www.youtube.com/watch?v=XO-2M43YIrE#t=1m50s

The person sped up the video, but you should be able to pick out the progression of texture lod on the pillar there.

I have no idea what you're trying to show me in that video. :p

WRT to DVD read, there's an interesting paper by J.P.

J.P. said:
Most of the data is read from the two highest detail layers at 6 MP/s (= 275 tiles/s). At an average compression ratio of 14:1 without storing mipmaps per tile this results in 1.25 MegaBytes worth of useful tiles that need to be streamed from the DVD per second. Furthermore, at an average compression ratio of 14:1 without storing mipmaps per tile there are about 30 to 40 tiles per 128 kB block read from the DVD. In practice about half the tiles streamed from DVD are dropped and as such wasted. This results in about 15 to 20 tiles being read per seek.
 
The new Edge article was summarized by Gofreak over at GAF

- the mac version 'is a first class citizen' in development; two of the programmers use it as their main development system (Willits & Duffy)

- Single textures range up to 128000x128000, weighing in at 120GB each uncompressed. The game currently weighs in at 'probably a couple of terabytes' according to Carmack.

- The sweet spot would be 1 50GB Blu-ray and 4 8.5GB DVDs, and Doom 4 may wind up as that, but that wasn't indicating what Rage will ship on as reported by CV&G. Just the 'ideal'.

- Every version starts with the same base data and then is compressed to suit each host's medium in a process that takes a couple of days.

- Carmack says they have the 'option' of providing a higher quality dataset with less compression for PS3 if they don't mind doing that process seperately for PS3. Similarly they could do a 'super platinum' PC edition with an 'incredibly rich' dataset if someone doesn't mind assigning 60+ GB for it.

- A lot of id Tech 5's lighting effects are prebaked into the textures. It doesn't handle global dynamic lighting. Edge notes differences between the dynamic lighting and shadows of the say the torch in a NPC's hand, and the subtler higher quality shadows thrown by static objects in the scene, and the irony of the departure from the 'no smoke and mirrors' approach of the Doom3 engine. They don't see it as an issue for Rage itself, but they point out that it might make it a less easy fit for other developers
 
Lights baked into textures makes a LOT of sense. They don't need lightmaps with the unique enviroment textures, they can bake at a quality level where they'd need weeks to calculate the lighting...
 
Lights baked into textures makes a LOT of sense. They don't need lightmaps with the unique enviroment textures, they can bake at a quality level where they'd need weeks to calculate the lighting...
In a way RAGE is branching away from the complexity of modern game engines. Where developers are trying to find solutions for GI faking for a fully dynamic world, RAGE is kinda like a 3D-ified version of the old 2D hand painted adventure games. It's basically a huge painting, with the glorious quality and variety that goes with like.

It's a solution that'll suit some games really well, but won't work for everything. I wonder if the overheads of Megatexturing will prevent it being included in more complex lighting situations? There isn't really anything to stop a deferred renderer using megatexturing for the albedo colour.
 
Full dynamic lighting is certainly possible, they may also be able to store some global illumination related information in addition to the color texture. Although I wonder about how much fragment processing power a deferred renderer would require, as virtual texturing is already quite fragment ready.

I guess we'll see with Doom4, although id is pretty quiet about the direction they plan to follow - close quarter survival horror style like Doom3, or a shooting fest like the first two games?
 
- Carmack says they have the 'option' of providing a higher quality dataset with less compression for PS3 if they don't mind doing that process seperately for PS3. Similarly they could do a 'super platinum' PC edition with an 'incredibly rich' dataset if someone doesn't mind assigning 60+ GB for it.
So this means ;

- Rage will not take advantage of every platform ,
- We'll get a cropped version because of X360's storage limitations ,

and confirms Tim Willits' a-year-old comments ; [ http://www.1up.com/do/newsStory?cId=3169963 ] ?..
 
Every version starts with the same base data and then is compressed to suit each host's medium in a process that takes a couple of days.

- Carmack says they have the 'option' of providing a higher quality dataset with less compression for PS3 if they don't mind doing that process seperately for PS3.

Any plausible reason why they wouldn't use that option ?
Seems easy enough.

And I suppose the "super platinum" PC edition megatextures would look even better ? With a 60b install, that's probably less compression than what they'd need to use to fit it on a double layer BD ? Maybe they'll release a BD edition PC version.
 
Any plausible reason why they wouldn't use that option ?

A larger data set might have performance implications. The data gets spaced further apart on disc implying longer disc seeks (could introduce latency issues and hence more visual pop). It also implies using a dual layer disc which has a performance hit when there is a layer change. Presumably a dual layer blu-ray costs more as well which affects their bottom line. The larger data set may also take significantly longer to generate as well, adding to development time if they have to regenerate data often. More data to load means more data to process cpu/gpu side as well. Etc...


keyn said:
- We'll get a cropped version because of X360's storage limitations

...or I guess we can just assume that the 360 ruined video games for everyone :)


Thanks for the link, liolio. The conclusion ties in with a lot of what was being said in the early days of this generation. Cell forces alternative programming models, but the whole world was going that way so may as well make a start now!

Strictly speaking, multi core coding forces alternative programming models. Cell forces both alternative programming models and alternative data models. It's the data side which more often than not causes significant grief.


Richard said:
JC mentioned how the game runs much slower if they don't bake the light in.

I wonder if they can make Rage+gi work on current console hardware considering how shader heavy it already is. That along with no day/night cycles and huge storage requirements might make it tricky to convince people to switch from their own in house rendering tech, at least for current gen.
 
I wonder if they can make Rage+gi work on current console hardware considering how shader heavy it already is. That along with no day/night cycles and huge storage requirements might make it tricky to convince people to switch from their own in house rendering tech, at least for current gen.

Indeed. The storage issue causes problems for Oblivion, GTA and other free-roam type games that target the XBOX. About GI, right now, the XBOX seems to have fps to spare and so will the PS3 once it gets up to 60fps. Drop the render rez to 600p or whatever the latest CoD is supposed to be doing and I think it will be fine. Perhaps that's why the next DOOM is supposed to go back to the 30fps goal? In the first year id Tech 5 was announced JC wondered if the PC version would have GI instead of baked lighting at least indoors; there's been no news on this front since then.

Another problem not many are talking about is that MT kills mods, or at least custom maps. This won't be a huge issue for most licensees of course but it's another nail. The baked-in static shadows also mean a visible difference in the dynamic shadows.

The biggest problem I see is that while id talks that MT doesn't necessarily increase development time (artists), that it makes life easier because there's no more texture budget and that their id Studio tools are great for streamlining the pipeline, what we/licensees do see is that RAGE will be released 5 years since their last game; already taking into account they spent 1 year on the cancelled Darkness game.

I believe it's telling that back in 2007 they had a few closed door showings at E3 to potential licensees and they mentioned in interviews how they were going to pursue this a lot more now and then in interviews they've given this year they tell us that licensing isn't their focus, there's no license announcements, etc. etc.
 
So this means ;

- Rage will not take advantage of every platform ,
- We'll get a cropped version because of X360's storage limitations ,

and confirms Tim Willits' a-year-old comments ; [ http://www.1up.com/do/newsStory?cId=3169963 ] ?..
Naah, you'll get a stuttery version because your GPU is pants... :)

Seriously though, I suspect they'll try take advantage of the strengths of each platform, although if they have texture pop issues reading from DVD, they're going to be even worse from uncompressed BD.
 
Full dynamic lighting is certainly possible, they may also be able to store some global illumination related information in addition to the color texture. Although I wonder about how much fragment processing power a deferred renderer would require, as virtual texturing is already quite fragment ready.

I guess we'll see with Doom4, although id is pretty quiet about the direction they plan to follow - close quarter survival horror style like Doom3, or a shooting fest like the first two games?
Deferred rendering/lighting should work perfectly as MT would be only be written once to g-buffer and light accumlation wouldn't have to touch it at all.

This would also make sense why Carmack stated that they can do 3 times as much in 30FPS when compared to 60 of Rage.
After base cost of MT has done there should be a lot more time to just do cheap lights.
 
The larger data set may also take significantly longer to generate as well, adding to development time if they have to regenerate data often.

I thought it was the level of compression they are using. Basically all of them are compressed from the same source. It's like different level of JPEG compression that you can play around with.

I wonder if they can make Rage+gi work on current console hardware considering how shader heavy it already is. That along with no day/night cycles and huge storage requirements might make it tricky to convince people to switch from their own in house rendering tech, at least for current gen.

I think they lost current gen to UE3. Next gen will be interesting if they indeed go the voxel route. Better hope those Hologram Disc make it to next gen.

Also anyone know if they were to target 1080p instead of 720p given the same size level, do they need to increase their size of megatexture to keep up with the resolution increase to keep the texel dense enough ?
 
So, the PDF indicates that the default block size is 128x128, which goes down with MIP mapping. Seems like they had to make a compromise between VRAM use and number of pages to grab from and thus seek to on the optical drive, right? With a hard drive or SSD this could go down on the next gen tech, thus decreasing memory requirements...
Although hardware support should also be required as adding a 4 texel border for anisotropic filtering for every, say, 32x32 block would result in a considerable waste of storage on the Megatexture too... interesting, can't wait for Carmack's speech to learn more.
 
Back
Top