RAGE : It Deserves its own thread now!

Status
Not open for further replies.
Every version starts with the same base data and then is compressed to suit each host's medium in a process that takes a couple of days.

- Carmack says they have the 'option' of providing a higher quality dataset with less compression for PS3 if they don't mind doing that process seperately for PS3.

Any plausible reason why they wouldn't use that option ?
Seems easy enough.

And I suppose the "super platinum" PC edition megatextures would look even better ? With a 60b install, that's probably less compression than what they'd need to use to fit it on a double layer BD ? Maybe they'll release a BD edition PC version.
 
Thanks for the link, liolio. The conclusion ties in with a lot of what was being said in the early days of this generation. Cell forces alternative programming models, but the whole world was going that way so may as well make a start now!

Yap ! Id must have gone through hell to break their engine into more and more SPUlets. Would be nice if they can tell how they overcome the SPU limits when processing transparency. Managing the high quality textures between the split memory pools, and avoiding optical disc latency should be very insightful work/reading. The slides didn't say anything about duplicating assets on the optical disc to minimize seek time. No mention of HDD use too !

And good news for the gameplay - 60fps and intolerance of even a single frame lag!

Wondering what's next for them besides porting to CUDA, OpenCL and LRB.


Ha ha, too early to call. Or rather, we won't know until it's launched.
 
Any plausible reason why they wouldn't use that option ?

A larger data set might have performance implications. The data gets spaced further apart on disc implying longer disc seeks (could introduce latency issues and hence more visual pop). It also implies using a dual layer disc which has a performance hit when there is a layer change. Presumably a dual layer blu-ray costs more as well which affects their bottom line. The larger data set may also take significantly longer to generate as well, adding to development time if they have to regenerate data often. More data to load means more data to process cpu/gpu side as well. Etc...


keyn said:
- We'll get a cropped version because of X360's storage limitations

...or I guess we can just assume that the 360 ruined video games for everyone :)


Thanks for the link, liolio. The conclusion ties in with a lot of what was being said in the early days of this generation. Cell forces alternative programming models, but the whole world was going that way so may as well make a start now!

Strictly speaking, multi core coding forces alternative programming models. Cell forces both alternative programming models and alternative data models. It's the data side which more often than not causes significant grief.


Richard said:
JC mentioned how the game runs much slower if they don't bake the light in.

I wonder if they can make Rage+gi work on current console hardware considering how shader heavy it already is. That along with no day/night cycles and huge storage requirements might make it tricky to convince people to switch from their own in house rendering tech, at least for current gen.
 
I wonder if they can make Rage+gi work on current console hardware considering how shader heavy it already is. That along with no day/night cycles and huge storage requirements might make it tricky to convince people to switch from their own in house rendering tech, at least for current gen.

Indeed. The storage issue causes problems for Oblivion, GTA and other free-roam type games that target the XBOX. About GI, right now, the XBOX seems to have fps to spare and so will the PS3 once it gets up to 60fps. Drop the render rez to 600p or whatever the latest CoD is supposed to be doing and I think it will be fine. Perhaps that's why the next DOOM is supposed to go back to the 30fps goal? In the first year id Tech 5 was announced JC wondered if the PC version would have GI instead of baked lighting at least indoors; there's been no news on this front since then.

Another problem not many are talking about is that MT kills mods, or at least custom maps. This won't be a huge issue for most licensees of course but it's another nail. The baked-in static shadows also mean a visible difference in the dynamic shadows.

The biggest problem I see is that while id talks that MT doesn't necessarily increase development time (artists), that it makes life easier because there's no more texture budget and that their id Studio tools are great for streamlining the pipeline, what we/licensees do see is that RAGE will be released 5 years since their last game; already taking into account they spent 1 year on the cancelled Darkness game.

I believe it's telling that back in 2007 they had a few closed door showings at E3 to potential licensees and they mentioned in interviews how they were going to pursue this a lot more now and then in interviews they've given this year they tell us that licensing isn't their focus, there's no license announcements, etc. etc.
 
So this means ;

- Rage will not take advantage of every platform ,
- We'll get a cropped version because of X360's storage limitations ,

and confirms Tim Willits' a-year-old comments ; [ http://www.1up.com/do/newsStory?cId=3169963 ] ?..
Naah, you'll get a stuttery version because your GPU is pants... :)

Seriously though, I suspect they'll try take advantage of the strengths of each platform, although if they have texture pop issues reading from DVD, they're going to be even worse from uncompressed BD.
 
Full dynamic lighting is certainly possible, they may also be able to store some global illumination related information in addition to the color texture. Although I wonder about how much fragment processing power a deferred renderer would require, as virtual texturing is already quite fragment ready.

I guess we'll see with Doom4, although id is pretty quiet about the direction they plan to follow - close quarter survival horror style like Doom3, or a shooting fest like the first two games?
Deferred rendering/lighting should work perfectly as MT would be only be written once to g-buffer and light accumlation wouldn't have to touch it at all.

This would also make sense why Carmack stated that they can do 3 times as much in 30FPS when compared to 60 of Rage.
After base cost of MT has done there should be a lot more time to just do cheap lights.
 
The larger data set may also take significantly longer to generate as well, adding to development time if they have to regenerate data often.

I thought it was the level of compression they are using. Basically all of them are compressed from the same source. It's like different level of JPEG compression that you can play around with.

I wonder if they can make Rage+gi work on current console hardware considering how shader heavy it already is. That along with no day/night cycles and huge storage requirements might make it tricky to convince people to switch from their own in house rendering tech, at least for current gen.

I think they lost current gen to UE3. Next gen will be interesting if they indeed go the voxel route. Better hope those Hologram Disc make it to next gen.

Also anyone know if they were to target 1080p instead of 720p given the same size level, do they need to increase their size of megatexture to keep up with the resolution increase to keep the texel dense enough ?
 
http://www.xhardware.com.br/index.p...to-da-siggraph-2009&catid=47:games&Itemid=100
Smoking hot screenshots! I assume these are from the presentation no? Must say they look drop dead photo realistic.

Dunno, looks very dissapointing versus their previous shots and much like HL2 esque graphics with AA and AF to polish it up. IMO looks like the lighting has taken a huge hit and the contrast in foreground vs skybox removes a lot of the sense of depth. Also very alike Borderlines art or vice versa. Also max 4xAF is like a jump back in time... 2002/2003 all over again.. sigh.

EDIT: Found the latest images at fullsize.
http://www.nvnews.net/vbulletin/showpost.php?p=2062792&postcount=9
 
Last edited by a moderator:
Also very alike Borderlines art or vice versa.

I donno why I am saying this but Borderlands was the first thing that came in my mind when I looked at the first screen in that page. The character model strangely looked like its from Borderlands.
 
Really beatifull art direction ( Lighting, color palette, level & NPC design) and impressive texture density.
Waiting for some footage, direct feed Hopefully.
 
Have to say, those images in the presentation look fantastic. I'm assuming high high high high end PC.

In a recent post on Shacknews, Brian of id had this to say about running the game on a PC:

binaryc said:
A GTX 280 with any recent Core 2 duo should run it just fine.


CNCAddict said:
To me it looks like the skybox is a 2D texture? I sure hope that's not the case.

From the same comments, Brian had this to say...

binaryc said:
Dynamic skies are on the todo list, but they haven't been done yet.
 
So, the PDF indicates that the default block size is 128x128, which goes down with MIP mapping. Seems like they had to make a compromise between VRAM use and number of pages to grab from and thus seek to on the optical drive, right? With a hard drive or SSD this could go down on the next gen tech, thus decreasing memory requirements...
Although hardware support should also be required as adding a 4 texel border for anisotropic filtering for every, say, 32x32 block would result in a considerable waste of storage on the Megatexture too... interesting, can't wait for Carmack's speech to learn more.
 
Doesn't really seem to be that much new info there, apart from word about letting go of the Naughty Dog programmer and bringing their own people up to better PS3 programming... He's probably holding back the more interesting stuff for Quakecon.
 
Well the Xenos (not including the eDRAM die) isn't too far off from R600 architecture isn't it? Though isn't there a disparity in the stream processor cluster scheme, but still somewhat comparable?

Far more so that RSX is like G80 ( a 3+ yo PC GPU) and beyond. Thats why i'm not really getting the coments about ID struggling with the PC tech while the console tech is doing fine.

Its more like the modern tech (relatively speaking in Xenos's terms) is doing fine while the old tech is struggling. Pretty obvious when you think about it!
 
In a recent post on Shacknews, Brian of id had this to say about running the game on a PC:

"A GTX 280 with any recent Core 2 duo should run it just fine."

That kinda goes without saying. The real question should be whether something along the lines of a 9600GT will handle the game at 1080p/60 because thats the level of performance they should be targetting on the PC for that level hardware.

GTX 280 performance in a 60fps console game should never even be considered. Even at 1080p with bags of AA/AF.
 
Status
Not open for further replies.
Back
Top