Carmack console stuff begins leaking from Quakecon

The stuff they showed its quite good, though it's not like on screen there's a lot of stuff going on at any given moment (and I thought I would have never said anything like that about Carmack's work..)
I'd like to see some direct feed..

Quite right, though I'd say that they're still far from an optimized state. So they might be able to up the content a bit... But we'd better not expect armies of medieval soldiers from the far east, right? ;)
 
Looking at the sales charts and market values of the three vendors, we might have to replace "possible" with "neccessary"...
that's a completely different matter, for what is worth I'll quit the industry and open a restaurant if ppl stop caring about technology in videogames :LOL:
 
How is that? Anyone disagreeing with "PS3 is teh king" is automatically not a developer, or what?

And good developers can do good things with subpar hardware too. The results don't neccessarily speak for the console.


I find this philosophy of 'limitations make better game art' quite a bit silly, personally.
If poly count X won't let me model five articulated fingers, then the resulting blocky hand won't look better, nor would an 8-sided wheel on a car, would it? If our texture artists can't use a specular map, then the metal surfaces won't look different enough from non-reflective surfaces either. Would Poliphony Digital's cars look better without HDR reflection maps? How about a Gears without normal maps?

If someone can make good looking stuff from less, then it's usually despite the limitations and not because of them. And the one unable to use the higher potential properly is probably just someone not talented/experienced/disciplined enough... Our texture crew has worked on all our cinematics, obviously, and they've done character textures for two AA games already (one is mentioned here on B3D quite often), so there are examples.

Then again, I'm no coder, just an artist... (and I kinda hate the english language for not making the distinction between someone who makes stuff for a living and someone who's making fine art) It's just that I feel Fran's opinion is kinda oppressed here.

now am i right in saying you are not a developer but an artist?
 
Are you kidding? Since when are artists not part of the game development team?

Since when have they? ..........

I Kid, I kid. Artists are valued members of any game dev team. I love them so much. I wish game teams only had artists in them. And they smell so much better than those stupid designer types.... Jerks!
 
KZ - there's no way they can run a deferred renderer at 60fps...

Could you clarify this for me? Is there something inherent to a deferred renderer that means it takes at least 16ms (on current hardware) or did you mean just for this game :?:

Is it that hard to give id some very well deserved credit for this? Especially after all the Carmack bashing, how he's just a whining PC developer who won't survive on consoles... And now he's just got rid of texture memory limitations as it seems, and on all platforms, once again changing something big in the industry.

From what I can tell, id Tech 5 sounds quite impressive to me. I don't have experience at all with UE3, so I can't compare the "ease" at which multiplatform development is, but I get the impression that it's a much smoother transition with what id has developed. I'd love to hear from other developers who will evaluate id Tech 5 in the future. I'd presume that id is going to be trying to market this to folks at EA at least...

I really liked the UI demo too; it seemed quite user friendly from what I could see for creating maps - intuitive and efficient? I wonder if they will include a similar set of tools with their games as Epic normally does with their PC games for making maps and mods. I'd love to get my hands on it just to try out. :)
 
do you have evidence that point to the contrary?, i know for certain that you are not a developer and there are many developers out there that say so. So put yourself in my position, who would you rather believed? the many developers who suggest and shows what the PS3 is capable off ( Lair, GT5, MGS4, Drakes's Fortune, Ratchet and do i need to mention Killzone 2 and we must all remember this is the first generation PS3 titles) or an artist on a forum.

I dont know really, maybe your opinion differs from mine and those games i mentions are not technologically impressive to you and therefore you draw the conclusion that the PS3 isn't all the technologically wonder.

I have to agree with Dantruon here, I mean since when can a "licensed engine" designed to make porting games easier compare to a game that is specifically coded to a single console.

Further since when can a "3rd party licensed multi console porting engine" be any type of credible indication or proof that the Sony Playstation 3 console is somehow technologically inferior and if so where is the proof?

Im shure the Cell has quite some muscles to show but this all hype and talk reminds me of the EE hype and talk and well...

Console hype has always been used as far as I can remember, ever since NEC called their console the Turbo Graphix 16 and Sega's Genesis sported a "High Definiton 16bit Graphics" logo on the console, (I won't say anything about SNK) later Nintendo's SNES just baked it up a notch with all of the crazy claims about mode 7...

I do know that some of the Emotion Engine's hype was pretty crazy but not to change the subject...

As a console application, did the EE fail to deliver in its gaming hype?

id Tech 5 has been running on multi-core from day one, which would be at least two years ago if not more (i.e. after Doom 3's release three years ago). UE3 is a PC engine that had to be fitted onto the consoles and multi-core, whereas id Tech 5 has always been designed for cross platform and multi-core. So no, I don't think Epic has ever really been ahead as far as multi-core.

How many Id Software developed games have been released on consoles as of this year 2007 compared to Epic Games?

Although I am aware of Quake 4 being on XBox 360 we pretty much know that game was never really programed to require more than one processor (even taking the dual core patch) and staying in that platform last years Epic Games Gears of War, even though we don't have any proof other than the game devs word on multicore usage or even as to what the actual Unreal Engine 3 version being used in that game, its safe to say that Epic games has the lead over Id so far.

Id Tech 5's presentation though can be a major blow to Epic Games in the sense that as far as we know Gears of War did not actually use the same UE3 code that was sold to other game devs who bought the licensed engine, specially those who complained about it, (Silicon Knights anyone?) If Id's Tech 5 proves attractive to those game devs interested in that engine, then Epic will have a major challenger, however we can agree that both engines are fundamentaly different in that Epic has embraced Direct X and quite possibly multicore while Id is only starting to embrace multicore but their engine is based on Open GL ideologies.

Is it that hard to give id some very well deserved credit for this? Especially after all the Carmack bashing, how he's just a whining PC developer who won't survive on consoles....

Sorry to have to tell you this but it is hard to give Id and John Carmack any well deserved credit for making a 3rd party licensed game engine that is able to run in consoles.

Id/John Carmack's last hurrahs were with PC games in the 90s and late 90s with Quake II and QIII being the most successfull for PC game engines that offered barely average console ports that the console gamer will barelly care to remember.

When you mention Id/JC's name you are talking about First person shooters and as much fun as I had playing Doom III/RoE and Quake 4 on my PC, they are minor game experiences if you remove any fanboyism and realize how stiff competition got and how little Id/JC offered as far as 3d graphics, things would be different however if Doom III came out back in late 2002.

In console gaming the first person shooters that have embarrased PC FPS gaming have been the RARE developed Goldeneye 007 N64, Bungie's Halo series, to a small crowd Rare's Perfect Dark and Timesplitters, there are other console FPS games but any attempt made by Epic or Id software to make a memorable console appearance has been in weakly ported games that held equally weak sales, it wasn't until recently that Epic Games broke away with Gears of War but then again GoW is not a FPS.

And now he's just got rid of texture memory limitations as it seems, and on all platforms, once again changing something big in the industry.

Again remember his engine is just a 3rd party licensed engine that will greatly depend on a game dev willing to use it for the gaming genre application that will benefit from it most. As far as we know, RAGE may end up being a high texture/resolution Halo/Half Life 2 like game as in most likely another first person shooter or possibly a third person game. At this point its hard to imagine this game engine being used in anything else and if John Carmack is willing to break any alleged NDAs by stating memory usage limits in PS3 as a problematic issue its highly possible that the 3rd party dev using this engine will run into similar problems as well on the console and by no means did he magically get rid of texture memory limitations BTW.

The big factor this time is that because of the multi core nature of the XBox 360 and PS3 and because his company had to hire a former Naughty Dog programmer, John Carmack cannot take all the credit for making the game as he has in the past, its not expected of him either but this is a fact.

I personally think that Id Tech 5 is a great idea, for Id Software to get their games ported to consoles but thats about it, if I want any real innovation I will got after a game that is made from scratch on a single console, but thats just me and so far those devs that have taken that step on PS3 at least have a major foot in the door that will turn into leaps later, while a 3rd party engine will still stay very much the same or will need major rewritting that will take more time to make something that stands out.

I really hate to think of Id's Tech 5 as just a level editor though from what I have seen.
 
Could you clarify this for me? Is there something inherent to a deferred renderer that means it takes at least 16ms (on current hardware) or did you mean just for this game :?:

They're moving around several times as many data for the G-buffer then a normal renderer does; and they're reading/writing it a lot for the lighting calculations. Read DeanoC's article here on B3D to learn more about it.
So the renderer is probably taxing the memory bandwith of the PS3 already; increasing this data traffic two times is definitely out of question.
 
LaaYosh said:
increasing this data traffic two times is definitely out of question.
Unless you have some proof that KZ is taking some grotesque amounts of time attribute filling (in which case, I'd have to wonder wtf were they thinking going deferred in the first place), all this obsession over bandwith is just speculation.

I'll grant you that % penalty from attribute filling might be more tolerable at 30fps(but that goes regardless of whether that part is bandwith limited or not), but without knowing those costs making generalizations about framerate is reaching.
Now if we were talking about deferred shader on PS2, I'd very much agree with you. :cool:
 
I think it is disingenous to say he eliminated texture memory limitations. You're still constrained by streaming bandwidth and latency, and the total available texture memory will have obvious impacts on many things, such as view distance modulo LOD, as well as game play features like split screen, multicamera views/spectating, etc.

When you have predictable scene changes, navigating through virtually infinite sized tiled textures is trivia, because for any given scene, and a small delta, the amount of texels you need to fetch is explicitly bounded. That's why Microsoft's SeaDragon works so smooth.

But that doesn't mean texture memory budgets can be forgotten about.
 
How many Id Software developed games have been released on consoles as of this year 2007 compared to Epic Games?

What does that matter? As Carmack pointed out a couple of years ago, it's actually easier for them to develop on the console then a PC because the hardware/drivers are static. But in any case, Carmack worked on several id Software console ports before Epic had even released Unreal, so they're still ahead. ;)

Although I am aware of Quake 4 being on XBox 360 we pretty much know that game was never really programed to require more than one processor (even taking the dual core patch) and staying in that platform last years Epic Games Gears of War, even though we don't have any proof other than the game devs word on multicore usage or even as to what the actual Unreal Engine 3 version being used in that game, its safe to say that Epic games has the lead over Id so far.

But UE3 was designed for PCs, and some of their fundamental decisions ended up being ill-suited for consoles...that will always haunt them. The developers themselves admitted as much. At the same time at id Software they were working on id Tech 5 with the consoles in mind right from the start.

If Id's Tech 5 proves attractive to those game devs interested in that engine, then Epic will have a major challenger, however we can agree that both engines are fundamentaly different in that Epic has embraced Direct X and quite possibly multicore while Id is only starting to embrace multicore but their engine is based on Open GL ideologies.

id Tech 5 was originally developed on the 360 with multi-core and DirectX at the fore-front. The whole DirectX/OpenGL thing is apparently pretty arbitrary to them at this point, but if you insist on bringing it up it's a fact that DirectX was the primary API Carmack used originally (it caused quite a stir at the time). As for multi-core, I clearly recall Epic showing Gears on the 360 at E3 2005 running on a single thread. So, a) the engine wasn't designed for multi-core to begin with, and b) they haven't been working on it longer then id Software...if anything id Tech 5 has probably been running on multi-core longer since it's been there since day one.


Now, I don't want this to turn into some kind of pissing match, I have a huge amount of respect for Epic...I consider them to be among the very best of developers. However, the assertions people are making here about Epic being far ahead of id Software are absurd...they have done quite a bit of PR on the whole console/multi-core thing, but the fact remains that the engine came into that scene late and actual results haven't been keeping up with the PR. This shouldn't be any surprise.
 
Reading about how easy their cross platform (relative) ease is being challanged, i wonder how they are implementing that. You make a game in Id tech 5 and cross what platforms you want and woosh, there you go (i think thats what Carmack said, admittedly dumbing it down for ease of understanding but anyhow). At the same time i read this i was tipped about this tech on the Mac called LLVM.

http://en.wikipedia.org/wiki/LLVM

Do they use something similer to give devs the same result on every platform? Or do they use some complicated translater thing?

I'm no coder, but eager to learn. please spare my life... :oops:
 
Reading about how easy their cross platform (relative) ease is being challanged, i wonder how they are implementing that. You make a game in Id tech 5 and cross what platforms you want and woosh, there you go (i think thats what Carmack said, admittedly dumbing it down for ease of understanding but anyhow). At the same time i read this i was tipped about this tech on the Mac called LLVM.

http://en.wikipedia.org/wiki/LLVM

Do they use something similer to give devs the same result on every platform?

According to Wikipedia page you linked, LLVM is just a platform (and language) independent optimization infrastucture which does something similar to optimization transformations GCC applies to its internal language, before going to target machine language.

But to get same results (including performance) you need to optimize your code and data in a platform dependent way as well, which Tech 5 or any other multiplatform engine needs to do.
Or do they use some complicated translater thing?
Well, I am no game developer and don't know how complicated their "translator" is, but if it indeed removes the burden of architectural limitations from texture artists, at least it needs to convert arbitrarily high resolution textures created by artists to a target platform friendly form.

But that just helps addressing vram issues and I don't think they claim to solve say vertex shader problems, hence things like geometry may still be difficult to port.
 
Unless you have some proof that KZ is taking some grotesque amounts of time attribute filling (in which case, I'd have to wonder wtf were they thinking going deferred in the first place), all this obsession over bandwith is just speculation.

Actually I'm quite interested in your opinion about this...
Based on DeanoC's article, a pdf linked in the KZ topic, and what I know about lighting in post for offline CG using compositing packages, I've got to the conclusion that framebuffer traffic for a deferred renderer should be significantly higher than usual. Each light pass reads and writes a lot of information, too, they've been talking about up to hundreds of lights; and then there's 2xMSAA, too. Bandwith on the PS3 is limited, even if not as much as it seemed at first sight. Doubling all this traffic should be more than enough to stall the system, IMHO.
 
Actually I'm quite interested in your opinion about this...
Based on DeanoC's article, a pdf linked in the KZ topic, and what I know about lighting in post for offline CG using compositing packages, I've got to the conclusion that framebuffer traffic for a deferred renderer should be significantly higher than usual. Each light pass reads and writes a lot of information, too, they've been talking about up to hundreds of lights; and then there's 2xMSAA, too. Bandwith on the PS3 is limited, even if not as much as it seemed at first sight. Doubling all this traffic should be more than enough to stall the system, IMHO.
Geometry pass uses for sure a lot of bandwidth, but it should also contribute to save a lot of bandwidth too in the lighting pass, so even though it's likely that deferred renderers need more bandwidth than forward renderers this doesn't make them automatically slower
 
Back
Top