Carmack console stuff begins leaking from Quakecon

even though it's likely that deferred renderers need more bandwidth than forward renderers this doesn't make them automatically slower

What I've said wasn't that it's slower, rather that because of the higher bandwith requirements, it'd be very unlikely that a deferred renderer could run at 720p resolution with a refresh rate of 60Hz, even if they'd get rid of 2x multisampling...

By the way, how exactly can they implement that? I've read about the edge blur based on running an edge detect on the depth/normal passes and it's quite clever, but I'm sure that GG doesn't count this as MSAA.
 
Well, I am no game developer and don't know how complicated their "translator" is, but if it indeed removes the burden of architectural limitations from texture artists, at least it needs to convert arbitrarily high resolution textures created by artists to a target platform friendly form.

But that just helps addressing vram issues and I don't think they claim to solve say vertex shader problems, hence things like geometry may still be difficult to port.

I was thinking perhaps this is where the Naughty Dog genious comes in. Coding to facilitate a more optimized "translator" so games perform better. Also i've never heard of a multiplat, or even new single plat, game being coded by one single coder. At least i wouldn't want to be that guy, even if i'd be Carmack. So i still think he deserves credit.
 
What I've said wasn't that it's slower, rather that because of the higher bandwith requirements, it'd be very unlikely that a deferred renderer could run at 720p resolution with a refresh rate of 60Hz, even if they'd get rid of 2x multisampling...
Umh..if you're not bandwidth limited I don't see a reason that should make your deferred renderer automatically slower as it should be far more efficient in other areas compated to a purely forward renderer.
By the way, how exactly can they implement that? I've read about the edge blur based on running an edge detect on the depth/normal passes and it's quite clever, but I'm sure that GG doesn't count this as MSAA.
I can tell you how I'd do that:

stupid/simple way: render you G-buffers with msaa on, perform the lighting pass on each subsample, downsample the whole image (basically this corresponds to do multisampling in the geometry pass and supersampling in the lighting pass)

smarter/complex way: render your G-buffers with msaa on, run a full screen pass that computes a per pixel mask that tells you if all the subsamples belonging to a pixel have the same values or not, use the aforementioned mask (stencil test..) to only shade a subsample per pixel where all the subsamples are the same, shade all of them otherwise, downsample. (this works well only if you don't have many subsamples per pixel, but since they use msaa 2x it should be fine to use something like that..)
EDIT: it would be even better if we could have some GPU that is able to return the state of a pixel rendered via multisampling (ie compressed or not..)
 
In console gaming the first person shooters that have embarrased PC FPS gaming have been the RARE developed Goldeneye 007 N64, Bungie's Halo series, to a small crowd Rare's Perfect Dark and Timesplitters, there are other console FPS games but any attempt made by Epic or Id software to make a memorable console appearance has been in weakly ported games that held equally weak sales, it wasn't until recently that Epic Games broke away with Gears of War but then again GoW is not a FPS.

I don't recall any of those games "embarrasing" available PC shooters of the time either graphically or gameplay wise. Sure they made a big impact with console gamers mainly because of a lack of decent FPS games available to them but I think you will find that while good, they weren't beyond what was already available to PC gamers.
 
Stressed by the fact that he mentioned Timesplitters in that list, which is a complete POS; 1997 graphics with pre 1997 gameplay, YAY!

Cheers
 
yeah, that looked odd to me. Timesplitters? I mean, get real. TS was a total crap game, especially when Half-Life/CS/TF were out 2 years earlier on PC.
 
You know it occurs to me one of the biggest things announced by Id here that people are ignoring is the whole "Quake in a web browser" thing. Which Carmack himself could end up being a bigger deal than his high end engines.

To me it would be like youtube for games. The potential is enormous, and it's an a simple idea that frankly could be revolutionary in the same way as youtube.
 
When did you move to San Francisco, nAo? Welcome to California!

You know that's funny. I saw nAo's signature in another forum and it said he works for Lucas Arts. I thought it was a mistake, but this makes me wonder about it.
Did nAo quit Ninja Theory?

Sorry for the OT.
 
Steve Nix on the Id Tech 5 subject:
http://www.eurogamer.net/article.php?article_id=81052

A few snippets:

How are you going to fit into the current technology pricing model? Do you feel you have to undercut Epic or anything like that?

Steve Nix: I don't think we need to really be concerned with anyone else's pricing, because we believe we have the best technology solution available. However, we have a history of very fair pricing for our technology. For our older technologies the pricing's on the webpage - you can look them up. I think we've got id Tech 4 currently at USD 250,000 against 5 points. I'd verify that actually, before you put it! [id Tech 4 page on idsoftware.com] For all the older stuff, we have the pricing up there.

The only reason we haven't been very public about id Tech 5 pricing is because honestly we haven't developed our final plan there. But we have a history of - like I said - being pretty fair, and I expect that our pricing will be...that we're not going to lose business based on price. If our engine's not quite the right engine for someone, for the type of game they want to make, they can make their decision, but I doubt we're going to lose business just solely focused on price.


Eurogamer: Are you talking to the platform holders about stepping inside the circle of their development tools or anything like that? Making yourselves part of their development offering?

Steve Nix: We've talked about the various middleware, officially approved programmes, and we're having those discussions. Obviously we work closely with Sony, we work closely with Microsoft, we work closely with Apple and we work closely with Intel - and even with AMD and ATI and NVIDIA. We work closely with everyone. As far as the support level and our interaction with those companies, we work very closely with them. All the major players and hardware and OS regularly to the id offices and they'll meet with John and talk about their roadmaps and John will say 'here's where I think you should go'. John was a major player in Apple adopting GL as their rendering solution for the desktop. So John's always talking roadmap with those guys, and we have pretty good relationships.

Whether or not we take the more formal approach of being approved middleware providers, that's something we're talking about, but I don't know to what degree it...it'll help to some degree, but publishers for the most part and developers know who we are, they know we make great technology, and getting a stamp of approval, I'm not sure if that's a tremendous Delta Force honestly.
 
Back
Top