I would say that he have grown into a role that is more then just technical junky.
What role is that? The reason we listen to Carmack is because he's a technical junkie, not because of say, his game-making expertise.
I would say that he have grown into a role that is more then just technical junky.
What role is that? The reason we listen to Carmack is because he's a technical junkie, not because of say, his game-making expertise.
He's a technical director of a company. As part of his role, to deliver not only technology, but within budget and time. That's his role. The reason you/we listen to him, is because (you're right) he knows his stuff.
I find it very odd for many people here to call him, lazy or he prefers things "as he knows them". When most people here don't know him personally and yet judge him. Id is not a large development house, to call him lazy or whatnot, just because he has to work within the parameters of his company and shareholders...i just don't know...
Yeah, maybe he's lazy or living in the past...but whenever making that claim, please provide irrefutable evidences. Or else, we can pretty much call every developers lazy because we're not working in asm for everything.
How did you come to that conclusion? It could have been given x time/budget, he rather work on a system that will deliver enough wiz-bang-oh-ah for the consumers.
Just because we have done ASM in the past, doesn't mean we don't want higher level languages today. Why do we need to use XML, relational database, oop? We want to reduce development cost, because at the end of the day, it's a business to make money instead of doing it for a cool factor.
I would say that he have grown into a role that is more then just technical junky.
I understood that from your original post, but the point is that if there was no PS3, it would be deemed that such optimization wouldn't be worth it.Making a game run as fast as it can on a console is not the same as making a game as good as it can be with a given budget. They're only partially correlated optimization goals.
Well from the things I hear regarding programming on the PS3, the Cell doesnt sound as efficient as it was probably expected from Sony, considering their long investments on the chip.Well, I kinda have to disagree here. Without handing out too much Kool-Aid, I think it's safe to say that we're seeing a pretty painful transition to a more async-job oriented programming model. This is not really something that is limited to the PS3, but also something we see a lot on 360. What is holding us back here is mostly the PC, where you simply cannot assume that you have a couple of cores just yet. This relates to the "PC guys moving over to consoles" situation, which gives you tons of straight single-threaded code that is being ported to the consoles. Complete rewrites are simply not that much in fashion right now.
Anyway, back to topic. If you want to stay competitive (among AAA titles, at least), you will need to get as much processing power out of your consoles as possible. This is not only true for graphics.
To get good performance out of both Xenon and Cell, however, you can't really do PC-like pointer-chasing, or do multiple passes on big sets of data. You need to have chunks that fit into cache, and you need stream-operations to work on these chunks. Basically, you turn you CPU cores into SPUs, with the cache acting as local store. This is something you cannot really get around on PS3, while you *can* get relatively far on 360 with the PC-like model. On the other hand, considering 360 memory bandwidth, I don't think you really want to fetch cache-lines more often than needed.
So yes, right now, it seems that PS3 forces a programming model on you that seems unnecessarily different from what you are used. But I'm pretty sure it's a model we will see a lot more of.
So my question is, what advantages does Cell really offer compared to the Xenon or any other current multi core CPU?
Just one point, I don't see this happen, caches are here to stay.he engine team is becoming an increasingly small part of the budget, and as time goes on programming models will shift towards what the Cell, simply because that's where things are going on the PC side as well,
You must be hearing from some pretty whack sources! Look outside of the console space to see how Cell is accelerating processing jobs elsewhere quite commonly an order of magnitude faster. That's even with third parties free to choose whatever processor they want. HPC shows Cell is a great high-speed processor, thanks to the design decisions they made.Well from the things I hear regarding programming on the PS3, the Cell doesnt sound as efficient as it was probably expected from Sony, considering their long investments on the chip.
Does some devs have an idea about what could have push namco to do the choices they did on SC IV?
I can't think about a valid technical reason.
Even considering the proces of games validation Joker454 spoke about, I don't get it.
Games could have been identical, I don't understand what have motivated Namco choices.
id's post-apocalyptic open-world shooter Rage (PC, PS3, 360, Mac) will look worse on Xbox 360 due to the compression needed to fit the game's assets on two DVDs, programmer John Carmack revealed at tonight's QuakeCon keynote.
According to Carmack, the royalty fees to include a third disc in the Xbox 360 version would be so high that it simply isn't a feasible solution, with the programmer hoping for Microsoft to make a concession. He stressed that the issue has nothing to do with the Xbox 360 hardware itself, and is merely a storage problem.
Carmack also noted his belief that neither Doom 4 or Rage will be digitally distributed, as is just isn't looking into it at this point.
During last year's QuakeCon talk, Carmack stated that the PlayStation 3 edition of Rage would ship on a single Blu-ray disc, with the PC and Mac versions likely to arrive in both Blu-ray and DVD form.
Does some devs have an idea about what could have push namco to do the choices they did on SC IV?
1) One of the 360's tcr's is that all games must ship with some form of AA. Some studios/franchises have the power to muscle this requirement aside. But assuming Soul Calibur could not, then they would have to either use hardware msaa or their own software method to meet this requirement. It seems fairly clear from the resolution choice that they decided against hardware msaa and tiling, so they decided to super sample instead. For 720p output, their super sampling method would meet the tcr. However, for 1080p output, it's possible that relying purely on the 360's hardware upscaler would not count as meeting the requirement. But, in a perverse twist of logic, rendering to a 1365x960 buffer, supersampling it down to 720p via software, then letting the hardware scale it back up again to the clients native 1080p display would meet the tcr requirement.
1)...
hm... somewhat less incredulous... I'm curious, how difficult is it to deal with anamorphic resolutions2)...
hm.. seems plausible considering the HUD is 720p and presumably drawn last i.e. 1080p = afterthought. But why not composite a pixel-perfect HUD instead and then let the hardware scale up or down in hardware Memory constraint ?3)...
4)...
This is certainly the trend we've been seeing from cross platform games and developer comments. The latest Carmack interview on 1up sure ain't Sony friendly.