You might think he's being pessimistic or "whining" but I see it as being realistic, especially with regard to consoles. Far too many people seem to assume that, say, 3 x 1.5 Ghz cores = 4.5Ghz of processing speed. The console manufactures hype up their devices to the max and then release un-realistic pre-rendered videos to further present this misleading picture. Damping down some of this expectation with a dose of realism is long overdue.BlueTsunami said:I say he needs to grab this challenge by the balls and see what he can do with multi-core proccessors. He DOES have the right to complain...but theres a point where it seems as though hes whining.
london-boy said:He said ALL THAT?! Good god the guy can talk!
Diplo said:You might think he's being pessimistic or "whining" but I see it as being realistic, especially with regard to consoles. Far too many people seem to assume that, say, 3 x 1.5 Ghz cores = 4.5Ghz of processing speed. The console manufactures hype up their devices to the max and then release un-realistic pre-rendered videos to further present this misleading picture. Damping down some of this expectation with a dose of realism is long overdue.
To say that Carmack is afraid of a challenge is simply untrue based on what he has done not only in the graphics industry but also in his side interests of Armadillo Aerospace. The fact that he's playing around with mobile technology for basically fun shows how ready he is to embrace new technological paradigms. He was also one of the first (and almost only) gamed developer to write a multi-threaded engine, as witnessed with Q3 tech, so he has more experience than most.
Nite_Hawk said:There are really a couple of things going on all at once in my humble opinion. Like you say, we've got people with big stakes in the new consoles saying that they are the best thing since sliced bread. We also have well established PC programmers like Gabe and JC saying that the new designs are really no better (indeed, they seem to give the impression that they are worse) than the current design.
I think quite honestly that JC is just suffering from getting old. He has a *lot* of knowledge about designing and optimizing games for single CPU systems, and going multicore is going to require throwing a lot of that out and starting over. That's not to say all of his experience is worthless. Still, I doubt that it will be JC, Gabe, or even people like Sweeney that are going to be the "god like" programmers for multicore processors. Sure they'll be able to make tries (with varying degrees of success) at it, but it is going to be the next generation of programmers who spend years writing multithreaded engines that will become the masters.
The hardest thing for JC is going to be letting other (younger!) people be the new stars at ID. Sweeney (from what little I've heard) sounds like he's made an effort to bring many people into Epic that have varying areas of expertise. That is, atleast in my opinion, why Epic is succeeding where it appears ID is failing.
Nite_Hawk
In no way do I want to pull his (Nite_Hawk) statement out of context (so check out his post in the other thread)...but that seems to put everything that needs to be said about his criticisms in perspective (coming from people who feel that hes crying over this a little to much).
ERP said:To be honest reading that thread I get
JC Says: Xbox360/PS3 not the second coming
Sony/MS fans Say: Whine whine whine -- JC Sucks....
It's one of the wonders of the internet there is so much crap out there you can just regurgitate the crap that aligns with your view and reject the stuff that doesn't.
FWIW I've read the keynote and I believe he's being very fair, it's just that no-one here wants to hear, not that Cell and the X360 processors are not all conquering behemoths, the flop comparisons would have us believe.
I love reading the but it's OOcode on an in order processor.... What exactly does OOcode look like?
I've seen many other threads where people claim that the compilers will solve this problem for the devs...... So what is the point here?
BlueTsunami said:My problem with JC is that he isn't giving any opinions as to what would be good about Multi-Threaded processes and also what could be an alternative to multi-core cpus...its that type of whining and bickering that doesn't get anything done...hes the "god like" programmer....he should tell us (in his opinion) what would be better then spout off on how bad all these CPUs are.
First off, you're using highly emotional terms in relating Carmack's keynote. I watched it, and I can assure you there was almost no emotion in it anywhere. The rants were at GDC 2005 and on Anandtech. He pointed out it's hard which is inarguably true. And he pointed out that hard things aren't the very best for development, which is also inarguable. If any of that sounds like whining or bickering to you, I submit that you are buying into what people are saying of Carmack more than you are buying into what Carmack said.BlueTsunami said:My problem with JC is that he isn't giving any opinions as to what would be good about Multi-Threaded processes and also what could be an alternative to multi-core cpus...its that type of whining and bickering that doesn't get anything done...hes the "god like" programmer....he should tell us (in his opinion) what would be better then spout off on how bad all these CPUs are.
The issue is whether you're disagreeing with what Carmack actually said or with what people reported him to say.And I also respectfully agree with JC..but that must be a bad thing...
Again, your lack of knowledge on programming discredits your conclusions. "OO code" is a typedef for "code". You can't tailor code to an OO unit. It just doesn't make sense.BlueTsunami said:EDIT: Does it matter what OO code looks like? All that matters is it was taylored for OO Cpu's....how could you think that it would magically be efficient? That blows my mind...it would seem pretty obvious that you would get a drop in performance by doing that.
EDIT: Does it matter what OO code looks like? All that matters is it was taylored for OO Cpu's....how could you think that it would magically be efficient? That blows my mind...it would seem pretty obvious that you would get a drop in performance by doing that.
BlueTsunami said:My problem with JC is that he isn't giving any opinions as to what would be good about Multi-Threaded processes and also what could be an alternative to multi-core cpus...its that type of whining and bickering that doesn't get anything done...hes the "god like" programmer....he should tell us (in his opinion) what would be better then spout off on how bad all these CPUs are.
Regurgitation? I apologize for agreeing with someone elses post...and I decided to put that in this thread because I thought it was exceptionall and ecapsulated what I was thinking also.
BlueTsunami said:Damn...I got raped...lol..
PS...damn I got raped..
BlueTsunami said:Damn...I got raped...lol..its true that I don't know that much about programming (I know very little...and the little I know comes from me working with *nix) but what my disagreement with JC is that hes bashing something that Intel, AMD and IBM find to be the next step (besides faster CPUs). Him complaining (as he is) shows his reluctance to accept multi-core CPUs and the fact that hes going in with reservations...he going to be left in the dust by people who accept this with open arms (this as in multi-core CPU's).
Its not about Consoles vs. PCs (I don't really play games that often anymore since I don't have time....I usually get down on my PSP from time to time..and play on my laptop when i'm on the train). But theres already developersr who have swallowed any inhabitions they may have had about these CPU's and chugged along..following the flow of the business. It just saddens me that someone as prestigious as him isn't taking on this challenge and is bickering about it.
PS...damn I got raped..
Shifty Geezer said:I thought the Intel roadmap had main cores and 'stripped-down' support cores, kinda like Cell. I don't know where OOOE fits in as one of the aspects needed for lots of cores is simple cores. The gains made in keeping the CPU busy with OOO instructions are lost in having less cores/hardware threads available. Maybe advances in compiler tech will make OOO less important?