Gabe Newell on NextGen systems

ps2xboxcube

Newcomer
I searched and didn't see this mentioned yet. Here is what Gabe Newell thnks about the nextgen systems.


"Your Existing Code? Throw it Away"
Thursday, 21 July 2005
Valve boss Gabe Newell is shocked at the lack of appreciation for the giant hurdles involved in next generation console development.

"Technologically, I think every game developer should be terrified of the next generation of processors. Your existing code, you can just throw it away. It's not going to be helpful in creating next generation game titles," said Newell.

"Most of the problems of getting these systems running on these multicore processors are not solved. They are doctoral theses, not known implementation problems. So it's not even clear that over the lifespan of these next generation systems that they will be solved problems. The amount of time it takes to get a good multicore engine running, the Xbox 360 might not even be on the market any longer. That should scare the crap out of everybody."

Newell cautions "Really good engineers are going to be much more valuable and engineers who used to be valuable writing game code in the previous generation may end up becoming thorns in the side of key programmers who can write multi-core game code."

Is this different?

But learning curves for new hardware have been par for the course for the past several generations of hardware. How exactly is this different?

"Yes, it is different. It is much more difficult now to write code that will have predictable behavior. We have performance problems now in the out-of-order universe because we have programmers who can't figure out why the changes they made caused the system to behave the way it does.

"So one of the people who has a deeper understanding of the overall architecture has to come in and tinker around, more or less blind, because there aren't a lot of performance tools that give insight into what's happening in the cache memory, where a lot of this stuff goes wrong. It goes a lot worse in a multicore world, where there's a whole bunch of stuff going on in these separate [cores], can suddenly have an impact on the entire system.

"If writing in-order code [in terms of difficulty] is a one and writing out-of-order code is a four, then writing multicore code is a 10," cautions Newell. "That's going to have consequences for a lot of people in our industry. People who were marginally productive before, will now be people that you can't afford to have write engine or game code. They can't get a big enough picture of what's going on in the box so they'll be a net negative on the project."

Hardware claims

Newell is also scathing about hardware manufacturer claims regarding perormance.

"Statements about 'Oh, the PS3 is going to be twice as fast as an Xbox 360' are totally meaningless . It means nothing. It's surprising that game customers don't realize how it treats them like idiots. The assumption is that you're going to swallow that kind of system, when in fact there's no code that has been run on both of those architectures that is anything close to a realistic proxy for game performance. So to make a statement like that, I'm worried for the customers. And that we view customers as complete morons that will never catch on and that we're lying to them all the time. That's a problem because in the long run, it will have an impact on our sales."

Gabe Newell has shared his opinions and experiences with us. We'd like to hear from more developers what challenges and solutions (if any) you're facing with your next generation console projects.


http://www.next-gen.biz/index.php?option=com_content&task=view&id=510&Itemid=2
 
Did he consequently swap "in-order" and "out-of-order"? :?

If not there are alot points that aint making sense( basically every occasion of those two words).

Edit: well got it: out-of-order CPUs can run in-order code efficiently, in-order CPUs needs special care for instruction-order - what he calls out-of-order code.
 
Late Breaking New: Ti-ti-tii-ti-ti-tii-ti-ti-tii-ti...

Our on the action field correspondents have just broke this new to our home studio...

"Big Corporations LIE!!!" More information at 11!


We all knew this was the case... though, he's in no way wrong nor, sadly, are the corporations... if you ever check out other forums you see people tossing around numbers and things htey don't understand out of context and in terms of absolutes. The really disturbing thing, IMO, is the fact that even if you try and correct these people and explain to them how they misinterepret things and so on, they just turn around flame you. They REFUSE to be educated -- that's the most disturbing part.

Life moves on and it's up those in the know to try and hold things together... burden of knowledge, I guess.
 
I don't think this tells us anything new. A new way of thinking will have to be taken in order to get the most performance out of these multi core processors. There will definitely be hurdles, but I doubt it will be as bad as Gabe Newell is making it out to be. Give it a year or two and then the tools and compilers for the consoles will be much better than initially.
 
Gabe Newell said:
So to make a statement like that, I'm worried for the customers. And that we view customers as complete morons that will never catch on and that we're lying to them all the time.
Half-Life 2 will be released this September (2003)!
 
Vysez said:
Gabe Newell said:
So to make a statement like that, I'm worried for the customers. And that we view customers as complete morons that will never catch on and that we're lying to them all the time.
Half-Life 2 will be released this September (2003)!

Hey I like that one. What did my mom tell me about people in glass houses? :)

Gabe said:
The amount of time it takes to get a good multicore engine running, the Xbox 360 might not even be on the market any longer. That should scare the crap out of everybody."

This guy is going extremly too far with his scare tactics. Does anybody I mean ANYBODY believe this? And this is coming from a respectable guy. Psst, I guess their won't be a Half-Life 3 on the PS3 or Xbox360. Scare the crap out of everybody? Ummm.... no that's what good support and software is for. Didn't MS make something called XNA, doesn't the PS3 come with a free package of Aegia and Havok software, and aren't plenty of other players aboard like Epic to make this multithreaded thing easier?

Nobody in life should be as scared as he is. What is your alternitive Gabe? Is it to work on PC styled CPUs until Intel goes multithreaded? When should have the console makers went with a multithreaded solution? 2010, 2015, etc.
 
Throw away existing code??

Yet still UE3 team got their stuff up and going on PS3 in 2 weeks.

The existing code may not work great and have the best efficiency, but I'm sure smart developers can re-work a few things and get decent performance out.

If Valve throws out their existing engine code to build a new system for PS3 or XBox 360 that's a waste of a lot of tech.

Speng.
 
How else will Valve spend 5 years porting their engine to PS3 or XBox360 unless they throw everything away?
 
I liked what he had to say.

I don't think he said the consoles are not powerful/less powerful than current architectures - quite the opposite. I think he said they're amazingly powerful, but will likely not reach as much of their full potential as current/previous consoles and PC's do.

Also, regarding the lying, he's right, again. People have been lied to in the past (eg, FFVIII ballroom scene) and will be lied to this gen with regard to figures. But people will of course swallow the lies and, worse, defend them to the teeth!

But there you have it.

Vysez, love the HL2 release date bit ;)
 
think he said they're amazingly powerful, but will likely not reach as much of their full potential as current/previous consoles and PC's do.

Well PARANOiA we know that PCs don't reach their full potential because every 10 months a new GPU is coming out. Why do they do that anyway? Also with all the new software and middleware to help devs why should they be scared? Have you personnally seen the PGR3, GoW, Gundam videos? And thats on Alpha kits. I'm personally not worried about it. Next-gen games will be alright.
 
mckmas8808 said:
Well PARANOiA we know that PCs don't reach their full potential because every 10 months a new GPU is coming out. Why do they do that anyway?

Because NVidia and ATI like making money? :) Also because people like buying "the best of the best", regardless of cost, even if it's a small (10-15%) performance increase over six months.

I threw PC's into the mix purely because my 9700pro can reach more potential than my XBox because I'll be playing new games on it in three years. Can't say the same about the Xbox, can you? I think PC graphics cards do reach their potential more than old consoles simply due to the constant influx of new games, regardless of the age of your hardware.

This is within reason of course (eg, your 3d architecture isn't outdated like Glide, or games won't run at all/playably).

mckmas8808 said:
Also with all the new software and middleware to help devs why should they be scared? Have you personnally seen the PGR3, GoW, Gundam videos? And thats on Alpha kits. I'm personally not worried about it. Next-gen games will be alright.

I'm not scared either. I think the games will look fantastic, and will enjoy playing them. I'll one up you and say games will be better than alright, they'll be jaw-dropping :)

There's a big difference between games being alright and reaching the same potential in hardware that an older system did. It'll be much harder to get the most out of the hardware next gen than this one, I imagine. But I don't want to crap this thread, so lets agree to disagree, ok?
 
Well he's 100% right about it being a LOT harder to program for these crazy in-order multi-core oddities.

These chips seem to have been designed to make it easy to boast about them. It's just going to be insanely harder to get max performance out of them. Absolutely untrivial. And it will require serious changes in how a senior coder from this gen deals with next gen. Nothing like throwing away years of experience on out-of-order cores?

In-order cores haven't been used on PCs since the original Pentium. Excluding low performance Centaur-team chips. Hell the AMD K5 was out of order! The gist of it is that the high performance chips are all out of order designs. OOO has been evolving for like 10 years! And now they just drop it in favor of massive clock speed and multi-cores.

I would go so far to say that an AthlonXP 3200+ is a decent match for what these new console chips are capable of. At least until (or if it's possible) to do some serious, down to the metal optimization of code. MASSIVE man hours and skill.... I've visited and spoken to the developers at Raven Software a few times. Last time I was there they were telling me how development costs and time are rising out of control for big titles. Well, they sure as hell didn't need this kind of bomb dropped on them. This is going to cost a LOT of money for developers. A LOT.

At least the GPUs in the new consoles are full-on designs that seem to have very few serious cost cutting measures going on.

I just think many of you are caught on this hype trail, as is ALWAYS the case with consoles. You don't see just what the CPU engineers have done to these chips.... It's very bad.
 
Wasn't the PS/2 technically a concurrent system with 3 "cores". The MIPS core, plus VU0/VU1? Since the developer's apocalypse didn't occur then, I doubt it will occur this time. Tools for concurrent programming are much better these days. It is not the multicore aspect which causes difficulty, it is the in-order aspect, and the maturity of compilers, plus the need to hand-optimize "hot" assembly for in-order.

p.s. don't forget that Gabe Newell is a shill. If Sweeney is a NVidia shill, than Gabe is an ATI shill x2. Moreover, Gabe knows jack about programming.
 
If you are programming in a HLL the fact that the processor is in-order is irrelevant to you ... and you really shouldnt worry your pretty little head about the performance impact either (it aint all that, certainly not on a dual issue core). If you are programming in assembler and can't juggle instruction latencies well ... I imagine Microsoft will spend a lot of money on their compiler so it can better optimize intrinsic based code, and some bright minds (maybe even Sony) will probably bring out something similar for the PS3.

As for cost of development, artists were pulling in most of that money ... is it so bad that a larger piece of the pie goes back to the programmers? :)
 
swaaye said:
OOO has been evolving for like 10 years! And now they just drop it in favor of massive clock speed and multi-cores.

Well "just drop it" makes it sound like a bad thing. You seem really smart what MS and Sony choose to do it something that even Intel will do in the far future.

swaaye said:
I would go so far to say that an AthlonXP 3200+ is a decent match for what these new console chips are capable of. At least until (or if it's possible) to do some serious, down to the metal optimization of code.

Oh please. That Athlon is not, I repeat is not a decent match for the CELL and XeCPU!!! Its just NOT the same. The serious amount of flops that the CELL can provide is something that the Athlon just can't get near it.

And what about the CELL abilities to distribute data over a network? I don't know if the PS3 will do this but Ken has said that it will so take that as it is.

swaaye said:
Well, they sure as hell didn't need this kind of bomb dropped on them. This is going to cost a LOT of money for developers. A LOT.

How much more do you think that its going to cost? 20%, 50%, 100%more? Can you give me your estimate. Didn't the difference between the PSone and the PS2 development rise?


swaaye said:
just think many of you are caught on this hype trail, as is ALWAYS the case with consoles. You don't see just what the CPU engineers have done to these chips.... It's very bad.

Well the demos with PGR3, GoW, Gundam, new Koei game, and others are proving that it is not hype. The CELL and XeCPU very bad? Ok for a video game console what can out do this two chips right now?
 
Author of the thread has the most politically correct alias I have seen. :LOL:

As for the article itself, I won't dispute the things mentioned, though a lot of content is old news to console developers.

However, I'm sure he is quite upset that all the effort that went into HL2 engine is more or less useless(considering the tone of the article) for the lucrative, sinfully-delicious engine-licensing market for next-gen. Lots of sour grapes on his plate, seeing how Epic's development plans and direction with Unreal engine are now earning them the monopoly over next-gen engine licensing.
 
I know why he wants to criticize next-gen CPU.
Because porting his source engine to next-gen platform is a huge engineering.....
:LOL: :LOL: :LOL:
 
passby said:
However, I'm sure he is quite upset that all the effort that went into HL2 engine is more or less useless(considering the tone of the article) for the lucrative, sinfully-delicious engine-licensing market for next-gen. Lots of sour grapes on his plate, seeing how Epic's development plans and direction with Unreal engine are now earning them the monopoly over next-gen engine licensing.

Well it does raise a lot of questions about the financial value of building a 3d engine. Source was amazing, costing millions I imagine, and less than a year later it's pretty much dated.
 
swaaye said:
I would go so far to say that an AthlonXP 3200+ is a decent match for what these new console chips are capable of.

I think Yonah+Ageia PPU would be better.
But cost is too much to control.... :rolleyes:
 
Back
Top