Predict: The Next Generation Console Tech

Status
Not open for further replies.
Real devs will step up and fill in the gap, leaving incapable, complaining devs behind

I almost hate to respond to this.
Shipping a game today is more about software engineering, and production than it is about "mad skillz".
It is basically impossible to monitor code quality over a 2+Million line codebase with hard deadlines, if your team has 100 people in it 5-10 might be stars.
There are exceptional teams they are usually smaller, and have the benefit that their success makes them attractive places to work so they have a deep enough hiring pool to be very picky. But even there I've seen a trend where optimizing for things other than just pure execution performance is commonplace. It's just not the biggest problem anymore.

Producing a great game is more about tools and ensuring that your gameplay people and artists can iterate efficiently than it is to eek 10% out of the hardware.

Now I'm not trying to justify the level of bugginess of a Skyrim, but much of that isn't technical ability it's poor production practices. Bethesda obviously doesn't manage to get good test coverage, or they simply can't keep up with the rate of change.

Unfortunately Bethesda is unlikely to change their practices going forwards because the most common error in the industry is believing that you are successful because of your process, not inspite of it.
 
Polyphony would be implementing raytracing, naughty dog creating FF Spirits Within-quality graphics, and Skyrim and Rockstar would be complaining they can't get their ports running at 30 fps, let alone maintain resolution parity with other consoles.

Amazing hardware will not always lead to amazing games, or ports in the case of PS3.

Your line of thinking on this is flawed. Taking PS3 as an example of current exotic hardware, you're thinking 3rd party devs get 50-80% efficiency out of the hardware, while 1st parties manage 90-100%. That's arguably somewhat correct, but you're then extrapolating that with more standardized, straightforward hardware, performance will be limited to 80% for everyone when compared to the alternative exotic hardware, but much easier to achieve.

It really doesn't work like that. Exotic hardware pretty much always only has advantages in theory. Some aspects of it will be great, others will be hobbled in comparison. A well rounded piece of hardware based on the most commonly used instruction set will save every developer time and is likely to produce better results because of the large library of pre-existing, extremely refined tools.

This is how I see it:
Exotic hardware:
- 1st party/exceptionally talented devs achieve 90-95% efficiency.
- 3rd party/time constrained devs achieve 50-85% efficiency.

Straightforward hardware:
- 1st party/exceptionally talented devs achieve 90-100% efficiency.
- 3rd party/time constrained devs achieve 70-95% efficiency.

That's a very simplified way of looking at it and I have no idea how one could go about measuring such stats (you can't) but that's the sort of delta I would personally expect, even if the overall percentages are lower.

NOTE: I'm talking about both chips having the same transistor budget, of course. Although, exotic hardware will always have higher R&D costs, so arguably for the same cost you could possibly have a standard off the shelf chip with more transistors at the same cost. I'm not really talking about an off the shelf chip though, so the R&D difference probably wouldn't be large enough to offset the cost of larger dies over the entire production life of the system.
 
Last edited by a moderator:
Almost Ot... lol
We should have info about the jaguar architecture next month. I don't expect miracle as far as perfs are concerned as AMD seems to aim for even lower TDP than with Zacate (4.5 Watts vs 9 Watts).
 
I almost hate to respond to this.
Shipping a game today is more about software engineering, and production than it is about "mad skillz".
It is basically impossible to monitor code quality over a 2+Million line codebase with hard deadlines, if your team has 100 people in it 5-10 might be stars.
There are exceptional teams they are usually smaller, and have the benefit that their success makes them attractive places to work so they have a deep enough hiring pool to be very picky. But even there I've seen a trend where optimizing for things other than just pure execution performance is commonplace. It's just not the biggest problem anymore.

Producing a great game is more about tools and ensuring that your gameplay people and artists can iterate efficiently than it is to eek 10% out of the hardware.

Now I'm not trying to justify the level of bugginess of a Skyrim, but much of that isn't technical ability it's poor production practices. Bethesda obviously doesn't manage to get good test coverage, or they simply can't keep up with the rate of change.

Unfortunately Bethesda is unlikely to change their practices going forwards because the most common error in the industry is believing that you are successful because of your process, not inspite of it.

ERP, any recommendations on how I can fit the above post-of-awesomeness into a signature? Not that it matters but you still top my list of favorite posts... after myself of course ;)

I know we all hate the lame and lazy "devs are lazy/dumb" posts and it is not worth replying too but your posts are succinct, valuable, and great reminders. Kudos.
 
Shipping a game today is more about software engineering, and production than it is about "mad skillz".
Likewise, transferred to the realm of hardware, I'd actually argue that building a successful console today is more about clever strategy decisions and efficient hardware solutions than about "lotz of corez" and "heapz of RAM" ... but only very few people seem to agree.
 
about the next gen consoles, I dont think amd are involved in any of them

AMD’s Manju Hegde in a q&a session
Do you expect the next gen consoles to make far more use of GPU compute?

Cannot comment further on this since these are products being brought forward by other companies.
 
about the next gen consoles, I dont think amd are involved in any of them

AMD’s Manju Hegde in a q&a session
Do you expect the next gen consoles to make far more use of GPU compute?

Cannot comment further on this since these are products being brought forward by other companies.


You've completely misunderstood that comment.
 
Shipping a game today is more about software engineering, and production than it is about "mad skillz".
It is basically impossible to monitor code quality over a 2+Million line codebase with hard deadlines, if your team has 100 people in it 5-10 might be stars.
There are exceptional teams they are usually smaller, and have the benefit that their success makes them attractive places to work so they have a deep enough hiring pool to be very picky. But even there I've seen a trend where optimizing for things other than just pure execution performance is commonplace. It's just not the biggest problem anymore.

Producing a great game is more about tools and ensuring that your gameplay people and artists can iterate efficiently than it is to eek 10% out of the hardware.
Well said.

Players tend to think that the best teams are those that optimize 99% out of the hardware, but they don't have enough insight about the game development process. Many of the games that receive "best graphics" awards are the best because during development they have excellent tools, minimized content/programming iteration times, minimized down times (code bugs and refactoring), minimized dependencies (people waiting for others), etc. Things like optimizing offline tools (for example instant coarse global illumination instead of waiting for hours to get the baked result) can improve the game graphics quality more than a 10% optimization for the (run time) graphics code (just because of improved iteration time). Also a well though code base that is designed based on good software development principles helps a lot during development. The less time you need to spend hunting bugs and refactoring, the more time you have to actually develop gameplay features, optimize performance and to finalize the existing features to perfection. Engines/tools that allow artists to modify graphics behavior (create shader variations, add/finetune post effects, etc) save a lot of time from programmers, and speed up the iteration time drastically (dependency chains between artists/programmers kill the productivity).

Improving productivity is the key for making good (and good looking) games.
 
amd-to-detail-jaguar-tablet-chips-in-august

Tablet chips?

Another http://www.pcworld.com/article/2593...ar_lowpower_processor_design_for_tablets.html

Also seem to be APU's. Does not seem to make a lot of sense for consoles.

Then again I suppose we might be looking at something with a pretty low power draw if Microsoft really wants to pack 8 of them in Durango.

Yeah, that's why I'm thinking this might be a decent choice for a console when you consider power draw. 8 cores @ 2 GHz plus a wide vector unit or 64-128 GCN ALU's instead. It would probably have a power draw (under 30W). Add to that a 70-90W GPU ( 7770-7850 level) and I think you would end up with a console that can operate under 150W when all the other components are added.
 
about the next gen consoles, I dont think amd are involved in any of them

AMD’s Manju Hegde in a q&a session
Do you expect the next gen consoles to make far more use of GPU compute?

Cannot comment further on this since these are products being brought forward by other companies.

You've misunderstood it, that means Sony, MS and Nintendo are bringing them out.
It's 100% confired that Wii U has AMD hardware, it's now probably around 95% certain that both next PS and XBox use AMD hardware, too.
 
You've misunderstood it, that means Sony, MS and Nintendo are bringing them out.
It's 100% confired that Wii U has AMD hardware, it's now probably around 95% certain that both next PS and XBox use AMD hardware, too.


I'm not so sure about Xbox 720 using AMD. Nvidia is spending a lot of money on Maxwell.
 
I'm not so sure about Xbox 720 using AMD. Nvidia is spending a lot of money on Maxwell.


So what you're saying is that Nvidia is willing to license its IP tech to Microsoft so MS can do with it as it pleases? Or is the reality of the situation that MS has been down this road with Nvidia before and doesn't want to get burned again?
 
So what you're saying is that Nvidia is willing to license its IP tech to Microsoft so MS can do with it as it pleases? Or is the reality of the situation that MS has been down this road with Nvidia before and doesn't want to get burned again?

I'm just saying Nvidia has large engineering teams working on next-gen GPU technology. I have no idea what MS will do...but I do think the situation is more dynamic than just expecting a AMD "Mutiny on the Bounty" GPU. Then you have IBM working on brand new Power 8 cores. There is more than one valid choice for console tech. Bleeding edge does help sell Xbox Live. I'm doubtful AMD can keep pace with Nvidia.

The one thing I'm fairly certain about is both Sony and Microsoft will use touchpad tech in their gamepads. It works well on a Vita and I think it'll turn out to be a big mistake by Nintendo not including this in the Wii U.
 
I personally think that we will see AMD GPUs all around, and x86 CPUs on 2/3 (IBM only being involved as a primary CPU on Wii U).

It seems that way. This gen will be a huge win for AMD.

However, Sony did have a huge contract come out about Nvidia a couple years ago. A lot of people assumed PSP2, but since that's ImgTec, it doesn't leave a whole lot else.
 
Status
Not open for further replies.
Back
Top