Predict: The Next Generation Console Tech

Status
Not open for further replies.
Yeah I don't get it either. How much more money has Nintendo made with the Wii, ten times or more than what Sony or MS has? Then again they both haven't really recouped their losses yet, so I think the Wii has actually made infinitely more money compared to the HD consoles.
Yes it made a lot more for Nintendo even in regard to brand strength / public perception than what the ps360. MS won a lot of mind share but EU? I don't know I know that in France it's still lagging in mind share and that they will have to tough job to have people forget about Rrod.
Money I guess nintendo made a lot of money, the current trend on their share with negative forecast has nothing to do with the wii.

The main problem with Wii is that it was interesting, seemingly fun and really cheap. That meant tons of people bought it but after you realized how little it really offered it just staid in a corner collecting dust. I wouldn't be surprised if software attachment rate for it was several times lower than either PS3 or XB360.
Well there a thread on the matter (where it was discussed heavily) some months(/ one year?)ago. Editors sold quiet a lot on the Wii there are numerous millions sellers N titles aside. But may be it's not the game that cost editors the most to port that prove the most lucrative. Anyway the trend now is negative and I'm confident indeed that a lot of Wii are indeed collecting dust (all my friends ones are).
It's easy to do an almost postmortem analysis but I believe that N made the wrong decision by try to push the Wii down even more people throats through more and more pointless peripherals while not delivering the games that pushed people to buy the game. They took the wrong path (postmortem it's easier to come down to that conclusion I know). Clearly adults that don't play games often were not a reliable base and trying to replace board/social game game was not a good idea either. My postmortem idea is that the WiiU should have been there for a while now, a real in between product. Instead of trying to address non gamers (while not delivering enough games) their focus should have been to get every body within the family around their system but not at the same time. That's where on-line service, a more potent system would have come handy. Nintendo pressed the lemon way too dry.
N should have ship the Wii2 fall 2010 when they were still at the top of the wave. Wii2 should have developed along their new handled something fresh not a new DS. They've been cheap and their exec will get fired in near future that's sure. They were all about their "casual gaming" while wondering about what to do with "da vitality censor". It looks like success made them a bit tipsy.
Imagine how strong Nintendo was in 2009 and if they have announced a system in TGS 2009 launching worldwide in 2010 matched by a new handled that can work together with the system providing WiiU functionality and way more? They were the unquestionable wiinner, they could have like MS develop a facebook application that run facebook games for example, they had the strength to do so, minecraft editors would not have been like "user base is not enough". No they preferred in a cocky fashion spend money searching how to do a new balance board, a wii motion; a vitality sensor??? WTF
postmortem WFT :LOL:
Imho they missed the point, social game and online servicces were the way to extend further their reach more than balance board 2, Wiimotion+ , vitalishit and I don't know what, a more potent system was the way to take male teenagers from either sony or MS. They got locked in a posture, searching workaround for their system and business model lacking instead of looking forward about what they peak popularity allowed them to do. It's like they took the Sony "10 years life span for them selves" but 10 years life span is not an universal truth neither for them or MS or Sony. For them 4 may have been enough they were past profitable with outstanding public perception.
 
Last edited by a moderator:
I don't think we should look at double and triple graphics card systems as a reasonable metric of PC performance. Sure, they're more available to the public compared to stuff like the GSCube or the first multi-chip 3dfx cards; but I'd put those systems into a different leage and not with the "PC".
Agreed, I'm comparing only to high-end single cards (6970, GTX 580) and I still doubt we'll leapfrog those level of cards at all. Similar for the CPU. I'd love to be proven wrong but I seriously doubt that I will, given both the technical factors and the increasing apathy for pushing high-end hardware from the console makers.

Arguably last generation barely did it anyways. The PS3 in particular was a bit depressing with the NVIDIA 8xxx series and DX10 coming out around the same time and providing significantly superior hardware.
 
I find it encouraging that we are more than likely going to be moving away from the PPC design used in PS3 and Xbox360 - it was just a megahurtz monster = lackluster performance. Of course the decision to go with such a CPU would have been made when it became known that IPC counts for something and splitting the cache on those multicore processors just served to starve them.

It would be nice to design a GPU from the ground up just for specifically for eye-fidelity, rather than the trend to go all general purpose which doesn't increase performance/die size in reality. I am talking about the legacy of DirectX and features that add bloat but don't increase speed. Didn't some dev state that you can do most of the effects in DX8 that you can in DX11 (I think a slight exxagerration but still some DX9 games look incredibly similar to their DX10/11 counterparts.)

Here is hoping that Sony, Nintendo and Microsoft choose wisely!
 
Didn't some dev state that you can do most of the effects in DX8 that you can in DX11 (I think a slight exxagerration but still some DX9 games look incredibly similar to their DX10/11 counterparts.)
It's a ridiculous claim from naive people. DX10/11 stuff to this point has been - bar a few exceptions - ports. As long as you're forced to support DX9 (and consoles) it's hard to write a completely different renderer to take advantage of new features. Frostbite 2 is one of the notable exceptions that has a very DX11-styled renderer that they have "made work" on older hardware, with probably some pretty significant performance implications.

Lots of the real breakthroughs in graphics are happening due to increased flexibility enabling fundamentally more efficient algorithms to be expressed. So while you can criticize certain features as not turning out to be too useful or efficient (*cough* geometry shaders), overall the feature improvements have mattered as much if not more than the raw speed. We have plenty of raw speed in high-end PC GPUs for instance, but the problem is things like how naively the GPU pipeline makes use of the available memory bandwidth. That sort of thing needs to be addressed by better algorithms (enabled by more general features), not more bandwidth.
 
Last edited by a moderator:
They should be able to easily match 6970/580 level performance given the advantages of a closed system. I'd guess current laptop graphics withing a closed ecosystem would accomplish that.
 
They should be able to easily beat 6970/580 level performance given the advantages of a closed system. I'd guess current laptop graphics withing a closed ecosystem would accomplish that.

Don't forget that when the new consoles launch, PC games will be running DX11, maybe even DX12 as standard. That should significantly reduce the API overhead seen with todays DX9 games. Instead of 2x the effective performance, you may see only 50% more or even less.

Although old by then those performance advantages of using the more modern API should translate back to the 6970/580.
 
I doubt that. How many people will have a dx12 graphics card in 2013? 1%?

If the consoles are DX11/DX12 then most PC games will be as well. DX11 on the PC will be a fallback path which will die quickly if console are DX12. But if consoles are DX11 then it's a no brainer. All games will be DX11 on the PC withing months of launch.
 
dx12 may be like dx11, i.e. still targeting directX 10.0 hardware.
windows 8 will be windows 6.2, too. so we can believe there won't be a giant API reboot as with vista/dx10 again.

so at least a dx12 game can have multiple paths without too much trouble - if windows XP support is dropped..
an Intel roadmap shows directX 11.1 support in 2013 though, so maybe there's no dx12 afterall.

that .1 release might still be significant if it allows more GPU autonomy. maybe the GPU could "create" its own litter in a building explosion rather than handling that with API draw calls?, I don't know.

consoles would use the same dx 11.1 level GPU architectures probably.
 
Agreed that better algorithms leads to better efficiency and this only comes about as new iterations of DirectX. The reason we are all transfixed on Direct X is due to the fact that it is the Windows x86 platform that is spearheading graphics hardware development in a sort of symbiotic relationship between the IHVs and ISVs.

The direction NVIDIA has taken and now ATI will take with GCN is better compute abilities but these abilities do not directly translate into better performance for the consumer. Going after the high margin and lucrative compute market hurts die size from what has been seen with Fermi. I didn't really think DX8 level pixel and vertex shaders could ever compare to the effects you get by simply having FP modes for blending and other effects such as HDR. Going back that far would be a complete waste of technological advancements and a shot in the foot for any vendor in the console space.

It's also a reason why I love the idea of eDRAM, not only because Bitboys sold me it back a decade ago but also because look at the Xbox360 - IMHO an incredibly balanced system.

Without needing an additional bus, or logic real estate (SPU's) it is comparable in graphics fidelity to the PS3. There are various reasons for this, one of them is because MS were smart.

EDIT: Reading up on Frostbite 2 - we definitely need developers like DiCE doing crazy things with those SPUs.. it seems they have found their true calling at last :)
 
Last edited by a moderator:
Framebuffer, Z-buffer, backbuffer, particle buffer, and if you use any level of deferred rendering than that's a few more buffers again. Reflections and other off-screen render targets will stay there too.

X360 keeps the backbuffer in main RAM and depending on the title it might get away with using the EDRAM only for everything else. If you do tiling or deferred rendering, or use more space for HDR (like Reach) then it might get complicated but then you can always go sub-HD

To do nearly anything with a buffer you need to copy it out to main memory. So if you want to read back depth for deferred lighting/shadows/depth of field/motion blur/etc. you need to copy it out to main memory. If you want to present a buffer to the display, you need to copy it out. If you want to sample a render target for reflections, you need to copy it out. The one case where you might save some memory is with MSAA...but only in the case where you can do the MSAA resolve before the copy out to main memory. This means that if you're doing deferred rendering and need access to individual subsamples, you need to copy out the non-resolved buffer which means you're not going to save any memory.

EDIT: forgot to also mention that depth buffers that you don't need to sample in shaders don't need to be copied into memory, which can save some some space for certain passes
 
Last edited by a moderator:
But if you're creating then storing those buffers one at a time, surely you could create in edram a render target that would otherwise need to be stored in main memory?
 
I don't think the execs here are thinking about XBox having to "pay back" it's investment. This isn't venture capital, and MS was not hurting for profit in those years. I think they're more focused on future profit. The E&D business is now the 4th most profitable at MS and the fastest growing. A few years of absorbed investments is a small price to pay for an entirely new revenue stream with the potential to rival the Windows and Office juggernauts.

Really? What is the revenue difference now, probably an order of magnitude?

Console business may have peaked. Aren't overall software sales lower this year?

What else is in E&D, mobile?
 
E&D was $1.3billion last year, but keep in mind that division includes some big money losers like kin and zune. Overall industry sales might be down, but I'm not sure that's true for MS (it's definitely not true for their hardware).
 
They should be able to easily match 6970/580 level performance given the advantages of a closed system. I'd guess current laptop graphics withing a closed ecosystem would accomplish that.
What are you basing that on?

I think you're *vastly* overestimating the advantages of a "closed system", particularly in terms of the GPU load. The bigger delta to writing "to the metal" is reduced CPU overhead. The GPUs actually run pretty efficiently once the commands are submitted to them with very little overhead. There's a couple specific features that aren't always exposed in the portable APIs, but nothing that makes a massive difference to algorithmic efficiency.

You may get decent visuals at lower resolutions and frame rates like the games that current console games play, but I'm extremely skeptical that in any sort of apples-to-apples comparison that the next gen consoles are going to reach high-end PC performance when they launch. There's just no magical way to make that happen in that form factor at a reasonable price point. Physics is too much of an issue these days.
 
yes. thermodynamics, to be precise. a similar problem is how to make a gaming laptop, we had the $2800 razer laptop not long ago (I guess it is pretty forgettable to most people) with lowish specs.

an underclocked core i7 and some geforce GTS 450 derivate with 2GB VRAM. this kind of hardware can give excellent results nonetheless and would be apt in a console (
has 8GB ddr3 too :p
)
 
the razer is not a good example of a performance laptop. Pick one with a m6990 (underclocked 6870) and it would crucify the razers performance. Cooling is less of an issue for a console box, and you certainly don't need a battery. You can even get a number of laptops with crossfired mobile 6990s.

What are you basing that on?

I think you're *vastly* overestimating the advantages of a "closed system", particularly in terms of the GPU load. The bigger delta to writing "to the metal" is reduced CPU overhead. The GPUs actually run pretty efficiently once the commands are submitted to them with very little overhead. There's a couple specific features that aren't always exposed in the portable APIs, but nothing that makes a massive difference to algorithmic efficiency.

You may get decent visuals at lower resolutions and frame rates like the games that current console games play, but I'm extremely skeptical that in any sort of apples-to-apples comparison that the next gen consoles are going to reach high-end PC performance when they launch. There's just no magical way to make that happen in that form factor at a reasonable price point. Physics is too much of an issue these days.

One platform that's going to perform almost exactly the same across millions of machines, you know how fast everything is going to be all of the time, but mostly I'm basing it on history and developer comments. And you'll be dealing with somewhat custom hardware for its purpose with dev kits built just for it. Even if the advantage has shrunk as was suggested, I think it's enough that for the target resolutions (1080P or less), they will be very close to the high end. Obviously consoles won't be doing 7680x 3200 or whatever and custom super high texture packs, but if they can't manage something akin to 6970 performance in 2014 they shouldn't bother.
 
The best laptop solutions are still typically around half the performance of a high-end desktop, and more expensive as well. There's no way you make up a 2x GPU performance deficit purely with low-level tricks, and there's no way even those laptop-level solutions will fit into a console budget. While cooling may be "less" of an issues, it's still a really big one... just listen to how load the fans on your console get compared to a desktop PC with an order of magnitude more power, just due to form factors :)

Again, love to be proven wrong but I severely doubt we'll see either Sony of Microsoft heavily pushing/subsidizing high performance solutions for their next consoles. I expect mid-range desktop equivalents at best, which while still quite decent and a large step up on current generation, won't really rock the boat for us researchers :)
 
Really? What is the revenue difference now, probably an order of magnitude?

Console business may have peaked. Aren't overall software sales lower this year?

What else is in E&D, mobile?
Total revenue was half the Windows or Office org last year (~9 billion to ~19 billion). Profits are an order of magnitude (~1.3 billion to ~12 billion). E&D revenue grew 45% last year, compared to a 2% dip for Windows, and the more normal 10-15% growth in the other MS divisions. At that growth rate the E&D division could exceed Windows revenue in three years (Not that I think that's going to happen, but it's a possibility). Profit will never have the same ridonkulous margins that windows and office have, but it's a pretty respectible business now. By itself, the E&D business would qualify for the Fortune 500 (Somewhere around #250)

And yes, it includes mobile, and the hardware division (mouse/keyboard stuff) I think.
 
Again, love to be proven wrong but I severely doubt we'll see either Sony of Microsoft heavily pushing/subsidizing high performance solutions for their next consoles. I expect mid-range desktop equivalents at best, which while still quite decent and a large step up on current generation, won't really rock the boat for us researchers :)

I generally agree with you, but I think something like the 6970 level of performance will be mid-range sooner than the next gen will arrive. The upcoming 28nm shrink will do that already. If more or less one year from now you are already holding in your hand the next refresh after the 79xx chip, a card that has over 2x the theoretical performance of 6970, will you still think that 6970 level is hard to achieve in a console in 2014?

I Still think around 6970 will be what we will be looking at, with some modifications. Single chip solution would ruin that, and I think that might be possible as well...
 
Status
Not open for further replies.
Back
Top