TheWretched
Regular
No, he is saying that their "improvements" still have "50% more optimization to go". Whatever that may mean.
Technically, it'd mean rendering exactly the same scene and assets at 50% higher framerate or resolution or mix of the two, although other features (ge. alternative lighting models) would count as an improvement without being quantifiable. If you're using the GPU efficiently though, higher framerate/resolution should be the most apparent advantage.you brought up a good point. the early rumors stated the wii u was 50% more powerful than what we have now... my question is how powerful exactly is 50%.
That's PR fire-fighting, like Kutaragi saying PS3 has remarkable potential that needs to be untapped, yet it's the same ball-park performance as XB360 and not magically more powerful. Never go to corporate executives or press releases to gain an understanding of a device. We have much better reference material - games showing no obvious improvements that would come with much more (50% more) powerful hardware, tiny dies, leaks and trustworthy rumours, hacker investigations revealing clocks. We could really do with eDRAM specifics and a final nailing of the GPU class, but we have a pretty good picture now, and there's not much that can change that picture significantly. The only real shocker would be if the eDRAM implementation is both very fast and 'smart' like XB360s, but that should be apparent in launch titles getting 'free AA', which they're not.does he there basically confirm Wii U is 50% more powerful than what we have and if so again how powerful is 50% more powerful.
No, he is saying that their "improvements" still have "50% more optimization to go". Whatever that may mean.
yeah i was wondering and never got a decent answer. i know people hare the numbers game and random multipliers but its something to think about if anyone should know he should.
Are you purposely being obtuse or are you just consistently missing the point and logic we are trying to bring to the thread? I understand what you are saying but, I am trying to show you how all of what you say doesn't add up.
How about you answer my question, what exactly do you think devs will accomplish with the Wii-U? Do you honestly believe it will keep up with the other next gen consoles?
I answered in another post but, yes.
What you should be asking yourself is if the Wii-U is so capable as you assume, why do devs need to reach the full potential of the system to outclass the PS360? If the leap in performance were there to begin with, devs wouldn't have to dig that deep to find it.
I never said the Wii- U has to be maxed out to outclass the PS360. I just said give the def
This isn't going to happen and is nothing more than a pipedream for Nintendo fans. Why would developers be asking MS and Sony for more power if they plan to lead development on the Wii-U? Why would MS and Sony even invest in new consoles if they thought next gen development would lead on the Wii-U?
The reason is this. In some cases there will be games that will be Wii U games that will be exclusives. Some developers would prefer to port it rather than build it ground up for the new MS and Sony consoles. Also, rather than let's say building 3 of the same game from the ground up developers may prefer to build from the one they are most familiar with which would be the Wii U. Like in art if, you have the basic framework of something drawn it is a lot easier to just color it in than do all of the drawing and having to color it in too.
No need to be so defensive. We can't compare the console market to the handheld market because Nintendo has had a history of dominating that market, not so with home consoles.
The reason I brought it up had to do with expectations. What most people expected didn't happen. That was my point.
You have no idea how well or poorly the Wii-U will sell.
I say you have no idea of how well it the Wii-U will sell.
Beyond power, there's also the architecture that needs to be considered. Regardless of how close the PS3 and 360 seem on the surface, each system has strengths and bottlenecks that differ in some big ways. Besides, while we all do it at some point, you can't just measure the "power" of a system in such a general way. There are many things to consider when you look at how a system is expected to perform. So just because the PS3 and 360 produce similar results, that doesn't mean porting would be easy. That kind of question shows a level of naivety on your part.
Please don't stray off-topic. Just to clarify, though, ERP was talking about the success of third-party games and not the system itself. Time will tell if Nintendo will be able to satisfy third-party publishers for good support, be we already know that alot of publishers will be betting on Microsoft next consoles and the PS4 to a lesser extent at this time.
Straying off topic hmmm let's see. I shall look above. Console Forum check. Console Technology check. Wii U hardware discussion and investigation check.
My statements are relating to the Wii U hardware. I may appear to be off track at times but all of it relates to intial discussions of the Wii U hardware.
The issue with your comparison with the PS3 vs Xbox360 is that the PS3 is NOT several times stronger in every way than the Xbox360. In fact, the PS3 has a weaker GPU (half the triangle set-up, etc), so some GPU tasks has to be offset to its CELL processor to make up for it if you're doing Xbox360 -> PS3 ports. Its split memory compared to 360's unified memory apparently gave devs some issues too.
Every console probably has some advantage over another in some technical way no matter how small or obscure. I mean overall.
For the Wii U, it appears to have the opposite problem from the PS3 in some ways. From what we know, its CPU is clocked slower and do not have as many GFLOPS as either current-gen CPUs. The processor, however, is probably more efficient in some tasks, and the GPU has been stated to be overall stronger. We are still not sure on how much stronger it is at this time, but it is safe to assume that it is not an order-of-magnitude difference from 360's GPU.
The specs for the PS4/Durango have not been confirmed, but they will likely be over 8x more powerful than current-gen overall, and probably at least several times stronger than in almost everything than the previous systems (though probably not as theoretically as powerful as CELL in some ways.) That means that both will at least be several times more powerful than the Wii U and should be capable to outperform it in graphics by brute force alone.
Now.. it may probably be arguable on how much better some games will look due to diminishing returns, budget, and personal tastes. For early ports, we may see modern enchanted 360 games at higher resolutions and framerates .or simply the PC-versions ports. The point is that they should be superior in a technical level.
I explained more about my stance in another post. I am trying to keep up with all of the replies lol.
Looking at games at a technical level is different from looking at an appealing one. Mario Galaxy, for example, is a generation behind games for HD systems in many technical ways, but looking at it in another perspective will give you various opinions.
The people at Beyond3D generally discuss things at a technical level, so that is something to consider when you make statements that you may not be able to back up with technical knowledge.
You are completely out of your depth here. You do not have the technological understanding to contribute to this thread. By all means hang around and learn and ask smart questions, but please don't derail this thread with off-topic and confused posts.
You are using an open-ended phrase. What is 'anything'? Is Wii U only showing 10% of its capabilities, or 50%, or 90%? When you look at the architecture as we know it, there's nothing complicated to using Wii U, so there's no reason to think usage will be well below capabilities. Yes, 3rd parties may be being cheap, but clearly not everyone is, including Nintendo. Trine 2, for example, is showing improvements. There are some improvements that don't need understanding or effort but just performance, like MSAA, and the lack of such features tells is a decent bit about the capabilities of Wii U (not much above PS360). Your other post gives a better point of reference:
By sharper I guess you mean higher resolution. That's a matter of RAM, ROPS, etc., and not developer skill. If there isn't higher resolution in launch titles, there's not likely to be a big improvement there. There will be sharper textures as a result of more RAM. Lighting and shading, yes, they'll be somewhat improved as it's DX10 versus DX9, although probably not a massive improvement. DX9 is pretty capable. So yes, Wii U will look better in the end, as I've said before. It's just not going to be massively better - the laws of physics tell us this. The small, low-wattage parts in Wii U will not be comparable to large, hot parts in the next-gen consoles. There's no need to wait on Wii U games to learn that.
For the purposes of this thread, Wii U's launch titles are adequate to tell us there isn't a massive amount of performance in there, which would be obvious in launch titles on cross-platform engines (UE3) managing higher framerates and better IQ as a result of the potent GPU. We have a ball-park performance metric, die sizes, rumours/leaks about chip names and flavours....Wii U's launch titles don't bring anything more to the discussion, unless they are exhibiting a particular effect like higher quality DOF.
Which is inadequate to contribute to the technical investigation.I am not out of my depth as I never claimed to be a 20 year development vet. I just use scientific principles and the technical knowledge I do possess.
Language barrier aside, your remarks about coding troubles are too wishy-washy and generic to be going anywhere. Prior to your and artstyledev's appearance in this thread, we were merrily discussing CPU architecture and GPU architecture and system bandwidth and all sorts of real subject matter, and I want to return to this topic.All of my comments have related to what is listed for what this thread is categorized as.
Which is inadequate to contribute to the technical investigation.
Language barrier aside, your remarks about coding troubles are too wishy-washy and generic to be going anywhere. Prior to your and artstyledev's appearance in this thread, we were merrily discussing CPU architecture and GPU architecture and system bandwidth and all sorts of real subject matter, and I want to return to this topic.
In addition the thread also, says hardware discussion and isn't that is what is going on here?
I know you do, but that's not based on anything substantial. It's a belief based on hope, not technical understanding. No matter how much experience the devs have with the hardware, there's only so much they are going to be able to accomplish with a 40 watt system, tiny tri-core 1.25 GHz, probably weak-sauce OoOE, PPC CPU, and a 550 MHz, probably RV770 class 150 mm^2 GPU including eDRAM*. The parameters of their operation leave a bit of margin for unknown reveals, but something like really fast BW in the eDRAM should be apparent in the launch games. That is obvious to most with a decent bit of system understanding. You don't need specific Wii U experience; 20+ years of following console and graphic tech developments, or even just a little while learning how different machines have worked over the years, is enough to tell us that 3rd party devs should be able to enable free MSAA on their 128+ GBps eDRAM if it existed.I believe there is a lot yet to be learned about the Wii U's power.
No, or at least not necessarily in terms of power or transistors. In terms of time, sure, but that's not the first thing that comes to mind. I don't say a Ferrari is more efficient at getting to 60mph than a Dacia. Heck, a dual-core Pentium 4 is more powerful than a single-core Pentium 4, but I'm not sure I'd say it's more efficient.The more powerful a system is the more efficiently it will run the code
Yes.I believe there is a lot yet to be learned about the Wii U's power.
And yet what definitive evidence are you basing your optimism on? That's the crux of the resistance that you're seeing.If, in a few years when developers learn the ins and out of it and have begin to build more games from the ground up with similar results then I would say you are right. So unless that happens I won't without more definitive evidence.
Hmm.. I don't think "weaksauce" is the correct word for the Wii U's CPU. It is definitely weaker in SIMD tasks than 360's CPU and Cell, but Broadway was surprisingly strong clock-to-clock with them in some things like general code. It is also clocked high considering its short stage pipelines. Each core is clocked around 70% more than Broadway and has larger asymmetric caches with eDRAM. It is probably roughly 6x Broadway in performance (5x raw power, + 33% more efficient per core?), and ~9x Gekko.I know you do, but that's not based on anything substantial. It's a belief based on hope, not technical understanding. No matter how much experience the devs have with the hardware, there's only so much they are going to be able to accomplish with a 40 watt system, tiny tri-core 1.25 GHz, probably weak-sauce OoOE, PPC CPU, and a 550 MHz, probably RV770 class 150 mm^2 GPU including eDRAM*. The parameters of their operation leave a bit of margin for unknown reveals, but something like really fast BW in the eDRAM should be apparent in the launch games. That is obvious to most with a decent bit of system understanding. You don't need specific Wii U experience; 20+ years of following console and graphic tech developments, or even just a little while learning how different machines have worked over the years, is enough to tell us that 3rd party devs should be able to enable free MSAA on their 128+ GBps eDRAM if it existed.
Matt from neogaf (who does seem to know some things) did claim that there were some things still locked in the "final" SKU. Time will tell.There is no magic or special sauce to Wii U. It's not like GC or PS2 with novel architectures that needed new ways of thinking to master any extra performance from. Its tech 3rd parties understand very well from having worked with similar the past 7 years, and the only substantial limitations at this point beyond hardware will be APIs. eg. It's remotely possible that there's a truckload of BW in eDRAM but the reason we aren't seeing its clear application is the dev environment not being up to snuff yet. The console was clearly launched in Beta flavour and there may be a situation where using Nintendo's compulsory systems (if they are compulsory) means no AA until the systems are updated.
I believe it should be clarified we are indeed talking at a technical level. Something "looking" better is subjective and can lead to subject derailing. :smile:What there is no doubt about whatsoever is that next-gen consoles will look better than the best Wii U has to offer even if they don't launch for two years, assuming the rumours for them are accurate. The alternative isn't possible, just as impossible as PS2 games looking better than PS3 launch titles which really were a poor use of the PS3 hardware. Efficiency and optimisation only gets you so far. The hardware gap between PS4 and Wii U will be a significant leap (as others say, to not have a significant leap would mean PS4 games looking worse than PS3, meaning Sony fail to compete with their old system, which has never happened in a console release ever).
Hmm.. I don't think "weaksauce" is the correct word for the Wii U's CPU.
It is probably roughly 6x Broadway in performance (5x raw power, + 33% more efficient per core?), and ~9x Gekko.
Yep. those commas are there for a reason.Shifty was talking about the OoOE capability of the WiiU's CPU with his weaksauce comment. i.e, although it is an OoOE design, that element of the design is very simplistic and ineffective compared with modern OoOE designs, e.g. x86.
A modern multithreaded engine should get close to 3x performance versus a single core - it's literally 3 single core processors, and as long as they aren't dependent on each other and working on independent workloads, they can run full tilt.The 3x from core count increase is something you simply aren't going to actually get in anything. Would be curious to hear some estimates on how much 3 cores helps you.
It's DDR3 instead of GDDR3 or 5 like in the other consoles, so it's absolute latency should be much lower. But if it's still going through the GPU that'll be a hit. Still, it has the potential to be under half what it is on XBox 360, which could be a big benefit. But Nintendo has to be using a competent memory controller for this.
Yep. those commas are there for a reason.
A modern multithreaded engine should get close to 3x performance versus a single core - it's literally 3 single core processors, and as long as they aren't dependent on each other and working on independent workloads, they can run full tilt.
A modern multithreaded engine should get close to 3x performance versus a single core - it's literally 3 single core processors, and as long as they aren't dependent on each other and working on independent workloads, they can run full tilt.
Inuhanyou said:I've tried to research this a lot to no avail really, so maybe you can give me an answer; what is the general difference between DDR3 ram and GDDR3 ram? I've been going under the assumption that GDDR3 RAM like in 360 and PS3 is more important to GPU functions and DDR3 is a much more general purpose ram used in normal off the shelf computers.
But what actually separates them? And is there an advantage to Nintendo using DDR3 over GDDR3 besides the supposed cost savings? Or is that what it was plainly about?
theizzeee there are devs and people who have worked in the business in one form or another for years on this forum while they aren't the majority you really might want to take time to learn who they are.
Meaniwhile you have a a lot to learn. Processors are governed by a lot of things but power/heat tends to be the main driver aka in anything with a decent design a 100 wat system will crush a 25 or 50 watt system etc.