The 2GB rumor came from me after asking a couple of people. It wasn't based on wishful thinking.
So did the 512MB of reserve ram for OS come from you also? For your source wasnt talking about the reserved 512MB of flash for the OS?
The 2GB rumor came from me after asking a couple of people. It wasn't based on wishful thinking.
2GB comes from the people on gaf that dont understand tech. If you look at the sdk leak, you have 1.5 GB of ram plus 512MB of flash for the OS, they added this to the ram.
And yes there is a ton of wishful thinking on GAF.
So did the 512MB of reserve ram for OS come from you also? For your source wasnt talking about the reserved 512MB of flash for the OS?
Care to elaborate what is all that whishful thinking on GAF? Because from what I see its quite down to the earth post E3.
About the memory I dont see why developers would need 1.5 Gigs for debugs taks, seems a bit too beefy for that, 1.5 seems confirmed but 2 GB doesnt seem impossible.
But you surely seem to enjoy trying to downplay WiiU as much as posible, but whats the point? We allready know it will be really far from PS720 spec wise so why do you seem to need to keep on that unnecesary task?
Why do people think that 'Hollywood' will be less powerful or advanced than Xbox2 ATI GPU?
* Work for Revolution GPU started way before MS joined Ati.
* Revolution will be released way after Xbox2
IMO 'Hollywood' GPU will be at the very least as powerful and advanced as Xbox2 GPU, having more time to design it and releasing next year point to that scenario.
Maybe graphic power wil not mean anything next gen, but Im sure a lot of people is going to be surprised with 'Hollywood'
Oh and Im really satisfied that Revolution being backwards compatible, as a nice bonus we now know that aside from the inovative controls, standard control will be there for sure.
Awesome news without doubt...
Pretty much anything on GAF, GPGPU, next gen gfx, and was way worse before the E3 crash.
I'm not downplaying anything but I been saying the same thing since I saw the case for Wiiu when it was announced. If you are going to call someone out at least post something to back up what you are saying. What have I been downplaying?
But i do very much enjoy the tech talk and Im in the camp the wiiu like the wii doesnt have to "so powerful" to be fun to play. Wiiu doing it own thing and people focus a lot on spec and try to turn it into something its not. There is a reason nintendo dont even comment on specs.
And just like the Wii before you have posters that have a "ton of wishful thinking" like below.
http://forum.beyond3d.com/showpost.php?p=426155&postcount=8
A lot of people were "surprised" by hollywood. Its funny how much the wiiu has been just like the wii. But that not a bad thing, people seem to forget the wii out sold everything last gen.
Pretty much anything on GAF, GPGPU, next gen gfx, and was way worse before the E3 crash.
I'm not downplaying anything but I been saying the same thing since I saw the case for Wiiu when it was announced. If you are going to call someone out at least post something to back up what you are saying. What have I been downplaying?
But i do very much enjoy the tech talk and Im in the camp the wiiu like the wii doesnt have to "so powerful" to be fun to play. Wiiu doing it own thing and people focus a lot on spec and try to turn it into something its not. There is a reason nintendo dont even comment on specs.
And just like the Wii before you have posters that have a "ton of wishful thinking" like below.
http://forum.beyond3d.com/showpost.php?p=426155&postcount=8
A lot of people were "surprised" by hollywood. Its funny how much the wiiu has been just like the wii. But that not a bad thing, people seem to forget the wii out sold everything last gen.
Hahahahaha, daaaamn I got a good chuckle from that, I feel honored that you felt the need to look up my old posts to find a 7 year old one about the wii to disprove my point. I didnt remember that post so that really made my day!
People were quite content to own Wii despite its lack of graphical power.Maybe graphic power wil not mean anything next gen,...
Everyone was surprised by 'Hollywood'!... but Im sure a lot of people is going to be surprised with 'Hollywood'
You know... I laughed at this IRL because it's a funny line (to a tech geek like me), but it's also true. So, so true...nintendo laughs in the face of Moore's law and reverses it!!
You know... I laughed at this IRL because it's a funny line (to a tech geek like me), but it's also true. So, so true...
Sounds reasonable, did you give any thoughts about the memory ?
2GB of ram with a large amount used in the OS. I think they will go for at least 52GB/s for the ram, not sure if that is possible with DDR3, I don't think GDDR5 is out of the question, especially if it's moderately clocked.
Thanks, I think the reason devs are complaining is due to gflops performance being less and also from a thread being tied to wireless data transfer to the controller, the more data streamed to the controller, the more that the CPU takes a hit. Combine that with ports likely not using the dsp and you could tie up 2 or 3 threads, leaving less threads than xenon. The 476fp maxed out at 2ghz, so we will have to wait and see but I would guess the clocks are within 400mhz of that, but its just a wild guess.Like your logic....so I'm thinking it's OoO and smt?? that would make it really hard for other to be weaker than xenon would it not?.... certainly it is very similar architecture to xenon so it isn't going to be hard to dev for?..so in that scenario it should not in anyway be weaker than a 7 year old cpu....there shouldn't be a single complaint from ANY devs about cpu power
....If they had to underclock it for tdp then that is very short sighted and in my opinion stupid for them to do....setting a target of a matchbox is all well and good, but you have to make sure it is more powerfull than your last generation competitors and also easily compatible for ports...incompetence from nintendo if that is the truth.
Saying that though apart from likely low cpu clocks...the rest of the system looks twice as powerfull as ps360....not sure about disk space or hdd options...but I would be very impressed with what you have wrote...as long as the cpu clocks improve...
I think that's an oversimplification. Unless we're missing something, Nintendo would have lost very little going with something like an ATi R300 series GPU, which was a couple of years old and small and cheap by then but would have offered much better visuals than Wii. It was clearly more profitable for Nintendo to go with Wii's final architecture, but they didn't have to be that cheap. That's the concern for Wii U, that Nintendo will make a choice that's excessively conservative.It's very clear how Wii turned out. Nintendo saw how much it would cost to compete with Microsoft and Sony from a hardware perspective 8 years ago, they knew that they would have to release a HD box that pumped out high end graphics, but they also saw how well Gamecube sold with it's "superior" hardware compared to the king of that generation, the PS2. So they came up with a different approach to Wii, they came up with the Wiimote, but realizing how much they sunk into R&D for the Wiimote, they simply didn't want to spend the resources on new hardware, that is why Wii is a 50% overclocked Gamecube.
Is there much reason to think Nintendo will have a meaty OS? They aren't pursuing the media-hub as actively as MS and Sony. What %age of RAM does DS/3DS use for OS?2GB of ram with a large amount used in the OS. I think they will go for at least 52GB/s for the ram, not sure if that is possible with DDR3, I don't think GDDR5 is out of the question, especially if it's moderately clocked.
Why would the CPU be involved in wireless communication? As wireless video is such a huge part of Wii U, Nintendo should be using more efficient hardware for the job such as video encoding to reduce bandwidth.Thanks, I think the reason devs are complaining is due to gflops performance being less and also from a thread being tied to wireless data transfer to the controller, the more data streamed to the controller, the more that the CPU takes a hit.
Why would Wuu have a thread/CPU dedicated to data transfer to the pad? That doesn't make sense.
I think that's an oversimplification. Unless we're missing something, Nintendo would have lost very little going with something like an ATi R300 series GPU, which was a couple of years old and small and cheap by then but would have offered much better visuals than Wii. It was clearly more profitable for Nintendo to go with Wii's final architecture, but they didn't have to be that cheap. That's the concern for Wii U, that Nintendo will make a choice that's excessively conservative.
Is there much reason to think Nintendo will have a meaty OS? They aren't pursuing the media-hub as actively as MS and Sony. What %age of RAM does DS/3DS use for OS?Why would the CPU be involved in wireless communication? As wireless video is such a huge part of Wii U, Nintendo should be using more efficient hardware for the job such as video encoding to reduce bandwidth.