Is it not possible that there can be a fair amout of idle time for the GPU or have parts of the GPU deactivated in 720p portable mode in games developers want to run in 1080p in docking mode? That way, there don't have to be a massive "overclock"?I do not think it is possible to have so wide range of clocks (not counting bandwidth) to deliver x2.25 pixels at same quality on TV vs Handheld. But 900p vs 720p is doable.
At the same time, I hope Nvidia&Nintendo launch a real 1080p TV console.
That's more than the Wii U sold for isn't it?
And Nintendo is hoping for better sales than the Wii U?
Anyone who is thinking Nintendo is trying to go head to head with the other consoles is kidding themselves. If Nintendo wanted to create the best possible experience for the TV, they wouldn't have made the system portable. The key feature is the portability, and if course Nintendo unifying their first party efforts on to a single device. I can promise you, Nintendo is not under the illusion that Switch will be the device that plays the best looking versions of AAA games, but if the Switch is able to acquire a wide variety of software that can be played anywhere, they have potential to offer consumers something no one else really is. We already know how well Nintendo portables can do without western publishers, now imagine that portable with a decent amount of western support, tv play, and a unified Nintendo software offering. It's tough to not be optimistic as a Nintendo fan.
Sent from my SM-G360V using Tapatalk
Yeah the Wuu was 10+ year old hardware sold for the same price as ps4one, of course it was priced too high for what it was.
The Switch seems to be much more up to date from a hardware perspective and therefore will be more expensive. Wii launched for 250 and that was very bare bones from a HW perspective.
I think anybody hoping for 250 is going to be disappointed. High 200's MIGHT be possible but looking at similar hardware on the market I'd say 300 is probably the lowest you can realistically expect.
Are we shaders limited in most console games ? I had the impression that console were mostly cpu limited
Despite all the criticism I see in this forum about the 1.6GHz Jaguar cores, I have yet to see a single actual developer actually complaining about single-threaded CPU performance in the 2013 consoles.
Again, stating the CPU is the bottleneck in comparison to the GPU or RAM or whatever else is not the same as complaining about the CPU.PlaneteSide devs said PS4 cpu is the bottleneck few weeks ago...
Then everyone should agree that prioritizing GPU area/power budget over CPU area/power budget was a good idea.Developers are inclined to prioritize what sells, and what the consumer deems as the most attractive. Most games are not pushing the AI and physics to the extreme because there isn't the same demand as high resolution graphics and framerate.
And every single developer would also much rather have a Pascal Titan X than a Bonaire/Pitcairn class GPU in the 2013 consoles. And would much rather have 24GB of 10GT/s GDDR5X in a 384bit arrangement than 8GB of 256bit DDR3/GDDR5. And would much rather have NVMe SSDs than disc-spinning drives.I also wouldn't assume that just because there aren't numerous complaints about the CPU isnt any indication that they think its a great CPU. I am pretty confident that every single developer would much rather have a I5 quad core than the Jaguar CPU.
Then I guess we'll see if the PS4 Pro's 2.1GHz Jaguar gets all this criticism you suggest if/when developers start comparing it to Scorpio with a Zen CPU.If either the PS4 or X1 had shipped with an I5 while the other shipped with the Jaguar, I would bet we would be hearing far more grumblings about the Jaguar CPU.