Would it not be ironic if the Xbox 3 essentially had more of the WiiU CPUs.
It would be even more ironic if the Xbox 3 had more of the PS4 CPUs... again
Would it not be ironic if the Xbox 3 essentially had more of the WiiU CPUs.
http://www.joystiq.com/2012/04/04/gearbox-boss-says-impressive-wii-u-a-really-nice-bridge-to-the/Gearbox boss "Wii U is a really nice bridge to the next gen"
"Assumptions that Wii U games will look like 'up rezzed' current-gen titles with better textures aren't quite right. They'll look just as good, but not better," one developer told CVG. "You shouldn't expect anything special from a graphics point of view," they added.
"We're still working on dev machines but there have definitely been some issues [in porting PS3/360 games]," our source said. "It's not actually a problem getting things up and running because the architecture is pretty conventional, but there are constraints with stuff like physics and AI processing because the hardware isn't quite as capable."
http://www.ign.com/articles/2012/04...specs?utm_campaign=twposts&utm_source=twitter
PS4 devkit has APU + HD7670 @ 1Ghz
A8-3850 APU and Radeon HD 7670... not impressive. Because there was speculation about chip stacking being used in PS4, im gonna hope that there will be two of each of those onboard.
so eh IGN said nextbox use 6670 and PS4 use rebranded 6670?
is IGN crazy?i don't think i need to trust them anymore
What are the chances that the 7670 is there just to provide roughly similar performance to the to-be-used APU's GPU? Kind of like early XB360 devkits used dual GPUs.http://www.ign.com/articles/2012/04...specs?utm_campaign=twposts&utm_source=twitter
PS4 devkit has APU + HD7670 @ 1Ghz
If there any truth to this at least it is clocked higher than in my guesstimate (I was super conservative with my 650 MHz hd6650 based clock speed)http://www.ign.com/articles/2012/04...specs?utm_campaign=twposts&utm_source=twitter
PS4 devkit has APU + HD7670 @ 1Ghz
How much did PS3 and X360 devkits change since their first and last version? Can someone make breakdown of all versions?
I've no idea, so honest question do you think that a 32GB/s would be a bottleneck to read texture from the main RAM?If they'd go with split memory pool I'm quite certain they'd use more for GPU as the biggest data hogs are for that and not for CPU.
How much did PS3 and X360 devkits change since their first and last version? Can someone make breakdown of all versions?
For starters you'll be lucky to get a bit more than half that 32GB/s on 128bit DDR3. Second that has to be shared with CPU. I'm quite sure it'll be cheaper and with far smaller overhead to just go with x MB RAM + 2-4x MB VRAM or just one huge shared memory pool.I've no idea, so honest question do you think that a 32GB/s would be a bottleneck to read texture from the main RAM?
How much did PS3 and X360 devkits change since their first and last version? Can someone make breakdown of all versions?