Well percentages are percentage of something.They can't overclock the CPU too much either.
At 2GHz would require a 66% increase in TDP.
If you did that then you would screw all the millions of people with older boxes, who would be unable to run new games on their existing hardware. You would also screw developers by essentially pushing the reset button on your installed user base.MS could always just go ahead as planned, and release an upgraded Xbox One, or Xbox Two (lol) that runs all previous xbox one games plus is more powerful in even just a few years.
Tempted to say, we should all count our blessings for you not being in charge then...That's probably where I would take the ball and run with it at MS.
You would design it to run the same games jsut at lower quality, like PC or Android. Every new game runs on the old console, at lower quality. There's a thread on this subject.If you did that then you would screw all the millions of people with older boxes, who would be unable to run new games on their existing hardware. You would also screw developers by essentially pushing the reset button on your installed user base.
They've lower latency memory, a fancy audio processor and extra bandwidth for some things. Not much to shout about, but ~10% higher CPU/GPU clocks would create plenty of smoke.IMO, as even 900mhz won't get them parity, they should go for the CPU.
The web can be stupid, at least a faster CPU than competition will have people to argue for eons about the respective merits of the designs.
Guru3D has a source who confirms that the Xbox One specifications have changed, and the clocks have been revised -compared to the leaked specs.
http://www.guru3d.com/news_story/mi...box_one_hardware_before_it_even_releases.html
Okay, I assume you are talking about the new VM model(s) for the OS, in which case, that's exactly a great point. But it doesn't answer a crucial question regarding early adopters.If the APIs remain the same, I don't see what the issue would be in releasing an Xbox Two in three or four years. That's the benefit of the API model.
Not that I care much tbh but that's what the article says. Specifications math haven't been shown and such we are going with rumours when it "sounds better". Allegedly their source also says that the 12GB rumour isn't true.Isn't it the same piece from examiner ? And where is the "confirmation " ?
Essentially, the source has confirmed that the increase in GPU clock speed is 100% true
To eSRAM (where the CPU doesn't have direct access), sure. Otherwise not so much (DDR3 and GDDR5 are in the same ballpark as all DRAM latency wise).They've lower latency memory,
Granted. It's somewhat probable SHAPE can do more than the PS4 audio processor. How relevant this will be we have to wait and see.a fancy audio processor
What extra bandwidth? The eSRAM may be enough to compensate the bandwidth disadvantage to the main RAM for some stuff (mainly ROP exports in first generation games, later one may see one or two examples of more unorthodox uses) and help to compensate that the XB1 has just half the ROPs by providing adequate bandwidth (the 32 PS4 ROPs could be bandwidth starved in some [contrived?] scenarios with HDR color formats and blending, but it's hard to imagine they will end up slower than the 16 ROPs of Durango). But the single memory pool of the PS4 will be easier to develop for and won't cause headaches of splitting the assets in the most optimal way between two memory pools to get parity with the PS4 bandwidth.and extra bandwidth for some things.
It will just help them to close the gap a bit more.Not much to shout about, but ~10% higher CPU/GPU clocks would create plenty of smoke.
The all-in-one bit is probably quite US centered. Personally, I hate settop boxes. That means I prefer a CAM plugged directly into the TV (current models include receivers for everything from DVB-T over cable to satellite, so one really doesn't need a settop box). And this also means I have no signal for the HDMI in of the XB1. At least for me the TV functionality of the XB1 is completely wasted.Their big win if there is to be one, is Kinect in the box and the all-in-one bit.
Isn't it the same piece from examiner ? And where is the "confirmation " ?
Essentially, the source has confirmed that the increase in GPU clock speed is 100% true, with the increase happening in direct response to the PlayStation 4. The increase has been a reality since before the reveal, as they have "been reacting to Sony ever since the first leaks of both systems." The news of this bump in clock speed was announced to first parties only, as they "actively spread disinformation to 3rd parties just before reveal to prevent leaks."
From the piece:
Essentially, the source has confirmed that the increase in GPU clock speed is 100% true, with the increase happening in direct response to the PlayStation 4. The increase has been a reality since before the reveal, as they have "been reacting to Sony ever since the first leaks of both systems." The news of this bump in clock speed was announced to first parties only, as they "actively spread disinformation to 3rd parties just before reveal to prevent leaks."
I guess you could say it new in and as much this Hilbert Hagedoorn fellow has a trusted source saying the above. Of course the active spread of disinformation could have entrapped our trusted source as well. Would an actual speed increase be matched by PS4 since an increase in RAM would not.
I keep mentioning this big fact: They are trying to cool 100W (near) silently.
100W is nothing for a chip of that size.
They are not in the "overclock" or "burning hot" region at all. They are in the opposite region, by their choice.
Is the ps4's external
If that's the case then i see, though my question has been for some time based on the power supplies used for the ps4 and XB1; Is the ps4's external and is it going to be bigger than Xb1's? I don't know what's the expected size of XB1's power supply but we do know the actual PS4 console is going to be smaller than XB1's. I'm thinking i don't see it possible for sony to include the power supply into the new design....unless there's something i'm overlooking.
Also, how difficult would it be to have a power supply change for the XB1 if it was needed? I do see the costs going up a little to enable a GPU altering, but i don't see it too expensive to be possible.
I wouldn't consider the ability to accidentally use the wrong kind of power brick, thus destroying it any kind of plus. AFAIK the PS3 (and presumably PS4) internal power sources can actually recognize and deal with the regional differences in electricity supplied. The only thing you'd have to change is the $3 cable to the wall.