Technical Comparison Sony PS4 and Microsoft Xbox

Status
Not open for further replies.
I never had a PS3, did it have problems streaming high definition files(larger than 4GB) over a home network?

Not over the network but FAT32 doesn't support files of that size on plugged in hdds. TBH it's been a year or two since I've tried plugging in an exFAT formatted hdd as my HTPC is far quieter for media playback
 
Not over the network but FAT32 doesn't support files of that size on plugged in hdds. TBH it's been a year or two since I've tried plugging in an exFAT formatted hdd as my HTPC is far quieter for media playback

I'm always looking to consolidate equipment when I can, but have been putting off building an HTPC. I'm hoping I can use one of the upcoming consoles to do the trick, because the 360 isn't doing it for me. It is crazy loud and I can't stream some larger files.
 
Your speculative conclusion became an assertion as soon as you started defending it with such vigor.

That's ridiculous. Now I can't even disagree with YOUR assertions without my points magically turning into "assertions". Good grief. Had no idea making rigorous points constitutes 'assertions' nowadays.

Leaks have proved to be informative to the actual hardware, even so, you don't care to hold them with any regard. Speculation that deviates very far from the leaks are a stretch. This includes a 33% gpu clock increase which has been given numerous reasons across this board to believe that it is not very likely.

I've already corrected you and addressed this.

However, you refute any detracting reasons...

God forbid! :???:

I'm simply not interested in your confused notions of what an assertion is or isn't, nor am I eager to hear you repeat your assumptions as fact. It's clear you aren't the slightest bit interested in discussing the issue further so you can feel welcomed to ignore any further posts I make regarding it. At this point we are simply bickering and I've grown tired of it. I've made my points and you have evidently exhausted yours so between the two of us we can leave it at that. :)
 
That's ridiculous. Now I can't even disagree with YOUR assertions without my points magically turning into "assertions". Good grief. Had no idea making rigorous points constitutes 'assertions' nowadays.



I've already corrected you and addressed this.

A MS representative said the SOC has a 100W TDP. Shouldn't that be enough to stop any idle speculation about a late game rescue?
 
A MS representative said the SOC has a 100W TDP. Shouldn't that be enough to stop any idle speculation about a late game rescue?

The real rescue will come in early 2016 when MS releases the next flavor of Xbox (Xbox One Point Five) that has more gpu, cpu, and bandwidth, while maintaining 100% backwards compatibility with Xbox One due to their software infrastructure using VM configurations.
 
The real rescue will come in early 2016 when MS releases the next flavor of Xbox (Xbox One Point Five) that has more gpu, cpu, and bandwidth, while maintaining 100% backwards compatibility with Xbox One due to their software infrastructure using VM configurations.

I have thought about that also. I heard a comment from someone on the PCper podcast saying the xbone games wouldnt be "coded to the metal" as with x360 games.

I would love the idea of updating hardware each year like an ipad/iphone.
 
I have thought about that also. I heard a comment from someone on the PCper podcast saying the xbone games wouldnt be "coded to the metal" as with x360 games.

I would love the idea of updating hardware each year like an ipad/iphone.

I wouldn't, I like the novelty of consoles and the rhetoric used during the console wars :p
If they updated annually i'm afraid it would turn into android vs ios which I don't want to be vested in.

I'll stick to PC for quick hardware updates.
 
The real rescue will come in early 2016 when MS releases the next flavor of Xbox (Xbox One Point Five) that has more gpu, cpu, and bandwidth, while maintaining 100% backwards compatibility with Xbox One due to their software infrastructure using VM configurations.

Only if XO is $299 max.
 
I'm always looking to consolidate equipment when I can, but have been putting off building an HTPC. I'm hoping I can use one of the upcoming consoles to do the trick, because the 360 isn't doing it for me. It is crazy loud and I can't stream some larger files.

ot but, a western digital tv is pretty awesome for this. 99 bucks. assuming you want a dumb box rather than a full fledged pc (i prefer a dumb box, easier).

in my experience 360 is ok, but has some problems refreshing folders sometimes, with some hd content, dont believe it supports mkv, etc. WDTV fixes all those problems and in my experience just works amazingly, plays every video you throw at it. oh and it's nearly hockey puck sized roku style for that matter, cant remember if it's passive cooled but definitely quiet. definitely one of my favorite products in the world. i guess the ui is kind of amateurish though for quibbles.

it would be nice if xbone was targeted to be a perfect streamer box too, fixing the issues with 360. unfortunately i doubt it's an area ms is interested in. no money in it for one (they'd rather funnel you to their pay video services), can be seen as piracy condoning for two.
 
A MS representative said the SOC has a 100W TDP. Shouldn't that be enough to stop any idle speculation about a late game rescue?

not really, no. when looking at tdp's of 1ghz+ AMD discrete GPU's it's probably still doable even if the 100 watts was some super hard limit rather than a guideline.

Example, 1ghz HD7770+1GB GDDR5=80 watts TDP (14 CU 7790=85 watts also). anand said the 8 core jag probably maxes 30 watts. seems close enough to 100 watts for wiggle room. if it didn't fit in 100 watts, i doubt it ending up a ~110 watt soc would cause spacetime ruptures or anything.
 
Surely the camera angles are a bigger indication it's not being played :LOL:

you mean like basically every next gen game shown yet (including all at the ps4 event except killzone)

it's annoying but for racing games for example, i dont think gameplay video/screens are even bothered to be released anymore. never seen a gameplay angle of a gran turismo game before it release i think. you wont see gameplay till you buy it and play it.

for played next gen you've got bf4 and shadowfall, that's it.
 
The real rescue will come in early 2016 when MS releases the next flavor of Xbox (Xbox One Point Five) that has more gpu, cpu, and bandwidth, while maintaining 100% backwards compatibility with Xbox One due to their software infrastructure using VM configurations.

You don't need a VM for forward compatibility just well designed OS APIs. Would be really silly that in 3 years from now they need game releases with different VM code branches to work around a solved problem in the first place.
 
BTW, did MS indicate why they went Blu-Ray?

There official line had been that DVDs delivered more than enough capacity for games and that things like procedural generation of textures made the need for higher-capacity media superfluous.

Plus presumably, they're going to push streaming or digital downloads for watching video content, not discs, right?
 
You don't need a VM for forward compatibility just well designed OS APIs. Would be really silly that in 3 years from now they need game releases with different VM code branches to work around a solved problem in the first place.

But that's the trick isn't it? Consoles outperform relative to their specs because they are coded closer to the metal allowing for some neat tricks. If that is all abstracted behind an API then we are really just looking at a duff PC. VMs or no there won't be a simple drop in upgrade for the next next gen.
 
Yeah, I think the Forza 5 footage is unlikely to be actual gameplay, most probably it's from replay mode. I don't think any of the other Forza games trailers showed actual gameplay either (the game always looked worse in gameplay, car models and aliasing wise)

Do we know if xbone has an arm core (trustzone), and if that core can be used to do the same thing as the ps4's cpu in the south bridge?

Yes, it definitely has an ARM core but I don't know if it's there for anything else besides security.

And astrograds 'MS could well have increased the clocks' line is misguided, we haven't heard a single, actual rumour saying they increased the clocks, it's pure speculation at this point and the most likely scenario remains that vgleaks are completely on the money and MS is being creative to come up with their 200GB/s.

I mean they did claim it was a 1TF machine last gen by counting non-programmable ops (with Sony then claiming PS3 was 2TF lol)

They also did this comparative analysis with the PS3:
http://au.ign.com/articles/2005/05/20/e3-2005-microsofts-xbox-360-vs-sonys-playstation-3?page=1

So, it is very plausible that the same tech guys who came up with that stuff this gen to make their machine look better are helping the marketing guys by doing the same for the Xbox One.
 
BTW, did MS indicate why they went Blu-Ray?

There official line had been that DVDs delivered more than enough capacity for games and that things like procedural generation of textures made the need for higher-capacity media superfluous.

Plus presumably, they're going to push streaming or digital downloads for watching video content, not discs, right?

If Microsoft wants to be the one-and-only digital device under your TV, they'll need to support all those high def movie discs out there.
 
BTW, did MS indicate why they went Blu-Ray?

There official line had been that DVDs delivered more than enough capacity for games and that things like procedural generation of textures made the need for higher-capacity media superfluous.

Plus presumably, they're going to push streaming or digital downloads for watching video content, not discs, right?

BluRay for movie player, Xbox One is not Wii U.

Yeah, I think the Forza 5 footage is unlikely to be actual gameplay, most probably it's from replay mode. I don't think any of the other Forza games trailers showed actual gameplay either (the game always looked worse in gameplay, car models and aliasing wise)

From EuroGamer:

From a rendering perspective, the game still operates at native 720p, but the locked 2x multi-sample anti-aliasing of the previous Forza titles has been altered to allow for an improved 4x MSAA implementation which we think is tied into the game mode selected: time-trial gives better edge-smoothing, while the more processing intensive race modes seem to be using the same 2x solution.

I guess the game aliasing is not an issue:

http://images.eurogamer.net/2011/articles//a/1/4/1/1/1/8/0/Forza_4_env2.bmp.jpg
 
Yet 3rd party devs didn't know about it. And we also have a VGLeaks rumor claiming Durango's specs were adjusted several weeks back. I won't concede anything as I'm not in a position to do so atm. Thus far I don't see any arguments being presented that don't extensively rely on assumptions that themselves are fragile. Just because you assert an assumption is valid or rock solid or 'highly likely' doesn't make it so.

I don't care how many times you repeat these assertions/assumptions or point to others repeating them. Having a discussion on this kind of topic should entail multiple perspectives and multiple hypotheses until we get more detailed info form MS to incorporate into our considerations.



Speculations isn't the same thing as an assertion. Speculation comes from pooling together information and putting pieces together in a way that seems to fit and explain their origins. Assertions are done without pooling together information and instead rely on the circular logic of assuming something is true and then using that to justify what you've assumed.

Some of you guys need to learn what words mean before you throw them around at others in efforts to dismiss them. You are asserting things, not me. :rolleyes:



And I've been correct on a large set of things as can be seen from my posts at TXB years back outlining how they'd approach this coming gen. I'll be sure to pop the cork for both of us the moment they unveil clock speeds. :smile:

You asked about why MS would want to up clocks. I'd agree it would be desperate on their end, likely informed by marketing agendas. An increased clock would give them the ability to tell consumers their box has a better CPU and an on par GPU while having more bandwidth. Will they care? Before the reveal I'd have said no, but then I also expected them to tell us all about how their box had chips every bit as fast as PS4's and they didn't. That's what got me curious about this whole thing.



Boglin,

I'm not saying my hypothesis is correct. I'm saying it needs vetting and thus far efforts to do so rely on the same assumptions/assertions my hypothesis calls into question in the first place. It doesn't matter how many times you repeat those assumptions. You can run around that circular logic all day long. Doesn't get you anywhere.

My hypothesis fits with the various disparate info we have and isn't as logistically implausible as you think it is imho. You are welcome to disagree, but do so without confusing who is assuming/asserting what in the process.

The problem I have with your 'speculation' is that its literally not possible with the information that we have, we know the TDP of the box is 100w or there about, a 2Ghz Quad Core Jaguar is somewhere near 50w on its own. That only leaves 50w for the video card which won't really get you much, also the GPU/CPU seem to have the same base clock which is 200mhz I think so you will need to go in multiples of that.

If you look at the tables on wiki you can easily find which GPU has a TDP which fits in with the remaining numbers, granted that you'll see that raising the core clock from 800mhz to 1000mhz drastically increases the TDP (from 6.875w a CU, to 8w a CU). So I think thats pretty conclusive on the massive up clock side of things.

So to sum up, with the standard clock speeds they have a 30w Jaguar and 70w left over for the GPU

Assuming that the GPU is similar to the ones on the wikipedia page or at least follows the same basic thermals

12 CU's at 800mhz would be 82.5w. Now obviously this is a bit off but we can probably assume some thermal savings something.

12 CU's at 1000mhz would be 96w. Obviously once a again maybe a little off but the problem is it needs to be 20w off, and I don't see that happening.

With your clock speeds they have a 50w a Jaguar and 50w left over for the GPU

Im not going to bother with showing the difference down here as its obvious the budget for the 100w is majorly blown with either GPU.
 
not really, no. when looking at tdp's of 1ghz+ AMD discrete GPU's it's probably still doable even if the 100 watts was some super hard limit rather than a guideline.

Example, 1ghz HD7770+1GB GDDR5=80 watts TDP (14 CU 7790=85 watts also). anand said the 8 core jag probably maxes 30 watts. seems close enough to 100 watts for wiggle room. if it didn't fit in 100 watts, i doubt it ending up a ~110 watt soc would cause spacetime ruptures or anything.

That would be more a 120W TDP then. The point isn't that they couldn't do that but that they defined it as 100W TDP and there's a diminishing return anyway because of their design decisions.

I regard the whole project as MS's coming Stalingrad for several reasons. A lost cause they'll drive so far into the ground to save the faces of the people in the command chain as long as humanly possible. We should start bets how many billions they'll vaporize before they'll change direction.
 
Status
Not open for further replies.
Back
Top