No mobile has a 120 Hz screen. There are very few 120 Hz LCDs at all AFAIK.
Mine displays 24p content at 72fps. But then mine is also not completely "modern" anymore. I think I bought it in 2005. Aah, Pioneer Kuro, how I love thee..I believe most modern televisions are actually 120hz to allow them to display 24hz/fps content and 60hz content without judder. The difference is that they only have to accept 24hz content they're not trying to display 120hz content at 120hz. The motion interpolation stuff is just a fairly crude copy frame + filter operation though, it's effectiveness is variable with sports content apparently fairing best IIRC from the last time I was on the AVSforums salivating over tvs.
Right. Most 120Hz/240Hz etc., TVs do in fact refresh at those rates, but they don't actually accept 120Hz signals. The purpose is to allow film to play at its native framerate by displaying each frame 5x, 10x etc., per second and avoiding 3:2 pulldown which causes judder. People falsely link frame interpolation to 120Hz, but they really are not the same thing.I believe most modern televisions are actually 120hz to allow them to display 24hz/fps content and 60hz content without judder. The difference is that they only have to accept 24hz content they're not trying to display 120hz content at 120hz. The motion interpolation stuff is just a fairly crude copy frame + filter operation though, it's effectiveness is variable with sports content apparently fairing best IIRC from the last time I was on the AVSforums salivating over tvs.
Plasmas generally use 48Hz, 72Hz or 96Hz for the same purpose.Mine displays 24p content at 72fps. But then mine is also not completely "modern" anymore. I think I bought it in 2005. Aah, Pioneer Kuro, how I love thee..
Mine displays 24p content at 72fps. But then mine is also not completely "modern" anymore. I think I bought it in 2005. Aah, Pioneer Kuro, how I love thee..
Right. Most 120Hz/240Hz etc., TVs do in fact refresh at those rates, but they don't actually accept 120Hz signals. The purpose is to allow film to play at its native framerate by displaying each frame 5x, 10x etc., per second and avoiding 3:2 pulldown which causes judder. People falsely link frame interpolation to 120Hz, but they really are not the same thing.
Plasmas generally use 48Hz, 72Hz or 96Hz for the same purpose.
That's the subfield drive, not the refresh rate.FWIW, my plasma TV said 600Hz in the box.
Of course, it was bought in 2009 so I'm damn sure it wont get 120Hz through HDMI.
And imho the next round of games is going to be interesting, wont we see games with textures that suddenly needs to be lower res on the PC because of the limited Ram on the graphics card, then what? The PS4 has 8GB of GDDR ram, even the most expensive Graphic cards are running short on that one.
I don't recall anyone saying the 260 performed better.
Of course but if you test with a CPU bound game and use a CPU twice or more powerful than on console, you don't test only the GPU (as that was the point of the article) but a combination of CPU + GPU. If they had underclocked the AMD hexacore to 1.6Ghz then yes it would have been fair. It wasn't fair. Why didn't they underclocked the CPU used? They did underclock GPU once when they wanted to predict future framerate gap between PS4 and X1 in a previous article.First of all, it's a console game so there's no such thing as CPU or GPU bound. The CPU may be whats's providing the hard limit on performance but the GPU will still be pushed to it's full potential by the simple additions of graphical effects - or even more basic; resolution.
On BF4 for consoles, the main bottleneck is on CPU, devs implied it several times. Ergo one can't use at all this game to compare GPUs.So this isn't a simple matter of ignoring the GPU and focusing on the CPU. The GPU in the PS4 is clearly be pushed to it's limits to achieve the sub 1080p resolution or it would run higher (or match the PC's high graphical effects).
It's either 1.97TF or 2TF. But they said 1.9TF in the article why? Who round down numbers like that? Don't forget the bandwidth allocated for the CPU on consoles too...And just for the record, I don't see the value in claiming the 260 is a 2TF GPU while ignoring it's other specs. It's as if you're trying to show that even with more power the PC GPU can only just keep up whereas in reality, the PC GPU while (almost) 2TF actually only has 70% of the PS4's fill rate and 60% of it's memory bandwidth so getting as close as it does it a big achievement.
And you can't take PR bandwidth Microsoft numbers. Esram has a 109GB/s peak bandwidth which is roughly what the GPU card has. I very much doubt COD devs will have optimized it to, in certain rare cases, reach the ideal 140GB/s (only used in some specific stuff). And I doubt they will combine esram and main ram bandwidth in the game. Just looking at the difference 720p versus 1080p on PS4 strongly shows a lack of optimization for the X1 architecture. We can roughly see the X1 as a max 109GB/s bandwidth GPU machine with COD and for this comparison.Umm, this is flat out wrong. I'm not sure where you're getting those numbers from but the 260 has only 38% of the X1's bandwidth when you include the eSRAM which you obviously must.
Well, many things are a matter of interpretation. I used my fuzzy logic differently than DF or you.And what a bizarre conclusion that would be given the available evidence.
My point was that even taking those 30fps locked games into the comparison was disingenous when they could unlock the NFS PC game for instance but not the next gen game.
The framerate are not comparable because we don't know the real framerate on console!
Of course but if you test with a CPU bound game and use a CPU twice or more powerful than on console, you don't test only the GPU (as that was the point of the article) but a combination of CPU + GPU. If they had underclocked the AMD hexacore to 1.6Ghz then yes it would have been fair. It wasn't fair. Why didn't they underclocked the CPU used? They did underclock GPU once when they wanted to predict future framerate gap between PS4 and X1 in a previous article.
On BF4 for consoles, the main bottleneck is on CPU, devs implied it several times. Ergo one can't use at all this game to compare GPUs.
It's either 1.97TF or 2TF. But they said 1.9TF in the article why? Who round down numbers like that? Don't forget the bandwidth allocated for the CPU on consoles too...
And you can't take PR bandwidth Microsoft numbers. Esram has a 109GB/s peak bandwidth which is roughly what the GPU card has. I very much doubt COD devs will have optimized it to, in certain rare cases, reach the ideal 140GB/s (only used in some specific stuff).
And I doubt they will combine esram and main ram bandwidth in the game.
Just looking at the difference 720p versus 1080p on PS4 strongly shows a lack of optimization for the X1 architecture. We can roughly see the X1 as a max 109GB/s bandwidth GPU machine with COD and for this comparison.
There are and will be. Ryse is using an improved postFX AA to very good result IMO, which reportedly uses some temporal analysis IIRC (bit unsure on that though).
Changing the subject, are there any more advanaced methods of post processing AA that could see the light of day now the new hardware is out? Or is SMAA, MLAA and FXAA pretty much it?
They probably decided to cap the X1 version because it was closer to 30fps overall anyway.How the hell is the Xbox One getting a 6fps (24fps low to 30fps high) variable while the PS4's getting a 27fps (33fps low to 60fps high) variable?
BTW, where is the 45fps peak that was mentioned for Xbox One?