Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
It's something most people just call 'blur'. This is an example:


It's a very low res video but it shows pretty well that during the white test screen patter, one display shows the lines moving whereas the other just doesn't keep up and blurs the image to a point where you basically no longer see the lines.

This happens in all TVs to a certain extent and deteriorates the image quality. Think about moving the camera (not even too quickly) in a game, say with lots of detailed trees in the background: some TVs will retain more detail and others will blur it and show a green mess.
 
Is motion resolution an applicable reason why some consumers cannot see much difference between 720p and 1080p while the game is in motion? IE. My plasma does 700lines in motion my Sony does 1080. I'm not going to see all that sharpness or resolution difference unless I switch to a PC monitor or a better TV?

If so. Good to know. Lol my Xbox pairs well with my plasma. Lol if I get a pS4 I will pair it with my Sony. LOL
 
PS3, 2006, 256 GFlops (actually a 2005 architecture, no?). 16x == 4 TFlops. That arrived in the GTX 780, 2013, 7 (8) years later, at a crazy price-point. What makes you think that 16x PS4, 29 teraflops, will be achievable and affordable in five years?

Well technically PS3 launched in November 2006 and AMD'a 7970 was basically at 4TF in January 2012, only 5 years and 2 months after the PS3. Additionally PS3's GPU was closer to high end back then vs PS4 at the end of 2013. AMD 290X launched in 2013 and has 5.6TF.

29.4TF/5.6TF is 0nly an increase of 5.25, so with a new node shrink coming pretty soon and at least another should fit within that timeframe plus increased focus in improving performance/watt, it's not a pipe dream to have a GPU around that performance level at the high end. I probably have to pretend that I didn't see that affordable part though :)
 
That is cheating! The PS4 GPU is not 4TF which would be a direct comparison. What is the point of comparing to a PC GPU?
Did we forget about the power and heat restrictions?

So on PC you may expect 30TF or maybe more but for a console you can expect to half it unless you want your console to be as big as an ATX sized PC and very expensive.
 
Well technically PS3 launched in November 2006 and AMD'a 7970 was basically at 4TF in January 2012, only 5 years and 2 months after the PS3.
Okay, didn't compare AMD parts.
Additionally PS3's GPU was closer to high end back then vs PS4 at the end of 2013.
That's kinda the point. Had PS4 gone for power, it could have had far more Flops, 4 TF. But it didn't because Money. ;)
 
In november 2006 (ps3 premiere) I already had geforce 8800 gt which was twice as fast as ps3. Geforce Titan -the most powerful pc card available at ps4 premiere is like 1,7 times faster than radeon 7870. Point is the PS4 (apart from cpu) was closer to high end than ps3 was at it's premiere
 
It would have been interesting to see what would have happened had Sony launched at the same price as the XOne+Kinect, and used those extra $100 (or whatever the difference was) on a beefier setup. Interesting both in that maybe it wouldn't have succeeded like it did, and also that perhaps we wouldn't have seen that much of a difference to justify the increase in price, after all. Who knows!
 
That is cheating! The PS4 GPU is not 4TF which would be a direct comparison. What is the point of comparing to a PC GPU?
Did we forget about the power and heat restrictions?

Shifty's comparison was PS3 to GTX 780 and stated that a similar TF jump is probably not possible in the next 5 years or so, I just stated that a GPU with that kind of power might come out. 2020 is some time away still, continuous mobile first development might bring further large advances in perf/watt in desktop as well. Look how much mobile GPUs have improved, Maxwell is a nice step in the right direction and I expect tech to improve further. I'm sure laws of physics still allow more performance. :)
 
That's kinda the point. Had PS4 gone for power, it could have had far more Flops, 4 TF. But it didn't because Money. ;)

Yes, but it makes making distance with PS5 to PS4 easier if they push as hard as they did with PS3. Not too likely yeah, just saying.

Geforce Titan -the most powerful pc card available at ps4 premiere is like 1,7 times faster than radeon 7870. Point is the PS4 (apart from cpu) was closer to high end than ps3 was at it's premiere

This is wrong in many ways. For starters 7870 has 40% more flops than the PS4 and comparing flops with different architectures/manufacturers is flawed. AMD 290X was also out when PS4 launched and it has 3.04X more TF's as the PS4. PS4 isn't even close to being as high end as PS3 let alone what 360 was in 2005.
 
Dr Evil said:
Shifty's comparison was PS3 to GTX 780 and stated that a similar TF jump is probably not possible in the next 5 years or so, I just stated that a GPU with that kind of power might come out. 2020 is some time away still, continuous mobile first development might bring further large advances in perf/watt in desktop as well.

I do expect the chip stacking DRAM technology to make a massive difference to performance however
The fabs are slowing down considerably from going node to node and it is getting more expensive. For the first time it is not GPU companies that are using the cutting edge 20nm node but mobile chip companies.

Look how much mobile GPUs have improved, Maxwell is a nice step in the right direction and I expect tech to improve further. I'm sure laws of physics still allow more performance.
That is also already on 20nm.

Tegra X1 is not mobile unless you mean it is aimed at the automobile market. ;-). The tech is nice but seriously it is a bit over hyped - we are talking about an NVIDIA GeForce 730 equivalent in its current iteration that requires a huge heatsink and probably quite a bit of power. No good for a mobile or tablet in its current version.
 
This is wrong in many ways. For starters 7870 has 40% more flops than the PS4 and comparing flops with different architectures/manufacturers is flawed. AMD 290X was also out when PS4 launched and it has 3.04X more TF's as the PS4. PS4 isn't even close to being as high end as PS3 let alone what 360 was in 2005.

This is not wrong at all.
Im talking about in game performance not flops and in this case the proportional diffrence between PS4 (Radeon 7850-7870) and Titan is much smaller than between Geforce 7970 and Geforce 8800 GTX which was highend in 2006
Not to mention that RSX was much slower than 7900 series with gimped memory bus and half the ropes.

PS3 was not the highend than and PS4 is not the highend now.
 
Tegra X1 is not mobile unless you mean it is aimed at the automobile market. ;-). The tech is nice but seriously it is a bit over hyped - we are talking about an NVIDIA GeForce 730 equivalent in its current iteration that requires a huge heatsink and probably quite a bit of power. No good for a mobile or tablet in its current version.
This is just wrong.
 
Yes, but it makes making distance with PS5 to PS4 easier if they push as hard as they did with PS3. Not too likely yeah, just saying.



This is wrong in many ways. For starters 7870 has 40% more flops than the PS4 and comparing flops with different architectures/manufacturers is flawed. AMD 290X was also out when PS4 launched and it has 3.04X more TF's as the PS4. PS4 isn't even close to being as high end as PS3 let alone what 360 was in 2005.

It's also worth remembering that the enthusiast GPU market in 2005 was a completely different beast to what it is today. Correct me if I'm wrong, but I think SLI cards were re-introduced just before that time by Nvidia, PCIe came out and basically the era of silly high budgets for highest end graphics started to develop then. And today, as long as you have the cash, you can build yourself a PC with many times the teraflop performance of a console, and let's not even talk about the CPU side of things.

At the same time, consoles have gone from the days of the PS2 and PS3 where they were launched near the top of the performance game, selling at a loss and basically ruining Sony, to a much healthier market today where no console is sold at a loss, and performance takes a back seat.

The two markets have basically gone in opposite directions so I fail to see the point of continuing to compare their hardware and performance today. It's all about the money, boys, and thank god for that. For the price, I'm quite happy with the things I see coming out of my PS4.
 
Last edited:
This is not wrong at all.
Im talking about in game performance not flops and in this case the proportional diffrence between PS4 (Radeon 7850-7870) and Titan is much smaller than between Geforce 7970 and Geforce 8800 GTX which was highend in 2006
Not to mention that RSX was much slower than 7900 series with gimped memory bus and half the ropes.

I think it matters quite a bit in this argument that GTX 8800 launched basically at the same time as the PS3, whereas something like AMD 7970 had been out for nearly two years before the PS4 and offering far better performance. GTX 8800 is one if not the most impressive GPU launches ever, so I admit it looks great here, but if you really pick a truly GPU limited situation, the difference between 290X vs 7850 is pretty huge and that's within the same core architecture. G80 was a major overhaul over the G70. PS3 did suffer a bit here launching late, but 360 looks that much stronger compared to PC GPUs and the One even weaker in the same token.

That is also already on 20nm.

Tegra X1 is not mobile unless you mean it is aimed at the automobile market. ;-). The tech is nice but seriously it is a bit over hyped - we are talking about an NVIDIA GeForce 730 equivalent in its current iteration that requires a huge heatsink and probably quite a bit of power. No good for a mobile or tablet in its current version.

I meant the benefits desktop Maxwell got from the increased focus on perf/watt driven by the mobile development. The mobile GPUs separately has seen a huge boost in performance in the last 5 years and nVidia seems to be able to bring some of that to the desktops as well. I hope AMD can do the same as they seem to be more likely to be in the future consoles as well. I just worry that their R&D capability has been suffering too much with their poor finances...

I'm quite happy with the things I see coming out of my PS4.

Me too, but I probably would have been happy with a 200W slightly larger PS4 as well. :)
 
The huge transition to day one digital makes me wonder if you can build a 9th generation console that doesn't have backwards compatibility in it. Plus, think about the huge number of software investments made this generation in apps that I'm not really sure we can go through again. Both Sony and Microsoft made a huge bet that the other would not have backwards compatibility since PPC was simply not the right choice to scale up to modern performance levels, but next time around, who knows?

If it comes down to box a is more powerful but hit the reset button again in available games (i.e none) and OS/system software vs box b which is weaker, but feels like an upgraded 8th gen box with more power and all 8th gen games are available, I would go all in on box B in a heart beat. We've already seen far more people spend time and money playing games on phones than consoles, so my ultimate question ends up becoming, does better software outdo more powerful hardware (Microsoft) or does more powerful hardware outdo software (Sony)? The answer that question I feel dictates everything from CPU arch to final BOM to the approach of capturing more people who spend most of their time on phones through streaming (assuming mobile networks can handle such a thing). I mean, a 50Mbps, sub 20ms, reliable mobile connection to a phone + a controller changes a lot about the economics of the hardware being built, since cost considerations can be reshuffled.

Or maybe I'm just way off the mark. I just feel these boxes can no longer be built in isolation of other things happening in the world, or they will go the way of the dodo and all we'll get is Flappy Birds X. :)
 
Or maybe I'm just way off the mark. I just feel these boxes can no longer be built in isolation of other things happening in the world, or they will go the way of the dodo and all we'll get is Flappy Birds X. :)
Backwards compatibility is certainly desirable but from what Sony have said about the large number non-PlayStation owners getting a PS4 I don't think it's that big a deal. If you move platforms b/c is out anyway.

But predicting the priorities and whims of the market as it will be in 4 to 5 years is impossible. Lack of compatibility Hasn't hindered this gen or last.
 
I know it's been mentioned in this thread once before already, but what are the chances that Next-Gen is delayed long enough that we end up with something that looks like a much more integrated CPU-GPU hybrid than the APUs which exist in the current generation consoles?

I was under the understanding that this was the direction that AMD was going in with the whole HSA initiative (which noone seems to speak about anymore).

The idea of a scalable, CPU/GPU hybrid with shared local memory resources and even some fixed functional units sounds interesting to me. Pretty much a paradigm shift in computing. Would such a thing be possible? Maybe even something like Larrabee, but with GPU shader arrays and perhaps one huge ed/esram L3 cache in addition to the sea of CPU cores?

How far away are we to something that really starts to blur the line between what a CPU and a GPU are traditionally considered to be?
 
Nvidia 960 would have been a good GPU for this gen, 2.3 teraflops and on a 128 bit bus. Nvidia's bandwidth optimizing tech making it possible.

I suppose that would have limited to 4GB RAM, but Samsung has double density chips coming that would have enabled 8GB.

I still like Nvidia for a wild card possibility for next gen.
 
Status
Not open for further replies.
Back
Top