Predict: The Next Generation Console Tech

Status
Not open for further replies.
IMO that is a great idea. Cheaper than having to include two seperate controllers.
Final design better not look like that lol.

I think we already have a discussion before when the move first unveiled. Basically they should make a split controller that have all the buttons in the regular controller and making it the default controller. That way they can ditch the standard controller in a PS3 move package. You can have one move and one navicon or I prefer like the one in the patent where each part could act like a single move controller.
 
I don't think they can ever get rid of the glowing orbs. There's just no alternative that would work as reliably. They can be measures from any angle. They remain in view even when not pointed at the tv/camera. You don't need a super-high resolution camera to get good measurements (important for high frame rate input). Unfortunately, image is important and gamers will always shun the "clown noses" for looking dumb, no matter how much better they work.
I'm a gamer and I (and many others) like the multicolor glowing spheres. Don't be afraid of the healing properties of chroma-therapy. ;) I remember some design pictures of a break apart Dualshock/Move controller, a year or two ago. I thought they were well done.
 
I don't think they can ever get rid of the glowing orbs.
They could be inflatable and retract into the controller when not using camera tracking. ;)

I do find the bright coloured balls kinda distracting. In a Move game it's not so bad, but I wouldn't like to be playing an FPS or such using it as a dual-shock and have two lights blaring away. Maybe Sony should use IR illuminated spheres and an IR camera, so they look just grey to us but can be tracked in game?
 
Maybe Sony should use IR illuminated spheres and an IR camera, so they look just grey to us but can be tracked in game?
That's an option of course, but how does the camera/software (reliably) differentiate between different bobbles when there's more than one in view? Strobing the bobbles in time with the camera shutter would work, but it requires a very fast camera, and some way to sync the two reliably and wirelessly (easier said than done with the fairly laggy CMOS consumer cameras and bluetooth tech used...)

Maybe you should just get used to the blaring lights? ;)
 
One thing that would help would be to set the orb brightness according to ambient brightness. It doesn't need to be a 40w bulb for the camera to clearly see it in a dark room! As part of the setup the orbs can come on dim and gradually increase in brightness until the camera has a clear view.
 
One thing that would help would be to set the orb brightness according to ambient brightness. It doesn't need to be a 40w bulb for the camera to clearly see it in a dark room! As part of the setup the orbs can come on dim and gradually increase in brightness until the camera has a clear view.
 
how much edram is possible to include in a nxt gen console without being much expensive?

I suppose a better would be how much would a typical deferred renderer need at 720p with some form of AA.

720p with 4xMSAA is ~30mb of frame buffer?

And I'm sure that a deferred engine running the same resolution and 4xMSAA would require a good chuck more.
 
If there's enough system BW, eDRAM isn't necessary. If using eDRAM, I guess a lot depends on how many buffers you want in eDRAM at any moment. Theoretically you only need two and combine all the buffers serially. 720p at 16 bits per pixel is 30 MBs 4xMSAA. 1080p would be twice that. You'd want twice whatever your backbuffer is for combining buffers, although I presume that works in screen space so MSAA samples aren't needed? That'd mean 30 MBs should be enough to render 720p 4xMSAA or 1080p 2xMSAA deferred, with the buffers fitting as simple 1280x720xbit depth buffers, one sample per pixel.

If you want to use the eDRAM for texturing and stuff as well though, you'd want plenty more.

It's worth pointing arijoytunir to this eDRAM thread. As expense is a measure of compromise (more eDRAM means less processing power from the same size silicon), there's no straight answer to the question.
 
o8i2yg.png


Seems point to a Jaguar for sure if we can believe it.
 
You all seem to be missing the important picture in that Sony patent:

Fullscreen_capture_30112012_110358.jpg.jpg


So it doesn't neccessarily have to be a big orb. Something with a more discreet profile could be more appealing to consumers.
 
How do you get signals from the PCB to the die on top of the interposer through the interposer? Yes, with TSVs! ;)

Or wirebond, what do you need to go off interposer for? Power in, hdmi out, sata in/out. What else?

Actually, if it was just tsv's thru the interposer it would already be a done deal.

The question in my mind is if there's an interim solution that preserves much of the benefit of stacking (for consoles, high bandwidth) while waiting for the process to mature (2015?).
 
how much edram is possible to include in a nxt gen console without being much expensive?

The latest and great chip is IBM's POWER7+, an update and shrink to 32nm of the POWER7, it has 80MB eDram L3.
It's a very expensive 8 core CPU with lots of stuff.

If Nintendo can afford 32MB eDram on a 40nm GPU, then a bigger console may afford a higher amount, maybe 64MB, 80 as an upper limit - maybe that's stretching it. But that could allow easier 1080p and high quality 720p.
 
If there's enough system BW, eDRAM isn't necessary. If using eDRAM, I guess a lot depends on how many buffers you want in eDRAM at any moment. Theoretically you only need two and combine all the buffers serially. 720p at 16 bits per pixel is 30 MBs 4xMSAA. 1080p would be twice that. You'd want twice whatever your backbuffer is for combining buffers, although I presume that works in screen space so MSAA samples aren't needed? That'd mean 30 MBs should be enough to render 720p 4xMSAA or 1080p 2xMSAA deferred, with the buffers fitting as simple 1280x720xbit depth buffers, one sample per pixel.

If you want to use the eDRAM for texturing and stuff as well though, you'd want plenty more.

It's worth pointing arijoytunir to this eDRAM thread. As expense is a measure of compromise (more eDRAM means less processing power from the same size silicon), there's no straight answer to the question.

Is edram needed if we want Xbox 360 games compatibility?
 
MS was rumored to simply include a Xenon CPU.

For the PS3, well there were early PS3 with full PS2 hardware, then PS3 that leaves out the CPU I believe while keeping the GPU+edram, then PS3 with nothing, that runs nothing from the PS2 at all.
 
Status
Not open for further replies.
Back
Top