Sony's Next Generation Portable unveiling - PSP2 in disguise

That would be a highly theoretical number and based on shader power exclusively, i.e. TMUs are quadrupled, and we (well, at least some of us) don't know exactly how much the architecture scales with the multi-cores.

Strange, as IMG have repeatedly stated publicly that their MP I/P scales virtually linearly (i.e. >95%), 2 cores gets you almost twice the peformance, 4 cores gets you almost x4. They state on their website that a 543MP4 @200Mhz give 133M polys per sec, which is slightly less than x4 the spec for an SGX540 (which as I recall is 35M).
 
Strange, as IMG have repeatedly stated publicly that their MP I/P scales virtually linearly (i.e. >95%), 2 cores gets you almost twice the peformance, 4 cores gets you almost x4. They state on their website that a 543MP4 @200Mhz give 133M polys per sec, which is slightly less than x4 the spec for an SGX540 (which as I recall is 35M).

Fill-rate isn't everything. Even more so when we'll now be talking about shader-intensive games.
One SGX543 has 2-3x the shader power of a SGX535, but the same amount of TMUs.
SGX543MP4 has 4x the TMUs of a single SGX535.

And then at some point there'll be a memory bandwidth bottleneck. The A4's SGX535 deals with a 2-channel 32bit memory controller and that's probably the sweetspot for the whole SoC.
I doubt the PSP2's SoC will have more than a 128bit UMA or 64/128bit for CPU and 64bit for GPU

You cannot just add in more cores and expect the end result to be a linear cumulation of a one core result.
 
BAD
- touch screen front
How is having a touch screen a bad thing?

-what is the difference between OLED and AMOLED?
There are two kinds of OLED pixel displays: active matrix (AMOLED) and passive matrix (PMOLED). All full colour OLED displays of a reasonable size are AMOLED, thus the terms OLED and AMOLED are often used interchangeably.

Exactly the same applies to (AM/PM-)LCD.

Pet peeve: both AMOLED and AM-LCD are TFT displays, as Thin Film Transistors are the core elements of the Active Matrix.
 
Here's that horrible looking interface in video form:
http://game.watch.impress.co.jp/video/gmw/docs/423/457/html/ppv.flv.html

It looks functional, I guess. But it's tacky as all hell.

well imo the tackiness wasn't as bad as the lack of practical functionality :LOL:

hierarchy, order, sensible typographical taste- all absent!
given the splendid hardware specs, the UI could have been utterly amazing- and that would definitely help impressions of slickness...

I have my own misgivings regarding the exterior aesthetics of the NGP, but compared to the UI they're rather tame and personal. The UI.... *aaaaaaaaaarghhhhhhhhh*
 
How is having a touch screen a bad thing?
1. adds to cost
2. smudges on screen
3. screen is obscured by hand // most important

having one on the back will eliminate points 2+3.
Note - u dont need to see the touchscreen when you use it for the same reason u dont need to look at your mouse when u use it.
My worry about a touchinput behind is you might trigger it unintentionally eg your pinky finger brushing it whilst playing

Also personally Im one of those guys that prefers eg in phones a physical keypad than a screen. I like the tactile feel
 
I like the tactile feel

That's what she said.
Sorry, couldn't resist..


I for one think that a well implemented haptic feedback replaces the "urge" of a tactile feel.


Which reminds me of another feature that's absent from the specs: rumble.
Does the NGP have it?
Nintendo decided to keep it out of the 3DS (and DSi, DSXL), even though there was a rumble add-on for the DS/lite.
 
No, but you should worry about losing half the screen's brightness after 2 years of use.


Now for a far-fetched question:

3D gaming-wise, how would the quad A9s + SGX543MP4 compare to AMD's C-50 (dual-bobcat @1GHz + 16*VLIW5+8TMUs+4 ROPs @ 280MHz)?
Performance-wise, how close is the NGP's system to AMD's lowest-power x86 APU?

My zune hd is now 2 years old and the screen looks just as bright as it did when i first bought it .
 
What are the IQ options with SGX543? given the VRAM (any idea what speed? We have no real detailed specs yet do we) will we possibly get uniform 4xMSAA or better? In that respect the screen results could look more polished than many PS3 titles. I'd certainly like to see a minimum IQ that doesn't have very obvious pixel crawl all over the shop.
 
But I disgress - my main question is whose bright idea of a "game changer" was it to come up with a completely new proprietary memory-card format. Especially if the rumours about internal storage (or rather, lack of thereof) turn out true.
That's got to be an anti piracy measure. DS's standard SD card slot has been very costly in that regard. A lack of on-board flash is bad though. Mem cards had better be competively priced, or the cost of ownership will be notably higher than the NGP's list price.

And this thing is aimed squarely at hardcore gamers? With floaty bubbles?! Appalling design IMO.
 
Fill-rate isn't everything. Even more so when we'll now be talking about shader-intensive games.
One SGX543 has 2-3x the shader power of a SGX535, but the same amount of TMUs.
SGX543MP4 has 4x the TMUs of a single SGX535.

And then at some point there'll be a memory bandwidth bottleneck. The A4's SGX535 deals with a 2-channel 32bit memory controller and that's probably the sweetspot for the whole SoC.
I doubt the PSP2's SoC will have more than a 128bit UMA or 64/128bit for CPU and 64bit for GPU

You cannot just add in more cores and expect the end result to be a linear cumulation of a one core result.

TBDRs have for good reason a healthy advantage in bandwidth constrained scenarios. There's no bandwidth overhead for MP by the way.

What are the IQ options with SGX543? given the VRAM (any idea what speed? We have no real detailed specs yet do we) will we possibly get uniform 4xMSAA or better? In that respect the screen results could look more polished than many PS3 titles. I'd certainly like to see a minimum IQ that doesn't have very obvious pixel crawl all over the shop.

Memory footprint and bandwidth cost for multi-sampling should be roughly in the coverage sampling league of an IMR. Else the cost is typically very small especially if you consider that the MP4 has 64 z/stencil units. It'll be as always a developer affair more than anything else.
 
since the PSP games will run completely through emulation on this thing & Sony know what held back PS2 emulation on PS3 I'm thinking maybe they went ahead & made this in a way that PS2 emulation would be possible too. and start putting PS2 games on PSN as a head start for when the PS4 comes out they will already have most of the PS2 games on PSN,

and with this I'm starting to see how big PSN can become.
 
I don't think summing the TMUs on the MP setup is directly comparable to that number on the single-core setup. Unless SGX MP cores can all work on the same tile, that is.
You do realise that PowerVR tiles are on the order of a thousand pixels, not hundreds of thousands? It wouldn't make sense to let multiple cores work on the same tile in the same frame.
 
I don't think summing the TMUs on the MP setup is directly comparable to that number on the single-core setup. Unless SGX MP cores can all work on the same tile, that is.

See also Arun's post. Oversimplified theoretical example: assume you have a scene that has evenly distributed data across it; once the master tiling unit of the MP splits it up to four (same sized) macro tiles and assigns each to each core, why exactly would each core NOT make full use of its 2 TMUs? Same goes for any other unit.
 
The 2 years comment may have been exaggerated, but those 10 years sound a bit of a longshot to me.
The biggest problem is the disparity of longevity between pixels of different colors, which results in an increasingly inaccurate color display.
The blue OLEDs are rated at losing half their brightness at 14.000 hours. Reaching ~75% brightness (which is pretty noticeable) could be had after some 7.000 hours (5 years at 4 hours/day) while the Reds and Greens are at ~95%.
That could become an issue still within the console's lifespan.
It's nothing to worry about right now, though.

The best blue dyes currently have a half life of around 50,000 hours max while the max for red and green is about 150,000 hours. The solution that Sony and many others use is to shorten the life of red and green dyes to match with the blue dye. Not only does that reduce the production costs as cheaper red and green dyes can be used, it also gives an even decay rate so the PQ stays awesome and avoids the problem you are talking about.

Sorry, but that really sounds like a pure marketing comment. Power draw wasn't even a variable in my question.
I don't know what the clocks are in that SGX543MP4, but I have a hard time believing it can beat the 500MHz Robson with 80 ALUs, 8 TMUs, full DX11 shaders and programmable tesselator.

I do believe that if you restrict a game to OpenGL ES 2.0 featureset, the SGX543MP4 could show better results in a scene with lots of overdraw where TBDR shows its advantages, but I also believe a 500MHz Robson could pull a lot better graphics if all the eye-candy features are implemented.

I can't reveal the clocks but you can't count ATi ALUs as single units, you have to count them in blocks of 4 or 5 depending on the architecture used. At a comparable 800x600 resolution C-50 performs very poorly on games like WoW and CoD4.

Power draw is probably the most important aspect of embedded devices, the A9+SGX543MP+ SoC consumes 50% less power than the C-50 while yielding more impressive results. There are downsides and trade-offs, but by avoiding x86 like the plague we will get battery life in the order of 5-6 hours vs 1-2 hours.

I still don't think you can compare the two directly, they offer different experiences and Fusion isn't designed for embedded products. I'm sure if AMD made a Fusion product directed at embedded devices then we can make a proper comparison.
 
1. adds to cost
2. smudges on screen
3. screen is obscured by hand // most important

having one on the back will eliminate points 2+3.
Note - u dont need to see the touchscreen when you use it for the same reason u dont need to look at your mouse when u use it.
With a mouse to have a cursor. For the rear touchpad to work as well as a touchscreen it would either have to detect proximity (and somehow show it on the screen, which may obscure things), or you need a lot of practice.

I wouldn't be totally surprised if the touchscreen will get used less and less as people get used to the rear pad and build up muscle memory. But to tap on something small like a link or for direct interaction with objects on the screen the touchscreen is still the better interface.

Yes, touchscreen adds some cost, but points 2 and 3 only apply when you actually use it at which point you also get the benefits.
 
You don't think this might be an over exaggeration?

Even if he's exaggerating, I can't help thinking of what AMD would really do if they'd plan to enter the embedded market down to smart-phone level. Something like that could be complete bullshit: http://www.digitimes.com/news/a20110111PD215.html ....or not in the end.

***edit: by the way the C-50 is clocked at 280MHz and not 500MHz from what I can see: http://www.anandtech.com/show/4134/the-brazos-review-amds-e350-supplants-ion-for-miniitx
 
I for one think that a well implemented haptic feedback replaces the "urge" of a tactile feel.
Just like trying to tickle yourself vs someone else tickling you, at the end of the day, they just cant compare, not even close
 
Back
Top