News & Rumours: Playstation 4/ Orbis *spin*

Status
Not open for further replies.
what's the efficiency of a 680, closer to durango or closer to xenos?

Well I don't know about Nvidia's architecture, but I do know that Thuways quote was talking about the improvements of the GCN architecture over Xeno's. Not really talking about the ESRAM.

But i think its same to assume Nvidia has a very efficient architecture and reaches near, if the not the same efficiency as GCN.
 
Thread Title Key English words: News, Rumours, Playstation

Things Not in thread title: Business, strategies, MyLife purchases, Xbox.
 
Some tidbits from Japan press courtesy of GAF again. Take it with salt as usual:
http://m.neogaf.com/showthread.php?t=516480

- PS4 to Vita remote play will have native vita resolution for all titles.

-you do not need to encode or change the format when you transfer movies from PS4 to PC.

-DS4's touchpad is not the priority feature but there are developers who are having intersting ideas for its usage

-PS4EYE's 2 camera will detect user's location even if the user doesn't have DS4 or Move controller.

-they might have PS1, 2 psp emulation in the future, if possible.

I looked at the source article, it mentions background separation, and the simultaneous photo and motion tracking too.

[Disclaimer: I took Japanese 101 long ago, don't lean too hard on my Japanese translation :)]
 
Some tidbits from Japan press courtesy of GAF again. Take it with salt as usual:
http://m.neogaf.com/showthread.php?t=516480



I looked at the source article, it mentions background separation, and the simultaneous photo and motion tracking too.

[Disclaimer: I took Japanese 101 long ago, don't lean too hard on my Japanese translation :)]

well maybe you can read some of this for me. it's from the PlayStation4 Eye patent.


WO2013014844A1WO2013014844A1wo2013014844a1WO2013014844A1(19)+small.jpg



WO2013014844A1WO2013014844A1wo2013014844a1WO2013014844A1(20)+small.jpg


WO2013014844A1WO2013014844A1wo2013014844a1WO2013014844A1(21)+small.jpg


WO2013014844A1WO2013014844A1wo2013014844a1WO2013014844A1(28)+small.jpg
 
Has there been any updates as to when the game streaming services (GaiKai or Remote Play) will actually start to work. I thought it would not be ready at launch from an article that I caught on my phone (was it qoutes from Trenton? maybe), but I was later not able to find which website it came from. If it is not ready for launch I wonder what kind or rollout they are thinking about.
 
It looks like a "simple" patent for capturing stereo photos/video as a series of scaled images. The unit can output raw or filtered images (e.g., cropped).

I kinda picked up what it was for I was just wondered what was being said in the images.
 
As others have pointed out in other threads, this source has no history of accuracy (or inaccuracy) for that matter.

I'm guessing they didn't mention CPU clock speed because they don't deem it important or they simply haven't decided yet.

1.6GHz simply doesn't make sense to me????? (That was the Ring Topology of Cell.)

My AMD Barton-core CPU from 2003, is clocked at 1.8-2.0GHz. Ten years later, I'm not sure in order single thread processing is really much quicker. In 10Years has single threat processing doubled from then in multi-core processors??? ~ I get 1.6 is half of the 3.2GHz the PS3 had. But 2GHz seems like the minimum to expect.
(My Gateway 64bit Dual core laptop from 2007 is 1.6GHz) Could the PS4 be 2.0GHz-3.2GHz. (3.2 makes a lot of sense. 1.6Ghz was the internal ring topology of the PS3.)

Does anyone have any sources? I keep thinking the PS4 is a tablet when I see 1.6Ghz.
 
1.6GHz simply doesn't make sense to me????? (That was the Ring Topology of Cell.)

My AMD Barton-core CPU from 2003, is clocked at 1.8-2.0GHz. Ten years later, I'm not sure in order single thread processing is really much quicker. In 10Years has single threat processing doubled from then in multi-core processors??? ~ I get 1.6 is half of the 3.2GHz the PS3 had. But 2GHz seems like the minimum to expect.
(My Gateway 64bit Dual core laptop from 2007 is 1.6GHz) Could the PS4 be 2.0GHz-3.2GHz. (3.2 makes a lot of sense. 1.6Ghz was the internal ring topology of the PS3.)

Does anyone have any sources? I keep thinking the PS4 is a tablet when I see 1.6Ghz.
It has to do with pipeline length, power, and out of order processing. A single Jaguar thread at 1.6GHz will run 2-3 times the number of instructions per second of a PPE thread. Two Jaguar threads (or cores) would handily outperform the PPE with it's hyperthreading. Sony has put 8 cores in the PS4, which would destroy the PS3's PPE to the tune of more than 16x. The GPU also does not need SPEs to help it out, they could shave 4 CUs off for SPE-like compute work, and still have more than enough graphics power to spare.
 
1.6GHz simply doesn't make sense to me????? (That was the Ring Topology of Cell.)

My AMD Barton-core CPU from 2003, is clocked at 1.8-2.0GHz. Ten years later, I'm not sure in order single thread processing is really much quicker. In 10Years has single threat processing doubled from then in multi-core processors??? ~ I get 1.6 is half of the 3.2GHz the PS3 had. But 2GHz seems like the minimum to expect.
(My Gateway 64bit Dual core laptop from 2007 is 1.6GHz) Could the PS4 be 2.0GHz-3.2GHz. (3.2 makes a lot of sense. 1.6Ghz was the internal ring topology of the PS3.)

Does anyone have any sources? I keep thinking the PS4 is a tablet when I see 1.6Ghz.
Well clock speed clearly depend on the architecture, bobcat were pretty much dogs when it came to overclocking (though they usually did not come with nice mobo offering lot of option), Jaguar should be better, but by AMD own numbers not by much.
Bobcat ran @1.6GHz, AMD promised a 10% increase in clock within the same power envelope> or less power at the same clock speed.

I guess it comes down to how many good chips Sony can have out of a wafer, pushing the limit on the clock speed may have them discarding chips that could meet their requirement other wise (8 CPU cores are OK, 18 CUs and the 32ROPS are OK, memory controller is OK, etc.).

I'm not sure about what is their margin with regard to power either. A hd7850 suck some juice, add the CPU cores and it is getting a consistent amount to dissipate out of a single chip. Definitely doable but as usual come at a cost, the cost of a better cooling system. They already push the limit with regard to RAM.

W&S I would think that 1.6 GHz is the bottom line, if they have impressive yields and a lot of chips that would qualitfy to run at say 1.8GHz they may consider a light bump "for free". My expectations are low on the matter, and if it were to happen I don';t expect more then a couple extra MHz above 1.6GHz. 2GHz sounds highly unlikely to me (looking at power and AMD claim for the jaguar CPU cores).
 
It has to do with pipeline length, power, and out of order processing. A single Jaguar thread at 1.6GHz will run 2-3 times the number of instructions per second of a PPE thread. Two Jaguar threads (or cores) would handily outperform the PPE with it's hyperthreading. Sony has put 8 cores in the PS4, which would destroy the PS3's PPE to the tune of more than 16x. The GPU also does not need SPEs to help it out, they could shave 4 CUs off for SPE-like compute work, and still have more than enough graphics power to spare.

Thank you, Liolio. ~ It sucks that I worry so much about something like this. So it helps that there are positive people to re-assure us naive enthusiasts.

bkilian, what a jaguar can do when all the cards are in order and paired together may be impressive. ~ The reason for my needless worry or rejection of 1.6Ghz claims is that the SPU on Cell could do 32bit operations in 7 cycles, and the internal right bus was pretty efficient. Most CPU cores can do more, but the time it takes to return a result has often been much more. I'm looking at http://developer.amd.com/resources/documentation-articles/ trying to see how well the internal latency compares to T.S.I.'s Cell Broadband Architecture.

This may not apply, but the construction rule is plan for 20% waste. So even it is is double per-cycle that ain't enough, when what I want to know is how long does it take to get the result back?

I'm really wasting my breath. = My anxiety = Where the hell is the 16MB of L3 cache to make up for GDDR5 slowness to respond to CPU requests? I want those ninja quick coding elements, like rain and light deform, that doesn't require texture processing. The Tanker level on MGS2 is still the most amazing rain effect I have ever seen. I could watch rain from top of screen bend and sway until it hit the deck and splashed...

People love big rolling explosions. But the really small details require really fast low latency memory. High bandwidth is great for large amounts of data. But online FPShooters all know, it is the latency between when you see them AND when they see you that equals alive and dead. Not the size of the graphics memory.

1. Does PS4 need L3 Cache to?
2. Would Sony be smarter to stop and go back to the drawing board, than release with 8GB of GDDR5?
3. Would you rather have (4MB L2 +16MB of L3) +4GB of GDDR5 ~OR~ 4MB L2 +8GB of GDDR5???

#3 Which system could you code more amazing Ninja Graphical tricks into?
 
*cough* *cough*

Jaguar Vanilla
* 1.8GHz LC Clocks (can be under-clocked for specific low-powered battery device needs - tablets, etc...).
* 2MB shared L2 cache per CUs
* 1-4 CUs can be outfitted per chip. (i.e. 4-16 logical cores)
* 5-25 watts depending on the device/product. (45 watts is achievable under proper conditions)

PS4 Jaguar with chocolate syrup.
* 2GHz is correct as of now.
* 4MB of total L2 cache (2MB L2 x 2 CUs)
* 2 CUs (8 Logical cores).
* idles around 7 watts during non-gaming operations and around 12 watts during Blu-ray movie operations. Gaming is a mixed bag...

What would be nice is a fully loaded Jaguar chip.

Quit bringing up platforms beyond the scope of the thread.
 
1.6GHz simply doesn't make sense to me????? (That was the Ring Topology of Cell.)

My AMD Barton-core CPU from 2003, is clocked at 1.8-2.0GHz. Ten years later, I'm not sure in order single thread processing is really much quicker. In 10Years has single threat processing doubled from then in multi-core processors??? ~ I get 1.6 is half of the 3.2GHz the PS3 had. But 2GHz seems like the minimum to expect.
(My Gateway 64bit Dual core laptop from 2007 is 1.6GHz) Could the PS4 be 2.0GHz-3.2GHz. (3.2 makes a lot of sense. 1.6Ghz was the internal ring topology of the PS3.)

Does anyone have any sources? I keep thinking the PS4 is a tablet when I see 1.6Ghz.

So you think they should just keep cranking the Ghz up?

They are all pretty boring labels, like input processing component, image processing component, demosaic component, etc.

Thanks.
 
bkilian, what a jaguar can do when all the cards are in order and paired together may be impressive. ~ The reason for my needless worry or rejection of 1.6Ghz claims is that the SPU on Cell could do 32bit operations in 7 cycles, and the internal right bus was pretty efficient. Most CPU cores can do more, but the time it takes to return a result has often been much more. I'm looking at http://developer.amd.com/resources/documentation-articles/ trying to see how well the internal latency compares to T.S.I.'s Cell Broadband Architecture.
Well, yes, if you're talking the entire CELL as a unit, then even an 8 core jaguar would not be able to compete. The cell had about 150Gflops or more of compute. An 8 core Jag sits at about 100Gflops. Add a single CU from the GPU and you've exceeded the cell.

As for latency sensitive operations, sure it's going to be trickier, but they'll manage. Doesn't matter if it takes twice as long to return a result if you can throw 4x as many resources at it.
 
I'd rather read what you think of the on chip Cache situation.
PS4 has 8 Logical Cores and 1,152 Stream Processing Elements (that go GPU compute).

Is 512KBx 8 Cores (4MB) enough?
How much is the delay and how wide is the GDDR5 Bus?
Is it 176GB/sec divide 800MHZ, what are the estimated specs for the GDDR5?


onQ,
In response to your question. An actively air cooled enclosure...1.8-2.8GHz. (No idea why. But temperature and cooling seem to never overheat when loaded to the max in these ranges. Around 3.2GHz the thermal diodes on the core will sometimes gate down the processor clock for individual cores. Remember, rooms without central air conditioning exist, where ambient tempts approach outside temp.

Rant:/
Most console designs need to imagine poor people...Or country boys who would put one in a tree house or garage, if they could still safely climb up there.

Say what you will. 4-6 cores is as much as a Non-Chrome browser needs. Programs that break down into multiple threads on their own can benefit from low speed multi-core processors. But so long as you have single threaded applications, the fasted solution is a faster clock. I have.. read, processor design and SIMD extensions continue to evolve... Scorched Earth (DOS game), is the only convincing proof I have seen. ~ Ever since the arrival of GUI interfaces, its harder to measure, because code keeps getting more bloated. So "bloated code", should you get a faster clock or a more expensive (Itanium2) processor?

Why the hell don't we have a 16 core Itanium2 in the consumer market? Guess the descendants cost a little more... http://ark.intel.com/products/family/451 :-*
 
I'd rather read what you think of the on chip Cache situation.
PS4 has 8 Logical Cores and 1,152 Stream Processing Elements (that go GPU compute).

Is 512KBx 8 Cores (4MB) enough?
How much is the delay and how wide is the GDDR5 Bus?
Is it 176GB/sec divide 800MHZ, what are the estimated specs for the GDDR5?


onQ,
In response to your question. An actively air cooled enclosure...1.8-2.8GHz. (No idea why. But temperature and cooling seem to never overheat when loaded to the max in these ranges. Around 3.2GHz the thermal diodes on the core will sometimes gate down the processor clock for individual cores. Remember, rooms without central air conditioning exist, where ambient tempts approach outside temp.

Rant:/
Most console designs need to imagine poor people...Or country boys who would put one in a tree house or garage, if they could still safely climb up there.

Say what you will. 4-6 cores is as much as a Non-Chrome browser needs. Programs that break down into multiple threads on their own can benefit from low speed multi-core processors. But so long as you have single threaded applications, the fasted solution is a faster clock. I have.. read, processor design and SIMD extensions continue to evolve... Scorched Earth (DOS game), is the only convincing proof I have seen. ~ Ever since the arrival of GUI interfaces, its harder to measure, because code keeps getting more bloated. So "bloated code", should you get a faster clock or a more expensive (Itanium2) processor?

Why the hell don't we have a 16 core Itanium2 in the consumer market? Guess the descendants cost a little more... http://ark.intel.com/products/family/451 :-*

But the PS4 is a closed box so when people develop for it they already know it's a multi-core CPU so they will make their code for multi-core.
 
onQ,
My only concern... do the developers on here think the PS4 needs L3 Cache?

If so, then I hope one of you with a better understanding that me can convince them to put the breaks on and trade 2GB of GDDR5 for 16MB of L3 Cache. ~ That is all.

PS2 lived for nearly a decade because of, low latency Quad-Ported 1,024bit eDRAM.
Speed and low latency allows independent developers to compete with big production houses.
 
Status
Not open for further replies.
Back
Top