News & Rumours: Playstation 4/ Orbis *spin*

Discussion in 'Console Industry' started by patsu, Jan 23, 2013.

Tags:
Thread Status:
Not open for further replies.
  1. ManuVlad3.0

    Regular

    Joined:
    May 26, 2004
    Messages:
    303
    Likes Received:
    4
    Location:
    Brasil
  2. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    1,520
    Likes Received:
    692
    As others have pointed out in other threads, this source has no history of accuracy (or inaccuracy) for that matter.

    I'm guessing they didn't mention CPU clock speed because they don't deem it important or they simply haven't decided yet.
     
  3. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,544
    Likes Received:
    10
    Location:
    In the land of the drop bears
    Well I don't know about Nvidia's architecture, but I do know that Thuways quote was talking about the improvements of the GCN architecture over Xeno's. Not really talking about the ESRAM.

    But i think its same to assume Nvidia has a very efficient architecture and reaches near, if the not the same efficiency as GCN.
     
  4. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,719
    Likes Received:
    5,815
    Location:
    ಠ_ಠ
    Thread Title Key English words: News, Rumours, Playstation

    Things Not in thread title: Business, strategies, MyLife purchases, Xbox.
     
  5. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,614
    Likes Received:
    60
    Some tidbits from Japan press courtesy of GAF again. Take it with salt as usual:
    http://m.neogaf.com/showthread.php?t=516480

    I looked at the source article, it mentions background separation, and the simultaneous photo and motion tracking too.

    [Disclaimer: I took Japanese 101 long ago, don't lean too hard on my Japanese translation :)]
     
  6. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    55
    well maybe you can read some of this for me. it's from the PlayStation4 Eye patent.


    [​IMG]


    [​IMG]

    [​IMG]

    [​IMG]
     
  7. Jwm

    Jwm
    Veteran Regular

    Joined:
    Feb 27, 2013
    Messages:
    1,037
    Likes Received:
    155
    Location:
    Texas
    Has there been any updates as to when the game streaming services (GaiKai or Remote Play) will actually start to work. I thought it would not be ready at launch from an article that I caught on my phone (was it qoutes from Trenton? maybe), but I was later not able to find which website it came from. If it is not ready for launch I wonder what kind or rollout they are thinking about.
     
  8. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,614
    Likes Received:
    60
    It looks like a "simple" patent for capturing stereo photos/video as a series of scaled images. The unit can output raw or filtered images (e.g., cropped).
     
  9. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    55
    I kinda picked up what it was for I was just wondered what was being said in the images.
     
  10. LuckyGuess

    Banned

    Joined:
    Feb 27, 2013
    Messages:
    39
    Likes Received:
    0
    Location:
    North America
    1.6GHz simply doesn't make sense to me????? (That was the Ring Topology of Cell.)

    My AMD Barton-core CPU from 2003, is clocked at 1.8-2.0GHz. Ten years later, I'm not sure in order single thread processing is really much quicker. In 10Years has single threat processing doubled from then in multi-core processors??? ~ I get 1.6 is half of the 3.2GHz the PS3 had. But 2GHz seems like the minimum to expect.
    (My Gateway 64bit Dual core laptop from 2007 is 1.6GHz) Could the PS4 be 2.0GHz-3.2GHz. (3.2 makes a lot of sense. 1.6Ghz was the internal ring topology of the PS3.)

    Does anyone have any sources? I keep thinking the PS4 is a tablet when I see 1.6Ghz.
     
  11. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,614
    Likes Received:
    60
    They are all pretty boring labels, like input processing component, image processing component, demosaic component, etc.
     
  12. bkilian

    Veteran

    Joined:
    Apr 22, 2006
    Messages:
    1,539
    Likes Received:
    3
    It has to do with pipeline length, power, and out of order processing. A single Jaguar thread at 1.6GHz will run 2-3 times the number of instructions per second of a PPE thread. Two Jaguar threads (or cores) would handily outperform the PPE with it's hyperthreading. Sony has put 8 cores in the PS4, which would destroy the PS3's PPE to the tune of more than 16x. The GPU also does not need SPEs to help it out, they could shave 4 CUs off for SPE-like compute work, and still have more than enough graphics power to spare.
     
  13. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,723
    Likes Received:
    193
    Location:
    Stateless
    Well clock speed clearly depend on the architecture, bobcat were pretty much dogs when it came to overclocking (though they usually did not come with nice mobo offering lot of option), Jaguar should be better, but by AMD own numbers not by much.
    Bobcat ran @1.6GHz, AMD promised a 10% increase in clock within the same power envelope> or less power at the same clock speed.

    I guess it comes down to how many good chips Sony can have out of a wafer, pushing the limit on the clock speed may have them discarding chips that could meet their requirement other wise (8 CPU cores are OK, 18 CUs and the 32ROPS are OK, memory controller is OK, etc.).

    I'm not sure about what is their margin with regard to power either. A hd7850 suck some juice, add the CPU cores and it is getting a consistent amount to dissipate out of a single chip. Definitely doable but as usual come at a cost, the cost of a better cooling system. They already push the limit with regard to RAM.

    W&S I would think that 1.6 GHz is the bottom line, if they have impressive yields and a lot of chips that would qualitfy to run at say 1.8GHz they may consider a light bump "for free". My expectations are low on the matter, and if it were to happen I don';t expect more then a couple extra MHz above 1.6GHz. 2GHz sounds highly unlikely to me (looking at power and AMD claim for the jaguar CPU cores).
     
  14. LuckyGuess

    Banned

    Joined:
    Feb 27, 2013
    Messages:
    39
    Likes Received:
    0
    Location:
    North America
    Thank you, Liolio. ~ It sucks that I worry so much about something like this. So it helps that there are positive people to re-assure us naive enthusiasts.

    bkilian, what a jaguar can do when all the cards are in order and paired together may be impressive. ~ The reason for my needless worry or rejection of 1.6Ghz claims is that the SPU on Cell could do 32bit operations in 7 cycles, and the internal right bus was pretty efficient. Most CPU cores can do more, but the time it takes to return a result has often been much more. I'm looking at http://developer.amd.com/resources/documentation-articles/ trying to see how well the internal latency compares to T.S.I.'s Cell Broadband Architecture.

    This may not apply, but the construction rule is plan for 20% waste. So even it is is double per-cycle that ain't enough, when what I want to know is how long does it take to get the result back?

    I'm really wasting my breath. = My anxiety = Where the hell is the 16MB of L3 cache to make up for GDDR5 slowness to respond to CPU requests? I want those ninja quick coding elements, like rain and light deform, that doesn't require texture processing. The Tanker level on MGS2 is still the most amazing rain effect I have ever seen. I could watch rain from top of screen bend and sway until it hit the deck and splashed...

    People love big rolling explosions. But the really small details require really fast low latency memory. High bandwidth is great for large amounts of data. But online FPShooters all know, it is the latency between when you see them AND when they see you that equals alive and dead. Not the size of the graphics memory.

    1. Does PS4 need L3 Cache to?
    2. Would Sony be smarter to stop and go back to the drawing board, than release with 8GB of GDDR5?
    3. Would you rather have (4MB L2 +16MB of L3) +4GB of GDDR5 ~OR~ 4MB L2 +8GB of GDDR5???

    #3 Which system could you code more amazing Ninja Graphical tricks into?
     
  15. MikeR

    Newcomer

    Joined:
    Feb 21, 2013
    Messages:
    10
    Likes Received:
    0
    *cough* *cough*

    Jaguar Vanilla
    * 1.8GHz LC Clocks (can be under-clocked for specific low-powered battery device needs - tablets, etc...).
    * 2MB shared L2 cache per CUs
    * 1-4 CUs can be outfitted per chip. (i.e. 4-16 logical cores)
    * 5-25 watts depending on the device/product. (45 watts is achievable under proper conditions)

    PS4 Jaguar with chocolate syrup.
    * 2GHz is correct as of now.
    * 4MB of total L2 cache (2MB L2 x 2 CUs)
    * 2 CUs (8 Logical cores).
    * idles around 7 watts during non-gaming operations and around 12 watts during Blu-ray movie operations. Gaming is a mixed bag...

    What would be nice is a fully loaded Jaguar chip.

    Quit bringing up platforms beyond the scope of the thread.
     
  16. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    55
    So you think they should just keep cranking the Ghz up?

    Thanks.
     
  17. bkilian

    Veteran

    Joined:
    Apr 22, 2006
    Messages:
    1,539
    Likes Received:
    3
    Well, yes, if you're talking the entire CELL as a unit, then even an 8 core jaguar would not be able to compete. The cell had about 150Gflops or more of compute. An 8 core Jag sits at about 100Gflops. Add a single CU from the GPU and you've exceeded the cell.

    As for latency sensitive operations, sure it's going to be trickier, but they'll manage. Doesn't matter if it takes twice as long to return a result if you can throw 4x as many resources at it.
     
  18. LuckyGuess

    Banned

    Joined:
    Feb 27, 2013
    Messages:
    39
    Likes Received:
    0
    Location:
    North America
    I'd rather read what you think of the on chip Cache situation.
    PS4 has 8 Logical Cores and 1,152 Stream Processing Elements (that go GPU compute).

    Is 512KBx 8 Cores (4MB) enough?
    How much is the delay and how wide is the GDDR5 Bus?
    Is it 176GB/sec divide 800MHZ, what are the estimated specs for the GDDR5?


    onQ,
    In response to your question. An actively air cooled enclosure...1.8-2.8GHz. (No idea why. But temperature and cooling seem to never overheat when loaded to the max in these ranges. Around 3.2GHz the thermal diodes on the core will sometimes gate down the processor clock for individual cores. Remember, rooms without central air conditioning exist, where ambient tempts approach outside temp.

    Rant:/
    Most console designs need to imagine poor people...Or country boys who would put one in a tree house or garage, if they could still safely climb up there.

    Say what you will. 4-6 cores is as much as a Non-Chrome browser needs. Programs that break down into multiple threads on their own can benefit from low speed multi-core processors. But so long as you have single threaded applications, the fasted solution is a faster clock. I have.. read, processor design and SIMD extensions continue to evolve... Scorched Earth (DOS game), is the only convincing proof I have seen. ~ Ever since the arrival of GUI interfaces, its harder to measure, because code keeps getting more bloated. So "bloated code", should you get a faster clock or a more expensive (Itanium2) processor?

    Why the hell don't we have a 16 core Itanium2 in the consumer market? Guess the descendants cost a little more... http://ark.intel.com/products/family/451 :-*
     
  19. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    55
    But the PS4 is a closed box so when people develop for it they already know it's a multi-core CPU so they will make their code for multi-core.
     
  20. LuckyGuess

    Banned

    Joined:
    Feb 27, 2013
    Messages:
    39
    Likes Received:
    0
    Location:
    North America
    onQ,
    My only concern... do the developers on here think the PS4 needs L3 Cache?

    If so, then I hope one of you with a better understanding that me can convince them to put the breaks on and trade 2GB of GDDR5 for 16MB of L3 Cache. ~ That is all.

    PS2 lived for nearly a decade because of, low latency Quad-Ported 1,024bit eDRAM.
    Speed and low latency allows independent developers to compete with big production houses.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...