PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Discussion in 'Console Technology' started by Love_In_Rio, Jan 28, 2013.

Thread Status:
Not open for further replies.
  1. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
    Why does it seem that everyone has forgotten about the Secondary Custom Chip?

    [​IMG]
     
  2. Airon

    Banned

    Joined:
    Dec 12, 2012
    Messages:
    172
    Likes Received:
    0
    As Cerny said, it will manage background task like download and HDD access.
    To me, it will also manage low power / standby state.
    Microsoft has managed this situation by customizing the CPU (I read somewhere, if I remember well, that they customized the CPU adding gates that open/close enable/disable CPU cores according to the state), Sony instead, it is just my assumption, will use this ARM secondary chip in order to manage low power / standby state.
     
    #3262 Airon, Sep 11, 2013
    Last edited by a moderator: Sep 11, 2013
  3. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    Why do you need to reserve CPU resources for voice recognition ? Sony simply said PS4 will support voice navigation. It may not be like GoogleNow or Kinect 2's "always-on" speech recognition system.
     
  4. dumbo11

    Regular

    Joined:
    Apr 21, 2010
    Messages:
    440
    Likes Received:
    7
    I don't think "voice recognition requires a microphone" would be seen as an extreme requirement.

    As to whether voice recognition works "all the time", that's unclear. Whilst gaming there are 2 jaguar cores doing (basically) nothing, on the other hand voice recognition whilst gaming is be debateably useful for the PS4.

    (with 'buttons free' gaming [aka kinect], you clearly need a separate navigation method, but if you have a "home" button in your hands, then being able to say 'home' isn't a great leap forwards).

    Gaming machines of all shapes/sizes are usually a weak CPU with a great GPU. In some ways that's disappointing, but I don't see that changing any time soon.
     
  5. gurgi

    Regular

    Joined:
    Jul 7, 2003
    Messages:
    605
    Likes Received:
    1
    gpgpu is maybe a good compromise then? ;)
     
  6. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    I have no idea. It depends on how they design their run-time. e.g., If they reserve enough RAM to keep transient states, they may not need to reserve any CU.
     
  7. DieH@rd

    Legend

    Joined:
    Sep 20, 2006
    Messages:
    6,387
    Likes Received:
    2,411
    Sony is packing mono headset with mic in every PS4 box.
    [​IMG]

    But they may force voice reg to be active only with PSEye. That move will drive better adoption of the camera.
     
  8. Solarus

    Newcomer

    Joined:
    Jan 12, 2009
    Messages:
    156
    Likes Received:
    0
    Location:
    With My Brother
  9. upnorthsox

    Veteran

    Joined:
    May 7, 2008
    Messages:
    2,106
    Likes Received:
    380
  10. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,493
    Likes Received:
    474
    I don't know how technical he is, but I think of someone asking for confirmation as being a good thing rather than a sign of ignorance.
     
  11. jlippo

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    1,744
    Likes Received:
    1,090
    Location:
    Finland
    #3271 jlippo, Sep 13, 2013
    Last edited by a moderator: Sep 13, 2013
  12. HokutoNoKen

    Newcomer

    Joined:
    Nov 2, 2010
    Messages:
    76
    Likes Received:
    26
    Power struggle: the real differences between PS4 and Xbox One performance

    Mod: Link is off topic. It isn't providing technical insight into what PS4 hardware is - we already know what it is! PS4 vs XB1 development has its own thread where this has been posted.


    / Ken
     
    #3272 HokutoNoKen, Sep 13, 2013
    Last edited by a moderator: Sep 13, 2013
  13. Airon

    Banned

    Joined:
    Dec 12, 2012
    Messages:
    172
    Likes Received:
    0
    Well I am not so sure about this.

    Cell, was a incredibly powerfull CPU, and I believe that also Xbox360, for its time, has a good CPU.

    People of my circle, much more tech wise than me (not game-industry employed anyway) keep on tell me that CPU will be the "Achilles Heel" of the next generation.

    For start, Jaguar is not a powerful CPU di per se and for 2013 standard in general.

    On-line, Multiplayer, persistent open-world, procedural contents etc... could be some of the main genera in the next generation and they are quite CPU intensive.

    Much probably also AI and physic engines will be more demanding (let's hope it anyway).

    On top on this we could add that, PS4 cpu will, much probably, be load with other task:
    Sound tasks, mainly, and maybe some pseye related stuff as much probably PSEye is simply a camera with a microphone.
    A part for speech recognition (that is a feature that I am willing to try), the Sound Area is very peculiar and I hope in a next gen boost also in this department.

    Then we have to consider the memory factor. PS4 badwith to CPU seems to be inferior to 20GB/s. And we have not to forget that GDRR5 latency is not optimized for CPU tasks.
    And speaking about it, GDDR5 memory has been ever used for CPU before?

    I do not know, with this considerations at hands, CPU could be the real bottleneck of PS4?
     
  14. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    No. It's been debunked multiple times.
     
  15. DrJay24

    Veteran

    Joined:
    May 16, 2008
    Messages:
    3,894
    Likes Received:
    634
    Location:
    Internet
    That must be why the console makers went with such beefy Intel CPUs. :wink:


    Your list does not even remotely back your argument.

    They say this every gen, but no fear the 8-core Jaguar can certainly beat the snot out of the Cell PPU and the CUs can do some SPU type heavy lifting.

    Yes, by design. The CPU does not need much bandwidth, go check the bandwidth of high end i7 systems.

    Do you have evidence or just repeating something you have read? Memory controllers for GPUs are not designed for latency, again this is by design. APUs have currently always been used on low-end systems, GDDR is too costly for such a use. Don't read a cost decision as a technical failure though.

    This seems to be the new console war talking point, but when has a CPU ever been a bottleneck for games? I can play BF3 on my i5 PC, but my PS3 also plays it the same, the difference being in rendering, not CPU chores. I would expect that gulf in CPU power to be cut by an order of magnitude for the PS4, at least until PC CPUs get more powerful in the future.

    Both Sony and MS sat with their engineers and designed system to get rid of old bottlenecks and avoid new ones. They both decided on low power x86 CPUs. I would expect they both made the right decision and not bottleneck themselves. Sony is betting on some offloading to GPGPU and MS on DSPs. 3rd party will boil down to lowest common denominator and neither will be used much outside of middleware.
     
  16. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    There's a lot of CPU power in 100 GF's (using that as overal performance metric as one people understand, taking the rest of the CPU to be balanced to similar 'performance' across integer and memory ops) efficient (PS3 had poor processor utilisation in some workloads) processing power. Unlike Cell it won't be needed for graphics work, so there's possibly an abundance of processing power, certainly compared to the norms of console design. And there's compute on both consoles to do some tasks.

    CPU will no doubt be where most devs hit the ceiling first, but I wouldn't call it an Achille's Heel which suggests a platform-crippling weakness. And given the options, it seems a capable and sensible solution. Hence why both MS and Sony went that route, I guess.
     
  17. racca

    Newcomer

    Joined:
    Apr 3, 2010
    Messages:
    51
    Likes Received:
    0
    Well you can't rule out possible software update though. If the hardwares shipping today are qualified for, say, 1.8GHz/900MHz. Should the need rises, SONY could add boost modes quite easily (and to be activated by the developers only, of course) without increasing power consumption. For example 1.8GHz/850MHz with only 6 CPU activated sounds reasonable to me.
     
  18. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Seems to me they could only do this if they are software screening out the chips that cant make those clocks. In other words if this is already planned. Otherwise some might ship and not make 1.8 or 850 once theyre upclocked.


    Anyways they've said nothing, I highly assume it's 1.6/800 until proven otherwise.
     
  19. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Indeed, it'd be a little odd to include hardware with room to improve but a reluctance to do so. In a handheld like PSP, the cap made some sense to conserve battery life. For a home console the only issue is heat and noise. Given that a CPU upclock to 1.8 GHz should only add a few watts heat, if the processors are capable of that then I'd expect Sony to go with that. It all comes down to what the product out of the factory can achieve in numbers. Too many failures at 1.8 GHz and Sony will want to pick the lower speed.
     
  20. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    The process is mature at this point. We have to assume that the CPU core macro AMD supplied is competently designed, and intended to run up to 2GHz I hear (possibly more.) 1.6GHz, and certainly also 1.8GHz, is a very conservative clock by today's standards; I doubt very many, if any chips would actually fail because of a 200MHz bump in clock that is still well below the alledged ceiling of the design...

    Still, none of that means it's going to happen. I wouldn't hold my breath TBH, if sony already has the lead performance-wise, why would they have to do anything...? :razz: (Certainly not what a geek like me wants to think, but we gotta be realistic here.)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...