Ati on Xenos

Discussion in 'Console Technology' started by pipo, Jun 10, 2005.

  1. Titanio

    Legend

    Joined:
    Dec 1, 2004
    Messages:
    5,670
    Likes Received:
    51
    This is an interesting point. Assuming 18 months of work, assuming a new design based on another architecture/chip, while designing it can you take out the "redundant" stuff? I'm sure RSX is a different chip than the PC cards, but based on the same tech - and ~18 months seems like enough time, considering refreshes happen in less time..(and there may have been more). I don't know how modular architectures/designs are though in terms of removing things..(?)
     
  2. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    well the video encoding options , i don't know how intergrated they are into the chip. I don't know if it can be removed from teh design. While its a great boon for the pc with the cell its really not needed but what if its intergal to the pipe line structure ?
     
  3. Titanio

    Legend

    Joined:
    Dec 1, 2004
    Messages:
    5,670
    Likes Received:
    51
    I haven't a clue either. I guess again this is something we won't have an answer to til if and when NVidia or Sony starts talking about RSX, or some charitable dev spills some beans..
     
  4. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Well, therein lies the rub - its not as though I can go up to the graphics vendors and ask them "So, exactly how inefficient is your architecture" and get a reply that doesn't say "graphics is inhernatly parallelisable, so we're never inefficient". However, now we are getting down to pretty low level detail on exactly how to best deal with the inherant latencies with shader processing and how to best handle those latencies.

    Actually I believe their main complaint was about the different types of demands VS and PS plce on the prcoessor and how the ALU's need to hanlde the latencies differently because PS usually has filtered texture lookups which are usually the most latency bound of operation. However, ATI's point is that they have removed the ALU and texture link by having them separated - if there is a texture instruction in a shader program the texture samplers will be tasked with retreiving that texture data, meanwhile a completely independant theads are working on the ALU's; when the texture data is ready the shader program that requested that data can go back into context, run on the ALUs and the the texture data will already be buffered and usable immediately when the shader instruction that does operations on that data is being processed.
     
  5. Carl B

    Carl B Friends call me xbd
    Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    Hey remember that before R500, there was R400... :wink:

    So it's not like things cant go wrong on either side.

    By the way, looked up the Purevideo transistor count on the GeForce 6 chips and it's roughly ~20 million.
     
  6. j^aws

    Veteran

    Joined:
    Jun 1, 2004
    Messages:
    1,992
    Likes Received:
    137
    [​IMG]

    Dave,

    I can see the inherent advantages of decoupling the shader ALUs from texture ALUs.

    I can also see the advantages of specialised shader ALUs and unified shader ALUs. However, the RSX architecture above also shows a 'decoupling' of texture ALUs but isn't clear on whether the shader ALUs are specialised or unified. So I can see two paths here for optimisation/efficiency,

    1. Decoupled texture ALUs VS Coupled texture ALUs

    2. Unified Shader ALUs VS Specilaised Shader ALUs

    Both RSX and Xenos look like having 1 but 2 isn't clear? :?

    I have yet to hear any of your comments on RSX on that note?
     
  7. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    oh no doubt. However its the fact that ati scrapped the r400 or postponed it leaves me feeling that the xenos will work as intended and looking at the problem from a semi diffrent aproach may help alot .

    Anyway we really don't know how either the rsx and the xenos work. Looking at transtor counts alone wont tell the whole story and i don't think even in the end it will be as clear cut as one being more powerfull than the other. I think u will find that they both end up edging out the other while doing diffrent tasks
     
  8. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I would say that the image above is a picture to display a concept, not an architectural representation.
     
  9. j^aws

    Veteran

    Joined:
    Jun 1, 2004
    Messages:
    1,992
    Likes Received:
    137
    Well that could be true but the image does very little to convey anything that could be further away from points 1 and 2. Especially point 1.

    Though apparently this is a clue to the architectural differences between Xenos and RSX,

    http://www.beyond3d.com/forum/viewtopic.php?p=522443#522443

    So, "How independant?", is apparently the difference between Xenos and RSX...?

    I look forward to your Xenos article, but I also hope you can do a thorough article on RSX when info becomes available... :p
     
  10. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Jaws, I can't talk because I'm under NDA, but given what I know about upcoming parts and what we hear about RSX I think you can "colour me surprised" if that diagram is a representation of architectural operation.

    As for an article on RSX - given all the commentry from NVIDIA do you actually expect it to be significantly different from their PC parts?
     
  11. j^aws

    Veteran

    Joined:
    Jun 1, 2004
    Messages:
    1,992
    Likes Received:
    137
    Well that NDA explains alot! :p

    Well the curiosity is on three levels,

    1. How different the G70 is from NV40?

    2. How different RSX is from G70/ G80/ WGF 2.0, with it's implementation/customisation with CELL and OpenGL|ES?

    3. PS3 and X360, different architectures, same result?

    So I take this as a hint that a RSX article is unlikely?
     
  12. Titanio

    Legend

    Joined:
    Dec 1, 2004
    Messages:
    5,670
    Likes Received:
    51
    Just suggestions..

    If RSX is too similar to PC technology being covered anyway, looking at the system (PS3) as a whole with regard to graphics as opposed to just the GPU in isolation could be worthwhile. Coverage/Analysis/Investigation of the relationship between Cell and RSX wrt graphics, the role Cell can play there etc. is very lacking (well, all coverage is tbh, but that may be symtomatic of Sony/NVidia's stance on information release rather than a lack of effort on the part of websites). Cell is arguably a step closer to the "CPU that's like a GPU", architecturally, and Sony evidently sees a role for it in graphics beyond the norm for a CPU. So that is something "different" that could be worth looking at. But perhaps curiousity in that area isn't widespread..just my own personal thought, perhaps!

    RSX itself beyond that could be treated as any "refresh" is.
     
  13. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Because they both envisage different usage patterns.
    Because neither one can see the future so they have to assume and extrapolate.

    Ideas can be great on paper and suck in practice. And sometimes stupid brutefore solutions are the best way to go.
     
  14. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I don't know yet, we'll see if/when people start talking a little more about RSX.

    With Xenos we have some different processing elements to consider: eDRAM / MSAA / Z pass / Tiling is one element to consider for Xenon and then there is the shader architecture which is a bit out there, and I firmly believe this will give us a glimpse to ATI's PC future (important to this site). RSX's pixel operation looks to be more in line with other processors, so there probably isn't much to look at there and the architecture appears to be PC derived so it will probably be covered in one form or another sooner or later - if its later (but before its covered from the PC side) then I'd like to do the same (if possible), but my suspicion is that its sooner.
     
  15. pc999

    Veteran

    Joined:
    Mar 13, 2004
    Messages:
    3,628
    Likes Received:
    31
    Location:
    Portugal
    Well, at the end I am almost sure that you will talk of it in the forums (and for a "variant" should be enough, like nv2a for XB), we later could just make a colection of posts, that should be enough and it would save a lot of time.
     
  16. Pozer

    Regular

    Joined:
    Feb 9, 2005
    Messages:
    664
    Likes Received:
    6
    Location:
    Ohio
    If you look at the financials of the Sony/Nvidia deal it would conclude that Sony got a PC derived part and not some exotic design ala Xenos. Also I think the CG libraries were very important to Sony as they can't afford a 2 year grace period for devs to get good visuals.

    I think RSX is a marriage of convience. Sony was too proud up till the 11th hour to admit it needed a decent GPU and Nvidia was to cheap to go with a tech licensing deal... that is until they saw they may not end up in any console except the Phantom and maybe a little worried about consumer perception. In the end Nvidia ate crow but atleast this time they didn't pay too many engineers to cook it.
     
  17. gosh

    Newcomer

    Joined:
    Jul 20, 2004
    Messages:
    149
    Likes Received:
    0
    is there any way to know or link that Sony admitting they needed a decent GPU and didnt plan for nvidia until late last year
     
  18. scooby_dooby

    Legend

    Joined:
    May 28, 2005
    Messages:
    8,563
    Likes Received:
    145
    Location:
    E-town, Alberta
    i read they had filed patents regarding using 3 cells in the ps3, 2 would act as the GPU.

    Then last year they signed a deal with nvidia to make a GPU, and only went with 1 cell processor.

    So i think most of the speculation is based on the patents Sony filed.
     
  19. ondaedg

    Regular

    Joined:
    Oct 5, 2003
    Messages:
    350
    Likes Received:
    1
    that is a reasonable way to determine the order of events. I personally think that Nvidia had a proposal on the table all along. I think that Sony decided to go with Nvidia because they bring alot to the table including a shader language, Linux expertise, and console experience. Furthermore, if anyone could design a bus between a GPU and a CPU it is Nvidia.
     
  20. X-AleX

    Newcomer

    Joined:
    May 20, 2005
    Messages:
    75
    Likes Received:
    14
    When will your eagerly awaited article be up on the website, DavE? :shock:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...