Xbox One (Durango) Technical hardware investigation

Discussion in 'Console Technology' started by Love_In_Rio, Jan 21, 2013.

Thread Status:
Not open for further replies.
  1. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    I just thought of something that would be fun to see in a game.

    You know how whenever you run across a TV in game it always shows some looped, generally low resolution video or series of images?

    Combined with the HDMI in on Durango, it'd be possible for those TVs to display a live video feed of the user's choosing. So, in theory everytime you ran across a TV set in a game, it could be displaying a real TV stream that is currently happening. That would go a long ways towards making a game world that much more immersive.

    Regards,
    SB
     
  2. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    948
    Likes Received:
    417
    Not incapable, no. It appears difficult to utilize all features at the same time at full, most engine-developers may pick only a sub-set in line with their strategic engine plans, and guided by their budgets related to "optimization vs. portability". This isn't bad per-se, you're not forced at gun-point to use them all really. I believe 2D-engines fe. would be quite happy with some of the features.
    It may turn out as a successfull jack-of-all-trades platform, and I'd hope it can be positively understood as that, instead of possibly downplaying it as silicon-wasteland in the upcoming vs.-discussions and reviews.
     
  3. dobwal

    Legend

    Joined:
    Oct 26, 2005
    Messages:
    5,955
    Likes Received:
    2,325
    Aren't the display panes function revolve around MS version of google glasses. A pane for each eye?
     
  4. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    They don't have to be.
     
  5. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    I think we are starting to know what's inside the Durango. It seems to me that there are a lot of dedicated chips, just like the Super Nintendo. And you know how it turned out....
     
  6. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    Well... The HUD plane could be "thrown" to a SmartGlass display for example.

    Since the HUD layer is separate, presumably they could also tweak the theme or even combine them where the OS sees fit (for in and out of app/game use).
     
  7. astrograd

    Regular

    Joined:
    Feb 10, 2013
    Messages:
    418
    Likes Received:
    0
    Read the patent for it. :wink:
     
  8. astrograd

    Regular

    Joined:
    Feb 10, 2013
    Messages:
    418
    Likes Received:
    0
    You should read the patent. It is pretty clear what the intended/expected uses of this tech are from that document I think.

    http://www.faqs.org/patents/app/20110304713



    It is pretty clear they are looking at dynamic res/fps/HDR/etc on the 2 application (games) planes. The goal, presumably, would be to have the HUD plane be 1080p always with dynamic fps. The game plane would be dynamic res with locked 30fps (or 60fps perhaps). By the sound of your theory there it would seem to make one of the frames wholly worthless, which is pretty clearly not the case.

    My question is what exactly can devs choose to put in these planes? Could the HUD plane also include the hand/gun in an fps? Could devs find ways to layer the game world so one plane is displaying the world's foreground and the other the background? In the patent it has images that seem to be depicting a racing game setup, which includes the steering wheel in the HUD plane. So what about the rest of the car's interior for instance? I mean if they seem to indicate a steering wheel...:?:

    Here is the image I am referring to: http://www.faqs.org/patents/imgfull/20110304713_02

    There are also applications for 3D gaming and streaming to other devices.
     
  9. astrograd

    Regular

    Joined:
    Feb 10, 2013
    Messages:
    418
    Likes Received:
    0
    Oops! Ignore my comment about that image depicting the steering wheel as part of the HUD plane. It's actually part of the game plane. Still re-reading the patent's details. Sorry about that! :/
     
  10. Quaz51

    Regular

    Joined:
    May 18, 2002
    Messages:
    916
    Likes Received:
    1
    Location:
    France
    merge hardware feature is classic in the console world.
    for example GT5 1280x1080 rendering is (hardware) upscaled and (hardware) merged with 1920x1080 HUD, it's hardware job not software

    another example, the merge circuit in the PS2 doc:


    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]



    Durango merge circuit is an enhancement of this
     
    #970 Quaz51, Feb 12, 2013
    Last edited by a moderator: Feb 12, 2013
  11. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    Quaz51, you are talking about the vertical scaler in RSX for your GT5 example ? For merging, do you mean GPU blend or something else (something in the scanout engine ?)
     
  12. astrograd

    Regular

    Joined:
    Feb 10, 2013
    Messages:
    418
    Likes Received:
    0
    More info from the patents...

    [0029] Turning now to FIG. 2, FIG. 2 illustrates a method 50 of outputting a video stream. At 52, method 50 includes retrieving from memory a first plane of display data having a first set of display parameters. It can be appreciated that "plane" as used herein refers to a plane (e.g., layer) of a 2D memory buffer, and thus is distinct from a plane in the traditional sense with respect to a plane of a 2D image. Planes may correspond to (e.g., be sourced by) application-generated display data or system-generated display data, resulting from, for example, frame-buffer(s) produced by the graphics core and/or other system components. Further, planes may be associated with various sources such as main sources, HUD sources, etc. and thus, the first plane may be any such suitable plane.

    [0030] For example, the first plane may be an application main plane comprising an application-generated primary-application display surface for displaying primary application content (e.g., main screen of a driving game). As another example, the first plane may be a system main plane comprising a system-generated primary-system display surface for a computing system (e.g., a window displaying system messages).

    [0031] The first plane has an associated first set of display parameters. Such display parameters indicate how display data of the plane is to be displayed. For example, display parameters could include resolution, color space, gamma value, etc. as described in more detail hereafter.

    [0032] Further, the first plane may be retrieved in any suitable manner, such as by direct memory access (DMA). As an example, the DMA may retrieve front buffer contents from a main memory. As such, a system-on-a-chip (SoC) may be designed to deliver a favorable latency response to display DMA read and write requests. The memory requests may be issued over a dedicated memory management unit (MMU), or they may be interleaved over a port that is shared with the System GPU block requesters. The overhead of the GPU and SoC memory controllers may then be taken into account in the latency calculations in order to design a suitable amount of DMA read buffering and related latency hiding mechanisms. Display DMA requests may be address-based to main memory. All cacheable writes intended for the front buffers may optionally be flushed, either via use of streaming writes or via explicit cache flush instructions.



    [0038] The video scaler may further provide for dynamic resolution adjustment based on system loading for fill limited applications. As such, the resampler may be configured to support arbitrary scaling factors, so as to yield minimal artifacts when dynamically changing scaling factors. Further, resampling may be independent on each of the sources of the planes. In such a case, a high quality 2D filter such as a high quality non-separable, spatially adaptive 2D filter may be desirable for a main plane, whereas non-adaptive, separable filters may be used for HUDs.



    [0050] By performing such post-processing on a per-plane basis, attributes of the sources (e.g. color space, size, location, etc.) can change on a frame by frame basis and therefore can be appropriately buffered to prevent bleeding/coherency/tearing issues. Thus, all display planes may be updated coherently.



    [0058] At 76, method 50 includes outputting the blended display data. In some embodiments, the blended display data may be output to a video encoder. However, in some embodiments, content that is formatted and composited for output may be written back into memory for subsequent use, including possible video compression. The source may be taken from any blending stage, for example, to include or exclude system planes. Alternatively, for fuller flexibility, a separate set of blenders may be added. Such outputting to memory also provides a debug path for the display pipeline.

    Read more: http://www.faqs.org/patents/app/20110304713#ixzz2KeFveMEu


     
  13. Oboro Shogun

    Newcomer

    Joined:
    Dec 17, 2005
    Messages:
    109
    Likes Received:
    0
    Just from reading and my quite modest understanding, the Durango is starting to remind me of the Sega Saturn and it's small army of processors. If the PS4 is released first and, with it's more straightforward, potentially more capable design, becomes the lead platform, I can see all these little "custom" pieces becoming a lot of extra effort to maintain parity across the 2 machines. I wonder how much the devs will push to achieve it. Look at the difference in effort this gen maximizing the capability of Cell's nuances between 1st and 3rd party devs.
     
  14. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    195
    Location:
    Stateless
    Forget about it, I read the article again , I misunderstood it in the first placed, shit happens :lol:
     
  15. Ketto

    Newcomer

    Joined:
    Jul 30, 2012
    Messages:
    39
    Likes Received:
    0
    Location:
    Winter Park, Florida; and London UK.
    Sega Saturn will rise again

    -Huge SS fan.
     
  16. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    My gut feel, which can of course be wrong, is: It's for enabling new (and consistent) user experience.

    Resource saving is secondary because in the first patent quotes you highlighted, we see the word "sacrificed". Dynamic resolution is activated when certain visual elements such as resolution has been compromised to hit higher framerate. This is different from optimization techniques that improve or do not impact visuals (e.g., culling).

    The invention allows the devs to soften the blow by minimizing/hiding the impact to HUD. Durango is certainly not designed for compromises as their first priority. They probably have other worthy goodies in mind that may take away some of the resources under certain scenarios (e.g., running something else together with gaming, or perhaps to enable OS animated thumbnail mode, etc.).

    Your steering wheel HUD example is ok. You don't have to follow the patent example to the tee. For games that are SmartGlass friendly, it may make sense to have the steering wheel and dashboard rendered to a separate plane (even together with the HUD).

    Most of the other points you listed from the patents can be done using software on Durango or Orbis too.

    I am more interested in the workflow, such as the virtual texture workflow, in Durango. If they do enough work during bake time, they may be able to load more details and pre-calculated data into the 8GB RAM.
     
  17. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,680
    Nothing about this setup prevents you from doing culling, or any optimizations. It may or may not make dynamic resolution easier, but by no means makes it automatic either. This hardware is there for QoS guarantees, and to provide some simple workarounds for visual issues, like HUD and overlay resolutions in non-native resolution games.
     
  18. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    It should/would simplify development while safeguarding some basic user experience.

    Developers will optimize their software for the target GPUs (Durango, Orbis, PS3, 360, WiiU) as usual. The display planes' scaling will happen in parallel on-demand. A real time OS should support various prioritization schemes and policies.

    e.g., Besides "protecting" game HUD resolution, the OS may also reduce the resolution of DVR videos or drop frames when a game is playing.

    Without the display planes, these adjustments will be done in an app specific manner (like on Mac/PC). For instance, PS3's Tourne DVR will drop frames "automatically" when there is insufficient resources.

    With display planes, they are handled more uniformly and consistently, in line with OS policies and use cases.
     
  19. Nevod

    Newcomer

    Joined:
    Feb 6, 2013
    Messages:
    18
    Likes Received:
    0
    Location:
    Krasnoyarsk, RF
    On the proposition that the LZ-decoding DME would be used to decompress LZ-packed DXT, it seems to be the most efficient way to use it, however, even LZ-packed, DXT is only suitable for characters and other small and repeatable objects, but not for environment - compression ratio of any JPEG-like method is much higher than of LZ-DXT. Strange that they did not implement JPEG XR decoder as it seems it's not really much more complex than plain JPEG and doesn't have blocking artifacts. Strange as well that JPEG decoder doesn't output in DXT, though that allows possiblity to do some deblocking post-processing and then pack to DXT using GPGPU.
     
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    I suppose using the HDMI in, the could render the TV signal to a texture. The composition planes won't help because they aren't 3D mapped to geometry. But then why not fetch video over the internet? that way devs could maintain content style, which is important. You don't want an episode of 24 or Smallville in a game set in the 1960s or a fantasy game.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...