Intel ARC GPUs, Xe Architecture for dGPUs

Discussion in 'Architecture and Products' started by DavidGraham, Dec 12, 2018.

Tags:
  1. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    A laptop I purchased in 2011 -with an i5-2500H or something like that- had a Sandy Bridge GPU and I was soooo hyped. Specially with the Miracast feature, so I could use my TV which supported Miracast along with my laptop without cables. It turns out that my particular HD3000 Graphics GPU from Intel wasn't Miracast compatible -iirc, some HD3000 were compatible, HD 4000 was totally compatible-.

    Your point makes sense. It was after Sandy Bridge when I could complete The Witcher 1 on my laptop, and I loved the feeling of having a computer I could move around and use to play good games on it.

    Also, Diablo 3 was playable on the HD 3000. I purchased it day one and while it rarely ran at 720p 60fps at the lowest possible details, the feeling of having your humble laptop running an AAA game was kinda special. The less you got the more you get to recognize the true worth of things. :smile2:

    did you have the Intel 740? I didn't tbh, it was only when I started buying laptops when I began to use Intel GPUs.

    The first laptop I ever got (2005) had some GMAxx something GPU, which was REALLY bad for gaming but well, it worked. I had several laptops until 2011 with integrated GMA solutions from Intel, but well, I didn't complain, it was the time of my life (2005 'til 2015) where I played consoles the most 'cos my laptops couldn't keep up, and I prefer laptops to desktop computers.
     
    PSman1700 likes this.
  2. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    what do you mean by compare to the other guys where every SIMD lane is a “core”? Could you elaborate? Just curious....

    Now that you mention it I wonder how Intel teraflops are going to compare to nVidia and AMD teraflops.

    Teraflops numbers aside, something were Intel are light years ahead of the competition is in their control panel for the GPUs. It looks nice, it's fast, compact, it looks really professional compared to nVidia Control Panel -horrible and slow design I suffer everyday, Geforce Experience isn't much better either- and AMD designs -I purchased a RX 570 back in 2017 and also I mounted 3 mining rigs with Vega GPUs for people who asked me to do that back then, and that red background colour, plus the overall design was poor, although slightly better than nVidia's, also faster-.

    Kinda miss that Intel HD Graphics option in the main desktop menu that appears when you right click on the desktop. It was simple, it was fast and the interface was gorgeous, without being the Sistine Chapel, imho.
     
    PSman1700 likes this.
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    That would actually make more sense. There are some other interesting architecture features that aren’t exposed in the marketing. A Xe core is configured as 16x256-bit, an Nvidia SM is 4x2x512-bit, an AMD CU is 2x1024-bit etc.
     
    Cyan, PSman1700 and DavidGraham like this.
  4. neckthrough

    Newcomer

    Joined:
    Mar 28, 2019
    Messages:
    138
    Likes Received:
    388
    Despite all the vitriol that it gets, I loved the Intel 740 for what it was -- an affordable 3D accelerator that was available at an affordable price in the country where I grew up. Cards based on Nvidia and 3dfx were exorbitantly expensive due to purchasing power differential in a developing country, plus markups and import duties. The only two realistic options for us at that time were the SiS 6326 and the Intel 740.

    As far as I remember the SiS 6326 didn't really "accelerate" much, but I think it did provide texture filtering. As a teenager who grew up looking at blocky software textures, seeing texture filtering in action on the Half Life: Uplink demo was mind-boggling. When the game first loaded up I thought it was stuck on a pre-rendered loading screen. I distinctly remember thinking that the game had hung because it was taking so long to load! When I moved the mouse and realized the filtered textures on the walls were being rendered in real-time, it was... one of those defining moments that you remember all your life. All that from the sorry little SiS. The Intel 740 was a huge step up in comparison.

    I believe the i810 was just an integrated version of the 740. I remember it being plagued by horrible driver (and/or performance?) problems that were never an issue with the 740. The problems did not go away over multiple chipset generations.
     
  5. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    Back then I had a G200 together with the Voodoo 2. A year later the Matrox G400 Max. In 2000 it was the first Radeon 64MB DDR VIVO AGP (R100), then Radeon 8500, Radeon 9700 PRO, Radeon X800 XT, Radeon 4890, Radeon 5870 ... how time flies!
     
    Lightman, Cyan and PSman1700 like this.
  6. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,090
    Loved that GPU. Ripped through HL2 ;)
     
  7. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    Nvidia and AMD market their cards based on the number of FP32 instructions per clock. So the 3070 Ti has 6144 “cores”. In Intel speak that would be 48 cores with 8 x 512-bit vector ALUs each.

    Intel’s version is more accurate but not as helpful for marketing.
     
    Cyan likes this.
  8. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    Even Intel's terminology of what a "core" means is arguably spurious as well from the standpoint of CPUs . An "Xe core" may likely not be an individual unit of control flow. On AMD HW, a unit of control flow is defined per vector unit on on the CUs. Without delving too much deeper, a unit of control flow could be possibly defined as either per vector engine or per-pair of vector engines which might be closer to the traditional concept of a "core" ...
     
  9. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    960
    Likes Received:
    853
  10. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    737
    Likes Received:
    354
    Limited Edition? Is it going to be something like GeForce FE to pay more for the privilege to get the card earlier, or is it going to be worse?
     
  11. Dayman1225

    Newcomer

    Joined:
    Sep 9, 2017
    Messages:
    77
    Likes Received:
    169
    AFAIK it’s more like founders edition.
     
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    https://videocardz.com/newz/intel-a...sted-slower-than-gtx-1650-up-to-2-2-ghz-clock
     
  13. Cyan

    Cyan orange
    Legend

    Joined:
    Apr 24, 2007
    Messages:
    9,734
    Likes Received:
    3,460
    that laptop is close to what I want for my future computer, a super light gaming laptop, although I prefer another, more capable, Intel GPU --still expectant as to whether they deliver or not-.

    AMD has shared a comparison of their RX 6500M vs the Arc A370M -the one with a 35W to 50W TDP-

    AMD Posts Intel Arc A370M vs. Radeon RX 6500M Benchmarks | Tom's Hardware (tomshardware.com)
     
    Lightman and PSman1700 like this.
  14. Jay

    Jay
    Veteran

    Joined:
    Aug 3, 2013
    Messages:
    4,029
    Likes Received:
    3,428
    Dayman1225 likes this.
  15. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,887
    Likes Received:
    4,534
  16. del42sa

    Newcomer

    Joined:
    Jun 29, 2017
    Messages:
    208
    Likes Received:
    137
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,465
    Location:
    Finland
    For those not familiar with Siru, it was founded by Kaj & Mika Tuomi and Mikko Alho.
    Kaj & Mika have been in "the scene" from Future Crew days, which after they founded BitBoys, which eventually was sold to ATi and later sold by AMD to Qualcomm as part of their Imageon group.
    Tuomi brothers left Qualcomm in 2010 and asked Alho to join them from Qualcomm too to found Siru. Alho had been with them at least since BitBoys days.

    As for Siru products, their IP is in customer(s) production hardware, but their customers aren't public information
     
  18. BlackAngus

    Newcomer

    Joined:
    Apr 2, 2003
    Messages:
    134
    Likes Received:
    31
    Woot! Future Crew!
    Man I loved their demos back in the day.
     
  19. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,105
    Location:
    35.1415,-90.056
    Hell yes!
     
    Lightman and PSman1700 like this.
  20. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,969
    Likes Received:
    963
    Location:
    Torquay, UK
    They know how to use blitter and copper :cool2:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...