Supercomputers: Obama orders world's fastest computer

Discussion in 'Graphics and Semiconductor Industry' started by pharma, Jul 30, 2015.

  1. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,634
    Likes Received:
    1,374
    President Obama has signed an executive order calling for the US to build the world's fastest computer by 2025.

    http://www.bbc.com/news/technology-33718311
     
  2. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    2,723
    Likes Received:
    93
    Location:
    Taiwan
    It was hoped to have an exascale supercomputer in 2018. Apparently it's impossible and now 2025 is the new timeframe...
     
    Razor1, AlexV and Jawed like this.
  3. xpea

    Regular Newcomer

    Joined:
    Jun 4, 2013
    Messages:
    366
    Likes Received:
    292
    I suspect it will be a race between Intel Xeon/Xeon Phi and IBM/Nvidia to win this bid.
    Volta's successor in TSMC 10nm based on Echelon / Exascale project is the obvious candidate
    [​IMG]

    [​IMG]
    [​IMG]
     
    Pixel and Grall like this.
  4. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    7,982
    Likes Received:
    2,429
    Location:
    Well within 3d
    The original projection by the US Department of Energy put a power cap at 20 MW, which this executive order ups to 30. The story has a quote of a guess at 60.

    Is this separate from this? http://www.hpcwire.com/2015/07/28/doe-exascale-plan-gets-support-with-caveats/#/
    The goals there are a bit more broad, with multiple exascale systems by 2023 at 20MW.
     
  5. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    60GW is frighteningly huge. It's a large-sized Swedish town's worth of electricity. City I'm in, 2nd largest in the country, pulled around 280GW peak IIRC on very cold winter days in the late 1990s. I'm not sure what current consumption levels lie at, because the display that showed these figures is no longer there. :( The city has grown quite a bit since then of course, but also things like remote heating (and cooling for that matter) have been expanded on a lot as well to reduce electricity (and fossile fuel) consumption.

    Hopefully they don't locate this monster computer anywhere there's a shortage of water, because that would be totally irresponsible. :p Preferably, you'd cool it with something like deep-ocean water, and then capture the waste heat for household use for example.
     
  6. idsn6

    Regular

    Joined:
    Apr 14, 2006
    Messages:
    459
    Likes Received:
    96
    And the easiest way to win the race is to kneecap the leader with export sanctions.
     
  7. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    faster Porn :evil:
     
  8. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    7,982
    Likes Received:
    2,429
    Location:
    Well within 3d
    It is a large estimate, and large enough that it might be an erroneous one by someone not aware of the numbers being given as goals.
    There are pragmatic reasons for aiming for 20 or even the executive order's 30.

    Perhaps not responsible, but it's a consideration that is sometimes discounted, particularly for any systems not on the public HPC lists. The NSA's Utah data center consumed 6.2 million gallons of water in 2014, in a desert state with drought conditions. DP floating point might not really figure into its workload, but otherwise it's a very large collection of networked compute racks.
     
  9. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,390
    Likes Received:
    802
    Is the water actually consumed, or does it simply pass through the cooling system?
     
  10. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    7,982
    Likes Received:
    2,429
    Location:
    Well within 3d
    The method suggested elsewhere is evaporative cooling when the temperatures rise high enough that the facility's standard cooling cannot dissipate heat fast enough. The amount varies with the season, although the periods of high temperature can unhelpfully coincide with drought.

    The increasing difficulty in terms of facility-wide power delivery and power dissipation is why the Exascale initiative and the recent executive order have rather constraining MW limits to them.
     
    Alexko likes this.
  11. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,170
    Location:
    La-la land
    Yes, and also, you consume water in much the same way to cool the powerplant generating the electricity the facility runs on. So water losses could essentially more or less double by building this thing in the wrong state.
     
  12. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,634
    Likes Received:
    1,374
    July 29, 2015 articles:
    http://www.extremetech.com/extreme/...er-to-build-first-ever-exascale-supercomputer

    http://www.extremetech.com/extreme/210872-extremetech-explains-what-is-moores-law
     
    #12 pharma, Jul 30, 2015
    Last edited: Jul 30, 2015
  13. London-boy

    London-boy Shifty's daddy
    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    21,106
    Likes Received:
    4,562
  14. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,059
    Likes Received:
    838
    Location:
    still camping with a mauler
    Crysis

    Or maybe they are trying to break 30fps in Arkham Knight. If so I have bad news for them...
     
    elect, Simon F, ToTTenTranz and 3 others like this.
  15. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,390
    Likes Received:
    802
    I was about to make the exact two same jokes! :lol:


    On a more serious note, it's probably the usual suspects: simulations of climate models, molecular dynamics, nuclear explosions, fluid dynamics, etc.
     
  16. London-boy

    London-boy Shifty's daddy
    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    21,106
    Likes Received:
    4,562
    Boring.

    I want full benevolent AI taking over the lunatics that are running the world.
     
    tabs, AlBran and Dr Evil like this.
  17. Pixel

    Regular

    Joined:
    Sep 16, 2013
    Messages:
    915
    Likes Received:
    371
    What if it is benevolent to all organic life on earth as a whole and deems that all industrialized nations and populations must be exterminated. Then you only have a small amount of farmer population left and some remote Amazonian, African and Austrialian tribes surviving. Guerrilla Games should make a game around something like that.
     
  18. Dr Evil

    Dr Evil Anas platyrhynchos
    Legend Veteran

    Joined:
    Jul 9, 2004
    Messages:
    5,704
    Likes Received:
    703
    Location:
    Finland
    One is talking about Megawatts and the other about Gigawatts...
     
    pharma likes this.
  19. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    7,982
    Likes Received:
    2,429
    Location:
    Well within 3d
    One of the most immediate applications--and a giant reason why the DOE is so interested--is nuclear weapons modeling. The finer points of what makes atomic weapons do what they do best and how aging affects them are harder to verify since actually detonating a few for research purposes has been banned.
    Besides that, materials science, particle physics, chemistry, climate modeling, and astronomy are noted areas where the demand is extreme and where exascale is still far from sufficient.


    That would be unsustainable. A giant computational device or network would need those industrialized nations to give it the highly pure materials, energy, and infrastructure it would take to last more than a few years, or potentially months if the HVAC goes down or an unattended municipal water supply shuts down.
    The remnants of humanity would be back to where they were in some thousands of years. An entity that could only be born from a multibillion dollar clean room with obscenely dust-free atmosphere, precise chemistry, impurities measured in parts per billion, almost no vibration, and on some of the rarest elements in the universe needs that civilization of biologics like an tribe on a coral atoll depends on billions of simple polyps.
    The AI could build its own civilization equivalents, but then that's trading one type of grasping, emergent organization for another.

    To the topic at hand, the article I linked had a list of challenges beyond just the consideration of a shiny new chip, which amounts to a fraction of several bullet points at best.

    • Energy efficiency: Creating more energy-efficient circuit, power, and cooling technologies.
    • Interconnect technology: Increasing the performance and energy efficiency of data movement.
    • Memory technology: Integrating advanced memory technologies to improve both capacity and bandwidth.
    • Scalable system software: Developing scalable system software that is power- and resilience-aware.
    • Programming systems: Inventing new programming environments that express massive parallelism, data locality, and resilience
    • Data management: Creating data management software that can handle the volume, velocity and diversity of data that is anticipated.
    • Exascale algorithms: Reformulating science problems and redesigning, or reinventing, their solution algorithms for exascale systems.
    • Algorithms for discovery, design, and decision: Facilitating mathematical optimization and uncertainty quantification for exascale discovery, design, and decision making.
    • Resilience and correctness: Ensuring correct scientific computation in face of faults, reproducibility, and algorithm verification challenges.
    • Scientific productivity: Increasing the productivity of computational scientists with new software engineering tools and environment
     
    Pixel and London-boy like this.
  20. Billy Idol

    Legend Veteran

    Joined:
    Mar 17, 2009
    Messages:
    5,924
    Likes Received:
    760
    Location:
    Europe
    Research!

    I am following ExaScale computing since years...it is an extreme scientific challenge....not only on the hardware side, but also on the software side: my code scales perfectly to about O(10^5) up to O(10^6) mpi ranks. But this is on Blue-Gene systems, with O(petaflop) performance...ExaScale Computing is basically everything times 1000 :)

    But I this would mean O(10^8)-O(10^9) processors for a Blue-Gene type of architecture :shock:

    How many of those will break down during a simulation due to hardware faults? Can your software handle failing MPI ranks?

    One of the biggest problems...parallel filesystems and I/O! 1mil processors writing simultaneously in a file?!


    Just to name a few issues :mrgreen:
     
    Lightman and Grall like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...