Nvidia Pascal Announcement

Not sure what they mean by Founders Edition anyway. ( reference ? )... anyway 1070 is priced well aggressively ( 220$ less than the 1080 one )
 
Not sure what they mean by Founders Edition anyway. ( reference ? )... anyway 1070 is priced well aggressively ( 220$ less than the 1080 one )
I think both products are very well priced. The difference between the Founders Edition may be the boost clock since Jen Hsun said it would have crazy overclockability .... I think the demo may have been using the GTX 1080 Founders Edition with the 2114 MHz clock.
 
I think Founders Edition will be the first cards available, when availability gets better the MSRP will drop and they will drop that name.
 
So the 2.1GHz is either an OC'ed speed or an error in the frequency reading?

EVGA PrecisionX OC
GPU Clock: 2114 MHz
Memory Clock: 5508 MHz
GPU Temp: 67c



SSP_171.JPG
 
http://nvidianews.nvidia.com/news/a-quantum-leap-in-gaming:-nvidia-introduces-geforce-gtx-1080

Superb Craftsmanship. Increases in bandwidth and power efficiency allow the GTX 1080 to run at clock speeds never before possible -- over 1700 MHz -- while consuming only 180 watts of power. New asynchronous compute advances improve efficiency and gaming performance. And new GPU Boost™ 3 technology supports advanced overclocking functionality. - See more at: http://nvidianews.nvidia.com/news/a...eforce-gtx-1080#sthash.SG5ytVUP.ikA9s9iK.dpuf
 
This is their way of saying "we don't want to stop making money with g-sync yet"
Not at all. Adaptive Sync will always be an optional part of the DP spec: there is no reason to require it for business monitors etc. I think it's because the spec is too new for certification testing. If DP is anything like other telecom specs, a major part of certification testing is interop testing. I don't think there are DP 1.3 monitors on the market right now. Let alone DP 1.4.
 
AMD can send all its high-end stuff to the local municipal dump. The $379 price is devastating for Fury X.

I think we can also let go of the idea that 16nm is not yet mature enough: it'd be insane to price this aggressively without being able to supply the market, because Nvidia just made most of its Maxwell chips irrelevant.
 
Last edited:
The performance numbers published vary a bit.

In the resources published by NV online, it says 2x Titan X for VR applications (mostly due to some type of "Multi-Projection" support?), and 1.3-1.4x for existing titles.

Perf announced on stage at the event is plain out 2x without any differentiation. (While the slide in the background has "VR performance" on the axis.)

Does anyone know what this "Multi-Projection" support actually include? Sounds like that batches geometry for stereo projection prior to rasterization, so even the 2x perf is probably just absolute best case for a synthetic, geometry limited scene? Probably achieves the savings only during tesselation and geometry shader evaluation?
 
Specs are a perfect match of what was previously speculated.
To be confirmed still 160 TMUs, 64 ROPs, 40 SMs.

A question arises what will be the big graphics Pascal, given the 15B transistor budget.
Will it be based on the P100 ie 60 SMs, 3840 cores, 240 TMUs.
or will it be 2xGP104 ie 80 SMs, 5120 cores, 320 TMUs.

I'm hoping for the latter.
 
Does anyone know what this "Multi-Projection" support actually include? Sounds like that batches geometry for stereo projection prior to rasterization, so even the 2x perf is probably just absolute best case for a synthetic, geometry limited scene? Probably achieves the savings only during tesselation and geometry shader evaluation?
GeForce can render same triangle to multiple viewports without GS amplification costs. See GL_NV_viewport_array2 and
GL_NV_viewport_swizzle. Maxwell can do this to 9 viewports, Pascal can do this to 16.
 
Back
Top