NVIDIA Kepler speculation thread

What's Hainan? Alright, it's a next-gen Radeon above Bonaire, replacing Pitcairn. So it's about two market segments higher and will compete with the GTX 660 or 760 likely.

gk110 (and I guess gk208 as well) can do 64 shifts per clock instead of 32 (per smx) which surely should help _a lot_ with bitmining. IIRC it also has a rotate instruction now. Obviously it's not enough to catch GCN there (I think GCN can do full-rate shifts, that is 64 per CU), but at least it looks less appalling now. Not sure what the story is with luxmark.

GK208 also has double the L2 size, vs GK107.
What about a Blender Cycles benchmark, would it fare well?
Anyway this Gainward card looks usable, the reference design was garbage - noisy tiny fan.
Good card for a linux or FreeBSD desktop I'd say. Also £60 is 70 euros, that's not very cheap but at least affordable, now I hope they actually sell it. GK208 is wholly unavailable here like DX9/DX10 S3 Graphics cards were.
 
What's Hainan? Alright, it's a next-gen Radeon above Bonaire, replacing Pitcairn. So it's about two market segments higher and will compete with the GTX 660 or 760 likely.

No, it's a very low-end chip. Like Oland/Mars, but with a smaller memory bus, probably half the ROPs and perhaps other differences, I'm not sure.

It might be OEM-only, however.
 
No, it's a very low-end chip. Like Oland/Mars, but with a smaller memory bus, probably half the ROPs and perhaps other differences, I'm not sure.
And missing another CU, plus no display outputs for instance, so it would be difficult to sell it as a desktop card :). I think Mars is quite a valid comparison against gk208, the former indeed has twice the memory bus width (meaning it doesn't have to totally suck when equipped just with ddr3 unlike gk208, though plenty of useless 64bit ddr3 mobile versions of mars/oland exist too and I believe gk208 has some advantage in such more or less 100% bandwidth limited scenarios there still) but overall both achieve similar performance with a similar die size (and both are "complete" chips).
 
No, it's a very low-end chip. Like Oland/Mars, but with a smaller memory bus, probably half the ROPs and perhaps other differences, I'm not sure.

It might be OEM-only, however.
320 shaders (5CUs), 64bit memory bus, probably 4 ROPs, and lacking UVD and display controllers so it can only be paired with APUs.
 
Could be true, if so I called both to perfection. :p http://forum.beyond3d.com/showpost.php?p=1735997&postcount=6462

Could also be a deflection because they have nothing to fight VI with except the promise of the next great series, making it Evergreen vs Fermi all over again.

But in this case, why release more 700 series card in november, december if they plan release maxwell in 28nm early in Q1 ? .. i can see a new Titan ( even if on the price i see a problem vs the AMD counterparts ).. I dont say its not the case, but the article look for me to take nearly all rumor and slap them in a really confusing way.
 
So I've heard good and bad things about TXAA.
Some IQ reviews at HardOCP show TXAA having good IQ at minimal performance hit, less than comparable MSAA modes, but other reviews reveal/show terrible blurring.
 
So I've heard good and bad things about TXAA.
Some IQ reviews at HardOCP show TXAA having good IQ at minimal performance hit, less than comparable MSAA modes, but other reviews reveal/show terrible blurring.

They have correct a bit the blur level .. but its a choice to do.. you cant get all.... If you like downsampling, supersampling, ultra sharp textures, this is not for you.
 
Or quick response to BF4 enhanced physics /weather, particules effects shown during gamescom.
That's beside my point (NVIDIA vs AMD). What you are discussing is EA vs Activision.

At any rate, After analyzing the period of 2012-2013, the sudden surge of AMD supported games is attributed largely to the deal between AMD and Square Enix which released 3 games in that period : Hitman , Sleeping Dogs and Tomb Raider (plus Deus-Ex a year before).

Both Far Cry 3 and Crysis 3 were supported by both companies. Far Cry 3 supported HDAO (AMD) and HBAO (NVIDIA) .. while Crysis 3 had AMD's logo and NVIDIA's TXAA .

NVIDIA has an old partnership with Ubisoft , they are releasing 3 games this year : Assassin's Creed , Splinter Cell , Watch Dogs, all of them heavily influenced by NVIDIA.

From now on, there will be a clear divide in the market, some studios are taking NVIDIA's side .. others are taking AMD's and NVIDIA's side is still larger , we'll see down the road if that remains true.
 
So I've heard good and bad things about TXAA.
Some IQ reviews at HardOCP show TXAA having good IQ at minimal performance hit, less than comparable MSAA modes, but other reviews reveal/show terrible blurring.
It's pretty good at eliminating crawling edges and flickering , but it does introduce noticeable blurring .
 
That's beside my point (NVIDIA vs AMD). What you are discussing is EA vs Activision.

At any rate, After analyzing the period of 2012-2013, the sudden surge of AMD supported games is attributed largely to the deal between AMD and Square Enix which released 3 games in that period : Hitman , Sleeping Dogs and Tomb Raider (plus Deus-Ex a year before).

Both Far Cry 3 and Crysis 3 were supported by both companies. Far Cry 3 supported HDAO (AMD) and HBAO (NVIDIA) .. while Crysis 3 had AMD's logo and NVIDIA's TXAA .

NVIDIA has an old partnership with Ubisoft , they are releasing 3 games this year : Assassin's Creed , Splinter Cell , Watch Dogs, all of them heavily influenced by NVIDIA.

From now on, there will be a clear divide in the market, some studios are taking NVIDIA's side .. others are taking AMD's and NVIDIA's side is still larger , we'll see down the road if that remains true.

I should have write that i completely agree with you, my post was a little joke against the fight between Dice and call of duty developpers.

Like the day before Dice have do a presentation of BF4 and dropped some videos of the enhanced physic, weather, smoke and particule system. i was find funny to see the other " team ", coming with Nvidia and TXAA + smoke Apex.
 
Yeah I hear ya , it's funny when you see those COD guys touting their -so called- next gen engine. When it is in fact the same bloody old engine since MW1 but with a minor and pathetic paint job .

COD is in serious need for a graphics and gameplay makeover .
 
Yeah I hear ya , it's funny when you see those COD guys touting their -so called- next gen engine. When it is in fact the same bloody old engine since MW1 but with a minor and pathetic paint job .

COD is in serious need for a graphics and gameplay makeover .

Sadly yes, I hope Nvidia can push them a bit for change this ( they have argument for it in general )..
 
GamesCom 2013: Call of Duty: Ghosts To Be Enhanced With NVIDIA TXAA & NVIDIA PhysX

It seems NVIDIA is going big with PhysX this year in particular , this is the 8th game to feature it in 2013. This never happened before .. a response to AMD gaming Evolved maybe?
CoD:Ghost need better lighting and overall graphics not better smoke physics, which will be only eye candy.
Despite the "next-gen engine" CoD:Ghost seriously lacks behind Battlefield 4 graphics.

COD is in serious need for a graphics and gameplay makeover .
I think nothing wrong with the gameplay. CoD is designed for Bloody Steves. If you want a more complex shooter than buy a new Battlefield, or a new ARMA if you want much more complex gameplay.
 
Last edited by a moderator:
I think that's rather a design decision, with Battlefield targetting 30ish fps and CoD being a 60 fps game.
 
I don't think EA is targeting 30 fps on the PS4/Xbone, they have announced repeatedly they will target 60 fps . In fact, a recent editorial at IGN states that they will do so at the expense of visual quality .

I highly doubt the next-gen consoles have the capabilities to run BF3 even @1080p/30fps without some concessions. They lack the CPU power and GPU power as well (especially XBone).
 
I don't think EA is targeting 30 fps on the PS4/Xbone, they have announced repeatedly they will target 60 fps . In fact, a recent editorial at IGN states that they will do so at the expense of visual quality .

I highly doubt the next-gen consoles have the capabilities to run BF3 even @1080p/30fps without some concessions. They lack the CPU power and GPU power as well (especially XBone).

Battlefield 4 for PS4 looks like piss so far.

http://ca.ign.com/articles/2013/08/23/gamescom-battlefield-4s-graphics-lacking-on-ps4

Watching the clip in HD, it doesn't seem like a step up from last gen. I would assume 3 months time is left for them to work on the game, which isn't that much time really.
 
Back
Top