Painkiller And NVIDIA

http://www.nvnews.net/#1068777586

When two marketing monkeys get together, theres no end to how wonderful everthing is. Someone should make a "review the marketing monkey PR stuff," this is just too wrong, its almost a lie and PR crap like this should not end up on a front page on a popular news site. Brainwashing it is, no less.


Painkiller takes advantage of the feature sets and performance of today's newest graphics technology, more specifically NVIDIA's GeForce FX family of GPUs, to achieve new levels of texture detail and the latest lighting and shadowing techniques.

reailtime shadow and perpixel lightening can be very impressive, but i hope "PC nvidia GFFX" version looks less dry and "unreal tournament" than this screenshot. Wheres the bumpmapping?


Based on the proprietary 3D "PAIN" engine, which is able to output 100 times more polygons than other games in the genre, Painkiller also relies on the raw performance of NVIDIA's GeForce FX GPUs to achieve blazingly fast performance to keep gamers in a world of non-stop action.

http://www.gamespot.com/gamespot/features/pc/nvidiageforcefx/02.html
http://www.xbitlabs.com/articles/video/display/geforcefx-5900ultra_8.html

GFFX 5900 ultra outperforms R 9800 pro regular "T&L" but not vertex shading (maybe since its (over)clocked so high). In any case higher polycount is always welcome, even in Painkiller.

their "proprietary 3d engine" is i guess optimized for GFFX architecture then, I wonder if they optimized/coded a different render path for radeon also, or maybe its not nescesseray (they didnt use any PS 2.0 effekts maybe)


Painkiller features truly stunning physics-based gameplay, frantic over-the-top action and some of the nastiest level bosses ever seen.

Although not stunning it does look fun when monsters blow appart, think about 35 of those running at your plasmagun and you mow them down just like doom 1 and 2. Braindead action is weapons with personality, nice death animations and unique looking monsters.


"Painkiller represents a new class of PC games that is pushing the latest shader technologies to achieve cinematic-quality visual effects in real-time, without having to sacrifice performance," said Bill Rehbock, director of developer relations at NVIDIA.

please, dont tell me this games graphics are the "cinematic quality" nvidia spent many dollars to market. wait it goes on:


"We're pleased to lend our hand in the development and marketing of this game to ensure that gamers around the world get to experience Painkiller the way it's meant to be played.

The way its meant to be played. Any definition on that saying other then zero PS 2.0 effekts and no HL2 support? Obviously nvidias self confidence seem low these days.
 
Gambler FEX online said:
Based on the proprietary 3D "PAIN" engine, which is able to output 100 times more polygons than other games in the genre.
I was always thinking that since T&L, the graphics cards are pushing polygons, not the engines... Turns out I was wrong! :oops:
 
NeARAZ said:
I was always thinking that since T&L, the graphics cards are pushing polygons, not the engines... Turns out I was wrong! :oops:

I think the engines still need to be able to keep track of all the polygons they send off to be rendered, and I think some engines cut processing overhead by not making the renderer compute beyond some number of polygons that is considered beyond the reach of technology during the engine's lifetime. It beats setting aside resources for something that wouldn't make sense.
 
There's no technical reason for an engine to have limited polygon counts. Well, maybe there's one ... 16 bit indices for vertex buffers limit you to 64k vertices per draw call. You can just switch to 32 bit indices then. Duh!
Otherwise, virtual address space is the limit. Roughly 40 million unique vertices, give or take a few.

Pure BS.
 
I've seen screenshots of this game. Where are the 100 times more polygons ? It looks like a 2000 Game. Errrh, wait, It looks like Will Rock, less the bump mapping on the characters....
 
You've got to remember this is Nvidia giving money to the game's *publisher* so that Nvidia can put out press releases like this. There may be nothing added by the developer, or more likely Nvidia is writing some of the code for the developer (in addition to giving cash to the publisher) because the developer isn't going to go thorugh the hassle of trying to program for the Nvidia twitchy architechture.

And in the end, the ATI cards may very well play the game faster and look better (as per many of the other TWIMTBP games) just because their hardware is so much better.
 
You mean like Tomb Raider 6, or Half-Life 2 which are TWIMTBP ?

Errrr, We could think of Dawn too.... Oooops, Dawn isn't a game, just an optimized technology demo for GeForce FX working way better ont competitor's products... Wooohoooo ! :devilish:
 
Of course it will look better, the question is how "optimised" is the game for FX hardware?

Get my drift? :)

Bouncing Zabaglione Bros. said:
You've got to remember this is Nvidia giving money to the game's *publisher* so that Nvidia can put out press releases like this. There may be nothing added by the developer, or more likely Nvidia is writing some of the code for the developer (in addition to giving cash to the publisher) because the developer isn't going to go thorugh the hassle of trying to program for the Nvidia twitchy architechture.

And in the end, the ATI cards may very well play the game faster and look better (as per many of the other TWIMTBP games) just because their hardware is so much better.
 
K.I.L.E.R said:
Of course it will look better, the question is how "optimised" is the game for FX hardware?

Get my drift? :)

Well what do you mean by "optimised"? Valve spend 5x longer optimising for a NV3x path, and still ended up with a slower game and lower IQ. Nvidia will be tooting their marketing agreement, but the coding may not reflect any special Nvidia features.

It may have code that tries (but fails) to get to the level of the ATI cards merely making the gap less of a yawning chasm. Nvidia may trumpet special effects like Tron's glow that were better supported on ATI hardware without all the marketing BS.

Nvidia's hardware and drivers are so far behind, I expect any special work to be done just to close that gap, rather than to do anything extra. Let's face it, there isn't anything as far as gaming goes that the Nvidia cards can do that the ATI cards can't do prettier and faster.

What I'm also getting at here is that marketing BS deals between Nvidia and a publisher may have nothing (or very little) to do with the actual coding done by the developers.
 
Quite an Honour, a game called Pain-K.I.L.E.R.

i think i know what kind of FSAA i need to use :)
 
" Let's face it, there isn't anything as far as gaming goes that the Nvidia cards can do that the ATI cards can't do prettier and faster."
------------------------------------------------------------------------------------
simply not true! nVidia hardware is much faster at shader replacement, and has features ati can't touch like LDR (low dynamic range), quincux fsaa (who doesn't want their full scene enhanced with the power of 4 cunx), psudotrilinear filtering, and shadow buffers.

shadow buffers are actualy kinda neat, and have been used in a few games. other than that, the main thing that nVidia has over ati is multimonitor support is better. they can do spanned desktops which is kinda neat to run games on.
c:
 
see colon said:
" Let's face it, there isn't anything as far as gaming goes that the Nvidia cards can do that the ATI cards can't do prettier and faster."
------------------------------------------------------------------------------------
simply not true! nVidia hardware is much faster at shader replacement, and has features ati can't touch like LDR (low dynamic range), quincux fsaa (who doesn't want their full scene enhanced with the power of 4 cunx), psudotrilinear filtering, and shadow buffers.


Yeah, yeah. When I said "prettier and faster", I meant "better" :LOL:

see colon said:
shadow buffers are actualy kinda neat, and have been used in a few games. other than that, the main thing that nVidia has over ati is multimonitor support is better. they can do spanned desktops which is kinda neat to run games on.
c:

Luckily, the ATI cards have enough grunt to keep shadows up to speed. For *gaming* is multimonitor support really relevent bar a few flight sims? I'd count multimonitor support in gaming as something very niche, like stereoscopic glasses.
 
Well, one monitor could show up the rendering, and another one the realtime FPS counter. A neat new feature implemented in nVIEW ;)
 
digitalwanderer said:
Meh, the screenshots make me think the game ain't gonna live up to any of it's hype. ;)

You do so too ? Strange, when you say to some people that it reminds more of let's rock than of a bluockbuster like, well....... Errrrrm, what FPS blockbuster had we got this summer ?

Well, even Contract Jack based on the Litech engine from NOLF2 seems way better...... Well, in fact, Contract Jack is frome a proven commitee ;)
 
faster and better

"Yeah, yeah. When I said "prettier and faster", I meant "better" "
-----------------------------------------------------------------------------------
better than 4 cunx?!

"Luckily, the ATI cards have enough grunt to keep shadows up to speed."
--------------------------------------------------------------------------------------
i remember reading somewhere, or perhaps an acid flashback, that homeworld2 uses shadow buffers and said shadows are causing "issues" on ati hardware. and splinter cell uses shadow buffers, mainly for visual effect.


"For *gaming* is multimonitor support really relevent bar a few flight sims? I'd count multimonitor support in gaming as something very niche, like stereoscopic glasses."
--------------------------------------------------------------------------------------
stratagy games work well with multiple monitors as well. aom supports it out of the box, and works quite nicely i might add. the only reasone it's a niche is because it's unsupported and undocumented by nVidia and ati (and thus must gaming cards owners either don't have it or don't know they have it). look at matrox's list of games that support multiplt monitors. there are quite a few, and lots of games that don't expressly state they support multiple monitors still do if you know how to trick them into it, and have support for spanned monitors.
c:
 
Magic-Sim said:
digitalwanderer said:
Meh, the screenshots make me think the game ain't gonna live up to any of it's hype. ;)

You do so too ? Strange, when you say to some people that it reminds more of let's rock than of a bluockbuster like, well....... Errrrrm, what FPS blockbuster had we got this summer ?

Well, even Contract Jack based on the Litech engine from NOLF2 seems way better...... Well, in fact, Contract Jack is frome a proven commitee ;)
Yeah. Nothing against this game since I know nothing really about it 'cept the screenshots, but the screenshots really didn't make me jump up-n-down excited and I just don't see anything very special about 'em. :(
 
Hmmm, they officially call their engine PAIN. Who else thinks that's dumb?

Q: Fellas, how's your engine going so far?
A: Well, it's been a real PAIN so far and it will stay like that.

Q: Hey, has anyone seen last week's code snapshot CD?
A: I'm sitting on it.
Q: So you've got PAIN in your ...?
A: Underneath, technically speaking.
 
Back
Top