Epic Says This Is What Next-Gen Should Look Like

The comparison was between two character models.
The comparison was between demo quality vs. actual games. TBH bullshots shouldn't be considered at all in this, except that Epic released high quality character images around the unveiling of UE3 which set the bar pretty high.

There's a part that is a direct courtesy toward Crytek.
I know man, it's like the engine developer version of a diss track. How long until we see Crytek respond with Prophet curb-stomping Marcus Fenix?
 
You ought to be comparing bullshot to bullshot really
I shouldnt be comparing something thats not accurate with something else thats also not accurate.
So if company X decides to throw in a little raytracer in the engine (seconds per frame) its OK for them to show screenshots from that + say thats the pictures the engine produces.
ME - OK as long as you have a disclaimer

(from other thread)
Perhaps we can move along now from old... claims and PR stunts. They're rather inane as it is!
but we have history repeating itself here. If they're not held accountable they will continue to deceive. Personally (like with politicians etc) I would like to hear/see the truth. Some companies manage to do it, so why shouldnt we hold everyone up to this standard (surely its not too much to ask)

Well, you've proven that 2005 hardware can't compete with 2004 bullshots or that a game scenario with a lit environment and other characters to render is nothing compared to a portrait shot... :s
thanks for agreeing, OK but doesnt that make you have doubts of the actual validity of the premise "Epic Says This Is What Next-Gen Should Look Like"
 
I know man, it's like the engine developer version of a diss track. How long until we see Crytek respond with Prophet curb-stomping Marcus Fenix?

Funny, when you look at the Cryengine demos, they are a lot more impressive to me. Epic does robots, armoured goons, cars, dark rainy streets with neon lights, etc. Crytek does forests, trees, deserts, heat haze, girls with skirts that blow in the wind, water flow, daytime simulation, helicopter down blast, butterflies etc. It's a heck of a lot more impressive to me that Crytek can simulate realistic and natural looking things, than Epic with their usual stocky soldiers and big square guns.

Personally, I don't think Epic should be poking fun at Crytek when they've yet to show anything that looks as good or as realistic as the Cryengine demos.
 
Funny, when you look at the Cryengine demos, they are a lot more impressive to me.

Yeah same here. Epic's video looks nice, but I can't help but think that's its just basically current Unreal engine brute forced to new levels with an absurd about of gpu hardware behind it, whereas Crytek and Frostbite look more real to me in that they can run on todays hardware. So while it looks cool, I'm just not all that excited about it. Frostbite actually looks to be the best of the bunch so far to me, but they need to get their tools in line to what Crytek has shown for theirs. Good times either way though as Frostbite and Cryengine will really be forcing everyones hand to keep pace and compete.
 
thanks for agreeing, OK but doesnt that make you have doubts of the actual validity of the premise "Epic Says This Is What Next-Gen Should Look Like"

"Should" isn't a promise or a guarantee.

This is seriously a boring discussion for a tech thread. :| Aren't we supposed to be smart enough on this forum to understand a few things about PR and bullshots by now? We've had years of silly discussions that cover the same crap over and over again. And now you're just bringing that back up. Give it a rest? Don't have anything worthwhile to contribute to the tech discussion? How about discussing the tech inside the demo and what's feasible or not feasible?

Tech demos are rarely indicative of final game quality due to resolution or memory and such or other changes in scope. I mean yay, awesome. Just confirmed that bullshot quality isn't possible in a game. Quite frankly, the continued trolling about PR bullshots is getting quite tiresome. But hey, let's ignore the value of scalable settings for recorded videos, marketing material or expanding interest beyond just the games industry. Bullshotting is old news since marketing began!

*sigh* :(
 
but we have history repeating itself here. If they're not held accountable they will continue to deceive. Personally (like with politicians etc) I would like to hear/see the truth. Some companies manage to do it, so why shouldnt we hold everyone up to this standard (surely its not too much to ask)
Yes it is, because they're not going to take any notice of this forum, and the whole world of marketing is based on stretching the truth, and it's tiresome and pointless, so give it a rest.

For the purposes of our discussions, we know the difference between a publicity shot and the in-game experience, and we have the sense to be able to change an opinion if a premature expectation of a lighting or shader system presented in a PR shot doesn't appear in the game.
 
Ok, so we think this was run on 3 580s. What was the rest of the hardware? i think that's fairly important to discuss when we could realistically see a *console* run this at a decently steady 30 fps.

I mean there's no AI in a tech demo, but what kind of a processor is feeding these 580s to keep them happy? How much RAM is necessary, even with streaming? What device is streaming (HDD, SSD, Optical)?

I'm just having trouble seeing that kind of power draw and heat reduced to a $350 package in the next 2 years.
 
So you believe the only reason why it was run on 3 580s is because they didn't have time to...finish some magical ritual?

Your third answer contradicts your thesis.
 
So you believe the only reason why it was run on 3 580s is because they didn't have time to...finish some magical ritual?

No, I believe the amount of effort put into optimizing was not high. Rein made a comment about the size of the team that worked on the project. And 3 cards in no way suggests 3x the performance.

Your third answer contradicts your thesis.

No it doesn't.
 
So you believe the only reason why it was run on 3 580s is because they didn't have time to...finish some magical ritual?

Your third answer contradicts your thesis.

Have I told you how much their HQ 'Cinematic' Bokeh DOF takes pefomance wise with their latest public SDK (think I mentioned it before to someone)?

About 10x slower than without. If they kept code same then it surely will be utterly taxing and goes inline with Mark saying it would probably run well on a single GPU given some time. I mean look how realtively low perfomance impact Cryteks CE3/Crysis 2 Bokeh DOF has yet giving good results (PC, 64taps or 32taps depeding on setting and option for fullres rendering). :)
 
I think these types of visuals next gen should be do-able. If not then I'll be highly disappointed in the next round of consoles.

Bullshotting is old news since marketing began!

*sigh* :(

Quoted for truth. I still remember Eidos using PC screens of the original Tomb Raider on the back of PSone and saturn cases. :p
 
It looks great, but I see nothing I wouldn't expect a single 580 to handle at 30hz. They could have been doing all the post-processing at full resolution which would hardly affect the IQ but could explain the massive GPU requirement. Truth is we know almost nothing about how they did this.
 
It looks great, but I see nothing I wouldn't expect a single 580 to handle at 30hz. They could have been doing all the post-processing at full resolution which would hardly affect the IQ but could explain the massive GPU requirement. Truth is we know almost nothing about how they did this.

Indeed. Post-FX buffer res & even tessellation factor (lol @ head wireframe) would certainly count as performance optimizations. What is rather "down-to-earth" in the demo are the objects where there are easily identifiable polygon edges (coat collar, the odd object in the background such as the garbage can). Even the background image of the building.
 
I always like the UE's.
will agree with you, but UE3 gets a bit annoying because until the last version it has been console unfriendly.

UE1; This was unique and incredible back in the day. In fact I preferred Unreal graphics to Quake 2 graphics, because of things like the reflections that displayed in the main screen of the game --the 3D scenery was pretty cool. Unreal was also a good game, featuring alien creatures similar to Halo elites, pacific native odd inhabitants with four arms, birds in the sky.., and some fun weapons.

UE2: I only enjoyed it while playing Unreal Championship on the Xbox. It looked to me as one of the most advanced Xbox 1 games out there, graphically wise.

UE3: After Oblivion, Gears of War was the next game that oozed next gen all over it, looking and running better than a lot of games with worse graphics.

I was amazed by the sheer quality of the graphics back in 2006. Besides that, some stages like the krill's one, was pretty cool. I wondered how they rendered two many birds at the same time without stressing the machine.

Also, the rough and clumsy dialogue of the characters helps with the general appeal of the franchise. I mean that without Marcus & company the game wouldn't be as fun if they changed their approach to something like Fable 3 -gag-worthy game sometimes, btw, I'd give it a 2-.

I love that kind of cultured dialogue, but in F3 everything (is and) sounds so childish and stupid... it's not the same at all. Anyway, what I mean is that Gears have been deservedly successful because of the right use of the engine and a decent story, setting and background.

However, save a few games like Gears, most UE3 games lack the same features compared to other engines and don't run that well on consoles, aside from looking like Gears clones. That's how some people grew tired of the engine, except for those games that really make the most of the engine on consoles.
 
I recall Epic publicly declaring with UE2 that they wanted to beat Halo's terrain rendering ;)

And yeah, Unreal 1 was really amazing, I still have pretty clear memories of some of the levels. It is a surprisingly big and long game too, tried to play it again in HD recently and it took forever.
Funnily enough it only had vertex lighting and no lightmaps at all, but at least it allowed them to do that awesome sequence where they locked you in, turned of the lights and then unleashed a Skaarj on you...
(still, meeting the first Shambler in Quake 1 was a lot more scary, three of us have been sitting in front of the PC and we were completely shocked by it...)
 
UE2: I only enjoyed it while playing Unreal Championship on the Xbox. It looked to me as one of the most advanced Xbox 1 games out there, graphically wise.

mm... Indeed. It seemed like the only game to use UE2X was UC2 though. Quite a shame since it was such a huge upgrade over the first game, but then next gen hit anyway.

Gotta wonder if they're planning any more upgrades for the current round of consoles. The displacement mapping/tessellation is certainly out of the question. Hopefully they'll have the GDC material up on UnrealTechnology.com soon.
 
Screw more detailed muscled monsters (human or otherwise) in crappy looking armour.

How about they get clothing and hair right? Now that would be a step forward. Oh wait that can't be leveraged on antique consoles whereas you can just throw lod and simplified shaders at this sort of stuff to get it to run.

This is because in an actual interactive game that has to deal with multiple A.I., check list features, resolution and frame rate decisions not even mentioning art direction of going for realism or sci-fi as is the case with the UT3 demonstration it does not make sense to expect those things.

Any game programmer and artist team can slap together a impressive realism oriented tech demo even with old hardware like Nv40 that is not the same as a playable game.
 
Back
Top