Gears of War from X05 in 720p direct feed.

one said:
Aren't the 5th and the last pic from a cut scene? How can you control a character shooting toward you? :???:

Exactly. I was going to say that too.

Anyway why does the trailer run so slooowww on my PC?
 
Deepak said:
Exactly. I was going to say that too.

Anyway why does the trailer run so slooowww on my PC?

Because you're PC is not fast enough. It sucks like my computer. I always have to watch the low res versions too.

And to all the "This is CG, it's not real crowd" please do us a favor and shutup.

Thankyou.
 
Presumably the gameplay framerate issues are the game being pretty severely CPU bound, no? What I'm seeing is something like 15fps and if the graphics engine is fine at the cutscenes then the gameplay, CPU based, is the limiting factor for framerate. If it's only running one 1 core that'd make sense, but I can't find any official confirmation of this. The only reference I've found so far have been forum comments, and I'm shocked and surprised that from E3 to X05, a span of some what, 6 months, they're still only on one core? Plus from that interview they're not maxing physics yet?

Can someone link to details of the hardware this was running on for X05 and how much it's being used? And why, assuming it's still on one core, is it still on one core?! What the blazes have they been programming this past five/six months?! :???:
 
Shifty Geezer said:
Presumably the gameplay framerate issues are the game being pretty severely CPU bound, no? What I'm seeing is something like 15fps and if the graphics engine is fine at the cutscenes then the gameplay, CPU based, is the limiting factor for framerate. If it's only running one 1 core that'd make sense, but I can't find any official confirmation of this. The only reference I've found so far have been forum comments, and I'm shocked and surprised that from E3 to X05, a span of some what, 6 months, they're still only on one core? Plus from that interview they're not maxing physics yet?

Can someone link to details of the hardware this was running on for X05 and how much it's being used? And why, assuming it's still on one core, is it still on one core?! What the blazes have they been programming this past five/six months?! :???:

you clicked the gamasutra link? :???:

GS: Will the framerate be improving?

MR: Well we've been working on this (actual hardware) like I say, for about two weeks [as of Tokyo Game Show]. We've done very little optimization, I'd like to say the lowest of low-hanging fruit optimization. We're only running on a single core now, so we'll get at least double that, it'll be super smooth. We didn't even expect to get onto the final box until X05, and here we are. So the Xbox 360 really exceeded our expectations.

there are choppy parts in the trailer, but those are taken from the early playable demo. this is all in-game to me.

was it mentioned that UE 3.0 do not have Xenos tiling support? no small feat Cliffy came up with such quality. to some, this the best of next gen yet .

shame his editing team are amatuers. could've the least sped up the whole trailer as others have done so.
 
fireshot said:
you clicked the gamasutra link? :???:
Nope :oops:
When did XeCPU's appear in SDKs? The final Beta hardware that shipped August? I guess that's much a lot of time (though with MS saying devs can just lift code from one core to the other two it ought to have been an easy transition to multicore to make :p) And GOW isn't a laucnh titles either is it, so I guess they've been developing the game and will the later months be making the switch to final hardware.

there are choppy parts in the trailer, but those are taken from the early playable demo. this is all in-game to me.
Why show old footage if you've something newer to show though? See, ot me they would have been better 'pulling a Sony'. They got great visuals at 30/60fps in their cutscenes, and then dragged the quality of the showing down with slow gameplay that's not indicative of the final game. They should have stuck to showing the in game trailers and if they wanted gameplay fottage, speed it up to 30fps to show what the actually game their advertising will be like, rather than what their current WIP is producing, as you elude to at the end of your message.
was it mentioned that UE 3.0 do not have Xenos tiling support? no small feat Cliffy came up with such quality. to some, this the best of next gen yet .
I think Xenos is showing already what a great advance is being made in next-gen visuals. I don't imagine this'll be effecting the gameplay speed as the graphics engine is the same in cutscenes and gameplay, and the cutscenes looked great and were very smooth.
 
Bt are all from the same in progress?There are a couble of underwhelming pics :???:

786_0009.jpg

786_0012.jpg
 
fireshot said:
could've the least sped up the whole trailer as others have done so.


The point Cliff has always tried to make in interviews is that all footage you see is running realtime on the Xbox 360 hardware/dev kits, that there are no smoke and mirrors so that whenever we do get to see footage, we can automatically say that it is realtime.
 
c0_re said:
LOL! Hilarious the amount of sad fanb0ys in this thread who want this to not be realtime!
If you are refering to me why dont you check my other posts?Found one post of mine you didnt like and called me a ******?

I am trying to be as spherical as I can with all platforms without making a certain particular choice


rosman said:
Lol ? guys you are taking screenshots from vids, thats the underwhelming thing ;)
I just linked them from the site.I see some considerable detail drop during gameplay in these pics compared to the real time cut scenes.
I hope they arent using the same technique they used in some PS2 games(having more detailed models in cut scenes and less detailed during gameplay).Or probably the engine reduces detail the more away the objects are from the camera


(not trying to bash or to doubt 360's capabilities mind you)
 
Last edited by a moderator:
i have one question.
Why do dev have to lock games at 30 OR 60fps only?. Why not 45 for example. On my pc, when a game runs at 45 fps ( when i can lock for example in doom3 ) i dont have any tearing. Tearing occurs only when fps is > to the screen refresh rate or can you have tearing below the screen refresh rate ??? Cause at 45fps games start to be really smooth...
 
That's because the TV set has a fixed refresh rate that's quite low. With 45fps, you'd get each new frame displayed for either 1 frame, or 2, sometimes even for 3 - so what you'd see would be fluctuating all around and your eyes would hurt.
Also, most PC gamers are willing to play without vsync and accept the occasional tearing, but consoles just don't do that.
 
ive just tried to play some game on my standard tv ( quake 3 and doom 3 ) and i dont see any pb with the maxfps set to 45... thats strange.
 
Someone correct me if I'm wrong, but I thought the Jak games (meaning II and III) had vsync off, no? As well as a number of others, the Jak & Daxter games are the only ones I specifically remember though.
 
If you ar erefering to tearing yes Jak2 and 3 had some.

V-Rally though on the PS2 had lots of eye hurting tearing.It was killing my eyes!!!!
 
Nesh said:
If you are refering to me why dont you check my other posts?Found one post of mine you didnt like and called me a ******?

I am trying to be as spherical as I can with all platforms without making a certain particular choice



I just linked them from the site.I see some considerable detail drop during gameplay in these pics compared to the real time cut scenes.
I hope they arent using the same technique they used in some PS2 games(having more detailed models in cut scenes and less detailed during gameplay).Or probably the engine reduces detail the more away the objects are from the camera


(not trying to bash or to doubt 360's capabilities mind you)
Nesh, I was going to defend your post, since when i watched the video one small snippet of video jumped out as being just bad. Then I came across this post of yours in another thread:

Nesh said:
I wish I had broadband.I am missing all the good stuff :devilish:

Nice impressions though :D

I take this to mean that you are commenting on this video in this thread without actually having viewed the video? This is a bit disingenuous, to say the least.

.Sis
 
Shifty Geezer said:
Nope :oops:
When did XeCPU's appear in SDKs? The final Beta hardware that shipped August? I guess that's much a lot of time (though with MS saying devs can just lift code from one core to the other two it ought to have been an easy transition to multicore to make :p) And GOW isn't a laucnh titles either is it, so I guess they've been developing the game and will the later months be making the switch to final hardware..

Also keep in mind that the one core they are using is also being used for all overhead as well, audio, network, XBL interface, etc. which IIRC can take up 50% of one core (some please correct me if i'm wrong). That means they are using 1/5 of the actual CPU power for the actual gameplay/grfx/ai/physics. Now i know you dont just get a linear improvement by adding more cores like that but i thought it was worth noting that you wont have to pay these 'housekeeping' overhead costs on cores 2 and 3.

Shifty Geezer said:
Why show old footage if you've something newer to show though? See, ot me they would have been better 'pulling a Sony'. They got great visuals at 30/60fps in their cutscenes, and then dragged the quality of the showing down with slow gameplay that's not indicative of the final game. They should have stuck to showing the in game trailers and if they wanted gameplay fottage, speed it up to 30fps to show what the actually game their advertising will be like, rather than what their current WIP is producing, as you elude to at the end of your message..

Damned if you do, damned if you dont. :)

Shifty Geezer said:
I think Xenos is showing already what a great advance is being made in next-gen visuals. I don't imagine this'll be effecting the gameplay speed as the graphics engine is the same in cutscenes and gameplay, and the cutscenes looked great and were very smooth.

Yes. Obvious from that alone this game is currently VERY CPU bound.
 
Back
Top