John Carmack "not all that excited" by next-gen hardware

Both Crysis 2 and Rage look superb but I'd definitely give Crysis 2 the edge. That's coming from someone who recently completed Rage and is currently playing through Crysis 2. Both at 60fps incidentally.
 
Yeh well, I've played the Rage demo on 360 too. Now what? Move on to your next strawman. It actually looks good/great in it's own way. And 60 FPS is awesome too. But it has it's weaknesses too (eg, close textures, an overall feeling of lack of dynamism).

Cool! Then maybe you can answer my first question, "Which do you think is more fun to play?" I am interested in your opinion, otherwise I would not ask it.

Also, I do not get what the strawman is. I just think that you cannot rate a game/movie/music piece unless you have experienced it. Maybe you think different there?
 
So you've never played Crysis but you're "assuming" on what makes it look bad, while your whole thrust of attack is based on claiming I'm assuming x y and z about rage based only on videos and screens.

Apperently you cannot read. I wrote: "It is kind of difficult to rate Crysis' and Rage's looks based on videos because almost all video that is available removes what makes Rage look good (the fluidity) and improves on what I assume makes Crysis look bad (lack of fluidity)." So here I make the assumption that a 20-30 fps frame rate (which Crysis has, http://www.eurogamer.net/articles/digitalfoundry-crysis2-face-off?page=2) makes a game look "bad" and a close to 60 fps frame rate makes a game look good. Now since almost all available videos are 30 fps you remove 60 fps fluidity, while the nature of video encoding tends to hide the non-fluidity of a 20-30 fps game. In my opinion, you tend to feel that more while you are actually playing the game (movement becomes slower, more stuttery and/or more blurry).

But maybe your opinon and experiences differ from mine, that would be interesting to hear.
 
I also disagree you cant tell anything about a games graphics from videos or screens. That's just false. Obviously. Go look at an 8-bit game on youtube and compare it to current gen and tell me you cant tell any difference.
I never claimed what you wrote. You can tell a lot of things from a properly encoded video, for example the effective frame rate. But videos are totally worthless for telling how much fun a game is.

Screen shots from a game can tell as much about the quality of a game as screen shots from a movie can tell about the quality of the movie. Maybe you can use them to count pixels, which is pretty interesting. I guess that you can tell that current gen games use 5 times the pixels as previous gen games.
 
Both Crysis 2 and Rage look superb but I'd definitely give Crysis 2 the edge. That's coming from someone who recently completed Rage and is currently playing through Crysis 2. Both at 60fps incidentally.

Yeah same here, played both at 60fps and finished Crysis 2 but didn't finish Rage. It's not that Rage is bad, it's a good game, it's just that the bar for these types of games is pretty high now. I liked Crysis 2 much better than Rage hence I finished C2 and got bored of Rage. If one wants a roam the wasteland shooting game then I'd recommend Borderlands over Rage. Just my personal opinion. Visually I found C2 to look better than Rage as well.

Regarding vr helmets, am I the only one that has no interest in wearing gaming accesories while I play games? I won't wear 3d glasses let alone a helmet! It just seems too cumbersome. Maybe it's because I tend to play core games while lying back on the couch, can't imagine how a helmet would work there.
 
Regarding vr helmets, am I the only one that has no interest in wearing gaming accesories while I play games? I won't wear 3d glasses let alone a helmet! It just seems too cumbersome. Maybe it's because I tend to play core games while lying back on the couch, can't imagine how a helmet would work there.

You're not alone, in fact outside of these forums you are part of a massive majority.
 
Headphones is my preferred way to play. Also, fearsomepirate I agree with you on local multiplayer and I'm on my low thirties. That's why I own every single licensed Lego games (that reminds me, afta pick up Lego Batman 2) because it let's me play with my SO in split screen.

OTOH, I don't quite agree with you on CoD. I think most of its success is driven by brand alone. People play CoD because most of their friends play CoD, like a law onto itself now. It's similar to the FIFA/PES split here in Europe, people will play one or the other and keep on buying basically the same game every year until all their friends decide to switch.

joker: WRT a wasteland shooting game, I found Rage actually worked best as a corridor-shooter. The outdoor areas were already fairly restricted by the environment, then you had invisible walls and apart from roaming buggies and stationary towers there's wasn't much to do in the wasteland, whereas the "instanced dungeons" had the game's full compliment of gameplay goodies.

It's not the racing or even the outdoor visuals: for id's first "racing game" Rage puts other, more experienced, companies' efforts to shame; it's just that no matter how respectable it was, it's clear it wasn't the game's strongest suit.

Btw, I'd recommend you try at least the distillery, gear-head level and the awesome Jackal Canyon in the second act if you didn't get that far. I don't know if it's because act 2 is shorter thus they had more disc storage to use on the MTs but these areas are a step up in graphics from the first act. Also, subway town is decidedly different from wellspring (I can't say I liked the wild west undertone). Jackal Canyon is one of the few spots in the game where MT's benefits really show through (that and Dead City but that was brought down by the low texture resolution). Jackal Canyon also seems to have a lot less invisible walls than other levels: you can actually fall down cliffs and die :oops:
 
I think local multiplayer is the most underappreciated feature in gaming today. Developers don't think about it because they all have the machine and a TV sitting on their desks, and we don't think about it because (I think) we're almost all in our thirties on this board and either living alone or have families, but as a university educator, I can assure you it's a huge deal to my students, both in frat houses and in dorms because they talk about it all the time. Without local multiplayer, Halo and COD wouldn't rule the dorm. If they don't rule the undergrads, they're not going to suddenly capture them when they turn 25, either. You also lose 100% of "I played this with a friend and want to buy it now" sales.

Seriously, find me a shooter that sold over 5 million units on the Xbox that has no local multiplayer. You can't, because it doesn't exist. I doubt anything will convince you, because conventional wisdom is that boys/young men ages 13-21 don't play video games with friends any more (either that, or they're irrelevant), but I think the conventional wisdom is driven entirely by thirtysomethings in the industry projecting their habits onto the rest of the world. I think if you've got an action game that you're not going to let high school and college students play together in the same room, you are putting a hard limit on your cross-platform sales of around 4-5 million units.
I honestly wonder if we'll ever see local-multiplayer make a serious splash in the next-gen of gaming? It seems like outside of Wii(-U) games, kid-friendly PS3/360 games, and a very few genre-specific titles, it's something most game companies are moving away from. I only say that because it's likely they'll increase the reliance on online multiplayer for profitable purposes.

I've heard a lot of game developers in interviews claim that it's hard to produce a proper split-screen experience without sacrificing other technical areas of a game to get that done. I'm honestly wondering if any graphical power the next consoles have will re-utilize that functionality, or set it aside for even more graphically intensive elements in games.

Maybe I'm being a bit too pessimistic of the next generation of consoles seeing as a few will definitely keep the option around, but I do feel that it will only be a handful of developers who decide add local-multiplayer back into their future games.
 
I've heard a lot of game developers in interviews claim that it's hard to produce a proper split-screen experience without sacrificing other technical areas of a game to get that done.
They were quite happy to make such sacrifices last gen to support multiplayer...
 
We'll see if AR glasses or VR helmets take off. Seems more like looking for the next big thing than actually bringing something good. The analogy to the holodeck is wrong. Think instead Wild Palms where people put on stylish (for the time) sunglasses which immersed them in some different locale than their living room.

How many people use AR apps. with their smart phones? They're available but they haven't seemed to light the world on fire.

Local multiplayer. Who has the time to do that once they graduate and grow up? Sure the Wii and some of the motion dance games were hits at parties. But aren't those fads over?
 
The gameplay is refreshingly different from all other shooters, it's far more dynamic and the weapons are more varied.

not true. you should play more shooters...

Regarding vr helmets, am I the only one that has no interest in wearing gaming accesories while I play games? I won't wear 3d glasses let alone a helmet! It just seems too cumbersome. Maybe it's because I tend to play core games while lying back on the couch, can't imagine how a helmet would work there.

completely agree. I have a 3D Plasma TV. Although lots of my games offers 3D mode...I just don't borther to put on glasses...not worth it, no real difference.
I am expecting the same for vr helmets...
If you really want a game changing experience, you need this tec and effort:
http://www.youtube.com/watch?v=eg8Bh5iI2WY&feature=player_embedded
 
Last edited by a moderator:
I said it a long time ago, 3D just isn't suited to console players ... the FoV just isn't there to allow immersion for 99% of them. Stereoscopic 3D is only suited to people with home cinema setup or people who sit close to a large monitor ... when you're surroundings already take up the majority of the view trying to increase immersion by stereoscopics on just the display is a lost cause, which is where HMDs come in (calling them helmets is tendentious, the market is moving on glasses ... not helmets).
 
I said it a long time ago, 3D just isn't suited to console players ... the FoV just isn't there to allow immersion for 99% of them. Stereoscopic 3D is only suited to people with home cinema setup or people who sit close to a large monitor ... when you're surroundings already take up the majority of the view trying to increase immersion by stereoscopics on just the display is a lost cause, which is where HMDs come in (calling them helmets is tendentious, the market is moving on glasses ... not helmets).

hm, I am not sure. I have also my PC hooked up my plasma. I am sitting about 2 feets away. it's a 42" TV. Playing my PC games or console games 3D doesn't really feel different.
Problem is: you do not only need the appropriate tec, but also smart game developers who use this tec to improve the actual gaming experience!
Just throw in 3D (like in movies) does not really substantial enhance the experience imo.
 
I don't think 3D will ever be immersive without player tracking of some form. Otherwise it's just a static view into a world behind a screen, rather than a world you are in. HMD's with head tracking are the ideal, but Kinect/PSEye tracking to shift the camera subtly following the player should also do a good job.
 
One big criterium for that kind of immersion is photorealistic graphics. So you turn your head and your FOV rotates as it would in real life. But you're still aware that you're viewing computer graphics.

Even cinema-quality CGI may not be enough for immersion.

There used to be these Viewmaster viewers, with discs of stereoscopic images of faraway places. 3D effect was good and it was photos. But it wasn't really more immersive than looking at travel photos in a book.
 
I've said it before: the problem with VR is how user conveys movement, not head-tracking. It doesn't matter if they're able to sell those goggles for 50 euros, I'm not going to put them on my head if I have to press buttons/sticks to move my character around. What happens when you have to look down at your controller, or <shudder> your keyboard?

I would be fine, even or especially with the keyboard :p
I'm not a laptop user though, I've only dealt with standard layout keyboard forever so my finger have been hard-wired into knowing the position of every letter or key already.
a HMD would be funny, esp. if used to the point you have no apparent display besides it.

but it's a bit over the top. unless you head-track to look at additional web browser instances, text terminals and other funny stuff! the caveat is, if you're displaying your apps as textures in a small 3D world, or just drawn on a virtual cylinder, then you need very high dpi. I find 3D accelerated desktop to be useless, because when you use their only feature : zooming out windows to look at all of them at a glance, text becomes unreadable and you can't make out what they are. (file managers and terminals all look alike). even with a small transformation of their "distance" and orientation, they look bad.

so, ungodly high dpi is needed for a HMD to be really great.
but I'm still delighted at Carmack taking a look at the problem and slashing the latencies. even if HMDs become workable because of huge improvements in gyrometers, accelerometers etc. and computing power.

motion controls are OLD : nintendo came out with them for the NES! (the Powerglove). it was a failure. but they still released motion controls 15 years later, with the Wii. useless tech became useful, with increased precision and lowered latency.

wow. 15 years ago I badly wanted a VFX-1! to play Descent under MS-DOS.
incidentally, it was about the price of Carmack's set, but it got nowhere besides being the most well-known HMD back then. no way my parents would have forked the monies. I'm sure it would have run Descent and nothing else.
http://museum.bounce-gaming.net/vfx1.html
 
hm, I am not sure. I have also my PC hooked up my plasma. I am sitting about 2 feets away. it's a 42" TV. Playing my PC games or console games 3D doesn't really feel different.
Problem is: you do not only need the appropriate tec, but also smart game developers who use this tec to improve the actual gaming experience!
Just throw in 3D (like in movies) does not really substantial enhance the experience imo.
Skyrim wasn't created with stereoscopic display in mind, but I played with it enabled on the PC and the difference was dramatic. I really enjoyed it.

I'm not excited about HMD's because I don't think I want my entire field of vision consumed by the game. Maybe I'll feel differently if I try it, but the biggest hurdle for me will be not getting sick while playing.
 
Skyrim wasn't created with stereoscopic display in mind, but I played with it enabled on the PC and the difference was dramatic. I really enjoyed it.

Yeah when 3d works it makes a dramatic difference IMO, I'll have to try it out on Skyrim when I eventually get it.
 
Local multiplayer. Who has the time to do that once they graduate and grow up? Sure the Wii and some of the motion dance games were hits at parties. But aren't those fads over?
There are people with families, maybe you want play more than just the Wii-centric stuff with your siblings and cousins while they're at your place. Also roommates and close friends who are in the same room with you.

Friend: That's a cool looking game, can I play?

You: Nope, you'll have to buy it separately on your console, and then go back to your place to setup your online pass with your online account so we can play.

Friend: Uh huh...

Online multiplayer shouldn't be the only option, both options should be accessible if the given opportunity presents itself.
 
Back
Top