Tom Clancy's The Division [PS4, XO]

Makes sense. It could actually be the birth of a new effective marketing strategy, where they can then say look our console game looks as good as a "$4000" pc! It's a clever strategy because very few people will be willing to believe that the game was purposely downgraded, whereas most console fans will absolutely believe that their shiny new console can match pc's costing "10x" more. Maybe that does sell them more units, it's an interesting strategy.
 
Makes sense. It could actually be the birth of a new effective marketing strategy, where they can then say look our console game looks as good as a "$4000" pc! It's a clever strategy because very few people will be willing to believe that the game was purposely downgraded, whereas most console fans will absolutely believe that their shiny new console can match pc's costing "10x" more. Maybe that does sell them more units, it's an interesting strategy.

Terrible decision (strategy) if that’s what UBI is setting out to do…
 
Terrible decision (strategy) if that’s what UBI is setting out to do…

I dunno, ultimately Ubisoft are chasing as many sales as they can, so in spite of what we may feel about the tactic if it does succeed in getting people to feel like their $400 console is a super computer and gets them spending more then mission accomplished. Watch Dogs seems to have sold well in spite of the internet crying over visual downgrades, so maybe they will try the strategy again. I mean hey, can't blame them for trying something new, whatever it takes to make sales. Use the pc version to generate buzz at e3, then downgrade it at ship to show how their l33t coders got the console version to match the uber expensive pc version. Get people happy about their consoles to get them spending more. You have to admit with all moral or ethical issues aside, it's kind of a clever sales tactic. It's not like the masses will know any better anyways.
 
It's not like the masses will know any better anyways.
I would have thought that 'the masses' would have no idea what the game looks like on PC, wouldn't see it at E3, and would just buy it. I imagine the gamers that watch E3 and would see the console version matching the PC version are the sorts of gamers who'd then read about PC downgrades on websites. Certainly I'd expect the PC gamer to be informed. As such, the PC version wouldn't need to be downgraded. You could show the PC version in promo materials while drumming up interest, with the masses only knowing the game through the occasional YouTube clip their gamer buddies point them to, and then sell them the console version which doesn't look as good but the masses aren't aware of that.
 
I'm not a PC gamer, but if I was one of the more serious ones with a proper rig, and all my newest games were being downgraded because the devs don't want 'such a huge gap' with consoles, I'd be seriously f'ked off.
What's the point of mega powerful PC GPUs if most of the good games cater to much lower specs, and are downgraded on purpose??
 
Many like to believe that the next-gen consoles are very, very underpowered/bad because they don't have exotic architecture and thus have no advantages whatsoever over any PC, not just high end PCs, but to be frank only fools can believe that a $400 machine it's as powerful as a $4000 one.
 
Many like to believe that the next-gen consoles are very, very underpowered/bad because they don't have exotic architecture and thus have no advantages whatsoever over any PC, not just high end PCs, but to be frank only fools can believe that a $400 machine it's as powerful as a $4000 one.

But of course. But what worry's me is that even a mid-range PC appears to perform better than consoles.
People expect consoles to perform better than the average gaming PC.

I am astonished that some non very demanding games perform "worse" or not as well as expected on consoles. Current console games perform like polished 1.5 versions of previous gen.

I get the impression that both manufacturers were more cost-conservative this time around so consoles didnt get the boost they used to get from gen to gen. That coupled with the efficiency of these consoles in getting out their performance, I suspect that we will see significant improvements but probably not as large as we used to. Of course I could be wrong because the 360 gave a similar impression when it was first released but exploded with amazing quality a year later with games such as Gears and Bioshock
 
I think that it would be nice to know if the next-gen consoles still hold the "same" advantages over PC that we witnessed in the past generations (low-level access to hardware, focus on fixed specs, less overhead, etc...) because it could help us to better set our expectations BUT I think we can safely say that there's room for improvements and that year one titles do not represent the full capabilities of the next-gen.
sebbbi said to DF that "Launch games never show the true long term potential of the consoles" which IMO should be of comfort for those that are worried about the next-gen.
Also I believe that Guerrilla Games and some of MS devs too said that in time we should expect better results but probably the message didn't pass.

P.S.
I think that players should learn to stop worrying and love the [strike]bomb[/strike] next-gen.
 
Last edited by a moderator:
I'm not a PC gamer, but if I was one of the more serious ones with a proper rig, and all my newest games were being downgraded because the devs don't want 'such a huge gap' with consoles, I'd be seriously f'ked off.

It's not because the developers don't want a huge gap (even if they say that). It is because with the discrete GPU market asymptotically diminishing to a very low fraction, there is no reason why they should sink a lot of money into features which will only be enjoyed by a few.

In 1999 every PC (as in 100%) had an AIB for graphics, last year it was 21%, and this year will be lower still. New memory technologies (3D stacking, DDR4) will accelerate this trend.

The PC mass market will have performance equivalent to the new consoles, or lower.

Cheers
 
If that's true and devs target the lowest common denominator, doesn't that spell the end of monster PC rigs? What's the point in monster cards if they don't benefit? Or will the PC elite just indulge in the same games at 4k supersampling and be done with it?
 
If that's true and devs target the lowest common denominator

Not the lowest common denominator, the most profitable denominator, which will be around current gen. console performance levels.

doesn't that spell the end of monster PC rigs?

Regrettably, yes.

What's the point in monster cards if they don't benefit? Or will the PC elite just indulge in the same games at 4k supersampling and be done with it?

Devs will implement easily scalable features: higher spatial resolution, higher temporal resolution, longer draw distances, more particles etc..

Cheers
 
Devs will implement easily scalable features: higher spatial resolution, higher temporal resolution, longer draw distances, more particles etc..
Does that mean that technical progress in the GPU space may as well come to a halt until new consoles? It's genreally been that way that PC advances are properly used, but if it's becoming more that way, new techniques will be even more pointless when devs aren't even going to consider them.
 
Not the lowest common denominator, the most profitable denominator, which will be around current gen. console performance levels.



Regrettably, yes.



Devs will implement easily scalable features: higher spatial resolution, higher temporal resolution, longer draw distances, more particles etc..

Cheers

This saddens me... I always enjoyed building hi-spec'd rigs knowing that games will support it. Games that really required beefy PC components for game engines using a insane amount of geometry, ungodly amount of shaders, intensive shadowing, 2k/4k resolutions, etc/etc. What diehard PC gamer really wants to build a hi-spec'd rig, just to run a poorly coded / poorly optimized console port? *cough* Watch Dogs *cough*

IMHO, this staggers the industry, not improving it.
 
Does that mean that technical progress in the GPU space may as well come to a halt until new consoles? It's genreally been that way that PC advances are properly used, but if it's becoming more that way, new techniques will be even more pointless when devs aren't even going to consider them.

Somehow your reply (statement), makes me think of the PS2 era (not in a good way). The long staggering life cycle that system enjoyed. Games started looking like each other across that system, even the PS2 to PC ports had that PS2'ish feeling to them. :runaway:
 
If that's true and devs target the lowest common denominator, doesn't that spell the end of monster PC rigs? What's the point in monster cards if they don't benefit? Or will the PC elite just indulge in the same games at 4k supersampling and be done with it?

Dont see why it'll be any different than the past few years in those regards. This is a mature market area and we pretty much know what will happen.

PC will be better, by some degree or another, even if the baseline is held down, and some people will fork out for that. If it was nothing but 60 FPS and higher res on PC, they would. But of course it will be more.
 
Dont see why it'll be any different than the past few years in those regards.
AFAIK this is the first time a company has basically said they are capping their games to the lowest common denominator. In DF face-offs, PC has always shown better effects etc. versus PS360. Sounds like now the only difference will be an increase in IQ values.
 
I honestly expected the new generation of consoles to help boost PC gaming, not gimp it.
 
I would have thought that 'the masses' would have no idea what the game looks like on PC, wouldn't see it at E3, and would just buy it.

They will know whatever the hype machine tells them. Most publications will not go into downgrades or things like that, they will gloss over all that because it's in their interest to keep the video game hype machine fed. So they will blast out news about how face melting some game looks and ignore the fact that it doesn't actually look like that when it finally comes out. It's the hype machine, reality never has much to do with it, it's just business. Whatever info comes out of that is what the masses will know and/or what will become "fact". As for the rest, the completely uninformed don't care because they probably moved on to other gaming devices anyways, and the rest like us forum types will bitch and moan to zero effect because our complaints are irrelevant. And let's face it, most people that complain, pledge never to buy a game by company X, etc, are full of it anyways, most of them break down and buy it anyways.


This saddens me... I always enjoyed building hi-spec'd rigs knowing that games will support it. Games that really required beefy PC components for game engines using a insane amount of geometry, ungodly amount of shaders, intensive shadowing, 2k/4k resolutions, etc/etc. What diehard PC gamer really wants to build a hi-spec'd rig, just to run a poorly coded / poorly optimized console port? *cough* Watch Dogs *cough*

IMHO, this staggers the industry, not improving it.

The good news is that it will help phones and tablets bridge the visual gap much quicker, since if 7 year old hardware dictates what we visually see then hardware based on a yearly cycle will be able to visually catch up to where things look "close enough' much quicker.
 
Back
Top