Gaming Journalism

Still the attempt should be made. No need to run into cases of having devs point out the errors of your analysis in public, when it could have been handled in private.

As far as I know, DF always tries to talk to the developers. They will always prefer to have a developer talk about their work. But this is always very hard, first to get the talk done during crunch time (which is usually when it is relevant, because that's when the game comes out and people want to know about the game), and then to get PR to sign off - so when that even happens, it typically happens quite some time after the game is out.

Also, I very strongly doubt that DF is so profitable that anyone there has a full-time job working on DF articles, so I highly doubt they will have PS4 devkit and I also doubt that they would have much benefit from one, as developers aren't going to give them their sourcecode anyway and you can't profile retail releases on them - you may well not even be able to run retail releases on them.
 
Ok, but that is already using some form of AA for most games (upscaling ;) )

i mean, x360 games are on 720p but upscaled to 1440x900 looks good (clean from aliasing, sharp) due to it have high PPI in small screen (19 inch).

its like when i streamed a pc game years ago to my Nokia symbian phone. No aliasing, sharp. I think it was 480p screen in very small inch. Smaller than 3DS screen.
 
As far as I know, DF always tries to talk to the developers. They will always prefer to have a developer talk about their work. But this is always very hard, first to get the talk done during crunch time (which is usually when it is relevant, because that's when the game comes out and people want to know about the game), and then to get PR to sign off - so when that even happens, it typically happens quite some time after the game is out.

Then they should document their attempts in the articles. And any attempts to source outside devs who may serve as a technical source. If the devs are busy, restricted from answering or simply are Aholes then its totally fine.

But the technical portion of their articles that go beyond simple frame or pixel counting should be structure in a way that show they have attempted to gather the most accurate information. Its not hard to write that the developers were unavailable for comments, that outside technical sources were unable to readily determine whats happening and here is our best guess of whats going on.

There nothing wrong with guessing on whats happening as long as its noted. But postulations should left to aspects of analysis that are easy to get right.

In short if they want to develop and maintain trust with their readership, they should document their attempts to get accurate information from primary or secondary sources. And denote their own analyses when those sources are unavailable as educated guesses. Readers shouldn't be left to assume.
 
Last edited:
I am still finding at times that I am not a fan of AA. In Project CARS, I had MSAA turned on, but turned it off, and the framerates are better, the details can be cranked up higher, the colors and lighting pop more, etc. Hmm. The shimmering in 1080p isn't that great to begin with. Starting to think more and more that the benefit of AA is linked to the ppi (which is probably a bit obvious, but I'm not sure everyone realises it).
^^ playing X360 on 19" 1440x900 monitor already looks very clean :D
Yup, games on the X360 using a VGA cable look superb crisp whether they had AA or not. If the resolution matches your TV's native resolution, you are in for a treat. :smile2:

That was a huge advantage for the Xbox One.

I remember discussing this eons ago, but HDMI looked a bit more colourful on my old TV, although when I played the same game on VGA next I thought the image quality was so good, there was no shimmering, no jaggies. Even the sub-HD games looked great and jaggies were visible to some extent, but never ever they seemed to cut like a saw just by looking at them.
 
@Cyan vga is a bit blurrier than HDMI/DVI. at least that what i got when comparing inputs on my 29" 1080p HDTV on X360. But the blur is not he annoying blur like from upsampling an image. i dont know how to describe it.
 
@Cyan vga is a bit blurrier than HDMI/DVI. at least that what i got when comparing inputs on my 29" 1080p HDTV on X360. But the blur is not he annoying blur like from upsampling an image. i dont know how to describe it.

I had an official 360 VGA cable, but either the cable or HANA had some internal interference that caused a kind of ghosting. Reminded me a little of when I built an RGB cable for my Megadrive, but the composite channel signal interfered with the RGB because it wasn't shielded (removing composite from the cable fixed it). On 360 I set output range to full and that made the issue disappear, for whatever reason.

VGA was pretty damn sharp if your monitor was sampling the signal properly and the signal was free of interference. If you're not sampling the analogue signal properly, or if there's interference (could be in the display adapter, cable, or even in the monitor) values can bleed across between pixels that make up the rows.

I really regret ditching my Trinitron monitors. :(

Edit: I just replied to a necro. Happy 2015. :oops:
 
I had an official 360 VGA cable, but either the cable or HANA had some internal interference that caused a kind of ghosting. Reminded me a little of when I built an RGB cable for my Megadrive, but the composite channel signal interfered with the RGB because it wasn't shielded (removing composite from the cable fixed it). On 360 I set output range to full and that made the issue disappear, for whatever reason.

VGA was pretty damn sharp if your monitor was sampling the signal properly and the signal was free of interference. If you're not sampling the analogue signal properly, or if there's interference (could be in the display adapter, cable, or even in the monitor) values can bleed across between pixels that make up the rows.

I really regret ditching my Trinitron monitors. :(

Edit: I just replied to a necro. Happy 2015. :oops:
VGA looked way better than HDMI on my Samsung HD Ready TV at the time -which I gave away to my mother- when playing X360 games. HDMI signal had to be upscaled to 1366x768 native res of my TV, but I could set VGA to 1440x900 and two horizontal black lines appeared at the top and bottom of the screen, and the image was so good, it made jaggier games look beautiful and jaggiless. I only noticed that colours seemed a bit more dim, but the change overall was really worth it.
 
Back
Top