Ok, but that is already using some form of AA for most games (upscaling )^^ playing X360 on 19" 1440x900 monitor already looks very clean
Still the attempt should be made. No need to run into cases of having devs point out the errors of your analysis in public, when it could have been handled in private.
Ok, but that is already using some form of AA for most games (upscaling )
Where have you been since the SDK leak?wait so now developers have acess to 7 xbox one cores instead of 6 ?
As far as I know, DF always tries to talk to the developers. They will always prefer to have a developer talk about their work. But this is always very hard, first to get the talk done during crunch time (which is usually when it is relevant, because that's when the game comes out and people want to know about the game), and then to get PR to sign off - so when that even happens, it typically happens quite some time after the game is out.
I am still finding at times that I am not a fan of AA. In Project CARS, I had MSAA turned on, but turned it off, and the framerates are better, the details can be cranked up higher, the colors and lighting pop more, etc. Hmm. The shimmering in 1080p isn't that great to begin with. Starting to think more and more that the benefit of AA is linked to the ppi (which is probably a bit obvious, but I'm not sure everyone realises it).
Yup, games on the X360 using a VGA cable look superb crisp whether they had AA or not. If the resolution matches your TV's native resolution, you are in for a treat. :smile2:^^ playing X360 on 19" 1440x900 monitor already looks very clean
Where have you been since the SDK leak?
Analoguei dont know how to describe it.
@Cyan vga is a bit blurrier than HDMI/DVI. at least that what i got when comparing inputs on my 29" 1080p HDTV on X360. But the blur is not he annoying blur like from upsampling an image. i dont know how to describe it.
VGA looked way better than HDMI on my Samsung HD Ready TV at the time -which I gave away to my mother- when playing X360 games. HDMI signal had to be upscaled to 1366x768 native res of my TV, but I could set VGA to 1440x900 and two horizontal black lines appeared at the top and bottom of the screen, and the image was so good, it made jaggier games look beautiful and jaggiless. I only noticed that colours seemed a bit more dim, but the change overall was really worth it.I had an official 360 VGA cable, but either the cable or HANA had some internal interference that caused a kind of ghosting. Reminded me a little of when I built an RGB cable for my Megadrive, but the composite channel signal interfered with the RGB because it wasn't shielded (removing composite from the cable fixed it). On 360 I set output range to full and that made the issue disappear, for whatever reason.
VGA was pretty damn sharp if your monitor was sampling the signal properly and the signal was free of interference. If you're not sampling the analogue signal properly, or if there's interference (could be in the display adapter, cable, or even in the monitor) values can bleed across between pixels that make up the rows.
I really regret ditching my Trinitron monitors.
Edit: I just replied to a necro. Happy 2015.