Can configure your videocard and/or monitor for that. Or just use a custom resolution with dimensions that are relative primes to the other.
If you still need a 1:1 pixel ratio, just disable resolution stretching, your active display area will be exactly the same as with a smaller display.
I know I wouldn’t give up the extra space when working with any text editors, IDEs, design apps, cad, studio, maya, ps etc…, or just watching movies. But also ain’t gonna cash out for a fancy video adapter just for the sake of games.
Because that is a 16:9 aspect ratio. 16:9 is a widescreen movie ratio. For example 1920x1080 is a 1080P movie. 1280x720 is a 720P movie. For a long time 16:10 ( or 8:5 ) was the standard for computer monitors, however it is becoming less common to see 16:10 aspect ratio monitors, and that makes me a sad panda because I don’t like things THAT widescreen.
Also @ vivanto : No matter what scaling has a negative impact on image quality, even if it is the correct aspect ratio.
My monitor is 1920x1080 it’s fine, I have no idea what you guys are talking about. My games don’t lag; Crysis Warhead runs flawlessly on enthusiast settings and 1920x1080 resolution. Seriously, unless your computer can’t handle high res, there is absolutely nothing wrong with a 1920x1080 monitor.
It [[COLOR=‘Red’]lag] has nothing to do with the fact that the resolution is 1920x1080. However, display quality / image scaling DOES. Most people (who know their stuff) prefer 16:10 for computing uses, and 16:9 for media. And 16:10 has no problem displaying 16:9 - you get more out of a 16:10 monitor, as a 16:9 CANNOT natively display 16:10 content.
It [[COLOR=‘Red’]reason for not getting 1920x1080] has to do with the fact that you’re rendering a game in a high resolution. What videocard do you have, Garth? Because I have a 5870, and I’m positive that I could run Crysis maxed out at 1920x1200 if I had a monitor that large (but I don’t, so I run Crysis at 100fps @ 1440x900).
So unless you have a bitchin’ videocard, you’re better off with 1680x1050ish. Especially if you plan on playing games that will come out in the near future WITHOUT having to upgrade your videocard. For example, your current graphics card can play Crysis @ 1920x1080, but in a year, a new “Crysis” will come out (i.e. a game that brings GPUs to their knees) and you’ll suddenly be gaming at 1024x768 just to get playable framerates.
I figure I’ll get 16:10 a monitor, but run (Crysis type GPU intensive) games at 1680x1050. I’ve found nothing on the web that mentions scaling being horrible. I’m confident (hoping) that my GPU/monitor will be able to scale without tearing or anything like that. (If there is some, I’m hoping it will be very minor. If this was a big problem I’d think there would be something out there I could have found on the subject.)
Well, I’m not necessarily saying 1920x1080 is better than anything else, I was simple disagreeing with everyone else saying it was shit. My videocard is a 9800gtx. My computer is over a year old and I have yet to find a game it can’t handle full settings. guess you know more than me and OP is best off going with your advice.
I’m also gonna disagree that 1920x1080 is shit, most games that support 16:10 also support 16:9 natively. Also you shouldn’t rely on scaling to play games faster, with LCD’s it’s either native resolution or disabled scaling for good quality.
Founded in 2004, Leakfree.org became one of the first online communities dedicated to Valve’s Source engine development. It is more famously known for the formation of Black Mesa: Source under the 'Leakfree Modification Team' handle in September 2004.