View Poll Results: The better approach to optimize video quality on a big (24") monitor
- 17. You may not vote on this poll
Keep native resolution, lower video options
Lower resolution, keep video options as high as you can
There is no 'one better way' young grasshopper.
Results 1 to 13 of 13
03-02-2013, 12:37 PM #1
- Join Date
- Oct 2011
- Berlin (GMT+1)
Optimal approach to downgrading video settings
My rig can no longer run games that have come out in the last couple of years at max video settings. And I won't be upgrading anytime soon.
Here's my question and were I would like you experience to weigh in.
In order to run games at acceptable frame rates whilst trying to keep the most beautiful output possible, do you think the better approach is:
- Lower the resolution (my native is 1920x1200) e.g. 1440x900 but try and keep texture, shader etc quality as high as possible
- Keep native resolution but lower all other settings
I honestly struggle. Sometimes the upscaling involved in lowering resolution makes things look like shit, whilst sometimes lowering e.g. texture or model details makes it look even more like shit.
Any thoughts on this topic? Is there some kid of strategy I could apply other than spending the day doing all sorts of combinations until I find it acceptable?
ps: and just because I can, I'll even add a poll for the funs!Easy (nCore) likes Braaaains
Find me on Steam
03-02-2013, 12:52 PM #2
Mostly it depends on the game (or engine) in question, it's a case of finding the things that make unreasonable requests of your system and tuning them down, if it slows down when certain things happen (usually particle effects for me) then downgrade those. Do you really notice full shadow effects?
Official forums (much that I hate them) will often have posts from people who have done all the graphics fiddling so you don't have to.
03-02-2013, 01:33 PM #3
Personally I alwayhs go with turning down the shader settings and such, and keep the resolution on native. I don't really need SSAO for my gaming enjoyment, I just want a sharp image.
As for my recommendations: Always turn down Shader settings first: HBAO or SSAO, HDR, that sorta thing - as they are the hardest for your card to handle. Then turn down stuff like shadows and lightning - and only then start toning down meshes or models, and then textures. Graphic cards are really good at rendering polygons and textures these days, it's the fancy stuff that makes it difficult on them.
For me personally, a game without much shaders but good textures and models is still very pretty to look at.
03-02-2013, 03:02 PM #4
I'd rather turn down the resolution than go to the most gimped texture resolutions and texture (anisotropic) filtering settings. Other than that, you can probably sacrifice most settings before resolution. AA is a matter of taste, some people are more annoyed by jaggies than others.
03-02-2013, 05:54 PM #5
If the game in question has a Tweak Guide, then it's worth checking out the page for it to see which settings you can lower without sacrificing fidelity or your enjoyment of the game too much. Lots of comparison shots with different settings on or off.
04-02-2013, 10:42 AM #6
I find it interesting that in the past 6 years a few enthusiasts have contributed quite amazing performance configs for UE3 and Cryengine games. Since most games are based on the unreal engine or UDK, it's best to start there.
e.g. Mster config for Crysis. Results in a level of quality that is very close to the standard Very High settings while retaining performance. I was getting 80fps-90fps with it, decided to max out half the settings (post processing, motion blur, object, texture quality) to get a stable 60fps
With a bit of testing, you can find out the lowest resolution your monitor or video card can upscale well. I found that the best upscaling nvidia cards achieve to be at 1366x768, while ATI cards do well at 1440x900. Concentrate on finding the sweet spot in lower resolutions on your particular rig and you should be good. If the resolution is adequate, the result is much better than lowering details/texture resolution for the sake of higher resolution.
Last edited by mashakos; 04-02-2013 at 10:46 AM.Steam profile
PC Specs: I have a big e-peen
04-02-2013, 11:56 AM #7
- Join Date
- Jun 2011
After post-processing, AA is what I'd always go for next - 16x AA tends to munch up your FPS. I regarded downgrading my resolution as a last step to try - if the gameplay is good, I'm not usually too fussed if it looks a bit worse due to turning down the shinies."Swans are so big, they're like the Ostriches of the bird world"
04-02-2013, 07:05 PM #8
- Join Date
- Jun 2011
Downgrading resolution is a last resort for me - I cannot imagine any option faffery would result in the ugliness that non-native resolutions wreak on games (at least any game which displays text or glyphs or stuff like that).
As someone who generally has a cheap PC, I usually find that easily-off texture quality and shadows is all I need. There are usually issues with various types of AA whereby it's often better to have some than none - sometimes you need to force them on/off - sometimes the same with VSYNC
If that doesn't work - I put the game to one side for when I have a better PC rather than playing it in shit-o-vision.
05-02-2013, 11:29 AM #9
For me, V-Sync, Post Processing and AA go down first, because I hardly notice while playing. If that doesn't help, the resolution goes down. Lowering the resolution also provides some blurriness which is some kind of AA if you want ;)
Btw. isn't it funny how modern post processing renders super sharp images and blurs them again? Depth of field for example looks pretty ridiculous in most games, imo. And you already have DoF available. In your eyes. It's called peripheral vision.
Last edited by Ernesto; 06-02-2013 at 09:56 AM.
05-02-2013, 11:50 AM #10
Yeah I find depth of field pretty pointless. At least the blurring in anti-aliasing serves a real purpose, DOF doesn't really help improve the quality of the scene. It's not realistic - IRL any blurry bits would pretty much always be in your peripheral vision, while on a screen your eyes can roam into the blurry bits whenever you like. At best, it makes the graphics more of an analogue of a video or photograph than something seen first-hand, and that still meshes poorly with first-person games.
06-02-2013, 09:53 AM #11
07-02-2013, 02:16 PM #12
- Join Date
- Jun 2011
Depth-of-field isn't pointless - your monitor is a flat-plane, nothing diminishes on it, your eyes provide nothing in that respect - your peripheral vision just confirms you're looking at a flat picture (unless you've got a 27"+ in which case it confirms nothing!!)
It's frequently overused (see also most other processing techniques) but it definatelys has it's place. It works best where you're not noticed it at all tho - if it's obvious, it's wrong.
I have to repeat what I said earlier too - I'd rather not play a game than play at non-native res (exception: anything so old it can't do that of course - but those games looked like Lego before and still do!!) - as someone who's owned some pretty basic PCs, lowering the res seldom fixes anything once you've turned all the other stuff off anyway...
07-02-2013, 02:40 PM #13
Hmm... I found this http://www.flickr.com/photos/28887548@N02/5246349795/ to illustrate what I mean. You can focus on a small area on your monitor. Now with the depth of field effect the same thing happens (so it's redundant), but you can't decide where you want to focus. In shooters for example it's always the middle of the screen. If you want to point your weapon somewhere and check your field of view, you see a blurry image when it shouldn't be blurry.
I guess it depends on how you play, but I don't always point the middle of the screen to where I want to look. I let my eyes wander the screen instead.