Page 1 of 1
Ignore Anti Aliasing Settings ??
Posted: Mon Nov 07, 2005 6:15 pm
by nco2k
hi folks,
how to ignore the aa settings, so that my game will always run with aa turned off, no matter what the user has selected in his gpu driver.
c ya,
nco2k
Re: Ignore Anti Aliasing Settings ??
Posted: Mon Nov 07, 2005 8:31 pm
by PB
I don't think it's possible... the drivers override app settings every time, as
far as I know. If your app is for public use, it's bad practice to do it anyway.
Posted: Mon Nov 07, 2005 9:16 pm
by dracflamloc
Since its set at the driver level you're pretty much stuck running with AA on. Just curious why you would want to force it off anyway?
Posted: Mon Nov 07, 2005 9:46 pm
by nco2k
@PB & dracflamloc
you guys are wrong, it is 100% possible. see neverwinter nights or guild wars and some other games aswell.
i want to turn this crap off because of performance reasons. i am making a small oldschool looking tile based 2d game with one or two 3d sprites (not ogre!) and when aa is enabled, the game is extremely slowing down, so that it (almost) wont be playable anymore. there is also no difference in image quality. so where is the point of a not working but hardcore slowing down aa in a 2d game, without changing the image quality ?? thats why i want to force turning aa off.
i also tried some small games written in pb a while ago and could see the same effect. so its not a bug in my code, but maybe a bug in pb ?? i will try to reproduce this, but it will be hard.
anyway i want to override the drivers aa settings, just to be on the safe side.
c ya,
nco2k
Posted: Tue Nov 08, 2005 6:09 pm
by Trond
nco2k wrote:
anyway i want to override the drivers aa settings, just to be on the safe side.
I don't know if you knew, but there are four kinds of settings:
Off by default.
On by default.
Always on.
Always off.
You should be able to override Off/On by default. The whole PURPOSE of the others is to NOT "play it the way it was meant to be played". If some user was dumb enough to select one of the always ones without reason, don't help him just to really anger the ones who for some reason you don't know needs to have it on.
Bottom line: Why have a settings dialog if the programs doesn't respect it?
Posted: Tue Nov 08, 2005 6:33 pm
by nco2k
@Trond
I don't know if you knew...
yeah i know.
If some user was dumb enough to select one of the always ones without reason
dumb or not, the user should have the freedom to select whatever he wants. but my game should run with aa turned off, whatever the user has in his driver settings, because there is no need for it in a tilebased 2d game. the only effect is a performance dropdown. :roll:
don't help him just to really anger the ones who for some reason you don't know needs to have it on
see guildwars, you can have "force aa always on", but it wont affect in the game until you set "aa by application" in your driver. thats exactly what i want. i dont want to change the users global driver settings or anything like that. i just want
my game to always run with aa turned off, otherwise i anger the ones who always must turn off aa manually to play my game and reset back before play some other games. thats bad coding, see silent hill 2 for example. :roll:
Bottom line: Why have a settings dialog if the programs doesn't respect it?
because sometime its needed, its just professional to make sure the game will always run in every situation. like i told i dont want to modify the users system settings, just my game and i have the right to do it.
please dont argue about the meaning anymore. i know its the correct way and there are tons of games out there, doing it the same way. so if someone can help me out, please help me out.
c ya,
nco2k
Posted: Tue Nov 08, 2005 10:07 pm
by THCM
I've got nearly the same problem with my game. When AA is forced on driver level my game isn't playable anymore. I don't use Sprite3D functions. I get massive slowdowns and not all tiles on the screen are drawn correct. I posted to the bug forum before, but didn't get an answer.
Posted: Tue Nov 08, 2005 10:56 pm
by dracflamloc
AA at the driver level is typically faster than if done by an in-game setting.
There are several reasons for this.
If a user has a forced setting, regardless of the guild wars setting, it will use the forced setting unless you unforce it by allowing "Application COntrol"
If a user is smart enough to force something like AA, then they will know to try turning it off if your program runs slow.
Posted: Tue Nov 08, 2005 11:20 pm
by nco2k
@dracflamloc
If a user has a forced setting, regardless of the guild wars setting, it will use the forced setting unless you unforce it by allowing "Application COntrol"
wrong!
it wont use the forced aa setting, it use no aa instead, at least at guild wars, other games can be different i know. neverwinter nights as far as i can remember, uses aa settings, even if driver aa is forced off.
If a user is smart enough to force something like AA, then they will know to try turning it off if your program runs slow.
and if the coder was smart enough, he would care about this problem, so the user dont have to do this everytime.
but like i already said before, please dont argue about it (smart or not blah...) anymore.
c ya,
nco2k
Posted: Tue Nov 08, 2005 11:51 pm
by dracflamloc
I have guild wars right here, and I dont know what you are doing or what kind of card/drivers you use but my AA settings are always forced to 4x and guild wars definitely doesnt disable them. This is on a 6800gt with the latest nvidia drivers.
Heh, "If the coder was smart enough" he'd realize that driver-level forced settings are there for a reason and are not supposed to be software controlled.
Changing a users system settings without notice and permission I find backhanded and would never run a program that did so. If you really wish to find out how to do this, it will be hard. I suggest you look for nvidia and ati API information on thier websites. If you do implement this, a message box with a choice would be a good compromise.
Of course this still won't help with other cards like intel, sis, matrox, etc...
Posted: Wed Nov 09, 2005 12:14 am
by dagcrack
We usually run AA 16x with out much slow down at job's workstation, but sure, we have cutting edge gpus, anyway.. the point is that your card might be third generation or fourth.. (even some fifth generation struggles past 4x) the problem there is that they cant handle that much data that antialiasing needs, some other gpus are not well optimized or does not have a good implementation (antialiasing methods)... others like the FX5200 and the FX5500 have the exact same core for example, but yet quite good processing power for AA 2x and 4x.. (difference between the 5200 and the 5500 is just the clock speed, you could reflash your 5200's bios and get a "5500" because it's the same exact core, with higher speed, less limits on overlocking - although you shouldnt go wild overclocking ANY card even if you installed an "uber" cooling system, it will degrade the hardwares life and I will always think you're stupid by overclocking hardware :p).
But thats why you see the slow down. dont worry about it, attach in the readme a section about it and thats all, if the user was playing with his driver's settings is not something you should care about, you should care that your game runs on most systems with correct (default) settings and you should also tell this in the readme, apart from that, you cant expect to controll the universe..
...You can override ANY setting (yes) but make sure, that if you do this, you REVERT IT BACK TO NORMAL - Hopefuly this wont happen if your program crashes and you dont have a way of handling such event.
When it comes to changing gamma (using gamma ramp) you do change the user's settings there... Just make sure you revert it back to normal, same with anything else.
Pitty how many of the AA algorithms require so much fillrate..
Posted: Wed Nov 09, 2005 12:14 am
by nco2k
@drac
I have guild wars right here...
same here, 6800gt and latest drivers, but make sure 4x aa is disabled in gw itself, then force it in your driver and restart the game. no changes here.
Heh, "If the coder was smart enough" he'd realize...
wrong, otherwise the driver would not be meant to be accessed/controlled by software.
Changing a users system settings without notice...
i am just changing the
intern directx setting of my game, nothing else.
If you really wish to find out how to do this, it will be hard...
http://msdn.microsoft.com/archive/defau ... e_type.asp afaik using D3DMULTISAMPLE_NONE overrides the drivers aa setting anyway. as far as i have understood on a visual basic site a while back.
If you do implement this, a message box with a choice would be a good compromise
what for?! "hey user thank you for playing my game. if you are using force aa settings 16x or whatever, this wont change anything in my game, because i disabled it in my intern directx routines hahahaha, but its a freakin 2d game anyway isnt it? best wishes the coder. press ok or cancel now".
Of course this still won't help with other cards like intel, sis, matrox, etc...
thats why i want to override this due directx, like thousands other games aswell. is that so hard to understand?
c ya,
nco2k
Posted: Wed Nov 09, 2005 12:18 am
by dagcrack
thats why i want to override this due directx, like thousands other games aswell. is that so hard to understand?
To me it isnt hard to understand, already told you.. (hope you've seen the post)..
Anyway, if you play with the gamma ramp or any "permanent" setting, just make sure to revert it back to normal, always.
Posted: Wed Nov 09, 2005 12:29 am
by nco2k
@dagcrack
yeah sorry, you have send your post, while i still was typing.
well even the newest games run pretty fine on my machine with 16x aa & 16x af with all high quality settings and optimizations turned off (riva tuner), but a lousy purebasic game hardcore drops down even with 2x aa. but why?!
dont worry, i will never ever change anything permanent (thats bullsh*t coding

), thats why i want to do it due directx and only for my game. just like SetFrameRate(), it also doesnt change anything else, anywhere else.
c ya,
nco2k