OpenGL multisampling problems
category: general [glöplog]
Hi fellow pouëteers.
I have a problem concerning OpenGL and GL_MULTISAMPLE. I know there are forums for this and everything...
Anyway. I create a render context using a multisample pixel format and checking for the highest possible number of samplebuffers (<= 8). Then I enable/disable multisampling with glEnable/glDisable(GL_MULTISAMPLING). Standard stuff.
The problem is that turning mutlisampling on/off only works on some systems and fails on others, effectively giving me multisampling when I don't need/want it.
I tried getting GL_SAMPLE_BUFFERS and GL_SAMPLES. The first one always returns 1 (multisampling support) or 0 (no multisampling). The later returns 2 on systems where switching works and other values (8) on systems where it doesn't. The thing that puzzles me is that i get a rendering context with 8x-multisampling and GL_SAMPLES returns 2...
The gfx cards are Nvidia FX 5xx (works), Nvidia FX 1100 (works). Doesn't work on the laptop I tried it on, which has some Nvidia Geforce 6xxx chipset, and another PC with some Geforce 4 Quadro something chipset.
The driver settings are adjusted to let the application control antialiasing. Drivers are more or less fresh versions...
Oh, and I'm using Qt4 to get the render context here...
any ideas?
I have a problem concerning OpenGL and GL_MULTISAMPLE. I know there are forums for this and everything...
Anyway. I create a render context using a multisample pixel format and checking for the highest possible number of samplebuffers (<= 8). Then I enable/disable multisampling with glEnable/glDisable(GL_MULTISAMPLING). Standard stuff.
The problem is that turning mutlisampling on/off only works on some systems and fails on others, effectively giving me multisampling when I don't need/want it.
I tried getting GL_SAMPLE_BUFFERS and GL_SAMPLES. The first one always returns 1 (multisampling support) or 0 (no multisampling). The later returns 2 on systems where switching works and other values (8) on systems where it doesn't. The thing that puzzles me is that i get a rendering context with 8x-multisampling and GL_SAMPLES returns 2...
The gfx cards are Nvidia FX 5xx (works), Nvidia FX 1100 (works). Doesn't work on the laptop I tried it on, which has some Nvidia Geforce 6xxx chipset, and another PC with some Geforce 4 Quadro something chipset.
The driver settings are adjusted to let the application control antialiasing. Drivers are more or less fresh versions...
Oh, and I'm using Qt4 to get the render context here...
any ideas?
Never used GL_MULTISAMPLING but I use GL_MULTISAMPLE_ARB and it seems to work fine on most configs I tried.
Did you check that?
http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=46
My code is based on this example.
Did you check that?
http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=46
My code is based on this example.
OpenGL is death!!!
killed by Vista.
even the allmighty god Carmack is coding for DX and the 360 now :(
killed by Vista.
even the allmighty god Carmack is coding for DX and the 360 now :(
Quote:
Standard stuff.
OpenGL makes even standards exciting ;)
for me GL_MULTISAMPLE and GL_MULTISAMPLE_ARB have the same extension number...
The strange thing is I don't get an gl error. I query glGetError() after glDisable(GL_MULTISAMPLE), but it returns GL_NO_ERROR, so it should have disabled multisampling.
[quote]OpenGL is death!!![/qoute]
Yeah, that's 'cause people stop using it, it supports everything DX can do... and Vista kills GL because it uses all of the texture memory for the aero-windows. There are other OS's though and they usually don't support DX...
The strange thing is I don't get an gl error. I query glGetError() after glDisable(GL_MULTISAMPLE), but it returns GL_NO_ERROR, so it should have disabled multisampling.
[quote]OpenGL is death!!![/qoute]
Yeah, that's 'cause people stop using it, it supports everything DX can do... and Vista kills GL because it uses all of the texture memory for the aero-windows. There are other OS's though and they usually don't support DX...
OpenGL state machine = CANCER
Anyways, you should also check the gfx card's driver preferences, since they usually allow to force multisampling to on/off or any sample size...
Anyways, you should also check the gfx card's driver preferences, since they usually allow to force multisampling to on/off or any sample size...
i don't force the settings... the are set to "application defined" on all machines...
gfx cards = state machines
IMHO DX will do the same as GL behind the scenes, you just don't see it...
IMHO DX will do the same as GL behind the scenes, you just don't see it...
all sequential processes and events = state machines
hi,
well first of all make sure to check you do like in the tutorial keops pointed out, you have to build up a first normal window to get the extensions and check if you have multisampling, and if so, destroy it and make a new one with another pixel format.
granted this is done fine, other problems might come from the fact that on some configs it's possible to enable multisampling but not disable it... so it's typically the kind of feature you'd enable at startup and never touch.. ;)
hope this helps somehow,
well first of all make sure to check you do like in the tutorial keops pointed out, you have to build up a first normal window to get the extensions and check if you have multisampling, and if so, destroy it and make a new one with another pixel format.
granted this is done fine, other problems might come from the fact that on some configs it's possible to enable multisampling but not disable it... so it's typically the kind of feature you'd enable at startup and never touch.. ;)
hope this helps somehow,
rarefluid: some nvidia-cards enable high-amounts of "multisampling" by combining multi- and super-sampling. In some systems, supers-sampling can't be turned off. Perhaps this is what's happening? Try selecting 2 or 4 samples...
And no, d3d won't do the same, since multisampling is a render-target-configuration, not a state-setting (which makes soo much more sense).
And no, d3d won't do the same, since multisampling is a render-target-configuration, not a state-setting (which makes soo much more sense).
Quote:
even the allmighty god Carmack is coding for DX and the 360 now :(
Carmack writes code?
...seems to be working on FX cards, but not on the Go-ones (laptop cards)
well, it seems to work like nystep said. You can enable it, but can't disable it...
It also seems that if I glEnable multisampling on a window that already has some sort of antialiasing, the picture gets "even more multisampled"... The only safe way to turn it off by application is to NOT acquire a multisampling render context. I need a NON-multisampling context for 2D stuff and I might try using different windows for 2D and 3D. hope that'll work.
I'll do some more tests next week...
Thanks for your replies so far guys!
It also seems that if I glEnable multisampling on a window that already has some sort of antialiasing, the picture gets "even more multisampled"... The only safe way to turn it off by application is to NOT acquire a multisampling render context. I need a NON-multisampling context for 2D stuff and I might try using different windows for 2D and 3D. hope that'll work.
I'll do some more tests next week...
Thanks for your replies so far guys!
Well. Guess what. Problems are gone with either a new driver or - I rather suspect this - a new version of the Qt library (4.3.0)... That stuff drove me crazy and now it is fucking gone...
And a note on a similar problem: If you're using "GL_GENERATE_MIPMAP_HINT_SGIS" for textures on cards that tell you they support the extension "GL_SGIS_generate_mipmap". DON'T TRUST THEM! Some older cards only support it in SOFTWARE (to be OpenGL 2.0 compliant)!
It will be fuckin slow, that's how you'll notice...
I suspect the Quadro FX1100 (Geforce 5700), Quadro FX500/600 (Geforce 5x00) and probably some ATI Radeons too here...
jsyk.
And a note on a similar problem: If you're using "GL_GENERATE_MIPMAP_HINT_SGIS" for textures on cards that tell you they support the extension "GL_SGIS_generate_mipmap". DON'T TRUST THEM! Some older cards only support it in SOFTWARE (to be OpenGL 2.0 compliant)!
It will be fuckin slow, that's how you'll notice...
I suspect the Quadro FX1100 (Geforce 5700), Quadro FX500/600 (Geforce 5x00) and probably some ATI Radeons too here...
jsyk.
you could also give a look at opengl.org forum.
Quote:
Or maybe gone fucking. In that case: it will be back!That stuff drove me crazy and now it is fucking gone...