determining why glsl shaders fail on ati
category: code [glöplog]
I work on a project which generates shaders at runtime, and we are trying to move from Cg to GLSL. The new GLSL stuff seems to work great...but only on nvidia hardware.
I'm not really experienced enough to be able to look at the shaders and determine what is angering the ati driver/hardware gods. Is there a nice list of such things to look out for or any write-ups/posts someone could direct me to about this compatibility issue?
FWIW, here are some example shaders:
http://min.us/mboAd2fK4t#1o
I'm not really experienced enough to be able to look at the shaders and determine what is angering the ati driver/hardware gods. Is there a nice list of such things to look out for or any write-ups/posts someone could direct me to about this compatibility issue?
FWIW, here are some example shaders:
http://min.us/mboAd2fK4t#1o
Being NViDiA guy myself, I've found ATI's ShaderAnalyzer very useful for pinpointing the problems - did you give it a try? Perhaps the most common pitfall I keep running into is skipping channels when storing color output.
what i learnt first was:
always have a dot in your floats...f.e. "float a=1;" should be "float a=1.;" atleast. (1.0 works ofcourse, but if you are trying to make it tiny you can skip zeros) while "0" itself should be "0." or ".0", doesnt matter.
i hated having to do so back when i had a nvidia-card and all worked nice except on other peoples ATI-Cards, but what has to be done has to be done :/ (in the end its very few bytes you "loose" to that tho.)
always have a dot in your floats...f.e. "float a=1;" should be "float a=1.;" atleast. (1.0 works ofcourse, but if you are trying to make it tiny you can skip zeros) while "0" itself should be "0." or ".0", doesnt matter.
i hated having to do so back when i had a nvidia-card and all worked nice except on other peoples ATI-Cards, but what has to be done has to be done :/ (in the end its very few bytes you "loose" to that tho.)
you also could sketch your shaders in "ATI Rendermonkey" (its free, google it, download/install it, love it), if they work in there they should compile in your code aswell.
shuffle2: In my experience, outputting the compilation-log even if the compilation succeeds, and fixing the warnings reported by the NVIDIA compiler fixes 99% of ATI-incompatibilities (or rather, NVSL-dependencies).
Another option is to declare the shader version (i.e "#version 100" even if you only use GLSL 1.0 features), this seems to enable a strict-mode in NV's GLSL compiler.
Another option is to declare the shader version (i.e "#version 100" even if you only use GLSL 1.0 features), this seems to enable a strict-mode in NV's GLSL compiler.
hArDy: Rendermonkey doesn't prevent all NV-ism in shaders, so that won't help.
I had a look to your shaders. They are all wrong, in that they declare floating point numbers with an f. For example (Shader003),
is not legal GLSL. It should be:
Another mistake you are doing is using saturate(). That doesn't exist in GLSL. For example, in shader Shader119, repplace
to
Another error is to use frac(), does doesn't exist in GLSL. Use fract() instead. For example, in Shader119, replace
with
etc etc etc. Basically, your shaders will work in GLSL when they follow the GLSL standard. More info, in the Wikipedia: http://en.wikipedia.org/wiki/GLSL
i know i can be bitchy, but eh, how many times do we have to answer the same question?
Code:
vec3 c01 = (c0 + c1) * 0.5f;
is not legal GLSL. It should be:
Code:
vec3 c01 = (c0 + c1) * 0.5;
Another mistake you are doing is using saturate(). That doesn't exist in GLSL. For example, in shader Shader119, repplace
Code:
prev.a = saturate(cprev.a*crastemp.a);
to
Code:
prev.a = clamp(cprev.a*crastemp.a, 0.0, 1.0);
Another error is to use frac(), does doesn't exist in GLSL. Use fract() instead. For example, in Shader119, replace
Code:
cprev = frac(prev * (255.0f/256.0f)) * (256.0f/255.0f);
with
Code:
cprev = fract(prev * (255.0f/256.0f)) * (256.0f/255.0f);
etc etc etc. Basically, your shaders will work in GLSL when they follow the GLSL standard. More info, in the Wikipedia: http://en.wikipedia.org/wiki/GLSL
i know i can be bitchy, but eh, how many times do we have to answer the same question?
shuffle: Read and follow the spec (btw seems the document changed a lot since I last looked) instead of using what happens to work in "nvsl" ;)
IIRC it also covers when it's needed to specify float constants and when it's not (but I guess that's only really relevant for size coding)
hardy: rendermonkey (naturally) uses the opengl driver on the system, so no, that won't help. But as kbi says ShaderAnalyzer is really useful to test if shaders at least compile (which will catch most pitfalls).
And as kusma says, nvidia's compiler is actually quite strict (ie glsl and not nvsl) if you start specifying the version - at least in the higher versions I remember it as more strict than amd's.
IIRC it also covers when it's needed to specify float constants and when it's not (but I guess that's only really relevant for size coding)
hardy: rendermonkey (naturally) uses the opengl driver on the system, so no, that won't help. But as kbi says ShaderAnalyzer is really useful to test if shaders at least compile (which will catch most pitfalls).
And as kusma says, nvidia's compiler is actually quite strict (ie glsl and not nvsl) if you start specifying the version - at least in the higher versions I remember it as more strict than amd's.
OK, thanks everyone!
iq: It didn't come off as overly bitchy. however, we #define frac and saturate, among other things, when needed (ie when generating glsl):
So with any luck this will just leave the problem to the syntax of float constants. Thanks a lot for looking at the files though.
also we are currently using
so i'll fiddle with this to make it more strict
iq: It didn't come off as overly bitchy. however, we #define frac and saturate, among other things, when needed (ie when generating glsl):
Code:
#define frac(x) fract(x)
#define saturate(x) clamp(x, 0.0f, 1.0f)
So with any luck this will just leave the problem to the syntax of float constants. Thanks a lot for looking at the files though.
also we are currently using
Code:
#version 330 compatibility
so i'll fiddle with this to make it more strict
why not just use the standard naming instead of macros ? :)
What iq and others said basicly. You could have easily detected all these errors using AMD's GPU ShaderAnalyzer , and just stick to the specification of GLSL 330 that you can get here.
It's really annoying to see people just blaming xy hardware vendor or z API instead of actually trying to figure out what is going wrong. :/ OpenGL and ATI work fine, you just have to produce good code and test. It's not so difficult.
It's really annoying to see people just blaming xy hardware vendor or z API instead of actually trying to figure out what is going wrong. :/ OpenGL and ATI work fine, you just have to produce good code and test. It's not so difficult.
Quote:
The new GLSL stuff seems to work great...but only on nvidia hardware.
so gl continues to be a total jackass festival, regardless of what iq justly pointed out.
Quote:
so gl continues to be a total jackass festival, regardless of what iq justly pointed out.
No, nvidia just continues to make their drivers more "developer friendly" than "standards compliant".. =)
sol: No, it's more that NVIDIA fucked up a long time ago (they used the same compiler for Cg and GLSL), and doesn't want to break existing code that depends on this misbehavior.
So they need to default to non-strict compilation, and find a future-proof way to introduce strictness. Enabling it on "#version" makes sense as such.
In other words, they are "user friendly" rather than "developer friendly", as people get to play the games they bought, even if the developer went belly-up after shipping.
So they need to default to non-strict compilation, and find a future-proof way to introduce strictness. Enabling it on "#version" makes sense as such.
In other words, they are "user friendly" rather than "developer friendly", as people get to play the games they bought, even if the developer went belly-up after shipping.
kusma: thanks for that. I've long thought nvidia were just catering to lowest-denominator coders or something with their glsl compiler, now I can move from general disgust to grudging acceptance of the situation :)
Informative thread, spread the word.
Don't only blame NV - ATI shader compilers are crap too. _really_
kusma: Yeah I figured it'd be that, the "Cg" conflict. But any more or less standardized API that allows major vendors to pull off jackassery like that for whatever legacy reasons they have.... is questionable :) But I know all this doesn't really help the discussion so I'll leave it at that.
(then again NV also has told "people" that if "you set obsolete renderstate X to Y" your "game will run better come next driver release", so it's all good business I guess)
i couldnt even compile my last intro on ATI. (gets stuck without bailing out, no error(-message), taskmanager needed. still no fix for that, thats why theres no FINAL yet :/ )
fuck them shader compilers!
fuck them shader compilers!
rasmus: because currently we can still produce Cg shaders.
las: +1
I have been curious for some time how does Apple deal with this problem. Anyone knows? I find it doubtful that the Jobs would allow such discrepancy in 'his' OS and hardware, but there does indeed exist Nvidia, ATI and Intel gfx Macs.
I doubt he'd give a flying fuck about differences between GLSL compilers tbh, but who knows :)
Never had a mac with nvidia somehow, but I've not seen anything like the nvidia-only mess we've seen over the years on PC (and I'm involved with other scenes where a problem like this would be well known). In general ati/nv incompatibilities are rare. If anything I'd say ATI has a better reputation for drivers than nvidia on mac, but it's nothing people really bitch about. The general state of opengl and various other stuff gets plenty of complaints though ;)
Never had a mac with nvidia somehow, but I've not seen anything like the nvidia-only mess we've seen over the years on PC (and I'm involved with other scenes where a problem like this would be well known). In general ati/nv incompatibilities are rare. If anything I'd say ATI has a better reputation for drivers than nvidia on mac, but it's nothing people really bitch about. The general state of opengl and various other stuff gets plenty of complaints though ;)
wow, someone quoting a dead jobs in a glsl context. props for that feat, really. and i thought i had seen it all.