Debugging

It's a sad fact of programming that code rarely ever works on the first go. You type something and it doesnt compile, so you remove the worst syntax errors and eventally it runs, but doesn't do what it should. If you're even somewhat experienced in coding, you probably have your own set of debugging techniques to cope. Often, debugging involves knowing the value of some variables and comparing them with what you know they actually should be. Printing a log out if a certain condition occurs that should not occur. Or similar things.

In GLSL, that's completely non-feasible. A fragment shader runs 30 times per second for two million pixels - if you could print a line out to the console in every execution, you'd get sixty million messages every second. Not really useful - which is why GLSL doesn't even contain functions like cout to access an output stream.

What to do then?

Syntax errors

The somewhat easier part to deal with are syntax errors. If the shader doesn't compile, the graphics card driver generates an error message, usually even with a line number. Now, where exactly this error message ends up depends on your application and your operating system - under Linux, a program that is started from the console usually directs the error stream back to the console, but there's no guarantee, the application can re-direct the error stream to a long file or elsewhere. So you need to first look where errors end up, and then use the information provided to catch syntax errors.

Of course, before you do that, you need to realize that something is wrong. Dependent on how your effect is structured, it may be the case that if a shader doesn't compile, you get to see a fallback option. In this case, you get menaingful visuals on the screen, they just don't change regardless of what you change in the code (because you're editing the wrong file).

If you ever get the feeling that the output doesn't react to whatever you do to the code, insert a line

gl_FragColor = vec4 (1.0, 0.0, 0.0, 1.0);

at the very end of your fragment shader. If your object doesn't turn red, your shader isn't running.

Algorithm errors

Say you have verified your shader compiles and runs, but doesn't do what it should. What then? Are the normals attached properly? Is there a problem with the view vector definition? You'd like to know what the value of the normals or the dot product is, but you cant' print it.

You can however color-code it and bring it on-screen that way. Remeber, whatever you write to gl_FragColor is what you'll see on-screen. You can in fact write the normal vector as well - simply transfer it from the vertex shader over and do:

gl_FragColor = vec4 (normal, 1.0);

and you'll get something like this (well, if you look into the right direction, one hemisphere will be black where the normal goes negative...):

gl_Normal of an airplane windshield rendered as color distribution.

Here you see nicely that there are in fact sharp discontinuities in the normals at the edges - so the rendering problem I was seeing in precisely that region when doing this particular debug were caused by the incoming mesh, not by a problem in the shader.

The important thing here is to be creative and to come up with a useful mapping - color channels are always supposed to be between 0 and 1. The length of the normal vector is automatically in that region, but a quantity like MSL altitude is not - so you have to transform it before writing it as a color, otherwise you'll only see white.

It takes some time to get used to looking at colors to diagnose rendering problems, but it actually works and with some practice, it's nearly as powerful as being able to print logs to the console.

A similar technique is if you want to know whether a certain condition which should not occur in fact occurs. Try

fragColor = mycolor;

if (condition == true) {fragColor = vec4 (1.0, 0.0, 0.0, 1.0);}

gl_FragColor = fragColor;

and you'll see all pixels marked in red where the condition which should not occur does in fact occur. Of course that still leaves you with the problem why it occurs - but that part is common to debugging everywhere.

Driver idiosyncrasies

A matter which took me some time to get used to is how much the outcome of what you do depends on the graphics driver you're working with. It's really like there's an invisible companion working with you on a rendering project who has his own character - something I most definitely do not know that way from C++ or FORTRAN coding.

NVIDIA drivers usually try to be helpful and accomodating - if you make a syntax error, the driver still tries to interpret your intention, and if that is reasonably clear, the shader compiles and runs okay without you getting much of a warning. The most striking example I have seen is that I had declared a color-valued function to return a float, yet assigned a vector to the return value - and strangely enough, it worked, I never realized until the shader ran on a different system and did not even compile.

On the other hand are AMD/Radeon drivers which feel like strict schoolmasters, insisting in keeping to the correct syntax to the letter, otherwise they'll do nothing.

Sometimes drivers also have bugs - code that's formally okay isn't run properly. Problems like this are often cured by updates - or by re-structuring code ever so slightly in order not to trigger that particular bug. The fact that under Linux graphics cards often run on OpenSource rather than the proprietary drivers doesn't necessarily help here.

Numerical issues

Sometimes a shader runs fine, but you get to see odd colors at just a glancing reflection angle. An unexpected black chunk gouged out of an otherwise pale blue sky. That kind of thing.

It's likely that these are numerical issues. You're computing in floating point precision in the first place, and usually graphics cards drivers are more concerned with speed than with accuracy. So the numbers you may get might just be close, but not exact.

Usually that's not a problem, because you really can't visually see if a volor value is a percent different. But if your calculation relies on the math working out to a result between 0 and 1, sometimes a percent difference means you get a value of -0.000001. Which is close to zero - but if you pass it as a color value, it's not zero, and it causes problems. More so if the result goes next into a function like a square root that doesn't take negative arguments!

A fairly reliable cure if you really need an output in a certain range and don't want to trust your luck with numerics is to clamp return values to the valid range

value = clamp(value, 0.0, 1.0);

or to restrict their range otherwise

value = max(value, 0.0);
parameter = min (parameter, 1.0);

While this should not mathematically be necessary, it buys you nothing to insist in the math unfortunately if the driver doesn't play along.

Continue with Texturing.


Back to main index     Back to rendering     Back to GLSL

Created by Thorsten Renk 2016 - see the disclaimer and contact information.