Blender Game Engine.
It is nice.
But it also has its problems. Just like everything else.
This time I had a hard time doing some GLSL shaders. Not the normal ones.
Just for testing I did a little GLSL viewer in BGE:
Just edit the simple.fs as you like. The code inside is taken from http://glsl.heroku.com/e#6952.0
You can find more nice codes here:
But with this little sample I got a problem:
How do I find out if an uniform is active?
This should return -1 if the uniform isn't active.
The function is not in the python api.
So currently I have no choice but to just pass all uniforms I know and get lots of messages about some not being in the shader.
Here I tried using the RenderToTexture feature to create a texture for an object on the fly:
Short answer: I couldn't get it working.
1) How do I switch between shaders without recreating them all the time?
2) How do I use another texture to render on then just the first?
3) How can I hide shader compile messages?
Don't wanna talk about the performance issue…
Or that it won't work as I thought it could.
So to get some answers:
Currently the shader program is part of BL_Shader. And it is capsuled pretty good. So BL_Shader should be exchangeable. So some "setShader" would be nice inside KX_BlenderMaterial.
2) RAS_IPolyMaterial just has one texture. BL_Material has access to more. Same goes for KX_BlenderMaterial. As KX_BM is a RAS_IPM there might be a way to query the the texturename with a virtual method.
3) Create a state that disables the output of those messages.
My first intend was to improve the interest detection I talked about last time [http://urfoex.blogspot.de/2013/02/bgeglsl-finding-interest-points.html]. Therefor I thought about using framebuffer objects.
Simply said it is the same as RenderToTexture. You create a buffer and then render to it instead of to the screen. When done you can just the buffer e.g. as a texture for the real screen rendering.
It is not the thing I had hoped for but at least it is something that could have helped a bit. My real intention was to have a buffer in GLSL that I could read and write to. The buffer should be on GPU. The shader could access and modify it each round.
The framebuffer could do the trick. The steps would be the following:
- create as much framebuffers as needed
- bind first fb, bind shader for first fb
- render just the object to the fb
- repeat step 2 and 3 until all framebuffers are used, consider using previous fbs as samplers
- set shaders, fb-samplers and other things for real rendering
- for each following rendering: skip step 1
Here are my first tries:
They don't work.
I'm not quite sure. Using PyOpenGL is simple. But somehow it is not.
Or at least not in combination with BGE.
I had found a sample on how to include it:
But "yeahy!" that just is for single-texture-mode (see http://blenderartists.org/forum/showthread.php?277025-KX_PolygonMater*ial-and-KX_BlenderMater*ial-question&highlight=KX_PolygonMaterial).
Writing some code I could get some framebuffer-code but for using the FB afterwards I would need to set it as a sampler. But how to do this without proper access the the shader? By building the shader via PyOpenGL. If it would give me more then just an error:
So no PYOGL shader for me."Shader.py", line 68, in load File "/usr/local/lib/python3.3/dist-packages/PyOpenGL-3.0.2-py3.3.egg/OpenGL/GL/shaders.py", line 220, in compileShader glCompileShader( shader ) File "/usr/local/lib/python3.3/dist-packages/PyOpenGL-3.0.2-py3.3.egg/OpenGL/latebind.py", line 41, in __call__ return self._finalCall( *args, **named ) File "/usr/local/lib/python3.3/dist-packages/PyOpenGL-3.0.2-py3.3.egg/OpenGL/GL/VERSION/GL_2_0.py", line 137, in GLSLCheckError description= glGetInfoLog( cArguments ) OpenGL.error.GLError: GLError( description = b"Vertex shader failed to compile wit..., baseOperation = glCompileShader, cArguments = (14,) )
It's a pain if you just want to code and get things done and it just doesn't work as you want. Python moves forward but its modules doesn't follow. PyOpenGL just has experimental support for Python 3.2. Especially its acceleration support isn't working with Python 3.3. On the other hand: OpenGL has moved to version 4.3 over a long period. But you can't just use it via Python in the BGE. It's not even in the C++ code.
Looking at those problems I have they mostly arise from trying to code only using Python. Extending the engine would do the trick and might help others too.
Most time in game making you probably won't notice those problems I had. Shaders don't change on the fly. RenderToTexture is expensive and so used just like one or to times for e.g. mirrors.
Most shaders from e.g. shader-site like those above simply work inside the BGE. As far as I can see most of them should work too:
One thing that could also be easily implemented, but for now just is a bit annoying: creating an own GLSL shader.
It's not the shader itself but the corresponding lighting and shadowing and such things. If you use material-nodes to do your tricks everything is just fine. But if you create your shader you loose everything for the corresponding object or better 'material'. Every extra like light and shadow you need to insert for yourself.
As an idea I thought about providing the main shader that should come with BGE and having some includes and functions in there. Via python you could set the right file to include and so the right function to call.
E.g. something like this:
out vec4 glFragColor;
vec4 diffuse = myDiffuseShader();
vec4 specular = mySpecularShader();
vec4 normal = myNormalShader();
vec4 ambient = myAmbientShader();
glFragColor = mainShader(diffuse, specular, normal, ambient);
def useShader():Shouldn't that work?
fragShader = read(mainFrag.glsl)
I think I try that another day.
→ nullptr instead of 0
→ no goto
→ shared_ptr instead of normal *
→ mercurial / git instead of svn
→ no "spit"ting around. usage of clog and cerror or even better logging
→ newest Python 3.3 with nearly no module working but really old OpenGL
→ C++ code that needs refactoring