I've got a strange problem with my deferred shading implementation. I render the required information via MRT into a FBO, currently diffuse, position and normals in world space, which looks like this:

This is done with the following setup for all tree textures:
diffuse = std::shared_ptr<bb::Texture>(new bb::Texture(GL_TEXTURE_2D)); // generates texture ID diffuse->bind(); diffuse->texture2D(0, GL_RGB, width, height, 0, GL_RGB, GL_FLOAT, 0); // glTexture2D diffuse->parameterf(GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); diffuse->parameterf(GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); diffuse->parameterf(GL_TEXTURE_MIN_FILTER, GL_LINEAR); diffuse->parameterf(GL_TEXTURE_MAG_FILTER, GL_LINEAR); diffuse->unbind(); texture2D(GL_COLOR_ATTACHMENT0+1, diffuse->getID(), 0); These are then used in my drawing stage:
dsShader->bind(); dsShader->enableVertexAttribArrays(); ds->diffuse->bind(GL_TEXTURE0); // "ds" is an FBO containing the textures ds->position->bind(GL_TEXTURE0+1); ds->normal->bind(GL_TEXTURE0+2); dsShader->sendUniform("diffuse", 0); dsShader->sendUniform("position", 1); dsShader->sendUniform("normal", 2); dsShader->sendUniform("camera", camera3D->position.x, camera3D->position.y, camera3D->position.z); dsOut->indexBuffer->bind(); dsOut->vertex2Buffer->bind(); dsOut->vertex2Buffer->vertexAttribPointer(dsShader->getAttribLocation("vertex0"), 2, GL_FLOAT, false, 0, 0); glDrawElements(GL_TRIANGLES, dsOut->indexBuffer->size(), GL_UNSIGNED_INT, 0); ds->diffuse->unbind(); ds->position->unbind(); ds->normal->unbind(); dsShader->disableVertexAttribArrays(); dsShader->unbind(); With the following shader (just the necessary part, the light source is hardcoded):
struct DirLight{ vec3 direction; vec4 diffuse, specular; }; uniform sampler2D diffuse; uniform sampler2D position; uniform sampler2D normal; uniform vec3 camera; DirLight light0 = DirLight(vec3(1, 1, 0), vec4(0.3), vec4(0.1)); in vec2 vertex; void main(){ vec4 color = texture(diffuse, vertex)*0.5; vec3 p = vec3(texture(position, vertex)); vec3 n = normalize(vec3(texture(normal, vertex))); float ndotl = max(dot(n, normalize(light0.direction)), 0.0); // simple phong if(ndotl > 0.0){ color += ndotl*light0.diffuse; } gl_FragColor = color; } The wierd part is, if I set the light source direction to negative values, lets say:
DirLight light0 = DirLight(vec3(-1, 0, 0), vec4(0.3), vec4(0.1)); The final result is not shaded. It looks right for positive values (more or less). Here is a picture of the output:

And there could also be a problem with the normals, marked in the red area.
GL_RGBthen the G-Buffer itself also has this behavior. You can either use something likeGL_RGB8_SNORMor do the old* 0.5 + 0.5to manually scale into [0,1] and* 2.0 - 1.0trick to scale back to [-1,1].