Render to Texture
Many advanced effects require textures as output and input. Shadows are the most important example of this. One way to render shadows is to render from the light view and record the distances to the objects. Then, render from the camera view and record the distances to the objects. Using the distance data, it can be computed if there are objects blocking the light, causing shadows.
In order to accomplish this effect, we will need to be able to render and store the results. This is done by rendering into a texture.
In OpenGL, we have only rendered to the default framebuffer thus far. The default framebuffer's contents are then displayed on the screen. However, we can create alternate buffers on the GPU and render into them. These alternate buffers do not have any default use; we can use them for whatever we wish.
In order to store render results, we first create a texture on the GPU. This is not backed by any image data, we simply need the storage space on the GPU.
Once we have the space, we can tell OpenGL to use the storage as a output buffer from the fragment shader stage.
Then, before rendering using
glDraw calls, we can specify the buffer to render into.
After the rendering, the storage on the GPU is now filled with the results of our render. Remember that this storage was initially a texture. We can then use the texture as input into another render step.