Project Journal – Radiosity frame analysis

We are now moving on to begin work on the screen-space radiosity effect. Now to help best understand how the effect works I am going to break down a single frame in the same way we did when looking at how AO worked.

This is the frame we are going to breakdown

Radiosity frame analysis

This is a much more complex example than last time. It includes everything we covered last time plus lighting and radiosity. So we will start where we did last time and have another look at the G-Buffer

G-Buffer Generation 

So as with last time the first step is generating the deep G-Buffer. The demo is using a 2-pass depth peeling method over the single pass prediction method. However the results are still the same so this doesn’t cause any issues.

The first 2 render targets are screen-space normals and the base colour. These look as follows for each layer.

First targets of G-Buffer

 

So nothing different so far as you would expect.

Now, the last time the following two render targets appeared to be empty. However, this time they have been written to and we can see that we now have both a glossness/shininess factor and emissive.

The gloss target stores specular colour in the red green and blue channels and smoothness in the alpha. Below you can see these terms separated with alpha on the left and RGB on the right. Note: the levels were modified slightly for the RGB channels the actual values are close to 0.

Gloss/spec

In the below emissive texture I again had to modify the levels as the real values were close to zero. You can see that between the tops of the building it draws a sky cubemap into the emissive target.

emissive

The cubemap used is displayed below

cube

This information isn’t required for the second layer as it isn’t properly shaded. Most of this is already available in my own G-Buffer so not much needs to be changed. All I am missing is the emissive skybox which I should be able to add quite easily.

 

Ambient Occlusion 

We covered this in plenty of detail last time around so we will skip over exactly how the algorithm works.

It still performs the custom mip generation for the camera space depth and then generates and blurs the AO producing the following texture.

AO

 

Pre-Lighting

Now that all of the resources have been created the scene is lit with diffuse lighting from a single directional light. This data will later be used to calculate the radiosity. The previous radiosity is included to simulate multiple indirect bounces allowing for the previous frames values to affect the current frames values.

The normals. current depth buffer, previous depth buffer, diffuse colour, previous radiosity(I’ll come to this shortly), a shadow map (that appears to be empty) and the screen-space velocity are all passed into the shader.

Looking at the uniforms passed into the shader the MVP for the light is empty so we can assume that shadows are not currently enabled (It doesn’t look like they are) however we will eventually want them in our solution.

Looking at the shader. It just performs the direct lighting calculation and then applies the indirect lighting by sampling from the previous buffer using the screen-space velocities to reproject the correct sample.  Later the specular and ambient lighting will be calculated using AO and gloss/spec. The HDR values are stored in an R11_G11_B10_Float buffer as to avoid using any additional space. This saves space as the alpha channel wasn’t required so its bits can be shared amongst the other channels.

In the following pass the second layer is shaded in the exact same fashion. However, it uses the same radiosity buffer used by the first layer.

The final shaded scenes for both layers appear as follows.

Shaded layers compared

 

Everything looks saturated as everything is stored in High Dynamic Range(HDR) the values in this image range as high as 10 or more so when clamped to [0, 1] it looks a little blown out. This is sorted in a tonemapping pass before presentation.

 

Radiosity Prep 

In preparation for the radiosity calculation the scene data is downsampled in exactly the same way as was performed for the AO. This is also likely for the same reasons as it improves cache locality when using wide sample areas.

The downsampled data includes both shaded layers, both sets of normals and the camera-space depth again. The depth does not need to be downsampled again but I presume it was left in so that it doesn’t have a dependency on the AO being calculated previously.

One interesting point is that the normals for both layers are packed into a single RGBA8 texture. Here the z component is dropped so layer 1 XY are stored in RG while layer 2 XY are stored in BA. This reduces some memory costs at only the added cost of reconstructing the Z component (and minor loss of precision)

No images have been included as it doesn’t help to describe the process. Just imagine the pictures above but smaller. 🙂

 

Calculating Screen-Space Radiosity

The prepped data is now sent to a shader to calculate the indirect lighting. The data is written out to two RGBA16_Float buffers for storage. Here all four channels are required. RGB store the calculated bounced lighting and A stores “Ambient visibility” I think this means how confident the sample coverage is. i.e. How visible the sample is to the indirect lighting. Areas of low confidence will be filled with static environment map samples. We will see this value being used later as the indirect lighting is used to shade the scene. The first render target stores the calculated values for the first layer while the second target is supposed to store the values for the peeled layer. However, it seems as though this is disabled here as both outputs are identical and only show the first layer.

The generated texture looks as follows

Radiosity calculated for layer 1You can see that the texture is very noisy for now. This will be fixed up in the next couple of passes. You can also see that the alpha approaches 1 at edges and depth discontinuties which is because sample count is low in these areas. Interestingly it looks a little bit like AO. Everything is very purple due to light bouncing off of the curtains and hitting into other points. Since most other areas are white this purple colour stands out.

In the next pass the currently calculated radiosity is merged with the previous frames. This works almost identically to temporal AA. The last frames value is reprojected and blended with the new values perceptually increasing the sample count over a few frames.
The accumulated frame can be seen below. It still looks noisy but this will be reduced later.

accumulated radiosity

 

Post Processing 

To reduce the noise in the resulting radiosity it is put through a bilateral filter in the exact same way we saw with the AO. This reduces noise while preserving edges. At the same time the guard band is removed avoiding the incorrectly computed values that naturally occur at screen edges.

You can see the final blurred indirect lighting below.

blurred

 

This is no different from the bilateral filter used for AO so will be easy to integrate into the current engine.

In a final pass at the end motion blur is applied. Since we are not moving nothing changes and since it isn’t directly related to the project we are going to skip it.

Shading

Now finally that the radiosity has been calculated the scene is shaded using the calculated data. Including Gloss\Spec, Emissive and AO.

As previously mentioned in the radiosity alpha channel is included a confidence value. We can see in this pass that an environment map is included that will be used to fill areas of low confidence.

radiosity cube

I’m not sure if this is created on startup or was pre-generated and is loaded from disk. Either way we will want to find a way to generate some form of cubemap using the local scene to ensure filler values are accurate. This one is rendered from the perspective of the angel in the middle of the scene. This ensures lighting is accurate for it but would be wrong for objects in different locations. This is not too much of an issue as it is only a filler. The fact that the main values are purple helps to ensure some continuity between scree-space and cubemap radiosity.

The lighting is very similar to what we are using now except for the environment mapping. This should not be too hard to include. Morgan McGuire has written a paper about simple estimation of correct diffuse and glossy ambient lighting which is likely the same as he is using here.

The final shaded scene now looks almost identical to the capture at the top of this page. All data is still stored in HDR using the R11_G11_B10_Float buffers.

Shaded radiosity

The blue border you can see is the guard band which is cropped out later.

Tonemapping 

Finally, tonemapping and a little bit of bloom are applied to the final image.

The bloom works by first tonemapping the image into the correct range before downsampling to half size along both axis, a total 4x shrinkage.

Finally with the bloom added back in we get the resulting image we saw at the start of the frame. Very nice!

final shaded frame

 

Conclusion 

The final result looks pretty cool. Especially when you compare it to static environment maps.

dynamic radiosity vs static light probes

 

There is a lot of work to do. But it is not all that bad. As far as I can see these are the only tasks that need to be completed.

  • Manual mipmap generation for colour and normals
  • Scene lighting computed in HDR
  • Generate radiosity from the lit scene.
  • Shade scene using calculated radiosity (Excluding env map)
  • Add env map to shading
  • Tonemapping
  • Bloom (Optional)

Quite a lot of work but alot of stuff is already in place. Mip map generation will run the same shader as used for AO. HDR lighting can be achieved by changing a few paramaters. And shading the scene with the calculated radisoity should just be an extension of what we are already currently doing.

The tricky bit will be getting the radiosity calculation correct. and properly accumulating it together. The environment mapping should also be easy. If worst comes to worst I can just nick the cubemap used for this demo.

Next entry I believe I should have most of the work done. I will give an explanation of how the indirect lighting works and hopefully demonstrate some screen shots.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s