Friday, May 29, 2009

Horizontal displacement

As always I do not keep strictly to the plan, and decided to try one of things I wanted to do someday - horizontal displacement of terrain.

The fractal map computed for quadtree node already contains 3 independent fractal noise channels. The first one computes elevation and is seeded from heightmap data. Other two are used for detail material mixing and other things. There is also a fourth channel containing global slope.
I modified the shader that computes vertex positions to displace also in horizontal directions, using one of the two independent fractal channels. Amount of displacement also varies with global slope - areas in flat regions are shifted minimally, but sloped parts that are also treated as rock are displaced a lot. This makes rocky parts much more interesting. For the record, the actual equation used for displacing point on a sphere in tangent space is this:

Next thing that had to be done was to compute the normals of such deformed surface. Article in GPU Gems about deformers provides nice info about Jacobian matrix that can be used for the job. After some pounding to my math circuits I managed to produce the following Jacobian of the above equation (in tangent space):

Normal is then computed as cross product between the second and third column, since the tangent and binormal are {0,1,0} and {0,0,1} respectively.

Finally, here is the result - left side original with only the vertical displacement, on the right side vertical&horizontal displacement:

There are still some issues but the overall effect is quite nice. Of course, collisions with sloped parts are no longer accurate and I'll have to do something with it later.

Outerra planetary engine

1 comment:

Anonymous said...

Recently I added Bezier Quadratic Patch support to my own terrain system, it clocked out at 77 instructions (18 of which are texture instructions), that's counting the world view and then project transform after the patch evaluation (I had to preserve the view space position for my pixel shader). When the position on the surface is known at compile time it optimizes down even further yet.

If you could afford the extra cycles then you could use that to generate C2 normals and tangents. I had to flip the normals in my implementation because they were all facing inward, but they did all come out on the right axis, same deal with the tangents. The results look amazing, it took around 24:1 (triangles:pixels) to see any artifacts with no filtering on the samples, went as far as 400:1 with linear sampling without any artifacts (not even a flat spot where it shouldn't be).