Wednesday, May 27, 2015

Evaluation of 30m elevation data in Outerra

In September 2014 it was announced that the 30m (1") SRTM dataset will be released with global coverage (previously available only for US region). This was eagerly expected by a lot of people, especially simulator fans, as the 90m data are lacking finer erosion patterns.

Final release is planned for September 2015, but as of now already a large part of the world is ready, with the exception of Northeast Africa and Southwest Asia. I decided to do some early testing of the data, and here's a comparison video showing the differences. Atmospheric haze was lowered for the video to better show the differences:

Depending on the location, the 30m dataset can be a huge enhancement, adding lots of smaller scale details.

Note that "30m" actually refers to 38m OT dataset that was created by re-projecting to OT quad-sphere mapping and fractal-resampling from 30m (1") sources, while the original dataset is effectively a 76m one, produced by bilinear resampling (which washes out some details by itself).

Here are also some animated pics, switching between 38/30m and 76/90m data:

As it can be seen, details are added both to flat parts and the slopes. Increased detail roughly triples the resulting dataset size, rising from 12.5GB to 39GB, excluding the color data which remain the same.

However, an interesting thing is that a 76/30 dataset (76m fractal resampling from 30m sources) still looks way better than the original dataset made from the 90m data, while staying roughly the same size. The following animation shows the difference between 76/30 and 38/30 data:

The extra detail in 38/30 data visible on OT terrain is actually not fully caused by the differences in detail; it seems that the procedural refinement that continues producing more detail is overreacting and producing a lot of small dips, as can be seen on the snow pattern.

Quality of the data

The problem seems to be that the 30m SRTM data are still noticeably filtered and smoothed. When the resampled 38m grid is overlaid on some known mountain crests, it's obvious that it's way more smooth than the real ones. This has some negative consequences when the data are further refined procedurally, and the elevations coming from real data are already missing a part of the spectrum because of the filtering.

The effective resolution seems to be actually closer to 90m, but it's still way better than the 90m source data, which were produced via a 3x3 averaging filter - that means even worse loss of detail.

30m sources can definitely serve to make a better 76m OT base world dataset, but I'm not certain if the increased detail alone justifies the threefold increase in size, compared to the 76/30 compilation (not to the original 76/90 one).
There are still things that can be done with the 30m data though. For example, attempt to reconstruct the lost detail by applying a crest sharpening filter on the source data. We can also use even more detailed elevation data as inputs to the dataset compiler where possible, while keeping the output resolution at 38 meters.


Apart from the filtering problem there are some other issues that show up in the new data, some of which are present in the old dataset as well but grew worse with the increase of detail.

First issue are various holes and false values that existed in the acquired radar data because of some regions being under heavy clouds during all the passes of the SRTM mission. While the 90m data were mostly fixed in many of these cases (except for some patterns in mountainous regions), new 30m sources are still containing many of them. It might be useful to create an in-game page to report these bugs by the users, crowd-sourcing it.

Another issue is that dense urban areas with tall or large buildings have them baked into elevations. It was also present in 90m data, but here it is more visible. For example, this is the Vehicle Assembly Building in Launch Complex 39:

The plan is to filter these out using urban area masks, which will be useful also to level the terrain in cities. One potential problem with that is that the urban mask data are available only in considerably coarser resolution than the elevation data, which may cause some unwanted effects.

Bathymetric data

Together with higher resolution land data we also went to use enhanced precision ocean depth data, released recently in 500m resolution. Previously used dataset had resolution of 1km, which was insufficient especially in coastal areas.

Unfortunately, the effective resolution of these data is still 1km or worse in most places, and the way the data were upscaled introduces nasty artifacts, since OT now takes these data as mandatory and cannot refine them using fractal (much more natural-looking) algorithms. The result is actually much worse than the original 1km sources (Hawaiian island):

Just as with the land data, any artificial resampling is making things worse. Fortunately, for important coastal areas there are plenty of other sources with much finer resolution, that we can combine with the global 1km bathymetric data. This is how the preliminary test of these sources looks like (ignore the land fills in bays):

Sunday, December 7, 2014

TitanIM, an Outerra-based military simulation platform

Revealed at the world's largest modeling, simulation and training conference oriented at military use, I/ITSEC (Orlando 1-5.Dec 2014), TitanIM is a new simulation platform built on top of the Outerra engine, utilizing its ability to render the whole planet with the full range of detail levels from space down to the blades of grass.

Military simulation was always one of the best fitting areas for the use of the engine. Unlike most other procedural engines, Outerra focuses on using real world data, enhancing it by seamless procedural refinement, which allows it to render accurate geography with first-person level ground details that does not need an extraordinary amount of streamed data to achieve geo-typical terrain. Supported  scale range allows it to combine all types of simulation environments into a single world and eventually into a single battlefield, which is something that's highly desired in this field.

Over the years we have been in contact with several companies in the military simulation business, which were interested in using the technology. As probably many people know, Bohemia Interactive Simulations (BIS), makers of VBS, is a major player in the "serious games" field. What is probably less known is that the company was originally founded as Bohemia Interactive Australia by David Lagettie, an Australian who saw the potential in Operation Flashpoint game, and went to use it for military simulation and training software, which soon saw a widespread adoption.

Later, around the time BIA was relocated to Prague, he left and founded Virtual Simulation Systems (VSS), a company developing all kinds of simulation hardware used in weapon and vehicle/aircraft simulators. Several of these were actually used at the ITSEC demo, shown on the screens below.

A new era: TitanIM/Outerra

TitanIM is a company founded by David Lagettie to develop a simulation platform based on the Outerra engine, in close cooperation with us. Right now Outerra engine isn't generally available, still being in the development phase, and so any early projects have to be developed with our direct participation. We have worked with TitanIM for some time already, providing specialized API and the functionality they require for specific tasks of that domain. This effort culminated at this year's I/ITSEC conference where TitanIM was officially revealed, although several projects committed to using Titan platform even before the official launch.

Here's a quickly made video showing some (not all) simulators shown at the I/ITSEC:

Titan booth was shared with two well-known companies that are already using Titan for their hardware simulators: Laser Shot and Intelligent Decisions (ID), showing diversity of applications even in this early phase.

A couple of photos of the simulators demoed:

Complete Aircrew Training System (CATS) with UH-60 Helicopter simulator

Boat platform, taking data from Outerra vehicle simulation and driving the platform servos.

Phil inside the F18 simulator using Oculus DK2, with accurate cockpit construction matching the rendered 3D model.

Overall it was a great success, with the whole Titan team working hard to get everything connected and working. These guys are seriously dedicated and insanely hard working; Phil (TitanIM co-founder and COO) had to be forcibly sent to get a bit of sleep after running for 3 days without rest, with other guys usually getting only short naps a day too.

We also decided to grant the exclusive license to Outerra engine to TitanIM for military use (direct or indirect use by the military), to secure its position, since we are already participating on it quite closely. This probably won't be good news for some other interested parties, but as many people are pointing out, competition can only be good in this field. With Outerra engine powering TitanIM, a global integrated simulation platform is possible for the first time, connecting all simulation areas - space, air, ground and water, into a single limitless world.

What does this mean for Outerra: apart from gaining an experienced partner handling simulation aspects that we could not cover by ourselves, lots of the stuff done for Titan will also get back to the Outerra engine and our games and simulators. We are also getting access to other connected companies, especially the hardware makers, making the engine more robust and universal in the process. It also allowed us to grow, to hire more people into our office in Bratislava, and the results will be showing up soon.

Sunday, May 11, 2014

Double precision approximations for map projections in OpenGL

(written mainly for my own reference, but it's also possible that someone else finds it useful as well)

The problem: OpenGL specification/extension ARB_gpu_shader_fp64 that brings support for double precision operations on the GPU specifically states that double-precision versions of angle, trigonometry, and exponential functions are not supported. All we get are the basic add/mul operations and sqrt.

In rendering planetary-scaled geometry you'll often hit numerical precision issues, and it's a good idea to avoid using 64-bit floats altogether because they come at a price. Performance hit is usually even larger than it could be, as some vendors intentionally cripple the double precision operations, to make a greater gap between their consumer and professional GPU lines.

In some cases it's not worth going into the trouble of trying to solve both the performance and precision problems at once, for example in tools that process real world data or project maps. These usually need trigonometric functions when converting between different map projection types, and a higher precision than 32 bit floats can give.
Unless one can/want to use OpenCL interoperation, here are some tricks how to reduce some of the used equations to just the basic supported double-precision operations.

fp64 atan2 approximation

In our case we needed to implement the geographic and Mercator projections. For both we need atan (or better atan2) function, which we'll get through a polynomial approximation.

Article on Lol Engine blog colorfully talks about why it's not a good idea to use Taylor series for approximating functions over an interval, and compares it with Ramez/minimax method. What's nice is that the good folks also released a tool for computing the minimax approximations, called remez exchange toolbox. Using this tool we can get a good atan approximation with custom adjustable polynomial degree / maximum error.

Here's the source code for the atan2 approximation with error less than 5 ⋅ 10-9, computed using the lolremez tool. Angular error of 5 ⋅ 10-9 translates to ~ 3cm error on Earth surface, which is very good for our purposes.

// atan2 approximation for doubles for GLSL
// using

double atan2(double y, double x)
    const double atan_tbl[] = {

    /* argument reduction: 
       arctan (-x) = -arctan(x); 
       arctan (1/x) = 1/2 * pi - arctan (x), when x > 0

    double ax = abs(x);
    double ay = abs(y);
    double t0 = max(ax, ay);
    double t1 = min(ax, ay);
    double a = 1 / t0;
    a *= t1;

    double s = a * a;
    double p = atan_tbl[9];

    p = fma( fma( fma( fma( fma( fma( fma( fma( fma( fma(p, s,
        atan_tbl[8]), s,
        atan_tbl[7]), s, 
        atan_tbl[6]), s,
        atan_tbl[5]), s,
        atan_tbl[4]), s,
        atan_tbl[3]), s,
        atan_tbl[2]), s,
        atan_tbl[1]), s,
        atan_tbl[0]), s*a, a);

    double r = ay > ax ? (1.57079632679489661923LF - p) : p;

    r = x < 0 ?  3.14159265358979323846LF - r : r;
    r = y < 0 ? -r : r;

    return r;

Mercator projection

While the geographic projection can do with the above atan2 function to compute the longitude and latitude, Mercator uses a different mapping for the y-axis, to preserve the shapes:

Where φ is the latitude angle. Bad luck, neither ln nor tan functions are available with double arguments. Logarithm approximation doesn't give a good precision with reasonable polynomial degree, but it turns out we actually do not need to approximate neither ln nor tan function.

It follows that:

Given that, going from 3D coordinates, we can get the value of tan φ simply from the 3D coordinates (assuming ECEF coordinate system):

Meaning that the inner logarithm expression now consists solely of operations that are supported by the ARB_gpu_shader_fp64 extension.

Finally, we can get rid of the logarithm itself by utilizing the identity:

and using the single-precision logarithm function to compute a delta value from some reference angle computed on the CPU. Typically that means computing sqrt(k^2+1) + |k| for the screen coordinate center, and using the computed logarithm difference directly as the projected screen y coordinate.

Since the resulting value of a/b is very close to 1 in cases when we need extra double precision (zoomed in),  we can even use the crudest ln approximation around 1: ln(x) ≅2*x/(2+x) with double arithmetic.