Complete dataset for Earth with 76m sampling is 192GB, after wavelet compression it is roughly 14GB. That is still too much for direct download, and it will be even slightly larger after more detailed data are used for some parts at higher latitudes that are currently only coarsely defined. Possible solution could be to use a coarser sampling, but this rapidly washes out nice terrain features that the fractal cannot supersede easily.
Fortunately it seems we will be able to use the 76m dataset since progressive download works quite nicely.
Wavelet compression of elevation data helps here because it not only nicely compresses the data but perhaps more significantly it arranges the data in layers by level of detail. Particular level of detail for quad-tree nodes can be then downloaded when needed.
Results are that landing on the ground in a mountainous area (where the data are largest) requires around 30-40MB of compressed data to be downloaded progressively and cached. Further data (3-8MB) are needed again only after traveling some 300km from the spot. So it seems that even camera following a cruise missile can be handled too, though possibly it could skip the most detailed data at that speed anyway.
Lastly, the downloader uses libtorrent library so the data can be downloaded using p2p, initially from http seed.
Using finer sampling manifests itself mainly in mountainous areas where high frequency features occur most.
Here's comparison of High Tatras as rendered by the engine and by the nature.
Currently we are working on material mixer that will allow us to assign different materials to different climatic bands, so the peaks here will not be covered by grass and mountain pine will grow at higher elevations too.
Outerra planetary engine