So, there's a lot to explain in that image, and it's only the second floor. I've moved the old signal processing consoles down into the 'basement', and the upper floor is left for the planetarium/system map.
The earth and moon should be obvious. Both datasets are from NASA. Earth is the 2013 bathyspheric "blue marble" image, and the moon is albedo corrected. The Sun is textured with the 2012 STEREO image (the first simultaneous full-sun image map) And far off to the left is a pair of Plank Microwave Anisotropy datasets: Background radiation in red, and matter density in blue.
Oh yeah, and there's a skybox too, but it's fake. Looks nice though. There'll be an option to turn it off.
The spheres are not actually spheres - in model space they're cubes. I suppose technically they are 'voxels'. Each sphere is raytraced within the cube with correct perspective, but without creating z-buffer geometry, so "hologram" might also be a good word. The spheres look geometrically perfect, but won't intersect correctly.
Why go to the trouble, compared with simply creating a triangular tesselation mesh for the sphere? Well, each "sphere" voxel is eight vertexes, with no normal or surface buffers. I can have thousand of spheres, millions, so long as they are stacked in non-overlapping minecraft rows.
I'm essentially ready to tackle the next major algorithm, the one which should elevate astromech to a whole new level: Ray Bundle Adjustment.
Long story short: Ray Bundle Adjustment is a tomography technique, think of it as the advanced successor to the 'backprojection' algorithm used to reconstruct CAT scans. Here's a good paper: http://luthuli.cs.uiuc.edu/~daf/courses/Optimization/Papers/bundleadjust.pdf
What the application? Given data coming from multiple viewpoints (such as a ring of solar observatory satellites like SOHO, SDO, or STEREO-A and B) one should be able to reconstruct the 3D structure of the volume.
Near-real-time 3D maps of the Sun and it's near volume (flares, mass ejections) sounds like a neat thing. I expect to have something working within the month.
Another load off my mind is that I found an excellent solution for one of my other problems: sensible rendering of cluster data containing thousands or millions of points. This is some beautiful work, based on the brilliant idea of leaving a step out. Expect to see something very similar in Astromech, soon as I get to it.
Lastly, I'm investigating going nearly fully peer-to-peer for the networking layer using WebRTC. In terms of spec and capabilities for moving video and data streams around, I would be an idiot not too. I'm putting together a small signalling server, and thinking about adaptive mesh "gossip" networks.
Google gave an excellent talk on what WebRTC is all about which I recommend you see, given how much impact that particular spec is going to have on the peer-to-peer internet. Without this technology, I would need some heavy central servers to videoconference a dozen telescopes. Now it can go point-to-point.
That changes many things. Assuming it all works. And it's already in your browser, most likely. There are still Chrome/Moz compatibility issues, but only because the spec is still moving.