Fisheye warping for spherical mirror fulldome projection

Written by Paul Bourke
July 2012


The following is an attempt at a technical description of how fisheye images are warped for use in a dome using the spherical mirror projection technique. There are currently a number of packages that support this warping, they include the authors own warpplayer, World Wide Telescope, Software Bisque, a patch for Quartz Composer, Blender, and Unity3D implementations ... and others.

Before introducing the warping a short discussion of how fisheye and warped fisheye based projections are generally configured in a planetarium (or other domes such as the iDome). Depending on whether the computer incorporates a display (laptops or the iMac) or whether it is a computer with a graphics card and two graphics pipes, the operator does not (in general) use the dome to navigate but rather a personal flat display. The two most common modes of operation then are as follows:

  • The operators display and the projected display are mirrored. The GUI elements should not be warped, it is the operators responsibility to keep the appearance of such elements out of the view of the audience viewing the dome. The software developers should be providing keystroke or scripting support to facilitate this. An alternative strategy is to attempt to place the GUI elements outside the fisheye circle. Since the mirroring is performed by the OS, it is important that the operators display is the same resolution or greater as the projector, that is, in most implementation of mirroring the final resolution on the dome will be the lower of the display and projector resolution.

  • The two displays are not mirrored, in this case the operator view may be different to the projected imagery. In the case of warping this allows the operators view to potentially be an unwarped fisheye. The GUI elements should not be warped. Please note that it is not generally optimal to capture the fisheye from the operators display and warp to the projector display, this is because the resolution of the warped fisheye will be limited to the vertical resolution of the operators display which will rarely be adequate.

It is the authors advice that the fisheye should be rendered to a texture of sufficient resolution (see later) and then presented/warped onto both the operator and projected display. Whether the operators display is fisheye and the projector display is warped, or both warped is a matter for the software developers based upon ease of implementation and/or performance, both are acceptable. Note that data projectors have the abilty of mirror the image vertically or horizontally, thus this need not be the concern of the software. However in order to facilitate compatabilty with other software, horizontal mirroring to the projector display may be helpful. In any caes the image on the operator display must not be mirrored.

This discussion will start from the position of having a fisheye image, the techniques for creating a fisheye using current real time graphics APIs (eg: OpenGL and DirectX) is a separate discussion. They generally involve either multipass rendering of cubic faces (environment maps) that are then assembled into a fisheye, or involve a vertex shader. Some discussion of this is given here for Unity3D and here for Blender.

The motivation for this arose from the developers of Stellarium and NightShade, two astronomy visualisation packages that while they have included patchy support for spherical mirror, it was implemented in a different fashion. The proposed method takes the fisheye image, generally it would be rendered to an offscreen texture, this texture is then applied to a mesh which implements the desired warping. There are some very important consequences to this:

  • The software does not need to know anything about the optics or geometry of the system, it simply applies the fisheye as a texture to a mesh where the mesh nodes and texture coordinates determine the final warped result.

  • The technique is much more general than just for use with a spherical mirror. It can be used for warp fisheye images and correct for nonlinear relationships between latitude and radius on the fisheye image, it can deal with any truncated fisheye arrangement (as opposed to treating these as special cases), it can even do weird things like warp into other shapes like rectangular rooms (bedroom planetarium).


The "magic" obviously occurs in the details of the mesh file. The author will include some default mesh files currently in common usage. However ideally a mesh file needs to be generated with a knowledge of the projector/mirror/dome geometry as well as the optics of some of those components. In the current version of Stellarium and Nightshade, the geometry and optical specification of the projection system is embedded into the code, the user enters a number of parameters until the image looks correct on the dome. The author maintains the best way to do this is to have an external application that creates warp mesh files (for those cases where the sample mesh files are inadequate), this way all applications can share one warping description. It should be noted that the current parameter set in Stellarium and Nightshade is inadequate on two fronts: there are parameters relevant to the projection optics that are not covered; and the parameters used are somewhat arbitrary in the sense that they do not follow standard methods for describing projector optics. At the moment there are at least three options for creating precise warp mesh files, one is based upon Blender, and one is the authors own meshmapper that is supplied with the warpplayer software.

Below are some "standard" mesh files which have been used up to now. Noting that having standard warp mesh files means the physical geometry of the projector and mirror needs to be adjusted, whereas the custom mesh file calibration approach means that a mesh file can be created that targets a particular geometric/optical arrangement. In general standard mesh files are often satisfactory for inflatable domes whereas fixed domes warrant a better result.

Mesh file Projector resolutions
4x3 XGA: 1024x768 (Not recommended)
SXGA+: 1400x1050
5x4 SXGA: 1280x1024
16x9 WXGA: 1280x720 (Not recommended)
HD: 1920x1080
16x10 UXGA+: 1920x1200

Please note that spherical mirror projection only makes sense for data projection and then when the projector is operated in fullscreen mode. The form of the mesh is not a function of the projector resolution but rather of the aspect ratio. This explains why there are only three standard warp files listed above, they correspond to the three most prevalent data projector aspect ratios. An aspect of 5x4 is not included as those are relatively unusual today and not ideal the spherical mirror projection.

If the reader downloads one of the warp mesh files above they will be observed to be plain ascii text files. The first line indicates the input image type, which is "2" for fisheye images ... remember this basic technique can be used for input image projections other that fisheye, for example standard perspective files, spherical (equirectangular) projections, cylindrical, etc. But in the case of a planetarium and warping a fisheye image the first line is always "2", this can optionally be used as test that a chosen warp file is appropriate. The second line contains two numbers indicating the dimensions of the 2D mesh, the first number is the number of nodes horizontally (Nx) and the second the number of nodes vertically (Ny). This and other aspects are illustrated in the next figure. The subsequent lines of the file each contain 5 numbers they are:

  • The position of the node (x,y). These are given in normalised screen coordinates, so the horizontal range will be -aspect to aspect, the vertical range will be -1 to 1. Normalised coordinates ensures that if the mesh is viewed with an orthographic camera of width -aspect to aspect horizontally and -1 to 1 vertically then the mesh will fill a screen of the same aspect. In OpenGL this camera might be something like:

       glViewport(0,0,width,height);
       glOrtho(-width/(double)height,width/(double)height,-1.0,1.0,0.1,10);
    
  • The texture coordinate (u,v) of the node, each of these range of 0 to 1 and apply to the fisheye image that will applied to the mesh.

  • A multiplicative intensity value in the range of 0 to 1, this allows one to cope with brightness control for different light path lengths and densities on the dome (for example: fading the image towards the "back" of the dome). Note that a negative value of this intensity value indicated the mesh node should not be drawn, indeed the whole mesh grid quad (or triangles) should not be draw.


Note that while the (x,y) positions of the nodes here form a regular grid, they need not. Sometimes it is easier to implement the warp with a regular (x,y) grid and varying (u,v), sometimes a variable (x,y) arrangement is preferable, even though it is still topologically a grid.

There are only two parameters the user needs to specify, they are:

  • The resolution in pixels of the offscreen texture to which the fisheye is being rendered. The exact value may be modified for performance reasons, but for HD or UXGA+ projector resolutions the recommended default would be 2048 (if powers of 2 are required) or 1600 pixels (if powers of two are not required).

  • The name of the mesh warp file, optionally including the path if that is not prescribed.

Mouse selection

Mouse clicking on a warped fisheye needs some special handling. While there are a number of different ways object selection can be handled in real time APIs, it is generally the case that they require the pixel position before warping. So here it will assumed the application can already handle mouse click and object selection in fisheye space, the following will then describe how to derive the pixel position in fisheye given a pixel position in warped fisheye.

Given the convention in the above figure the procedure is as follows.

  • Let (iw,jw) be the clicked pixel position in the warped image. they respectively range from 0 to the display width and height.

  • Convert these to normalised screen coordinates xw and yw as follows
    xw = 2 * aspect * (iw / displaywidth - 1/2)
    yw = 2 * (jw / displayheight - 1/2)

  • Now xw and yw are in the same coordinate system as the (x,y) coordinates of the mesh, one now needs to find the quad or the triangle of the mesh that contains the point (xw,yw).

  • Once the quad or triangle has been found the exact corresponding texture coordinate corresponding to the point (xw,yw) can be interpolated by considering the relative (x,y) position of the point in the quad or triangle, and the (u,v) texture coordinates at each vertex.

  • This texture coordinate (uf,vf) is the index into the fisheye image, it just needs to be transformed to give the image coordinate (if,jf) for the fisheye, since uf and vf both range from 0 to 1 this is simply
    if = fisheyewidth * uf
    jf = fisheyeheight * vf