ODSP (Omni Directional Stereoscopic Panorama) Cylindrical Camera for PovRay
Written by Paul Bourke
October 2025
The following describes how to implement an ODSP (Omni Directional Stereoscopic Panorama)
cylindrical camera in Povray version 3.8. In particular, the type of ODSP panorama required
for a cylindrical display that is stereoscopic capable.
The details of an ODSP cylindrical panorama
will not be described, details can be found
here and physical camera methods
here.
The particular display being developed for
is an LED based 360 cylinder, approximately 8m in diameter and 4m high.
The resolution is 12816 pixels around the circumference and 2048 pixels high, each pixel approximately 2mm pitch.
The vertical field of view is therefore 2*atan(pi*2048/12816) = 53.3 degrees.
A bit of history ...
In 2007 version 3.6 of Povray was modified by the author to support a range of stereoscopic
displays. Including stereoscopic enabled cylindrical displays which at the time were multi-projector
based. Details of this can be found here.
Another earlier approach used by the author was to render narrow slits as the cameras rotated
about their central axis. Each of these slits was then concatenated together to form
the two 360 panoramas, one for each eye. This was first used by the author in 2002, described
here.
The main issue with this approach is its inefficience, each panorama requires 100's of narrow slit
renderings and Povray loads the scene for each render.
In Povray version 3.7 the mesh_camera was introduced. This allowed one to define a triangle for
each pixel in the image, the ray for that pixel had an origin at the center of the triangle
and a direction corresponding to the normal of the triangle. While this is a cunning idea, in
practice it has two drawbacks. First, for high resolution images there are a large
number of pixels and therefore triangles and parsing times could be a significant portion
of the total rendering time. The second problem was there
was no clean way to perform high quality antialiasing (at least at the time of writing).
A number of forks from the main Povray branch have implemented ODSP cameras. Most implement
full equirectangular panoramas due to their applicability to head mounted displays,
otherwise more commonly referred to as VR (virtual reality) headsets.
For the application here, one can obviously extract a cylindrical panorama from an
equirectangular panorama, for example this is an option for
filmed content using 360 stereoscopic video cameras.
Of course rendering the full equirectangular is
wasteful if one only needs a more limited vertical field of view.
For example the LED cylinder that motivated this exercise only has a 53.3 degree
vertical field of view.
It also turns
out that an ODSP pair for VR headsets typically sets the zero parallax at infinity
(or at least at the distance of the furthest object in the scene).
Whereas for screen based systems the zero parallax distance should be at a fixed
distance from the camera, that is, at the screen distance.
The fundamental requirement for a raytracer camera is to be able to define an
origin and direction for each pixel in the final image. For many camera types, like
a standard perspective camera, the origin for each ray is identical and the direction
vectors diverge from that position.
For an orthographic camera the origin for each ray is different (they all lie
on a plane) and the direction vectors are all identical.
For an ODSP camera both the ray origin and direction vector vary across the
panorama image.
At the time of writing Povray 3.8 is still in beta, but it has an extremely powerful
camera type called "user_defined". This camera type allows one to specify exactly
the origin and direction vector for each pixel. The equations can be seen in the code
below.
#version 3.8;
/*
Experiment with ODSP (cylindrical) using PovRay user_defined camera in version 3.8
Designed and tested on the Baptist University LED cylinder in Hong Kong: 8m diameter, 4m high
Deals with left and right eye separately, suggest ffmpeg to create top/bottom arrangement
Handles positioning of zero parallax, typically at screen distance or infinity
Supports both a left and right handed coordinate system
The camera rig can be panned in order to set where "front" (y axis) corresponds to in the panorama
All length units in meters
*/
#declare CYLRADIUS = 4.0;
#declare CYLHEIGHT = 4.0;
#declare ODSP_camerax = 0; // View position
#declare ODSP_cameray = 0;
#declare ODSP_cameraz = CYLHEIGHT/2;
#declare ODSP_eyesep = 0.065/2; // Half the eye separation
#declare ODSP_handed = -1; // 1 for left, -1 for right
#declare ODSP_whicheye = -1; // -1 for left, 1 for right, 0 for center
#declare ODSP_vfov = 2*atan(pi*image_height/image_width); // Based on image dimensions not cyclinder dimensions
#declare ODSP_parallax = 4.0; // Distance to zero parallax
// Set to very large number for infinity
#declare ODSP_parangle = atan(ODSP_eyesep / ODSP_parallax); // Rotation angle to set zero parallax
#declare ODSP_panangle = 180*pi/180; // Pan the camera rig
// Mostly to control where "front" is on the panorama
// ODS cylindrical panorama
camera {
user_defined
location {
function { ODSP_camerax + ODSP_whicheye * ODSP_eyesep * cos(x * 2*pi + ODSP_panangle) * ODSP_handed }
function { ODSP_cameray + ODSP_whicheye * ODSP_eyesep * sin(x * 2*pi + ODSP_panangle) }
function { ODSP_cameraz }
}
direction {
function { sin(x * 2*pi - ODSP_whicheye * ODSP_parangle + ODSP_panangle) * ODSP_handed }
function { -cos(x * 2*pi - ODSP_whicheye * ODSP_parangle + ODSP_panangle) }
function { y * 2 * tan(0.5 * ODSP_vfov) }
}
}
The above is based upon the conventions illustrated in the following. Note the the author typically
defines "up" as the z axis, different coordinate based
up vectors simply require a swapping of the order of the functions.
For these types of displays an arbitrary up vector is generally not a good idea since
there is already a physical sense of "up", but it could
readily be implemented in the camera or by tilting the scene.
Creating stereoscopic views that convey correct scale and depth, and don't place undue
stress on the human visual system, requires a precise and careful
attention to the camera model. A test scene is supplied
theworld.inc with objects at known positions and of
known dimensions such that the ODSP camera model and resulting images can
be checked. In particular the scene elements have the following features.
Small spheres (stars) in the distance. This enables a check of their expected
separation. Specifically, if zero parallax is set to infinity then there should
not be any separation between their positions in the left and right eye image.
If zero parallax is set to the cylindrical display depth then their
separation should be the same as the eye separation.
A cylindrical grid with a radius equal to the display radius. When zero parallax distance is
set to the display radius, this mesh should exhibit no separation.
Axes showing "up" (z axis in green) and "right" (x axis in red). This checks that the left or right
handed coordinate system control is correct.
Rotating cubes at 3m away from the camera, cylinders at 4m, cones at 5m away
from the camera, and a 1m square floor plan grid. These all allow judgement of correct
depth when viewed within the cylindrical display.
For example, the following are the resulting stereo pair when the zero parallax distance
is set to infinity. The full resolution images are supplied so that the features
as documented can be verified.

The following checks can be performed.
The "stars" (distant small spheres) are at infinity so the separation between their representations
should be 0.
All objects in the scene exhibit negative parallax. That is, the representation of
an object in the right eye will be to the left of the representation of the same object in
the left eye.
There is no vertical separation on any objects. This is always the case for cylindrical ODSP
and is one of the problems with equirectangular ODSP images.
In the following the zero parallax distance is set to 4m, the radius of the cylindrical
display used for this exercise.

The following checks can be performed.
The "stars" (distant small spheres) are at infinity so the separation between their representations
should be the same as the eye separation used. The separation is 33 pixels, the pitch of the LED
pixels is 2mm so indeed, separation is 65mm. Note that this can be verified on the physical cylinder
by measuring the distance between the stars. Separations greater than eye separation are a strong
cause of eye fatigue, they requires the human eyes to diverge which they are never required
to do in real life.
The grid is created at the radius of the actual LED cylinder, 4m away. As such the representation
should be identical in the left and right images, that is, at zero parallax. In the cylinder the
grid will appear to be at exactly the same depth as the cylinder surface.
Objects positioned further than 4m away (cones) exhibit positive parallax (will appear behind the cylindrical
display), objects less than 4m (cubes) away will have negative parallax and appear in front of the
cylindrical display.
The floor grid should appear to be co-planar to the real floor within the cylinder, at least for
the observer located in the center of the cylinder.
|