Generate 360° images with ParaView

Context

Nowadays, it is quite common to have access to a device capable of displaying 360° scenes like VR headsets, planetarium/dome screens or a simple smartphone screen with the internal gyroscope acting on the virtual camera direction.

In order to display an image or a video with these devices, a convenient way is to project the 360° scene using a common projection like the equirectangular projection (Figure 1) or the azimuthal equidistant projection (also known as fish-eye – Figure 2).

Figure 1 – Equirectangular projection.
Image credit: Daniel R. Strebe (CC BY-SA 3.0)
Figure 2 – Azimuthal equidistant projection.
Image credit: Daniel R. Strebe (CC BY-SA 3.0)

The type of projection is generally chosen to minimize the distortion of the image depending on where the center of interest is. The equirectangular projection is used when the center of interest is located on a horizontal plane whereas the azimuthal equidistant projection is used when the center of interest is located in a specific direction. 

Figure 3 – 360° equirectangular projection video of an animated scene
produced with the new ParaView Projection View.

Sometimes, visualizing the dataset inside it instead of looking at it from the outside adds a new level of perception. Being limited to a small view angle can be frustrating on some datasets and allowing the immersion improves drastically the visualization experience.

With the upcoming ParaView v5.7, it possible now to produce 360° images or videos in real-time thanks to the “Panoramic View Plugin” and the new “Panoramic Projection View” it offers.

Taking benefit of this new feature, DKRZ (German Climate Computing Centre), with their partner HZG (Centre for Materials and Coastal Research), are using ParaView to generate various animations, ranging from climate and weather to material science, to be displayed within a large mobile dome. HZG also developed an app to display 360° animations on any smartphone showing their 360° Science. Both centers are also using VR goggles, not only for public outreach but also to foster their research.

Feature usage

In the plugin list, be sure to enable the plugin “PanoramicProjectionView”. Once enabled, a new view is available in the view list (see Figure 4).

Figure 4 – New ParaView view enabled by the plugin.

There are 3 parameters available when this new view is selected (see Figure 5).

  • Projection Type: the selection between the two projection type, Equirectangular (see Figure 3) or Azimuthal Equidistant (see Figure 4)
  • Projection Angle: the angle of the capture if the user does not need the full 360° scene.
  • Cube Resolution: the resolution of each face of the internal cubemap texture (configure the tradeoff between quality and performance). Default value is 500. Increasing it will slow down the rendering time but will increase the rendering quality.
Figure 5 – Panoramic Projection View parameters

Known limitations

The new render view works well with both surface and volume rendering but there are still known limitations.

  • Raytracer renderer (OSPRay and OptiX) are not supported yet
  • Overlay widgets (color bar and orientation widget) are not projected thus visualizing them with the VR device will produce distorted widgets.
  • Representation using “Lines as tubes” and “Points as spheres” can produce shading artifacts.
  • Lights attached to the camera only point to the front face leading to low luminosity on some parts of the image

Implementation details

Under the hood, ParaView exploits the power of VTK render passes pipeline to capture six off-screen square images for each direction (front, back, top, bottom, right, and left) into a cubemap texture. This cubemap contains the whole 360° scene. Special care must be taken to the camera parameters when capturing each direction to ensure that the field of view is exactly 90° and the horizontal/vertical ratio is exactly 1:1 in order to correctly fit the cubemap faces.

An image processing pass will then combine all the faces of the cubemap and project them to a single 2D image that will be displayed to the ParaView render view.

Acknowledgements

Special thanks to Niklas Röber from DKRZ for his valuable help regarding this new feature and the redaction of this article.

This work was supported by Helmholtz-Zentrum Geesthacht – Center for Materials and Coastal Research (HZG).

Developments were done by Kitware SAS, France.

Michael Migliore is an R&D Engineer at Kitware, France. He is a developer of VTK and ParaView since 2017. His areas of expertise include computer graphics, physics simulation, collision detection, and software development.

Joachim Pouderoux is a Technical Lead at Kitware, France. He is a developer of VTK and ParaView since 2012. His areas of expertise include scientific visualization, computer graphics, interaction techniques, Voronoi meshing and software development.

2 Responses to Generate 360° images with ParaView

  1. Olivier Burri says:

    Hi there!

    Awesome option! Just a question. Usually 360 videos for VR lenses are exported as a single movie that contains the view for the left eye and the view for the right eye side by side. If we wanted to play these exported views as a 360 video in Youtube, is this possible? I haven’t foudna way to export them that way yet…

    • Michael Migliore says:

      Hi Olivier.
      It is difficult to capture 360° stereo videos because the position of the two cameras are fixed.
      That means that if you turn your head to the right, you will not have a left and a right eye but an eye behind the other, and it will break the 3D perception.

Questions or comments are always welcome!

X