ParaView can run on a supercomputer with thousands of nodes to provide visualization and analysis of very large datasets. In this configuration, the same version of the ParaView analysis pipeline runs on each node to process a piece of the data, the results are rendered in software using Off-Screen Mesa and composited into a final image which is send to the ParaView client for display.
Software rendering is used because, until recently, supercomputer nodes did not provide graphic cards as they were used mainly for computation. This is beginning to change with the release of new GPU Accelerators cards, such as NVIDIA Tesla, which can be used for both computation and off-screen rendering.
The Native Platform Interface (EGL) provides means to render to a native windowing system, such as Android, X Windows or Microsoft Windows, or to an off-screen buffer. For rendering API, one can choose OpenGL ES, OpenVG or, starting with EGL version 1.4, full OpenGL.
We enable the VTK and the ParaView server (pvserver) to render to an EGL off-screen buffer. Through this work we allow server-side hardware-accelerated rendering without the need to install a windowing system.
To compile VTK or ParaView for off-screen rendering through EGL you will need:
- A graphics card driver that supports OpenGL rendering through EGL (full OpenGL rendering is supported only in EGL version 1.4 or later). We have tested our code with the NVIDIA driver version 355.11.
- You might need the EGL headers as they did not come with the Nvidia driver used in our tests. You can download them from Khronos EGL Registry.
- Set VTK advanced configuration option VTK_USE_OFFSCREEN_EGL.
You’ll get a configuration error if any of the windowing systems is enabled: VTK_USE_X or VTK_USE_COCOA so you’ll have to disable your windowing system. You’ll also get an error if you are on WIN32, ANDROID or APPLE_IOS.
If you have several graphics cards on you system you may need to set the index of the graphics card you want to use, if that is different than the default card chosen by the driver. You can do that if your driver supports EGL_EXT_platform_device and EGL_EXT_device_base extensions.
You can set the default graphics card used by the render window in VTK by setting the advanced configuration option VTK_EGL_DEVICE_INDEX to an integer such as 0 or 1 for two cards installed on a system. By default, this variable is set to 0 which means that the default graphics card is used. We are investigating using a more user friendly mechanism such as the name of the graphics card. We note that the index of the graphics card you need to pass is the same as the index of the card returned by the following command nvidia-smi.
For a system with more then one graphics card installed, you can choose the graphics card used for rendering at runtime, in case it is different that the card setup at configuration time.
If you want to change the graphics card set through the configuration process, you can call vtkRenderWindow::GetNumberOfDevices() to query the number of devices available on a system and vtkRenderWindow::SetDeviceIndex(deviceIndex) to set the device you want to be used for rendering.
To start pvserver with rendering set on a graphics card different than the card set through the configuration process, you have to pass the following command line parameter:
–egl-device-index=<device_index>, where <device_index> is the graphics card index.
To check if you are rendering to the correct graphics card in ParaView you can use Help, About, Connection Information, OpenGL Renderer.
- Make sure that EGL_INCLUDE_DIR, EGL_LIBRARY, EGL_gldispatch_LIBRARY, EGL_opengl_LIBRARY point to valid headers and libraries. On Ubuntu 16.04 with NVidia driver version 361.42 the libraries are: /usr/lib/nvidia-361/libEGL.so, /usr/lib/nvidia-361/libGLdispatch.so.0 and /usr/lib/nvidia-361/libOpenGL.so.
- Pass –disable-xdisplay-test to pvserver if this option exists. We have seen a case when this test creates problems with the EGL rendering
We hope you enjoy this new feature. It is available in the VTK and ParaView git repositories.
Thanks to Peter Messmer from NVidia for answering all our EGL questions.