Setup Pixelstreaming - dhelmrich/Synavis GitHub Wiki
Setting up PixelStreaming
For general purposes, follow the tutorial at the Epic Games Webpage.
It contains information on how to generally get started, and there are also deployment information on how to put this into motion in a cloud environment over at unreal containers.
Supercomputing environment
Generally, compute nodes do not have access to the internet. Futhermore, compute nodes might not be able to run UE on them. Things you need to watch out for here is:
- General Graphics capabilities. This includes whether Vulkan is supported. Since some UE4 version, they dropped GL rendering on Linux and only support Vulkan, which is the better choice imo. However, this also means that you need vulkan drivers installed on the system and you need a graphics-able GPU. Modern tensor core cards might not be able to support this without a lot of extra programming work, such as the NVIDIA Hopper architecture.
- Encoding/Decoding abilities. This is more independent of graphics, but you need to have the codec headers ready. For a lot of concurrent sessions, you should move to VP9 encoding. Be aware that right now, it is in your hands to decode the images on the receiving side as we generally don't want to keep updating for different encoding/decoding schemes. There is always ffmpeg build on the JSC software stack as it is also used for Xpra, for example.