Technical data
Last updated
Last updated
Nergame uses a distributed architecture mainly based on events and services. This structure is composed of the next elements:
For the MVP version, the technologies that we used to publish video and audio with low latency are RTMP for the record and HLS for the visualization also for live and deferred streamings. This combination of technologies allows us to create a horizontally scalable system in a simple way but allowing his use between infinite devices.
In the future improvement of the platform will be RTMP distributed by relay servers, allowing us to receive higher traffic. This will force us to provide new solutions for the scalability problems of this protocol that we will try to resolve using a CDN for the visualization part and an event-driven architecture.
The latency between both protocols is very similar, reaching the 12 maximum seconds of delay between the streaming and the visualization to the viewers.
All the live streamings are recorded through the platform and are stored in MP4 format with H264 video codec and AAC audio codec. These records can be published after on the Nergame's channel on Youtube and can be seen inside the platform.