Synchronised video capture from multiple cameras with splitmuxsink
gotsring
gotsring at live.com
Sat Feb 20 00:27:12 UTC 2021
Just a few comments, no hard advice (sorry).
I recently created a sort of NVR that would receive videos from 2-4 IP cams,
show the live view, and record all streams on-demand. When recording, each
camera streams to its own file, 1 file per camera per record session. In
general, I could replay the videos "synchronized" by just starting to play
them all at the same time. The streams were all probably within 50-100 ms of
each other (wall-clock time), so if you want something that is pretty much
visually synchronized, you don't have to really mess with pipeline clocks or
specific gst elements. Anything more seriously synchronized is beyond my
experience.
Implementation details if you're curious:
I created one pipeline per camera, and all pipelines were managed by the
same g_main_loop so I could start/stop recordings in one place, which pretty
much consisted of linking/unlinking a tee to a sub-pipeline with an encoder
and filesink. Of course, each pipeline took 2-5 seconds to initialize as it
connected to the cameras, but once that was done, starting and stopping
recordings took a negligible amount of time.
After receiving and decoding the camera feeds, I had a videorate element to
make sure the streams stayed a consistent 30 FPS, even while the connection
to the camera connection dropped and restarted (would just repeat the last
frame until connection was restored). This forced a uniformity between
camera streams that helps keep synchronization during longer recordings
(longest I tried was 70 hours long).
Hope this helps!
--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
More information about the gstreamer-devel
mailing list