Stereo Camera Synchronization
nicolas at ndufresne.ca
Thu Apr 20 20:15:05 UTC 2023
Le jeudi 20 avril 2023 à 10:53 -0400, Dwight Kulkarni via gstreamer-devel a
> Hi all,
> I have two cameras with independent pipelines, the camera ends in an appsink
> where the jpeg frames are consumed.
> The problem is that I need to extract the jpeg frames from both cameras at
> exactly the same time. There can't be slight time differences between the two
> images. This is for stereo image analysis.
> Right now, I terminate the pipeline in an appsink and then consume the Jpeg
> frames that get outputted.
> If each camera runs on a separate pipeline I will get two different callbacks
> that contain the JPEG image and there is no guarantee that both images are in
> synch unless there is something from gstreamer that can synch them.
> Alternately, we can change the camera driver so that the raw video coming out
> of the ISP is being stitched into a frame that's twice as large and then maybe
> crop the image in gstreamer when recording the video to select only one camera
> but the Jpeg frame being returned will contain both frames already synched.
> Before pursuing these ideas, I was hoping for any comments on what is the best
The suggestion is good, it can simplify a lot your gstreamer work. Though, if
you still need to synchronization, only streams within the same pipeline can be
correlated in time and synchronized. So your two sources element (whatever the
element this is is) needs to be running in the same pipeline, under the same
clock. From there you can turn their PTS into running-time and sync them
(appsink GstSample gives you both the segment and timestamp to help you with
that). The challenge is to get your timestamp source to be accurate. HW
timestamps are ideal for this use case. You might also want some fuzzy match,
e.g. truncate to millisecond precision.
> Dwight Kulkarni
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gstreamer-devel