Compositor sink stream synchronisation

Terry Barnaby terry1 at
Wed Nov 16 15:45:39 UTC 2022

We are having a problem with NXP hardware based compositor's where 
occasionally (after pause/unpause) the pipeline gets into a state where 
the output video Buffers only contain frames from one of the compositors 
input sinks. Digging into the code it looks like the compositor (which 
uses GstVideoAggregator) is dropping buffers from the main live camera 
stream because its Buffer timestamps are before the compositors current 
output Buffer timestamps. So the compositor just uses the last camera 
Buffer it had often from 10's of seconds ago.

We have two Video frame sources a v4l2src through one of our own simple 
video deinterlacer elements to a compositor and an overlay stream coming 
from an appsrc. The CPU is only lightly used.

We have seen a harder issue with a different NXP hardware based 
compositor (no frames at all solid from the camera source) and seen some 
interesting frame drops with the standard software compositor.

Our problem may be due to our deinterlacer element somehow (latency ?) 
but it is pretty simple just a VideoFilter (GstVideoFilterClass) 
implementing a transform_frame function.

My question is how in general is gstreamer compositor frame 
synchronisation supposed to work as I can't find any info on this ?

Our v4l2src is a live video source at 30 fps and our appsrc is a non 
live source at 5 fps (although I have tried 30fps). I would have assumed 
that the compositor's output frames should be synchronised to the 
cameras live stream predominantly but this does not seem to be the case. 
It does output 30 fps but is obviously not synchronised in time to the 
cameras Buffer stream.


More information about the gstreamer-devel mailing list