mpegtsmux AV synch and dynamic linking

Marianna Smidth Buschle msb at qtec.com
Mon Oct 25 07:06:37 UTC 2021


This pipeline works and it is a simplification of part of a python app:

gst-launch-1.0 udpsrc address=224.1.1.1 port=5000 multicast-iface=eth1 ! 
application/x-rtp,media=video,payload=33,clock-rate=90000,encoding-name=MP2T 
! rtpbin ! queue ! decodebin caps="video/x-h264" ! h264parse ! 
"video/x-h264,stream-format=byte-stream,alignment=au" ! tee name=t ! 
queue ! "video/x-h264,stream-format=byte-stream,alignment=au" ! 
multiqueue name=mq ! mpegtsmux name=mux ! rtpmp2tpay ! filesink 
location=test.ts audiotestsrc ! 
"audio/x-raw,format=S32LE,layout=interleaved,rate=44100,channels=2,channel-mask=(bitmask)0x3" 
! mq. mq. ! audioconvert ! avenc_aac ! mux. -v --gst-debug=*:3

Basically there is video (MPEGTS - H264) coming through RTP which is to 
be muxed with local audio (which is actually an alsasrc) and saved to a 
file. (There is a lot more going on but this is the relevant part)

My problem/question is related to the synchronization of audio and video 
and to how properly dynamic link the necessary pads.

A/V synchronization:

- All sources are supposedly using the same clock: the alsasrc is set to 
provide-clock=0 and the remote UDP source has also received the same clock.

- So should I use the multiqueue as a form of synchronization? Does it 
makes a difference or could I just as well use normal queues?

- Anything else I should do or be aware of in terms of having the audio 
and video in synch?

Dynamic linking:

- The alsasrc should be running from the beginning however the camera 
sending the UDP might show up at any later point, disconnect and reconnect

- When I only have the video all works. The link between rtpbin and the 
first queue is made based on the pad-added signal, the same is true for 
the link between decodebin and h264parse.

- When I try adding the audio I get all sorts of errors, all depending 
on when I try to link the avenc_aac and the mpegtsmux.

   - Fx linking when the decodebin starts producing buffers: 
WARN               basetsmux 
gstbasetsmux.c:811:gst_base_ts_mux_create_pad_stream:<mpegtsmux2> error: 
Could not create handler for stream

What does this error mean?

And which is the proper way / order to link audio and video to the 
mpegtsmux with these constrains?

-- 
Best regards / Med venlig hilsen
“Marianna Smidth Buschle”



More information about the gstreamer-devel mailing list