webrtcbin: Remove a connecting webrtcbin from running pipeline

Matthew Waters ystreet00 at gmail.com
Wed Jan 26 04:24:26 UTC 2022


Before negotiation completes, webrtcbin will block the input pad to
avoid losing data (at the expense of a potential burst of data at the

This block of the webrtcbin input pad in turn blocks the upstream queue
in the pad stream lock:

Thread 1 (Thread 0x7ffff7907500 (LWP 1962063) "python3"):
#0  0x00007ffff7a6a7b0 in __lll_lock_wait () at /lib64/libpthread.so.0
#1  0x00007ffff7a635d0 in pthread_mutex_lock () at /lib64/libpthread.so.0
#2  0x00007fffe8f72869 in gst_pad_stop_task (pad=pad at entry=0x555555bbe3e0) at ../subprojects/gstreamer/gst/gstpad.c:6438
#3  0x00007fffe7c356f3 in gst_queue_src_activate_mode (pad=0x555555bbe3e0, parent=<optimized out>, mode=GST_PAD_MODE_PUSH, active=0) at ../subprojects/gstreamer/plugins/elements/gstqueue.c:1766
#4  0x00007fffe8f6c65a in activate_mode_internal (pad=pad at entry=0x555555bbe3e0, parent=parent at entry=0x555555ade370, mode=mode at entry=GST_PAD_MODE_PUSH, active=active at entry=0) at ../subprojects/gstreamer/gst/gstpad.c:1216
#5  0x00007fffe8f6ce48 in gst_pad_set_active (pad=pad at entry=0x555555bbe3e0, active=0) at ../subprojects/gstreamer/gst/gstpad.c:1114
#6  0x00007fffe8f469a1 in activate_pads (vpad=<optimized out>, ret=0x7fffffff6360, active=0x7fffffff63bc) at ../subprojects/gstreamer/gst/gstelement.c:3171
#7  0x00007fffe8f5c6b3 in gst_iterator_fold (it=it at entry=0x555555a36220, func=func at entry=0x7fffe8f46980 <activate_pads>, ret=ret at entry=0x7fffffff6360, user_data=user_data at entry=0x7fffffff63bc) at ../subprojects/gstreamer/gst/gstiterator.c:617
#8  0x00007fffe8f47036 in iterator_activate_fold_with_resync (iter=iter at entry=0x555555a36220, user_data=user_data at entry=0x7fffffff63bc, func=0x7fffe8f46980 <activate_pads>) at ../subprojects/gstreamer/gst/gstelement.c:3195
#9  0x00007fffe8f492bb in gst_element_pads_activate (element=element at entry=0x555555ade370, active=<optimized out>, active at entry=0) at ../subprojects/gstreamer/gst/gstelement.c:3231
#10 0x00007fffe8f49591 in gst_element_change_state_func (element=0x555555ade370, transition=GST_STATE_CHANGE_PAUSED_TO_READY) at ../subprojects/gstreamer/gst/gstelement.c:3305
#11 0x00007fffe8f4b78e in gst_element_change_state (element=element at entry=0x555555ade370, transition=GST_STATE_CHANGE_PAUSED_TO_READY) at ../subprojects/gstreamer/gst/gstelement.c:3083
#12 0x00007fffe8f4c20a in gst_element_continue_state (element=element at entry=0x555555ade370, ret=ret at entry=GST_STATE_CHANGE_SUCCESS) at ../subprojects/gstreamer/gst/gstelement.c:2791
#13 0x00007fffe8f4b7db in gst_element_change_state (element=element at entry=0x555555ade370, transition=transition at entry=GST_STATE_CHANGE_PLAYING_TO_PAUSED) at ../subprojects/gstreamer/gst/gstelement.c:3122
#14 0x00007fffe8f4becd in gst_element_set_state_func (element=0x555555ade370, state=GST_STATE_NULL) at ../subprojects/gstreamer/gst/gstelement.c:3037
#15 0x00007ffff7fadc04 in ffi_call_unix64 () at /lib64/libffi.so.6
#16 0x00007ffff7fad107 in ffi_call () at /lib64/libffi.so.6
#17 0x00007fffea047e06 in pygi_invoke_c_callable (function_cache=0x555555bc2600, state=<optimized out>, py_args=<optimized out>, py_kwargs=<optimized out>) at ../subprojects/pygobject/gi/pygi-invoke.c:684
#18 0x00007fffea049b5a in pygi_function_cache_invoke (function_cache=<optimized out>, py_args=py_args at entry=0x7fffe9108f00, py_kwargs=py_kwargs at entry=0x0) at ../subprojects/pygobject/gi/pygi-cache.c:862
#19 0x00007fffea0486d5 in pygi_callable_info_invoke (user_data=0x0, cache=<optimized out>, kwargs=0x0, py_args=0x7fffe9108f00, info=<optimized out>) at ../subprojects/pygobject/gi/pygi-invoke.c:727
#20 0x00007fffea03a97e in _callable_info_call (self=0x7fffea16c8f0, args=0x7fffe912e670, kwargs=0x0) at ../subprojects/pygobject/gi/pygi-info.c:548
#21 0x00007ffff7d5c357 in _PyObject_MakeTpCall () at /lib64/libpython3.9.so.1.0
#22 0x00007ffff7d5921e in _PyEval_EvalFrameDefault () at /lib64/libpython3.9.so.1.0
#23 0x00007ffff7d60c83 in function_code_fastcall () at /lib64/libpython3.9.so.1.0
#24 0x00007ffff7d54231 in _PyEval_EvalFrameDefault () at /lib64/libpython3.9.so.1.0
#25 0x00007ffff7d52c4d in _PyEval_EvalCode () at /lib64/libpython3.9.so.1.0
#26 0x00007ffff7dcdc35 in _PyEval_EvalCodeWithName () at /lib64/libpython3.9.so.1.0
#27 0x00007ffff7dcdbcd in PyEval_EvalCodeEx () at /lib64/libpython3.9.so.1.0
#28 0x00007ffff7dcdb7f in PyEval_EvalCode () at /lib64/libpython3.9.so.1.0
#29 0x00007ffff7dfb3f4 in run_eval_code_obj () at /lib64/libpython3.9.so.1.0
#30 0x00007ffff7df7486 in run_mod () at /lib64/libpython3.9.so.1.0
#31 0x00007ffff7ccec80 in pyrun_file.cold () at /lib64/libpython3.9.so.1.0
#32 0x00007ffff7df16a3 in PyRun_SimpleFileExFlags () at /lib64/libpython3.9.so.1.0
#33 0x00007ffff7dee888 in Py_RunMain () at /lib64/libpython3.9.so.1.0
#34 0x00007ffff7dc072d in Py_BytesMain () at /lib64/libpython3.9.so.1.0
#35 0x00007ffff7aa0b75 in __libc_start_main () at /lib64/libc.so.6
#36 0x000055555555509e in _start ()

To fix, you need to avoid the queue from blocking, you need to remove
the data from the pipeline.  One (tested) solution is to bring the state
of webrtcbin down before the queue element which causes the pad to
return flushing upstream on any push and the webrtcbin input pad block
to be removed.  Another fix would probably be to push a
flush-start/flush-stop manually into the queue element.  Another fix may
be to bring the element states down in lock-step from
PLAYING->PAUSED->READY->NULL (which is what bin's do internally).


On 26/1/22 05:11, Eric Timmons via gstreamer-devel wrote:
> I've got an application where we're streaming video to multiple clients using WebRTC. Our video goes into a tee element and then each WebRTC branch of the tee has a queue, rtph264pay, and webrtcbin element. Dynamically adding new branches to this pipeline works great. Dynamically removing branches as clients go away works, but only if the both the SDP and ICE negotiations have finished.
> We can't figure out how to remove a branch where an offer has been sent, but no answer received. Currently, we disconnect the branch from the tee, set the states of every element in the branch to NULL, and then remove them from the pipeline. If negotiations have not finished, then setting any element's state to NULL blocks indefinitely.
> We've additionally tried sending an EOS through the branch after disconnecting from the tee, but the EOS doesn't ever seem to leave the queue.
> I've attached an example using the Python bindings. This hangs on Line 107 in both 1.18.4 and current main branch.
> Thanks!
> -Eric
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220126/8e02f5f2/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 495 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220126/8e02f5f2/attachment-0001.sig>

More information about the gstreamer-devel mailing list