<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">Hi,<div class=""><br class=""></div><div class="">Woow thanks for the fast reply!</div><div class=""><br class=""></div><div class="">I would like to keep these components as separate applications (we already have this architecture by sending UDP multicast streams between components), so when I was referring separating them as individual pipelines I meant separate processes. </div><div class=""><br class=""></div><div class="">My thought process was that I want to replace our code with standard GST element as much as possible (no code is the best code :D) I mean the gstreamer so much filled with feature that most probably my usecases could be covered with out of the box elements (maybe even with a simple gst-launch-1.0 component or maybe a small python wrapper around it)</div><div class=""><br class=""></div><div class=""><img apple-inline="yes" id="F199F3E5-B20E-411F-965A-DE7556702EBF" width="433" height="251" src="cid:1DB67A56-A5EF-4F25-9334-EAC06C17B9C8" class=""></div><div class=""><br class=""></div><div class="">So preferably I would handle this "delay / get back to realtime without loosing data" functionality in the "Delayer" component.</div><div class=""><br class=""></div><div class="">Bests,</div><div class="">Peter</div><div class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On 2022. Feb 11., at 16:49, Nicolas Dufresne <<a href="mailto:nicolas@ndufresne.ca" class="">nicolas@ndufresne.ca</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div class="">Le vendredi 11 février 2022 à 15:11 +0100, Peter Biro via gstreamer-devel a<br class="">écrit :<br class=""><blockquote type="cite" class="">Hi all,<br class=""><br class="">In my application I need to record streaming video (and audio audio) while I<br class="">also have to record some time before the recording is initiated by the user.<br class="">Currently Im doing this by have a small c++ application which executes the<br class="">follwoing pipeline:<br class=""><br class="">udpsrc multicast-group=<VIDEO_SRC_IP> auto-multicast=true<br class="">port=<VIDEO_SRC_PORT><br class=""> ! application/x-rtp,media=video,encoding-name=VP8,payload=96 !<br class="">rtpjitterbuffer ! rtpvp8depay ! tee name=video_stream_spilt<br class=""> udpsrc port=<AUDIO_SRC_PORT><br class=""> ! application/x-rtp,media=audio,clock-rate=44100,encoding-<br class="">name=L24,encoding-params=1,channels=1,payload=96,ssrc=687131883,timestamp-<br class="">offset=3784732336,seqnum-offset=8272<br class=""> ! rtpL24depay ! audioconvert ! audioresample ! queue<br class="">name=file_sink_audio_queue ! voaacenc ! queue name=q2 max-size-bytes=0 max-<br class="">size-buffers=0 max-size-time=0 ! aacparse ! faad ! audioresample !<br class="">audioconvert ! voaacenc ! video_mux.<br class=""> video_stream_spilt. ! queue name=file_sink_video_queue ! omxvp8dec !<br class="">videoconvert ! omxh264enc bitrate=8000000 control-rate=2 insert-sps-pps=true !<br class="">matroskamux name=video_mux<br class=""> ! filesink name=file_sink location=<OUTPUT_FILENAME><br class=""> video_stream_spilt. ! fakesink<br class=""><br class="">In the c++ application logic I configure the "max-size-time" on the<br class="">"file_sink_video_queue" and "file_sink_audio_queue" queues to video and audio<br class="">data from the past and blocking the src pads on the queues. When the record<br class="">start event arrives it will wait for the first keyframe in the video and the<br class="">unblock the the pads to start to record the audio and video data into the<br class="">file.<br class=""><br class="">I would like to simplify this application by splitting it into two simpler<br class="">application. First by adding a new pipeline running in the middle which would<br class="">be responsible for delaying and returning the stream to realtime. Then<br class="">a simple recorder component which receives the audio and video streams at the<br class="">end.<br class=""><br class="">So basically what I would like to achieve is that instead<br class="">of blocking/unblocking the the queues I would have a "delayed stream" and I'm<br class="">searching for an event that I can send to the pipeline (or some other way) to<br class="">reduce the delay to realtime again (or minimize it to close to zero) in a way<br class="">that it would flush / send out the data from the queue instead of dismissing<br class="">it.<br class=""></blockquote><br class="">My recommendation would be the split this into 3 GStreamer pipelines: receiver,<br class="">player, recorder.<br class=""><br class="">The receiver would receive, depay and if needed decode the audio/video streams,<br class="">which are then expose as appsink (one per stream).<br class=""><br class="">The video player would consume the data from the appsink and render it to a<br class="">display. This is going to be live playback. You can of course at anytime change<br class="">the player pipeline and play from file, though to make it work, you would<br class="">probably have to rework your storage, perhaps use splitmuxsink instead.<br class=""><br class="">The recorder will also be feed from the receiver appsink data, but will<br class="">transcode, mux and store to file. Using splitmuxsink will save you the burdon of<br class="">muxing, but also of dealing with video synchronization point.<br class=""><br class="">The forth component of your software will be the controller, which roles will be<br class="">to watch for data from the appsink and pass that to the active appsrc element.<br class="">Only catch, you need to workout the segment/timestamp for feeding non-live<br class="">recording pipeline.<br class=""><br class="">Nicolas<br class=""><blockquote type="cite" class=""><br class="">I found this example:<br class=""><a href="https://stackoverflow.com/questions/32908047/gstreamer-increase-and-decrease-delay-on-the-fly" class="">https://stackoverflow.com/questions/32908047/gstreamer-increase-and-decrease-delay-on-the-fly</a><br class=""><br class="">But this covers a slightly different use case, with this when I reset the que<br class="">params it jumps to the present and drops all the data in the buffer.<br class=""><br class="">Do you have any idea how could I solve this? <br class=""><br class="">Thank you for your help!<br class=""><br class="">Bests,<br class="">Peter<br class=""></blockquote><br class=""></div></div></blockquote></div><br class=""></div></body></html>