Get back to realtime without loosing data

Peter Biro pettair at gmail.com
Mon Feb 14 22:03:19 UTC 2022


Hi

Basically I can simplify my use case as:

Is there any option to reduce the 'min-threshold-time' property on a queue to (close to) 0 without loosing data? 


So the best behaviour would be to send all the frames from the buffer in a burst when I reset the delay to 0. The recorder component would be overloaded for some time but if it has enough memory it should catch up with time.

Bests,
Peter


> On 2022. Feb 11., at 21:09, Peter Biro <pettair at gmail.com> wrote:
> 
> Hi,
> 
> Woow thanks for the fast reply!
> 
> I would like to keep these components as separate applications (we already have this architecture by sending UDP multicast streams between components), so when I was referring separating them as individual pipelines I meant separate processes. 
> 
> My thought process was that I want to replace our code with standard GST element as much as possible (no code is the best code :D) I mean the gstreamer so much filled with feature that most probably  my usecases could be covered with out of the box elements (maybe even with a simple gst-launch-1.0 component or maybe a small python wrapper around it)
> 
> <PastedGraphic-2.png>
> 
> So preferably I would handle this "delay / get back to realtime without loosing data" functionality in the "Delayer" component.
> 
> Bests,
> Peter
> 
> 
>> On 2022. Feb 11., at 16:49, Nicolas Dufresne <nicolas at ndufresne.ca <mailto:nicolas at ndufresne.ca>> wrote:
>> 
>> Le vendredi 11 février 2022 à 15:11 +0100, Peter Biro via gstreamer-devel a
>> écrit :
>>> Hi all,
>>> 
>>> In my application I need to record streaming video (and audio audio) while I
>>> also have to record some time before the recording is initiated by the user.
>>> Currently Im doing this by have a small c++ application which executes the
>>> follwoing pipeline:
>>> 
>>> udpsrc multicast-group=<VIDEO_SRC_IP> auto-multicast=true
>>> port=<VIDEO_SRC_PORT>
>>>         ! application/x-rtp,media=video,encoding-name=VP8,payload=96 !
>>> rtpjitterbuffer ! rtpvp8depay ! tee name=video_stream_spilt
>>>         udpsrc port=<AUDIO_SRC_PORT>
>>>         ! application/x-rtp,media=audio,clock-rate=44100,encoding-
>>> name=L24,encoding-params=1,channels=1,payload=96,ssrc=687131883,timestamp-
>>> offset=3784732336,seqnum-offset=8272
>>>         ! rtpL24depay ! audioconvert ! audioresample ! queue
>>> name=file_sink_audio_queue ! voaacenc ! queue name=q2 max-size-bytes=0 max-
>>> size-buffers=0 max-size-time=0 ! aacparse ! faad ! audioresample !
>>> audioconvert ! voaacenc ! video_mux.
>>>         video_stream_spilt. ! queue name=file_sink_video_queue ! omxvp8dec !
>>> videoconvert ! omxh264enc bitrate=8000000 control-rate=2 insert-sps-pps=true !
>>> matroskamux name=video_mux
>>>         ! filesink name=file_sink location=<OUTPUT_FILENAME>
>>>         video_stream_spilt. ! fakesink
>>> 
>>> In the c++ application logic I configure the "max-size-time" on the
>>> "file_sink_video_queue" and "file_sink_audio_queue" queues to video and audio
>>> data from the past and blocking the src pads on the queues. When the record
>>> start event arrives it will wait for the first keyframe in the video and the
>>> unblock the the pads to start to record the audio and video data into the
>>> file.
>>> 
>>> I would like to simplify this application by splitting it into two simpler
>>> application. First by adding a new pipeline running in the middle which would
>>> be responsible for delaying and returning the stream to realtime. Then
>>> a simple recorder component which receives the audio and video streams at the
>>> end.
>>> 
>>> So basically what I would like to achieve is that instead
>>> of blocking/unblocking the the queues I would have a "delayed stream" and I'm
>>> searching for an event that I can send to the pipeline (or some other way) to
>>> reduce the delay to realtime again (or minimize it to close to zero) in a way
>>> that it would flush / send out the data from the queue instead of dismissing
>>> it.
>> 
>> My recommendation would be the split this into 3 GStreamer pipelines: receiver,
>> player, recorder.
>> 
>> The receiver would receive, depay and if needed decode the audio/video streams,
>> which are then expose as appsink (one per stream).
>> 
>> The video player would consume the data from the appsink and render it to a
>> display. This is going to be live playback. You can of course at anytime change
>> the player pipeline and play from file, though to make it work, you would
>> probably have to rework your storage, perhaps use splitmuxsink instead.
>> 
>> The recorder will also be feed from the receiver appsink data, but will
>> transcode, mux and store to file. Using splitmuxsink will save you the burdon of
>> muxing, but also of dealing with video synchronization point.
>> 
>> The forth component of your software will be the controller, which roles will be
>> to watch for data from the appsink and pass that to the active appsrc element.
>> Only catch, you need to workout the segment/timestamp for feeding non-live
>> recording pipeline.
>> 
>> Nicolas
>>> 
>>> I found this example:
>>> https://stackoverflow.com/questions/32908047/gstreamer-increase-and-decrease-delay-on-the-fly <https://stackoverflow.com/questions/32908047/gstreamer-increase-and-decrease-delay-on-the-fly>
>>> 
>>> But this covers a slightly different use case, with this when I reset the que
>>> params it jumps to the present and drops all the data in the buffer.
>>> 
>>> Do you have any idea how could I solve this? 
>>> 
>>> Thank you for your help!
>>> 
>>> Bests,
>>> Peter
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220214/ff6b3651/attachment.htm>


More information about the gstreamer-devel mailing list