Get back to realtime without loosing data

Nicolas Dufresne nicolas at ndufresne.ca
Fri Feb 11 15:49:35 UTC 2022


Le vendredi 11 février 2022 à 15:11 +0100, Peter Biro via gstreamer-devel a
écrit :
> Hi all,
> 
> In my application I need to record streaming video (and audio audio) while I
> also have to record some time before the recording is initiated by the user.
> Currently Im doing this by have a small c++ application which executes the
> follwoing pipeline:
> 
> udpsrc multicast-group=<VIDEO_SRC_IP> auto-multicast=true
> port=<VIDEO_SRC_PORT>
>         ! application/x-rtp,media=video,encoding-name=VP8,payload=96 !
> rtpjitterbuffer ! rtpvp8depay ! tee name=video_stream_spilt
>         udpsrc port=<AUDIO_SRC_PORT>
>         ! application/x-rtp,media=audio,clock-rate=44100,encoding-
> name=L24,encoding-params=1,channels=1,payload=96,ssrc=687131883,timestamp-
> offset=3784732336,seqnum-offset=8272
>         ! rtpL24depay ! audioconvert ! audioresample ! queue
> name=file_sink_audio_queue ! voaacenc ! queue name=q2 max-size-bytes=0 max-
> size-buffers=0 max-size-time=0 ! aacparse ! faad ! audioresample !
> audioconvert ! voaacenc ! video_mux.
>         video_stream_spilt. ! queue name=file_sink_video_queue ! omxvp8dec !
> videoconvert ! omxh264enc bitrate=8000000 control-rate=2 insert-sps-pps=true !
> matroskamux name=video_mux
>         ! filesink name=file_sink location=<OUTPUT_FILENAME>
>         video_stream_spilt. ! fakesink
> 
> In the c++ application logic I configure the "max-size-time" on the
> "file_sink_video_queue" and "file_sink_audio_queue" queues to video and audio
> data from the past and blocking the src pads on the queues. When the record
> start event arrives it will wait for the first keyframe in the video and the
> unblock the the pads to start to record the audio and video data into the
> file.
> 
> I would like to simplify this application by splitting it into two simpler
> application. First by adding a new pipeline running in the middle which would
> be responsible for delaying and returning the stream to realtime. Then
> a simple recorder component which receives the audio and video streams at the
> end.
> 
> So basically what I would like to achieve is that instead
> of blocking/unblocking the the queues I would have a "delayed stream" and I'm
> searching for an event that I can send to the pipeline (or some other way) to
> reduce the delay to realtime again (or minimize it to close to zero) in a way
> that it would flush / send out the data from the queue instead of dismissing
> it.

My recommendation would be the split this into 3 GStreamer pipelines: receiver,
player, recorder.

The receiver would receive, depay and if needed decode the audio/video streams,
which are then expose as appsink (one per stream).

The video player would consume the data from the appsink and render it to a
display. This is going to be live playback. You can of course at anytime change
the player pipeline and play from file, though to make it work, you would
probably have to rework your storage, perhaps use splitmuxsink instead.

The recorder will also be feed from the receiver appsink data, but will
transcode, mux and store to file. Using splitmuxsink will save you the burdon of
muxing, but also of dealing with video synchronization point.

The forth component of your software will be the controller, which roles will be
to watch for data from the appsink and pass that to the active appsrc element.
Only catch, you need to workout the segment/timestamp for feeding non-live
recording pipeline.

Nicolas
> 
> I found this example:
> https://stackoverflow.com/questions/32908047/gstreamer-increase-and-decrease-delay-on-the-fly
> 
> But this covers a slightly different use case, with this when I reset the que
> params it jumps to the present and drops all the data in the buffer.
> 
> Do you have any idea how could I solve this? 
> 
> Thank you for your help!
> 
> Bests,
> Peter



More information about the gstreamer-devel mailing list