Instant Replay with Gstreamer

Sebastian Dröge sebastian at centricular.com
Sat Dec 28 03:53:50 PST 2013


On Fr, 2013-12-27 at 10:04 -0500, Todd Agulnick wrote:
> Hi,
> 
> I'm trying to build a server application that allows a client to request a
> video covering a specific time range in the recent past from a camera
> attached to the server.
> 
> My plan is to block the source pad of a leaky queue which is set up to hold
> ~ 1 GB of encoded video, with the queue always running full and dropping
> old video as the new video arrives. The first part of the pipeline looks
> something like:
> 
> videotestsrc ! videoconvert ! queue ! x264enc ! queue
> 
> On a signal from the the client, the controlling app would unblock the
> queue's source pad, inspecting each buffer as it comes through: drop any
> buffers that are before the requested time window, pass through to a mux
> and filesink any buffers within the requested window, and send along an EOS
> to wrap things up and block the queue again as soon as it sees a buffer
> timestamped after the requested window. (Requests will always be
> non-overlapping and in chronological order.)
> 
> To handle a request, the app would append to the above pipeline:
> 
> mp4mux ! filesink
> 
> ... and then tear those elements down again to wait for the next request.
> 
> My questions are:
> 
>  * Are blocking probes the right tool here? The documentation suggests that
> blocking probes are really intended for short-term blockages, but in this
> system the normal state is that the stream is blocked and accumulating in
> the queue. I worry whether this will have unexpected consequences, and
> indeed my experiments thus far have failed to produce a proper MP4 file.

Is your source creating data in real time? What limits are you using on
the queue? In general this should work if the source is creating data in
real time.

How did your experiments fail so far? Did you make sure the MP4 files
are finalized before trying to play them back (i.e. the muxer got the
EOS event and rewrote the headers)? MP4 does not support streaming
unfinished files. If you need to stream the files to the client
"immediately" before finishing them you have to implement something like
DASH with fragmented MP4 files.

Do you make sure that the first h264 data you pass to the muxer contains
a keyframe and that the h264 data is in avc stream format?

>  * Do you know of any examples of similar problems and solutions on the 1.x
> codebase? The only examples I've been able to find are pre-1.x. (For
> example, this one:
> http://gstreamer-devel.966125.n4.nabble.com/Dynamically-updating-filesink-location-at-run-time-on-the-fly-tt4660569.html#a4660577
> )

Yes, check this for example:
http://lists.freedesktop.org/archives/gstreamer-devel/2013-December/045246.html

This code is creating a new muxer and filesink every now and then.

-- 
Sebastian Dröge, Centricular Ltd - http://www.centricular.com
Expertise, Straight from the Source
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 966 bytes
Desc: This is a digitally signed message part
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20131228/fd450eb5/attachment.pgp>


More information about the gstreamer-devel mailing list