Instant Replay with Gstreamer

Todd Agulnick todd at agulnick.com
Sat Dec 28 22:58:27 PST 2013


Sebastian,

Thanks for your response. As you surmised, the problem is that I'm not
finalizing the MP4 file -- but it's not for lack of trying.

Inspired by the example in the App Developers Manual, I'd been trying to
finalize the file by sending an EOS through the bin that contains the mux
and filesink. I can make that work, but the EOS also tends to want to shut
down the whole pipeline, which isn't what I want. So then I found that I
could override the bin's handle_message method to discard the EOS before it
bubbled up to the pipeline. It works, but what a mess!

So I was happy to see the solution that you linked below that shows the
code simply setting the bin to GST_STATE_NULL to trigger the finalizing of
the file. But I can't seem to get that to work in my app. The critical
difference seems to be that as soon as I decide that the muxer has seen all
the data intended for the current file, I want to block the sourcepad on
the queue upstream of the bin so that the queue resumes accumulating video.
And my impression is that if I'm blocking buffers on that pad
(with GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_BUFFER), the transition
to NULL state doesn't finalize the file -- the elements transition to
GST_STATE_NULL, but the finalization code isn't called.

I understand that the bin is operating in a thread created by the upstream
queue, so I can sort of see how blocking the queue's source pad might cause
problems for the bin. But if I unblock the queue's source pad, then data is
going to need to stream somewhere and that's not what I want. So I feel
like I'm stuck.

The critical bits of code are:

static GstPadProbeReturn
blockpad_probe_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
{
  app_data * app = user_data;

  gst_element_set_state (app->bin, GST_STATE_NULL);
  gst_bin_remove (GST_BIN(app->pipeline), app->bin);

  return GST_PAD_PROBE_OK;
}

void block_pipeline(app_data *app)
{
  g_print ("Blocking pipeline...\n");
  app->blockpad_probe_id = gst_pad_add_probe (app->blockpad,
      GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_BUFFER,
      blockpad_probe_cb, app, NULL);
}


And the tail of the log looks like this (note the absence of the message
"Updating remaining values and sending last data") which the muxer emits
when it properly finalizes a file):

0:00:21.687079000 88586 0x7fe8b40f48a0 LOG                    qtmux
gstqtmux.c:2442:gst_qt_mux_handle_buffer:<mux> selected pad video_0 with
time 0:00:19.766666666
0:00:21.687123000 88586 0x7fe8b40f48a0 LOG                    qtmux
gstqtmux.c:2315:gst_qt_mux_add_buffer:<mux> Pad (video_0) dts updated to
0:00:19.666666666
0:00:21.687142000 88586 0x7fe8b40f48a0 LOG                    qtmux
gstqtmux.c:2319:gst_qt_mux_add_buffer:<mux> Adding 1 samples to track,
duration: 100 size: 6145 chunk offset: 1665581
0:00:21.687157000 88586 0x7fe8b40f48a0 DEBUG                  qtmux
gstqtmux.c:2347:gst_qt_mux_add_buffer: dts: 0:00:19.633333333 pts:
0:00:19.599999999 timebase_dts: 58900 pts_offset: -100
0:00:21.687178000 88586 0x7fe8b40f48a0 LOG                    qtmux
gstqtmux.c:1203:gst_qt_mux_send_buffer:<mux> sending buffer size 6145
0:00:21.687188000 88586 0x7fe8b40f48a0 LOG                    qtmux
gstqtmux.c:1219:gst_qt_mux_send_buffer:<mux> downstream
0:00:21.720409000 88586 0x7fe8b40f48a0 DEBUG                  qtmux
gstqtmux.c:3232:gst_qt_mux_release_pad:<mux> Releasing mux:video_0
0:00:21.720431000 88586 0x7fe8b40f48a0 DEBUG                  qtmux
gstqtmux.c:3236:gst_qt_mux_release_pad: Checking mux:video_0


Also, to answer some of your questions:

 * yes, the source is "is-live"
 * I'm setting the queue's max-size-bytes to 10 MB for now; leaky =
downstream
 * I haven't implemented the key frame check yet, but I see how to do that
 * what is avc stream format? Does that mean setting x264enc
byte-stream=true?

Thanks again for any enlightenment you can offer; I really appreciate it.

-Todd


On Sat, Dec 28, 2013 at 6:53 AM, Sebastian Dröge
<sebastian at centricular.com>wrote:

> On Fr, 2013-12-27 at 10:04 -0500, Todd Agulnick wrote:
> > Hi,
> >
> > I'm trying to build a server application that allows a client to request
> a
> > video covering a specific time range in the recent past from a camera
> > attached to the server.
> >
> > My plan is to block the source pad of a leaky queue which is set up to
> hold
> > ~ 1 GB of encoded video, with the queue always running full and dropping
> > old video as the new video arrives. The first part of the pipeline looks
> > something like:
> >
> > videotestsrc ! videoconvert ! queue ! x264enc ! queue
> >
> > On a signal from the the client, the controlling app would unblock the
> > queue's source pad, inspecting each buffer as it comes through: drop any
> > buffers that are before the requested time window, pass through to a mux
> > and filesink any buffers within the requested window, and send along an
> EOS
> > to wrap things up and block the queue again as soon as it sees a buffer
> > timestamped after the requested window. (Requests will always be
> > non-overlapping and in chronological order.)
> >
> > To handle a request, the app would append to the above pipeline:
> >
> > mp4mux ! filesink
> >
> > ... and then tear those elements down again to wait for the next request.
> >
> > My questions are:
> >
> >  * Are blocking probes the right tool here? The documentation suggests
> that
> > blocking probes are really intended for short-term blockages, but in this
> > system the normal state is that the stream is blocked and accumulating in
> > the queue. I worry whether this will have unexpected consequences, and
> > indeed my experiments thus far have failed to produce a proper MP4 file.
>
> Is your source creating data in real time? What limits are you using on
> the queue? In general this should work if the source is creating data in
> real time.
>
> How did your experiments fail so far? Did you make sure the MP4 files
> are finalized before trying to play them back (i.e. the muxer got the
> EOS event and rewrote the headers)? MP4 does not support streaming
> unfinished files. If you need to stream the files to the client
> "immediately" before finishing them you have to implement something like
> DASH with fragmented MP4 files.
>
> Do you make sure that the first h264 data you pass to the muxer contains
> a keyframe and that the h264 data is in avc stream format?
>
> >  * Do you know of any examples of similar problems and solutions on the
> 1.x
> > codebase? The only examples I've been able to find are pre-1.x. (For
> > example, this one:
> >
> http://gstreamer-devel.966125.n4.nabble.com/Dynamically-updating-filesink-location-at-run-time-on-the-fly-tt4660569.html#a4660577
> > )
>
> Yes, check this for example:
>
> http://lists.freedesktop.org/archives/gstreamer-devel/2013-December/045246.html
>
> This code is creating a new muxer and filesink every now and then.
>
> --
> Sebastian Dröge, Centricular Ltd - http://www.centricular.com
> Expertise, Straight from the Source
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20131229/0edb9cfb/attachment-0001.html>


More information about the gstreamer-devel mailing list