Audiomixer dropping input when mixing live and non-live sources

Sean DuBois sean at siobud.com
Sun Mar 12 19:21:17 UTC 2017


On Sun, Mar 12, 2017 at 01:41:31PM -0500, Sean DuBois wrote:
> On Sun, Mar 12, 2017 at 10:08:16AM -0400, Nicolas Dufresne wrote:
> > Le 12 mars 2017 4:35 AM, "Sean DuBois" <sean at siobud.com> a écrit :
> >
> > Hey list!
> >
> > I am attempting to combine a mixture of live and non-live sources, however
> > I am having trouble
> > with the audiomixer dropping audio. The following is my example pipeline,
> > however the audio is lost from my rtmpsrc.
> > The rtmpsrc is 'live' it is a h264/aac FLV and is is produced from a remote
> > camera on the fly.
> >
> >
> > ```
> > #include <gst/gst.h>
> >
> > int main(int argc, char *argv[]) {
> >   gst_init(&argc, &argv);
> >
> >   auto *loop = g_main_loop_new(NULL, FALSE);
> >   auto *pipeline = gst_parse_launch(
> >       "videotestsrc is-live=true ! compositor name=c !
> > video/x-raw,width=1280,height=720 ! queue ! autovideosink "
> >       "audiotestsrc volume=0.0 is-live=true ! audiomixer name=a ! queue !
> > autoaudiosink "
> >       "rtmpsrc location=\"rtmp://localhost/serve/live\" ! decodebin name=d
> > ! videoconvert name=vconv ! queue ! c. d. ! audioconvert name=aconv ! queue
> > ! a.",
> >       NULL);
> >
> >   gst_element_set_state(pipeline, GST_STATE_PLAYING);
> >   g_main_loop_run(loop);
> >
> >   return 0;
> > }
> > ````
> >
> > If I remove `is-live=true` from the videotestsrc and audiotestsrc the audio
> > works.
> > If I add latency=2000000000 to the compositor/audiomixer the audio works.
> >
> > However, I can't add the latency attribute because other srcs on the
> > audiomixer/compositor (rtp) break things very quickly
> >
> >
> > Can you clarify how it fails for you? You need some latency for this to
> > work, but 2s might bee to much. You would need enough latency on rtp jitter
> > buffer too.
> >
> >
> > One thing I do find peculiar is that the compositor always works it is just
> > empty, there is some difference in logic/state
> > between the audiomixer/compositor (where the compositor behavior is the
> > well behaving one)
> >
> >
> > Video is simpler to deal with, since you can repeat frames without you
> > noticing.
> >
> >
> > I also can do a GST_PAD_PROBE_BUFFER and add ~2 seconds to the PTS of the
> > raw audio buffers on the audioconvert sink pad, and that fixes it as well.
> > However I don't understand where that 2 second of loss is coming from? I
> > would like to measure/understand, before I do a hack
> > like that.
> >
> > So if anyone has any ideas/can point out what I am doing wrong I would love
> > to hear!
> >
> > thanks
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.freedesktop.org
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.freedesktop.org
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> Hi Nicolas, thanks for the quick response
>
> > Can you clarify how it fails for you? You need some latency for this to
> > work, but 2s might bee to much. You would need enough latency on rtp jitter
> > buffer too.
>
> The audiomixer drops all incoming buffers from my rtmpsrc, the output is completely
> silent (but if I add a audiotestsrc it works, proving that the audiomixer is working)
>
> How would I know the magnitude of latency I should be adding? It seems
> variable, 2 seconds works and sometimes it doesn't. Is there a signal
> from the audiomixer pad that I could watch and up the latency until it
> works?
>
> The other issue is that when I do add latency and RTP sources the
> compositor starts to act very poorly. This makes sense to me, I don't want any
> latency from my RTP input if it doesn't arrive in time I am ok
> discarding it. However for my RTMP I know that the video/audio is fine,
> it just seems to have 'fallen behind' because a PadProbe that adds 2
> seconds to it fixes everything. I just don't know what I should be
> measuring to figure out where those 2 seconds are coming from.
>
>
> Here is my RTP input, however even if I do get latency+RTP working it
> still doesn't change the issue that I don't know how much latency to
> add.
> ```
> gst-launch-1.0 udpsrc port=5000 caps="application/x-rtp" ! rtph264depay ! decodebin ! videoconvert ! compositor latency=5000000000 sink_1::xpos=500 name=c ! autovideosink videotestsrc is-live=true ! c.
>
> gst-launch-1.0 videotestsrc ! x264enc speed-preset=veryfast ! rtph264pay ! udpsink port=5000 host="127.0.0.1"
> ```
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

Also, here is what I mean by a PadProbe that modifies timestamps, this
pipeline works but I am not sure *why* The +-3 seconds on audio/video
was just from me tweaking things. I would love to know the reason that
those magnitudes work (and then use the smallest number possible, or do
the math off of the pipeline clock)

```
#include <gst/gst.h>

GstPadProbeReturn video_pad_probe(GstPad *pad, GstPadProbeInfo *info, gpointer user_data) {
  auto *buffer = GST_PAD_PROBE_INFO_BUFFER(info);
  static gboolean first = true;
  if (first) {
    first = false;
  } else {
    GST_BUFFER_PTS(buffer) -= 3000000000;
  }

  return GST_PAD_PROBE_OK;
}

GstPadProbeReturn audio_pad_probe(GstPad *pad, GstPadProbeInfo *info, gpointer user_data) {
  auto *buffer = GST_PAD_PROBE_INFO_BUFFER(info);
  static gboolean first = true;
  if (first) {
    first = false;
  } else {
    GST_BUFFER_PTS(buffer) -= 3000000000;
  }

  return GST_PAD_PROBE_OK;
}

int main(int argc, char *argv[]) {
  gst_init(&argc, &argv);

  auto *loop = g_main_loop_new(NULL, FALSE);
  auto *pipeline = gst_parse_launch(
      "flvmux name=mux ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! rtmpsink location=\"rtmp://localhost/inbound/test\" "
      "videotestsrc is-live=true ! compositor name=c ! video/x-raw,width=1280,height=720 ! queue ! x264enc speed-preset=veryfast tune=zerolatency ! queue ! mux. "
      "audiotestsrc is-live=true volume=0.0 ! audiomixer name=a ! queue ! voaacenc ! queue ! mux. "
      "rtmpsrc location=\"rtmp://localhost/serve/live\" ! decodebin name=d ! videoconvert name=vconv ! queue ! c. d. ! audioconvert name=aconv ! queue ! a.",
      NULL);

  gst_pad_add_probe(gst_element_get_static_pad(gst_bin_get_by_name(GST_BIN(pipeline), "aconv"), "sink"), GST_PAD_PROBE_TYPE_BUFFER, audio_pad_probe, nullptr, nullptr);
  gst_pad_add_probe(gst_element_get_static_pad(gst_bin_get_by_name(GST_BIN(pipeline), "vconv"), "sink"), GST_PAD_PROBE_TYPE_BUFFER, video_pad_probe, nullptr, nullptr);

  gst_element_set_state(pipeline, GST_STATE_PLAYING);
  g_main_loop_run(loop);

  return 0;
}
```


More information about the gstreamer-devel mailing list