Audiomixer dropping input when mixing live and non-live sources

Sean DuBois sean at siobud.com
Tue Mar 14 16:29:01 UTC 2017


On Sun, Mar 12, 2017 at 06:18:15PM -0400, Nicolas Dufresne wrote:
> Le dimanche 12 mars 2017 à 13:41 -0500, Sean DuBois a écrit :
> > Hi Nicolas, thanks for the quick response
> >
> > > Can you clarify how it fails for you? You need some latency for
> > > this to
> > > work, but 2s might bee to much. You would need enough latency on
> > > rtp jitter
> > > buffer too.
> >
> > The audiomixer drops all incoming buffers from my rtmpsrc, the output
> > is completely
> > silent (but if I add a audiotestsrc it works, proving that the
> > audiomixer is working)
>
> Interesting, that looks like everything is being dropped, as it's late.
>  I'm wondering if rtmpsrc is really implementing a live source. A live
> source would set timestamp base on arrival time, while this problem
> seems to indicate that it sets timestamp from zero or something,
> regardless of how much time it took to start. What you could try
> though, is to pause you pipeline first, wait a bit, and then start it.
>
> >
> > How would I know the magnitude of latency I should be adding? It
> > seems
> > variable, 2 seconds works and sometimes it doesn't. Is there a signal
> > from the audiomixer pad that I could watch and up the latency until
> > it
> > works?
>
> rtmp is TCP, so yes, it's variable, un-controlled. Ideally, the
> rtmpsource element would compute the initial latency, and advertise it
> over the pipeline should it could all work. Then just a small amount in
> the mixer for jitter would be sufficient.
>
> >
> > The other issue is that when I do add latency and RTP sources the
> > compositor starts to act very poorly. This makes sense to me, I don't
> > want any
> > latency from my RTP input if it doesn't arrive in time I am ok
> > discarding it. However for my RTMP I know that the video/audio is
> > fine,
> > it just seems to have 'fallen behind' because a PadProbe that adds 2
> > seconds to it fixes everything. I just don't know what I should be
> > measuring to figure out where those 2 seconds are coming from.
>
> The mixing latency is to compensate small delays between inputs, 0 is a
> little racy.
>
> >
> >
> > Here is my RTP input, however even if I do get latency+RTP working it
> > still doesn't change the issue that I don't know how much latency to
> > add.
> > ```
> > gst-launch-1.0 udpsrc port=5000 caps="application/x-rtp" !
> > rtph264depay ! decodebin ! videoconvert ! compositor
> > latency=5000000000 sink_1::xpos=500 name=c ! autovideosink
> > videotestsrc is-live=true ! c.
>
> You are missing an rtpjitterbuffer. That means your packet could endup
> in the wrong order, time information will be jittery too. This is far
> from ideal when doing RTP. It will also make the compositor start
> really early, which then make the other streams appear late.
>
> >
> > gst-launch-1.0 videotestsrc ! x264enc speed-preset=veryfast !
> > rtph264pay ! udpsink port=5000 host="127.0.0.1"
> > ```



> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
So this is probably an awful hack, but here is how I solved it (in case
anyone ends up here from a search engine)

Note you MUST clean this up if you don't want it to leak
(gst_bin_get_by_name and small stuff like that) I use unique_ptr locally
but ripped it out for this example

The only thing is that now audiorate/audioresample don't work they
complain about discontinuity I might end up making another thread about
that.


```
#include <gst/gst.h>

#include <gst/app/gstappsink.h>
#include <gst/app/gstappsrc.h>

GstFlowReturn NewSample(GstElement *object, gpointer user_data) {
  GstBuffer *buff;
  GstSample *sample;

  if ((sample = gst_app_sink_pull_sample((GstAppSink *)object)) == NULL) {
    return GST_FLOW_EOS;
  }
  if ((buff = gst_sample_get_buffer(sample)) == NULL) {
    gst_sample_unref(sample);
    return GST_FLOW_EOS;
  }

  gst_sample_unref(sample);
  gst_buffer_ref(buff);

  GST_BUFFER_DURATION(buff) = GST_BUFFER_PTS(buff) = GST_BUFFER_DTS(buff) = GST_CLOCK_TIME_NONE;
  gst_app_src_push_buffer(appsrc, buff);

  return GST_FLOW_OK;
}

void PadAdded(GstElement *object, GstPad *arg0, gpointer user_data) {
  auto *pipeline = (GstBin *)user_data;
  auto *caps = gst_pad_get_current_caps(arg0);
  g_autofree gchar *caps_string = gst_caps_to_string(caps);

  if (g_strrstr(caps_string, "video") != NULL) {
    g_object_set(gst_bin_get_by_name(pipeline, "h264_appsrc"), "caps", caps, nullptr);
    if (GST_PAD_LINK_FAILED(gst_pad_link(arg0, gst_element_get_static_pad(gst_bin_get_by_name(pipeline, "h264_queue"), "sink")))) {
      g_print("Video link failed \n");
    }
  } else {
    g_object_set(gst_bin_get_by_name(pipeline, "aac_appsrc"), "caps", caps, nullptr);
    if (GST_PAD_LINK_FAILED(gst_pad_link(arg0, gst_element_get_static_pad(gst_bin_get_by_name(pipeline, "aac_queue"), "sink")))) {
      g_print("Audio link failed \n");
    }
  }
}

void add_rtmp(GstBin *pipeline) {
  auto *rtmpsrc          = gst_element_factory_make("rtmpsrc", nullptr),
       *flvdemux         = gst_element_factory_make("flvdemux", nullptr),
       *h264_queue       = gst_element_factory_make("queue", "h264_queue"),
       *h264_appsink     = gst_element_factory_make("appsink", nullptr),
       *h264_appsrc      = gst_element_factory_make("appsrc", "h264_appsrc"),
       *avdec_h264       = gst_element_factory_make("avdec_h264", nullptr),
       *videoconvert     = gst_element_factory_make("videoconvert", nullptr),
       *video_queue      = gst_element_factory_make("queue", nullptr),
       *aac_queue        = gst_element_factory_make("queue", "aac_queue"),
       *aac_appsink      = gst_element_factory_make("appsink", nullptr),
       *aac_appsrc       = gst_element_factory_make("appsrc", "aac_appsrc"),
       *avdec_aac        = gst_element_factory_make("avdec_aac", nullptr),
       *audioconvert     = gst_element_factory_make("audioconvert", nullptr),
       *audio_queue      = gst_element_factory_make("queue", nullptr),
       *h264parse        = gst_element_factory_make("h264parse", nullptr);

  gst_bin_add_many(pipeline, rtmpsrc, flvdemux, h264_queue, h264_appsink, h264_appsrc, avdec_h264, videoconvert, aac_queue, aac_appsink, aac_appsrc, avdec_aac, audioconvert, audio_queue, video_queue, h264parse, nullptr);

  gst_element_link(rtmpsrc, flvdemux);
  gst_element_link(h264_queue, h264_appsink);
  gst_element_link(aac_queue, aac_appsink);

  gst_element_link_many(h264_appsrc, h264parse, avdec_h264, videoconvert, video_queue, gst_bin_get_by_name(pipeline, "c"), nullptr);
  gst_element_link_many(aac_appsrc, avdec_aac, audioconvert, audio_queue, gst_bin_get_by_name(pipeline, "a"), nullptr);

  g_object_set(rtmpsrc, "location", YOUR_RTMP_URL, nullptr);

  g_object_set(h264_appsink, "emit-signals", TRUE, nullptr);
  g_object_set(aac_appsink, "emit-signals", TRUE, nullptr);

  g_object_set(h264_appsrc, "is-live", TRUE, "do-timestamp", TRUE, nullptr);
  gst_util_set_object_arg(G_OBJECT(h264_appsrc), "format", "time");

  g_object_set(aac_appsrc, "is-live", TRUE, "do-timestamp", TRUE, nullptr);
  gst_util_set_object_arg(G_OBJECT(aac_appsrc), "format", "time");

  g_signal_connect(flvdemux, "pad-added", G_CALLBACK(PadAdded), pipeline);

  g_signal_connect(h264_appsink, "new-sample", G_CALLBACK(NewSample), h264_appsrc);
  g_signal_connect(aac_appsink, "new-sample", G_CALLBACK(NewSample), aac_appsrc);

}


int main(int argc, char *argv[]) {
  gst_init(&argc, &argv);

  auto *loop = g_main_loop_new(NULL, FALSE);
  auto *pipeline = gst_parse_launch(
      "flvmux name=mux ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! rtmpsink location=\"rtmp://localhost/inbound/test\" "
      "videotestsrc is-live=true ! compositor name=c ! video/x-raw,width=1280,height=720 ! queue ! x264enc speed-preset=veryfast tune=zerolatency ! queue ! mux. "
      "audiotestsrc is-live=true volume=0.0 ! audiomixer name=a ! queue ! voaacenc ! queue ! mux. ", NULL);

  add_rtmp(GST_BIN(pipeline));
  gst_element_set_state(pipeline, GST_STATE_PLAYING);
  g_main_loop_run(loop);

  return 0;
}
```


More information about the gstreamer-devel mailing list