how to create proper HLS pipeline
James
jam at tigger.ws
Tue Sep 21 23:01:29 UTC 2021
> On 22 Sep 2021, at 3:08 am, Andres Gonzalez via gstreamer-devel <gstreamer-devel at lists.freedesktop.org> wrote:
>
> Hi Chris,
>
> Using a tsparse element after the udpsrc and before a hlssink2 (not hlssink) gives this runtime error:
>
> GStreamer-CRITICAL **: 18:42:53.362: Element splitmuxsink0 already has a pad named video, the behaviour of gst_element_get_request_pad() for existing pads is undefined!
>
> But using a tsdemux after udpsrc and before hlssink2 does indeed work. This pipeline works great:
>
> gst-launch-1.0 udpsrc port=50000 caps="video/mpegts,systemstream=true" ! tsdemux ! hlssink2 target-duration=5 playlist-location="playlist.m3u8" location=segment_%05d.ts
>
> However, when I code this pipeline up in my C++ application (using gst_element_factory_make(), gst_bin_add(), and gst_element_link()), I get that same runtime error I got when using tsparse.
>
> So the pipeline: udpsrc --> tsdemux --> hlssink2. works when I use gst-launch. But when I code it up manually, I get that runtime error. So those 3 elements won't work using the typical gst_element_link() so some other pad linking is going on under the covers.
>
> How do I do the link manually so these 3 elements will link up properly?
>
> Thanks,
> -Andres
>
> On Mon, Sep 20, 2021 at 6:41 PM Chris Wine via gstreamer-devel <gstreamer-devel at lists.freedesktop.org <mailto:gstreamer-devel at lists.freedesktop.org>> wrote:
> It might be that hlssink wants to see keyframes and durations of all the content so that it knows where to split... try adding a "tsparse" element after udpsrc. If that doesn't work, you might have to demux the individual audio/video streams, run them through their respective parsers, and then use hlssink2 to remux them.
>
> --Chris
>
> On Mon, Sep 20, 2021 at 5:45 PM Andres Gonzalez via gstreamer-devel <gstreamer-devel at lists.freedesktop.org <mailto:gstreamer-devel at lists.freedesktop.org>> wrote:
> Thanks for your response Chris.
>
> I tried your suggestion:
>
> gst-launch-1.0 udpsrc port=50000 caps="video/mpegts,systemstream=true" ! hlssink target-duration=5 playlist-location="playlist.m3u8" location=segment.%03d.ts
>
> This pipeline is operational, however, it does not create the playlist and just generates a single segment file, and it keeps growing and growing. I think the pipeline needs to know something about the video, otherwise how does it know if it has enough video in the segment file to match the specified target duration?
>
> In my application, I am using hlssink2 for the pipeline that is generating a new video stream (with appsrc, encoder, etc). But this pipeline has a udpsrc and so there isn't an encoder in the pipeline. That is the reason I am using hlssink instead of hlssink2 for this pipeline.
>
> -Andres
>
>
>
>
>
>
> On Mon, Sep 20, 2021 at 3:14 PM Chris Wine via gstreamer-devel <gstreamer-devel at lists.freedesktop.org <mailto:gstreamer-devel at lists.freedesktop.org>> wrote:
> Hi Andres,
>
> I believe hlssink takes an mpeg transport stream, so you don't need to have the tsdemux element in there unless you need to do something else to the streams before remuxing them and sending them to hlssink. So maybe just "udpsrc ! hlssink" (with properties of course) will give you want you're looking for.
>
> For making sure the caps are correct, you should just be able to set the "caps" property on udpsrc:
> udpsrc caps="video/mpegts,systemstream=true"
>
> If you already have separate audio and video streams, I'd use hlssink2 which does the mpegts muxing internally.
>
> --Chris
>
> On Mon, Sep 20, 2021 at 4:00 PM Andres Gonzalez via gstreamer-devel <gstreamer-devel at lists.freedesktop.org <mailto:gstreamer-devel at lists.freedesktop.org>> wrote:
> Just some additional info. I am using gst-launch just to figure out what the appropriate elements should be. I have a C++ app that I develop where I code up the pipeline for my application. But since I am not even sure which elements should be in the pipeline, I am experimenting around with creating the pipeline with gst-launch.
> Thanks,
> -Andres
>
> On Mon, Sep 20, 2021 at 1:49 PM Andres Gonzalez <andres.agoralabs at gmail.com <mailto:andres.agoralabs at gmail.com>> wrote:
> Hi,
> I have a question about creating an HLS pipeline. I am receiving a MPEG-TS UDP stream and I want to create a pipeline that will receive the UDP TS stream, and then create/publish a corresponding HLS stream. For example, something like this:
>
> gst-launch-1.0 udpsrc port=50000 ! tsdemux ! hlssink playlist-root=http://192.168.0.100:8080 <http://192.168.0.100:8080/> location=segment.%03d.ts
>
> This actually works and receives the UDP/TS stream and starts to generate the segment_000.ts file. But it never creates the playlist file and keeps adding to the segment_000.ts file.
>
> I am assuming that I need to provide some caps information about the video coming in on the UDP/TS stream but I am not sure how to do that. So could someone tell me what the pipeline should consist of to get this working?
>
What worked for me when struggling like this was to use the DOT debug to examine the pipeline in detail. Also I use the parse to build a pipeline eg
pipeline = gst_parse_launch ("v4l2src name=source ! "
"queue ! "
"videoscale ! video/x-raw,width=768,height=576 ! "
"videoconvert ! "
"xvimagesink name=sink1 force-aspect-ratio=false sync=false", NULL);
James
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20210922/7e2738b6/attachment.htm>
More information about the gstreamer-devel
mailing list