streaming into a web pag

R C cjvijf at gmail.com
Mon May 13 15:45:39 UTC 2019


well,  I was trying to get "something" in a page.   I am not that happy
with using ffmpeg because it creates a bunch of files indeed and there is a
delay of about a minute (might be great for streaming a movie or such).

I don't mind a delay of a few seconds, buffering etc, but a minute or two
is not what I am looking for.
I did this ffmpeg html5/hls  trial to see if I could get at least something
working in a browser.

I have been reading up a little, not pretending I understand everything.


So the mechanism (from what I read) is to set oup a stream/pipe with
gstreamer/gst-launch  which connects to a camera and 'transcodes" the
stream and  "dumps" it somewhere on a 'device'  (sink?).

So I guess I need to install an apache plugin or so to make that part work,
or is there another way to do it? (or some dev device??)    From what I
read there is something called a "souphttpclientsink", I also saw a hlssink
mentioned (probably the same issues as I have now with fmpeg)  so I am
wondering what 'sink' to use, and how if I want  the stream to be displayed
in a webpage (I don't want the client/browser/player to connect to anything
else then  http port 80).

thanks,

Ron



On Mon, May 13, 2019 at 12:50 AM Marc Leeman <marc.leeman at gmail.com> wrote:

> Ah, but you're not using WebRTC now, you are using HLS.
>
> HLS will create a bunch of small files in a TS container for the
> browser to pick up from a web location.
>
> If you want to recreate this with GStreamer, you can use the hlssink.
>
> Just feed it a stream in a transport stream container and it will do
> something similar.
>
> If you drop the files in a location served by your webserver, you can
> access it with a browser.
>
>
> On Mon, 13 May 2019 at 06:55, R C <cjvijf at gmail.com> wrote:
> >
> > So I manages to get some streaming going, but using ffmpeg but want to
> use gstreamer really  (ffmpeg creates a bunch of files and the player
> somehow starts 3 minutes after the current time.
> >
> > So I would like to accomplish the same as I have working with ffmpeg:
> >
> >
> > Here is what I am using.
> >
> > this is the html:
> >
> > <!DOCTYPE html>
> > <html>
> >
> > <header>
> >
> > <title>Live Cam</title>
> >
> > <script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
> >
> > <script>
> > function hlsStart() {
> >    if (Hls.isSupported()) {
> >       var video = document.getElementById('video');
> >       var hls = new Hls();
> >
> >       // bind them together
> >       hls.attachMedia(video);
> >       hls.on(Hls.Events.MEDIA_ATTACHED, function () {
> >          console.log("video and hls.js are now bound together !");
> >          hls.loadSource("
> http://zoneminder.localdomain/IP-Cameras/stream/test-camera/mystream.m3u8
> ");
> >          hls.on(Hls.Events.MANIFEST_PARSED, function (event, data) {
> >             console.log("manifest loaded, found " + data.levels.length +
> " quality level");
> >             });
> >          });
> >       }
> >    }
> > </script>
> > </header>
> > <body onload="hlsStart();">
> > <video id="video" autoplay="true" controls="controls"></video>
> > </body>
> > </html>
> >
> >
> >
> > and this is what I am doing with ffmpeg:
> >
> > ffmpeg -i
> "rtsp://192.168.x.y:554/user=admin_password=XXXXXXXX_channel=1_stream=0.sdp?real_stream"
> -y -c:a aac -b:a 160000 -ac 2 -s 960x540 -c:v libx264 -b:v 800000 -hls_time
> 10 -hls_list_size 10 -start_number 1 mystream.m3u8
> >
> >
> > any suggestions?  (I probably need an Apache plugin, or something like
> that?)
> >
> >
> > thanks,
> >
> >
> > Ron
> >
> >
> >
> > On 5/7/19 4:13 AM, Marc Leeman wrote:
> >
> > I don't think you need to transcode, H.264 should also be supported by
> > the browsers.
> >
> > On Tue, 7 May 2019 at 09:10, Ralf Sippl <ralf.sippl at gmail.com> wrote:
> >
> > Hi Ron,
> >
> > if the pipeline works, you got the GStreamer part right. Of course there
> are
> > two streams, video and audio. Each is sent to a different UDP port.
> >
> > Now you need to run the receiving part, i.e. Janus. The streaming demo
> > listens to the ports your pipeline sends to. This is obviously off-topic
> > here, use the Janus site, or contact me if that doesn't work.
> >
> > You can use webrtcbin instead, as Nirbheek suggested, but I found it
> harder
> > to set up (you need to run the websocket part on your own), and it will
> be a
> > 1-to-1 connection, so you can't use it for broadcast.
> >
> > Ralf
> >
> >
> >
> > --
> > Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.freedesktop.org
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.freedesktop.org
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
>
> --
> g. Marc
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20190513/60101ca0/attachment-0001.html>


More information about the gstreamer-devel mailing list