streaming into a web pag

Marc Leeman marc.leeman at gmail.com
Mon May 13 06:49:52 UTC 2019


Ah, but you're not using WebRTC now, you are using HLS.

HLS will create a bunch of small files in a TS container for the
browser to pick up from a web location.

If you want to recreate this with GStreamer, you can use the hlssink.

Just feed it a stream in a transport stream container and it will do
something similar.

If you drop the files in a location served by your webserver, you can
access it with a browser.


On Mon, 13 May 2019 at 06:55, R C <cjvijf at gmail.com> wrote:
>
> So I manages to get some streaming going, but using ffmpeg but want to use gstreamer really  (ffmpeg creates a bunch of files and the player somehow starts 3 minutes after the current time.
>
> So I would like to accomplish the same as I have working with ffmpeg:
>
>
> Here is what I am using.
>
> this is the html:
>
> <!DOCTYPE html>
> <html>
>
> <header>
>
> <title>Live Cam</title>
>
> <script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
>
> <script>
> function hlsStart() {
>    if (Hls.isSupported()) {
>       var video = document.getElementById('video');
>       var hls = new Hls();
>
>       // bind them together
>       hls.attachMedia(video);
>       hls.on(Hls.Events.MEDIA_ATTACHED, function () {
>          console.log("video and hls.js are now bound together !");
>          hls.loadSource("http://zoneminder.localdomain/IP-Cameras/stream/test-camera/mystream.m3u8");
>          hls.on(Hls.Events.MANIFEST_PARSED, function (event, data) {
>             console.log("manifest loaded, found " + data.levels.length + " quality level");
>             });
>          });
>       }
>    }
> </script>
> </header>
> <body onload="hlsStart();">
> <video id="video" autoplay="true" controls="controls"></video>
> </body>
> </html>
>
>
>
> and this is what I am doing with ffmpeg:
>
> ffmpeg -i "rtsp://192.168.x.y:554/user=admin_password=XXXXXXXX_channel=1_stream=0.sdp?real_stream" -y -c:a aac -b:a 160000 -ac 2 -s 960x540 -c:v libx264 -b:v 800000 -hls_time 10 -hls_list_size 10 -start_number 1 mystream.m3u8
>
>
> any suggestions?  (I probably need an Apache plugin, or something like that?)
>
>
> thanks,
>
>
> Ron
>
>
>
> On 5/7/19 4:13 AM, Marc Leeman wrote:
>
> I don't think you need to transcode, H.264 should also be supported by
> the browsers.
>
> On Tue, 7 May 2019 at 09:10, Ralf Sippl <ralf.sippl at gmail.com> wrote:
>
> Hi Ron,
>
> if the pipeline works, you got the GStreamer part right. Of course there are
> two streams, video and audio. Each is sent to a different UDP port.
>
> Now you need to run the receiving part, i.e. Janus. The streaming demo
> listens to the ports your pipeline sends to. This is obviously off-topic
> here, use the Janus site, or contact me if that doesn't work.
>
> You can use webrtcbin instead, as Nirbheek suggested, but I found it harder
> to set up (you need to run the websocket part on your own), and it will be a
> 1-to-1 connection, so you can't use it for broadcast.
>
> Ralf
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



-- 
g. Marc


More information about the gstreamer-devel mailing list