gstreamer-devel Digest, Vol 100, Issue 12

todor at newblueinc.com todor at newblueinc.com
Mon May 6 17:30:28 UTC 2019


Matt is taking me to the conf room directly 

> On May 3, 2019, at 10:30 AM, <gstreamer-devel-request at lists.freedesktop.org> <gstreamer-devel-request at lists.freedesktop.org> wrote:
> 
> Send gstreamer-devel mailing list submissions to
>    gstreamer-devel at lists.freedesktop.org
> 
> To subscribe or unsubscribe via the World Wide Web, visit
>    https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> or, via email, send a message with subject or body 'help' to
>    gstreamer-devel-request at lists.freedesktop.org
> 
> You can reach the person managing the list at
>    gstreamer-devel-owner at lists.freedesktop.org
> 
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of gstreamer-devel digest..."
> 
> 
> Today's Topics:
> 
>   1. Re: STREAMING OVER UDP FROM RASPBERRY PI CAMERA USING
>      GSTREAMER (Vinod Kesti)
>   2. Video to stdout (lec1)
>   3. Re: STREAMING OVER UDP FROM RASPBERRY PI CAMERA USING
>      GSTREAMER (Aditya Oza)
>   4. H265 video file streaming over TCP (guiltylotus)
>   5. Recording and Streaming Simultaneously (Aditya Oza)
>   6. Re: Why is GstBuffer not writable in the _fill method of
>      GstPushSrc? (Ben Rush)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Fri, 3 May 2019 07:53:34 -0500 (CDT)
> From: Vinod Kesti <vinodkesti at yahoo.com>
> To: gstreamer-devel at lists.freedesktop.org
> Subject: Re: STREAMING OVER UDP FROM RASPBERRY PI CAMERA USING
>    GSTREAMER
> Message-ID: <1556888014281-0.post at n4.nabble.com>
> Content-Type: text/plain; charset=us-ascii
> 
> By default VLC player choose mpeg2ts depayloader.
> 
> use rtpmp2tpay to test once:
> 
> gst-launch-1.0  --gst-debug-level=3 -v  v4l2src device=/dev/video0 !
> capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! queue !
> omxh264enc !   queue ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink
> host=127.0.0.1 port=8080
> 
> and trt rtp://@:8080 in VLC.
> 
> VLC pay have option to choose depayloader but haven't tried it.
> 
> 
> 
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> 
> 
> ------------------------------
> 
> Message: 2
> Date: Fri, 3 May 2019 08:16:05 -0500 (CDT)
> From: lec1 <lec at geisel-software.com>
> To: gstreamer-devel at lists.freedesktop.org
> Subject: Video to stdout
> Message-ID: <1556889365346-0.post at n4.nabble.com>
> Content-Type: text/plain; charset=us-ascii
> 
> Hi, thanks for your wonderful work.
> 
> I am adding streamer support to a Node.js application. This application
> currently uses ffmpeg and has grown around the "fluent ffmeg" framework.
> The "fluent ffmpeg" software drives the video via a pipe to stdout. A lot of
> support software has been written in the application to support this model.
> Now to my problems with gstreamer, In order to be the least disruptive
> possible I have adopted the same model. A RTSP stream at the head and a
> stdout sink. Here are some of the things I have tried:
> 
> 
> gst-launch-1.0 uridecodebin silent=true
> uri=rtsp://192.168.1.x:8822/test.stm ! queue ! fdsink fd=1
> gst-launch-1.0 rtspsrc silent=true
> location=rtsp://192.168.1.x:8822/test.stm ! rtph264depay ! fdsink fd=1
> gst-launch-1.0 rtspsrc silent=true
> location=rtsp://192.168.1.x:8822/test.stm
> caps="video/x-h264,width=1280,height=800,framerate=(fraction)25/1" ! queue !
> rtph264depay ! h264parse ! avdec_h264 ! fdsink fd=1
> etc.
> 
> All of the above produce video when terminated by "autovideosink" instead of
> the "fdsink fd=1" construct.
> 
> However with "fdsink fd=1" or "filesink location=/dev/stdout" video pipe
> termination, the pipeline is stalled.
> 
> The software that handles the pipeline fills up buffers and extracts the
> mpeg images to display on a web page. I have wrapped gstreamer in a wrapper
> very similar to the "fluent" wrapper for ffmpeg and have attached event
> handlers to help with the progress of the pipeline. I have enabled debug
> logging and nothing jumps at me. It looks like the pipeline is behaving as
> it should yet I still am unable to get video output.
> 
> I have also attempted to concatenate pipelines.
> 
> e.g.
> gst-launch-1.0 uridecodebin silent=true uri=rtsp://192.168.1.x:8822/test.stm
> ! queue ! fdsink fd=1 | gst-launch-1.0 fdsrc fd=0 ! decodebin !
> autovideosink
> but thus far I have not succeeded.
> 
> Any light you can shed on the resolution of this issue is greatly
> appreciated.
> 
> 
> 
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> 
> 
> ------------------------------
> 
> Message: 3
> Date: Fri, 3 May 2019 08:54:48 -0500 (CDT)
> From: Aditya Oza <aditya.oza at teksun.in>
> To: gstreamer-devel at lists.freedesktop.org
> Subject: Re: STREAMING OVER UDP FROM RASPBERRY PI CAMERA USING
>    GSTREAMER
> Message-ID: <1556891688666-0.post at n4.nabble.com>
> Content-Type: text/plain; charset=us-ascii
> 
> Thank you for reply  Vinod Kesti,
> 
> gst-launch-1.0  --gst-debug-level=3 -v  v4l2src device=/dev/video0 !
> capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! queue !
> omxh264enc !   queue ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink
> host=127.0.0.1 port=8080
> 
> it's working fine on local server.. :)
> 
> Now I try to change 127.0.0.1 to my Acess point 192.168.5.1(Raspberry pi)
> and run a command on a terminal but not display stream in client side.
> 
> Give me some suggestion in it.
> 
> 
> 
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> 
> 
> ------------------------------
> 
> Message: 4
> Date: Fri, 3 May 2019 09:47:02 -0500 (CDT)
> From: guiltylotus <minh02091997 at gmail.com>
> To: gstreamer-devel at lists.freedesktop.org
> Subject: H265 video file streaming over TCP
> Message-ID: <1556894822320-0.post at n4.nabble.com>
> Content-Type: text/plain; charset=us-ascii
> 
> Hello,
> 
> I'm using Gstreamer 1.8.3 and Jetson Tx2.
> Now, on Jetson Tx2, I want to send a mp4 file, encode with h.265 and
> streaming over TCP by this command:
> 
> gst-launch-1.0 filesrc location=hncloud.mp4 ! decodebin ! omxh265enc !
> mpegtsmux ! queue ! tcpserversink host=xxx.xxx.x.xxx port=5000
> recover-policy=keyframe sync-method=latest-keyframe sync=false
> 
> To receive, I use VLC to play this file on my Linux PC
> (tcp://xxx.xxx.x.xx:5000) but nothing happends. The screen just has a black
> color. Someone please help me.
> 
> I tested with videotestsrc:
> 
> gst-launch-1.0 videotestsrc ! decodebin ! omxh265enc ! mpegtsmux ! queue !
> tcpserversink host=xxx.xxx.x.xxx port=5000 recover-policy=keyframe
> sync-method=latest-keyframe sync=false
> 
> everything works fine.
> 
> 
> 
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> 
> 
> ------------------------------
> 
> Message: 5
> Date: Fri, 3 May 2019 20:30:07 +0530
> From: Aditya Oza <aditya.oza at teksun.in>
> To: gstreamer-devel at lists.freedesktop.org
> Subject: Recording and Streaming Simultaneously
> Message-ID:
>    <CAJXhsbE68H=8hgXY5P06yF_25wVNy7+vaCZ6Sm8X49pLSQTpeQ at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> hello sir,
> The objective I am trying to achieve is streaming 1080p video from
> Raspberry pi camera and record the video simultaneously.
> I tried recording the http streaming as source but didn't work on 30fps. A
> lot of frames were missing and almost got 8fps only.
> As a second approach, I am trying to record the file directly from camera
> and then streaming the "recording in progress/buffer" file. For the same I
> am trying to use GStreamer. Please suggest if this is good option or should
> I try any other?
> For Recording using GStreamer I used
> gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter
> caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
> videoflip method=clockwise ! videoflip method=clockwise ! videoconvert !
> videorate ! x264enc! avimux ! filesink location=test_video.h264
> Result : recorded video shows 1080p and 30fps but frames are dropping
> heavily
> For Streaming the video buffer I have used UDP in Gstreamer as,
> gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter
> caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue !
> rtph264pay ! udpsink host=192.168.5.1 port=8080
> Result : No specific errors on terminal but can't get stream on vlc
> Please suggest the best method here.
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: 
> <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20190503/0f69554b/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 6
> Date: Fri, 3 May 2019 12:29:56 -0500
> From: Ben Rush <ben at ben-rush.net>
> To: Discussion of the development of and with GStreamer
>    <gstreamer-devel at lists.freedesktop.org>
> Subject: Re: Why is GstBuffer not writable in the _fill method of
>    GstPushSrc?
> Message-ID:
>    <CANV0r97iYLEB3mMXiqa2mOGcSnkXtnGnyvX7dQ1EesZNG=4nuw at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> Sebastian,
> 
> I apologize for not getting back to you sooner. I will try putting up some
> sample code for you in the near future. However, I have opened up the
> following issue on the Intel Media SDK gstreamer github repo that my be of
> interest to you: https://github.com/intel/gstreamer-media-SDK/issues/173.
> 
> Essentially I'm using their x264 encoder. I don't know how it's working
> exactly, but when I request AYUV, I'm able to get a writable buffer (and
> yes, everything works). It looks as though the GstBuffer is writable, but
> allocated by the gstreamer pipeline itself. When I request NV12, which is
> supported by the x264 plugin as a media type from upstream elements, the
> Intel Media pipeline allocations memory for me. This is video memory. Now,
> it's possible that one could make the argument the memory isn't writeable
> from CPU code because it's on the video device, but I know in Intel's
> OpenCL drivers I'm able to get writable memory buffers since the GPU has a
> shared memory space with the CPU (the whole point of the Intel on-chip
> GPU). So, it should be possible. Why the code above appears to not work
> with it, I'm unsure.
> 
> BTW: I've also opened another issue that some GStreamer users might
> encounter when using the Intel Media SDK (on Windows) here:
> https://github.com/intel/gstreamer-media-SDK/issues/169. I might try fixing
> this (it might be done so by a simple preprocessor statement preventing
> that code from compiling on Windows).
> 
> Anyway. Keeping you up to date.
> 
> On Tue, Apr 30, 2019 at 5:39 AM Sebastian Dröge <sebastian at centricular.com>
> wrote:
> 
>>> On Mon, 2019-04-29 at 10:46 -0500, Ben Rush wrote:
>>> Sebastian,
>>> 
>>> So, I'm not sure precisely what's going on, but I think it might -
>>> maybe - have something to do with the fact that I'm using the MFX
>>> (Intel Media SDK) GStreamer modules, and that perhaps things aren't
>>> playing well together. Let me explain. I've got an example
>>> application that, all it does, is encode a static image of the
>>> scientist Richard Feynman to disk as an MP4 file (500 frames). I use
>>> it to kind of tinker and explore the various parts of the GStreamer
>>> pipeline.
>>> 
>>> I have found that if I set the output caps to NV12, the
>>> gst_buffer_map() call fails:
>>> 
>>> static GstStaticPadTemplate gst_feynman_template =
>>> GST_STATIC_PAD_TEMPLATE("src",
>>>      GST_PAD_SRC,
>>>      GST_PAD_ALWAYS,
>>>      GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{NV12}"))
>>> );
>>> 
>>> If, on the other hand, I change this to AYUV, it works:
>>> 
>>> static GstStaticPadTemplate gst_feynman_template =
>>> GST_STATIC_PAD_TEMPLATE("src",
>>>      GST_PAD_SRC,
>>>      GST_PAD_ALWAYS,
>>>      GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{AYUV}"))
>>> );
>>> 
>>> It doesn't matter what I actually send as this failure happens before
>>> I send anything at all. During the first invocation to fill the
>>> buffer, the gst_buffer_map call fails as I've indicated in previous
>>> emails if I specify NV12. As I'm writing this email out certain
>>> things are starting to becoming clearer, and I presume what's
>>> happening is that per the handshake between my source DLL and the
>>> downstream gstmfx.dll (and the x264 encoder therein) something is
>>> failing to initialize properly, thereby causing the GStreamer
>>> pipeline to fail to also initialize the GstBuffer for me properly.
>>> Specifically, the gstmfxenc_h264.c encoder is failing to handle NV12
>>> properly (or me asking it to handle this format is failing on its
>>> side and therefore falling back to some standard GStreamer
>>> allocator/handler (I don't know the terminology well as I'm still
>>> learning)).
>>> 
>>> I know that the gstmfx encoder suite advertises that it supports
>>> these formats:
>>> 
>>> # define GST_MFX_SUPPORTED_INPUT_FORMATS \
>>>    "{ NV12, YV12, I420, YUY2, P010_10LE, BGRA, BGRx }"
>>> 
>>> What tipped me off was inspecting the GstBuffer object in-memory, and
>>> seeing when I specified NV12, the GstBuffer->Pool->Object->name was
>>> "mfxvideobufferpool0", whereas if I specify AYUV, the name is
>>> "videobufferpool0".
>>> 
>>> So, does this appear to be a bug in the intel media encoder side of
>>> things? Or am I still not using something correctly? If it is a bug
>>> in the Intel side of things, do you have any advise on how to track
>>> it down? They've been pretty unhelpful so far regarding various other
>>> issues I've encountered using their stuff.
>> 
>> Does it work correctly if you us AYUV?
>> 
>> I don't see anything specifically wrong in your code, especially not in
>> the fill() function. What you say sounds like the MFX plugin is
>> providing you with a buffer pool that provides buffers that can't be
>> write-mapped. That would be a bug in the MFX plugin.
>> 
>> Which MFX plugin are you using? Does it work better if you use
>> something else instead?
>> 
>> 
>> In any case, instead of just providing the code in-line in a mail it
>> would be good if you could provide a git repository with everything
>> needed to compile and run it so that it's easier to check what exactly
>> is happening.
>> 
>> That said, from what you describe it sounds like a bug in the MFX
>> plugin.
>> 
>> --
>> Sebastian Dröge, Centricular Ltd · https://www.centricular.com
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at lists.freedesktop.org
>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: 
> <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20190503/3040a535/attachment.html>
> 
> ------------------------------
> 
> Subject: Digest Footer
> 
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> 
> ------------------------------
> 
> End of gstreamer-devel Digest, Vol 100, Issue 12
> ************************************************


More information about the gstreamer-devel mailing list