From anilkumar03006 at hotmail.com Mon Mar 1 05:39:56 2021 From: anilkumar03006 at hotmail.com (anil0407) Date: Sun, 28 Feb 2021 23:39:56 -0600 (CST) Subject: video play fast with gst new buffer allocation Message-ID: <1614577196958-0.post@n4.nabble.com> Hi, In a _chain function(custom plugin), when i move to the existing gst buffer to new gst buffer, i observed that the video is playing very fast. what changes required to make it play normal speed. Here the snippet code: buf = gst_buffer_make_writable (buf); buf_size = gst_buffer_get_size (buf); n_buf = gst_buffer_new_allocate (NULL, buf_size, NULL); gst_buffer_map (n_buf, &n_map, GST_MAP_WRITE); if (gst_buffer_map (buf, &map, GST_MAP_WRITE)) { ptr = (guint16 *) map.data; memcpy((guint16 *)n_map.data, (guint16 *)ptr, buf_size); gst_buffer_unmap (buf, &map); gst_buffer_unmap (n_buf, &n_map); return gst_pad_push (filter->srcpad, n_buf); } Thanks, Anil -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From deepanshu.saxena30 at gmail.com Mon Mar 1 12:01:57 2021 From: deepanshu.saxena30 at gmail.com (deepanshu) Date: Mon, 1 Mar 2021 06:01:57 -0600 (CST) Subject: Gstreamer webrtcbin flow stuck at add-ice-candidate signal In-Reply-To: <4af33a22-8118-bc5b-a3ad-e8e65979d5d4@gmail.com> References: <1614278444335-0.post@n4.nabble.com> <249be6ba-39e5-ba86-db6e-a5b076e1f651@gmail.com> <1614416918797-0.post@n4.nabble.com> <4af33a22-8118-bc5b-a3ad-e8e65979d5d4@gmail.com> Message-ID: <1614600117505-0.post@n4.nabble.com> Hi Matthew, Thanks for your valuable input. Please find the attached document containing the objective and my approach along with the source code. problem_statement_and_source_code.zip It would be really helpful if you please guide me in getting this resolved. Thanks in advance. Deepanshu -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From franka1986 at gmail.com Mon Mar 1 12:41:34 2021 From: franka1986 at gmail.com (Fran Raga) Date: Mon, 1 Mar 2021 06:41:34 -0600 (CST) Subject: Injecting metadata from file MISP In-Reply-To: References: <1586802356984-0.post@n4.nabble.com> Message-ID: <1614602494710-0.post@n4.nabble.com> After many years I found a clue. I have published what I think could be a good starting point for multiplexing a video and ingesting telemetry (KLV). It's only a draft but I think it's a good start. I'm not a gstreamer expert https://gist.github.com/All4Gis/509fbe06ce53a0885744d16595811e6f to extract the klv we are develop this library to extract KLV data and parse in real time, https://github.com/All4Gis/QGISFMV/tree/master/code/klvdata if you don't want to use it inside QGIS you can use the library separately. https://github.com/paretech/klvdata -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From riccardo at phascode.org Mon Mar 1 13:09:00 2021 From: riccardo at phascode.org (RiccardoCagnasso) Date: Mon, 1 Mar 2021 07:09:00 -0600 (CST) Subject: Gtk3 window unresponsive on windows 10 when video embedded in Message-ID: <1614604140771-0.post@n4.nabble.com> Using this code I can embed a videosink in a GTK3 window's drawing area on Windows 10 But this causes the window to become unresponsive and eventually triggers the windows 10 unresponsive application error message. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From keith.thornton at zeiss.com Mon Mar 1 13:18:17 2021 From: keith.thornton at zeiss.com (Thornton, Keith) Date: Mon, 1 Mar 2021 13:18:17 +0000 Subject: AW: video play fast with gst new buffer allocation In-Reply-To: <1614577196958-0.post@n4.nabble.com> References: <1614577196958-0.post@n4.nabble.com> Message-ID: Hi, your new buffer is missing PTS, DURATION and OFFSET stuff. You should set these with the values from the original buffer Gruesse -----Urspr?ngliche Nachricht----- Von: gstreamer-devel Im Auftrag von anil0407 Gesendet: Montag, 1. M?rz 2021 06:40 An: gstreamer-devel at lists.freedesktop.org Betreff: video play fast with gst new buffer allocation Hi, In a _chain function(custom plugin), when i move to the existing gst buffer to new gst buffer, i observed that the video is playing very fast. what changes required to make it play normal speed. Here the snippet code: buf = gst_buffer_make_writable (buf); buf_size = gst_buffer_get_size (buf); n_buf = gst_buffer_new_allocate (NULL, buf_size, NULL); gst_buffer_map (n_buf, &n_map, GST_MAP_WRITE); if (gst_buffer_map (buf, &map, GST_MAP_WRITE)) { ptr = (guint16 *) map.data; memcpy((guint16 *)n_map.data, (guint16 *)ptr, buf_size); gst_buffer_unmap (buf, &map); gst_buffer_unmap (n_buf, &n_map); return gst_pad_push (filter->srcpad, n_buf); } Thanks, Anil -- Sent from: https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7C0c5803fad2f8452e1bac08d8dcb410c7%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637502013219035498%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=pAYmTjj3TBQ0z1c6W%2BleCuncT5%2Fo5s%2FHBAUrugcSvLI%3D&reserved=0 _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7C0c5803fad2f8452e1bac08d8dcb410c7%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637502013219035498%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=Sza7VQyQ7BL85T7MKMd2u%2Br8q1uTE9WjPUsufu7JQ9M%3D&reserved=0 From luca.bacci982 at gmail.com Mon Mar 1 14:36:00 2021 From: luca.bacci982 at gmail.com (Luca Bacci) Date: Mon, 1 Mar 2021 15:36:00 +0100 Subject: Gtk3 window unresponsive on windows 10 when video embedded in In-Reply-To: <1614604140771-0.post@n4.nabble.com> References: <1614604140771-0.post@n4.nabble.com> Message-ID: Hi Riccardo, I can't see any code in your email using GMail web client. Could you post it again? Il lun 1 mar 2021, 14:09 RiccardoCagnasso ha scritto: > Using this code I can embed a videosink in a GTK3 window's drawing area on > Windows 10 > > > > But this causes the window to become unresponsive and eventually triggers > the windows 10 unresponsive application error message. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From riccardo at phascode.org Mon Mar 1 16:56:43 2021 From: riccardo at phascode.org (RiccardoCagnasso) Date: Mon, 1 Mar 2021 10:56:43 -0600 (CST) Subject: Gtk3 window unresponsive on windows 10 when video embedded in In-Reply-To: References: <1614604140771-0.post@n4.nabble.com> Message-ID: <1614617803703-0.post@n4.nabble.com> The code seems to be removed for some reason. I'll post a pastebin https://pastebin.com/jNVb2jMN -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From riccardo at phascode.org Mon Mar 1 16:57:47 2021 From: riccardo at phascode.org (RiccardoCagnasso) Date: Mon, 1 Mar 2021 10:57:47 -0600 (CST) Subject: Virtual webcam on Windows 10 Message-ID: <1614617867862-0.post@n4.nabble.com> Is there some way create a virtual webcam on Windows 10 and put the output of a pipeline into it? Something like using v4l2loopack with v4l2sink on linux. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From contact at olivieraubert.net Mon Mar 1 18:33:39 2021 From: contact at olivieraubert.net (Olivier Aubert) Date: Mon, 01 Mar 2021 19:33:39 +0100 Subject: Gtk3 window unresponsive on windows 10 when video embedded in In-Reply-To: <1614617803703-0.post@n4.nabble.com> References: <1614604140771-0.post@n4.nabble.com> <1614617803703-0.post@n4.nabble.com> Message-ID: <924cf113d3217c0e69a71fe68f7fbe824df736ab.camel@olivieraubert.net> FWIW, I transitioned for win32 from using the Overlay API with a drawing area to using the gtksink element, which directly provides a Gtk widget. It is much more convenient to handle. Olivier On Mon, 2021-03-01 at 10:56 -0600, RiccardoCagnasso wrote: > The code seems to be removed for some reason. > > I'll post a pastebin https://pastebin.com/jNVb2jMN > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From franka1986 at gmail.com Mon Mar 1 12:49:59 2021 From: franka1986 at gmail.com (Fran Raga) Date: Mon, 1 Mar 2021 06:49:59 -0600 (CST) Subject: python code for muxing klv and video In-Reply-To: References: Message-ID: <1614602999681-0.post@n4.nabble.com> I have made some changes to your code. https://gist.github.com/All4Gis/509fbe06ce53a0885744d16595811e6f If I use your code with all the frames of the video in a video that for example 1 min in my case, the pipeline dies, I don't know what happens.I need cut video for example the first 10 seconds. Other point is the video result gives me an error when I play it "meta/x-klv" , but the video show without problems. I have made some changes to avoid the variable "t" I think it is not necessary to do that step and neither to extract the frames, we can do that with opencv to read the frames in real time and ingest data, is my only my proposal. I would be happy to make this code work well with your help. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From seungha at centricular.com Mon Mar 1 19:13:06 2021 From: seungha at centricular.com (Seungha Yang) Date: Mon, 1 Mar 2021 13:13:06 -0600 (CST) Subject: Virtual webcam on Windows 10 In-Reply-To: <1614617867862-0.post@n4.nabble.com> References: <1614617867862-0.post@n4.nabble.com> Message-ID: <1614625986439-0.post@n4.nabble.com> Hi, Unfortunately there's no directly equivalent way to v4l2loopack on Windows. There are two options, - You can implement your own DirectShow COM server which serve video data for client DirectShow source filters as if it's webcam device so that client DirectShow application can read it from. OBS supports virtual webcam on Windows using this approach. - Or, you need to implement Windows driver which serve video data. In this way, both DirectShow and MediaFoundation (probably Windows kernel streaming also) applications can also read data from the virtual webcam. In any case, it's not simple to achieve on Windows. Regards, Seungha -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From Bruce.Liang at abilitycorp.com.tw Tue Mar 2 03:58:56 2021 From: Bruce.Liang at abilitycorp.com.tw (Bruce Liang) Date: Tue, 2 Mar 2021 03:58:56 +0000 Subject: gst-rtsp-server DESCRIBE fail if previous client PLAY without SETUP all streams Message-ID: <5B8432EE2A8F384A945090BB45D0E6F5011B77076A@HQ-EXMS.VID-HQ.vqti.com.tw> Hello, I want to push my own video and audio data to rtsp server, so I choose /gst-rtsp-server/examples/test-appsrc2.c as base to develop my application. The pipeline description has 2 seperate pipelines in it:(/gst-rtsp-server/examples/test-appsrc2.c) gst_rtsp_media_factory_set_launch (factory, "( appsrc name=videosrc ! h264parse ! rtph264pay name=pay0 pt=96 " " appsrc name=audiosrc ! audioconvert ! rtpL24pay name=pay1 pt=97 )"); The client1 only send SETUP to setup stream0(video stream) and then sends PLAY request(client1 doesn't need audio stream, so it setup video stream only). And then client2 sends DESCRIBE request and receives [503 Service Unavailable] response. The debug log shows: 0:00:03.537321800 2007 0x7fffeae68760 ERROR basesrc gstbasesrc.c:3088:gst_base_src_loop: pausing task, reason not-linked 0:00:03.538008300 2007 0x7fffeae68760 WARN basesrc gstbasesrc.c:3137:gst_base_src_loop: error: Internal data stream error. 0:00:03.538212500 2007 0x7fffeae68760 WARN basesrc gstbasesrc.c:3137:gst_base_src_loop: error: streaming stopped, reason not-linked (-1) How can I fix this? Is there any setting I have to feed to gst-rtsp-server before I start it? Thanks, Bruce Liang *** Confidentiality Note *** This e-mail message and any accompanying documents are for sole use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply e-mail and destroy all copies of the original message. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vandanachadha at yahoo.com Tue Mar 2 09:02:56 2021 From: vandanachadha at yahoo.com (Vandana Chadha) Date: Tue, 2 Mar 2021 09:02:56 +0000 (UTC) Subject: transcoding pipeline for dvbsrc and dashsink References: <344230640.1132224.1614675776879.ref@mail.yahoo.com> Message-ID: <344230640.1132224.1614675776879@mail.yahoo.com> Need to read from dvbsrc which has audio and video content both and transcode the content using aac and x264 into the dashsink Am trying the following amongst many other tries, but not working: gst-launch-1.0 dashsink name=dashsink mpd-baseurl=http://localhost/media/kllywq84 mpd-root-path=/var/www/localhost/media/kllywq84 mpd-filename=live.mpd target-duration=5 min-buffer-time=10 minimum-update-period=10 dynamic=true muxer=ts dvbsrc modulation=5 adapter=0 frequency=147000000 delsys=dvb-c-b ! tsdemux name=demux demux. ! queue ! audioresample ! audioconvert ! avenc_aac bitrate = 128000 demux. ! queue ! x264enc bitrate=1200 key-int-max=60 ! video/x-h264,stream-format=byte-stream,profile=main ! dashsink. Output:Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... /GstPipeline:pipeline0/GstDashSink:dashsink/GstSplitMuxSink:splitmuxsink0/GstFileSink:sink: async = false Pipeline is PREROLLED ... Setting pipeline to PLAYING ... /GstPipeline:pipeline0/GstDvbSrc:dvbsrc0.GstPad:src: caps = video/mpegts, mpegversion=(int)2, systemstream=(boolean)true /GstPipeline:pipeline0/GstTSDemux:demux.GstPad:sink: caps = video/mpegts, mpegversion=(int)2, systemstream=(boolean)true New clock: GstSystemClock WARNING: from element /GstPipeline:pipeline0/GstTSDemux:demux: Delayed linking failed. Additional debug info: subprojects/gstreamer/gst/parse/grammar.y(544): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstTSDemux:demux: failed delayed linking some pad of GstTSDemux named demux to some pad of GstQueue named queue0 WARNING: from element /GstPipeline:pipeline0/GstTSDemux:demux: Delayed linking failed. Additional debug info: subprojects/gstreamer/gst/parse/grammar.y(544): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstTSDemux:demux: failed delayed linking some pad of GstTSDemux named demux to some pad of GstQueue named queue1 ERROR: from element /GstPipeline:pipeline0/GstDvbSrc:dvbsrc0: Internal data stream error. Additional debug info: ../subprojects/gstreamer/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstDvbSrc:dvbsrc0: streaming stopped, reason not-linked (-1) Regards,?Vandana -------------- next part -------------- An HTML attachment was scrubbed... URL: From scerveau at gmail.com Tue Mar 2 09:11:46 2021 From: scerveau at gmail.com (dabrain34) Date: Tue, 2 Mar 2021 03:11:46 -0600 (CST) Subject: transcoding pipeline for dvbsrc and dashsink In-Reply-To: <344230640.1132224.1614675776879@mail.yahoo.com> References: <344230640.1132224.1614675776879@mail.yahoo.com> Message-ID: <1614676306467-0.post@n4.nabble.com> Hello Vandana, You need to tell explicitly the name of the dashsink pad to be used by the transcoding branch. By example for a video you'll be using dashsink.video_0 and for audio dashsink.audio_0. If you want to add an additional transcoding path, you'll use a pad for audio named dashsink.audio_1 Can you give a try to this pipeline: $ gst-launch-1.0 dashsink name=dashsink mpd-baseurl=http://localhost/media/kllywq84 mpd-root-path=/var/www/localhost/media/kllywq84 mpd-filename=live.mpd target-duration=5 min-buffer-time=10 minimum-update-period=10 dynamic=true muxer=ts dvbsrc modulation=5 adapter=0 frequency=147000000 delsys=dvb-c-b ! tsdemux name=demux demux. ! queue ! audioresample ! audioconvert ! avenc_aac bitrate = 128000 ! dashsink.audio_0 demux. ! queue ! x264enc bitrate=1200 key-int-max=60 ! video/x-h264,stream-format=byte-stream,profile=main ! dashsink.video_0 Regards. St?phane -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From michiel at aanmelder.nl Tue Mar 2 09:21:31 2021 From: michiel at aanmelder.nl (Michiel Konstapel) Date: Tue, 2 Mar 2021 10:21:31 +0100 Subject: two sink and one src custome plugin In-Reply-To: <1613719127830-0.post@n4.nabble.com> References: <1613638727612-0.post@n4.nabble.com> <1613661972766-0.post@n4.nabble.com> <1613719127830-0.post@n4.nabble.com> Message-ID: <5b8fb4a1-1b8c-04da-34be-50218db10c56@aanmelder.nl> I think input-selector might be what you want: "Direct one out of N input streams to the output pad." HTH, Michiel On 19-02-2021 08:18, anil0407 wrote: > Hi, > > Thanks for your reply, like N-to-1 pipe fitting, can you suggest any > reference plugin if exist[opp to tee]. > Pad Templates: > SINK template: 'sink_%u' > Availability: On request > Capabilities: > ANY > > SRC template: 'src' > Availability: Always > Capabilities: > ANY > > Thanks, > Anil > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From michiel at aanmelder.nl Tue Mar 2 09:29:00 2021 From: michiel at aanmelder.nl (Michiel Konstapel) Date: Tue, 2 Mar 2021 10:29:00 +0100 Subject: Combining 2 Video Streams Side by Side, Getting Choppy Results In-Reply-To: References: Message-ID: On 16-02-2021 16:06, Nicolas Dufresne wrote: > Le mardi 16 f?vrier 2021 ? 02:50 -0500, Jim Ruxton a ?crit?: >> >> I am trying to combine 2 video streams. One from my laptop camera and >> one from an external webcam. I'm using the following pipeline and >> getting very choppy results. Neither my CPU or GPU seem to be working >> hard. Any ideas how to make this less choppy? >> >> *./gst-launch-1.0 v4l2src device=/dev/video**4**! videoscale ! videoconvert ! video/x-raw,format=YUY2, >> framerate=30/1, width=***640***, height=***480****! ****alpha********alpha=1***! videobox border-alpha=0 left=-640 ! videomixer name=Mix ! >> videoconvert ! autovideosink v4l2src device=/dev/video***2***! videoscale ! videoconvert ! video/x-raw, format=YUY2, >> framerate=30/1, width=***640***, height=***480****! ****alpha********alpha=1***! videobox border-alpha=0 right=-640 ! Mix.* > > First recommendation, move away from videomixer and use compositor > (videomixer is just a backward shm on top of compositor). As you have > a live pipeline, you should release the composition pressure by > configurating a latency on the compositor element. ?The latency is in > nano-second, one of two frames of latency should be fine in general, > but the default is none, and would only worked with perfectly synched > sources which has accurate latency (v4l2src does not have accurate > latency, it simply claims 1 frame, alway). > > Second recommendation, consider adding a queue before you display > sink, this will improve the timeout logic inside the compositor, by > giving it a bit more freedome (with thread seperation). Interesting! Does any of the above apply to *gl*videomixer as well? Are there any specific considerations for using that in a live pipeline? Should we also give that more latency? I see the glmixerbin superclass has a property for that, also defaulting to zero. Kind regards, Michiel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicholas at umantec.net Tue Mar 2 11:34:12 2021 From: nicholas at umantec.net (Nick_law) Date: Tue, 2 Mar 2021 05:34:12 -0600 (CST) Subject: How to dynamically link to tee after pipeline started using gst_parse_launch Message-ID: <1614684852011-0.post@n4.nabble.com> Good afternoon all, Thanks to @gotsring in this post http://gstreamer-devel.966125.n4.nabble.com/Is-it-possible-to-dynamically-update-tee-sink-interleave-src-td4696637.html I have learnt how to unlink and relink pads in a pipeline. What I would like to do now is start a pipeline with audiotestsrcs connected to a Fakesink and once the pipeline is started then link the audiotestsrcs using tees to the corresponding interleave pads. I am unsure if this is even possible using gst_parse_launch but I am hoping someone is able to lead me in a direction where I can get access to interleave and tee pads without having them linked in the original parse launch string. I can see that the tee plugin has a num-src-pads property which would allow to specify the number of pads needed but thereafter I don't know how to access them. The main problem being that I would hope a tee src pad doesn't need to be defined with a named pad in the string. It would also be nice to be able to access individual interleave pads without creating named pads as well. i.e i.src_2 or t2_src3 etc just from the named tee and interleave values. The main idea is to be able to create a multichannel (16 channels +) pipeline that initially outputs 0 until audio sources are mapped to the desired channels. I have a current broken example that shows a little of what I'm trying to do but obviously missing some core concepts: ``` gst_init(NULL, NULL); GError* err = NULL; std::stringstream gst_parse; //audiotestsrc needed the is-live=true flag added to allow for remapping and continued playout. //Currently unsure how that would react with a file player (wavparse and rawaudioparse) gst_parse << "interleave name=i ! audioconvert ! wavenc ! filesink location=test.wav "; gst_parse << "audiotestsrc is-live=true ! tee name=t1 "; gst_parse << "audiotestsrc is-live=true wave=2 ! tee name=t2 "; gst_parse << "audiotestsrc is-live=true wave=4 ! tee name=t3 "; //Silent wave used as Fakesrc gst_parse << "t3. ! queue ! volume name=out0 volume=1.0 ! i. "; gst_parse << "t3. ! queue ! volume name=out1 volume=1.0 ! i. "; gst_parse << "t3. ! queue ! volume name=out2 volume=1.0 ! i. "; gst_parse << "t3. ! queue ! volume name=out3 volume=1.0 ! i. "; gst_parse << "t1. ! queue ! volume name=vol0 volume=1.0 ! fakesink "; //Ideally this is not needed to start pipeline gst_parse << "t2. ! queue ! volume name=vol1 volume=1.0 ! fakesink "; GstElement* pipeline = gst_parse_launch(gst_parse.str().c_str(), &err); if (err != NULL) { g_print("Pipeline failed to be created! You did a bad parse!\n"); g_printerr("Error: %s\n", err->message); return -1; } // Get elements by name GstElement* out0 = gst_bin_get_by_name(GST_BIN(pipeline), "out0"); GstElement* out1 = gst_bin_get_by_name(GST_BIN(pipeline), "out1"); GstElement* out2 = gst_bin_get_by_name(GST_BIN(pipeline), "out2"); GstElement* out3 = gst_bin_get_by_name(GST_BIN(pipeline), "out3"); GstElement* vol0 = gst_bin_get_by_name(GST_BIN(pipeline), "vol0");//Again ideally not needed GstElement* vol1 = gst_bin_get_by_name(GST_BIN(pipeline), "vol1"); if ( !out0 || !out1 || !out2 || !out3 || !vol0 || !vol1) { g_print("Error getting bins!\n"); return -1; } // Get their the sink and src pads that need to be unlinked and re-linked GstPad* out0_sink_pad = gst_element_get_static_pad(out0, "sink"); //sink side is where the fakesrc silent wave is conected GstPad* out1_sink_pad = gst_element_get_static_pad(out1, "sink"); GstPad* out2_sink_pad = gst_element_get_static_pad(out2, "sink"); GstPad* out3_sink_pad = gst_element_get_static_pad(out3, "sink"); GstPad* vol0_src_pad = gst_element_get_static_pad(vol0, "src"); //src side is where the Fakesink is connected GstPad* vol1_src_pad = gst_element_get_static_pad(vol1, "src"); if ( !out0_sink_pad || !out1_sink_pad || !out2_sink_pad || !out3_sink_pad || !vol0_src_pad || !vol1_src_pad) { g_print("Error getting src pads!\n"); return -1; } // I'm not sure that pad order is guaranteed when the pipeline is constructed, // meaning interleave pad sink1 might be connected to vol0 instead of vol1 // So, get pads based on peers! peers are the element that the pad is connected to. GstPad* fakesrc0_pad = gst_pad_get_peer(out0_sink_pad); GstPad* fakesrc1_pad = gst_pad_get_peer(out1_sink_pad); GstPad* fakesrc2_pad = gst_pad_get_peer(out2_sink_pad); GstPad* fakesrc3_pad = gst_pad_get_peer(out3_sink_pad); GstPad* fakesink0_pad = gst_pad_get_peer(vol0_src_pad); GstPad* fakesink1_pad = gst_pad_get_peer(vol1_src_pad); if (!fakesrc0_pad || !fakesrc1_pad || !fakesrc2_pad || !fakesrc3_pad || !fakesink0_pad || !fakesink1_pad) { g_print("Error getting peer pads!\n"); return -1; } // Play for a bit g_print("Playing...\n"); gst_element_set_state(pipeline, GST_STATE_PLAYING); GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "playing_orig"); g_usleep(2000000); // Swap around pads g_print("Rearranging pads...\n"); gst_element_set_state(pipeline, GST_STATE_PAUSED); gboolean good = TRUE; good &= gst_pad_unlink(fakesrc0_pad, out0_sink_pad); good &= gst_pad_unlink(fakesrc1_pad, out1_sink_pad); good &= gst_pad_unlink(fakesrc2_pad, out2_sink_pad); good &= gst_pad_unlink(fakesrc3_pad, out3_sink_pad); good &= gst_pad_unlink(vol0_src_pad, fakesink0_pad); good &= gst_pad_unlink(vol1_src_pad, fakesink1_pad); if (!good) { g_print("Pad unlink was not good!"); REQUIRE(0); } // This is where I am unsure how to relink to a tee that hasn't got a named pad in the original parse launch string. // The relink won't work as the vol*_src_pads are being relinked to more than one pad. good &= gst_pad_link(vol0_src_pad, out0_sink_pad) == GST_PAD_LINK_OK; good &= gst_pad_link(vol1_src_pad, out1_sink_pad) == GST_PAD_LINK_OK; good &= gst_pad_link(vol0_src_pad, out2_sink_pad) == GST_PAD_LINK_OK; good &= gst_pad_link(vol1_src_pad, out3_sink_pad) == GST_PAD_LINK_OK; //It seems that you MUST reconnect the fakesinks and Fakesrcs or the relink doesn't play out good &= gst_pad_link(fakesrc0_pad, fakesink0_pad) == GST_PAD_LINK_OK; good &= gst_pad_link(fakesrc1_pad, fakesink1_pad) == GST_PAD_LINK_OK; good &= gst_pad_link(fakesrc2_pad, fakesink0_pad) == GST_PAD_LINK_OK; good &= gst_pad_link(fakesrc3_pad, fakesink1_pad) == GST_PAD_LINK_OK; if (!good) { g_print("Pad relink was not good!"); REQUIRE(0); } // Play some more g_print("Playing more...\n"); gst_element_set_state(pipeline, GST_STATE_PLAYING); GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "playing_swapped"); g_usleep(2000000); g_print("Stopping...\n"); //TODO: test this with the hanging fifo issue and see if that stops gstreamer from hanging gst_element_send_event(pipeline, gst_event_new_eos()); gst_element_set_state(pipeline, GST_STATE_NULL); g_print("Done.\n"); ``` I hope I have explained my problem clearly and that someone can help? Thanks so much, Nick -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicholas at umantec.net Tue Mar 2 11:46:58 2021 From: nicholas at umantec.net (Nick_law) Date: Tue, 2 Mar 2021 05:46:58 -0600 (CST) Subject: concat multiple .mp4 files with KLV In-Reply-To: References: Message-ID: <1614685618657-0.post@n4.nabble.com> Have a look at multifilesrc: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-multifilesrc.html You should be able to play multiple files as if they are one if their names are the same but an index appended. i.e in.0.mp4 in.1.mp4 multifilesrc location=in.%01d.mp4 -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From josh at joshdoe.com Tue Mar 2 12:30:59 2021 From: josh at joshdoe.com (Josh Doe) Date: Tue, 2 Mar 2021 07:30:59 -0500 Subject: Gstreamer plugin CMakeList.txt In-Reply-To: <1614347280976-0.post@n4.nabble.com> References: <1614347280976-0.post@n4.nabble.com> Message-ID: On Fri, Feb 26, 2021 at 11:35 AM anil0407 wrote: > Can any one suggest gstreamer plugin CMakeLists.txt for cmake build system. > I use CMake in gst-plugins-vision [0], that being said I plan to switch to Meson "soon." [0]: https://github.com/joshdoe/gst-plugins-vision -------------- next part -------------- An HTML attachment was scrubbed... URL: From random.plant at gmail.com Tue Mar 2 12:50:49 2021 From: random.plant at gmail.com (Anton Novikov) Date: Tue, 2 Mar 2021 17:50:49 +0500 Subject: RFC: MIDI's MIME types Message-ID: Hi everyone, I plan to write a MIDI sequencer/editor and would like to use GStreamer for separate rendering of each track via wildmidi/timidity/fluidsynth/whatever and mixing it up with audiomixer plugin. (It is crucial that tracks have separate channel namespaces, it will be a certain extension of SMF, if more precisely -- for microtonal tunings support). What we/you have now is wildmididec which consumes audio/midi, which currently stands for SMF file data. I propose to break that up to Standard Midi File (and RIFF) parser (of .mid being mapped to something like application/smf) and renderer of resulting audio/midi streams containing raw track data (division number being specified in renderer's variable parameter). Or I could just abandon audio/midi and use something like audio/x-raw-midi in a newly written timidity/fluidsynth renderer, to not mess up with wildmididec. What do you think? Thanks, Anek -------------- next part -------------- An HTML attachment was scrubbed... URL: From deepanshu.saxena30 at gmail.com Tue Mar 2 14:33:12 2021 From: deepanshu.saxena30 at gmail.com (deepanshu) Date: Tue, 2 Mar 2021 08:33:12 -0600 (CST) Subject: Gstreamer webrtcbin flow stuck at add-ice-candidate signal In-Reply-To: <1614600117505-0.post@n4.nabble.com> References: <1614278444335-0.post@n4.nabble.com> <249be6ba-39e5-ba86-db6e-a5b076e1f651@gmail.com> <1614416918797-0.post@n4.nabble.com> <4af33a22-8118-bc5b-a3ad-e8e65979d5d4@gmail.com> <1614600117505-0.post@n4.nabble.com> Message-ID: <1614695592597-0.post@n4.nabble.com> Hi, Any update on this. It will be really helpful. Thanks Deepanshu -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From jean_philippe_arnaud at yahoo.fr Tue Mar 2 16:51:28 2021 From: jean_philippe_arnaud at yahoo.fr (jean-philippe) Date: Tue, 2 Mar 2021 10:51:28 -0600 (CST) Subject: How to retrieve last-sample from d3d11videosink Message-ID: <1614703888453-0.post@n4.nabble.com> I need to store a snapshot from my live display. I'm using the D3D11 elements on Windows 10 with GStreamer 1.18.3. The relevant section of the pipeline: "d3d11h264dec ! d3d11convert ! d3d11videosink". Variable 'sample' is null after the call in the following code: GstSample* sample = nullptr; g_object_get(videosink, "last-sample", &sample, NULL); if (!sample) THROW("Failed getting sample"); If the last-sample option does not work, what alternative do I have to take a snapshot? Thanks -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Tue Mar 2 17:28:52 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 02 Mar 2021 12:28:52 -0500 Subject: Combining 2 Video Streams Side by Side, Getting Choppy Results In-Reply-To: References: Message-ID: Le mardi 02 mars 2021 ? 10:29 +0100, Michiel Konstapel a ?crit?: > On 16-02-2021 16:06, Nicolas Dufresne wrote: > > Le mardi 16 f?vrier 2021 ? 02:50 -0500, Jim Ruxton a ?crit?: > > > I am trying to combine 2 video streams. One from my laptop camera and one > > from an external webcam. I'm using the following pipeline and getting very > > choppy results. Neither my CPU or GPU seem to be working hard. Any ideas how > > to make this less choppy? > > ./gst-launch-1.0 v4l2src device=/dev/video4 ! videoscale ! videoconvert ! video/x-raw,format=YUY2, framerate=30/1, width=640, height=480! alpha alpha=1 ! videobox border-alpha=0 left=-640 ! > > videomixer name=Mix ! videoconvert ! autovideosink v4l2src device=/dev/video2 ! videoscale ! videoconvert ! video/x-raw, format=YUY2, framerate=30/1, width=640, height=480! alpha alpha=1 ! videobox border-alpha=0 right=-640 ! Mix. > > First recommendation, move away from videomixer and use compositor (videomixer > is just a backward shm on top of compositor). As you have a live pipeline, you > should release the composition pressure by configurating a latency on the > compositor element. ?The latency is in nano-second, one of two frames of > latency should be fine in general, but the default is none, and would only > worked with perfectly synched sources which has accurate latency (v4l2src does > not have accurate latency, it simply claims 1 frame, alway). > > Second recommendation, consider adding a queue before you display sink, this > will improve the timeout logic inside the compositor, by giving it a bit more > freedome (with thread seperation). > > Interesting! Does any of the above apply to *gl*videomixer as well? Are there > any specific considerations for using that in a live pipeline? Should we also > give that more latency? I see the glmixerbin superclass has a property for > that, also defaulting to zero. glvideomixer is based a GstVideoAggregator, hence works fine for livepipeline. The mixer bin is to help taking take of uploading the pixel to your GPU, as GPUs don't or rarely use linear / malloc pixel data. > Kind regards, > Michiel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Tue Mar 2 18:39:29 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 02 Mar 2021 13:39:29 -0500 Subject: How to retrieve last-sample from d3d11videosink In-Reply-To: <1614703888453-0.post@n4.nabble.com> References: <1614703888453-0.post@n4.nabble.com> Message-ID: Le mardi 02 mars 2021 ? 10:51 -0600, jean-philippe a ?crit?: > I need to store a snapshot from my live display. I'm using the D3D11 elements > on Windows 10 with GStreamer 1.18.3. > > The relevant section of the pipeline: "d3d11h264dec ! d3d11convert ! > d3d11videosink". > > Variable 'sample' is null after the call in the following code: > > GstSample* sample = nullptr; > > g_object_get(videosink, "last-sample", &sample, NULL); > if (!sample) > ?? THROW("Failed getting sample"); > > If the last-sample option does not work, what alternative do I have to take > a snapshot? Can you check is enable-last-sample property is set ? > > Thanks > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From seungha at centricular.com Tue Mar 2 18:56:58 2021 From: seungha at centricular.com (Seungha Yang) Date: Tue, 2 Mar 2021 12:56:58 -0600 (CST) Subject: How to retrieve last-sample from d3d11videosink In-Reply-To: <1614703888453-0.post@n4.nabble.com> References: <1614703888453-0.post@n4.nabble.com> Message-ID: <1614711418359-0.post@n4.nabble.com> Hi, g_object_get(videosink, "last-sample", &sample, NULL) could return null sample if it's called before the videosink is receiving any buffer from upstream Regards, Seungha -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From riccardo at phascode.org Wed Mar 3 10:02:54 2021 From: riccardo at phascode.org (RiccardoCagnasso) Date: Wed, 3 Mar 2021 04:02:54 -0600 (CST) Subject: Gtk3 window unresponsive on windows 10 when video embedded in In-Reply-To: <924cf113d3217c0e69a71fe68f7fbe824df736ab.camel@olivieraubert.net> References: <1614604140771-0.post@n4.nabble.com> <1614617803703-0.post@n4.nabble.com> <924cf113d3217c0e69a71fe68f7fbe824df736ab.camel@olivieraubert.net> Message-ID: <1614765774394-0.post@n4.nabble.com> Well, gtksink works fine. Too bad that you can't choose what videosink you are actually using. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From michal.rudowicz at fl9.eu Wed Mar 3 10:17:36 2021 From: michal.rudowicz at fl9.eu (=?UTF-8?Q?Micha=C5=82_Rudowicz?=) Date: Wed, 03 Mar 2021 11:17:36 +0100 Subject: Multiple synchronized video streams into one appsink Message-ID: <5c3fcee4-662b-429c-b9c2-2695e711ebd0@www.fastmail.com> Hi, I'm trying to create a pipeline that, simplified, looks like this: v4l2src ! tee ! videoconvert ! videoscale --\ \ \ \ \----\ \ + appsink \------------------------------------/ The idea is to be able to perform operations on a converted and scaled stream in the appsink, but - depending on the result of the operations - being also able to access the unmodified stream in there. Synchronization of frames is important - there's no point in that if the original and scaled frames do not share the same source frame. Is there an easy way to do that in GStreamer, or should I move the convert and scaling to the appsink and essentially handle that myself? Thanks in advance for any tips! From contact at olivieraubert.net Wed Mar 3 10:40:39 2021 From: contact at olivieraubert.net (Olivier Aubert) Date: Wed, 03 Mar 2021 11:40:39 +0100 Subject: Gtk3 window unresponsive on windows 10 when video embedded in In-Reply-To: <1614765774394-0.post@n4.nabble.com> References: <1614604140771-0.post@n4.nabble.com> <1614617803703-0.post@n4.nabble.com> <924cf113d3217c0e69a71fe68f7fbe824df736ab.camel@olivieraubert.net> <1614765774394-0.post@n4.nabble.com> Message-ID: On Wed, 2021-03-03 at 04:02 -0600, RiccardoCagnasso wrote: > Well, gtksink works fine. Too bad that you can't choose what videosink you > are actually using. AFAICS from the sources[1], gtksink does not use a specific videosink: it simply renders frame buffers using the cairo API, which may be suboptimal in terms of performance wrt. specialized protocols like XVideo, but can be considered an acceptable trade-off for the simplification in integration. You may also try gtkglsink (in combination for glupload). Olivier [1] https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/blob/1.18/ext/gtk/gtkgstwidget.c#L42 From jean_philippe_arnaud at yahoo.fr Wed Mar 3 12:22:02 2021 From: jean_philippe_arnaud at yahoo.fr (jean-philippe) Date: Wed, 3 Mar 2021 06:22:02 -0600 (CST) Subject: How to retrieve last-sample from d3d11videosink In-Reply-To: References: <1614703888453-0.post@n4.nabble.com> Message-ID: <1614774122559-0.post@n4.nabble.com> enable-last-sample is set to TRUE by default, but setting it anyway to TRUE make the last-sample signal work. videosink = gst_element_factory_make("d3d11videosink", ""); int x = -1; g_object_get(videosink, "enable-last-sample", &x, NULL); -- > x == TRUE, but last-sample returns nullptr g_object_set(videosink, "enable-last-sample", TRUE, NULL); g_object_get(videosink, "enable-last-sample", &x, NULL); -- > x == TRUE, last-sample returns valid GstSample * Many thanks for the help! Do you need me to file a bug report? If so, can you let me know where please? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Wed Mar 3 15:22:26 2021 From: gotsring at live.com (gotsring) Date: Wed, 3 Mar 2021 09:22:26 -0600 (CST) Subject: How to dynamically link to tee after pipeline started using gst_parse_launch In-Reply-To: <1614684852011-0.post@n4.nabble.com> References: <1614684852011-0.post@n4.nabble.com> Message-ID: <1614784946477-0.post@n4.nabble.com> May I ask why you would like to use gst_parse_launch for this? Yes, it's probably possible, but it seems either way you need to hold references to certain elements or pads. I'd wager that it's better to programmatically create elements and save references to them in an array, especially for this amount of manipulation. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From rfernandez at sensia-solutions.com Wed Mar 3 16:24:01 2021 From: rfernandez at sensia-solutions.com (rfn) Date: Wed, 3 Mar 2021 10:24:01 -0600 (CST) Subject: no element ksvideosrc Message-ID: <1614788641664-0.post@n4.nabble.com> Hi everybody, I'm fairly new to gstreamer, so excuse if this is trivial. I have a working pipeline in linux using : descr = g_strdup_printf ("v4l2src device=/dev/video0 ! video/x-raw,format=GRAY16_LE ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! appsink name=sink"); data->pipe = gst_parse_launch (descr, &error); Now I am trying to get the equivalent code for windows. I was trying by changing the source, using: /ksvideosrc device-name=*device-name*/ instead of: /v4l2src device=/dev/video0/ This returns: /No element ksvideosrc/ However, if I execute on the cmd : gst-launch-1.0 ksvideosrc device-name=*device-name* ! video/x-raw,format=GRAY16_LE ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! autovideosink it works just fine. Since ksvideosrc works on the command line but not on the code, I am guessing that some kind of include, library ... is missing. I have tried including basically every .dll and library. I used both gstreamer runtime and development installers on complete mode. Any idea on what can it be? Kind regards, Raul. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From riccardo at phascode.org Wed Mar 3 17:24:32 2021 From: riccardo at phascode.org (RiccardoCagnasso) Date: Wed, 3 Mar 2021 11:24:32 -0600 (CST) Subject: Please consider ndi support Message-ID: <1614792272757-0.post@n4.nabble.com> I hope this post is not off topic in this list. I would like to move the case to gstreamer team to include NDI support in the default gstreamer distribution. NDI is a very useful tool. I tested ndisink from https://github.com/davidmhewitt/gst-plugin-ndi and it works just fine on linux. The code seems simple enough. I wonder if it is possible to include it the good/bad/ugly with time. It would be a shame if it goes undeveloped for whatever reason. I didn't test any ndsrc, but I know that there's one available. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From sebastian at centricular.com Wed Mar 3 17:39:08 2021 From: sebastian at centricular.com (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Wed, 03 Mar 2021 19:39:08 +0200 Subject: Please consider ndi support In-Reply-To: <1614792272757-0.post@n4.nabble.com> References: <1614792272757-0.post@n4.nabble.com> Message-ID: On Wed, 2021-03-03 at 11:24 -0600, RiccardoCagnasso wrote: > I hope this post is not off topic in this list. > > I would like to move the case to gstreamer team to include NDI support in > the default gstreamer distribution. NDI is a very useful tool. > > I tested ndisink from https://github.com/davidmhewitt/gst-plugin-ndi?and it > works just fine on linux. The code seems simple enough. I wonder if it is > possible to include it the good/bad/ugly with time. It would be a shame if > it goes undeveloped for whatever reason. > > I didn't test any ndsrc, but I know that there's one available. There's an NDI plugin available here:?https://github.com/teltek/gst-plugin-ndi and a PR by me that adds a sink here:?https://github.com/teltek/gst-plugin-ndi/pull/55 That plugin is used in production by a few people by now so should generally work well, and I'll also continue working on it as needed and time permits. The main reason why I didn't consider including it as part of gst- plugins-rs is because it requires a proprietary SDK so nobody's going to ship it as part of their packages and you'll have to build it yourself anyway. I don't know about the one you linked but it's written in Vala so can't really be part of the good/bad/ugly GStreamer plugin modules because of that reason too, in addition to the complications with the SDK. -- Sebastian Dr?ge, Centricular Ltd ? https://www.centricular.com From deepanshu.saxena30 at gmail.com Wed Mar 3 18:31:40 2021 From: deepanshu.saxena30 at gmail.com (deepanshu) Date: Wed, 3 Mar 2021 12:31:40 -0600 (CST) Subject: Gstreamer webrtcbin flow stuck at add-ice-candidate signal In-Reply-To: <1614695592597-0.post@n4.nabble.com> References: <1614278444335-0.post@n4.nabble.com> <249be6ba-39e5-ba86-db6e-a5b076e1f651@gmail.com> <1614416918797-0.post@n4.nabble.com> <4af33a22-8118-bc5b-a3ad-e8e65979d5d4@gmail.com> <1614600117505-0.post@n4.nabble.com> <1614695592597-0.post@n4.nabble.com> Message-ID: <1614796300906-0.post@n4.nabble.com> I was able to get some more logs by setting export G_MESSAGES_DEBUG=all and export NICE_DEBUG=all. If these migh t be of some help in understanding the issue https://pastebin.com/gKHKQZsX The below logs keep on repeating (python:9058): libnice-DEBUG: 23:55:45.760: Agent 0x325dbf0 : stream 1: timer tick #51: 0 frozen, 0 in-progress, 0 waiting, 1 succeeded, 0 discovered, 0 nominated, 1 waiting-for-nom, 1 valid. (python:9058): libnice-DEBUG: 23:55:45.761: Agent 0x325dbf0 : stream 2: timer tick #51: 0 frozen, 0 in-progress, 0 waiting, 0 succeeded, 0 discovered, 0 nominated, 0 waiting-for-nom, 0 valid. Thanks -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From keith.thornton at zeiss.com Thu Mar 4 04:50:24 2021 From: keith.thornton at zeiss.com (Thornton, Keith) Date: Thu, 4 Mar 2021 04:50:24 +0000 Subject: AW: no element ksvideosrc In-Reply-To: <1614788641664-0.post@n4.nabble.com> References: <1614788641664-0.post@n4.nabble.com> Message-ID: Hi, try the same with videotestsrc, It maybe that the environments in which they are running are different. /PATH,GST_PLUGIN_PATH). Gruesse -----Urspr?ngliche Nachricht----- Von: gstreamer-devel Im Auftrag von rfn Gesendet: Mittwoch, 3. M?rz 2021 17:24 An: gstreamer-devel at lists.freedesktop.org Betreff: no element ksvideosrc Hi everybody, I'm fairly new to gstreamer, so excuse if this is trivial. I have a working pipeline in linux using : descr = g_strdup_printf ("v4l2src device=/dev/video0 ! video/x-raw,format=GRAY16_LE ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! appsink name=sink"); data->pipe = gst_parse_launch (descr, &error); Now I am trying to get the equivalent code for windows. I was trying by changing the source, using: /ksvideosrc device-name=*device-name*/ instead of: /v4l2src device=/dev/video0/ This returns: /No element ksvideosrc/ However, if I execute on the cmd : gst-launch-1.0 ksvideosrc device-name=*device-name* ! video/x-raw,format=GRAY16_LE ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! autovideosink it works just fine. Since ksvideosrc works on the command line but not on the code, I am guessing that some kind of include, library ... is missing. I have tried including basically every .dll and library. I used both gstreamer runtime and development installers on complete mode. Any idea on what can it be? Kind regards, Raul. -- Sent from: https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7Ca8b27179f76145acceb008d8de6e3996%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637503912290502131%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=f0i95EtE1t5tgrqUPHZC0FMhwjDwUR8iPufgOVxk400%3D&reserved=0 _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7Ca8b27179f76145acceb008d8de6e3996%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637503912290502131%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=xXkxn8ZBe7BbU%2FhXBQJ8KFnO7Up3AsZNBPsZc14dFX0%3D&reserved=0 From nicholas at umantec.net Thu Mar 4 07:22:33 2021 From: nicholas at umantec.net (Nick_law) Date: Thu, 4 Mar 2021 01:22:33 -0600 (CST) Subject: How to dynamically link to tee after pipeline started using gst_parse_launch In-Reply-To: <1614784946477-0.post@n4.nabble.com> References: <1614684852011-0.post@n4.nabble.com> <1614784946477-0.post@n4.nabble.com> Message-ID: <1614842553759-0.post@n4.nabble.com> Hi @gotsring, Short answer legacy and time. I am trying to get the go ahead to remove the dependency on gst_parse_launch but will need a considerable amount of time to remove, refactor and learn everything that is needed to construct the pipelines fully using the gstreamer api. So still would like to know how to use gst_parse_launch in the event I don't have enough time to re-implement. regards, Nick gotsring wrote > May I ask why you would like to use gst_parse_launch for this? Yes, it's > probably possible, but it seems either way you need to hold references to > certain elements or pads. > > I'd wager that it's better to programmatically create elements and save > references to them in an array, especially for this amount of > manipulation. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From dongjin.ha at 42dot.ai Thu Mar 4 08:31:12 2021 From: dongjin.ha at 42dot.ai (dongjin.ha) Date: Thu, 4 Mar 2021 02:31:12 -0600 (CST) Subject: How can I set my custom memory buffer to v4l2src userptr? Message-ID: <1614846672339-0.post@n4.nabble.com> Hi all, My c++ application runs the camera with gst_parse_launch("my_gst_command_string"); For example, my command string is here. "v4l2src device=/dev/video" + std::to_string(num) + " ! video/x-raw, format=UYVY, width=1920,height=1080 ! \ videoconvert ! video/x-raw, format=RGB ! appsink name=mysink" Actually with this command, there's no problem. But I want to allocate specific buffer and make my gst command use it. I could set v4l2src io-mode to userptr but I don't know how can I assign my buffer to my gstreamer pipeline in my application. Could you help or give some guide to assign my buffer to my gstreamer pipeline? Thanks in advance. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From anilkumar03006 at hotmail.com Thu Mar 4 11:46:00 2021 From: anilkumar03006 at hotmail.com (anil0407) Date: Thu, 4 Mar 2021 05:46:00 -0600 (CST) Subject: AW: video play fast with gst new buffer allocation In-Reply-To: References: <1614577196958-0.post@n4.nabble.com> Message-ID: <1614858360737-0.post@n4.nabble.com> Hi, After adding PTS, DURATION and OFFSET stuff...small video is playing good but longer video getting hang. Here the snippet code: buf = gst_buffer_make_writable (buf); buf_size = gst_buffer_get_size (buf); n_buf = gst_buffer_new_allocate (NULL, buf_size, NULL); gst_buffer_map (n_buf, &n_map, GST_MAP_WRITE); if (gst_buffer_map (buf, &map, GST_MAP_WRITE)) { ptr = (guint16 *) map.data; GST_BUFFER_DURATION (n_buf) = GST_BUFFER_DURATION (buf); GST_BUFFER_PTS (n_buf) = GST_BUFFER_PTS (buf); GST_BUFFER_OFFSET(n_buf) = GST_BUFFER_OFFSET(buf); memcpy((guint16 *)n_map.data, (guint16 *)ptr, buf_size); gst_buffer_unmap (buf, &map); gst_buffer_unmap (n_buf, &n_map); return gst_pad_push (filter->srcpad, n_buf); } Thanks, Anil -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Thu Mar 4 15:31:32 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 04 Mar 2021 10:31:32 -0500 Subject: How can I set my custom memory buffer to v4l2src userptr? In-Reply-To: <1614846672339-0.post@n4.nabble.com> References: <1614846672339-0.post@n4.nabble.com> Message-ID: Le jeudi 04 mars 2021 ? 02:31 -0600, dongjin.ha a ?crit?: > Hi all, > > My c++ application runs the camera with > gst_parse_launch("my_gst_command_string"); > For example, my command string is here. > > "v4l2src device=/dev/video" + std::to_string(num) + " ! video/x-raw, > format=UYVY, width=1920,height=1080 ! \ > ??????? videoconvert ! video/x-raw, format=RGB !? appsink name=mysink" > > Actually with this command, there's no problem. > > But I want to allocate specific buffer and make my gst command use it. > I could set v4l2src io-mode to userptr but I don't know how can I assign my > buffer to my gstreamer pipeline in my application. > Could you help or give some guide to assign my buffer to my gstreamer > pipeline? If you enable userptr, v4l2src will use downstream buffers as provided by pool in the allocation query. If videoconvert is in active, videoconvert will be the provider. With appsink, to handle the allocation query, you need to use a pad probe, this is not very hard. An example, it's not offering any pool in this case though, left as an exercise: https://gitlab.freedesktop.org/mesa/kmscube/-/blob/master/gst-decoder.c#L242 WARNING, V4L2 drivers don't always operate with virtual memory. Drivers also have limited support for user provided strides. Youre millage may vary from platform to platform. Also, userptr requires bling cache flushing, as we cannot assume if it will be used by CPU or Device. This has an important performance impact. I'm not sure why you need this, but I'm sure it will be difficult and you should handle a copy fallback. > > Thanks in advance. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From riccardo at phascode.org Thu Mar 4 17:22:13 2021 From: riccardo at phascode.org (RiccardoCagnasso) Date: Thu, 4 Mar 2021 11:22:13 -0600 (CST) Subject: Please consider ndi support In-Reply-To: References: <1614792272757-0.post@n4.nabble.com> Message-ID: <1614878533734-0.post@n4.nabble.com> I tried your ndisink. It works really fine. I hope it's merged in the main branch soon. I'll build a PKGBUILD for AUR maybe -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From toby at rosecott.net Thu Mar 4 20:23:03 2021 From: toby at rosecott.net (Toby Widdows) Date: Thu, 04 Mar 2021 20:23:03 +0000 Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files Message-ID: Hi, I am by no means a developer, and I have struggled to get my head around gstreamer.? I have what on the face of seems like a simple issue and one which I have achieved with FFMPEG and some linux scripting, however I have a jetson nano that i want to use for this, and the FFMPEG someone has built to support the nvidia encoders and decoders gives me weird video issues, and as the nvidia forums promote gstreamer, I thought I would give this a go.. I have video in MKV containers in H264 video format with multiple audio and multiple subtitle tracks.? I want to be able to transcode the video, and passthrough all the audio and subtitle tracks into a new MKV file. I have figured out how to do the video, but can not get my head around the methods needed to do the audio and subtitle tracks. I can discover how many audio and subtitle tracks there are and can build a script around all this, if I could only just figure out the syntax for gst-launch.?? Help much appreciated.?? And apologies in advance for the series of stupid questions that may come out of this thread? this is my starting point? :- gst-launch-1.0 filesrc location=./2.mkv \ ! matroskademux \ ! h264parse \ ! nvv4l2decoder \ ! nvv4l2h265enc bitrate=2000000 \ ! h265parse \ ! matroskamux \ ! progressreport update-freq=1? \ ! filesink location=./new2.mkv?? Thanks again in advance Toby -------------- next part -------------- An HTML attachment was scrubbed... URL: From aaron at sipsorcery.com Thu Mar 4 22:42:43 2021 From: aaron at sipsorcery.com (Aaron Clauson) Date: Thu, 4 Mar 2021 22:42:43 +0000 Subject: Static build fails to load libnice elements Message-ID: I'm experimenting with non-static and static gstreamer builds using the gst-build on an Ubuntu docker image. In both cases I'm able to successfully build gstreamer and compile my basic application that's using a webrtcbin pipeline. Note that static in this case is referring to the "Static build" description from the gst-build readme ( https://gitlab.freedesktop.org/gstreamer/gst-build#static-build) rather than attempting to link my executable with actual static libraries. For the non-static build my application is able to run the webrtcbin pipeline as expected. For static builds I get the warning below about libnice and the pipeline fails to launch (I definitely have libnice-dev installed and both docker images install an identical set of packages). 0:00:54.272738800 16 0x558ef5419c00 WARN webrtcbin gstwebrtcbin.c:133:_have_nice_elements: error: libnice elements are not available My understanding with the static build was that all the plugins are included in libgstreamer-full-1.0.so. The libnice symbols all seem to be in that library: root at 3600f52e3abd:/src/gst-webrtc-echo/builddir# nm /usr/local/lib/x86_64-linux-gnu/libgstreamer-full-1.0.so | grep nice 0000000000caaf60 t _gst_nice_thread 0000000000c95700 t _have_nice_elements 0000000000cb2310 t _nice_agent_stream_ids_get_type 0000000000cb22f0 t _nice_agent_stream_ids_get_type_once ... The meson command I'm using to generate the static build is: meson -Dgood=enabled -Dgst-plugins-good:vpx=enabled \ -Dgst-plugins-good:rtpmanager=enabled \ -Dbad=enabled -Dgst-plugins-bad:dtls=enabled \ -Dbad=enabled -Dgst-plugins-bad:srtp=enabled \ -Dbad=enabled -Dgst-plugins-bad:webrtc=enabled \ --default-library=static \ builddir And the command I then use to build my application is: gcc gst-webrtc-echo.c -I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/x86_64-linux-gnu/glib-2.0/include -I/usr/local/include/gstreamer-1.0 -L/usr/local/lib -L/usr/local/lib/x86_64-linux-gnu -lgstreamer-full-1.0 -lglib-2.0 -lgmodule-2.0 -lgobject-2.0 -lgio-2.0 -lpthread -lm -ldl -lpcre -lffi -levent -lcjson If anybody can spot anything I might be doing wrong any suggestions would be welcome. Aaron -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.crete at collabora.com Thu Mar 4 23:00:29 2021 From: olivier.crete at collabora.com (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Thu, 04 Mar 2021 18:00:29 -0500 Subject: Static build fails to load libnice elements In-Reply-To: References: Message-ID: <130a856b16515170276c9d65aaae13ce3244ab07.camel@collabora.com> Hi, I'm pretty sure that the libnice elements don't get included in libgstreamer-full, you have to link them separately as they're not officially part of GStreamer. Olivier On Thu, 2021-03-04 at 22:42 +0000, Aaron Clauson wrote: > I'm experimenting with non-static and static gstreamer builds using > the gst-build on an Ubuntu docker image. In both cases I'm able to > successfully build gstreamer and compile my basic application that's > using a?webrtcbin?pipeline. > > Note that static in this case is referring to the "Static build" > description from the gst-build readme > (https://gitlab.freedesktop.org/gstreamer/gst-build#static-build) > rather than attempting to link my executable with actual static > libraries. > > For the non-static build my application is able to run the > webrtcbin?pipeline as expected. For static builds I get the warning > below about libnice and the pipeline fails to launch (I definitely > have libnice-dev installed and both docker images install an > identical set of packages). > > 0:00:54.272738800 ? ?16 0x558ef5419c00 WARN ? ? ? ? ? ? ? webrtcbin > gstwebrtcbin.c:133:_have_nice_elements: error: libnice > elements are not available > > My understanding with the static build was that all the plugins are > included in libgstreamer-full-1.0.so. The libnice symbols all seem to > be in that library: > > root at 3600f52e3abd:/src/gst-webrtc-echo/builddir# nm > /usr/local/lib/x86_64-linux-gnu/libgstreamer-full-1.0.so | grep nice > 0000000000caaf60 t _gst_nice_thread > 0000000000c95700 t _have_nice_elements > 0000000000cb2310 t _nice_agent_stream_ids_get_type > 0000000000cb22f0 t _nice_agent_stream_ids_get_type_once > ... > > The meson command I'm using to generate the static build is: > > meson -Dgood=enabled -Dgst-plugins-good:vpx=enabled \ > ? -Dgst-plugins-good:rtpmanager=enabled \ > ? -Dbad=enabled -Dgst-plugins-bad:dtls=enabled \ > ? -Dbad=enabled -Dgst-plugins-bad:srtp=enabled \ > ? -Dbad=enabled -Dgst-plugins-bad:webrtc=enabled \ > ? --default-library=static \ > ? builddir > > And the command I then use to build my application is: > > gcc gst-webrtc-echo.c -I/usr/include/gstreamer-1.0 - > I/usr/include/glib-2.0 -I/usr/lib/x86_64-linux-gnu/glib-2.0/include - > I/usr/local/include/gstreamer-1.0 -L/usr/local/lib - > L/usr/local/lib/x86_64-linux-gnu -lgstreamer-full-1.0 -lglib-2.0 - > lgmodule-2.0 -lgobject-2.0 -lgio-2.0 -lpthread -lm -ldl -lpcre -lffi > -levent -lcjson > > If anybody can spot anything I might be doing wrong any suggestions > would be welcome. > > Aaron > > > > > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- Olivier Cr?te olivier.crete at collabora.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From aaron at sipsorcery.com Thu Mar 4 23:25:02 2021 From: aaron at sipsorcery.com (Aaron Clauson) Date: Thu, 4 Mar 2021 23:25:02 +0000 Subject: Static build fails to load libnice elements Message-ID: > I'm pretty sure that the libnice elements don't get included in libgstreamer-full, you have to link them separately as they're not officially part of GStreamer. Thanks, But why would libnice be treated differently from any other plugin? A quick comparison of the symbols in libnice.so and related symbols in libgstreamer-full-1.0.so does seem to show that at least some of the libnice symbols are being pulled in. For example. From libnice.so: 000000000000b630 T nice_address_copy_to_sockaddr 000000000000b840 T nice_address_dup 000000000000b760 T nice_address_equal 000000000000b9f0 T nice_address_equal_no_port ... 0000000000037380 T stun_agent_build_unknown_attributes_error 00000000000360b0 T stun_agent_default_validater 0000000000036e90 T stun_agent_finish_message 0000000000036a80 T stun_agent_forget_transaction 0000000000036060 T stun_agent_init ... >From libgstreamer-full-1.0.so: 0000000000cb1c40 t nice_address_copy_to_sockaddr 0000000000cb1e50 t nice_address_dup 0000000000cb1d70 t nice_address_equal 0000000000cb2010 t nice_address_equal_no_port ... 0000000000cde970 t stun_agent_build_unknown_attributes_error 0000000000cdd650 t stun_agent_default_validater 0000000000cdd4b0 t stun_agent_find_unknowns.isra.0 0000000000cde480 t stun_agent_finish_message 0000000000cde070 t stun_agent_forget_transaction 0000000000cdd600 t stun_agent_init ... -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.crete at collabora.com Thu Mar 4 23:47:33 2021 From: olivier.crete at collabora.com (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Thu, 04 Mar 2021 18:47:33 -0500 Subject: Static build fails to load libnice elements In-Reply-To: References: Message-ID: <01159d972b7ef43d4ac5cbb5758ee28dcfc8b3e7.camel@collabora.com> Hi, Those symbols are from the library, not from the plugin. The simple reason the libnice plugin isn't included is that I think you need to do some meson magic inside the libnice build system to make it be included... and, with my libnice maintainer hat on, I have no idea what this is, so merge requests are welcome ;) Olivier On Thu, 2021-03-04 at 23:25 +0000, Aaron Clauson wrote: > > I'm pretty sure that the libnice elements don't get included in > libgstreamer-full, you have to link them separately as they're not > officially part of GStreamer. > > Thanks, > > But why would libnice be treated differently from any other plugin? > > A quick?comparison of the symbols in libnice.so and related symbols > in libgstreamer-full-1.0.so does seem to show that at least some of > the libnice symbols are being pulled in. > > For example.?From libnice.so: > > 000000000000b630 T nice_address_copy_to_sockaddr > 000000000000b840 T nice_address_dup > 000000000000b760 T nice_address_equal > 000000000000b9f0 T nice_address_equal_no_port > ... > 0000000000037380 T stun_agent_build_unknown_attributes_error > 00000000000360b0 T stun_agent_default_validater > 0000000000036e90 T stun_agent_finish_message > 0000000000036a80 T stun_agent_forget_transaction > 0000000000036060 T stun_agent_init > ... > > From?libgstreamer-full-1.0.so: > > 0000000000cb1c40 t nice_address_copy_to_sockaddr > 0000000000cb1e50 t nice_address_dup > 0000000000cb1d70 t nice_address_equal > 0000000000cb2010 t nice_address_equal_no_port > ... > 0000000000cde970 t stun_agent_build_unknown_attributes_error > 0000000000cdd650 t stun_agent_default_validater > 0000000000cdd4b0 t stun_agent_find_unknowns.isra.0 > 0000000000cde480 t stun_agent_finish_message > 0000000000cde070 t stun_agent_forget_transaction > 0000000000cdd600 t stun_agent_init > ... > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- Olivier Cr?te olivier.crete at collabora.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Fri Mar 5 00:51:10 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 04 Mar 2021 19:51:10 -0500 Subject: Static build fails to load libnice elements In-Reply-To: References: Message-ID: <22f33d43fb5abb63e8fcaffb09ce053677ad796d.camel@ndufresne.ca> Le jeudi 04 mars 2021 ? 23:25 +0000, Aaron Clauson a ?crit?: > > I'm pretty sure that the libnice elements don't get included in > libgstreamer-full, you have to link them separately as they're not > officially part of GStreamer. > > Thanks, > > But why would libnice be treated differently from any other plugin? The creation of the gstreamer-full library is closely tied to GStreamer build system. The libnice plugin is not part of GStreamer project, but par of libnice project. I think there would be some values to try and figure-out a way, but for now, the build system is simply not aware of the existance of that plugin (nothing is hardcoded, it's all done through meson). > > A quick?comparison of the symbols in libnice.so and related symbols in > libgstreamer-full-1.0.so does seem to show that at least some of the libnice > symbols are being pulled in. > > For example.?From libnice.so: > > 000000000000b630 T nice_address_copy_to_sockaddr > 000000000000b840 T nice_address_dup > 000000000000b760 T nice_address_equal > 000000000000b9f0 T nice_address_equal_no_port > ... > 0000000000037380 T stun_agent_build_unknown_attributes_error > 00000000000360b0 T stun_agent_default_validater > 0000000000036e90 T stun_agent_finish_message > 0000000000036a80 T stun_agent_forget_transaction > 0000000000036060 T stun_agent_init > ... > > From?libgstreamer-full-1.0.so: > > 0000000000cb1c40 t nice_address_copy_to_sockaddr > 0000000000cb1e50 t nice_address_dup > 0000000000cb1d70 t nice_address_equal > 0000000000cb2010 t nice_address_equal_no_port > ... > 0000000000cde970 t stun_agent_build_unknown_attributes_error > 0000000000cdd650 t stun_agent_default_validater > 0000000000cdd4b0 t stun_agent_find_unknowns.isra.0 > 0000000000cde480 t stun_agent_finish_message > 0000000000cde070 t stun_agent_forget_transaction > 0000000000cdd600 t stun_agent_init > ... > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Fri Mar 5 02:21:59 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 04 Mar 2021 21:21:59 -0500 Subject: Static build fails to load libnice elements In-Reply-To: <01159d972b7ef43d4ac5cbb5758ee28dcfc8b3e7.camel@collabora.com> References: <01159d972b7ef43d4ac5cbb5758ee28dcfc8b3e7.camel@collabora.com> Message-ID: <4594073e58e4e6783b281858b3b7311698f38400.camel@ndufresne.ca> Le jeudi 04 mars 2021 ? 18:47 -0500, Olivier Cr?te a ?crit?: > Hi, > > Those symbols are from the library, not from the plugin. The simple reason the > libnice plugin isn't included is that I think you need to do some meson magic > inside the libnice build system to make it be included... and, with my libnice > maintainer hat on, I have no idea what this is, so merge requests are welcome > ;) Ah, it uses introspection, just like gst-env.py, just need to define a plugins array somewhere. Sent an MR. https://gitlab.freedesktop.org/libnice/libnice/-/merge_requests/190 commit c4c58f22917c2519f3d64ba67cb8fe993c31f9f3 (HEAD -> gst-env-fix, dev/gst-env-fix) Author: Nicolas Dufresne Date: Thu Mar 4 21:18:47 2021 -0500 gst: Fix gst-env and libgstreamer-full.so suppport gst-build uses meson introspection and reads the plugins array of each subproject in order to locate the plugins. Setting libnice plugins array allow enabling nice plugin in both gst-env.py and when building single library libgstreamer-full.so. diff --git a/gst/meson.build b/gst/meson.build index 091a37f..572c6ab 100644 --- a/gst/meson.build +++ b/gst/meson.build @@ -16,6 +16,7 @@ libgstnice = library('gstnice', link_with: libnice, install_dir: gst_plugins_install_dir, install: true) +plugins = [libgstnice] # Generate pc files for static plugins if we build static plugins if get_option('default_library') != 'shared' --- -------------- next part -------------- An HTML attachment was scrubbed... URL: From keith.thornton at zeiss.com Fri Mar 5 06:04:39 2021 From: keith.thornton at zeiss.com (Thornton, Keith) Date: Fri, 5 Mar 2021 06:04:39 +0000 Subject: AW: AW: video play fast with gst new buffer allocation In-Reply-To: <1614858360737-0.post@n4.nabble.com> References: <1614577196958-0.post@n4.nabble.com> <1614858360737-0.post@n4.nabble.com> Message-ID: Hi Just as an example you might like to look at the following GstSample* pSample = gst_app_sink_pull_sample(appsink); if (!pSample) { MULTIMEDIA_ERROR << " sample == nullptr"; return GST_FLOW_ERROR; } GstBuffer* pBuffer = gst_sample_get_buffer(pSample); if (!pBuffer) { MULTIMEDIA_ERROR << "failed to get buffer from sample"; gst_sample_unref(pSample); return GST_FLOW_ERROR; } GstBuffer* pNewBuffer = gst_buffer_copy_deep(pBuffer); if (!pNewBuffer) { MULTIMEDIA_ERROR << "failed to create new buffer for streaming"; return GST_FLOW_ERROR; } GstCaps* pNewCaps = gst_sample_get_caps(pSample); GstSegment* pSegment = gst_sample_get_segment(pSample); GstSample* pNewSample = gst_sample_new(pNewBuffer, pNewCaps, pSegment, nullptr); gst_sample_unref(pSample); if (! pNewSample) { MULTIMEDIA_ERROR << "failed to get buffer from sample"; gst_buffer_unref(pNewBuffer); return GST_FLOW_ERROR; } GstFlowReturn ret = gst_app_src_push_sample(appSrc, pNewSample); if (ret == GST_FLOW_OK) { MULTIMEDIA_TRACE << "PTS=" << GST_BUFFER_PTS(pNewBuffer) << ", DTS=" << GST_BUFFER_DTS(pNewBuffer) << ", DURATION=" << GST_BUFFER_DURATION(pNewBuffer); } else { MULTIMEDIA_ERROR << "failed to push the buffer downstream, error code = " << ret; gst_buffer_unref(pNewBuffer); gst_sample_unref(pNewSample); return GST_FLOW_OK; } // g_signal_emit_by_name(appsrc, "push-buffer", buffer, &ret); gst_buffer_unref(pNewBuffer); gst_sample_unref(pNewSample); return ret; } -----Urspr?ngliche Nachricht----- Von: gstreamer-devel Im Auftrag von anil0407 Gesendet: Donnerstag, 4. M?rz 2021 12:46 An: gstreamer-devel at lists.freedesktop.org Betreff: Re: AW: video play fast with gst new buffer allocation Hi, After adding PTS, DURATION and OFFSET stuff...small video is playing good but longer video getting hang. Here the snippet code: buf = gst_buffer_make_writable (buf); buf_size = gst_buffer_get_size (buf); n_buf = gst_buffer_new_allocate (NULL, buf_size, NULL); gst_buffer_map (n_buf, &n_map, GST_MAP_WRITE); if (gst_buffer_map (buf, &map, GST_MAP_WRITE)) { ptr = (guint16 *) map.data; GST_BUFFER_DURATION (n_buf) = GST_BUFFER_DURATION (buf); GST_BUFFER_PTS (n_buf) = GST_BUFFER_PTS (buf); GST_BUFFER_OFFSET(n_buf) = GST_BUFFER_OFFSET(buf); memcpy((guint16 *)n_map.data, (guint16 *)ptr, buf_size); gst_buffer_unmap (buf, &map); gst_buffer_unmap (n_buf, &n_map); return gst_pad_push (filter->srcpad, n_buf); } Thanks, Anil -- Sent from: https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7C24ac24cc2a714dd7d77908d8df205ed6%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637504677425205352%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=OUFd3lMBycgau%2B1AwMh43nFOhJXnJpgyyWETk0arEcs%3D&reserved=0 _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7C24ac24cc2a714dd7d77908d8df205ed6%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637504677425215306%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=JwuxvprEHMH5Tqf79Ax8VBxcC4Ut1krEk5ln%2F6etqbc%3D&reserved=0 From gevaria.purva at einfochips.com Fri Mar 5 06:53:42 2021 From: gevaria.purva at einfochips.com (Purva) Date: Fri, 5 Mar 2021 00:53:42 -0600 (CST) Subject: videoscale not working with android Message-ID: <1614927222486-0.post@n4.nabble.com> Hi I am developing one android application. I want to scale video with fix height width. But videoscale is not working. Below is my pipeline code "udpsrc port=2222 !" + "queue ! " + "h265parse ! " + "decodebin3 ! " + "videoscale ! " + "glimagesink sync=false" Any suggestions on same ? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From aaron at sipsorcery.com Fri Mar 5 12:19:14 2021 From: aaron at sipsorcery.com (Aaron Clauson) Date: Fri, 5 Mar 2021 12:19:14 +0000 Subject: Static build fails to load libnice elements Message-ID: > Ah, it uses introspection, just like gst-env.py, just need to define a plugins > array somewhere. Sent an MR. > > https://gitlab.freedesktop.org/libnice/libnice/-/merge_requests/190 Great, thanks a lot. That fixed my problem! -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Fri Mar 5 16:54:03 2021 From: gotsring at live.com (gotsring) Date: Fri, 5 Mar 2021 10:54:03 -0600 (CST) Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: References: Message-ID: <1614963243590-0.post@n4.nabble.com> I'll be honest, I've not really dealt with more than one stream out of matroskademux, so not sure if there's a more automatic way. You can try something like this. This assumes the audio is FLAC, thus the flacparse, but you should be able to change out the parser (or maybe remove it?) for other audio streams. gst-launch-1.0.exe filesrc location=2.mkv ! matroskademux name=d \ d.video_0 ! queue ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=2000000 ! h265parse ! matroskamux name=m ! filesink location=new2.mkv \ d.audio_0 ! queue ! flacparse ! m. \ d.audio_1 ! queue ! flacparse ! m. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From toby at rosecott.net Fri Mar 5 17:19:51 2021 From: toby at rosecott.net (toby at rosecott.net) Date: Fri, 5 Mar 2021 17:19:51 +0000 Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: <1614963243590-0.post@n4.nabble.com> References: <1614963243590-0.post@n4.nabble.com> Message-ID: <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@Spark> So by 'remove it' you mean remove the '! Flacparse' Regards Toby On 5 Mar 2021, 16:54 +0000, gotsring , wrote: > I'll be honest, I've not really dealt with more than one stream out of > matroskademux, so not sure if there's a more automatic way. > > You can try something like this. This assumes the audio is FLAC, thus the > flacparse, but you should be able to change out the parser (or maybe remove > it?) for other audio streams. > > gst-launch-1.0.exe filesrc location=2.mkv ! matroskademux name=d \ > d.video_0 ! queue ! h264parse ! nvv4l2decoder ! nvv4l2h265enc > bitrate=2000000 ! h265parse ! matroskamux name=m ! filesink > location=new2.mkv \ > d.audio_0 ! queue ! flacparse ! m. \ > d.audio_1 ! queue ! flacparse ! m. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Fri Mar 5 17:32:08 2021 From: gotsring at live.com (gotsring) Date: Fri, 5 Mar 2021 11:32:08 -0600 (CST) Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@Spark> References: <1614963243590-0.post@n4.nabble.com> <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@Spark> Message-ID: <1614965528428-0.post@n4.nabble.com> Yep. Though I would expected that you probably have to replace it with an audio parser that matches your audio streams. You'll have to experiment with this. You can test the outputs with gst-play-1.0. I think pressing 'a' or 's' in the output window switches between the audio and subtitle streams, so you can verify that they're all there. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From toby at rosecott.net Fri Mar 5 20:05:25 2021 From: toby at rosecott.net (Toby Widdows) Date: Fri, 05 Mar 2021 20:05:25 +0000 Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: <1614965528428-0.post@n4.nabble.com> References: <1614963243590-0.post@n4.nabble.com> <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@spark> <1614965528428-0.post@n4.nabble.com> Message-ID: Hi, so i've been fiddling and can get the audio streams to pass through if i know what they are eg? gst-launch-1.0 filesrc location=2.mkv ! matroskademux name=demux demux.video_0 ! queue ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=2000000 ! h265parse ! matroskamux name=mux ! progressreport update-freq=1 ! filesink location=2222.mkv demux.audio_0 ! queue ! aacparse !? mux. this works and will parse the AAC audio track through the output with no apparent loss in performance I got this from the Nvidia forum for gstreamer and it does the same thing but does not need to know what the audio stream is gst-launch-1.0 filesrc location=2.mkv ! matroskademux name=demux demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! mux.video_0 demux.audio_0 ! queue ! mux.audio_0 matroskamux name=mux ! progressreport update-freq=1 ! filesink location=2222.mkv I can see there are differences but because i cant understand the command structure, and it appears to be very flexible, i cant make the top work using the bottom one commands.? everything is in a different order! and to my poor brain that just seems wrong! TIA Toby ------ Original Message ------ From: "gotsring" To: gstreamer-devel at lists.freedesktop.org Sent: 05/03/2021 17:32:08 Subject: Re: Questions about transcoding video with audio and subtitle passthrough using MKV files Yep. Though I would expected that you probably have to replace it with an audio parser that matches your audio streams. ? You'll have to experiment with this. You can test the outputs with gst-play-1.0. I think pressing 'a' or 's' in the output window switches between the audio and subtitle streams, so you can verify that they're all there. ? ? ? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From gotsring at live.com Fri Mar 5 21:18:42 2021 From: gotsring at live.com (gotsring) Date: Fri, 5 Mar 2021 15:18:42 -0600 (CST) Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: References: <1614963243590-0.post@n4.nabble.com> <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@Spark> <1614965528428-0.post@n4.nabble.com> Message-ID: <1614979122804-0.post@n4.nabble.com> I sure hope I'm not explaining this wrong, but here goes: It looks like they're manually directing *types* of streams between the demuxer and muxer instead of letting it try to decide on its own. Broken down, you have 3 sections of the pipeline: 1. The demuxer, which reads an existing file and splits out the types of streams (like audio, video, subtitles) 2. The transcode/passthrough, where each individual stream is transcoded (video) or left alone (audio). 3. The muxer, which re-combines the new streams into an output file So looking at the pipeline, you can break that into 3 sections: Demuxer section: filesrc location=2.mkv ! matroskademux name=demux Transcode/Passthrough section demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! mux.video_0 demux.audio_0 ! queue ! mux.audio_0 Muxer Section matroskamux name=mux ! progressreport update-freq=1 ! filesink location=2222.mkv You'll notice that the transcode/passthrough section is much larger because they are routing individual streams instead of the one file. You'll also notice that they are using named elements to take things from the demuxer and put it into the muxer (conveniently named demux and mux). So the video is transcoded here: demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! mux.video_0 And the audio stream is left as is and routed from the audio_0 pad of the demuxer to the audio_0 pad of the muxer. demux.audio_0 ! queue ! mux.audio_0 If there was another audio track, you could also pass that forwards by incrementing the pad count: demux.audio_1 ! queue ! mux.audio_1 Hopefully that clarifies the format so you can adjust it to your needs. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Sat Mar 6 00:05:50 2021 From: gotsring at live.com (gotsring) Date: Fri, 5 Mar 2021 18:05:50 -0600 (CST) Subject: How to dynamically link to tee after pipeline started using gst_parse_launch In-Reply-To: <1614842553759-0.post@n4.nabble.com> References: <1614684852011-0.post@n4.nabble.com> <1614784946477-0.post@n4.nabble.com> <1614842553759-0.post@n4.nabble.com> Message-ID: <1614989150160-0.post@n4.nabble.com> Tee and interleave both have request pads, meaning you can just request a new pad without necessarily pre-allocating them. So for example, if you find the interleave element, you can request a new sink pad using gst_element_get_request_pad(interleave, "sink_%u") Wasn't quite sure what your intended use case is, but I altered the example code to create a pipeline that's not completely linked, and then grabs references to interleave and the volume elements to finish linking everything together before playing. dynamic_interleave_link.cpp I also noticed just now that interleave has a channel-positions property to (I think) rearrange which inputs map to which channel. Maybe this is useful to you? https://gstreamer.freedesktop.org/documentation/interleave/interleave.html?gi-language=c#interleave:channel-positions Hope this helps! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From deepanshu.saxena30 at gmail.com Sat Mar 6 03:43:39 2021 From: deepanshu.saxena30 at gmail.com (deepanshu) Date: Fri, 5 Mar 2021 21:43:39 -0600 (CST) Subject: Gstreamer webrtcbin flow stuck at add-ice-candidate signal In-Reply-To: <1614796300906-0.post@n4.nabble.com> References: <1614278444335-0.post@n4.nabble.com> <249be6ba-39e5-ba86-db6e-a5b076e1f651@gmail.com> <1614416918797-0.post@n4.nabble.com> <4af33a22-8118-bc5b-a3ad-e8e65979d5d4@gmail.com> <1614600117505-0.post@n4.nabble.com> <1614695592597-0.post@n4.nabble.com> <1614796300906-0.post@n4.nabble.com> Message-ID: <1615002219220-0.post@n4.nabble.com> Can anybody suggest starting pointers on the problem i am facing. Also, is adding and pairing of TCP ICE candidates supported in gstreamer webrtcbin. I am currently using ubuntu distribution version of gstreamer, which is 14.0. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ashdsh121 at gmail.com Sat Mar 6 02:24:01 2021 From: ashdsh121 at gmail.com (Maverick) Date: Fri, 5 Mar 2021 20:24:01 -0600 (CST) Subject: Media formats Message-ID: <1614997441305-0.post@n4.nabble.com> Hi, I was looking at the media formats described in - https://gstreamer.freedesktop.org/documentation/additional/design/mediatype-video-raw.html?gi-language=c#formats Some formats have these fields but not all of them. Whats the reason behind that ? default offset: 0 default rstride: RU4 (width) default size: rstride (component0) * RU2 (height) Also some formats have offset and others have it as "default offset". Does that mean these can be configured while creating the buffer ? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From volatileconst at gmail.com Sun Mar 7 01:03:36 2021 From: volatileconst at gmail.com (evaluat0r) Date: Sat, 6 Mar 2021 19:03:36 -0600 (CST) Subject: Coercing a branch of your pipeline to have updated running time Message-ID: <1615079016520-0.post@n4.nabble.com> Hi all, I?m noticing that a part of my pipeline that feeds into a mixer will fall behind the running time of the pipeline (and pts values also affected), and then ?correct? after about N seconds pass. The part of my Pipeline where this happens is for an RTP source. RTP source sends when it has data, and doesn?t send anything when it doesn?t have data. Where in gstreamer would this correction logic on pts values be? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ezerbib at kramerav.com Sun Mar 7 10:22:27 2021 From: ezerbib at kramerav.com (Eric Zerbib) Date: Sun, 7 Mar 2021 10:22:27 +0000 Subject: Backporting rtmp2sink in gstreamer 1.14 Message-ID: Hello Gstreamer dev I want to use gstreamer to make a live streaming RTMP to facebook, I have tested rtpmsink in the past and it was working while facebook was accepting RTMP (not secure), today Facebook is no more supporting RTMP and is using RTMPS, unfortunately rtmpsink is base on a old version of librtmp which is not maintained for now, some post indicate that we need to change to use rtmp2sink which is now a new version of RTMP rewrote from scratch. My Hardware vendor(Xilinx) is using an old version of Gstreamer 1.14.4 which is optimized for an FPGA board we are using, I would like to know if it?s possible to backport the code of RTMP2SINK to gstreamer 1.14, or is there a solution to install two version of gstreamer 1.14 and 1.18 to mix the cappabilty of the hardware acceleration got with gstreamer 1.14 and the new lib gstreamer 1.18? Thanks in advace EZ -------------- next part -------------- An HTML attachment was scrubbed... URL: From anville at hotmail.com Mon Mar 8 17:46:15 2021 From: anville at hotmail.com (anville) Date: Mon, 8 Mar 2021 11:46:15 -0600 (CST) Subject: Example of setting RTSP server session name / title Message-ID: <1615225575649-0.post@n4.nabble.com> I have a simple RTSP server app that just creates server, a media factory, sets up a mount point, and starts the loop. When I connect via an RTSP client (mpv), the session name comes back as "Session streamed with Gstreamer". I'd like to change this, but it's not apparent how to. I see that that string is hard coded in rtsp-client.c in the library (https://github.com/GStreamer/gst-rtsp-server/blob/master/gst/rtsp-server/rtsp-client.c) To modify this in my app, I'm guessing I'll need to set up call-backs for some signals, but I can't find any examples of how I might do this. Any suggestions? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From yogle38 at yahoo.com Mon Mar 8 19:03:09 2021 From: yogle38 at yahoo.com (Joachim Gossmann) Date: Mon, 8 Mar 2021 19:03:09 +0000 (UTC) Subject: ios Tutorials 4+5 : "Error received from element source: Secure connetion setup failed." References: <821783180.567225.1615230189342.ref@mail.yahoo.com> Message-ID: <821783180.567225.1615230189342@mail.yahoo.com> Hi - I am currently N00bing through the tutorials. I was able to build,link and run iOS Tutorials 1-3 successfully. However, the Mediaplayer-Tutorials 4+5 build fine, but do not run correctly. The XCode console shows no errors, however the Player does not show the video, instead it displays? "Error received from element source: Secure connetion setup failed." Unfortunately I seem out of luck trying to find a remedy - Do the certs have to be replaced? Is the app unable to find the certs? Skipping the secure connection requirement by using a url available with "http://" produces the same message/error. What am I missing? Cheers, j From joel.winarske at gmail.com Mon Mar 8 21:59:00 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Mon, 8 Mar 2021 13:59:00 -0800 Subject: render to egl texture Message-ID: Is https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter still the recommended pattern for rendering to an EGL texture? Thanks, Joel -------------- next part -------------- An HTML attachment was scrubbed... URL: From ystreet00 at gmail.com Tue Mar 9 00:32:10 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Tue, 9 Mar 2021 11:32:10 +1100 Subject: render to egl texture In-Reply-To: References: Message-ID: No. clutter has not been recommended for many years.? gst-plugins-gl neither for many more.? gst-plugins-gl has been migrated into gst-plugins-bad as can be seen from the latest commit on that repo: https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f ) and then promoted to gst-plugins-base and is available as the libgstgl-1.0 library. Cheers -Matt On 9/3/21 8:59 am, Joel Winarske wrote: > Is > https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter > > still the recommended pattern for rendering to an EGL texture? > > Thanks, > Joel > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From joel.winarske at gmail.com Tue Mar 9 00:59:38 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Mon, 8 Mar 2021 16:59:38 -0800 Subject: render to egl texture In-Reply-To: References: Message-ID: Thank you for that. What is the current recommended pattern for rendering to a GL texture which gets consumed by a shared context? The shared context handles the rendering. Cheers, Joel On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters wrote: > No. > > clutter has not been recommended for many years. gst-plugins-gl neither > for many more. gst-plugins-gl has been migrated into gst-plugins-bad as > can be seen from the latest commit on that repo: > https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f) > and then promoted to gst-plugins-base and is available as the libgstgl-1.0 > library. > > Cheers > -Matt > > On 9/3/21 8:59 am, Joel Winarske wrote: > > Is > https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter > still the recommended pattern for rendering to an EGL texture? > > Thanks, > Joel > > > _______________________________________________ > gstreamer-devel mailing listgstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From joel.winarske at gmail.com Tue Mar 9 02:37:58 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Mon, 8 Mar 2021 18:37:58 -0800 Subject: render to egl texture In-Reply-To: References: Message-ID: I'm figuring a pipeline like this: uridecodebin uri=file:///usr/local/share/assets/video.mp4 ! video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! glimagesink To get the texture id I see a pattern in the cube example of attaching callback to "client-draw" of glimagesink, then mapping the video buffer which provides access to the texture id. Is this the only way to access the texture id? Thanks, Joel On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske wrote: > Thank you for that. > > What is the current recommended pattern for rendering to a GL texture > which gets consumed by a shared context? The shared context handles the > rendering. > > Cheers, > Joel > > > On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters wrote: > >> No. >> >> clutter has not been recommended for many years. gst-plugins-gl neither >> for many more. gst-plugins-gl has been migrated into gst-plugins-bad as >> can be seen from the latest commit on that repo: >> https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f) >> and then promoted to gst-plugins-base and is available as the libgstgl-1.0 >> library. >> >> Cheers >> -Matt >> >> On 9/3/21 8:59 am, Joel Winarske wrote: >> >> Is >> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >> still the recommended pattern for rendering to an EGL texture? >> >> Thanks, >> Joel >> >> >> _______________________________________________ >> gstreamer-devel mailing listgstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From ystreet00 at gmail.com Tue Mar 9 05:57:01 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Tue, 9 Mar 2021 16:57:01 +1100 Subject: render to egl texture In-Reply-To: References: Message-ID: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> That is one option if you're looking to use glimagesink's rendering.? If you're rendering the texture yourself, something like https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tests/examples/gl/sdl/sdlshare.c is more appropriate. Cheers -Matt On 9/3/21 1:37 pm, Joel Winarske wrote: > I'm figuring a pipeline like this: > uridecodebin uri=file:///usr/local/share/assets/video.mp4 ! > video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! glimagesink > > To get the texture id I see a pattern in the cube example of attaching > callback to "client-draw" of glimagesink, then mapping the video > buffer which provides access to the texture id.? Is this the only way > to access the texture id? > > Thanks, > Joel > > > On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske > wrote: > > Thank you for that. > > What is the current recommended pattern for rendering to a GL > texture which gets consumed by a shared context? The shared > context handles the rendering. > > Cheers, > Joel > > > On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters > wrote: > > No. > > clutter has not been recommended for many years. > gst-plugins-gl neither for many more.? gst-plugins-gl has been > migrated into gst-plugins-bad as can be seen from the latest > commit on that repo: > https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f > ) > and then promoted to gst-plugins-base and is available as the > libgstgl-1.0 library. > > Cheers > -Matt > > On 9/3/21 8:59 am, Joel Winarske wrote: >> Is >> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >> >> still the recommended pattern for rendering to an EGL texture? >> >> Thanks, >> Joel >> >> >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From leedag8 at gmail.com Tue Mar 9 08:33:43 2021 From: leedag8 at gmail.com (JimmyHan) Date: Tue, 9 Mar 2021 02:33:43 -0600 (CST) Subject: Something may have went wrong in setting Up GStreamer environment for developing In-Reply-To: <83dbd4af-9ae0-38fd-2891-c3e4b06c47f7@centricular.com> References: <1614184959288-0.post@n4.nabble.com> <1e06c52de2c91a1aa2d079e84e61966d83551a4b.camel@ndufresne.ca> <1614198031041-0.post@n4.nabble.com> <1614238023379-0.post@n4.nabble.com> <83dbd4af-9ae0-38fd-2891-c3e4b06c47f7@centricular.com> Message-ID: <1615278823879-0.post@n4.nabble.com> So in practicing, I realized what was the problem. I run all the tutorials to see which one crashed with the segmentation fault. then using the tutorials that crashed, I looked for similarities in code, and I found something. *All the tutorials utilizing this line of code crashed with segmentation faults:* /* Build the pipeline */ pipeline = gst_parse_launch("playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.*webm*", NULL); or /* Set the URI to play */ g_object_set (data.source, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.*webm*", NULL); it would crash withe the segmentation fault. so I decided to go to the address "https://www.freedesktop.org/software/gstreamer-sdk/data/media/" to find any other formats. Fortunately I found the *.ogv* format. Replaced *.webm* with *.ogv* and everything worked perfectly. The uneducated conclusion for now is that, whenever the source is pulled from the internet and comes from a *.webm* format, it causes problems. Well on my system (!Pop_OS). -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From samueldemouramoreira at gmail.com Tue Mar 9 12:06:57 2021 From: samueldemouramoreira at gmail.com (Samuel de Moura) Date: Tue, 9 Mar 2021 09:06:57 -0300 Subject: Pausing/resuming buffering of HTTP-delivered media? Message-ID: <2F559A6B-509F-4033-AD25-1E237B3C7BD2@gmail.com> Hello, I'm currently developing an application where I need to have control over the buffering of HTTP-delivered media. Specifically, I need to at least be able to pause/resume buffering even before preroll has finished. The pipeline is just a regular playbin with custom audio/video/text sinks. Initially I tried setting the connection-speed property on the playbin itself before discovering that it's not an actual speed limiter. After that, I tried iterating over the elements in the bin and, for each queue/queue2/multiqueue object, changing its size limit to the amount of data it was currently holding (and, since this needs to happen before preroll finishes, I also added an element-added callback to also set the properties on any new queue elements that may get added to the pipeline). That also didn't work: the bus keeps receiving buffering messages in a pattern like 0%-4%-0%-4%-0% for a few seconds, and then simply proceeds to buffer normally after a little while. My next idea was to go mess with souphttpsrc's implementation directly, but before doing that I came here to ask for help in case there's a simpler solution I'm missing. Thanks, Samuel From andrew at surion.io Tue Mar 9 06:39:01 2021 From: andrew at surion.io (andrew at surion.io) Date: Tue, 9 Mar 2021 00:39:01 -0600 (CST) Subject: splitmuxsink and timestamps In-Reply-To: <20180604063754.kpfx4yd5zgkvvqr7@rgvaio.localdomain> References: <20180531115231.skqe34kbka7bwkb3@rgvaio.localdomain> <7153D1686E3590498D57D986E0F83BA60C6665@adeerl01sms004> <20180604063754.kpfx4yd5zgkvvqr7@rgvaio.localdomain> Message-ID: <1615271941644-0.post@n4.nabble.com> I agree. The ability to start each segment from 00:00 would be really great. In the interim should we use a different element, or offset the timestamps? Regards Andrew -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From alex at kordecki.de Tue Mar 9 12:17:01 2021 From: alex at kordecki.de (klex0) Date: Tue, 9 Mar 2021 06:17:01 -0600 (CST) Subject: Ending a pipeline with EOS sometimes hangs Message-ID: <1615292221062-0.post@n4.nabble.com> Hi, i have a pipeline for interleaving two RTP audio streams and writing them into a file. I have around 100-200 of these pipelines running in parallel and around 30.000 of them running per day. To end a pipeline, I send an EOS Event and wait with "bus.timed_pop_filtered(Gst.SECOND, Gst.MessageType.EOS | Gst.MessageType.ERROR)" for the EOS Message to be sure, everything is written. Every few minutes a "timed_pop_filtered" didn't exit, even not with a timeout. I found this happens reproducible when only one RTP stream has send packets but not the other. I have fixed this with supervising the SEGMENT events on each RTP stream. But never the less, I still have this problem of hanging threads in timed_pop_filtered and currently have no idea about the cause. Is there a way to get "timed_pop_filtered" to timeout? Or do any of you have an idea what can induce this problem? Regards, alex -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From joel.winarske at gmail.com Tue Mar 9 18:34:58 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Tue, 9 Mar 2021 10:34:58 -0800 Subject: render to egl texture In-Reply-To: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> References: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> Message-ID: In my use case (video player) I just need to initialize the pipeline and return a texture id. Is there a way to determine the texture id without loading a frame? Is the texture id constant over the lifecycle of the pipeline? On Mon, Mar 8, 2021 at 9:57 PM Matthew Waters wrote: > That is one option if you're looking to use glimagesink's rendering. If > you're rendering the texture yourself, something like > https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tests/examples/gl/sdl/sdlshare.c > is more appropriate. > > Cheers > -Matt > > On 9/3/21 1:37 pm, Joel Winarske wrote: > > I'm figuring a pipeline like this: > uridecodebin uri=file:///usr/local/share/assets/video.mp4 ! > video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! glimagesink > > To get the texture id I see a pattern in the cube example of attaching > callback to "client-draw" of glimagesink, then mapping the video buffer > which provides access to the texture id. Is this the only way to access > the texture id? > > Thanks, > Joel > > > On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske > wrote: > >> Thank you for that. >> >> What is the current recommended pattern for rendering to a GL texture >> which gets consumed by a shared context? The shared context handles the >> rendering. >> >> Cheers, >> Joel >> >> >> On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters >> wrote: >> >>> No. >>> >>> clutter has not been recommended for many years. gst-plugins-gl neither >>> for many more. gst-plugins-gl has been migrated into gst-plugins-bad as >>> can be seen from the latest commit on that repo: >>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f) >>> and then promoted to gst-plugins-base and is available as the libgstgl-1.0 >>> library. >>> >>> Cheers >>> -Matt >>> >>> On 9/3/21 8:59 am, Joel Winarske wrote: >>> >>> Is >>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >>> still the recommended pattern for rendering to an EGL texture? >>> >>> Thanks, >>> Joel >>> >>> >>> _______________________________________________ >>> gstreamer-devel mailing listgstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >>> >>> >>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Tue Mar 9 22:32:41 2021 From: gotsring at live.com (gotsring) Date: Tue, 9 Mar 2021 16:32:41 -0600 (CST) Subject: Ending a pipeline with EOS sometimes hangs In-Reply-To: <1615292221062-0.post@n4.nabble.com> References: <1615292221062-0.post@n4.nabble.com> Message-ID: <1615329161084-0.post@n4.nabble.com> I believe I had encountered something similar when I tried closing a pipeline with an HTTP stream that never started. My solution (kind of a hack) was to set a timer that triggered the cleanup function after 5 seconds regardless of EOS. I did this using g_timeout_add_seconds(). Of course, this probably means not all elements are flushing correctly, so your output files might not be finalized correctly. Or you can try monitoring where the EOS is getting hung up by enabling debug output (level 4 or 5 maybe?) and from there further debug the problem. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ystreet00 at gmail.com Wed Mar 10 00:42:36 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Wed, 10 Mar 2021 11:42:36 +1100 Subject: render to egl texture In-Reply-To: References: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> Message-ID: There is no way to know the texture ID without uploading the frame. The texture ID is almost never a constant value.? At least, there will probably be two textures that will be flipped between. At most, each texture id will be unique. Cheers -Matt On 10/3/21 5:34 am, Joel Winarske wrote: > In my use case (video player) I just need to initialize the pipeline > and return a texture id. > > Is there a way to determine the texture id without loading a frame? > > Is the texture id constant over the lifecycle of the pipeline? > > > On Mon, Mar 8, 2021 at 9:57 PM Matthew Waters > wrote: > > That is one option if you're looking to use glimagesink's > rendering.? If you're rendering the texture yourself, something > like > https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tests/examples/gl/sdl/sdlshare.c > > is more appropriate. > > Cheers > -Matt > > On 9/3/21 1:37 pm, Joel Winarske wrote: >> I'm figuring a pipeline like this: >> uridecodebin uri=file:///usr/local/share/assets/video.mp4 ! >> video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! >> glimagesink >> >> To get the texture id I see a pattern in the cube example of >> attaching callback to "client-draw" of glimagesink, then mapping >> the video buffer which provides access to the texture id.? Is >> this the only way to access the texture id? >> >> Thanks, >> Joel >> >> >> On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske >> > wrote: >> >> Thank you for that. >> >> What is the current recommended pattern for rendering to a GL >> texture which gets consumed by a shared context?? The shared >> context handles the rendering. >> >> Cheers, >> Joel >> >> >> On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters >> > wrote: >> >> No. >> >> clutter has not been recommended for many years.? >> gst-plugins-gl neither for many more. gst-plugins-gl has >> been migrated into gst-plugins-bad as can be seen from >> the latest commit on that repo: >> https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f >> ) >> and then promoted to gst-plugins-base and is available as >> the libgstgl-1.0 library. >> >> Cheers >> -Matt >> >> On 9/3/21 8:59 am, Joel Winarske wrote: >>> Is >>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >>> >>> still the recommended pattern for rendering to an EGL >>> texture? >>> >>> Thanks, >>> Joel >>> >>> >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.freedesktop.org >>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From homanhuang at gmail.com Wed Mar 10 09:27:02 2021 From: homanhuang at gmail.com (HomanX) Date: Wed, 10 Mar 2021 03:27:02 -0600 (CST) Subject: How do I build GTK+GStreamer with basic-tutorial-5.c? Message-ID: <1615368422134-0.post@n4.nabble.com> I did Mingw64 with gcc, and pkg-config --cflags --libs gtk+-3.0 gstreamer-1.0 gstreamer-video-1.0. No luck, my exe file show dll files missing: libpng16 libgstreamer-1.0 libgstvideo-1.0 libgio-2.0 In VS2019, I cannot combine them with missing GST lib or missing GTK lib. I confused why I can't mix them. Looks like they are fighting on some space and cannot share the same project. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ezerbib at kramerav.com Wed Mar 10 10:23:49 2021 From: ezerbib at kramerav.com (Eric Zerbib) Date: Wed, 10 Mar 2021 10:23:49 +0000 Subject: no element ksvideosrc Message-ID: C:\gstreamer\1.18.0\msvc_x86_64\bin>gst-inspect-1.0.exe ksvideosrc Factory Details: Rank primary (256) Long-name KsVideoSrc Klass Source/Video/Hardware Description Stream data from a video capture device through Windows kernel streaming Author Ole Andr? Vadla Ravn?s Haakon Sporsheim Andres Colubri Plugin Details: Name winks Description Windows kernel streaming plugin Filename C:\gstreamer\1.18.0\msvc_x86_64\lib\gstreamer-1.0\gstwinks.dll Version 1.18.3 License LGPL Source module gst-plugins-bad Source release date 2021-01-13 Binary package GStreamer Bad Plug-ins source release Origin URL Unknown package origin -------------- next part -------------- An HTML attachment was scrubbed... URL: From renjith.thankachan at matrixcomsec.com Wed Mar 10 10:49:28 2021 From: renjith.thankachan at matrixcomsec.com (Renjith Thankachan (Software Development - Telecom)) Date: Wed, 10 Mar 2021 16:19:28 +0530 Subject: RTSP Client sometimes doesn't send video (stream 0) packets to the client Message-ID: Hello, We are using gstreamer rtsp server in our streaming server. I would like to discuss & find a possible reason for one of the behaviors which we have observed while streaming our videos. - Our Streaming Server is deployed in a Windows 10 PC - We are using a single shared rtspmediafactory to stream our videos. - a video streamed via the gstreamer rtsp server was displayed in different vlc clients in different machines (around 20-30 clients) - we observed that in some clients the video was not displayed. *Our analysis* - We verified from wireshark(client end) that video packets were not received. audio was received - We verified from wireshark(server end) that audio packets were sent to client, but video packets were not sent - we observed that initially the video packets were sent/received for around 1 second but gstreamer stopped sending it (before stopping an ACK was received from the client with PSH bit on) - On enabling the gst debugs a warning was printed "*already a queued data message for channel 0*" - from do_send_data_list (rtsp-client.c) - though the stream in some clients were not displayed, in all other clients it was displayed properly *Our Troubleshooting techniques * - we replaced the client with ODM ( same behavior was observed ) - we replaced the client PC ( same behavior was observed ) - we shifted to a private network ( still same behavior was observed in some clients) *Configurations used* Camera Resolution :- 2048 X 1536 @ 30fps H.264 Codec TCP stream was rendered to the clients 1000 Mbps switch was used All client PC were windows 10 PC We are stuck with this analysis and trying to find the exact reason for the video disconnection. Please provide your valuable inputs for the above mentioned issue faced by us Regards Renjith Thankachan -------------- next part -------------- An HTML attachment was scrubbed... URL: From jacklawi at gmail.com Wed Mar 10 12:52:05 2021 From: jacklawi at gmail.com (omer.tal) Date: Wed, 10 Mar 2021 06:52:05 -0600 (CST) Subject: Recommended parameters for RTSP over bad network Message-ID: <1615380725696-0.post@n4.nabble.com> Hey guys... I'm trying to stream H264 with RTSP over a bad network. Are there any recommended parameters which are better to avoid artifacts and frame loss in the receiver's end? Also, I'm trying to keep the latency as low as possible. Thanks -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From jshanab at jfs-tech.com Wed Mar 10 12:57:27 2021 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Wed, 10 Mar 2021 07:57:27 -0500 Subject: Recommended parameters for RTSP over bad network In-Reply-To: <1615380725696-0.post@n4.nabble.com> References: <1615380725696-0.post@n4.nabble.com> Message-ID: H264 especially without incremental periodic frame refresh is probably not the best. Maybe need to consider transcoding to jpeg2000. Designed for varible bandwidth and is keyframeless. Using wavelets instead of dct means failure is more loss of cool resolution than block artifact On Wed, Mar 10, 2021, 07:52 omer.tal wrote: > Hey guys... > I'm trying to stream H264 with RTSP over a bad network. Are there any > recommended parameters which are better to avoid artifacts and frame loss > in > the receiver's end? > Also, I'm trying to keep the latency as low as possible. > > Thanks > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From marc.leeman at gmail.com Wed Mar 10 14:00:15 2021 From: marc.leeman at gmail.com (Marc Leeman) Date: Wed, 10 Mar 2021 15:00:15 +0100 Subject: RTSP Client sometimes doesn't send video (stream 0) packets to the client In-Reply-To: References: Message-ID: Investigate the RTSP session management exchange first, maybe that will contain valuable information. e,g, 1. did the client request the video 2. what transport mechanism is used (I'm guessing RTP over TCP) 3. do you get the same behaviour when using UDP, what about UDP/muticast ... On Wed, 10 Mar 2021 at 13:15, Renjith Thankachan (Software Development - Telecom) wrote: > > Hello, > > We are using gstreamer rtsp server in our streaming server. > > I would like to discuss & find a possible reason for one of the behaviors which we have observed while streaming our videos. > > - Our Streaming Server is deployed in a Windows 10 PC > - We are using a single shared rtspmediafactory to stream our videos. > - a video streamed via the gstreamer rtsp server was displayed in different vlc clients in different machines (around 20-30 clients) > - we observed that in some clients the video was not displayed. > > Our analysis > > - We verified from wireshark(client end) that video packets were not received. audio was received > - We verified from wireshark(server end) that audio packets were sent to client, but video packets were not sent > - we observed that initially the video packets were sent/received for around 1 second but gstreamer stopped sending it (before stopping an ACK was received from the client with PSH bit on) > - On enabling the gst debugs a warning was printed "already a queued data message for channel 0" - from do_send_data_list (rtsp-client.c) > - though the stream in some clients were not displayed, in all other clients it was displayed properly > > > Our Troubleshooting techniques > > - we replaced the client with ODM ( same behavior was observed ) > - we replaced the client PC ( same behavior was observed ) > - we shifted to a private network ( still same behavior was observed in some clients) > > > Configurations used > Camera Resolution :- 2048 X 1536 @ 30fps H.264 Codec > TCP stream was rendered to the clients > 1000 Mbps switch was used > All client PC were windows 10 PC > > > We are stuck with this analysis and trying to find the exact reason for the video disconnection. > > Please provide your valuable inputs for the above mentioned issue faced by us > > Regards > Renjith Thankachan > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- g. Marc From gotsring at live.com Wed Mar 10 15:41:51 2021 From: gotsring at live.com (gotsring) Date: Wed, 10 Mar 2021 09:41:51 -0600 (CST) Subject: How do I build GTK+GStreamer with basic-tutorial-5.c? In-Reply-To: <1615368422134-0.post@n4.nabble.com> References: <1615368422134-0.post@n4.nabble.com> Message-ID: <1615390911852-0.post@n4.nabble.com> What's your setup? Maybe check your path environment var? If you're using MSYS2, I had to manually install pkg-config using pacman, but it built and ran afterwards. The build command shown in the tutorial resulted in gcc claiming that there were missing libraries, but the real issue was that pkg-config wasn't even installed If you are using MSVC builds, I don't think that GTK is even built with them. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From jason.a.willis99 at gmail.com Wed Mar 10 15:30:53 2021 From: jason.a.willis99 at gmail.com (Xsabin) Date: Wed, 10 Mar 2021 09:30:53 -0600 (CST) Subject: Monitoring Audio/Source devices on Windows In-Reply-To: References: Message-ID: <1615390253255-0.post@n4.nabble.com> I think this is an issue with Windows. I cannot get it to work with my usb head set or local audio card. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From jacklawi at gmail.com Wed Mar 10 16:25:21 2021 From: jacklawi at gmail.com (omer.tal) Date: Wed, 10 Mar 2021 10:25:21 -0600 (CST) Subject: Recommended parameters for RTSP over bad network In-Reply-To: References: <1615380725696-0.post@n4.nabble.com> Message-ID: <1615393521573-0.post@n4.nabble.com> Hey there. Unfortunately I'm using a limited bandwidth and must encode to video, otherwise my bandwidth won't be sufficient. I was just wondering how would it be best to set the parameters of the rtppayloader or the encoder so that it will best fit the network? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From michael.gruner at ridgerun.com Wed Mar 10 16:33:08 2021 From: michael.gruner at ridgerun.com (Michael Gruner) Date: Wed, 10 Mar 2021 10:33:08 -0600 Subject: Recommended parameters for RTSP over bad network In-Reply-To: <1615393521573-0.post@n4.nabble.com> References: <1615380725696-0.post@n4.nabble.com> <1615393521573-0.post@n4.nabble.com> Message-ID: <883F99BE-11BA-444A-BAA9-15234ED7F6C9@ridgerun.com> NVIDIA has a summary on how to tune their encoders for different scenarios. One of them is low latency. You may extrapolate this configuration to your encoder: https://docs.nvidia.com/video-technologies/video-codec-sdk/nvenc-video-encoder-api-prog-guide/#recommended-nvenc-settings > On 10 Mar 2021, at 10:25, omer.tal wrote: > > Hey there. > > Unfortunately I'm using a limited bandwidth and must encode to video, > otherwise my bandwidth won't be sufficient. > I was just wondering how would it be best to set the parameters of the > rtppayloader or the encoder so that it will best fit the network? > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From toby at rosecott.net Wed Mar 10 17:35:17 2021 From: toby at rosecott.net (Toby Widdows) Date: Wed, 10 Mar 2021 17:35:17 +0000 Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: <1614979122804-0.post@n4.nabble.com> References: <1614963243590-0.post@n4.nabble.com> <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@spark> <1614965528428-0.post@n4.nabble.com> <1614979122804-0.post@n4.nabble.com> Message-ID: Hi, Sorry I have not replied to this, real life got in the way!! Thanks for this, it made sense, and has increased my ability to understand the command greatly.? i have tested this with single video and multiple audio tracks, and it works like a charm. Apart from when there a TrueHD audio track, then I get this WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed. Additional debug info: ./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux: failed delayed linking pad? audio_0 of GstMatroskaDemux named demux to some pad of GstQueue named queue2 the command i am giving is this. ?gst-launch-1.0 filesrc location="/external/test/pickup/2.mkv" ! matroskademux name=demux \ demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=3500000 ! h265parse ! queue ! mux.video_0 \ demux.audio_0 ! queue ! mux.audio_0 \ matroskamux name=mux ! progressreport update-freq=1 ! filesink location="/external/test/pickup/2new.mkv" gst-discoverer gives me this for audio ? ? audio: AC-3 (ATSC A/52) ? ? audio: AC-3 (ATSC A/52) ? ? audio: DTS ? ? audio: E-AC-3 (ATSC A/52B) ? ?? ?audio: AC-3 (ATSC A/52) ? ? audio: AC-3 (ATSC A/52) ? ? audio: DTS ? ? audio: Dolby TrueHD if I use any other index than 0 it works like a dream, as there are duplicates of all codec's apart from Dolby TrueHD, I am assuming that is the problem. Also if I use the same mechinism for subtitles it just does not work, regardless of the format of the subtitles, I get? WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed. Additional debug info: ./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux: failed delayed linking pad? subtitle_0 of GstMatroskaDemux named demux to some pad of GstQueue named queue2 I generated that message with this command ?gst-launch-1.0 filesrc location="/external/test/pickup/2.mkv" ! matroskademux name=demux \ demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=3500000 ! h265parse ! queue ! mux.video_0 \ demux.subtitle_0 ! queue ! mux.subtitle_0 \ matroskamux name=mux ! progressreport update-freq=1 ! filesink location="/external/test/pickup/2new.mkv" I get that regardless of if the subtitle is "Timed Text" or "PGS subtitles". I'm presuming the errors I am getting are because of matroskamux not having pads for the codec being used, even though it says it has for text subtitles. The help provided has been great, and my understanding is growing.? But I think I still have a long way to go in understand gstreamer. regards Toby ------ Original Message ------ From: "gotsring" To: gstreamer-devel at lists.freedesktop.org Sent: 05/03/2021 21:18:42 Subject: Re: Re[2]: Questions about transcoding video with audio and subtitle passthrough using MKV files I sure hope I'm not explaining this wrong, but here goes: It looks like they're manually directing *types* of streams between the demuxer and muxer instead of letting it try to decide on its own. ? Broken down, you have 3 sections of the pipeline: 1. The demuxer, which reads an existing file and splits out the types of streams (like audio, video, subtitles) 2. The transcode/passthrough, where each individual stream is transcoded (video) or left alone (audio). 3. The muxer, which re-combines the new streams into an output file ? So looking at the pipeline, you can break that into 3 sections: Demuxer section: filesrc location=2.mkv ! matroskademux name=demux ? Transcode/Passthrough section demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! mux.video_0 demux.audio_0 ! queue ! mux.audio_0 ? Muxer Section matroskamux name=mux ! progressreport update-freq=1 ! filesink location=2222.mkv ? You'll notice that the transcode/passthrough section is much larger because they are routing individual streams instead of the one file. You'll also notice that they are using named elements to take things from the demuxer and put it into the muxer (conveniently named demux and mux). ? So the video is transcoded here: demux.video_0 ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvv4l2h265enc bitrate=20000000 ! h265parse ! queue ! mux.video_0 ? And the audio stream is left as is and routed from the audio_0 pad of the demuxer to the audio_0 pad of the muxer. demux.audio_0 ! queue ! mux.audio_0 ? If there was another audio track, you could also pass that forwards by incrementing the pad count: demux.audio_1 ! queue ! mux.audio_1 ? Hopefully that clarifies the format so you can adjust it to your needs. ? ? ? ? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From gotsring at live.com Wed Mar 10 18:11:16 2021 From: gotsring at live.com (gotsring) Date: Wed, 10 Mar 2021 12:11:16 -0600 (CST) Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: References: <1614963243590-0.post@n4.nabble.com> <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@Spark> <1614965528428-0.post@n4.nabble.com> <1614979122804-0.post@n4.nabble.com> Message-ID: <1615399876952-0.post@n4.nabble.com> I also couldn't get subtitles to work using gst-launch, but I wasn't sure if that's because I was generating test files wrong or because I'm using Windows. It might be the case that you have to do more complex linking using c/python code rather that just a gst-launch command, but more likely is that I don't know what I'm doing. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From toby at rosecott.net Wed Mar 10 18:47:54 2021 From: toby at rosecott.net (toby at rosecott.net) Date: Wed, 10 Mar 2021 18:47:54 +0000 Subject: Questions about transcoding video with audio and subtitle passthrough using MKV files In-Reply-To: <1615399876952-0.post@n4.nabble.com> References: <1614963243590-0.post@n4.nabble.com> <90a2f1f5-40f3-4a87-84bf-50728d6bbb58@Spark> <1614965528428-0.post@n4.nabble.com> <1614979122804-0.post@n4.nabble.com> <1615399876952-0.post@n4.nabble.com> Message-ID: <700a032b-08c1-496a-adf0-5fc275e1f768@Spark> Haha. You have way more idea about it than I do!! I'm not going to learn c or python for this.??I can do it long way round by dumping the true hd??and subtitles to files and then remixing them with ffmpeg or mkv merge.??I can handle that with scripting. Would be nice to know why the true hd and subs don't work though Regards Toby On 10 Mar 2021, 18:11 +0000, gotsring , wrote: > I also couldn't get subtitles to work using gst-launch, but I wasn't sure if > that's because I was generating test files wrong or because I'm using > Windows. > > It might be the case that you have to do more complex linking using c/python > code rather that just a gst-launch command, but more likely is that I don't > know what I'm doing. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From deepthimj at rangsons-ds.com Thu Mar 11 06:50:06 2021 From: deepthimj at rangsons-ds.com (deepthimj) Date: Thu, 11 Mar 2021 12:20:06 +0530 Subject: Requesting for a Solution Message-ID: <7b0f3ed4-f051-9a4c-d49c-943c36d98e87@rangsons-ds.com> Hello, I am newbie for gstreamer and QT. I got stuck while doing some coding. I hope you will solve my problem . I am using gstreamer[1.16.0] . In my code, I am overlaying the video on the qwidget. Now I am trying to overlay text and clock on the video through gstreamer but I am getting either text or clock overlay. My code is below, Thank You in advance. #include #include #include #include #include #include #include intmain(intargc,char*argv[]) { ???????? gst_init(&argc,&argv); QApplicationapp(argc,argv); app.connect(&app,SIGNAL(lastWindowClosed()),&app,SLOT(quit())); //preparethepipeline QStringuri="udp://192.169.25.88:500"; GstElement*pipeline=gst_pipeline_new("pipeline"); GstElement*src=gst_element_factory_make("udpsrc","source"); GstElement*depay=gst_element_factory_make("rtph264depay","depay"); GstElement*parse=gst_element_factory_make("h264parse","parse"); GstElement*queue=gst_element_factory_make("queue","queue"); GstElement*clkoverlay=gst_element_factory_make("clockoverlay","clockoverlay"); GstElement*textOverlay=gst_element_factory_make("textoverlay",nullptr); GstElement*decode=gst_element_factory_make("avdec_h264","decode"); GstElement*sink=gst_element_factory_make("glimagesink","sink"); if(!textOverlay){ GST_WARNING("needtextoverlayfromgst-plugins-base"); } g_object_set(textOverlay,"text","GStreamer","font-desc","Sans,30",nullptr); //Creating and linking pipeline GstCaps*caps=gst_caps_new_simple("application/x-rtp","encoding-name",G_TYPE_STRING,"H264","payload",G_TYPE_INT,26,nullptr); g_object_set(G_OBJECT(src),"uri",uri.toLatin1().data(),"caps",caps,nullptr); gst_bin_add_many(GST_BIN(pipeline),src,depay,parse,decode,clkoverlay,textOverlay,queue,sink,nullptr); //preparetheui QWidgetwindow; window.resize(320,240); window.show(); WIdxwinid=window.winId(); if(gst_element_link_many(src,depay,parse,decode,clkoverlay,textOverlay,queue,sink,nullptr)!=true) { qDebug()<<"Elementcouldnotbelinked"; } else{ gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(sink),xwinid); //runthepipeline GstStateChangeReturnsret=gst_element_set_state(pipeline,GST_STATE_PLAYING); if(sret==GST_STATE_CHANGE_FAILURE){ gst_element_set_state(pipeline,GST_STATE_NULL); gst_object_unref(pipeline); //Exitapplication QTimer::singleShot(0,QApplication::activeWindow(),SLOT(quit())); } } intret=app.exec(); //releasing the resources window.hide(); gst_element_set_state(pipeline,GST_STATE_NULL); gst_object_unref(pipeline); returnret; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From 330271189 at qq.com Thu Mar 11 04:10:46 2021 From: 330271189 at qq.com (=?ISO-8859-1?B?c3Ryb25n?=) Date: Thu, 11 Mar 2021 12:10:46 +0800 Subject: how to use avdec_g726 to decode g726 raw data Message-ID: hi, all     I want to use gstreamer/libav to encode$decode with g726, the command line I used is encode: gst-launch-1.0 alsasrc device=hw:0 ! audio/x-raw, rate=8000, channels=1, format=S16LE, layout=interleaved ! avenc_g726 code_size=5 ! filesink location=test.g726 it can work well, I can use ffplay to decode test.g726 decode: gst-launch-1.0 filesrc location=test.g726 blocksize=100 ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726 ! avdec_g726 ! filesink location=test.pcm some issues are found Setting pipeline to PAUSED ... Pipeline is PREROLLING ... ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error. Additional debug info: ../subprojects/gstreamer/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0: streaming stopped, reason error (-5) ERROR: pipeline doesn't want to preroll. Setting pipeline to NULL ... Freeing pipeline ... It seems that filesrc plugin can not parse g726 raw data, do anybody give me some suggestion to fix the issue? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From renjith.thankachan at matrixcomsec.com Thu Mar 11 13:10:28 2021 From: renjith.thankachan at matrixcomsec.com (renjith.t) Date: Thu, 11 Mar 2021 07:10:28 -0600 (CST) Subject: RTSP Client sometimes doesn't send video (stream 0) packets to the client In-Reply-To: References: Message-ID: <1615468228154-0.post@n4.nabble.com> Hi, I have actually gone through the session media files. What I found is that -> gstreamer rtsp server is closing the transport for the stream 0 -> it is done from "update_transport" method in rtsp-stream.c -> info debug printed is "removing TCP 192.168.111.78" where 111.78 was the client where video was not displayed so it is clear that gstreamer is explicitly closing the transport. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From keith.thornton at zeiss.com Thu Mar 11 14:22:42 2021 From: keith.thornton at zeiss.com (Thornton, Keith) Date: Thu, 11 Mar 2021 14:22:42 +0000 Subject: AW: RTSP Client sometimes doesn't send video (stream 0) packets to the client In-Reply-To: <1615468228154-0.post@n4.nabble.com> References: <1615468228154-0.post@n4.nabble.com> Message-ID: Hi, that may be because the server is not receiving the keep alive timer from the client. If you have a wireshark log look to see if the client sends a GET_PARAMETER once a minute (If you haven't changed the keep alive timeout). Gruesse -----Urspr?ngliche Nachricht----- Von: gstreamer-devel Im Auftrag von renjith.t Gesendet: Donnerstag, 11. M?rz 2021 14:10 An: gstreamer-devel at lists.freedesktop.org Betreff: Re: RTSP Client sometimes doesn't send video (stream 0) packets to the client Hi, I have actually gone through the session media files. What I found is that -> gstreamer rtsp server is closing the transport for the stream 0 it is -> done from "update_transport" method in rtsp-stream.c info debug -> printed is "removing TCP 192.168.111.78" where 111.78 was the client where video was not displayed so it is clear that gstreamer is explicitly closing the transport. -- Sent from: https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7C6e975d97adaa4a9d361508d8e4981c9b%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637510689266574710%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=jP9RcoouYXOZf1m1UAiUqAEuSn8TBvoy%2FpiGpFN%2BTkc%3D&reserved=0 _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7C6e975d97adaa4a9d361508d8e4981c9b%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637510689266574710%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=qscZFcyt5nNK836xWaQVbioQcpbQ9OyCAZPiGS1S6UQ%3D&reserved=0 From jshanab at jfs-tech.com Thu Mar 11 14:42:15 2021 From: jshanab at jfs-tech.com (Jeff Shanab) Date: Thu, 11 Mar 2021 09:42:15 -0500 Subject: RTSP Client sometimes doesn't send video (stream 0) packets to the client In-Reply-To: References: <1615468228154-0.post@n4.nabble.com> Message-ID: I deal with a large amount of security cameras of different brands. Just an FYI.... 60 seconds and GET_PARAMETER are the default and work 90+% of the time. The Setup reply will tell you if it is not default but is there for default also, depends on brand. end of session header "timeout=60" I have seen cameras that 1) cannot handle GET_PARAMETER in any transport (RED Vision) 2) must have OPTIONS or SET_PARAMETER instead 3) use rtcp, the rest are ignored. (I thought the timeout=nn told me which but it is inconsistent) The behaviour seems to vary with transport 1) rtsp-over-http.(2 sockets) All bets are off, every mfg interprets the vague combination of the Apple Quicktime Spec differently. 2) rtsp-over-tcp.(1 socket) Usually GET_PARAMETER is universally accepted here and is all that is needed. 3) rtsp/rtp/udp (up to 7 sockets) GET_PARAMETER for main session but some must have rtcp receiver reports per session or they disconnect, Patterns I see are if disconnect in seconds, rtcp issue if right on the 30 second, 1 minutes or 2 minute then GET_PARAMETER/OPTIONS/SET_PARAMETER (OPTIONS on vary old cameras, otherwise OPTIONS tells you if it supports GET_PARAMETER Wireshark is your friend. On Thu, Mar 11, 2021 at 9:22 AM Thornton, Keith wrote: > Hi, > that may be because the server is not receiving the keep alive timer from > the client. If you have a wireshark log look to see if the client sends a > GET_PARAMETER once a minute (If you haven't changed the keep alive timeout). > Gruesse > > -----Urspr?ngliche Nachricht----- > Von: gstreamer-devel Im > Auftrag von renjith.t > Gesendet: Donnerstag, 11. M?rz 2021 14:10 > An: gstreamer-devel at lists.freedesktop.org > Betreff: Re: RTSP Client sometimes doesn't send video (stream 0) packets > to the client > > Hi, > > I have actually gone through the session media files. > > What I found is that > -> gstreamer rtsp server is closing the transport for the stream 0 it is > -> done from "update_transport" method in rtsp-stream.c info debug > -> printed is "removing TCP 192.168.111.78" where 111.78 was the > client where video was not displayed > > so it is clear that gstreamer is explicitly closing the transport. > > > > > > -- > Sent from: > https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7C6e975d97adaa4a9d361508d8e4981c9b%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637510689266574710%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=jP9RcoouYXOZf1m1UAiUqAEuSn8TBvoy%2FpiGpFN%2BTkc%3D&reserved=0 > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > > https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7C6e975d97adaa4a9d361508d8e4981c9b%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637510689266574710%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=qscZFcyt5nNK836xWaQVbioQcpbQ9OyCAZPiGS1S6UQ%3D&reserved=0 > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Thu Mar 11 16:21:11 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 11 Mar 2021 11:21:11 -0500 Subject: how to use avdec_g726 to decode g726 raw data In-Reply-To: References: Message-ID: <8e7628b78b6d98dc7943e9e6eaa11e5fd67f1f48.camel@ndufresne.ca> Le jeudi 11 mars 2021 ? 12:10 +0800, strong a ?crit?: > hi, all > ? ? I want to use gstreamer/libav to encode$decode with g726, the command line > I used is > encode: > gst-launch-1.0 alsasrc device=hw:0 ! audio/x-raw, rate=8000, channels=1, format=S16LE, layout=interleaved ! avenc_g726 code_size=5 ! filesink location=test.g726 > it can work well, I can use ffplay to decode test.g726 > > decode: > gst-launch-1.0 filesrc location=test.g726 blocksize=100 ! audio/x-adpcm, > bitrate=40000, rate=8000, channels=1, layout=g726 ! avdec_g726 ! filesink > location=test.pcm > > location=test.pcm Took me a bit of research, but found this: 0:00:00.343734425 725274 0x55c94b173000 DEBUG audiodecoder gstaudiodecoder.c:2396:gst_audio_decoder_sink_eventfunc: unsupported format; ignoring Didn't search further, but the decoder refuses byte segments and fails everything. You can hack/workaround that with filesrc do-timestamp=TRUE. gst-launch-1.0 filesrc do-timestamp=TRUE location=test.g726 blocksize=100 ! audio/x-adpcm,bitrate=40000, rate=8000, channels=1, layout=g726 ! avdec_g726 ! filesink location=test.pcm Would be nice to open an issue in gst-ffmepg about that. Nicolas > > some issues are found > > Setting pipeline to PAUSED ... > Pipeline is PREROLLING ... > ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data > stream error. > Additional debug info: > ../subprojects/gstreamer/libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop > (): /GstPipeline:pipeline0/GstFileSrc:filesrc0: > streaming stopped, reason error (-5) > ERROR: pipeline doesn't want to preroll. > Setting pipeline to NULL ... > Freeing pipeline ... > > It seems that filesrc plugin can not parse g726 raw data, do anybody give me > some suggestion to fix the issue? > > Thanks > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From joel.winarske at gmail.com Fri Mar 12 00:02:58 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Thu, 11 Mar 2021 16:02:58 -0800 Subject: render to egl texture In-Reply-To: References: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> Message-ID: I'm taking a look at the sdlshare example, and porting to Fedora 33 - wayland/egl. After pausing the pipeline I get an eglCreateContext EGL Error of EGL_BAD_CONTEXT. The context being passed in is shared, without a surface. Code: https://gist.github.com/jwinarske/a518d16f18a4e0345d91027984098ec9 Log: https://gist.github.com/jwinarske/2d19e39590415fb8331af2edbeb1b984 How do I avoid the dummy window altogether? My end goal is to simply update a texture on each frame. Something else renders the texture. On Tue, Mar 9, 2021 at 4:42 PM Matthew Waters wrote: > There is no way to know the texture ID without uploading the frame. > > The texture ID is almost never a constant value. At least, there will > probably be two textures that will be flipped between. At most, each > texture id will be unique. > > Cheers > -Matt > > On 10/3/21 5:34 am, Joel Winarske wrote: > > In my use case (video player) I just need to initialize the pipeline and > return a texture id. > > Is there a way to determine the texture id without loading a frame? > > Is the texture id constant over the lifecycle of the pipeline? > > > On Mon, Mar 8, 2021 at 9:57 PM Matthew Waters wrote: > >> That is one option if you're looking to use glimagesink's rendering. If >> you're rendering the texture yourself, something like >> https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tests/examples/gl/sdl/sdlshare.c >> is more appropriate. >> >> Cheers >> -Matt >> >> On 9/3/21 1:37 pm, Joel Winarske wrote: >> >> I'm figuring a pipeline like this: >> uridecodebin uri=file:///usr/local/share/assets/video.mp4 ! >> video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! glimagesink >> >> To get the texture id I see a pattern in the cube example of attaching >> callback to "client-draw" of glimagesink, then mapping the video buffer >> which provides access to the texture id. Is this the only way to access >> the texture id? >> >> Thanks, >> Joel >> >> >> On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske >> wrote: >> >>> Thank you for that. >>> >>> What is the current recommended pattern for rendering to a GL texture >>> which gets consumed by a shared context? The shared context handles the >>> rendering. >>> >>> Cheers, >>> Joel >>> >>> >>> On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters >>> wrote: >>> >>>> No. >>>> >>>> clutter has not been recommended for many years. gst-plugins-gl >>>> neither for many more. gst-plugins-gl has been migrated into >>>> gst-plugins-bad as can be seen from the latest commit on that repo: >>>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f) >>>> and then promoted to gst-plugins-base and is available as the libgstgl-1.0 >>>> library. >>>> >>>> Cheers >>>> -Matt >>>> >>>> On 9/3/21 8:59 am, Joel Winarske wrote: >>>> >>>> Is >>>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >>>> still the recommended pattern for rendering to an EGL texture? >>>> >>>> Thanks, >>>> Joel >>>> >>>> >>>> _______________________________________________ >>>> gstreamer-devel mailing listgstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >>>> >>>> >>>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bing.song at nxp.com Fri Mar 12 08:03:44 2021 From: bing.song at nxp.com (Bing Song) Date: Fri, 12 Mar 2021 08:03:44 +0000 Subject: deep learning video analytics Message-ID: Hi, Is there any deep leaning video analytics plugins in upstream repo? There are many such kind of project, such as gstinference/nnstreamer/deepstream/gst-video-analytics. Do you think which one is more likely be accepted by community? Regards, Bing -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.crane at carrier.com Thu Mar 11 16:09:53 2021 From: matthew.crane at carrier.com (Crane, Matthew) Date: Thu, 11 Mar 2021 16:09:53 +0000 Subject: GstBuffer metadata across shmsink/shmsrc Message-ID: Hi, I'm currently working on a project where we are passing data between GStreamer pipelines in separate processes using shmsink/shmsrc. Our current pipelines looks something like the following: Process 1 pipeline: rtspsrc ! rtph264depay ! h264parse ! custom_ts_element ! gdppay ! shmsink Process 2 pipeline: shmsrc ! gdpdepay ! appsink The custom_ts_element attaches a reference timestamp metadata item to the buffers that pass through (based on UTC time). We need to be able to get the reference timestamp for the buffers in Process 2. In preliminary testing we have discovered that our buffer metadata is not making it across the shm interface to the pipeline in Process 2. Here is the buffer logging output of a test run with shmsrc ! gdpdepay ! fakesink that shows "meta: none": /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (16756 bytes, dts: 0:01:10.277195667, pts: 0:01:07.872940835, duration: 0:00:00.100000000, offset: 11793417, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x5648824925a0 I examined some of the shmsink code and saw a spot where the element iterates through the GstMeta objects attached to the buffer and copies them in preparation for copying the buffer to shared memory and expected the metadata to be available in the buffers from shmsrc. I also ran a test where I removed the gdppay / gdpdepay elements to see if that serialization was removing the metadata, but that didn't change anything. Is this a bug in shmsink/shmsrc? Or is it functioning as designed? -Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at centricular.com Fri Mar 12 10:38:10 2021 From: sebastian at centricular.com (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Fri, 12 Mar 2021 12:38:10 +0200 Subject: GstBuffer metadata across shmsink/shmsrc In-Reply-To: References: Message-ID: <3a7f432dd930dba1b3193f66d6718b761b8ed833.camel@centricular.com> On Thu, 2021-03-11 at 16:09 +0000, Crane, Matthew wrote: > > Is this a bug in shmsink/shmsrc? Or is it functioning as designed? GstMeta unfortunately doesn't provide an API for serializing/deserializing it, and for many GstMeta that also wouldn't be possible (there are quite a few that contain pointers). A first step for making this all work in shmsrc/sink or gdp would be to add an optional interface on GstMeta for serialization/deserialization, and then implement that in the GstMetas you care about. -- Sebastian Dr?ge, Centricular Ltd ? https://www.centricular.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From hassanmuhammad221 at gmail.com Fri Mar 12 12:05:44 2021 From: hassanmuhammad221 at gmail.com (Hassan Muhammad) Date: Fri, 12 Mar 2021 17:05:44 +0500 Subject: Question Regarding GStreamer Pipeline Graph Message-ID: Hi there, I have a question regarding generating .dot file for my gstreamer pipeline. I am trying to build an application in gst-rust bindings with webrtc and would like to visualize my dynamic pipeline. I've set the appropriate environment variable however, the .dot files are only generated when using the "gst-launch" command from the terminal. I also looked into the documentation for: "gstreamer::functions::debug_bin_to_dot_file" and called the function with the following parameters: gst::debug_bin_to_dot_file(&pipeline, gst::DebugGraphDetails::ALL, "C:/tmp/out.dot") but after launching the application, no files are generated. Any help would be appreciated. Thanks, Hassan -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at centricular.com Fri Mar 12 12:45:15 2021 From: sebastian at centricular.com (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Fri, 12 Mar 2021 14:45:15 +0200 Subject: Question Regarding GStreamer Pipeline Graph In-Reply-To: References: Message-ID: On Fri, 2021-03-12 at 17:05 +0500, Hassan Muhammad wrote: > Hi there, > > I have a question regarding generating .dot file for my gstreamer > pipeline. I am trying to build an application in gst-rust bindings > with webrtc and would like to visualize my dynamic pipeline. I've set > the appropriate environment variable however, the .dot files are only > generated when using the "gst-launch" command from the terminal. I > also?looked into the documentation for: > > "gstreamer::functions::debug_bin_to_dot_file"? > > and called the function with the following parameters: > > gst::debug_bin_to_dot_file(&pipeline, gst::DebugGraphDetails::ALL, > "C:/tmp/out.dot") > > but after launching the application, no files are generated.? Please check the C documentation here:?https://gstreamer.freedesktop.org/documentation/gstreamer/gstdebugutils.html?gi-language=c#GST_DEBUG_BIN_TO_DOT_FILE In short, you need to set an environment variable for defining the directory where the .dot files should be dumped and then the filename you provide to the function call is really just a filename (prefix). You can use gst::debug_bin_to_dot_data() to get a string and handle the writing yourself if you need more control. That function is also not affected by the environment variable. -- Sebastian Dr?ge, Centricular Ltd ? https://www.centricular.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Fri Mar 12 17:03:23 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Fri, 12 Mar 2021 12:03:23 -0500 Subject: deep learning video analytics In-Reply-To: References: Message-ID: <46a97eb9b9b1db894d5b7a2dd1edc681d3a4ccd8.camel@ndufresne.ca> Le vendredi 12 mars 2021 ? 08:03 +0000, Bing Song a ?crit?: > Hi, > ? > Is there any deep leaning video analytics plugins in upstream repo? There are > many such kind of project, such as gstinference/nnstreamer/deepstream/gst- > video-analytics. Do you think which one is more likely be accepted by > community? On my todo to review, but I think this is the most promising one today: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1997 > ? > Regards, > Bing > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From backob at hotmail.com Sat Mar 13 05:53:03 2021 From: backob at hotmail.com (Evan) Date: Sat, 13 Mar 2021 15:53:03 +1000 Subject: OpenOb Message-ID: Hi, On your list of Gstreamer based applications, I noticed OpenOb audio streamer is missing, I understand this app is based on Gstreamer. Can you throw any light on this, is it being maintained ?, if so, what version of Gstreamer is it running under ?. Tks From kochmmi1 at fel.cvut.cz Fri Mar 12 15:09:10 2021 From: kochmmi1 at fel.cvut.cz (kochmmi1) Date: Fri, 12 Mar 2021 09:09:10 -0600 (CST) Subject: Correct EOS when saving stream from USB camera Message-ID: <1615561750930-0.post@n4.nabble.com> I have USB camera, embedded device. I want to have the possibility to firstly save the video from the camera to a file, later probably to display the video and/or stream it via RTSP by means of C/c++ wrtitten app. Now I have a working terminal command to save the given number of frames: ''' gst-launch-1.0 v4l2src device=/dev/video1 num-buffers=900 ! image/jpeg, width=1920, height=1080, io-mode=4 ! imxvpudec ! imxvpuenc_mjpeg ! avimux ! filesink location=/some/path.avi'''. I would like to be able to save not-defined lenght of the video (until some user input to stop it). I have tried several approaches. Of course I can simply use some syscall, but the problems with that are probably not necesary to list, I will not be taking that route. Now the next simplest (yet probably problematic if I would want to do something else with the stream it the future) is: ``` void pipelineVideoStart(){ if (!gst_is_initialized()) { qWarning()<<"initializing GST"; setenv("GST_DEBUG", ("*:" + std::to_string(3)).c_str(), 1); gst_init(nullptr, nullptr); } GstElement *pipeline; GstBus *bus; GstMessage *msg; std::string command = "v4l2src device=/dev/video1 ! image/jpeg, width=1920, height=1080, io-mode=4 ! imxvpudec ! imxvpuenc_mjpeg ! avimux ! filesink location = /some/file.avi"; pipeline = gst_parse_launch (command.c_str(), NULL); /* Start playing */ gst_element_set_state (pipeline, GST_STATE_PLAYING); /* Wait until error or EOS */ bus = gst_element_get_bus (pipeline); msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GstMessageType( GST_MESSAGE_ERROR | GST_MESSAGE_EOS)); /* Free resources */ if (msg != NULL) gst_message_unref (msg); gst_object_unref (bus); gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (pipeline); return; } ''' ``` Problems I find with this approach 1) Video is created, however with wrong metadata (0 seconds lenght, even though it is playable). Because... it's never succesfully finished, is it? How do I stop the recording? 1) zero control. I expect problems with any additional sink (appsink, rtsp streaming) In second approach I was trying to build the pipeline myself: ``` static GstElement *pipeline; void gstreamerUsbCamera::selfPipelineVideo(){ if (!gst_is_initialized()) { setenv("GST_DEBUG", ("*:" + std::to_string(3)).c_str(), 1); gst_init(nullptr, nullptr); } GstCaps *caps; GstStateChangeReturn ret; GstElement *source, *muxer, *sink; source = gst_element_factory_make ("v4l2src", "source"); g_object_set (source, "device", "/dev/video1", NULL); muxer = gst_element_factory_make ("avimux", "avimux"); sink = gst_element_factory_make ("filesink", "sink"); g_object_set (sink, "location", "/mnt/ssd/someTest.avi", NULL); pipeline = gst_pipeline_new ("pipeline_src"); if (!pipeline || !source || !muxer || !sink) { g_printerr ("Not all elements could be created.\n"); return; } caps = gst_caps_new_simple ("image/jpeg", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "io-mode", G_TYPE_INT, 4, "framerate", GST_TYPE_FRACTION, 30, 1, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1,1, "interlace-mode", G_TYPE_STRING, "progresive", NULL); GstPadTemplate *template1 = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(source), "src_%u"); GstPad *pad = gst_element_request_pad(src, template1, "pad", caps);//");// gst_pad_new("source", GST_PAD_SRC); gst_caps_unref (caps); gst_bin_add_many (GST_BIN (pipeline), source, muxer, sink, NULL); if (gst_element_link_many(source, muxer,sink, NULL) != TRUE) { g_printerr ("Elements could not be linked.\n"); gst_object_unref (pipeline); return; } ret = gst_element_set_state (pipeline, GST_STATE_PLAYING); if (ret == GST_STATE_CHANGE_FAILURE) { g_printerr ("Unable to set the pipeline to the playing state.\n"); gst_object_unref (pipeline); return; } // Start playing recording = true; return; } int endVideoPipeline(void) { GstMessage *message = gst_message_new_eos(nullptr); gst_bus_post(pipeline->bus, message); /* Free resources */ if (message != NULL) gst_message_unref (message); gst_element_set_state (pipeline, GST_STATE_PAUSED); gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (pipeline); return 1; gst_app_src_end_of_stream(GST_APP_SRC(mGstData.appsrc)); usleep(500000); // Important gst_element_set_state (mGstData.pipeline_src, GST_STATE_NULL); gst_object_unref (mGstData.pipeline_src); recording = false; return 0; } ``` Problems with this: 1) wrong metadata (wrong lenght in VLC) and wrong format (the caps negotiating fails, I get 3000x4000&10fps instead of HD at 30fps). I also tried (at some point) to use the code from https://gist.github.com/crearo/1dc01b93b2b513e0000f183144c61b20 with some tweaks (commented-out the displaying, since that is not what I want now, and changed startRecording function to ```void gstreamerUsbCamera::startRecordingS() { g_print("startRecording\n"); GstPad *sinkpad; GstPadTemplate *templ, *temp2; GstCaps *caps; templ = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(tee), "src_%u"); teepad = gst_element_request_pad(tee, templ, NULL, NULL); queue_record = gst_element_factory_make("queue", "queue_record"); encoder = gst_element_factory_make("imxvpuenc", NULL); muxer = gst_element_factory_make("avimux", NULL); filesink = gst_element_factory_make("filesink", NULL); g_object_set(filesink, "location", "output.avi", NULL); caps = gst_caps_new_simple ("image/jpeg", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "io-mode", G_TYPE_INT, 4, NULL); temp2 = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(src), "src_%u"); GstPad *pad = gst_element_request_pad(src, templ, NULL, caps); gst_bin_add_many(GST_BIN(pipeline), queue_record, encoder, muxer, filesink, NULL); gst_element_link_many(queue_record, encoder, muxer, filesink, NULL); gst_element_sync_state_with_parent(queue_record); gst_element_sync_state_with_parent(encoder); gst_element_sync_state_with_parent(muxer); gst_element_sync_state_with_parent(filesink); sinkpad = gst_element_get_static_pad(queue_record, "sink"); gst_pad_link(teepad, sinkpad); gst_object_unref(sinkpad); isRecording = true; } ``` Here I get empty (0B) file at output. Now I do think that I am aiming to some tweak of the last two mentioned approaches. Can anyone point me in the direction of my mystakes and how to fix them? Thank you. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From bing.song at nxp.com Sat Mar 13 13:58:45 2021 From: bing.song at nxp.com (Bing Song) Date: Sat, 13 Mar 2021 13:58:45 +0000 Subject: [EXT] Re: deep learning video analytics In-Reply-To: <46a97eb9b9b1db894d5b7a2dd1edc681d3a4ccd8.camel@ndufresne.ca> References: <46a97eb9b9b1db894d5b7a2dd1edc681d3a4ccd8.camel@ndufresne.ca> Message-ID: Is it possible to separate pre-process/inference/post-process to separate plugin? So it can processed in different thread and the pre/post-process can be accelerated by HW. Such as nnstreamer and deepstream. Below MR is only for ONNX, is it possible add one adapter layer for more inference engine? Regards, Bing From: gstreamer-devel On Behalf Of Nicolas Dufresne Sent: 2021?3?13? 1:03 To: Discussion of the development of and with GStreamer Cc: Aaron Boxer Subject: [EXT] Re: deep learning video analytics Caution: EXT Email Le vendredi 12 mars 2021 ? 08:03 +0000, Bing Song a ?crit : Hi, Is there any deep leaning video analytics plugins in upstream repo? There are many such kind of project, such as gstinference/nnstreamer/deepstream/gst-video-analytics. Do you think which one is more likely be accepted by community? On my todo to review, but I think this is the most promising one today: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1997 Regards, Bing _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Sat Mar 13 15:20:44 2021 From: gotsring at live.com (gotsring) Date: Sat, 13 Mar 2021 09:20:44 -0600 (CST) Subject: Correct EOS when saving stream from USB camera In-Reply-To: <1615561750930-0.post@n4.nabble.com> References: <1615561750930-0.post@n4.nabble.com> Message-ID: <1615648844784-0.post@n4.nabble.com> It doesn't look like you're stopping the pipeline correctly, so the data is never written to the file. In your first attempt, you are waiting for an EOS, but you never actually send one. In the next, you send an EOS, but then immediately stop the pipeline, so the EOS never goes anywhere. To stop a pipeline, you need to - Send an EOS event to the pipeline - Wait for it to travel through the pipeline - Set the pipeline state to READY or NULL - Proceed with anything else basic_eos.cpp Check out this example code. It creates the pipeline, runs it, and then schedules an EOS event to be sent. When the cb_message function receives the EOS signal, this means it has reached the end of the pipeline, so it is safe to stop the pipeline. It uses gst_parse_launch to create the pipeline, but you can switch to the manual method as needed. I used g_timeout_add_seconds() to schedule the EOS, but you can easily capture ctrl+c signals or other keyboard events to send an EOS. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From joel.winarske at gmail.com Sun Mar 14 21:13:05 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Sun, 14 Mar 2021 14:13:05 -0700 Subject: render to egl texture In-Reply-To: References: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> Message-ID: Using the below combo of variables seems to be invalid, for both Ubuntu 18.04 LTS (Wayland on Ubuntu) and Fedora 33 (Wayland is active by default). Canonical (Ubuntu) is stating they will default with Wayland (again) in an upcoming release. GST_GL_WINDOW=wayland GST_GL_PLATFORM=egl GST_GL_API=gles2 What gl test cases are expected to work via wayland/egl/gles2? Or any test case for that matter. Thanks, Joel On Thu, Mar 11, 2021 at 4:02 PM Joel Winarske wrote: > I'm taking a look at the sdlshare example, and porting to Fedora 33 - > wayland/egl. > > After pausing the pipeline I get an eglCreateContext EGL Error of > EGL_BAD_CONTEXT. The context being passed in is shared, without a surface. > > Code: https://gist.github.com/jwinarske/a518d16f18a4e0345d91027984098ec9 > Log: https://gist.github.com/jwinarske/2d19e39590415fb8331af2edbeb1b984 > > How do I avoid the dummy window altogether? > > My end goal is to simply update a texture on each frame. Something else > renders the texture. > > > > > On Tue, Mar 9, 2021 at 4:42 PM Matthew Waters wrote: > >> There is no way to know the texture ID without uploading the frame. >> >> The texture ID is almost never a constant value. At least, there will >> probably be two textures that will be flipped between. At most, each >> texture id will be unique. >> >> Cheers >> -Matt >> >> On 10/3/21 5:34 am, Joel Winarske wrote: >> >> In my use case (video player) I just need to initialize the pipeline and >> return a texture id. >> >> Is there a way to determine the texture id without loading a frame? >> >> Is the texture id constant over the lifecycle of the pipeline? >> >> >> On Mon, Mar 8, 2021 at 9:57 PM Matthew Waters >> wrote: >> >>> That is one option if you're looking to use glimagesink's rendering. If >>> you're rendering the texture yourself, something like >>> https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tests/examples/gl/sdl/sdlshare.c >>> is more appropriate. >>> >>> Cheers >>> -Matt >>> >>> On 9/3/21 1:37 pm, Joel Winarske wrote: >>> >>> I'm figuring a pipeline like this: >>> uridecodebin uri=file:///usr/local/share/assets/video.mp4 ! >>> video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! glimagesink >>> >>> To get the texture id I see a pattern in the cube example of attaching >>> callback to "client-draw" of glimagesink, then mapping the video buffer >>> which provides access to the texture id. Is this the only way to access >>> the texture id? >>> >>> Thanks, >>> Joel >>> >>> >>> On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske >>> wrote: >>> >>>> Thank you for that. >>>> >>>> What is the current recommended pattern for rendering to a GL texture >>>> which gets consumed by a shared context? The shared context handles the >>>> rendering. >>>> >>>> Cheers, >>>> Joel >>>> >>>> >>>> On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters >>>> wrote: >>>> >>>>> No. >>>>> >>>>> clutter has not been recommended for many years. gst-plugins-gl >>>>> neither for many more. gst-plugins-gl has been migrated into >>>>> gst-plugins-bad as can be seen from the latest commit on that repo: >>>>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f) >>>>> and then promoted to gst-plugins-base and is available as the libgstgl-1.0 >>>>> library. >>>>> >>>>> Cheers >>>>> -Matt >>>>> >>>>> On 9/3/21 8:59 am, Joel Winarske wrote: >>>>> >>>>> Is >>>>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >>>>> still the recommended pattern for rendering to an EGL texture? >>>>> >>>>> Thanks, >>>>> Joel >>>>> >>>>> >>>>> _______________________________________________ >>>>> gstreamer-devel mailing listgstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >>>>> >>>>> >>>>> >>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Mon Mar 15 00:04:17 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Sun, 14 Mar 2021 20:04:17 -0400 Subject: [EXT] Re: deep learning video analytics In-Reply-To: References: <46a97eb9b9b1db894d5b7a2dd1edc681d3a4ccd8.camel@ndufresne.ca> Message-ID: Le sam. 13 mars 2021 10 h 00, Bing Song a ?crit : > Is it possible to separate pre-process/inference/post-process to separate > plugin? So it can processed in different thread and the pre/post-process > can be accelerated by HW. Such as nnstreamer and deepstream. > > > > Below MR is only for ONNX, is it possible add one adapter layer for more > inference engine? > As GStreamer is plugin based, and since there is very little sharable between framework, each framework support should live in their own plugin in my opinion. This maintain the adaptation layer thin and efficient. Some framework, like Nvidia and some other maintain their own plugin in their own repository. And this situation is perfectly suitable, as without maintenance, these plugin would have no value. > > Regards, > > Bing > > > > *From:* gstreamer-devel *On > Behalf Of *Nicolas Dufresne > *Sent:* 2021?3?13? 1:03 > *To:* Discussion of the development of and with GStreamer < > gstreamer-devel at lists.freedesktop.org> > *Cc:* Aaron Boxer > *Subject:* [EXT] Re: deep learning video analytics > > > > *Caution: *EXT Email > > Le vendredi 12 mars 2021 ? 08:03 +0000, Bing Song a ?crit : > > Hi, > > > > Is there any deep leaning video analytics plugins in upstream repo? There > are many such kind of project, such as > gstinference/nnstreamer/deepstream/gst-video-analytics. Do you think which > one is more likely be accepted by community? > > > > On my todo to review, but I think this is the most promising one today: > > > > > https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1997 > > > > > > > Regards, > > Bing > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.freedesktop.org > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ystreet00 at gmail.com Mon Mar 15 09:40:38 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Mon, 15 Mar 2021 20:40:38 +1100 Subject: render to egl texture In-Reply-To: References: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> Message-ID: This combination is fine if libgstgl is built for those platforms and your log would indicate that this is the case. Almost all gl elements in gst-plugins-base support that combination.? glupload, gleffects, glimagesink, gltestsrc are non-exhaustive examples that support that just fine.? There are very few gl elements nowadays that struggle with the (gles2) requirement.? wayland+egl is fine and is supported by all OpenGL elements. Further comments inline. On 15/3/21 8:13 am, Joel Winarske wrote: > Using the below combo of variables seems to be invalid, for both > Ubuntu 18.04 LTS (Wayland on Ubuntu) and Fedora 33 (Wayland is active > by default).? Canonical (Ubuntu) is stating they will default with > Wayland (again) in an upcoming release. > > GST_GL_WINDOW=wayland > GST_GL_PLATFORM=egl > GST_GL_API=gles2 These variables will only have any effect when GStreamer is creating the necessary resources.? If you are passing in some shared resources (display or OpenGL context), GStreamer will use the values from there instead. > What gl test cases are expected to work via wayland/egl/gles2?? Or any > test case for that matter. The gtk examples support wayland, otherwise the gtkglsink or qmlglsink elements contain the necessary code for retrieving the wl_display from those respective toolkits and then passing the required GstGLDisplay to GStreamer. > Thanks, > Joel > > > On Thu, Mar 11, 2021 at 4:02 PM Joel Winarske > wrote: > > I'm taking a look at the sdlshare example, and porting to Fedora > 33 - wayland/egl. > > After pausing the pipeline I get an eglCreateContext EGL Error of > EGL_BAD_CONTEXT.? The context being passed in is shared, without a > surface. > > Code: > https://gist.github.com/jwinarske/a518d16f18a4e0345d91027984098ec9 > > Log: > https://gist.github.com/jwinarske/2d19e39590415fb8331af2edbeb1b984 > > > How do I avoid the dummy window altogether? > > My end goal is to simply update a texture on each frame. Something > else renders the texture. > For wayland, you need to share the wl_display between SDL and GStreamer.? Once you retrieve the wl_display from SDL, create the necessary GstGLDisplay with gst_gl_display_wayland_new_with_display(). Cheers -Matt > On Tue, Mar 9, 2021 at 4:42 PM Matthew Waters > wrote: > > There is no way to know the texture ID without uploading the > frame. > > The texture ID is almost never a constant value.? At least, > there will probably be two textures that will be flipped > between. At most, each texture id will be unique. > > Cheers > -Matt > > On 10/3/21 5:34 am, Joel Winarske wrote: >> In my use case (video player) I just need to initialize the >> pipeline and return a texture id. >> >> Is there a way to determine the texture id without loading a >> frame? >> >> Is the texture id constant over the lifecycle of the pipeline? >> >> >> On Mon, Mar 8, 2021 at 9:57 PM Matthew Waters >> > wrote: >> >> That is one option if you're looking to use glimagesink's >> rendering.? If you're rendering the texture yourself, >> something like >> https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tests/examples/gl/sdl/sdlshare.c >> >> is more appropriate. >> >> Cheers >> -Matt >> >> On 9/3/21 1:37 pm, Joel Winarske wrote: >>> I'm figuring a pipeline like this: >>> uridecodebin >>> uri=file:///usr/local/share/assets/video.mp4 ! >>> video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D >>> ! glimagesink >>> >>> To get the texture id I see a pattern in the cube >>> example of attaching callback to "client-draw" of >>> glimagesink, then mapping the video buffer which >>> provides access to the texture id.? Is this the only way >>> to access the texture id? >>> >>> Thanks, >>> Joel >>> >>> >>> On Mon, Mar 8, 2021 at 4:59 PM Joel Winarske >>> >> > wrote: >>> >>> Thank you for that. >>> >>> What is the current recommended pattern for >>> rendering to a GL texture which gets consumed by a >>> shared context?? The shared context handles the >>> rendering. >>> >>> Cheers, >>> Joel >>> >>> >>> On Mon, Mar 8, 2021 at 4:32 PM Matthew Waters >>> > >>> wrote: >>> >>> No. >>> >>> clutter has not been recommended for many >>> years.? gst-plugins-gl neither for many more.? >>> gst-plugins-gl has been migrated into >>> gst-plugins-bad as can be seen from the latest >>> commit on that repo: >>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/commit/bedade404ec82432742a901c663f18dfaa24356f >>> ) >>> and then promoted to gst-plugins-base and is >>> available as the libgstgl-1.0 library. >>> >>> Cheers >>> -Matt >>> >>> On 9/3/21 8:59 am, Joel Winarske wrote: >>>> Is >>>> https://github.com/freedesktop/gstreamer-gst-plugins-gl/blob/master/tests/examples/clutter >>>> >>>> still the recommended pattern for rendering to >>>> an EGL texture? >>>> >>>> Thanks, >>>> Joel >>>> >>>> >>>> _______________________________________________ >>>> gstreamer-devel mailing list >>>> gstreamer-devel at lists.freedesktop.org >>>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From 330271189 at qq.com Mon Mar 15 07:23:04 2021 From: 330271189 at qq.com (=?ISO-8859-1?B?c3Ryb25n?=) Date: Mon, 15 Mar 2021 15:23:04 +0800 Subject: about how to use audiofx to realize aduio EQ/bass/treble Message-ID: hi, all     I am studying how to use gstreamer audiofx plugin to realize audio bass/treble/EQ, but it seems that the function of audiofx is very foundational, does anybody know to realize audio effect and EQ with gstreamer plugin? Thanks very much -------------- next part -------------- An HTML attachment was scrubbed... URL: From hassanmuhammad221 at gmail.com Mon Mar 15 10:04:18 2021 From: hassanmuhammad221 at gmail.com (Hassan Muhammad) Date: Mon, 15 Mar 2021 15:04:18 +0500 Subject: Question Regarding GStreamer Pipeline Graph In-Reply-To: References: Message-ID: Thank you for your prompt response. I was able to use the gst::debug_bin_to_dot_data() function as per your suggestion to get a string and write a .dot file which I then converted to a graph. On Fri, Mar 12, 2021 at 5:45 PM Sebastian Dr?ge wrote: > On Fri, 2021-03-12 at 17:05 +0500, Hassan Muhammad wrote: > > Hi there, > > I have a question regarding generating .dot file for my gstreamer > pipeline. I am trying to build an application in gst-rust bindings with > webrtc and would like to visualize my dynamic pipeline. I've set the > appropriate environment variable however, the .dot files are only generated > when using the "gst-launch" command from the terminal. I also looked into > the documentation for: > > "gstreamer::functions::debug_bin_to_dot_file" > > and called the function with the following parameters: > > gst::debug_bin_to_dot_file(&pipeline, gst::DebugGraphDetails::ALL, > "C:/tmp/out.dot") > > but after launching the application, no files are generated. > > > Please check the C documentation here: > https://gstreamer.freedesktop.org/documentation/gstreamer/gstdebugutils.html?gi-language=c#GST_DEBUG_BIN_TO_DOT_FILE > > In short, you need to set an environment variable for defining the > directory where the .dot files should be dumped and then the filename you > provide to the function call is really just a filename (prefix). > > You can use gst::debug_bin_to_dot_data() to get a string and handle the > writing yourself if you need more control. That function is also not > affected by the environment variable. > > -- > > Sebastian Dr?ge, Centricular Ltd ? https://www.centricular.com > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian at centricular.com Mon Mar 15 10:33:15 2021 From: sebastian at centricular.com (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Mon, 15 Mar 2021 12:33:15 +0200 Subject: about how to use audiofx to realize aduio EQ/bass/treble In-Reply-To: References: Message-ID: On Mon, 2021-03-15 at 15:23 +0800, strong wrote: > hi, all > ? ? I am studying how to use gstreamer audiofx plugin to realize > audio bass/treble/EQ, but it seems that > the function of audiofx is very foundational, does anybody know to > realize audio effect and EQ with gstreamer > plugin? You can find a graphical example with the equalizer here: ??https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/blob/master/tests/examples/equalizer/demo.c That probably answers your questions how to use it. If not, please ask a more specific question. -- Sebastian Dr?ge, Centricular Ltd ? https://www.centricular.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From kepitto at outlook.com Mon Mar 15 12:23:13 2021 From: kepitto at outlook.com (kepitto) Date: Mon, 15 Mar 2021 07:23:13 -0500 (CDT) Subject: splitmuxsink segmentation fault on raspberry pi 4 Message-ID: <1615810993038-0.post@n4.nabble.com> I'm working on recording application and am trying to create files with timestamps in 10 minute intervalls. This application works fine on a x86 device, however it doesn't work on a raspberry pi 4. I want to use the max-size-time property of splitmuxsink, however I do have to assign "(guint64)0" to it for it to work: /g_object_set(splitmux, "muxer", matroskamux, "max-size-time", 10000000, NULL);/ Delivers following warning and no file is being created: "GLib-GObject-WARNING ... g_object_is_valid_property: object class "GstSplitMuxSink" has no property named '\x80;P\xb5\u0001' /g_object_set(splitmux, "muxer", matroskamux, "max-size-time", (guint64)10000000, NULL);/ Results in a instant Segmentation fault (Same Bug for dvb plugin https://www.raspberrypi.org/forums/viewtopic.php?t=117819) By using (guint64)0 to make my application work I can't use the property for it's purpose. Is there any workaround or solution to this problem? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From lusinehayrapetyan1992 at gmail.com Mon Mar 15 12:48:03 2021 From: lusinehayrapetyan1992 at gmail.com (Lusine) Date: Mon, 15 Mar 2021 07:48:03 -0500 (CDT) Subject: webrtcbin segmentation fault In-Reply-To: <1613033333270-0.post@n4.nabble.com> References: <1613033333270-0.post@n4.nabble.com> Message-ID: <1615812483804-0.post@n4.nabble.com> If anyone else faces the same issue: The reason of the issue is WSL 1, seems gstreamer webrtc doesn't work on it. it solved by moving to WSL 2. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From lusinehayrapetyan1992 at gmail.com Mon Mar 15 13:10:31 2021 From: lusinehayrapetyan1992 at gmail.com (Lusine) Date: Mon, 15 Mar 2021 08:10:31 -0500 (CDT) Subject: webrtcbin state change issue Message-ID: <1615813831370-0.post@n4.nabble.com> Hi Folks, I need to change my pipeline(which streams with webrtcbin) state from PLAYING to READY and then again recover the playing state. gst_element_set_state(pipe1, GST_STATE_READY); // DO something and then again set the playing state. gst_element_set_state(pipe1, GST_STATE_PLAYING); My pipeline is this: https://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/tests/examples/webrtc/webrtc.c But after setting the PLAYING state again the streaming doesn't work, is there an issue or do I need to do some extra step to recover the PLAYING state? Regards, Lusine -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From joel.winarske at gmail.com Mon Mar 15 14:09:13 2021 From: joel.winarske at gmail.com (Joel Winarske) Date: Mon, 15 Mar 2021 07:09:13 -0700 Subject: render to egl texture In-Reply-To: References: <4a041b86-104d-28fc-ee1c-528b3ac1ee83@gmail.com> Message-ID: > For wayland, you need to share the wl_display between SDL and GStreamer. > Once you retrieve the wl_display from SDL, create the necessary > GstGLDisplay with gst_gl_display_wayland_new_with_display(). > Replacing gst_gl_display_new() with gst_gl_display_wayland_new_with_display() did the trick. Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Mon Mar 15 19:43:51 2021 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Mon, 15 Mar 2021 19:43:51 +0000 Subject: GStreamer 1.18.4 stable bug fix release Message-ID: The GStreamer team is pleased to announce another bug fix release in the stable 1.18 release series. This release only contains bug fixes and security fixes. It should be safe to upgrade from 1.18.x We recommend you upgrade at the earliest opportunity. Highlighted bugfixes: - important security fixes for ID3 tag reading, matroska and realmedia parsing, and gst-libav audio decoding - audiomixer, audioaggregator: input buffer handling fixes - decodebin3: improve stream-selection message handling - uridecodebin3: make "caps" property work - wavenc: fix writing of INFO chunks in some cases - v4l2: bt601 colorimetry, allow encoder resolution changes, fix decoder frame rate negotiation - decklinkvideosink: fix auto format detection, and fixes for 29.97fps framerate output - mpeg-2 video handling fixes when seeking - avviddec: fix bufferpool negotiation and possible memory corruption when changing resolution - various stability, performance and reliability improvements - memory leak fixes - build fixes: rpicamsrc, qt overlay example, d3d11videosink on UWP Release notes with details about changes and fixed bugs can be found at: https://gstreamer.freedesktop.org/releases/1.18/#1.18.4 For details about the security fixes (which also apply to older branches) see: https://gstreamer.freedesktop.org/security/ Binaries for Android, iOS, Mac OS X and Windows should be available soon. As always, please let us know of any issues you run into by filing an issue or Merge Request in Gitlab: https://gitlab.freedesktop.org/gstreamer/ Thanks! -------------- next part -------------- https://gstreamer.freedesktop.org/src/gstreamer/gstreamer-1.18.4.tar.xz 9aeec99b38e310817012aa2d1d76573b787af47f8a725a65b833880a094dfbc5 https://gstreamer.freedesktop.org/src/gst-plugins-base/gst-plugins-base-1.18.4.tar.xz 29e53229a84d01d722f6f6db13087231cdf6113dd85c25746b9b58c3d68e8323 https://gstreamer.freedesktop.org/src/gst-plugins-good/gst-plugins-good-1.18.4.tar.xz b6e50e3a9bbcd56ee6ec71c33aa8332cc9c926b0c1fae995aac8b3040ebe39b0 https://gstreamer.freedesktop.org/src/gst-plugins-ugly/gst-plugins-ugly-1.18.4.tar.xz 218df0ce0d31e8ca9cdeb01a3b0c573172cc9c21bb3d41811c7820145623d13c https://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-1.18.4.tar.xz 74e806bc5595b18c70e9ca93571e27e79dfb808e5d2e7967afa952b52e99c85f https://gstreamer.freedesktop.org/src/gst-libav/gst-libav-1.18.4.tar.xz 344a463badca216c2cef6ee36f9510c190862bdee48dc4591c0a430df7e8c396 https://gstreamer.freedesktop.org/src/gst-rtsp-server/gst-rtsp-server-1.18.4.tar.xz a46bb8de40b971a048580279d2660e616796f871ad3ed00c8a95fe4d273a6c94 https://gstreamer.freedesktop.org/src/gst-editing-services/gst-editing-services-1.18.4.tar.xz 4687b870a7de18aebf50f45ff572ad9e0138020e3479e02a6f056a0c4c7a1d04 https://gstreamer.freedesktop.org/src/gst-python/gst-python-1.18.4.tar.xz cb68e08a7e825e08b83a12a22dcd6e4f1b328a7b02a7ac84f42f68f4ddc7098e https://gstreamer.freedesktop.org/src/gstreamer-vaapi/gstreamer-vaapi-1.18.4.tar.xz 92db98af86f3150d429c9ab17e88d2364f9c07a140c8f445ed739e8f10252aea https://gstreamer.freedesktop.org/src/gst-omx/gst-omx-1.18.4.tar.xz e35051cf891eb2f31d6fcf176ff37d985f97f33874ac31b0b3ad3b5b95035043 https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.18.4.tar.xz ffbd194c40912cb5e7fca2863648bf9dd8257b7af97d3a60c4fcd4efd8526ccf https://gstreamer.freedesktop.org/src/gstreamer-sharp/gstreamer-sharp-1.18.4.tar.xz 59eadf4a5e4e1cbc5e3f601754e6cd00bb7ba23bd82c602d72c7951ed496f2c2 https://gstreamer.freedesktop.org/src/gstreamer-docs/gstreamer-docs-1.18.4.tar.xz 7899b2d7b293c76cc5e76ddc85084f92e533c7b8978894e6882cf3924008b30a -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: This is a digitally signed message part URL: From www.andreabertolaso at gmail.com Mon Mar 15 23:06:49 2021 From: www.andreabertolaso at gmail.com (Andressio) Date: Mon, 15 Mar 2021 18:06:49 -0500 (CDT) Subject: Regulate speed of decoded frames using presentation timestamp Message-ID: <1615849609962-0.post@n4.nabble.com> Hi all. I am developing an application that consumes h264 streams from two different models of ip cameras via rtsp. I am using a gstremer pipeline that looks like this: /rtspsrc location=camera_rtsp protocols=GST_RTSP_LOWER_TRANS_TCP ! rtph264depay ! h264parse ! video/x-h264,stream-format=byte-stream,alignment=au ! appsink/ Buffers pulled from appsink hold encoded h264 packets. I feed these packets directly into the NVDEC decoder of an nvidia gpu to obtain the decoded raw frames using opencv + nvidia video codec sdk (note that I don't use the nvdec element from gstreamer plugins bad). What happens is that with one of the cameras everything works fine: the frames are decoded at the right speed (in line with video framerate) and displaying them results in a smooth video. On the other hand the second camera gives poor results: frames are decoded at variable speed leading to a bulky video when displayed but, if I average the times between all consecutives frames from one keyframe to the next keyframe, the speed is in line with video framerate. To sum up this last case: all frames are decoded correctly, but are not decoded at a constant speed. I am quite confident that the decoder works properly. I guess it is simply a matter of correctly using the presentation timestamp that can be extracted from appsink when pulling buffers (N.B. for the first camera the presentation timestamp is mostly monotonically increasing but sometimes returns back in time, for the second camera it is monotonically incresing). In an early stage of the application I put gstreamer's nvdec element between /h264parse/ end /appsink/. I obtained regular streams with raw frames emitted at a regular speed for both cameras but unfortunately it was not very reliable. How does gstreamer's nvdec regulate the emission of decoded frames? How can I implement a similar behavior using the presentation timestamps from the /appsink/ while keeping the above pipeline unchanged? I guess I should use some buffers, but are you aware of some examples or resources that can be used as a staring point? Thanks -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ystreet00 at gmail.com Tue Mar 16 01:43:23 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Tue, 16 Mar 2021 12:43:23 +1100 Subject: webrtcbin state change issue In-Reply-To: <1615813831370-0.post@n4.nabble.com> References: <1615813831370-0.post@n4.nabble.com> Message-ID: <0b125490-61e1-f311-ed58-ab2b313eac43@gmail.com> Hi, webrtcbin doesn't really support restarting at all.? Why are you setting webrtcbin to READY? Cheers -Matt On 16/3/21 12:10 am, Lusine wrote: > Hi Folks, > > I need to change my pipeline(which streams with webrtcbin) state from > PLAYING to READY and then again recover the playing state. > gst_element_set_state(pipe1, GST_STATE_READY); > // DO something and then again set the playing state. > gst_element_set_state(pipe1, GST_STATE_PLAYING); > > My pipeline is this: > https://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/tests/examples/webrtc/webrtc.c > > But after setting the PLAYING state again the streaming doesn't work, is > there an issue or do I need to do some extra step to recover the PLAYING > state? > > Regards, > Lusine > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From vladimir.tyutin at gmail.com Tue Mar 16 09:56:19 2021 From: vladimir.tyutin at gmail.com (Vladimir Tyutin) Date: Tue, 16 Mar 2021 12:56:19 +0300 Subject: gstreamer 1.18.2 issue Message-ID: Hi all, I have build gstreamer and base plugins version 1.18.2 in buildroot for my camera board. Now I'm trying to verify it on the board. gst-inspect-1.0 prints the version: gst-inspect-1.0 --version gst-inspect-1.0 version 1.18.2 GStreamer 1.18.2 Unknown package origin But it I use it to show all available plugins or specific plugin like this: gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so It shows nothing. In the log I see the last message default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing GstTracer and after that strange line: more: : No such file or directory and nothing else. Please see the full log attached. Thanks, Vladimir -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: gstreamer.log Type: application/octet-stream Size: 366090 bytes Desc: not available URL: From t.i.m at zen.co.uk Tue Mar 16 14:12:33 2021 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Tue, 16 Mar 2021 14:12:33 +0000 Subject: GStreamer 1.18.4 stable bug fix release In-Reply-To: References: Message-ID: > Binaries for Android, iOS, Mac OS X and Windows > should be available soon Binaries are up now too: https://gstreamer.freedesktop.org/download/ Cheers Tim From nicolas at ndufresne.ca Tue Mar 16 14:28:46 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 16 Mar 2021 10:28:46 -0400 Subject: Regulate speed of decoded frames using presentation timestamp In-Reply-To: <1615849609962-0.post@n4.nabble.com> References: <1615849609962-0.post@n4.nabble.com> Message-ID: Le lundi 15 mars 2021 ? 18:06 -0500, Andressio a ?crit?: > Hi all. I am developing an application that consumes h264 streams from two > different models of ip cameras via rtsp. I am using a gstremer pipeline that > looks like this: > > /rtspsrc location=camera_rtsp protocols=GST_RTSP_LOWER_TRANS_TCP ! > rtph264depay ! h264parse ! > video/x-h264,stream-format=byte-stream,alignment=au ! appsink/ > > Buffers pulled from appsink hold encoded h264 packets. I feed these packets > directly into the NVDEC decoder of an nvidia gpu to obtain the decoded raw > frames using opencv + nvidia video codec sdk (note that I don't use the > nvdec element from gstreamer plugins bad). > > What happens is that with one of the cameras everything works fine: the > frames are decoded at the right speed (in line with video framerate) and > displaying them results in a smooth video. On the other hand the second > camera gives poor results: frames are decoded at variable speed leading to a > bulky video when displayed but, if I average the times between all > consecutives frames from one keyframe to the next keyframe, the speed is in > line with video framerate. To sum up this last case: all frames are decoded > correctly, but are not decoded at a constant speed. > > I am quite confident that the decoder works properly. I guess it is simply a > matter of correctly using the presentation timestamp that can be extracted > from appsink when pulling buffers (N.B. for the first camera the > presentation timestamp is mostly monotonically increasing but sometimes > returns back in time, for the second camera it is monotonically incresing). This return back in time indicates the presence of B-Frames. The decoder will reorder them for you. Notice the DTS, which should be monotonic before the decoder. > > In an early stage of the application I put gstreamer's nvdec element between > /h264parse/ end /appsink/. I obtained regular streams with raw frames > emitted at a regular speed for both cameras but unfortunately it was not > very reliable. > > How does gstreamer's nvdec regulate the emission of decoded frames? How can > I implement a similar behavior? using the presentation timestamps from the > /appsink/ while keeping the above pipeline unchanged? I guess I should use > some buffers, but are you aware of some examples or resources that can be > used as a staring point? In GStreamer, decoders are not responsible for smoothing the transmission. They simply process as fast as possible. Dowstream the decoder, playback pipelines will usually contain a short raw video queue (which allow buffering when decoding is faster).? After that queue, there is a display componenent, (could be glimagesink). These element are using GstBaseSink base class, which implement most of the synchronisation code. It will translate the PTS into running-time, and then translate this to clock time in order to determin how long to wait before displaying. Audio sink are much more complex. p.s. Remember that decoding complexity varies within a stream, so not all frames decode at equal speed. Some HW decoder would smooth this, but this is atipical for GPU decoders and PC software. > > Thanks > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From patrick.fischer at vitec.com Tue Mar 16 14:29:08 2021 From: patrick.fischer at vitec.com (Patrick Fischer) Date: Tue, 16 Mar 2021 14:29:08 +0000 Subject: sometimes seek blocks In-Reply-To: References: Message-ID: Hello I have a mobile(android) app with gst and unfortunately it hangs occasionally after a seek. I've been searching for several weeks and just can't get any further. Therefore, I hope that I can get this way a few tips on what I could still do. The pipeline is attached in the image. But in this case it is not a UDP Src but a souphttpsrc. After a seek sometimes the pipeline doesn't get into the playing state anymore. So I can't read from the queue of the appsink anymore. It hangs and triggers a deadlock. gst_element_seek_simple(customData->pipeline, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT),time_nanoseconds) I tried it with a flush before the seek. To get appropriate log output I set the loglevels like this. setenv("GST_DEBUG", "GST_TRACER:7,GST_BUFFER*:3,GST_EVENT:9,GST_MESSAGE:9,appsink:9,queue:9", true); setenv("GST_TRACERS", "leaks(GstEvent,GstMessage)", true); Of course this generates a lot of output. Therefore, I attached the logfile. If my app can't get any data from the appsink for a certain time, then I run through the pipeline and output each state. So you can see that both appsinks (each for audio/video) are still hanging in the pause state. Even if I try to set them explicitly to playing nothing happens. I also can't get g_signal_emit_by_name(mAppSink, "pull-sample", &sample); or similar on the appsink anymore. It seems that the gst-libs/gst/app/gstappsink.c it hands on a mutex code. I don't know why. I suspect that some buffer is full and I do not know why every now and then I get "Got data flow before segment event" but it doesn't seem to be the problem because it sometimes works anyway. What can I do to get to the bottom of this? Regards Patrick -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testmobile.log Type: text/x-log Size: 474927 bytes Desc: testmobile.log URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: pipeline.png Type: image/png Size: 246058 bytes Desc: pipeline.png URL: From vladimir.tyutin at gmail.com Tue Mar 16 16:53:02 2021 From: vladimir.tyutin at gmail.com (Vladimir Tyutin) Date: Tue, 16 Mar 2021 19:53:02 +0300 Subject: gst-inspect-1.0 show nothing In-Reply-To: References: Message-ID: Changed the title ---------- Forwarded message --------- From: Vladimir Tyutin Date: Tue, Mar 16, 2021 at 12:56 PM Subject: gstreamer 1.18.2 issue To: Discussion of the development of and with GStreamer < gstreamer-devel at lists.freedesktop.org> Hi all, I have build gstreamer and base plugins version 1.18.2 in buildroot for my camera board. Now I'm trying to verify it on the board. gst-inspect-1.0 prints the version: gst-inspect-1.0 --version gst-inspect-1.0 version 1.18.2 GStreamer 1.18.2 Unknown package origin But it I use it to show all available plugins or specific plugin like this: gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so It shows nothing. In the log I see the last message default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing GstTracer and after that strange line: more: : No such file or directory and nothing else. Please see the full log attached. Thanks, Vladimir -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: gstreamer.log Type: application/octet-stream Size: 366090 bytes Desc: not available URL: From patrickcusack at mac.com Tue Mar 16 18:59:03 2021 From: patrickcusack at mac.com (Patrick Cusack) Date: Tue, 16 Mar 2021 11:59:03 -0700 Subject: gstreamer and BigSur Message-ID: <8360783F-221D-4E47-AB69-9CB446B60A5C@mac.com> Question: Does gstreamer on macOS have a sink that utilizes Metal or does it still use OpenGL? Thanks, Patrick From michiel at aanmelder.nl Tue Mar 16 19:51:45 2021 From: michiel at aanmelder.nl (Michiel Konstapel) Date: Tue, 16 Mar 2021 20:51:45 +0100 Subject: WebRTCBin not receiving caps signal In-Reply-To: References: <1610622034499-0.post@n4.nabble.com> Message-ID: After several days of banging my head on the wall, I've gone and split off the webrtc part into a separate pipeline, connected to the main one through udpsink/udpsrc. However, even then adding webrtcbins will randomly lock up. I've narrowed it down to a pretty small testcase in the attached python file. In my environment (1.18.3 built from source) I had to run export GI_TYPELIB_PATH=/usr/local/lib/x86_64-linux-gnu/girepository-1.0 or it fails with: ValueError: Namespace GstWebRTC not available Run it once as a producer pipeline: python3 testcase.py producer And then run the consumer: python3 testcase.py consumer This will just add a whole bunch of webrtcbins to the running pipeline. Occasionally, this will successfully add 100 webrtcbins to the pipeline, but usually, it'll lock up: gstwebrtcbin.c:319:gst_webrtc_bin_pad_new:<'':sink_0> new visible pad with direction sink gstwebrtcbin.c:469:_find_transceiver_for_mline: Found transceiver (NULL) for mlineindex 0 gstwebrtcbin.c:5823:gst_webrtc_bin_request_new_pad: Created new transceiver for mline 0 gstwebrtcbin.c:5737:gst_webrtc_bin_change_state: changing state: NULL => READY gstwebrtcbin.c:1354:_check_if_negotiation_is_needed: checking if negotiation is needed gstwebrtcbin.c:1360:_check_if_negotiation_is_needed: no negotiation possible until caps have been received on all sink pads gstwebrtcbin.c:5737:gst_webrtc_bin_change_state: changing state: READY => PAUSED gstwebrtcbin.c:5737:gst_webrtc_bin_change_state: changing state: PAUSED => PLAYING And then it just freezes. Sometimes it's the 98th, sometimes it's the 10th. I've ran it with maximum GST_DEBUG, added print statements to gstwebrtcbin.c, and I can't find anything that sets apart the successful and failing attempts. Adding sleeps seems to help, but it's not foolproof and I have no idea why it would help. Matthew or other experts, any ideas? Is there an issue tracker I should add this to? Is there something wrong with this approach? I did find a workaround: if I pause the whole consumer pipeline, add the webrtcbin, and then set the pipeline playing, that appears to prevent the race condition. Now that it's in a separate pipeline, that's a viable option, but it still doesn't sit right with me that it just randomly... stops. Kind regards, Michiel On 14-01-2021 12:04, Michiel Konstapel wrote: > Apologies, that lost a lot of context since it references an old email > to the list. It concerns this message by Dominik: > http://gstreamer-devel.966125.n4.nabble.com/WebRTCBin-not-receiving-caps-signal-tt4689946.html > > Quoting: >> I am working on a python app that turns an rtsp stream from a >> networked camera into a stream that you can access through the >> browser via WebRTC. >> >> For that, I am using a dynamic pipeline, the rtp audio and video >> streams go to a fakesink through a tee element. For each incoming >> websocket connection, I build an WebRTC connection: I am adding two >> queues, one for audio, one for video to the tee elements and connect >> them to a new WebRTC element. Then I exchange the SDP offer and >> response via websockets. >> >> Now, in some situations, after quick reloads of the web page that >> brokers the connection, the newest added WebRTCBin element does not >> send the ?on-negotiation-neeed? signal. I am assuming this is because >> it did not receive caps on the newly requested pad, thus the >> gst_webrtcbin_sink_event method did not get triggered and the >> WebRTCBin is not checking whether a negotiation is needed. >> >> A PDF of the pipeline is here: >> http://roettsch.es/debug_gst_webrtcbin.pdf and an image of the >> relevant portions is attached. >> >> The WebRTCBin element sendrecv-2-772371 has blocked pads, as it >> hasn?t done any negotiation and unblocked its pads, but also the CAPS >> on the links to the queue elements only show ?ANY?. >> >> I have a few questions to further investigate this: >> >> What could be the reason the caps event is not received by the >> GstWebRTCBin? >> When is the caps event usually sent downstream? Is it sent only once >> and could it be that the GstWebRTCBin somehow missed it through some >> race condition? >> When connecting queue and webrtcbin to the tee element, do I need to >> follow a certain order? Do I need to insert probes in order not to >> miss the caps event? > > > On 14-01-2021 12:00, mkonstapel wrote: >> Hi Dominik et al, >> >> I think I am running into the exact same problem - quickly reloading the >> "player" page (thus adding webrtcbins) eventually, and apparently >> randomly, >> locks up the pipeline. Did you ever find a solution? >> >> Kind regards, >> Michiel Konstapel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testcase.py Type: text/x-python Size: 4160 bytes Desc: not available URL: From michiel at aanmelder.nl Tue Mar 16 20:14:59 2021 From: michiel at aanmelder.nl (Michiel Konstapel) Date: Tue, 16 Mar 2021 21:14:59 +0100 Subject: WebRTCBin not receiving caps signal In-Reply-To: References: <1610622034499-0.post@n4.nabble.com> Message-ID: Oh, the producer pipeline in the example was not quite correct: it should include key frames, a profile specification and config-interval for the h264 parser and payloader: ... nvh264enc gop-size=50 ! video/x-h264, profile=baseline ! h264parse config-interval=-1 ! rtph264pay config-interval=-1 ... Updated version is attached. But even then, it still breaks. On 16-03-2021 20:51, Michiel Konstapel wrote: > > After several days of banging my head on the wall, I've gone and split > off the webrtc part into a separate pipeline, connected to the main > one through udpsink/udpsrc. However, even then adding webrtcbins will > randomly lock up. I've narrowed it down to a pretty small testcase in > the attached python file. > > In my environment (1.18.3 built from source) I had to run > > export GI_TYPELIB_PATH=/usr/local/lib/x86_64-linux-gnu/girepository-1.0 > > or it fails with: > > ValueError: Namespace GstWebRTC not available > > Run it once as a producer pipeline: > > python3 testcase.py producer > > And then run the consumer: > > python3 testcase.py consumer > > This will just add a whole bunch of webrtcbins to the running pipeline. > > Occasionally, this will successfully add 100 webrtcbins to the > pipeline, but usually, it'll lock up: > > gstwebrtcbin.c:319:gst_webrtc_bin_pad_new:<'':sink_0> new visible pad with direction sink > gstwebrtcbin.c:469:_find_transceiver_for_mline: Found transceiver (NULL) for mlineindex 0 > gstwebrtcbin.c:5823:gst_webrtc_bin_request_new_pad: Created new transceiver for mline 0 > gstwebrtcbin.c:5737:gst_webrtc_bin_change_state: changing state: NULL => READY > gstwebrtcbin.c:1354:_check_if_negotiation_is_needed: checking if negotiation is needed > gstwebrtcbin.c:1360:_check_if_negotiation_is_needed: no negotiation possible until caps have been received on all sink pads > gstwebrtcbin.c:5737:gst_webrtc_bin_change_state: changing state: READY => PAUSED > gstwebrtcbin.c:5737:gst_webrtc_bin_change_state: changing state: PAUSED => PLAYING > > And then it just freezes. > > Sometimes it's the 98th, sometimes it's the 10th. I've ran it with > maximum GST_DEBUG, added print statements to gstwebrtcbin.c, and I > can't find anything that sets apart the successful and failing > attempts. Adding sleeps seems to help, but it's not foolproof and I > have no idea why it would help. > > Matthew or other experts, any ideas? Is there an issue tracker I > should add this to? Is there something wrong with this approach? > > I did find a workaround: if I pause the whole consumer pipeline, add > the webrtcbin, and then set the pipeline playing, that appears to > prevent the race condition. Now that it's in a separate pipeline, > that's a viable option, but it still doesn't sit right with me that it > just randomly... stops. > > Kind regards, > Michiel > > On 14-01-2021 12:04, Michiel Konstapel wrote: >> Apologies, that lost a lot of context since it references an old >> email to the list. It concerns this message by Dominik: >> http://gstreamer-devel.966125.n4.nabble.com/WebRTCBin-not-receiving-caps-signal-tt4689946.html >> >> Quoting: >>> I am working on a python app that turns an rtsp stream from a >>> networked camera into a stream that you can access through the >>> browser via WebRTC. >>> >>> For that, I am using a dynamic pipeline, the rtp audio and video >>> streams go to a fakesink through a tee element. For each incoming >>> websocket connection, I build an WebRTC connection: I am adding two >>> queues, one for audio, one for video to the tee elements and connect >>> them to a new WebRTC element. Then I exchange the SDP offer and >>> response via websockets. >>> >>> Now, in some situations, after quick reloads of the web page that >>> brokers the connection, the newest added WebRTCBin element does not >>> send the ?on-negotiation-neeed? signal. I am assuming this is >>> because it did not receive caps on the newly requested pad, thus the >>> gst_webrtcbin_sink_event method did not get triggered and the >>> WebRTCBin is not checking whether a negotiation is needed. >>> >>> A PDF of the pipeline is here: >>> http://roettsch.es/debug_gst_webrtcbin.pdf and an image of the >>> relevant portions is attached. >>> >>> The WebRTCBin element sendrecv-2-772371 has blocked pads, as it >>> hasn?t done any negotiation and unblocked its pads, but also the >>> CAPS on the links to the queue elements only show ?ANY?. >>> >>> I have a few questions to further investigate this: >>> >>> What could be the reason the caps event is not received by the >>> GstWebRTCBin? >>> When is the caps event usually sent downstream? Is it sent only once >>> and could it be that the GstWebRTCBin somehow missed it through some >>> race condition? >>> When connecting queue and webrtcbin to the tee element, do I need to >>> follow a certain order? Do I need to insert probes in order not to >>> miss the caps event? >> >> >> On 14-01-2021 12:00, mkonstapel wrote: >>> Hi Dominik et al, >>> >>> I think I am running into the exact same problem - quickly reloading >>> the >>> "player" page (thus adding webrtcbins) eventually, and apparently >>> randomly, >>> locks up the pipeline. Did you ever find a solution? >>> >>> Kind regards, >>> Michiel Konstapel -- Michiel Konstapel /Lead Software Developer/ aanmelder.nl T: +31(0)15 2400 119 E: michiel at aanmelder.nl -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testcase.py Type: text/x-python Size: 4298 bytes Desc: not available URL: From nicolas at ndufresne.ca Tue Mar 16 20:54:10 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 16 Mar 2021 16:54:10 -0400 Subject: gstreamer and BigSur In-Reply-To: <8360783F-221D-4E47-AB69-9CB446B60A5C@mac.com> References: <8360783F-221D-4E47-AB69-9CB446B60A5C@mac.com> Message-ID: Le mardi 16 mars 2021 ? 11:59 -0700, Patrick Cusack a ?crit?: > Question: Does gstreamer on macOS have a sink that utilizes Metal or does it > still use OpenGL? There is no Metal support in GStreamer. So far, glimagesink has been used, and I think Vulkan sink can be used with MoltenVK. Metal is quite niche in the context of GStreamer project, but if someone is willing to contribute and maintain support for that, patches welcome. > > Thanks, > > Patrick > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From www.andreabertolaso at gmail.com Tue Mar 16 21:21:29 2021 From: www.andreabertolaso at gmail.com (Andressio) Date: Tue, 16 Mar 2021 16:21:29 -0500 (CDT) Subject: Regulate speed of decoded frames using presentation timestamp In-Reply-To: References: <1615849609962-0.post@n4.nabble.com> Message-ID: <1615929689022-0.post@n4.nabble.com> Thank you for your reply Nicolas Dufresne-5 wrote > This return back in time indicates the presence of B-Frames. The decoder > will > reorder them for you. Notice the DTS, which should be monotonic before the > decoder. I don't think there are B-frames in the stream. The return back in time happens a variable number of times: in my experiments 3 times per hour at most. I guess that if it was due to B-frames the jump back should be much more frequent and predictable. Consider that I'm using a keyframe interval of 10 at 25 fps. Nicolas Dufresne-5 wrote > In GStreamer, decoders are not responsible for smoothing the transmission. > They > simply process as fast as possible. Dowstream the decoder, playback > pipelines > will usually contain a short raw video queue (which allow buffering when > decoding is faster).? After that queue, there is a display componenent, > (could > be glimagesink). These element are using GstBaseSink base class, which > implement > most of the synchronisation code. It will translate the PTS into > running-time, > and then translate this to clock time in order to determin how long to > wait > before displaying. Audio sink are much more complex. I will look into GstBaseSink, thanks. Is relying on the PTS the best shot I have to regulate the output? Nicolas Dufresne-5 wrote > p.s. Remember that decoding complexity varies within a stream, so not all > frames > decode at equal speed. Some HW decoder would smooth this, but this is > atipical > for GPU decoders and PC software. It is very interesting. Could you please provide some documentation about these different behaviours? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From iotsystek at gmail.com Tue Mar 16 23:29:32 2021 From: iotsystek at gmail.com (iotsystek) Date: Tue, 16 Mar 2021 18:29:32 -0500 (CDT) Subject: How is gst-libav installed on windows 10? Message-ID: <1615937372910-0.post@n4.nabble.com> I am using GStreamer 1.18.2. I have my cameras streaming from embedded Linux to windows running Linux in a virtual machine in windows. I have been trying for several days to get this working directly on windows without the virtual machine. When I run the following gstreamer command which works in Linux in windows: gst-launch-1.0 -v udpsrc port=1235 ! "application/x-rtp, payload=127" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false I receive the following error: WARNING: erroneous pipeline: no element "avdec_h264" I get a very similar error when I run: gst-inspect-1.0 avdec_h264 No such element or plugin 'avdec_h264' After much research I conclude that windows needs to have gst-libav installed, but I cannot find the install files anywhere. It seems that these files are installed automatically on Linux. Any assistance with this is greatly appreciated. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Wed Mar 17 03:54:23 2021 From: gotsring at live.com (gotsring) Date: Tue, 16 Mar 2021 22:54:23 -0500 (CDT) Subject: How is gst-libav installed on windows 10? In-Reply-To: <1615937372910-0.post@n4.nabble.com> References: <1615937372910-0.post@n4.nabble.com> Message-ID: <1615953263035-0.post@n4.nabble.com> I'm assuming you're using MSVC instead of MinGW. I have avdec_h264 listed in gst-inspect-1.0, and I just did a stock install. I'm using the GStreamer 1.18.1 MSVC x64 binaries downloaded from the GStreamer downloads page. I don't think I did anything more than that, but I could have forgotten something. Easiest way to check is to do a *complete* install of the 1.18 binaries and you should be good to go. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From fred.sund at gmail.com Wed Mar 17 05:33:37 2021 From: fred.sund at gmail.com (RedMarsBlueMoon) Date: Wed, 17 Mar 2021 00:33:37 -0500 (CDT) Subject: Stutter/Missing Frames Raw h264 over rtp. Message-ID: <1615959217960-0.post@n4.nabble.com> Hello! I'm running gst-launch with an Raspberry Pi 4b as sender and PC with Ubuntu 20.04 on the receiving end. Locally on LAN Ethernet cable. The receiver is getting a good image with only very occasional breakage. The receiver also seem to be keeping the correct playback speed. But it's stuttering, or maybe frame skipping. The receiver PC is otherwise completely fine playing back other video files with the same resolution and higher bit rates. This is the command on the sender, (GStreamer 1.14.4) gst-launch-1.0 -v filesrc location=/home/pi/Videos/mulan_trailer.h264 ! video/x-h264, framerate=25/1, width=1920, height=804 ! queue ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=8160 This is the command on the receiver, gst-launch-1.0 udpsrc port=8160 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink I tried adding that jitter step but it didn't help, gst-launch-1.0 udpsrc port=8160 ! application/x-rtp,encoding-name=H264,payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink I don't think this is a hardware limitation but it seems more like there's missing timing meta data on the stream, or that I'm not reading the stream correctly or something like that? Can anyone see what I'm doing wrong? Cheers! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From jacklawi at gmail.com Wed Mar 17 08:41:29 2021 From: jacklawi at gmail.com (omer.tal) Date: Wed, 17 Mar 2021 03:41:29 -0500 (CDT) Subject: Stutter/Missing Frames Raw h264 over rtp. In-Reply-To: <1615959217960-0.post@n4.nabble.com> References: <1615959217960-0.post@n4.nabble.com> Message-ID: <1615970489517-0.post@n4.nabble.com> Hey, Can you try to run it with GST_DEBUG=3? Maybe there are some warnings that can help debug. Also, try to add mtu parameter to rtph264pay, and set it to 100 / 300 / 700 etc (default is 1400). I had a similar issue in the past and it appeared to relate to the network I was using. The mtu parameter really helped. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From marc.leeman at gmail.com Wed Mar 17 08:50:38 2021 From: marc.leeman at gmail.com (Marc Leeman) Date: Wed, 17 Mar 2021 09:50:38 +0100 Subject: RTSP Client sometimes doesn't send video (stream 0) packets to the client In-Reply-To: References: <1615468228154-0.post@n4.nabble.com> Message-ID: Unless 'OPTIONS' lists commands that it does not support :-) The joy of network cameras... On Thu, 11 Mar 2021 at 16:45, Jeff Shanab wrote: > > I deal with a large amount of security cameras of different brands. > > Just an FYI.... > 60 seconds and GET_PARAMETER are the default and work 90+% of the time. > The Setup reply will tell you if it is not default but is there for default also, depends on brand. end of session header "timeout=60" > > I have seen cameras that > 1) cannot handle GET_PARAMETER in any transport (RED Vision) > 2) must have OPTIONS or SET_PARAMETER instead > 3) use rtcp, the rest are ignored. (I thought the timeout=nn told me which but it is inconsistent) > > The behaviour seems to vary with transport > 1) rtsp-over-http.(2 sockets) All bets are off, every mfg interprets the vague combination of the Apple Quicktime Spec differently. > 2) rtsp-over-tcp.(1 socket) Usually GET_PARAMETER is universally accepted here and is all that is needed. > 3) rtsp/rtp/udp (up to 7 sockets) GET_PARAMETER for main session but some must have rtcp receiver reports per session or they disconnect, > > Patterns I see are > if disconnect in seconds, rtcp issue > if right on the 30 second, 1 minutes or 2 minute then GET_PARAMETER/OPTIONS/SET_PARAMETER (OPTIONS on vary old cameras, otherwise OPTIONS tells you if it supports GET_PARAMETER > > Wireshark is your friend. > > > > > On Thu, Mar 11, 2021 at 9:22 AM Thornton, Keith wrote: >> >> Hi, >> that may be because the server is not receiving the keep alive timer from the client. If you have a wireshark log look to see if the client sends a GET_PARAMETER once a minute (If you haven't changed the keep alive timeout). >> Gruesse >> >> -----Urspr?ngliche Nachricht----- >> Von: gstreamer-devel Im Auftrag von renjith.t >> Gesendet: Donnerstag, 11. M?rz 2021 14:10 >> An: gstreamer-devel at lists.freedesktop.org >> Betreff: Re: RTSP Client sometimes doesn't send video (stream 0) packets to the client >> >> Hi, >> >> I have actually gone through the session media files. >> >> What I found is that >> -> gstreamer rtsp server is closing the transport for the stream 0 it is >> -> done from "update_transport" method in rtsp-stream.c info debug >> -> printed is "removing TCP 192.168.111.78" where 111.78 was the >> client where video was not displayed >> >> so it is clear that gstreamer is explicitly closing the transport. >> >> >> >> >> >> -- >> Sent from: https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7C6e975d97adaa4a9d361508d8e4981c9b%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637510689266574710%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=jP9RcoouYXOZf1m1UAiUqAEuSn8TBvoy%2FpiGpFN%2BTkc%3D&reserved=0 >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7C6e975d97adaa4a9d361508d8e4981c9b%7C28042244bb514cd680347776fa3703e8%7C1%7C1%7C637510689266574710%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=qscZFcyt5nNK836xWaQVbioQcpbQ9OyCAZPiGS1S6UQ%3D&reserved=0 >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- g. Marc From vladimir.tyutin at gmail.com Wed Mar 17 09:32:25 2021 From: vladimir.tyutin at gmail.com (Vladimir Tyutin) Date: Wed, 17 Mar 2021 12:32:25 +0300 Subject: gst-inspect-1.0 show nothing In-Reply-To: References: Message-ID: Hi all, Please give me any feedback on the issue below. Plugin registry seems to work, but gst-inspect-1.0 prints no data. Thanks, Vladimir On Tue, Mar 16, 2021 at 7:53 PM Vladimir Tyutin wrote: > Changed the title > > ---------- Forwarded message --------- > From: Vladimir Tyutin > Date: Tue, Mar 16, 2021 at 12:56 PM > Subject: gstreamer 1.18.2 issue > To: Discussion of the development of and with GStreamer < > gstreamer-devel at lists.freedesktop.org> > > > Hi all, > I have build gstreamer and base plugins version 1.18.2 in buildroot for my > camera board. > Now I'm trying to verify it on the board. gst-inspect-1.0 prints the > version: > gst-inspect-1.0 --version > gst-inspect-1.0 version 1.18.2 > GStreamer 1.18.2 > Unknown package origin > > But it I use it to show all available plugins or specific plugin like this: > gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so > It shows nothing. > In the log I see the last message > default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing GstTracer > and after that strange line: > more: : No such file or directory > and nothing else. > Please see the full log attached. > > Thanks, > Vladimir > -------------- next part -------------- An HTML attachment was scrubbed... URL: From marc.leeman at gmail.com Wed Mar 17 11:24:05 2021 From: marc.leeman at gmail.com (Marc Leeman) Date: Wed, 17 Mar 2021 12:24:05 +0100 Subject: gst-inspect-1.0 show nothing In-Reply-To: References: Message-ID: Start with clearing the registry cache ~/.cache/gstreamer-1.0/registry.x86_64.bin Does this show up for all gst-inspect (e.g. without parameters)? Does this show up after a clean rebuild? On Wed, 17 Mar 2021 at 11:45, Vladimir Tyutin wrote: > > Hi all, > Please give me any feedback on the issue below. > Plugin registry seems to work, but gst-inspect-1.0 prints no data. > > Thanks, > Vladimir > > On Tue, Mar 16, 2021 at 7:53 PM Vladimir Tyutin wrote: >> >> Changed the title >> >> ---------- Forwarded message --------- >> From: Vladimir Tyutin >> Date: Tue, Mar 16, 2021 at 12:56 PM >> Subject: gstreamer 1.18.2 issue >> To: Discussion of the development of and with GStreamer >> >> >> Hi all, >> I have build gstreamer and base plugins version 1.18.2 in buildroot for my camera board. >> Now I'm trying to verify it on the board. gst-inspect-1.0 prints the version: >> gst-inspect-1.0 --version >> gst-inspect-1.0 version 1.18.2 >> GStreamer 1.18.2 >> Unknown package origin >> >> But it I use it to show all available plugins or specific plugin like this: >> gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so >> It shows nothing. >> In the log I see the last message >> default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing GstTracer >> and after that strange line: >> more: : No such file or directory >> and nothing else. >> Please see the full log attached. >> >> Thanks, >> Vladimir > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- g. Marc From 330271189 at qq.com Tue Mar 16 07:45:21 2021 From: 330271189 at qq.com (=?ISO-8859-1?B?c3Ryb25n?=) Date: Tue, 16 Mar 2021 15:45:21 +0800 Subject: gstreamer-devel Digest, Vol 122, Issue 29 In-Reply-To: References: Message-ID: Re: about how to use audiofx to realize aduio EQ/bass/treble hi, Sebastian     Thanks very much, about EQ, you have given me what I wanted.     And, do you know which plugin can realize audio effect like bass and treble, I can use sox to realize it, and but      I do not know how to realize it with gstreamer. Thanks very much ------------------ Original ------------------ From: "gstreamer-devel" From kepitto at outlook.com Wed Mar 17 07:34:48 2021 From: kepitto at outlook.com (kepitto) Date: Wed, 17 Mar 2021 02:34:48 -0500 (CDT) Subject: Stutter/Missing Frames Raw h264 over rtp. In-Reply-To: <1615959217960-0.post@n4.nabble.com> References: <1615959217960-0.post@n4.nabble.com> Message-ID: <1615966488253-0.post@n4.nabble.com> I've had a stuttery stream before when I didn't parse before decoding. Try adding h264parse before the avdec_h264 element on the receiving end. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From kirti2.goel at altran.com Wed Mar 17 07:40:07 2021 From: kirti2.goel at altran.com (Kirti Goel) Date: Wed, 17 Mar 2021 02:40:07 -0500 (CDT) Subject: not able to stream the video on server Message-ID: <1615966807418-0.post@n4.nabble.com> hi, I am referring to the repo https://github.com/GStreamer/gst-rtsp-server/tree/master/examples I am getting this error on server 1) gcc -o test-launch test-record.c `pkg-config --cflags --libs gstreamer-rtsp-server-1.0` 2)./test-launch -p 7081 filesrc location=Relaxing_highway_traffic_777.mp4 ! decodebin ! x264enc ! fakesink after running client, error it shows: *(test-launch:37329): GLib-GObject-WARNING **: 12:41:21.811: invalid cast from 'GstFileSrc' to 'GstBin' (test-launch:37329): GStreamer-CRITICAL **: 12:41:21.811: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed* on client side error *ERROR: from element /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not read from resource. Additional debug info: ../gst/rtsp-sink/gstrtspclientsink.c(3092): gst_rtsp_client_sink_send (): /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Got error response: 415 (Unsupported Media Type).* -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From kirti2.goel at altran.com Wed Mar 17 08:59:41 2021 From: kirti2.goel at altran.com (Kirti Goel) Date: Wed, 17 Mar 2021 03:59:41 -0500 (CDT) Subject: not able to stream the video on server Message-ID: <1615971581443-0.post@n4.nabble.com> hi, I am referring to the repo https://github.com/GStreamer/gst-rtsp-server/tree/master/examples I am getting this error on server 1) gcc -o test-launch test-record.c `pkg-config --cflags --libs gstreamer-rtsp-server-1.0` 2)./test-launch -p 7081 filesrc location=Relaxing_highway_traffic_777.mp4 ! decodebin ! x264enc ! fakesink after running client, error it shows: *(test-launch:37329): GLib-GObject-WARNING **: 12:41:21.811: invalid cast from 'GstFileSrc' to 'GstBin' (test-launch:37329): GStreamer-CRITICAL **: 12:41:21.811: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed* on client side error *ERROR: from element /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not read from resource. Additional debug info: ../gst/rtsp-sink/gstrtspclientsink.c(3092): gst_rtsp_client_sink_send (): /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Got error response: 415 (Unsupported Media Type).* -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From alemmat at gmail.com Mon Mar 15 23:59:18 2021 From: alemmat at gmail.com (alemmat) Date: Mon, 15 Mar 2021 18:59:18 -0500 (CDT) Subject: loudness measure Message-ID: <1615852758206-0.post@n4.nabble.com> Hello everyone I am trying to measure the louness in a aes 67 streaming, i find in the ubuntu repository this plugin gstreamer1.0-crystalizer-pulseeffects that has this element peautogain allowed to measured the loudness. When i use this element in a pipeline created from the terminal I do not receive any error but the output is little strange compared to other elements like level /GstPipeline:pipeline0/GstPeautogain:peautogain0: g = 0.39685201644897461 /GstPipeline:pipeline0/GstPeautogain:peautogain0: m = -13.623071670532227 /GstPipeline:pipeline0/GstPeautogain:peautogain0: s = -16.76246452331543 /GstPipeline:pipeline0/GstPeautogain:peautogain0: i = -15.127962112426758 /GstPipeline:pipeline0/GstPeautogain:peautogain0: r = -25.127962112426758 /GstPipeline:pipeline0/GstPeautogain:peautogain0: l = -15.116959571838379 /GstPipeline:pipeline0/GstPeautogain:peautogain0: lra = 0 /GstPipeline:pipeline0/GstPeautogain:peautogain0: g = 0.4035041332244873 When i try to use the element peautogain in python script I can't hear anything and I have this error the script does not close GstMessageError, gerror=(GError)NULL, debug=(string)"gstbasesrc.c\(3072\):\ gst_base_src_loop\ \(\):\ /GstPipeline:player/GstUDPSrc:udpsrc:\012streaming\ stopped\,\ reason\ not-linked\ \(-1\)", details=(structure)"details\,\ flow-return\=\(int\)-1\;"; GstMessageError, gerror=(GError)NULL, debug=(string)"gstqueue.c\(988\):\ gst_queue_handle_sink_event\ \(\):\ /GstPipeline:player/GstQueue:queue:\012streaming\ stopped\,\ reason\ not-linked\ \(-1\)", details=(structure)"details\,\ flow-return\=\(int\)-1\;"; Thanks to read this post if anyone can giveme a hint i am very glad, Sorry if I didn't express myself well, I don't speak English well -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From quocanhkcn2018 at gmail.com Wed Mar 17 03:34:38 2021 From: quocanhkcn2018 at gmail.com (quocanhkcn2018) Date: Tue, 16 Mar 2021 22:34:38 -0500 (CDT) Subject: How can I enable rtmp in gst-plugins-bad? Message-ID: <1615952078241-0.post@n4.nabble.com> Hello I am newbie, I am trying to use gstreamer rtmp in my Quectel Kit to live stream my video. But when I use rtmpsink I have Issue: "No such element or plugin 'rtmpsink'" SO I think I need to enable rtmp in gst-plugins-bad. Someone help me please. Thank all so much. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Wed Mar 17 13:14:30 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Wed, 17 Mar 2021 09:14:30 -0400 Subject: Regulate speed of decoded frames using presentation timestamp In-Reply-To: <1615929689022-0.post@n4.nabble.com> References: <1615849609962-0.post@n4.nabble.com> <1615929689022-0.post@n4.nabble.com> Message-ID: <2f785917fab548dafd9bb84df5da5cf6872c1ee3.camel@ndufresne.ca> Le mardi 16 mars 2021 ? 16:21 -0500, Andressio a ?crit?: > Thank you for your reply > > > Nicolas Dufresne-5 wrote > > This return back in time indicates the presence of B-Frames. The decoder > > will > > reorder them for you. Notice the DTS, which should be monotonic before the > > decoder. > > I don't think there are B-frames in the stream. The return back in time > happens a variable number of times: in my experiments 3 times per hour at > most. I guess that if it was due to B-frames the jump back should be much > more frequent and predictable. Consider that I'm using a keyframe interval > of 10 at 25 fps. Good point, then perhaps a clock skew in the jitterbuffer, best run with this env set, so you get warnings from gst: GST_DEBUG=2 > > > Nicolas Dufresne-5 wrote > > In GStreamer, decoders are not responsible for smoothing the transmission. > > They > > simply process as fast as possible. Dowstream the decoder, playback > > pipelines > > will usually contain a short raw video queue (which allow buffering when > > decoding is faster).? After that queue, there is a display componenent, > > (could > > be glimagesink). These element are using GstBaseSink base class, which > > implement > > most of the synchronisation code. It will translate the PTS into > > running-time, > > and then translate this to clock time in order to determin how long to > > wait > > before displaying. Audio sink are much more complex. > > I will look into GstBaseSink, thanks. Is relying on the PTS the best shot I > have to regulate the output? Well, running-time, which is obtain through: running_time = gst_segment_to_running_time (segment, GST_FORMAT_TIME, pts); The segment structure is obtain by watching for segment events, it is also copied into GstSamples to make it easy for appsink users. > > > Nicolas Dufresne-5 wrote > > p.s. Remember that decoding complexity varies within a stream, so not all > > frames > > decode at equal speed. Some HW decoder would smooth this, but this is > > atipical > > for GPU decoders and PC software. > > It is very interesting. Could you please provide some documentation about > these different behaviours? I don't have much documentation to share, this is mostly accumulated knowledge. In general, CODECs are composed of multiple compression tools, see each tools as functions that need to be called with specific parameters and order. If a tool isn't used, you have less functions to call, hence you get the output with less clock cycles. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From iotsystek at gmail.com Wed Mar 17 14:09:45 2021 From: iotsystek at gmail.com (iotsystek) Date: Wed, 17 Mar 2021 09:09:45 -0500 (CDT) Subject: How is gst-libav installed on windows 10? In-Reply-To: <1615953263035-0.post@n4.nabble.com> References: <1615937372910-0.post@n4.nabble.com> <1615953263035-0.post@n4.nabble.com> Message-ID: <1615990185138-0.post@n4.nabble.com> That's the ticket. Thank You. Sometimes I just cannot see the forest for all the trees. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From vladimir.tyutin at gmail.com Wed Mar 17 16:06:47 2021 From: vladimir.tyutin at gmail.com (Vladimir Tyutin) Date: Wed, 17 Mar 2021 19:06:47 +0300 Subject: gst-inspect-1.0 show nothing In-Reply-To: References: Message-ID: Hi Marc, Yes, I tried clean registry bin file. Tried clean build also and gst-inspect without parameters. No output at all. When I activate GST_DEBUG=6 I see in log plugins seems to be parsed and added to registry. But no output with plugins details. I compared logs with my Ubuntu platform and log seems similar but on Ubuntu I see plugin details on my camera Linux nothing. For camera I use buildroot package to build gstreamer. May be some meson config parameter is missed? What should be used in config parameters for gstreamer to get gst-inspect working? > On 17 Mar 2021, at 14:23, Marc Leeman wrote: > > ?Start with clearing the registry cache > ~/.cache/gstreamer-1.0/registry.x86_64.bin > > Does this show up for all gst-inspect (e.g. without parameters)? > Does this show up after a clean rebuild? > > >> On Wed, 17 Mar 2021 at 11:45, Vladimir Tyutin wrote: >> >> Hi all, >> Please give me any feedback on the issue below. >> Plugin registry seems to work, but gst-inspect-1.0 prints no data. >> >> Thanks, >> Vladimir >> >>> On Tue, Mar 16, 2021 at 7:53 PM Vladimir Tyutin wrote: >>> >>> Changed the title >>> >>> ---------- Forwarded message --------- >>> From: Vladimir Tyutin >>> Date: Tue, Mar 16, 2021 at 12:56 PM >>> Subject: gstreamer 1.18.2 issue >>> To: Discussion of the development of and with GStreamer >>> >>> >>> Hi all, >>> I have build gstreamer and base plugins version 1.18.2 in buildroot for my camera board. >>> Now I'm trying to verify it on the board. gst-inspect-1.0 prints the version: >>> gst-inspect-1.0 --version >>> gst-inspect-1.0 version 1.18.2 >>> GStreamer 1.18.2 >>> Unknown package origin >>> >>> But it I use it to show all available plugins or specific plugin like this: >>> gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so >>> It shows nothing. >>> In the log I see the last message >>> default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing GstTracer >>> and after that strange line: >>> more: : No such file or directory >>> and nothing else. >>> Please see the full log attached. >>> >>> Thanks, >>> Vladimir >> >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > > -- > g. Marc > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From wanted002 at 163.com Wed Mar 17 12:46:09 2021 From: wanted002 at 163.com (wanted002) Date: Wed, 17 Mar 2021 07:46:09 -0500 (CDT) Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads Message-ID: <1615985169511-0.post@n4.nabble.com> Hi, all: I'm a newbie for gstreamer. I appreciate your wonderful work ! My idea is reduce the video latency by optimizing the cost of CPU for decoding . I'm now suffered with this problem and I need your help, thanks very much~~. This is my pipe line for receiving and decoding a remote rtp stream: *====screen print start=====* 200390 200356 54 20:15 ? 00:01:05 /usr/bin/gst-launch-1.0 -v udpsrc port=1991 caps="application/x-rtp, media=video" ! rtpjitterbuffer latency=20 ! rtpmp2tdepay ! tsdemux name=demuxer demuxer. ! queue name=video_ch max-size-buffers=0 max-size-time=0 ! h264parse ! queue name=dec0 ! avdec_h264 max-threads=2 skip-frame=1 ! videoconvert n-threads=4 ! xvimagesink display=:0 sync=false demuxer. ! queue name=audio_ch max-size-buffers=0 max-size-time=0 ! aacparse ! avdec_aac ! audioconvert ! audioresample ! autoaudiosink *====screen print end=====* But I found that avdec_h264 not woring under multi-threads, for which I supposed 'max-threads' should be working. See the "top -H" print below, all decoding work is loaded on the 200398 thread, named as "dec0". *====screen print start=====* top -H | grep dec 200398 root 20 0 1164484 69284 46408 S 57.9 0.9 0:42.92 dec0:src 200398 root 20 0 1161412 66248 43372 R 32.3 0.8 0:43.90 dec0:src 200398 root 20 0 1164484 69284 46408 S 82.6 0.9 0:46.41 dec0:src 200398 root 20 0 1161412 66248 43372 R 82.9 0.8 0:48.93 dec0:src 200398 root 20 0 1164484 69284 46408 S 59.0 0.9 0:50.73 dec0:src 200398 root 20 0 1164484 69284 46408 S 32.9 0.9 0:51.73 dec0:src 200398 root 20 0 1164484 69284 46408 S 24.4 0.9 0:52.47 dec0:src 200398 root 20 0 1164484 69284 46408 S 24.3 0.9 0:53.21 dec0:src 200398 root 20 0 1164484 69284 46408 S 24.0 0.9 0:53.94 dec0:src *====screen print end=====* This is the pstree print of the "parent PID" 200390, which is the /usr/bin/gst-launch-1.0 program. *====screen print start=====* pstree -pt 200390 gst-launch-1.0(200390)???{audio_ch:src}(200399) ??{audio_ch:src}(201471) ??{dec0:src}(200398) ??{dec0:src}(200405) ??{gmain}(200404) ??{gst-launch-1.0}(200397) ??{rtpjitterbuffer}(200402) ??{timer}(200401) ??{udpsrc0:src}(200403) ??{video_ch:src}(200400) ??{videoconvert}(200406) ??{videoconvert}(200407) ??{videoconvert}(200408) *====screen print end=====* And I debug thread 20045 ? which is the sibling decoding thread of 200398, it seems to be blocked with : pthread_cond_wait since spawned. Attaching to process 200390 [New LWP 200397] [New LWP 200398] [New LWP 200399] [New LWP 200400] [New LWP 200401] [New LWP 200402] [New LWP 200403] [New LWP 200404] [New LWP 200405] [New LWP 200406] [New LWP 200407] [New LWP 200408] [New LWP 201471] [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/aarch64-linux-gnu/libthread_db.so.1". 0x0000007f82ceedb8 in poll () from /lib/aarch64-linux-gnu/libc.so.6 (gdb) info threads Id Target Id Frame * 1 Thread 0x7f83114790 (LWP 200390) "gst-launch-1.0" 0x0000007f82ceedb8 in poll () from /lib/aarch64-linux-gnu/libc.so.6 2 Thread 0x7f7866e1e0 (LWP 200397) "gst-launch-1.0" 0x0000007f82cc541c in clock_nanosleep () from /lib/aarch64-linux-gnu/libc.so.6 3 Thread 0x7f77e6d1e0 (LWP 200398) "dec0:src" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 4 Thread 0x7f776311e0 (LWP 200399) "audio_ch:src" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 5 Thread 0x7f76e301e0 (LWP 200400) "video_ch:src" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 6 Thread 0x7f7662f1e0 (LWP 200401) "timer" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 7 Thread 0x7f75e2e1e0 (LWP 200402) "rtpjitterbuffer" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 8 Thread 0x7f7562d1e0 (LWP 200403) "udpsrc0:src" 0x0000007f82ceedb8 in poll () from /lib/aarch64-linux-gnu/libc.so.6 9 Thread 0x7f74e2c1e0 (LWP 200404) "gmain" 0x0000007f82ceedb8 in poll () from /lib/aarch64-linux-gnu/libc.so.6 10 Thread 0x7f57fff1e0 (LWP 200405) "dec0:src" 0x0000007f82da6038 in pthread_cond_wait@@GLIBC_2.17 () from /lib/aarch64-linux-gnu/libpthread.so.0 11 Thread 0x7f574fe1e0 (LWP 200406) "videoconvert" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 12 Thread 0x7f56cfd1e0 (LWP 200407) "videoconvert" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 13 Thread 0x7f564fc1e0 (LWP 200408) "videoconvert" 0x0000007f82cf4760 in syscall () from /lib/aarch64-linux-gnu/libc.so.6 14 Thread 0x7f554031e0 (LWP 201471) "audio_ch:src" 0x0000007f82ceedb8 in poll () from /lib/aarch64-linux-gnu/libc.so.6 (gdb) c Continuing. (gdb) bt #0 0x0000007f83073038 in pthread_cond_wait@@GLIBC_2.17 () at /lib/aarch64-linux-gnu/libpthread.so.0 #1 0x0000007f800c5cbc in () at /lib/aarch64-linux-gnu/libavutil.so.56 #2 0x0000007f8306c4fc in start_thread () at /lib/aarch64-linux-gnu/libpthread.so.0 #3 0x0000007f82fc530c in () at /lib/aarch64-linux-gnu/libc.so.6 -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From miguel.exposito at generaldrones.es Wed Mar 17 20:09:33 2021 From: miguel.exposito at generaldrones.es (mikeGD) Date: Wed, 17 Mar 2021 15:09:33 -0500 (CDT) Subject: Playing a raw h.264 stream from a USB source (timestamp issues) Message-ID: <1616011773036-0.post@n4.nabble.com> I'm dealing with a live video source (drone wireless video receiver) that outputs a raw h.264 video stream over USB. My goal is to integrate it into QGroundStation in Android, which has a GStreamer pipeline like this: I have dumped a slice of the received USB data to a file, which is perfectly playable with vlc using the following command: However, if I play it back using this GStreamer pipeline, the playback speed is too high (like x10) I'm using appsrc to push the USB data into the QGroundControl pipeline. The video plays, but lots of frames are dropped and gstreamer complains about packets dropped because frames are too late. After closer inspection of my dump, I realized that the stream is lacking pts and dts information (which seems to be usual in baseline h.264 streams) But apparently, the duration information is there. The USB endpoint reads 512-byte chunks (due to the USB Hi-Speed max. payload size for a bulk endpoint), and some transfers are smaller (400+ bytes long). I have no way to detect the beginning/end of NALs since it's an opaque continuous byte stream. (video/x-h264, stream-format=(string)byte-stream, alignment=none) So I built an appsrc to push the video stream to the pipeline and tried to blindly timestamp the buffers like this: ... but still no luck ... I have had limited success by using the following pipeline that encodes the h.264 stream into RTP payloads and then decodes it with a caps filter specifying the target framerate: I could build that into QGroundControl in C++ but I don't think it's the right approach and I should not make any assumptions about the target framerate since in this case it's 30 fps, but it may change dynamically. So, my questions are: - What would be the right approach to getting the video playing at the right speed without any frame drops? - Is it reasonable or possible to ask GStreamer to generate the PTS/DTS (there are no B-frames, so PTS should be equal to DTS) based on the duration information of the packets using the standard pipeline? Any help would be greatly appreciated. Thanks! Mike -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From zingfrid at gmail.com Thu Mar 18 00:34:49 2021 From: zingfrid at gmail.com (Anton Pryima) Date: Thu, 18 Mar 2021 02:34:49 +0200 Subject: gst-inspect-1.0 show nothing In-Reply-To: References: Message-ID: Hello Vladimir. Try to look into pkg-config at your camera's platform and check that it found appropriate .so files. Best regards, Anton. On Wed, Mar 17, 2021, 18:06 Vladimir Tyutin wrote: > Hi Marc, > Yes, I tried clean registry bin file. > Tried clean build also and gst-inspect without parameters. > No output at all. > When I activate GST_DEBUG=6 I see in log plugins seems to be parsed and > added to registry. But no output with plugins details. > I compared logs with my Ubuntu platform and log seems similar but on > Ubuntu I see plugin details on my camera Linux nothing. > For camera I use buildroot package to build gstreamer. May be some meson > config parameter is missed? > What should be used in config parameters for gstreamer to get gst-inspect > working? > > > On 17 Mar 2021, at 14:23, Marc Leeman wrote: > > > > ?Start with clearing the registry cache > > ~/.cache/gstreamer-1.0/registry.x86_64.bin > > > > Does this show up for all gst-inspect (e.g. without parameters)? > > Does this show up after a clean rebuild? > > > > > >> On Wed, 17 Mar 2021 at 11:45, Vladimir Tyutin < > vladimir.tyutin at gmail.com> wrote: > >> > >> Hi all, > >> Please give me any feedback on the issue below. > >> Plugin registry seems to work, but gst-inspect-1.0 prints no data. > >> > >> Thanks, > >> Vladimir > >> > >>> On Tue, Mar 16, 2021 at 7:53 PM Vladimir Tyutin < > vladimir.tyutin at gmail.com> wrote: > >>> > >>> Changed the title > >>> > >>> ---------- Forwarded message --------- > >>> From: Vladimir Tyutin > >>> Date: Tue, Mar 16, 2021 at 12:56 PM > >>> Subject: gstreamer 1.18.2 issue > >>> To: Discussion of the development of and with GStreamer < > gstreamer-devel at lists.freedesktop.org> > >>> > >>> > >>> Hi all, > >>> I have build gstreamer and base plugins version 1.18.2 in buildroot > for my camera board. > >>> Now I'm trying to verify it on the board. gst-inspect-1.0 prints the > version: > >>> gst-inspect-1.0 --version > >>> gst-inspect-1.0 version 1.18.2 > >>> GStreamer 1.18.2 > >>> Unknown package origin > >>> > >>> But it I use it to show all available plugins or specific plugin like > this: > >>> gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so > >>> It shows nothing. > >>> In the log I see the last message > >>> default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing > GstTracer > >>> and after that strange line: > >>> more: : No such file or directory > >>> and nothing else. > >>> Please see the full log attached. > >>> > >>> Thanks, > >>> Vladimir > >> > >> _______________________________________________ > >> gstreamer-devel mailing list > >> gstreamer-devel at lists.freedesktop.org > >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > > > > > > -- > > g. Marc > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.freedesktop.org > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fred.sund at gmail.com Thu Mar 18 01:54:57 2021 From: fred.sund at gmail.com (RedMarsBlueMoon) Date: Wed, 17 Mar 2021 20:54:57 -0500 (CDT) Subject: Stutter/Missing Frames Raw h264 over rtp. In-Reply-To: <1615970489517-0.post@n4.nabble.com> References: <1615959217960-0.post@n4.nabble.com> <1615970489517-0.post@n4.nabble.com> Message-ID: <1616032497291-0.post@n4.nabble.com> Thank you both for the suggestions! I tried to add the 'h264parse' but unfortunately I got no different result with that. I added the debug option and got some interesting messages repeatedly, WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: A lot of buffers are being dropped. Additional debug info: gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: There may be a timestamping problem, or this computer is too slow. I don't see any upticks on my CPU monitor. It's sunning at about 10% maybe. A i7-3770 4+4 core. My GPU is a gtx970 which has onboard hw h264 decode according to https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new Is there a way to see what cpu/gpu the decode as actually using for decode? That error message also says, "There may be a timestamping problem" so what could I be missing for that? I thought that's what 'rtph264pay' did? Cheers! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Thu Mar 18 03:32:24 2021 From: gotsring at live.com (gotsring) Date: Wed, 17 Mar 2021 22:32:24 -0500 (CDT) Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <1615985169511-0.post@n4.nabble.com> References: <1615985169511-0.post@n4.nabble.com> Message-ID: <1616038344494-0.post@n4.nabble.com> Not really a direct answer, mostly a question: Is there any reason you're using CPU decoders instead of hardware-accelerated decoders? I'll admit I haven't recently done much GStreamer stuff on Linux (it appears you're using Linux), but depending on your hardware, you may have the option of using vaapi (vaapih264dec) or OMX (omxh264dec) plugins. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From wanted002 at 163.com Thu Mar 18 05:48:16 2021 From: wanted002 at 163.com (wanted002) Date: Thu, 18 Mar 2021 00:48:16 -0500 (CDT) Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <1616038344494-0.post@n4.nabble.com> References: <1615985169511-0.post@n4.nabble.com> <1616038344494-0.post@n4.nabble.com> Message-ID: <1616046496522-0.post@n4.nabble.com> Thanks~ . My gstreamer is 1.16.2 , and the pipeline executed on my Linux ( ubuntu20.04). Because there's no hardware providing hardware-acceleration, CPU has to take the decoding job and thus cause a high cost, which I consider lead to the latency. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From lusinehayrapetyan1992 at gmail.com Thu Mar 18 11:34:59 2021 From: lusinehayrapetyan1992 at gmail.com (Lusine) Date: Thu, 18 Mar 2021 06:34:59 -0500 (CDT) Subject: webrtcbin state change issue In-Reply-To: <0b125490-61e1-f311-ed58-ab2b313eac43@gmail.com> References: <1615813831370-0.post@n4.nabble.com> <0b125490-61e1-f311-ed58-ab2b313eac43@gmail.com> Message-ID: <1616067299626-0.post@n4.nabble.com> Hi Matt, I need to change tune & speed-preset properties of x264enc elements at runtime. These properties are only possible to change when the x264enc is in ready state. That's why I's thinking to set the pipeline in the ready state, change these properties and then recover the playing state which doesn't work. Is there any other way to achieve what I want? Regards, Lusine -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Thu Mar 18 12:23:33 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Thu, 18 Mar 2021 17:53:33 +0530 Subject: Issue of fetching Camera from website Message-ID: Hi, We have Installed Ubuntu and Gstreamer at AWS and when running this command gst-launch-1.0 v4l2src ! videoconvert ! ximagesink We are getting this error Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could not initialise X output Additional debug info: ximagesink.c(860): gst_x_image_sink_xcontext_get (): /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could not open display Setting pipeline to NULL ... Freeing pipeline ... ------------------------ and gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc tune=zerolatency ! rtph264pay ! udpsink port=10000 Error Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot identify device '/dev/video0'. Additional debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: system error: No such file or directory Setting pipeline to NULL ... Freeing pipeline ... Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From maaloulsafouane at gmail.com Thu Mar 18 13:49:36 2021 From: maaloulsafouane at gmail.com (SAFOUANE) Date: Thu, 18 Mar 2021 08:49:36 -0500 (CDT) Subject: Using gstreamer with cmake in android? Message-ID: <1616075376144-0.post@n4.nabble.com> I succeed in receiving the video flux with gstreamer on android as client and the raspberry pi as server. The problem is that i use ndk build to add gstreamer c files to my application. I want to know how to add them using cmake ? Or if i can add the android.mk and application.mk to my application using cmake ?Best regards,Safouane -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Thu Mar 18 15:31:34 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 18 Mar 2021 11:31:34 -0400 Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <1616046496522-0.post@n4.nabble.com> References: <1615985169511-0.post@n4.nabble.com> <1616038344494-0.post@n4.nabble.com> <1616046496522-0.post@n4.nabble.com> Message-ID: <50b192f8634c89fdb8277d26ab9007bd4d672767.camel@ndufresne.ca> Le jeudi 18 mars 2021 ? 00:48 -0500, wanted002 a ?crit?: > Thanks~ . My gstreamer is 1.16.2 , and the pipeline executed on my Linux ( > ubuntu20.04). Because there's no hardware providing hardware-acceleration, > CPU has to take the decoding job and thus cause a high cost, which I > consider lead to the latency. Perhaps you want to be aware of this: https://gitlab.freedesktop.org/gstreamer/gst-libav/-/blob/master/ext/libav/gstavviddec.c#L561 If the pipeline is live, we kow that frame base threading will introduce a lot of latency, so we flip it to slice base threading. For for this mode to run on multiple threads, the encoded stream needs to have at least as many slices as the number of threads you want to use. If you have control over the encoder, that's the approch I would use. You can always override this by setting the property "thread-type" to frame/1. Now that you get 1 frame latency per thread, as the threading requires introducing render delays. The paralellism still vary on the encoding of references, since sometimes you have to decode the reference before you can do anything else. Nicolas > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From nicolas at ndufresne.ca Thu Mar 18 15:35:00 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 18 Mar 2021 11:35:00 -0400 Subject: Issue of fetching Camera from website In-Reply-To: References: Message-ID: Le jeudi 18 mars 2021 ? 17:53 +0530, Activecraft Software Development a ?crit?: > Hi, > > We have Installed Ubuntu and Gstreamer at AWS? > and when running this command > gst-launch-1.0 v4l2src ! videoconvert ! ximagesink > > We are getting this error? > Setting pipeline to PAUSED ... > ERROR: Pipeline doesn't want to pause. > ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could > not initialise X output > Additional debug info: > ximagesink.c(860): gst_x_image_sink_xcontext_get (): > /GstPipeline:pipeline0/GstXImageSink:ximagesink0: > Could not open display > Setting pipeline to NULL ... > Freeing pipeline ... > ------------------------ You have to start an X11 server and set the DISPLAY env accordingly in order to use ximagesink. > and? > gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc > tune=zerolatency ! rtph264pay ! udpsink port=10000 > > Error > Setting pipeline to PAUSED ... > ERROR: Pipeline doesn't want to pause. > ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot > identify device '/dev/video0'. > Additional debug info: > v4l2_calls.c(609): gst_v4l2_open (): > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: > system error: No such file or directory > Setting pipeline to NULL ... > Freeing pipeline ... You have to have a camera connected to your system and exposed as /dev/video0 to use v4l2src too. While running a headless X11 is possible over aws, attaching a camera seems rather atypical. If you use a full VM, you would enable vivid driver, which will provide an emulated camera to your linux kernel. > > ??Anurag Biala???? > +91-9814808323 |?+1(646)-797-2775?? > SKYPE:? ?activecraft at hotmail.com? ? ? |? ? ?Gmail? ? ??activecraft at gmail.com > Website:?https://www.activecraft.com??|? ??Email? ? ??info at activecraft.com?? ? > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? > > > Website Design & Develop + App Design & Develop + SEO/SMM +?Graphic Design + > UI/UX > ? ? ? ? ? ? ?? > CONFIDENTIALITY NOTICE: The information in this email may be confidential > and/or privileged. This email is intended to be reviewed by only the > individual or organization named above. If you are not the intended recipient > or an authorized representative of the intended recipient, you are hereby > notified that any review, dissemination or copying of this email and its > attachments, if any, or the information contained herein is prohibited. If you > have received this email in error, please immediately notify the sender by > return email and delete this email from your system.? > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From marc.leeman at gmail.com Thu Mar 18 16:50:28 2021 From: marc.leeman at gmail.com (Marc Leeman) Date: Thu, 18 Mar 2021 17:50:28 +0100 Subject: not able to stream the video on server In-Reply-To: <1615971581443-0.post@n4.nabble.com> References: <1615971581443-0.post@n4.nabble.com> Message-ID: I don't know this particular tool by heart, but I seem to remember that you need to specify the pipeline up until the RTP payloader *and* you need to pass it the name pay0 (or more if you mux multiple streams into one session). On Wed, 17 Mar 2021 at 14:45, Kirti Goel wrote: > > hi, > > I am referring to the repo > https://github.com/GStreamer/gst-rtsp-server/tree/master/examples > > I am getting this error on server > 1) gcc -o test-launch test-record.c `pkg-config --cflags --libs > gstreamer-rtsp-server-1.0` > 2)./test-launch -p 7081 filesrc location=Relaxing_highway_traffic_777.mp4 ! > decodebin ! x264enc ! fakesink > > after running client, error it shows: > *(test-launch:37329): GLib-GObject-WARNING **: 12:41:21.811: invalid cast > from 'GstFileSrc' to 'GstBin' > > (test-launch:37329): GStreamer-CRITICAL **: 12:41:21.811: > gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed* > > on client side error > *ERROR: from element > /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: Could not read > from resource. > Additional debug info: > ../gst/rtsp-sink/gstrtspclientsink.c(3092): gst_rtsp_client_sink_send (): > /GstPipeline:pipeline0/GstRTSPClientSink:rtspclientsink0: > Got error response: 415 (Unsupported Media Type).* > > > > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- g. Marc From iotsystek at gmail.com Thu Mar 18 19:05:59 2021 From: iotsystek at gmail.com (iotsystek) Date: Thu, 18 Mar 2021 14:05:59 -0500 (CDT) Subject: Unable to set Capabilities on v4l2src using C Message-ID: <1616094359820-0.post@n4.nabble.com> Setup: I am using GStreamer 1.18.2 on embedded Linux with a couple of USB cameras and sending the video to Windows. I have a working pipeline. USB Cameras attached to the embedded computer using the following command example: gst-launch-1.0 v4l2src device=/dev/video0 ! \ 'video/x-raw, width=640, height=480' ! \ videorate max-rate=6 ! \ videoconvert ! \ x264enc pass=qual quantizer=20 tune=zerolatency ! \ rtph264pay ! \ udpsink host=192.168.168.32 port=1234 The PC receives and displays the streams with no issue. Now I am attempting to translate the gst-launch-1.0 command into C code. I have been totally unsuccessful in setting the capabilities ('video/x-raw, width=640, height=480') on the v4l2src, which does not have a ?caps? property. I would be grateful for both a code snippet / example detailing the C source code needed to do this and guidance as to where I might look to find this kind of answer directly. Is there a repository of GStreamer version 1.0 C Code snipets? Also related to the above. Once I have this working I will need to be able to change these camera height and width capabilities. I also have a socket connection (totally unrelated to GStreamer) to the embedded Linux system and use it for control and feedback of additional hardware. I will need to raise and lower the cameras width and height while it is running via this socket connection. Will I have to totally stop the video flow or is there a way to adjust these setting on the fly. Again the code snippet would be greatly appreciated. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From iotsystek at gmail.com Thu Mar 18 20:44:37 2021 From: iotsystek at gmail.com (iotsystek) Date: Thu, 18 Mar 2021 15:44:37 -0500 (CDT) Subject: Unable to set Capabilities on v4l2src using C In-Reply-To: <1616094359820-0.post@n4.nabble.com> References: <1616094359820-0.post@n4.nabble.com> Message-ID: <1616100277064-0.post@n4.nabble.com> iotsystek wrote > Setup: > I am using GStreamer 1.18.2 on embedded Linux with a couple of USB cameras > and sending the video to Windows. > > I have a working pipeline. > USB Cameras attached to the embedded computer using the following command > example: > > gst-launch-1.0 v4l2src device=/dev/video0 ! \ > 'video/x-raw, width=640, height=480' ! \ > videorate max-rate=6 ! \ > videoconvert ! \ > x264enc pass=qual quantizer=20 tune=zerolatency ! \ > rtph264pay ! \ > udpsink host=192.168.168.32 port=1234 > > The PC receives and displays the streams with no issue. > > Now I am attempting to translate the gst-launch-1.0 command into C code. > I > have been totally unsuccessful in setting the capabilities ('video/x-raw, > width=640, height=480') on the v4l2src, which does not have a ?caps? > property. > > I would be grateful for both a code snippet / example detailing the C > source > code needed to do this and guidance as to where I might look to find this > kind of answer directly. Is there a repository of GStreamer version 1.0 C > Code snipets? > > Also related to the above. Once I have this working I will need to be > able > to change these camera height and width capabilities. > > I also have a socket connection (totally unrelated to GStreamer) to the > embedded Linux system and use it for control and feedback of additional > hardware. I will need to raise and lower the cameras width and height > while > it is running via this socket connection. Will I have to totally stop the > video flow or is there a way to adjust these setting on the fly. > > Again the code snippet would be greatly appreciated. > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at .freedesktop > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel Got it working? It turns out that: 'video/x-raw, width=640, height=480' ! \ Translates to: capsfilter caps='video/x-raw, width=800, height=600' ! \ capsfilter has a caps property which is easily set and capsfilter is applied to the preceding element in the chain. So at this point. Any help or suggested best practices on changing parameters (width, height) on the fly would be greatly appreciated. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Thu Mar 18 20:53:14 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 18 Mar 2021 16:53:14 -0400 Subject: Unable to set Capabilities on v4l2src using C In-Reply-To: <1616094359820-0.post@n4.nabble.com> References: <1616094359820-0.post@n4.nabble.com> Message-ID: Le jeudi 18 mars 2021 ? 14:05 -0500, iotsystek a ?crit?: > Setup: > I am using GStreamer 1.18.2 on embedded Linux with a couple of USB cameras > and sending the video to Windows. > > I have a working pipeline. > USB Cameras attached to the embedded computer using the following command > example: > > gst-launch-1.0 v4l2src device=/dev/video0 ! \ > ??? 'video/x-raw, width=640, height=480' ! \ > ??? videorate max-rate=6 ! \ > ??? videoconvert ! \ > ??? x264enc pass=qual quantizer=20 tune=zerolatency ! \ > ??? rtph264pay ! \ > ??? udpsink host=192.168.168.32 port=1234 > > The PC receives and displays the streams with no issue. > > Now I am attempting to translate the gst-launch-1.0 command into C code.? I > have been totally unsuccessful in setting the capabilities ('video/x-raw, > width=640, height=480') on the v4l2src, which does not have a ?caps? > property. You'll need to place an element called "capsfilter" which has a caps property. > > I would be grateful for both a code snippet / example detailing the C source > code needed to do this and guidance as to where I might look to find this > kind of answer directly.? Is there a repository of GStreamer version 1.0 C > Code snipets? > > Also related to the above.? Once I have this working I will need to be able > to change these camera height and width capabilities. > > I also have a socket connection (totally unrelated to GStreamer) to the > embedded Linux system and use it for control and feedback of additional > hardware.? I will need to raise and lower the cameras width and height while > it is running via this socket connection.? Will I have to totally stop the > video flow or is there a way to adjust these setting on the fly. > > Again the code snippet would be greatly appreciated. > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From leedag8 at gmail.com Thu Mar 18 21:43:36 2021 From: leedag8 at gmail.com (JimmyHan) Date: Thu, 18 Mar 2021 16:43:36 -0500 (CDT) Subject: changing default settings to video in code e.g width, height, hue, contrast Message-ID: <1616103816889-0.post@n4.nabble.com> Hello Beginner here, Started learning and understanding more n more with practice. I have been searching up n down for examples on video manipulation in code but after a few days I still don't have a clue. E.g setting width and height, or denoise, detail, brightness, hue, frc-algorithm and the list goes on. My boss has built a UI in html, php; where he is am able to add to insert text or choose from a drop down list options for the pipeline. Looks like this screenshot of my setup: My task was to add those from the pic and integrate those in the options shown in the image. After choosing those options I then create a xml file, then read that xml file options into variables using an xmlreader. This is done in C, using "gst.h" and "libxml/xmlreader.h". Now I want to add options mentioned above but I'm not sure how to do this using Gstreamer API, I can't find ntn on the internet for doing those settings using the API. I saw APIs like gst-libs/mfx/gstmfxfilter.h but I'm not sure if there is a native way of doing it. I might not know what I'm saying but I'm just looking for advice or links I may have over looked. Also I'd would really appreciate any advice on a whole different more efficient setup for this. I met this setup in my new job out of university and it's my first project. I feel it can be done better. We were also looking at gst_parse_launch() and adding them as caps but again idk what I'm saying. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From leedag8 at gmail.com Fri Mar 19 01:34:51 2021 From: leedag8 at gmail.com (JimmyHan) Date: Thu, 18 Mar 2021 20:34:51 -0500 (CDT) Subject: Help with setting Caps in src code - not using gst-parse-launch Message-ID: <1616117691251-0.post@n4.nabble.com> Hello Beginner here, Started learning and understanding more n more with practice. I have been searching up n down for examples on adding caps in code but after a few days I still don't have a clue. E.g setting width and height, or denoise, detail, brightness, hue, frc-algorithm and the list goes on. My boss has built a UI in html, php; where he has been able to add to inser text or choose from a drop down list options for the pipeline. Looks like this screenshot of my setup: My task was to add and integrate those options shown in the image.After choosing those options I then create a xml file, then read that xml file options into variables using an xmlreader. For e.g I know a video source capabilities would include width and height. I know how to add those caps using gst-launch-1.0 or gst-parse-launch what I really wanna know is how do I get that done in code. I don't even have any idea which element has capabilities such as the others shown in the image but for now I'd just like to how to add them in code. I have seen e.g. such as: /GstElements *capsfilter;/ GstCaps * caps; /capsfilter = gst_element_factory_make ("capsfilter", "capsfilter");/ /caps = gst_caps_new_simple ("video/x-raw", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "framerate", G_TYPE_INT, 60, NULL);/ g_object_set (capsfilter, "caps", caps, NULL); /gst_caps_unref (caps);/ I'd also appreciate any idea or guide on adjusting these in code: denoise, detail, brightness, hue, frc-algorithm, etc -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ystreet00 at gmail.com Fri Mar 19 04:04:04 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Fri, 19 Mar 2021 15:04:04 +1100 Subject: webrtcbin state change issue In-Reply-To: <1616067299626-0.post@n4.nabble.com> References: <1615813831370-0.post@n4.nabble.com> <0b125490-61e1-f311-ed58-ab2b313eac43@gmail.com> <1616067299626-0.post@n4.nabble.com> Message-ID: You can only set x264enc to READY state then instead of the entire pipeline. That may result in a different stream configuration though which may require a reconfiguration of the input stream (another thing that webrtcbin doesn't really support at the moment) and renegotiating the SDP. On 18/3/21 10:34 pm, Lusine wrote: > Hi Matt, > I need to change tune & speed-preset properties of x264enc elements at > runtime. These properties are only possible to change when the x264enc is in > ready state. That's why I's thinking to set the pipeline in the ready state, > change these properties and then recover the playing state which doesn't > work. Is there any other way to achieve what I want? > > Regards, > Lusine > -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From gotsring at live.com Fri Mar 19 04:14:23 2021 From: gotsring at live.com (gotsring) Date: Thu, 18 Mar 2021 23:14:23 -0500 (CDT) Subject: Help with setting Caps in src code - not using gst-parse-launch In-Reply-To: <1616117691251-0.post@n4.nabble.com> References: <1616117691251-0.post@n4.nabble.com> Message-ID: <1616127263220-0.post@n4.nabble.com> I'm not entirely sure what you're asking, but I'll throw a few things at the wall to see what sticks. To set caps, you pretty much have the code already. Just add an element called "capsfilter" and link it between the pipeline elements where you need to specify the caps. For example, the pipeline videotestsrc ! video/x-raw, format=I420, width=1920, height=1080, framerate=20/1 ! autovideosink becomes (in C code) // Create the elements GstElement* testsrc = gst_element_factory_make("videotestsrc", "testsrc"); GstElement* filter = gst_element_factory_make("capsfilter", "filter"); GstElement* videosink = gst_element_factory_make("autovideosink", "videosink"); gst_bin_add_many(GST_BIN(pipeline), testsrc, filter, videosink, NULL); // Set the caps GstCaps* caps = gst_caps_new_simple( "video/x-raw", "format", G_TYPE_STRING, "I420", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "framerate", GST_TYPE_FRACTION, 20, 1, NULL); g_object_set(G_OBJECT(filter), "caps", caps, NULL); // Link everything gst_element_link_many(testsrc, filter, videosink, NULL); Some of those other things like brightness and rotation actually look like element properties instead of caps. You can see what properties are available for a specific element by using gst-inspect-1.0. For example, 'gst-inspect-1.0 videotestsrc' shows us that there's a "pattern" property that allows us to change what test pattern the videotestsrc element produces. In code, we can set the pattern property using g_object_set(G_OBJECT(testsrc), "pattern", 18, NULL); This will show a moving ball instead of the regular test pattern. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From keith.thornton at zeiss.com Fri Mar 19 05:02:14 2021 From: keith.thornton at zeiss.com (Thornton, Keith) Date: Fri, 19 Mar 2021 05:02:14 +0000 Subject: AW: Help with setting Caps in src code - not using gst-parse-launch In-Reply-To: <1616117691251-0.post@n4.nabble.com> References: <1616117691251-0.post@n4.nabble.com> Message-ID: Hi, I haven't tried this myself but you might find something useful in the re-negotiation part of https://gstreamer.freedesktop.org/documentation/plugin-development/advanced/negotiation.html?gi-language=c Gr??e Von: gstreamer-devel Im Auftrag von JimmyHan Gesendet: Freitag, 19. M?rz 2021 02:35 An: gstreamer-devel at lists.freedesktop.org Betreff: Help with setting Caps in src code - not using gst-parse-launch Hello Beginner here, Started learning and understanding more n more with practice. I have been searching up n down for examples on adding caps in code but after a few days I still don't have a clue. E.g setting width and height, or denoise, detail, brightness, hue, frc-algorithm and the list goes on. My boss has built a UI in html, php; where he has been able to add to inser text or choose from a drop down list options for the pipeline. Looks like this screenshot of my setup: [http://gstreamer-devel.966125.n4.nabble.com/file/t379895/rsz_screenshot_from_2021-03-18_17-24-45.png] My task was to add and integrate those options shown in the image. After choosing those options I then create a xml file, then read that xml file options into variables using an xmlreader. For e.g I know a video source capabilities would include width and height. I know how to add those caps using gst-launch-1.0 or gst-parse-launch what I really wanna know is how do I get that done in code. I don't even have any idea which element has capabilities such as the others shown in the image but for now I'd just like to how to add them in code. I have seen e.g. such as: GstElements *capsfilter; GstCaps * caps; capsfilter = gst_element_factory_make ("capsfilter", "capsfilter"); caps = gst_caps_new_simple ("video/x-raw", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "framerate", G_TYPE_INT, 60, NULL); g_object_set (capsfilter, "caps", caps, NULL); gst_caps_unref (caps); I'd also appreciate any idea or guide on adjusting these in code: denoise, detail, brightness, hue, frc-algorithm, etc ________________________________ Sent from the GStreamer-devel mailing list archive at Nabble.com. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vladimir.tyutin at gmail.com Fri Mar 19 09:28:50 2021 From: vladimir.tyutin at gmail.com (Vladimir Tyutin) Date: Fri, 19 Mar 2021 12:28:50 +0300 Subject: gst-inspect-1.0 show nothing In-Reply-To: References: Message-ID: Hi Anton, Is package-config really required to get gst-inspect work? In log I see that plugins are parsed by registry and added to registry cash. package-config is unknown command on my camera, so it's missed. But on another camera from Lindenis it's also missed but gst-inspect works fine. I suppose some package is not included in buildroot. But I don't know which one. What is required to get gst-inspect print information about plugins? Thanks, Vladimir On Thu, Mar 18, 2021 at 3:35 AM Anton Pryima wrote: > Hello Vladimir. > > Try to look into pkg-config at your camera's platform and check that it > found appropriate .so files. > > Best regards, > Anton. > > On Wed, Mar 17, 2021, 18:06 Vladimir Tyutin > wrote: > >> Hi Marc, >> Yes, I tried clean registry bin file. >> Tried clean build also and gst-inspect without parameters. >> No output at all. >> When I activate GST_DEBUG=6 I see in log plugins seems to be parsed and >> added to registry. But no output with plugins details. >> I compared logs with my Ubuntu platform and log seems similar but on >> Ubuntu I see plugin details on my camera Linux nothing. >> For camera I use buildroot package to build gstreamer. May be some meson >> config parameter is missed? >> What should be used in config parameters for gstreamer to get gst-inspect >> working? >> >> > On 17 Mar 2021, at 14:23, Marc Leeman wrote: >> > >> > ?Start with clearing the registry cache >> > ~/.cache/gstreamer-1.0/registry.x86_64.bin >> > >> > Does this show up for all gst-inspect (e.g. without parameters)? >> > Does this show up after a clean rebuild? >> > >> > >> >> On Wed, 17 Mar 2021 at 11:45, Vladimir Tyutin < >> vladimir.tyutin at gmail.com> wrote: >> >> >> >> Hi all, >> >> Please give me any feedback on the issue below. >> >> Plugin registry seems to work, but gst-inspect-1.0 prints no data. >> >> >> >> Thanks, >> >> Vladimir >> >> >> >>> On Tue, Mar 16, 2021 at 7:53 PM Vladimir Tyutin < >> vladimir.tyutin at gmail.com> wrote: >> >>> >> >>> Changed the title >> >>> >> >>> ---------- Forwarded message --------- >> >>> From: Vladimir Tyutin >> >>> Date: Tue, Mar 16, 2021 at 12:56 PM >> >>> Subject: gstreamer 1.18.2 issue >> >>> To: Discussion of the development of and with GStreamer < >> gstreamer-devel at lists.freedesktop.org> >> >>> >> >>> >> >>> Hi all, >> >>> I have build gstreamer and base plugins version 1.18.2 in buildroot >> for my camera board. >> >>> Now I'm trying to verify it on the board. gst-inspect-1.0 prints the >> version: >> >>> gst-inspect-1.0 --version >> >>> gst-inspect-1.0 version 1.18.2 >> >>> GStreamer 1.18.2 >> >>> Unknown package origin >> >>> >> >>> But it I use it to show all available plugins or specific plugin like >> this: >> >>> gst-inspect-1.0 /usr/lib/gstreamer-1.0/libgstvideoscale.so >> >>> It shows nothing. >> >>> In the log I see the last message >> >>> default gsttracerutils.c:77:_priv_gst_tracing_init: Initializing >> GstTracer >> >>> and after that strange line: >> >>> more: : No such file or directory >> >>> and nothing else. >> >>> Please see the full log attached. >> >>> >> >>> Thanks, >> >>> Vladimir >> >> >> >> _______________________________________________ >> >> gstreamer-devel mailing list >> >> gstreamer-devel at lists.freedesktop.org >> >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> > >> > >> > >> > -- >> > g. Marc >> > _______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.freedesktop.org >> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wanted002 at 163.com Fri Mar 19 09:54:56 2021 From: wanted002 at 163.com (wanted002) Date: Fri, 19 Mar 2021 04:54:56 -0500 (CDT) Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <50b192f8634c89fdb8277d26ab9007bd4d672767.camel@ndufresne.ca> References: <1615985169511-0.post@n4.nabble.com> <1616038344494-0.post@n4.nabble.com> <1616046496522-0.post@n4.nabble.com> <50b192f8634c89fdb8277d26ab9007bd4d672767.camel@ndufresne.ca> Message-ID: <1616147696623-0.post@n4.nabble.com> Thanks for your help, Nicolas. Since I'm a fresh on video and gstreamer, so Further more please :) 1) Seems slice base *threading* need the cooperation from encoder side. So I need to checkout the slice number in the encoded frame first ? Then I can decide the max-threads value according to it . 2) I should use thread-type=slice ? But that's introduced in gstreamer 1.18 , may be I need to upgrade my gstreamer 3) "The paralellism still vary on the encoding of references, since sometimes you have to decode the reference before you can do anything else. " ----for this, I'm a little puzzled. "references" means the I frame? And I need to decode the I frame 1st for the sync between multi-threads ? Thanks again. Best wishes ! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Fri Mar 19 10:52:12 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Fri, 19 Mar 2021 16:22:12 +0530 Subject: Issue with Command Message-ID: I am running below command and getting an error, please help. root at ip-172-31-31-90:~# gst-launch-1.0 v4l2src num-buffers=1 ! jpegenc quality=100 ! multifilesink location=/var/www/html/public/img/IMG%06d.jpg Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot identify device '/dev/video0'. Additional debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: system error: No such file or directory Setting pipeline to NULL ... Freeing pipeline ... root at ip-172-31-31-90:~# Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From renjith.thankachan at matrixcomsec.com Fri Mar 19 11:40:42 2021 From: renjith.thankachan at matrixcomsec.com (renjith.t) Date: Fri, 19 Mar 2021 06:40:42 -0500 (CDT) Subject: RTSP Client sometimes doesn't send video (stream 0) packets to the client In-Reply-To: References: <1615468228154-0.post@n4.nabble.com> Message-ID: <1616154042215-0.post@n4.nabble.com> Hello All, Its not an issue with the GET_PARAMETERS Request. GET_PARAMETERS request is sent by the client, it is also received and processed further by the rtsp server. We took time to analyze this issue, and below is our observation - gst_rtsp_watch_write_serialized_messages :- function writes the stream on to the socket. - If the messages to be sent to the clients is in big numbers, then some messages are not sent to the client, this function adds the pending messages to the client's watch and allocates the dispatcher (gst_rtsp_source_dispatch_write) for sending those pending messages, and sets a sequence no in the last message. - when gst_rtsp_source_dispatch_write is invoked, it is responsible to send those pending messages to the client. when all messages are sent successfully, the callback for message_sent is called, which in-turn will reset the sequence number. *Now the problem arises in the below situation* (it occurs when the number of TCP clients for a single shared stream is huge - I reproduced this issue with 10-15 TCP Clients (5-7 clients per PC) with resolution of 2048 X 1536 @ 25fps H.264 Codec ) - gst_rtsp_watch_write_serialized_messages was not able to write the messages completely - It added the pending messages to the client's watch & assigned the dispatcher for it. - the above was done for all the 5-6 clients in the same pc - now the dispatcher came in into action - dispatcher took clients one by one and started to send the messages from the client's watch - before the dispatcher could start sending messages to the last client, next message for that client was scheduled to be sent through gst_rtsp_watch_write_serialized_messages (*I believe it is a case of context switching* ) - now send_tcp_message of rtsp-stream.c came into action. - it tried to send message to the client, from check_transport_backlog where it failed - on failing it removed the transport of the client. using update_transport - hence no video was displayed on the client side since then. - after that the dispatcher came into picture once again and delivered the message to the client - it was only after the transport was removed, the client's object received the message sent notification for the sequence which was set earlier On a trial basis, i stopped the removing the transport, it worked fine with 15-20 clients for around 15 hrs Other observation i came across was - every time the dispatcher was called from the same thread (even for different clients of different streams) - the same thread was also used to send messages to the other clients also. Any help or solution to this problem is heartily welcomed. Best Regards -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From kepitto at outlook.com Fri Mar 19 12:00:36 2021 From: kepitto at outlook.com (kepitto) Date: Fri, 19 Mar 2021 07:00:36 -0500 (CDT) Subject: Creating multiple udp sources? Message-ID: <1616155236596-0.post@n4.nabble.com> I'm trying to receive multiple udp streams mix them together with videomixer encode + mux them into a file. I know there is a multiudpsink, but how does one implement multiple udpsrc? My attempt was through a pointer-Array like: GstElement* udpsrc[COUNT]; I create the elements in loops afterwards and link them in a loop to the videomixer. This doesn't seem like a good solution. What's the proper way to implement this? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Fri Mar 19 12:21:40 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Fri, 19 Mar 2021 08:21:40 -0400 Subject: Unable to set Capabilities on v4l2src using C In-Reply-To: <1616100277064-0.post@n4.nabble.com> References: <1616094359820-0.post@n4.nabble.com> <1616100277064-0.post@n4.nabble.com> Message-ID: Le jeu. 18 mars 2021 17 h 45, iotsystek a ?crit : > iotsystek wrote > > Setup: > > I am using GStreamer 1.18.2 on embedded Linux with a couple of USB > cameras > > and sending the video to Windows. > > > > I have a working pipeline. > > USB Cameras attached to the embedded computer using the following command > > example: > > > > gst-launch-1.0 v4l2src device=/dev/video0 ! \ > > 'video/x-raw, width=640, height=480' ! \ > > videorate max-rate=6 ! \ > > videoconvert ! \ > > x264enc pass=qual quantizer=20 tune=zerolatency ! \ > > rtph264pay ! \ > > udpsink host=192.168.168.32 port=1234 > > > > The PC receives and displays the streams with no issue. > > > > Now I am attempting to translate the gst-launch-1.0 command into C code. > > I > > have been totally unsuccessful in setting the capabilities ('video/x-raw, > > width=640, height=480') on the v4l2src, which does not have a ?caps? > > property. > > > > I would be grateful for both a code snippet / example detailing the C > > source > > code needed to do this and guidance as to where I might look to find this > > kind of answer directly. Is there a repository of GStreamer version 1.0 > C > > Code snipets? > > > > Also related to the above. Once I have this working I will need to be > > able > > to change these camera height and width capabilities. > > > > I also have a socket connection (totally unrelated to GStreamer) to the > > embedded Linux system and use it for control and feedback of additional > > hardware. I will need to raise and lower the cameras width and height > > while > > it is running via this socket connection. Will I have to totally stop > the > > video flow or is there a way to adjust these setting on the fly. > > > > Again the code snippet would be greatly appreciated. > > > > > > > > > > -- > > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > > _______________________________________________ > > gstreamer-devel mailing list > > > gstreamer-devel at .freedesktop > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > Got it working? > > It turns out that: > 'video/x-raw, width=640, height=480' ! \ > Translates to: > capsfilter caps='video/x-raw, width=800, height=600' ! \ > > capsfilter has a caps property which is easily set and capsfilter is > applied > to the preceding element in the chain. > > So at this point. > Any help or suggested best practices on changing parameters (width, height) > on the fly would be greatly appreciated. > I think you want to use the device provider to get the list of caps. Whatever you want to set on caps filter, you should check that it intersect with the capabilities from the provider. Then it's just a matter of setting the caps. It can be changed at runtime, note that the change is asynchronous, expect few frames delay. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Fri Mar 19 14:29:29 2021 From: gotsring at live.com (gotsring) Date: Fri, 19 Mar 2021 09:29:29 -0500 (CDT) Subject: Issue with Command In-Reply-To: References: Message-ID: <1616164169370-0.post@n4.nabble.com> It looks like /dev/video0 doesn't exist. What are you trying to play? Try setting the device property on v4l2src to something that exists. See docs here: https://gstreamer.freedesktop.org/documentation/video4linux2/v4l2src.html?gi-language=c -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Fri Mar 19 14:35:06 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Fri, 19 Mar 2021 20:05:06 +0530 Subject: Issue with Command In-Reply-To: <1616164169370-0.post@n4.nabble.com> References: <1616164169370-0.post@n4.nabble.com> Message-ID: Hi, Can you please update in which file we have to set device property. Please provide a path of the gstreamer file location. We are new. Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. On Fri, Mar 19, 2021 at 7:59 PM gotsring wrote: > It looks like /dev/video0 doesn't exist. > What are you trying to play? Try setting the device property on v4l2src to > something that exists. > > See docs here: > > https://gstreamer.freedesktop.org/documentation/video4linux2/v4l2src.html?gi-language=c > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Fri Mar 19 14:59:48 2021 From: gotsring at live.com (gotsring) Date: Fri, 19 Mar 2021 09:59:48 -0500 (CDT) Subject: Creating multiple udp sources? In-Reply-To: <1616155236596-0.post@n4.nabble.com> References: <1616155236596-0.post@n4.nabble.com> Message-ID: <1616165988875-0.post@n4.nabble.com> Why not just add several udpsrc elements to the same pipeline, then mix? Also, I think compositor is preferred over videomixer. Something like... gst-launch-1.0.exe compositor name=c sink_1::xpos=640 sink_2::ypos=480 sink_3::xpos=640 sink_3::ypos=480 ! videoconvert ! x264enc ! h264parse ! matroskamux ! filesink location=out.mkv \ udpsrc name=src0 ! decodebin ! c. \ udpsrc name=src1 ! decodebin ! c. \ udpsrc name=src2 ! decodebin ! c. \ udpsrc name=src3 ! decodebin ! c. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Fri Mar 19 15:10:39 2021 From: gotsring at live.com (gotsring) Date: Fri, 19 Mar 2021 10:10:39 -0500 (CDT) Subject: Issue with Command In-Reply-To: References: <1616164169370-0.post@n4.nabble.com> Message-ID: <1616166639187-0.post@n4.nabble.com> There is not a file you edit, it is part of the gst-launch-1.0 command. Something like: gst-launch-1.0 v4l2src num-buffers=1 device=/dev/video5 ! jpegenc quality=100 ! multifilesink location=/var/www/html/public/img/IMG%06d.jpg This is pretty basic usage. I suggest you Google examples of the gst-launch command, there are several out there. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From perini.davide at dpsoftware.org Fri Mar 19 15:26:45 2021 From: perini.davide at dpsoftware.org (Davide Perini) Date: Fri, 19 Mar 2021 16:26:45 +0100 Subject: GStreamer 1.20 Message-ID: <4f46bd91-c586-43f9-64a9-05e0700540a9@dpsoftware.org> Hi guys, sorry for the question but I am waiting for some cool new feature in GStreamer 1.20 (d3d11desktopdupsrc), is there any plan to release GStreamer 1.20 in march or april? Thank you!!! Davide From activecraft at gmail.com Fri Mar 19 16:30:51 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Fri, 19 Mar 2021 22:00:51 +0530 Subject: Issue with Command In-Reply-To: <1616166639187-0.post@n4.nabble.com> References: <1616164169370-0.post@n4.nabble.com> <1616166639187-0.post@n4.nabble.com> Message-ID: I tried below command and same error. Gstreamer is installed at AWS, so is there any issue of fetching webcam. gst-launch-1.0 v4l2src num-buffers=1 device=/dev/video5 ! jpegenc quality=100 ! multifilesink location=/var/www/html/public/img/IMG%06d.jpg Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot identify device '/dev/video5'. Additional debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: system error: No such file or directory Setting pipeline to NULL ... Freeing pipeline ... Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. On Fri, Mar 19, 2021 at 8:40 PM gotsring wrote: > There is not a file you edit, it is part of the gst-launch-1.0 command. > > Something like: > gst-launch-1.0 v4l2src num-buffers=1 device=/dev/video5 ! jpegenc > quality=100 ! multifilesink location=/var/www/html/public/img/IMG%06d.jpg > > This is pretty basic usage. > I suggest you Google examples of the gst-launch command, there are several > out there. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Fri Mar 19 17:00:03 2021 From: gotsring at live.com (gotsring) Date: Fri, 19 Mar 2021 12:00:03 -0500 (CDT) Subject: Issue with Command In-Reply-To: References: <1616164169370-0.post@n4.nabble.com> <1616166639187-0.post@n4.nabble.com> Message-ID: <1616173203380-0.post@n4.nabble.com> I have not ever dealt with GStreamer on a remote device or AWS. You should make sure that the webcam you are trying to access is even recognized by the OS (try playing it with VLC or something first). Typically, video devices are listed under /dev/video*, but this may not be the case for you. This sounds like something you have to sort out on your end; this is not a GStreamer issue. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Fri Mar 19 17:02:30 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Fri, 19 Mar 2021 22:32:30 +0530 Subject: Issue with Command In-Reply-To: <1616173203380-0.post@n4.nabble.com> References: <1616164169370-0.post@n4.nabble.com> <1616166639187-0.post@n4.nabble.com> <1616173203380-0.post@n4.nabble.com> Message-ID: Gstreamer is Installed at AWS and we are running the command at the website. Is Gstreamer works only for local PC Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. On Fri, Mar 19, 2021 at 10:30 PM gotsring wrote: > I have not ever dealt with GStreamer on a remote device or AWS. You should > make sure that the webcam you are trying to access is even recognized by > the > OS (try playing it with VLC or something first). Typically, video devices > are listed under /dev/video*, but this may not be the case for you. > > This sounds like something you have to sort out on your end; this is not a > GStreamer issue. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From trey.hutcheson at gmail.com Fri Mar 19 17:20:34 2021 From: trey.hutcheson at gmail.com (Trey Hutcheson) Date: Fri, 19 Mar 2021 12:20:34 -0500 Subject: webrtcbin and caps on sink pads Message-ID: Ok, so I know that before generating an offer/answer, that we need to wait on all sink pads to get caps from upstream. But we are encountering a problem: sometimes caps are never sent. Example: * webrtcbin1 is receiving audio and video from its remote peer. Each incoming media stream is terminated by a tee element. * I create a new webrtcbin instance, and branch off the tee's hanging off webrtcbin1; each branch contains a queue and then depayloads and repayloads the rtp, and connected via new sink pads on webrtcbin2. Repeat that process for each new webrtc peer/webrtcbin instance that needs to participate in the media session. Problem #1: the first time a branch is created from the source tee, caps are sent downstream. However, new branches never receive caps. To get around that, we explicitly create a caps event and send it down the branch. Problem #2: sometimes that caps event takes several seconds (2 seconds, 5 seconds, it's indeterminate) to actually get to the new sink pad. And I have no idea why. This delay creates all kinds of problems for us internally. Why is the tee element not sending caps down to new elements after they are linked? Is it supposed to? If it's not, is there any built in element that will function like a tee, cache the last known caps, and send them downstream when new src pads are linked? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jpmelian at gmail.com Fri Mar 19 17:54:25 2021 From: jpmelian at gmail.com (JPM) Date: Fri, 19 Mar 2021 12:54:25 -0500 (CDT) Subject: input-selector problem in Android Message-ID: <1616176465041-0.post@n4.nabble.com> Hi, In Android Tutorial3 I have added this new native function : static void gst_native_acceso (JNIEnv* env, jobject thiz, int lan) { CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id); GstElement *selector = gst_bin_get_by_name(data->pipeline, "sel"); gint nb_sources; GstPad *active_pad, *new_pad; gchar *active_name; g_object_get (G_OBJECT (selector), "n-pads", &nb_sources, NULL); g_object_get (G_OBJECT (selector), "active-pad", &active_pad, NULL); active_name = gst_pad_get_name(active_pad); gchar *message = g_strdup_printf("active_name: %s , n-pads: %d", active_name, nb_sources); set_ui_message(message, data); g_free (message); } The pipeline is : data->pipeline = gst_parse_launch("input-selector name=sel udpsrc multicast-group=224.1.1.1 auto-multicast=true port=6000 ! \ image/jpeg,width=1280,height=720,framerate=31/1 ! jpegparse ! jpegdec ! sel. tcpclientsrc host=80.28.166.18 port=6001 ! \ h264parse ! avdec_h264 ! sel. sel. ! glimagesink sync=false", &error); The problem is the active-pad name is "(NULL)" when it should be "sink_0" or "sink_1". Any help ?. Thanks and best regards. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Fri Mar 19 18:54:44 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Fri, 19 Mar 2021 14:54:44 -0400 Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <1616147696623-0.post@n4.nabble.com> References: <1615985169511-0.post@n4.nabble.com> <1616038344494-0.post@n4.nabble.com> <1616046496522-0.post@n4.nabble.com> <50b192f8634c89fdb8277d26ab9007bd4d672767.camel@ndufresne.ca> <1616147696623-0.post@n4.nabble.com> Message-ID: <46659ff832cde2c517a1d96b9cc776ec9721b439.camel@ndufresne.ca> Le vendredi 19 mars 2021 ? 04:54 -0500, wanted002 a ?crit?: > Thanks for your help, Nicolas.? Since I'm a fresh on video and gstreamer, so > Further more please :) > 1) Seems slice base *threading* need the cooperation from encoder side. So I > need to checkout the slice number in the encoded frame first ?? Then I can > decide the max-threads value according to it . Correct, that setting will depends on the encoder you use of course. An example, for openh264enc, slice-mode=n-slices and num-slices=N will do. > 2) I should use thread-type=slice ? But that's introduced in gstreamer 1.18 > , may be I need to upgrade my gstreamer Oh, oops, well, or backport the changes. > 3) "The paralellism still vary on the encoding of references, since > sometimes you have to decode the reference before you can do anything else. > " ----for this, I'm a little puzzled.? "references" means the I frame? And I > need to decode the I frame 1st for the sync between multi-threads ? A reference frame is a frame used to decode other frames. The compression method used in H264 include the ability to start from a previous frame and edit that (moving some blockes, strething from other block) around in order to reconstruct a similar image. If you haven't decoded that frame yet, it's not really possible to decode. Decoder can be facy of course and wait till the specific block is ready. > ??? Thanks again. Best wishes ! > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From olivier.crete at collabora.com Fri Mar 19 19:09:51 2021 From: olivier.crete at collabora.com (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Fri, 19 Mar 2021 15:09:51 -0400 Subject: webrtcbin and caps on sink pads In-Reply-To: References: Message-ID: On Fri, 2021-03-19 at 12:20 -0500, Trey Hutcheson wrote: > Ok, so I know that before generating an offer/answer, that we need to > wait on all sink pads to get caps from upstream. But we are > encountering a problem: sometimes caps are never sent. You can also skip all of that by just setting the exact codec you will receive as the codec preferences on the transceiver itself. > > Example: > * webrtcbin1 is receiving audio and video from its remote peer. Each > incoming media stream is terminated by a tee element.? > * I create a new webrtcbin instance, and branch off the tee's hanging > off webrtcbin1; each branch contains a queue and then depayloads and > repayloads the rtp, and connected via new sink pads on webrtcbin2.? > > Repeat?that process for each new webrtc peer/webrtcbin instance that > needs to participate in the media session. > > Problem #1: the first time a branch is created from the source tee, > caps are sent downstream. However, new branches never receive caps. > To get around that, we explicitly create a caps event and send it > down the branch. > > Problem #2: sometimes that caps event takes several seconds (2 > seconds, 5 seconds, it's indeterminate) to actually get to the new > sink pad. And I have no idea why. This delay creates all kinds of > problems for us internally.? Possibly blocking on a keyframe in the parser ? > > Why is the tee element not sending caps down to new elements after > they are linked? Is it supposed to? If it's not, is there any built > in element that will function like a tee, cache the last known caps, > and send them downstream when new src pads are linked? Tee should do exactly that already. --? Olivier Cr?te olivier.crete at collabora.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From activecraft at gmail.com Fri Mar 19 21:25:34 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Sat, 20 Mar 2021 02:55:34 +0530 Subject: Help needed Message-ID: We have installed Gstreamer at AWS ubuntu. Now when i am running below command then how my pc webcam will work. what steps needed so a webcam can be fetched. root at ip-172-31-31-90:~# gst-launch-1.0 v4l2src num-buffers=1 ! jpegenc quality=100 ! multifilesink location=/var/www/html/public/img/IMG%06d.jpg Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot identify device '/dev/video0'. Additional debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: system error: No such file or directory Setting pipeline to NULL ... Freeing pipeline ... root at ip-172-31-31-90:~# Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From giri_2984 at yahoo.co.in Fri Mar 19 22:15:48 2021 From: giri_2984 at yahoo.co.in (RK29) Date: Fri, 19 Mar 2021 17:15:48 -0500 (CDT) Subject: Getting AUdio Video Stream using Kinesis Message-ID: <1616192148765-0.post@n4.nabble.com> Hello There, I can get the video with Below GST command but cannot get audio gst-launch-1.0 rtspsrc location="rtsp://192.168.1.111:554/" short-header=TRUE ! rtph264depay ! video/x-h264, format=avc,alignment=au ! h264parse ! kvssink stream-name="test-stream" aws-region="east-1" retention-period=1 Can someone recommend any solution. My audio is AAC -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From giri_2984 at yahoo.co.in Fri Mar 19 22:17:51 2021 From: giri_2984 at yahoo.co.in (RK29) Date: Fri, 19 Mar 2021 17:17:51 -0500 (CDT) Subject: Gstreamer not able to connect with RTSP source In-Reply-To: <1585285038396-0.post@n4.nabble.com> References: <1585285038396-0.post@n4.nabble.com> Message-ID: <1616192271102-0.post@n4.nabble.com> Did you find the answer? I am not getting the audio only video gst-launch-1.0 rtspsrc location="rtsp://192.168.1.111:554/" short-header=TRUE ! rtph264depay ! video/x-h264, format=avc,alignment=au ! h264parse ! kvssink stream-name="test-stream" aws-region="east-1" retention-period=1 any thoughts for audio? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Fri Mar 19 22:19:06 2021 From: gotsring at live.com (gotsring) Date: Fri, 19 Mar 2021 17:19:06 -0500 (CDT) Subject: Help needed In-Reply-To: References: Message-ID: <1616192346803-0.post@n4.nabble.com> What are you trying to do overall? What is the AWS instance for? Storing images? Does your AWS instance have a camera connected to it, or is your camera only connected to your local PC? If your webcam is on your PC, then you will have to run the command on your PC, not the AWS console. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Fri Mar 19 23:15:04 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Sat, 20 Mar 2021 04:45:04 +0530 Subject: Help needed In-Reply-To: <1616192346803-0.post@n4.nabble.com> References: <1616192346803-0.post@n4.nabble.com> Message-ID: If i will run Command at my pcm than this is the error D:\>gst-launch-1.0 v4l2src0 num-buffers=1 device="/dev/video0" ! jpegenc quality=100 ! multifilesink location=/var/www/html/public/img/IMG%06d.jpg 'gst-launch-1.0' is not recognized as an internal or external command, operable program or batch file. Our Aim Gstreamer and ubuntu installed at AWS Now I want to run commands on a live website so the website can access the user webcam (who opened the site in browser) and live stream to the browser. Can live websites fetch cameras? Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. On Sat, Mar 20, 2021 at 3:49 AM gotsring wrote: > What are you trying to do overall? > What is the AWS instance for? Storing images? Does your AWS instance have a > camera connected to it, or is your camera only connected to your local PC? > > If your webcam is on your PC, then you will have to run the command on your > PC, not the AWS console. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From patrickcusack at mac.com Fri Mar 19 23:21:54 2021 From: patrickcusack at mac.com (Patrick Cusack) Date: Fri, 19 Mar 2021 16:21:54 -0700 Subject: gstreamer and BigSur In-Reply-To: References: Message-ID: <62B99327-B9CA-4305-8A76-D5D9F6F245A5@mac.com> Ok. Thanks. Sent from my iPhone > On Mar 16, 2021, at 1:54 PM, Nicolas Dufresne wrote: > > ?Le mardi 16 mars 2021 ? 11:59 -0700, Patrick Cusack a ?crit : >> Question: Does gstreamer on macOS have a sink that utilizes Metal or does it >> still use OpenGL? > > There is no Metal support in GStreamer. So far, glimagesink has been used, and I > think Vulkan sink can be used with MoltenVK. Metal is quite niche in the context > of GStreamer project, but if someone is willing to contribute and maintain > support for that, patches welcome. > >> >> Thanks, >> >> Patrick >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From gotsring at live.com Sat Mar 20 00:42:19 2021 From: gotsring at live.com (gotsring) Date: Fri, 19 Mar 2021 19:42:19 -0500 (CDT) Subject: Help needed In-Reply-To: References: <1616192346803-0.post@n4.nabble.com> Message-ID: <1616200939999-0.post@n4.nabble.com> So you're making a web app that will access the client's webcam and record stuff? This is beyond my experience, but I believe this can be achieved with WebRTC. Perhaps look into this, there are no doubt examples on the Internet. I'm not sure if GStreamer can run in web apps, but either way, I don't think this is something you can achieve using gst-launch-1.0 commands. Someone please correct me if I'm wrong. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From anilkumar03006 at hotmail.com Fri Mar 19 09:12:33 2021 From: anilkumar03006 at hotmail.com (anil0407) Date: Fri, 19 Mar 2021 04:12:33 -0500 (CDT) Subject: gst pad request template. Message-ID: <1616145153552-0.post@n4.nabble.com> Hi All, Is there any gst-template plugin having GST_PAD_REQUEST for GST_STATIC_PAD_TEMPLATE. if no please suggest the changes required to gst-template myfilter plugin which is GST_PAD_ALWAYS. Thanks, Anil -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Sat Mar 20 19:34:21 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Sun, 21 Mar 2021 01:04:21 +0530 Subject: Stream to web browser Message-ID: Hi, gst-launch-1.0 videotestsrc \ ! queue ! vp8enc ! rtpvp8pay \ ! application/x-rtp,media=video,encoding-name=VP8,payload=96 \ ! webrtcbin name=sendrecv Can someone help in consuming this pipeline with a Laravel based server to display the stream onto a web browser? Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From kepitto at outlook.com Mon Mar 22 07:28:15 2021 From: kepitto at outlook.com (kepitto) Date: Mon, 22 Mar 2021 02:28:15 -0500 (CDT) Subject: Creating multiple udp sources? In-Reply-To: <1616165988875-0.post@n4.nabble.com> References: <1616155236596-0.post@n4.nabble.com> <1616165988875-0.post@n4.nabble.com> Message-ID: <1616398095996-0.post@n4.nabble.com> That's what I'm basically doing with my approach in C. I thought maybe there's an element I missed to handle the case better regarding variable inputsize. Also i tried the compositor, but it freezes up after ~ 30 seconds of recording, which doesn't happen with videomixer. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From karim_atiki at hotmail.com Mon Mar 22 10:15:08 2021 From: karim_atiki at hotmail.com (karimchtx) Date: Mon, 22 Mar 2021 05:15:08 -0500 (CDT) Subject: Cannot play a 4K mkv content with v4l2h265dec plugin Message-ID: <1616408108829-0.post@n4.nabble.com> Hi,I'm currently trying to play a 4K content on an IMX8 based board, an apalis imx8.I planed to use the v4l2h265dec decoder, as at first glance, the imx vpudec plugin is not available for this processor, though it's supported on imx8m and mx8mm.I've designed the following pipeline:root at apalis-imx8:~# XDG_RUNTIME_DIR=/run/user/0 gst-launch-1.0 -v filesrc location=/media/sda1/4K_content.2160p.UHD.BLURAY.REMUX.HDR.HEVC.x265.mkv ! matroskademux name=d d. ! queue ! h265parse ! v4l2h265dec ! waylandsinkUnfortunately, I systematicaly get the following error, whatever I try to change the pipeline. Setting pipeline to PAUSED ...Pipeline is PREROLLED ...Got context from element 'sink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";Setting pipeline to PLAYING ...New clock: GstSystemClock/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h265, level=(string)5.1, tier=(string)high, profile=(string)main-10, codec_data=(buffer)012220000000b0000000000099f000fcfdfafa00000f04a00001002140010c01fff00b00000030000030099148c0c00000fa40001770140a10001003d420101222000000300b00000030000030099a001e020021c4db148e490a50bc06d4244026d9400000fa4000177018930397800057bcc0017d7be78f1e8a2000100084401c072f69bfb64a70001003d4e010538428219561c292e3a1fdf34154454d4520546974616e2046696c6520332e382e31362028342e382e31362e302920202020202080, stream-format=(string)hvc1, alignment=(string)au, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, fraction)24000/1001/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h265, level=(string)5.1, tier=(string)high, profile=(string)main-10, codec_data=(buffer)012220000000b0000000000099f000fcfdfafa00000f04a00001002140010c01fff00b00000030000030099148c0c00000fa40001770140a10001003d420101222000000300b00000030000030099a001e020021c4db148e490a50bc06d4244026d9400000fa4000177018930397800057bcc0017d7be78f1e8a2000100084401c072f69bfb64a70001003d4e010538428219561c292e3a1fdf34154454d4520546974616e2046696c6520332e382e31362028342e382e31362e302920202020202080, stream-format=(string)hvc1, alignment=(string)au, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, fraction)24000/1001/GstPipeline:pipeline0/GstH265Parse:h265parse0.GstPad:sink: caps = video/x-h265, level=(string)5.1, tier=(string)high, profile=(string)main-10, codec_data=(buffer)012220000000b0000000000099f000fcfdfafa00000f04a0000100214002000000300b00000030000030099148c0c00000fa40001770140a10001003d420101222000000300b00000030000030099a001e020021c4db148e490a50bc06d4244026d9400000fa4000177018930397800057bcc0017d7be78f1e8a2000100084401c072f69bfb64a70001003d4ec9bb89248219561c292e3a1fdf34154454d4520546974616e2046696c6520332e382e31362028342e382e31362e302920202020202080, stream-format=(string)hvc1, alignment=(string)au, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fractimerate=(fraction)24000/1001/GstPipeline:pipeline0/GstH265Parse:h265parse0.GstPad:src: caps = video/x-h265, level=(string)5.1, tier=(string)high, profile=(string)main-10, stream-format=(string)byte-stream, alignment=(string)au, width=(int)3840, heigh, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)24000/1001, chroma-format=(string)4:2:0, bit-depth-luma=(uint)10, bit-depth-chroma=(uint)10, parsed=(boolean)true/GstPipeline:pipeline0/v4l2h265dec:v4l2h265dec0.GstPad:sink: caps = video/x-h265, level=(string)5.1, tier=(string)high, profile=(string)main-10, stream-format=(string)byte-stream, alignment=(string)au, width=(int)3840, hei60, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)24000/1001, chroma-format=(string)4:2:0, bit-depth-luma=(uint)10, bit-depth-chroma=(uint)10, parsed=(boolean)trueERROR: from element /GstPipeline:pipeline0/GstMatroskaDemux:d: Internal data stream error.Additional debug info:../git/gst/matroska/matroska-demux.c(5715): gst_matroska_demux_loop (): /GstPipeline:pipeline0/GstMatroskaDemux:d:streaming stopped, reason not-negotiated (-4)Execution ended after 0:00:00.131678311Setting pipeline to PAUSED ...Setting pipeline to READY ...Setting pipeline to NULL ...Total showed frames (0), playing for (0:00:00.132085050), fps (0.000). Is there something obviously wrong in my pipeline ?Thanks for your feedback.K. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From crg7475 at mailbox.org Mon Mar 22 10:22:57 2021 From: crg7475 at mailbox.org (Carlos Rafael Giani) Date: Mon, 22 Mar 2021 11:22:57 +0100 Subject: GStreamer and MPEG-H 3D audio Message-ID: <3df89edd-bed9-cc54-07ac-d943c92cc64d@mailbox.org> Has anyone begun to work on adding MPEG-H 3D audio support to GStreamer? In particular, this would involve qtdemux (to parse MPEG-H audio atoms) and alsasink (to output encoded MPEG-H 3D audio directly to a DSP). I've been looking for something like that, but did not find any attempts at actually adding that support. Does anyone else know more? From 330271189 at qq.com Mon Mar 22 03:50:06 2021 From: 330271189 at qq.com (=?ISO-8859-1?B?c3Ryb25n?=) Date: Mon, 22 Mar 2021 11:50:06 +0800 Subject: how to use avdec_g726 to decode g726 raw data Message-ID: hi, Nicolas     Thanks for your help.     Now  I can make avdec_g726 work, like gst-launch-1.0 filesrc location=8000_1_b5.g726 do-timestamp=true blocksize=100 ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726, format=S16LE ! avdec_g726  ! alsasink sync=false     But another question is "mute" can not be set when "volume" plugin is used, for example, if the command line is like gst-launch-1.0 filesrc location=8000_1_b5.g726 do-timestamp=true blocksize=100 ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726, format=S16LE ! avdec_g726  ! volume volume=5.0 ! alsasink sync=false    it can work. But if the command line is like gst-launch-1.0 filesrc location=8000_1_b5.g726 do-timestamp=true blocksize=100 ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726, format=S16LE ! avdec_g726  ! volume mute=true ! alsasink sync=false   it can not work, the print message is  Setting pipeline to PAUSED ... Pipeline is PREROLLING ... Redistribute latency... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstAudioSinkClock Got EOS from element "pipeline0". Execution ended after 0:00:00.255093264 Setting pipeline to NULL ... Freeing pipeline ...         it seems that mute in volume will close the stream,  I do not know the reason Thanks very much -------------- next part -------------- An HTML attachment was scrubbed... URL: From fblackmessms at yandex.ru Mon Mar 22 09:21:08 2021 From: fblackmessms at yandex.ru (Maksim Danilov) Date: Mon, 22 Mar 2021 04:21:08 -0500 (CDT) Subject: How to configure webrtc echo cancellation in Windows 10? Message-ID: <1616404868390-0.post@n4.nabble.com> Good afternoon. I'm trying to make echo cancellation in this pipeline: gst-launch-1.0.exe rtpsession name=s udpsrc port=8078 caps="application/x-rtp, media=audio, clock-rate=8000, encoding-name=PCMU, payload=0" ! s.recv_rtp_sink s.recv_rtp_src ! rtpjitterbuffer latency=100 ! rtppcmudepay ! mulawdec ! audioresample ! audioconvert ! audio/x-raw, rate=48000, format=S16LE ! webrtcechoprobe ! audioconvert ! wasapisink buffer-time=30000 wasapisrc ! audioresample ! audioconvert ! audio/x-raw, rate=48000, format=S16LE ! webrtcdsp ! audioresample ! audioconvert ! mulawenc ! rtppcmupay ! s.send_rtp_sink s.send_rtp_src ! udpsink host=10.8.0.218 port=30012 Everyting is good, but echo cancel is not working at all. I read multiple mailing list, however can't make through. Can someone help me? Do you how some research on how to configure latency and buffer-time? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From sartoricz at gmail.com Mon Mar 22 10:57:10 2021 From: sartoricz at gmail.com (Jan Vitek) Date: Mon, 22 Mar 2021 05:57:10 -0500 (CDT) Subject: srtserversink not transmitting video pakets In-Reply-To: <1579790763941-0.post@n4.nabble.com> References: <1579555018621-0.post@n4.nabble.com> <1579790763941-0.post@n4.nabble.com> Message-ID: <1616410630168-0.post@n4.nabble.com> Hi, I just succeeded with my first srt pipeline, using gstreamer packaged in Debian Testing. It needs some more tuning - buffering, encryption and statistics though. gst-inspect-1.0 version 1.18.3 GStreamer 1.18.3 http://packages.qa.debian.org/gstreamer1.0 gst-launch-1.0 -vvv udpsrc multicast-iface=enp7s0 auto-multicast=true uri=udp://10.1.1.1 at 232.10.10.1:2314 ! srtserversink uri=srt://:12314/ Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock WARNING: from element /GstPipeline:pipeline0/GstSRTServerSink:srtserversink0: Pipeline construction is invalid, please add queues. Additional debug info: ../libs/gst/base/gstbasesink.c(1249): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstSRTServerSink:srtserversink0: Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000. ^Chandling interrupt. Interrupt: Stopping pipeline ... Execution ended after 0:17:03.034216413 Setting pipeline to NULL ... ERROR: from element /GstPipeline:pipeline0/GstSRTServerSink:srtserversink0: Failed to write to SRT socket: Canceled waiting for a connection. Additional debug info: ../ext/srt/gstsrtsink.c(181): gst_srt_sink_render (): /GstPipeline:pipeline0/GstSRTServerSink:srtserversink0 An error happened while waiting for EOS ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error. Additional debug info: ../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: streaming stopped, reason error (-5) An error happened while waiting for EOS Freeing pipeline ... -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From info at activecraft.com Mon Mar 22 04:09:32 2021 From: info at activecraft.com (info at activecraft.com) Date: Mon, 22 Mar 2021 09:39:32 +0530 Subject: Issue of fetching Camera from website In-Reply-To: References: Message-ID: <20210322093932.Horde.B6f9H087DJslPIw02wDgTWZ@activecraft.com> ? ? Hi, ? ? Again clarifying ? ? We have Installed Gstreamer at AWS/ubuntu. ? ? Now we want to fetch webcam from website. ? ?Is this possible with these commands .... ? ? ? ? gst-launch-1.0 v4l2src ! ------------------- ? ?Or? ? ? ? gst-launch-1.0 webrtcbin ........... ?? ? ? Or not. ? ? Please help. ? ?Kind Regards, Anurag Quoting Nicolas Dufresne : > Le jeudi 18 mars 2021 ? 17:53 +0530, Activecraft Software > Development a ?crit?: >> Hi, >> >> We have Installed Ubuntu and Gstreamer at AWS? >> and when running this command >> gst-launch-1.0 v4l2src ! videoconvert ! ximagesink >> >> We are getting this error? >> Setting pipeline to PAUSED ... >> ERROR: Pipeline doesn't want to pause. >> ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could >> not initialise X output >> Additional debug info: >> ximagesink.c(860): gst_x_image_sink_xcontext_get (): >> /GstPipeline:pipeline0/GstXImageSink:ximagesink0: >> Could not open display >> Setting pipeline to NULL ... >> Freeing pipeline ... >> ------------------------ > > You have to start an X11 server and set the DISPLAY env accordingly > in order to > use ximagesink. > >> and? >> gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc >> tune=zerolatency ! rtph264pay ! udpsink port=10000 >> >> Error >> Setting pipeline to PAUSED ... >> ERROR: Pipeline doesn't want to pause. >> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot >> identify device '/dev/video0'. >> Additional debug info: >> v4l2_calls.c(609): gst_v4l2_open (): >> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: >> system error: No such file or directory >> Setting pipeline to NULL ... >> Freeing pipeline ... > > You have to have a camera connected to your system and exposed as > /dev/video0 to > use v4l2src too. > > While running a headless X11 is possible over aws, attaching a camera seems > rather atypical. If you use a full VM, you would enable vivid > driver, which will > provide an emulated camera to your linux kernel. > >> ??Anurag Biala???? >> +91-9814808323 |?+1(646)-797-2775?? >> SKYPE:? ?activecraft at hotmail.com? ? ? |? ? ?Gmail? ? ??activecraft at gmail.com >> Website:?https://www.activecraft.com??|?[1] ??Email? ? >> ??info at activecraft.com?? ? >> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? >> >> Website Design & Develop + App Design & Develop + SEO/SMM +?Graphic Design + >> UI/UX >> ? ? ? ? ? ? ?? >> CONFIDENTIALITY NOTICE: The information in this email may be confidential >> and/or privileged. This email is intended to be reviewed by only the >> individual or organization named above. If you are not the intended >> recipient >> or an authorized representative of the intended recipient, you are hereby >> notified that any review, dissemination or copying of this email and its >> attachments, if any, or the information contained herein is >> prohibited. If you >> have received this email in error, please immediately notify the sender by >> return email and delete this email from your system.? >> >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel Links: ------ [1] https://www.activecraft.com%C2%A0%C2%A0%7C%C2%A0 Anurag Biala +91-9814808323 | +1(646)-434-7132 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com | Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From wanted002 at 163.com Mon Mar 22 12:46:10 2021 From: wanted002 at 163.com (wanted002) Date: Mon, 22 Mar 2021 07:46:10 -0500 (CDT) Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <46659ff832cde2c517a1d96b9cc776ec9721b439.camel@ndufresne.ca> References: <1615985169511-0.post@n4.nabble.com> <1616038344494-0.post@n4.nabble.com> <1616046496522-0.post@n4.nabble.com> <50b192f8634c89fdb8277d26ab9007bd4d672767.camel@ndufresne.ca> <1616147696623-0.post@n4.nabble.com> <46659ff832cde2c517a1d96b9cc776ec9721b439.camel@ndufresne.ca> Message-ID: <1616417170799-0.post@n4.nabble.com> Many thanks! I really appereciate your time and advices. I still have some beginner questions to figure out, so beg for your guidance more :) Q1 : About "thread_type" and "thread_count" in source code: I read the function "gst_ffmpegviddec_set_format at gst-libav-1.16.2/ext/libav/gstavviddec.c" and found these code: ''' if (is_live) ffmpegdec->context->thread_type = FF_THREAD_SLICE; else ffmpegdec->context->thread_type = FF_THREAD_SLICE | FF_THREAD_FRAME; ''' So the default "thread_type" parameter is FF_THREAD_SLICE , means multithreading base on slice for version 1.16.2? And is the parameter "ffmpegdec->context->thread_count" indicate how many thread for decoding will be spawned ? Q2 : Why the decoding job still runs only on one cpu core ? I use the "top -H -p xxxx" command, xxxx is the pid of my gst-launch-1.0 programm running the pipeline . But I saw decoding job is loaded on the 1st decoding thread, the other three are idle.(My cpu got 4 cores, so I set avdec_h264 with "max-threads=4"). But I tested this on ubuntu20.10 with gst-1.18.0, all decoding jobs were loaded on each cpu core with parameter "thread-type=Frame". By the way, the encoding side is a windows10 laptop, and I guess maybe windows-os encoded h.264 stream with slice=1. But I'm ashamed that I 'don't known how to prove it.... Any advices,please? Q3 : Why 1 frame latency is introduced if the multithreading base on Frame/1 ? I read the function "gst_ffmpegviddec_set_format at gst-libav-1.18.0/ext/libav/gstavviddec.c" , and see the comment: /* When thread type is FF_THREAD_FRAME, extra latency is introduced equal * to one frame per thread. We thus need to calculate the thread count ourselves */ Thank you very much for your guidance. Best wishes. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From giri_2984 at yahoo.co.in Mon Mar 22 14:46:25 2021 From: giri_2984 at yahoo.co.in (RK29) Date: Mon, 22 Mar 2021 09:46:25 -0500 (CDT) Subject: Getting AUdio Video Stream using Kinesis In-Reply-To: <1616192148765-0.post@n4.nabble.com> References: <1616192148765-0.post@n4.nabble.com> Message-ID: <1616424385978-0.post@n4.nabble.com> Can someone answer the question? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From t.i.m at zen.co.uk Mon Mar 22 17:53:57 2021 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Mon, 22 Mar 2021 17:53:57 +0000 Subject: GStreamer 1.20 In-Reply-To: <4f46bd91-c586-43f9-64a9-05e0700540a9@dpsoftware.org> References: <4f46bd91-c586-43f9-64a9-05e0700540a9@dpsoftware.org> Message-ID: <05c7193c52b65985810e90c59c2e2a646913c6f5.camel@zen.co.uk> On Fri, 2021-03-19 at 16:26 +0100, Davide Perini wrote: Hi Davide, Pretty safe to say that 1.20 won't come out in March, but next on our list is to get a 1.19.1 development pre-release out soon,?which should get the ball rolling towards 1.20. Cheers ?Tim > Hi guys, > sorry for the question but I am waiting for some cool new feature in > GStreamer 1.20 (d3d11desktopdupsrc), > is there any plan to release GStreamer 1.20 in march or april? > > Thank you!!! > Davide > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Mon Mar 22 18:36:26 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 14:36:26 -0400 Subject: how to use avdec_g726 to decode g726 raw data In-Reply-To: References: Message-ID: <8dfcac185356489978ee9edffb7e0beed08e0489.camel@ndufresne.ca> Le lundi 22 mars 2021 ? 11:50 +0800, strong a ?crit?: > hi, Nicolas > ? ? Thanks for your help. > ? ? Now? I can make avdec_g726 work, like > gst-launch-1.0 filesrc location=8000_1_b5.g726 do-timestamp=true blocksize=100 > ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726, > format=S16LE ! avdec_g726? ! alsasink sync=false > ? ? But another question is "mute" can not be set when "volume" plugin is > used, for example, > if the command line is like > gst-launch-1.0 filesrc location=8000_1_b5.g726 do-timestamp=true blocksize=100 > ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726, > format=S16LE ! avdec_g726? ! volume volume=5.0 ! alsasink sync=false > ? ?it can work. > > But if the command line is like > gst-launch-1.0 filesrc location=8000_1_b5.g726 do-timestamp=true blocksize=100 > ! audio/x-adpcm, bitrate=40000, rate=8000, channels=1, layout=g726, > format=S16LE ! avdec_g726? ! volume mute=true ! alsasink sync=false > ? it can not work, the print message is? > Setting pipeline to PAUSED ... > Pipeline is PREROLLING ... > Redistribute latency... > Pipeline is PREROLLED ... > Setting pipeline to PLAYING ... > New clock: GstAudioSinkClock > Got EOS from element "pipeline0". > Execution ended after 0:00:00.255093264 > Setting pipeline to NULL ... > Freeing pipeline ... > ? ? > ? ? it seems that mute in volume will close the stream,? I do not know the > reason mute is optimized, instead of filling the stream with silence, it will send gaps. As you have asked gstreamer to no do synchronization, gaps are consumed very fast as they aren't blocked by the audio queue. Drop the sync=false for better results. > > Thanks very much > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Mon Mar 22 18:40:16 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 14:40:16 -0400 Subject: Issue of fetching Camera from website In-Reply-To: <20210322093932.Horde.B6f9H087DJslPIw02wDgTWZ@activecraft.com> References: <20210322093932.Horde.B6f9H087DJslPIw02wDgTWZ@activecraft.com> Message-ID: Le lundi 22 mars 2021 ? 09:39 +0530, info at activecraft.com a ?crit?: > ? > ? Hi, > ? > ? Again clarifying > ? > ? We have Installed Gstreamer at AWS/ubuntu. > ? > ? Now we want to fetch webcam from website. > ? ?Is this possible with these commands .... > ? > ? ? ? gst-launch-1.0 v4l2src ! ------------------- > ? ?Or? > ? ? ? gst-launch-1.0 webrtcbin ........... What I can I say, yes. Please come back with specific questions if you need help doing so, this is too broad of a quesiton. GStreamer can be used to bridge a webcam from your PC to a WebRTC channel so you WebRTC enabled AWS node can injest. This is not a ready made solution though, GStreamer is a framework that will let you build a customized solution for your specific needs. > ?? > ? > ? Or not. > ? > ? Please help. > ? > ?Kind Regards, > Anurag > > > > > > > > > > > > > > > > > > > > Quoting Nicolas Dufresne : > > Le jeudi 18 mars 2021 ? 17:53 +0530, Activecraft Software Development a > > ?crit?: > > > Hi, > > > > > > We have Installed Ubuntu and Gstreamer at AWS? > > > and when running this command > > > gst-launch-1.0 v4l2src ! videoconvert ! ximagesink > > > > > > We are getting this error? > > > Setting pipeline to PAUSED ... > > > ERROR: Pipeline doesn't want to pause. > > > ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: > > > Could > > > not initialise X output > > > Additional debug info: > > > ximagesink.c(860): gst_x_image_sink_xcontext_get (): > > > /GstPipeline:pipeline0/GstXImageSink:ximagesink0: > > > Could not open display > > > Setting pipeline to NULL ... > > > Freeing pipeline ... > > > ------------------------ > > You have to start an X11 server and set the DISPLAY env accordingly in order > > to > > use ximagesink. > > > and? > > > gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc > > > tune=zerolatency ! rtph264pay ! udpsink port=10000 > > > > > > Error > > > Setting pipeline to PAUSED ... > > > ERROR: Pipeline doesn't want to pause. > > > ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot > > > identify device '/dev/video0'. > > > Additional debug info: > > > v4l2_calls.c(609): gst_v4l2_open (): > > > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: > > > system error: No such file or directory > > > Setting pipeline to NULL ... > > > Freeing pipeline ... > > You have to have a camera connected to your system and exposed as > > /dev/video0 to > > use v4l2src too. > > > > While running a headless X11 is possible over aws, attaching a camera seems > > rather atypical. If you use a full VM, you would enable vivid driver, which > > will > > provide an emulated camera to your linux kernel. > > > ??Anurag Biala???? > > > +91-9814808323 |?+1(646)-797-2775?? > > > SKYPE:? ?activecraft at hotmail.com? ? ? |? ? ?Gmail? ? > > > ??activecraft at gmail.com > > > Website:?https://www.activecraft.com??|? ??Email? ? > > > ??info at activecraft.com?? ? > > > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? > > > > > > > > > Website Design & Develop + App Design & Develop + SEO/SMM +?Graphic Design > > > + > > > UI/UX > > > ? ? ? ? ? ? ?? > > > CONFIDENTIALITY NOTICE: The information in this email may be confidential > > > and/or privileged. This email is intended to be reviewed by only the > > > individual or organization named above. If you are not the intended > > > recipient > > > or an authorized representative of the intended recipient, you are hereby > > > notified that any review, dissemination or copying of this email and its > > > attachments, if any, or the information contained herein is prohibited. If > > > you > > > have received this email in error, please immediately notify the sender by > > > return email and delete this email from your system.? > > > > > > > > > _______________________________________________ > > > gstreamer-devel mailing list > > > gstreamer-devel at lists.freedesktop.org > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.freedesktop.orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > ??Anurag Biala????? > +91-9814808323 |?+1(646)-434-7132?? > SKYPE:? ?activecraft at hotmail.com? ? ? |? ? ?Gmail? ? ??activecraft at gmail.com > Website:?https://www.activecraft.com??|? ??Email? ? ??info at activecraft.com?? ? > ? ? ? ? > Whatsapp :?+1(646)-434-7132?? ? ? ? ? ? ? ? ? ? ? ? ? > ? > > Website Design & Develop + App Design & Develop + SEO/SMM +?Graphic Design + > UI/UX > ? ? ? ? ? ? ?? > CONFIDENTIALITY NOTICE: The information in this email may be confidential > and/or privileged. This email is intended to be reviewed by only the > individual or organization named above. If you are not the intended recipient > or an authorized representative of the intended recipient, you are hereby > notified that any review, dissemination or copying of this email and its > attachments, if any, or the information contained herein is prohibited. If you > have received this email in error, please immediately notify the sender by > return email and delete this email from your system.? > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Mon Mar 22 18:44:36 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 14:44:36 -0400 Subject: How to configure webrtc echo cancellation in Windows 10? In-Reply-To: <1616404868390-0.post@n4.nabble.com> References: <1616404868390-0.post@n4.nabble.com> Message-ID: <019b1aa111310ed1306f17e345ff50cfad170cef.camel@ndufresne.ca> Le lundi 22 mars 2021 ? 04:21 -0500, Maksim Danilov a ?crit?: > Good afternoon. I'm trying to make echo cancellation in this pipeline: > gst-launch-1.0.exe rtpsession name=s udpsrc port=8078 > caps="application/x-rtp, media=audio, clock-rate=8000, encoding-name=PCMU, > payload=0" ! s.recv_rtp_sink s.recv_rtp_src ! rtpjitterbuffer latency=100 ! > rtppcmudepay ! mulawdec ! audioresample ! audioconvert ! audio/x-raw, > rate=48000, format=S16LE ! webrtcechoprobe ! audioconvert ! wasapisink I'm not a windows expert, but perhaps you should set "low-latency" property on both src/sink in order to ensure you have reliable latency, > buffer-time=30000 wasapisrc ! audioresample ! audioconvert ! audio/x-raw, > rate=48000, format=S16LE ! webrtcdsp ! audioresample ! audioconvert ! Assuming you capture/playback latency (ignoring network latency) is under 400ms, you may want to give a try to delay-agnostic mode. > mulawenc ! rtppcmupay ! s.send_rtp_sink s.send_rtp_src ! udpsink > host=10.8.0.218 port=30012 > Everyting is good, but echo cancel is not working at all. I read multiple > mailing list, however can't make through. > Can someone help me? Do you how some research on how to configure latency > and buffer-time? > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From nicolas at ndufresne.ca Mon Mar 22 18:50:19 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 14:50:19 -0400 Subject: Cannot play a 4K mkv content with v4l2h265dec plugin In-Reply-To: <1616408108829-0.post@n4.nabble.com> References: <1616408108829-0.post@n4.nabble.com> Message-ID: <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> Le lundi 22 mars 2021 ? 05:15 -0500, karimchtx a ?crit?: > Hi, I'm currently trying to play a 4K content on an IMX8 based board, an > apalis > imx8. I planed to use the v4l2h265dec decoder, as at first glance, the imx > vpudec plugin is not available for this processor, though it's supported on > imx8m and mx8mm. I've designed the following pipeline: root at apalis-imx8:~# > XDG_RUNTIME_DIR=/run/user/0 gst-launch-1.0 -v filesrc > location=/media/sda1/4K_content.2160p.UHD.BLURAY.REMUX.HDR.HEVC.x265.mkv ! > matroskademux name=d d. ! queue ! h265parse ! v4l2h265dec ! waylandsink > Unfortunately, I systematicaly get the following error, whatever I try to > change > the pipeline. [snip] > streaming stopped, reason not-negotiated (-4) Execution ended after > 0:00:00.131678311 Setting pipeline to PAUSED ... Setting pipeline to READY ... > Setting pipeline to NULL ... Total showed frames (0), playing for > (0:00:00.132085050), fps (0.000). > Is there something obviously wrong in my pipeline ? Thanks for your feedback. > K. Not negotiation is a common pipeline error. It means what two elements in your pipeline could not negotiate their format. It does not say which one though, but I suspect the v4l2h265dec ! waylandsink is the culprit. You need your compositor to support the pixel format produced by your decoder. To confirm this, add videoconvert to see if that make it display something, don't expect 4K if software video conversion is taking place. Next step, inspect the caps, you can use -v to see the caps being set on each pads. The video/x-raw caps will have formats which will help you. You can also use weston-info to list the format supported by your compositor. Remember that this is still downstream and NXP carries a large amount of patches accross gstreamer, weston, mesa etc. So you may need to use these patches if you aren't using their vendor OS. > > Sent from the GStreamer-devel mailing list archive at Nabble.com. > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From activecraft at gmail.com Mon Mar 22 18:51:09 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Tue, 23 Mar 2021 00:21:09 +0530 Subject: Issue of fetching Camera from website In-Reply-To: References: <20210322093932.Horde.B6f9H087DJslPIw02wDgTWZ@activecraft.com> Message-ID: Hi Nicolas, We are new. But we have lots of clients asking for Gstreamer. So if we have to fetch stream using getusermedia than we should use WebRTC and after that any of the command like gst-launch-1.0 -v udpsrc port= ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! udpsink host=3.137.50.48 port=10001 or with webrtcbin If you have Any ex. of usergetmedia with Webrtc to Gstreamer will be very helpful. Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. On Tue, Mar 23, 2021 at 12:10 AM Nicolas Dufresne wrote: > Le lundi 22 mars 2021 ? 09:39 +0530, info at activecraft.com a ?crit : > > > Hi, > > Again clarifying > > We have Installed Gstreamer at AWS/ubuntu. > > Now we want to fetch webcam from website. > Is this possible with these commands .... > > gst-launch-1.0 v4l2src ! ------------------- > Or > gst-launch-1.0 webrtcbin ........... > > > What I can I say, yes. Please come back with specific questions if you > need help doing so, this is too broad of a quesiton. GStreamer can be used > to bridge a webcam from your PC to a WebRTC channel so you WebRTC enabled > AWS node can injest. This is not a ready made solution though, GStreamer is > a framework that will let you build a customized solution for your specific > needs. > > > > Or not. > > Please help. > > Kind Regards, > Anurag > > > > > > > > > > > > > > > > > > > > > Quoting Nicolas Dufresne : > > Le jeudi 18 mars 2021 ? 17:53 +0530, Activecraft Software Development a > ?crit : > > Hi, > > We have Installed Ubuntu and Gstreamer at AWS > and when running this command > gst-launch-1.0 v4l2src ! videoconvert ! ximagesink > > We are getting this error > Setting pipeline to PAUSED ... > ERROR: Pipeline doesn't want to pause. > ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could > not initialise X output > Additional debug info: > ximagesink.c(860): gst_x_image_sink_xcontext_get (): > /GstPipeline:pipeline0/GstXImageSink:ximagesink0: > Could not open display > Setting pipeline to NULL ... > Freeing pipeline ... > ------------------------ > > You have to start an X11 server and set the DISPLAY env accordingly in > order to > use ximagesink. > > and > gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc > tune=zerolatency ! rtph264pay ! udpsink port=10000 > > Error > Setting pipeline to PAUSED ... > ERROR: Pipeline doesn't want to pause. > ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot > identify device '/dev/video0'. > Additional debug info: > v4l2_calls.c(609): gst_v4l2_open (): > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: > system error: No such file or directory > Setting pipeline to NULL ... > Freeing pipeline ... > > You have to have a camera connected to your system and exposed as > /dev/video0 to > use v4l2src too. > > While running a headless X11 is possible over aws, attaching a camera seems > rather atypical. If you use a full VM, you would enable vivid driver, > which will > provide an emulated camera to your linux kernel. > > Anurag Biala > +91-9814808323 | +1(646)-797-2775 > SKYPE: activecraft at hotmail.com | Gmail > activecraft at gmail.com > Website: https://www.activecraft.com | Email info at activecraft.com > > > > > Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design > + > UI/UX > > CONFIDENTIALITY NOTICE: The information in this email may be confidential > and/or privileged. This email is intended to be reviewed by only the > individual or organization named above. If you are not the intended > recipient > or an authorized representative of the intended recipient, you are hereby > notified that any review, dissemination or copying of this email and its > attachments, if any, or the information contained herein is prohibited. If > you > have received this email in error, please immediately notify the sender by > return email and delete this email from your system. > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop. > orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > > Anurag Biala > +91-9814808323 | +1(646)-434-7132 > SKYPE: activecraft at hotmail.com | Gmail > activecraft at gmail.com > Website: https://www.activecraft.com * |* Email > info at activecraft.com > Whatsapp : +1(646)-434-7132 > > > Website Design & Develop + App Design & Develop + SEO/SMM + Graphic > Design + UI/UX > > CONFIDENTIALITY NOTICE: The information in this email may be confidential > and/or privileged. This email is intended to be reviewed by only the > individual or organization named above. If you are not the intended > recipient or an authorized representative of the intended recipient, you > are hereby notified that any review, dissemination or copying of this email > and its attachments, if any, or the information contained herein is > prohibited. If you have received this email in error, please immediately > notify the sender by return email and delete this email from your system. > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Mon Mar 22 19:10:20 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 15:10:20 -0400 Subject: Don't kown how to make avdec_h264 working under multi-threads modes, seems not working with max-threads In-Reply-To: <1616417170799-0.post@n4.nabble.com> References: <1615985169511-0.post@n4.nabble.com> <1616038344494-0.post@n4.nabble.com> <1616046496522-0.post@n4.nabble.com> <50b192f8634c89fdb8277d26ab9007bd4d672767.camel@ndufresne.ca> <1616147696623-0.post@n4.nabble.com> <46659ff832cde2c517a1d96b9cc776ec9721b439.camel@ndufresne.ca> <1616417170799-0.post@n4.nabble.com> Message-ID: <80ff5e5e2e947baefda6fbbcd12d6c263b6f4cb6.camel@ndufresne.ca> Le lundi 22 mars 2021 ? 07:46 -0500, wanted002 a ?crit?: > Many thanks! I really appereciate your time and advices. > I still have some beginner questions to figure out, so beg for your guidance > more :) > > Q1 :? About "thread_type" and? "thread_count" in source code: > I read the function > "gst_ffmpegviddec_set_format at gst-libav-1.16.2/ext/libav/gstavviddec.c"? and > found these code: > > ??? ''' > ??? if (is_live) > ????? ffmpegdec->context->thread_type = FF_THREAD_SLICE; > ??? else > ????? ffmpegdec->context->thread_type = FF_THREAD_SLICE | FF_THREAD_FRAME; > ??? ''' > > So the default "thread_type" parameter is FF_THREAD_SLICE , means > multithreading base on slice for version 1.16.2? > And is the parameter "ffmpegdec->context->thread_count" indicate how many > thread for decoding will be spawned ? Not exactly, the default is to enable threads, SLICE for live pipeline, and SLICE or FRAME (both) for non-live pipeline. > > Q2 : Why the decoding job still runs only on one? cpu core ? > ??? I use the "top -H -p xxxx" command,? xxxx is the pid of my > gst-launch-1.0 programm running the pipeline . But I saw decoding job is > loaded on the 1st decoding thread, the other three are idle.(My cpu got 4 > cores, so I set avdec_h264 with "max-threads=4").?? But I tested this on > ubuntu20.10 with gst-1.18.0, all decoding jobs were loaded on each cpu core > with parameter "thread-type=Frame". > ??? By the way, the encoding side is a windows10 laptop, and I guess maybe > windows-os encoded h.264 stream with slice=1. But I'm ashamed that I 'don't > known how to prove it....? Any advices,please? I assume you use Media Foundation, see "Slice encoding": https://docs.microsoft.com/en-us/windows/win32/medfound/h-264-video-encoder This is the default, it's will produce as many slices as you have CPU cores on the encoder side by default (assuming I'm reading the doc right). > > Q3 : Why 1 frame latency is introduced if the multithreading base on Frame/1 > ? > I read the function > "gst_ffmpegviddec_set_format at gst-libav-1.18.0/ext/libav/gstavviddec.c" , and > see the comment: > ??? /* When thread type is FF_THREAD_FRAME, extra latency is introduced > equal > ???? * to one frame per thread. We thus need to calculate the thread count > ourselves */ > > ??? Thank you very much for your guidance. Best wishes. This is what is documented in FFMPEG API documentation, and was also observed. I haven't looked at the internal details. But adding render delays, ensure for live pipeline that the thread pool fills, otherwise the pool will always starve and run single threaded. Let's hope this is not your situation. That could mean that your time information is late, too high network latency, or miss-configure latency somewhere. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From nicolas at ndufresne.ca Mon Mar 22 19:12:55 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 15:12:55 -0400 Subject: Stream to web browser In-Reply-To: References: Message-ID: <232b8aef4b69f45c99cf9295940a9f78d6edeea8.camel@ndufresne.ca> Le dimanche 21 mars 2021 ? 01:04 +0530, Activecraft Software Development a ?crit?: > Hi, > gst-launch-1.0 videotestsrc \ > ! queue ! vp8enc ! rtpvp8pay \ > ! application/x-rtp,media=video,encoding-name=VP8,payload=96 \ > ! webrtcbin name=sendrecv > Can someone help in consuming this pipeline with a Laravel based server to > display the stream onto a web browser? GStreamer webrtcbin only implement the RTP/ICE (the streaming) part of a WebRTC session. You still need to implement your signalling as per W3C signalling protocol. Also, you will not be able to do webrtc streaming to a browser without writing some code. Please find various code examples here: https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc > > > ??Anurag Biala???? > +91-9814808323 |?+1(646)-797-2775?? > SKYPE:? ?activecraft at hotmail.com? ? ? |? ? ?Gmail? ? ??activecraft at gmail.com > Website:?https://www.activecraft.com??|? ??Email? ? ??info at activecraft.com?? ? > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? > > > Website Design & Develop + App Design & Develop + SEO/SMM +?Graphic Design + > UI/UX > ? ? ? ? ? ? ?? > CONFIDENTIALITY NOTICE: The information in this email may be confidential > and/or privileged. This email is intended to be reviewed by only the > individual or organization named above. If you are not the intended recipient > or an authorized representative of the intended recipient, you are hereby > notified that any review, dissemination or copying of this email and its > attachments, if any, or the information contained herein is prohibited. If you > have received this email in error, please immediately notify the sender by > return email and delete this email from your system.? > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From karim_atiki at hotmail.com Mon Mar 22 19:34:43 2021 From: karim_atiki at hotmail.com (karimchtx) Date: Mon, 22 Mar 2021 14:34:43 -0500 (CDT) Subject: Cannot play a 4K mkv content with v4l2h265dec plugin In-Reply-To: <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> References: <1616408108829-0.post@n4.nabble.com> <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> Message-ID: <1616441683716-0.post@n4.nabble.com> Hi Nicolas, Thanks for your quick reply. You're definitly right in your analysis. A couple hours after my posts, I've slightly changed my pipeline as following: gst-launch-1.0 -v filesrc location=/media/sda1/solo.mkv ! video/x-matroska ! aiurdemux ! h265parse ! v4l2h265dec ! imxvideoconvert_g2d ! ! queue ! waylandsink You were right, I needed an explicit plugin before waylandsink for the conversion. But: at first sight, it works. But the video is sometime freezing during many seconds...(but gstreamer doesn't crash). Regarding the CPU, the vpu is effectively working. I can't figure out how to explain the "freezing" sequences. I tried to play the same media within souphttpsrc, it's working as well but the playback is still stuttering in "fast sequences" of the movie. And finally, I just can't figure ou what's happening and how could I debug / monitor the plugins task. Any suggestion would be welcome. Best Regards, Karim -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Mon Mar 22 20:50:41 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 22 Mar 2021 16:50:41 -0400 Subject: Cannot play a 4K mkv content with v4l2h265dec plugin In-Reply-To: <1616441683716-0.post@n4.nabble.com> References: <1616408108829-0.post@n4.nabble.com> <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> <1616441683716-0.post@n4.nabble.com> Message-ID: <9c050368d2c482c1b393dbf6aad7f26635e8d0e6.camel@ndufresne.ca> Le lundi 22 mars 2021 ? 14:34 -0500, karimchtx a ?crit?: > Hi Nicolas, > > Thanks for your quick reply. > You're definitly right in your analysis. > A couple hours after my posts, I've slightly changed my pipeline as > following: > > ?gst-launch-1.0 -v filesrc location=/media/sda1/solo.mkv ! video/x-matroska > ! aiurdemux ! h265parse ! v4l2h265dec ! imxvideoconvert_g2d ! ! queue ! > waylandsink > > > You were right, I needed an explicit plugin before waylandsink for the > conversion. > But: > > at first sight, it works. > But the video is sometime freezing during many seconds...(but gstreamer > doesn't crash). > Regarding the CPU, the vpu is effectively working. > I can't figure out how to explain the "freezing" sequences. > > I tried to play the same media within souphttpsrc, it's working as well but > the playback is still stuttering in "fast sequences" of the movie. > > And finally, I just can't figure ou what's happening and how could I debug / > monitor the plugins task. Few things to try: Add queues before/after your decoder, to help parallelism and mimic what playbin would do. Try and make sure imxvideoconvert_g2d runs in zero-copy. Try and change aiurdemux with matroskademux, just in case the NXP demuxer have timing issues (that being said, it's quite good for a downstream demuxer). You may want to test with fakevideosink to make sure you decoder is keeping up, as if it's already the case, you can start looking at graphics. > > Any suggestion would be welcome. > > Best Regards, > > Karim > > > > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From ystreet00 at gmail.com Tue Mar 23 01:35:22 2021 From: ystreet00 at gmail.com (Matthew Waters) Date: Tue, 23 Mar 2021 12:35:22 +1100 Subject: Issue of fetching Camera from website In-Reply-To: References: <20210322093932.Horde.B6f9H087DJslPIw02wDgTWZ@activecraft.com> Message-ID: Take your pick among any of these webrtc examples: https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc . On 23/3/21 5:51 am, Activecraft Software Development wrote: > Hi Nicolas, > > We are new. But we have lots of clients asking for Gstreamer. > > So if? we have to fetch stream using getusermedia than we should use > WebRTC and after that any of the command like > > gst-launch-1.0 -v udpsrc port= ! > application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! > jpegdec ! udpsink host=3.137.50.48 port=10001 > or with webrtcbin > > If you have Any ex. of usergetmedia with Webrtc to Gstreamer will be > very helpful. > > > Anurag Biala > +91-9814808323 | +1(646)-797-2775 > SKYPE: activecraft at hotmail.com ? ? ? > |Gmail activecraft at gmail.com > Website: https://www.activecraft.com > *?|*Email __info at activecraft.com > > > > Website Design & Develop + App Design & Develop + SEO/SMM + Graphic > Design + UI/UX > CONFIDENTIALITY NOTICE: The information in this email may be > confidential and/or privileged. This email is intended to be reviewed > by only the individual or organization named above. If you are not the > intended recipient or an authorized representative of the intended > recipient, you are hereby notified that any review, dissemination or > copying of this email and its attachments, if any, or the information > contained herein is prohibited. If you have received this email in > error, please immediately notify the sender by return email and delete > this email from your system. > > > > > On Tue, Mar 23, 2021 at 12:10 AM Nicolas Dufresne > > wrote: > > Le lundi 22 mars 2021 ? 09:39 +0530, info at activecraft.com > a ?crit?: >> >> ? Hi, >> ? Again clarifying >> ? We have Installed Gstreamer at AWS/ubuntu. >> ? Now we want to fetch webcam from website. >> ? ?Is this possible with these commands .... >> ? ? ? gst-launch-1.0 v4l2src ! ------------------- >> ? ?Or >> ? ? ? gst-launch-1.0 webrtcbin ........... > > What I can I say, yes. Please come back with specific questions if > you need help doing so, this is too broad of a quesiton. GStreamer > can be used to bridge a webcam from your PC to a WebRTC channel so > you WebRTC enabled AWS node can injest. This is not a ready made > solution though, GStreamer is a framework that will let you build > a customized solution for your specific needs. > >> ? Or not. >> ? Please help. >> ?Kind Regards, >> Anurag >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> Quoting Nicolas Dufresne > >: >> >>> Le jeudi 18 mars 2021 ? 17:53 +0530, Activecraft Software >>> Development a ?crit?: >>> >>>> Hi, >>>> >>>> We have Installed Ubuntu and Gstreamer at AWS >>>> and when running this command >>>> gst-launch-1.0 v4l2src ! videoconvert ! ximagesink >>>> >>>> We are getting this error >>>> Setting pipeline to PAUSED ... >>>> ERROR: Pipeline doesn't want to pause. >>>> ERROR: from element >>>> /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Could >>>> not initialise X output >>>> Additional debug info: >>>> ximagesink.c(860): gst_x_image_sink_xcontext_get (): >>>> /GstPipeline:pipeline0/GstXImageSink:ximagesink0: >>>> Could not open display >>>> Setting pipeline to NULL ... >>>> Freeing pipeline ... >>>> ------------------------ >>>> >>> You have to start an X11 server and set the DISPLAY env >>> accordingly in order to >>> use ximagesink. >>>> >>>> and >>>> gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc >>>> tune=zerolatency ! rtph264pay ! udpsink port=10000 >>>> >>>> Error >>>> Setting pipeline to PAUSED ... >>>> ERROR: Pipeline doesn't want to pause. >>>> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: >>>> Cannot >>>> identify device '/dev/video0'. >>>> Additional debug info: >>>> v4l2_calls.c(609): gst_v4l2_open (): >>>> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: >>>> system error: No such file or directory >>>> Setting pipeline to NULL ... >>>> Freeing pipeline ... >>>> >>> You have to have a camera connected to your system and exposed >>> as /dev/video0 to >>> use v4l2src too. >>> >>> While running a headless X11 is possible over aws, attaching a >>> camera seems >>> rather atypical. If you use a full VM, you would enable vivid >>> driver, which will >>> provide an emulated camera to your linux kernel. >>>> >>>> ??Anurag Biala >>>> +91-9814808323 |?+1(646)-797-2775 >>>> SKYPE: activecraft at hotmail.com >>>> ? ? |? ? ?Gmail activecraft at gmail.com >>>> >>>> Website: https://www.activecraft.com??| >>>> ??Email >>>> info at activecraft.com >>>> >>>> >>>> >>>> Website Design & Develop + App Design & Develop + SEO/SMM >>>> +?Graphic Design + >>>> UI/UX >>>> >>>> CONFIDENTIALITY NOTICE: The information in this email may be >>>> confidential >>>> and/or privileged. This email is intended to be reviewed by >>>> only the >>>> individual or organization named above. If you are not the >>>> intended recipient >>>> or an authorized representative of the intended recipient, you >>>> are hereby >>>> notified that any review, dissemination or copying of this >>>> email and its >>>> attachments, if any, or the information contained herein is >>>> prohibited. If you >>>> have received this email in error, please immediately notify >>>> the sender by >>>> return email and delete this email from your system. >>>> >>>> >>>> _______________________________________________ >>>> gstreamer-devel mailing list >>>> gstreamer-devel at lists.freedesktop.org >>>> >>>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >>>> >>>> >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.freedesktop >>> .orghttps://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> >> >> >> Anurag Biala >> +91-9814808323 | +1(646)-434-7132 >> SKYPE: activecraft at hotmail.com ? >> ? ? |Gmail activecraft at gmail.com >> Website: https://www.activecraft.com >> *?|*Email __info at activecraft.com >> >> Whatsapp : +1(646)-434-7132 >> >> Website Design & Develop + App Design & Develop + SEO/SMM + >> Graphic Design + UI/UX >> CONFIDENTIALITY NOTICE: The information in this email may be >> confidential and/or privileged. This email is intended to be >> reviewed by only the individual or organization named above. If >> you are not the intended recipient or an authorized >> representative of the intended recipient, you are hereby notified >> that any review, dissemination or copying of this email and its >> attachments, if any, or the information contained herein is >> prohibited. If you have received this email in error, please >> immediately notify the sender by return email and delete this >> email from your system. >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.freedesktop.org >> >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: OpenPGP_signature Type: application/pgp-signature Size: 495 bytes Desc: OpenPGP digital signature URL: From gregoire at gentil.com Tue Mar 23 01:29:32 2021 From: gregoire at gentil.com (Gregoire Gentil) Date: Mon, 22 Mar 2021 18:29:32 -0700 Subject: UDP NET_ADMIN error on Android app In-Reply-To: References: Message-ID: Hello, I'm using gstreamer 1.16.2 in an Android app built with API version 29 and NDK 21.1. I'm running that app on Android TV 9 with kernel 4.1. I get the strange error message: 03-21 19:10:46.896 9080 9223 W GStreamer+udpsrc: 0:00:10.827964588 0xd1507750 ../gst/udp/gstudpsrc.c:1447:gst_udpsrc_open: warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege? 03-21 19:10:46.897 9080 9223 W GStreamer+udpsrc: 0:00:10.828812588 0xd1507750 ../gst/udp/gstudpsrc.c:1457:gst_udpsrc_open: have udp buffer of 262144 bytes while 524288 were requested 03-21 19:10:51.996 9080 9223 W GStreamer+rtspsrc: 0:00:15.928045518 0xd1507750 ../gst/rtsp/gstrtspsrc.c:5771:gst_rtspsrc_reconnect: warning: Could not receive any UDP packets for 5.0000 seconds, maybe your firewall is blocking it. Retrying using a tcp connection. 03-21 19:11:13.472 9080 9223 W GStreamer+rtspsrc: 0:00:37.404373634 0xd1507750 ../gst/rtsp/gstrtspsrc.c:5560:gst_rtspsrc_loop_interleaved: error: Could not receive message. (System error) 03-21 19:11:13.474 9080 9223 W GStreamer+rtspsrc: 0:00:37.406002134 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6056:gst_rtspsrc_loop: error: Internal data stream error. 03-21 19:11:13.474 9080 9223 W GStreamer+rtspsrc: 0:00:37.406265426 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6056:gst_rtspsrc_loop: error: streaming stopped, reason error (-5) 03-21 19:11:13.495 9080 9223 W GStreamer+default: 0:00:37.427520719 0xd1507750 ../gst-libs/gst/rtsp/gstrtspconnection.c:1234:write_bytes Operation was cancelled 03-21 19:11:13.496 9080 9223 W GStreamer+rtspsrc: 0:00:37.427919303 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6402:gst_rtspsrc_try_send: send interrupted 03-21 19:11:13.496 9080 9223 W GStreamer+rtspsrc: 0:00:37.428175428 0xd1507750 ../gst/rtsp/gstrtspsrc.c:8650:gst_rtspsrc_pause: PAUSE interrupted 03-21 19:11:13.503 9080 9223 W GStreamer+default: 0:00:37.435170178 0xd1507750 ../gst-libs/gst/rtsp/gstrtspconnection.c:1234:write_bytes Error sending data: Broken pipe 03-21 19:11:13.503 9080 9223 W GStreamer+rtspsrc: 0:00:37.435540720 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6400:gst_rtspsrc_try_send: error: Could not send message. (System error) 03-21 19:11:13.504 9080 9223 W GStreamer+rtspsrc: 0:00:37.436036929 0xd1507750 ../gst/rtsp/gstrtspsrc.c:8075:gst_rtspsrc_close: error: Could not send message. (System error) My pipeline is: rtspsrc latency=250 location=rtsp://10.10.10.1:81 ! rtph264depay ! queue ! h264parse ! mpegpsmux ! filesink NET_ADMIN privilege is only for system app. Obviously, the device is not rooted. I'm confused why I get such udpsrc error. Note that the second line of the log indicates that it can't get a 512kB buffer but it can get a 256kB. Has anyone got an idea? Gr?goire From gregoire at gentil.com Tue Mar 23 07:56:05 2021 From: gregoire at gentil.com (Gregoire Gentil) Date: Tue, 23 Mar 2021 00:56:05 -0700 Subject: UDP NET_ADMIN error on Android app In-Reply-To: References: Message-ID: Hello, I have added udp-buffer-size=262144 to my pipeline and the net_admin permission problem disappears. But now, I'm getting: PermissionCache: checking android.permission.ACCESS_SURFACE_FLINGER for uid=10075 => denied (2263 us) It's like Android wants to get a system app to do video via GStreamer on Android TV. Any clue what's going on? Gr?goire On 3/22/21 6:29 PM, Gregoire Gentil wrote: > Hello, > > I'm using gstreamer 1.16.2 in an Android app built with API version 29 > and NDK 21.1. I'm running that app on Android TV 9 with kernel 4.1. > > I get the strange error message: > > 03-21 19:10:46.896? 9080? 9223 W GStreamer+udpsrc: 0:00:10.827964588 > 0xd1507750 ../gst/udp/gstudpsrc.c:1447:gst_udpsrc_open: > warning: Could not create a buffer of requested 524288 bytes (Operation > not permitted). Need net.admin privilege? > > 03-21 19:10:46.897? 9080? 9223 W GStreamer+udpsrc: 0:00:10.828812588 > 0xd1507750 ../gst/udp/gstudpsrc.c:1457:gst_udpsrc_open: have > udp buffer of 262144 bytes while 524288 were requested > > 03-21 19:10:51.996? 9080? 9223 W GStreamer+rtspsrc: 0:00:15.928045518 > 0xd1507750 > ../gst/rtsp/gstrtspsrc.c:5771:gst_rtspsrc_reconnect: warning: > Could not receive any UDP packets for 5.0000 seconds, maybe your > firewall is blocking it. Retrying using a tcp connection. > > 03-21 19:11:13.472? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.404373634 > 0xd1507750 > ../gst/rtsp/gstrtspsrc.c:5560:gst_rtspsrc_loop_interleaved: > error: Could not receive message. (System error) > > 03-21 19:11:13.474? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.406002134 > 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6056:gst_rtspsrc_loop: > error: Internal data stream error. > > 03-21 19:11:13.474? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.406265426 > 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6056:gst_rtspsrc_loop: > error: streaming stopped, reason error (-5) > > 03-21 19:11:13.495? 9080? 9223 W GStreamer+default: 0:00:37.427520719 > 0xd1507750 ../gst-libs/gst/rtsp/gstrtspconnection.c:1234:write_bytes > Operation was cancelled > > 03-21 19:11:13.496? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.427919303 > 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6402:gst_rtspsrc_try_send: > send interrupted > > 03-21 19:11:13.496? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.428175428 > 0xd1507750 ../gst/rtsp/gstrtspsrc.c:8650:gst_rtspsrc_pause: > PAUSE interrupted > > 03-21 19:11:13.503? 9080? 9223 W GStreamer+default: 0:00:37.435170178 > 0xd1507750 ../gst-libs/gst/rtsp/gstrtspconnection.c:1234:write_bytes > Error sending data: Broken pipe > > 03-21 19:11:13.503? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.435540720 > 0xd1507750 ../gst/rtsp/gstrtspsrc.c:6400:gst_rtspsrc_try_send: > error: Could not send message. (System error) > > 03-21 19:11:13.504? 9080? 9223 W GStreamer+rtspsrc: 0:00:37.436036929 > 0xd1507750 ../gst/rtsp/gstrtspsrc.c:8075:gst_rtspsrc_close: > error: Could not send message. (System error) > > My pipeline is: > > rtspsrc latency=250 location=rtsp://10.10.10.1:81 ! rtph264depay ! queue > ! h264parse ! mpegpsmux ! filesink > > NET_ADMIN privilege is only for system app. Obviously, the device is not > rooted. I'm confused why I get such udpsrc error. Note that the second > line of the log indicates that it can't get a 512kB buffer but it can > get a 256kB. Has anyone got an idea? > > Gr?goire From activecraft at gmail.com Tue Mar 23 08:17:12 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Tue, 23 Mar 2021 13:47:12 +0530 Subject: Stream to web browser In-Reply-To: <232b8aef4b69f45c99cf9295940a9f78d6edeea8.camel@ndufresne.ca> References: <232b8aef4b69f45c99cf9295940a9f78d6edeea8.camel@ndufresne.ca> Message-ID: Hi, We have Installed Project GST-examples We run below command ```console cd /path/to/gst-examples meson _builddir and received Error--------------- Also couldn't find a fallback subproject in subprojects/gst-plugins-bad for the dependency gstreamer-play-1.0 Reason: Subproject directory 'subprojects/gst-plugins-bad' does not exist and cannot be downloaded: No gst-plugins-bad.wrap found for 'subprojects/gst-plugins-bad' meson.build:23:0: ERROR: Native dependency 'gstreamer-play-1.0' not found Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. On Tue, Mar 23, 2021 at 12:42 AM Nicolas Dufresne wrote: > Le dimanche 21 mars 2021 ? 01:04 +0530, Activecraft Software Development a > ?crit : > > Hi, > > gst-launch-1.0 videotestsrc \ > ! queue ! vp8enc ! rtpvp8pay \ > ! application/x-rtp,media=video,encoding-name=VP8,payload=96 \ > ! webrtcbin name=sendrecv > > Can someone help in consuming this pipeline with a Laravel based server to display the stream onto a web browser? > > > GStreamer webrtcbin only implement the RTP/ICE (the streaming) part of a > WebRTC session. You still need to implement your signalling as per W3C > signalling protocol. Also, you will not be able to do webrtc streaming to a > browser without writing some code. Please find various code examples here: > > https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc > > Anurag Biala > +91-9814808323 | +1(646)-797-2775 > SKYPE: activecraft at hotmail.com | Gmail > activecraft at gmail.com > Website: https://www.activecraft.com * |* Email > info at activecraft.com > > > Website Design & Develop + App Design & Develop + SEO/SMM + Graphic > Design + UI/UX > > CONFIDENTIALITY NOTICE: The information in this email may be confidential > and/or privileged. This email is intended to be reviewed by only the > individual or organization named above. If you are not the intended > recipient or an authorized representative of the intended recipient, you > are hereby notified that any review, dissemination or copying of this email > and its attachments, if any, or the information contained herein is > prohibited. If you have received this email in error, please immediately > notify the sender by return email and delete this email from your system. > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From javier.carrasco at wolfvision.net Tue Mar 23 08:19:36 2021 From: javier.carrasco at wolfvision.net (Javiku) Date: Tue, 23 Mar 2021 03:19:36 -0500 (CDT) Subject: force keyframe ignored (v4l2h264enc) Message-ID: <1616487576901-0.post@n4.nabble.com> Hello, I tried to implement the solution proposed in http://gstreamer-devel.966125.n4.nabble.com/How-to-force-keyframes-using-Python-GST-bindings-td4695810.html but my "v4l2h264enc" encoder seems to ignore the event I send and no extra key frames is generated. I just added a function to get a pointer to the encoder and some messages that in fact are never displayed because the conditions are always false. This is my code: ForceKeyStruct = gst_structure_new("GstForceKeyUnit","all-headers", G_TYPE_BOOLEAN, TRUE, NULL); if(ForceKeyStruct == NULL) ipcd_warn("ForeKeyStruct NULL"); force_key_unit_event = gst_event_new_custom(GST_EVENT_CUSTOM_UPSTREAM, ForceKeyStruct); if(force_key_unit_event == NULL) ipcd_warn("force_key_unit_event NULL"); enc = gst_bin_get_by_name(GST_BIN(pipeline), "v4l2h264enc0"); if(enc == NULL) ipcd_warn("enc NULL"); encoder_src_pad = gst_element_get_static_pad(enc, "src"); if(!gst_pad_send_event(encoder_src_pad, force_key_unit_event)) ipcd_warn("event not sent"); gst_object_unref(encoder_src_pad); When this code is executed I do not notice any change... as if the event was completely ignored. The frame pattern does not change and I do not get any additional key frame. I do not get any error message either. Am I doing anything wrong? Is my encoder not able to handle that event? should I explicitly configure the encoder to accept this event? Thanks in advance. Regards Javiku -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From karim_atiki at hotmail.com Tue Mar 23 08:30:52 2021 From: karim_atiki at hotmail.com (karimchtx) Date: Tue, 23 Mar 2021 03:30:52 -0500 (CDT) Subject: Cannot play a 4K mkv content with v4l2h265dec plugin In-Reply-To: <9c050368d2c482c1b393dbf6aad7f26635e8d0e6.camel@ndufresne.ca> References: <1616408108829-0.post@n4.nabble.com> <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> <1616441683716-0.post@n4.nabble.com> <9c050368d2c482c1b393dbf6aad7f26635e8d0e6.camel@ndufresne.ca> Message-ID: <1616488252754-0.post@n4.nabble.com> Hi Nicolas, Thanks a lot for your recommendations. It helped me a lot to tune my pipeline ! I have integrated the matroskademux, as well as queues before/after the hw decoding. As I'm writing this mail, the movie is still playing. It is very smooth...but sometime it's still stuttering a little bit though it's drastically better than yesterday. My guess is it's maybe related to frame-rate issues. The movie is typically in 23.976 fps, the tv is currently setup @ 60fps. I'll try @ 24 fps and @ 30 fps later. Or maybe is it still related to bitrate (it's an action movie...in a galaxy far, far away...) in fast sequences and thus maybe the "queue" elements should be tuned as well in their size ? Karim -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ottavio at campana.vi.it Tue Mar 23 08:50:51 2021 From: ottavio at campana.vi.it (Ottavio Campana) Date: Tue, 23 Mar 2021 09:50:51 +0100 Subject: Webrtc with sip for signalling? Message-ID: Hello, webrtc does not specify a standard signalling protocol. I am wondering if there is any example to use sip for signalling and gstreamer for sending audio and video feeds. Or are you aware of any technical limitation in doing this? Thank you, Ottavio -- Non c'? pi? forza nella normalit?, c'? solo monotonia -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Tue Mar 23 12:21:47 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 23 Mar 2021 08:21:47 -0400 Subject: Cannot play a 4K mkv content with v4l2h265dec plugin In-Reply-To: <1616488252754-0.post@n4.nabble.com> References: <1616408108829-0.post@n4.nabble.com> <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> <1616441683716-0.post@n4.nabble.com> <9c050368d2c482c1b393dbf6aad7f26635e8d0e6.camel@ndufresne.ca> <1616488252754-0.post@n4.nabble.com> Message-ID: Le mar. 23 mars 2021 05 h 45, karimchtx a ?crit : > Hi Nicolas, > > Thanks a lot for your recommendations. > It helped me a lot to tune my pipeline ! > > I have integrated the matroskademux, as well as queues before/after the hw > decoding. > As I'm writing this mail, the movie is still playing. > It is very smooth...but sometime it's still stuttering a little bit though > it's drastically better than yesterday. > > My guess is it's maybe related to frame-rate issues. The movie is typically > in 23.976 fps, the tv is currently setup @ 60fps. I'll try @ 24 fps and @ > 30 > fps later. > > Or maybe is it still related to bitrate (it's an action movie...in a galaxy > far, far away...) in fast sequences and thus maybe the "queue" elements > should be tuned as well in their size ? > As it's 4K, you can tune the queue before the display sink with a larger max size. E.g. max-size-byte=0 (unlimited) Though, at this level, the stutter could be cause by colliding events in your compositor, or GPU spike if the GPU is being used. Wayland have mechanism to improve accuracy further, bit this is not yet implemented. > > > Karim > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Tue Mar 23 12:23:58 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 23 Mar 2021 08:23:58 -0400 Subject: force keyframe ignored (v4l2h264enc) In-Reply-To: <1616487576901-0.post@n4.nabble.com> References: <1616487576901-0.post@n4.nabble.com> Message-ID: Le mar. 23 mars 2021 05 h 30, Javiku a ?crit : > Hello, > I tried to implement the solution proposed in > > http://gstreamer-devel.966125.n4.nabble.com/How-to-force-keyframes-using-Python-GST-bindings-td4695810.html > < > http://gstreamer-devel.966125.n4.nabble.com/How-to-force-keyframes-using-Python-GST-bindings-td4695810.html> > > but my "v4l2h264enc" encoder seems to ignore the event I send and no extra > key frames is generated. I just added a function to get a pointer to the > encoder and some messages that in fact are never displayed because the > conditions are always false. This is my code: > This is not yet implemented in this element, in fact there wasn't yet support for this in Linux when that element was posted. I can provide you hints on how to implement support for that, it should be fairly straightforward. > ForceKeyStruct = gst_structure_new("GstForceKeyUnit","all-headers", > G_TYPE_BOOLEAN, TRUE, NULL); > if(ForceKeyStruct == NULL) > ipcd_warn("ForeKeyStruct NULL"); > force_key_unit_event = > gst_event_new_custom(GST_EVENT_CUSTOM_UPSTREAM, > ForceKeyStruct); > if(force_key_unit_event == NULL) > ipcd_warn("force_key_unit_event NULL"); > enc = gst_bin_get_by_name(GST_BIN(pipeline), "v4l2h264enc0"); > if(enc == NULL) > ipcd_warn("enc NULL"); > encoder_src_pad = gst_element_get_static_pad(enc, "src"); > if(!gst_pad_send_event(encoder_src_pad, force_key_unit_event)) > ipcd_warn("event not sent"); > gst_object_unref(encoder_src_pad); > > When this code is executed I do not notice any change... as if the event > was > completely ignored. The frame pattern does not change and I do not get any > additional key frame. I do not get any error message either. Am I doing > anything wrong? Is my encoder not able to handle that event? should I > explicitly configure the encoder to accept this event? Thanks in advance. > Regards Javiku > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From davinder16 at gmail.com Tue Mar 23 14:23:30 2021 From: davinder16 at gmail.com (Davinder Singh) Date: Tue, 23 Mar 2021 19:53:30 +0530 Subject: Gstreamer WebRTC Examples Message-ID: Hello I am totally new to Gstreamer but i have installed Ubuntu & Gstreamer on the server. I have found an example https://gitlab.freedesktop.org/gstreamer/gst-examples/ but could not understand how these examples will work. I cloned webrtc examples on the server but don't know how it will work. Please have a look at following link: https://livestream.activecraft.com/gst-examples/webrtc/sendrecv/js/index.html Please help me with the steps i should follow -- Thanks & Regards Davinder Singh -------------- next part -------------- An HTML attachment was scrubbed... URL: From javier.carrasco at wolfvision.net Tue Mar 23 15:24:01 2021 From: javier.carrasco at wolfvision.net (Javiku) Date: Tue, 23 Mar 2021 10:24:01 -0500 (CDT) Subject: force keyframe ignored (v4l2h264enc) In-Reply-To: References: <1616487576901-0.post@n4.nabble.com> Message-ID: <1616513041641-0.post@n4.nabble.com> Nicolas Dufresne-5 wrote > This is not yet implemented in this element, in fact there wasn't yet > support for this in Linux when that element was posted. I can provide you > hints on how to implement support for that, it should be fairly > straightforward. That would be great because I must use this element and a way to force key frames is an important requirement. Therefore implementing support for that seems to be the only solution. Thank you! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From karim_atiki at hotmail.com Tue Mar 23 15:38:53 2021 From: karim_atiki at hotmail.com (karimchtx) Date: Tue, 23 Mar 2021 10:38:53 -0500 (CDT) Subject: Cannot play a 4K mkv content with v4l2h265dec plugin In-Reply-To: References: <1616408108829-0.post@n4.nabble.com> <5a00e037ad7965ae5f1a0acbc2ca0196e4093500.camel@ndufresne.ca> <1616441683716-0.post@n4.nabble.com> <9c050368d2c482c1b393dbf6aad7f26635e8d0e6.camel@ndufresne.ca> <1616488252754-0.post@n4.nabble.com> Message-ID: <1616513933292-0.post@n4.nabble.com> Thanks Nicolas. That's actually what I did, set max-size-bytes=0. Finally I found out the problem was better related to the parameters of the monitor used. It's a 4K Samsung big-screen. And additional features such as "auto smooth" options were activated. We've deactivated this option as the screens used in production won't have this option set. And there we are, the playback is smooth...let's say at 99 % :) I still need to investigate furether inot wayland/weston regarding the mode's frequency. I've set weston with: [output] name=HDMI-A-1 mode=3840x2160 at 60 Knowing that the movie framerate is @ 24000/1001 If I change it to 3840x2160 at 24...the playback is completely and awfully stuttering. Well that's a part of wayland/weston that's a bit obscure for my understanding. Thanks again, I've appreciated your help on this subject. K. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From activecraft at gmail.com Tue Mar 23 17:41:42 2021 From: activecraft at gmail.com (Activecraft Software Development) Date: Tue, 23 Mar 2021 23:11:42 +0530 Subject: No subject Message-ID: when i have installed project and running a command meson _builddir It gives the below error. Reason: Subproject directory 'subprojects/gst-plugins-bad' does not exist and cannot be downloaded: No gst-plugins-bad.wrap found for 'subprojects/gst-plugins-bad' meson.build:23:0: ERROR: Native dependency 'gstreamer-play-1.0' not found Anurag Biala +91-9814808323 | +1(646)-797-2775 SKYPE: activecraft at hotmail.com | Gmail activecraft at gmail.com Website: https://www.activecraft.com * |* Email info at activecraft.com Website Design & Develop + App Design & Develop + SEO/SMM + Graphic Design + UI/UX CONFIDENTIALITY NOTICE: The information in this email may be confidential and/or privileged. This email is intended to be reviewed by only the individual or organization named above. If you are not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any review, dissemination or copying of this email and its attachments, if any, or the information contained herein is prohibited. If you have received this email in error, please immediately notify the sender by return email and delete this email from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Tue Mar 23 19:32:10 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Tue, 23 Mar 2021 15:32:10 -0400 Subject: force keyframe ignored (v4l2h264enc) In-Reply-To: <1616513041641-0.post@n4.nabble.com> References: <1616487576901-0.post@n4.nabble.com> <1616513041641-0.post@n4.nabble.com> Message-ID: Le mardi 23 mars 2021 ? 10:24 -0500, Javiku a ?crit?: > Nicolas Dufresne-5 wrote > > This is not yet implemented in this element, in fact there wasn't yet > > support for this in Linux when that element was posted. I can provide you > > hints on how to implement support for that, it should be fairly > > straightforward. > > That would be great because I must use this element and a way to force key > frames is an important requirement. Therefore implementing support for that > seems to be the only solution. Thank you. Here's is some information, V4L2 framework offert a control of type button (a trigger) which ask for a new keyframe (it works ASAP). V4L2_CID_MPEG_VIDEO_FORCE_KEY_FRAME (button) Force a key frame for the next queued buffer. Applicable to encoders. This is a general, codec-agnostic keyframe control. As of GStreamer side, GstVideoDecoder base class will handle the custom event and will set a flag on the GstVideoCodecFrame of the associated frame we are requested to make a key frame with. You have to trigger the button because you queue the frame. Here's an example of code reading the flag: https://gitlab.freedesktop.org/gstreamer/gst-plugins-ugly/-/blob/master/ext/x264/gstx264enc.c#L2514 And on V4L2 side here's where I think the code to trigger the button should be called. https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/blob/master/sys/v4l2/gstv4l2videoenc.c#L796 I don't think that lock matters, you trigger with or without it I think. The helpers for controls are not great and don't support buttons, I would just call the ioctl() directly (well using the function pointer in v4l2object). Something like this (not tested): https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/vidioc-g-ctrl.html struct v4l2_control ctrl = { V4L2_CID_MPEG_VIDEO_FORCE_KEY_FRAME, 1 }; if (v4l2object->ioctl (v4l2object->video_fd,VIDIOC_S_CTRL , &ctrk) < 0) GST_ELEMENT_WARNING (self, RESOURCE, FAILED, (_("Failed to force keyframe.")), (NULL)); (if it's too complicated, I'll make an MR and you can help testing it) -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From kochmmi1 at fel.cvut.cz Wed Mar 24 11:29:48 2021 From: kochmmi1 at fel.cvut.cz (kochmmi1) Date: Wed, 24 Mar 2021 06:29:48 -0500 (CDT) Subject: v4l2src: device or resource busy In-Reply-To: <1568116756979-0.post@n4.nabble.com> References: <1568054218276-0.post@n4.nabble.com> <20190910133325.760fdf07@fluffyspider.com> <1568116756979-0.post@n4.nabble.com> Message-ID: <1616585388939-0.post@n4.nabble.com> Hi, I am experiencig the same issue, i guess. Have you reached some solution? Thanks Michal -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Wed Mar 24 15:19:13 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Wed, 24 Mar 2021 11:19:13 -0400 Subject: v4l2src: device or resource busy In-Reply-To: <1616585388939-0.post@n4.nabble.com> References: <1568054218276-0.post@n4.nabble.com> <20190910133325.760fdf07@fluffyspider.com> <1568116756979-0.post@n4.nabble.com> <1616585388939-0.post@n4.nabble.com> Message-ID: Le mercredi 24 mars 2021 ? 06:29 -0500, kochmmi1 a ?crit?: > Hi, > I am experiencig the same issue, i guess. Have you reached some solution? > Thanks > Michal This report was against GStreamer 1.8, which is very ancient. My recommendation is to use newer kernel and newer GStreamer, as the memory handling has been improved, allowing to detach buffers from the driver, hence avoiding these EBUSY errors. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From anilkumar03006 at hotmail.com Tue Mar 23 05:37:02 2021 From: anilkumar03006 at hotmail.com (anil0407) Date: Tue, 23 Mar 2021 00:37:02 -0500 (CDT) Subject: gst pad request template. In-Reply-To: <1616145153552-0.post@n4.nabble.com> References: <1616145153552-0.post@n4.nabble.com> Message-ID: <1616477822905-0.post@n4.nabble.com> Hi All, with gst-inspect-1.0 myplugin, getting Failed to load plugin gstmyplugin.so: undefined symbol: gst_collect_pads_add_pad please help what i am missing static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("video_%u", GST_PAD_SINK, GST_PAD_REQUEST, GST_STATIC_CAPS ("ANY") ); in _init() function filter->srcpad = gst_pad_new_from_static_template (&src_factory, "src"); gst_pad_use_fixed_caps (filter->srcpad); gst_element_add_pad (GST_ELEMENT (filter), filter->srcpad); filter->silent = FALSE; filter->collect = gst_collect_pads_new (); ... in _class_init() function ... gstelement_class->request_new_pad = gst_request_new_pad; gstelement_class->release_pad = gst_release_pad; gstelement_class->change_state = gst_change_state; ... in gst_request_new_pad() function ... pad_name = "video_0"; newpad = gst_pad_new_from_template (templ, pad_name); dvpad->collect = gst_collect_pads_add_pad (dvmux->collect, newpad, sizeof (GstmypluginPad), NULL, TRUE); .... Thanks, Anil -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From josephngari at hotmail.com Wed Mar 24 17:59:50 2021 From: josephngari at hotmail.com (pineapplejoe) Date: Wed, 24 Mar 2021 12:59:50 -0500 (CDT) Subject: How to stream both audio and video from an RTSP source to Amazons kinesis Message-ID: <1616608790257-0.post@n4.nabble.com> I am trying to stream both audio and video from an IP camera, through a raspberry pi, and up into aws kinesis. I know the IP camera is producing both audio and video because I can use VLC to connect to the camera and both are present. I now have a pipeline that should produce both audio and video but im getting an error. The pipeline: gst-launch-1.0 rtspsrc location="rtsp://Pass:Word at 192.168.1.15/stream" latency=0 name=d d. ! queue ! \ rtppcmudepay ! mulawdec ! audioresample ! audioconvert ! queue ! matroskamux name=mux d. ! queue ! \ rtph264depay ! h264parse ! queue ! mux. mux. ! \ kvssink <...aws creds ... > The error im getting is WARNING: erroneous pipeline: could not link mux to kvssink0 If I remove the ! mux. mix. ! - the stream will work but there is no audio. just video. What am I missing? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From fblackmessms at yandex.ru Wed Mar 24 06:49:02 2021 From: fblackmessms at yandex.ru (Maksim Danilov) Date: Wed, 24 Mar 2021 01:49:02 -0500 (CDT) Subject: How to configure webrtc echo cancellation in Windows 10? In-Reply-To: <019b1aa111310ed1306f17e345ff50cfad170cef.camel@ndufresne.ca> References: <1616404868390-0.post@n4.nabble.com> <019b1aa111310ed1306f17e345ff50cfad170cef.camel@ndufresne.ca> Message-ID: <1616568542941-0.post@n4.nabble.com> Thanks for the answer. I gave up on configuring AEC over rtp and came up with simple example that look like: wasapisrc ! audioconvert ! audio/x-raw, format=S16LE, rate=48000 ! audioconvert ! webrtcdsp ! webrtcechoprobe ! audioconvert ! wasapisink. It gives no result (in linux everything work as expected). I tried you suggestion with low-latency mode, however got messages with 'invalid latency add some queues to the pipeline'. Even if I add queue to pipeline. I can't hear my self at all. That pipeline gives the same result: wasapisrc low-latency=true ! ! wasapisink low-latency=true. So I think the problem is in plugin. Probably it add some additional latency in implementation, cause it doesn't act like real time. I inspected the code of plugin and it looks like a very fast implementation of api under Windows. Moreover it can't even capture specific caps of src/sink and uses shared format F32LE (maybe it is resampling and converting audio in core and we get some latency through). -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From walkers_aug_7 at yahoo.co.jp Tue Mar 23 05:49:38 2021 From: walkers_aug_7 at yahoo.co.jp (Yusuke) Date: Tue, 23 Mar 2021 00:49:38 -0500 (CDT) Subject: I want to use decklink to output SDI input to SDI Message-ID: <1616478578142-0.post@n4.nabble.com> I want to use decklink to output SDI input to SDI. This is OK. gst-launch-1.0 \ decklinkvideosrc device-number=0 connection=sdi mode=1080i5994 profile=4 ! \ decklinkvideosink device-number=7 mode=1080i5994 profile=4 sync=false \ decklinkaudiosrc device-number=0 channels=16 ! audio/x-raw ,format=S32LE ! decklinkaudiosink device-number=7 This is NG. gst-launch-1.0 \ decklinkvideosrc device-number=0 connection=sdi mode=1080i5994 profile=4 ! \ decklinkvideosink device-number=7 mode=1080i5994 profile=4 sync=false \ decklinkaudiosrc device-number=0 channels=16 ! audio/x-raw ,format=S32LE ! audiomixer ! decklinkaudiosink device-number=7 How can I use "audio mixer" to output SDI? I am Japanese. Sorry for the poor English. Thank you. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ikenetik at mail.ru Thu Mar 25 11:20:00 2021 From: ikenetik at mail.ru (ikenetik) Date: Thu, 25 Mar 2021 06:20:00 -0500 (CDT) Subject: Gstreamer-python video and audio rtsp-tcp Message-ID: <1616671200068-0.post@n4.nabble.com> Hello, I am new to this aspect, so I can do stupid things. I am unable to adequately combine the video and audio track for output to the stream. Output video, but no audio. Please, help) It's code: import cv2 import gi gi.require_version('Gst', '1.0') gi.require_version('GstRtspServer', '1.0') from gi.repository import GLib, Gst, GstRtspServer, GObject class SensorFactory(GstRtspServer.RTSPMediaFactory): def __init__(self, **properties): super(SensorFactory, self).__init__(**properties) self.cap = cv2.VideoCapture("test.mp4") self.number_frames = 0 self.fps = 30 self.duration = 1 / self.fps * Gst.SECOND # duration of a frame in nanoseconds self.launch_string = "! video/x-raw-yuv, framerate={}/1," \ "! theoraenc bitrate=400 ! mux. " \ "audiotestsrc ! audiorate ! legacyresample ! " \ "audioconvert ! audio/x-raw-float,channels=2 " \ "! vorbisenc bitrate=64000 ! mux. " \ "oggmux name=mux ! fgdpsink ".format(self.fps) # streams to gst-launch-1.0 rtspsrc location=rtsp://localhost:8554/test latency=50 ! decodebin ! autovideosink def on_need_data(self, src, lenght): if self.cap.isOpened(): ret, frame = self.cap.read() if ret: data = frame.tostring() #print(data) buf = Gst.Buffer.new_allocate(None, len(data), None) buf.fill(0, data) buf.duration = self.duration timestamp = self.number_frames * self.duration buf.pts = buf.dts = int(timestamp) buf.offset = timestamp self.number_frames += 1 retval = src.emit('push-buffer', buf) #print('pushed buffer, frame {}, duration {} ns, durations {} s'.format(self.number_frames, # self.duration, # self.duration / Gst.SECOND)) if retval != Gst.FlowReturn.OK: print(retval) else: print('file is not opened,use normal path!') def do_create_element(self, url): print('do_create is working!') return Gst.parse_launch(self.launch_string) def do_configure(self, rtsp_media): print('do_configure working too!') self.number_frames = 0 appsrc = rtsp_media.get_element().get_child_by_name('source') appsrc.connect('need-data', self.on_need_data) class GstServer(GstRtspServer.RTSPServer): print('gi correctly') def __init__(self, **properties): super(GstServer, self).__init__(**properties) self.factory = SensorFactory() self.factory.set_shared(True) self.get_mount_points().add_factory("/test", self.factory) self.attach(None) Gst.init(None) server = GstServer() loop = GLib.MainLoop() loop.run() -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From mhaines4102 at gmail.com Thu Mar 25 17:00:08 2021 From: mhaines4102 at gmail.com (Mike Haines) Date: Thu, 25 Mar 2021 12:00:08 -0500 Subject: rtspsrc giving EOS every 8 seconds Message-ID: I am connected to an ip camera live. When property protocols is set for TCP, every 8 seconds I receive an EOS signal. When not set to TCP the signal is not sent. let src = gst::ElementFactory::make("rtspsrc", None).map_err(|_| MissingElement("filesink")).unwrap();src.set_property("location", &uri) .expect("setting rtspsrc location property failed");src.set_property("latency", &3000u32) .expect("setting rtspsrc latency property failed");src.set_property("timeout", &8000000u64) .expect("setting rtspsrc timeout property failed");src.set_property_from_str("protocols", "tcp"); The ip camera is on the local network with me. If I do not set protocols to tcp the behavior does not occur. Tried changing the latency and the timeout, no change. It is EVERY 8 seconds. is-live does not accept a property change but should be set to true by default. Just to be clear about current installations: gstreamer1.0-plugins-base is already the newest version (1.14.5-0ubuntu1~18.04.1). gstreamer1.0-plugins-good is already the newest version (1.14.5-0ubuntu1~18.04.1). libgstreamer-plugins-base1.0-dev is already the newest version (1.14.5-0ubuntu1~18.04.1). libgstreamer1.0-dev is already the newest version (1.14.5-0ubuntu1~18.04.1). gstreamer1.0-libav is already the newest version (1.14.5-0ubuntu1~18.04.1). gstreamer1.0-plugins-bad is already the newest version (1.14.5-0ubuntu1~18.04.1). gstreamer1.0-plugins-ugly is already the newest version (1.14.5-0ubuntu1~18.04.1). libgstrtspserver-1.0-dev is already the newest version (1.14.5-0ubuntu1~18.04.1). -------------- next part -------------- An HTML attachment was scrubbed... URL: From weiac at amazon.com Thu Mar 25 17:10:25 2021 From: weiac at amazon.com (weianchen) Date: Thu, 25 Mar 2021 12:10:25 -0500 (CDT) Subject: TWCC stats are null with send-only connection to chrome in master In-Reply-To: References: Message-ID: <1616692225152-0.post@n4.nabble.com> Hello Faraz, I'm also trying to enable TWCC stats, would you please let you me how/where did you add the "extmap-3=http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01"? Thanks!! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Thu Mar 25 18:30:38 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 25 Mar 2021 14:30:38 -0400 Subject: I want to use decklink to output SDI input to SDI In-Reply-To: <1616478578142-0.post@n4.nabble.com> References: <1616478578142-0.post@n4.nabble.com> Message-ID: <3e40e0c374b5632d1f7aaabbb250c97edab30374.camel@ndufresne.ca> Le mardi 23 mars 2021 ? 00:49 -0500, Yusuke a ?crit?: > I want to use decklink to output SDI input to SDI. > > This is OK. > gst-launch-1.0 \ > decklinkvideosrc device-number=0 connection=sdi mode=1080i5994 profile=4 ! \ > decklinkvideosink device-number=7 mode=1080i5994 profile=4 sync=false \ > decklinkaudiosrc device-number=0 channels=16 ! audio/x-raw ,format=S32LE ! > decklinkaudiosink device-number=7 > > This is NG. > gst-launch-1.0 \ > decklinkvideosrc device-number=0 connection=sdi mode=1080i5994 profile=4 ! \ > decklinkvideosink device-number=7 mode=1080i5994 profile=4 sync=false \ > decklinkaudiosrc device-number=0 channels=16 ! audio/x-raw ,format=S32LE ! > audiomixer ! decklinkaudiosink device-number=7 Perhaps you want to allow some latency on audiomixer ? (see audiomixer latency property in gst-inspect-1.0). > > > How can I use "audio mixer" to output SDI? > > I am Japanese. > Sorry for the poor English. > Thank you. > > > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From nicolas at ndufresne.ca Thu Mar 25 18:33:11 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Thu, 25 Mar 2021 14:33:11 -0400 Subject: How to configure webrtc echo cancellation in Windows 10? In-Reply-To: <1616568542941-0.post@n4.nabble.com> References: <1616404868390-0.post@n4.nabble.com> <019b1aa111310ed1306f17e345ff50cfad170cef.camel@ndufresne.ca> <1616568542941-0.post@n4.nabble.com> Message-ID: Le mercredi 24 mars 2021 ? 01:49 -0500, Maksim Danilov a ?crit?: > Thanks for the answer. I gave up on configuring AEC over rtp and came up with > simple example that look like: > wasapisrc ! audioconvert ! audio/x-raw, format=S16LE, rate=48000 ! > audioconvert ! webrtcdsp ! webrtcechoprobe ! audioconvert ! wasapisink. > It gives no result (in linux everything work as expected). > I tried you suggestion with low-latency mode, however got messages with > 'invalid latency add some queues to the pipeline'. Even if I add queue to > pipeline. I can't hear my self at all. > That pipeline gives the same result: wasapisrc low-latency=true ! ! > wasapisink low-latency=true. > So I think the problem is in plugin. Probably it add some additional latency > in implementation, cause it doesn't act like real time. > I inspected the code of plugin and it looks like a very fast implementation > of api under Windows. Moreover it can't even capture specific caps of > src/sink and uses shared format F32LE (maybe it is resampling and converting > audio in core and we get some latency through). I'm clueless at this point. Perhaps Nirbheek can help here ? > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From kepitto at outlook.com Fri Mar 26 09:08:36 2021 From: kepitto at outlook.com (kepitto) Date: Fri, 26 Mar 2021 04:08:36 -0500 (CDT) Subject: omxh264enc CBR on Raspberry Pi 4? Message-ID: <1616749716100-0.post@n4.nabble.com> I'm trying to encode a video on a raspberry pi 4 in a realtime application, so I want to test out the constant bitrate. Setting the control-rate property to 2 and the target-bitrate=1000000 doesn't work though, a file is being created however the size is 0. control-rate=1 (variable) with target-bitrate=1000000 does work. (doesn't work without setting target-bitrate) 1. Is there anything I'm missing regarding CBR encoding to a file? 2. Is the target-bitrate for VBR just the maximum possible bitrate? Or what does target-bitrate do for VBR? Thanks -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From mhaines4102 at gmail.com Fri Mar 26 14:48:45 2021 From: mhaines4102 at gmail.com (Mike Haines) Date: Fri, 26 Mar 2021 09:48:45 -0500 Subject: rtspsrc - protocols=tcp EOS 8 seconds Message-ID: gst-launch-1.0 -e rtspsrc location=rtsp://cameraip/live.sdp protocols=tcp ! rtph264depay ! h264parse ! filesink location="record_test1.mp4" Will not record more than 8 seconds Output from process: Got EOS from element "pipeline0". Execution ended after 0:00:07.399556977 I have changed every setting I can find and no change. Occasionally it will record longer than 8 seconds but it is 1 out of 20. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhaines4102 at gmail.com Fri Mar 26 15:40:27 2021 From: mhaines4102 at gmail.com (mhaines4102) Date: Fri, 26 Mar 2021 10:40:27 -0500 (CDT) Subject: rtspsrc - protocols=tcp EOS 8 seconds In-Reply-To: References: Message-ID: <1616773227800-0.post@n4.nabble.com> To be clear. The camera is directly connected to my box. The camera is not having any connectivity issues. Replaced Cable Changed Computers Works on UDP but not TCP. The answer cannot be "just use udp" Changed timeout and tcp-timeout with no change Changed buffer type with no change Changed the latency with no change Changed the retries with no change I am having trouble believing that I am the only person to ever see this behavior. Originally I was seeing this on an application I was writing, so I changed to just using the gst-launcher so that no "code" was in the way. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From Markus.Elfring at web.de Sat Mar 27 12:32:32 2021 From: Markus.Elfring at web.de (Markus Elfring) Date: Sat, 27 Mar 2021 13:32:32 +0100 Subject: Checking support for clipping of input data Message-ID: <1ca24852-ba27-ab7d-3d25-1aab0b8bbe6a@web.de> Hello, I got informed that ?Clipping? would be a general data processing operation. https://en.wikipedia.org/wiki/Clipping_(computer_graphics) Which GStreamer element implementations do you know which perform this kind of transformation for selected multi-media input data? Regards, Markus From trey.hutcheson at gmail.com Sat Mar 27 15:37:30 2021 From: trey.hutcheson at gmail.com (Trey Hutcheson) Date: Sat, 27 Mar 2021 10:37:30 -0500 Subject: webrtcbin, h264 profiles and iOS? Message-ID: Here's my situation: * remote peer (ios) produces an sdp offer with h264. I negotiate with webrtcbin, and the remote peer sends in video. * a second peer (ios) joins, and my media server creates a new webrtcbin instance, and depayloads/repayloads the h264 from the first webrtcbin to the 2nd * webrtcbin instance 2 creates an offer. * the 2nd remote peer rejects the offer with the message: "Failed to set remote offer sdp: Failed to set remote video description send parameters'.' >From what I've read, I *believe* the 2nd peer is rejecting the offer because of the h264 profile offered by webrtcbin. The original offer from the video producing peer looks like this (relevant parameters): a=rtpmap:98 H264/90000 a=rtcp-fb:98 goog-remb a=rtcp-fb:98 transport-cc a=rtcp-fb:98 ccm fir a=rtcp-fb:98 nack a=rtcp-fb:98 nack pli a=fmtp:98 profile-level-id=42e01f;level-asymmetry-allowed=1;packetization-mode=1 And then when webrtcbin produces the offer for the second peer, it looks like this: a=rtpmap:98 H264/90000 a=rtcp-fb:98 nack pli a=fmtp:98 packetization-mode=1;profile-level-id=42001f;sprop-parameter-sets=J0IAH6tAUB7TUCAgKkG0EQjUAA==,KM48MA== So the video sending peer offered profile level 42e01f. But for the same h264 stream, webrtcbin offered profile level 42001f. What's going on here? What is the correct behavior? Chrome accepts the offer and plays the video just fine. Safari does not. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pettair at gmail.com Sat Mar 27 20:46:27 2021 From: pettair at gmail.com (Peter Biro) Date: Sat, 27 Mar 2021 21:46:27 +0100 Subject: Update only certain libraries Message-ID: Hello Unfortunately Im locked in at version 1.14.5 since I'm using NVidia's Jetson Nano This is the highest version they package into their distribution and as I learned the have no plan to update. They also provide a handful of custom extension, which I also need so to build and install the whole package from source would not work for me. I would need to update the WebRTC lib to 1.17.90 at least to have mDNS support. If there is a chance I would like to only update gstreamer-webrtc-1.0 shared lib but of course any solution would do. Can you point me to a direction how can I achieve this partial update (if it is even possible)? Thank you! Bests, Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From trey.hutcheson at gmail.com Sun Mar 28 14:01:15 2021 From: trey.hutcheson at gmail.com (Trey Hutcheson) Date: Sun, 28 Mar 2021 09:01:15 -0500 Subject: webrtcbin, h264 profiles and iOS? In-Reply-To: References: Message-ID: Ok so it appears Safari is in fact rejecting the profile 42001f. When the first webrtcbin instance fires the pad-added signal for the src pad, the caps show a profile-level-id of 42e01f. We then repayload the stream (queue ! rtph264depay ! h264parse config-interval=-1 ! rtph264pay) on its way to the 2nd webrtcbin instance, and the caps on the sink pad end up having a profile-level-id of 42001f. Is it h264parse that's modifying the caps? Is there any way to preserve the profile-level-id here? On Sat, Mar 27, 2021 at 10:37 AM Trey Hutcheson wrote: > Here's my situation: > * remote peer (ios) produces an sdp offer with h264. I negotiate with > webrtcbin, and the remote peer sends in video. > * a second peer (ios) joins, and my media server creates a new webrtcbin > instance, and depayloads/repayloads the h264 from the first webrtcbin to > the 2nd > * webrtcbin instance 2 creates an offer. > * the 2nd remote peer rejects the offer with the message: "Failed to set > remote offer sdp: Failed to set remote video description send parameters'.' > > From what I've read, I *believe* the 2nd peer is rejecting the offer > because of the h264 profile offered by webrtcbin. > > The original offer from the video producing peer looks like this (relevant > parameters): > a=rtpmap:98 H264/90000 > a=rtcp-fb:98 goog-remb > a=rtcp-fb:98 transport-cc > a=rtcp-fb:98 ccm fir > a=rtcp-fb:98 nack > a=rtcp-fb:98 nack pli > a=fmtp:98 > profile-level-id=42e01f;level-asymmetry-allowed=1;packetization-mode=1 > > And then when webrtcbin produces the offer for the second peer, it looks > like this: > a=rtpmap:98 H264/90000 > a=rtcp-fb:98 nack pli > a=fmtp:98 > packetization-mode=1;profile-level-id=42001f;sprop-parameter-sets=J0IAH6tAUB7TUCAgKkG0EQjUAA==,KM48MA== > > So the video sending peer offered profile level 42e01f. But for the same > h264 stream, webrtcbin offered profile level 42001f. > > What's going on here? What is the correct behavior? > > Chrome accepts the offer and plays the video just fine. Safari does not. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From amindfv at mailbox.org Sun Mar 28 21:15:03 2021 From: amindfv at mailbox.org (amindfv at mailbox.org) Date: Sun, 28 Mar 2021 15:15:03 -0600 Subject: gst-play vs ffplay efficiency? Message-ID: <20210328211503.GA30694@painter.painter> Not sure how I haven't noticed this before now, but: When seeking with arrow keys, for everything from very low-res to very high-res video, gst-play has a noticeable lag, vs. ffplay which seems to immediately seek. Also, for very high-res videos gst-play playback gets quite choppy while ffplay doesn't. I'm running older versions of both players - the ones from Debian stable (gst-play-1.0 version 1.14.4 / ffplay version 4.1.6-1~deb10u1). Possibly a cause. Is this to be expected? Thanks! Tom From keith.thornton at zeiss.com Mon Mar 29 03:56:43 2021 From: keith.thornton at zeiss.com (Thornton, Keith) Date: Mon, 29 Mar 2021 03:56:43 +0000 Subject: AW: rtspsrc - protocols=tcp EOS 8 seconds In-Reply-To: <1616773227800-0.post@n4.nabble.com> References: <1616773227800-0.post@n4.nabble.com> Message-ID: Hi, what does wireshark say? Is there a disconnect after 8 seconds? If so who disconnects? Gruesse -----Urspr?ngliche Nachricht----- Von: gstreamer-devel Im Auftrag von mhaines4102 Gesendet: Freitag, 26. M?rz 2021 16:40 An: gstreamer-devel at lists.freedesktop.org Betreff: Re: rtspsrc - protocols=tcp EOS 8 seconds To be clear. The camera is directly connected to my box. The camera is not having any connectivity issues. Replaced Cable Changed Computers Works on UDP but not TCP. The answer cannot be "just use udp" Changed timeout and tcp-timeout with no change Changed buffer type with no change Changed the latency with no change Changed the retries with no change I am having trouble believing that I am the only person to ever see this behavior. Originally I was seeing this on an application I was writing, so I changed to just using the gst-launcher so that no "code" was in the way. -- Sent from: https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgstreamer-devel.966125.n4.nabble.com%2F&data=04%7C01%7C%7Cbc97969a92584e2a208908d8f0768b6e%7C28042244bb514cd680347776fa3703e8%7C1%7C0%7C637523739219776069%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=3tpxZLUO%2BEs3DBWYG95XoPtmzOtiZJ8prH4zyK5e6q4%3D&reserved=0 _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.freedesktop.org https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.freedesktop.org%2Fmailman%2Flistinfo%2Fgstreamer-devel&data=04%7C01%7C%7Cbc97969a92584e2a208908d8f0768b6e%7C28042244bb514cd680347776fa3703e8%7C1%7C0%7C637523739219776069%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=fjonEprfg%2Fr9R5rse3gPU6czE3DJjoQwn1qcW%2F5G6Xc%3D&reserved=0 From jam at tigger.ws Mon Mar 29 05:06:43 2021 From: jam at tigger.ws (James Linder) Date: Mon, 29 Mar 2021 13:06:43 +0800 Subject: newby help Message-ID: G?day I tease my mate ?windows is so simple? only a fool can use it. I feel like that about gstreamer. I have a command that works. gst-launch-1.0 v4l2src device=$1 ! tee name=t \ t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! \ videoconvert ! xvimagesink force-aspect-ratio=false\ t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! \ videoconvert ! xvimagesink force-aspect-ratio=false The tutorials say: Please note that gst-launch is primarily a debugging tool for developers and users. You should not build applications on top of it. For applications, use the gst_parse_launch() function of the GStreamer API as an easy way to construct pipelines from pipeline descriptions. WHY? I?m trying to build a pipeline based on the above, but I do not understand the whole linking bit in particular I think that linking the tee to each queue should be all that is required, but core-dump says I?m wrong. Why the gst-launch above is that I need to have the sink windows have no decoration. I?ve made suitable canvases but I now need to reparent the xv window in a window of size and position I declare ie in my canvases. Any suggestions are welcome. A while after instantiation the windows are called ?gst-launch-1.0? and I can manipilate them. But I often miss. I can catch them as ?*? but I dont want ALL windows having no decoration, just the XV windows. Please, a few moments to lead an ignorant soal out the swamp. James From dor.forer at indoor-robotics.com Mon Mar 29 13:05:29 2021 From: dor.forer at indoor-robotics.com (dforer) Date: Mon, 29 Mar 2021 08:05:29 -0500 (CDT) Subject: RTSP streaming using rtsp server Message-ID: <1617023129930-0.post@n4.nabble.com> Hello everyone. I have read a lot and decided to use the rtsp server to stream my h264 camera via rtsp. I have a h264 camera with this spec: ioctl: VIDIOC_ENUM_FMT Index : 0 Type : Video Capture Pixel Format: 'H264' (compressed) Name : H.264 And I tried to stream my camera with this line of code: ./test-launch --gst-debug=3 "( v4l2src device=/dev/video1 ! video/x-h264,width=640,height=480,framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )" When I do so, it seems to work, but when I try to show it on another device usinf ffplay or vlc, I get a lot of: RTP: missed 1 packets errors in P frame And the live video playing is not smooth and get stuck alot. I feels like I'm missing something in the h264 encoding because I'm using a camera with hardware endcofing. BTW, when I try to use vlc rtsp streaming everything works great. What can it be? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From dor.forer at indoor-robotics.com Mon Mar 29 13:09:27 2021 From: dor.forer at indoor-robotics.com (dforer) Date: Mon, 29 Mar 2021 08:09:27 -0500 (CDT) Subject: RTSP streaming using rtsp server Message-ID: <1617023367216-0.post@n4.nabble.com> Hello everyone. I have read a lot and decided to use the rtsp server to stream my h264 camera via rtsp. I have a h264 camera with this spec: ioctl: VIDIOC_ENUM_FMT Index : 0 Type : Video Capture Pixel Format: 'H264' (compressed) Name : H.264 And I tried to stream my camera with this line of code: ./test-launch --gst-debug=3 "( v4l2src device=/dev/video1 ! video/x-h264,width=640,height=480,framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96 )" When I do so, it seems to work, but when I try to show it on another device usinf ffplay or vlc, I get a lot of: RTP: missed 1 packets errors in P frame And the live video playing is not smooth and get stuck alot. I feels like I'm missing something in the h264 encoding because I'm using a camera with hardware endcofing. BTW, when I try to use vlc rtsp streaming everything works great. What can it be? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From w.appelhoff at tv1.eu Mon Mar 29 13:19:14 2021 From: w.appelhoff at tv1.eu (Winand Appelhoff) Date: Mon, 29 Mar 2021 13:19:14 +0000 Subject: AW: Update only certain libraries In-Reply-To: References: Message-ID: Hi Peter, you can actually update Gstreamer on a Nano using the "gst-install" script https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide/accelerated_gstreamer.html#wwpID0E06E0HA Cheers Winand ________________________________ Von: gstreamer-devel im Auftrag von Peter Biro Gesendet: Samstag, 27. M?rz 2021 21:46 An: Discussion of the development of and with GStreamer Betreff: Update only certain libraries Hello Unfortunately Im locked in at version 1.14.5 since I'm using NVidia's Jetson Nano This is the highest version they package into their distribution and as I learned the have no plan to update. They also provide a handful of custom extension, which I also need so to build and install the whole package from source would not work for me. I would need to update the WebRTC lib to 1.17.90 at least to have mDNS support. If there is a chance I would like to only update gstreamer-webrtc-1.0 shared lib but of course any solution would do. Can you point me to a direction how can I achieve this partial update (if it is even possible)? Thank you! Bests, Peter -------------- next part -------------- An HTML attachment was scrubbed... URL: From gotsring at live.com Mon Mar 29 14:50:15 2021 From: gotsring at live.com (gotsring) Date: Mon, 29 Mar 2021 09:50:15 -0500 (CDT) Subject: newby help In-Reply-To: References: Message-ID: <1617029415308-0.post@n4.nabble.com> gst-launch is good for testing pipelines, but it lacks the ability to do any monitoring, live changes, error-handling, etc. Building pipelines in code takes getting used to, but it allows much more flexibility and robustness. For basic pipelines, you can use gst_parse_launch to run a pipeline similar to how you would use gst-launch For example, a simple pipeline with a tee can be started from the shell using: gst-launch-1.0 videotestsrc ! tee name=t \ t. ! queue ! videoconvert ! autovideosink \ t. ! queue ! videoconvert ! autovideosink or in C code using gst_parse_launch: GError* err = NULL; GstElement* pipeline = gst_parse_launch( "videotestsrc ! tee name=t " "t. ! queue ! videoconvert ! autovideosink " "t. ! queue ! videoconvert ! autovideosink" , &err); gst_element_set_state(pipeline, GST_STATE_PLAYING); // Play it You can swap out my test pipeline for whatever you want. Not quite sure what you're trying to do in terms of a GUI, but maybe use GTK? Check out basic tutorial 5: https://gstreamer.freedesktop.org/documentation/tutorials/basic/toolkit-integration.html?gi-language=c -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gotsring at live.com Mon Mar 29 14:54:02 2021 From: gotsring at live.com (gotsring) Date: Mon, 29 Mar 2021 09:54:02 -0500 (CDT) Subject: Checking support for clipping of input data In-Reply-To: <1ca24852-ba27-ab7d-3d25-1aab0b8bbe6a@web.de> References: <1ca24852-ba27-ab7d-3d25-1aab0b8bbe6a@web.de> Message-ID: <1617029642245-0.post@n4.nabble.com> Have you looked at videocrop or videobox? I'm not quite sure what you're trying to do. https://gstreamer.freedesktop.org/documentation/videocrop/videocrop.html?gi-language=c https://gstreamer.freedesktop.org/documentation/videobox/index.html?gi-language=c -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From mhaines4102 at gmail.com Mon Mar 29 16:48:57 2021 From: mhaines4102 at gmail.com (mhaines4102) Date: Mon, 29 Mar 2021 11:48:57 -0500 (CDT) Subject: AW: rtspsrc - protocols=tcp EOS 8 seconds In-Reply-To: References: <1616773227800-0.post@n4.nabble.com> Message-ID: <1617036537875-0.post@n4.nabble.com> There is no disconnect. I get an interleaved packet Then I get a sender report "1652","6.337378","192.168.55.160","192.168.55.1","RTCP","122","Sender Report Source description " Then I get an EOS in my system. It always seems to be directly after the sender report, but this is not the only sender report that comes through. I have included a pcap file. This is from the API version of my app. The command-line version of this test does a teardown directly after as well. "TEARDOWN rtsp://camera0/live.sdp/ RTSP/1.0" The report attached was done by setting the pipeline to null as soon as the EOS hits and exiting the program and you will see that in the report by the FIN packet. Test1.pcap -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From Markus.Elfring at web.de Mon Mar 29 18:12:31 2021 From: Markus.Elfring at web.de (Markus Elfring) Date: Mon, 29 Mar 2021 20:12:31 +0200 Subject: Checking support for clipping of input data In-Reply-To: <1617029642245-0.post@n4.nabble.com> References: <1ca24852-ba27-ab7d-3d25-1aab0b8bbe6a@web.de> <1617029642245-0.post@n4.nabble.com> Message-ID: <336c986e-3099-7ab6-e4c0-803dfee0ef3a@web.de> > https://gstreamer.freedesktop.org/documentation/videocrop/videocrop.html?gi-language=c > https://gstreamer.freedesktop.org/documentation/videobox/index.html?gi-language=c Thanks for your links. My application interests can grow also for customised cropping of pictures. This operation refers to rectangular areas. I am looking for clipping support also for other shapes and exclusion of further image parts. Can the data processing results be represented as virtual cameras? Regards, Markus From gotsring at live.com Mon Mar 29 18:32:25 2021 From: gotsring at live.com (gotsring) Date: Mon, 29 Mar 2021 13:32:25 -0500 (CDT) Subject: Checking support for clipping of input data In-Reply-To: <336c986e-3099-7ab6-e4c0-803dfee0ef3a@web.de> References: <1ca24852-ba27-ab7d-3d25-1aab0b8bbe6a@web.de> <1617029642245-0.post@n4.nabble.com> <336c986e-3099-7ab6-e4c0-803dfee0ef3a@web.de> Message-ID: <1617042745294-0.post@n4.nabble.com> I'm not aware of any stock GStreamer elements that allow you to mask off other shapes. Worst case scenario, you can create a filter element or use OpenCV to mask out parts of an image in whatever shape you want. Assuming you're using Linux, you can use v4l2sink to pipe GStreamer video outputs to a /dev/video device that can be used by other applications. You just have to install a new video device beforehand. https://gstreamer.freedesktop.org/documentation/video4linux2/v4l2sink.html?gi-language=c See this post for more info: http://gstreamer-devel.966125.n4.nabble.com/Error-using-v4l2sink-when-combining-2-camera-streams-td4696728.html -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From nicolas at ndufresne.ca Mon Mar 29 19:36:06 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Mon, 29 Mar 2021 15:36:06 -0400 Subject: gst-play vs ffplay efficiency? In-Reply-To: <20210328211503.GA30694@painter.painter> References: <20210328211503.GA30694@painter.painter> Message-ID: Le dimanche 28 mars 2021 ? 15:15 -0600, amindfv at mailbox.org a ?crit?: > Not sure how I haven't noticed this before now, but: > > When seeking with arrow keys, for everything from very low-res to very high-res video, gst-play has a noticeable lag, vs. ffplay which seems to immediately seek. > > Also, for very high-res videos gst-play playback gets quite choppy while ffplay doesn't. > > I'm running older versions of both players - the ones from Debian stable (gst-play-1.0 version 1.14.4 / ffplay version 4.1.6-1~deb10u1). Possibly a cause. It would be more fair to compare against latest GStreamer of course. Now, it's likely ffmpeg seeking is more efficient, but before jumping to conclusion, you have to make sure you compare apple-to-apple. GStreamer provides a multitude of seek operations, which yield different speeds. gst-play-1.0 is using GST_SEEK_FLAG_ACCURATE, which is one of the slowest, but also one which will vary in performance considering how far from a keyframe you seek. It will also be affected by how the stream was encoded, like the distance between keyframes. https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/tools/gst-play.c#L992 Nicolas > > Is this to be expected? > > Thanks! > Tom > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From Markus.Elfring at web.de Mon Mar 29 20:00:25 2021 From: Markus.Elfring at web.de (Markus Elfring) Date: Mon, 29 Mar 2021 22:00:25 +0200 Subject: Checking support for clipping of input data In-Reply-To: <1617042745294-0.post@n4.nabble.com> References: <1ca24852-ba27-ab7d-3d25-1aab0b8bbe6a@web.de> <1617029642245-0.post@n4.nabble.com> <336c986e-3099-7ab6-e4c0-803dfee0ef3a@web.de> <1617042745294-0.post@n4.nabble.com> Message-ID: <7e6dbcc4-f8f4-6371-d23c-b5d1078518e5@web.de> > I'm not aware of any stock GStreamer elements that allow you to mask off > other shapes. Are any other drawing programs needed for the construction of clipping areas? Will software development interests grow for such an use case? Regards, Markus From sean at siobud.com Mon Mar 29 21:26:26 2021 From: sean at siobud.com (Sean DuBois) Date: Mon, 29 Mar 2021 14:26:26 -0700 Subject: WebRTC-Echoes: Interop for WebRTC. Any GStreamer devs interested in getting involved? Message-ID: Hey list, We have started a WebRTC interop project called WebRTC-echoes[0] GStreamer has already been added to the test matrix, but would love if some devs would be involved! My hope is that this is something we could take to the IETF and bring more stability to the protocol. [0] https://github.com/sipsorcery/webrtc-echoes From mhaines4102 at gmail.com Mon Mar 29 21:51:29 2021 From: mhaines4102 at gmail.com (mhaines4102) Date: Mon, 29 Mar 2021 16:51:29 -0500 (CDT) Subject: AW: rtspsrc - protocols=tcp EOS 8 seconds In-Reply-To: <1617036537875-0.post@n4.nabble.com> References: <1616773227800-0.post@n4.nabble.com> <1617036537875-0.post@n4.nabble.com> Message-ID: <1617054689061-0.post@n4.nabble.com> I ran the cli with a fakesink and verbose. This is the very end of the output where it stops. I do not see any reason for this behavior, but I must be mission something: gst-launch-1.0 -v -e rtspsrc location=rtsp://camera0/live.sdp protocols=tcp ! fakesink /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats, ssrc=(uint)3934040712, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)-1, clock-rate=(int)-1, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)0, packets-received=(guint64)0, bitrate=(guint64)0, packets-lost=(int)0, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)false, sr-ntptime=(guint64)0, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0;", "application/x-rtp-source-stats, ssrc=(uint)2127178463, internal=(boolean)false, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)true, seqnum-base=(int)-1, clock-rate=(int)90000, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)3383604, packets-received=(guint64)2457, bitrate=(guint64)5250303, packets-lost=(int)65536, jitter=(uint)6, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16432694533086695424, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0, sent-rb=(boolean)true, sent-rb-fractionlost=(uint)0, sent-rb-packetslost=(int)65536, sent-rb-exthighestseq=(uint)67992, sent-rb-jitter=(uint)6, sent-rb-lsr=(uint)2828868386, sent-rb-dlsr=(uint)311701, have-rb=(boolean)false, rb-fractionlost=(uint)0, rb-packetslost=(int)0, rb-exthighestseq=(uint)0, rb-jitter=(uint)0, rb-lsr=(uint)0, rb-dlsr=(uint)0, rb-round-trip=(uint)0;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats, ssrc=(uint)3934040712, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)-1, clock-rate=(int)-1, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)0, packets-received=(guint64)0, bitrate=(guint64)0, packets-lost=(int)0, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)false, sr-ntptime=(guint64)0, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0;", "application/x-rtp-source-stats, ssrc=(uint)2127178463, internal=(boolean)false, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)true, seqnum-base=(int)-1, clock-rate=(int)90000, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)3569173, packets-received=(guint64)2592, bitrate=(guint64)5250303, packets-lost=(int)65536, jitter=(uint)17, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16432694533086695424, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0, sent-rb=(boolean)true, sent-rb-fractionlost=(uint)0, sent-rb-packetslost=(int)65536, sent-rb-exthighestseq=(uint)68127, sent-rb-jitter=(uint)17, sent-rb-lsr=(uint)2828868386, sent-rb-dlsr=(uint)324901, have-rb=(boolean)false, rb-fractionlost=(uint)0, rb-packetslost=(int)0, rb-exthighestseq=(uint)0, rb-jitter=(uint)0, rb-lsr=(uint)0, rb-dlsr=(uint)0, rb-round-trip=(uint)0;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession2: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats, ssrc=(uint)3379194541, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)-1, clock-rate=(int)-1, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)0, packets-received=(guint64)0, bitrate=(guint64)0, packets-lost=(int)0, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)false, sr-ntptime=(guint64)0, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0;", "application/x-rtp-source-stats, ssrc=(uint)476900212, internal=(boolean)false, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)true, seqnum-base=(int)-1, clock-rate=(int)90000, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)18461, packets-received=(guint64)20, bitrate=(guint64)0, packets-lost=(int)65536, jitter=(uint)1044, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16432694533048040448, sr-rtptime=(uint)0, sr-octet-count=(uint)933, sr-packet-count=(uint)1, sent-rb=(boolean)true, sent-rb-fractionlost=(uint)0, sent-rb-packetslost=(int)65536, sent-rb-exthighestseq=(uint)65555, sent-rb-jitter=(uint)1044, sent-rb-lsr=(uint)2828867796, sent-rb-dlsr=(uint)321389, have-rb=(boolean)false, rb-fractionlost=(uint)0, rb-packetslost=(int)0, rb-exthighestseq=(uint)0, rb-jitter=(uint)0, rb-lsr=(uint)0, rb-dlsr=(uint)0, rb-round-trip=(uint)0;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession1: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats, ssrc=(uint)2086076471, internal=(boolean)false, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)true, seqnum-base=(int)-1, clock-rate=(int)8000, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)40960, packets-received=(guint64)64, bitrate=(guint64)64165, packets-lost=(int)65536, jitter=(uint)79, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16432694533425997824, sr-rtptime=(uint)0, sr-octet-count=(uint)656, sr-packet-count=(uint)1, sent-rb=(boolean)true, sent-rb-fractionlost=(uint)0, sent-rb-packetslost=(int)65536, sent-rb-exthighestseq=(uint)65596, sent-rb-jitter=(uint)96, sent-rb-lsr=(uint)2828873564, sent-rb-dlsr=(uint)314490, have-rb=(boolean)false, rb-fractionlost=(uint)0, rb-packetslost=(int)0, rb-exthighestseq=(uint)0, rb-jitter=(uint)0, rb-lsr=(uint)0, rb-dlsr=(uint)0, rb-round-trip=(uint)0;", "application/x-rtp-source-stats, ssrc=(uint)239842252, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)-1, clock-rate=(int)-1, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)0, packets-received=(guint64)0, bitrate=(guint64)0, packets-lost=(int)0, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)false, sr-ntptime=(guint64)0, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0;" >, rtx-count=(uint)0; /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstRtpBin:manager/GstRtpSession:rtpsession0: stats = application/x-rtp-session-stats, rtx-drop-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, source-stats=(GValueArray)< "application/x-rtp-source-stats, ssrc=(uint)3934040712, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)-1, clock-rate=(int)-1, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)0, packets-received=(guint64)0, bitrate=(guint64)0, packets-lost=(int)0, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)false, sr-ntptime=(guint64)0, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0;", "application/x-rtp-source-stats, ssrc=(uint)2127178463, internal=(boolean)false, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)true, seqnum-base=(int)-1, clock-rate=(int)90000, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)3650582, packets-received=(guint64)2652, bitrate=(guint64)5250303, packets-lost=(int)65536, jitter=(uint)24, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16432694533086695424, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0, sent-rb=(boolean)true, sent-rb-fractionlost=(uint)0, sent-rb-packetslost=(int)65536, sent-rb-exthighestseq=(uint)68187, sent-rb-jitter=(uint)24, sent-rb-lsr=(uint)2828868386, sent-rb-dlsr=(uint)336688, have-rb=(boolean)false, rb-fractionlost=(uint)0, rb-packetslost=(int)0, rb-exthighestseq=(uint)0, rb-jitter=(uint)0, rb-lsr=(uint)0, rb-dlsr=(uint)0, rb-round-trip=(uint)0;" >, rtx-count=(uint)0; Got EOS from element "pipeline0". Execution ended after 0:00:05.506070901 Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Freeing pipeline ... -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From jam at tigger.ws Mon Mar 29 22:57:35 2021 From: jam at tigger.ws (James) Date: Tue, 30 Mar 2021 06:57:35 +0800 Subject: newby help In-Reply-To: <1617029415308-0.post@n4.nabble.com> References: <1617029415308-0.post@n4.nabble.com> Message-ID: > On 29 Mar 2021, at 10:50 pm, gotsring wrote: > > gst-launch is good for testing pipelines, but it lacks the ability to do any > monitoring, live changes, error-handling, etc. Building pipelines in code > takes getting used to, but it allows much more flexibility and robustness. > > For basic pipelines, you can use gst_parse_launch to run a pipeline similar > to how you would use gst-launch > > For example, a simple pipeline with a tee can be started from the shell > using: > gst-launch-1.0 videotestsrc ! tee name=t \ > t. ! queue ! videoconvert ! autovideosink \ > t. ! queue ! videoconvert ! autovideosink > > or in C code using gst_parse_launch: > GError* err = NULL; > GstElement* pipeline = gst_parse_launch( > "videotestsrc ! tee name=t " > "t. ! queue ! videoconvert ! autovideosink " > "t. ! queue ! videoconvert ! autovideosink" > , &err); > gst_element_set_state(pipeline, GST_STATE_PLAYING); // Play it > > > You can swap out my test pipeline for whatever you want. > > Not quite sure what you're trying to do in terms of a GUI, but maybe use > GTK? Check out basic tutorial 5: > https://gstreamer.freedesktop.org/documentation/tutorials/basic/toolkit-integration.html?gi-language=c Thanks. I put my exact working command into the gst-parse-launch where it failed with core dump I think. I've done so much wading around in the basic_tutorial 7 and 8 I don't remember exactly. What I've done is to explicitly link the tee pads. Now I'm just trying to track down 2 gstreamer-CRITICAL errors before a core dump. I find the whole toolkit rather nice, just oh so complex and I see my original question of how do I code this pipeline is extremely non-trivial James From jam at tigger.ws Tue Mar 30 06:22:09 2021 From: jam at tigger.ws (James) Date: Tue, 30 Mar 2021 14:22:09 +0800 Subject: newby help In-Reply-To: <1617029415308-0.post@n4.nabble.com> References: <1617029415308-0.post@n4.nabble.com> Message-ID: > On 29 Mar 2021, at 10:50 pm, gotsring wrote: > > gst-launch is good for testing pipelines, but it lacks the ability to do any > monitoring, live changes, error-handling, etc. Building pipelines in code > takes getting used to, but it allows much more flexibility and robustness. > > For basic pipelines, you can use gst_parse_launch to run a pipeline similar > to how you would use gst-launch > > For example, a simple pipeline with a tee can be started from the shell > using: > gst-launch-1.0 videotestsrc ! tee name=t \ > t. ! queue ! videoconvert ! autovideosink \ > t. ! queue ! videoconvert ! autovideosink > > or in C code using gst_parse_launch: > GError* err = NULL; > GstElement* pipeline = gst_parse_launch( > "videotestsrc ! tee name=t " > "t. ! queue ! videoconvert ! autovideosink " > "t. ! queue ! videoconvert ! autovideosink" > , &err); > gst_element_set_state(pipeline, GST_STATE_PLAYING); // Play it > > > You can swap out my test pipeline for whatever you want. > > Not quite sure what you're trying to do in terms of a GUI, but maybe use > GTK? Check out basic tutorial 5: > https://gstreamer.freedesktop.org/documentation/tutorials/basic/toolkit-integration.html?gi-language=c OK perhaps a different track: I compiled basic_tutorial12.c The executable runs, pretty picture etc I changed pipeline = gst_parse_launch ("playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL); for pipeline = gst_parse_launch ("v4l2src device=argv[1] ! tee name=t " "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " "videoconvert ! xvimagesink force-aspect-ratio=false " " t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " "videoconvert ! xvimagesink force-aspect-ratio=false", NULL); Now running gives me Unable to set the pipeline to the playing state. My 'get_parse_lauch' is wrong. Why? Where? Thanks James From sebastian at centricular.com Tue Mar 30 07:19:51 2021 From: sebastian at centricular.com (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Tue, 30 Mar 2021 10:19:51 +0300 Subject: WebRTC-Echoes: Interop for WebRTC. Any GStreamer devs interested in getting involved? In-Reply-To: References: Message-ID: <88cb9f14ea49c18935275e57574f517a1e862998.camel@centricular.com> Hi Sean, On Mon, 2021-03-29 at 14:26 -0700, Sean DuBois wrote: > Hey list, > > We have started a WebRTC interop project called WebRTC-echoes[0] > GStreamer has already been added to the test matrix, but would love > if some devs would be involved! My hope is that this is something > we could take to the IETF and bring more stability to the protocol. Is there anything specific for someone to get involved with? Sounds like a useful project in any case, thanks for setting that up :) Would someone create issues in Gitlab whenever something fails on the GStreamer side or what's your plan there? -- Sebastian Dr?ge, Centricular Ltd ? https://www.centricular.com From michiel at aanmelder.nl Tue Mar 30 07:32:53 2021 From: michiel at aanmelder.nl (Michiel Konstapel) Date: Tue, 30 Mar 2021 09:32:53 +0200 Subject: newby help In-Reply-To: References: <1617029415308-0.post@n4.nabble.com> Message-ID: On 30-03-2021 08:22, James wrote: > OK perhaps a different track: > I compiled basic_tutorial12.c > The executable runs, pretty picture etc > > I changed > > pipeline = gst_parse_launch ("playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL); > > for > pipeline = gst_parse_launch ("v4l2src device=argv[1] ! tee name=t " > "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " > "videoconvert ! xvimagesink force-aspect-ratio=false " > " t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " > "videoconvert ! xvimagesink force-aspect-ratio=false", NULL); > > Now running gives me > > Unable to set the pipeline to the playing state. > > My 'get_parse_lauch' is wrong. Why? Where? > > Thanks > James Do you literally have "v4l2src device=argv[1]" in there? That's not going to work :) If it's C, you'll have to use something like snprintf() to put the argument into the parse_launch string. My C is quite rusty, but something like char cmdline[1024]; snprintf( cmdline, 1024, "v4l2src device=%s ! tee name=t " "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " "videoconvert ! xvimagesink force-aspect-ratio=false " "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " "videoconvert ! xvimagesink force-aspect-ratio=false", argv[1] ); pipeline = gst_parse_launch(cmdline, NULL); HTH, Michiel From jam at tigger.ws Tue Mar 30 11:41:31 2021 From: jam at tigger.ws (James Linder) Date: Tue, 30 Mar 2021 19:41:31 +0800 Subject: newby help In-Reply-To: References: <1617029415308-0.post@n4.nabble.com> Message-ID: <75B83355-58E4-4A48-9156-44F18BCE56DA@tigger.ws> > On 30 Mar 2021, at 3:32 pm, Michiel Konstapel wrote: > > On 30-03-2021 08:22, James wrote: >> OK perhaps a different track: >> I compiled basic_tutorial12.c >> The executable runs, pretty picture etc >> >> I changed >> >> pipeline = gst_parse_launch ("playbin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL); >> >> for >> pipeline = gst_parse_launch ("v4l2src device=argv[1] ! tee name=t " >> "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " >> "videoconvert ! xvimagesink force-aspect-ratio=false " >> " t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " >> "videoconvert ! xvimagesink force-aspect-ratio=false", NULL); >> >> Now running gives me >> >> Unable to set the pipeline to the playing state. >> >> My 'get_parse_lauch' is wrong. Why? Where? >> >> Thanks >> James > > Do you literally have "v4l2src device=argv[1]" in there? That's not going to work :) > > If it's C, you'll have to use something like snprintf() to put the argument into the parse_launch string. My C is quite rusty, but something like > > char cmdline[1024]; > snprintf( > cmdline, 1024, > "v4l2src device=%s ! tee name=t " > "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " > "videoconvert ! xvimagesink force-aspect-ratio=false " > "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " > "videoconvert ! xvimagesink force-aspect-ratio=false", > argv[1] > ); > pipeline = gst_parse_launch(cmdline, NULL); Of course! silly moi. My very last dumb question, I promise. I need to fiddle with the elements in the pipeline (to reparent the window). So the printf solution or g_object_set (..) so I?m trying to list items in pipeline (which was made with gst_parse_launch) // walk the pipeline GstIterator *iter = gst_bin_iterate_elements (pipeline); GValue item = G_VALUE_INIT; gboolean done = FALSE; while (!done) { switch (gst_iterator_next (iter, &item)) { case GST_ITERATOR_OK: //...get/use/change item here... gst_element_get_name (&item); g_value_reset (&item); break; case GST_ITERATOR_RESYNC: //...rollback changes to items... gst_iterator_resync (iter); break; case GST_ITERATOR_ERROR: //...wrong parameters were given... done = TRUE; break; case GST_ITERATOR_DONE: done = TRUE; break; } } g_value_unset (&item); gst_iterator_free (iter); //????????????????????? Which fails asserting item is not an object. Again stuff I don?t understand, despite lots of reading James From gotsring at live.com Tue Mar 30 14:23:09 2021 From: gotsring at live.com (gotsring) Date: Tue, 30 Mar 2021 09:23:09 -0500 (CDT) Subject: newby help In-Reply-To: <75B83355-58E4-4A48-9156-44F18BCE56DA@tigger.ws> References: <1617029415308-0.post@n4.nabble.com> <75B83355-58E4-4A48-9156-44F18BCE56DA@tigger.ws> Message-ID: <1617114189668-0.post@n4.nabble.com> In this case, I don't think you need to iterate over all the elements manually. You're just trying to grab references to the xvimagesink, correct? If so, use gst_bin_get_by_name(), which returns a reference to the element in question (or NULL). You can name your elements in gst_parse_launch to make them easier to find. Something like: // Name the xvimagesinks in the parse string (just add "name=custom_name") char cmdline[1024]; snprintf( cmdline, 1024, "v4l2src device=%s ! tee name=t " "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " "videoconvert ! xvimagesink name=sink1 force-aspect-ratio=false " "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " "videoconvert ! xvimagesink name=sink2 force-aspect-ratio=false", argv[1] ); pipeline = gst_parse_launch(cmdline, NULL); // Get references to the xvimagesink elements, we called them sink1 and sink2 GstElement* sink1 = gst_bin_get_by_name(GST_BIN(pipeline), "sink1"); GstElement* sink2 = gst_bin_get_by_name(GST_BIN(pipeline), "sink2"); -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From gdenispro at gmail.com Tue Mar 30 15:38:17 2021 From: gdenispro at gmail.com (Guillaume Denis) Date: Tue, 30 Mar 2021 17:38:17 +0200 Subject: How to compile a base plugin Message-ID: Hello, I am new to GStreamer (and C) and I try to compile one of the base plugin to learn (by modifying). After having installed various dependencies (including libgstreamer-plugins-base1.0-dev) I get: $ gcc -c audioecho.c `pkg-config --cflags --libs gstreamer-1.0` In file included from audioecho.c:60: audioecho.h:71:1: warning: data definition has no type or storage class ?GST_ELEMENT_REGISTER_DECLARE(audioecho); ?^~~~~~~~~~~~~~~~~~~~~~~~~~~~ audioecho.h:71:1: warning: type defaults to 'int' in declaration of 'GST_ELEMENT_REGISTER_DECLARE' [-Wimplicit-int] audioecho.h:71:1: warning: parameter names (without types) in function declaration audioecho.c:88:40: error: expected ')' before string constant ?GST_ELEMENT_REGISTER_DEFINE (audioecho, "audioecho", ??????????????????????????????????????? ^~~~~~~~~~~~ ??????????????????????????????????????? ) I guess GST_ELEMENT_REGISTER_DECLARE and GST_ELEMENT_REGISTER_DEFINE macros are not defined, but I don't know how to solve this. Many thanks for your help! Guillaume From guille.rodriguez at gmail.com Tue Mar 30 16:08:11 2021 From: guille.rodriguez at gmail.com (Guillermo Rodriguez Garcia) Date: Tue, 30 Mar 2021 18:08:11 +0200 Subject: Problem with jpegparse and timestamps Message-ID: Hi all, I have been using gstreamer to stream live video from mjpeg cameras with a pipeline like this: gst-launch-1.0 souphttpsrc is-live=true location= ! multipartdemux ! image/jpeg,framerate=0/1 ! jpegparse ! avdec_mjpeg ! videoconvert ! ximagesink This was working fine with 1.14.x. Recently I upgraded gstreamer to a more recent version and the above does not work well anymore. Video stutters or freezes. I ran git bisect and found the following commit to be the culprit: https://github.com/GStreamer/gst-plugins-bad/commit/5c52f866ad36784fd66a6bd74915bfb911915ae7 If I revert that commit, the pipeline works fine again. Is this the expected behaviour? Can someone shed some light? Thanks, Guillermo Rodriguez Garcia guille.rodriguez at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.crete at collabora.com Tue Mar 30 16:21:46 2021 From: olivier.crete at collabora.com (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Tue, 30 Mar 2021 12:21:46 -0400 Subject: How to compile a base plugin In-Reply-To: References: Message-ID: Hi, If you are compiling the git master version of a plugin, you need to compile it against the git master version of the GStreamer core. We don't support mixing versions of the various GStreamer repositories. This macro was recently added, you could revert the relevant commit for this specific plugin if you want to backport it. Olivier On Tue, 2021-03-30 at 17:38 +0200, Guillaume Denis wrote: > Hello, > > I am new to GStreamer (and C) and I try to compile one of the base > plugin to learn (by modifying). After having installed various > dependencies (including libgstreamer-plugins-base1.0-dev) I get: > > > $ gcc -c audioecho.c `pkg-config --cflags --libs gstreamer-1.0` > In file included from audioecho.c:60: > audioecho.h:71:1: warning: data definition has no type or storage > class > ??GST_ELEMENT_REGISTER_DECLARE(audioecho); > ??^~~~~~~~~~~~~~~~~~~~~~~~~~~~ > audioecho.h:71:1: warning: type defaults to 'int' in declaration of > 'GST_ELEMENT_REGISTER_DECLARE' [-Wimplicit-int] > audioecho.h:71:1: warning: parameter names (without types) in > function > declaration > audioecho.c:88:40: error: expected ')' before string constant > ??GST_ELEMENT_REGISTER_DEFINE (audioecho, "audioecho", > ???????????????????????????????????????? ^~~~~~~~~~~~ > ???????????????????????????????????????? ) > > I guess GST_ELEMENT_REGISTER_DECLARE and GST_ELEMENT_REGISTER_DEFINE > macros are not defined, but I don't know how to solve this. Many > thanks > for your help! Guillaume > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- Olivier Cr?te olivier.crete at collabora.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.crete at collabora.com Tue Mar 30 16:41:56 2021 From: olivier.crete at collabora.com (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Tue, 30 Mar 2021 12:41:56 -0400 Subject: Problem with jpegparse and timestamps In-Reply-To: References: Message-ID: <256f8292dc3a569520d932ca36126abf1778c572.camel@collabora.com> Hi, Let me know if the following patch fixes your issue: https://gitlab.freedesktop.org/ocrete/gst-plugins-bad/-/commit/7247af1d20262accc70cc3d03f04eda4e7146f50 Olivier On Tue, 2021-03-30 at 18:08 +0200, Guillermo Rodriguez Garcia wrote: > Hi all, > > I have been using gstreamer to stream live video from mjpeg cameras > with a pipeline like this: > > gst-launch-1.0 souphttpsrc is-live=true location= ! > multipartdemux ! image/jpeg,framerate=0/1 ! jpegparse ! avdec_mjpeg ! > videoconvert ! ximagesink > > This was working fine with 1.14.x. > Recently I upgraded gstreamer to a more recent version and the above > does not work well anymore. Video stutters or freezes. > > I ran git bisect and found the following commit to be the culprit: > > https://github.com/GStreamer/gst-plugins-bad/commit/5c52f866ad36784fd66a6bd74915bfb911915ae7 > > If I revert that commit, the pipeline works fine again. > > Is this the expected behaviour? Can someone shed some light? > > Thanks, > > Guillermo Rodriguez Garcia > guille.rodriguez at gmail.com > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel -- Olivier Cr?te olivier.crete at collabora.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From jam at tigger.ws Tue Mar 30 23:20:42 2021 From: jam at tigger.ws (James) Date: Wed, 31 Mar 2021 07:20:42 +0800 Subject: newby help In-Reply-To: <1617114189668-0.post@n4.nabble.com> References: <1617029415308-0.post@n4.nabble.com> <75B83355-58E4-4A48-9156-44F18BCE56DA@tigger.ws> <1617114189668-0.post@n4.nabble.com> Message-ID: <6F135144-F2AA-47B8-B846-6870DF5CEE35@tigger.ws> > On 30 Mar 2021, at 10:23 pm, gotsring wrote: > > In this case, I don't think you need to iterate over all the elements > manually. You're just trying to grab references to the xvimagesink, correct? > If so, use gst_bin_get_by_name(), which returns a reference to the element > in question (or NULL). You can name your elements in gst_parse_launch to > make them easier to find. > > Something like: > // Name the xvimagesinks in the parse string (just add "name=custom_name") > char cmdline[1024]; > snprintf( > cmdline, 1024, > "v4l2src device=%s ! tee name=t " > "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " > "videoconvert ! xvimagesink name=sink1 force-aspect-ratio=false " > "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " > "videoconvert ! xvimagesink name=sink2 force-aspect-ratio=false", > argv[1] > ); > pipeline = gst_parse_launch(cmdline, NULL); > > // Get references to the xvimagesink elements, we called them sink1 and > sink2 > GstElement* sink1 = gst_bin_get_by_name(GST_BIN(pipeline), "sink1"); > GstElement* sink2 = gst_bin_get_by_name(GST_BIN(pipeline), "sink2"); Thank you lots James From emad.albloushi at gmail.com Wed Mar 31 02:28:28 2021 From: emad.albloushi at gmail.com (Emad Alblueshi) Date: Wed, 31 Mar 2021 05:28:28 +0300 Subject: rtsp server for videoconferencing Message-ID: Hi, it?s mentioned In the gstreamer website that the rtsp server supports from small embedded devices to large-scale videoconferencing. does it mean streaming audio/video from multiple parties in predefined endpoints? I have checked the examples in the gitlab official repo and the email list history but didn?t find simple example for such use case. Currently I?m working on desktop application for educational purposes Your guidance is highly appreciated Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From jam at tigger.ws Wed Mar 31 07:13:33 2021 From: jam at tigger.ws (James) Date: Wed, 31 Mar 2021 15:13:33 +0800 Subject: newby help In-Reply-To: <6F135144-F2AA-47B8-B846-6870DF5CEE35@tigger.ws> References: <1617029415308-0.post@n4.nabble.com> <75B83355-58E4-4A48-9156-44F18BCE56DA@tigger.ws> <1617114189668-0.post@n4.nabble.com> <6F135144-F2AA-47B8-B846-6870DF5CEE35@tigger.ws> Message-ID: <551E6370-9362-4304-8D92-CC801D4A9A51@tigger.ws> > On 31 Mar 2021, at 7:20 am, James wrote: > > > >> On 30 Mar 2021, at 10:23 pm, gotsring wrote: >> >> In this case, I don't think you need to iterate over all the elements >> manually. You're just trying to grab references to the xvimagesink, correct? >> If so, use gst_bin_get_by_name(), which returns a reference to the element >> in question (or NULL). You can name your elements in gst_parse_launch to >> make them easier to find. >> >> Something like: >> // Name the xvimagesinks in the parse string (just add "name=custom_name") >> char cmdline[1024]; >> snprintf( >> cmdline, 1024, >> "v4l2src device=%s ! tee name=t " >> "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " >> "videoconvert ! xvimagesink name=sink1 force-aspect-ratio=false " >> "t. ! queue ! videoscale ! video/x-raw,width=768,height=576 ! " >> "videoconvert ! xvimagesink name=sink2 force-aspect-ratio=false", >> argv[1] >> ); >> pipeline = gst_parse_launch(cmdline, NULL); >> >> // Get references to the xvimagesink elements, we called them sink1 and >> sink2 >> GstElement* sink1 = gst_bin_get_by_name(GST_BIN(pipeline), "sink1"); >> GstElement* sink2 = gst_bin_get_by_name(GST_BIN(pipeline), "sink2"); > > Thank you lots I want to re-parent a window. The only gooled results I can find are old and use depreciated apis. I tried GstVideoOverlay* ov1 = (GstVideoOverlay*) sink1; GstVideoOverlay* ov2 = (GstVideoOverlay*) sink2; gst_video_overlay_set_window_handle (ov1, (guintptr) preview1Wid); based on https://gstreamer.freedesktop.org/documentation/video/gstvideooverlay.html?gi-language=c#gst_video_overlay_set_window_handle but was rewarded with gcc myalt.c -o myalt `pkg-config --cflags --libs gstreamer-1.0` /usr/lib64/gcc/x86_64-suse-linux/7/../../../../x86_64-suse-linux/bin/ld: /tmp/ccT1P5wY.o: in function `main': myalt.c:(.text+0x316): undefined reference to `gst_video_overlay_set_window_handle' collect2: error: ld returned 1 exit status make: *** [Makefile:9: myalt] Error 1 Anybody please James -------------- next part -------------- An HTML attachment was scrubbed... URL: From guille.rodriguez at gmail.com Wed Mar 31 07:46:59 2021 From: guille.rodriguez at gmail.com (Guillermo Rodriguez Garcia) Date: Wed, 31 Mar 2021 09:46:59 +0200 Subject: Problem with jpegparse and timestamps In-Reply-To: <256f8292dc3a569520d932ca36126abf1778c572.camel@collabora.com> References: <256f8292dc3a569520d932ca36126abf1778c572.camel@collabora.com> Message-ID: Hello, Yes indeed, I confirm that this patch fixes the problem. Thank you for the quick response! Guillermo El mar, 30 mar 2021 a las 18:42, Olivier Cr?te () escribi?: > Hi, > > Let me know if the following patch fixes your issue: > > > https://gitlab.freedesktop.org/ocrete/gst-plugins-bad/-/commit/7247af1d20262accc70cc3d03f04eda4e7146f50 > > Olivier > > On Tue, 2021-03-30 at 18:08 +0200, Guillermo Rodriguez Garcia wrote: > > Hi all, > > I have been using gstreamer to stream live video from mjpeg cameras with a > pipeline like this: > > gst-launch-1.0 souphttpsrc is-live=true location= ! > multipartdemux ! image/jpeg,framerate=0/1 ! jpegparse ! avdec_mjpeg ! > videoconvert ! ximagesink > > This was working fine with 1.14.x. > Recently I upgraded gstreamer to a more recent version and the above does > not work well anymore. Video stutters or freezes. > > I ran git bisect and found the following commit to be the culprit: > > > https://github.com/GStreamer/gst-plugins-bad/commit/5c52f866ad36784fd66a6bd74915bfb911915ae7 > > If I revert that commit, the pipeline works fine again. > > Is this the expected behaviour? Can someone shed some light? > > Thanks, > > Guillermo Rodriguez Garcia > guille.rodriguez at gmail.com > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > -- > > Olivier Cr?te > olivier.crete at collabora.com > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > -- Guillermo Rodriguez Garcia guille.rodriguez at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From javier.carrasco at wolfvision.net Wed Mar 31 08:16:41 2021 From: javier.carrasco at wolfvision.net (Javiku) Date: Wed, 31 Mar 2021 03:16:41 -0500 (CDT) Subject: force keyframe ignored (v4l2h264enc) In-Reply-To: References: <1616487576901-0.post@n4.nabble.com> <1616513041641-0.post@n4.nabble.com> Message-ID: <1617178601201-0.post@n4.nabble.com> I don't know if the task is complicated for someone who has some experience but it is definitely over my head :P I am new to GStreamer and I have to admit that the code you linked overwhelmed me. On the other hand I have plenty of time to test and I will be pleased to help somehow. You can contact me directly to test new code that is not ready to be released or just post messages within this thread if the implementation might be useful for other people. Thank you, best regards, Javiku -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From arturs.valenieks at gmail.com Wed Mar 31 09:22:11 2021 From: arturs.valenieks at gmail.com (Townsheriff) Date: Wed, 31 Mar 2021 04:22:11 -0500 (CDT) Subject: Gst Device Provider Bus Callbacks Message-ID: <1617182531123-0.post@n4.nabble.com> Hey, I'm writing a small application where I want to start a pipeline when a device is connected. To detect device I'm using DeviceMonitor api. I'm having trouble receiving an event from `gst_bus_add_watch_full`, but it works fine if I do it with timed iteration `gst_bus_timed_pop`. I was following this tutorial. What I'm doing wrong here? fn main() -> Result<(), Error> { gst::init(); let monitor = gst::DeviceMonitor::new(); monitor.set_show_all_devices(true); let bus = monitor.get_bus(); let r = bus.add_watch(|bus, msg| { // never called println!("WATCH Message {:?}", &msg); Continue(true) }); println!("result {:?}", r); let caps = gst::Caps::new_any(); monitor.add_filter(None, Some(&caps)); monitor.start(); monitor.get_providers() .into_iter().for_each(|provider| println!("provider {:?}", String::from(provider))); let devices =monitor.get_devices(); println!("devices {:?}", &devices); for msg in bus.iter_timed(gst::CLOCK_TIME_NONE) { // works fine println!("Message iter {:?}", &msg); } Ok(()) } Cheers! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From ottavio at campana.vi.it Wed Mar 31 13:59:49 2021 From: ottavio at campana.vi.it (Ottavio Campana) Date: Wed, 31 Mar 2021 15:59:49 +0200 Subject: Do you know a program for measurig RTSP streaming quality? Message-ID: I would like to measure the quality of a RTSP stream from a gstreamer based server. For quality I mean jitter, delay, packet loss. I don't really care about decoding. Do you know if there is a program, hopefully based on gstreamer, that does it? Thank you, Ottavio -- Non c'? pi? forza nella normalit?, c'? solo monotonia -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicolas at ndufresne.ca Wed Mar 31 15:31:41 2021 From: nicolas at ndufresne.ca (Nicolas Dufresne) Date: Wed, 31 Mar 2021 11:31:41 -0400 Subject: Do you know a program for measurig RTSP streaming quality? In-Reply-To: References: Message-ID: <6c12715e14ada18910e11b96d2f2d0079c61a4a9.camel@ndufresne.ca> Le mercredi 31 mars 2021 ? 15:59 +0200, Ottavio Campana a ?crit?: > I would like to measure the quality of a RTSP stream from a gstreamer based > server. For quality I mean jitter, delay, packet loss. I don't really care > about decoding. > > Do you know if there is a program, hopefully based on gstreamer, that does it? There is various "stats" property for that. With the debug tool gst-launch-1.0, adding -v option is often sufficient to get them trace when updated. Though I'm not aware of an open source tool that would automatically gather and display this. > > Thank you, > > Ottavio > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel From letizia.mariotti at gmail.com Wed Mar 31 16:16:46 2021 From: letizia.mariotti at gmail.com (Letty) Date: Wed, 31 Mar 2021 11:16:46 -0500 (CDT) Subject: Accessing gstreamer timestamp from python Message-ID: <1617207406995-0.post@n4.nabble.com> Hi all, I am a beginner with gstreamer! I successfully managed to create a gstreamer pipeline to read frames into Python using OpenCV cv2.VideoCapture. I understand that gstreamer has a buffer through which it is possible to read the timestamp at which the current frame was taken, but it is not clear to me if it is possible to access this information through python and how. For completeness, this is my gstreamer pipeline: which then I call through VideoCapture -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From JamesMartin_2000 at yahoo.com Wed Mar 31 20:08:05 2021 From: JamesMartin_2000 at yahoo.com (jamesM) Date: Wed, 31 Mar 2021 15:08:05 -0500 (CDT) Subject: Play V4l2 video source on framebuffer Message-ID: <1617221285007-0.post@n4.nabble.com> Hi I installed Ubuntu server 20.04 (without GUI) on my desktop PC and also installed video capture card on my PC. i wanna play real time Pal camera frame(720*576 )on framebuffer but i cant >>v4l2-ctl --all Driver Info: Driver name : tw686x Card type : tw6864 Bus info : PCI:0000:01:00.0 Driver version : 5.4.94 Capabilities : 0x85200001 Video Capture Read/Write Streaming Extended Pix Format Device Capabilities Device Caps : 0x05200001 Video Capture Read/Write Streaming Extended Pix Format Priority: 2 Video input : 0 (Composite0: ok) Video Standard = 0x0000b000 NTSC-M/M-JP/M-KR Format Video Capture: Width/Height : 720/480 Pixel Format : 'UYVY' (UYVY 4:2:2) Field : Interlaced Bytes per Line : 1440 Size Image : 691200 Colorspace : SMPTE 170M Transfer Function : Default (maps to Rec. 709) YCbCr/HSV Encoding: Default (maps to ITU-R 601) Quantization : Default (maps to Limited Range) Flags : Streaming Parameters Video Capture: Capabilities : timeperframe Frames per second: 30.000 (30/1) Read buffers : 3 User Controls brightness 0x00980900 (int) : min=-128 max=127 step=1 default=0 value=0 flags=slider contrast 0x00980901 (int) : min=0 max=255 step=1 default=100 value=100 flags=slider saturation 0x00980902 (int) : min=0 max=255 step=1 default=128 value=128 flags=slider hue 0x00980903 (int) : min=-128 max=127 step=1 default=0 value=0 flags=slider >>gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-raw,width=640,height=576,format=(string)YUY2" ! fbdevsink="/dev/fb0" WARNING:errorneous pipeline:could not link v4l2src0 to fbdevsink0 , fbdevsink0 cant handle video/x-raw,width=(int)720,height=(int)576,(string)YUY2 It seems gstreamer pipeline need some filters would you please help me what kind of filters should i use? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ From mikeybards at gmail.com Wed Mar 31 21:59:49 2021 From: mikeybards at gmail.com (mdb) Date: Wed, 31 Mar 2021 16:59:49 -0500 (CDT) Subject: Automatically set pipeline to paused after X frames Message-ID: <1617227989982-0.post@n4.nabble.com> I setup the following pipeline /v4l2src device=/dev/video0 num-buffers=5 ! filesink location=/tmp/foo.bar/ in a function, which returns a GStreamer element (writing in OCaml) in *paused* state. I then set the state to *playing* manually and see the expected files in /tmp. However, the pipeline remains in *playing* after the 5 buffers have been stored in /foo.bar/. I would like the pipeline to reset itself to *stop* after the 5 buffers have been collected. I have not found anything to achieve this yet; I briefly read through position tracking and seeking & pipeline manipulation . I am comfortable stubbing C functions in OCaml, I just cannot figure out what to stub. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/