Usage of rtponviftimestamp in gst-rtsp-server

Howling wong watertreader at hotmail.com
Fri Mar 4 05:51:01 UTC 2022


Hi

DESCRIPTION
I have some problems with using onvif timestamp "rtponviftimestamp" plugin. The plugin is use to extend the rtp packets with NTP time. I have added in for noth the payload 96 and 97. The src, udpsrc recieve its buffer from udpsink in another thread, which carry MPEG-TS-stream

My pipeline with rtponviftimestamp.  I am using gstreamer 1.14

gchar *pipeline =
      g_strdup_printf ("( udpsrc port=2346 name=src0  "
                   "! queue ! rtpmp2tdepay ! video/mpegts, systemstream=true, packetsize=188  ! aiurdemux  name=d "
                  " d.video_0 ! queue ! h264parse name=video !  rtph264pay config-interval=1 name=pay0 pt=96 ! rtponviftimestamp name=tsv "
                  " d.audio_0  !  queue ! aacparse name=audio ! rtpmp4gpay name=pay1 pt=97 ! rtponviftimestamp name=tsa )",
                                                                   8554);

PROBLEM
When i tried to replay, the following error occurs

(VideoServerTest2:3782): GStreamer-CRITICAL **: gst_ghost_pad_new: assertion '!gst_pad_is_linked (target)' failed

(VideoServerTest2:3782): GStreamer-CRITICAL **: gst_pad_set_active: assertion 'GST_IS_PAD (pad)' failed

(VideoServerTest2:3782): GStreamer-CRITICAL **: gst_element_add_pad: assertion 'GST_IS_PAD (pad)' failed

** (VideoServerTest2:3782): CRITICAL **: gst_rtsp_stream_new: assertion 'GST_IS_PAD (pad)' failed

** (VideoServerTest2:3782): CRITICAL **: gst_rtsp_stream_set_multicast_iface: assertion 'GST_IS_RTSP_STREAM (stream)' failed

** (VideoServerTest2:3782): CRITICAL **: gst_rtsp_stream_set_profiles: assertion 'GST_IS_RTSP_STREAM (stream)' failed

** (VideoServerTest2:3782): CRITICAL **: gst_rtsp_stream_set_protocols: assertion 'GST_IS_RTSP_STREAM (stream)' failed
Segmentation fault (core dumped)

EFFORT

I tried to trace the error and the problem seem to be with gst-rtsp-media.c  "gst_rtsp_media_collect_stream"
function which in terms call  the "gst_rtsp_media_create_stream" which tried to create ghost pad to link the payload to stream.

When  I tried to remove the rtponviftimestamp from the pipeline, the pipeline seems to be working fine

Queries

  1.  Is this the correct way to use rtponviftimestamp?
  2.  Is it a must to extend rtp packets in onvif specification. Have tried with some player. Seem like extending the rtp packets is not a must?
  3.  Is there another way to carry audio and video  together in stream. Thought of doing so to enable audio and video to be in sync with one another at the source end

Thanks

Regards

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220304/aa257a51/attachment.htm>


More information about the gstreamer-devel mailing list