Probing the element UDP source when no data exists on the port.
venkateshkuppan26 at gmail.com
Wed Sep 26 13:24:34 UTC 2018
Hello James and others,
I am trying to simulate the delay in UDP streaming using the linux tool -
netem. netem <https://wiki.linuxfoundation.org/networking/netem>
I use the following command to induce a latency of 500ms (at iMX6 side) :
tc qdisc add dev eth0 root netem delay 500ms
Now this delay is observed, when I ping between 2 devices, connected via LAN
I am transmitting video packets from iMX6 to PC, where I have configured
delay of 500 ms for the eth0 port of iMX6.
gst-launch-1.0 -v imxv4l2videosrc device=/dev/video1 ! imxvpuenc_h264
bitrate=500 ! h264parse ! rtph264pay ! udpsink host=192.168.1.11 port=xxxx
I have configured a timeout of 10ms for the udp source :
udpsrc port = xxxx timeout=10000000 ! rtph264depay ! h264parse ! avdec_h264
Since, I configured a delay of 500ms at transmit side(iMX6), i was expecting
that at the receiving side, I receive each frame with a delay of 500ms,
which would be detected by element udpsrc - timeout. However this is not the
When I completely stop sending packets from iMX6, I can see the timeout
occurring and the callback called. So the timeout is called, when there are
no packets at all, and not when there is a delay in receiving the packets.
To sum up the observations:
1. the timeout is called when the iMX6 completely stops sending udp packets.
2. the timeout is not called when the iMX6 sends delayed udp packets.
3. the video rendered at receiving side, shows visible delay when configured
which is expected.
Do you know/Does anyone know why the element `udpsrc`, does not detect the
timeout in cases of delayed packets, and detect only when the packets are
completely stopped sending?
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
More information about the gstreamer-devel