Why can't I receive an RTP video stream produced by appsrc?
t.i.m at zen.co.uk
Thu Aug 4 23:48:14 UTC 2022
> I'm trying to use the appsrc element with rtpbin and udpsink to
> create an RTP sender using the VP8 codec (vp8enc).
> This program creates a fake video stream by switching black and white
> frames every 100 milliseconds. You can see the video by uncommenting
> the macro USE_AUTOVIDEOSINK.
> receiver.sh (using gst-launch-1.0):
> When I run the receiver, I only get the first frame. It doesn't
> blink. What am I doing wrong?
> PS: I'm using libgstreamer 1.20.3 from Ubuntu 22.04.
I haven't looked at the code in detail or tried to run it, but in your
broadcaster you don't seem to set buffer timestamps on the buffers you
push into appsrc, e.g. with something like
GST_BUFFER_PTS(buffer) = gst_util_uint64_scale_round (frame_number,
GST_SECOND * FPS_DENOM, FPS_NUM);
or try setting appsrc do-timestamp=true and perhaps also set appsrc
min-latency=40000000 (frame duration in nanoseconds for 25fps).
If you thread produces data in a timed manner (as it does given the
usleep) you also don't really need to bother with the need-data and
enough-data signals. Just produce your buffer and push it into appsrc
when it's there (and make sure it gets timestamped one way or another).
Your usleep will throttle the data production automatically.
More information about the gstreamer-devel