Splitting source to measure latency

James Cameron quozl at laptop.org
Tue May 29 10:43:17 UTC 2018


On Tue, May 29, 2018 at 12:27:00PM +0200, Wolfgang Grandegger wrote:
> Hello,
> 
> I want to split the video source to measure the latency between a video
> generated and displayed on a local and remote display using:
> 
> # gst-launch-1.0 -v videotestsrc \
>   ! video/xraw,format=RGBx,width=800,height=600,framerate=30/1 \
>   ! timeoverlay ! tee name=t \
>   t. ! queue ! kmssink
>   t. ! queue ! vaapipostproc ! vaapijpegenc quality=90 \
>      ! rtpjpegpay ! udpsink host=192.168.0.254 port=50004
> 
> I'm interested in the latency due to:
> 
>  vaapipostproc ! vaapijpegenc quality=90 ! rtpjpegpay \
>  ! udpsink host=192.168.0.254 port=50004
> 
> Then I take a picture of both, the local and the remote display. I see
> that the time of the video on the remote side is delayed by more or less
> exactly *one* second. How does the "tee" deliver the packages to both
> streams? One by one? Is it possible at all to measure the latency using
> "tee"? Have I missed something else?

I'm only new here, but does your exactly one second change if you
give each queue a max-size-time other than the default of one second?

Could you examine current-level-time of each queue?

-- 
James Cameron
http://quozl.netrek.org/


More information about the gstreamer-devel mailing list