Splitting source to measure latency

Wolfgang Grandegger wg at grandegger.com
Tue May 29 12:51:27 UTC 2018


Hello,

Am 29.05.2018 um 12:43 schrieb James Cameron:
> On Tue, May 29, 2018 at 12:27:00PM +0200, Wolfgang Grandegger wrote:
>> Hello,
>>
>> I want to split the video source to measure the latency between a video
>> generated and displayed on a local and remote display using:
>>
>> # gst-launch-1.0 -v videotestsrc \
>>   ! video/xraw,format=RGBx,width=800,height=600,framerate=30/1 \
>>   ! timeoverlay ! tee name=t \
>>   t. ! queue ! kmssink
>>   t. ! queue ! vaapipostproc ! vaapijpegenc quality=90 \
>>      ! rtpjpegpay ! udpsink host=192.168.0.254 port=50004
>>
>> I'm interested in the latency due to:
>>
>>  vaapipostproc ! vaapijpegenc quality=90 ! rtpjpegpay \
>>  ! udpsink host=192.168.0.254 port=50004
>>
>> Then I take a picture of both, the local and the remote display. I see
>> that the time of the video on the remote side is delayed by more or less
>> exactly *one* second. How does the "tee" deliver the packages to both
>> streams? One by one? Is it possible at all to measure the latency using
>> "tee"? Have I missed something else?
> 
> I'm only new here, but does your exactly one second change if you
> give each queue a max-size-time other than the default of one second?
> 
> Could you examine current-level-time of each queue?

Setting "max-size-time" to 0 didn't help. My problem was on the receiver
side, as pointed out by Nicolas. With VLC settings from [1] I was able
to reduce the latency below 100ms.

[1] https://www.slac.stanford.edu/grp/cd/soft/unix/VLC-setup.html

Wolfgang.


More information about the gstreamer-devel mailing list