[Spice-devel] [spice] Enable mm_time adjustments on startup

Francois Gouget fgouget at codeweavers.com
Thu May 2 07:57:23 UTC 2019


On Wed, 17 Apr 2019, Snir Sheriber wrote:
[...]
> > The reason for that is that while a minimum 400 ms latency is fine when
> > playing a YouTube video [1], it's very annoying when the whole desktop
> > is being streamed, either through the streaming agent or through the
> > future Virgl remote access, because then it translates into a 400 ms
> 
> Are you working on something like that (remote virgl)?

Yes. See (though I need to update refresh these GitHub branches):

https://lists.freedesktop.org/archives/spice-devel/2019-January/047329.html


> Notice that currently there's small hacky patch on client to ignore latency
> when it's full
> screen streaming and there is no audio playback.
> 
> (d047b2fb7f5d492d6c49f589ba5ff862c6b115da)

Right. It never actually had any effect in my tests.
* When testing the fullscreen streaming in QEmu (i.e. when 
  streaming_mode is true) I had to use the mjpeg encoding because there 
  seems to be some conflict between QEmu and GStreamer on Debian. But 
  the patch has no effect on the builtin mjpeg decoder because the 
  latency in mjpeg_decoder_queue_frame() is to drop frames if it is 
  negative. To determine when to display the frame it uses the frame's 
  timestamp.
* The rest of the time I'm testing by running an OpenGL application in a 
  regular session and so streaming_mode is false so the GstVideoOverlay 
  is not used and thus the patch has no effect.

  That's because spice_gst_decoder_queue_frame() uses latency to drop 
  frames if it is negative; and to determine the deadline for decoding 
  the frame. When not using the video overlay the decoded frames then 
  get queued until it is time to display them, according to their 
  mm_time timestamp which is not impacted by latency.

That said I'm not sure ignoring the frames mm_time timestamp is a good 
idea as it means the network jitter will translate directly into 
framerate jitter (e.g. in OpenGL applications or in silent movies).


> > Other steps are:
> > * Reducing the default latency.
> 
> What will be the default? what will happen to late video frames?

Late frames will be dropped as has always been the case (at least in 
the mjpeg case otherwise it's a bit more complex).

As before the latency must be adjusted, and particularly, increased, as 
required to avoid dropping frames. Of course when starting with a 
huge latency like 400 ms handling the latency correctly is much less 
important as it only become important when you have really a bad network 
connection or a very long encoding times.

It's important to distinguish increasing and decreasing the latency.

Increasing the latency means a frame queued on the client will be 
displayed even later. So the latency can be increased freely.

Decreasing the latency means a bunch of frames queued on the client may 
suddenly become late, causing them to be dropped. So more care must be 
taken.


-- 
Francois Gouget <fgouget at codeweavers.com>


More information about the Spice-devel mailing list