[Spice-devel] question: how to test the gstreamer:h264 with qemu ?

Francois Gouget fgouget at codeweavers.com
Mon Aug 8 16:18:23 UTC 2016


On Mon, 8 Aug 2016, Frediano Ziglio wrote:
[...]
> The current h/w acceleration allows to allocate the buffer and decode
> into that buffer. As you allocate that buffer you can easily understand
> (a copy from that specific area/buffer) if client is using it.
> Not saying that's easy it require some page toggling from Qemu and/or
> the driver to support drawing from this buffer that is a much more aware
> guest driver.
> Yes, the application would have to use this buffer for drawing to the
> screen in order to understand the relationship between original byte
> stream and final screen at this point you know if is better to send the
> original stream or not (because for instance you display only small part
> of the video).
> 
> > * In fact the guest applications could very well not be displaying the
> >   decoded video. That's the case when you transcode a video. In that
> >   case streaming the video to the client would make no sense.
> > 
> 
> In this case the buffer is not copied directly to screen.

Unless the applications that apply overlays have two codepaths I'm not 
sure the case where the decoded buffer is copied directly to screen is 
very common.


> > * But let's say the guest application does display the video. Now you
> >   must prevent the spice server from sending the corresponding screen
> >   updates either through streaming or regular updates. Otherwise you'll
> >   end up doubling the bandwidth usage! (and wasting CPU cycles if it is
> >   streaming them)
> 
> Yes, you would have to just change the "output" of the streaming to the
> destination buffer.

Not sure what you mean by that.


> > * However the video detected by the server may not have the same size as
> >   the original stream. This is almost always the case when you display a
> >   video fullscreen for instance. Furthermore the spice server often
> >   detects a video as two videos (e.g. one with the first 100 lines or
> >   so, and another with the bottom 140, causing banding in YouTube for
> >   instance). So it seems that matching the video being displayed with
> >   the intercepted video stream would require solving that issue.
> > 
> 
> Still this is possible to detect with a guest driver that support operation
> like stretch (or 3D!).

Assuming the scaling is performed by the hardware. At least that's 
somewhat likely since the application would want the scaled video before 
working with it (e.g. to apply overlays).


> About the split, yes, this should be fixed. But with 3D support for instance
> the application usually display the whole frames.

Based on past discussions my understanding is that this happens when the 
application does not blit the frame in one go. I don't know the details 
but I believe the fact it happens implies at least Chrome does not do a 
direct copy of the decoded frame to the screen.


-- 
Francois Gouget <fgouget at codeweavers.com>


More information about the Spice-devel mailing list