[RFC] wl_surface video protocol extension

Pekka Paalanen ppaalanen at gmail.com
Thu Oct 17 11:22:52 CEST 2013


On Thu, 17 Oct 2013 09:38:41 +0100
James Courtier-Dutton <james.dutton at gmail.com> wrote:

> Hi,
> 
> I have extensive experience with video streaming and display having worked
> on the xine video player.
> Setting time stamps on a stream of video frames is not the entire solution.
> The biggest problem with displaying video on computer displays is "frame
> rate adaption".
> I.e. The frame rate of the video frames might not equal the frame rate of
> the display.
> What is really needed is a predictor.
> I.e. If I send this frame to the video hardware, what is the exact time it
> will be displayed.
> A predictor might be difficult, so some callback after the frame is
> displayed giving the exact time the frame was displayed would also help.
> The video player could then use this information to adjust the frame
> submission to ensure the frame was displayed closer to when it needs to be.

Hi James,

that is exactly what is being planned, but was not mentioned yet:
presentation timestamp events coming from the compositor, telling the
client when the wl_buffer actually hit the screen. Then you know what
timestamp you asked, when you queued it, and what actually happened, and
can work out the statistics on the difference.

> There are various artifacts that appear as a result of frame rate adaption.
> One of these is field dominance. This is where the top field is displayed
> for longer than the bottom field. This results in the appearance of
> flickering lines to the viewer.
> If you could determine how long the top field was presented to the user,
> you could maybe make adjustments to make the bottom field appear for a
> similar amount of time.

Ok, you want to know how long a frame (buffer) has been on.
Presentation timestamp tells you when the frame started to show, so
maybe you could use the presentation timestamp of the next frame as the
ending timestamp for this frame?

Btw. would you need to know or set the scanout "phase" of top vs. bottom
field for each frame on an interlaced display and video?

> So, I would propose the following.
> 1) sequence number the frames at the display rate.
> 2) provide an api for the application to predict the time of each frame
> number in the sequence. I.e The next 5 frames will display at time
> X1,X2,X3,X4,X5. Also provide a way to determine the "frame submited to
> frame displayed" latency. So the user application knows how many frames in
> advance it needs to do the submit.
> 3) provide an api for the application to ensure that a frame it submits
> will get displayed at display rate sequence number X.
> 4) If frame X has already past, only display it if there are now newer
> frames in the quene, otherwise drop it.

To me, all that sounds like a library API of somewhat higher level than
display server protocol. We just need to make sure the protocol has
everything to support such a library.

I think item 4 is already implied in the protocol.

Also, monitor refresh rates may change, which is why I think a
timestamp is better in the protocol than a sequence number tied to a
refresh rate. Another thing missing in the protocol sketch is an event
notifying the client about current output refresh rate changes.

> Summary:
> For the best video streaming experience, timestamps on the frame are not
> very useful, what is useful is predicting exactly when the frame will
> actually be displayed to the user.

I hope the three timestamps would allow you to estimate everything you
need:
- when you queued a buffer (sent the request)
- the timestamp you asked the buffer to be presented at
- the timestamp the buffer really was presented

Plus of course the same for the next buffer, so you know e.g. how long
a buffer was on screen etc.

How does that sound?


Thanks,
pq


More information about the wayland-devel mailing list