[Spice-devel] [spice] gstreamer-encoder: Return the average frame size as a 32 bit int
Francois Gouget
fgouget at codeweavers.com
Mon May 20 13:56:40 UTC 2019
On Thu, 16 May 2019, Frediano Ziglio wrote:
[...]
> > @@ -520,7 +520,7 @@ static uint32_t get_min_playback_delay(SpiceGstEncoder
> > *encoder)
> > * an I frame) and an average frame. This also takes into account the
> > * frames dropped by the encoder bit rate control.
> > */
> > - uint64_t size = get_maximum_frame_size(encoder) +
> > get_average_frame_size(encoder);
> > + uint32_t size = get_maximum_frame_size(encoder) +
> > get_average_frame_size(encoder);
> > uint32_t send_time = MSEC_PER_SEC * size * 8 / encoder->bit_rate;
> >
>
> Here you have 8000 * 2 * frame_size so could overflow uint32_t with
> frame_size >= ~256kb.
Right. It's confusing that NSEC_PER_SEC and NSEC_PER_MILLISEC are LL
constants but not NSEC_PER_MICROSEC and MSEC_PER_SEC. Should that be
changed?
Since they are all less than or equal to one billion, should they just
be basic constants (which would avoid the %lld vs. %ld confusion
whenever they are used in a trace) or should they all be LL constants to
avoid unexpected overflows.
> I agree get_average_frame_size can safely returns uint32_t but you should
> change above line to
>
> uint32_t send_time = (uint32_t) ((uint64_t) (MSEC_PER_SEC * 8) * size / encoder->bit_rate);
>
> or leave size uint64_t.
I would be ok with that too. It's mostly having get_average_frame_size()
return an uint64_t when get_maximum_frame_size() returns an uint32_t
that was bothering me.
--
Francois Gouget <fgouget at codeweavers.com>
More information about the Spice-devel
mailing list