<div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Tue, Oct 16, 2018 at 5:06 PM Keith Packard <<a href="mailto:keithp@keithp.com">keithp@keithp.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Bas Nieuwenhuizen <<a href="mailto:bas@basnieuwenhuizen.nl" target="_blank">bas@basnieuwenhuizen.nl</a>> writes:<br>
<br>
> Well the complication here is that in the MONOTONIC (not<br>
> MONOTONIC_RAW) case the CPU measurement can happen at the end of the<br>
> MONOTONIC_RAW interval (as the order of measurements is based on<br>
> argument order), so you can get a tick that started `period` (5 in<br>
> this case) monotonic ticks before the start of the interval and a CPU<br>
> measurement at the end of the interval.<br>
<br>
Ah, that's an excellent point. Let's split out raw and monotonic and<br>
take a look. You want the GPU sampled at the start of the raw interval<br>
and monotonic sampled at the end, I think?<br>
<br>
w x y z 0 1 2 3 4 5 6 7 8 9 a b c d e f<br>
Raw -_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-<br>
<br>
0 1 2 3<br>
GPU -----_____-----_____-----_____-----_____<br>
<br>
x y z 0 1 2 3 4 5 6 7 8 9 a b c<br>
Monotonic -_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-<br>
<br>
Interval <-----------------><br>
Deviation <--------------------------><br>
<br>
start = read(raw) 2<br>
gpu = read(GPU) 1<br>
mono = read(monotonic) 2<br>
end = read(raw) b<br>
<br>
In this case, the error between the monotonic pulse and the GPU is<br>
interval + gpu_period (probably plus one to include the measurement<br>
error of the raw clock).<br></blockquote><div><br></div><div>I'm very confused by this case. Why is monotonic timeline delayed? It seems to me like it's only the monotonic sampling that's delayed and the result is that mono ends up closer to end than start so the sampled value would be something like 9 or a rather than 2?</div><div><br></div><div>I think we can model this fairly simply as two components:</div><div><br></div><div> 1) The size of the sampling window; this is "end - start + monotonic_raw_tick"</div><div> 2) The maximum phase shift of any sample. The only issue here is that a tick may have started before the sampling window so we need to add on the maximum tick size. The worst case bound for this is when the early sampled clock is sampled at the end of a tick and the late sampled clock is sampled at the beginning of a tick.<br></div><div><br></div><div>The result is that we're looking at something like "end - start + monotonic_raw_tick + max(gpu_tick, monotonic_tick)" Have I just come full-circle?<br></div></div></div>