[igt-dev] [PATCH i-g-t 03/17] benchmarks/gem_wsim: fix scaling of period steps

Bernatowicz, Marcin marcin.bernatowicz at linux.intel.com
Fri Sep 29 11:30:56 UTC 2023



On 9/29/2023 12:52 PM, Tvrtko Ursulin wrote:
> 
> On 29/09/2023 10:31, Bernatowicz, Marcin wrote:
>> Hi,
>>
>> On 9/29/2023 10:01 AM, Tvrtko Ursulin wrote:
>>>
>>> On 28/09/2023 18:45, Marcin Bernatowicz wrote:
>>>> Period steps take scale time (-F) command line option into account.
>>>
>>> "Make period steps.."?
>>>
>>> "Periods steps should take.."?
>>>
>>>> This allows to scale workload without need to modify .wsim file
>>>>
>>>> ex. having following example.wsim
>>>>
>>>> 1.VCS1.3000.0.1
>>>> 1.RCS.500-1000.-1.0
>>>> 1.RCS.3700.0.0
>>>> 1.RCS.1000.-2.0
>>>> 1.VCS2.2300.-2.0
>>>> 1.RCS.4700.-1.0
>>>> 1.VCS2.600.-1.1
>>>> p.16000
>>>>
>>>> we can scale the whole workload x10 with:
>>>>
>>>> gem_wsim -w example.wsim -f 10 -F 10
>>>>
>>>> -f is for batch duration steps, -F for period and delay steps
>>>
>>> Actually I am having a little bit of a second thought here. Thinking 
>>> that perhaps it was deliberate to not scale periods.
>>>
>>> Think of it like this. -f 0.5 simulates a twice as fast GPU. -F 2 
>>> simulates a twice as slow CPU.
>>>
>>> In both cases if something wants to hit 60 fps, it still wants to hit 
>>> 60 fps. What use case for scaling the period do you have in mind?
>>>
>>
>> That gives another view on the matter.
>>
>> I thought about it more like having a common unit, so giving -F 1000 
>> -> makes all CPU values (Period/Duration) are given in ms.
> 
> I lost you here. -F is also a scaling factor and not a time value.
> 
>> -f option may be used to calibrate for difference between GPU and CPU 
>> ex. if wrongly reported freq makes a real GPU duration 10x faster than 
>> CPU measured (ex. 10ms specified batch duration takes 1ms in reality) 
>> we can provide -f 10 and still have GPU durations correspond to CPU time.
> 
> Hm but in either case nothing of this relates to framerate.

true

> 
> My current thinking is to drop this patch unless you can think of a good 
> use case for scaling period. Or we need a new command option for scaling 
> only periods.

I don't see a use case for it, I will drop the patch.
--
marcin

> 
> Regards,
> 
> Tvrtko
> 
>>
>> Regards,
>> marcin
>>
>>
>>> Regards,
>>>
>>> Tvrtko
>>>
>>>> v2:
>>>> - apply same approach as with DELAY step (Tvrtko)
>>>>
>>>> Signed-off-by: Marcin Bernatowicz <marcin.bernatowicz at linux.intel.com>
>>>> ---
>>>>   benchmarks/gem_wsim.c | 2 ++
>>>>   1 file changed, 2 insertions(+)
>>>>
>>>> diff --git a/benchmarks/gem_wsim.c b/benchmarks/gem_wsim.c
>>>> index 42690d3d0..41557517c 100644
>>>> --- a/benchmarks/gem_wsim.c
>>>> +++ b/benchmarks/gem_wsim.c
>>>> @@ -1186,6 +1186,8 @@ parse_workload(struct w_arg *arg, unsigned int 
>>>> flags, double scale_dur,
>>>>   add_step:
>>>>           if (step.type == DELAY)
>>>>               step.delay = __duration(step.delay, scale_time);
>>>> +        else if (step.type == PERIOD)
>>>> +            step.period = __duration(step.period, scale_time);
>>>>           step.idx = nr_steps++;
>>>>           step.request = -1;


More information about the igt-dev mailing list