Calculate the real world time at which a frame was captured?

George Hawkins george.hawkins at beta-engine.net
Sun Aug 4 22:20:11 UTC 2019


On Mon, Jul 29, 2019 at 2:58 PM "pisymbol ." <pisymbol at gmail.com> wrote:

> On Sun, Jul 28, 2019 at 2:10 PM George Hawkins <gstreamer at beta-engine.net>
> wrote:
>
>> If I use the filesrc element to playback a video file I can retrieve data
>> like this on a per frame basis:
>>
>>     index=0, timestamp=832000000, stream-time=0
>>     index=1, timestamp=870000000, stream-time=38000000
>>     ...
>>
>> But what is the first timestamp above relative to? How can I retrieve a
>> real-world start time such that I can combine it with this timestamp in
>> order to calculate the real-world time at which the frame was captured?
>>
>> I control the original file capture process as well as the playback but I
>> haven't found how to capture and recover the start time that I need for
>> combining with timestamps in this way.
>>
>> Currently, I capture the video file like so:
>>
>>     gst-launch-1.0 nvarguscamerasrc \
>>         ! 'video/x-raw(memory:NVMM), width=3280, height=2464,
>> framerate=21/1' \
>>         ! nvjpegenc \
>>         ! matroskamux \
>>         ! filesink location=out.mkv
>>
>> I can change the container and video format if this makes it easier to
>> encode and recover the start time later. I can obviously get an
>> _approximate_ start time by recording the time at which the pipeline
>> started - but I'd prefer something more precise (and _if possible_ I'd
>> prefer that the value was encoded somewhere in the resulting video file
>> rather than stored separately).
>>
>> I've used GST_DEBUG to see if I could see anything that looked like a
>> start time when replaying the file but didn't spot anything.
>>
>> And if I look at the file with a tool like mediainfo the only date I see
>> is:
>>
>>     Encoded date : UTC 2019-07-24 19:20:42
>>
>> TL;DR - when recording my video file how do I capture and later recover a
>> value that can be combined with a relative timestamp value (like the one
>> for index 0 up above) to give the real world time at which the frame was
>> captured.
>>
>> Why can't you generate an epoch (UTC) on a per frame basis? That's what I
> do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP
> as well as generate my own epoch when the frame is received in the pipeline
> (see identity's "handoff" signal).
>
> I mean you have to define "start time" precisely here too.
>
> -aps
>

Thanks for the reply Aps - and sorry for being slow in following-up. As it
seems GStreamer is used in quite a lot of machine vision projects, I
thought it would be quite a common thing to want to do, to record video and
then, at a later stage, work with the video and time signals from another
source.

E.g. one might continuously record video of the night sky and later come
back and want to view particular frames when some other information source
tells you that there might have been something interesting to see in-frame
at e.g. 3:24 AM.

So I didn't want to reinvent the wheel. But in the end, I did as you
suggested and coded up a new element, using identity as a template.

The learning curve for getting started with coding GStreamer elements is
fairly step - but ultimately, the actual code required to implement what I
wanted was trivial.

On the off-chance that it might be useful to someone else, the results can
be found here: https://github.com/george-hawkins/gst-absolutetimestamps

The README is far longer than the few lines of code required to print out
the timestamps I wanted :)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20190805/963b973b/attachment.html>


More information about the gstreamer-devel mailing list