Calculate the real world time at which a frame was captured?

pisymbol . pisymbol at gmail.com
Mon Jul 29 18:58:51 UTC 2019


On Sun, Jul 28, 2019 at 8:10 AM George Hawkins <gstreamer at beta-engine.net>
wrote:

> If I use the filesrc element to playback a video file I can retrieve data
> like this on a per frame basis:
>
>     index=0, timestamp=832000000, stream-time=0
>     index=1, timestamp=870000000, stream-time=38000000
>     ...
>
> But what is the first timestamp above relative to? How can I retrieve a
> real-world start time such that I can combine it with this timestamp in
> order to calculate the real-world time at which the frame was captured?
>
> I control the original file capture process as well as the playback but I
> haven't found how to capture and recover the start time that I need for
> combining with timestamps in this way.
>
> Currently, I capture the video file like so:
>
>     gst-launch-1.0 nvarguscamerasrc \
>         ! 'video/x-raw(memory:NVMM), width=3280, height=2464,
> framerate=21/1' \
>         ! nvjpegenc \
>         ! matroskamux \
>         ! filesink location=out.mkv
>
> I can change the container and video format if this makes it easier to
> encode and recover the start time later. I can obviously get an
> _approximate_ start time by recording the time at which the pipeline
> started - but I'd prefer something more precise (and _if possible_ I'd
> prefer that the value was encoded somewhere in the resulting video file
> rather than stored separately).
>
> I've used GST_DEBUG to see if I could see anything that looked like a
> start time when replaying the file but didn't spot anything.
>
> And if I look at the file with a tool like mediainfo the only date I see
> is:
>
>     Encoded date : UTC 2019-07-24 19:20:42
>
> TL;DR - when recording my video file how do I capture and later recover a
> value that can be combined with a relative timestamp value (like the one
> for index 0 up above) to give the real world time at which the frame was
> captured.
>

Why can't you generate an epoch (UTC) on a per frame basis? That's what I
do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP
as well as generate my own epoch when the frame is received in the pipeline
(see identity's "handoff" signal).

I mean you have to define "start time" precisely here too.

-aps
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20190729/8e149e7f/attachment.html>


More information about the gstreamer-devel mailing list