Get camera timestamp from rtsp stream.
Jeff Shanab
jshanab at jfs-tech.com
Wed Jul 19 19:23:15 UTC 2023
It is. The rtsp/rtp/tcp protocol has rtp timestamps and are negotiated
offsets from connection time. This is required to sync substreams like
audio and metadata with the video in light of possible network paths and
with other streams like in a confernece call.
The url you provided was an onvif one and the device if onvif probably
supports it and you will know when you send it a header and it replies back.
It was originally designed for replay from cameras and nvrs of stored
video, but it is the only known source of a frame timestamp as rtp
timestamps are synchronized
On Wed, Jul 19, 2023 at 2:23 PM Anand Sivaram via gstreamer-devel <
gstreamer-devel at lists.freedesktop.org> wrote:
> I think Jeff was talking about this one.
>
> https://www.onvif.org/specs/stream/ONVIF-Streaming-Spec.pdf
> 6.3 section RTP header extension
> That would require the camera to support it.
>
> @Anton - got your requirement. If the camera does not support any extra
> timing as described above, then I am not sure how to get the exact actual
> capture timestamp of the frame from the camera.
>
> We can try to derive it based on RTP timestamp only.
> - With splitmuxsink and "format-location" callback, get the system time -
> for example time.time() equivalent of python - on the RTSP client and mark
> it in the MP4 filename.
> - RTP clock gets incremented by 90000Hz and if it is 30fps then 3000 in
> every frame.
> - Since you are using 3sec MP4, then there would be 90 frames, 270000 in
> RTP timestamp increase.
> - Now run the recording for some time, correlate the actual time observed
> on "format-location" and the RTP timestamp and compute the jitter and see
> if it is sufficient.
>
> This method has one assumption - that is the network jitter is low and
> manageable.. If the camera and RTSP client are on the same network -
> preferably wired - then jitter would be a few msec only, then that could be
> less than 10% of interframe time of 33msec.
>
> Anyway, would be happy to hear better suggestions from experts.
>
> Thanks and Regards
>
> Anand
>
> On Wed, 19 Jul 2023 at 22:56, Антон Шаров <sharov_am at mail.ru> wrote:
>
>> Hi, Anand!
>>
>> Thank you so much for your reply!
>>
>> The thing is I don’t need RTP, I need exact camera(server) capture time
>> of a frame.
>> Ideally, I need capture time for each frame, but probably for some
>> keyframe of mp4 file
>> (first frame of this video file?) would be ok also. I’m trying to use my
>> custom appsink,
>> smth like
>> appSink.NewSample += AppSink_NewSample;
>>
>> splitmuxsink["sink"] = appSink;
>> splitmuxsink["max-size-time"] = 3000000000;
>> splitmuxsink["async-finalize"] = false;
>>
>> private static void AppSink_NewSample(object o, NewSampleArgs args)
>> {
>> if (o is AppSink aps)
>> {
>> var sample = aps.PullSample();
>> var buf = sample.Buffer;
>> buf.Map(out var info, MapFlags.Read);
>> var ts = buf.GetReferenceTimestampMeta();// !!!!!!
>> buf.Unmap(info);
>> }
>> }
>> I don’t quite understand whether it is possible to have needed (3 sec. in
>> this case) mp4 file in
>> whole buffer, for which I can use GetReferenceTimestampMeta() which
>> return me timestamp
>> for this buffer (hence for whole file, hence for keyframe). But on
>> practice with my custom sink
>> I got some weired chunks in buffer and GetReferenceTimestampMeta returns
>> null (not null exactly,
>> but some useless info where needed timestamp = 0).
>>
>> I believe this ideal approach won’t work (because appsink is not
>> seekable), but at least it looks like
>> desired solution.
>>
>> In case of default sink (filesink), I need
>>
>> 1. change file name, which somehow seems to be possible but is buggy
>> on .net library, but at least it is seems to possible;
>> 2. get timestamp of the first frames (buffer) of newly created file.
>> Ideally I would like to name newly created file with timestamp_value.mp4.
>>
>>
>>
>>
>>
>> Среда, 19 июля 2023, 12:37 +03:00 от Anand Sivaram <aspnair at gmail.com>:
>>
>> Hello Anton,
>>
>> The RTSP media is coming as separate RTP streams for each video and
>> audio. They must be having only usual RTP parameters like timestamp,
>> sequence number and payload type.
>> That too RTP timestamps are initialized randomly as per standard, so the
>> timestamps on video and audio streams have no relation at all.
>> If you are planning to read the time using some OCR software from the
>> video frame, then you will have to decode H.264 and the typical granularity
>> is 1sec.
>>
>> Are you using the "format-location" callback signal in splitmuxsink with
>> which you can generate custom filenames with timestamp and any prefix. It
>> is not available on gst-launch, but it is definitely available on C and
>> Python.
>>
>> Thanks and Regards
>>
>> Anand
>>
>>
>> On Mon, 17 Jul 2023 at 15:36, Антон Шаров via gstreamer-devel <
>> gstreamer-devel at lists.freedesktop.org
>> <//e.mail.ru/compose/?mailto=mailto%3agstreamer%2ddevel at lists.freedesktop.org>>
>> wrote:
>>
>>
>> Hi.
>>
>> I’m given rtsp stream from camera, where each frame has capturing
>> timestamp.
>> Connection strin looks like rtsp://{usr}:{pwd{@ip_addr
>> /onvif/media?profile=Profile1
>> I need to store this data for some time and provide search access for
>> this data, either return closest (exact) frame for provided timestamp or
>> return mp4 file which containts this closest frame. My first pipeline is rtsp
>> ! h264depay ! h264parse ! splitmuxsink location=… max-time-size=10seconds
>> (for example), when I save new file via splitmuxsink, I’m some how need to
>> get camera timestamp
>> for first frame of video (or key frame) and maybe store this mp4 file as
>> timestamp.mp4 (or save ts for later in some db, for example).
>>
>> Other approach is to use jpegenc and to store each frame with it’s
>> timestamp, but I don’t know how to obtain timestamp for jpeg buffer (rtsp !
>> decodebin ! jpegenc ! appsink). But I assume that this won’t be effective
>> solution in terms of CPU and storage usage, and better to store mp4 files.
>>
>> So, in both cases I don’t know how to get reference-timestamp-meta for
>> needed buffer.
>>
>> Can someone help me?
>>
>> PS: I use C# wrapper, namely gstreamer-sharp, but I don’t think it is
>> relevant for this problem.
>>
>> Thanks in advance.
>>
>> --
>> С Уважением,
>> Шаров Антон
>>
>>
>>
>> --
>> С Уважением,
>> Шаров Антон
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20230719/d3893a19/attachment-0001.htm>
More information about the gstreamer-devel
mailing list