<div dir="ltr"><div class="gmail_default" style="font-family:monospace,monospace;font-size:small">I think Jeff was talking about this one.<br><br><a href="https://www.onvif.org/specs/stream/ONVIF-Streaming-Spec.pdf">https://www.onvif.org/specs/stream/ONVIF-Streaming-Spec.pdf</a><br>6.3 section RTP header extension<br>That would require the camera to support it.<br><br>@Anton - got your requirement. If the camera does not support any extra timing as described above, then I am not sure how to get the exact actual capture timestamp of the frame from the camera.<br><br>We can try to derive it based on RTP timestamp only.<br>- With splitmuxsink and "format-location" callback, get the system time - for example time.time() equivalent of python - on the RTSP client and mark it in the MP4 filename.<br>- RTP clock gets incremented by 90000Hz and if it is 30fps then 3000 in every frame.<br>- Since you are using 3sec MP4, then there would be 90 frames, 270000 in RTP timestamp increase.<br>- Now run the recording for some time, correlate the actual time observed on "format-location" and the RTP timestamp and compute the jitter and see if it is sufficient.<br><br>This method has one assumption - that is the network jitter is low and manageable.. If the camera and RTSP client are on the same network - preferably wired - then jitter would be a few msec only, then that could be less than 10% of interframe time of 33msec.<br><br>Anyway, would be happy to hear better suggestions from experts.<br><br>Thanks and Regards<br><br>Anand<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, 19 Jul 2023 at 22:56, Антон Шаров <<a href="mailto:sharov_am@mail.ru">sharov_am@mail.ru</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div><div>Hi, Anand!</div><div> </div><div>Thank you so much for your reply!</div><div> </div><div>The thing is I don’t need RTP, I need exact camera(server) capture time of a frame.</div><div>Ideally, I need capture time for each frame, but probably for some keyframe of mp4 file</div><div>(first frame of this video file?) would be ok also. I’m trying to use my custom appsink,</div><div>smth like</div><div><div>appSink.NewSample += AppSink_NewSample;</div></div><div> </div><div>splitmuxsink["sink"] = appSink;</div><div><div><div>splitmuxsink["max-size-time"] = 3000000000;</div><div>splitmuxsink["async-finalize"] = false;</div></div></div><div> </div><div>private static void AppSink_NewSample(object o, NewSampleArgs args)</div><div><div><div> {</div><div> if (o is AppSink aps)</div></div><div><div> {</div><div> var sample = aps.PullSample();</div><div> var buf = sample.Buffer;</div><div> buf.Map(out var info, MapFlags.Read);</div></div><div> var ts = buf.GetReferenceTimestampMeta();// !!!!!!</div><div> buf.Unmap(info);</div><div><div> }</div></div><div> }</div></div><div>I don’t quite understand whether it is possible to have needed (3 sec. in this case) mp4 file in</div><div>whole buffer, for which I can use GetReferenceTimestampMeta() which return me timestamp</div><div>for this buffer (hence for whole file, hence for keyframe). But on practice with my custom sink</div><div>I got some weired chunks in buffer and GetReferenceTimestampMeta returns null (not null exactly,</div><div>but some useless info where needed timestamp = 0).</div><div> </div><div>I believe this ideal approach won’t work (because appsink is not seekable), but at least it looks like</div><div>desired solution.</div><div> </div><div>In case of default sink (filesink), I need</div><ol><li>change file name, which somehow seems to be possible but is buggy on .net library, but at least it is seems to possible;</li><li>get timestamp of the first frames (buffer) of newly created file. Ideally I would like to name newly created file with timestamp_value.mp4.</li></ol><div> </div><div> </div><div> </div><blockquote style="border-left:1px solid rgb(8,87,166);margin:10px;padding:0px 0px 0px 10px">Среда, 19 июля 2023, 12:37 +03:00 от Anand Sivaram <<a href="mailto:aspnair@gmail.com" target="_blank">aspnair@gmail.com</a>>:<br> <div id="m_4517595678549198887"><div><div><div id="m_4517595678549198887style_16897594241878875035_BODY"><div><div><div style="font-family:monospace,monospace;font-size:small">Hello Anton,</div><div style="font-family:monospace,monospace;font-size:small"> </div><div style="font-family:monospace,monospace;font-size:small">The RTSP media is coming as separate RTP streams for each video and audio. They must be having only usual RTP parameters like timestamp, sequence number and payload type.</div><div style="font-family:monospace,monospace;font-size:small">That too RTP timestamps are initialized randomly as per standard, so the timestamps on video and audio streams have no relation at all.</div><div style="font-family:monospace,monospace;font-size:small">If you are planning to read the time using some OCR software from the video frame, then you will have to decode H.264 and the typical granularity is 1sec.</div><div style="font-family:monospace,monospace;font-size:small"> </div><div style="font-family:monospace,monospace;font-size:small">Are you using the "format-location" callback signal in splitmuxsink with which you can generate custom filenames with timestamp and any prefix. It is not available on gst-launch, but it is definitely available on C and Python.</div><div style="font-family:monospace,monospace;font-size:small"> </div><div style="font-family:monospace,monospace;font-size:small">Thanks and Regards</div><div style="font-family:monospace,monospace;font-size:small"> </div><div style="font-family:monospace,monospace;font-size:small">Anand</div><div style="font-family:monospace,monospace;font-size:small"> </div></div> <div><div><div><span>On Mon, 17 Jul 2023 at 15:36, Антон Шаров via gstreamer-devel <<a href="//e.mail.ru/compose/?mailto=mailto%3agstreamer%2ddevel@lists.freedesktop.org" target="_blank">gstreamer-devel@lists.freedesktop.org</a>> wrote:</span></div><div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div> </div><div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Hi.</div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"> </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">I’m given rtsp stream from camera, where each frame has capturing timestamp.</div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Connection strin looks like rtsp://{usr}:{pwd{@ip_addr/onvif/media?profile=Profile1</div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">I need to store this data for some time and provide search access for this data, either return closest (exact) frame for provided timestamp or return mp4 file which containts this closest frame. My first pipeline is <span>rtsp ! h264depay ! h264parse ! splitmuxsink location=… max-time-size=10seconds (for example), when I save new file via splitmuxsink, I’m some how need to get camera timestamp</span></div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"><span>for first frame of video (or key frame) and maybe store this mp4 file as timestamp.mp4 (or save ts for later in some db, for example).</span></div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"> </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Other approach is to use jpegenc and to store each frame with it’s timestamp, but I don’t know how to obtain timestamp for jpeg buffer (rtsp ! decodebin ! jpegenc ! appsink). But I assume that this won’t be effective solution in terms of CPU and storage usage, and better to store mp4 files.</div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"> </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">So, in both cases I don’t know how to get reference-timestamp-meta for needed buffer.</div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"> </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Can someone help me? </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"> </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">PS: I use C# wrapper, namely gstreamer-sharp, but I don’t think it is relevant for this problem.</div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px"> </div><div style="background-color:rgb(255,255,255);color:rgb(44,45,46);font-family:Arial,Tahoma,Verdana,sans-serif;font-size:15px;font-style:normal;font-variant-caps:normal;font-variant-ligatures:normal;font-weight:400;letter-spacing:normal;text-align:start;text-decoration-color:initial;text-decoration-style:initial;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Thanks in advance.</div></div><div> </div><div><div><div>--<br>С Уважением,<br>Шаров Антон</div></div></div></div></blockquote></div></div></div></div></div></div></div></div></blockquote><div> <div> </div><div><div><div>--<br>С Уважением,<br>Шаров Антон</div></div></div><div> </div></div></div>
</blockquote></div>