<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<p>x264enc adds an offset of 1000 to both PTS and DTS in order to be
able to handle negative DTS (B frames may have negative
timestamps).</p>
<p>It does that by sending a new SEGMENT event.</p>
<p>So if you look at the PTS and DTS alone they will show the
offset, but if you use the segment you should get the "corrected"
values.</p>
<p>Something like: <br>
</p>
<div style="color: #d4d4d4;background-color: #1e1e1e;font-family: 'Droid Sans Mono', 'monospace', monospace;font-weight: normal;font-size: 14px;line-height: 19px;white-space: pre;"><div><span style="color: #d4d4d4;"> </span><span style="color: #6a9955;">#if self.segment != None:</span></div><div><span style="color: #d4d4d4;"> </span><span style="color: #6a9955;">#print (name + " PTS: " + Gst.TIME_ARGS(buf.pts) + " DTS: " + Gst.TIME_ARGS(buf.dts) + " now: " + Gst.TIME_ARGS(time_now))</span></div><div><span style="color: #d4d4d4;"> </span><span style="color: #6a9955;">#print (name + " to_running_time: " + Gst.TIME_ARGS(self.segment.to_running_time(self.segment.format, buf.pts)) + " to_running_time: " + Gst.TIME_ARGS(self.segment.to_running_time(self.segment.format, buf.dts)))</span></div></div>
<p><br>
</p>
<p>There are other gstreamer elements that do the same (add this
offset).</p>
<blockquote type="cite"
cite="mid:mailman.1725.1644951547.1114.gstreamer-devel@lists.freedesktop.org">
<pre class="moz-quote-pre" wrap="">So I tried to do a bit more digging on this, not because I have an issue with your pipeline (I will probably go that route in the end) but I suspect there's something else off with my element clocks that's the true problem. Specifically, when I play video through my pipeline VLC is showing a start time of 1000:00:00.
Through debugging I have verified the pts and dts values I am setting look correct (by looking at the results of the buffer.pts()? and buffer.dts()? values. For example, I'm getting these for the first 3 h264 buffers I am passing in:
-------------------
INPUT - dts Some(0:00:00.000000000) pts Some(0:00:00.083000000)
INPUT - dts Some(0:00:00.041000000) pts Some(0:00:00.250000000)
INPUT - dts Some(0:00:00.083000000) pts Some(0:00:00.166000000)
--------------------
(The Some()? just means it's not a null value, if you aren't familiar with rust)
I then added debugging of the values coming out of the appsink, which gives me
---------------------
dts 999:59:59.917000000 pts 1000:00:00.000000000
Running time: Some(0:00:00.000138500)
Start time: Some(0:00:00.000000000)
Clock time: Some(25:59:28.529101100)
dts 999:59:59.959000000 pts 1000:00:00.167000000
Running time: Some(0:00:00.047566400)
Start time: Some(0:00:00.000000000)
Clock time: Some(25:59:28.576529600)
dts 1000:00:00.000000000 pts 1000:00:00.083000000
Running time: Some(0:00:00.092902000)
Start time: Some(0:00:00.000000000)
Clock time: Some(25:59:28.621872200)
------------------------
We can see that the dts and pts values are all being set up so that presentation starts at 1000:00:00. The pipeline's start time is 00:00:00, the current running time seems to be the correct. Clock time seems weird (doesn't match my system time but based on docs it can be pulling this from a device somewhere, and probably doesn't matter?).
So, why would my pipelines be starting from a 1000 pts? I can't find much in the docs to help debug this further
For reference, my current iteration of my test code is using the following code to create a pipeline of appsrc ! decodebin ! videoscale ! capsfilter ! x264enc ! h264parse ! appsink?
---------------
let appsrc = ElementFactory::make("appsrc", None).unwrap();
let decoder = ElementFactory::make("decodebin", None).unwrap();
let scale = ElementFactory::make("videoscale", None).unwrap();
let capsfilter = ElementFactory::make("capsfilter", None).unwrap();
let encoder = ElementFactory::make("x264enc", None).unwrap();
let output_parser = ElementFactory::make("h264parse", None).unwrap();
let appsink = ElementFactory::make("appsink", None).unwrap();
let pipeline = Pipeline::new(None);
pipeline.add_many(
&[&appsrc, &decoder, &scale, &capsfilter, &encoder, &output_parser, &appsink]
).unwrap();
appsrc.link(&decoder).unwrap();
Element::link_many(&[&scale, &capsfilter, &encoder, &output_parser, &appsink]).unwrap();
decoder.connect_pad_added(move |src, src_pad| {
src.link_pads(Some(&src_pad.name()), &scale.clone(), None).unwrap();
});
let caps = Caps::builder("video/x-raw")
.field("width", 800)
.field("height", 600)
.build();
capsfilter.set_property("caps", caps);
encoder.set_property_from_str("speed-preset", "veryfast");
-------------
I appreciate any insights you (or anyone) could give!
</pre>
</blockquote>
<pre class="moz-signature" cols="72">--
Best regards / Med venlig hilsen
“Marianna Smidth Buschle”</pre>
</body>
</html>