<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<style type="text/css" style="display:none;"> P {margin-top:0;margin-bottom:0;} </style>
</head>
<body dir="ltr">
<div style="font-family: Calibri, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0); background-color: rgb(255, 255, 255);">
Thank you so much for the sample pipeline. It wasn't clear to me how to get audio into the same pipeline in a way to properly mux, but using flvmux makes perfect sense. I also wasn't aware you could do the named pads in that manner with the pipeline syntax,
so all around appreciated <span id="🙂" contenteditable="false">🙂</span><br>
</div>
<div id="appendonsend"></div>
<hr style="display:inline-block;width:98%" tabindex="-1">
<div id="divRplyFwdMsg" dir="ltr"><font face="Calibri, sans-serif" style="font-size:11pt" color="#000000"><b>From:</b> Nirbheek Chauhan <nirbheek.chauhan@gmail.com><br>
<b>Sent:</b> Tuesday, February 15, 2022 8:52 AM<br>
<b>To:</b> Discussion of the development of and with GStreamer <gstreamer-devel@lists.freedesktop.org><br>
<b>Cc:</b> Matthew Shapiro <me@mshapiro.net><br>
<b>Subject:</b> Re: Trying to keep non-transcoded audio and transcoded video in sync</font>
<div> </div>
</div>
<div class="BodyFragment"><font size="2"><span style="font-size:11pt;">
<div class="PlainText">Hi,<br>
<br>
On Mon, Feb 14, 2022 at 4:45 PM Matthew Shapiro via gstreamer-devel<br>
<gstreamer-devel@lists.freedesktop.org> wrote:<br>
><br>
> I have a custom written RTMP server written in Rust, and I'm trying to implement gstreamer to provide dynamic transcoding pipelines. I wrote all the RTMP code by hand so I have some media knowledge, but I'm far, far from an expert and I'm confused about
some timing issues I'm getting.<br>
><br>
> I have a pre-encoded FLV of big buck bunny and I am using ffmpeg to push the video (without transcoding) into my RTMP server, and I'm using ffplay to act as an RTMP client. When I do zero transcoding the audio and video are perfectly synced, but once I pass
the packets through the x264enc for encoding, audio is now several seconds before the corresponding video. My understanding is that RTMP clients/video players would use the RTMP Timestamp values to keep both audio and video in sync, but either the timestamps
that I'm getting from gstreamer are incorrect or I'm misunderstanding something.<br>
><br>
> For transcoding I'm using the following pipeline for my proof of concept:<br>
><br>
> -------------------<br>
> appsrc name=input ! decodebin ! videoscale ! video/x-raw,width=800,height=600 ! x264enc speed-preset=veryfast name=encoder ! h264parse name=parser ! appsink name=output<br>
> -------------------<br>
><br>
<br>
The issue is likely something to do with timestamps being wrong when<br>
pushing video into the pipeline (or getting it out of the pipeline),<br>
or synchronization breaking due to some other reason on the sender<br>
side (probably in the muxer).<br>
<br>
The simplest way to fix this would be to let gstreamer do the muxing<br>
too, with a pipeline like:<br>
<br>
appsrc name=vinput ! queue ! decodebin ! videoscale !<br>
video/x-raw,width=800,height=600 ! x264enc speed-preset=veryfast<br>
name=encoder ! h264parse name=parser ! mux.video<br>
appsrc name=ainput ! queue ! mux.audio<br>
flvmux name=mux ! appsink name=voutput<br>
<br>
Here flvmux will ensure that synchronization is taking place<br>
correctly, as long as the audio and video buffers pushed into the<br>
appsrc are timestamped correctly.<br>
<br>
Cheers,<br>
Nirbheek<br>
</div>
</span></font></div>
</body>
</html>