<span class="Apple-style-span" style>Hi,</span><br style><br style><span class="Apple-style-span" style>I have a server-client video streaming using gstreamer and RTP. So basically:</span><br style><span class="Apple-style-span" style>[source storage] --> [server] ---------> [client] --> [frames rendered</span><br style>
<span class="Apple-style-span" style>on screen]</span><br style><br style><span class="Apple-style-span" style>Is there any way to know, for each rendered frame, what was the</span><br style><span class="Apple-style-span" style>original playing time in the video source of the server?</span><br style>
<br style><span class="Apple-style-span" style>For example.</span><br style><span class="Apple-style-span" style> - At 10 fps, after 30 seconds of playing, the "playing time" would be</span><br style><span class="Apple-style-span" style>300.0 seconds.</span><br style>
<span class="Apple-style-span" style> - But if the network is slow, the playing time could very well be</span><br style><span class="Apple-style-span" style>only 276.4 seconds.</span><br style><span class="Apple-style-span" style> - Or maybe some frames could be lost if using UDP.</span><br style>
<span class="Apple-style-span" style> - Also, if the source storage has variable framerate, I cannot use</span><br style><span class="Apple-style-span" style>"10fps" or any other average fps for any calculations.</span><br style>
<br style><span class="Apple-style-span" style>How could this be achieved?</span><br style><br style><span class="Apple-style-span" style>Thanks!</span><br clear="all"><div><br></div>-- <br>Saludos,<br> Bruno González<br>
<br>_______________________________________________<br>Jabber: stenyak AT <a href="http://gmail.com" target="_blank">gmail.com</a><br><a href="http://www.stenyak.com" target="_blank">http://www.stenyak.com</a><br>