I understand all this, but how does the my_mpeg4_dec sends the timing data to alsasink?<br><br><div class="gmail_quote">On Sun, Apr 3, 2011 at 10:06 AM, sudarshan bisht <span dir="ltr"><<a href="mailto:bisht.sudarshan@gmail.com">bisht.sudarshan@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">Timestamp of each video frame should be calculated on the basis of framerate in my_mpeg4_dec plugin and that timestamp has to be set to Gstbuffer which has one frame of video data and then Gstbuffer should be pushed to the sink element. You can print timestamp for each frame and check weather is it set correctly or not.<div>
<br><div><div><br></div><div>For more information about clocks , go through the following link;</div><div><a href="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-clocks.html" target="_blank">http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-clocks.html</a></div>
<div><br></div><div><br><br><div class="gmail_quote"><div><div></div><div class="h5">On Sat, Apr 2, 2011 at 3:22 AM, Radivoje Jovanovic <span dir="ltr"><<a href="mailto:radivojejovanovic@gmail.com" target="_blank">radivojejovanovic@gmail.com</a>></span> wrote:<br>
</div></div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;"><div><div></div><div class="h5">I have following pipeline:<br>gst-launch-0.10 filesrc
location=my_movie.mp4 ! qtdemux name=t ! queue ! my_mpeg4_dec !
my_video_sink t. ! queue ! faad ! alsasink.<br>I am using the same
pipeline to play mpeg4 and h264 videos (different decoder for h264).
H264 works just fine but mpeg4 has sync problems. The video is as twice
as fast as it should be. If I provide sync=false into my_mpeg4_dec line
it all works fine but I do not want to do it this way. I just read in
the documentation that:<br>
<br>"Sometimes it is a parser element the one that knows the time, for instance if a pipeline contains a<br>filesrc element connected to a MPEG decoder element, the former is the one that knows the time of<br>each sample, because the knowledge of when to play each sample is embedded in the MPEG format.<br>
In this case this element will be regarded as the source element for this discussion."<br><br>I
am not sure how does ALSA gets the time stamp when it cannot see mpeg4
directly? Is mpeg4 supposed to update global clock upstream so ALSA can
see it or what?<br>
Cheers,<br>Ogi
<br></div></div>_______________________________________________<br>
gstreamer-devel mailing list<br>
<a href="mailto:gstreamer-devel@lists.freedesktop.org" target="_blank">gstreamer-devel@lists.freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" target="_blank">http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel</a><br>
<br></blockquote></div><br><br clear="all"><br>-- <br>Regards,<br><br>Sudarshan Bisht<br>
</div></div></div>
<br>_______________________________________________<br>
gstreamer-devel mailing list<br>
<a href="mailto:gstreamer-devel@lists.freedesktop.org">gstreamer-devel@lists.freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" target="_blank">http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel</a><br>
<br></blockquote></div><br>