[gst-devel] Use videorate in MJPEG pipeline
Ron McOuat
rmcouat at smartt.com
Tue Nov 18 07:55:56 CET 2008
Figured out the problem of why pipeline pre roll never completed for the
pipeline below. During pre roll the buffers come into videorate with no
clock since the clock is not started until playing state so the
videorate element throws away all buffers received on the sink pad
detecting GST_NO_CLOCK in the buffers. This causes the pipeline to not
complete the pause state pre-roll operation because the end sink never
gets a video buffer.
I forced live mode on the gnomevfsrc element and now the pipeline plays.
I need the videorate element to handle mime type image/jpeg in order to
split a feed from an Axis camera into two different sinks. I am using
videorate only to lower the frame rate on a tee branch that doesn't need
the full rate. Doing this lowers network bandwidth requirements by only
acquiring the camera data once instead of opening two feeds over the
network. I have no need to decode the jpeg frames, the pipeline below is
only to illustrate the problem.
Questions on alternative ways to address this:
1) Enhance videorate
Should I submit an enhancement request or is this changing the videorate
component for a use that is not desired? I could write my own plugin to
address this using videorate as a model. As mentioned in the first
posting I have added the image/jpeg mime type to the src and sink caps
templates in a local to me source copy. The non completion of pre roll
could probably be fixed by not discarding buffers during the pre roll
activity or just pass one buffer to satisfy the sink as a special case
for pause state. Otherwise videorate only works with live sources or in
a location of the pipeline where a decoder has added time to the raw
video buffer.
2) Use is-live on source to avoid pre roll
gnomevfssrc and souphttpsrc do not have an is-live property like
videotestsrc for example to allow setting of the live property in
GstBaseSrc from gst-launch. It could be added to each or would it be
better to add this to GstBaseSrc like do-timestamp so any source could
set this property using the base class property. Maybe there is a reason
for not making the property generally available for sources that can
also read files (too many problems when used for the wrong purpose)?
When getting frames from a network camera by http it is sort of live
because frames are lost in a pause state but it is able to pre roll and
supply data which a live source is not generally capable of. The
do-timestamp option only comes into effect when the pipeline goes to play.
Suggestions?
Ron McOuat wrote:
> I would like to use videorate in a MJPEG pipeline in order to get 1 feed
> from for example an Axis camera, then use a tee connection to split the
> image/jpeg stream and deliver to two different destination sinks but at
> different frame rates. For example deliver image/jpeg at 8 fps to one
> sink and at 2 fps to another sink by using videorate in the 2 fps tee
> branch.
>
> I tested and got videorate running correctly in the following gst-launch
> to illustrate (no tee to 2 sinks for simplicity now) and sending the
> output to a window to observe the output
>
> gst-launch -v gnomevfssrc
> location="http://root:pass@192.168.2.90/axis-cgi/mjpg/video.cgi?fps=8&resolution=640x480"
> do-timestamp=true ! multipartdemux !
> image/jpeg,width=640,height=480,framerate=8/1 ! jpegdec ! videorate !
> video/x-raw-yuv,framerate=1/5 ! xvimagesink
>
> where the video rate change occurs in the pipeline section where the
> mime type is video/x-raw-yuv. The pipeline refused to run until
> framerate=8/1 was added to the caps entry after multipartdemux. Watching
> the XWindow image, the camera time updates once every 5 seconds as
> requested for the output framerate of videorate.
>
>
> Knowing the pad templates on videorate allow only video/x-raw-yuv and
> video/x-raw-rgb, I altered the source to add in image/jpeg as a third
> mime type in the source and sink pad templates
>
> line 112 and 119 in gstvideorate.c changed from
>
> GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb")
>
> to
>
> GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb; image/jpeg")
>
> snip rest of first post.
>
More information about the gstreamer-devel
mailing list