[gst-devel] Why does this video format conversion fail?

wally_bkg wb666greene at gmail.com
Fri Dec 10 18:13:21 CET 2010



wally_bkg wrote:
> 
> I need to do it in C but I can illustrate the problem simply with a few
> gst-lanuch commands.
> 
> Basically I have a USB video capture device (Hauppauge WinTV-HVR 950Q)
> that works with gstreamer if I simply do:
> 
> gst-launch v4l2src device=/dev/video2 ! xvimagesink
> 
> However I'm having trouble figuring out a caps filter to use that will let
> me get the buffers in a yuv type format.
> 
> 
> On a normal capture card if I do:
> 
> gst-launch v4l2src device=/dev/video1 ! video/x-raw-yuv,
> framerate=\(fraction\)30000/1001, width=640, height=480 ! xvimagesink
> 
> It works fine, but change to /dev/video2 (the USB device) I get:
> 
> Setting pipeline to PAUSED ...
> ERROR: Pipeline doesn't want to pause.
> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not
> negotiate format
> Additional debug info:
> gstbasesrc.c(2719): gst_base_src_start ():
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> Check your filtered caps, if any
> 
> 
> 
> So I tried using ffmpegcolorspace to convert:
> 
> gst-launch v4l2src device=/dev/video1 ! ffmpegcolorspace !
> video/x-raw-yuv, framerate=\(fraction\)30000/1001, width=640, height=480 !
> ffmpegcolorspace ! xvimagesink
> 
> And get the same error message as without the ffmpegcolorspace elements
> between the capsfilter.
> 
> 
> One of my main reasons for trying to use gstreamer is to let it do the
> heavy lifting of dealing with video input and output.   At the end of the
> day all I want from the appsink element is a pointer to the video data in
> a format documented well enough that I can pull out a 640x480 intensity
> (grayscale) image.
> 
> Up to getting this device, setting the caps to { video/x-raw-yuv,
> framerate=\(fraction\)30000/1001, width=640, height=480 } has worked fine
> for all the capture cards I've tried, and obviously needing to deal with
> only a single raw format in my code simplifies it greatly.
> 
> 
> I'm having trouble in my C code to extract the caps that get negotiated if
> I leave out the capsfilter from my pipeline.  Any samples out there of how
> to do it?
> 
> 

I figured out how to extract the caps.

When using /dev/video1 (saa713x card) the "default" Buffer caps:
video/x-raw-gray, bpp=(int)8, framerate=(fraction)30000/1001,
width=(int)704, height=(int)480

When uisng /dev/video2 (the 950Q USB device) Buffer caps: video/x-raw-rgb,
bpp=(int)24, depth=(int)24, red_mask=(int)255, green_mask=(int)65280,
blue_mask=(int)16711680, endianness=(int)4321,
framerate=(fraction)30000/1001, width=(int)720, height=(int)480

But this doesn't give me any clues as to why ffmpegcolor space can't convert
the rgb caps to the yuv or grey caps I'd prefer to use.



-- 
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Why-does-this-video-format-conversion-fail-tp3080822p3082344.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.




More information about the gstreamer-devel mailing list