Camerabin : Failed to link camera source's vfsrc pad to viewfinder queue

Lee Matthews lma at
Tue Jun 17 07:45:05 PDT 2014

> What exactly is vfsrc ? I cannot find any reference to it. If it means view finder source, I don't understand why the v4l2src element would have these pads...

I suspect camerabin wants a 'wrappercamerasrc' here. So try creating a
'wrappercamerasrc' element, set v4l2src on that as source via the
"video-source" property,and then add the wrappercamerasrc as source on
camerabin via the "camera-source" property.


Thanks Tim. I'm still having problems though, generally some confusion regarding the camerabin I think.

I have tried setting up a pipeline as follows :

  	g_print("create camera.\n");
  	pipeline  = gst_element_factory_make ("camerabin", "camerabin");
		g_print("camera create fail.\n");
		return HTTP_FAIL;
  	g_print("create camera wrapper.\n");
  	camerawrapper  = gst_element_factory_make ("wrappercamerabinsrc", "wrappercamerabinsrc");
		g_print("camerawrapper create fail.\n");
		return HTTP_FAIL;
  	g_print("create source.\n");
  	source  = gst_element_factory_make ("v4l2src", "webcam");
		g_print("source create fail.\n");
		return HTTP_FAIL;
  	g_print("set device location.\n");
  	g_object_set (source,"device","/dev/video3",NULL); 
  	g_print("set camerawrapper video source.\n");
  	g_object_set (camerawrapper,"video-source",source,NULL); 	
  	g_print("set camerabin camera-source.\n");
  	g_object_set (pipeline,"camera-source",camerawrapper,NULL); 
  	g_print("create vp8enc.\n");
  	vp8enc  = gst_element_factory_make ("vp8enc", "vp8enc");
		g_print("vp8enc create fail.\n");
		return HTTP_FAIL;
  	g_print("create webmmux.\n");
  	webmmux  = gst_element_factory_make ("webmmux", "webmmux");
		g_print("webmmux create fail.\n");
		return HTTP_FAIL;
  	g_print("create multisocketsink.\n");
  	multisocketsink = gst_element_factory_make ("multisocketsink", NULL);
  	g_object_set (multisocketsink,
		"unit-format", GST_FORMAT_TIME,
      	"units-max", (gint64) 7 * GST_SECOND,
      	"units-soft-max", (gint64) 3 * GST_SECOND,
      	"recover-policy", 3 /* keyframe */ ,
      	"timeout", (guint64) 10 * GST_SECOND,
      	"sync-method", 1 /* next-keyframe */ ,
		g_print("multisocketsink create fail.\n");
		return HTTP_FAIL;
  	gst_bin_add_many (GST_BIN (pipeline), vp8enc, webmmux, multisocketsink, NULL);
 	if(!gst_element_link_many(vp8enc, webmmux,multisocketsink, NULL))
		g_print("Unable to link encoder and mux elements.\n");
		return HTTP_FAIL;
    g_print("set viewdfinder.\n");
  	g_object_set (pipeline,"viewfinder-sink",vp8enc,NULL); 
  	g_print("end set viewdfinder.\n");


When I set the pipeline to play, I get the following :

I/GStreamer+basetransform(29884): 0:00:18.286744993 0x78566580 gstbasetransform.c:1359:gst_base_transform_setcaps:<imagebin-capsfilter> reuse caps
I/GStreamer+GST_EVENT(29884): 0:00:18.286855150 0x78566580 gstevent.c:677:gst_event_new_caps creating caps event video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
I/GStreamer+v4l2(29884): 0:00:18.319647285 0x78566580 gstv4l2bufferpool.c:473:gst_v4l2_buffer_pool_set_config:<webcam:pool:src> increasing minimum buffers to 2
I/GStreamer+v4l2(29884): 0:00:18.319709420 0x78566580 gstv4l2bufferpool.c:479:gst_v4l2_buffer_pool_set_config:<webcam:pool:src> reducing maximum buffers to 32
I/GStreamer+v4l2(29884): 0:00:18.319761868 0x78566580 gstv4l2bufferpool.c:490:gst_v4l2_buffer_pool_set_config:<webcam:pool:src> can't allocate, setting maximum to minimum
E/GStreamer+v4l2(29884): 0:00:18.334737649 0x78566580 gstv4l2bufferpool.c:550:gst_v4l2_buffer_pool_streamon:<webcam:pool:src> error with STREAMON 22 (Invalid argument)
E/GStreamer+v4l2allocator(29884): 0:00:18.334862805 0x78566580 gstv4l2allocator.c:1326:gst_v4l2_allocator_dqbuf:<webcam:pool:src:allocator> failed dequeuing a mmap buffer: Invalid argument
E/GStreamer+v4l2allocator(29884): 0:00:18.334921816 0x78566580 gstv4l2allocator.c:1338:gst_v4l2_allocator_dqbuf:<webcam:pool:src:allocator> The buffer type is not supported, or the index is out of bounds, or no buffers have been allocated yet, or the userptr or length are invalid.
W/GStreamer+v4l2src(29884): 0:00:18.334977805 0x78566580 gstv4l2src.c:736:gst_v4l2src_create:<webcam> error: Failed to allocate a buffer
I/GStreamer+GST_ERROR_SYSTEM(29884): 0:00:18.335234472 0x78566580 gstelement.c:1832:gst_element_message_full:<webcam> posting message: Failed to allocate a buffer
I/GLib+stdout(29884): Error Failed to allocate a buffer

I seem to remember having a STREAMON 22 error before and it looks like it was do with the configuration of the caps. But I'm not sure how I would set the caps for camerabin.

Do I create a pipeline, with v4l2src, my caps filter, and a videoconvert and then link this to the video-source property in wrappercamerabinsrc ?

Or do I try and set the caps in camerabin's video-capture-caps property ?

I wish to be able to stream encoded video over the network using multisocketsink, I equally need to be able to save the encoded video to disk on command.

Can I presume that video is being transmitted from camerabin to viewfinder-sink in binary format ? If I wish to use this to transmit over the network would I need to do my encoding twice ? (ie once within camerabin - to save to the disk, and once outside of camerabin - to stream over the network ?)


Tim Müller, Centricular Ltd -

gstreamer-devel mailing list
gstreamer-devel at

More information about the gstreamer-devel mailing list