Streaming ... images getting with OpenCV

Matias Hernandez Arellano msdark at
Mon Mar 28 19:30:41 PDT 2011

I'm still trying..

The project use Qt libraries, so i will use QtGstreamer to do the streaming part..

I read the Tiago's code and i see how can i copy de data of OpenCV image to a Buffer to push into an appsrc pipeline..
So i try to do that with QtGstreamer:

I create the pipeline
  QString pipeDescr = QString("appsrc name=\"buffer\" ! queue ! videparse format=14 "
                                    "width=640 height=480 framerate=25/1 "
                                    " ! videorate ! videoscale ! ffmpegcolorspace "
                                    " video/x-raw-rgb, width=640, height=480 ! decodebin ! avimux ! "
                                    "filesink location=\"app_sink_01\"");
        m_pipeline = QGst::Pipeline::create();
        m_pipeline = QGst::Parse::launch(pipeDescr).dynamicCast<QGst::Pipeline>();
        QGlib::connect(m_pipeline->bus(), "message::error", this,&Player::onBusMessage);

then i try to add the data of the images every time i query a frame
	 IplImage* frame = capture->queryFrame();
	    uchar* IMdata = (uchar*)frame->imageData;
    	const QGst::BufferPtr buffer = QGst::Buffer::create(sizeof(data)).dynamicCast<QGst::Buffer>();

But here i'm stuck .. how can i copy de data of the image (IMdata)  into a QGst::Buffer to use the the  QGst::Utils::ApplicationSource to push data into the pipeline

Am i doing this right?????

Any idea, guideline or anything will be appreciate ...

Thanks in adv
El 26-03-2011, a las 0:57, Tiago Katcipis escribió:

> On Fri, Mar 25, 2011 at 9:19 PM, Matias Hernandez Arellano <
> msdark at> wrote:
>> (Sorry for my english)
>> Hi, i made some research about this, and finally i have some ideas to
>> accomplish this.
>> First: I have an application using OpenCV to get the camera frames then
>> make some image processing and show this in the screen. Now i need to send
>> this images (the result) over the network to see the result in other (or
>> others) devices (maybe using HTML5 video tag).
>> Reading i finally decide to use gstreamer to make the streaming part, but i
>> don't know how can i put the OpenCV images (frames for the camera) in a
>> buffer to stream that with gstreamer.
>> So, finally i read about appsrc.
>> This can be used to push data into a gstreamer pipeline right?
>> So if i can push the OpenCV data into a pipeline i can use the streaming
>> capabilities of gstreamer and see the result in other device.
>> Am I correct?
>> Any example of how can i do that?
> Don't know if it will help you, but i worked together with a friend of mine
> on a simple code that captures video using gstreamer, process the data on
> OpenCV (actually it detects faces and draw an rectangle around them using
> OpenCV Haar features)  and them pushes the processed OpenCV data on another
> pipeline, using appsrc. The code was just a test of how OpenCV Haar works
> and how to integrate it with Gstreamer, so it is a mess (lots of commented
> code and even some commentaries on portuguese)...but maybe it will help you
> :-).
> The new_buffer callback should interest you, it basically process a buffer
> coming from a appsink (pipeline1) and them pushes it on appsrc (pipeline2).
> Best regards,
> Tiago Katcipis
>> A OpenCV image is basically a matrix containing the information of the
>> image, so i need to push this matrix into a gstreamer and stream.
>> Matías Hernandez Arellano
>> Ingeniero de Software/Proyectos en VisionLabs S.A
>> CDA Archlinux-CL
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at

Matías Hernandez Arellano
Ingeniero de Software/Proyectos en VisionLabs S.A
CDA Archlinux-CL

More information about the gstreamer-devel mailing list