[gst-devel] Starting to use GStreamer as both streaming source and target

Julien Moutte julien at moutte.net
Tue Feb 22 02:43:05 CET 2005


You should really have a look at Flumotion (http://www.flumotion.net)

Cheers,

On Mon, 2005-02-21 at 20:28 +0100, Philip Van Hoof wrote:
> Hi there,
> 
> At the location where I'm doing a project (Maia Scientific) we are
> creating an application that basically asks a motorized microscope to go
> to a certain XY and Z (focus) position and take a picture. After taking
> the picture some image analysis is performed upon it. Like counting
> cells. Or measuring their sizes. Stuff like that.
> 
> Not really important for the question but some of you people might like
> to know in which type of technology some of your opensource work is
> being used today.
> 
> For taking those pictures we are, of course, using cameras. And for
> bringing the video of such a camera to the computer we are, of course,
> using a professional framegrabber. A framegrabber which isn't supported
> by with a typical "Video4Linux" device driver. The type of device driver
> thats takes care of support on Linux will basically make the framebuffer
> memory readable and some ioctl-calls possible from userspace. And
> there's a binary-only library to do useful stuf with the card.
> 
> We'd now like to create a small video-streaming server. Why would we
> like to do this: the customer who is going to use the device/software,
> likes to see live video-images coming from the "microscope". We,
> however, separated userinterface from core-software. We can't access the
> video framegrabber from the userinterface directly.
> 
> The good thing, however, is that both softwares are using common
> technologies also seen in many GNOME applications. They currently
> communicate through CORBA by using ORBit-2.
> 
> We can, on the core-software, trigger an event each time a frame is
> written/available. We have a very good knowledge of the device itself
> and know when a vsync happens. We also know when odd/even lines are
> written. And we can create a ringbuffer of multiple frames and also know
> which frame was the last one, and we can read all those frames.
> 
> We have a pointer to the framebuffer (which holds a typical RGB PAL
> image) and we know the formatting and how to, for example, write it to
> an image-file.
> 
> How can we, using that as source, easily create a GStreamer-video
> streaming server? And how can we easily create a client that will
> display the video-stream? (I'll probably look at the sources of Totem
> for the client). Where can I find the demo's and sources? Which
> applications demonstrate what we are planning to create?
> 
> 
> 
> 
> 
-- 
Julien Moutte <julien at moutte.net>





More information about the gstreamer-devel mailing list