How to capture jpeg-pictures from a live video stream
Edgar Thier
edgar.thier at theimagingsource.com
Tue Jun 19 11:37:28 UTC 2018
On 06/19/2018 11:46 AM, katariinaT wrote:
> Hi! I'm working on a project where I'm supposed to capture jpeg-pictures from
> a live video stream. I've been reading and researching for some time now,
> but haven't found any useful guides.
> I'm using multiple IP-cameras, so the goal would be to capture a
> jpeg-picture from each camera for example every 10 seconds. Does anyone have
> aby good advice or idea how to implement or approach a problem like this?
> At the moment I'm able to connect and stream video from multiple IP-cameras,
> but the rest is yet unsolved.
> Thanks beforehand,
> Katariina
>
The most reliable way is a separate pipeline that saves the image.
Depending on how your application looks you can either use the buffer in the appsink you use,
or if you use a ximagesink or similar you can also use an additional fakesink after a tee and pull
the last-sample.
The sample or buffer can simply be pushed in a pipeline like:
appsrc ! queue ! videoconvert ! jpegenc ! filesink location=/tmp/image-%5d.jpeg
Regards,
Edgar
More information about the gstreamer-devel
mailing list