feeding images to imagefreeze

Brian McKeon brian.dev.address at gmail.com
Wed Feb 6 07:14:43 PST 2013


Thanks so much Tim!

I didn't realize that was the default behavior for a video sink. Cool.

But there's a complication that I didn't mention earlier... ;)

Eventually, I'd like to be able to send this stream over rtp instead of 
to a video sink. So from what you said, it sounds like this wouldn't 
work for that case.

I'll spend some time looking at the example you provided though in case 
that'll work for me.

But assuming that it doesn't, I think I may try my hand at modifying 
imagefreeze. Do you have any thoughts about the viability of that?

(At this point, I'm just trying to figure out whether I'm barking up the 
wrong tree or not.)

Cheers,
Brian



On 2/6/13 9:33 AM, Tim-Philipp Müller wrote:
> On Wed, 2013-02-06 at 09:15 -0500, Brian McKeon wrote:
>
> Hi,
>
>> My pipeline is very basic at the moment because I wanted to start with a
>> proof of concept.
>>
>> It looks like:
>>
>> appsrc ! decodebin ! imagefreeze ! autovideosink
>>
>>
>> And let me restate my goal to try to add clarity:
>>
>> My code is sent a jpeg image every so often. (i.e. the input isn't
>> continuous. New images can arrive at any time.)
>>
>> I need to take this single image and create a continuous output stream
>> with it. Very much like what imagefreeze does.
>>
>> This approach is currently working, but only for the very first image I
>> receive.
>>
>> Every image after that gets enqueued into the appsrc element, but those
>> buffers never get pulled into the output stream. So the output just
>> keeps showing the first image.
>>
>> I was really hoping there was a simple way to trigger the imagefreeze
>> element to tell it to pull the next buffer from appsrc.
> Right, so imagefreeze takes a single static image as input and will
> repeat that ad infinitum basically. I don't think it fits your use case.
>
> If your output is a video sink, you should not need imagefreeze at all.
> You just push the next image whenever you want. Until then, the video
> sink should keep the old image around.
>
> However, if you want to make sure you actually render/output N images
> per second or so, regardless of how often images get pushed in, then
> that won't work of course.
>
> Something like this would almost do the trick then:
>
> gst-launch-0.10 -v \
>    videotestsrc ! video/x-raw-yuv,framerate=1/3 ! intervideosink \
>    intervideosrc ! identity ! ffmpegcolorspace ! ximagesink
>
> or
>
> gst-launch-1.0 -v \
>    videotestsrc ! video/x-raw,framerate=1/3 ! intervideosink  \
>    intervideosrc ! identity silent=false ! videoconvert ! ximagesink
>
> but you'll notice that it reverts to a black frame if it hasn't received
> a buffer for a while (~1 second). I think this value is hard-coded
> inside the intervideo* elements currently, so can't easily be changed
> (no reason not to expose a property for this though)
>
> Cheers
>   -Tim
>
>
>
>
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>



More information about the gstreamer-devel mailing list