Rendering Raw video with gstreamer
wmetcalf at niftytv.com
Thu Jul 7 07:31:08 PDT 2011
So using y4menc I would capture the video frames in a buffer, push the
buffer into a pipeline using appsrc, then link appsrc to y4menc, and
then write the information to a file.
Essentially - buffer->appsrc->y4menc->filesink
Then to play the file - filesrc->videoparse->autovideosink
Does this seem correct or am I doing this incorrectly?
On 7/6/2011 6:26 PM, David Schleef wrote:
> On Wed, Jul 06, 2011 at 05:58:13PM -0500, William Metcalf wrote:
>> I am developing a c application using gstreamer and th decklink api which pulls in video frames from a capture card, stores them into a buffer, and pushes that buffer through a gstreamer pipeline. One of my development steps right now is to write the buffers into a file, and try to play that file with gstreamer. I am using the following pipeline to try and play the file:
>> gst-launch filesrc location=video.raw ! videoparse ! autovideosink
> If you want to store raw video, I would start with y4menc/y4mdec.
> Otherwise, if you really want to store raw video with no framing
> information, you'll have to tell videoparse all the information
> about the video.
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
More information about the gstreamer-devel