[gst-devel] how to learn from .yuv files and encode with h.264?
Rafael Sousa
rafael.lmsousa at gmail.com
Wed Nov 3 16:09:08 CET 2010
Hi Marco
Thanks again, you solved my problem, the code worked beautifully.
regards
On Tue, Nov 2, 2010 at 4:13 PM, Rafael Sousa <rafael.lmsousa at gmail.com>wrote:
> the gst-launch works fine for me, I was using the C API because was the
> first example next to what I want that I found, but the command line way
> works just fine too.
>
> I'll try your solution. Thanks very much for the attention.
>
>
> On Tue, Nov 2, 2010 at 2:57 PM, Marco Ballesio <gibrovacco at gmail.com>wrote:
>
>> Hi,
>>
>> On Mon, Nov 1, 2010 at 5:56 PM, Rafael Sousa <rafael.lmsousa at gmail.com>wrote:
>>
>>> Marco,
>>>
>>> The modification that you proposed to use the mp4mux didn't work. The
>>> same error occurred. I'm using the yuv format because after the
>>> transimission of the video, I intend to calculate the psnr between the
>>> source yuv and the received yuv for academic analysis. But the mp4 format
>>> also helps.
>>>
>>
>> the attached recv.c should do the trick. I just wonder why at this point
>> you can't simply use gst-launch...
>>
>> Regards
>>
>>
>>>
>>> Sorry,for the bothering, but I've reseached a lot before appealing to
>>> this list.
>>>
>>> thanks for the help,
>>> best regards,.
>>>
>>>
>>> On Mon, Nov 1, 2010 at 4:44 AM, Marco Ballesio <gibrovacco at gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> On Mon, Nov 1, 2010 at 2:46 AM, Rafael Sousa <rafael.lmsousa at gmail.com>wrote:
>>>>
>>>>> Hi Marco
>>>>>
>>>>> thank you so much for the help, it worked very well here... But what I
>>>>> want is to store the video at the receiver in the disk. I tried to modify
>>>>> this example that you sent to me, but it didn't saved the video. I modified
>>>>> the following in the recv.c. I used the foreman_cif.yuv video sample
>>>>> (352x288).
>>>>>
>>>>>
>>>>> #define PIPELINE_FORMAT "\
>>>>> gstrtpbin name=rtpbin \
>>>>> udpsrc
>>>>> caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\"
>>>>> \
>>>>> port=5000 ! rtpbin.recv_rtp_sink_0 \
>>>>> rtpbin. ! rtph264depay ! ffdec_h264 ! *\" video/x-raw-yuv, width=352,
>>>>> height=288, format=(fourcc)I420 \" ! filesink location=test_rcv.yuv* !
>>>>> \
>>>>>
>>>>
>>>> you don't need the capsfilter between decoder and filesink.. everything
>>>> will automagically work ;).
>>>>
>>>> Btw, unless you're in love with raw yuv I suggest using a container
>>>> format.. in this case you don't even need to decode and thus you could save
>>>> some CPU at recording time. Try with this pipeline:
>>>>
>>>>
>>>> gstrtpbin name=rtpbin \
>>>> udpsrc
>>>> caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\"
>>>> \
>>>> port=5000 ! rtpbin.recv_rtp_sink_0 \
>>>> rtpbin. ! rtph264depay ! *mp4mux ! filesink location=test_rcv.mp4* ! \
>>>>
>>>> etc. etc.
>>>>
>>>> NOTE: you'll need to add a way to send the EOS to the muxer element
>>>> (SIGHUP signal handler/read the manual ;) ), but setting the mp4mux property
>>>> "faststart" to true could make the file playable even when manually stopping
>>>> the pipeline with ctrl+c (discaimer: I've not tried it).
>>>>
>>>> Regards
>>>>
>>>>
>>>>> udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \
>>>>> rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false"
>>>>>
>>>>>
>>>>> it returned an error:
>>>>>
>>>>> Error: internal error in data stream.
>>>>>
>>>>> I'm very gratefull for your help.
>>>>>
>>>>> regards
>>>>>
>>>>>
>>>>>
>>>>> On Sun, Oct 31, 2010 at 4:48 PM, Marco Ballesio <gibrovacco at gmail.com>wrote:
>>>>>
>>>>>> mmhh.. md5 from:
>>>>>> http://gstreamer-devel.966125.n4.nabble.com/how-to-learn-from-yuv-files-and-encode-with-h-264-td3017365.html#a3017365
>>>>>>
>>>>>> is different. Attaching the sources directly.
>>>>>>
>>>>>> Regards
>>>>>>
>>>>>>
>>>>>> On Sun, Oct 31, 2010 at 10:33 PM, Marco Ballesio <
>>>>>> gibrovacco at gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> it appears the app you sent is actually more complex than what you
>>>>>>> need.. btw, some functions are not accessible from the code snippet you
>>>>>>> sent.
>>>>>>>
>>>>>>> I found some soare time to write a minimalistic send/receive couple
>>>>>>> of applications, the first one reading from a yuv file generated with:
>>>>>>>
>>>>>>> gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv,
>>>>>>> width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv
>>>>>>>
>>>>>>> and streaming to an address specified from command line. The second
>>>>>>> app opens a connection and renders all the h264 frames it receives from it.
>>>>>>> Hopefully it will give you an idea about how to get your app working.
>>>>>>>
>>>>>>> P.S. added back the gst-devel mailing list to the loop.
>>>>>>> P.P.S hopefully the attachment will make its way through the
>>>>>>> moderation.
>>>>>>>
>>>>>>> Regards
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <
>>>>>>> rafael.lmsousa at gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Gibro,
>>>>>>>>
>>>>>>>> As I said, I'm a newbie in this list, and I don't know exactly how
>>>>>>>> to fix the issues in my code, so, I'll send to you my whole code to you. If
>>>>>>>> you may take a look I'd be be very gratefull for your help.
>>>>>>>>
>>>>>>>> regards
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <gibrovacco at gmail.com>wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> it appears the sources you attached to the message did not go
>>>>>>>>> through the moderation.. btw a few comments below.
>>>>>>>>>
>>>>>>>>> ..snip..
>>>>>>>>>
>>>>>>>>> > *//Possible problem*
>>>>>>>>> > * **size = 352*288*(3/2);*
>>>>>>>>> > * **buffer = gst_buffer_try_new_and_alloc (size);*
>>>>>>>>>
>>>>>>>>> is there a particular reason for allocating this from the
>>>>>>>>> application? The pipeline is usually handling buffer allocation/release
>>>>>>>>> automagically.
>>>>>>>>>
>>>>>>>>> > * **if (buffer==NULL){*
>>>>>>>>> > * ** g_printerr("failed to allocate memory\n");*
>>>>>>>>> > * **}*
>>>>>>>>> > *//Possible problem *
>>>>>>>>> > * **gst_buffer_set_caps (buffer,caps);*
>>>>>>>>> > *//Set up the video encoding parameters*
>>>>>>>>> > * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
>>>>>>>>> > * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2',
>>>>>>>>> '0'),*
>>>>>>>>> > * **"width", G_TYPE_INT, 352,*
>>>>>>>>> > * **"height", G_TYPE_INT, 288,*
>>>>>>>>> > * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
>>>>>>>>> > * **if ( !caps ) {*
>>>>>>>>> > * **g_printerr("Failed to create caps\n");*
>>>>>>>>> > * **return 0;*
>>>>>>>>> > * **}*
>>>>>>>>> > err = gst_element_link_filtered(filesrc, time, caps);
>>>>>>>>>
>>>>>>>>> what is the time element? It's possible caps are not propagated to
>>>>>>>>> the encoder if not directly connected to the filesrc.
>>>>>>>>>
>>>>>>>>> Moreover, I didn't catch where you're setting the blocksize
>>>>>>>>> property in your filesrc to "size".
>>>>>>>>>
>>>>>>>>> ..Snip..
>>>>>>>>>
>>>>>>>>> > There is something wrong or missing in this function? How can I
>>>>>>>>> make
>>>>>>>>> > what I want to work?
>>>>>>>>>
>>>>>>>>> See my comments above ;)
>>>>>>>>>
>>>>>>>>> P.S. Maybe you could attach (or better copy to pastebin) the output
>>>>>>>>> when setting GST-DEBUG to 2 from the shell prior executing your binary.
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> >
>>>>>>>>> > Please, I need help.
>>>>>>>>> >
>>>>>>>>> > thanks for the previous answers and thanks in advance
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20101103/0503a9fa/attachment.htm>
More information about the gstreamer-devel
mailing list