[poppler] optimized pdf rendering
harry at midnight-labs.org
Wed Jan 26 12:22:37 PST 2011
I've had a very good experience with Poppler overall doing something
relatively similar to your idea.
Caching PDFDoc objects is an OK start, in some cases I've had it take
minutes to load a large document (1gb+) and overall they don't take much
memory to keep around (e.g. you could keep a fairly large number in a LRU
I would really recommend performing a large-size rendering once for each
page requested and caching on disk (as a PNG/JPEG) and resizing on the fly
to match the dimensions requested, the initial rasterization process is the
most CPU intensive process while resizing a PNG/JPEG is a predictable, easy
to cache and constant time.
On 26 January 2011 19:24, Erik Rehn <erik at slagkryssaren.com> wrote:
> Hey Popplers!
> This is my first message to this list, hope there is some activity. :)
> Anyway, im doing some research into the field of pdf rendering and came
> across poppler/xpdf which seems to be quite easy to work with. Now I have
> some questions for you guys :)
> How fast is Poppler compared to other pdf libraries, commercial and open
> source? What im looking for is the fastest way possible to render pdfs into
> bitmaps. Ive done some testing with the pdftoppm tool and it seem to be
> slightly slower than for instance Foxit's SDK.
> Is there any optimization that could be done to speed up the rendering? My
> goal is to develop a server side pdf rendering engine which clients can
> request individual pages from. If for instance the same page is requested
> twice but in different sizes is there any part of the rendering process that
> could be cached to save work?
> One thing would be to cache PDFDoc objects to avoid parsing the same pdf
> over and over, does anyone have any thoughts on that?
> Thanks for your help!
> Best regards,
> Erik Rehn
> poppler mailing list
> poppler at lists.freedesktop.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the poppler