[Clipart] upload_svg.cgi and shell.php

Jonadab the Unsightly One jonadab at bright.net
Sat Sep 25 07:21:48 PDT 2004


Jonathan Phillips <jon at rejon.org> writes:

> I had to fix permissions on the screenshots directory, and now
> everything works for one to upload their own thumbnail.
>
> When someone submits their own thumbnail, the dimension are checked
> aren't they? 

Currently no, but they could be.  I was figuring that since the HTML
specifies width and height attributes, the worst that would happen if
somone submits one at the wrong size is we'd waste some bandwidth.
Also, autogenerating the thumbnail is the default, so most people will
probably just use that.

However, it wouldn't be hard for the script to also autoresize
submitted thumbnails.  One more line of code I think.  Maybe I'll put
that in when I update it, which I need to do because...

> They need to be 195 by 146.

According to this:
http://openclipart.org/cgi-bin/wiki.pl?OCAPWebsiteUpdating
They need to be 267x200.  I think that is the size of all the existing
ones (or it was, anyway).

Oh, I see, someone has resized them.  I'll change the
screenshot-upload script to say the new dimensions.  And the Wiki.
  
> Another thing, can these files then be dynamically loaded onto the
> screenshots page? That would be brilliant...

We could write a screenshots page that automatically gets them from the
same place the front page gets them from.  That wouldn't be hard.  But
the current screenshots page just lists the existing ones.

> I've added a couple 

You've tested upload_screenshot and it's working fine, then?  Good.

> Also, can the cached copy of shell.php at least be checked to see if
> their was a modified file time between the cached copy and the real
> file? It is annoying to update shell.php and then the cache is still
> not right...Checking the file mtime would be better than just
> loading it in each time, right?

shell.php is dynamic.  If it were a static local file, we could just
slurp it off the filesystem quickly, no problem.  The problem is that
it's a dynamic web page in itself, so we have to go through http to
get it.  Because it's dynamic, the web server will always say it has
changed, so checking the modification time doesn't tell us anything we
don't already know.

The performance problem comes from the overhead of http.  To get an
updated shell, the CGI script has to basically pretend it's a web
browser, contact the web server, and wait for the complete response to
_finish_, before it can _start_ sending content back to the user.
This roughly doubles the complete page load time and, worse, for the
first half or so of that time the user receives _no_ content, so it
feels as if the page is not responding at all, because the page can't
even _start_ to render.  (Most pages start rendering in just a few
seconds, before the last parts of the page have arrived, so the user
sees some progress.)  (There's also the DNS lookup, but one hopes the
server's resolver library is caching that.)

In the long term, Parrot will solve this problem.  When Perl6 and PHP
both run on the Parrot VM, Perl code will be able to directly call
functions written in PHP (which will be able to directly call
functions written in Perl, or Python, or ...) without any extra
overhead.  However, useful, stable implementations of Perl6 and PHP
for Parrot are probably still several years from realization.

The problem would also be a lot less noticeable if the freedesktop.org
server were less overloaded.  There's only a little difference if any
in page load times from my shared dialup connection at home to the
mostly-unutilized T1 at work, so I'm pretty sure the freedesktop
server, not the user's connection, is the bottleneck in most of our
page loads.  Loading shell.php over http doubles the effect of that
bottleneck on the page load time, as well as placing that much more
stress on the server.

-- 
$;=sub{$/};@;=map{my($a,$b)=($_,$;);$;=sub{$a.$b->()}}
split//,"ten.thgirb\@badanoj$/ --";$\=$ ;-> ();print$/




More information about the clipart mailing list