1 introduction
this is mainly to allow database searches via a http interface, os-independant. this is going to be a add-on which many companies will give their clients in the future: simply to know how far the project is, no matter if its a big machine in production or just a parcel sent from A to B.
beware! i only put down the bare bones of scripting here. i even did not install perl.
must be i like bare bones.
nothing worked.
as i have not installed gs nor ghostview (nor xwin, this will be after i got through its dedicated client/server concept (making it able to run on any terminal anywhere in the world)), i helpe myself with a line in ~/.mc.ext:
ps grep "(" %f | lessbecause the lines starting with "(" contain readable output.
soon i figured out that there had to be 4 conf files in /var/httpd/conf: httpd.conf, access.conf, srm.conf, and mime.types.
with the comments provided, httpd.conf was done in 45 secs or so.
srm.conf was faster, i changed nothing. (it was hard for me to understand that the root of my http server was in httpdocs, but the cgi-bin root in cgi-bin, but that's aliases that are kept in srm.conf. perhaps i should have read it before just copying the template.)
access.conf i didn't understand. httpd definitely disliked the syntax. there are some html tags in this file - maybe that's why.
my new access.conf has a length of 0 bytes. THAT WORKS - though i guess i have no directories limited to someone. but security concerns me AFTER anything else is working - so what.
it worked, and displayed index.htm.
i lynxed to http://localhost/cgi-bin/date, and it displayed the date.
without the http:// or the localhost/, it wouldn't. bang. i had to set up index.htm with full links.
so good, so far.
it may be simplest to pass them by a user-friendly form. here an example:
you should set the action (in line 7) to be sent to metacrawler.com, then this should work.
submitting this form with method=get will pass the parameters (such as general, method,...) as environment variable to the cgi-script named crawler.
i had to set a line "export" in crawler to get all environmets variables displayed to find out that the environment variable with all parameters glued one after another was called "QUERY_STRING". They are separated by & signs, and space are transmitted as &20 (read: hex20, that is dec32).
have fun!
i found it to have a response time of 5 secs. so i decided to let it be. there's nothing i must serve to the world, though, that doesn't fit into my table of contents.