Who does not know the problems maintaining their own home page? You changed some pages, hardly remembering which ones these were. Then you get an awful FTP connection to a server that crashes often and is so slow that it takes years to finish. Sitecopy is the solution for these problems. It keeps the files on the server and the files on your local disk always identical.
It is recommended that your several sites maintained on your local disk are accessible with your local web server. This setup offers you the possibility to find corrupted links and be able to run CGI scripts (see also the article about apache). Be careful that you just use relative links instead of absolute links. Then the sites work on both servers.
If you own two homepages for example, one at geocities and one at tripod, then you should create two directories in your webserver:
>> mkdir /usr/local/httpd/htdocs/geo
(/usr/local/httpd/htdocs is the directory where the webserver keeps its sites in the SuSE distribution - this might be different for others - and copy the sites into them.
Now you have to configure sitecopy so that it knows which directories have to be mirrored and to which FTP-server the files have to be copied. This is handled in the file .sitecopyrc which has to be placed in your home directory.
First you create the file with :
>> touch ~/.sitecopyrc
and then edit it with a simple
Every site gets a specific name that is needed to steer it later.
If the files in the Internet were the same as the local files, you would tell sitecopy this by typing:
>> sitecopy -c -a
If not, it is recommended that you delete all files on the FTP server and upload them completely with :
>> sitecopy -u -a
Now the local and Internet sites
are synchronized. If you changed, deleted or created a new file
or directory sitecopy would recognize it and adjust the files and
directories on the FTP-server to mirror the local structure.
It makes sense to add the sitecopy call into the script /etc/ppp/ip-up so you do not have to start it by hand every time you are connected to the Internet. But pay attention - sitecopy has to be executed by the user who owns the .sitecopy file in the home directory. Because the ip-up script is executed by root you either have to copy the configuration file to /root/ or you start it as a different user:
>> su -c "sitecopy -u -a" <USERNAME>
From time to time, you should
start sitecopy by hand so you can see if it interruptted the
upload because of errors. This happens if it tries to create a
subdirectory which already exists. These problems of
synchronization could occur if the program was not stopped
correctly. Also, CGI scripts need additional help because these
are copied without executable rights to the server.
A gnome frontend exists which does not want to compile with a SUSE distribution. But this is not so bad because sitecopy is one of those small tools that should just work in the background. A front end is completely unnecessary.