[leafnode-list] Re: Downloading old articles
Matthias Andree
matthias.andree at gmx.de
Fri Nov 13 10:00:21 CET 2009
Am 12.11.2009, 19:12 Uhr, schrieb Marcin Dziwnowski
<m.dziwnowski at gmail.com>:
> On Fri, Nov 6, 2009 at 10:55 AM, Robert Grimm <lists at news.robgri.de>
> wrote:
>
>> Apparently it is "noxover = 1" in leafnode 1.
>
> ... and it doesn't help. Leafnode still doesn't download more articles.
> Well, the fault is more on the server side than leafnode's, but is
> there another way around it?
I think what you need to do as a workaround will be:
1. configure /etc/leafnode/config to maxfetch=100000
2. run fetchnews -x100000
3. repeat 1 and 2 with 200000, then 300000, and so forth, until you're set.
You can try to automate things with a Bourne-like shell (bash, ksh and
pdksh qualify) and perl like this:
#!/bin/sh
# WARNING - UNTESTED CODE BELOW
# this script workaround fetches 700000 articles in 100000 increments:
# note that max must be a multiple of inc!
inc=100000
max=700000
#
i=$inc
while [ $i -le $max ] ; do
perl -ple "s/^maxfetch *=.*/maxfetch=$i/;" -i /etc/leafnode/config
fetchnews -nvx $i
i=$(( $i + $inc ))
done
# end of script
If you don't have perl installed, replace the perl line with:
sed -e "s/^maxfetch *=.*/maxfetch=$i/;" -i /etc/leafnode/config
assuming your sed(1) implementation can do in-place edits (that is what -i
is for)
Save the script to bulkfetch.sh and then run sh bulkfetch.sh as root or
news user.
HTH
--
Matthias Andree
More information about the leafnode-list
mailing list