[leafnode-list] Re: Downloading old articles

Matthias Andree matthias.andree at gmx.de
Thu Nov 5 09:28:17 CET 2009


Am 05.11.2009, 03:59 Uhr, schrieb Marcin Dziwnowski  
<m.dziwnowski at gmail.com>:

> On Tue, Nov 3, 2009 at 3:33 PM, Marcin Dziwnowski
> <m.dziwnowski at gmail.com> wrote:
>
>> # fetchnews -vvvv -x 235000
>>  backing up from 1706789 to 1471789
>>  considering articles 1471789 - 1706793
>>  0 articles fetched, 0 killed
>> No information about why it does not attempt to download the missing
>> posts is given
>
> As it turns out the server I am using probably just can't cope with
> listing the demanded 235 000 posts all at once. The articles are
> there, the xover command is recognized but it's just too much.

Would XHDR work? Try   usexhdr = 1   in the .../config file below your  
"server = example.pl" line.

> Is there a way to make fetchnews download posts in "batches", ten,
> maybe twenty thousand every run? Without overloading the server with
> listing three, four, five or six hundred thousand articles first?

I figured that in leafnode 1 at least (didn't check leafnode 2)  
maxfetch=12345 overrides larger fetchnews -x 98765 values. I'm not sure if  
I'd consider that a bug or feature and if I want to fix it. Given  
leafnode-1's low release frequency and really long propagation into  
distributions I have some reservations about touching it.

>> Or maybe some smarter tool for the job?
>
> The "smarter tool" term was very unfortunate, Matthias, I'm sorry.

No problem, I don't usually take comments about my software personally.
"usually" means: in the absense of other ad-hominem comments and if the  
complaint isn't the purpose in itself.

-- 
Matthias Andree



More information about the leafnode-list mailing list