[leafnode-list] Re: Downloading old articles

Marcin Dziwnowski m.dziwnowski at gmail.com
Thu Nov 5 03:59:22 CET 2009


On Tue, Nov 3, 2009 at 3:33 PM, Marcin Dziwnowski
<m.dziwnowski at gmail.com> wrote:

> # fetchnews -vvvv -x 235000
>  backing up from 1706789 to 1471789
>  considering articles 1471789 - 1706793
>  0 articles fetched, 0 killed
> No information about why it does not attempt to download the missing
> posts is given

As it turns out the server I am using probably just can't cope with
listing the demanded 235 000 posts all at once. The articles are
there, the xover command is recognized but it's just too much.

Is there a way to make fetchnews download posts in "batches", ten,
maybe twenty thousand every run? Without overloading the server with
listing three, four, five or six hundred thousand articles first?

> Or maybe some smarter tool for the job?

The "smarter tool" term was very unfortunate, Matthias, I'm sorry.



More information about the leafnode-list mailing list