AminetAminet
Search:
84450 packages online
About
Recent
Browse
Search
Upload
Setup
Services

comm/news/nntpget.lha

Mirror:Random
Showing:m68k-amigaosgeneric
No screenshot available
Short:Update to NNTP client for AS225
Author:Michael van Elst (mlelstv at serpens.rhein.de)
Architecture:m68k-amigaos
Date:1995-05-24
Download:http://aminet.net/comm/news/nntpget.lha - View contents
Readme:http://aminet.net/comm/news/nntpget.readme
Downloads:5177

nntpget is a NNTP (Net News Transfer Protocol) client that gathers news
articles from a remote NNTP server. It was written for AS225R2 but seems
to work fine with AmiTCP and the faked socket.library done by Henning
Schmiedehausen (also on Aminet).

What would you do when you do not have a working NNTP daemon or do not
want to run (and let it write to disk) all the time ?

My solution is nntpget, a program that talks with a remote NNTP server
using the 'newnews' and 'article' commands to query for new articles
since a specific date and then to retrieve these articles. nntpget
generates a valid batchfile suitable for 'rnews' or the like (i.e.
including the #! rnews xxxx headers).

Usage:

> nntpget ?
FROM/M,SUBSCRIPTIONS/K,ARTICLES/K,SINCE/K,DATESTAMP/K,TO/A/K,APPEND/S,NNTPSERVER
/K/A,CPS/K/N,NBUFS/K,HISTORY/K,QUIET/S: 

  FROM             multiple strings denoting newsgroups or complete hierarchies
                   like:   FROM de.test  alt.*  news.announce
  
  SUBSCRIPTIONS    file with list of newsgroups or hierarchies, one per line
                   Characters after the first white space are ignored, so
                   that you can use your active file.

  ARTICLES         file with list of MsgIDs, one per line

you can use all 3 sources at the same time. nntpget will first work on FROM,
then on SUBSCRIPTIONS then on ARTICLES.

  SINCE date       fetch only articles since given date in AmigaDOS format
                   substitutes like 'monday' or 'yesterday' are valid meaning
                   the last such weekday (at current time). So yesterday says
                   to retrieve news from the last 24 hours.
                   If you tell it the current day or 'today' this says to fetch
                   all articles since this day at midnight.
                   You can also specify a time meaning since today (or yesterday
                   if the time would be in the future) at that time.

  DATESTAMP file   fetch only articles newer than that file. When all
                   goes well then the modification time of that file
                   is updated to the _starting time_ of nntpget. This
                   ensures that subsequent runs will not lose articles.

                   The filesystem where the datestamp file is located
                   must support the SetFileDate function (which is
                   true for the standard ROM filesystem).

                   If you do not use either the SINCE or the DATESTAMP
                   parameter the newserver is told to retrieve articles
                   since 1.1.1900 (i.e. since ever).
                   Some servers may interpret this as 1.1.2000 instead and
                   won't send anything for a while.

                   Articles requested by the ARTICLES file are not filtered
                   out. You will get all articles independent of age !

  TO file          generated batch file

  APPEND           append to batch file

  NNTPSERVER       the name or IP number of your NNTP server machine

  CPS num          bandwidth limitation. nntpget tries to keep traffic below
                   the given characters per second. This can only be an average
                   since there are bursts of at least one TCP segment which
                   are under control of the TCP protocol stack.
                   The bandwidth is just limited for article transfers and not
                   for the initial transfer of article MsgIDs but which is
                   short.

  NBUFS num        n-buffer NNTP protocol while fetching articles. nntpget
                   will request up to num articles in advance. This should
                   improve throughput if you receive lots of small articles.
                   A good value for num is 5. A value of 1 means to fetch
                   articles in lock-step with the server's answers (which 
                   is the default).

                   There have been reports about problems with n-buffering
                   and the INN (InterNetNews) daemon. Reason is that n-
                   buffering can produce bulks of output. If the network
                   is slow then INN erroneously assumes that the connection
                   has been lost and terminates the session.

  HISTORY base     If you want to update your news database from several
                   servers or don't know what articles need to be fetched
                   you can use this option to tell nntpget the basename
                   of a dbz(1) compatible history database (i.e. without
                   the .pag or .dir suffixes). nntpget will then match
                   msgids against the history and just transfer articles
                   that are not in your database.

  QUIET            suppress verbose output messages. On high bandwidth
                   connections this would degrade performance significantly.
                   You will still see the average data transfer rate and
                   number of articles transferred.

nntpget uses 2 temporary files in T:. One to gather all MsgIDs, one for the
current article (it cannot write directly to the batchfile since it doesn't know
the size at that time which has to go to the #! rnews header). So please have
enough space on that filesystem (usually T: is on RAM:).

While running nntpget will show you (part of) the NNTP traffic on stdout as well
as the average transfer rate.

>> NEWS (0.95) <<
This version fixes some problems of the first version.

When you used the CPS option and something modifies the clock (say when
adjusting for DST) then nntpget measures a negative speed and would wait
a very long time.

Again with the CPS option it was impossible to stop nntpget while it
was throttling the transfer making the previous problem even more
nasty.

When you abort nntpget it removes the temporary files including the
article list but while it is running it did maintain an exclusive lock
on the list file. Now it closes and reopens the list file so that you
can examine (or copy) the list while the transfer is in progress. This
helps when you want to abort and resume a transfer.
Caveat: if you keep the list file open from a second process then nntpget
cannot delete it.

>> NEWS (0.101) <<

Some minor corrections to follow closer the NNTP specification.

>> NEWS (0.115) <<

Added DATESTAMP, NBUFS and HISTORY options. Fixed two cases where a
parameter was missing from an error message.

The CPS and NBUFS options aren't that useful together. NBUFS is used
to improve transfer rates which would be reduced again by CPS. You
may still want it for maintaining a low average bandwidth usage
together with a high peak bandwidth usage after a temporary network
failure.

>> NEWS (0.118) <<

Bugfixes

>> NEWS (0.119) <<

Fixed a bug where nntpget would hang when you used n-buffering and a
requested article couldn't be transferred. nntpget didn't count the
failure responses and waited forever for acknowledgements after all
articles were transferred. You could still abort nntpget with CTRL-C
and the batchfile was intact.

Added QUIET option, modified text output so that you can see the
number of articles transferred and rejected.

Michael van Elst


Contents of comm/news/nntpget.lha
 PERMSSN    UID  GID    PACKED    SIZE  RATIO     CRC       STAMP          NAME
---------- ----------- ------- ------- ------ ---------- ------------ -------------
[generic]                13054   21752  60.0% -lh5- 101c May 19  1995 nntpget
[generic]                 3030    7337  41.3% -lh5- 04a3 May 19  1995 nntpget.readme
---------- ----------- ------- ------- ------ ---------- ------------ -------------
 Total         2 files   16084   29089  55.3%            May 24  1995

Aminet © 1992-2024 Urban Müller and the Aminet team. Aminet contact address: <aminetaminet net>