This is the command rawdog that can be run in the OnWorks free hosting provider using one of our multiple free online workstations such as Ubuntu Online, Fedora Online, Windows online emulator or MAC OS online emulator
PROGRAM:
NAME
rawdog - an RSS Aggregator Without Delusions Of Grandeur
SYNOPSIS
rawdog [options]
DESCRIPTION
rawdog is a feed aggregator for Unix-like systems.
rawdog uses the Python feedparser module to retrieve articles from a number of feeds in
RSS, Atom and other formats, and writes out a single HTML file, based on a template either
provided by the user or generated by rawdog, containing the latest articles it's seen.
rawdog uses the ETags and Last-Modified headers to avoid fetching a file that hasn't
changed, and supports gzip and delta compression to reduce bandwidth when it has. rawdog
is configured from a simple text file; the only state kept between invocations that can't
be reconstructed from the feeds is the ordering of articles.
OPTIONS
This program follows the usual GNU command line syntax, with long options starting with
two dashes (`-').
General Options
-d DIR, --dir DIR
Use DIR instead of the $HOME/.rawdog directory. This option lets you have two or
more rawdog setups with different configurations and sets of feeds.
-N, --no-locking
Do not lock the state file.
rawdog usually claims a lock on its state file, to stop more than one instance from
running at the same time. Unfortunately, some filesystems don't support file
locking; you can use this option to disable locking entirely if you're in that
situation.
-v, --verbose
Print more detailed information about what rawdog is doing to stderr while it runs.
-V FILE, --log FILE
As with -V, but write the information to FILE.
-W, --no-lock-wait
Exit silently if the state file is already locked.
If the state file is already locked, rawdog will normally wait until it becomes
available, then run. However, if you're got a lot of feeds and a slow network
connection, you might prefer rawdog to just give up immediately if the previous
instance is still running.
Actions
rawdog will perform these actions in the order given.
-a URL, --add URL
Try to find a feed associated with URL and add it to the config file.
URL may be a feed itself, or it can be an HTML page that links to a feed in any of
a variety of ways. rawdog uses heuristics to pick the best feed it can find, and
will complain if it can't find one.
-c FILE, --config FILE
Read FILE as an additional config file; any options provided in FILE will override
those set in the main config file (with the exception of "feed", which is
cumulative). FILE may be an absolute path or a path relative to your .rawdog
directory.
Note that $HOME/.rawdog/config will still be read first even if you specify this
option. -c is mostly useful when you want to write the same set of feeds out using
two different sets of output options.
-f URL, --update-feed URL
Update the feed pointed to by URL immediately, even if its period hasn't elapsed
since it was last updated. This is useful when you're publishing a feed yourself,
and want to test whether it's working properly.
-l, --list
List brief information about each of the feeds that was known about at the time of
the last update.
-r URL, --remove URL
Remove feed URL from the config file.
-s TEMPLATE, --show TEMPLATE
Print one of the templates currently in use to stdout. TEMPLATE may be page, item,
feedlist or feeditem. This can be used as a starting point if you want to design
your own template for use with the corresponding template option in the config
file.
-u, --update
Fetch data from the feeds and store it. This could take some time if you've got
lots of feeds.
-w, --write
Write out the HTML output file.
Special Actions
If one of these options is specified, rawdog will perform only that action, then exit.
--dump URL
Show what rawdog's feed parser returns for URL. This can be useful when trying to
understand why rawdog doesn't display a feed correctly.
--help Provide a brief summary of all the options rawdog supports.
EXAMPLES
rawdog is typically invoked from cron(1). The following crontab(5) entry would fetch data
from feeds and write it to HTML once an hour, exiting if rawdog is already running:
0 * * * * rawdog -Wuw
Use rawdog online using onworks.net services