Original post is here: eklausmeier.goip.de
Hiawatha is a secure and reliable web-server. It is used for this blog. AWStats is a collection of Perl-scripts to analyze log-files from web-servers. By default, AWStats can read Apache log-files. It cannot directly read log-files from Hiawatha.
The Hiawatha log-file format is:
- host, this is the IP address
- date + time
- HTTP status code, e.g., 200
- size in bytes
- URL including the method (GET/POST/etc.)
- referrer
- user agent, e.g., Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:94.0) Gecko/20100101 Firefox/94.0
- and other fields
I already use GoAccess to analyze my log-files. See Using GoAccess with Hiawatha Web-Server. GoAccess is pretty fast and the output looks good, but it cannot really filter the data. I.e., it shows huge amounts of data generated by bots. So I hoped that AWStats could fill this gap.
1. Modifying Perl program. I first tried to configure AWStats in
1/etc/awstats/awstats.eklausmeier.goip.de.conf
Unfortunately, even after preformatting the Hiawatha log-files this didn't work. So I had to change to source code in awstats.pl
.
I created a new LogFormat=5:
1elsif ( $LogFormat eq '5' ) { # Hiawatha web-server log-format
2 $PerlParsingFormat = "special Hiawatha web-server log-format";
3 $pos_host = 0;
4 $pos_date = 1;
5 $pos_code = 2;
6 $pos_size = 3;
7 $pos_method = 4; # together with url
8 $pos_url = 5; # together with method
9 $pos_referer = 6;
10 $pos_agent = 7;
11 @fieldlib = (
12 'host', 'date', 'code', 'size',
13 'method', 'url', 'referer', 'ua'
14 );
15}
There are two places in awstats.pl
, which actually read from the log-file. These two places I changed with a call to a small subroutine, which can then handle Hiawatha log-files natively and without hassle.
1# split log line into fields
2sub splitLog (@) {
3 my ($PerlParsingFormat,$line) = (@_[0],@_[1]);
4 if ($PerlParsingFormat eq '(?^:^special Hiawatha web-server log-format)') {
5 my @F = split('\|',$line);
6 my @R;
7 ($R[0],$R[2],$R[3],$R[6],$R[7]) = ($F[0],$F[2],$F[3],$F[5],$F[6]);
8 my ($day,$month,$year,$hms) = ($F[1] =~ /\w\w\w\s+(\d+)\s+(\w+)\s+(\d+)\s+(\d+:\d+:\d+)/);
9 $R[1] = sprintf("%02d/%s/%04d:%s",$day,$month,$year,$hms); # DD/Month/YYYY:HH:MM:SS (Apache)
10 ($R[4],$R[5]) = ($F[4] =~ /^(\w+)\s+([^\s]+)\s+[^\s]+$/); # GET /index.html HTTP/x.x
11 return @R;
12 }
13 return map( /$PerlParsingFormat/, $line );
14}
This subroutine is then called
1@field = splitLog($PerlParsingFormat,$line);
instead of
1@field = map( /$PerlParsingFormat/, $line );
and instead of
1if ( !( @field = splitLog($PerlParsingFormat,$line) ) ) { #map( /$PerlParsingFormat/, $line )
In total this occurs two times in awstats.pl
.
2. Configuring AWStats. To actually run awstats.pl
you have to symlink the lib-directory first:
1ln -s /usr/share/webapps/awstats/cgi-bin/lib lib
assuming that the AWStat package was installed under /usr/share/webapps/awstats
.
In /etc/awstats/awstats.eklausmeier.goip.de.conf
I set:
LogFile="/tmp/access.log"
LogType=W
LogFormat=5
LogSeparator="\|"
SiteDomain="eklausmeier.goip.de"
DNSLookup=2
DirIcons="/awstatsicon"
You have to set a symbolic link in your web-root:
1ln -s /usr/share/webapps/awstats/icon awstatsicon
3. Running AWStats. Starting AWStats is thus:
1./awstats.pl -config=eklausmeier.goip.de -output -staticlinks > /srv/http/awstats.html
Generating all the reports:
1/usr/share/awstats/tools/awstats_buildstaticpages.pl -config=eklausmeier.goip.de -dir=/srv/http
Hiawatha splits the log-file and gzip's them. To concat them all together, use something like:
1L=/tmp/access.log; rm $L; for i in `seq 52 -1 2`; do zcat access.log.$i.gz >> $L; done; cat access.log.1 access.log >> $L
4. Example output. Output of AWStats for the overview for spiders and bots looks similar to this:
The detailled overview of the most requested URLs looks similar to this:
The list of used operating systems looks like this:
Added 02-Apr-2022: AWStats uses the heuristic to detect bots by examining access to robots.txt. Those machines, which access robots.txt, must be bots. It is one of many ways to detect bots in AWStats. Well, it turns out many bots do not bother to look at robots.txt. So this heuristic is not very reliable. Just for the records, Google and Yandex do honor robots.txt.