![]() Quickly process large log files, as often desired. Through the use of intermediary data base files, AWStats is able to Gateway interface) or directly from the operating system command line. Reason is that author spend, since July 2008, most of his time as project leader on another major OpenSource projet called Dolibarr ERP & CRM and worksĪlso at full time for TecLib, a french Open Source company.Ī lot of other developers maintains the software, providing patches, or packages, above all for Linux distributions (fedora, debian, ubuntu.).ĭesigned with flexibility in mind, AWStats can be run through a web browser CGI (common However, development is now done on "maintenance fixes" or small new features. Data is graphically presented in easy to readĪWStats development started in 1997 and is still developed today by same author ( Laurent Destailleur). Server statistics reports based on the rich data contained in Log analyzer which creates advanced web, ftp, mail and streaming While there, you could also add DirectoryIndex .AWStats logfile analyzer 7.8 Documentation ![]() Note that while you can add hostnames instead of IPs, reverse DNS needs to be configured. You’ll likely have to edit /etc/httpd/conf.d/nf and add Require ip 10.10.10.10/16, or whatever your local ip range is. Unless you’re running awstats on your localhost, you’ll be denied access. Luckily, awstats developers though of this, and you can pass along a an alternate config in the url: However, I’ve aggregated my logs to a single location. Review your logsīy default, awstats figures out what config to use based on the domain name in the URL. Let’s just piggy-back on provided functionality: $ time sudo /etc/cron.hourly/awstats # My site is entirely https, so tell awstats that # DNSLookups is going to make log parsing take a *very* long time. You’ll probably want to read through all the options, but here’s all the values I modified: LogFile="/opt/logs/chrisirwin.ca/access_" Now go to /etc/awstats, and make a copy of the config for each domain: $ sudo cp .conf $ sudo dnf install awstatsĪnd, uh, restart apache again $ sudo systemctl restart httpd This is in Fedora 22, but requires you to enable epel for CentOS/RHEL. Then I picked my internal web host, and installed awstats. Now I have a log store with a directory per server, and logs per virtualhost within them.Ĭonfigure cron + ssh-keys to acquire that data, or run it manually whenever. Rsync -avz $host:/var/log/gitlab/nginx/*log* $host/ Rsync -avz $host:/var/log/httpd/*log* $host/ So I wrote a bash script to pull in my logs to a local directory: $ cat /opt/logs/update-logsįor host in chrisirwin.ca do Do I want to manage awstats on all of them? No. It goes without saying that awstats needs to be local to the logs. I’ll probably need to revisit this in the future, but 1 year worth of logs is only 55MB. ![]() I also disabled logrotate, and un-rotated my logs with zcat. Then restart httpd $ sudo systemctl restart httpd In each virtualhost I added new CustomLog and ErrorLog definitions, using the domain name of the virtualhost. Now that we’ve got separate access logs, we need to tell our virtualhosts to use them. I renamed access_log to access_log.old, just so I don’t mistakenly review it’s data again. Then I dumped anything that didnt' match those rules into my main virtualhost’s log (including all the generic GET / entries.Īll my logs are sorted into per-virtualhost logs, and all lines from the original are accounted for. I ended up spending a few hours coming up with a bunch of rules to identify all queries for my non-main virtualhosts (yay static files). My biggest problem is a lot of log enteries didn’t actually indicate which virtualhost they were from. A lot of grep work, and I managed to split those out to individual access logs: /var/log/httpd/access_, for example. My first problem, is all were using /var/log/httpd/access_log for logging. I’ve got seven virtualhosts spread across four virtual machines. I’m mainly interested in seeing how many people actually read these articles, as well as what search terms referred them here. I just wanted simple stats based on logs: It’s non-intrusive to visitors, doesn’t send their browsing habits to third parties (other than what they send themselves), and uses the apache log data I’ve already got for the entire year. This article isn’t supposed to help you decide. There are a few options: Use a service (Google Analytics, etc) or parse your logs. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |