How to Harden Xnix Logs

Posted on 9:25 PM by Bharathvn

Handling Logs
By now, you know what system logs there are, where they are stored, and the nature and format of their contents. All this information is useful, but analyzing several megabytes of text is inconvenient and difficult.

In a system processing numerous requests, logs grow rapidly. For example, the daily log on my Web server can exceed 4 MB. This is a lot of text information, in which finding a specific entry within short time is practically impossible.

This is why programmers and administrators have written and continue writing log-analyzing software. Logs should be analyzed every day or, preferably, every hour. To maintain a secure system, you cannot afford to miss any important messages.

The most effective log-analyzing programs are those that analyze log entries as they are recorded in the log. This is relatively simple to implement, especially on a remote computer that receives log entries from the server over the network. As entries come in, they are analyzed and recorded in the log files for storage and more detailed future analyzes. It is usually difficult to detect an attack by one system message, and sometimes a dynamic picture is necessary. For example, one failed authorization attempt does not mean anything, while ten or more attempts look quite suspicious.

Unfortunately, all known log-analyzing software cannot do effective dynamic analysis. Most of this software only create rules, according to which certain entries are considered either suspicious or not. Therefore, all failed system login entries are considered suspicious and are subsequently analyzed manually. Every day, at least one user hits the wrong key when entering the password, especially if it is a complex one. It would make no sense to react to all such messages.

There is another shortcoming to analyzing logs line by line. Suppose that the log-analyzing utility issued a message informing of an attempt to access a restricted disk area. Such log entries for most services will contain only information about the attempt, not information about the user account used.

For example, a log entry recording unauthorized access to the ftp directory will contain the IP address of the client but not the user account. To find out, which user produced this failed login attempt, you have to open the log and look over the connection history from this IP manually. This problem can be avoided by dynamic log analysis.

The Tail Utility
When I am working directly at the server, I launch the following command in a new terminal window:

tail -f /var/log/messages

This command displays updates to the log file in real time; that is, whenever a new entry is added to the log, the utility displays it.

This is convenient if only a few entries are recorded into the log. In this way, you can work in one terminal and periodically switch to the other to check the new log messages. But if there are too many system messages (e.g., many users are working with the server), checking all new entries becomes impossible. In this case, you need a special utility to filter the messages and display only those deemed suspicious.

The Swatch Utility
This is a powerful Perl log message-analyzing utility. This is a rather simple language and many administrators know it, so you can easily modify the program and add new functions. The program can be downloaded from the site http://sourceforge.net/project/swatch.

The program can analyze log entries on the schedule (if the program is scheduled in the cron task manager) or immediately upon their being entered into the log.

The installation process is different because Swatch is a Perl program. This is done by executing the following sequence of commands:

tar xzvf swatch-3.1.tgz
cd swatch-3.1
perl Makefile.PL
make test
make install
make realclean

That the program is written in Perl is also its shortcoming. I had already mentioned that any software that can be used by hackers to enter the system should not be installed on the server unless necessary. The Perl interpreter is necessary for a Web server using scripts written in this language. In other cases, I recommend against installing a Perl interpreter because hackers often use this language for writing their own rootkits.

The Logsurfer Utility
This is one of the few programs that can examine logs dynamically. The program can be downloaded from sourceforge.net/projects/logsurfer. As was said, most log-analyzing programs do this line by line, which is ineffective because lots of trash is produced.

The powerful features of the program make it more difficult to configure. This is a shortcoming, because configuration errors may result in an important event going undetected.

The Logcheck/LogSentry Utility
This is the easiest program to use. It was developed by the programmers who developed the PortSentry utility considered earlier. LogSentry uses various templates to filter out the suspicious log messages.

The program is user-friendly, but I am concerned about its future. It looks like there will be no more updates, and sooner or later the current features will not be enough and a substitution will be necessary.

But I have high hopes for the prospects of the program. Its operation was considered in Section 12.4, when considering the operation of the PortSentry program.