Infrastructure contacted me to complain that one of our SharePoint environments was logging too much data (via the ULS) and it was becoming unmanageable (an Operations Management tool like SCOM has not been configured). Looking through many gigabytes of text, even with a free tool like ULSViewer, it is difficult to be confident that you are correctly identifying the most common issues, it is an inaccurate art at best.
That is why I wrote a log analyser as a PowerShell script which will process ULS log files and, using fuzzy comparison, create a report of the most frequently occurring log entries.
I am very well aware that this is not necessarily useful information in many cases (hence I had to write this script myself). Nevertheless I found it useful in my scenario and I hope that some of you may as well.
Just in case you are interested: using this script I was able to declare with certainty that logging by the SPMonitoredScope class made up almost 30% of total log entries. This will be reduced by explicitly stating the log severity in the class constructor as verbose and maintaining a log level of Medium for the SharePoint Foundation : Monitoring diagnostic logging category.
A few things of note:
You may want to add to or remove from the set of replace statements in order to increase/decrease the ‘fuzziness’ of the comparison. Adding a replace statement for removing URLs may be a good candidate if you wish to increase matches.
The script loads entire files into memory at once. Be aware of this if you have very large log files or not much RAM.
The output file is a CSV, open it with Excel.
By default the script will identify and analyse all *.log files in the current directory.
If you cancel the script during processing (ctrl+c) it will still write all processed data up until the point at which it was cancelled.
I quite enjoy PowerShell-ing so expect to see more utilities in the future.