We return logs as we want - the analysis of logs in Windows systems

It's time to talk about convenient work with logs, especially since Windows has a lot of non-obvious tools for this. For example, Log Parser, which is sometimes simply irreplaceable.

The article will not talk about serious things like Splunk and ELK (Elasticsearch + Logstash + Kibana). We focus on the simple and free.

Logs and command line

Before PowerShell, cmd utilities like find and findstr could be used . They are quite suitable for simple automation. For example, when I needed to catch errors in the 1C 7.7 exchange, I used a simple command in the exchange scripts:

findstr "Fail" *.log >> fail.txt

It allowed to receive all exchange errors in the fail.txt file. But if something more was needed, such as obtaining information about a previous error, then you had to create monstrous scripts with for loops or use third-party utilities. Fortunately, with the advent of PowerShell, these problems are a thing of the past.

The main tool for working with text logs is the Get-Content cmdlet , which is used to display the contents of a text file. For example, to display the WSUS service log to the console, you can use the command:

Get-Content -Path 'C:\Program Files\Update Services\LogFiles\SoftwareDistribution.log' | Out-Host -Paging

To display the last lines of the log, there is the Tail parameter, which, together with the Wait parameter, will allow you to watch the log online. Let's see how the system is updated with the command:

>Get-Content -Path "C:\Windows\WindowsUpdate.log" -Tail 5 -Wait

We are watching the progress of the Windows update.

If we need to catch certain events in the logs, the Select-String cmdlet will help , which allows you to display only strings matching the search mask. Take a look at the latest Windows Firewall locks:

Select-String -Path "C:\Windows\System32\LogFiles\Firewall\pfirewall.log" -Pattern 'Drop' | Select-Object -Last 20 | Format-Table Line

We look who is trying to crawl onto our grandfather.

If necessary, look in the log lines before and after the desired, you can use the Context parameter. For example, to display three lines after and three lines before an error, you can use the command:

Select-String 'C:\Windows\Cluster\Reports\Cluster.log' -Pattern ' err ' ‑Context 3

Both useful cmdlets can be combined. For example, to display lines 45 through 75 from netlogon.log, the command will help:

Get-Content 'C:\Windows\debug\netlogon.log' | Select-Object -First 30 -Skip 45

The system logs are in .evtx format, and there are separate cmdlets for working with them. To work with classic magazines ("Application", "System", etc.) Get-Eventlog is used. This cmdlet is convenient, but does not allow you to work with other application and service logs. To work with any magazines, including classic ones, there is a more universal option - Get-WinEvent . Let us dwell on it in more detail.

To get a list of available system logs, you can run the following command:

Get-WinEvent -ListLog *

Display available magazines and information about them.

To view a particular magazine, you just need to add its name. For example, we get the last 20 entries from the System log with the command:

Get-WinEvent -LogName 'System' -MaxEvents 20

Recent entries in the System log.

It is most convenient to use hash tables to get certain events. For more information about working with hash tables in PowerShell, see Technet about_Hash_Tables .

For example, we get all the events from the System log with event code 1 and 6013.

Get-WinEvent -FilterHashTable @{LogName='System';ID='1','6013'}

In case you need to receive events of a certain type - warnings or errors - you need to use a filter by importance (Level). The following values ​​are possible:

  • 0 - always write;
  • 1 - critical;
  • 2 - error;
  • 3 - warning;
  • 4 - information;
  • 5 - verbose.

Collecting a hash table with several importance values ​​with a single command will not work so easily. If we want to get errors and warnings from the syslog, we can use additional filtering using Where-Object :

Get-WinEvent -FilterHashtable @{LogName='system'} | Where-Object -FilterScript {($_.Level -eq 2) -or ($_.Level -eq 3)}

System log errors and warnings.

Similarly, you can assemble a table by filtering directly by event text and by time.

You can read more about the work of both cmdlets for working with syslogs in the PowerShell documentation:

PowerShell is a convenient and flexible mechanism, but it requires syntax knowledge and for complex conditions and processing a large number of files it will require writing full scripts. But there is an option to get by with just SQL queries using the wonderful Log Parser.

Working with logs through SQL queries

Utility Log Parser was born at the beginning of the "zero" and since then managed to acquire an official graphical shell. Nevertheless, it has not lost its relevance and still remains for me one of the most favorite tools for analyzing logs. You can download the utility in the Microsoft Download Center , the graphical interface to it is in the Technet gallery . About the graphical interface a bit later, let's start with the utility itself.

The possibilities of Log Parser have already been described in the material “ LogParser - a familiar look at unusual things ”, so I will start with specific examples.

To begin with, we’ll deal with text files — for example, we’ll get a list of RDP connections blocked by our firewall. To get this information, the following SQL query is fine:

 extract_token(text, 0, ' ') as date, 
 extract_token(text, 1, ' ') as time,
 extract_token(text, 2, ' ') as action, 
 extract_token(text, 4, ' ') as src-ip,  
 extract_token(text, 7, ' ') as port 
FROM 'C:\Windows\System32\LogFiles\Firewall\pfirewall.log' 
WHERE action='DROP' AND port='3389'
ORDER BY date,time DESC

Let's look at the result:

We look at the Windows Firewall log.

Of course, you can do anything with the resulting table — sort, group. How much imagination and knowledge of SQL is enough.

Log Parser also works great with many other sources. For example, let's see where users connected to our server via RDP.

We will work with the TerminalServices-LocalSessionManager \ Operational log.

Не со всеми журналами Log Parser работает просто так ― к некоторым он не может получить доступ. В нашем случае просто скопируем журнал из %SystemRoot%\System32\Winevt\Logs\Microsoft-Windows-TerminalServices-LocalSessionManager%4Operational.evtx в %temp%\test.evtx.

We will receive the data with this request:

 timegenerated as Date, 
 extract_token(strings, 0, '|') as user,
 extract_token(strings, 2, '|') as sourceip 
FROM '%temp%\test.evtx'
WHERE EventID = 21

We look at who and when connected to our terminal server.

It is especially convenient to use Log Parser to work with a large number of log files - for example, in IIS or Exchange. Thanks to the capabilities of SQL, you can get a variety of analytical information, up to statistics on the versions of IOS and Android that connect to your server.

As an example, let's look at the statistics of the number of letters by day with this query:

 TO_LOCALTIME(TO_TIMESTAMP(EXTRACT_PREFIX(TO_STRING([#Fields: date-time]),0,'T'), 'yyyy-MM-dd')) AS Date,
 COUNT(*) AS [Daily Email Traffic] 
FROM 'C:\Program Files\Microsoft\Exchange Server\V15\TransportRoles\Logs\MessageTracking\*.LOG'

If the system has Office Web Components installed, which can be downloaded from the Microsoft Download Center , you can get a beautiful diagram at the output.

We fulfill the request and open the resulting picture ...

We admire the result.

It should be noted that after installing Log Parser, the COM component MSUtil.LogQuery is registered in the system . It allows you to make requests to the utility engine not only through a call to LogParser.exe, but also using any other familiar language. В качестве примера приведу простой скрипт PowerShell , который выведет 20 наиболее объемных файлов на диске С.

$LogQuery = New-Object -ComObject "MSUtil.LogQuery"
$InputFormat = New-Object -ComObject "MSUtil.LogQuery.FileSystemInputFormat"
$InputFormat.Recurse = -1
$OutputFormat = New-Object -ComObject "MSUtil.LogQuery.CSVOutputFormat"
$SQLQuery = "SELECT Top 20 Path, Size INTO '%temp%\output.csv' FROM 'C:\*.*' ORDER BY Size DESC"
$LogQuery.ExecuteBatch($SQLQuery, $InputFormat, $OutputFormat)
$CSV = Import-Csv  $env:TEMP'\output.csv'
$CSV | fl 
Remove-Item $env:TEMP'\output.csv'

You can get acquainted with the documentation about the component’s operation in the material Log Parser COM API Overview on the SystemManager.ru portal.

Due to this feature, several utilities are available to facilitate the work, which are a graphical shell for Log Parser. I won’t consider paid ones, but I’ll show you the free Log Parser Studio.

Interface Log Parser Studio.

The main feature here is the library, which allows you to keep all requests in one place, without scattering folders. Also, there are a lot of ready-made examples that will help you sort out the queries.

The second feature is the ability to export the request to a PowerShell script.

As an example, let's see how the selection of mailboxes sending the most letters will work:

Selection of the most active mailboxes.

In this case, you can choose much more types of magazines. For example, in the “pure” Log Parser there are restrictions on the types of input data, and there is no separate type for Exchange - you need to enter field descriptions and skip headers yourself. In Log Parser Studio, the required formats are ready to use.

In addition to Log Parser, you can work with logs using the MS Excel features mentioned in the article “ Excel instead of PowerShell ”. But you can achieve maximum convenience by preparing the primary material using Log Parser and then processing it through Power Query in Excel.

Have you ever used any tools for shoveling logs? Share in the comments.