Auditing file deletion and access and writing events to a log file using Powershell

  • Tutorial

I think many people faced the problem when they came to you and asked: “We have a file missing on a shared resource, there was and it didn’t, it seems someone deleted it, can you check who did it?” In the best case, you say that you do not have time, in the worst case you try to find a mention of this file in the logs. And when file audit on the file server is turned on, the logs there, to put it mildly, “are very large,” and finding something there is unrealistic.
Here I am, after another such question (ok, backups are done several times a day) and my answer that: “I don’t know who did this, but I’ll restore the file to you”, I decided that this fundamentally does not suit me ...

Let's get started.


To begin, enable the ability to audit access to files and folders to group policies.
Local Security Policies-> Advanced Security Policy Configuration-> Access to Objects
Enable “File System Audit” for success and failure.
After that, we need to configure the audit for the folders we need.
We go to the properties of the shared folder on the file server, go to the "Security" tab, click "Advanced", go to the "Audit" tab, click "Change" and "Add". We select users for whom to audit. I recommend choosing "Everything", otherwise it makes no sense. Application level "For this folder and its subfolders and files."
We choose the actions on which we want to conduct an audit. I chose “Create Files / Recover Data” Success / Failure, “Create Folders / Recover Data” Success / Failure, Delete subfolders and files and just delete, also on Success / Failure.
Click OK. We look forward to applying audit policies on all files. After that, a lot of file and folder access events will appear in the security event log. The number of events is directly proportional to the number of working users with a shared resource, and, of course, from the activity of use.

So, we already have the data in the logs, it remains only to get them out of there, and only those that interest us, without unnecessary “water”. After that, we will carefully insert our data into the text file line by line separating the data with tabs, so that, for example, in the future, open them with a spreadsheet editor.

#Задаем период, в течении которого мы будем запускать один раз скрипт, и искать нужные нам события. Здесь период задан - 1 час. Т.е. проверяются все события за последний час.
$time =  (get-date) - (new-timespan -min 60)

#$BodyL - переменная для записи в лог-файл
$BodyL = ""

#$Body - переменная, в которую записываются ВСЕ события с нужным ID. 
$Body = Get-WinEvent -FilterHashtable @{LogName="Security";ID=4663;StartTime=$Time}|where{ ([xml]$_.ToXml()).Event.EventData.Data |where {$_.name -eq "ObjectName"}|where {($_.'#text') -notmatch ".*tmp"} |where {($_.'#text') -notmatch ".*~lock*"}|where {($_.'#text') -notmatch ".*~$*"}} |select TimeCreated, @{n="Файл_";e={([xml]$_.ToXml()).Event.EventData.Data | ? {$_.Name -eq "ObjectName"} | %{$_.'#text'}}},@{n="Пользователь_";e={([xml]$_.ToXml()).Event.EventData.Data | ? {$_.Name -eq "SubjectUserName"} | %{$_.'#text'}}} |sort TimeCreated

#Далее в цикле проверяем содержит ли событие определенное слово (к примеру название шары, например: Secret)	
foreach ($bod in $body){
	if ($Body -match ".*Secret*"){

#Если содержит, то пишем в переменную $BodyL данные в первую строчку: время, полный путь файла, имя пользователя. И #в конце строчки переводим каретку на новую строчку, чтобы писать следующую строчку с данными о новом файле. И так #до тех пор, пока переменная $BodyL не будет содержать в себе все данные о доступах к файлам пользователей.

		$BodyL=$BodyL+$Bod.TimeCreated+"`t"+$Bod.Файл_+"`t"+$Bod.Пользователь_+"`n"
	}
}

#Т.к. записей может быть очень много (в зависимости от активности использования общего ресурса), то лучше разбить лог #на дни. Каждый день - новый лог. Имя лога состоит из Названия AccessFile и даты: день, месяц, год.
$Day = $time.day
$Month = $Time.Month
$Year = $Time.Year
$name = "AccessFile-"+$Day+"-"+$Month+"-"+$Year+".txt"
$Outfile = "\serverServerLogFilesAccessFileLog"+$name

#Пишем нашу переменную со всеми данными за последний час в лог-файл.
$BodyL | out-file $Outfile -append


And now a very interesting script.



The script writes a log about deleted files.

#Переменная $Time тут имеет такое же назначение как в предыдущем скрипте.

$time =  (get-date) - (new-timespan -min 60)

#$Events - содержит время и порядковый номер записи евента с ID=4660. И сортируем по порядковому номеру.
#!!!!Это важное замечание!!! При удалении файла создается сразу 2 записи, с ID=4660 и ID=4663.
$Events = Get-WinEvent -FilterHashtable @{LogName="Security";ID=4660;StartTime=$time} | Select TimeCreated,@{n="Запись";e={([xml]$_.ToXml()).Event.System.EventRecordID}} |sort Запись

#Самые важные команды поиска. Опишу принцип ниже, после листинга скрипта.
$BodyL = ""
$TimeSpan = new-TimeSpan -sec 1
foreach($event in $events){
	$PrevEvent = $Event.Запись
	$PrevEvent = $PrevEvent - 1
	$TimeEvent = $Event.TimeCreated
	$TimeEventEnd = $TimeEvent+$TimeSpan
	$TimeEventStart = $TimeEvent- (new-timespan -sec 1)
	$Body = Get-WinEvent -FilterHashtable @{LogName="Security";ID=4663;StartTime=$TimeEventStart;EndTime=$TimeEventEnd} |where {([xml]$_.ToXml()).Event.System.EventRecordID -match "$PrevEvent"}|where{ ([xml]$_.ToXml()).Event.EventData.Data |where {$_.name -eq "ObjectName"}|where {($_.'#text') -notmatch ".*tmp"} |where {($_.'#text') -notmatch ".*~lock*"}|where {($_.'#text') -notmatch ".*~$*"}} |select TimeCreated, @{n="Файл_";e={([xml]$_.ToXml()).Event.EventData.Data | ? {$_.Name -eq "ObjectName"} | %{$_.'#text'}}},@{n="Пользователь_";e={([xml]$_.ToXml()).Event.EventData.Data | ? {$_.Name -eq "SubjectUserName"} | %{$_.'#text'}}} 
	if ($Body -match ".*Secret*"){
		$BodyL=$BodyL+$Body.TimeCreated+"`t"+$Body.Файл_+"`t"+$Body.Пользователь_+"`n"
	}
}


$Month = $Time.Month
$Year = $Time.Year
$name = "DeletedFiles-"+$Month+"-"+$Year+".txt"
$Outfile = "\serverServerLogFilesDeletedFilesLog"+$name


$BodyL | out-file $Outfile -append


As it turned out, when deleting files and deleting descriptors, the same event is created in the log, under ID = 4663. At the same time, the message body may contain different values ​​of “Access Operations”: Data recording (or adding a file), DELETE , etc.
Of course, we are interested in the DELETE operation. But that is not all. The most interesting thing is that, during the usual renaming of a file, 2 events are created with ID 4663, the first with the Access Operation: DELETE, and the second with the operation: Data recording (or file addition). So if you just select 4663, then we will have a lot of false information: where the files will go, both deleted and simply renamed.
However, I noticed that when the file is explicitly deleted, another event is created with ID 4660, in which, if you carefully examine the message body, it contains the username and much more service information, but no file name. But there is a descriptor code.

However, the event preceding this event was event ID 4663. Where exactly the file name, user name and time, and the operation are not strange there DELETE. And most importantly, there is a handle number that corresponds to the handle number from the event above (4660, remember? Which is created when the file is explicitly deleted). So now, in order to know exactly which files are deleted, you just need to find all the events with ID 4660, as well as those preceding each event, an event with code 4663, which will contain the number of the desired descriptor.

These 2 events are generated simultaneously when the file is deleted, but are recorded sequentially, first 4663, then 4660. In this case, their serial numbers differ by one. 4660 has a serial number one greater than 4663.
It is by this property that the desired event is sought.


Those. all events with ID 4660 are taken. They take 2 properties, creation time and serial number.
Then, in a loop, each event 4660 is taken one at a time. Its properties, time and serial number are selected.
Next, the $ PrevEvent variable is entered with the number of the event we need, which contains the necessary information about the deleted file. And also the time frames are determined in which it is necessary to search for this event with a specific serial number (with the same that we entered in $ PrevEvent). Because an event is generated almost simultaneously, then the search will be reduced to 2 seconds: + - 1 second.
(Yes, it’s +1 sec and -1 sec, why exactly, I can’t say, it was detected experimentally, if you don’t add a second, you may not find some, maybe due to the fact that these two events may be generated one before the other later and vice versa).
I’ll make a reservation right away that it’s a very long time to search only by serial number for all events within an hour, because the serial number is in the body of the event, and to determine it, you need to parse each event - this is a very long time. That is why such a small period of 2 seconds is needed (+ -1sec from the 4660 event, remember?).
It is in this time period that the event with the required sequence number is searched.
After it is found, the filters work:
|where{ ([xml]$_.ToXml()).Event.EventData.Data |where {$_.name -eq "ObjectName"}|where {($_.'#text') -notmatch ".*tmp"} |where {($_.'#text') -notmatch ".*~lock*"}|where {($_.'#text') -notmatch ".*~$*"}}


Those. не записываем информацию об удаленных временных файлах (.*tmp), файлах блокировок документов MS Office (.*lock), и временных файлах MS Office (.*~$*)
Because м же образом берем необходимые поля из этого события, и пишем их в переменную $BodyL.
After finding all the events, write $ BodyL to the text file of the log.

For the log of deleted files, I use the scheme: one file for one month with a name containing the month number and year). Because Deleted files are many times smaller than the files that were accessed.

As a result, instead of endless “digging” of logs in search of truth, you can open the log file with any spreadsheet editor and view the data we need by user or file.

Recommendations


You will have to determine for yourself the time during which you will look for the necessary events. The longer the period, the longer it searches. It all depends on server performance. If weak - then start with 10 minutes. See how fast it works. If longer than 10 minutes, then either increase again, it will suddenly help, or vice versa reduce the period to 5 minutes.

After you determine the time period. Put this script in the task scheduler and indicate that you need to run this script every 5.10.60 minutes (depending on what period you specified in the script). I have it indicated every 60 minutes. $ time = (get-date) - (new-timespan -min 60) .

PS


Both of these scripts work for me for a network resource of 100GB, on which an average of 50 users actively work daily.
Search time for deleted files per hour is 10-15 minutes.
Search time for all files to which access was available is from 3 to 10 minutes. Depending on the load on the server.