A Custom Solution to Audit a highly active SharePoint Farm.
Note: Post is written to outline getting Sharepoint audit events in Mcafee SIEM product. Since this solution exports Sharepoint audit events to flat text log files as an intermediate step, this solution could be easily amended to work with any log analysis tool.
I needed to audit the activity of a highly used SharePoint Farm, this was a farm used primarily as a document management system for a firm. Auditing was required on all document access not just administrative activity which resulted in on average, nearly 1 million records per 24 hr period. This greatly outstripped the capability of using SharePoint auditing exports via powershell or SharePoint tasks and caused reliability issues with Logbinder for SharePoint (an otherwise excellent product). Additionally, querying the SharePoint database directly using the Windows x32 based McAfee SQL Collector against my custom “Auditing” SQL view resulted in the collector crashing with every attempt, I suspect from what seemed like a memory leak. So going the traditional route of defining a SQL query and utilizing Mcafee’s SQL collector was not a viable option, though in many environments I suspect this would be just fine and would therefore not require the custom database and sql agent job.
The Sharepoint Audit Log collection solution I ultimately came up with comprises a number of custom integration points and also introduces a custom Database located on the Sharepoint Database server that contains a custom table for each Sharepoint Content or Administration site collection datasource. See “Sharepoint Audit Log Collection Process Flow” diagram for high level log collection process logic as well as the content in this post to understand how the custom process works and any caveats. This solution has worked incredibly well and reliably in my environment, no custom recovery or “fixes” have ever been required even in events where SharePoint services have gone down, etc. In over 6 months of heavy production use the only changes required have been the addition of a few ASP parser rules to capture more SharePoint events.
Sharepoint Audit Collection Process Summary
Sharepoint maintains audit log information if turned on within a table in the same Content Data Store named dbo.AuditData. A custom query against this table which includes an INNER join performed on the UserInfo table is used to extract meaningful audit log data. Microsoft does not provide a mechanism to extract this log data other than via a gui or powershell excel export which breaks on our site collections that are large, and results in SQL locks and site performance/access problems.
A SQL job on the Sharepoint Database server is executed at scheduled intervals that:
- Queries a custom database created that contains a bookmark table that lists all audit export jobs containing: datetime of last export record, time range of last export, records returned, and time to execute export query.
- Then executes a SELECT statement against the Sharepoint database for audit records from last timestamp to last time stamp plus time increment (+30min seems to work well). Export job quits immediately if this audit log range exceeds current time minus 15 minutes, to prevent missing events due to events not having occurred or not having been written to Sharepoint table yet.
Note: This direct access is not supported by Microsoft and has the potential for impacting Sharepoint access or performance. This is mitigated through the use of “NOLOCK” statements and requesting small blocks of audit records at any time. There is some risk however though, but this has not caused any detected issues in over 6 months of use in a high transaction volume SharePoint 2010 farm.
- These audit records are written to a flat file located on the SQL server, my location is S:\SQLAuditLogs. Initially the export is made to a file with name RBS1_Temp.wrk, then the contents of the wrk file are appended to the actual log file with naming convention of yearmonthday_hour.log
- Time bookmark and export stats are written to custom database table.
- McAfee Receiver Data Source periodically polls the \\Databaseserver\SQLAuditLogs directory for *.log files to ingest and then deletes files once ingested. Other Job Notes: The “TimeIncrement” to collect Logs needs to be selected to:
- Minimize Sharepoint audit query time and the amount of time the query needs to run over the course of a day.
- Keep Security audit logs in SIEM timely, ie. Not lagging too long into the past.
- Allow for catchup to occur should SQL agent job be disabled for a period of time. (ie. Auditlog collection time period should be greater than SQL job agent scheduled frequency which should be greater than time taken for audit export query to run).
SharePoint Content Database AuditingData Custom View:
USE [SharePoint Content DB name]
/****** Object: View [dbo].[BAL_AuditView] Script Date: 08/16/2014 09:40:38 ******/
SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER ON
CREATE VIEW [dbo].[BAL_AuditView]
SELECT TOP (100) PERCENT a.Occurred, a.SiteId, a.ItemId, a.ItemType, a.UserId, a.MachineName, a.MachineIp, a.DocLocation, a.LocationType, a.Event, a.EventName,
a.EventSource, a.SourceName, a.EventData, b.tp_Login, b.tp_Title, b.tp_Email, b.tp_Notes
FROM dbo.AuditData AS a WITH (NOLOCK) INNER JOIN
dbo.UserInfo AS b WITH (NOLOCK) ON a.UserId = b.tp_ID
WHERE (a.Occurred > ‘2014-06-21 05:33:22.000’) AND (b.tp_Login <> ‘SHAREPOINT\system’)
ORDER BY a.Occurred
SQL Agent Job custom code:
DECLARE @BookmarkDate datetime
DECLARE @MinuteIncrement int = 60
DECLARE @NewDate datetime
DECLARE @CutoffDate datetime
DECLARE @RecordsReturned int
DECLARE @StartTime datetime
DECLARE @TimeTakenToExport int
DECLARE @BCPQuery varchar(2000)
DECLARE @BCPCommand varchar(2000)
DECLARE @LogFilePath varchar (100) = ‘S:\SharepointAuditLogs\’
DECLARE @LogFileTemp varchar (100) = ‘S:\SharepointAuditLogs\RBS1_Temp.wrk’
select top 1 @BookmarkDate = Bookmark
from [BAL_AuditBookmark] order by Bookmark desc
SET @NewDate = DateADD(minute,@MinuteIncrement,@BookmarkDate)
SET @CutoffDate = DateADD(minute,-15,GetDate()) — Make sure log export lags log export attempt by 15 min. to provide time for latest events to be written to Sharepoint DB
SET @CutoffDate = DateADD(hour,+8,@CutoffDate) — Add 8 hours to cutoff time as logs are in UTC time. Server is in Pacific
IF @NewDate <= @CutoffDate BEGIN SET @StartTime = GETDATE() /* This query extracts the relevent SQL Audit data out of Sharepoint Database */ set @BCPQuery = ‘SELECT TOP (100) PERCENT a.Occurred, a.SiteId, a.ItemId, a.ItemType, a.UserId, a.MachineName, a.MachineIp, a.DocLocation, a.LocationType, a.Event, a.EventName, a.EventSource, a.SourceName, a.EventData, b.tp_Login, b.tp_Title, b.tp_Email, b.tp_Notes FROM WSS_Content_RBS1.dbo.AuditData AS a WITH (NOLOCK) INNER JOIN WSS_Content_RBS1.dbo.UserInfo AS b WITH (NOLOCK) ON a.UserId = b.tp_ID WHERE (a.Occurred > ‘+””+convert(varchar(20),@BookmarkDate)+””+’) AND (a.Occurred < ‘+””+convert(varchar(20),@NewDate)+””+’) AND tp_Login <> ‘+””+’SHAREPOINT\system’+””+’ ORDER BY a.Occurred’
SET @bcpCommand = ‘bcp “‘+@BCPQuery+'” queryout “‘
SET @bcpCommand = @bcpCommand + @LogFileTemp + ‘” -T -c -t”|”‘
EXEC master..xp_cmdshell @bcpCommand
— APPEND WORK FILE TO LOG FILES with DATE and HOUR timestamp —
SET @BCPCommand = ‘type ‘+@LogFileTemp+’ >> ‘+@LogFilePath+convert(varchar(10),@BookmarkDate,112)+’_’+convert(varchar(2),DATEPART(hh, @BookmarkDate))+’.log’
EXEC master..xp_cmdshell @bcpCommand
SET @TimeTakenToExport = DATEDIFF(SECOND,@StartTime,GETDATE())
SET @RecordsReturned = ‘1’
insert into [BAL_AuditBookmark] values (@NewDate,@MinuteIncrement,@RecordsReturned,@TimeTakenToExport);
Custom Database Creation Code:
/****** Object: Database [BAL_SharepointAudit] Script Date: 01/08/2015 11:46:28 ******/
CREATE DATABASE [BAL_SharepointAudit] ON PRIMARY
( NAME = N’BAL_SharepointAudit’, FILENAME = N’F:\SQL2008Default\Data\BAL_SharepointAudit.mdf’ , SIZE = 1029120KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
( NAME = N’BAL_SharepointAudit_log’, FILENAME = N’F:\SQL2008Default\Data\BAL_SharepointAudit_log.ldf’ , SIZE = 1746688KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
ALTER DATABASE [BAL_SharepointAudit] SET COMPATIBILITY_LEVEL = 100
/****** Object: Table [dbo].[BAL_AuditBookmark] Script Date: 01/08/2015 11:47:26 ******/
SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER ON
CREATE TABLE [dbo].[BAL_AuditBookmark](
[Bookmark] [datetime] NOT NULL,
[Increment] [smallint] NOT NULL,
[RecordsReturned] [int] NOT NULL,
[ExportTimeInSec] [int] NOT NULL
) ON [PRIMARY]