ELK your CM audit logs

In this post I'll show how I've leveraged Elasticsearch, Kibana, and Filebeats to achieve a dashboard based on my CM audit logs.  I've created an ubuntu server where I've installed all of the ELK components (note if you don't have access to one, you don't need one to play... just install it all locally).  This segregates it from Content Manager and let's me work with it independently. 

First I installed Filebeat onto the server generating my audit logs.  I configured it to search within the Audit Log output folder and to ignore the first line of each file.  You can see this configuration below.

2017-12-12_17-03-58.png

I completed the configuration by directing the content to Elasticsearch directly.  Then I started filebeat and let it run.

2017-12-13_11-27-48.png

When I inspect the index I can see it's stuffed each line of the file into a message column.

2017-12-12_18-22-22.png

This doesn't help me.  I need to break this line of data into separate fields. The old audit log viewer shows me the order of each column and what data is contained inside.  I used this to define a regular expression to extract the properties.  

2017-12-12_20-55-20.png

I used a grok pipeline processor to implement the regular expression, transform some of the values, and then remove the message field (so that it doesn't confuse things later).  The processor I came up with is as follows:

 {
        "description" : "Convert Content Manager Offline Audit Log data to indexed data",
        "processors" : [
        {
            "grok": {
                "field": "message",
                "patterns": [ "%{DATA:EventDescription}\t%{DATA:EventTime}\t%{DATA:User}\t%{DATA:EventObject}\t%{DATA:EventComputer}\t%{NUMBER:UserUri}\t%{DATA:TimeSource}\t%{DATA:TimeServer}\t%{DATA:Owner}\t%{DATA:RelatedItem}\t%{DATA:Comment}\t%{DATA:ExtraDetails}\t%{NUMBER:EventId}\t%{NUMBER:ObjectId}\t%{NUMBER:ObjectUri}\t%{NUMBER:RelatedId}\t%{NUMBER:RelatedUri}\t%{DATA:ServerIp}\t%{DATA:ClientIp}$" ] 
            }
        },
        {
            "convert": {
                "field" : "UserUri",
                "type": "integer"
            }
        },
        {
            "convert": {
                "field" : "ObjectId",
                "type": "integer"
            }
        },
        {
            "convert": {
                "field" : "ObjectUri",
                "type": "integer"
            }
        },
        {
            "convert": {
                "field" : "RelatedId",
                "type": "integer"
            }
        },
        {
            "convert": {
                "field" : "RelatedUri",
                "type": "integer"
            }
        },
            {
            "remove": {
                "field" : "message"
            }
        }],
    "on_failure" : [
          {
            "set" : {
              "field" : "error",
              "value" : "{{ _ingest.on_failure_message }}"
            }
          }
        ]
    }

Now when I test the pipeline I can see that the audit log file data is being parsed correctly.  When testing the pipeline you submit the processor and some sample data.  It then returns the results of the processor.

Result of posting to the pipeline simulator

Result of posting to the pipeline simulator

As shown below, I used postman to submit the validated pipeline to elasticsearch and name it "cm-audit"...

2017-12-12_23-07-04.png

Now I need to go back to my Content Manager server and update the configuration of Filebeats.  Here I'll need to direct the output into the newly created pipeline.  I do this by adding a pipeline definition to the Elasticsearch output options.

2017-12-12_21-58-51.png

Next I stop Filebeat and delete the local registry file in ProgramData (this let's me re-process the audit log files). 

2017-12-13_11-30-22.png

Before starting it back up though, I need to delete the existing index.

Delete action in the Elasticsearch Head

Delete action in the Elasticsearch Head

Now I can start it back up and let it populate elasticsearch.  If I check the index via kibana I can see my custom fields for the audit logs are there.

2017-12-12_22-30-48.png

Last step, create some visualizations and then place onto a dashboard.  In my instance I've also setup winlogbeats (which I'll do another post on some other day), so I have lots of information I can use in my dashboards.  For a quick example I'll show the breakdown of event types and users.

2017-12-12_22-28-29.png

Automating movement to cheaper storage

Now that I have an Azure Blob store configured, I want to have documents moved there after they haven't been used for a while.  I've previously shown how to do this manually within the client, but now I'll show how to automate it.

The script is very straight-forward and follows these steps:

  1. Find the target store in CM
  2. Find all documents to be moved
  3. Move each document to the target store

Here's an implementation of this within powershell:

Clear-Host
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$LocalStoreName = "Main Document Store"
$AzureStoreName = "Azure Storage"
$SearchString = "store:$($LocalStoreName) and accessedOn<Previous Year"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
#fetch the store and exit if missing
$Tier3Store = $Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::ElectronicStore, $AzureStoreName)
if ( $Tier3Store -eq $null ) {
    Write-Error "Unable to find store named '$($AzureStoreName)'"
    exit
}
#search for records eligible for transfer
$Records = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Record
$Records.SearchString = $SearchString
Write-Host "Found $($Records.Count) records"
$x = 0
#transfer each record
foreach ( $Result in $Records ) 
{
    $Record = [HP.HPTRIM.SDK.Record]$Result
    $record.TransferStorage($Tier3Store, $true)
    Write-Host "Record $($Record.Number) transfered"
	$x++
}

I ran it to get the results below.  I forced it to stop after the first record for demonstration purposes, but you should get the idea. 

2017-12-07_21-11-57.png

Automatically Creating Alerts for Users

A question over on the forum about the possibility of having a notification sent to users when they've been selected in a custom property.  For instance, maybe when a request for information is registered I want to alert someone.  Whomever is selected should get a notice.

2017-12-07_18-17-08.png

This behavior is really what the "assignee" feature is meant to handle.  Users use their in-tray to see these newly created items.  However, the email notification options related to that feature are fairly limited.  If I, as the user, create an alert then I can receive a notification that a record has been created and assigned to me.

2017-12-07_18-17-37.png

Here are the properties for the alert...

2017-12-07_18-20-57.png
2017-12-07_18-21-06.png
2017-12-07_18-21-13.png

Now the challenge here is that administrators cannot access alerts for other users.  We could tell each user to do this themselves, but then I wouldn't have much to write about.  So instead I create another user and craft a script.

2017-12-07_18-25-23.png

The script needs to find all users that can login and create an alert for each user not having one.  To do that we'll have to impersonate those users.  That requires you either run this as the service account or you add the impersonator to the enterprise studio options.  You can see below the option I'm referencing.

2017-12-07_18-03-31.png

This is how it appears after you've added the account...

Now I need a powershell script that finds all of the users and then connects as that user to perform the search for an alert.  If it can't find an alert then it creates a new one in the impersonated dataset.  This script can then be scheduled to run once every evening.  The audit logs will show that this impersonation happened, so there should be no concern regarding security.

Clear-Host
$LocationCustomPropertyName = "Alert Location"
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
$LocationCustomProperty = [HP.HPTRIM.SDK.FieldDefinition]$Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::FieldDefinition, $LocationCustomPropertyName)
#Find all locations that can login
$Users = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Location
$Users.SearchString = "login:* not uri:$($Database.CurrentUser.Uri)"
Write-Host "Found $($Users.Count) users"
foreach ( $User in $Users ) 
{
    try {
        #Impersonate this user so we can find his/her alerts
        $TrustedDatabase = New-Object HP.HPTRIM.SDK.Database
        $TrustedDatabase.TrustedUser = $User.LogsInAs
        $TrustedDatabase.Connect()
        Write-Host "Connected as $($TrustedDatabase.CurrentUser.FullFormattedName)"
        #formulate criteria string for this user 
        $CriteriaString = "$($LocationCustomProperty.SearchClauseName):[default:me]"
        #search using impersonated connection
        $Alerts = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch $TrustedDatabase, Alert
        $Alerts.SearchString = "user:$($User.Uri) eventType:added"
        #can't search on criteria so have to inspect each to see if already exists
        Write-Host "User $(([HP.HPTRIM.SDK.Location]$User).FullFormattedName) has $($Alerts.Count) alerts"
        $AlertExists = $false
        foreach ( $Alert in $Alerts ) 
        {
            if ( ([HP.HPTRIM.SDK.Alert]$Alert).Criteria -eq $CriteriaString ) 
            {
                $AlertExists = $true
            }
        }
        #when not existing we create it
        if ( $AlertExists -eq $false ) {
            $UserAlert = New-Object HP.HPTRIM.SDK.Alert -ArgumentList $TrustedDatabase
            $UserAlert.Criteria = $CriteriaString
            #$UserAlert.ChildSubscribers.NewSubscriber($TrustedDatabase.CurrentUser)
            $UserAlert.Save()
            Write-Host "Created an alert for $($User.FullFormattedName)"
        } else {
            Write-Host "Alert found for $($User.FullFormattedName)"
        }
    } catch [HP.HPTRIM.SDK.TrimException] {
        Write-Host "$($_)"
    }
}

When I run the script I get these results....

2017-12-07_18-33-09.png

Ah!  When I created Elmer I didn't give him an email address.   I could either update this script to automatically create email addresses for users, simply exclude them from the initial search for users (by adding "email:*" to the search string), or by sending a log of this script to an email address for manual rectification. 

For now I manually fix it and run the script again.

2017-12-07_18-40-09.png

If I run it a second time I can see it doesn't re-create it for him.

2017-12-07_18-40-51.png

Success!  Now just schedule this to run on a server once every day and you have a viable solution to the posted question.  Before implementation you might adjust the script to only target those users who would ever be selected for the custom property (if that's possible).