Enriching Record Metadata via the Google Vision API

Many times the title of the uploaded file doesn't convey any real information.  We often ask users to supply additional terms, but we can also use machine learning models to automatically tag records.  This enhances the user's experience and provides more opportunities for search.  

faulkner.jpg
Automatically generated keywords, provided by the Vision API

Automatically generated keywords, provided by the Vision API

In the rest of the post I'll show how to build this plugin and integrate it with the Google Vision Api...


First things first, I created a solution within Visual Studio that contains one class library.  The library contains one class named Addin, which is derived from the TrimEventProcessorAddIn base class.  This is the minimum needed to be considered an "Event Processor Addin".

using HP.HPTRIM.SDK;
 
namespace CMRamble.EventProcessor.VisionApi
{
    public class Addin : TrimEventProcessorAddIn
    {
        public override void ProcessEvent(Database db, TrimEvent evt)
        {
        }
    }
}

Next I'll add a class library project with a skeleton method named AttachVisionLabelsAsTerms.  This method will be invoked by the Event Processor and will result in keywords being attached for a given record.  To do so it will call upon the Google Vision Api.  The event processor itself doesn't know anything about the Google Vision Api.

using HP.HPTRIM.SDK;
 
namespace CMRamble.VisionApi
{
    public static class RecordController
    {
        public static void AttachVisionLabelsAsTerms(Record rec)
        {
 
        }
    }
}

Before I can work with the Google Vision Api, I have to import the namespace via the NuGet package manager.

The online documentation provides this sample code that invokes the Api:

var image = Image.FromFile(filePath);
var client = ImageAnnotatorClient.Create();
var response = client.DetectLabels(image);
foreach (var annotation in response)
{
    if (annotation.Description != null)
        Console.WriteLine(annotation.Description);
}

I'll drop this into a new static method in my VisionApi class library.  To re-use the sample code I'll need to pass the file path into the method call and then return a list of labels.  I'll mark the method private so that it can't be directly called from the Event Processor Addin.

private static List<string> InvokeDetectLabels(string filePath)
{
    List<string> labels = new List<string>();
    var image = Image.FromFile(filePath);
    var client = ImageAnnotatorClient.Create();
    var response = client.DetectLabels(image);
    foreach (var annotation in response)
    {
        if (annotation.Description != null)
            labels.Add(annotation.Description);
    }
    return labels;
}

Now I can go back to my record controller and build-out the logic.  I'll need to extract the record to disk, invoke the new InvokeDetectLabels method, and work with the results.  Ultimately I should include error handling and logging, but for now this is sufficient.

public static void AttachVisionLabelsAsTerms(Record rec)
{
    // formulate local path names
    string fileName = $"{rec.Uri}.{rec.Extension}";
    string fileDirectory = $"{System.IO.Path.GetTempPath()}\\visionApi";
    string filePath = $"{fileDirectory}\\{fileName}";
    // create storage location on disk
    if (!System.IO.Directory.Exists(fileDirectory)) System.IO.Directory.CreateDirectory(fileDirectory);
    // extract the file
    if (!System.IO.File.Exists(filePath) ) rec.GetDocument(filePath, false"GoogleVisionApi", filePath);
    // get the labels
    List<string> labels = InvokeDetectLabels(filePath);
    // process the labels
    foreachvar label in labels )
    {
        AttachTerm(rec, label);
    }
    // clean-up my mess
    if (System.IO.File.Exists(filePath)) try { System.IO.File.Delete(filePath); } catch ( Exception ex ) { }
}

I'll also need to create a new method named "AttachTerm".  This method will take the label provided by google and attach a keyword (thesaurus term) for each.  If the term does not yet exist then it will create it.

private static void AttachTerm(Record rec, string label)
{
    // if record does not already contain keyword
    if ( !rec.Keywords.Contains(label) )
    {
        // fetch the keyword
        Keyword keyword = null;
        try { keyword = new HP.HPTRIM.SDK.Keyword(rec.Database, label); } catch ( Exception ex ) { }
        if (keyword == null)
        {
            // when it doesn't exist, create it
            keyword = new Keyword(rec.Database);
            keyword.Name = label;
            keyword.Save();
        }
        // attach it
        rec.AttachKeyword(keyword);
        rec.Save();
    }
}

Almost there!  Last step is to go back to the event processor add in and update it to use the record controller.  I'll also need to ensure I'm only calling the Vision API for supported image types and in certain circumstances.  After making those changes I'm left with the code shown below.

using System;
using HP.HPTRIM.SDK;
using CMRamble.VisionApi;
 
namespace CMRamble.EventProcessor.VisionApi
{
    public class Addin : TrimEventProcessorAddIn
    {
        public const string supportedExtensions = "png,jpg,jpeg,bmp";
        public override void ProcessEvent(Database db, TrimEvent evt)
        {
            switch (evt.EventType)
            {
                case Events.DocAttached:
                case Events.DocReplaced:
                    if ( evt.RelatedObjectType == BaseObjectTypes.Record )
                    {
                        InvokeVisionApi(new Record(db, evt.RelatedObjectUri));
                    }
                    break;
                default:
                    break;
            }
        }
 
        private void InvokeVisionApi(Record record)
        {
            if ( supportedExtensions.Contains(record.Extension.ToLower()) )
            {
                RecordController.AttachVisionLabelsAsTerms(record);
            }
        }
    }
}

Next I copied the compiled solution onto the workgroup server and registered the add-in via the Enterprise Studio. 

2018-05-19_7-57-09.png

 

Before I can test it though, I'll need to create a service account within google.  Once created I'll download the API key as a json file and place it onto the server.

The API requires that the path to the json file be referenced within an environment variable.  The file can be placed anywhere on the server that is accessible by the CM service account.  This is done within the system properties contained in the control panel.

2018-05-19_7-51-45.png

Woot woot!  I'm ready to test.  I should now be able to drop an image into the system and see some results!  I'll use the same image as provided within the documentation, so that I can ensure similar results.  

2018-05-19_8-16-19.png

Sweet!  Now I don't need to make users pick terms.... let the cloud do it for me!

Stress Testing GCP CM via ServiceAPI and jmeter

Let's see how my GCP hosted CM infrastructure holds up under ridiculous load.  

jmeter can be downloaded here.  Since this is a pure Java application you'll need to have java installed (I'm using JRE 9 but most should use JDK 8 here).  Note that JDK v10 is not yet officially supported.

After extracting jmeter you launch it by executing jmeter.bat...

2018-05-12_6-39-04.png

Every test plan should have a name, description, and a few core variables...

There are many, many possible ways to build out a test plan.  Each will have at least one thread group though, so I'll start by creating just one.  

2018-05-12_6-49-13.png

As shown below, I created a thread group devoted to creating folders.  I also set error action to stop test.  Later I'll come back and increase the thread properties, but for now I only want one thread so that I can finish the configuration.

Next I'll add an HTTP Request Sampler, as shown below...

2018-05-12_7-37-48.png

The sampler is configured to submit a Json record definition, with values for Title and Record Type.  This is posted to the ServiceAPI end-point for records.  It's as basic as you can get!

I'll also need an HTTP Header Manager though, so that the ServiceAPI understands I'm intending to work with json data.

2018-05-12_7-34-29.png

Last, I'll add a View Results Tree Listener, like so...

2018-05-12_7-40-52.png

Now I can run the test plan and review the results...

2018-05-12_7-46-35.png

After fixing the issue with my json (I used "RecordRecordTitle" instead of "RecordTypedTitle"), I can run it again to see success.

2018-05-12_7-48-46.png

Now I'll disable the view results tree, add a few report listeners, and ramp up the number of threads to 5.  Each thread will submit 5 new folders.  With that I can see the average throughput.

2018-05-12_7-51-36.png

Next I'll increase the number of folders, tweak the ramp-up period, and delay thread creation until needed.  Then I run it and can review a the larger sample set.

2018-05-12_7-54-24.png

This is anecdotal though.  I need to also monitor all of the resources within the stack.  For instance, as shown below, I can see the impact of that minor load on the CPU of the workgroup server.  I'd want to also monitor the workgroup server(s) RAM, disk IO, network IO, and event logs.  Same goes for the database server and any components of the stack.

2018-05-12_7-58-44.png

From here the stress test plan could include logic to also create documents within the folders, run complex searches, or perform actions.  There are many other considerations before running the test plan, such as: running the plan via command line instead of GUI, scheduling the plan across multiple workstations, calculating appropriate thread count/ramp-up period based on infrastructure, and chaining multiple HTTP requests to more aptly reflect end-user actions.

For fun though I'm going to create 25,000 folders and see what happens!

2018-05-12_8-32-43.png

Ensuring Records Managers can access Microstragegy

I've got a MicroStragtegy server environment with a Records Management group.  I'd like to ensure that certain new CM users always have access to MSTR, so that they have appropriate access to dashboards and reports.  For simplicity of the post I'll focus just on CM administrators.

Within MicroStrategy I'd like to create a new user and include them in a "Records Management" group, like so:

2018-05-05_8-48-44.png

This implementation of MSTR does not have an instance of the REST API available, so I'm limited to using the command manager.  My CM instance is in the cloud and I won't be allowed to install the CM client on the MSTR server.  To bridge that divide I'll use a powershell script that leverages the ServiceAPI and Invoke-Expression cmdlet.  

First I need a function that gets me a list of CM users:

function Get-CMAdministrators {
    Param($baseServiceApiUri)
    $queryUri = ($baseServiceApiUri + "/Location?q=userType:administrator&pageSize=100000&properties=LocationLogsInAs,LocationLoginExpires,LocationEmailAddress,LocationGivenNames,LocationSurname")
    $AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
    [System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
    $headers = @{ 
        Accept = "application/json"
    }
    $response = Invoke-RestMethod -Uri $queryUri -Method Get -Headers $headers -ContentType "application/json"
    Write-Debug $response
    if ( $response.TotalResults -gt 0 ) {
        return $response.Results
    }
    return $null
}

Second I need a function that creates a user within MSTR:

function New-MstrUser {
    Param($psn, $psnUser, $psnPwd, $userName,$fullName,$password,$group)
    $command = "CREATE USER `"$userName`" FULLNAME `"$fullName`" PASSWORD `"$password`" ALLOWCHANGEPWD TRUE CHANGEPWD TRUE IN GROUP `"$group`";"
    $outFile = ""
    $logFile = ""
    try {
        $outFile = New-TemporaryFile
        Add-Content -Path $outFile $command
        $logFile = New-TemporaryFile
        $cmdmgrCommand = "cmdmgr -n `"$psn`" -u `"$psnUser`" -f `"$outFile`" -o `"$logFile`""
        iex $cmdmgrCommand
    } catch {
        
    }
    if ( (Test-Path $outFile) ) { Remove-Item $outFile -Force }
    if ( (Test-Path $logFile) ) { Remove-Item $logFile -Force }
}

Last step is to tie them together:

$psn = "MicroStrategy Analytics Modules"
$psnUser = "Administrator"
$psnPwd = ""
$recordsManagemenetGroupName = "Records Management"
$baseUri = "http://10.0.0.1/HPECMServiceAPI"
$administrators = Get-CMAdministrators -baseServiceApiUri $baseUri
if ( $administrators -ne $null ) {
    foreach ( $admin in $administrators ) {
        New-MstrUser -psn $psn -psnUser $psnUser -psnPwd $psnPwd -userName $admin.LocationLogsInAs.Value -fullName ($admin.LocationSurName.Value + ', ' + $admin.LocationGivenNames.Value) -password $admin.LocationLogsInAs.Value -group $recordsManagemenetGroupName
    } 
}

After running it, my users are created!

2018-05-05_10-03-31.png