Relocating a Document Store

I've recently migrated my dataset's database into a secure cloud environment.  I've also prepared a workgroup server in the cloud and have registered the dataset on that server.  When users try to open electronic documents though, an error is generated.

download.png

The error message indicates the expected location of the electronic file, relative to the server.  In this case I should be able to browse the file-system on the old server and find this directory: "C:\HPE Content Manager\DocumentStore\47".  When I look on the old server though, I don't see it.

To sort this out I'll need to review the properties of the record and find the document store details.  That information is stored in the Document Details property.  I can view that from the Properties tab of the view pane, as shown below.

2018-04-23_21-32-49.png

To dig in further I'll need to go to the Administration ribbon and click Document Stores.

Then locate the document store I found in the record's document details property and open the properties dialog.  

Well that doesn't seem right.  The path highlighted does indeed exist.  However, as I migrated my dataset into the cloud I also changed the dataset ID!  The path indicated in the properties of the document store is just the starting path.  Within that path will be the ID of the dataset; and then, within that, will be the unique ID of the document store.  I can find that by reviewing the view pane for the document store (you may need to customize the view pane and add that property).

2018-04-25_10-12-40.png

If I go back to the old server I should find a sub-folder within "45" (the old dataset ID) named "2"....

2018-04-25_10-14-40.png

Moving that folder onto my new server into the path "C:\HPE Content Manager\DocumentStore\47" should resolve the issue.  After dropping the file into that path, everything works!

Moral of the story is to be mindful when changing dataset IDs!  

Also, it would be really nice to be able to use a google storage bucket as a document store!

PostgresSQL in GCP

If you're like me and are moving to the GCP stack, you'll find you have a few extra steps to undertake when preparing your infrastructure. 

First, you'll have to manually install the PostgresSQL ODBC drivers on your VM instance.  You can download them from the PostgresSQL website, locate here: 

https://www.postgresql.org/ftp/odbc/versions/

Second, you create your PostgresSQL instance within your GCP project.  This was super simple.... click, click, click, done.

2018-04-11_16-08-34.png
2018-04-11_16-09-21.png

Next I secured access between my PostgresSQL instance and the VM instance.  After that, it is necessary to create a schema for your Content Manager database.  Content Manager will throw up on the screen if you don't have one already prepared.  

To resolve the issue you'll need to manually create your schema within the database.  I did this via the gcloud shell, but you could do this locally from your gcloud SDK (or any database management tool capable of managing a PostgresSQL instance).

2018-04-11_16-16-29.png

Only draw-back to this cloud instance of PostgresSQL is there is no GIS support...

2018-04-11_16-18-34.png

Now that my Content Manager instance is fully in the cloud I can leverage all of the really cool features of the GCP stack!

Automating movement to cheaper storage

Now that I have an Azure Blob store configured, I want to have documents moved there after they haven't been used for a while.  I've previously shown how to do this manually within the client, but now I'll show how to automate it.

The script is very straight-forward and follows these steps:

  1. Find the target store in CM
  2. Find all documents to be moved
  3. Move each document to the target store

Here's an implementation of this within powershell:

Clear-Host
Add-Type -Path "D:\Program Files\Hewlett Packard Enterprise\Content Manager\HP.HPTRIM.SDK.dll"
$LocalStoreName = "Main Document Store"
$AzureStoreName = "Azure Storage"
$SearchString = "store:$($LocalStoreName) and accessedOn<Previous Year"
$Database = New-Object HP.HPTRIM.SDK.Database
$Database.Connect()
#fetch the store and exit if missing
$Tier3Store = $Database.FindTrimObjectByName([HP.HPTRIM.SDK.BaseObjectTypes]::ElectronicStore, $AzureStoreName)
if ( $Tier3Store -eq $null ) {
    Write-Error "Unable to find store named '$($AzureStoreName)'"
    exit
}
#search for records eligible for transfer
$Records = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch -ArgumentList $Database, Record
$Records.SearchString = $SearchString
Write-Host "Found $($Records.Count) records"
$x = 0
#transfer each record
foreach ( $Result in $Records ) 
{
    $Record = [HP.HPTRIM.SDK.Record]$Result
    $record.TransferStorage($Tier3Store, $true)
    Write-Host "Record $($Record.Number) transfered"
	$x++
}

I ran it to get the results below.  I forced it to stop after the first record for demonstration purposes, but you should get the idea. 

2017-12-07_21-11-57.png