Workflow Applications Using Fortanix ACI

NOTE
Please download the Python script for use with this example and unzip the tar file into your working directory. It is best to use the scripts provided in the tar file instead of copying the scripts outlined in the example below to avoid potential formatting issues.

workflow-scripts.tar.gz

Disclaimer: The Fortanix CCM features - Workflows, Datasets, and App Configurations will only be enabled for Customers with an "Enterprise" license.

1.0 Introduction

Workflow graphs are maps that show how generic applications are connected to datasets and other generic applications. These are collaborative objects where multiple users can provide their own objects and approvals. For more details refer to the User Guide: Workflows.

This article describes how to create and deploy Fortanix CCM Workflows using a simple data processing example.

1.1 A Research Portal for Patient Data

This example demonstrates a Fortanix CCM workflow with Python scripts deployed securely using Fortanix Enclave OS. The Workflow provides the most common diseases from patient data.

For input data, a synthetic file containing the medical history of fictive patients is used. The data used for this example is available from https://synthea.mitre.org/downloads.

File: 1K Sample Synthetic Patient Records, CSV[mirror]: 9 MB

This is made available via:
Jason Walonoski, Mark Kramer, Joseph Nichols, Andre Quina, Chris Moesel, Dylan Hall,
Carlton Duffett, Kudakwashe Dube, Thomas Gallagher, Scott McLachlan,
Synthea: An approach, method, and software mechanism for generating synthetic patients
and the synthetic electronic health care record, Journal of the American Medical
Informatics Association, Volume 25, Issue 3, March 2018,
Pages 230–238, https://doi.org/10.1093/jamia/ocx079

The input and output dataset represents patient information collected by a different healthcare provider and contains the following elements:

  • Patient demographic information
  • Patient conditions
  • Patient encounters
  • Patient medications

2.0 Prerequisites

  • Set up a default config for Amazon Web Service (AWS) with your Access Key/Secret Key.
  • Run the following command to install Boto:
    sudo apt-get install python3-boto3

3.0 Create CCM Workflow

3.1 Sign up and Log in

  1. Sign in to Fortanix CCM using the URL - https://ccm.fortanix.com/. Corvin14.png
    Figure 1: Sign up and Log in
    For more details, refer to the https://support.fortanix.com/hc/en-us/articles/360034373551-User-s-Guide-Logging-in#SignUp.

  2. Select an account once you have logged into Fortanix CCM user interface. If you do not have an account create a new account.

3.2 Invite Users to Join CCM

An Administrator invites users who are Data Owners, App Owners, and Output Owners to join a CCM account and create a workflow graph.

To invite users, click the Users icon WorkflowEx4.png . On the Users page, click INVITE USER to invite new users to the account.

WorkflowEx3.png Figure 2: Invite Users

For more details, refer to https://support.fortanix.com/hc/en-us/articles/360043093352-User-s-Guide-Invite-Users.

The new users (Data Owners and App Owners) must join the CCM account and create Datasets and App Configs as described in the following sections.

3.3 Create Input and Output Datasets

Datasets are the definitions containing the location and access credentials of the data that allow the enclave OS in the workflow to download the data and upload the data. In this example, we use the AWS S3 bucket to:

  • Store an encrypted file that will be downloaded and decrypted by the enclave OS.
  • Upload an encrypted file using the credentials provided in the dataset.
  • Create an S3 bucket with a directory that is accessible using a URL. 

3.3.1 Input User (Data Provider)

Consider that the Data Owner user has access to sensitive information and wants to allow an Application Owner to process this information.
This sensitive data is stored in a conditions.csv file.

In this example, the file is encrypted, uploaded to a storage solution (AWS S3), and a dataset is configured with credentials and an encryption key for enclave access or processing:

  1. If you haven't already, download and untar this tar ball workflow-scripts.tar.gzfile:
    tar xvfz workflow-scripts.tar.gz
  2. Obtain a copy of the CSV sample data (also available here: https://synthea.mitre.org/downloads):
    wget https://synthetichealth.github.io/synthea-sample-data/downloads/synthea_sample_data_csv_apr2020.zip
    unzip synthea_sample_data_csv_apr2020.zip
    You will only be using the file csv/conditions.csv to encrypt this file, generate a key locally, encrypt the file and store the key in a KMS securely:
    1. Run the following command to generate an encryption key:
      xxd -u -l 32 -p /dev/random | tr -d '\n' > ./key.hex
    2. Use aes_256_gcm.py script included in the tar file that you downloaded. Run the following command to encrypt the file:
      ./workflow_scripts/aes_256_gcm.py enc -K ./key.hex -in ./csv/conditions.csv -out ./conditions.csv.enc
  3. Run the following command to upload the encrypted file to a secure storage location such as AWS S3:
    aws s3 --profile upload cp ./conditions.csv.enc <s3-directory>
  4. Generate a pre-signed URL to access the file. This is done to avoid inserting the whole AWS SDK in the example.
    Use the presign.py script included in the tar file and run the following command:
    ./presign.py default download <s3-directory> conditions.csv.enc 86400
    Where, S3-directory is the directory of your S3 bucket directory.
    As an example, the output from the above command will be as shown below:
    https://fortanix-pocs-data.s3.amazonaws.com/conditions.csv.enc?AWSAccessKeyId=<key>&Signature=PcpH99nszG2%2Fv85z4IbgwgVDywc%3D&Expires=1613817035
    The output consists of two parts:
    • The location - https://fortanix-pocs-data.s3.amazonaws.com/conditions.csv.enc
    • Query parameters: AWSAccessKeyId=<key>&Signature=PcpH99nszG2%2Fv85z4IbgwgVDywc%3D&Expires=1613817035
      The query parameters must be base64 encoded for the dataset definition using the following command:
      echo -n ‘AWSAccessKeyId=<key>&Signature=PcpH99nszG2%2Fv85z4IbgwgVDywc%3D&Expires=1613817035’ | base64
    • The output will be as follows:
      QVdTQWNjZXNzS2V5SWQ9QUtJQVhPNU42R0dOQ05WMzUzV1MmU2lnbmF0dXJlPVBjcEg5OW5zekcyJTJGdjg1ejRJYmd3Z1ZEeXdjJTNEJkV4cGlyZXM9MTYxMzgxNzAzNQ==
    At this point, users accessing the URL above with full query string parameters will be able to download the encrypted file until it expires in 1 day. If the URL above expires, the dataset will need to be updated with new query parameters
    NOTE
    If you access the URL without the string following '?', you will get a 403 forbidden.
    Hence, treat the query parameters as access credentials.

Perform the following steps to create an input dataset:

  1. Click the Datasets icon in the CCM left panel and click CREATE DATASET to create a new dataset. WorkflowEx5.png
    Figure 3: Create a New Dataset
  2. In the Create new dataset form, enter the following details:
    • Name – the dataset name. For example: Conditions Data.
    • Description (optional) – the dataset description. For example: Patients with associated conditions.
    • Labels (optional)– attach one or more key-value labels to the dataset. For example: Key: Location and Value: East US
    • Location – the AWS S3 URL where data can be accessed. For example: https://fortanix-pocs-data.s3.amazonaws.com/conditions.csv.enc
    • Long Description (optional)– enter the content in GitHub-flavoured Markdown file format. You can also use the Fetch Long Description button to get the Markdown file from an external URL.
      Fetch Long Description Dialog Box.png
      Figure 4: Fetch Long Description Dialog Box
      The following is the sample long description in Markdown format:
      - Strikethrough Text
      ~~It is Strikethrough test..~~

      - Blockquote Text
      > Test Blockquote.

      - Bold
      **A sample Description.**

      - Italic
      *It is Italics*

      - Bold and Italic
      ***Bold and Italics text***

      - Link
      This is [Website](https://www.fortanix.com/)?
    • Credentials – the credentials needed to access the data. The credentials must be in the correct JSON format and consist of:
      • Query parameters that were base64 encoded.
      • The key that was used to encrypt the file.
      {
      "query_string": "<my-query-string>",
      "encryption": {
      "key": "<my-key>"
      }
      }
      For example:
      {
      "query_string": "QVdTQWNjZXNzS2V5SWQ9QUtJQVhPNU42R0dOQ05WMzUzV1MmU2lnbmF0dXJlPVBjcEg5OW5zekcyJTJGdjg1ejRJYmd3Z1ZEeXdjJTNEJkV4cGlyZXM9MTYxMzgxNzAzNQ==",
      "encryption": {
      "key": "63F0E4C07666126226D795027862ACC5848E939881C3CFE8CB3EB47DD7B3D24A"
      }
      }
      TIP
      Before saving the dataset, it is a good idea to verify that the JSON is correct. After saving the dataset you will not be able to view the credentials and access to data may fail. Any online JSON formatting tool can be used to validate that the JSON is correct.
      NOTE
      • The credentials are only passed as text when creating the dataset over an HTTPS connection.
      • It is then stored in a KMS (Fortanix Data Security Manager) and only accessible to approved enclaves.
      • Not even the Data Owner can retrieve the credentials.
    Create Dataset Blank-no.png
    Figure 5: Create Input Dataset
  3. Click the CREATE DATASET button to create the input dataset.

3.3.2 Output User (Data Receiver)

Once the data has been received by the Enclave, the user application will run within the Enclave and generate output data (processed data). This data should be encrypted (using your key) before being uploaded to an untrusted store. This is achieved by defining an output dataset to be used by the workflow.
Perform the following steps:

  1. Run the following command to generate an encryption key:
    xxd -u -l 32 -p /dev/random | tr -d '\n' > ./key_out.hex
  2. Use the presign.py script included in the tar file. Run the following command to generate a presign URL for the upload:
    ./presign.py default upload <s3-directory> conditions_output.csv.enc 86400
    Where s3-directory is the directory of your S3 bucket directory.
    As an example, the output of the above command will be as shown below:
    https://fortanix-pocs-data.s3.amazonaws.com/conditions_output.csv.enc?AWSAccessKeyId=<key>Signature=HFvhxaiKY0cGR9XqgGLp5zcAWac%3D&Expires=1613817880
    The output consists of two parts:
    • The location: https://fortanix-pocs-data.s3.amazonaws.com/conditions_output.csv.enc
    • Query parameters: AWSAccessKeyId=<key>Signature=HFvhxaiKY0cGR9XqgGLp5zcAWac%3D&Expires=1613817880
      The query parameters must be base64 encoded for the dataset definition using the following command:
      echo -n ‘AWSAccessKeyId=<key>Signature=HFvhxaiKY0cGR9XqgGLp5zcAWac%3D&Expires=1613817880’ | base64
    • The output will be as follows:
      QVdTQWNjZXNzS2V5SWQ9QUtJQVhPNU42R0dOTk1SWFZLUEEmU2lnbmF0dXJlPUhGdmh4YWlLWTBjR1I5WHFnR0xwNXpjQVdhYyUzRCZFeHBpcmVzPTE2MTM4MTc4ODA=
  3. Create an output dataset with the following sample values:
    • Name – the dataset name. For example: Conditions processing output.
    • Description (optional) – the dataset description.
    • Labels (optional)– attach one or more key-value labels to the dataset.
    • or example: Key: Location and Value: East US.
    • Location – the URL where data can be accessed. For example: https://fortanix-pocs-data.s3.amazonaws.com/conditions_output.csv.enc 
    • Long Description (optional)– enter the content in GitHub-flavoured Markdown file format. You can also use the Fetch Long Description button to get the Markdown file from an external URL.
      Fetch Long Description Dialog Box.png
      Figure 6: Fetch Long Description Dialog Box
      The following is the sample long description in Markdown format:
      - Strikethrough Text
      ~~It is Strikethrough test..~~

      - Blockquote Text
      > Test Blockquote.

      - Bold
      **A sample Description.**

      - Italic
      *It is Italics*

      - Bold and Italic
      ***Bold and Italics text***

      - Link
      This is [Website](https://www.fortanix.com/)?
    • Credentials – the credentials needed to access the data. The credentials must be in the correct JSON format and consist of:
      • Query parameters that were base64 encoded.
      • The key that was used to encrypt the file.
      {
      "query_string": "<my-query-string>",
      "encryption": {
      "key": "<my-key>"
      }
      }
      For example:
      {
      "query_string": "QVdTQWNjZXNzS2V5SWQ9QUtJQVhPNU42R0dOTk1SWFZLUEEmU2lnbmF0dXJlPUhGdmh4YWlLWTBjR1I5WHFnR0xwNXpjQVdhYyUzRCZFeHBpcmVzPTE2MTM4MTc4ODA=",
      "encryption": {
      "key": "63F0E4C07666126226D795027862ACC5848E939881C3CFE8CB3EB47DD7B3D24A"
      }
      }
      TIP
      Before saving the dataset, it is a good idea to verify that the JSON is correct. After saving the dataset you will not be able to view the credentials and access to data may fail. Any online JSON formatting tool can be used to validate that the JSON is correct.
      NOTE
      • The credentials are only passed as text when creating the dataset over an HTTPS connection.
      • It is then stored in a KMS (Fortanix Data Security Manager) and only accessible to approved enclaves.
      • Not even the Data Owner can retrieve the credentials.

Create Dataset Blank-no.png
Figure 7: Create Output Dataset

3.4 ACI Application

3.4.1 Create a General Purpose Python Docker Image

Create a docker image that will run arbitrary protected python code. The following files are used. These files are included in the tar file provided with this example:

Dockerfile:

FROM python:3.6

RUN apt-get update && apt-get install -y python3-cryptography python3-requests python3-pandas
RUN mkdir -p /opt/fortanix/enclave-os/app-config/rw
RUN mkdir -p /demo/code /demo/input

COPY ./start.py ./utils.py ./describe.py /demo/

CMD ["/demo/start.py"]

start.py: This file is the main entry point into the application.

!/usr/bin/python3
  
import os
import utils
import hashlib
from subprocess import PIPE, run

def main():
    input_folder="/opt/fortanix/enclave-os/app-config/rw"
    
    command = ["/usr/bin/python3", "/demo/describe.py"]
    
    # This downloads and decrypts all input data. File names are the object names from app config.
    for i in utils.read_json_datasets("input"):
        decrypted = utils.get_dataset(i)
        open(input_folder + i.name, 'wb').write(decrypted)
        
        # Add the file as input argument for our script
        command.append(input_folder + i.name)
        
    print("Running script")
    result = run(command, stdout=PIPE, stderr=PIPE, universal_newlines=True)
    
    # For simplicity uploading just stdout/stderr/returncode.
    utils.upload_result("output", result.returncode, result.stdout, result.stderr)
    
    print("Execution complete")

if __name__ == "__main__": main()

utils.py: This file contains the set of library functions.

!/usr/bin/env python3

import os
import sys
import string
import json
import requests
import base64
import hashlib
from subprocess import PIPE, run

from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.ciphers import (
Cipher, algorithms, modes
)

NONCE_SIZE=12
TAG_SIZE=16
KEY_SIZE=32

PORTS_PATH="/opt/fortanix/enclave-os/app-config/rw/"

def convert_key(key_hex):
key=key_hex.rstrip();
if len(key) != 64:
raise Exception("Key file must be 64 bytes hex string for AES-256-GCM")

if not all(c in string.hexdigits for c in key) or len(key) != 64:
raise Exception("Key must be a 64 character hex stream")

return bytes.fromhex(key)

def encrypt_buffer(key, data):
nonce=os.urandom(NONCE_SIZE)

cipher = Cipher(algorithms.AES(key), modes.GCM(nonce), backend=default_backend())
encryptor = cipher.encryptor()

return nonce + encryptor.update(data.encode()) + encryptor.finalize() + encryptor.tag

def decrypt_buffer(key, data):
tag=data[-TAG_SIZE:]
nonce=data[:NONCE_SIZE]

cipher = Cipher(algorithms.AES(key), modes.GCM(nonce, tag), backend=default_backend())
decryptor = cipher.decryptor()

return decryptor.update(data[NONCE_SIZE:len(data)-TAG_SIZE]) + decryptor.finalize()

class JsonDataset:
def __init__(self, location, credentials, name):
self.location = location
self.credentials = credentials
self.name = name

def read_json_datasets(port):
ports=[]
for folder in os.listdir(PORTS_PATH + port):
subfolder=PORTS_PATH + port + "/" + folder
if os.path.exists(subfolder + "/dataset"):
credentials=json.load(open(subfolder + "/dataset/credentials.bin", "r"))
location=open(subfolder + "/dataset/location.txt", "r").read()
ports.append(JsonDataset(location, credentials, folder))

return ports

def get_dataset(dataset):
url = dataset.location + "?" + base64.b64decode(dataset.credentials["query_string"]).decode('ascii')
r = requests.get(url, allow_redirects=True)
r.raise_for_status()

print("Retrieved dataset from location: " + dataset.location)
key = convert_key(dataset.credentials["encryption"]["key"])
return decrypt_buffer(key, r.content)

class RunResult:
def __init__(self, returncode, stdout, stderr):
self.returncode = returncode
self.stdout = base64.b64encode(stdout.encode()).decode('ascii')
self.stderr = base64.b64encode(stderr.encode()).decode('ascii')

def upload_result(port, returncode, stdout, stderr):
result=RunResult(returncode, stdout, stderr)
json_str=json.dumps(result.__dict__)

for dataset in read_json_datasets(port):
url = dataset.location + "?" + base64.b64decode(dataset.credentials["query_string"]).decode('ascii')
key = convert_key(dataset.credentials["encryption"]["key"])

print("Writing output to location: " + dataset.location)
requests.put(url, encrypt_buffer(key, json_str))

describe.py: This file contains the custom code called by start.py.

!/usr/bin/python3
  
import pandas as pd
import sys

for i in sys.argv[1:]:
df_do = pd.read_csv(i)
print("Dataset: " + i + "\n")
print(df_do['DESCRIPTION'].describe())
print("")

A standard docker build and docker push command must be used to create your docker and push to your registry. For example:

docker build -t <my-registry>/simple-python-sgx .
docker push <my-registry>/simple-python-sgx

3.4.1 Create an ACI Application and Image Creation

This section describes how to register the docker image in CCM and convert it to an SGX format.

  1. To create an ACI application using Fortanix CCM UI, click the Applications icon WorkflowEx6.png, and in the Applications page click +ADD APPLICATION to create an application. EOSExampleNewApp.png Figure 8: Create an ACI Application
    For more details refer to https://support.fortanix.com/hc/en-us/articles/360043527431-User-s-Guide-Add-and-Edit-an-Application#add-aci-application-0-8.
    ACI-application.png
    Figure 9: Create an ACI Application
  2. Create an image of the application.
    Add Figure 10: Create an Application Image
    For more details refer to https://support.fortanix.com/hc/en-us/articles/360043529411-User-s-Guide-Create-an-Image#create-image-for-aci-application-0-5.
  3. From Fortanix CCM Tasks tab, fetch the domain whitelisting tasks:
    WorkflowEx14.png
    Figure 11: Fetch Tasks for Approval 
  4. Approve the tasks.
    WorkflowEx12.png
    Figure 12: Approve Domain Whitelist Task WorkflowEx13.png
    Figure 13: Approve Build Whitelist Task

3.5 Create Application Configuration

The Docker Image recognizes the Python script using an Application Configuration which defines the ports.

Perform the following steps to create an Application Configuration:

  1. Click the Applications tab in the CCM left panel and from the left menu select Configurations.
  2. Click ADD CONFIGURATION to add a new configuration. EDPExampleAppConfig.png
    Figure 14: Create App Configuration
  3. In the ADD APPLICATION CONFIGURATION window, fill the following:
    1. Image – select the <my-registry>/simple-python-sgx:latest application image for which you want to create a configuration.
      Where <my-registry> is the location of your docker registry
    2. Name and Description – Enter a name and description of the configuration.
    3. Ports – Enter the connections to be used in the workflow. You can add multiple ports depending on how the connection should work. For example: “input”, “output”, “heartbeat”, and so on.
    4. Labels – attach one or more key-value labels to the app config.
    5. Configuration items – These are key-value pairs used for configuring the app.
      • For Enclave OS applications, the Key is the path of the file that contains the Value for configuring an app. 
    NOTE
    Fortanix allows only files in the path /opt/fortanix/ for ACI applications.
    EOSExample9c.png
    Figure 15: Save Configuration
  4. Click SAVE CONFIGURATION to save the configuration. EDPExampleAppConfig2.png
    Figure 16: Configuration Saved

3.6 Create a Workflow

Perform the following steps to create a workflow:

  1. Click the Workflows icon in the CCM left panel.
  2. In the Workflows page, click +WORKFLOW to create a new workflow. CCM_workflow1.png
    Figure 17: Create Workflow
  3. In the CREATE WORKFLOW dialog, enter the Workflow Name and Description (optional). Click CREATE to go to the Workflow graph.
  4. Add an app to the Workflow graph. To add an app to a Workflow graph, drag the App icon and drop it into the graph area. Click +APPLICATION. In the ADD APPLICATION dialog, the App Owner must select an existing application image, for example: <my-registry>/simple-python-sgx:latest from the list of available application images.
    Where, <my-registry> is the location of your registry.
  5. For the selected application image, the App Owner must create an app config/add existing app config.
  6. Click SELECT APPLICATION to select the application. EOSExample10a.png
    Figure 18: Add App to Workflow
  7. Click SAVE AS DRAFT to save the draft. EOSExample11a.png
    Figure 19: Save Draft
  8. To access the draft Workflow, click the Draft tab in the Workflows left menu.
  9. Add an input and output dataset to the Workflow graph. To add a dataset to a Workflow graph, drag the dataset icon and drop it into the graph area. Click +DATASET. In the DATASET dialog, the Data Owner must select from existing datasets that were created in the section above. EOSExample12b.png
    Figure 20: Add Input Dataset to Workflow
    EOSExample13a.png
    Figure 21: Add Output Dataset to Workflow
  10. Create connections between the applications and input/output datasets. To do that, drag the Input Dataset connection point and join it to the Application connection point. In the SELECT PORTS window, select the Target Port as “input”. Repeat the same to connect the Application to the Output Dataset and select the Target Port as “Output”.
    EOSExample17.png
    Figure 22: Create Connection
  11. If the Workflow is complete, the user clicks the REQUEST APPROVAL button to generate the approval process for the Workflow. EOSExample18a.png
    Figure 23: Request Workflow Approval
    WARNING
    When a draft Workflow is submitted for approval, it will be removed from the drafts list and the user will no longer be able to edit it directly once it is in a “pending” or "approved" state.
  12. The workflow is in a “pending” state until all the users approve it. In the Pending tab click SHOW APPROVAL REQUEST to approve a Workflow. EOSExample19a.png
    Figure 24: Workflow Pending Approval
  13. In the APPROVAL REQUEST - CREATE WORKFLOW dialog, click APPROVE to approve the workflow or DECLINE to reject a workflow. EOSExample20a.png
    Figure 25: Approve the Workflow
    NOTE
    • A user can also approve/decline a Workflow from the CCM Tasks tab.
    • Notice that the users who have approved the workflow have a green tick WorkflowEx27.png against their icon.
  14. All the users of a Workflow must approve the Workflow to finalize it. If a user declines a Workflow, the Workflow is rejected. When all the users approve the Workflow, it is deployed.
    1. CCM configures apps to access the datasets.
    2. CCM creates the Workflow Application Configs.
    3. CCM returns the list of hashes needed to start the apps.

3.7 Run the Application

After a Workflow is approved by all the users, the Applications will have the Workflow Application Configurations provided to them. This configuration has information such as which Datasets or Apps they are connected to, any user-provided files or values to be provided within an enclave, and so on.

We provide a configuration to applications using an identifier passed as an input argument.

This identifier is a sha256sum of items that you need to secure from the configuration and workflow. 

Fortanix CCM will also embed this identifier inside the certificates it issues so that it is clear what configuration is used for the KMS to allow access to credentials.

It embeds this inside a subject alternate name:

<identifier>.<mrenclave>.id.fortanix.cloud

With the identifier above, the KMS that stores the dataset credentials will authenticate and give credentials only to applications that present a proper certificate. When the application starts, CCM will keep track of which applications are allowed to access which configurations using the identifier.

Perform the following steps to view the Application Identifier:

  1. Click the application in the approved Workflow graph. EOSExample21a.png
    Figure 26: Workflow Application Detailed View
  2. In the detailed view of the Workflow application, copy the value of Runtime configuration hash. This ID is used to run the application. EOSExample23b1.png
    Figure 27: Copy the App Identifier
  3. To deploy the ACI application using azure portal, perform the steps as mentioned in Section 2 - Deploy Confidential ACI Group Using Azure Portal. In the "Custom deployment" screen, for the App Config Id field, enter the Application Identifier copied in Step 2 above. AzurePortalAppConfig.png
    Figure 28: Azure ACI Deployment
  4. When the App Owner starts the application with the application config identifier.
    The Data Output Owner can view the output using the following steps:
    1. Run the following command to download the output file:
      aws s3 --profile download cp s3:<s3-directory>/conditions_output.csv.enc .
      For example:
      aws s3 --profile download cp s3://fortanix-pocs-data/conditions_output.csv.enc .
    2. Run the following command to decrypt the file:
      Use the aes_256_gcm.py script provided in the tar file.
      ./aes_256_gcm.py dec -K ./key.hex -in ./conditions_output.csv.enc -out ./output.txt
      $ cat output.txt | jq .
      {
      "returncode": 0,
      "stdout": "RGF0YXNldDogL29wdC9mb3J0YW5peC9lbmNsYXZlLW9zL2FwcC1jb25maWcvaW5wdXQvY3hoY2Z4ZHZsCgpjb3VudCAgICAgICAgICAgICAgICAgICAgICAgICAgIDgzNzYKdW5pcXVlICAgICAgICAgICAgICAgICAgICAgICAgICAgMTI5CnRvcCAgICAgICBWaXJhbCBzaW51c2l0aXMgKGRpc29yZGVyKQpmcmVxICAgICAgICAgICAgICAgICAgICAgICAgICAgIDEyNDgKTmFtZTogREVTQ1JJUFRJT04sIGR0eXBlOiBvYmplY3QKCg==",
      "stderr": ""
      }
      The file now has the expected output:
      $ cat output.txt | jq -r .stdout | base64 -d
      Dataset: /opt/fortanix/enclave-os/app-config/input/cxhcfxdvl

      count 8376
      unique 129
      top Viral sinusitis (disorder)
      freq 1248
      Name: DESCRIPTION, dtype: object
  5. When the App Owner starts the application with the application config identifier:

    1. Applications will request an attestation certificate from the NodeAgent with the identifier as part of the report data.
    2. The application requests an application certificate from the NodeAgent.
    3. The Fortanix CCM verifies that the application is allowed to access the configuration.
    4. The application requests from Fortanix CCM its configuration by providing its certificate provisioned above as an authentication mechanism.
    5. The CCM does certificate authentication, extracts the application identifier from the certificate, and sends back the configuration corresponding to that identifier.
    6. The application verifies and applies the configuration hash.
    7. The application gets the credentials from URLs in the config.
    8. The application authenticates and reads or writes data from the datasets.

Comments

Please sign in to leave a comment.

Was this article helpful?
0 out of 0 found this helpful