Modernizing Scripts into the Cloud

David Allan
6 min readAug 23, 2022

--

Bash scripts contain a set of instructions written in Bash command language and those scripts can be executed in Unix shell interpreters, similarly, batch scripts in Windows provide similar capabilities. We may have used these scripts to automate tasks that are time consuming and need to be performed over and over again. If we compare with modern computing, bash scripts are “old fashioned” stuff since all the interactions with the user are done through the command line interface. But there are tonnes of existing scripts and interfaces to products that require the usage of command line tools — for example running a MAXL script to load into Essbase, command line database export utilities, orchestration of existing processes and so on. What to do, rewrite? Not sensible to do that for all cases at all. So how do we modernize and integrate with cloud based systems?

Looking back at Lake Tahoe

Let’s see how that can be done using the run command feature in OCI which the Compute Instance Run Command plugin that is managed by the Oracle Cloud Agent software. With this we get an easy to use and manage REST API to run our scripts and command line tools, doesn’t matter if its UNIX or Windows. Below you can see a Windows compute instance in OCI that we can execute scripts on — this could be created and in complete control of you the customer with whatever dependent scripts and installations that is needed.

OCI Console — compute instance, Oracle Cloud Agent

Its possible to use the OCI Console to create a command, or use the REST API or any of the supported languages;

Run a command from OCI Console in Compute Instance

For the remainder of this post we will use the REST Task in OCI Data Integration, this will allow us to capture the command to execute, poll and wait for its completion and also cancel/delete the command. The command be parameterized so we can use it for many different use cases. We can also schedule this to run at a regular cadence, so we could load Essbase on a schedule with this approach, export using command line tools and so on.

Create Command on OCI Compute

See the documentation here for creating a command on the compute instance. We will use this REST endpoint in an OCI Data REST task, the first step is defining the POST method and the URL.

Create a REST task in OCI Data Integration

Note above the endpoint has the region defined as us-ashburn-1, we can make this more generic so that it can invoke on any region, define the parameter REGION with a default value;

Parameterize the URL — make it resuable

On the headers tab, we need to define content-type and accept properties with respective values;

Content-Type: application/jsonAccept = application/json

Below you can see where the headers are defined in the REST task;

Define the headers in REST task

Define the create command payload

The payload for the create command has a number of parameters including the compartment where the command is created, the name for the jobs, the instance to run the job on and the script to run. The output can be saved to Object Storage, below we are just reporting to standard output.

{  "compartmentId": "${COMPARTMENT_ID}", "executionTimeOutInSeconds": 3600,  "displayName": "${COMMAND_NAME}",  "target": {    "instanceId": "${INSTANCE_ID}"  },  "content": {    "source": {      "sourceType": "TEXT",     "text": "${SCRIPT_TEXT}"    },    "output": {      "outputType": "TEXT"    }  }}

Here we can see where the payload is defined and the parameters we have defined to make this as generic as possible.

Define the request payload in the REST task

Getting Status of Command

Extract the command id from the create command above;

COMMAND_ID CAST(json_path(SYS.RESPONSE_PAYLOAD, '$.id') AS String)

Below you can see we defined an expression named COMMAND_ID that we can reference in the status checking REST API, the COMMAND_ID is the “id” value in the response from the POST method above.

Define an expression to get the command id from the response

Get the status of the command by using the COMMAND_ID

GET https://iaas.us-ashburn-1.oraclecloud.com/20180530/instanceAgentCommands/#{COMMAND_ID}/status?instanceId=${INSTANCE_ID}

Below you can see where the polling REST endpoint is defined;

Define the polling status check REST endpoint

Canceling the Command

Cancel/delete the command using the COMMAND_ID expression and the REST endpoint to DELETE the command;

DELETE https://iaas.${REGION}.oraclecloud.com/20180530/instanceAgentCommands/#{COMMAND_ID}

Below you can see where the DELETE command is defined and the REST endpoint for the cancel.

Define the cancel REST endpoint

Poll and Success Conditions

Finally we have to define the polling and success conditions.

We can say keep getting the status whilst the lifecycle of the command is not SUCCEEDED and not FAILED. Here is the polling condition;

CAST(json_path(SYS.RESPONSE_PAYLOAD, '$.lifecycleState') AS String) != 'SUCCEEDED' AND CAST(json_path(SYS.RESPONSE_PAYLOAD, '$.lifecycleState') AS String) != 'FAILED'

To determine if the command is successful we have to check that the RESPONSE_STATUS is ≥200 and ≤300 for example, plus the lifecycle status is SUCCEEDED.

SYS.RESPONSE_STATUS >= 200 AND SYS.RESPONSE_STATUS <= 300 AND CAST(json_path(SYS.RESPONSE_PAYLOAD, '$.lifecycleState') AS String) == 'SUCCEEDED'

Authentication and Policies

Ensure you have set up the REST task to use OCI for authentication, you can use either a workspace or an application as shown below;

You will also need policies such as the following to execute this from OCI Data Integration;

You can use resource principal policy statements for immediate effect
allow any-user to manage instance-agent-command-family in compartment <compartment-name> where ALL {request.principal.type=‘disworkspace’,request.principal.id=‘<workspace_ocid>’}
allow any-user to manage instance-agent-command-execution-family in compartment <compartment-name> where ALL {request.principal.type=‘disworkspace’,request.principal.id=‘<workspace_ocid>’}

Orchestrating the Command

We can execute the REST task from within OCI Data Integration and also schedule this so that we can do this over and over. Below we have a task schedule which executes a REST task, the task schedule can configure any parameters to specific values also!

Schedule the command to run on cadence

The task can also be used in a pipeline and you can add dependencies and notifications. The example below shows a pipeline, the pipeline calls the script on the remote compute instance and then if it fails publishes a notification. You can se all of the output parameters for the REST task, these include the STATUS, ERROR_MESSAGE and detail information on response status and payload.

Reuse an Existing Library

I’ve packaged this as code so you can easily recreate the task in your environment;

Also you can find a Postman collection here with example tasks including this;

https://blogs.oracle.com/dataintegration/post/oci-rest-task-collection-for-oci-data-integration

Summary

Here we have seen how we can modernize existing scripts in the cloud and how we can automate tasks that are time consuming and need to be performed over and over again. Hopefully this gives you some good insight into running scripts, tools and commands on OCI. Let me know what you think.

For more information on OCI Data Integration and REST Tasks see the documentation;

https://docs.oracle.com/en-us/iaas/data-integration/using/rest-tasks.htm#rest-tasks

For information on running commands on an instance here.

Thanks for reading.

--

--

David Allan

Architect at @Oracle The views expressed on this blog are my own and do not necessarily reflect the views of Oracle.