Provisioning resources in the Oracle Cloud from Azure DevOps side


INTRO

Over time, more and more companies are embracing the cloud in ways some of us predicted nearly 7 years ago, that is, hybrid and multi-vendor.

This is an approach* to lifecycle the creation of resources in the Oracle Cloud, starting from the design until ending up building/maintaining all the stuff with the use of tools and automations. Take a look to the git repo in which the README.md file puts a little bit more detail. The git repo is private, let me know and I’ll be more than happy to give you permissions.

(*)In response to my colleague/friend JavierRP who asked me about this thing a few days ago. It takes (because my limitations) more than 12 hours of my weekend time to investigate, implement and blog this stuff, anyhow please excuse the brevity and typos.

STEP 1: CREATE TERRAFORM FILES

In this particular use case I’m using the so called OKIT ( Oracle Designer ToolKit) an open source project that you can get here.

OKIT has an option to generate the terraform files, and also includes other such as create markdown documentation.

From the hamburger menu in the top left corner select [Terraform], a zip file will be downloaded to your laptop.

STEP 2: CREATE GIT REPO

Create a git repo in your preferred provider and put the terraform files in it.

STEP 3: create Azure devops project

Create a new project from your AZ DevOps organisation:

step 4: INSTALL AZURE pipelines terraform tasks

Go to this link and install it in your AZ DevOps organisation:

STEP 5: Create stuff in oracle cloud side

Under Identity Domains/Users/the user who is gonna be used for devops purposes create a Customer Secret Key:

Write down the key and the secret for later.

Create a private Object Storage bucket somewhere in the compartments hierarchy:

From Administration/Tenancy details, grab the Object Storage namespace:

STEP 6: CREATE A SECUREFILE ENTRY

Add the private ppk file as a securefile in the library:

Locate the ppk of the OCI devops user and upload to Library/SecureFiles:

Give your pipeline permissions to access this securefile:

STEP 7: CREATE A NEW PIPELINE

Create a new pipeline in the project:

Select the repo (in my case I have integrated DevOps organisation with my github account):

Select the repo with the tf files in it:

Edit in the git repo a file named backend.tf with the following content (replace parts in bold with the appropriate value):

terraform {
  backend "s3" {
    bucket                      = "bucketname"
    region                      = "eu-madrid-1"
    key                         = "IAC/<groupofresources>/terraform.tfstate"
    endpoint                    = "https://<namespace>.compat.objectstorage.eu-madrid-1.oraclecloud.com/"
    skip_region_validation      = true
    skip_credentials_validation = true
    skip_metadata_api_check     = true
    force_path_style            = true
  }
}

Review the pipeline code, pay attention to the name of the securefile, change if needed:

# jmu 13/05/2023
# this is the first time in my life i use az devops 
# I am setting a self-hostedagent pool in my mac, because waiting for Azure Support to give me quota to run builders...
trigger:
- main
#pool: mymac

steps:
# get securefile with the ppk in it
- task: DownloadSecureFile@1
  name: ppkfile
  displayName: 'Download ppkfile'
  inputs:
    secureFile: 'tef.pem'

# create tfvars file from variables
- task: Bash@3
  inputs:
    targetType: 'inline'
    script: |
      # creating the terraform.tfvars file in the appropriate folder on the fly from the pipeline variables 
      echo "fingerprint=\"$(FINGERPRINT)\"" > $(Agent.BuildDirectory)/s/terraform.tfvars
      echo "region=\"$(REGION)\"" >>  $(Agent.BuildDirectory)/s/terraform.tfvars
      echo "tenancy_ocid=\"$(TENANCY_OCID)\"" >>  $(Agent.BuildDirectory)/s/terraform.tfvars
      echo "compartment_ocid=\"$(COMPARTMENT_OCID)\"" >>  $(Agent.BuildDirectory)/s/terraform.tfvars
      echo "user_ocid=\"$(USER_OCID)\"" >>  $(Agent.BuildDirectory)/s/terraform.tfvars
      echo "private_key_path=\"$(ppkfile.secureFilePath)\"" >>  $(Agent.BuildDirectory)/s/terraform.tfvars
      cat $(Agent.BuildDirectory)/s/terraform.tfvars
      ls
      
- task: TerraformCLI@0
  inputs:
    command: 'init'
    commandOptions: '-backend-config="access_key=<XXXXXXXXX>" -backend-config="secret_key=<XXXXXXXXX>="'
    allowTelemetryCollection: true
- task: TerraformCLI@0
  inputs:
    command: 'plan'
    allowTelemetryCollection: true
- task: TerraformCLI@0
  inputs:
    command: 'apply'
    commandOptions: '-auto-approve'
    allowTelemetryCollection: true

Remenber to replace the values in bold <XXXXXXXXX> with the appropriate values obtained in step 5.

Create tf variables in the pipeline’s [Variables] section (the typical variables you normally put in the terraform.tfvars file):

TENANCY: the ocid of the Oracle Clod tenancy

REGION: The OC region friendy name

COMPARTMENT_OCID: the ocid of the compartment in which the stuff will be created

USER_OCID: the ocid of the OC user that is being utilised for devops purposes

FINGERPRINT: the fingerprint of the user’s apikey

OCI_GO_SDK_DEBUG: for debugging purposes

TF_LOG: you know

STEP 8: CONFIGURE APPROVAL

Go to Project Settings/Agent Pools/Azure Pipelines:

Click on Approvals & Cheks:

Add Approval:

Add Approver and click [Create] button:

Trigger the pipeline with a change or manually.

Check email inbox. Approve please:

STEP 9: REVIEW PIPELINE EXECUTION

STEP 9: Verify stuff created in the ORACLE CLOUD side

VCN and subnets created:

Route tables:

Etc:

tfstate file stored in object storage bucket:

STEP 10: Modify state in oracle cloud side

Delete one of the subnets and trigger the pipeline again::

Verify that the pipeline does what it have to do and the subnet is recreated again:

Et voilà (or écolo qui/here it is/aquí está):

.

That’s all, hope it helps!! 🙂

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.