aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • Engineering

Multi-Environment Service Orchestrations

  • aster.cloud
  • October 17, 2022
  • 3 minute read

In a previous post, I showed how to use a GitOps approach to manage the deployment lifecycle of a service orchestration. This approach makes it easy to deploy changes to a workflow in a staging environment, run tests against it, and gradually roll out these changes to the production environment.

While GitOps helps to manage the deployment lifecycle, it’s not enough. Sometimes, you need to make changes to the workflow before deploying to different environments. You need to design workflows with multiple environments in mind.


Partner with aster.cloud
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

For example, instead of hardcoding the URLs called from the workflow, you should replace the URLs with staging and production URLs depending on where the workflow is being deployed.

Let’s explore three different ways of replacing URLs in a workflow.

Option 1: Pass URLs as runtime arguments

Option 1

 

In the first option, you define URLs as runtime arguments and use them whenever you need to call a service:

 

main:
  params: [args]
  steps:
    - init:
        assign:
          - url1: ${args.urls.url1}
          - url2: ${args.urls.url2}

 

You can deploy workflow1.yaml as an example:

 

gcloud workflows deploy multi-env1 --source workflow1.yaml

 

Run the workflow in the staging environment with staging URLs:

 

gcloud workflows run multi-env1 --data='{"urls":{"url1": "https://us-central1-projectid.cloudfunctions.net/func1-staging", "url2": "https://us-central1-projectid.cloudfunctions.net/func2-staging"}}'

 

And, run the workflow in the prod environment with prod URLs:

 

gcloud workflows run multi-env1 --data='{"urls":{"url1": "https://us-central1-projectid.cloudfunctions.net/func1-prod", "url2": "https://us-central1-projectid.cloudfunctions.net/func2-prod"}}'

 

Note: These runtime arguments can also be passed when triggering using API, client libraries, or scheduled triggers but not when triggering with Eventarc.

Option 2: Use Cloud Build to deploy multiple versions

Option 2

 

In the second option, you use Cloud Build to deploy multiple versions of the workflow with the appropriate staging and prod URLs replaced at deployment time.

Read More  The Future Of Sustainable Flying Is Data-Driven For Lufthansa Group

Run setup.sh to enable required services and grant necessary roles.

Define a YAML (see workflow2.yaml for an example) that has placeholder values for URLs:

 

main:
  steps:
    - init:
        assign:
          - url1: REPLACE_url1
          - url2: REPLACE_url2

 

Define cloubuild.yaml that has a step to replace placeholder URLs and a deployment step:

 

steps:
- id: 'replace-urls'
  name: 'gcr.io/cloud-builders/gcloud'
  entrypoint: bash
  args:
    - -c
    - |
      sed -i -e "s~REPLACE_url1~$_URL1~" workflow2.yaml
      sed -i -e "s~REPLACE_url2~$_URL2~" workflow2.yaml
- id: 'deploy-workflow'
  name: 'gcr.io/cloud-builders/gcloud'
  args: ['workflows', 'deploy', 'multi-env2-$_ENV', '--source', 'workflow2.yaml']

 

Deploy the workflow in the staging environment with staging URLs:

 

gcloud builds submit --config cloudbuild.yaml --substitutions=_ENV=staging,_URL1="https://us-central1-projectid.cloudfunctions.net/func1-staging",_URL2="https://us-central1-projectid.cloudfunctions.net/func2-staging"

 

Deploy the workflow in the prod environment with prod URLs:

 

gcloud builds submit --config cloudbuild.yaml --substitutions=_ENV=prod,_URL1="https://us-central1-projectid.cloudfunctions.net/func1-prod",_URL2="https://us-central1-projectid.cloudfunctions.net/func2-prod"

 

Now, you have two workflows ready to run in staging and prod environments:

 

gcloud workflows run multi-env2-staging
gcloud workflows run multi-env2-prod

 

Option 3: Use Terraform to deploy multiple versions

Option 3

 

In the third option, you use Terraform to deploy multiple versions of the workflow with the appropriate staging and prod URLs replaced at deployment time.

Define a YAML (see workflow3.yaml for an example) that has placeholder values for URLs:

 

main:
  steps:
    - init:
        assign:
          - url1: ${url1}
          - url2: ${url2}

 

Define main.tf that creates staging and prod workflows:

 

variable "project_id" {
  type = string
}

variable "url1" {
  type = string
}

variable "url2" {
  type = string
}

locals {
  env = ["staging", "prod"]
}

# Define and deploy staging and prod workflows
resource "google_workflows_workflow" "multi-env3-workflows" {
  for_each = toset(local.env)

  name            = "multi-env3-${each.key}"
  project         = var.project_id
  region          = "us-central1"
  source_contents = templatefile("${path.module}/workflow3.yaml", { url1 : "${var.url1}-${each.key}", url2 : "${var.url2}-${each.key}" })
}

 

Initialize Terraform:

 

terraform init

 

Check the planned changes:

Read More  Advanced Topic Modeling Tutorial: How to Use SVD & NMF in Python

 

terraform plan -var="project_id=YOUR-PROJECT-ID" -var="url1=https://us-central1-projectid.cloudfunctions.net/func1" -var="url2=https://us-central1-projectid.cloudfunctions.net/func2"

 

Deploy the workflow in the staging environment with staging URLs and the prod environment with prod URLs:

 

terraform apply -var="project_id=YOUR-PROJECT-ID" -var="url1=https://us-central1-projectid.cloudfunctions.net/func1" -var="url2=https://us-central1-projectid.cloudfunctions.net/func2"

 

Now, you have two workflows ready to run in staging and prod environments:

 

gcloud workflows run multi-env3-staging
gcloud workflows run multi-env3-prod

 

Pros and cons

At this point, you might be wondering which option is best.

Option 1 has a simpler setup (a single workflow deployment) but a more complicated execution, as you need to pass in URLs for every execution. If you have a lot of URLs, executions can get too verbose with all the runtime arguments for URLs. Also, you can’t tell which URLs your workflow will call until you actually execute the workflow.

Option 2 has a more complicated setup with multiple workflow deployments with Cloud Build. However, the workflow contains the URLs being called and that results in a simpler execution and debugging experience.

Option 3 is pretty much the same as Option 2 but for Terraform users. If you’re already using Terraform, it probably makes sense to also rely on Terraform to replace URLs for different environments.

 

This post provided examples of how to implement multi-environment workflows. If you have questions or feedback, feel free to reach out to me on Twitter @meteatamel.

 

 

By: Mete Atamel (Developer Advocate)
Source: Google Cloud Blog


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

aster.cloud

Related Topics
  • Best Practice
  • Cloud Build
  • Deployment
  • Google Cloud
  • Terraform
  • Tutorials
  • Workflow
You May Also Like
View Post
  • Engineering
  • Technology

Guide: Our top four AI Hypercomputer use cases, reference architectures and tutorials

  • March 9, 2025
View Post
  • Computing
  • Engineering

Why a decades old architecture decision is impeding the power of AI computing

  • February 19, 2025
View Post
  • Engineering
  • Software Engineering

This Month in Julia World

  • January 17, 2025
View Post
  • Engineering
  • Software Engineering

Google Summer of Code 2025 is here!

  • January 17, 2025
View Post
  • Data
  • Engineering

Hiding in Plain Site: Attackers Sneaking Malware into Images on Websites

  • January 16, 2025
View Post
  • Computing
  • Design
  • Engineering
  • Technology

Here’s why it’s important to build long-term cryptographic resilience

  • December 24, 2024
IBM and Ferrari Premium Partner
View Post
  • Data
  • Engineering

IBM Selected as Official Fan Engagement and Data Analytics Partner for Scuderia Ferrari HP

  • November 7, 2024
View Post
  • Engineering

Transforming the Developer Experience for Every Engineering Role

  • July 14, 2024

Stay Connected!
LATEST
  • college-of-cardinals-2025 1
    The Definitive Who’s Who of the 2025 Papal Conclave
    • May 7, 2025
  • conclave-poster-black-smoke 2
    The World Is Revalidating Itself
    • May 6, 2025
  • oracle-ibm 3
    IBM and Oracle Expand Partnership to Advance Agentic AI and Hybrid Cloud
    • May 6, 2025
  • 4
    Conclave: How A New Pope Is Chosen
    • April 25, 2025
  • Getting things done makes her feel amazing 5
    Nurturing Minds in the Digital Revolution
    • April 25, 2025
  • 6
    AI is automating our jobs – but values need to change if we are to be liberated by it
    • April 17, 2025
  • 7
    Canonical Releases Ubuntu 25.04 Plucky Puffin
    • April 17, 2025
  • 8
    United States Army Enterprise Cloud Management Agency Expands its Oracle Defense Cloud Services
    • April 15, 2025
  • 9
    Tokyo Electron and IBM Renew Collaboration for Advanced Semiconductor Technology
    • April 2, 2025
  • 10
    IBM Accelerates Momentum in the as a Service Space with Growing Portfolio of Tools Simplifying Infrastructure Management
    • March 27, 2025
about
Hello World!

We are aster.cloud. We’re created by programmers for programmers.

Our site aims to provide guides, programming tips, reviews, and interesting materials for tech people and those who want to learn in general.

We would like to hear from you.

If you have any feedback, enquiries, or sponsorship request, kindly reach out to us at:

[email protected]
Most Popular
  • 1
    Tariffs, Trump, and Other Things That Start With T – They’re Not The Problem, It’s How We Use Them
    • March 25, 2025
  • 2
    IBM contributes key open-source projects to Linux Foundation to advance AI community participation
    • March 22, 2025
  • 3
    Co-op mode: New partners driving the future of gaming with AI
    • March 22, 2025
  • 4
    Mitsubishi Motors Canada Launches AI-Powered “Intelligent Companion” to Transform the 2025 Outlander Buying Experience
    • March 10, 2025
  • PiPiPi 5
    The Unexpected Pi-Fect Deals This March 14
    • March 13, 2025
  • /
  • Technology
  • Tools
  • About
  • Contact Us

Input your search keywords and press Enter.