4.8 C
New York
Thursday, January 11, 2024

Flowpipe: A workflow engine for devops scripters


[*]

If you happen to outline your infrastructure as code, should not your workflow automation use the identical as-code method? That is how Flowpipe works. Outline workflows with HCL (HashiCorp configuration language), then run them utilizing a single binary that you just deploy domestically, within the cloud, or in any CI/CD pipeline. Flowpipe embodies the identical architectural substances you may discover in any workflow device: pipelines, steps, triggers, management circulate. And it integrates with every part you’d count on from a device on this class.

However this is not ClickOps. You do not use a diagramming device to construct integrations. Pipeline definitions are code artifacts that stay in repositories as first-class residents of the fashionable software program ecosystem: version-controlled and collaborative.

These pipelines can orchestrate workflows utilizing a variety of strategies. Do you have to monitor open points in GitHub after which notify Slack? There’s multiple option to collect the GitHub knowledge you may ship to Slack:

  1. In a pipeline step. Use the GitHub library’s list_issues pipeline, which encapsulates an http step that calls the GitHub API.
  2. In a question step. Use Steampipe’s GitHub plugin to question for open points in a repo.
  3. In a operate step. Write an AWS-Lambda-compatible operate, in Python or JavaScript, to name the GitHub API.
  4. In a container step, package deal the GitHub CLI in a container and run gh subject checklist that means.

Why all these decisions? The outdated Perl mantra “There may be multiple option to do it” applies right here too. Flowpipe is a contemporary incarnation of “duct tape for the Web”: a versatile equipment filled with helpful instruments that work nicely collectively. For any given integration, select those most applicable in that context, or that leverage current property, or which might be most handy. You are by no means blocked. There’s at all times a option to get the job completed as you navigate a fancy panorama of various and interconnected clouds and companies.

Present me the code!

Listed below are 4 methods to collect details about GitHub points.

Checklist GitHub points utilizing Flowpipe’s GitHub and Slack libraries

Flowpipe mods present reusable pipelines. On this case, there are library mods with assist for each wanted operations: itemizing GitHub points, notifying Slack. So we will simply use these libraries in a pair of pipeline steps.

pipeline "list_open_issues_and_notify_slack" {

  step "pipeline" "list_issues" {
    pipeline = github.pipeline.list_issues  # use the github library mod
    args = {
      issue_state: "OPEN"
      repository_owner: "turbot"                    
      repository_name: "steampipe"  
    }
  }

  step "pipeline" "notify_slack" {

    pipeline = slack.pipeline.post_message  # use the github slack mod
    args = {
      token   = var.slack_token
      channel = var.slack_channel
      message = step.pipeline.list_issues.worth
  }

}

The documentation for the GitHub and Slack mods lists obtainable pipelines and, for every pipeline, the required and non-compulsory parameters. It is simple to make use of printed Flowpipe mods, and equally simple to create and use your personal.

Checklist GitHub points utilizing Steampipe’s GitHub plugin

Flowpipe does not require Steampipe however will fortunately embrace it. If you happen to’re in a position to make use of each collectively you acquire immense energy. The GitHub plugin is only one of many wrappers for a rising ecosystem of information sources, every modeled as tables you may question with SQL in a question step.

pipeline "list_open_issues_and_notify_slack" {

  step "question" "query_list_issues" {
    connection_string = "postgres://steampipe@localhost:9193/steampipe"
    sql               = <<EOQ
      choose * from github_issue
      the place repository_full_name="turbot/steampipe"
      and state="OPEN"
    EOQ
  }

  step "pipeline" "notify_slack" {
    # use the library mod as above, or one other methodology
  }
}

All you want here’s a connection string, by the way in which. If you happen to connect with Steampipe you may faucet into its plugin ecosystem, but when the info you are going after occurs to stay in one other database you should use SQL to question it from there.

Checklist GitHub points utilizing a Lambda-compatible operate

What if there’s neither a library mod nor a Steampipe plugin on your use case? Another choice: Name a operate in a operate step.

pipeline "list_open_issues_and_notify_slack" {

  step "operate" "list_issues" {
      src    = "https://www.infoworld.com/article/3712125/./features"
      runtime = "python:3.10"
      handler = "list_issues.handler"
      occasion = {
        proprietor = "turbot"
        repo = "steampipe"
      }
  }

This is the operate.

def handler(occasion, context):
  proprietor = occasion['owner']
  repo = occasion['repo']
  url = f"https://api.github.com/repos/{proprietor}/{repo}/points?state=closed"
  response = requests.get(url)
    return {
        'points': response.json()
    }

These features, which you’ll write in Python or JavaScript, are suitable with AWS Lambda features: event-driven, stateless, short-lived. And in comparison with AWS Lambda features, they are much simpler to jot down and check. You may even live-edit your features as a result of if you make adjustments Flowpipe mechanically detects and applies them.

Checklist GitHub points utilizing the GitHub CLI

Command-line interfaces are basic instruments for DevOps integration. You may package deal a CLI in a container and use it in a container step.

pipeline "list_open_issues_and_notify_slack" {

  step "container" "list_issues" {
     picture = "my-gh-image"
     cmd = ["/container-list-issues.sh"]
     env = {
       GITHUB_TOKEN = var.access_token
       GH_COMMAND   = var.gh_command
       }
  }

That is in all probability overkill on this case, however the capacity to make use of containerized instructions on this means ensures maximal flexibility and portability.

Why HashiCorp configuration language?

HashCorp configuration language (HCL) is, to start with, acquainted to devops professionals who use it to precise Terraform configurations. However the language seems to be a perfect match for workflow too. The directed acyclic graph (DAG) on the core of its execution mannequin determines the order of operations primarily based on useful resource dependencies, not like many scripting languages the place such dependencies should be managed explicitly.

If the second step in a workflow refers back to the output of step one, Flowpipe implicitly sequences the steps. Concurrency is implicit too. Workflow steps that do not depend upon different steps mechanically run in parallel, no particular syntax required. So you may create complicated and extremely parallel workflows in a declarative fashion that is straightforward to learn and write. For instance, here is a step that iterates over an inventory of customers and makes use of an http step to name an API for every person.

step "http" "add_a_user" {
  for_each = ["Jerry", "Elaine", "Newman"]
  url      = "https://myapi.native/api/v1/person"
  methodology   = "submit"
  request_body = jsonencode({
    user_name = "${every.worth}"
  })
}

As a result of issues do not at all times go based on plan, Flowpipe’s declarative fashion extends to error dealing with and retries. 

step "http" "my_request" {
  url    = "https://myapi.native/subscribe"
  methodology = "submit"
  physique   = jsonencode({
    title = param.subscriber
  })
  retry {
    max_attempts = 5
    technique     = "exponential"
    min_interval = 100
    max_interval = 10000
  }
}

You will sometimes must unpack the outcomes of 1 step in a pipeline, then rework the info in an effort to feed it to a subsequent step. Among the many Terraform-compatible HCL features supported by Flowpipe are assortment features that work with lists and maps.

pipeline "get_astronauts" {

  step "http" "whos_in_space" {
      url    = "http://api.open-notify.org/astros"
      methodology = "get"
  }
  
  output "method_1" {
    worth = [for o in step.http.whos_in_space.response_body.people: po.name]
  }
  
  output "method_2" {
    worth = step.http.whos_in_space.response_body.individuals[*].title
  }
}

This is the output of the command flowpipe pipeline run get_astronauts.

flowpipe get astronauts IDG

The 2 strategies are equal methods to iterate over the checklist of maps returned from the API and extract the title area from every. The primary methodology makes use of the versatile for expression which might work with lists, units, tuples, maps, and objects. The second methodology provides an similar end result utilizing the splat expression, which might simplify entry to fields inside components of lists, units, and tuples.

Schedules, occasions, and triggers

As with different workflow engines, you may set off a Flowpipe pipeline on a cron-defined schedule.

set off "schedule" "daily_3pm" {
  schedule = "* 15 * * *"
  pipeline = pipeline.daily_task
}

However you may additionally need to react instantly to occasions like code pushes, infrastructure change, or Slack messages. So Flowpipe offers an HTTP-based set off to react to an incoming webhook by operating a pipeline.

set off "http" "my_webhook" {
  pipeline = pipeline.my_pipeline
  args     = {
    occasion = self.request_body
  }
}

To make use of triggers, run Flowpipe in server mode.

The Goldilocks zone

Flowpipe occupies a center floor between instruments like Zapier or IFTTT, which require little or no code for easy issues, and instruments like N8N or Windmill, which might do complicated issues however require lots of code. You specific pipelines, steps, and triggers in the usual devops configuration language: HCL. As wanted you increase that code with SQL, or Python, or JavaScript, or bash, or something you may package deal right into a container.

You coordinate all these assets utilizing a standard execution mannequin embedded in a single binary that runs as a CLI, and/or as server that schedules duties and listens for webhooks. Both means you may run that single binary domestically or deploy it to any cloud or CI/CD pipeline.

To get began, obtain the device, take a look at the library mods and samples, and run by way of the tutorial.

Will Flowpipe’s declarative code-forward fashion resonate with devops scripters? Give it a try to tell us the way it goes. And should you’re inclined to contribute to Flowpipe’s AGPL-licensed engine or Apache-licensed mods, we’re at all times joyful to obtain pull requests!

Copyright © 2024 IDG Communications, Inc.

[*]
[*]Supply hyperlink

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles