It is important to have in mind that in getOperate Scripts are at the basis of Flows and Apps. To sum up roughly, workflows are state machines represented as DAG to compose scripts together. Learn more in the Script Quickstart in the previous section. You will not necessarily have to re-build each script as you can reuse them from your workspace or from the Hub.

Those workflows can run for-loops, branches (parralellizable) suspend themselves until a timeout or receiving events such as webhooks or approvals. They can be scheduled very frequently and check for new external items to process (what we call “Trigger” script).

The overhead and coldstart between each step is about 20ms, which is faster than any other orchestration engine, by a large margin.

To create your first workflow, you could also pick one from our Hub and fork it. Here, we’re going to build our own flow from scratch, step by step.

From getOperate, click on + Flow, and let’s get started!

Follow our detailed section on the Flow Editor for more information.

Settings

Metadata

The first thing you’ll see is the Metadata menu. From there, you can set the permissions of the workflow: User (by default, you), and Folder (referring to read and/or write groups).

Also, you can give succinctly a Name, a Summary and a Description to your flow. Those are supposed to be explicit, we recommend you to give context and make them as self-explanatory as possible.

Schedule

On another tab, you can configure a Schedule* to trigger your flow. Flows can be triggered by any schedules, their webhooks or their UI but they only have only one primary schedule with which they share the same path. This menu is where you set the primary schedule with CRON. The default schedule is none.

Schedules

Scheduling allows you to define schedules for Scripts and Flows, automatically running them at set frequencies.

Shared Directory

Last tab of the settings menu is the Shared Directory.

By default, flows on getOperate are based on a result basis. A step will take as inputs the results of previous steps. And this works fine for lightweight automation.

For heavier ETLs and any output that is not suitable for JSON, you might want to use the Shared Directory to share data between steps. Steps share a folder at ./shared in which they can store heavier data and pass them to the next step.

Get more details on the Persistent Storage dedicated page.

Persistent Storage

Ensure that your data is safely stored and easily accessible whenever required.

Worker Group

When a worker group is defined at the flow level, any steps inside the flow will run on that worker group, regardless of the steps’ worker group. If no worker group is defined, the flow controls will be executed by the default worker group ‘flow’ and the steps will be executed in their respective worker group.

Workers and Worker Groups

Worker Groups allow users to run scripts and flows on different machines with varying specifications.

You can always go back to this menu by clicking on Settings on the top lef, or on the name of the flow on the toolbar.

How data is exchanged between steps

Flows on getOperate are generic and reusable, they therefore expose inputs. Input and outputs are piped together.

Inputs are either:

  • Static: you can find them on top of the side menu. This tab centralizes the static inputs of every steps. It is akin to a file containing all constants. Modifying a value here modify it in the step input directly.
  • Dynamically linked to others: with JSON objects as result that allow to refer to the output of any step. You can refer to the result of any step:
    • using the id associated with the step
    • clicking on the plug logo that will let you pick flow inputs or previous steps’ results (after testing flow or step)

Architecture and Data Exchange

A workflow is a JSON serializable value in the OpenFlow format.

Flow editor

On the left you’ll find a graphical view of the flow. From there you can architecture your flow and take action at each step.

There are four kinds of scripts: Action, Trigger, Approval and Error handler. You can sequence them how you want. Action is the default script type.

Each script can be called from Workspace or Hub, you can also decide to write them inline.

Your flow can be deepened with additional features, below are some major ones.

For loops

For loops are a special type of steps that allows you to iterate over a list of items, given by an iterator expression.

For loops

Iterate a series of tasks.

Branching

Branches build branching logic to create and manage complex workflows based on conditions. There are two of them:

  • Branch one: allows you to execute a branch if a condition is true.
  • Branch all: allows you to execute all the branches in parallel, as if each branch is a flow.

Branches

Split the execution of the flow based on a condition.

Retries

At each step, getOperate allows you to customize the number of retries by going on the Advanced tabs of the individual script. If defined, upon error this step will be retried with a delay and a maximum number of attempts.

Retries

Re-try a step in case of error.

Suspend/Approval Step

At each step you can add Approval Scripts to manage security and control over your flows.

Request approvals can be sent by email, Slack, anything. Then you can automatically resume workflows with secret webhooks after the approval steps.

Approval Steps in Flows

Suspend a flow until specific event(s) are received, such as approvals or cancellations.

You can find all the flows’ features in their dedicated section.

Triggers

There are several ways to trigger a flow with getOperate.

  1. The most direct one is from the autogenerated UI provided by getOperate. It is the one you will see from the flow editor.
  2. A similar but more customized way is to use getOperate Apps using the App editor.
  3. We saw above that you can trigger flows using schedules that you can check from the Runs page. One special way to use scheduling is to combine it with trigger scripts.
  4. Execute flows from the CLI to trigger your flows from your terminal.
  5. Trigger the flow from another flow.
  6. Using trigger scripts to trigger only if a condition has been met.
  7. Webhooks. Each Flow created in the app gets autogenerated webhooks. You can see them once you flow is saved. You can even trigger flows without leaving Slack!

Triggering Flows

Trigger flows on-demand, by schedule or on external events.

Time to test

You don’t have to explore all flows editor possibilities at once. At each step, test what you’re building to keep control on your wonder. You can also test up to a certain step by clicking on an action (x) and then on Test up to x.

Testing Flows

Iterate quickly and get control on your flow testing.

When you’re done, deploy your flow, schedule it, create and app from it, or even publish it to Hub.

Follow our detailed section on the Flow Editor for more information.