Pipeline conditions - Azure Pipelines (2024)

  • Article

Azure DevOps Services | Azure DevOps Server 2022 - Azure DevOps Server 2019

This article describes the conditions under which an Azure Pipelines stage, job, or step runs, and how to specify different conditions. For more context on stages, jobs, and steps, see Key concepts for Azure Pipelines.

  • By default, a job or stage runs if it doesn't depend on any other job or stage, or if all its dependencies completed and succeeded. This requirement applies not only to direct dependencies, but to their indirect dependencies, computed recursively.

  • By default, a step runs if nothing in its job failed yet and the step immediately preceding it completed.

You can override or customize this behavior by forcing a stage, job, or step to run even if a previous dependency fails, or by specifying a custom condition.

Note

This article discusses YAML pipeline capabilities. For Classic pipelines, you can specify some conditions under which tasks or jobs run in the Control Options of each task, and in the Additional options for a job in a release pipeline.

Conditions under which a stage, job, or step runs

In the pipeline definition YAML, you can specify the following conditions under which a stage, job, or step runs:

  • Custom conditions.

By default, stages, jobs, and steps run if all direct and indirect dependencies succeed. This status is the same as specifying condition: succeeded(). For more information, see succeeded status function.

When you specify a condition property for a stage, job, or step, you overwrite the default condition: succeeded(). Specifying your own conditions can cause your stage, job, or step to run even if the build is canceled. Make sure the conditions you write take into account the state of the parent stage or job.

The following YAML example shows the always() and failed() conditions. The step in the first job runs even if dependencies fail or the build is canceled. The second job runs only if the first job fails.

jobs:- job: Foo steps: - script: echo Hello! condition: always() # this step runs, even if the build is canceled- job: Bar dependsOn: Foo condition: failed() # this job runs only if Foo fails

You can also set and use variables in conditions. The following example sets and uses an isMain variable to designate main as the Build.SourceBranch.

variables: isMain: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]stages:- stage: A jobs: - job: A1 steps: - script: echo Hello Stage A!- stage: B condition: and(succeeded(), eq(variables.isMain, true)) jobs: - job: B1 steps: - script: echo Hello Stage B! - script: echo $(isMain)

Important

Conditions are evaluated to determine whether to start a stage, job, or step. Therefore, nothing computed at runtime inside that unit of work is available. For example, if you have a job that sets a variable using a runtime expression with $[ ] syntax, you can't use that variable in a custom condition in that job.

Custom conditions

If the built-in conditions don't meet your needs, you can specify custom conditions. You write conditions as expressions in YAML pipeline definitions.

The agent evaluates the expression beginning with the innermost function and proceeding outward. The final result is a boolean value that determines whether or not the task, job, or stage should run. For a full guide to the syntax, see Expressions.

If any of your conditions make it possible for the task to run even after the build is canceled, specify a reasonable value for cancel timeout so that these tasks have enough time to complete after the user cancels a run.

Condition outcomes when a build is canceled

Canceling a build doesn't mean that all its stages, jobs, or steps stop running. Which stages, jobs, or steps stop running depend on the conditions you specified, and at what point of the pipeline's execution you canceled the build. If a stage, job, or step's parent is skipped, the task doesn't run, regardless of its conditions.

A stage, job, or step runs whenever its conditions evaluate to true. If your condition doesn't take into account the state of the task's parent, the task might run even if its parent is canceled. To control whether stages, jobs, or steps with conditions run when a build is canceled, make sure to include a job status check function in your conditions.

The following examples show the outcomes of various conditions set on stages, jobs, or steps when the build is canceled.

Stage example 1

In the following pipeline, by default stage2 would depend on stage1, but stage2 has a condition set to run whenever the source branch is main, regardless of stage1 status.

If you queue a build on the main branch and cancel it while stage1 is running, stage2 still runs, because eq(variables['Build.SourceBranch'], 'refs/heads/main') evaluates to true.

stages:- stage: stage1 jobs: - job: A steps: - script: echo 1; sleep 30- stage: stage2 condition: eq(variables['Build.SourceBranch'], 'refs/heads/main') jobs: - job: B steps: - script: echo 2

Stage example 2

In the following pipeline, stage2 depends on stage1 by default. Job B in stage2 has a condition set. If you queue a build on the main branch and cancel it while stage1 is running, stage2 doesn't run, even though it contains a job whose condition evaluates to true.

The reason is because stage2 has the default condition: succeeded(), which evaluates to false when stage1 is canceled. Therefore, stage2 is skipped, and none of its jobs run.

stages:- stage: stage1 jobs: - job: A steps: - script: echo 1; sleep 30- stage: stage2 jobs: - job: B condition: eq(variables['Build.SourceBranch'], 'refs/heads/main') steps: - script: echo 2

Stage example 3

In the following pipeline, by default stage2 depends on stage1, and the step inside job B has a condition set.

If you queue a build on the main branch and cancel it while stage1 is running, stage2 doesn't run, even though it contains a step in job B whose condition evaluates to true. The reason is because stage2 is skipped in response to stage1 being canceled.

stages:- stage: stage1 jobs: - job: A steps: - script: echo 1; sleep 30- stage: stage2 jobs: - job: B steps: - script: echo 2 condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')

Job example 1

In the following YAML pipeline, job B depends on job A by default, but job B has a condition set to run whenever the source branch is main. If you queue a build on the main branch and cancel it while job A is running, job B still runs, because eq(variables['Build.SourceBranch'], 'refs/heads/main') evaluates to true.

jobs:- job: A steps: - script: sleep 30- job: B dependsOn: A condition: eq(variables['Build.SourceBranch'], 'refs/heads/main') steps: - script: echo step 2.1

If you want job B to run only when job A succeeds and the build source is the main branch, your condition should be and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main')).

Job example 2

In the following pipeline, job B depends on job A by default. If you queue a build on the main branch and cancel it while job A is running, job B doesn't run, even though its step has a condition that evaluates to true.

The reason is because job B has the default condition: succeeded(), which evaluates to false when job A is canceled. Therefore, job B is skipped, and none of its steps run.

jobs:- job: A steps: - script: sleep 30- job: B dependsOn: A steps: - script: echo step 2.1 condition: eq(variables['Build.SourceBranch'], 'refs/heads/main', succeeded()) 

Step example

You can also have conditions on steps.

In the following pipeline, step 2.3 has a condition set to run whenever the source branch is main. If you queue a build on the main branch and cancel it while steps 2.1 or 2.2 are running, step 2.3 still runs, because eq(variables['Build.SourceBranch'], 'refs/heads/main') evaluates to true.

steps: - script: echo step 2.1 - script: echo step 2.2; sleep 30 - script: echo step 2.3 condition: eq(variables['Build.SourceBranch'], 'refs/heads/main')

Condition settings

The following table shows example condition settings to produce various outcomes.

Note

Release.Artifacts.{artifact-alias}.SourceBranch is equivalent to Build.SourceBranch.

Desired outcomeExample condition setting
Run if the source branch is main, even if the parent or preceding stage, job, or step failed or was canceled.eq(variables['Build.SourceBranch'], 'refs/heads/main')
Run if the source branch is main and the parent or preceding stage, job, or step succeeded.and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main'))
Run if the source branch isn't main, and the parent or preceding stage, job, or step succeeded.and(succeeded(), ne(variables['Build.SourceBranch'], 'refs/heads/main'))
Run for user topic branches, if the parent or preceding stage, job, or step succeeded.and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/heads/users/'))
Run for continuous integration (CI) builds, if the parent or preceding stage, job, or step succeeded.and(succeeded(), in(variables['Build.Reason'], 'IndividualCI', 'BatchedCI'))
Run if the build was triggered by a branch policy for a pull request, and the parent or preceding stage, job, or step failed.and(failed(), eq(variables['Build.Reason'], 'PullRequest'))
Run for a scheduled build, even if the parent or preceding stage, job, or step failed or was canceled.eq(variables['Build.Reason'], 'Schedule')
Run if a variable is set to true, even if the parent or preceding stage, job, or step failed or was canceled.eq(variables['System.debug'], true)

Note

You can set a condition to run if a variable is null (empty string). Since all variables are treated as strings in Azure Pipelines, an empty string is equivalent to null in the following pipeline:

variables:- name: testEmpty value: ''jobs: - job: A steps: - script: echo testEmpty is blank condition: eq(variables.testEmpty, '')

Parameters in conditions

Parameter expansion happens before conditions are considered. Therefore, when you declare a parameter in the same pipeline as a condition, you can embed the parameter inside the condition. The script in the following YAML runs because parameters.doThing is true.

parameters:- name: doThing default: true type: booleansteps:- script: echo I did a thing condition: and(succeeded(), ${{ eq(parameters.doThing, true) }})

The condition in the preceding pipeline combines two functions: succeeded() and ${{ eq(parameters.doThing, true) }}. The succeeded() function checks if the previous step succeeded. The succeeded() function returns true because there was no previous step.

The ${{ eq(parameters.doThing, true) }} function checks whether the doThing parameter is equal to true. Since the default value for doThing is true, the condition returns true by default unless the pipeline sets a different value.

Template parameters in conditions

When you pass a parameter to a template, you need to either set the parameter's value in your template or use templateContext to pass the parameter to the template.

For example, the following parameters.yml file declares the doThing parameter and default value:

# parameters.ymlparameters:- name: doThing default: true # value passed to the condition type: booleanjobs: - job: B steps: - script: echo I did a thing condition: ${{ eq(parameters.doThing, true) }}

The pipeline code references the parameters.yml template. The output of the pipeline is I did a thing because the parameter doThing is true.

# azure-pipeline.ymlparameters:- name: doThing default: true type: booleantrigger:- noneextends: template: parameters.yml

For more template parameter examples, see the Template usage reference.

Job output variables used in subsequent job conditions

You can make a variable available to future jobs and specify it in a condition. Variables available to future jobs must be marked as multi-job output variables by using isOutput=true, as in the following code:

jobs:- job: Foo steps: - bash: | echo "This is job Foo." echo "##vso[task.setvariable variable=doThing;isOutput=true]Yes" #set variable doThing to Yes name: DetermineResult- job: Bar dependsOn: Foo condition: eq(dependencies.Foo.outputs['DetermineResult.doThing'], 'Yes') #map doThing and check the value steps: - script: echo "Job Foo ran and doThing is Yes."

Variables created in a step used in subsequent step conditions

You can create a variable that's available for future steps to specify in a condition. Variables created from steps are available to future steps by default and don't need to be marked as multi-job output variables.

There are some important things to note about scoping variables that are created from steps.

  • Variables created in a step in a job are scoped to the steps in the same job.
  • Variables created in a step are available in subsequent steps only as environment variables.
  • Variables created in a step can't be used in the step that defines them.

The following example shows creating a pipeline variable in a step and using the variable in a subsequent step's condition and script.

steps:# This step creates a new pipeline variable: doThing. This variable is available to subsequent steps.- bash: | echo "##vso[task.setvariable variable=doThing]Yes" displayName: Step 1# This step is able to use doThing, so it uses doThing in its condition- script: | # Access the variable from Step 1 as an environment variable. echo "Value of doThing (as DOTHING env var): $DOTHING." displayName: Step 2 condition: and(succeeded(), eq(variables['doThing'], 'Yes')) # or and(succeeded(), eq(variables.doThing, 'Yes'))

FAQ

How can I trigger a job if a previous job succeeded with issues?

You can use the result of the previous job in a condition. For example, in the following YAML, the condition eq(dependencies.A.result,'SucceededWithIssues') allows job B to run because job A succeeded with issues.

jobs:- job: A displayName: Job A continueOnError: true # next job starts even if this one fails steps: - script: echo Job A ran - script: exit 1- job: B dependsOn: A condition: eq(dependencies.A.result,'SucceededWithIssues') # targets the result of the previous job displayName: Job B steps: - script: echo Job B ran

I canceled my build, but it's still running. Why?

You can experience this issue if a condition configured in a stage doesn't include a job status check function. To resolve the issue, add a job status check function to the condition.

If you cancel a job while it's in the queue stage but not running, the entire job is canceled, including all the other stages. For more information, see Condition outcomes when a build is canceled earlier in this article.

Related content

  • Specify jobs in your pipeline
  • Add stages, dependencies, and conditions
Pipeline conditions - Azure Pipelines (2024)

FAQs

What is a condition in an Azure pipeline? ›

Conditions under which a stage, job, or step runs. In the pipeline definition YAML, you can specify the following conditions under which a stage, job, or step runs: Only when all previous direct and indirect dependencies with the same agent pool succeed.

What is the condition succeeded or failed in Azure pipeline? ›

succeeded or failed()Runs when its previous depended task succeeds or fails. Won't run if the pipeline execution is cancelled. always()Runs all the time regardless of the status of previous tasks or pipeline cancellation. For example, the failed() condition can be used at the Job level to control task execution.

What is the difference between dependencies and stageDependencies in Azure pipelines? ›

The context is called dependencies for jobs and stages and works much like variables. If you refer to an output variable from a job in another stage, the context is called stageDependencies. Note that for a stage to be dependent on another stage, you need to specify dependOn : to establish the dependency.

What is environment in Azure pipelines? ›

An environment is a collection of resources that you can target with deployments from a pipeline. An environment represents a logical target where your pipeline deploys software. Typical environment names are Dev, Test, QA, Staging, and Production. Note. Azure DevOps environments aren't available in Classic pipelines.

What is pipeline abnormal operating conditions? ›

An abnormal condition is a non-emergency condition on a gas transmission facility that occurs when the operating design limits have been exceeded due to a pressure, flow rate, or temperature change outside the limits of normal conditions.

What 2 types of pipelines can you create in Azure DevOps? ›

Classic pipelines are created in the Azure DevOps web portal with the Classic user interface editor. You can define a build pipeline to build, test your code, and then publish your artifact (binary). Additionally, you can define a release pipeline to consume your binary (artifact) and deploy it to specific targets.

How do you fix a failed pipeline? ›

If a merge request pipeline or merged result pipeline was canceled or failed, you can:
  1. Re-run the entire pipeline by selecting Run pipeline in the pipeline tab in the merge request.
  2. Retry only the jobs that failed. If you re-run the entire pipeline, this is not necessary.
  3. Push a new commit to fix the failure.

What are the stages of Azure pipeline environment? ›

For instance, your pipeline might include stages for building, testing, deploying to a staging environment, and deploying to production. You might want all stages to run automatically except for the production deployment, which you prefer to trigger manually when ready.

What is a trigger in Azure pipelines? ›

Scheduled triggers are independent of the repository and allow you to run a pipeline according to a schedule. Pipeline triggers in YAML pipelines and build completion triggers in classic build pipelines allow you to trigger one pipeline upon the completion of another.

What is the difference between a build and a release in Azure Pipelines? ›

The Azure DevOps Server provides two different types of pipelines to perform build, deployment, testing and further actions. A Build Pipeline is used to generate Artifacts out of Source Code. A Release Pipeline consumes the Artifacts and conducts follow-up actions within a multi-staging system.

What are two ways to configure your Azure Pipelines? ›

From the Pipeline settings pane you can configure the following settings.
  • Processing of new run requests - Sometimes you'll want to prevent new runs from starting on your pipeline. ...
  • YAML file path - If you ever need to direct your pipeline to use a different YAML file, you can specify the path to that file.
Aug 5, 2024

What is the difference between deployment group and environment in Azure pipeline? ›

In effect, a deployment group is just another grouping of agents, much like an agent pool. Environments: Environment represents a collection of resources such as namespaces within Kubernetes clusters, Azure Web Apps, virtual machines, databases, which can be targeted by deployments from a pipeline.

How to set environment variables in Azure pipelines? ›

You can set a variable for a build pipeline by following these steps:
  1. Go to the Pipelines page, select the appropriate pipeline, and then select Edit.
  2. Locate the Variables for this pipeline.
  3. Add or update the variable.
  4. To mark the variable as secret, select Keep this value secret.
  5. Save the pipeline.
Apr 4, 2024

What is the difference between agent job and deployment group job? ›

Agent pool jobs run on an agent in an agent pool. Server jobs run on the Azure DevOps Server. Deployment group jobs run on machines in a deployment group. These jobs are only available in a release pipeline.

What is the difference between pipeline variable and parameter in Azure? ›

Unlike pipeline parameters, which are defined at the pipeline level and cannot be changed during a pipeline run, pipeline variables can be set and modified within a pipeline using a Set Variable activity.

What are the features of Azure pipelines? ›

Azure Pipelines vs Jenkins
Azure Pipelines
ConfigurationYAML-based, visual editor available
EcosystemA comprehensive set of features for CI/CD
Hosting and ManagementCloud-hosted as part of Azure DevOps suite, benefits from cloud ease of setup, scaling, and updates.
CostFree and paid tiers (limited build agents and storage)
2 more rows
Jun 18, 2024

What are variable functions in Azure pipelines? ›

Variables give you a convenient way to get key bits of data into various parts of the pipeline. The most common use of variables is to define a value that you can then use in your pipeline. All variables are strings and are mutable. The value of a variable can change from run to run or job to job of your pipeline.

Top Articles
Great Pyrenees Dogs: Size, Temperament, Breed Information, and More! | Brown Veterinary Hospital
The Travellers Guide ATMs in Thailand (& How To Avoid Fees) - 2024
Zabor Funeral Home Inc
Froedtert Billing Phone Number
El Paso Pet Craigslist
Greedfall Console Commands
Soap2Day Autoplay
Math Playground Protractor
St Petersburg Craigslist Pets
Red Wing Care Guide | Fat Buddha Store
Graveguard Set Bloodborne
How to Watch Braves vs. Dodgers: TV Channel & Live Stream - September 15
Magic Mike's Last Dance Showtimes Near Marcus Cedar Creek Cinema
Find The Eagle Hunter High To The East
Rosemary Beach, Panama City Beach, FL Real Estate & Homes for Sale | realtor.com®
Uky Linkblue Login
Where to Find Scavs in Customs in Escape from Tarkov
Richland Ecampus
Bible Gateway passage: Revelation 3 - New Living Translation
Aes Salt Lake City Showdown
Ac-15 Gungeon
Boxer Puppies For Sale In Amish Country Ohio
Dove Cremation Services Topeka Ks
Bj타리
Great ATV Riding Tips for Beginners
manhattan cars & trucks - by owner - craigslist
Healthy Kaiserpermanente Org Sign On
Earthy Fuel Crossword
Davita Salary
October 19 Sunset
A Grade Ahead Reviews the Book vs. The Movie: Cloudy with a Chance of Meatballs - A Grade Ahead Blog
Flaky Fish Meat Rdr2
Beaver Saddle Ark
Drabcoplex Fishing Lure
Jewish Federation Of Greater Rochester
Myanswers Com Abc Resources
Bill Manser Net Worth
Post A Bid Monticello Mn
Ds Cuts Saugus
Hanco*ck County Ms Busted Newspaper
The Sports Academy - 101 Glenwest Drive, Glen Carbon, Illinois 62034 - Guide
Neil Young - Sugar Mountain (2008) - MusicMeter.nl
A Man Called Otto Showtimes Near Cinemark Greeley Mall
Dineren en overnachten in Boutique Hotel The Church in Arnhem - Priya Loves Food & Travel
Okta Login Nordstrom
Meee Ruh
Costco Tire Promo Code Michelin 2022
Morgan State University Receives $20.9 Million NIH/NIMHD Grant to Expand Groundbreaking Research on Urban Health Disparities
One Facing Life Maybe Crossword
The Love Life Of Kelsey Asbille: A Comprehensive Guide To Her Relationships
Booked On The Bayou Houma 2023
Latest Posts
Article information

Author: Laurine Ryan

Last Updated:

Views: 6484

Rating: 4.7 / 5 (57 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Laurine Ryan

Birthday: 1994-12-23

Address: Suite 751 871 Lissette Throughway, West Kittie, NH 41603

Phone: +2366831109631

Job: Sales Producer

Hobby: Creative writing, Motor sports, Do it yourself, Skateboarding, Coffee roasting, Calligraphy, Stand-up comedy

Introduction: My name is Laurine Ryan, I am a adorable, fair, graceful, spotless, gorgeous, homely, cooperative person who loves writing and wants to share my knowledge and understanding with you.