GithubHelp home page GithubHelp logo

jenkinsci / pipeline-as-yaml-plugin Goto Github PK

View Code? Open in Web Editor NEW
128.0 9.0 44.0 473 KB

Jenkins Pipeline As Yaml Plugin

Home Page: https://plugins.jenkins.io/pipeline-as-yaml/

License: MIT License

Java 99.82% HTML 0.18%
jenkins jenkins-pipeline jenkins-plugin pipeline yaml pipeline-as-yaml pipeline-as-code multibranch-pipeline

pipeline-as-yaml-plugin's Introduction

Pipeline As Yaml Plugin for Jenkins

Build Coverage LOC Contributors GitHub release GitHub license

This plugin enables defining Jenkins Pipelines in YAML Format for Pipeline and MultiBranch Pipeline Jobs.

Important

Currently this plugin is in the incubation stage. It will evolve further to become more aligned with the Pipeline ecosystem, and some breaking changes are plausible. You are welcome to try out this plugin and to provide your feedback. Contributions are welcome!

Description

Jenkins enables defining pipelines with specific DSL. With this plugin Jenkins pipelines can be defined in Yaml format.

Defined Yaml format is converted to Jenkins Pipeline Declarative syntax in runtime.

Any existing steps in Snippet Generator or Declarative Directive Generator can bu used in step or script block.

Jenkins Declarative Pipeline Syntax rules must be followed.

Please see below for usage examples.

Usage

Pipeline

For using Pipeline As Yaml in your Pipeline Job, select one of the possible options.

Editor

Define Pipeline As Yaml with embedded editor.

pipelineAsScript

SCM

Retrieve Pipeline As Yaml from SCM Definition

pipelineAsScm

MultiBranch Pipeline

For using Pipeline as Yaml in your MultiBranch Pipeline, select by Jenkinsfile As Yaml' in Build Configuration`.

Build Configuration

Pipeline As Yaml Syntax

Pipeline definition must stat with pipeline key.

For detailed usage examples please check here.

pipeline:
  agent: any
    ...
    ...

Agent

Example agent definition is shown below. Agent definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  agent:
    node:
      label: 'label'

Environment

Example definition is shown below. Environment definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  environment:
    KEY1: "VAL1"

Options

Example definition is shown below. Options definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  options:
    - "timeout(time: 1, unit: 'HOURS')"
    # Or any other 'options' directive which is generated by Declarative Directive Generator

Post

Example definition is shown below. Post definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  post:
    always:
      - echo Test
    changed:
      - echo Test
    # Or any other 'post' directive which is generated by Declarative Directive Generator 

Tools

Example definition is shown below. Tools definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline: 
  tools:
    maven: "maven"
    # Or any other 'tools' directive which is generated by Declarative Directive Generator" 

When

Example definition is shown below. When definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  stages:
    - stage: "WhenTest"
      when:
        - "branch 'production'"
      # Or any other 'when' directive which is generated by Declarative Directive Generator" 

Parameters

Example definition is shown below.

For further supported definitions syntax please check.

pipeline:
  parameters:
    - "string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')"
    # Or any other 'parameters' directive which is generated by Declarative Directive Generator" 

Triggers

Example definition is shown below.

For further supported definitions syntax please check.

pipeline:
  triggers:
    - cron('H */4 * * 1-5')
    # Or any other 'triggers' directive which is generated by Declarative Directive Generator" 

Library

Example definition is below.

Before using Library feature please read here

For further supported definitions syntax please check.

pipeline:
  library: "library@master"
  agent:
    any:
  stages:
    - stage: "Stage Library"
      steps:
        script:
          - "myCustomStepInLibrary"

Stages

Example definition is shown below.

For further supported definitions syntax please check.

pipeline:
  agent:
    none:
  stages:
    - stage: "Stage1"
      steps:
        - echo "1"
    - stage: "Stage2"
      steps:
        - echo "2"
pipeline:
  agent:
    none:
  stages:
    - stage: "Stage1"
      stages:
        - stage: "Inner Stage1"
          steps:
            - echo "1" 
pipeline:
  stages:
    - stage: "Stage1"
      steps:
        - echo "1"
    - stage: "Parallel"
      parallel:
        - stage: "Parallel1"
          steps:
            - echo "P1"
        - stage: "Parallel2"
          steps:
            - echo "P1"

Steps

Example definition is shown below.

Any other 'step' which is generated by Snippet Generator can be used in steps definitions.

For further supported definitions syntax please check.

pipeline:
  stages:
    - stage: "Stage"
      steps:
        - echo env.WORKSPACE # Or any other 'step' which is generated by Snippet Generator" 

Any other 'step' which is generated by Snippet Generator or Groovy Script can be used in steps definitions.

pipeline:
  stages:
    - stage: "Stage1"
      steps:
        script:
          - echo "1" # Or any other 'step' which is generated by Snippet Generator, Groovy Script" 

For implementing complex scripts or steps

pipeline:
  stages:
    - stage: "Stage1"
      steps:
        script: |
          echo "1"
          echo "2"
          echo "3"

Special Steps With Code Blocks

Some steps has their own code blocks. For example: 'withAnt, withEnv, withCredentials, dir' or any other custom step definition which has it's own code block.

This kind of steps also can be defined as YAML.

Example definition is shown below.

pipeline:
  stages:
    - stage: "Stage"
      steps:
        script:
          - withAnt:
            script:
              - echo "No values"
          - withEnv: "['KEY=VAL']"
            script:
              - echo $KEY
          - withCredentials: "[usernamePassword(credentialsId: 'eedc7820-a4e0-4d87-a66d-b5b65ee42ad9', passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]"
            script:
              - echo $USERNAME
          - withCredentials: "[string(credentials: ''),variable: 'CRED']"
            script:
              - echo $CRED

This steps can be used within their blocks as well.

pipeline:
  stages:
    - stage: "WithEnv Intertwined"
      steps:
        script:
          - withEnv: "['KEY1=VAL1']"
            script:
              - echo env.KEY1
              - withEnv: "['KEY2=VAL2']"
                script:
                  - echo env.KEY2

Custom steps can be converted to YAML format as shown below.

myCustomStep([customVariable: '']) {
    echo "some code"
}
pipeline:
  stages:
    - stage: "Stage"
      steps:
        script:
          - myCustomStep: "[customVariable: '']"
            script:
              - echo "some code"

Conversion and Validation

Before running Pipeline As Yaml, you can convert to Declarative Script and validate the pipeline. By this, errors can be prevented before running the pipelines.

For using this functionality click the Pipeline Syntax Page which is shown in the Job Menu

Pipeline Syntax

Click "Pipeline As YAML Converter" link

Pipeline As YAML Conveter

Paste your Pipeline As YAML to first text area and click "Convert To Pipeline Declarative Script" button as shown below

Paste Pipeline

After successful conversion second text area will be filled Pipeline Declarative Script. For validation, click "Validate" button as shown below

Validate

Validation or error messages will be show below the button.

Reporting Issues

Please create issue in this repository.

Create Issue

Thank You!

If you feel your self generous today, you can buy me a coffee : )
Or you can star the project. Thanks.

pipeline-as-yaml-plugin's People

Contributors

aytuncbeken avatar dependabot[bot] avatar jonesbusy avatar kennyg avatar lengyf avatar oleg-nenashev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pipeline-as-yaml-plugin's Issues

When replaying a pipeline the code shown is the transformed declarative code rather than the original YAML

Describe the bug
I created a pipeline as YAML. Then ran it, so build n.1 completed. Clicked on the build, then hit the "replay" button, it showed the code of the pipeline, not as YAML anymore, but instead as its corresponding declarative code.

To Reproduce
Steps to reproduce the behaviour:

  1. Create a simple "hello world" pipeline as YAML and save it.
  2. Build the pipeline, wait until build completes.
  3. Click on the build, then, click on "replay"
  4. See the code is not YAML anymore. Instead, it is the declarative version of the previous YAML code.

Expected behaviour
Replaying a build from a pipeline defined as YAML should show the original YAML code.

Example of YAML pipeline

pipeline:
  agent:
    label: 'master'
  stages:
   - stage: Setup
     steps: echo "hello world"

Declarative pipeline
When replaying the code of the pipeline above, this is shown

pipeline {
  agent {
    node {
      label 'master'
    }
  }
  stages {
    stage('Setup') {
      steps {
        echo "hello world"
      }
    }
  }
}

Pipeline Converter UI v1

Is your feature request related to a problem? Please describe.

  • An UI for converting Yaml to Declarative
  • UI must be in the Pipeline Syntax Page for easy access

Pipeline Support

Requirements:

  • Pipeline Job support needs to be added to plugin
  • Options
    • Jenkins file as YAML From SCM
    • Jenkins file as Yaml (Editor)

SnakeYAML method not found

SnakeYAML API Plugin 2.2-111.vc6598e30cc65
java.lang.NoSuchMethodError: org.yaml.snakeyaml.representer.Representer: method 'void ()' not found
at io.jenkins.plugins.pipeline.parsers.AbstractParser.(AbstractParser.java:25)
at io.jenkins.plugins.pipeline.parsers.PipelineParser.(PipelineParser.java:26)
at io.jenkins.plugins.pipeline.cps.PipelineCpsScmFlowDefinition.create(PipelineCpsScmFlowDefinition.java:50)
at io.jenkins.plugins.pipeline.PipelineAsYamlScmFlowDefinition.create(PipelineAsYamlScmFlowDefinition.java:74)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:311)
at hudson.model.ResourceController.execute(ResourceController.java:101)
at hudson.model.Executor.run(Executor.java:442)

NPE when running a job with 'Pipeline As Yaml from SCM'

The problem only occurs while running a job with 'Pipeline As Yaml from SCM'. What follows is the Output console content :

java.lang.NullPointerException
	at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.<init>(CpsScmFlowDefinition.java:78)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.cps.PipelineCpsScmFlowDefinition.<init>(PipelineCpsScmFlowDefinition.java:30)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.PipelineAsYamlScmFlowDefinition.create(PipelineAsYamlScmFlowDefinition.java:73)
	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:309)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:428)
Finished: FAILURE

Steps to reproduce the behavior:

  1. Create a new job
  2. Setup the pipeline section with 'Pipeline As Yaml from SCM' and feed it with SCM/repo/credentials/branch.
  3. Click Save
  4. Running the job lead to a fail wih NPE

Expected behavior
A healthy job execution.

Versions:

  • Jenkins Version : 2.249.2
  • Plugin Version : 0.12-rc

Context:
It may be related to this issue : #34

Pipeline as YAML should not dependency on Pipeline Aggregator

Describe the bug
Pipeline Aggregator is a meta-plugin which includes A LOT of Pipeline plugins: https://github.com/jenkinsci/workflow-aggregator-plugin . It bloats the dependency scope for the plugin and it prevents Pipeline as YAML from being included into the default distribution due to circular dependencies.

Example of a dependency conflict caused by an old plugin version in the aggregator:

Require upper bound dependencies error for com.google.guava:guava:11.0.1 paths to dependency are:
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-com.google.guava:guava:11.0.1
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.kohsuke.stapler:stapler-jrebel:1.259
      +-org.kohsuke.stapler:stapler:1.259
        +-com.google.guava:guava:11.0.1
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.jenkins-ci.plugins.workflow:workflow-aggregator:2.6
      +-org.jenkinsci.plugins:pipeline-model-definition:1.3.2
        +-org.jenkinsci.plugins:pipeline-model-api:1.3.2
          +-com.github.fge:json-schema-validator:2.0.4
            +-com.github.fge:json-schema-core:1.0.4
              +-com.google.guava:guava:11.0.1 (managed) <-- com.google.guava:guava:13.0.1

To Reproduce
Steps to reproduce the behavior:

  1. See jenkinsci/jenkinsfile-runner#316

Expected behavior
Pipeline as YAML plugin declares dependencies only on Pipeline components it needs. https://github.com/jenkinsci/bom can be ideally to simplify this process and further management.

scenarioInput stage missing close brace

Describe the bug
scenarioInput stage missing close brace

Additional context
pipeline {
agent none
stages {
stage('Stage1') {
input {
message "message"
id "id"
ok "ok"
submitter "submitter"
submitterParameter "submitterParameter"
parameters {
string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
steps {
echo "1"
}
}
}
}

Pipeline as YAML Parser classes do not propagate exception causes

Describe the bug
Pipeline as YAML parsers do not retain the cause exceptions. In such case, some diagnostics info may be lost. It may become harder to users to diagnose the failure causes.

Sample code:

        catch (Exception e) {
            throw new PipelineAsYamlRuntimeException(e.getLocalizedMessage());
        }

Sample error from Jenkinsfile Runner:

[2020-07-17T20:27:45.806Z] 2020-07-17 20:27:45.752+0000 [id=112]	WARNING	i.j.j.runner.JenkinsEmbedder#before: Jenkins.theInstance was not cleared by a previous test, doing that now

[2020-07-17T20:27:46.618Z] 2020-07-17 20:27:46.513+0000 [id=133]	WARNING	o.j.p.w.flow.FlowExecutionList#unregister: Owner[job/1:job #1] was not in the list to begin with: []

[2020-07-17T20:27:46.618Z] Started

[2020-07-17T20:27:46.618Z] org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.exceptions.PipelineAsYamlRuntimeException: java.util.ArrayList cannot be cast to java.lang.String

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.parsers.PipelineParser.parse(PipelineParser.java:53)

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.cps.PipelineCpsScmFlowDefinition.create(PipelineCpsScmFlowDefinition.java:50)

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.PipelineAsYamlScmFlowDefinition.create(PipelineAsYamlScmFlowDefinition.java:74)

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:309)

[2020-07-17T20:27:46.618Z] 	at hudson.model.ResourceController.execute(ResourceController.java:97)

[2020-07-17T20:27:46.618Z] 	at hudson.model.Executor.run(Executor.java:428)

[2020-07-17T20:27:46.618Z] Finished: FAILURE

To Reproduce

See the test in jenkinsci/jenkinsfile-runner#316

Expected behavior
All exceptions are properly propagated

Converting `environment` with `credentials` method is broken?

Describe the bug
The environment section parser bug: a special helper method credentials parser is broken. It's adding single quotes around method. Then it will be errors in pipeline.

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'Pipeline As YAML Converter'
  2. Paste the code from Additional context
  3. Press 'Convert to Pipeline'
  4. See like environment section converted
    4.1 Press 'Valid' & see validation errors

Expected behavior
As expected by docs of environment syntax.
Convert this

environment:
  COVERALLS_SECRET_TOKEN: credentials('COVERALLS_SECRET_TOKEN')

to this (without quotes around credentials method):

environment {
  COVERALLS_SECRET_TOKEN = credentials('COVERALLS_SECRET_TOKEN')
}

Desktop (please complete the following information):

  • Ubuntu 18.04 on server, Linux Mint 20 on desktop
  • Chrome 84 on desktop

Additional context
I have code like this:

pipeline
  agent:
    label: 'master'
  stages:
    - stage: Tests
      stages:
        - stage: Start tests
          environment:
            COVERALLS_SECRET_TOKEN: credentials('COVERALLS_SECRET_TOKEN')
          steps:
            - echo "====++++executing Start tests++++===="
            - echo "COVERALLS_SECRET_TOKEN = ${COVERALLS_SECRET_TOKEN}"
            - sh 'npm install'
            - sh 'npm test'
          post:
            always:
              - echo "====++++always++++===="
            success:
              - echo "====++++Start tests executed successfully++++===="
              - sh 'npm run coverage'
            failure:
              - echo "====++++Start tests execution failed++++===="

and this is parsing result:

pipeline {
  agent {
    node {
      label 'master'
    }
  }
  stages {
    stage('Tests') {
      stages {
        stage('Start tests') {
          environment {
            COVERALLS_SECRET_TOKEN = 'credentials('COVERALLS_SECRET_TOKEN')'
          }
          steps {
            echo "====++++executing Start tests++++===="
            echo "COVERALLS_SECRET_TOKEN = ${COVERALLS_SECRET_TOKEN}"
            sh 'npm install'
            sh 'npm test'
          }

Validation says that:

startup failed:
WorkflowScript: 18: Environment variable values must either be single quoted, double quoted, or function calls. @ line 18, column 50.
LS_SECRET_TOKEN = 'credentials('COVERALLS_
^

WorkflowScript: 17: No variables specified for environment @ line 17, column 11.
environment {
^

2 errors

Add support for Kubernetes plugin

Hi,
It would be great if we could use pipeline-as-yaml-plugin to describe a pipeline which uses the kubernetes agent. At the moment it doesn't seem to be possible.
Something like:

pipeline:
  agent:
    kubernetes:
      cloud: mystack
      yaml: >
        apiVersion: v1
        kind: Pod
        spec:
            imagePullSecrets:
            - name: my-creds
            containers:
            - name: ubuntu
              image: myimage:1.1
              command: ['sleep', 'infinity']
              tty: true
              imagePullPolicy: Always'''
    stages:
      - stage: Test
        steps: echo "Hello world"

I've tried to run the pipeline above through the conversion tool provided by the plugin and it almost worked. The problem is that they "yaml" key's value gets converted in a string with single quotes, which doesn't work with multiline strings such as the pod definition above.
I reckon if the plugin was able to wrap the value of the yaml key in a multiline string (e.g. '''string''') that would work

Matrix support

Is your feature request related to a problem? Please describe.
I need jenkins matrices support for some of my projects. I don't see any way to describe it in yaml.

Describe the solution you'd like
I would like to be able to describe matrices in yaml.

Describe alternatives you've considered
N/A

Additional context
I am pretty new to jenkins pipelines so maybe I am missing something about how to describe matrices in a pipeline.

Schema Validation

Requirements:

  • Schema of Yaml file should be validated before trying to parse.

Can't save custom script path

Describe the bug
When job Pipeline settings is set to 'Pipeline As Yaml from SCM' and a custom path to the Jenkinsfile.yaml is given, the location is not saved even after hitting the 'Save' button.

Steps to reproduce the behavior:

  1. Create a new job
  2. Setup the pipeline section with 'Pipeline As Yaml from SCM' and feed it with SCM/repo/credentials/branch and script path, if Jenkinsfile.yaml is in a subdirectory of a (git) projet.
  3. Click Save
  4. If you check the job configuration again, everything is fine except that the specified path as been replaced by "Jenkinsfile.yaml"

Expected behavior
A script path location taken into account

Versions:

  • Jenkins Version : 2.249.2
  • plugin version : 0.12-rc

Parity with standard Jenkinsfile for Docker

Is your feature request related to a problem? Please describe.

Since the standard Jenkinsfile doesn't require an explicit clone, pipeline as yaml shouldn't either.

Describe the solution you'd like

When it is yaml from SCM it should automatically have the code cloned (as per the configuration) to be consistent with what happens for Jenkinsfile from SCM.

Describe alternatives you've considered

The workaround is to add an explicit checkout

This works..

pipeline {
    agent { 
        docker { 
            image 'maven:3.3.3' 
            reuseNode true
        }
    }
    stages {
        stage('build') {
            steps {
                sh 'mvn clean package'
            }
        }
    }
}

But this doesn't

pipeline:
  agent:
    docker:
      image: maven:3.3.3
      # 2. Reusing the node (double-check that this doesn't happen automatically)
      reuseNode: 'true'
  stages:
    - stage: "build"
      steps:
        - sh "mvn clean package"

Unless you add a clone step before the maven build

        - "checkout([$class: 'GitSCM', branches: [[name: '*/jenkins-poc']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'example-cred', url: 'https://github.com/example-org/example-repo.git']]])"

Pipeline YAML may get processed as Groovy DSL when CpsFlowFactoryAction2 is present

Describe the bug
I hit this issue in the simple demo for jenkinsci/jenkinsfile-runner#316 . Jenkinsfile Runner uses the SCM source and the virtual FilesystemSCM, and presumptions in the Pipeline as YAML's code may lead to incorrect behavior if CpsFlowFactoryAction2 is present in the created Pipeline.

THIS IS NOT A BUG IN PIPELINE AS YAML, but some code hardening may make sense

Step 0. Jenkinsfile Runner adds the SetJenkinsfileLocation action which implements CpsFlowFactoryAction2. https://github.com/jenkinsci/jenkinsfile-runner/blob/9f41f51b6dc320b9dd5c0fa6d81f179518597d37/payload/src/main/java/io/jenkins/jenkinsfile/runner/SetJenkinsfileLocation.java

Step 1. PipelineCpsScmFlowDefinition converts YAML to Groovy DSL and then calls CpsFlowDefinition constructor

@Override
    public CpsFlowExecution create(FlowExecutionOwner owner, TaskListener listener, List<? extends Action> actions) throws Exception {
        CpsFlowExecution cpsFlowExecution =  super.create(owner, listener, actions);
        String yamlJenkinsFileContent = cpsFlowExecution.getScript();
       
       ....
 
        String jenkinsFileContent = pipelineModel.get().toPrettyGroovy();
        return new CpsFlowDefinition(jenkinsFileContent,cpsFlowExecution.isSandbox()).create(owner,listener, actions);
    }

Step 2. CpsFlowDefinition flow execution creator consults with actions passed as arguments. One of actions is SetJenkinsfileLocation. This action makes the method to calls return ((CpsFlowFactoryAction2) a).create(this, owner, actions); instead of creating the default constructor as coded below. The actually called create() method creates the execution from scratch and ignores the converted DSL. So it calls a standard

image

    @Override
    @SuppressWarnings("deprecation")
    public CpsFlowExecution create(FlowExecutionOwner owner, TaskListener listener, List<? extends Action> actions) throws IOException {
        for (Action a : actions) {
            if (a instanceof CpsFlowFactoryAction) {
                CpsFlowFactoryAction fa = (CpsFlowFactoryAction) a;
                return fa.create(this,owner,actions);
            } else if (a instanceof CpsFlowFactoryAction2) {
                return ((CpsFlowFactoryAction2) a).create(this, owner, actions);
            }
        }
        Queue.Executable exec = owner.getExecutable();
        FlowDurabilityHint hint = (exec instanceof Run) ? DurabilityHintProvider.suggestedFor(((Run)exec).getParent()) : GlobalDefaultFlowDurabilityLevel.getDefaultDurabilityHint();
        return new CpsFlowExecution(sandbox ? script : ScriptApproval.get().using(script, GroovyLanguage.get()), sandbox, owner, hint);
    }

Step 3. A standard Groovy Converter is called. Execution fails with...

org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: expecting EOF, found ':' @ line 5, column 10.
     - stage: "Print Hello"
            ^

1 error

        at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
        at org.codehaus.groovy.control.ErrorCollector.addFatalError(ErrorCollector.java:150)
        at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:120)
        at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:132)
        at org.codehaus.groovy.control.SourceUnit.addError(SourceUnit.java:350)
        at org.codehaus.groovy.antlr.AntlrParserPlugin.transformCSTIntoAST(AntlrParserPlugin.java:144)
        at org.codehaus.groovy.antlr.AntlrParserPlugin.parseCST(AntlrParserPlugin.java:110)
        at org.codehaus.groovy.control.SourceUnit.parse(SourceUnit.java:234)
        at org.codehaus.groovy.control.CompilationUnit$1.call(CompilationUnit.java:168)
        at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:943)
        at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:605)
        at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
        at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
        at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)

To Reproduce
Run a demo from jenkinsci/jenkinsfile-runner#316

Expected behavior
Pipeline as YAML code is more robust against custom CpsFlowFactoryAction2 implementations. Additional coverage for Pipeline replay/restart functionality might be needed

Additional context
For me the resolution will be clearly on the Jenkinsfile Runner side. This is rather code hardening, not a bug

Pipeline as YAML fails to run if user trigger it with Replay option

I have created a Pipeline in YAML as shown below.

pipeline:
    agent:
      any:
    stages:
     - stage: "Checkout"
       steps:
         script: 
           - git 'https://github.com/username/API.git'
     - stage: "Build Multi stage Docker Image"
       steps:
         script: 
           - sh "docker build -t username/webserver:v$BUILD_NUMBER ."

The pipeline when successful when I build it. But failing when I replay it with the below error.

Replayed #40
java.lang.ClassCastException: java.lang.String cannot be cast to java.util.LinkedHashMap
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.parsers.PipelineParser.parse(PipelineParser.java:34)
Caused: org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.exceptions.PipelineAsYamlRuntimeException: java.lang.String cannot be cast to java.util.LinkedHashMap
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.parsers.PipelineParser.parse(PipelineParser.java:53)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.cps.PipelineCpsFlowDefinition.create(PipelineCpsFlowDefinition.java:41)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.PipelineAsYamlScriptFlowDefinition.create(PipelineAsYamlScriptFlowDefinition.java:56)
	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:309)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:428)
Finished: FAILURE

To Reproduce
To reproduce this issue please create a Pipeline using YAML and try to run via replay.

Expected behavior
The pipeline should be successful.

Proposal: Change the package name for the code and tests

Currently the code uses the org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline; package. A few concerns there:

  • org.jenkinsci.plugins.workflow is an old root for Pipeline plugins. Maybe it makes sense to use io.jenkins.plugins.pipeline
  • multibranch does not seem to be needed, the code does not really depend on MultiBranch logic

If we do the change, it needs to happen before 1.0 release. It will be a breaking change, and I doubt it makes sense to spend time on data migration logic to prevent that.

Pipeline as YAML defines dependency on a higher slf4j-api version than the Jenkins core

Jenkins core includes slf4j-api, and the current versions use 1.7.26. The plugin uses version 1.7.30. It causes problems for components which use Maven Enforcer

Require upper bound dependencies error for org.slf4j:jcl-over-slf4j:1.7.26 paths to dependency are:
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.slf4j:jcl-over-slf4j:1.7.26
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.slf4j:jcl-over-slf4j:1.7.26 (managed) <-- org.slf4j:jcl-over-slf4j:1.7.30
,
Require upper bound dependencies error for org.slf4j:slf4j-api:1.7.26 paths to dependency are:
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.slf4j:slf4j-api:1.7.26
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.slf4j:jcl-over-slf4j:1.7.26
      +-org.slf4j:slf4j-api:1.7.26 (managed) <-- org.slf4j:slf4j-api:1.7.30
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.slf4j:log4j-over-slf4j:1.7.26
      +-org.slf4j:slf4j-api:1.7.26
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.slf4j:slf4j-jdk14:1.7.26 (managed) <-- org.slf4j:slf4j-jdk14:1.7.30
      +-org.slf4j:slf4j-api:1.7.26 (managed) <-- org.slf4j:slf4j-api:1.7.30

To Reproduce
jenkinsci/jenkinsfile-runner#316

Expected behavior
Pipeline as YAML uses the same library versions as Jenkins core. The recommendation is to update Plugin POM to 4.x and to use the dependency version provided by the Bill of Materials

Require a working example of executing multiline shell script

Describe your use-case which is not covered by existing documentation.

Require a best way to execute multiline shell script.
The script can includes multiple variable declarations and if else statements.

I see that individual line shell command can be executed using
sh [command]
However it should be great if we could have a standard way of executing multiline shell scripts.

If it already exists, I would appreciate if someone can point me to the documentation or provide example for reference here.

Reference any relevant documentation, other materials or issues/pull requests that can be used for inspiration.

No response

Are you interested in contributing to the documentation?

No response

stash and unstash

Can you please provide examples for stash and unstash usage examples? I tried below, but looks likes not way to declare.

  agent:
    label: "Master"
  steps:
    - deleteDir()
    - unstash 'ucdPackage'
    - echo '1'

Status of this project?

Hi everyone,

I just wanted to know what the status of this project is.

We (the company) will be moving to Jenkins in the near future and the devs have been looking at how to best migrate our existing pipelines (in Azure DevOps and TeamCity).

I understand that this project is currently in incubation but I haven't seen much activity on the repository (esp. Releases).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.