dorny / paths-filter Goto Github PK
View Code? Open in Web Editor NEWConditionally run actions based on files modified by PR, feature branch or pushed commits
License: MIT License
Conditionally run actions based on files modified by PR, feature branch or pushed commits
License: MIT License
Hi dorny
I use the action as below, hope it can detect the change of specific file, but I found that it always run the steps even the file is not change.
Is there anything I am missing? Thank you!
- uses: dorny/paths-filter@v2
id: changes
with:
filters: |
core:
- 'code365scripts/*.psd1'
teams:
- 'code365scripts.teams/*.psd1'
weixin:
- 'code365scripts.weixin/*.psd1'
When having a bigger codebase, which has PR merges quite frequently, the default setting for base
parameter with master
branch defeats the purpose of this action a bit, as the diff will include also the new changes from master
merged after you've checkout your branch.
I've tried setting it to base: ${{ github.ref }}
, which seems to work better as it's checks the diff with the last commit, but the problem is when you push your branch the first time. Then there's no commit to diff with, so in this case making base
as master
makes sense, just not sure how to have this condition in the workflow.
Maybe someone would have some suggestions?
Edit: or maybe good solution would be to use the default master
base, but the diff would only check new changes from the branch, but not what the branch is missing from master
as these changes are irrelevant and won't affect the build. Using something like git diff --name-only <ref>...HEAD
When setting the base
to the name of the current branch, it works correctly on future commits. But the first time the job runs (no commit since branch created) none of the files are detected as being changed.
I need a way to run a job the first time a new branch is pushed, or if certain files in the branch have changed. Can this be done?
(I can potentially do this in the latest jobs when I look for the changed output potentially, but wanted to check if I was missing something)
I have a requirement where workflow needs to run a job only if a file is changed in a specific folder . So I had used the below snippet in out github action .
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
# https://github.com/dorny/paths-filter
id: should_we_deploy
with:
filters: |
src:
- 'code/xyz/**'
This fails when it runs in the main (For the PRs it works fine) with the error "Error: This action requires 'base' input to be configured or 'repository.default_branch' to be set in the event payload" . However in the repository default branch is set and it is main
. The logs from the action
Run dorny/paths-filter@v2 with: filters: src: - ‘code/xyz/**’ token: *********** list-files: none initial-fetch-depth: 100 Get current git ref /usr/bin/git branch --show-current main Error: This action requires 'base' input to be configured or 'repository.default_branch' to be set in the event payload
Can this be due to mis configuration ? Thanks
Hey there!
I'm trying to use this action in a job called 'changes' which runs at the very start of my workflow. It works brilliantly on different steps in the same job but as soon as I try to write an extra job which depends on the 'changes' job using the 'needs' keyword the context object seems to be empty. I've even tried to log the needs object and it seems to be completely empty.
Here is my workflow file. I reckon I've overlooked something but hopefully you will be able to point me to it:
name: Build and Provide LPF artifacts
on:
push:
branches:
- master
pull_request:
branches:
- master
workflow_dispatch:
inputs:
project:
description: 'deploy to project brain-lpf-<dev|tst|prd>'
required: true
default: 'brain-lpf-dev'
jobs:
changes:
name: Identify Changes
runs-on: ubuntu-latest
outputs:
dataflow-artifacts-dags:
dataflow-artifacts-tf:
dataproc-artifacts:
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
name: Identify changes
id: changes
with:
list-files: shell
filters: |
dataflow-artifacts-dags:
- 'cloud-migration/airflow/**'
dataflow-artifacts-tf:
- 'cloud-migration/deployment/terraform/**'
dataproc-artifacts:
- 'on-premise/src/**'
- name: Output Changes to DAG Files
if: ${{ steps.changes.outputs.dataflow-artifacts-dags == 'true' }}
run: echo ${{ steps.identify-changes.outputs.dataflow-artifacts-dags_files }}
- name: Output Changes to TF Files
if: ${{ steps.changes.outputs.dataflow-artifacts-tf == 'true' }}
run: echo ${{ steps.identify-changes.outputs.dataflow-artifacts-tf_files }}
- name: Output Changes to Dataproc Jobs Source Files
if: ${{ steps.changes.outputs.dataproc-artifacts == 'true' }}
run: echo ${{ steps.identify-changes.outputs.dataproc-artifacts_files }}
upload-dataflow-artifacts:
name: Upload Dataflow Artifacts
runs-on: ubuntu-latest
needs: changes
env:
DATAFLOW_ARTIFACTS_DATAFLOW_NAME: lpf-airflow
DATAFLOW_ARTIFACTS_DATAFLOW_LOCATION: europe-west1
DATAFLOW_ARTIFACTS_DAG_LOCAL_LOCATION: cloud-migration/airflow
DATAFLOW_ARTIFACTS_DAG_REMOTE_LOCATION: dag
DATAFLOW_ARTIFACTS_TF_LOCAL_LOCATION: cloud-migration/deployment/terraform
DATAFLOW_ARTIFACTS_TF_REMOTE_LOCATION: data/terraform
steps:
- uses: actions/checkout@v2
- name: Setup gcloud / gsutil
uses: google-github-actions/setup-gcloud@master
with:
service_account_key: ${{ secrets.SA_TERRAFORM }}
project_id: ${{ github.event.inputs.project }}
export_default_credentials: false
- name: Identify Dataflow GCS Bucket
id: identify_airflow_bucket
run: |-
bucket_name=$(
gcloud composer environments describe ${{ env.DATAFLOW_ARTIFACTS_DATAFLOW_NAME }} \
--project=${{ github.event.inputs.project }} \
--location=${{ env.DATAFLOW_ARTIFACTS_DATAFLOW_LOCATION }} \
--format="get(config.dagGcsPrefix)" \
| cut -d/ -f 1-3
)
echo "Dataflow bucket identified as '$bucket_name'"
echo ::set-output name=DATAFLOW_ARTIFACTS_BUCKET::$bucket_name
- name: Upload DAG Files
if: ${{ needs.changes.outputs.dataflow-artifacts-dags }}
run: |-
from=${{ env.DATAFLOW_ARTIFACTS_DAG_LOCAL_LOCATION }}
to=${{ steps.identify_airflow_bucket.outputs.DATAFLOW_ARTIFACTS_BUCKET }}/${{ env.DATAFLOW_ARTIFACTS_DAG_REMOTE_LOCATION }}
echo "Uploading DAG Files from '$from' to '$to'"
gsutil -m cp -r $from $to
- name: Upload Terraform Files
if: ${{ needs.changes.outputs.dataflow-artifacts-tf }}
run: |-
from=${{ env.DATAFLOW_ARTIFACTS_TF_LOCAL_LOCATION }}
to=${{ steps.identify_airflow_bucket.outputs.DATAFLOW_ARTIFACTS_BUCKET }}/${{ env.DATAFLOW_ARTIFACTS_TF_REMOTE_LOCATION }}
echo "Uploading Terraform Files from '$from' to '$to'"
gsutil -m cp -r $from $to
upload-dataproc-artifacts:
name: Build and Upload LPF Dataproc Artifacts
runs-on: ubuntu-latest
needs: changes
if: ${{ needs.changes.outputs.dataproc-artifacts == 'true' }}
env:
DATAPROC_ARTIFACTS_BUCKET: ${{ github.event.inputs.project }}-dataproc-artifacts
DATAPROC_ARTIFACTS_BUILD_PATH: on-premise
MVN_CMD: mvn clean package -DskipIntegrationTests
steps:
- uses: actions/checkout@v2
- uses: actions/cache@v1
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-jdk8-${{ hashFiles('**/pom.xml') }}
restore-keys: ${{ runner.os }}-maven-jdk8-
- name: Setup gcloud / gsutil
uses: google-github-actions/setup-gcloud@master
with:
service_account_key: ${{ secrets.SA_TERRAFORM }}
project_id: ${{ github.event.inputs.project }}
export_default_credentials: true
- name: Setup JDK 1.8
uses: actions/setup-java@v1
with:
java-version: 1.8
- name: Build Dataproc Artifacts
run: |-
cd ${{ env.DATAPROC_ARTIFACTS_BUILD_PATH }}
eval ${{ env.MVN_CMD }}
env:
GOOGLE_APPLICATION_CREDENTIALS_FILE: google-key.json
GOOGLE_APPLICATION_CREDENTIALS_CONTENT: ${{ secrets.SA_TERRAFORM }}
- name: Extract Branch name for artifact upload
id: extract_branch
run: echo "##[set-output name=BRANCH;]$(echo ${GITHUB_REF#refs/heads/})"
- name: Upload Dataproc Artifacts
run: |-
from=${{ env.DATAPROC_ARTIFACTS_BUILD_PATH }}/target/productfeed*.jar
to=gs://${{ env.DATAPROC_ARTIFACTS_BUCKET }}/${{ steps.extract_branch.outputs.BRANCH }}
echo "Uploading Dataproc Artifacts from '$from' to '$to'"
gsutil -m cp -r $from $to
Thanks!
EDIT:
<job-id>:
- name: log context
- env:
NEEDS_CONTEXT: ${{ toJSON(needs) }}
- run: echo $NEEDS_CONTEXT
First let me say THANK YOU for the awesome Action! I am already finding it extremely useful!
The proposal is to add the ability to print the list of files that were created/updated/deleted.
This would provide some transparency which can be helpful:
This could be done by default (my preference) or could be opt-in via a new input/option.
@dorny Would you be willing to review/merge a PR implementing this feature?
Thanks a lot for this amazing action! We use it at CrownLabs@Polito
We had some action (1, 2) failing on pushes, this is the log (same on both):
Run dorny/[email protected]
with:
filters: webservice:
- 'webservice/**/*'
token: ***
/usr/bin/git fetch --depth=1 origin 0000000000000000000000000000000000000000
fatal: remote error: upload-pack: not our ref 0000000000000000000000000000000000000000
fatal: the remote end hung up unexpectedly
##[error]The process '/usr/bin/git' failed with exit code 128
This is our action implementation. Can you tell us if we are doing something wrong
Thank you very much!
I would like to pass also the head ref as parameter. I have several workflows that listens to workflow_dispatch and repository_dispatch and I need to filter files by path given a head and base ref.
There are some cases where I may want to use the same key for n+1 filters. For instance, if you have a src/<group>/**
and deploy/<group>/**
directory where changes in the <group>
directory in either locations should trigger a build that is named <group>
which is also the filter key.
Run dorny/[email protected]
Error: duplicated mapping key at line 2, column 1:
gcf : 'src/gcf/**'
^
It would be nice if the outputs was a set array that loaded unique keys from the filter.
It looks when the changes job runs after a PR is merged it can't sort out git? Works just fine on PR branches.
Run dorny/[email protected]
Changes will be detected against commit (cda515396b1453bfb814819e48b795d234094c28)
Checking if commit for cda515396b1453bfb814819e48b795d234094c28 is locally available
/usr/bin/git cat-file -e cda515396b1453bfb814819e48b795d234094c28^{commit}
fatal: not a git repository (or any of the parent directories): .git
Fetching cda515396b1453bfb814819e48b795d234094c28 from origin
/usr/bin/git fetch --depth=1 --no-tags origin cda515396b1453bfb814819e48b795d234094c28
fatal: not a git repository (or any of the parent directories): .git
Error: The process '/usr/bin/git' failed with exit code 128
We have separate main
and develop
branches, and merge to main intermittently. On all merges to develop
, the Github API reported changed files and the paths filter worked great.
On the merge to main, even though the PR page said 280 files were changed, the output from the paths filter said none were changed:
Ideally, I'd like these changed files to trigger tests. Any thoughts on what's going on here?
I have a question :)
If I understand correctly, specifying token will speedup diff calculation. I tried to use ${{ secrets.GITHUB_TOKEN }}
but it looks like the action continues fetching last commit. Here is how I use it:
- name: Detect changes
uses: dorny/[email protected]
id: changes
with:
token: ${{ secrets.GITHUB_TOKEN }}
filters: |
all: &all
- .github/workflows/main.yml
- CMakeLists.txt
- CMake/**
...
Am I doing something wrong?
Hey, great work on this, I see you've discovered our usage of it at Sentry. We're trying to use a few negation filters together, as an example
backend:
- '!**/*.tsx'
- '!**/*.less'
However, this doesn't work as I had expected - it seems to be OR
ing each filter so if I had a commit where a tsx
file changed, backend
would still match due to the second filter. Is this expected? Am I writing the filters incorrectly?
For some use-cases it might be useful to provide access to lists of changed (or added/deleted/modified) files via output parameters.
For this use-cases there is already Changed Files Exporter
action. Benefit to implement it here is that we could make it work also for push events and provide different encoding option (JSON array, new-line delimited).
Hi @dorny,
We have a workflow containing multiple triggers:
Your paths-filter
action works great for the first three use-cases, detecting the context and base to compare against. However, it fails for scheduled events giving the error This action requires 'base' input to be configured or 'repository.default_branch' to be set in the event payload
.
Looking in your code at where the error is thrown, we could set a base
, but I'm not sure how to set that in a way that would work for the other triggers.
Any suggestions on how to handle this? For a scheduled build there is obviously no "base" to compare against as it's just a specific commit that is used, but ideally it would fall-back and handle this situation without raising an error?
Thanks,
Joe.
Can it be possible to return an array as an input to use in a matrix ?
build:
strategy:
fail-fast: true
matrix:
package: {{steps.filter.outputs}} //This is not working cause is not an array but a map ?
if: ${{ matrix.package == 'true' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
id: filter
with:
base: 'main'
filters: |
shared:
- 'packages/shared/**'
template:
- 'packages/template/**'
Im getting a cannot unmarshal !!map into []interface {}
The main culprit seems to be this line:
Line 58 in ca8fa40
That causes the git merge-base to fail with: Not a valid object name remotes/origin/<tag>
I'm trying to check for all backend files that are NOT markdown (.md
) files and am having some trouble with the patterns.
The following pattern !src/share/services/**/**.md
picks up the changes to .github/workflows/ci.yml
(the file this is in).
It seems like this !src/share/services/**/**.md
negated pattern will match anything anywhere that's not a markdown file
jobs:
lint:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
id: changes
with:
list-files: 'json'
filters: |
backend:
- 'src/share/services/**'
- '!src/share/services/**/**.md'
Is it possible to do something like this via picomatch?
Thanks for the great action!
Hello,
Thank you for developing this great tool, it very helpful in my current project.
But I have a little problem, I can't ignore folders which contain other folders, maybe your tool have some option to ignore folders from filter?
I mean a situation when folder system looks like:
Folder1:
Folder11
Folder12
Folder13
And I want to ignore for example Folder 12, but checking Folder13 and Folder11, maybe you know some solution for that case?
P.S. I'm trying to use 'ignore' option from micromath - '(BigFolder/**, {ignore: InsideFolder})', but it not works, filter just disabled and don't see any file changes :(
When I start my workflow manually with workflow_dispatch, I get the following warning and changes will be detected even though there wasn't any: 'before' field is missing in PUSH event payload - changes will be detected from last commit
Run: https://github.com/Cyberbeni/install-swift-tool/runs/1432414699?check_suite_focus=true
Workflow: https://github.com/Cyberbeni/install-swift-tool/blob/70b350e0cafa467112631131ea9e64dc29f979b0/.github/workflows/update-dependencies.yml
Commit created because changes in 'dist/**' were detected: Cyberbeni/install-swift-tool@a68d9a3
There are quite a few cases were you need to provide a comma separated list of files for cli arguments or properties. So I think it would be good to be able to define it in the action, rather that doing sed.
When triggered via pull_request, the paths-filter always compares against the base branch for the PR. While this is fine for the first push, on subsequent pushes to the same PR it ignores the "base" option so it will always detect the same file changes, rather than just the most recent changes.
In our workflow, we provision a cloud environment when a PR is created. When there is a "synchronize" event from a later commit/push we don't want to re-provision the environment unless the kubernetes config files have changed.
Right now, paths-filter always detects the change and re-provisions on each commit/push. If I try switching from a pull_request trigger to a push trigger then it only sees the change in the latest commit rather than the last pushed commit (gitlab only has the gitlab.payload.before defined for pull requests)
Is there any way to allow the "base" option to be respected for pull_request triggers so we can point to the gitlab.payload.before instead of always using the base branch of the PR?
For some special use-cases it might be viable to trigger different filter rules when some file is added, deleted or modified.
See:
status
fieldgit diff-index --name-status
I am looking into how I can build a Github Actions Pipeline for a Monorepo using paths-filters
.
paths-filters
ensures that only the needed Jobs are executed and this works good for PRs / feature branches. As part of the pipeline we run tests on branches and prs. Once a change gets merged into master
branch we run a couple of more tests before the changes are deployed to our staging system. This works fine if tests are not failing ;) but in case e.g. some tests fail I would like to somehow with the next run take all changes since the last "successful" commit to master
into account.
Technically I can do it by using the commit SHA of the last successful commit as the base
. But then I would have to change my workflow yaml once tests finish successfully, which doesnt sound like a good idea.
Any other ideas how to solve this?
Thanks for this great Github Actions ;)
So one of the ways we're using this is to find added/modified files to pass to our linter. We use list-files: shell
since we pass this list of files to the linter. However, we'd also like to get the count of files so that we can do some different conditionals based on the # of changed files.
Any objections to adding the count to the action outputs?
When I merge a PR, the filters work correctly, but when I run the same thing from a workflow_dispatch trigger, this action views all files as added
. From the logs, I see No merge base found - all files will be listed as added
when the base is correctly resolved as main
. This happens even when the number of commits in the PR is small, so I don't think increasing depth
would fix this.
Here's the relevant snippet from my action:
- uses: dorny/paths-filter@v2
id: changes
with:
filters: |
settings:
- 'es_settings/**'
- name: "Trigger bootstrap"
if: steps.changes.outputs.settings == 'true'
id: bootstrap
uses: ./.github/actions/bootstrap
When run as workflow_dispatch, all files in es_settings
directory are flagged as added even though none of these files changed.
Matching files:
es_settings/<filename>.json [added]
...
es_settings/<other filename>.json [added]
Here's the run logs during workflow_dispatch
:
Changes will be detected against the branch main
Checking if commit for main is locally available
/usr/bin/git cat-file -e main^{commit}
fatal: Not a valid object name main^{commit}
Fetching main
/usr/bin/git fetch --depth=10 --no-tags origin main:main
From https://github.com/<repo name>
* [new branch] main -> main
* [new branch] main -> origin/main
Searching for merge-base with main
/usr/bin/git merge-base main HEAD
/usr/bin/git rev-list --count HEAD
1
/usr/bin/git rev-list --count main
46
/usr/bin/git fetch --deepen=10 --no-tags
From https://github.com/<repo name>
* [new branch] <branch name> -> origin/<branch name>
/usr/bin/git rev-list --count HEAD
25
/usr/bin/git rev-list --count main
14
No merge base found - all files will be listed as added
and here's the run logs when running from a merge:
Fetching list of changed files for PR#58 from Github API
Am I using this wrong during workflow_dispatch
? It looks like the base
of main
branch is inferred correctly, but no merge base is being correctly set?
If I have a workflow event with trigger:
on:
pull_request:
types:
- labeled
and my job looks like:
jobs:
tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: dorny/[email protected]
id: filter
with:
filters: |
:migrations
- 'db/**/*'
- name: deploy
if: steps.filter.outputs.migrations == 'false'
run: ...
The above deploy
step will always run, even if there are changes to the db
directory.
I believe this is because the pull_request
event that you're using in your GitHub Action doesn't support the labeled
type.
I offer to make two changes:
Thank you a lot for this action!
Could you add the ability to use rules recursively?
For example:
- name: Detect changes
uses: dorny/[email protected]
id: changes
with:
filters: |
all:
- '.github/workflows/main.yml'
- 'CMakeLists.txt'
- 'CMake/**'
utils:
- ${{ steps.changes.filters.all }}
- 'Utils/**'
Such rule would allow to avoid always check for all
rule in every step. This is just a simple example. But such feature can help to avoid duplicated paths in dependent filters (very usefull for project with complex dependencies).
I would like to be able to set the default output if the list is empty.
Using the below as an example, I would like to be able to do two things.
with.default: [src, server]
with.outputAllFilters: true
or something like thatjobs:
tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
id: filter
with:
filters: |
backend:
- 'backend/**'
frontend:
- 'frontend/**'
I'd like to run actions with the matrix option only when the matrix isn't empty, but I'm currently getting an error saying "Matrix vector does not contain any values".
Ideally, if the matrix vector didn't contain any values, the test job would not run.
My action looks like this:
jobs:
changes:
runs-on: ubuntu-latest
outputs:
# Expose matched filters as job 'services' output variable
services: ${{ steps.filter.outputs.changes }}
steps:
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@v2
id: filter
with:
filters: |
service1:
- service1/**/*.py
- service1/env.yml
service2:
- service2/**/*.py
- service2/env.yml
# JOB to test each of modified services
test:
needs: changes
strategy:
matrix:
# Parse JSON array containing names of all filters matching any of changed files
service: ${{ fromJson(needs.changes.outputs.services) }}
runs-on: ubuntu-latest
...
Thanks in advance for your help!
The action does not work with act cli tool. Works just fine on GH.
strategy:
41 matrix:
42 stack: ${{ fromJson(needs.changes.outputs.stacks) }}
Error: yaml: unmarshal errors:
line 42: cannot unmarshal !!str `${{ fro...` into []interface {}
I'm working on refactoring the existing Apache Pulsar (apache/pulsar) build in my fork lhotari/pulsar.
This is the way how I'm using paths-filter:
https://github.com/lhotari/pulsar/blob/ef23acb30548a37de648cc89965f854668975e95/.github/workflows/pulsar-ci.yaml#L32-L50
However this fails after a few builds with this type of errors
Fetching list of changed files for PR#11 from Github API
Number of changed_files is 157
Invoking listFiles(pull_number: 11, page: 1, per_page: 100)
Error: API rate limit exceeded for installation ID 6407194.
2021-03-05T09:10:57.7553228Z ##[section]Starting: Request a runner to run this job
2021-03-05T09:10:58.4697507Z Can't find any online and idle self-hosted runner in current repository that matches the required labels: 'ubuntu-20.04'
2021-03-05T09:10:58.4697604Z Can't find any online and idle self-hosted runner in current repository's account/organization that matches the required labels: 'ubuntu-20.04'
2021-03-05T09:10:58.4697805Z Found online and idle hosted runner in current repository's account/organization that matches the required labels: 'ubuntu-20.04'
2021-03-05T09:10:58.7015644Z ##[section]Finishing: Request a runner to run this job
2021-03-05T09:11:05.2736491Z Current runner version: '2.277.1'
2021-03-05T09:11:05.2761183Z ##[group]Operating System
2021-03-05T09:11:05.2761887Z Ubuntu
2021-03-05T09:11:05.2762255Z 20.04.2
2021-03-05T09:11:05.2762584Z LTS
2021-03-05T09:11:05.2762925Z ##[endgroup]
2021-03-05T09:11:05.2763374Z ##[group]Virtual Environment
2021-03-05T09:11:05.2763851Z Environment: ubuntu-20.04
2021-03-05T09:11:05.2764295Z Version: 20210302.0
2021-03-05T09:11:05.2765039Z Included Software: https://github.com/actions/virtual-environments/blob/ubuntu20/20210302.0/images/linux/Ubuntu2004-README.md
2021-03-05T09:11:05.2766380Z Image Release: https://github.com/actions/virtual-environments/releases/tag/ubuntu20%2F
2021-03-05T09:11:05.2767122Z ##[endgroup]
2021-03-05T09:11:05.2768917Z ##[group]GITHUB_TOKEN Permissions
2021-03-05T09:11:05.2770175Z Actions: write
2021-03-05T09:11:05.2770932Z Checks: write
2021-03-05T09:11:05.2771460Z Contents: write
2021-03-05T09:11:05.2772083Z Deployments: write
2021-03-05T09:11:05.2772752Z Issues: write
2021-03-05T09:11:05.2773287Z Metadata: read
2021-03-05T09:11:05.2774019Z OrganizationPackages: write
2021-03-05T09:11:05.2774678Z Packages: write
2021-03-05T09:11:05.2775315Z PullRequests: write
2021-03-05T09:11:05.2775974Z RepositoryProjects: write
2021-03-05T09:11:05.2776698Z SecurityEvents: write
2021-03-05T09:11:05.2777277Z Statuses: write
2021-03-05T09:11:05.2777911Z ##[endgroup]
2021-03-05T09:11:05.2781103Z Prepare workflow directory
2021-03-05T09:11:05.3421642Z Prepare all required actions
2021-03-05T09:11:05.3430093Z Getting action download info
2021-03-05T09:11:05.7040658Z Download action repository 'dorny/paths-filter@v2'
2021-03-05T09:11:08.1104090Z ##[group]Run dorny/paths-filter@v2
2021-03-05T09:11:08.1104621Z with:
2021-03-05T09:11:08.1105303Z token: ***
2021-03-05T09:11:08.1106287Z filters: src:
# pattern syntax uses https://github.com/micromatch/picomatch
- '!((site2|deployment)/**)'
2021-03-05T09:11:08.1107127Z list-files: none
2021-03-05T09:11:08.1107678Z initial-fetch-depth: 10
2021-03-05T09:11:08.1108218Z env:
2021-03-05T09:11:08.1110259Z MAVEN_OPTS: -Dmaven.test.failure.ignore=true -DtestRetryCount=0 -Xmx1024m -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.http.retryHandler.class=standard -Dmaven.wagon.http.retryHandler.count=3
2021-03-05T09:11:08.1112038Z ##[endgroup]
2021-03-05T09:11:08.5138995Z ##[group]Fetching list of changed files for PR#11 from Github API
2021-03-05T09:11:08.5140857Z Number of changed_files is 157
2021-03-05T09:11:08.5331126Z Invoking listFiles(pull_number: 11, page: 1, per_page: 100)
2021-03-05T09:11:08.6419876Z ##[error]API rate limit exceeded for installation ID 6407194.
2021-03-05T09:11:08.6544385Z Evaluate and set job outputs
2021-03-05T09:11:08.6564810Z Cleaning up orphan processes
It seems that paths-filters exceeds the GitHub API rate limits when using a large repository like apache/pulsar.
paths-filter would use GitHub API in a way that it doesn't exceed the rate limiting.
Is there any workaround for this problem?
I have reported a similar issue about test-reporter, dorny/test-reporter#68
If repository is checked out into subfolder of $GITHUB_WORKSPACE
, our change detection via git diff-index
will fail. Checkout to subfolder is not common scenario but there are valid use cases for that. One example is checkout out multiple repositories.
Solution would be to add working-directory
input parameter that would allow user to specify folder where change detection (i.e. git commands) should be executed. This option won't have any impact on change detection for pull requests using Github API.
Would it be possible to create a major version tag that constantly gets updated with each minor version like in actions/checkout: https://github.com/actions/checkout/releases? This would allow us to reference this action by just doing uses: dorny/paths-filter@v2
in our workflow YML files since it is a bit of a pain to constantly keep up with each minor version release.
Pipeline fails with:
Run dorny/[email protected]
Get current git ref
/usr/bin/git branch --show-current
error: unknown option `show-current'
Line 168 in 78ab00f
git version 2.20.1
is used.
The same issue as for actions/checkout
: actions/checkout#121
G'day:
I am testing this action locally with act like so:
act -j deployToFirebase -v
My stack:
I am running into a problem, and here's the partial stack trace:
[Build and Deploy to Firebase/deployToFirebase] Exec command '[node /actions/dorny-paths-filter@v2/dist/index.js]'
[Build and Deploy to Firebase/deployToFirebase] ❗ ::error::Cannot read property 'startsWith' of undefined
[Build and Deploy to Firebase/deployToFirebase] ❌ Failure - dorny/paths-filter@v2
...
DEBU[0040] exit with `FAILURE`: 1
DEBU[0040] exit with `FAILURE`: 1
DEBU[0040] exit with `FAILURE`: 1
Error: exit with `FAILURE`: 1
My workflow file deployToFirebase.yml
looks like:
name: Build and Deploy to Firebase
on:
push:
branches: [ develop ]
jobs:
deployToFirebase:
runs-on: ubuntu-latest
env:
CI: ""
steps:
- name: Check out code
uses: actions/checkout@master
- uses: dorny/paths-filter@v2
id: filter
with:
filters: |
frontend:
- 'src/**'
backend:
- 'functions/**'
# run only if 'frontend' (hosting) files were changed
- name: frontend steps
if: steps.filter.outputs.frontend == 'true'
run: |
echo "<<<<<<<<<<<<<<<<<<<<<<< frontend changed"
# run only if 'backend' (functions) files were changed
- name: functions steps
if: steps.filter.outputs.backend == 'true'
run: |
echo ">>>>>>>>>>>>>>>>>>>>>>> functions changed"
What am I not doing right? Or is there anything special I have to do in order to use this action?
I am also open to suggestions on how else to test locally to save on the hassle of first sending the code to github and then waiting for the workflow result
This is a follow on issue to #90 (I wasn't able to reopen my issue closed by someone else because I'm not a collaborator, I think).
Fix #91 worked well for me when using a short name, but it does break when I use a fully qualified ref. I think that the filter logic here breaks down when shortName
is fully qualified, e.g., if shortName == 'refs/remotes/origin/{branch_name}
, then match[1] == '{branch_name}'
when match !== null
.
I think that this can be fixed by modifying getShortName
to also strip 'refs/remotes/origin'
from the ref
.
Assuming I use the example below, is there a way I can dynamically get the keys (frontend
, backend
) of `with.filters?
Something like steps.filters.keys
would output '[frontend, backend]'
jobs:
tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
id: filter
with:
filters: |
backend:
- 'backend/**'
frontend:
- 'frontend/**'
When you have an action triggered by a push to a certain branch, say my_branch
, and another similarly named branch exists, say another/my_branch
, getLocalRef
will return another/my_branch
instead of my_branch
as it shows up first in the list of matching refs pulled by getLocalRef
:
const output = (await exec('git', ['show-ref', shortName], {ignoreReturnCode: true})).stdout
const refs = output.split(/\r?\n/g)
.map(l =>; l.match(/refs\/.*$/)?.[0] ?? '')
.filter(l =>; l !== '')
Prefixing with this format{some_prefix}/{branch_name}
causes some trouble with show-ref
as they show up as refs for {branch_name}
. Annoyingly, some bots will autogenerate branches with this pattern, e.g., changeset actions has a bot that does such a thing (see example below).
This was caused by the changeset bot:
I'm not sure what the right solution is here, I'd be happy to work together to figure it out but I'm not an expert in actions or git in general. It does appear to be an undesirable behavior that can get by you rather silently. I think you can solve this in some cases by explicitly passing the ref
in the inputs but does it make sense to fail if say there remoteRef.length > 1
in getLocalRef
?
I'm unable to get my changes
job to run which uses paths-filter
within it. It always fails with the following error:
My workflow looks like this:
name: Deploy API server
on:
push:
branches: [master]
jobs:
changes:
runs-on: ubuntu-latest
# Set job outputs to values from filter step
outputs:
apiServer: ${{ steps.filter.outputs.apiServer }}
steps:
- name: Checkout source code
uses: actions/checkout@v1
- name: Check for changes
uses: dorny/[email protected]
id: filter
with:
filters: |
apiServer:
- 'packages/api-server/**/*'
tests:
runs-on: ubuntu-latest
defaults:
run:
working-directory: packages/api-server
if: ${{ needs.changes.outputs.apiServer == 'true' }}
steps:
- name: Checkout source code
uses: actions/checkout@v1
- name: Use Node.js 12.x
uses: actions/setup-node@v1
with:
node-version: "12.x"
- name: Install dependencies
run: yarn
- name: Tests
run: yarn test
I don't know if it's paths-filter
that's causing it but thought I'll open this issue just in case it helps.
Does this action allow me to run only if the files changed are in src
and nowhere else?
P.S. I am new to github actions...
For example, I have targets
array in my matrix. Then I use this action to detect changed changed with filters named exactly the same as targets
:
jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [windows-latest, ubuntu-latest]
target: [first, second]
steps:
- name: Detect changes
uses: dorny/[email protected]
id: changes
with:
filters: |
first:
- /some/path/**
second:
- some/another/path/**
Is there a way to check if matrix target
was filtered? Something like the following:
- name: My Action
if: contains(steps.changes.outputs, matrix.target)
run: echo "test"
I have looked at the examples in the readme, but I can't get this to work:
There's a dir called Deploy
in my project, with each file inside named after an Azure resource I want to deploy something to, and the contents being the version I want to deploy. Something like
./Deploy/ase-something-dev
./Deploy/ase-something-staging
./Deploy/ase-seomthing-prod
I want to use paths-filter to conditionally run some steps, including the deployment. As I saw that list-files is possible I am trying to use it to configure a matrix as such:
# on, env omitted
jobs:
detect-changes:
runs-on: ubuntu-latest
outputs:
deploy: ${{ steps.filter.outputs.deploy }}
application: ${{ steps.filter.outputs.application }}
steps:
- uses: actions/checkout@v2
- uses: dorny/paths-filter@v2
id: filter
with:
filters: |
deploy:
- 'Deploy/**'
application:
- '!(Deploy/**)'
list-files: json
# some other jobs omitted
deploy:
needs: [detect-changes, some-other-step]
if: # omitted
runs-on: ubuntu-latest
strategy:
max-parallel: 1
matrix:
# is it correct to expect the 'deploy_files' output here? And should it be valid json?
environment: ${{ fromJSON(needs.detect-changes.outputs.deploy_files) }}
steps:
# I would expect this to output 'current environment is ./Deploy/ase-something-dev', etc
- run: echo "current environment is ${{ matrix.environment }}"
My workflow is failing at the deploy job with Error when evaluating 'strategy' for job 'deploy'. (Line: 101, Col: 22): Error reading JToken from JsonReader. Path '', line 0, position 0.,(Line: 101, Col: 22): Unexpected value ''
It would be useful if this could filter based on changes to HEAD. I've ran into patterns where I want to send dispatches to start other workflows for syncing translations or uploading changed assets. But those jobs cannot use this because it gets the error: This action can be triggered only by pull_request or push event
Thanks for the work!
Hello, first of all, i grateful for developing this action. I have a problem, im trying to use this action in my workflow, but have an error during step "Searching for merge-base with master" immediately after launch build. I started build from my branch, and need to know, does files in my branch after commit different than files in master. Look at screenshot, git is cant find command like "no-auto-gc", maybe you see this error before and know how to fix it?
P.S. Builds without this action working good.
P.S.S. And i cant find any information about this command.
Would it be possible to add the possibility to have the list of files, shell escaped, but unquoted?
Right now when filtering inside a pull_request workflow, the list of changes will be all changes inside the pull request.
Ideally the filter would run only on the changes that have been made between the last push to the PR and the current push to the PR. I'm aware that rebaseing and force pushing will screw things up (I would then take the whole changeset of the PR as filter). But for small changes to the PR, I don't want to trigger all builds of all components if only one is affected.
Is there a way to do this already?
Christian
Hello,
We have a microservice architecture operating in a mono-repo configuration.
I would like to create 1 Github Action to lint/test/publish/... only the service that has changed.
I thought of using the list-files: shell
, then consuming the result by iterating through the list and for each path executing a command. Something on the lines of: for (service in list-files); cd service && golangci-lint run
.
We'll be only changing one service at a time, so most likely the list will only have one value.
The workflow detects correctly which services have been changed, but I'm not able to consume the list correctly.
Any help is much appreciated.
Thanks,
My repo looks like this:
.
└── .github
└── workflows
└── go.yml
├── README.md
├── go.mod
└── server
├── serviceone
│ └── main.go
├── servicetwo
│ └── main.go
└── servicethree
└── main.go
My go.yml
looks like:
name: Lint Go code
on: [push, pull_request]
jobs:
Lint:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: Set up Go
uses: actions/setup-go@v2
with:
go-version: 1.15
- uses: dorny/paths-filter@v2
id: changes
with:
filters: |
server:
- 'server/**'
list-files: shell
- name: Printing file changes
run: |
for val in ${steps.changes.outputs[@]}; do echo $val; done
I'm getting the following result after Github actions runs my workflow:
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.