GithubHelp home page GithubHelp logo

treosh / lighthouse-ci-action Goto Github PK

View Code? Open in Web Editor NEW
1.1K 14.0 81.0 56.93 MB

Audit URLs using Lighthouse and test performance with Lighthouse CI.

License: MIT License

JavaScript 96.79% Python 2.46% HTML 0.75%
lighthouse github-actions performance-budget lighthouse-ci

lighthouse-ci-action's Introduction

Lighthouse CI Action

Audit URLs using Lighthouse and test performance with Lighthouse CI.

This action integrates Lighthouse CI with Github Actions environment. Making it simple to see failed tests, upload results, run jobs in parallel, store secrets, and interpolate env variables.

It is built in collaboration between Lighthouse Team, Treo (web performance monitoring company), and many excellent contributors.

Features:

  • โœ… Audit URLs using Lighthouse v11
  • ๐ŸŽฏ Test performance with Lighthouse CI assertions or performance budgets
  • ๐Ÿ˜ป See failed results in the action interface
  • ๐Ÿ’พ Upload results to a private LHCI server, Temporary Public Storage, or as artifacts
  • โš™๏ธ Full control over Lighthouse CI config
  • ๐Ÿš€ Fast action initialization (less than 1 second)

Lighthouse CI Action

Example

Run Lighthouse on each push to the repo, test performance budget, save results as action artifacts.

Create .github/workflows/main.yml with the list of URLs to audit using Lighthouse.

name: Lighthouse CI
on: push
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Audit URLs using Lighthouse
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: |
            https://example.com/
            https://example.com/blog
          budgetPath: ./budget.json # test performance budgets
          uploadArtifacts: true # save results as an action artifacts
          temporaryPublicStorage: true # upload lighthouse report to the temporary storage

Describe your performance budget using a budget.json.

[
  {
    "path": "/*",
    "resourceSizes": [
      {
        "resourceType": "document",
        "budget": 18
      },
      {
        "resourceType": "total",
        "budget": 200
      }
    ]
  }
]

โš™๏ธ See this workflow in use

Recipes

Run Lighthouse and validate against Lighthouse CI assertions.

Create .github/workflows/main.yml with the list of URLs to audit and identify a lighthouserc file with configPath.

main.yml

name: Lighthouse
on: push
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Run Lighthouse on urls and validate with lighthouserc
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: 'https://exterkamp.codes/'
          configPath: './lighthouserc.json'

Make a lighthouserc.json file with LHCI assertion syntax.

lighthouserc.json

{
  "ci": {
    "assert": {
      "assertions": {
        "first-contentful-paint": ["error", { "minScore": 0.6 }]
      }
    }
  }
}
Lighthouse CI Action: test Lighthouse assertions

โš™๏ธ See this workflow in use

Upload results to a private LHCI server.

Create .github/workflows/main.yml with the list of URLs to audit using lighthouse, and identify a serverBaseUrl to upload to and an token to use.

Note: use GitHub secrets to keep your server address hidden!

main.yml

name: Lighthouse
on: push
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Run Lighthouse on urls and upload data to private lhci server
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: 'https://example.com/'
          serverBaseUrl: ${{ secrets.LHCI_SERVER_URL }}
          serverToken: ${{ secrets.LHCI_SERVER_TOKEN }}
Lighthouse CI Action: Upload results to a private server

โš™๏ธ See this workflow in use

Audit with custom Chrome options and custom Lighthouse config.

Create .github/workflows/main.yml with the list of URLs to audit and identify a lighthouserc file with configPath.

main.yml

name: Lighthouse
on: push
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Run Lighthouse on urls with lighthouserc
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: 'https://example.com/'
          configPath: './lighthouserc.json'

Chrome flags can be set directly in the lighthouserc's collect section.

lighthouserc.json

{
  "ci": {
    "collect": {
      "numberOfRuns": 1,
      "settings": {
        "chromeFlags": "--disable-gpu --no-sandbox --no-zygote"
      }
    }
  }
}

Custom Lighthouse config can be defined in a seperate Lighthouse config using the custom Lighthouse config syntax. This is then referenced by the lighthouserc file in the configPath.

lighthouserc.json

{
  "ci": {
    "collect": {
      "numberOfRuns": 1,
      "settings": {
        "configPath": "./lighthouse-config.js"
      }
    }
  }
}

Then put all the custom Lighthouse config in the file referenced in the lighthouserc.

lighthouse-config.js

module.exports = {
  extends: 'lighthouse:default',
  settings: {
    emulatedFormFactor: 'desktop',
    audits: [{ path: 'metrics/first-contentful-paint', options: { scorePODR: 800, scoreMedian: 1600 } }],
  },
}

โš™๏ธ See this workflow in use

Test a static site without having to deploy it.

Create .github/workflows/main.yml and identify a lighthouserc file with a staticDistDir config.

main.yml

name: Lighthouse
on: push
jobs:
  static-dist-dir:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Run Lighthouse against a static dist dir
        uses: treosh/lighthouse-ci-action@v11
        with:
          # no urls needed, since it uses local folder to scan .html files
          configPath: './lighthouserc.json'

lighthouserc.json

{
  "ci": {
    "collect": {
      "staticDistDir": "./dist"
    }
  }
}

Inside your staticDistDir there should be html files that make up your site. LHCI will run a simple static webserver to host the files, then run an audit against each of them. More details on this process are in the Lighthouse CI docs.

Lighthouse CI Action: Test a static site without having to deploy it

โš™๏ธ See this workflow in use

Integrate Lighthouse CI with Netlify

It waits for Netlify to finish building a preview and then uses a built version to check performance. Hence, recipe is a composition of 2 actions: Wait for Netlify Action and Lighthouse CI Action.

name: Lighthouse CI for Netlify sites
on: pull_request
jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4
      - name: Wait for the Netlify Preview
        uses: jakepartusch/[email protected]
        id: netlify
        with:
          site_name: 'gallant-panini-bc8593'
      - name: Audit URLs using Lighthouse
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: |
            ${{ steps.netlify.outputs.url }}
            ${{ steps.netlify.outputs.url }}/products/
          budgetPath: ./budget.json
          uploadArtifacts: true

โš™๏ธ See this workflow in use

Use URLs interpolation to pass secrets or environment variables

URLs support interpolation of process env variables so that you can write URLs like:

name: Lighthouse CI
on: push
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Run Lighthouse and test budgets
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: |
            https://pr-$PR_NUMBER.staging-example.com/
            https://pr-$PR_NUMBER.staging-example.com/blog
          budgetPath: ./budgets.json
          temporaryPublicStorage: true
        env:
          PR_NUMBER: ${{ github.event.pull_request.number }}

โš™๏ธ See this workflow in use

Use with a Lighthouse plugin.

Combine the field performance plugin with Github Actions.

main.yml

name: Lighthouse CI with a plugin
on: push
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - run: npm install # install dependencies, that includes Lighthouse plugins
      - name: Audit URLs with Field Performance Plugin
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: |
            https://www.example.com/
          configPath: '.lighthouserc.json'
          temporaryPublicStorage: true

lighthouserc.json

{
  "ci": {
    "collect": {
      "settings": {
        "plugins": ["lighthouse-plugin-field-performance"]
      }
    }
  }
}

Add a plugin as a dependency, so it's installed locally:

package.json

{
  "devDependencies": {
    "lighthouse-plugin-field-performance": "^2.0.1"
  }
}

โš™๏ธ See this workflow in use

Use `output` for a powerful composition with other actions

main.yml

# Example of output usage
name: LHCI-output-webhook
on: push
jobs:
  output-webhook:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Use output for sending data to API.
        id: LHCIAction
        uses: ./
        with:
          urls: |
            https://treo.sh/
      - name: Webhook
          uses: denar90/[email protected]
          with:
            webhookUrl: ${{secrets.ACTION_WEBHOOK_URL}}
            data: '{ "links": ${{steps.LHCIAction.outputs.links}}, "manifest": ${{steps.LHCIAction.outputs.manifest}} }'

โš™๏ธ See this workflow in use

GitHub Action workflow on self-hosted GitHub runner (e.g. on-premise)

main.yml

name: Lighthouse CI
on: push
jobs:
  lighthouse:
    runs-on: [self-hosted, your-custom-label]
    steps:
      - uses: actions/checkout@v4
      - name: install Node.js

      - uses: browser-actions/setup-chrome@latest

      - run: chrome --version
        uses: actions/setup-node@v3
        with:
          node-version: ${{YOUR_REQUIRED_NODE_JS_VERSION}}

      - name: Audit URLs using Lighthouse
        uses: treosh/lighthouse-ci-action@v11
        with:
          urls: |
            https://example.com/
            https://example.com/blog
        [...]

Learn more about hosted runners

Dynamically generate URLs

Use github-script or any other means to dynamically generate a list of URLs to test

main.yml

jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Generate URLs
        id: urls
        uses: actions/github-script@v6
        with:
          github-token: ${{ secrets.GITHUB_TOKEN }}
          script: |
            const globber = await glob.create('elements/*/demo/*.html');
            const files = await globber.glob();
            const urls = files
              .map(x => x.match(/([\w-]+)/)[1])
              .map(x => `${${{ env.DOMAIN }}}/components/${x}/demo/`)
              .join('\n');
            core.setOutput('urls', urls);

      - name: Lighthouse CI Action
        id: lighthouse
        uses: treosh/lighthouse-ci-action@v8
        with:
          urls: |
            ${{ steps.urls.outputs.urls }}

Explore more workflows in public examples. Submit a pull request with a new one if they don't cover your use case.

Inputs

urls

Provide the list of URLs separated by a new line. Each URL is audited using the latest version of Lighthouse and Chrome preinstalled on the environment.

urls: |
  https://example.com/
  https://example.com/blog
  https://example.com/pricing

uploadArtifacts (default: false)

Upload Lighthouse results as action artifacts to persist results. Equivalent to using actions/upload-artifact to save the artifacts with additional action steps.

uploadArtifacts: true

uploadExtraArgs

Add extra args to the upload command.

uploadExtraArgs: "--extraHeaders.Authorization='Bearer X92sEo3n1J1F0k1E9' --extraHeaders.Foo='Bar'"

temporaryPublicStorage (default: false)

Upload reports to the temporary public storage.

Note: As the name implies, this is temporary and public storage. If you're uncomfortable with the idea of your Lighthouse reports being stored on a public URL on Google Cloud, use a private LHCI server. Reports are automatically deleted 7 days after upload.

temporaryPublicStorage: true

budgetPath

Use a performance budget to keep your page size in check. Lighthouse CI Action will fail the build if one of the URLs exceeds the budget.

Learn more about the budget.json spec and practical use of performance budgets.

budgetPath: ./budget.json

runs (default: 1)

Specify the number of runs to do on each URL.

Note: Asserting against a single run can lead to flaky performance assertions. Use 1 only to ensure static audits like Lighthouse scores, page size, or performance budgets.

runs: 3

configPath

Set a path to a custom lighthouserc file for full control of the Lighthouse environment and assertions.

Use lighthouserc to configure the collection of data (via Lighthouse config and Chrome Flags), and CI assertions (via LHCI assertions).

configPath: ./lighthouserc.json

If some configurations aren't set using action parameters, the settings are fetched from the config file provided here.

serverBaseUrl

Upload Lighthouse results to a private LHCI server by specifying both serverBaseUrl and serverToken. This will replace uploading to temporary-public-storage.

serverBaseUrl: ${{ secrets.LHCI_SERVER_BASE_URL }}
serverToken: ${{ secrets.LHCI_SERVER_TOKEN }}

Note: Use Github secrets to keep your token hidden!

basicAuthUsername basicAuthPassword

Lighthouse servers can be protected with basic authentication LHCI server basic authentication by specifying both basicAuthUsername and basicAuthPassword will authenticate the upload.

basicAuthUsername: ${{ secrets.LHCI_SERVER_BASIC_AUTH_USERNAME }}
basicAuthPassword: ${{ secrets.LHCI_SERVER_BASIC_AUTH_PASSWORD }}

Note: Use Github secrets to keep your username and password hidden!

Outputs

Use outputs to compose results of the LHCI Action with other Github Actions, like webhooks, notifications, or custom assertions.

resultsPath

A path to .lighthouseci results folder:

/Users/lighthouse-ci-action/.lighthouseci

links

A JSON string with a links to uploaded results:

{
  'https://treo.sh/': 'https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1593981455963-59854.report.html'
  ...
}

assertionResults

A JSON string with assertion results:

[
  {
    name: 'maxNumericValue',
    expected: 61440,
    actual: 508455,
    values: [508455],
    operator: '<=',
    passed: false,
    auditProperty: 'total.size',
    auditId: 'resource-summary',
    level: 'error',
    url: 'https://treo.sh/',
    auditTitle: 'Keep request counts low and transfer sizes small',
    auditDocumentationLink: 'https://developers.google.com/web/tools/lighthouse/audits/budgets',
  },
  ...
]

manifest

A JSON string with report results (LHCI docs reference):

[
  {
    "url": "https://treo.sh/",
    "isRepresentativeRun": true,
    "htmlPath": "/Users/lighthouse-ci-action/.lighthouseci/treo_sh-_-2020_07_05_20_37_18.report.html",
    "jsonPath": "/Users/lighthouse-ci-action/.lighthouseci/treo_sh-_-2020_07_05_20_37_18.report.json",
    "summary": { "performance": 0.99, "accessibility": 0.98, "best-practices": 1, "seo": 0.96, "pwa": 0.71 }
  }
]

Credits

Sponsored by Treo and Google.

lighthouse-ci-action's People

Contributors

alekseykulikov avatar arnellebalane avatar bennypowers avatar bep avatar coliff avatar connorjclark avatar cqueern avatar denar90 avatar dependabot[bot] avatar djiit avatar emmekappa avatar exterkamp avatar gregjopa avatar gvdp avatar kaidoering avatar kiwiupover avatar kobe avatar koddsson avatar masup9 avatar mbj36 avatar nschonni avatar orta avatar paulirish avatar riccardogiorato avatar rogerfernandes avatar simpixelated avatar yashtotale avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lighthouse-ci-action's Issues

[Previews] Netlify integration

Nice to have check for none production URLs (preview). Netlify has this feature, it just builds user site against branch/PR user/collaborator pushed. Then user sees perf results for the new changes in PR using the same action.

PoC PR - #30

Main problem is race conditions: Netlify can deploy longer then action starts to check, we need to wait for the Netlify to finish and then run action

I've researched a bit and looks like it's tricky to wait for Netlify build to be finished.

Q: on a pr -> run a node script -> script calls the github checks API -> creates a NEW completed check with a preview url.
A: Actions currently doesn't support the behavior you're describing. On the other hand, a GitHub App could do the kind of thing that you're talking about by updating the details_url value on the check run that it creates.
https://github.community/t5/GitHub-Actions/Use-action-to-deploy-and-return-a-preview-URL/td-p/29214

I think it can be a bit complicated for PoC. Let's try just ping preview URL for N minutes (we can even add option for that in case of really long builds) until it returns 200.

Thoughts?

setting configPath leads to "Cannot find module"

the configPath is loaded relative to the action's own sourcecode.

In this commit ChromeDevTools/debugger-protocol-viewer@27154fb ...

you can see configPath: ./.lighthouserc.js

I was attempting to reference the .lighthouserc.js you see in the same commit.
But it failed to find it:
image
(run link)

probably because these lines expect a path that resolves from cwd/pwd:

if (configPath) {
const rcFileObj = loadRcFile(configPath)

The readme has the example of configPath: ./lighthouserc.json so i figured it would be read local to my checkout instead of local to the action's source.

some path.resolve() should sort this out?

Report viewer error

Hi all. I have an error when trying to check report. After I'm redirected to the report page from git I see the next error in the browse console - "Uncaught (in promise) DOMException: Failed to execute 'setItem' on 'Storage': Setting the value of 'lastCompareReport' exceeded the quota." with error trace. Here is the screenshot of console - http://i.imgur.com/W9GzJMo.png
I'm using chrome to open report pages, but the same result is when opening them in FF or chrome incognito.
Any help is appreciated, thx.

default of `runs: 1` overrides a config's numberOfRuns

looked into this and the CLI arg is set regardless.. and the CLI will prioritize its args above anything in the config. (which makes sense IMO)

So we shouldn't set the CLI numberOfRuns if it was just falling back to the actions' default of 1.

There's an alternative though.. Instead, the action could just default to the default of the CLI (which is 3 runs). If we did that, then you could just delete some code and this problem would work itself out. ;)

Puppeteer support

My workflow links a config file, that in return links a puppeteerScript. This works locally when running lhci with puppeteer globally installed, but I attempted to get it running with this Action in vain. I tried a regular npm dependency as well as the Action ianwalter/[email protected].

// workflow.yml
jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps: 
      - uses: actions/checkout@v1
      - uses: treosh/lighthouse-ci-action@v2
        with:
          budgetPath: ./lighthouse-budget.json
          configPath: ./lighthouse.json
   //   ...
// lighthouse.json
{
  "ci": {
    "assert": {
      "preset": "lighthouse:recommended"
    },
    "collect": {
      "numberOfRuns": 3,
      "puppeteerScript": "./lighthouse-login.js"
    },
    "upload": {
      "target": "temporary-public-storage"
    }
  }
}
Run treosh/lighthouse-ci-action@v2
Action config
Collecting
  Error: Cannot find module 'puppeteer'
  Require stack:
  - /home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/puppeteer-manager.js
  - /home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js
  - /home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js
      at Function.Module._resolveFilename (internal/modules/cjs/loader.js:797:15)
      at Function.Module._load (internal/modules/cjs/loader.js:690:27)
      at Module.require (internal/modules/cjs/loader.js:852:19)
      at require (internal/modules/cjs/helpers.js:74:18)
      at PuppeteerManager._getBrowser (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/puppeteer-manager.js:27:23)
      at PuppeteerManager.invokePuppeteerScriptForUrl (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/puppeteer-manager.js:58:32)
      at Object.runCommand (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js:176:23)
  ##[error]LHCI 'collect' has encountered a problem.
  done in 1.17184977s
      at async run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js:84:7)

Assertion is not done on category

I am using lighthouse ci github action of the latest version. My goal is to assert the following:

  1. Metrics
  2. Category (Performance)

I have configured the budget.json(desktop) and mob-budget.json(mobile) for validation of metrics and also I have separate config file for dekstop and mobile.

Below is budget.json
image

Below is the desktop config file:
image

Below is mob-budget.json
image

Below is the mobile config file
image

Below is the workflow using above mentioned files:
image

Expectation:
The lighthouse scan should be done for all pages mentioned in urls and it should asserted for metrics and categories for specified values.

Observed:
Only metrics is getting asserted and not category.

Link of one of the execution: https://github.com/mkathu/ul-witelabel-visual-test/runs/4042090535?check_suite_focus=true

Sharing the report of one of the page which does not match the configured performance value of 0.9 for desktop, but not getting asserted.
image

Any pointers would be greatly appreciated in making this work.

I can't seem to get my Github Action to fail...

I'm trying to get my Github build to fail if my Accessibility Score falls below a certain threshold.

I suspect it's because it's not reading my lighthouserc.json file correctly (or, more likely, my file is incorrect).

Please tell me what I'm doing wrong! (Also, please tell me if there's somewhere better to ask questions like this)

My lighthouse.yml

name: Lighthouse

on: [push]

jobs:
  lighthouse:
    # Setup cut for berevity

    - name: Audit URLs using Lighthouse
      uses: treosh/lighthouse-ci-action@v7
      with:
        urls: |
          http://[my-url]/
          http://[my-url]/welcome
        configPath: ./.github/lighthouse/lighthouserc.json
        uploadArtifacts: true
        temporaryPublicStorage: true

And the config file (lighthouserc.json)

{
  "ci": {
    "assert": {
      "assertions": {
        "color-contrast": "off",
        // NOTE: I've tried both 0.99 and 99, and neither cause my build to fail
        "categories:accessibility": ["error", {"minScore": 99}],
        "categories:best-practices": "off",
        "categories:performance": "off",
        "categories:seo": "off"
      }
    }
  }
}

Even though everything's mostly off, all the assertions still get run, and even though my required A11y score is 99 (and the reported score is 92), my build still passes.

Any insight here would be greatly appreciated!

Recipe for separate Lighthouse job

@Snugug shared a cool Github Actions pattern he's using to run multiple jobs in parallel but only after the project has been built. Basically it's a combo of the needs job property combined with uploading/downloading artifacts.

it gives two benefits:

  1. you can do jobs in parallel, to get all CI finishing faster.
  2. a separate 'commit status' item for Lighthouse. Sam sez he likes this especially depending on how he did his budgets. maybe some are required and others more optional.

https://github.com/chromeos/static-site-scaffold/blob/master/.github/workflows/nodejs.yml shows it in use. pretend the commented-out job is still there. ;)


users of this action probably would like to use this kinda pattern.

dynamic urls list

Say I want to send the list of urls in a webhook, or calculate it based on some state in my repo, or define it using a javascript function.

It would be great to be able to dynamically build the list of urls.

Cheers

Document or integrate with github deployments/pull requests

I'd like to see my lighthouse scores for each of my deployments. I use nextjs with now for my site so I have deployments for each push, so it should be possible to test against the latest deployment in the PR and show the results inline. Is that possible?

Unable to determine current hash with `git rev-parse HEAD`

I'm having this error when I activate temporaryPublicStorage: true.

This is the run: https://github.com/frontity/frontity/pull/244/checks?check_run_id=339869140

And this is the workflow job:

  lighthouse:
    runs-on: ubuntu-latest
    needs: deploy

    steps:
      - name: Audit URLs using Lighthouse
        uses: treosh/lighthouse-ci-action@v2
        with:
          temporaryPublicStorage: true
          urls: |
            https://mars-theme-frontity.worona.now.sh/
            https://mars-theme-frontity.worona.now.sh/2016/the-beauties-of-gullfoss/

By looking at the specific line that throws the error:
https://github.com/GoogleChrome/lighthouse-ci/blob/master/packages/utils/src/build-context.js#L54
it looks like lighthouse is trying to use git rev-list -n1 HEAD to get the commit hash.

Just before that, it looks for some environment variables. I have solved the issue adding the env LHCI_BUILD_CONTEXT__CURRENT_HASH populated with ${{ github.sha }}.

I'm not sure why I am having this problem, but in case you can reproduce it, maybe this package should add the LHCI_BUILD_CONTEXT__CURRENT_HASH env variable by default:

env:
  LHCI_BUILD_CONTEXT__CURRENT_HASH: ${{ github.sha }}

Mismatching results from Lighthouse report attached to the output manifest

Hi, I have a case of the numbers of the lighthouse score in the report are not equal as we see them in the PR.

Type Bug

The result variable below which inside the console.log is retrieved by ${{ steps.lighthouse_audit.outputs.manifest }}[0].summary. lighthouse_audit is the identifier for the above step that runs the audit!

https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1624269254964-65478.report.html

image

Part of lighthouse.yaml..

...
      - name: Audit URLs using Lighthouse
        id: lighthouse_audit
        uses: treosh/lighthouse-ci-action@v7
        with:
          configPath: './lighthouserc.json'
          uploadArtifacts: true
          temporaryPublicStorage: true
      - name: Format lighthouse score
        id: format_lighthouse_score
        uses: actions/github-script@v3
        with:
          github-token: ${{secrets.GITHUB_TOKEN}}
          script: |
            const result = ${{ steps.lighthouse_audit.outputs.manifest }}[0].summary
            console.log({ result })
            const links = ${{ steps.lighthouse_audit.outputs.links }}
            const formatResult = (res) => Math.round((res * 100))
            Object.keys(result).forEach(key => result[key] = formatResult(result[key]))
            const score = res => res >= 90 ? '๐ŸŸข ' : res >= 50 ? '๐ŸŸ  ' : '๐Ÿ”ด '
            const comment = [
                `โšก๏ธ [Lighthouse report](${Object.values(links)[0]}) for the changes in this PR:`,
                '| Category | Score |',
                '| --- | --- |',
                `| ${score(result.performance)} Performance | ${result.performance} |`,
                `| ${score(result.accessibility)} Accessibility | ${result.accessibility} |`,
                `| ${score(result['best-practices'])} Best practices | ${result['best-practices']} |`,
                `| ${score(result.seo)} SEO | ${result.seo} |`,
                `| ${score(result.pwa)} PWA | ${result.pwa} |`,
            ].join('\n');
            core.setOutput("comment", comment);
 ...

Expected
The results returned to the script are exactly the same as the generated report!

configPath error: ENOENT: no such file or directory

No matter what I try, I cannot get the configPath to work. Always results in error:

##[error]ENOENT: no such file or directory, open '/home/runner/work/simpixelated.com/simpixelated.com/lighthouserc.json'

image

You can see the build here:
https://github.com/simpixelated/simpixelated.com/pull/19/checks?check_run_id=989101553

Seems similar to #47 and I had hoped the new release would fix it, but it's not working for me โ˜น๏ธ

Here's my worfklow:

name: Lighthouse CI for Netlify sites
on: pull_request
jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Use Node.js 12.x
        uses: actions/setup-node@v1
        with:
          node-version: 12.x
      - name: Wait for the Netlify Preview
        uses: jakepartusch/wait-for-netlify-action@v1
        id: netlify
        with:
          site_name: "simpixelated"
          max_timeout: 120
      - uses: actions/checkout@v2
      - name: Audit URLs using Lighthouse
        uses: treosh/[email protected]
        with:
          urls: |
            ${{ steps.netlify.outputs.url }}
            ${{ steps.netlify.outputs.url }}/two-year-work-retrospective
          configPath: "./lighthouserc.json"
          temporaryPublicStorage: true

How to use autodiscoverUrlBlocklist?

I'm working on getting Lighthouse CI to work on our repo . However the 404.html file is picked up and tested. But we don't want that to happen.

I noticed autodiscoverUrlBlocklist and tried that in the lighthouserc file, but i'm not sure how to use it.
I tried only the file, and the full url. But the port always changes so i can't really specify the exact url with port.

Any other way how i can exclude the error file?

[BUG] budget.json not found in assertion sub-step

The action does for some reason use an incorrect path to budget.json. The repo-name in the path is duplicated like so: /home/runner/work/%REPO%/%REPO%/budget.json

Also happens in your demo here:
https://github.com/denar90/lightouse-ci-netlify-preact/runs/891728484?check_suite_focus=true

Asserting
  Error: ENOENT: no such file or directory, open '/home/runner/work/lightouse-ci-netlify-preact/lightouse-ci-netlify-preact/budget.json'
      at Object.openSync (fs.js:440:3)
      at Object.readFileSync (fs.js:342:35)
      at readBudgets (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/assert/assert.js:41:24)
      at Object.runCommand (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/assert/assert.js:54:57)
      at run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/cli.js:88:23)
      at Object.<anonymous> (/home/runner/work/_actions/treosh/lighthouse-ci-action/v3/node_modules/@lhci/cli/src/cli.js:119:1)
      at Module._compile (internal/modules/cjs/loader.js:959:30)
      at Object.Module._extensions..js (internal/modules/cjs/loader.js:995:10)
      at Module.load (internal/modules/cjs/loader.js:815:32)

Wild variances in v3

I'm using v3, running a live url 3 times. The URLs are all hitting Firebase's hosting CDN, so they're very stable performance-wise. In my latest run, my Performance score across 3 runs was 24, 73, 73. The exact same site, with identical code, run from the same infra, but at a different URL is reporting values of 89, 74, and 72. When running Lighthouse from Chrome, the same URLs consistently have a performance score of between 85-87. I'm curious as to why there is so much variance in the actions version of Lighthouse as compared to the browser version, and if there's any way to stabilize the results.

Thanks!

Bad VMs leading to Inaccurate results

Running this a few times with only 1 run on my personal site gh repo I got inconsistent results. This seemed to be due to the underlying VMs.

100 score & 1026 benchmark idx
98 score & 347 benchmark idx
88 score & 106 benchmark idx

Ideas:

  • Less than a 500 benchmark index, the run should be killed. These results won't be accurate and can't be asserted against.

  • We should run a pre-flight check with the benchmarker Lighthouse uses (it's some pretty simple js to run, seen here: https://benchmark.exterkamp.codes) and check if the VM we got for our action was DOA. If it was, then exit code 1 and print out something about re-running the action because of a bad VM.

[BUG] Netlify Sample Fails - Waiting for DevTools protocol response has exceeded the allotted time.

Building upon issue #31 from @denar90 - I've attempted to use the approach to wait for a preview site to be available from Netlify, and run the lighthouse-ci-action to determine SEO, Accessibility scores on PRs.

However, I'm encountering issues (as at https://github.com/chrisreddington/hugo-community/runs/1526907604?check_suite_focus=true#step:4:92). From some quick searching, it looks like there are some wider issues around this and potentially Chrome versions, if I understand rightly?

But was curious if anyone else is going down this path with netlify/also encountering issues? I wondered @denar90 if you're still using it / have any tips? Thanks!

v2 budget assertions seem wrong

One more thing...

I just tested with the NPM module and I have set my budgets correctly:

image

The 1KB over budged is expected. Using this action All my budgets are failing and I dont know what scale the numbers are in

image

error when run lighthouse on (custom) GitHub runner (with preinstalled Chrome)

Hi,

we would like to run Lighthouse on pre-PRO environment which is only available internally.
That's why we have a custom GitHub runner running on on-premise Kubernetes.
Google Chrome browser is already installed.

But it fails with the following error. Maybe someone has a clue what went wrong.

events.js:187
  Running Lighthouse 10 time(s) on https://www.mypage.de/
        throw er; // Unhandled 'error' event
        ^
  
  Error: spawn node ENOENT
      at Process.ChildProcess._handle.onexit (internal/child_process.js:264:19)
      at onErrorNT (internal/child_process.js:456:16)
      at processTicksAndRejections (internal/process/task_queues.js:80:21)
  Emitted 'error' event on ChildProcess instance at:
      at Process.ChildProcess._handle.onexit (internal/child_process.js:270:12)
      at onErrorNT (internal/child_process.js:456:16)
      at processTicksAndRejections (internal/process/task_queues.js:80:21) {
    errno: 'ENOENT',
    code: 'ENOENT',
    syscall: 'spawn node',
    path: 'node',
    spawnargs: [
      '/runner/_work/_actions/treosh/lighthouse-ci-action/v8/node_modules/lighthouse/lighthouse-cli/index.js',
      'https://www.mypage.de/',
      '--output',
      'json',
      '--output-path',
      'stdout',
      '--cli-flags-path',
      '/runner/_work/mypage-ui/mypage-ui/.lighthouseci/flags-30755ad7-427c-4a5d-87fb-b48693f0278e.json'
    ]
  }
  Run #1...::error::LHCI 'collect' has encountered a problem.

Make LH Scores available as output parameters

Would be great to have the main LH Scores available as output parameters in the github action.

That way it would be highly flexible so you could e.g.

  • use the output to post the values in a comment
  • use the values within if-conditions to build additional logic on top of it
  • send the values to a chat-systems webhook
  • etc

Passing flags to lighthouse

Is it possible to add flags to the lighthouse action? For the normal LHCI we can do the following:

lighthouse-ci [URL] --chrome-flags=--ignore-certificate-errors

Is this possible in the YAML?

URL interpolation (e.g. URLs with a PR number in them)

Hi there!

Cool work, I'm trying to audit my PR deploys, but they include a URL which differs per run.

I thought I'd reach out if you had any ideas?

Current (not working):

  lighthouse:
    runs-on: ubuntu-latest
    needs: build
    steps:
      - uses: actions/checkout@master
      - name: Run Lighthouse and test budgets
        uses: treosh/lighthouse-ci-action@v1
        with:
          urls: |
            https://typescript-v2-$PR_NUMBER.ortam.now.sh
            https://typescript-v2-$PR_NUMBER.ortam.now.sh/tsconfig
            https://typescript-v2-$PR_NUMBER.ortam.now.sh/docs/handbook/integrating-with-build-tools.html

Fri, 08 Nov 2019 13:28:38 GMT GatherRunner:error DNS servers could not resolve the provided domain. https://typescript-v2-$pr_number.ortam.now.sh/

Some options:

  • straight up copy the shell style $THING replacement (which would mean looking through all env vars and seeing if they can be replaced in strings)

  • explicitly add a template variable replacement syntax (faster, & more explicit)

  lighthouse:
    runs-on: ubuntu-latest
    needs: build
    steps:
      - uses: actions/checkout@master
      - name: Run Lighthouse and test budgets
        uses: treosh/lighthouse-ci-action@v1
        with:
          urls: |
            https://typescript-v2-$(PR_NUM).ortam.now.sh
            https://typescript-v2-$(PR_NUM).ortam.now.sh/tsconfig
            https://typescript-v2-$(PR_NUM).ortam.now.sh/docs/handbook/integrating-with-build-tools.html
        env:
          PR_NUM: ${{ secrets.$PR_NUMBER }}

static-dist-dir type action failed

I'm running the action on a static dir and get the following error:

Started a web server on port 44475...
  Running Lighthouse 1 time(s) on http://localhost:44475/index.html
  Run #1...failed!
  Error: Lighthouse failed with exit code 1
      at ChildProcess.<anonymous> (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/lighthouse-runner.js:103:21)
      at ChildProcess.emit (events.js:210:5)
      at Process.ChildProcess._handle.onexit (internal/child_process.js:272:12)
  Fri, 18 Jun 2021 21:02:37 GMT ChromeLauncher Waiting for browser.
  Fri, 18 Jun 2021 21:02:37 GMT ChromeLauncher Waiting for browser...
  Fri, 18 Jun 2021 21:02:38 GMT ChromeLauncher Waiting for browser.....
  Fri, 18 Jun 2021 21:02:38 GMT ChromeLauncher Waiting for browser.......
  Fri, 18 Jun 2021 21:02:39 GMT ChromeLauncher Waiting for browser.........
  Fri, 18 Jun 2021 21:02:39 GMT ChromeLauncher Waiting for browser...........
  Fri, 18 Jun 2021 21:02:40 GMT ChromeLauncher Waiting for browser.............
  Fri, 18 Jun 2021 21:02:40 GMT ChromeLauncher Waiting for browser...............
  Fri, 18 Jun 2021 21:02:41 GMT ChromeLauncher Waiting for browser.................
  Fri, 18 Jun 2021 21:02:41 GMT ChromeLauncher Waiting for browser.................โœ“
  Fri, 18 Jun 2021 21:02:41 GMT ChromeLauncher Killing Chrome instance 2616
  Runtime error encountered: perf.getEntriesByName is not a function
  TypeError: perf.getEntriesByName is not a function
      at Object.exports.stop (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/marky/lib/marky.cjs.js:55:24)
      at Function.timeEnd (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse-logger/index.js:128:11)
      at Function.requireGatherers (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/config/config.js:819:9)
      at new Config (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/config/config.js:337:27)
      at generateConfig (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/index.js:60:10)
      at lighthouse (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-core/index.js:43:18)
      at runLighthouse (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/lighthouse/lighthouse-cli/run.js:193:32)
      at processTicksAndRejections (node:internal/process/task_queues:96:5)
  Error: LHCI 'collect' has encountered a problem.

Here are the relevant files:

lighthouse.yaml

name: LH
on:
  pull_request:
    types: [opened, reopened, synchronize, ready_for_review]

jobs:
  static-dist-dir:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Setup Node
        uses: dcodeIO/setup-node-nvm@master
        with:
          node-version: node
      - run: |
          npm ci --ignore-scripts
          npm run setup
          npm run build
      - name: Run Lighthouse on urls and validate with lighthouserc
        uses: treosh/lighthouse-ci-action@v2
        with:
          configPath: './.github/lighthouserc.json'
          uploadArtifacts: true
      - name: Report
        uses: manrueda/lighthouse-report-action@master
        with:
          reports: '.lighthouseci'
          github-token: ${{ secrets.GITHUB_TOKEN }}
lighthouserc.json
{
  "ci": {
    "preset": "lighthouse:recommended",
    "assert": {
      "assertions": {
        "categories:performance": ["error", { "minScore": 0.5 }],
        "categories:accessibility": ["error", { "minScore": 0.7 }],
        "categories:best-practices": ["error", { "minScore": 0.85 }],
        "categories:seo": ["error", { "minScore": 0.8 }],
        "categories:pwa": ["error", { "minScore": 0.25 }]
      }
    },
    "collect": {
      "staticDistDir": "./packages/app/dist",
      "settings": {
        "chromeFlags": [
          "--enable-webgl2-compute-context",
          "--use-fake-device-for-media-stream",
          "--use-fake-ui-for-media-stream"
        ]
      }
    }
  }
}

Do action parameters override settings parameters?

Actions has at least two parameters that replicate the ones present on LHCI config:

Which of them do have priority if both specified? Looks like action params override configuration options from config file but please confirm explicitly. What if action param is not specified? Will it use value from config file or use its own default (e.g. action runs=1 by default)?

Also huge thanks for an awesome tool! Great job ๐Ÿ‘

Lighthouse Plugin doesn't work without hacks

I created a sample project with a Lighthouse Plugin for my blogpost: https://engineering.q42.nl/making-a-lighthouse-plugin-work-with-lighthouse-ci/.

I couldn't get it to work on GitHub Actions without adding the lines:

- run: npm install lighthouse
- run: mv node_modules/lighthouse-plugin-social-sharing

Otherwise it gave te error messages:
Runtime error encountered: Cannot find module 'lighthouse'
Runtime error encountered: Unable to locate plugin: lighthouse-plugin-field-social-sharing

My sample repo is: https://github.com/Q42/lighthouse-plugin-sample-project

Fail if score lowers compared to previous runs

Could an option be added so that if an overall score drops (maybe by a certain margin, or just a certain metric). Seeing as the results can be stored as an artifact this should be possible

How to set a sub folder as working directory it?

I have the following issue, trying to run the Github Action on a "mono repo". So for example I have:

root/
  | - Client
  |      |- Lighthouse should run here
  | 
  |- Server

So with that folder structure, when I run lighthouse throw the following error...

error Couldn't find a package.json file in "/home/runner/work/***/***"
 Error: LHCI 'collect' has encountered a problem.

And make sense, since the package.json in on "/home/runner/work///client", right now I'm using configPath: ./client/lighthouserc.js' but I wasn't able to fin something like that to say, runPath or workingDirectory.


Any ideas? It's just me using it in a wrong way?

Thanks for you time guys! You did and excellent work with this!

Uploading results after budget fails

image

Is this correct that the upload does not proceed if the budget fails? In your examples it looks like the budget uploads.

My workflow

name: CI

on: push

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v1

    - name: Lighthouse CI Action
      uses: treosh/lighthouse-ci-action@v1
      with:
        urls: 'http://milesalex.github.io/gatsby'
        budgetPath: .github/workflows/budget.json

    - name: Upload results
      uses: actions/upload-artifact@master
      with:
        name: lighthouse-results
        path: './results'

Error: ENOENT: no such file or directory

I'm getting the following error in github actions when I try to run this on push on an Eleventy site.

Full output

 Run Lighthouse against a static dist dir
1s
      at async run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js:85:7)
Run treosh/lighthouse-ci-action@v2
Action config
Collecting
  Started a web server on port 38615...
  Error: ENOENT: no such file or directory, scandir '/home/runner/work/derekjdev/_site'
      at Object.readdirSync (fs.js:854:3)
      at FallbackServer.getAvailableUrls (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/fallback-server.js:62:26)
      at startServerAndDetermineUrls (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js:150:25)
      at processTicksAndRejections (internal/process/task_queues.js:93:5)
      at async Object.runCommand (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/collect/collect.js:189:25)
  ##[error]LHCI 'collect' has encountered a problem.
      at async run (/home/runner/work/_actions/treosh/lighthouse-ci-action/v2/node_modules/@lhci/cli/src/cli.js:85:7)

.github/workflows/main.yml

name: Lighthouse
on: push
jobs:
  static-dist-dir:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v1
      - name: Run Lighthouse against a static dist dir
        uses: treosh/lighthouse-ci-action@v2
        with:
          # no urls needed, since it uses local folder to scan .html files
          configPath: '.github/lighthouse/lighthouserc.json'

.github/lighthouse/lighthouserc.json*

{
  "ci": {
    "collect": {
      "staticDistDir": "../_site"
    }
  }
}

What am I missing?

Use `::debug` and `::warning` annotations

An LHCI assertion supports error or warning status.
Currently, every annotation has an error status, where it lists errors and warnings separated visually.
If a URL only has warnings, the status should be a warning.

console.log("::warning some text%0Aafter new line")

We could also use ::debug annotations to report LH results or other useful information.

Allow use of YAML config files

I've got a yaml lighthouserc file, and it throws the following error which leads me to believe it only currently works with json files:

Run treosh/lighthouse-ci-action@v2
undefined:1
ci:
^

SyntaxError: Unexpected token c in JSON at position 0
    at JSON.parse (<anonymous>)

Getting unexpected token in JSON position 0 when passing configPath

I've been continuously getting the same error when trying to run this action with the configPath option:

Error: Unexpected token ๏ฟฝ in JSON at position 0

.github/workflows/on-push.yaml:

      - name: run-lighthouse-ci
        uses: treosh/lighthouse-ci-action@v7
        with:
          configPath: './lighthouserc.json'

I've found that if I remove the configPath option I get a different error related to LHCI, which is probably due to me not passing url.

./lighthouserc.json

{
  "ci": {
    "collect": { "staticDistDir": "./dist/public/static" }
  }
}

Public repo that is affected (at the commit that is has the above example)
https://github.com/bradtaniguchi/bradtaniguchi.github.io/tree/dd4d980a2f9036b6e2e20333a4559cb4a64864f3

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.