GithubHelp home page GithubHelp logo

folke / ultra-runner Goto Github PK

View Code? Open in Web Editor NEW
1.2K 6.0 32.0 3.19 MB

๐Ÿƒโ›ฐ Ultra fast monorepo script runner and build tool

Home Page: https://www.npmjs.com/package/ultra-runner

License: Apache License 2.0

JavaScript 4.34% TypeScript 95.59% Shell 0.07%
concurrent parallel lerna pnpm yarn-workspaces monorepo npm build build-tool script-runn

ultra-runner's Introduction

๐Ÿƒ โ›ฐ๏ธ Ultra Runner

github badge Coverage Status npm GitHub GitHub top language Renovate

Ultra fast monorepo script runner and build tool.

โœจ Features

  • zero-config: works out of the box with your existing monorepo
  • non-intrusive: no need to make any changes to your packages.json files
  • workspaces: detects packages in existing lerna, yarn, npm@7 and pnpm workspaces, or recursively searches them
  • ultra fast builds: ultra keeps track of file changes in your repo and only actually build a package when needed
  • parallel builds: ultra builds your packages concurrently by default
  • workspace dependencies: workspace dependencies are automatically resolved and used for parallel builds
  • execute anything: one command to run package scripts, node_modules binaries or system binaries, recursively in your repository.
  • faster script execution: ultra hijacks any npm, pnpm, yarn and npx calls for faster execution.
  • concurrency within scripts: you can add optional configuration to package.json to run parts of a script in parallel. No need to change the actual scripts
  • filtering: filter on package names or subdirectories
  • monitoring: node process monitor (like top for node)
  • output zooming when executing multiple commands in parallel, ultra will try to keep as much concurrent output on the screen as possible, by only showing the last lines of the commands. Once the commands complete running, a full log is written to the terminal. This is very useful when building a bunch of packages with --watch for instance.
  • missing scripts when executing scripts recursively, only packages that have the script defined, will execute it.

๐ŸŽฅ View Demo

Workspaces

๐Ÿค“ Smart

Ultra parses your package.json and hijacks any npm run, yarn and npx calls. Shell operators like &&, ; and || are also interpreted.

For example:

{
  "scripts": {
    "lint": "yarn lint:ts && yarn lint:eslint && yarn lint:docs",
    "lint:eslint": "npx eslint bin/*.js src/*.ts __tests__/*.ts --cache",
    "lint:docs": "npx markdownlint README.md",
    "lint:ts": "npx tsc -p tsconfig.build.json --noEmit",
    "lint:fix": "yarn lint:eslint --fix"
  }
}

Running ultra lint Ultra Lint

Running ultra lint:fix will spawn exactly one child process, directly with the correct command, instead of spawning yarn intermediately

Ultra will additionally execute any configured pre and post scripts, just like npm run and yarn run.

๐ŸŒด Recursive Execution

When using -r or --recursive, the command will be executed in every package of your repository, excluding the root package. If you also want to run in the root package, combine --recursive with --root. Commands are always run concurrently with a default concurrency of 10 (can be changed with --concurrency)

Ultra finds packages based on your monorepo workspace:

  • lerna
  • pnpm
  • yarn workspace
  • when no monorepo manager was found, we look recursively for packages

Use --filter <filter> to filter packages in the workspace. The filter argument can use wildcards to filter package names and/or subdirectories:

$ ultra -r --filter "@scope/app" pwd
...

$ ultra -r --filter "@scope/*" pwd
...

$ ultra -r --filter "apps/*" pwd
...

When the filter is prefixed with a +, then all dependencies of the filtered packages will also be included. For example, let's say you have a package "app1" that depends on "lib1", then using the filter +app1, will execute the command on both app1 and lib1, using the workspace topology.

๐Ÿ“ฆ Builds

Ultra automatically detects workspace dependencies, while still allowing parallel builds. Packages are build concurrently as soon as their dependencies are build (also concurrently). Every package directory contains a .ultra.cache.json file that contains hashes of all files and build artifacts in your repository. Internally this uses git ls-files for files under source control and simple mtime timestamps for build artifacts. When building a package, the current state is compared with the .ultra.cache.json. Builds are skipped when no changes were detected.

Optimized builds using the dependency tree and files cache, are automatically triggered when running the build script or using --build with a custom script or command.

All commands below will trigger optimized builds.

$ ultra -r --build
...

$ ultra -r build
...

$ ultra -r --build mycustombuildscript
...

If for some reason you want to rebuild a package, use --rebuild or rebuild.

If you want some files to be excluded from the .ultra.cache.json, you can create a .ultraignore file. The format is similar to .gitignore. Whenever a file changes that is listed in your .ultraignore, a rebuild will not be triggered.

๐Ÿ“Š Monitor

With ultra --monitor you can easily monitor all running node processes on your machine.

For every process, you can also see the package where the command was executed and a clean command line.

Monitor

โšก Fast

Ultra parses your package.json scripts and will only execute the commands that are really needed. Any script interdependencies are resolved during the parsing stage. This ensures there's pretty much no overhead in execution by Ultra itself, since it's only running once. yarn run or npm run on the other hand, will spawn new yarn or npm child processes as needed by the package scripts.

npm run npx yarn yarn exec ultra
package.json scripts โœ… โŒ โœ… โŒ โœ…
./node_modules/.bin/ โŒ โœ… โœ… โœ… โœ…
system binaries โŒ โœ… โŒ โœ… โœ…
execution overhead (1) 250ms 60ms 220ms 200ms 65ms

1. each program was run 10x with the command true or {scripts:{"true":"true}} to calculate the execution overhead

Suppose you would want to run a script that calls 5 other scripts by using && and/or post/pre.

  • Using yarn, you would have a total overhead of 2.5s (10x 250ms)
  • Using ultra, you hit the overhead only once, so the total overhead would still be 65ms

To make execution ultra fast, you can configure which scripts should be ran concurrently.

โ• there's no need to switch your scripts over to ultra. Even with the optional configuration you can still use yarn or npm to run your scripts if you want to.

Example builds:

yarn ultra not concurrent ultra concurrent
build Ultra-Runner 8.9s 7.2s 5.1s
build Devmoji 16s 13s 8s

๐ŸŽจ Formatting

There are three output formats that each can be combined with --silent to hide command output.

--pretty is the default. It shows output in a hierarchical way and uses spinners to see exactly what's happening. Make sure to check out the animation at the top of this page. Every executed step shows the execution time.

--pretty combined with --silent is useful if you're only interested to see the overview:

--no-pretty doesn't use spinners and prefixes command output with the command name. This is useful for logging purposes.

Combining --no-pretty with --silent shows a flat overview.

--raw will show the exact output as you would expect when running the commands stand alone. If the command you're executing is interactive (reads from stdin), then this is the mode you should use.

๐Ÿ’ซ Getting Started

Install with npm or yarn

globally

npm install -g ultra-runner
yarn global add ultra-runner

locally inside your project. use with npx ultra

npm install --dev ultra-runner
yarn add --dev ultra-runner

Now run ultra --info within your repository to see everything related to your monorepo

See optional configuration for information on how to setup concurrent script execution.

๐Ÿš€ Usage

$ ultra --help
Usage: ultra [options] <cmd> [cmd-options]

Workspace:
  --recursive, -r  Run command in every workspace folder concurrently                                      [boolean]
  --filter         Filter package name or directory using wildcard pattern                                  [string]
  --root           When using --recursive, also include the root package of the workspace                  [boolean]
  --concurrency    Set the maximum number of concurrency                                      [number] [default: 10]

Status:
  --info  Show workspace dependencies                                                                      [boolean]
  --list  List package scripts. Also works with --recursive                                                [boolean]
  --monitor           Show node process list, updated every 2 seconds                                      [boolean]
  --monitor-interval  Set process list interval in seconds                                     [number] [default: 2]

Build:
  --build, -b  Use dependency tree to build packages in correct order                                      [boolean]
  --rebuild    Triggers a build without checking for file changes                                          [boolean]

Formatting:
  --pretty  enable pretty output, spinners and separate command output. Default when a TTY [boolean] [default: true]
  --raw     Output only raw command output                                                                 [boolean]
  --silent  Skip script output. ultra console logs will still be shown                                     [boolean]
  --color   colorize output                                                                [boolean] [default: true]

Options:
  --version      Show version number                                                                       [boolean]
  --dry-run, -d  Show what commands would be executed, without actually executing them                     [boolean]

โš™๏ธ Optional Configuration

To allow parallel execution of your scripts, you can specify scripts that should run concurrently, in your package.json.

{
  "scripts": {
    "lint:eslint": "npx eslint bin/*.js src/*.ts __tests__/*.ts --cache",
    "lint:docs": "npx markdownlint *.md",
    "lint:ts": "npx tsc -p tsconfig.build.json --noEmit",
    "lint": "yarn lint:eslint && yarn lint:docs && yarn lint:ts",
    "prebuild": "yarn lint && yarn jest",
    "build": "..."
  },
  "ultra": {
    "concurrent": ["lint"]
  }
}
  • yarn build will run the lint and jest commands sequentially
  • ultra build will run all lint commands concurrently and then execute jest. (note that we can also add prebuild to concurrent, since tests don't depend on linting. this way all commands would run concurrently)

ultra-runner's People

Contributors

alexkrautmann avatar dependabot[bot] avatar etc-tiago avatar folke avatar lxgreen avatar omgimalexis avatar remorses avatar renovate-bot avatar renovate[bot] avatar semantic-release-bot avatar shimarulin avatar zekth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

ultra-runner's Issues

Shortcut for --filter to run on all dependencies of current working directory package

What i would like to have is a simpler way to do the following

{
  "build:deps": "THIS_PACKAGE=$(jq -r .name package.json) && ultra -r --filter +$THIS_PACKAGE build:lib",
}

It would be cool if --filter would run on all cwd's package dependecies when doing the following

ultra -r --filter + build:lib

Or maybe add support for paths when using + and do this instead

ultra -r --filter +./ build:lib

Ultra runs scripts inside node_modules in pnpm monorepo

Hi, first of all thank you for this great package. I'm running into an issue with my pnpm monorepo project :

In the root package.json, I have a script for linting my workspaces ultra -r lint, but it seems to run the lint on every packages inside the node_modules directory, as you can see below :

image

When I run the lint without ultra inside each of my workspaces by hand, it works great. I don't know if I'm missing something here or if it's a bug :)

Running postinstall using ultra

Hey ๐Ÿ‘‹ First of all, thanks for building this, it looks super nice.

We have a repository where a bunch of stuff is happening in the postinstall hook. The imagined workflow is, that people checkout a branch, run yarn and everything works.

It would be super nice, if there would be an easy way for me to run these postinstall hooks in an ultra-context, using its concurrency features and console ui. I'm unsure what would be the best approach though. Maybe something like ultra install?

Does ultra-runner work with git submodules?

Both Lerna and Rush.js are not able to track changes in projects/packages if they are git submodules.

For various reasons, I'd like for some (or all) packages in my monorepo (actually I call it "umbrella repo" because it isn't a single repo) to be git submodules.

Rush.js totally fails running anything when git submodules are present in the umbrella repo.

Lerna allows me to run commands across all git submodules just fine. For example, lerna run build is able to run commands in a certain order depending on the dependency tree between packages.

However, things like lerna version or lerna publish don't work, because Lerna does not know how to determine if a package is modified if it is a git submodule.

I think the solution would be for Lerna to go into the submodule, and run git status (or whatever) inside the submodule to detect if it has changed. Or perhaps Lerna needs to identify submodule commit hash changes in git diff output at the top level. So due to lack of support for detecting changes of git submodules, Lerna isn't able to determine what packages changed (lerna changed). It just simply thinks that everything is always changed.

Does ultra-runner support git submodules? If not, could that be something to be considered as an addition to ultra-runner?

asterisk

Thanks for making this nice runner tool!

In my yarn workspace I have defined many local dependencies with an asterisk as the version (*)
But now, ultra thinks that these "version" have changed every time I run the same command again, these dependencies come in as "changed files". Something I do wrong?

Prefix output for `--no-pretty` with the package name

Right now, --no-pretty while mix output from all running commands and prefix them with the command.

It would be better to also add the package name to the prefix to make it easier to debug what output is coming from what command.

(see also #89)

Scriptless workspaces should not force rebuilds

Currently workspaces without the build script, or which ever script is being run, will force all dependent workspaces to rebuild every time. This causes a huge slowdown when you have workspaces for generic configuration stuff, like browserlist-config or an internal eslint-plugin. For this kind of stuff, there is no point in having a build script, their contents rarely even change AND most other workspaces depend on them.

While it's easy enough to bypass the issue (by making a dummy build script), I think it would make more sense for Ultra to just check/make the cache-json and signal dependencies to be built if the hashes have changed.

pwd is pointing to <workspace-directory>/dist

When I execute ultra -r --no-pretty --color pwd from workspace root, it is always pointing to the <workspace-directory>/dist. This also chains for the nested dist directories <workspace-directory>/dist/dist

asterisk application system

hi to all, I installed the voip asterisk switchboard on raspberrypi 3+ and it works fine, now I would have thought of activating the gpio by making a call to an internal number, then run a shell command from the dialplan, created the virtual extension 100 I'm going to create a dialplan in the exstension_custom.conf file like so:
[from-internal-custom]

exten => 100,1,answer()
exten => 100,n,wait (3)
exten => 100,n,system(/root/my folder/my file)
exten =>100,n.Hangup()

my file is an executable written in bash, and I can run it both from root @ raspberrypi, and from the asterisk CLI, but it doesn't work when I call the virtual extension, where do I make the mistake? can someone help me?

Bug: watch doesn't run any more than concurrency limit

For long-running processes, this is a bit of a unintuitive edge case.

If I run ultra -r watch it will only run the first 10 (concurrency default) processes. I guess this makes sense, but it was confusing.

I think two changes would help:

  • Documentation around long-running processes like watch never going above concurrency
  • Commands that include watch could have a notice that says they will limit to the concurrency level.

Ultra cache could be used for more than builds (ie. lint)

It would be nice to be able to leverage ultra cache system for linting. I don't know if this could be an cli option ?

Thanks for your work, I'm investigating ultra to see if we should migrate from Rush to ultra+pnpm workspaces :)

Building specific module with its submodules

Is there any way I can build one specific module from my monorepo with its dependencies recursively? Just like lerna's scope works. As I see, --filter doesn't work with dependencies and tries to build only packages that match pattern.

Webpack alternative?

Does it and can it replace webpack? I have a project that normally take 2mins just to build on webpack even though the end build is just a bit over 1mb. That makes Dev process super slow.

Explanation of incremental build feature

Hey, @folke thanks for the really good piece of software. The ultra-runner looks awesome and really promising for our workflow.
Yesterday I ran it in our rush monorepo and as stated in docs it builds everything with zero-config, nice job!

However, I would like to know more about incremental build feature. To know how to optimize code or what patterns to avoid and what can cause optimization bailouts.

Could you elaborate a bit more on the strategy used for incremental build?

For example:

From doc:

.ultra.cache.json file that contains hashes of all files and build artifacts in your repository. Internally this uses git ls-files for files under source control and simple mtime timestamps for build artifacts.

Wy we even should keep track on "build artefacts"? The build artefacts should not affect building, because they are, literally build artefacts, not sources.

Following up this question should I store my artefacts in project folder eq projects/app1/dist or it's better to hoist it into another level out of package folder?

What happened if I change root-level dependency or environment variable?
For example root level tsconfig.json or dependencies in root level package.json.
Does it treats package folder as "hermetic" where only changes applied to files inside that folder may produce different build no matter of environment (node version?).

More real-world example. I have a package with build command:

    "build": "gulp extract-messages && yarn build:bundle && yarn inline-messages",

Where gulp extract-messages creates a *.pot file in {package-name}/src/translates.

I see on my CI that this invalidate cache every time. Means all consequent runs (without any changes) always trigger full rebuild.

As far as I understand ultra runner should create .ultra.cache.json after build command successfully executed. It means it should store mtime of these newly created files (kind of artefacts) and next time skips build.

But somehow these files invalidate cache every time. Do I understand the flow correctly? Do I miss something?

[Question] Execute yarn commands concurrently

I'm terribly sorry if this is a stupid question, but I don't know how to get ultra-runner to "hijack" (your words) the yarn workspaces and run the command(s) concurrently?

Root directory package.json contains:

{
  "private": true,
  "workspaces": [
    "yarns/**"
  ],
  "scripts": {
    "lint": "yarn workspaces run lint"
  },
  "ultra": {
    "concurrent": ["lint"]
  },
  "devDependencies": {
    "ultra-runner": "^3.5.0",
    **snip**   
  }
}

In one of the projects, the package.json contains:

{
  "name": "worker",
  "version": "1.0.0",
  "description": "Worker for API",
  "main": "main.js",
  "engines": {
    "node": "10.x"
  },
  "scripts": {
    "build": "tsc",
    "test": "jest --runInBand --config ../../../jest.config.js",
    "lint": "eslint --ignore-path ../../../.eslintignore \"**/*.{ts,js}\""
  },
  "dependencies": {
    **snip**
  }
}

I've tried running npx ultra lint which does not run the linting concurrently

I've tried running npx ultra yarn workspaces run lint which does not run the linting concurrently

What am I doing wrong? ๐Ÿ™ˆ

I should mention when running npx ultra -r npm run lint, but then it's not really using yarn workspaces, is it? ๐Ÿค”

Concurrency without build still seems to do a topological sort

I'd expect this:

ultra --silent -r --no-pretty --rebuild --concurrency 100 build

If I had 100 packages, would show 100 "build" commands straight away, and then the outputs would all follow. Instead, I see what looks like a topological build still happening. Note I'm not using --build which I know is topological, I'm trying to avoid that in this case but can't seem to.

Action Required: Fix Renovate Configuration

There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.

Error type: undefined. Note: this is a nested preset so please contact the preset author if you are unable to fix it yourself.

Doesn't seem to run without git

We're testing out ultra-runner for our docker builds, but when I run it says:

$ ultra -r build
โฏ @dish/auth at packages/auth
โฏ @dish/tscc-spec at packages/tscc-spec
โฏ @dish/tsickle-loader at packages/tsickle-loader
โฏ @dish/worker at packages/worker
error Not a Git repository /app/packages/auth
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
Service 'jwt-server' failed to build: The command '/bin/sh -c yarn build' returned a non-zero code: 1
##[error]Process completed with exit code 1.

I'm guessing ultra-runner does some git check and gets upset here. Seems like that should be optional.

Package ignore glob patterns should be configurable

Right now, we have a default ignore list of regexes for package exclusions:

const DEFAULT_IGNORE = [
  "**/node_modules/**",
  "**/bower_components/**",
  "**/test/**",
  "**/tests/**",
  "**/__tests__/**",
]

Might be a good idea to integrate CosmiConfig and make parts of ultra configurable, including the regex above.

Feature request: Arrays or more advanved glob patterns for --filter

Something that would be a huge help would to be able to combine multiple --filter statements into one execution of ultra.

For example perhaps:

ultra -r --filter "@scope/app, services/*" start

or even:

ultra -r --filter "@scope/app" --filter "services/*" start

Forgive me if this is already possible using globs, but I can't seem to figure it out.

Builds never cached if there is a dep on a build-less dep

I have two packages @packages/a and @packages/b. @packages/a has no build step and @packages/b has a build step. @packages/b depends on @packages/a

When I run yarn ultra -r --filter "@packages/b" --build build the cache is never hit and it does a full rebuild. I'm guessing this is because @packages/a has no build, it gets no cache file, and ultra-runner thinks it needs to build it again.

Cleanup command after sending a SIGINT

Hello,

Thanks for this awesome runner! ๐Ÿƒ

I'm using it to run all my workspace's dev script in parallel, it's so good!

One of my workspace is using a docker-compose up in its dev script and I'm looking for a way to execute a cleanup command after I do a CTRL + C in my terminal.

What signal do you send to the underlying scripts when quitting the terminal?

My use case is to be able to run a docker-compose down when I CTRL + C on ultra-runner. ๐Ÿ˜„

posttest runs before test is finished

I have seen that my posttest script is started before test has finished.

test contains two subtasks that can run in parallel. So I have added test to ultra.concurrent. posttest contains only one task and thus I have not tagged it as concurrent.

What leads to this behaviour and can I control it?

Thanks

Yarn v2 (berry) support?

Hi, this package seems great! I having problems with it though, running on a monorepo with yarn v2. Is yarn v2 supported?

Run yarn PnP commands faster

Yarn berry scripts can be run with

node -r ./.pnp.js path/to/$$virtual/package/binary.js

This halves the start time

I am already working on a PR for this

Recursive pattern exclusion

The recursive option may need an exception option for the folder pattern. Because it looks recursively for the package.json file even in dist folders, causing issues. For example:

# Raw environment
- packages
-- api
--- package.json
-- ui
--- package.json
package.json

So i have my build task which will run build in each subfolder. The API compiles and copy it's package.json to be installed for the nodejs server. Which ends up in:

# Already built environment
- packages
-- api
--- dist
---- package.json
--- package.json
-- ui
--- dist
--- package.json
package.json

And if i run npx ultra build again it ends up using the package.json of the packages/api/dist which breaks the build process. A solution would be to set a command like:

    "build": "rm -rf packages/*/dist && npx ultra -r yarn build",

A solution would be to check on .gitignore maybe. Any idea?

Ultra runner --build feature doesn't work in CI environments when non tracked files are downloaded from cache

I want to use ultra runner to have faster CI pipelines, only rebuilding changed packages

Ultra runner uses the ultra.cache.json files to detect when something changed, to detect if a non tracked file changes it compares it with the last modified timestamp from the json file.

This doesn't work in the CI because downloading the cache will modify the last modified timestamp.

In my opinion we shouldn't check the last modified timestamp of non tracked files, we should only check that they exist.

We can do this removing the --directory argument from the git ls-files command and not doing the last modified time check.

Removing the --directory argument from the ls-files produces output like this

+ artifacts/file.js
+ artifacts/file.d.ts
- artifacts/
normal_file.js

I can work on a PR if you agree

how are dependencies determined?

Hello, can you explain more how dependencies are determined? Is it based on lerna symbolic links? or scoped packages in the package.json dependencies object?

Group output option

When running things concurrently using --no-pretty, the logs are mixed together from all the different processes.

You could have an option, or default to, to log out the outputs in groups. You'd have to wait for them to end of course and then just output the whole thing for each one as it ends.

Perhaps as --group-logs, though tbh it's a reasonable default.

filter/reduce on git concestor

Thank you for Ultra Runner! I think a powerful feature would be to further filter based on files changed as per git history. One case would be preintegration cicd. Take ultra -r --filter "apps/* build which would build all the apps, whereas this feature, --filter ultra -r --filter "apps/* --filter concestor build would only build the apps that have files that have changed since branching.

Git does all the work and Ultra Runner already depends on git, this feature would provide the UI wrapper, options in the config file, intersection of multiple filters, etc. Here is a rather dated bash script I have used in the past for a real world use case and the suggested git usage (git merge-base --fork-point $left_ref $right_ref is the star).

Idea: more efficient build strategy

Hey ultra people,

Thanks for this cool piece of code!

I'd like to share an idea. As far as I understand, currently ultra cache relies on git file hash for change detection in files, and timestamps for change detection in directories. I can see, the cache does not include the build artifact caches (e.g. in my project, the cache contains an entry for dist directory, but not for its contents).

What if the cache would contain entries for build artifacts as well, while their hashes are affected by the contents (e.g. md5 checksum)? This would provide actual change detection. Say, I've just changed a typescript annotation or a comment in package A. This change triggers package A rebuild. After the package is rebuilt, we compare bundle checksums vs cached ones. Since the change is dev-time only, it doesn't affect the bundle contents, so the rebuild can stop here and the dependent packages don't have to be rebuilt.

WDYT?

Passing args to npm run script - command failed

https://docs.npmjs.com/cli/v6/commands/npm-run-script#synopsis

// package.json (partial)
  "scripts": {
    "lint:src": "eslint \"src/**/*.{ts,tsx}\"",
    "lint:src:fix": "npm run lint:src -- --fix",
    "lint:fix": "npm run lint:src:fix"
  },
ultra lint:fix

Result

>ultra lint:fix
ร— lint:fix 2.906s
  ร— lint:src:fix 2.905s
    ร— lint:src 2.903s
      ร— $ eslint "src/**/*.{ts,tsx}" -- --fix 2.901s
        โ”‚
        โ”‚ Oops! Something went wrong! :(
        โ”‚
        โ”‚ ESLint: 7.15.0
        โ”‚
        โ”‚ No files matching the pattern "--fix" were found.
        โ”‚ Please check for typing mistakes in the pattern.
        โ”‚
        โ”‚
error
error Command eslint failed with exit code 2

npm run lint:fix works ok, no errors.

Env: Windows 10, Node 12 LTS, ultra installed globally (v3.6.0)

expect can setting exclude some file in .ultra.cache.json

First, ultra-runner is awesome!!!

But, there is a using case:

some build process can async produce files or expect exclude .gitignore don't put in .ultra.cache.json

async getFiles(directory: string, exclude: string[] = []): Promise<GitFiles> {
    ...
}

async function getPackageFiles(
  root: string,
  workspace: Workspace | undefined
): Promise<PackageFiles> {
  return {
    files: await cache.getFiles(root),
    deps: getDependencies(root, workspace),
  }
}

npx ts-node myscript.ts fails with ENOENT

I have a script in my package.json like "doit" : "npx ts-node myscript.ts".
[email protected] is in my devDependencies and installed.
ts-node.cmd is in node_modules/.bin/ts-node.cmd (as well as bash and ps versions)
Running the package script with npm run doit works fine.
Running npx ultra doit fails with

error error Command ts-node failed with Error: spawn ...\node_modules\.bin\ts-node ENOENT. Is the command on your path?

I'm on Windows 10.

Install hangs with PNPM workspace

Hi !

I will try to prepare a repro case asap, but for now :

My install hangs indefinitely on "Running" while using pnpm workspace. Using pnpm install by itself without ultra works well.

The install should take a few minutes (~3 minutes with pnpm) but should complete.

OS: Windows 10 64
Package manager: pnpm 4.14.0

Prefix logged lines with package name when using `--raw`

I am using the --raw option to not have the logs being cut when running in CI, the problem is that i can't really know from what package the logs come from

It would be cool to have the logged lines being prefixed with the package name, like this

> ultra -r --raw build

[package-1] ./src/index.ts โ†’ lib, lib...
[package-1] babelHelpers: 'bundled' option was used by default. It is recommended to configure this option explicitly, read [package-1] more here: https://github.com/rollup/plugins/tree/master/packages/babel#babelhelpers

[another-package] ./src/index.ts โ†’ lib, lib...
[another-package] babelHelpers: 'bundled' option was used by default. It is recommended to configure this option explicitly, [another-package] read more here: https://github.com/rollup/plugins/tree/master/packages/babel#babelhelpers
[another-package] created lib, lib in 501ms

I can work on a PR for this

Large git repos are not supported

I've tried to launch build for lerna repo with huge git repo, which led to this issue. I've just added additional line in cli.js directly in node_modules to see full error with stack trace, as it was hidden, at here is what I found.

error stdout maxBuffer length exceeded { RangeError [ERR_CHILD_PROCESS_STDIO_MAXBUFFER]: stdout maxBuffer length exceeded at Socket.onChildStdout (child_process.js:354:14) at Socket.emit (events.js:198:13) at addChunk (_stream_readable.js:288:12) at readableAddChunk (_stream_readable.js:265:13) at Socket.Readable.push (_stream_readable.js:224:10) at Pipe.onStreamRead (internal/stream_base_commons.js:94:17) cmd: 'git ls-files --full-name -s -d -c -m -o --directory -t' }

Running this command standalone indeed returns huge lists, which I waited few minutes to until stopped it.

Is it possible to bypass this step or increase buffer limit for such cases?

Feature request: extended glob for --filter

In a yarn workspace setup, it would be nice to be able to target multiple, specific packages using extended globbing:

$ ultra -r --filter "@scope/(app|docs)" pwd

It is currently possible to use the --filter option to target a specific package:

$ ultra -r --filter "@scope/app" pwd

Or all packages under a scope or directory:

$ ultra -r --filter "@scope/*" pwd

Assuming this is a useful feature, maybe it would be as simple as adding config for globrex here?

Question: Can you run multiple root and recursive packages/commands selectively?

Hi thanks for the awesome work on Ultra! I'm trying to figure something out and I was hoping to get advice (I searched and read a lot of issues and through the code but couldn't find a solution).

I have a Lerna monorepo with something like this:

web/
  - app/
  - www/
  - blog/
services/
  - proxy/
  - api1/
  - api2/

I want to selectively run certain web apps but all services. Currently I have a root yarn watch script in my root package.json that uses lerna to trigger all the services and set up scripts like "watch": "yarn ./services/setup.ts; lerna run --parallel watch'" in one terminal with ultra watch, and then I just run one web apps "start": "concurrently --kill-others \"yarn:build:watch\" \"yarn:tsc:watch\"" command at a time with ultra -r --filter "@scope/app" start in (a)other terminal(s). I some times need to run two or more from web.

Questions are: 1) can I selectively run a root script such as watch and a recursive filtered script? 2) can I run more than one recursive filtered script at a time or more than one scoped package some how? 3) Can I combine these all into one command?

Thanks a bunch in advance.

Better support for running watches in related workspaces

Not sure if the title is super clear, but what I mean is running "start" or "watch" commands on multiple related workspaces with recursion. Currently this is problematic, because these commands don't finish in the normal sense, so they shouldn't be waited for. But they also can't be run fully parallel, because that ends in failures when workspace dependencies haven't yet been built.

So I'd suggest something similar to wsrun's "--done-criteria" option, that would watch the output of each running workspace and once the output matches a regex, continue on to dependent workspaces.

Unable to run more than 10 concurrent scripts

Hi,
I'm trying to run this scripts:

ultra -r --concurrency 20 --filter "@my-workspace/*" start

Unfortunately, even if I specified a concurrency of 20 I'm not able to run it on every package. It seems the concurrency is always limited to 10 (the default).

Do you have any advice?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.