Parallel and sequential, especially at the command level, are really the wrong abstractions for running scripts. If you have multiple packages, each with builds, there's a high chance you have dependencies and multiple packages depending on common ones.
What you really want is a way for scripts to describe their dependencies, and then the runner figures out what order to run them in, and cache scripts that don't need to be run because their inputs didn't change.
Wireit[1] is an npm script runner that adds that incrementally on top of package.json. I can't manage an npm monorepo without it now.
Deno started integrating the idea directly into their built-in script runner. I think this is an important enough feature that more runtimes should follow Deno's lead.
> What you really want is a way for scripts to describe their dependencies, and then the runner figures out what order to run them in, and cache scripts that don't need to be run because their inputs didn't change.
DAG + content-addressing, final binary being the target and everything resolved from there. We could have some beautiful build system that just works and is fast, but seems it never magically appears by itself although it seems so elegant. Guess Nix/NixOS is the closest we've gotten so far, works well enough, missing concurrency and parallelism though.
Wireit does both DAG and content-addressing. It figerprints the inputs and outputs of dependencies. And you run scripts externally with plain `npm run` commands. It's really beautiful.
Google's build system Bazel is what you describe.
If only we could make something like that
But now we would need each script to independently do their own caching, which isn’t all bad. At least you have more cross runner compatibility and resilience
Wireit really is that. The script declares dependencies and input, Wireit caches based on the direct inputs and dependency outputs.
Well, that speeds things up a lot. But I agree with spankalee, it should be a DAG.
< "ci": "CI=true bun run check && bun run test && bun run build && bun run docs && bun run zip && bun run zip:firefox"
> "ci": "CI=true bun run --parallel check test build docs && bun run --parallel zip zip:firefox"
Is it more common in English to use there terms Parallel and Sequential or Parallel and Series ? Made a React Library to generate video as code and named two components <Parallel> <Series> I was wondering if those were two best terms two use...
Electric engineering talks about parallel and series. (including the old parallel and serial ports on computers, before almost everything became serial)
Programming talks about parallelism or concurrency or threading. (single-threading, multi-threading)
Or synchronous and asynchronous.
The legal system talks about concurrent and consecutive.
Process descriptions might use "sequential" rather than consecutive or series.
"Linear" is another possibility, but it's overloaded since it's often used in reference to mathematics.
Both would be understood and are roughly interchangeable.
"Sequential" feels more appropriate to me for the task runner scenario where we wait for one task to finish before running the next.
"Series" suggests a kind of concurrency to me because of the electrical circuit context, where the outputs of one are flowing into the next, but both are running concurrently. Processes that are Unix piped into each other would be another thing that feels more like a "series" than a "sequence".
The electronics terms parallel and series are about static physical connections (things are connected in parallel or series — the more grammatical form would be in a series).
The software terms parallel and sequential are about the temporal relationship of activities (things are done in parallel or sequentially). That’s why in software we also have the term “concurrent” which means something different from “parallel”.
When talking in terms of software parallelism, "parallel" and "sequential" are more common to describe, for example, multi-threaded vs. single-threaded implementations.
Sequential is a fuzzier word. It can imply that a series of steps feeds output from step A into step B and so on. But at the same time it can also drift into areas typically defined as linearization. Where a task runs in parallel but applies in series, in sequence.
I think your average person knows what sequential means but might not remember what series means. Personally I always remember the meaning of series in “parallel vs series” because it must be the opposite of parallel. I’m not proud of the fact that I always forget and have to re-intuit the meaning every time, but the only time I ever see “series” is when people are talking about a TV show or electronics.
Parallel and Series makes sense to me; it's also the terminology used for electrical circuits.
Genuine question out of curiosity. Why do I want parallel and sequential when I can just write a simple bash script to accomplish the same thing? Is there some additional complexity I’m missing?
As a note here, there are a lot of resources that make bash seem incredibly arcane, with custom functions often recommended. But a simple interruptible script to run things in parallel can be as simple as:
(trap 'kill 0' INT TERM; cmd1 & cmd2 & cmd3 & wait)
Or, for 1+2 sequentially, in parallel with 3+4 sequentially:
(trap 'kill 0' INT TERM;
(cmd1 && cmd2) &
(cmd3 && cmd4) &
wait
)
(To oversimplify: The trap propagates the signal (with 'kill') to the process group 0 made by the () parens; this only needs to be set at the top level. & means run in background, && means run and continue only on success.)
There are other reasons one might not want to depend on bash, but it's not something to be afraid of!
I get where you're coming from and if this was a package I'd agree - but having this built in/part of the tooling is nice - one less dependency - bash isn't as ubiquitous as you assume.
This is cleaner and you don't have to write a bash script. It's one (well, several: the script, bash, and it's dependencies) less thing, which is important in containers and for scale.
It lets developers on Windows also build and test your package in parallel mode. If you make your build scripts bash, they're Linux-only.
> if you make your build scripts bash, they’re Linux only
Git bash exists on windows and is perfectly usable.
It's still much less dependable compared to something fully supported like Bun.
> when I can just write a simple bash script to accomplish the same thing
At this point you don't need most things...
But this is no more than 5 lines of code. If it was 100 I’d understand.
A few reasons.
1. Minor speed boost from not needing bun multiple times (or extract the build/test/lint commands from package.json).
2. You can query/filter commands. E.g. run all my tests (both unit and integration).
3.You avoid needing a separate Bash install (for Windows).
This is nice to see, but I'm curious to check if the web socket bugs are all gone (I had a watch on a particular one that stopped me from running Node-RED in some circumstances, but can't find it on mobile...)
Why does Anthropic even need Bun? Is Claude not good enough to write something far superior very fast?
Think you answered your second question with your first.
Their C compiler project proves the opposite.
[deleted]
IIRC Claude actually did write a lot of it.