Conversation
|
Review requested:
|
|
I like the idea, it would greatly reduce the amount of filesystem operations that pnpm has to do in order to create an isolated node_modules layout using symlinks. I also suggested arcanis to possibly go one layer deeper and allow to map the individual files of packages. This would allow to map node_modules directly from a content-addressable store (that consists of package files). Of course, that would increase the size of the file several times but it would also make installation even faster. |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #62239 +/- ##
==========================================
- Coverage 91.55% 89.69% -1.87%
==========================================
Files 355 707 +352
Lines 149381 218508 +69127
Branches 23364 41807 +18443
==========================================
+ Hits 136765 195982 +59217
- Misses 12354 14445 +2091
- Partials 262 8081 +7819
🚀 New features to boost your workflow:
|
|
This seems quite close to the |
|
I did, but felt that the semantics were too different; import maps have two fields:
Neither of those match the semantics we need, and reusing them just for their name but with different semantics would have been imo misleading for third-party resolver implementors. |
|
Love it. I'll try to give a detailed review on the flight home today if the in flight wifi treats me kindly. |
|
What happens if two packages define the exact same path? An error, presumably, given that the algorithm relies on being able to determine for each path which package it belongs to? Needs a test in any case. |
I'll follow-up with a separate improvement to key module instances per both their path and their package IDs (it's key to solve the peer dependency problem I mentioned in the PR description), but for this iteration only one package ID per path is supported. I updated the code to throw an error accordingly. |
|
@guybedford @avivkeller can I get another review? once this land I'll start looking at a prototype for generating package maps in package managers, and the package ID follow-up |
|
|
||
| In the example above both `lib-old` and `lib-new` use the same `./lib` folder to | ||
| store their sources, the only difference being in which version of `react` they'll | ||
| access when performing `require` calls or using `import`. |
There was a problem hiding this comment.
How does this work given the module cache? Wont the first one to load’s react version “win”?
There was a problem hiding this comment.
Yes, that's why it's not implemented in this PR (it throws should this case happen), and will be implemented as a follow-up: #62239 (comment) (still I want it to be documented as third-party tools reading package maps better know about this early in the implementation)
|
What happens when there's symlinks in the specified paths? From the current implementation it looks like it basically breaks. You could sort of fix this by resolving symlinks in the package map (at the cost of a lot of file system operations during startup), but then you have the problem that the "longest path containing a file" heuristic is now ambiguous: you could have a file {
"packages": {
"my-app": {
"path": "./src",
"dependencies": {
"foo": "foo",
"baz": "baz"
}
},
"foo": {
"path": "/foo/bar"
},
"baz": {
"path": "/baz"
}
}
}and then I don't have a good intuition for what the right thing to do here is. |
It depends where the symlink is:
|
There was a problem hiding this comment.
Is it possible for @arcanis or someone else familiar with the work to come to the next TSC meeting to talk a little about this implementation? I have two concerns with landing this at the moment:
- Awareness: like mentioned in the "Why?" section of its description, this PR is introducing a complete new alternative to the more-than-a-decade old module resolution system, have APM folks been looped in? were the other adjacents WG consulted? @nodejs/package-maintenance @nodejs/tooling
- Import maps: Is the current implementation going to create an irremediable compatibility with that standard? Can this use some help from the @nodejs/wintercg folks to help standardize this so that other runtimes may also implement support in the future?
That said, thanks @arcanis for taking the time to work on it! I feel like the runtime definitely needs to innovate on this and sorry to be potentially slowing things down in the immediate time but hopefully it's to make sure we land on a solution that will last! ✌️
|
I have been looking for someone who wanted to take over and help get #49443 over the finish line. Maybe this would be a good opportunity to start a thread in with @nodejs/wintercg, come up with the technical and UX goals we have for this feature. I think we need to decide if we need spec changes (to pursue via the standards process) or if we can find workarounds that maintain spec compliance. @arcanis happy to work with you on this if you are interested. |
Sure, happy to, please invite me to the next meeting. I also discussed it a little while ago with folks at the WinterTC and it wasn't clear at the time whether it was worth a multi-vendor discussion at that stage. So I figured that since this is behind
The designs serve different goals. The way import maps are keyed makes it impossible to support very common dependency graphs, causing package managers to generate semantically incorrect installs or rely on features like inject or virtual paths which add complexity and come with their own trade-offs. That's not to say import maps can't be implemented in Node.js as well, but their design limits them to solve more web-specific use cases that wouldn't satisfy the practical problems our local package managers set out to address. |
|
Using the example in "How it works" as base, some compatibility/security questions:
Context: I'm looking at it from the experience of having to map the dependencies for LavaMoat policy files and these are some of the reasons we were forced to invent canonical names that represent the package as the shortest path in dependency tree leading to it, and we use that as keys in a flat structure. |
Separate arbitrary IDs. The graph is stored flat in {
"packages": {
"my-app": {
"path": "./src",
"dependencies": {
"lodash": "lodash@1",
"react": "react"
}
},
"lodash@1": {
"path": "./node_modules/lodash"
},
"lodash@2": {
"path": "./node_modules/react/node_modules/lodash"
},
"react": {
"path": "./node_modules/react",
"dependencies": {
"lodash": "lodash@2"
}
}
}
}
Separate IDs as well; from my experience with Yarn we give different identities to packages that come from different sources (git, npm, http, etc). As an example we also don't deduplicate them during hoisting.
I'd treat bundleDependencies as their own source, so supporting package managers would give them their own ID. |
|
I like this idea generally! How does it differ from a lockfile? Could it replace the lockfile pattern, or serve both purposes? It seems like this could potentially merge the lockfile and node_modules into a single lightweight resolution/lock file, eliminating a huge number of stateful edge cases that exist with the lockfile+node_modules pattern. What is the DX for breaking out a local override / edit / vendor on specific packages? |
Lockfiles are package-manager specific files and contain arbitrary information that are only relevant to the package manager that generated them (checksums, whether the package contains scripts, and more). Depending on the package manager they also store fewer information that are needed for runtime execution. As an example Yarn doesn't store the peer dependency resolutions in the lockfile; they are always recomputed at install time. Package maps are an install byproduct, same as |
|
What's the intention around committing the resolution file to source control? It seems like if the intention is to have it committed, the a package manager could merge its lock-file state with resolution state into that single file and reduce/eliminate an entire class of post install mutation and drift defects that can happen between lock files and node_modules If it's not the intention to capture the resolution file to source control, why not? |
My goal was to give it more time to raise awareness and I'm satisfied with the extra time and reach out effort we've done since - I'm happy to lift the block now and let the rest of the collaborator base work on landing the best solution here :)
a021d9f to
2ba189b
Compare
|
I'm the maintainer of a "create"-style package (think This worked fine when we only supported yarn. But now I've added support for npm and pnpm as well. And doing the lock file generation (by performing a full install, grabbing the package manager specific lock file, and then deleting node_modules) now takes around three minutes, significantly slowing down both local testing and CI. Would shipping a package map in our "create" package provide similar speed gains as we currently achieve by shipping the lock files? If it does speed up that dependency analysis/resolution in a package manager agnostic way, that's another win worth highlighting :) Not to mention the reduced complexity compared to having generate multiple lock files. |
lib/internal/modules/cjs/loader.js
Outdated
| function tryPackageMapResolveCJS(request, parent, conditions) { | ||
| if (!hasPackageMap()) { return undefined; } | ||
|
|
||
| const parentPath = parent?.filename ?? process.cwd() + path.sep; |
There was a problem hiding this comment.
Not a blocker for landing a PR, but probably something to iron out before making stable: using process.cwd() feels weird here, that would mean a package-map could load completely different files depending on the CWD (i.e. node --experimental-package-map=subdir/package-map.json subdir/index.js and cd subdir && node --experimental-package-map=package-map.json index.js would behave differently). A less surprising behavior would be to use the dirname of package-map.json instead.
There was a problem hiding this comment.
The process.cwd() here is only used when an require call is performed by something that doesn't have a parent. Usually that's when evaluating a script with node -e.
As for --experimental-package-map, if provided a relative path it'll always be relative to the cwd since we call pathResolve here. I think that's the expected behavior from a user point of view.
There was a problem hiding this comment.
Resolving the path to the JSON file is certainly fine, I agree. My point is more that the paths inside that JSON file should not be CWD-dependant.
The
process.cwd()here is only used when anrequirecall is performed by something that doesn't have a parent. Usually that's when evaluating a script withnode -e.
That still doesn't seem right, evalScript does define its own module.filename using the CWD. Also any entry point script doesn't have a parent, and in general we never use the "parent" (which is not reliable, see https://nodejs.org/api/deprecations.html#dep0144-moduleparent) to resolve require calls
There was a problem hiding this comment.
Resolving the path to the JSON file is certainly fine, I agree. My point is more that the paths inside that JSON file should not be CWD-dependant.
They are not, they are relative to the location of the package map:
https://github.com/nodejs/node/pull/62239/changes#diff-dc08f64ce5a7c16618ae8b4ee0e179f2a0362e31a0c7884969b2c820070d3092R95
That still doesn't seem right, evalScript does define its own module.filename using the CWD.
I was slightly off - the "no parent" case is when calling node -r some-package (not -e). In that case the parent module that's passed to _resolveFilename has its filename property set to null, in which case we need to fallback to the cwd.
Also any entry point script doesn't have a parent, and in general we never use the "parent" (which is not reliable, see https://nodejs.org/api/deprecations.html#dep0144-moduleparent) to resolve require calls
In the context of the resolution pipeline (_resolveFilename), the parent is whichever module performs the require call, it's not the same as module.parent. It's reliable in that context.
There was a problem hiding this comment.
Let's use the existing util:
node/lib/internal/modules/cjs/loader.js
Lines 609 to 621 in d0fa608
wesleytodd
left a comment
There was a problem hiding this comment.
If this lands, are we saying that we will not ship import maps?
I would like to repeat my concern that shipping a feature which is incompatible with an existing web platform spec in the same space is problematic. Especially without working through gaps with the spec maintainers to attempt reconciliation of gaps.
Obviously my review is not very valuable without more clarity on the gaps than is documented here, but will re-state my ask to collaborate on an import map compatible solution. I believe that most of what folks asked about is addressable without breaking compatibility with the spec via partial scope matching and falling back to the Node.js resolution algorithm for the last mile.
This PR adds a new
--experimental-package-map=<path>flag letting Node.js resolve packages using a static JSON file instead of walkingnode_modulesdirectories.Why?
The
node_modulesresolution algorithm predates npm and its clear definition of the concept of packages. It works well enough and is widely supported, but has known issues:Phantom dependencies - packages can accidentally import things they don't declare, because hoisting makes transitive dependencies visible
Peer dependency resolution is broken in monorepos - if
website-v1usesreact@18andwebsite-v2usesreact@19, and both use a sharedcomponent-libwith React as a peer dep, there's nonode_moduleslayout that resolves correctly. The shared lib always gets whichever React was hoisted.Hoisting is lossy - runtimes can't tell if an import is legitimate or accidental
Resolution requires I/O - you have to hit the filesystem to resolve packages, and that quickly adds up. Even more so in agentic worlds where working with multiple git trees become a common pattern.
Package managers have tried workarounds (pnpm symlinks, Yarn PnP), but are either limited by what the filesystem itself can offer (like symlinks) or by their complexity and lack of standardization (like Yarn PnP). This PR offers a mechanism for such tools to solve the problems listed above in tandem with Node.js.
How it works
A
package-map.jsondeclares packages, their locations (relative to the package map), and what each can import:{ "packages": { "my-app": { "path": "./src", "dependencies": { "lodash": "lodash", "react": "react" } }, "lodash": { "path": "./node_modules/lodash" }, "react": { "path": "./node_modules/react" } } }When resolving a bare specifier:
dependenciespathERR_PACKAGE_MAP_ACCESS_DENIEDMODULE_NOT_FOUNDCompatibility
An important aspect of the package maps feature that separates it from competing options like Yarn PnP is its builtin compatibility with
node_modulesinstalls. Package managers can generate bothnode_modulesfolders ANDpackage-map.jsonfiles, with the later referencing paths from the former.Tools that know how to leverage
package-map.jsoncan then use this pattern for both static package resolution and strict dependency checks (with optional fallbacks to hoisting if they just wish to use the package map information to emit warnings rather than strict errors), whereas tools that don't will fallback to the classicalnode_modulesresolution.Differences with import maps
Issue #49443 requested to implement import maps. In practice these aren't a good fit for runtimes like Node.js for reasons described here and which can be summarized as: import maps take full ownership of the resolution pipeline by spec, thus preventing implementing additional runtime-specific behaviours such as
exportsorimportsfields.This PR comes as close from implementing import maps as possible but with a very light difference in design making it possible to stay compatible with other Node.js resolution features.
Why not a loader?
The ecosystem now has to deal with a variety of third-party resolvers, most of them not implementing the loader API for many different reasons: too complex, turing-complete, or dependent on a JS runtime.
After I've been following this path for more than six years I can confidently say that loaders would work for Node.js itself but wouldn't be standard enough to be included in at least some of those popular third-party tools.
Package manager integration
Package maps are meant to be generated by package managers. For example you could imagine Yarn or pnpm generating a
node_modulesdirectory containing anode_modules/.package-map.jsonfile should the feature be enabled (it would likely remain opt-in while experimental).Then all
yarn nodecalls would inject the flag into the environment when spawning Node.js:Questions
--experimental-strict-package-mapsis set? Or via astrictfield inpackage-map.json.