New Aug 5, 2024

The Deno Package Paradox

More Front-end Bloggers All from dbushell.com View The Deno Package Paradox on dbushell.com

Deno packaging is now a fragmented mess. It didn’t start that way.

Designing Deno’s module system around HTTP imports was ambitious. It aimed to replace npm with a distributed system over HTTP, aligning with how ES Modules work in browsers. […] Despite its promise, several issues with HTTP imports emerged since that initial implementation.

What we got wrong about HTTP imports

This is what Ryan Dahl — creator of Deno and Node — is now saying. Contrast this to Dahl’s 2018 JSConf talk in which NPM and package.json were regrettable mistakes. To fix this Deno would use ESM imports with URLs to shirk the module resolution required by Node.

If only relative files and URLs were used when importing, the path defines the version. There is no need to list dependencies.

That was 2018, in 2024 Dahl and Deno have changed course. It’s okay to admit you’re wrong. Commendable even. But I do question the motivation here. Before I get spicy and conspiratorial let’s look at the mess this has created. Oh, it’s messy…

HTTP Imports

HTTP imports will remain in Deno.

import {leftpad} from "http://...";

Since the Deno v1.30 release it has supported deno.json.

{
  "imports": {
    "is-even": "https://...",
    "is-odd": "https://..."
  }
}

This config file will look familiar. It’s basically package.json mixed with the import map standard which itself was based on package.json and supported in browsers.

Here in lies our first problem that was never solved.

If you try to publish a Deno library with its dependencies in an import map, that library cannot be imported. It can’t be used. Deno can only resolve deno.json for the top-level project and not that of imported libraries to find their sub-dependencies.

That feature is exclusive to JSR:

During publishing, jsr publish / deno publish will automatically rewrite the specifiers in your source code to fully qualified specifiers that do not require an import map / package.json anymore.

I’ll discuss JSR later. For now, if you don’t want to publish on JSR you cannot use deno.json for your library dependencies. Instead the convention has been to use deps.ts where you re-export your dependencies.

export * as dependency from "http://...";

JSR is basically shimming this for your convenience.

Why can’t the Deno binary itself resolve sub-dependencies? With HTTP imports it doesn’t know where to look for a 3rd party deno.json. Even if there was a convention, like a reference pointer, import maps have issues beyond my smooth brain to explain.

Deno v1.31.0 added package.json support and the ability to import NPM packages and their sub-dependencies just fine. Node had that solved all along. In Dahl’s defence as to why support was added, he says:

The JavaScript ecosystem’s reliance on a single centralized module registry conflicts with the web’s decentralized nature.

Why We Added package.json Support to Deno

That quote has nothing to do with why. It’s just pertinent to where I’m going next. You see, Deno is giving up on decentralised packages via HTTP import to encourage their new centralised solution; JSR.

What is JSR?

Deno built JSR: the JavaScript Registry. You must absolutely not call JSR a “package manager” — Dahl is little touchy about that. I presumed because of the whole “10 regrets” thing, but now I’m unsure.

Technically the deno binary is the package manager. You can run:

deno add npm:left-pad

This command adds the package to your deno.json import map:

{
  "imports": {
    "left-pad": "npm:left-pad@^1.0.0",
    "@std/fs": "jsr:@std/fs@^1.0.0"
  }
}

Deno are even proposing a breaking change to install so that it behaves the same, like Node. Deno has added special prefixes for npm:, node:, and jsr: packages. This is the new solution to replace HTTP imports; just publish on JSR.

JSR does not support HTTP imports:

Deno code on JSR will NOT be permitted to use HTTPS imports. JSR performs deduplication of dependencies based on semantic versions, which is not possible with HTTPS-imported dependencies.

In my initial thoughts on JSR I pointed out a glaring issue. JSR has no support for any pre-JSR Deno library that used HTTP imports; i.e. anything with a dependency. You have to go up the dependency chain and encourage package authors to publish on JSR before you can publish yourself.

This means that publishing Deno packages on JSR is significantly more challenging than Node packages, if not practically impossible. Ironic, consider JSR was supposed to fix the Deno dependency issue. Unless every Deno developer jumps on the JSR train it will fragment the Deno package ecosystem.

So, everyone is on board, right?

Oh dear… 💀

Last week I noted that six months after public launch JSR has less than 5000 packages. Checking today that number is 3655 — it fell by over 1000; a purge? Regardless, it’s not exactly popular, is it? Remember, JSR is not just “the Deno registry”.

What would a healthy number be? The beauty of HTTP imports was that packages could be imported from anyway, direct from GitHub was an option. Deno provided deno.land/x where 7576 packages currently reside. That service used Git tags for versioning rather than deno.json. Maintaining both is a chore. Even the official standard library is 3 months out of date. It was abandoned to promote JSR.

That’s at least 7.5k packages left to rot in favour of a new registry that is incompatible with how Deno has handled dependencies up until now.

It’s obvious that NPM and its 3.4 million packages does a good enough job for Node developers. And with Node merging experimental TypeScript support JSR’s killer feature isn’t quite the humble brag it briefly was. If Node devs are even aware of JSR they have little reason to care.

Where are the Deno developers? Are they happy with HTTP imports? Are they blocked because of HTTP imports?

Are they in the room with us now?

Something’s off

I was attracted to Deno by the decentralised nature and familiar web standard based APIs. I like writing web servers with the Fetch API. Reading files with Web Streams. Using Web Crypto, Promises with top-level await, etc. Deno felt refreshingly modern compared to Node. I was excited to see what innovation followed.

Innovation did not follow.

Instead we’ve had months — maybe years now — of dull release notes laden with Node & NPM compatibility fixes.

I don’t think Dahl changed tune simply because of the developer experience. Despite all his regrets about Node packaging, Deno packaging is now in a worse state.

Deno, the company, is funded by venture capital. VCs want growth. VCs want a return on investment. I think the pressure is rising at Deno HQ. Have investors lost patience with ideals and innovation? All this backtracking looks like a desperate attempt to attract more developers. Where are those developers? They’re using Node. Maybe Node compatibility is the solution? Nope. It ain’t working. Why would a Node developer move to Deno only to continue using Node APIs and NPM modules?

I’m just spitballin’ but something if off with Deno. I get bad vibes around the whole Deno ecosystem.

Deno has stalled. And whilst Deno is floundering, Node is adding all the good stuff that made Deno interesting. As a hobbyist Deno developer, sure, importing NPM packages is handy. But why exactly was the entire Deno package ecosystem fragmented? TypeScript is not the answer. TypeScript was doing just fine on its own. TypeScript didn’t ask for JSR.

Should we expect “10 Things I Regret About Deno” at JSConf 2028?

Related Notes

I follow-up in notes from 5th August, 6th August, and 7th August discussing further problems with Deno packages.

I’ve published a follow-up review of JSR and Deno with even more discussion.

Scroll to top