JavaScript Runtimes and Package Managers

I want to finish reading the Node.js Documentation, and I also want to read about the JavaScript runtimes Deno and Bun. I learned some things when reading the Node.js documentation for a Daily Reading post, and I think I will learn more useful information by reading about these 3 JavaScript runtimes. I also want to read about Package Managers to see if I could speed up my CI.

Date Created:
Last Edited:
1 49

JavaScript Runtimes


A JavaScript engine is simply a computer program that executes JavaScript code. It's responsible for translating human-readable JavaScript code into machine-readable instructions that the computer's hardware can execute.

The JavaScript code that runs in the browser doesn't interact with the computer's hardware, instead it interacts with the JavaScript engine, which acts as an intermediary between the code and the underlying machine. Every browser has its own JS engine, but the most well known is Google's v8 Engine. This V8 Engine powers Google Chrome and also Node.js.

How the JavaScript Engine Works

A JS Engine always contains a call stack and a heap. The call stack is where the code gets executed with the help of the execution context. And the heap is an unstructured memory pool that stores all the objects in the memory that the application needs.

JS Engine Graph

Compilation vs Interpretation

In compilation, the entire code is converted into machine code at once and written in a binary file that can be executed by a computer.

Compilation

In interpretation, the interpreter runs through the source code and executes it line by line. The code stull needs to get converted into machine code, but this time it is happening line by line while executing the program.

Interpretation

JS used to be a purely interpreted language, but the modern JS engine now uses a mix of compilation and interpretation which is known as just-in-time (JIT) compilation.

With compilation, the machine code is stored in a portable file. It can be executed at any time - there's no need to rush immediately after the compilation process. With JIT, the machine code needs to be executed as soon as the compilation ends.

JIT and JavaScript

Whenever a piece of JavaScript code enters the engine, the first step is to parse the code. The code is parsed into a data structure called the AST (Abstract Syntax Tree). This works by first splitting up each line of code into pieces that are meaningful to the language, and then saving all these pieces into the tree in a structured way. This step also checks for syntax errors. The resulting tree will be used to generate the machine code.

The next step is compilation. The engine takes the AST and compiles it into machine code. This machine code gets executed immediately because its using JIT. The modern JS engine generates inefficient machine code just to execute the program as fast as possible. Then the engine takes the already pre-compiled code to optimize and recompile the code during the already running program execution.

JIT in JavaScript

What is a JavaScript Runtime?

A JavaScript (JS) runtime is a comprehensive environment that enables the execution of JavaScript code. It consists of various components working together to facilitate the execution of JavaScript applications.

Asynchronous operations in JavaScript, such as handling user input or making network requests, utilize callback functions. These functions are placed in a queue known as the callback queue, awaiting execution, The callback queue ensures that asynchronous tasks are handled in an organized manner.

When the call stack is empty, the callback queue gets passed to the call stack so that it can be executed and this happens by something called the Event Loop.

Node.js


Security Best Practices

  • Denial of Service Attack
    • If too many requests are sent to the server or if bad requests (slowloris) are sent, the server could crash.
    • Mitigations:
      • Use a reverse proxy to receive and forward requests to the Node.js application. Reverse proxies can provide caching, load balancing, IP blacklisting, etc. which reduce the probability of DoS attack being effective.
      • Correctly configure the server timeouts, so that connections are idle or where requests are arriving too slowly can be dropped.
      • Limit the number of open sockets per host and in total.
  • DNS Rebinding
    • This is an attack that can target Node.js applications being run with the debugging inspector enabled using the --inspect switch. Through DNS rebinding, an attacker can temporarily control the origin for their requests so that they seem to originate from a local IP address. See the DNS Rebinding wiki.
    • Mitigations:
      • Disable inspector on SIGUSR1 signal by attaching a process.on('SIGUSR1',...) listener to it.
      • Do not run the inspector protocol in production.
  • Exposure of Sensitive Information to an Unauthorized Actor
    • All the files in the current directory are pushed to the npm registry during the package publication
    • There are some mechanisms to control this behavior by defining a blocklist with .npmignore and .gitignore or by defining an allow list in the package.json
    • Mitigations:
      • Using npm publish --dry-run to list all the files to publish.
      • Create and maintain ignore files such as .gitignore and .npmignore.
  • HTTP Request Smuggling
    • This is an attack that involves two HTTP servers (usually a proxy and a Node.js application). A client sends an HTTP request that goes first through the front-end server (the proxy) and then is redirected to the back-end server (the application). When the front-end and back-end interpret ambiguous HTTP requests differently, there is potential for an attack to send a malicious message that won't be seen by the front-end but will be seen by the back-end, effectively smuggling it past the proxy server.
    • Mitigations:
      • Do not use the insecureHTTPParser option when creating a HTTP Server.
  • Information Exposure Through Timing Attacks
    • This is an attack that allows the attacker to learn potentially sensitive information by, for example, measuring how long it takes for the application to respond to a request. This attack is not specific to Node.js and can target almost all runtimes. The attack is possible whenever the application uses a secret in a timing-sensitive operation (e.g. branch).
    • Mitigations:
      • The crypto API exposes a function timingSafeEqual to compare the actual and expected sensitive values using a constant-time algorithm,
      • Try to avoid using secrets in variable-time operations. This includes branching on secrets and, when the attacker could be co-located on the same infrastructure.
  • Malicious Third-Party Modules
  • Supply Chain Attacks
    • A supply chain attack in Node.js application happens when one of its dependencies (either direct or transitive) are compromised. An attacker who takes control of an upstream package can publish a new version with malicious code in it.
  • Memory Access Violation
    • Memory-based or heap-based attacks depend on a combination of memory management errors and exploitable memory allocator. Like all runtimes, Node.js is vulnerable to these attacks if your projects run on a shared machine.
    • Mitigations:
      • Use --secure-heap=n depending on your application where n is the allocated maximum byte size.
      • Do not run your production app on a shared machine.
  • Monkey Patching
    • Monkey patching refers to the modification of properties in runtime aiming to change the existing behavior.
    • The --frozen-intrinsics flag enables experimental frozen intrinsics, which means all the built-in JavaScript objects and functions are recursively frozen. However, you can still define new globals and replace existing globals using globalThis.
  • Prototype Pollution Attacks
    • This refers to the possibility of modifying or injecting properties into JavaScript language items by abusing the usage of proto, constructor, prototype and other properties inherited form built-in prototypes.

Asynchronous Work

At its core, JavaScript is designed to be non-blocking on the main thread, this is where views are rendered. When the main thread becomes blocked it results in the infamous freezing that end users dread, and no other events can be dispatched resulting in the loss of data acquisition, for example. This creates some unique constraints that only a functional style of programming can cure. This is where callbacks come into the picture.

Callbacks become complicated when there are multiple nested callbacks (callback hell). This is where functions come in to great use. More complex operations are made up of many functions:

  1. indicator style / input
    1. This is is the first input in the sequence - accepts the original input.
  2. middleware
    1. Returns another function
  3. terminator
    1. Invokes the callback.

The following demonstrates the flow to network or file system requests. Here the latency is 0 because all the values are available in memory.

function final(someInput, callback) {
callback(`${someInput} and terminated by executing callback `);
}

function middleware(someInput, callback) {
return final(`${someInput} touched by middleware `, callback);
}

function initiate() {
const someInput = 'hello this is a function ';
middleware(someInput, function (result) {
console.log(result);
// requires callback to `return` result
});
}

initiate();

State management

Functions may or may not be state dependent. State dependency arises when the input or other variable of a function relies on an outside function. There are two primary strategies for state management:

  1. passing in variables directly to a function
  2. acquiring a variable value from a cache, session, file, database, network, or other outside source.

setTimeout instructs the CPU to store instructions elsewhere on the bus, and instructs that the data is scheduled for pickup at a later time. The same situation arises in dealing with file systems and network requests. The main thread simply cannot be blocked for an indeterminate period of time - therefore, we use callbacks to schedule the execution of code in time in a controlled manner.

You will be able to perform almost all operations with the following three patterns:

  1. In series: functions will be executed in strict sequential order
  2. Full Parallel: When ordering is not an issue, such as emailing a list of 1,00,000 email recipients.
  3. Limited Parallel: Parallel with limit, such as successfully emailing 1,000,000 recipients form a list of 10 million users

Blocking vs NonBlocking

Blocking is when the execution of additional JavaScript in the Node.js process must wait until a non-JavaScript operation completes. This happens because the event loop is unable to continue running JavaScript while a blocking operation is occurring.

Synchronous methods in the Node.js standard library that use libuv are most commonly used blocking operations. All of the I/O methods in the Node.js standard library provide asynchronous versions, which are non-blocking and accept callback functions. Blocking methods execute synchronously and non-blocking methods execute asynchronously.

const fs = require('node:fs');
// synchronous
const data = fs.readFileSync('/file.md'); // blocks here until file is read
// asynchronous
fs.readFile('/file.md', (err, data) => {
if (err) throw err;
});

JavaScript execution in Node.js is single threaded, so concurrency refers to the event loop's capacity to execute JavaScript callback functions after completing other work. Any code that is expected to run in a concurrent manner must allow the event loop to continue running as non-JavaScript operations, like I/O, are occurring.

Asynchronicity in Programming Languages

Computers are asynchronous by design. Asynchronous means that things can happen independently of the main program flow.

In the current consumer computers, every program runs for a specific time slot and then it stops its execution to let another program continue their execution. This thing runs in a cycle so fast that it's impossible to notice. We think our computers run many programs simultaneously, but this is an illusion (except on multiprocessor machines).

Programs internally use interrupts, a signal that's emitted to the processor to gain the attention of the system.

JavaScript is synchronous by default and is single threaded. This means that code cannot create new threads and run in parallel. Lines of code are executed in series, one after another.

A callback is a simple function that's passed as a value to another function and will only be executed when the event happens. We can do this because JavaScript has first-class functions, which can be assigned to variables and passed around to other functions (called higher-order functions).

The Node.js Event Loop

The event loop is what allows Node.js to perform non-blocking I/O operations- despite the fact that a single JavaScript thread is used by default - by offloading operations to the system kernel whenever possible. Since most modern kernels are multi-threaded, they can handle multiple operations executing in the background. When one of these operations completes, the kernel tells Node.js so that the appropriate callback may be added to the poll queue to eventually be executed.

When Node.js starts, it initializes the event loop. processed the provided input script which may make async API calls, schedule timers, or call process.nextTick(), then begins processing the event loop, shown below.

Simplified Event Loop Order of Operations

Each phase (box shown above) has a FIFO queue of callbacks to execute. While each phase is special in its own way, generally, when the event loop enters a given phase, it will perform any operations specific to that phase, then execute callbacks in that phase's queue until the queue has been exhausted or the maximum number of callbacks has executed. When the queue has been exhausted or the callback limit is reached, the event loop will move to the next phase, and so on.

Since any of these operations may schedule more operations and new events processed in the poll phase are queued by the kernel, poll events can be queued while polling events are being processed.

  • timers: this phase executes callbacks scheduled by setTimeout() and setInterval()
  • pending callbacks: executes I/O callbacks deferred to the next loop iteration
  • idle, prepare: only used internally
  • poll: retrieve new I/O events; execute I/O related callbacks (almost all with the exception of close callbacks, the ones scheduled by timers, and setImmediate()); node will block here when appropriate
  • check: setImmediate() callbacks are invoked here
  • close callbacks: some close callbacks, e.g. socket.on('close',...)

The Node.js Event emitter

Nodde.js offers us the option to build a similar system to the browser using the events module. This module offers the EventEmitter class, which you can use to handle events.

The object exposes the on and emit methods:

  • emit: used to trigger an event
  • on: used to add a callback function that's going to be executed when the event is triggered

Other events:

  • once(): add a one-time listener
  • removeListener()/off(): remove an event listener from an event
  • removeAllListeneres(): remove all listeners for an event

Understanding process.nextTick()

One important part of the event loop is process.nextTick(). Every time the runtime calls back into JavaScript for an event, we call it a tick. When we pass a function to process.nextTick(), we instruct the engine to invoke this function immediately after the current operation completes, before moving to the next phase in the event loop.

process.nextTick(() => {
// do something
});

It's the way we can tell the JS engine to process a function asynchronously (after the current function), but as soon as possible, not queue it. Use nextTick() when you want to make sure that in the next event loop iteration that code is already executed.

Don't Block the Event Loop (or the Worker Pool)

Node.js offers a Worker Pool to handle expensive tasks like file I/O. Node.js scales well, sometimes better than more heavyweight approaches like Apache. The secret is that Node.js uses a small number of threads to handle many clients. If Node.js can make do with fewer threads, then it can spend more of your system's time and memory working on clients rather than on paying space and time overhead for threads (memory, context-switching).

Here's a good rule of thumb for keeping your Node.js server speedy: Node.js is fast when the work associated with each client at any given time is "small". Node.js uses a small number of threads to handle many clients. There are two types of threads: one Event Loop (the main loop, the main thread, event thread, etc.) and a pool of k workers in a Worker Pool.

If a thread is taking a long time to execute a callback (Event Loop) or a task (Worker), we call it blocked. While a thread is blocked or working on behalf of one client, it cannot handle requests from any other clients. This provides two motivations for blocking neither the Event Loop nor the Worker Pool:

  1. Performance: If you regularly perform heavyweight activity on either type of thread, the throughput (requests/second) of your server will suffer.
  2. Security: If it is possible that for certain input one of your threads might block, a malicious client could submit this evil input, make your threads block, and keep them from working on other clients.

The Event Loop executes JavaScript callbacks registered for events, and is also responsible for fulfilling non-blocking asynchronous requests like network I/O. The Worker Pool of Node.js is implemented in libuv, which exposes a general task submission API.

Because Node.js handles many clients with few threads, if a thread blocks handling one client's request, then pending client requests may not get a turn until the thread finishes its callback on task. The fair treatment of clients is thus the responsibility of your application. This means you shouldn't do too much work for any client in any single callback or task.

The Event Loop notices each new client connection and orchestrates the generation of a response. All incoming requests and outgoing responses pass through the Event Loop. This means that if the Event Loop spends too long at any point, all current and new clients will not get a turn.

You should make sure you never block the Event Loop. In other words, each of your JavaScript callbacks should complete quickly. This also applies to your await's, your Promise.then's, and so on. A good way to ensure this is to reason about the computational complexity of your callbacks.

However, for complex tasks you should consider bounding the input and rejecting inputs that are too long.

Other ways to not block event loop:

  • partitioning
  • offloading to worker pool
function asyncAvg(n, avgCB) {
// Save ongoing sum in JS closure.
let sum = 0;
function help(i, cb) {
sum += i;
if (i == n) {
cb(sum);
return;
}

// "Asynchronous recursion".
// Schedule next operation asynchronously.
setImmediate(help.bind(null, i + 1, cb));
}

// Start the helper, with CB to call avgCB.
help(1, function (sum) {
let avg = sum / n;
avgCB(avg);
});
}

asyncAvg(n, function (avg) {
console.log('avg of 1-n: ' + avg);
});

In order to not block the Worker Pool, you should try to minimize the variation in Task times of tasks in the Worker Pool.

Manipulating Files

You can get some stats about a file using fs.stat(). The following stats are available:

  • whether the file is a file or directory
  • whether the file is a symbolic link
  • the file size in bytes

You can extract information about a path in Node.js:

const path = require('node:path');

const notes = '/users/joe/notes.txt';

path.dirname(notes); // /users/joe
path.basename(notes); // notes.txt
path.extname(notes); // .txt
path.basename(notes, path.extname(notes)); // notes
// Get the absolute path
path.resolve('joe.txt'); // '/Users/joe/joe.txt' if run from my home folder

If the path starts with a /, it means that the path is an absolute path.

A file descriptor is a reference to an open file, the number (fd) returned by the opening file using the open() method offered by the fs module. The number (fd) uniquely identifies an open file in an operating system,

const fs = require('node:fs');

fs.open('/Users/joe/test.txt', 'r', (err, fd) => {
// fd is our file descriptor
});

Command Line

REPL stands for Read Evaluate Print Loop, and it is a programming language environment that takes a single expression as user input and returns the result back to the console after execution. The REPL session provides a convenient way to quickly test JavaScript code.

Node.js provides the readline module to get input from the command line.

Bun


Bun is a all-in-one toolkit for JavaScript and TypeScript apps. It ships as a single executable called bun. At its core is the Bun runtime, a fast executable runtime designed as a drop-in replacement for Node.js. It is written in Zig and powered by JavaScriptCore under the hood, drastically reducing startup times and memory usage.
$ bun run index.tsx # TS and JSX supported out of the box

The bun command-line tool also implements a test runner, script runner, and Node.js-compatible package manager, all significantly faster than previously existing tools and usable in existing Node.js projects with little to nor changes necessary.

$ bun run start # run the `start` script
$ bun install <pkg> # install a package
$ bun build ./index.tsx # bundle a project for browsers
$ bun test # run tests
$ bun cowsay 'Hello, world!' # execute a package

Bun is designed from the ground-up with today's JavaScript ecosystem in mind.

  • Speed: Bun processes start 4x faster than Node.js currently
  • TypeScript & JSX support: You can directly execute .jsx, .ts, .tsx files; Bun's transpiler converts these to vanilla JavaScript before execution
  • ESM & CommonJS compatibility
  • Web-standard APIs: Bun implements standard Web APIs like fetch, WebSocket, and ReadableStream
  • Node.js compatibility: In addition to supporting Node-style module resolution, Bun aims for full compatibility with built-in Node.js globals and modules.
Bun is more than a runtime. The long-term goal is to be a cohesive, infrastructural toolkit for building apps with JavaScript/TypeScript, including a package manager, transpiler, bundler, script runner, test runner, and more.

Installation

$ curl -fsSL https://bun.sh/install | bash # for macOS, Linux, and WSL
> powershell -c "irm bun.sh/install.ps1|iex"
$ # Docker
$ docker pull oven/bun
$ docker run --rm --init --ulimit memlock=-1:-1 oven/bun

$ bun --version # Check the version
1.x.y
$ bun upgrade # upgrade bun

QuickStart

$ bun init
bun init helps you get started with a minimal project and tries to
guess sensible defaults. Press ^C anytime to quit.

package name (quickstart):
entry point (index.ts):

Done! A package.json file was saved in the current directory.
+ index.ts
+ .gitignore
+ tsconfig.json (for editor auto-complete)
+ README.md

To get started, run:
bun run index.ts
const server = Bun.serve({
port: 3000,
fetch(req) {
return new Response("Bun!");
},
});

console.log(`Listening on http://localhost:${server.port} ...`);
$ bun index.ts
Listening on http://localhost:3000 ...

Bun can also execute "scripts" from your package.json:

{
"name": "quickstart",
"module": "index.ts",
"type": "module",
"scripts": {
"start": "bun run index.ts"
},
"devDependencies": {
"@types/bun": "^1.0.0"
}
}

Run it with bun run start

$ bun run start
Install a Package:
$ bun add figlet
$ bun add -d @types/figlet # Typescript users only

To install the TypeScript definitions for Bun's built-in APIs, install @types/bun

$ bun add -d @types/bun # dev dependency

At this point, you should be able to reference the Bun global in your TypeScript files without seeing errors in your editor.

{
"compilerOptions": {
// Enable latest features
"lib": ["ESNext"],
"target": "ESNext",
"module": "ESNext",
"moduleDetection": "force",
"jsx": "react-jsx",
"allowJs": true,

// Bundler mode
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"verbatimModuleSyntax": true,
"noEmit": true,

// Best practices
"strict": true,
"skipLibCheck": true,
"noFallthroughCasesInSwitch": true,

// Some stricter flags
"noUnusedLocals": true,
"noUnusedParameters": true,
"noPropertyAccessFromIndexSignature": true
}
}


Deno


Deno, the open-source runtime for TypeScript and JavaScript. Features built-in dev tools, powerful platform APIs, and native support for TypeScript and JSX. Deno is an open source JavaScript, TypeScript, and WebAssembly runtime with secure defaults and a great developer experience.

Getting Started

> irm https://deno.land/install.ps1 | iex # Windows
$ curl -fsSL https://deno.land/install.sh | sh # Linux
$ deno --version # Verify installation

Deno can run JavaScript and TypeScript with no additional tools or configuration required. Run a script with Deno:

$ deno main.ts

Init a Deno project:

$ deno init my_project

A deno.json file is created to configure your project, and two typescript files are created: main.ts and main_test.ts. The main.ts file is where you'll write your application code, on initial creation it will contain a simple program which adds two numbers together. The main_test.ts file is where you can write tests, initially it will contain a test for your addition program. Deno has a built in test runner. You can write tests for your code and run them with the deno test command.

Install the Deno extension in Vscode to improve developer experience.

Deno is a command line program. The Deno command line interface (CLI) can be used to run scripts, manage dependencies, and even compile your code into standalone executables. The Deno CLU has a number of subcommands (like run, init, and test, etc.). They are used to perform different tasks within the Deno runtime environment.

You can pass in additional script arguments after the command name (e.g., deno run main.ts arg1 arg2 arg3), and access them in the main script (Deno.args).

Fundamentals

TypeScript is a first class language in Deno, just like JavaScript or WebAssembly. you can run or import TypeScript without installing anything more than the Deno CLI. With its built-in TypeScript compiler, Deno will compile your TypeScript code to JavaScript with no extra config needed.

Deno allows you to type-check your code with the deno check subcommand.

Modern NOde.js project swill run in Deno with little to no networking required. However, there are some key differences between the two runtimes that you can take advantage of to make your code simpler and smaller when migrating Node projects to Deno.

Deno provides a compatibility layer that allows the use of Node.js built-in APIs within Deno programs. However, in order to use them, you will need to add the node: specifier to any import statement that use them:

import * as os from "node:os";
console.log(os.cpus());

Deno has native support for importing npm packages by using npm: specifiers. For example:

import * as emoji from "npm:node-emoji";

console.log(emoji.emojify(`:sauropod: :heart: npm`));

Deno will automatically determine if a package is using CommonJS and make it work seamlessly when imported. If the file extension is .cjs, Deno will treat this module as CommonJS.

Deno is secure by default. Unless you specifically enable it, a program run with Deno has no access to sensitive APIs, such as file system access, network connectivity, or environment access. You must explicitly grant access to these resources with command line flags or with a runtime permission prompt. This is a major difference from Node, where dependencies are automatically granted full access to all system I/O, potentially introducing hidden vulnerabilities into your project.

You can configure Deno using a deno.json file. This file can be used to configure the TypeScript compiler, linter, formatter, and other Deno tools. The configuration file supports .json and .jsonc extensions.

Deno has a built in HTTP server API that allows you to write HTTP servers. The Deno.serve API supports HTTP/1.1 and HTTP/2.


Package Managers


A dependency is a third-party bit of software that was probably written by someone else and ideally solves a single problem for you. A web project can have any number of dependencies, ranging from none to many, and your dependencies might include sub-dependencies that you didn't explicitly install - your dependencies may have their own dependencies.

A bundle is a term that's generally used to refer to a single file on your web server that contains all the JavaScript code for your software - typically compressed as much as possible to help reduce the time it takes for your software downloaded and displayed in your visitor's browser.

A package manager is something that makes sure dependency code is added and removed cleanly - it manages your project dependencies. The package manager will provide a method to install new dependencies (also referred to as packages and offer capabilities for you to publish your own packages. Package managers handle duplicate dependencies.

Dependencies can be installed globally or locally.

A package registry is a central place where a package is published and thus can be installed form. npm, as well as being a package manager, is also the name of the most commonly-used package registry for JavaScript packages.

NPM


As an npm user or organization member, you can create and publish packages that anyone can download and use in their own projects.

  • Unscoped public packages exist in the global public registry namespace and can be referenced in a package.json file with the package name alone: package-name
  • Scoped public packages belong to a user or organization and must be preceded by the user or organization when included as a dependency in a package.json file:
    • @username/package-name
    • @org-name/package-name



YARN


Yarn is an established open-source package manager used to manage dependencies in JavaScript projects. It assists with the process of installing, updating, configuring, and removing package dependencies, eventually helping you reach your objectives faster with fewer distractions.
Its areas of focus are speed, correctness, security, and developer experience, which we improve along every axis by leveraging a suite of innovative features such as workspaces, offline caching, parallel installs, hardened mode, interactive commands, and more.

Installation

The preferred way to manage Yarn is by-project and through Corepack, a tool shipped by default with Node.js. Modern releases of Yarn aren't meant to be installed globally, or from npm.

  1. Enable Corepack; this will add the yarn binary to your PATH.
$ corepack enable
$ yarn init -2 # Initialize a new project
$ # Updating Yarn
$ yarn set version stable
$ yarn install

Usage

  • Running yarn is enough to run install. It's an alias to yarn install
  • Adding or updating a dependency to a single package is done with yarn add
  • Upgrading a dependency across the whole project is done with yarn up
  • Your scripts are aliased.. Calling yarn build is the same as yarn run build
  • Most registry-related commands are moved behind yarn npm

Migration

  1. Make sure you're using Node 18+
  2. Run corepack enable to activate Corepack
  3. Go into your project directory
  4. Run yarn set version berry
  5. Convert your .npmrc and .yarnrc files into .yarnrc.yml
    1. Need to change the format of these files; they should be written in YAML.
  6. Run yarn install to migrate the lockfile
  7. Commit all changes

Custom pre and post scripts are no longer supported. You should rewrite:

{
"scripts": {
"prestart": "do-something",
"start": "http-server"
}
}

Into:

{
"scripts": {
"prestart": "do-something",
"start": "yarn prestart && http-server"
}
}


PNPM


Fast, disk space efficient package manager.
  • Fast
    • pnpm is up to 2x faster than npm
  • Efficient
    • Files inside node_modules are cloned or hard linked from a single content-addressable storage
  • Supports monorepos
    • pnpm has built-in support for multiple packages in a repository
  • Strict
    • pnpm creates a non-flat node_modules by default, so code has no access to arbitrary packages

Introduction

Motivation

  • Saving Disk Space
    • When using npm, if you have 100 projects using a dependency, you will have 100 copies of that dependency saved on disk. With pnpm, the dependency will be stored in a content-addressable store
  • Boosting Installation Speed
    • pnpm performs installation in three stages:
      1. Dependency resolution. All required dependencies are identified and fetched to the store.
      2. Directory structure calculation. The node_modules directory structure is calculated based on the dependencies
      3. Linking Dependencies. All remaining dependencies are fetched and hard linked from the store to node_modules
    • This approach is significantly faster than the traditional three-stage installation process of resolving, fetching, and writing all dependencies to node_modules
  • Creating a non-flat node_modules directory
    • When installing dependencies with npm or Yarn Classic, all packages are hoisted to the root of the modules directory. As a result, source code has access to dependencies that are not added as dependencies to the project. By default, pnpm uses symlinks to add only the direct dependencies of the project into the root of the modules directory.

Installation

You need to have Node.js (at least v18) installed on the system.

$ Invoke-WebRequest https://get.pnpm.io/install.ps1 -UseBasicParsing | Invoke-Expression
$ wget -qO- https://get.pnpm.io/install.sh | sh - # Linux OR
$ curl -fsSL https://get.pnpm.io/install.sh | sh -
$ npm install -g pnpm # Install with npm
$ choco install pnpm # Install with Choco

Usage

npm command

pnpm equivalent

npm install

pnpm install

npm i <pkg>

pnpm add <pkg>

npm run <cmd>

pnpm <cmd>


Comments

You have to be logged in to add a comment

User Comments

Insert Math Markup

ESC
About Inserting Math Content
Display Style:

Embed News Content

ESC
About Embedding News Content

Embed Youtube Video

ESC
Embedding Youtube Videos

Embed TikTok Video

ESC
Embedding TikTok Videos

Embed X Post

ESC
Embedding X Posts

Embed Instagram Post

ESC
Embedding Instagram Posts

Insert Details Element

ESC

Example Output:

Summary Title
You will be able to insert content here after confirming the title of the <details> element.

Insert Table

ESC
Customization
Align:
Preview:

Insert Horizontal Rule

#000000

Preview:


View Content At Different Sizes

ESC

Edit Style of Block Nodes

ESC

Edit the background color, default text color, margin, padding, and border of block nodes. Editable block nodes include paragraphs, headers, and lists.

#ffffff
#000000

Edit Selected Cells

Change the background color, vertical align, and borders of the cells in the current selection.

#ffffff
Vertical Align:
Border
#000000
Border Style:

Edit Table

ESC
Customization:
Align:

Upload Lexical State

ESC

Upload a .lexical file. If the file type matches the type of the current editor, then a preview will be shown below the file input.

Upload 3D Object

ESC

Upload Jupyter Notebook

ESC

Upload a Jupyter notebook and embed the resulting HTML in the text editor.

Insert Custom HTML

ESC

Edit Image Background Color

ESC
#ffffff

Insert Columns Layout

ESC
Column Type:

Select Code Language

ESC
Select Coding Language

Insert Chart

ESC

Use the search box below

Upload Previous Version of Article State

ESC