The Dragons of TypeScript Enums

The Dragons of TypeScript Enums

June 2022

TypeScript supports enum types, but they have some subtle pitfalls to watch out for. This post dives into a few of the issues with TypeScript enums and proposes some alternatives.

Issue 1: Enum compilation generates code

TypeScript defines itself as "JavaScript with syntax for types". So usually JavaScript compiled from TypeScript looks just like the source TypeScript code with the types erased. As an example, take a look at this add function in TypeScript and the corresponding compiled JavaScript:

ts
function add(x: number, y: number): number {
return x + y;
}
js
function add(x, y) {
return x + y;
}

Enums violate this principle. e.g. This enum looks very different after being compiled to TypeScript:

ts
enum Direction {
UP = 0,
DOWN = 1,
LEFT = 2,
RIGHT = 3,
}
js
var Direction;
(function (Direction) {
Direction[Direction["UP"] = 0] = "UP";
Direction[Direction["DOWN"] = 1] = "DOWN";
Direction[Direction["LEFT"] = 2] = "LEFT";
Direction[Direction["RIGHT"] = 3] = "RIGHT";
})(Direction || (Direction = {}));

This transformation is required to work at runtime because TypeScript enums are not ECMAScript compatible. This makes it harder to reason about the runtime behavior of the TypeScript code and can complicate things if you ever need to debug issues with the compiled JavaScript directly.

Issue 2: Numeric enums don't provide type safety

Any number is assignable to a numeric enum. We could declare a function that accepts a Direction parameter and call it like so:

ts
function setDirection(direction: Direction): void {}
setDirection(Direction.LEFT);
setDirection(2);
setDirection(999);

The first two setDirection calls are valid (and equivalent), but the last call passes a number that isn't in the Direction enum. Surprisingly, this compiles to JavaScript just fine.

js
function setDirection(direction) {}
setDirection(Direction.LEFT);
setDirection(2);
setDirection(999);

Numeric enums were intentionally designed this way to support bit flags - bitwise combinations of enum members can take on values that aren't explicitly specified in the enum. There's a good example of this use case here. Still, this behavior seems inconsistent and surprising to me - this is exactly the kind of thing I want the TypeScript compiler to catch.

Issue 3: String enums are nominally typed

TypeScript is structurally typed language, except for string enums which are nominally typed. A hypothetical string decoding API might look something like this:

ts
enum Charset {
LATIN1 = 'latin1',
UTF8 = 'utf8',
UTF16 = 'utf16le',
}
function decode(bytes: Buffer, charset: Charset): string {
// ...
}

After the numeric enum example above one might expect that decode will accept any string for the Charset argument, but the TypeScript compiler actually won't accept any string at all. The enum must be referenced directly to call decode:

ts
decode(Buffer.from([]), Charset.UTF8); // Compiles
decode(Buffer.from([]), 'utf8'); // Does not compile

But in JavaScript, Charset.UTF8 is really just the string 'utf8', so this does work in JavaScript:

js
decode(Buffer.from([]), 'utf8');

Nominally typed string enums are a weird inconsistency in the TypeScript language. This could be relevant if we were to publish a library with this interface. TypeScript and JavaScript users would interact with the API differently, which is not an ideal experience.

Solution: ECMAScript-compatible alternatives exist!

As demonstrated above, TypeScript enums have some strange properties. Luckily there are other constructs we can consider that provide similar functionality with less of the weirdness.

The first is a union of type literals. The string decoding API from above could be refactored to this:

ts
type Charset = 'latin1' | 'utf8' | 'utf16le'
function decode(bytes: Buffer, charset: Charset): string {
// ...
return '';
}

Calling decode with a Charset string literal now works in TypeScript and JavaScript, and the TypeScript compiler will verify decode is only called with values in the Charset union.

ts/js
decode(Buffer.from([]), 'utf8');

One drawback to this approach is that if the value of one of the literals changes later (suppose latin1 changes to Latin-1), every place in the codebase referencing the old value has to be updated. The TypeScript documentation proposes an alternative that handles this well: an object with as const. Using this approach the string decoding API would look like this:

ts
const Charset = {
LATIN1: 'latin1',
UTF8: 'utf8',
UTF16: 'utf16le',
} as const;
type Charset = typeof Charset[keyof typeof Charset];
function decode(bytes: Buffer, charset: Charset): string {
// ...
return '';
}

Now the Charset values can be referenced by their keys just like an enum. The two approaches to calling decode work in both TypeScript and JavaScript as well:

ts/js
decode(Buffer.from([]), Charset.UTF8); // Compiles
decode(Buffer.from([]), 'utf8'); // Compiles

And finally, the compiled JavaScript looks pretty close to the TypeScript source, unlike what we got from the enum. Much better!

js
const Charset = {
LATIN1: 'latin1',
UTF8: 'utf8',
UTF16: 'utf16le',
}

Conclusion

TypeScript enums aren't ECMAScript-compatible and violate the TypeScript philosophy of being JavaScript with a type system on top. I do think it's totally possible to write production-ready TypeScript using enums, but they have some subtle pitfalls that are worth being aware of. Given that equivalent alternatives exist that are better aligned with ECMAScript standards, I imagine those will eventually emerge as the more idiomatic approach.