The Airiness of const & Object.freeze

In JavaScript, almost all value assignments that are solid can melt into air. Only the primitive types of JavaScript are truly immutable.

Even with the advent of const this only partly changed. The const keyword makes the assignment constant for primitives. But the immutability of composite types such as Object and Array is limited.

Let's start with an example concerning a primitive type, String.

var someName = 'franz Kafka';
console.log(someName[0]); // 'f'

someName[0] = 'F';
console.log(someName); // 'franz Kafka'

someName = someName.replace('f', 'F');
console.log(someName); // 'Franz Kafka'

If we would have written…

const someName = 'franz Kafka';

…no new assignment of another value would be possible. We would have to create another variable,

const newSpelling =  someName.replace('f', 'F');
console.log(newSpelling); // 'Franz Kafka'

…and thus the keyword const fills its purpose in this context.

However, the limits of const(ant) assignments in JavaScript become obvious when turning to Object and Array. And without the use of a library or your own functions, there is no real rescue.

The freeze and preventExtensions methods of Object are useful. Unfortunately, both are 'shallow'. They only work on the first 'level' of nodes, never on nested leaves/nodes.

Say you have nested object:

const someObject = {
  a: { c: { d: 666 } },
  b: 666

// "{'a':{'c':{'d':666}},'b':666}"

Now this is possible, even though the value assigment is of type const.

someObject.b = 665;


If we use the freeze method of Object…

someObject.b = 664;


...the key 'b' is *not* assigned a new value. But...

someObject.a.c.d = 664;

// "{'a':{'c':{'d':664}},'b':665}"

The key 'd' of 'c' (of key 'a') did change and we have proven the limits of the freeze method. Nevertheless, it's quite simple to make your own deep freeze method. You can either make a function of its own or (as I will do here) attach it to the Object prototype. In real life you should use Immutability.js or some other library, I guess.

Object.prototype.deepFreeze = function(obj = this) {
  return (function fn(objectToLoop) {
    for (let key in objectToLoop) {
      if (
        Array.isArray(objectToLoop[key]) ||
        typeof objectToLoop[key] == 'object'
      ) {

This method uses a recursive strategy to iterate through the Object and freeze each level by use of the freeze method. Thus,

const someObject2 = {
  a: { c: { d: 666 } },
  b: 666
someObject2.b = 665;
someObject2.a.c.d = 664;


If we'd want a more condensed formulation we could refactor the code and write:

Object.prototype.deepFreeze = function(obj = this) {
  return (function fn(objectToLoop) {
      key => Array.isArray(objectToLoop[key])
             || typeof objectToLoop[key] == 'object'
        ? fn(objectToLoop[key])
        : null

Why is this the case? Why have a freeze method that's so limited? If we check the ECMAScript Specification for freeze of ES5, there are no hints on this.

I for one actually think there is a consequence to this behaviour.

const arr = [[2],[1]];
console.log(arr.includes(2)); // false

const arr = [[1,2 ], [3, 4]];
console.log(arr[0].includes(2)); //true

We could make a method or function named deep-includes that would search every element (and its sub elements). The same goes for deep-copy and deep-find and so on. But if this came out of the box, you would hide how the language works. Also, perhaps we don't always want to deep-search an array. What I mean is that this behaviour is a good thing. Let's not change it.