Custom functions for working with collections

A little bit of procrastination on my part, but thought I would share.

I’m not intending to write a full functional library, we have lodash and ramda for that, but I am writing a few utility functions.

Part of the inspiration to do this came from playing with Map.groupBy. A useful function, that returns a Map type of grouped values.

If you want to then filter, flatMap, reduce that Map type, it first needs to be converted to an Array. The OCD in me, but I don’t like the spread syntax much and would much rather just pass in that map.

So I have just written these reduce functions

// import { TYPES, toString } from type.js
const TYPES = {
    'arguments': '[object Arguments]',
    'array': '[object Array]',
    ...
}

/**
 * reduceArray - A function that reduces an array to a single value.
 * @param {Array} arr - The array to reduce.
 * @param {Function} reducer - The reducer function.
 * @param {any} accumulator - The initial value for the accumulator.
 * @returns {any} The final accumulator value.
 * @example
 * const sum = (a, b) => a + b;
 * reduceArray([1, 2, 3, 4], sum); // returns 10
 */
function reduceArray(arr, reducer, accumulator) {
    const length = arr.length;
    let i = 0;

    if (accumulator === undefined) {
        if (!length)
            throw TypeError('Reduce of empty array with no initial value');
        accumulator = arr[i++];
    }

    while (i < length)
        accumulator = reducer(accumulator, arr[i++]);

    return accumulator;
}


/**
 * reduce - A function that reduces an iterable collection to a single value.
 * @param {Iterable} collection - The collection to reduce.
 * @param {Function} reducer - The reducer function.
 * @param {any} accumulator - The initial value for the accumulator.
 * @returns {any} The final accumulator value.
 * @example
 * const sumValues = (a, [key, val]) => a + val;
 * const map = new Map([['a', 1], ['b', 2], ['c', 3]]);
 * reduce(map, sumValues, 0); // returns 6
 */
function reduce(collection, reducer, accumulator) {
    if (toString.call(collection) === TYPES.array)
        return reduceArray(collection, reducer, accumulator);

    let iter = collection[Symbol.iterator]();

    if (accumulator === undefined) {
        const {done, value} = iter.next();

        if (done)
            throw TypeError('Reduce of empty iterable with no initial value');
        accumulator = value;
    }

    for (const item of iter)
        accumulator = reducer(accumulator, item);

    return accumulator;
}

The reduce function will take any iterable, and perform a reduce on it without converting the collection to an Array first. If it is an array it uses the faster arrayReduce, it isn’t an array it calls the symbol iterator on that collection to get an iterator. Something I believe for .. of does under the hood.

The reason I do this for reduce is that I may want to grab the first item if the accumulator argument is undefined.

for and While loops are still king in JS it seems, being much performant than the built in higher order functions e.g. map, filter etc.

I created my own performance test module to see how the reduce functions in vanilla JS, lodash and my custom library performed.

Setup
A large collection of random numbers

const randomArray = Array.from({length: 1000000}, () => Math.floor(Math.random() * 100));
const randomMap = new Map(randomArray.map((val, i) => [i, val]));

const sum = (a, b) => a + b;
const sumValues = (a, [key, val]) => a + val;

Tests

const reduceMaps = (
    timeTests('reduce performance tests with a Map collection')
        .test(
            'Vanilla JS reduce',
            () => {
                Array.from(randomMap).reduce(sumValues, 0);
            }
        )
        .test(
            '_.loDash reduce',
            () => {
                _.reduce(_.toArray(randomMap), sumValues, 0);
            }
        )
        .test(
            'Custom reduce',
            () => {
                reduce(randomMap, sumValues, 0);
            }
        )
);

const reduceArrays = (
    timeTests('reduce performance tests with an Array collection')
        .test(
            'Vanilla JS reduce',
            () => {
                randomArray.reduce(sum);
            }
        )
        .test(
            '_.loDash reduce',
            () => {
                _.reduce(randomArray, sum);
            }
        )
        .test(
            'Custom reduce',
            () => {
                reduce(randomArray, sum);
            }
        )
);

reduceMaps.run(1, 25); // 1 iteration, 25 samples
reduceArrays.run(1, 25); // 1 iteration, 25 samples

My test module isn’t the best, but these were my results

Reducing a Map

Reducing an Array

I also tested the Reduce Map collection with measurethat.net

So that is it, as I say just thought I would share :slight_smile:

As an aside, I haven’t used Typescript yet, but I am toe dipping with eslint’s jsdoc. Got to say I like it, being able to hover over the function and see it’s details is pretty useful.

2 Likes

Is there a reason not to implement them into the prototype for Map?

myMap.reduce(fx,init) would be more… familiar to those who use it for arrays?

So what is the advantage of using your code by just using?

let sum = 0;
for(const elem of map)
    sum += elem.val;

It’s not limited to Map, it will work with numerous iterables, Strings, Sets, Maps, HTMLCollections etc.

e.g. an HTMLCollection as per the other day.

const total = reduce(
    document.querySelectorAll('input[type="checkbox"]'), 
    (x, y) => x + (y.value * y.checked), 0
)

Maybe old fashioned of me, but I have always thought it was bad practice to add functions to the built-in prototypes. Should ES2026 introduce a reduce function to Map, then you will have conflicts.

The exception, I would say is polyfills, where you would test to see if reduce is implemented in this browser/environment, otherwise here is my reduce which follows the ES latest implementation.

sum is just an example, it could be product.

const product = (x, y) => x * y

reduce is a tried and tested higher order function, which gives you the flexibility to plugin whatever callback you like.

That’s not to say, your option isn’t valid, if all you need is a sum function.

Just to add, the idea here is that I want to write a few useful functions for manipulating data e.g. JSON. That and I just quite enjoy this stuff :slight_smile:

I can understand that, it just feels backwards for me if I have to reduce my map by saying
reduce(new Map(randomArray.map((val, i) => [i, val])), (a, b) => a + b),0).someOtherFunctionThatOperatesOnTheResult()

the idea of chained operations is you start at the beginning and go to the end, rather than starting the chain by calling a function, creating the thing that that function operates on, then telling it what to do in the function call from the beginning of this sentence, and then proceed with chaining as normal:

(new Map(randomArray.map((val, i) => [i, val]))).reduce((a, b) => a + b),0).someOtherFunctionThatOperatesOnTheResult()

It is no different to doing
[].reduce.call(myArray, somefunction)

Essentially it is demethodized.

Look at PHP or Python, and their higher order functions operate in a similar manner.

Look at lodash and ramdaJS and likewise you call the function first
_.reduce(collection, function, initValue)

(Which is why i dont use lodash or ramdaJS…it’s one of those things where i try to do it the “normal” way, and then constantly have to go back and "oh right, i have to call it backwards with this library indicator first and… eugh.)

Fair enough, but just to point out this won’t work.

(new Map(randomArray.map((val, i) => [i, val]))).reduce((a, b) => a + b),0)

This would (Not sure if the braces are balanced :))

Array.from(new Map(randomArray.map((val, i) => [i, val]))).reduce((a, b) => a + b),0)

From the tests I did, converting to an array first can be a bit of an expensive operation.

Valid point.

It would if reduce was in the prototype for Map. Which was the point :wink:

But to each their own.

1 Like

I know that reduce is able to do so. But my loop is also :slight_smile: I have never found a really useful need for reduce as it is really hard to read and has no advantage at all.
If you put the callback in the reduce function or in my loop body. I see no difference

To a certain extent I would agree with you, and funnily enough I do find myself leaning more towards for loops over higher order functions (simpler and easier to read). That said still nice to have a few utility functions. Furthermore writing these is an interesting learning exercise as well.