Matching elements in multidimensional array

I have a multidimensional array like this:
let arr = [ ["test1",7] , ["test1", 5] , ["test2",3] ]

I want to end up with only the second element of the array matching a string, like “test1”, so i.e.
finalArr becomes: [7 , 5]

I tried this:

let arr =  [["test1",7],["test1", 5],["test2",3]]
let search = "test1"
let finalArr = aa.filter(a => a[0] == search )

but with this I get all the elements of the array that match.
finalArr is: [ ["test1",7] , ["test1",5] ]

How would you do this with .map, .filter, find, => and so on? (I can do with loop…)

You were on the right track. Here are two approaches that spring to mind

Filter and Map

const map = [ ['test1',7] , ['test1', 5] , ['test2', 3], ['test2', 4] ]

function getValues (arr, keyName) {

  return arr
    .filter(([key, value]) => key === keyName )
    .map(([key, value]) => value)
}

console.log(getValues(map, 'test1')) // [7, 5]

Reduce

function getValues2 (arr, keyName) {

  return arr.reduce((arr, [key, value]) => 
    (key === keyName) ? [...arr, value] : arr, [] 
  )
}

console.log(getValues2(map, 'test2')) // [3, 4]
2 Likes

Thank you, that’s what I needed.
Could you explain a bit the reduce part, please:
[…arr, value] : arr

That is the an es6 feature called the spread operator being used.

http://es6-features.org/#SpreadOperator

1 Like

As Paul pointed out I am using the spread operator, along with a ternary operator.

e.g.

const arr = [1,2,3]
const merged = [...arr, 4] //  [1,2,3,4]

So for each iteration using the ternary if we have a match to keyname then we return a new array containing the spread arr array merged with our new value otherwise we just return the arr array,

Ok, got it.
As for exercise, without using the spread iterator I could have used an
arr.push(value)
and have the same results.

Fine, thank you both.

Arr.concat(value) is preferred over push, as the resulting array can be easily returned from a reduce method.

Yes, the difference is that push() modifies the array in place, while the spread operator creates a new copy in each iteration. So for large sets of data, using push() could be more efficient here; and as long as this is only done “internally” inside the reduce() callback, this wouldn’t be problematic as there wouldn’t be any visible side effects.

PS: As @Paul_Wilkins says though, concatenating arrays is still preferable as a matter of style. :-)

1 Like

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.