var arreglo = [3, 4, 5, 6, 7];
var suma = 0;
for (var i = 0; i < arreglo.length; i++) {
suma = suma + arreglo[i];
}
console.log(suma);
If we look at the possible logic of the operation, the result must be 18. 18 because according to the condition given to i i<arreglo.length, elements 3 through 6 must be chosen, but not 7, because it would be violating the condition. But surprise, it gives 25 and I don’t know why as it violates the condition given to i. Please help me to understand this
I got the logic now. Thanks. Before continuing, positions in arrays are selected through index numbers, but why they decided to make length different, like instead from starting from 0, starting from 1?
I thought from the very start i as an index couldn’t get higher than the arreglo.length condition, but the thing here is that i isn’t an index, its just another var that can be taken as an index, but it not inherently one. It doesn’t take into account if its an index, a length, a whatever meanwhile its a number. The condition is only saying there i must be lower than 5, not as an index, not as anything as it is a normal variable.