I don't quite understand what it means until the program finishes what it does or the program is in the middle of doing something else ,is it a statement it needs to finish first? a function that is currently runs that needs to finish execute before the program can moves on to the scheduled code(that his timer is already reaches 0 a while ago)?
In return I got an answer from the user "felgall" tried to explain it to me, but his answer confused me even more. Can someone please explain me his answer?
A computer processor can only execute a set number of commands at the same time. For most computers in the past that was one instruction at a time, now it is common to find processors that can process two or sometimes even more things at the same time.
Basically when the computer is performing tasks it has an input queue of instructions that are waiting to be processed and as it finishes processing each instruction iit then gets the next one off the queue. If you tell the computer to run something right now then that instruction gets added to the end of the queue and will wait its turn to run after anything else already in the queue. When a timeout timer finishes counting down to zero it adds the code it has been told to run onto the end of the queue no differently than any other code that is supposed to run right now and the commands will get run when the processor gets to them.
One thing you need to remember is that ALL of the programs running on your computer are adding commands to be run onto the queue all the time and so how soon after you tell something to run that it actually does will depend on how much is on the queue at that time.
The more you have running at the time the longer the wait for something that is to run right now before it actually gets to run. There have to be a huge number of things going on before the delay becomes long enough for you to notice.
The only exception to new processes getting added to the end of the queue are a special category of tasks called interrupts. Those are generally sent by separate devices where there is something that needs to be copied into memory right now or it will be lost and so the instruction to do the copy gets to jump the queue. You get to see a list of devices in your computer that are connected to interrupts when you first turn the computer on.
Where this delay can make a difference is where you are trying to time a process using timeouts where the commands to be run start another countdown to run the same thing again. Then you have count down to zero followed by time on the queue waiting to run followed by countdown to zero followed by etc..... So each time around takes slightly longer than the countdown would indicate. One way around that is to use setInterval instead of setTimeout because setInterval never stops counting down, it restarts automatically as soon as it reaches zero. So if you were going to use one of these commands to set up an analog clock then using setTimeout would result in the clock gradually falling further behind in when it is supposed to move the hands whereas setInterval would stay roughly the same amount behind all the time.