IMHO, 0000-00-00 00:00:00 is a better choice than NULL. This is because not only null same as no value, but null can also be translated into Unknown Value. So no value plus 2 is 2. But Unknown Value plus 2 is Unknown Value. It’s just a good practice to use 0000-00-00 00:00:00.
If you store NULL, can certain date/time functions work incorrectly?
Will the same advise apply to INT fields, I mean is it better to use NULL than a 0?
I googled “null vs dummy values” and “null vs default values” and I came across some interesting posts, however people tend to disagree a little so maybe the choice should be based on a specific need rather than a rule of thumb, but then again you dont always know all the variables and what could go wrong.
Suppose you have a list of students and the scores they earned on a test.
Here are three marks: 50, 60, 70.
In addition to those students, one student didn’t write the test because he was away. Another one wrote the test but got every answer wrong.
In your scenario you’d enter 0 for both of them. One of those is inaccurate. Also how would you tell them apart? (i.e. which was an actual score and which a dummy value).
Now also try to average the test scores, if you have a NULL entered for the student who didn’t write the scores then you’d average the four scores (0, 50, 60, 70) and come up with 45 which would be correct. If you used a 0 for the “missing” test result then you’d come up with a class average of (0,0,50,60,70) 36 which would not be correct.