Pantglas2 Yet again, I will explain what the millenium bug was all about. I will keep it to simple stuff - because it wasn't very hard to understand. Well, I understood it, so it couldn't have been.
How many bits is your computing device? Is it a 64 bit? Well, when computing was in its infancy, way back in the 70' and 80's when the BBC Micro was the latest thing, you only got 8-bit machines. So programming could only be for 8-bit machines, and that was true for a long time.
What on earth is a bit, I hear you ask? A bit is a chunk of information that can be sent from part of the machine to another in one go. 8 bits represented a number containing 8 digits in code, which was interpreted and acted upon by a more complicated set of instructions. For instance, the 8 possible colours each had an 8-bit number. With 64 bits modern machines can understand vast numbers of colours, but with 8 you only got cyan, magenta, yellow, blue green, red, black and white. Other groupings meant other things.
After receiving 8 bits, the machine assumed that the next 8 bits were a new number with a new meaning. The largest number that can be represented this way was 256 - so there could only be 256 different instructions to understand and carry out.
8 bits didn't have enough variations to do ALL the years from the birth of Jesus to 2000, so programmers used only the last two digits - assuming that their software wouldn't still be in use in 2000. Those "boffins" didn't expect their programmes to be immortal. They were neither stupid nor arrogant.
Then computers caught on, everyone wanted one, and utilities and banks and so on used programmes to carry out their businesses - in 8 bits. Other makers designed machines that would use 16 bits - wow -and then 32 and now 64 and so it goes on.
BUT the original programmes still worked, with a bit of tinkering and additions, and the users didn't always get them completely rewritten, so by 1999 there were still bits of programmes in use here and there that only used the last two digits of the year - not only when referring to dates, but also in doing internal checks on whether the computer was properly tuned and working to exactly the right speed.
Programmes are many thousands of lines long, and a reference could be hidden deep in a long sequence, so could have come into use without warning - and the computer in question would then fail its timing test and switch itself off. That could have been disastrous.