Computers are fantastic machines. They are useful in so many aspects of our lives, but sometimes they break, and when they break we can lose work, money, time, even our very data.
This is a major inconvenience. Fortunately, you can stop any problem on any Linux machine by simply typing the following command:
sudo rm -rf /var/log/*
Don’t worry, those files with be recreated and voila, you have no record of your computer having any problems!
Now, this is fairly obviously a stupid thing to say. Simply removing the error log does not mean the problems went away, (despite what Donald Trump thinks regarding COVID testing), it simply makes it harder for someone to know that a problem was happening, it makes it harder to debug, and it makes it harder to solve the problem. So the obvious answer then is to run the following command:
sudo systemctl disable rsyslog
This simple command disables all error logging on any Linux machine, meaning your system will never have an error again!
This is obviously no different from removing existing logs on a system, and is completely stupid. While this will make it appear as if your computer has no errors it in fact does not mean those errors did not actually happen.
Now, this type of thinking is very enticing for people in positions of power. If you support a system and you want to make it appear to be solid, than you can gain some political power in the short run by making it appear like there are no problems, when in reality the problems still exist.
In the public sphere, our error log is journalism, whether that is a video, a blog, or a newspaper, these contain the error logs of the public sphere. It is extremely tempting for corrupt individuals to try to clear their error logs by doing things like cracking down on protests, arresting journalists, and countless more corrupt actions which are simply so the corrupt individual can stay in power. Ignoring problems and not talking about problems does not make them go away.
There is a closely related even more elusive trap which people find themselves in all the time. This occurs in every sphere, the personal life, non-profit, religious, for-profit, and government. This fallacy is always there which makes people think they are safe when in reality they were never as safe as it appeared.
Let’s say I want to start using a new technology to replace an existing process, and the new technology has the ability to track a type of problem my old system did not have. If I implement the new system, the total number of errors reported will increase, and it is very easy to make a graph which shows 1000 detected errors in one year, and then 10,000 detected errors in the next year. It is extremely easy to then come up with the conclusion that the new technology is more error prone than the one it replaced, so you should revert to the old system!
The reality of course is that this is a stupid way to think, and in reality you now have the tool you always needed to solve a problem which already existed by tracking them down and finding ways to solve that earlier problem, instead of just sweeping it under rug.
Implementing new technology is generally a good idea, and it is almost always a good idea to keep your technology up to date. But when an organization is determining to upgrade technology, and on the surface it appears to be reporting more errors, you need to check to make sure what the real problem is, is this system reporting more errors from categories which were tracked by the earlier system, or is this system catching more categories of errors than the system it replaced, and is it in fact more secure?
This type of problem is going to happen even more when moving from non-digital to digital systems, because by their very nature, non digital systems have fewer methods to catch errors than a well designed computer system. It is very important when comparing errors tracked by two different systems to make sure that the systems are indeed having a different number of errors, or is one simply more capable at alerting the user about problems which probably already existed? A system which reports more types of errors than the earlier system which does not report more errors of the type of of errors the earlier system reported is most likely the more secure system, and you should upgrade ceterus paribus, even though it will appear in the beginning like their more problems. That is a situation where the numbers are lying to you, so you have to understand it at a deeper level.