2021 in 21 aphorisms
An aphorism is a concise expression of an observation, a general truth or principle. I collected a list of those throughout this overly strange year which is now—finally— going away. Some of these might sound a bit like over-generalizations. One thing is for sure: they are opinionated, so read with proper care.
A plan eventually—and inevitably—detaches from reality. When that happens, there are two kinds of people: those who alter the plan to connect it back to reality, and those who choose to alter reality to connect it to the plan (i.e, choose to live a lie.)
No matter how complex a system is, all its requirements boil down to basically one: make it cheaper.
Most risk assessments will assume normal distributions all over the place. This is, risk assessments will assume variance is known, when in most cases, it is not.
All the processes you put in place are continuously and superlinearly rotting while you sleep.
The smartest organizations produce things you cannot touch.
Expectations are always higher than results. Even if we set our expectations reasonably low, the outcomes adjust to keep the amount of disappointment sustained.
Many Complex Systems that work are miracles worth calling The Vatican. Corollary: Complex Systems always contain a non-zero amount of faults and flaws, hence they always run in degraded mode.
Everyone preaches about engineering standards and practices as long as schedule permits. When shit hits the fan, standards and good practices are the first tossed out of the window.
When everyone is a manager, no one is a manager.
There are excellent engineers who are terrible managers, and terrible managers who are terrible managers.
There is nothing as dangerous as developing two systems at the same time which are meant to interact with each other.
If you can’t explain the failure, blame software.
Engineers will ask once if they don’t understand. They might ask a second time if they still don’t understand, but they will just go with whatever they understood instead of asking a third time.
There’s a strange type of engineer who looks misleadingly competent, sounds very knowledgeable, but has no single flying clue what’s going on.
Any hardware issue which can be patched in software, will be patched in software—from a system perspective, there’s no issue anymore. But the hardware folks will still think there’s an issue and fix it in further versions, which again makes the system fail, requiring a new software patch and opening the doors of hell for future configuration control.
The more complicated, convoluted, acronym-infested the title, the more irrelevant the person is in the organization.
A quiet guy sitting inconspicuously in a corner is the one keeping the whole project together, if not the whole organization.
The perfect manager is virtually invisible, yet highly noticeable.
You cannot measure productivity of an engineer in any meaningful way because you are not sure what productivity is in the first place.
An incompetent native English speaker will last longer in an organization than an incompetent non-native English speaker.
Leading skills and effectiveness do not scale linearly or exponentially with headcount, but logarithmically.
Bonus:
Those with bullshit jobs are destined to continuously overemphasize the importance of what they do, for survival reasons.
System operators’ competence and trust is typically underestimated, so a variety of automations and protections are piled up which ultimately prevent operators from saving the system should a perfect storm take place.
The engineer who has just left is the sole reason behind all the problems there are and there will ever be.