A Non-Bullshit Attempt on the So-Called 'Digital Transformation'
As it happens, the hype industry has hijacked and bastardized the concept of “digitalization” or “digital transformation” until it means nothing anymore. We just can’t have nice things. This is sad if we consider that there is a need for a true, honest discussion on how much the process of engineering complex things stems from a messy digital origin and how it steps into the not-well-behaved physical world.
Up until recently, the only way to have extensive knowledge of a physical object was to be in close proximity to that object. The information about any physical object was relatively inseparable from the physical object itself. We could have superficial descriptions of that object, but at best they were limited in both extensiveness and fidelity.
Basic information about an object such as dimensions, height, width, and length, only became available in the late 1800s (!) with the creation of the standard meter1 and a way to consistently measure things. Before that, everyone had their own version of measurement definitions which meant that interchangeable manufacturing was nearly impossible. It was then only in the last half of the twentieth century, that we could strip the information from a physical object and create a digital representation of it. This digital counterpart started off relatively sparse as a CAD description and is becoming more and more rich and robust over the years. While at first, this digital “reflection” was merely descriptive, in recent years it has become increasingly more behavioral. What behavioral means is that the CAD object is no longer simply a three-dimensional object hanging in empty space, time-independent. We can now simulate physical forces on this object over time to determine its behavior. Where CAD models were static representations of form, simulations are dynamic representations, not only of form but also of function.
One of the previous issues with only being able to work with a physical instance of an object is that the range of experimentation into its behavior was both expensive and time-consuming. We first had to physically create the object as a one-off. We then had to create a physical environment in which the object was impacted by the elements, for instance, mechanical vibration and temperature swings. This meant that we were limited to investigating scenarios and their associated levels that we thought were of concern. Often, the test scenarios would damage the object, increasing the expense. This meant that the first time we actually saw a condition not covered by a physical test would be when the physical object was in actual use. This meant that there were going to be many unforeseen conditions or emergent behaviors that resulted in failures that could result in harm and even death to its users. Aggregating these objects into systems compounded the problem. Systems are much more sophisticated than simple objects. We need to have better ways to understand these increasingly sophisticated systems. If the system contains computing elements, it also means we have to simulate the behavior of those computers and how they execute the code the system requires to achieve its expected behavior2. A problem appears: the fidelity of the simulators cannot be ignored. A CPU emulator which contains an error when emulating CPU architecture X or Y will change the execution flow of the code which may lead to totally different behaviors compared to the real thing.
The idea of a seemingly inevitable “Digital Transformation” where we push most of our work as engineers into the digital domain has made headlines recently, with incumbents preaching on the beauties of embracing “Digital Twins” and how they are very much doing it already3. The Digital Twin concept, in its definition, means to be able to design, evaluate, verify, and use/operate the virtual version of the system of interest during a large portion of the design lifecycle. In theory, with Digital Twins we could understand whether our designs are actually manufacturable, and determine performance and modes of failure. All of this before the physical system is actually produced. It also downplays the role of physical prototyping, as the need to produce rough instances of the design is replaced by analyses done in the digital realm.
That’s more of the hype side. In reality, there is an ugly side to the “Digital Twins” and the transformation they are supposed to bring. Far from the ideal of a cohesive digital replica that stock images with people touching screens with revolving 3D models and wearing VR headsets has tried to make us believe, representing a complex system in the digital domain requires large amounts of unstructured information in the shape of disaggregated tools and software environments that will not easily talk with each other. This includes mechanical design data, thermal and fluid dynamics data, power systems, multiphysics simulation, finite elements analysis, electromagnétics, application software, firmware, and programmable logic, just to name a few. As much as the Product Lifecycle Data4 tool makers have tried hard to sell to us that they can easily handle all that, it remains largely untrue; PDM/PLM only bites a small portion of the challenge, usually the portion without complex runtime behavior.
Don’t be deceived. The digitalization of the engineering practice and product design is still in its infancy and looks more like an intention than a ‘transformation’. Today, it’s just a dysfunctional collection of expensive tools that come with industrial amounts of vendor lock-in. Just like lifeboats in the Titanic or space tourism, the ‘Digital Transformation’ appears to be a rather bourgeois thing only possible for those with deep pockets.
For us mortals, the “digital transformation” it’s still largely Excel.
The meter was originally defined in 1791 by the French Revolution as one ten-millionth of the distance from the equator to the North Pole along a great circle passing through Paris. This definition was part of the metric system's attempt to base measurements on natural phenomena.
The current definition, established in 1983, is based on the speed of light. The meter is defined as the distance that light travels in a vacuum in 1/299,792,458 of a second. This definition effectively ties the meter to the fundamental constant of the speed of light, making it extremely precise and universally consistent.
Adapted from “Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior in Complex Systems”, by Michael Grieves and John Vickers
Ironically, when those incumbents act as customers, they tend to ask for the most classic/physical lifecycle approach possible.
Spoke about PDM already: https://managerseverywhere.substack.com/p/product-lifecycle-data-and-the-enterprise.