One of most common phrases one hears in when discussing the value of product life-cycle management (PLM) and product data management (PDM) software is that by combining multiple data sources, each representing different aspects of a product, such as architecture and geometries, test information, supply chain considerations, and so forth, these tools can unify information to represent a "single version of the truth." But do PDM tools really represent a single version of the truth? Should they?
One of most common phrases one hears in presentations and software demonstrations about the value of product life-cycle management and product data management software is that by combining multiple data sources, each representing different aspects of a product, such as architecture and geometries, test information, supply chain considerations, and so forth, these tools can unify information to represent a "single version of the truth."
It is certainly correct to observe that many organizations suffer from information schizophrenia. Over the course of developing products, organizations generate and store volumes of information from a multitude of activities such as product design, modeling and simulation, testing, production, and field experience. This information is generated from a variety of sources, both manual and instrument generated, including CAD, MES, and any number of bespoke applications and spreadsheets.
So the idea of offering a single point to synthesize the different data and points of view makes sense. Thomas Aquinas claimed that "a judgment is said to be true when it conforms to the external reality." But is there really a single version of the truth, or is the truth, to paraphrase the third century BC Greek idiom, in the eye of the beholder?
In most organizations, product development is represented by a linear, forward-feeding flow of product information and decision making. This process emphasizes individual task performance and is optimized to meet the goals and constraints of each product life-cycle stage, often at the expense and detriment of other downstream activities. For instance, a system designer may select a design and components best suited to meet the functional requirements of that system but may not be aware that these components are expensive and in short supply. Likewise, supply chain planners focus on identifying lower-cost suppliers, but lacking insight into the design requirements and market forecast data, these suppliers may be unable to meet quality and delivery expectations. Consequently, these two well-conceived decisions end up being at odds with each other, a fact that is not immediately apparent to the individual groups. In like manner, downstream activities such as supply chain and service planning often are treated as afterthoughts and commence late in the product life cycle, at which point the ability to influence already-made decisions is reduced.
In other words, each stakeholder's "truth" is the one that best represents the stakeholder's goals and constraints. Seldom do users have the tools to "see" downstream or upstream. All too often, lack of visibility to downstream activity, or clarity of understanding of the higher business-level goals, results in an optimal decision made within one product group, location, or discipline but, at the same time, jeopardizes the ability of a downstream group to meet its objectives. By definition, serializing a set of highly optimized localized decisions is likely to result in a suboptimal global decision.
Shouldn't we be able to reconcile conflicting information so decisions can be made that take all relevant "end to end" information into consideration?
Product data management is the keystone of the answer, yet it is not always sufficient. PDM software makes it easier to organize and access information from a single repository, but ease of access alone does not make information useful and impactful. PDM users need to understand the purpose and meaning of information as well as where and how to use it to derive the full context for high-fidelity decision making.
The real value of product data hinges on an active product knowledge source that is self-organized and reveals information, knowledge, and experience depending on the user, task, and context and that allows multiple "truths" to coexist, at least temporarily, in the form of each stakeholder's goals and constraints. Users need to interact, identify patterns (issues), draw conclusions, and then negotiate, harmonize, and optimize decisions such that the end result - a product, a process, a decision - represents an optimal outcome, which, in turn, is implicitly incorporated into the PDM system for future use.
Mashing up data, regardless of the source and semantics, opens the door for whole life-cycle analytics and what-if analyses that are impractical if not impossible using disparate tools. Furthermore, it allows product companies to store and retrieve a higher level of knowledge. Borrowing from the study of semiotics, PDM systems should store not only items and attributes (akin to language syntax and semantics) but also complex relationships that encapsulate business processes and best practices that can be searched for and reused.
If PDM is the system of record then the PLM application is the "system of engagement." Vendors have sold this basic concept with all of the "snaps together like Legos" marketing, but you point out the key failing - that the syntax of PDM and collaboration of PLM (or how people interact with the information - pragmatics to use the semiotics term) must be bridged by a much better understanding of the meaning (semantics).Therefore, instead of insisting on storing and managing the "single version of the truth," PLM vendors should provide an open architecture and the building blocks to realize an effective collaborative PLM platform.