PLM Implementation Time Machine
The original article was published on my Beyond PLM blog
It's so difficult to see what is going on when you're in the middle of something? It's only with hindsight we can see real things. In my blog How to afford the next PLM upgrade, I raised a question about the complexity of PLM system upgrades and what PLM architecture can make it easier in the future. Not to support upgrades will soon become a non-starter for any PLM architecture. I think vendors and customers are realizing it. I outlined the implication of this change for both vendors and customers. Read the article and draw your opinion.
I've got a number of interesting comments. Thanks everybody for contributing to this discussion online and offline by contacting me directly. My favorite comment came from Jos Voskuil and I'd like to quote it here.
When a post starts with "Nothing is more painful in PLM business as PLM upgrades" I have the tendency to stop reading. It reflects a mindset that if you have an installed system and you want to move to a new version is the biggest problem. Before reaching that situation you have to implement PLM first in such a manner that what you did in the past is upgradeable in the future. This is a mission impossible for everyone.
What I like in Jos' comment is a reflection on how companies are creating an instant legacy by implementing PLM solutions. Hence, to be able to plan ahead is very important. I fully agree with this position- companies should plan what to do and how to implement the PLM system. Planning is always important and especially when it comes to such fundamental topics as managing product information and processes.
At the same time, Jos outlined a very gloomy future of upgrades as mission impossible. Here is the passage.
In my recent blog series, I illustrated that methodologies and data models have been changing. With digitization, we see even a more disruptive set of new best practices as they need to become data-driven. It would be good to study and describe them first before implementing multi-tenant SaaS solutions, as the same risk will be here for the future. Too much small spaghetti pieces will make it a mikado game - don't touch it as we do not know the impact.
To conclude: Having a consistent implementation vision as a strategy is step one - next, be aware with every choice you make, you create a legacy, which can be justified by its business benefits. However, some generations later we will again talk about the (different) challenge of upgrades/changes.
The recommendation to have a vision is great. How to make it happen? I didn't find the answer in Jos's comments. I can say that the reality of manufacturing companies is that they already have one or two PLM systems implemented and there is a high probability that they are stuck in old revisions of PLM software. There is no time machine available for these companies that will allow them to go back and fix their PLM implementations planning. The reality is only to move forward and figure out how to upgrade or replace these systems.
The first step to upgrade can be painful because current systems with very few exceptions, were not planned to be upgradable. Therefore, it is very important to think about future systems and how Jos rightfully says to be with a constant change in mind.
One of the ways to solve the problem of updates is to adopt a DevOps methodology in PLM Implementations. I was writing about it in the past. Check some of my DevOps articles. Here is my favorite - SaaS, PLM, Tomkat, and why PLM vendors need to learn DevOps.
In my article – SaaS PLM – build the airplane when you’re flying it, I shared more thoughts on how I see SaaS applications are different from hosted cloud systems and where the evolution of future PLM development will take us. A bit more about how I see future PLM will be developed in my article – PLM Circa 2020.
The previous generation of PLM systems were designed on top of an SQL database, installed on company premises to control and manage company data. It was a single version of the truth paradigm. Things are changing and the truth is now distributed. It is not located in a single database in a single company. It lives in multiple places and it updates all the time. The pace of business is always getting faster. To support such an environment, companies need to move onto different types of systems – ones that are global, online, optimized to use information coming from multiple sources, and capable of interacting in real-time. Isolated SQL-based architectures running in a single company are a thing of the past. SaaS is making the PLM upgrade problem irrelevant, as everyone is running on the same version of the same software.
New PLM SaaS architectures can make systems continuously upgradable because these architectures were created with the upgrade in mind. Everything in these systems is designed to allow the system to be upgraded at any moment of time. More sophisticated settings can allow you to run an isolated version of the system, but it can be only done with the assumption that the system can be upgraded forward at any moment of time.
Conclusion
The universal path forward is not only to have a vision for consistent implementation but also to have technology that is capable to support this vision. SaaS architectures by nature built on top of concepts of continuous integration, DevOps, and designed with the mindset of being upgradable at any moment of time. There are no PLM time machines available for manufacturing companies to bring them back and fix the previous implementation system. Therefore, the only path available is the path forward - to provide a new system and PLM architecture allowing to push systems forward in the path of continuous development, operations, and upgrades. Just my thoughts…
Best, Oleg
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network platform that manages product data and connects manufacturers and their supply chain networks.