Systems and interfaces often cost more than they should, to build, operate, and maintain. This sometimes also results in the business being constrained rather than supported by their solutions. A major cause is that the quality of the data models implemented in systems and interfaces is poor. Business rules, specific to how things are done in a particular place or industry, are often fixed in the structure of a data model. This means that small changes in the way business is conducted lead to large changes in computer systems and interfaces.
Entity types, such as customers or transactions, are often not identified, or incorrectly identified. This can lead to replication of data, data structure, and functionality, together with the attendant costs of that duplication in development and maintenance. Data models for different systems are arbitrarily different. The result of this is that complex interfaces are required between systems that share data. These interfaces can account for between 25-70% of the cost of current systems.
There is also the case where data cannot be shared electronically with customers and suppliers, because the structure and meaning of data has not been standardized. For example, dossier information are still sometimes exchanged on paper.
The reason for these problems is a lack of standards that will ensure that data models will both meet business needs and be consistent.
Just as architects consider blueprints before constructing a building, organizations and consortiums should consider data before building solutions. On average, about 70 percent of software development efforts fail, and a major source of failure is premature coding. A data model helps define the problem, enabling consortiums to consider different approaches and choose the best one.
Applications can be built at lower cost via data models. Data modeling typically consumes less than 10 percent of a project budget and can reduce the 70 percent of budget that is typically devoted to programming. Data modeling catches errors and oversights early when they are easy to fix. This is better than fixing errors once the software has been written or – worse yet – is in production. Organizations can also build software faster by catching errors early. In addition, data models are the cogs enabling automation as well as data access.
A data model provides focus for determining scope. It provides something tangible to help business sponsors and developers agree over precisely what is included with the software and what is omitted. Business staff can see what the developers are building and compare it with their understanding. Models promote consensus among developers, customers, and other stakeholders. Models promotes agreement on vocabulary and jargon as well as highlighting the chosen terms so that they can be driven forward into software artifacts. The resulting software becomes easier to maintain and extend.
Models document important concepts and jargon, proving a basis for long-term maintenance. The documentation will serve you well through staff turnover. Today, most application vendors can provide a data model of their application upon request. That is because the IT industry recognizes that models are effective at conveying important abstractions and ideas in a concise and understandable manner. The documentation inherent in a model also serves as a perfect starting point for analytical data mining.
Lastly, data model causes participants to crisply define concepts and resolve confusion. As a result, application development starts with a clear vision. Developers can still make detailed errors as they write application code, but they are less likely to make deep errors that are difficult to resolve.
Organizations and consortiums can use their data models to estimate the complexity of software, and gain insight into the level of development effort and project risk. In this case organizations consider the size of a model, as well as the intensity of dependencies to measure risk.
Does your organisation or consortium have your own data models? Keep owning them but let us maintain and develop it for you!
Model mapping is both essential and the real gold in data interoperability. Let us help you in mapping and maintain your data models as these evolve and enriches!
Moving from one industry model to another or the brand-new industry standard! Don’t sweat it, we´ve done that before and happily do it again with you!
We love to help out and are fiercely convinced that we have what it takes to solve almost any cross-domain interoperability issues! We love to hear from you regardless of if you just want to take a coffee and chitchat or make the next big thing!