Governance of Master Data: Definition, Structure and Implementation

Learn about the importance of establishing a master data governance model and how you can structure and implement a governance model for your enterprise.


We have discussed before the importance of Master Data Management (MDM) to your business (with a focus on testing laboratories). So – once we have agreed upon the need for MDM, we have to consider very seriously the matter of governance of that master data to ensure that it is a reliable central repository of your core information assets.

What is Governance?

Governance is everything that an organization does to ensure the ownership and accountability pertaining to master data management. This includes:

  • The items that form the basis for master data and their structure.
  • The processes to create and maintain master data
  • The metrics and business rules that pertain to master data management
  • The rules and processes around master data access, security and consumption

Effective governance of master data can be enabled by the establishment of a governance organization (team) whose responsibility it is to create policy and procedures that will be adhered to pertaining to the creation, maintenance and consumption of master data while ensuring that there are appropriate metrics in place to ensure the quality and business use of the master data.

Structuring a Governance Organization:

A well structured governance organization has:

  • A Governance Council: Made up of senior members of business units and corporate functions, the council establishes the strategy and defines the tactics via which master data will be governed. The council also owns the responsibility and accountability to propagate the directives of the governance council in their respective business areas via the stewardship team. The council oversees aspects pertaining to investments pertaining to master data and its consumption, and regularly evaluates the effectiveness of the governance model via metrics.
  • A Stewardship Team: Made up of information/data stewards, this team is constituted by individuals (generally business leaders or managers) from different business units who are tasked with propagating the program and monitoring progress. This team needs to be well defined as it is a critical factor for the success of governance initiatives. The stewardship team defines the specific policies, procedures and practices via which information will be governed. They constantly monitor the efficacy of the governance program and determine the projects that will make the biggest impact to information quality and leverage and report into the governance council. They also drive the organizational change initiatives – including training and evangelization – that are required to ensure the adoption of master data governance throughout the enterprise.
  • A Custodian Team: Information Custodians are individuals from various business units that are responsible for specific master data domains (e.g. people, customer, standards and regulations, methods, etc.). They are directly responsible for the management of the master data in their domain and work closely with the information steward for their domain to ensure alignment and adherence to governance policy.

Governance Policies & Procedures:

To ensure clarity a few policies should be established:

  • Policy around ownership and accountability pertaining to master data
  • Policy around governance procedure development & implementation
  • Policy around governance issues and their resolution
  • Policy around compliance audits
  • Policies around governance training

Once these policies have been established, they should then be materialized through well defined processes and procedures such as:

  • Identification, definition and implementation or new policies/procedures
  • Identifying and making the case for new master data initiatives
  • Making the case for modifications to the master data governance model
  • Definition of an oversight process to measure and monitor effectiveness
  • Oversee and manage the progress of existing initiatives
  • Definition and capture of metrics to measure performance
  • Defining the calendar and execution of compliance audits
  • Day to day management of governance and compliance issues

Centralized vs Hub & Spoke models in Testing Laboratories:

Often times we are asked the question by Laboratories about whether they should have a centralized governance team or a distributed one. The answer lies in the way the master data elements are managed. For small to medium sized organizations, it may be easier to implement a centralized model and then evolve that over time to grow with the geographical expansion. For large organizations that have a geographically distributed footprint, we most recommend the hub-and-spoke model – wherein collaboration and alignment are critical components to success.

As with most things, there is no one-size-fits-all prescription when it comes to this. The choice of architecture depends on specific business problems with processes or data domains. Organizations should remain flexible in their choice of architecture and select those that best fit their business needs.

Our Test Reference Information Management Solution (TRIMS) supports both models via its definition of various roles and collaboration workflows.

Critical Success Factors:

Establishing the right governance model is critically important – and it is hard. Especially for organizations that have never had that kind of structure. Here are some things that will help you along the journey of defining and maturing this capability over time:

  • Identify an Executive Sponsor: It is critical that a senior executive in the organization invests their time to be the executive sponsor of this initiative. They are positioned to ensure that the model aligns to corporate strategy while messaging to the enterprise the value the leadership team places on the initiative. As the chair of the Governance Council, they are also the final decision maker around investments related to master data and its adoption.
  • Identify a Chief Information Steward: The Chief Information Steward should be an individual who is the lead evangelist for the program and should advocate continuous improvements to master data quality (regardless of their current or past business roles / domain affiliation). They are key to driving organizational change.
  • Assigning the right people to the Information Custodian Role: Different business units and functions have different needs – and the information custodians should be picked to align with those needs. Most importantly, the information custodian should believe in and fundamentally propagate the values and ideals of master data management and its governance inside their respective business areas.
  • Build a scalable model: Try not to build too bureaucratic a model to start. Start small (but with clearly defined roles and responsibilities) and constantly measure/ change / improve. The intent should be to build something that scales with the needs of the business.
  • Build data quality into the model: Information quality should be an integral part of the model. In fact it should be one of the deliverables of the model to ensure that the output of the team results in ensuring the core strategic value of the initiative are being delivered.
  • Make the right choice around Centralized vs Hub-and-Spoke: We talked a bit about this before – make the right choice for your organization. Be willing to revisit the choice as the organization grows and matures.
  • Measure both efficacy and efficiency: Efficiency metrics answer the question, “Are things being done right?”. Efficacy metrics answer the question, “Are the right things being done?”. It is important to review and analyze both types of metrics to determine and improve the business value of governance.
  • Build a culture of information quality: If the enterprise does not see the value of master data and its governance, your initiatives will fail. Build a culture where information is treated as a key asset. Empower decision making at the nodes, but governed by the right framework that ensures quality. Always evangelize the need and value the initiative to the enterprise AND to the people that are doing the work.

If you are a testing laboratory, we strongly suggest you take a look at QualNimbus Test Reference Information Management Solution (TRIMS). It not only provides you with a centralized repository to manage your fore master data information, but also provides a full fledged governance enabling model built right into the solution.

The system allows for the definition of business units and departments, and ensures that information creation capabilities are restricted by the same. So – for instance – someone from the softlines business units can only create SL related packages (but still consume artifacts created by other departments such as chemical/analytical testing which could be a centralized capability). It has clearly demarcated roles for master data management, creation of test lines, protocols, packages, pricing management, etc. that ensures that information custodians work on those aspects that they are accountable for.

The system also has a well defined workflow for the management of core artifacts such as test lines and protocols/packages whereby an individual can author an artifact and it is automatically sent onward for review and approval after which its price can be configured in the system prior to it being available for consumption.

The system also provides operational reporting that allows for visibility to the work in progress (by status, stage, author, etc), the aging of artifacts, etc.

We would be happy to speak to you regarding how our system could be of benefit to your organization. Please contact us and we can set up a demo to walk you through TRIMS.

As always, we would love to hear your thoughts – so leave us a comment.

Master Data Management for TIC – Lab Testing

How a Master Data Management strategy implemented via QualNimbus TRIMS could help you be productive, profitable and deliver consistent service.

The importance of Master Data Management for Testing Laboratories

Most testing laboratories, in our experience as well as in speaking with many laboratories big and small, do not have a cohesive master data management strategy. While this could be for many reasons, they are generally doing themselves a disservice. Having a well structured master data management strategy allows laboratories to manage and leverage their core information assets.

Laboratories have to stay on top of the global compliance landscape and develop solutions that allow them to deliver services to their stakeholders in a consistent manner. This applies to aspects such as Regulations and Standards (which can vary by region/country/state/etc.), Methods for performing the test, associated Analytes and their limits, the definition of Test Lines and the consumption of those Test Lines in Product Protocols or Test Packages, etc.

Most laboratories that we have engaged with manage this information manually today – either in Word or Excel. There are several issues with this:

  • Information is not well structured (or consistently structured)
  • There is duplication of artifacts across the organization (across locations, for instance) which leads not only to governance issues but also the risk of inconsistency in service delivery
  • Making a change to a master data attribute can be heavily time consuming since every artifact that consumes that attribute will now need to be manually opened, reviewed, changed if necessary and reviewed.
  • The information in these files cannot be readily consumed – say by eCommerce solutions or by Laboratory Information Management Solutions (LIMS).

A lot of manual labor and errors in service delivery can thus be avoided by the implementation or adoption of a master data management solution. Besides this, having a MDM strategy also allows for highly effective data mining and business intelligence reporting around your transactional data which can help you significantly leverage your information assets.

Learn More

An Illustrative Scenario

Let’s say that a Laboratory ABC has a Retailer XYZ as a client. This retailer has its own specifications around the testing process – for instance, they sell a product in the US and Europe (i.e. they have to be in compliance with the regulatory requirements of those regions). Because of this, their specific needs – both from a regulatory and performance perspective could vary from the standard. For example, they could specify requirements that are more stringent than those being required by the regulations (e.g. if the US regulation requires <= 120 ppm for Lead, the retailer may chose to require their products go above and beyond with a requirement of <= 100 ppm). There are several other variances that could occur based on product attributes (e.g. applicability to specific materials, exemption of others), testing conditions (e.g. variances at different temperatures, wash cycles, etc.), etc.

As one can imagine, if this is being manually maintained, it is a nightmare to manage and govern. Often times, just because of the variances, laboratories choose to create multiple test lines that are identical except for these variances (which further exacerbates the situation). These test lines are also consumed in Protocols and Packages. So now there is an even higher number of artifacts that need to be managed and governed.

Now think of what the lab has to go through if a single analyte limit undergoes a change (say Lead goes to 90 ppm due to a new regulatory requirement in EU). The retailer asks the laboratory to ensure that they would be compliant to this requirement when the new requirement goes into effect. The laboratory now has to go in and find every test line and change the appropriate ones to reflect the new requirement (and ensure it has an effective date the same as the regulation “go-live” date) and then they need to do the same with every protocol or package. This often takes weeks of manual effort.

Contact Us

MDM – Need for a simpler solution

The implementation of a master data management solution can dramatically reduce the effort associated with managing and governing this information.

QualNimbus has developed an enterprise class solution specifically for Lab Testing that allows centralized management of testing related master reference data. This solution is called the Test Reference Information Management Solution (TRIMS).

In TRIMS, all of the master data is defined just once – and discretely.

So in the scenario above,

  1. The client specific analyte-limit for Lead (which is managed separately from the regulatory analyte-limit) is managed in the Analyte Limit master.
  2. This analyte limit is then consumed in multiple test lines (as applicable).
  3. The Test Lines are then consumed in multiple protocols or packages.
  4. Changes to key artifacts are versioned, effective dated and governed (i.e. they require review/approval workflows).
  5. Every change in the system also triggers an audit trail so you can see who made the changes to what information and when.
  6. Once a change is made, it automatically cascades through the system and affects only those artifacts that consume that specific attribute or artifact.

The same scenario, which would take weeks of manual effort, can now be accomplished in a matter of hours using TRIMS.

If you manage a lab, we recommend you take a closer look at Master Data Management as a core aspect of your business. Please contact us and we would be happy to help you understand this better and demonstrate how TRIMS could help you.



Let us know what you think via the comments section.

Information Security on QualNimbus Solutions

Learn about how QualNimbus secures information on its applications. You can use QualNimbus solutions with peace of mind!

Information Security

A lot of our prospective customers have a concern regarding information security on our platform. This is rightly so – as companies do want to ensure that the information they store on our systems is secured and protected.

We want to assure you that QualNimbus takes information security very seriously. We have the following measures in place to ensure security:

  • Our infrastructure is hosted on Amazon AWS which has some of the highest degree of security in the world (it is even used by the US Department of Defense!).
  • In addition to this, to ensure scalability, we have multiple tiers in our applications: Web Tier, Application Tier and Database Tier. All three of these tiers are themselves secured by complex challenge/response (password) mechanisms to prevent breach.
  • The web tier is hosted behind a load balancer, which is itself secured as well.
  • We also require that every user of our system has a strong password – and only valid users can access the system.
  • All our systems also implement role based security – that allows for fine tuning of the capabilities that users have access to once logged into the system based on their role(s).
  • All transactions on our applications happen over a HTTPS connection that uses 256 bit encryption (so information cannot be hacked during transfer between your machine and our servers).
  • Finally, as a last measure of security, we store each customer’s data in a completely separate database schema – so your information is never intermingled with information of other companies.

We hope this addresses concerns regarding security. We have been able to win the trust of large, industry leading firms about our security and scalability. We would be happy to discuss any of these aspects in detail with you – just contact us.

We want to assure you again that we take the security of your information as seriously as you do. We want you to use QualNimbus solutions with absolute peace of mind!

Contact Us


The What & Why of SaaS

Learn why Software as a Service (SaaS) as a means for delivering entire applications over the internet makes sense for your business.

What is SaaS:

Historically, companies were required to buy, build, and maintain their IT infrastructures despite exponential costs. Software as a service (or SaaS) gives companies an alternative. SaaS is a mechanism of delivering applications over the Internet—as a service. Instead of managing hardware, software and applications, you simply access it via the Internet, freeing yourself from the complexities of software and hardware management. QualNimbus, being a SaaS solution, will manage the hardware, software as well as access to the application, including security, availability, and performance. All you need is an internet connection.

Benefits of SaaS:

The SaaS model has flourished because of the many benefits it offers to businesses of all sizes and types. Here’s what’s driving customers to take advantage of SaaS solutions:

  • High Adoption: SaaS applications are available from any computer or any device—any time, anywhere. Because most people are familiar with using the Internet to find what they need, SaaS applications tend to have high adoption rates, with a lower learning curve.
  • Lower Initial Costs: SaaS applications are subscription based. No license fees mean lower initial costs. Having the SaaS provider manage the IT infrastructure means lower IT costs for hardware, software, and the people needed to manage it all.
  • Painless Upgrades: Because the SaaS provider manages all updates and upgrades, there are no patches for customers to download or install. The SaaS provider also manages availability, so there’s no need for customers to add hardware, software, or bandwidth as the user base grows.
  • Seamless Integration: SaaS vendors with true multitenant architectures can scale indefinitely to meet customer demand. Many SaaS providers also provide APIs that let you integrate with existing ERP systems or other business productivity systems.

The image below provides an easy to understand view of the various types of solutions, including SaaS and how the various models differ.

 On Premise Solution  Infrastructure as a Service (IaaS) Platform as a Service (PaaS) Software as a Service (SaaS)
in-house iaas paas saas
  • Entire Hardware/Software stack & People all in house
  • Substantial IT costs to maintain & manage (high TCO)
  • HW/SW Upgrades & Scalability are expensive & time consuming
  • Lights-on focus – limited innovation
  • Full SW stack & People in house; HW vendor managed
  • High IT costs to maintain & manage (high TCO)
  • SW Upgrades are expensive & time consuming
  • Lights-on focus – limited innovation
  • App/Data & People in house; HW/SW vendor managed
  • Moderate IT costs to maintain & manage (high TCO)
  • App Upgrades are expensive & time consuming
  • Application focus – siloed innovation
  • Entire HW/SW stack & People vendor managed
  • No internal IT cost – pay/use
  • Low TCO (multi tenant)
  • Highly scalable
  • Continuously evolving product; transparent upgrades
  • Resources freed for innovation



If you are a Testing service provider – or a buyer or supplier with an in-house lab – you know the importance of having a solid master data management (MDM) strategy. Having a clean information base is critical for all of your master data – including information pertaining to Tests, Test Lines and, if applicable, Test Packages and Test Protocols.

If you are a large TIC service provider TRIMS helps ensure that you can govern this information across many laboratories worldwide, ensuring consistent service to your customers everywhere. Just as important is to have the ability to make a change to a master data element – or Test Line – and have it automatically be reflected across every other artifact that it is referenced in.

QualNimbus’ Test Reference Information Management Solution (TRIMS) can help with this. It is a state of the art solution that allows you to establish a solid master data management as well as master data governance strategy and lays the foundation for business intelligence reporting and advanced decision support systems.

Contact us and we would be happy to have a more detailed discussion with you.

Contact Us


Inspekt is our SaaS solution to manage the entire value chain pertaining to Inspections Services. The solution is aimed primarily at small to medium sized Service Providers as well as Buying houses or Importers based in the sourcing regions that have their own in-house Inspections teams.