Lab Test Information Management: Challenges & Way Forward

In an ever more competitive landscape, Testing Laboratories need to address their challenges by focusing on the core drivers via a cohesive information management strategy.

Lab Test Information Management: Challenges

Testing Laboratories in the TIC industry have grown aggressively, both inorganically via acquisitions and organically, while simultaneously striving to maintain and grow margins. This growth model has brought with it significant, and I must say, inevitable challenges.

Some of the key challenges are listed below:

  1. Lack of standardization – Variability in core information and operational execution platform across lab locations and regions, including those driven by client specific customization.
  2. Ad-Hoc and inefficient processes – Redundant and excessive manual verification steps have been introduced at every stage throughout the operations.
  3. Ageing delivery platforms – The industry continues to rely heavily on human capital and has a lot of catching up to do in terms of technology adoption. Full automation in Lab data capture, auto report generation, B2B customer intimacy platforms are dreams that continue to elude even the most innovative, market leading service providers.
  4. Declining Margins – Given an increasingly competitive landscape and increasing input costs coupled with stagnant pricing (or even increased pricing pressures), margins are increasingly coming under pressure. This deserves to be mentioned separately as a Challenge on its own as it is one of the single most critical metric on which management performance is measured today.
  5. Revenue Leakage – This is a direct and inevitable result of lack of traceability in lab information management, all the way from the Test Request Form (TRF) being submitted to the corresponding invoice being issued. Labs all over the world end up under-invoicing. While the percentage loss in revenue varies from lab to lab, it does happen in every lab nevertheless.
  6. Impatient, Unhappy Customers – Let’s not forget the customer in all this. Customers want faster time to market and are unhappy with the Turn Around Time (TAT) delays they face from the TIC service provider. The list of impediments are long – rework due to lab reporting errors, invoicing mistakes (they will all complain if labs over-invoice!), the lack of reliable business intelligence, etc.

Lab Test Information Management: Way forward

Focus on the drivers!

  • Gain Competitive Advantage – Do you believe you are in this business just to provide a test report? If not, how will you differentiate?
  • Retain & Grow – Customers no longer want just a test report, even if you can manage to send an accurate report within the committed TAT. How will you ensure customer retention? How will you delight your customers continually?
  • Improve Margins – It is difficult to maintain high margins while simultaneously aiming for higher growth, especially if organizations continue using human capital-intensive processes and systems. What are the operational objectives? In which areas will you be able to drive efficiency and improve cost competitiveness?

Align your objectives to the drivers

What
How
Why
Ensure Customer Delight a. Improve consistency and granularity of reporting data by adopting a robust universal master data management (MDM) system that permeates across all operational transaction management and reporting systems.

b. Accelerate Turn Around Time (TAT) by promptly responding to regulatory or customer requested changes to test protocols and packages. This is a lot easier to achieve once you have a workflow based, global test, protocol/package management system such as the TRIMS in place.

c. Automate test data capture, reporting and invoicing.

d. Deliver real time operational information and rich business intelligence reporting through a smart customer portal. Once again, this cannot be achieved if you don’t have the basis to efficiently manage rich referential data.

– Differentiate

– Retain & Grow

Improve Scalability a. Implement best practice processes to improve operational governance across the network.

b. Adopt a scalable technology platform to handle simple to complex change management requests, quickly and accurately, impacting a large and ever increasing number of test protocols and packages.

– Differentiate

– Retain & Grow

– Improve Margins

Improve efficiency and lower risk a. Single source of truth for your test protocols and packages – eliminate reliance on stand alone, document/paper based test protocols and with it the risk of non-compliance with current regulatory and customer requirements.

b. Drastically reduce errors in test assignment, data capture and reporting, thereby minimizing rework.

c. Save paper and printing costs. This is not just about being socially responsible – it is pure margin, very much part of the current Total Cost of Ownership (TCO) – this cost will go away with adoption of the new test protocol and package management platform.

– Improve Margins

Let us know what you think – leave a comment below.

Contact Us

Master Data Management for Testing Laboratories

How a Master Data Management strategy implemented via QualNimbus TRIMS could help you be productive, profitable and deliver consistent service.

What is Master Data Management?

The definition for master data management (MDM) from Wikipedia reads:

In business, master data management (MDM) comprises the processes, governance, policies, standards and tools that consistently define and manage the critical data of an organization to provide a single point of reference.

The data that is mastered may include:

  • reference data – the business objects for transactions, and the dimensions for analysis
  • analytical data – supports decision making

Essentially, it is the single source of truth of core reference information that can then be consumed throughout the organization in a consistent fashion. For instance, an organization should generally have a single source of truth as it pertains to its customers, suppliers, products, services, etc.

The importance of Master Data Management for Testing Laboratories

Most testing laboratories, in our experience as well as in speaking with many laboratories big and small, do not have a cohesive master data management strategy. While this could be for many reasons, they are generally doing themselves a disservice. Having a well structured master data management strategy allows laboratories to manage and leverage their core information assets.

Laboratories have to stay on top of the global compliance landscape and develop solutions that allow them to deliver services to their stakeholders in a consistent manner. This applies to aspects such as Regulations and Standards (which can vary by region/country/state/etc.), Methods for performing the test, associated Analytes and their limits, the definition of Test Lines and the consumption of those Test Lines in Product Protocols or Test Packages, etc.

Most laboratories that we have engaged with manage this information manually today – either in Word or Excel. There are several issues with this:

  • Information is not well structured (or consistently structured)
  • There is duplication of artifacts across the organization (across locations, for instance) which leads not only to governance issues but also the risk of inconsistency in service delivery
  • Making a change to a master data attribute can be heavily time consuming since every artifact that consumes that attribute will now need to be manually opened, reviewed, changed if necessary and reviewed.
  • The information in these files cannot be readily consumed – say by eCommerce solutions or by Laboratory Information Management Solutions (LIMS).

A lot of manual labor and errors in service delivery can thus be avoided by the implementation or adoption of a master data management solution. Besides this, having a MDM startegy also allows for highly effective data mining and business intelligence reporting around your transactional data which can help you significantly leverage your information assets.

Learn More

An Illustrative Scenario

Let’s say that a Laboratory ABC has a Retailer XYZ as a client. This retailer has its own specifications around the testing process – for instance, they sell a product in the US and Europe (i.e. they have to be in compliance with the regulatory requirements of those regions). Because of this, their specific needs – both from a regulatory and performance perspective could vary from the standard. For example, they could specify requirements that are more stringent than those being required by the regulations (e.g. if the US regulation requires <= 120ppm for Lead, the retailer may chose to require their products go above and beyond with a requirement of <= 100ppm). There are several other variances that could occur based on product attributes (e.g. applicability to specific materials, exemption of others), testing conditions (e.g. variances at different temperatures, wash cycles, etc.), etc.

As one can imagine, if this is being manually maintained, it is a nightmare to manage and govern. Often times, just because of the variances, laboratories choose to create multiple test lines that are identical except for these variances (which further exacerbates the situation). These test lines are also consumed in Protocols and Packages. So now there is an even higher number of artifacts that need to be managed and governed.

Now think of what the lab has to go through if a single analyte limit undergoes a change (say Lead goes to 90ppm due to a new regulatory requirement in EU). The retailer asks the laboratory to ensure that they would be compliant to this requirement when the new requirement goes into effect. The laboratory now has to go in and find every test line and change the appropriate ones to reflect the new requirement (and ensure it has an effective date the same as the regulation “go-live” date) and then they need to do the same with every protocol or package. This often times takes weeks of months of person-day effort to affect.

Contact Us

Is there a simpler solution?

Yes! The implementation of a master data management solution can dramatically reduce the effort associated with managing and governing this information.

QualNimbus has developed a secure and scalable Software as a Service (SaaS) solution specifically for the testing laboratories that allows them to centrally manage all of their testing related master reference data in one place. This solution is called the Test Reference Information Management Solution (TRIMS).

TRIMS Master Data Management

In TRIMS, all of the master data is defined just once – and discretely. So in the scenario above, the client specific analyte-limit for Lead (which is managed separately from the regulatory analyte-limit) is managed in the Analyte Limit master. This analyte limit is then consumed in multiple test lines (as applicable). The Test Lines are then consumed in multiple protocols or packages. Changes to key artifacts are versioned, effective dated and governed (i.e. they require review/approval workflows). Every change in the system also triggers an audit trail so you can see who made the changes to what information and when. Once a change is made, it automatically cascades through the system and affects only those artifacts that consume that specific attribute or artifact. For instance, the scenario above can be accomplished in a matter of hours or a couple of days (accounting for review/approval workflows).

If you manage a lab, we recommend you take a closer look at Master Data Management as a core aspect of your business. Please contact us and we would be happy to help you understand this better and demonstrate how TRIMS could help you.


RequestDemo

 

Let us know what you think via the comments section.