Lab Test Information Management: Challenges & Way Forward

In an ever more competitive landscape, Testing Laboratories need to address their challenges by focusing on the core drivers via a cohesive information management strategy.

Lab Test Information Management: Challenges

Testing Laboratories in the TIC industry have grown aggressively, both inorganically via acquisitions and organically, while simultaneously striving to maintain and grow margins. This growth model has brought with it significant, and I must say, inevitable challenges.

Some of the key challenges are listed below:

  1. Lack of standardization – Variability in core information and operational execution platform across lab locations and regions, including those driven by client specific customization.
  2. Ad-Hoc and inefficient processes – Redundant and excessive manual verification steps have been introduced at every stage throughout the operations.
  3. Ageing delivery platforms – The industry continues to rely heavily on human capital and has a lot of catching up to do in terms of technology adoption. Full automation in Lab data capture, auto report generation, B2B customer intimacy platforms are dreams that continue to elude even the most innovative, market leading service providers.
  4. Declining Margins – Given an increasingly competitive landscape and increasing input costs coupled with stagnant pricing (or even increased pricing pressures), margins are increasingly coming under pressure. This deserves to be mentioned separately as a Challenge on its own as it is one of the single most critical metric on which management performance is measured today.
  5. Revenue Leakage – This is a direct and inevitable result of lack of traceability in lab information management, all the way from the Test Request Form (TRF) being submitted to the corresponding invoice being issued. Labs all over the world end up under-invoicing. While the percentage loss in revenue varies from lab to lab, it does happen in every lab nevertheless.
  6. Impatient, Unhappy Customers – Let’s not forget the customer in all this. Customers want faster time to market and are unhappy with the Turn Around Time (TAT) delays they face from the TIC service provider. The list of impediments are long – rework due to lab reporting errors, invoicing mistakes (they will all complain if labs over-invoice!), the lack of reliable business intelligence, etc.

Lab Test Information Management: Way forward

Focus on the drivers!

  • Gain Competitive Advantage – Do you believe you are in this business just to provide a test report? If not, how will you differentiate?
  • Retain & Grow – Customers no longer want just a test report, even if you can manage to send an accurate report within the committed TAT. How will you ensure customer retention? How will you delight your customers continually?
  • Improve Margins – It is difficult to maintain high margins while simultaneously aiming for higher growth, especially if organizations continue using human capital-intensive processes and systems. What are the operational objectives? In which areas will you be able to drive efficiency and improve cost competitiveness?

Align your objectives to the drivers

Ensure Customer Delight a. Improve consistency and granularity of reporting data by adopting a robust universal master data management (MDM) system that permeates across all operational transaction management and reporting systems.

b. Accelerate Turn Around Time (TAT) by promptly responding to regulatory or customer requested changes to test protocols and packages. This is a lot easier to achieve once you have a workflow based, global test, protocol/package management system such as the TRIMS in place.

c. Automate test data capture, reporting and invoicing.

d. Deliver real time operational information and rich business intelligence reporting through a smart customer portal. Once again, this cannot be achieved if you don’t have the basis to efficiently manage rich referential data.

– Differentiate

– Retain & Grow

Improve Scalability a. Implement best practice processes to improve operational governance across the network.

b. Adopt a scalable technology platform to handle simple to complex change management requests, quickly and accurately, impacting a large and ever increasing number of test protocols and packages.

– Differentiate

– Retain & Grow

– Improve Margins

Improve efficiency and lower risk a. Single source of truth for your test protocols and packages – eliminate reliance on stand alone, document/paper based test protocols and with it the risk of non-compliance with current regulatory and customer requirements.

b. Drastically reduce errors in test assignment, data capture and reporting, thereby minimizing rework.

c. Save paper and printing costs. This is not just about being socially responsible – it is pure margin, very much part of the current Total Cost of Ownership (TCO) – this cost will go away with adoption of the new test protocol and package management platform.

– Improve Margins

Let us know what you think – leave a comment below.

Contact Us

Governance of Master Data: Definition, Structure and Implementation

Learn about the importance of establishing a master data governance model and how you can structure and implement a governance model for your enterprise.


We have discussed before the importance of Master Data Management (MDM) to your business (with a focus on testing laboratories). So – once we have agreed upon the need for MDM, we have to consider very seriously the matter of governance of that master data to ensure that it is a reliable central repository of your core information assets.

What is Governance?

Governance is everything that an organization does to ensure the ownership and accountability pertaining to master data management. This includes:

  • The items that form the basis for master data and their structure.
  • The processes to create and maintain master data
  • The metrics and business rules that pertain to master data management
  • The rules and processes around master data access, security and consumption

Effective governance of master data can be enabled by the establishment of a governance organization (team) whose responsibility it is to create policy and procedures that will be adhered to pertaining to the creation, maintenance and consumption of master data while ensuring that there are appropriate metrics in place to ensure the quality and business use of the master data.

Structuring a Governance Organization:

A well structured governance organization has:

  • A Governance Council: Made up of senior members of business units and corporate functions, the council establishes the strategy and defines the tactics via which master data will be governed. The council also owns the responsibility and accountability to propagate the directives of the governance council in their respective business areas via the stewardship team. The council oversees aspects pertaining to investments pertaining to master data and its consumption, and regularly evaluates the effectiveness of the governance model via metrics.
  • A Stewardship Team: Made up of information/data stewards, this team is constituted by individuals (generally business leaders or managers) from different business units who are tasked with propagating the program and monitoring progress. This team needs to be well defined as it is a critical factor for the success of governance initiatives. The stewardship team defines the specific policies, procedures and practices via which information will be governed. They constantly monitor the efficacy of the governance program and determine the projects that will make the biggest impact to information quality and leverage and report into the governance council. They also drive the organizational change initiatives – including training and evangelization – that are required to ensure the adoption of master data governance throughout the enterprise.
  • A Custodian Team: Information Custodians are individuals from various business units that are responsible for specific master data domains (e.g. people, customer, standards and regulations, methods, etc.). They are directly responsible for the management of the master data in their domain and work closely with the information steward for their domain to ensure alignment and adherence to governance policy.

Governance Policies & Procedures:

To ensure clarity a few policies should be established:

  • Policy around ownership and accountability pertaining to master data
  • Policy around governance procedure development & implementation
  • Policy around governance issues and their resolution
  • Policy around compliance audits
  • Policies around governance training

Once these policies have been established, they should then be materialized through well defined processes and procedures such as:

  • Identification, definition and implementation or new policies/procedures
  • Identifying and making the case for new master data initiatives
  • Making the case for modifications to the master data governance model
  • Definition of an oversight process to measure and monitor effectiveness
  • Oversee and manage the progress of existing initiatives
  • Definition and capture of metrics to measure performance
  • Defining the calendar and execution of compliance audits
  • Day to day management of governance and compliance issues

Centralized vs Hub & Spoke models in Testing Laboratories:

Often times we are asked the question by Laboratories about whether they should have a centralized governance team or a distributed one. The answer lies in the way the master data elements are managed. For small to medium sized organizations, it may be easier to implement a centralized model and then evolve that over time to grow with the geographical expansion. For large organizations that have a geographically distributed footprint, we most recommend the hub-and-spoke model – wherein collaboration and alignment are critical components to success.

As with most things, there is no one-size-fits-all prescription when it comes to this. The choice of architecture depends on specific business problems with processes or data domains. Organizations should remain flexible in their choice of architecture and select those that best fit their business needs.

Our Test Reference Information Management Solution (TRIMS) supports both models via its definition of various roles and collaboration workflows.

Critical Success Factors:

Establishing the right governance model is critically important – and it is hard. Especially for organizations that have never had that kind of structure. Here are some things that will help you along the journey of defining and maturing this capability over time:

  • Identify an Executive Sponsor: It is critical that a senior executive in the organization invests their time to be the executive sponsor of this initiative. They are positioned to ensure that the model aligns to corporate strategy while messaging to the enterprise the value the leadership team places on the initiative. As the chair of the Governance Council, they are also the final decision maker around investments related to master data and its adoption.
  • Identify a Chief Information Steward: The Chief Information Steward should be an individual who is the lead evangelist for the program and should advocate continuous improvements to master data quality (regardless of their current or past business roles / domain affiliation). They are key to driving organizational change.
  • Assigning the right people to the Information Custodian Role: Different business units and functions have different needs – and the information custodians should be picked to align with those needs. Most importantly, the information custodian should believe in and fundamentally propagate the values and ideals of master data management and its governance inside their respective business areas.
  • Build a scalable model: Try not to build too bureaucratic a model to start. Start small (but with clearly defined roles and responsibilities) and constantly measure/ change / improve. The intent should be to build something that scales with the needs of the business.
  • Build data quality into the model: Information quality should be an integral part of the model. In fact it should be one of the deliverables of the model to ensure that the output of the team results in ensuring the core strategic value of the initiative are being delivered.
  • Make the right choice around Centralized vs Hub-and-Spoke: We talked a bit about this before – make the right choice for your organization. Be willing to revisit the choice as the organization grows and matures.
  • Measure both efficacy and efficiency: Efficiency metrics answer the question, “Are things being done right?”. Efficacy metrics answer the question, “Are the right things being done?”. It is important to review and analyze both types of metrics to determine and improve the business value of governance.
  • Build a culture of information quality: If the enterprise does not see the value of master data and its governance, your initiatives will fail. Build a culture where information is treated as a key asset. Empower decision making at the nodes, but governed by the right framework that ensures quality. Always evangelize the need and value the initiative to the enterprise AND to the people that are doing the work.

If you are a testing laboratory, we strongly suggest you take a look at QualNimbus Test Reference Information Management Solution (TRIMS). It not only provides you with a centralized repository to manage your fore master data information, but also provides a full fledged governance enabling model built right into the solution.

The system allows for the definition of business units and departments, and ensures that information creation capabilities are restricted by the same. So – for instance – someone from the softlines business units can only create SL related packages (but still consume artifacts created by other departments such as chemical/analytical testing which could be a centralized capability). It has clearly demarcated roles for master data management, creation of test lines, protocols, packages, pricing management, etc. that ensures that information custodians work on those aspects that they are accountable for.

The system also has a well defined workflow for the management of core artifacts such as test lines and protocols/packages whereby an individual can author an artifact and it is automatically sent onward for review and approval after which its price can be configured in the system prior to it being available for consumption.

The system also provides operational reporting that allows for visibility to the work in progress (by status, stage, author, etc), the aging of artifacts, etc.

We would be happy to speak to you regarding how our system could be of benefit to your organization. Please contact us and we can set up a demo to walk you through TRIMS.

As always, we would love to hear your thoughts – so leave us a comment.

Master Data Management for TIC – Lab Testing

How a Master Data Management strategy implemented via QualNimbus TRIMS could help you be productive, profitable and deliver consistent service.

The importance of Master Data Management for Testing Laboratories

Most testing laboratories, in our experience as well as in speaking with many laboratories big and small, do not have a cohesive master data management strategy. While this could be for many reasons, they are generally doing themselves a disservice. Having a well structured master data management strategy allows laboratories to manage and leverage their core information assets.

Laboratories have to stay on top of the global compliance landscape and develop solutions that allow them to deliver services to their stakeholders in a consistent manner. This applies to aspects such as Regulations and Standards (which can vary by region/country/state/etc.), Methods for performing the test, associated Analytes and their limits, the definition of Test Lines and the consumption of those Test Lines in Product Protocols or Test Packages, etc.

Most laboratories that we have engaged with manage this information manually today – either in Word or Excel. There are several issues with this:

  • Information is not well structured (or consistently structured)
  • There is duplication of artifacts across the organization (across locations, for instance) which leads not only to governance issues but also the risk of inconsistency in service delivery
  • Making a change to a master data attribute can be heavily time consuming since every artifact that consumes that attribute will now need to be manually opened, reviewed, changed if necessary and reviewed.
  • The information in these files cannot be readily consumed – say by eCommerce solutions or by Laboratory Information Management Solutions (LIMS).

A lot of manual labor and errors in service delivery can thus be avoided by the implementation or adoption of a master data management solution. Besides this, having a MDM strategy also allows for highly effective data mining and business intelligence reporting around your transactional data which can help you significantly leverage your information assets.

Learn More

An Illustrative Scenario

Let’s say that a Laboratory ABC has a Retailer XYZ as a client. This retailer has its own specifications around the testing process – for instance, they sell a product in the US and Europe (i.e. they have to be in compliance with the regulatory requirements of those regions). Because of this, their specific needs – both from a regulatory and performance perspective could vary from the standard. For example, they could specify requirements that are more stringent than those being required by the regulations (e.g. if the US regulation requires <= 120 ppm for Lead, the retailer may chose to require their products go above and beyond with a requirement of <= 100 ppm). There are several other variances that could occur based on product attributes (e.g. applicability to specific materials, exemption of others), testing conditions (e.g. variances at different temperatures, wash cycles, etc.), etc.

As one can imagine, if this is being manually maintained, it is a nightmare to manage and govern. Often times, just because of the variances, laboratories choose to create multiple test lines that are identical except for these variances (which further exacerbates the situation). These test lines are also consumed in Protocols and Packages. So now there is an even higher number of artifacts that need to be managed and governed.

Now think of what the lab has to go through if a single analyte limit undergoes a change (say Lead goes to 90 ppm due to a new regulatory requirement in EU). The retailer asks the laboratory to ensure that they would be compliant to this requirement when the new requirement goes into effect. The laboratory now has to go in and find every test line and change the appropriate ones to reflect the new requirement (and ensure it has an effective date the same as the regulation “go-live” date) and then they need to do the same with every protocol or package. This often takes weeks of manual effort.

Contact Us

MDM – Need for a simpler solution

The implementation of a master data management solution can dramatically reduce the effort associated with managing and governing this information.

QualNimbus has developed an enterprise class solution specifically for Lab Testing that allows centralized management of testing related master reference data. This solution is called the Test Reference Information Management Solution (TRIMS).

In TRIMS, all of the master data is defined just once – and discretely.

So in the scenario above,

  1. The client specific analyte-limit for Lead (which is managed separately from the regulatory analyte-limit) is managed in the Analyte Limit master.
  2. This analyte limit is then consumed in multiple test lines (as applicable).
  3. The Test Lines are then consumed in multiple protocols or packages.
  4. Changes to key artifacts are versioned, effective dated and governed (i.e. they require review/approval workflows).
  5. Every change in the system also triggers an audit trail so you can see who made the changes to what information and when.
  6. Once a change is made, it automatically cascades through the system and affects only those artifacts that consume that specific attribute or artifact.

The same scenario, which would take weeks of manual effort, can now be accomplished in a matter of hours using TRIMS.

If you manage a lab, we recommend you take a closer look at Master Data Management as a core aspect of your business. Please contact us and we would be happy to help you understand this better and demonstrate how TRIMS could help you.



Let us know what you think via the comments section.

Release Notes: Inspekt Version 1.13.0 (20170506.1)

Inspekt Version 1.13.0 – Release Notes: New in Inspekt Version 1.13.0 – RCA/CAPA Management, enhancements to booking, scheduling, field inspections, reporting ….. and more!

Inspekt Version 1.13.0 (20170506.1): 06 May 2017


  • RCA/CAPA Management: The Root Cause Analysis and Corrective Action Preventive Action (RCA/CAPA) Management module now enables your Buyers to collaboratively work with their Factories to track and manage the CAPA lifecycle at a granular level for each Defect and/or failed Checkpoint.


  • Advanced Activity Report: Users can now quickly generate advanced activity reports that allows excel export of all inspection booking information, sectional results, along with PO/Lot details and Invoice details.
  • Invoice Reference Link: Customers who prefer to retain pre-generated invoice numbers issued by their F&A dept/systems can now link the invoice number with an inspection at the booking stage itself and track it throughout the lifecycle. This linked reference number will then be used for Web Services based or file upload based integration with the F&A dept/system.


  •  Security/Access Management: Admins can now manage security at a more granular level to ensure users regardless of roles have record level access as per Locations, Operating Offices, Clients relevant to them.
  • MDM on the fly: This existing feature has been further improved for faster new company master data management.
  • Invoicing Revision History: Invoicing revision history tracking is now on par with revision tracking in other modules.


  • SP logo sizing (desktop Outlook email only): There was an issue where the Inspection Service Provider logo was losing it’s dimensions on some auto emails – this impacted only those customers using Desktop Outlook email client to view the emails. This issue has been fixed.

Click here to see the entire change log (for every Inspekt version released to date).

ABOUT Inspekt: Inspekt lets you manage the entire Inspections lifecycle – from order to invoice – all from your web browser. 

Not a current user of Inspekt – but want to get all these great features for Inspections Management? Start your free trial today!Register

Release Notes: Inspekt Version 1.11.0 (20170223.1)

Inspekt Version 1.11.0 – Release Notes

Inspekt Version 1.11.0 (20170223.1): 23 Feb 2017


  • Booking: Inspections Booking managers can now view and browse the location specific schedule without having to leave their booking screen.
  • Scheduling: Inspectors can now be separately scheduled for specific dates in a multi-day inspection.
  • Inspector Black/White List Management: With this new feature service providers can blacklist or whitelist any inspector for a given Buyer.
  • Custom Sampling Plan: Users can now override the standard ANSI/ASQ Z1.4 based sampling plan at the Inspection level or Protocol section level and use their own custom sampling plan.

AQL Override

  • Image Annotation: Inspectors can now annotate all images from within the application. These images and their annotations will then be printed in the final report.

Annotation Shirt

Users can also reset the annotations at any time and revert to the original image.

  • Invoice Management: Now Inspekt users can easily manage multi-operating office/location specific invoice templates.


  • Dashboard: The Dashboard has been enhanced with an improved Calendar to quickly view and browse the inspection schedule.
  • Review Management: With an improved UI, it is now easier than before to configure and manage Inspector to Reviewer mappings in the workforce.
  • Query & Results Grid: An improved search query and query results grid screen is now available.

Click here to see the entire change log (for every Inspekt version released to date).

ABOUT Inspekt: Inspekt lets you manage the entire Inspections lifecycle – from order to invoice – all from your web browser. 

Not a current user of Inspekt – but want to get all these great features for Inspections Management? Start your free trial today!Register

Release Notes: Inspekt Version 1.10.0 (20170125.1)

Inspekt Version 1.10.0

Inspekt Version 1.10.0 (20170125.1): 25 Jan 2017


  • Booking: Inspection booking notifications to factory contacts can now be controlled from inside the booking screen.
  • Protocol Management: Protocol writers can now Preview the Protocol online and Print to PDF or printer.
  • Rollout/Trial Run Support: Via a system setting customers can now turn off outgoing Inspekt notifications and redirect them to an internal generic “Catch All” mailbox to avoid accidental notifications during the trial run.


  • Image Management: In addition to the pre-exisitng Image Interface, Inspectors can now also attach images under each section.
  • Invoice Management: The system now allows for enhanced tracking of Invoice issuance and payment, for both individual and consolidated Invoices.
  • AQL Calculator: AQL calculator User Experience has been improved.
  • Revision History: Revision History tracking has been streamlined to reduce clutter and improve legibility.

Click here to see the entire change log (for every Inspekt version released to date).

ABOUT Inspekt: Inspekt lets you manage the entire Inspections lifecycle – from order to invoice – all from your web browser. 

Not a current user of Inspekt – but want to get all these great features for Inspections Management? Start your free trial today!Register

Release Notes: Inspekt Version 1.9.0 (20161230.1)

Inspekt Version 1.9.0

Inspekt Version 1.9.0 (20161230.1): 30 December 2016


  • Protocol Management – Introduced default sample size for Onsite Test items.
  • Image Management – Prominently displaying guidelines to prevent aspect ratio distortions


  • Dashboard – Dashboard now shows Today’s Invoice Count as well.
  • Protocol Management – Carton Specifications now include Carton Weight.
  • Protocol Management – Enhanced configuration to allow section level AQL and choose report printing configurations.
  • Booking – Booking details edit process related to change in Buyer/Factory details enhanced to improve user experience.
  • Booking  Search grid improved to clearly indicate source of order (via external Inspekt Client Portal, Internal Operations, B2B Web Services).
  • Scheduling – Estimated Man Day calculation provided with hyperlinked explanation.
  • Report  Standard Abbreviated report template modified with enhanced display of failed Defects, Check Points and Onsite Tests.
  • Mobile Verification – The mobile SMS based OTP verification process has been improved.

Click here to see the entire change log (for every Inspekt version released to date).

ABOUT Inspekt: Inspekt lets you manage the entire Inspections lifecycle – from order to invoice – all from your web browser. 

Not a current user of Inspekt – but want to get all these great features for Inspections Management? Start your free trial today!Register

Product Inspections for Brands, Retailers and other large Buyers – The Way Forward

Quality Control processes followed by Brands, Retailers and large Buyers of Consumer Products in general invariably include Product Inspections for goods they acquire via their own sourcing offices or via third party vendors.

There are 3 main scenarios by which Buyers can pursue inspections, and these are detailed below. Depending on the Buyer’s specific requirements, they can also choose to pursue a hybrid approach that can combine the different scenarios to strategically address their specific operational and financial needs.

Product Inspection Scenarios

1st Party Inspections: These are inspections that are managed by the Suppliers/Factories. This option should be chosen by Buyers only where regulations/end customer allows it for low risk products, and where the Supplier/Factory is qualified for self-inspections based on maturity and a verified history of producing defect free, high quality products. Having visibility to all 1st Party QC activities via an Inspections Management Platform is essential.

2nd Party Inspections: These are inspections that are managed by the Buyer, using in-house QC/Inspections teams. Buyers who are risk averse, genuinely concerned about product quality, and thus need to directly control the QC process, will choose this option, while retaining the option of using 1st party and 3rd party inspections on a as needed basis. Running the entire Inspections operations on a robust Inspections Management Platform is crucial for success.

3rd Party Inspections: These are inspections that are managed by external 3rd party inspections service providers. This option is typically chosen by the Buyer when the regulatory market demands it and/or the buyer wants to outsource the workload and/or the risks associated with non-compliance are high and the buyer needs additional assurance.

1st / 2nd / 3rd Party Inspection Approaches At a Glance
Brand Risk2 Impartiality3 QC4
Ops Scalability
of QC
of QC (Variable)
1st Party Low High Low High Low Low
2nd Party High Low Medium Medium High Low
3rd Party Medium Medium High High Low High
  1. Management Control: Refers to the level of control the Buyer has over the QC process. While being dependent on a robust Inspections Management Platform, it shows how well the Buyer can continually improve supply chain quality, drive higher productivity, efficiency and accuracy in the inspections team, and lower cost of quality. 
  2. Brand Risk: The Buyer’s Brand risk is very sensitive to perception. Due to high QC costs, 100% inspections may not be feasible for every product that get shipped and quality issues are bound to surface some times. Buyers known to exercise greater control over the QC process and known to be “Quality Conscious” rather than “Cost Conscious”, are better positioned to protect their brand in the event of any quality issue surfacing in shipped products.
  3. Impartiality: A 3rd party by definition should have no relationship whatsoever with the Buyer or the Supplier/Factory and therefore is the most impartial. Though being impartial doesn’t mean being apathetic, the 3rd party Inspectors may lack the level of passion and dedication that in-house inspection teams are expected to possess. 
  4. Scalability: The capacity to handle fluctuations in inspection volume is high if the inspection is either managed directly by the supplier/factory or it is outsourced to a 3rd party. Usually the Buyer who chooses to use in-house QC team has a fairly good understanding of the demand and could use a combination of 1st party and 3rd party to tide over seasonal peaks, while keeping most of the inspections in-house.
  5. Cost: Typically, an in-house QC Inspection team will carry higher fixed cost, and so outsourcing the QC Inspections to a 3rd party will lower this fixed cost to some extent – some Buyers may still maintain in-house admin and report review staff as they may not fully trust the 3rd Party results. On the other hand as volumes increase, the in-house operations is likely to be less expensive to manage than using a 3rd party QC firm due to lower variable cost. Besides the in-house QC Inspection team is expected to improve its productivity and efficiency, resulting in lower cost per inspection over time.
The Way Forward for Buyers: A Multi-Pronged Approach
  1. Gain visibility to all 1st Party QC activities by adopting a common Inspections Management Platform.
  2. Bring in 3rd Party Inspection Service Providers for only Final Random Inspections and mandate use of the common Inspections Management Platform. As a result, this will ensure complete visibility across both 1st Party and 3rd Party Inspection activity and data.
  3. Initiate Internal QA/QC Team to start gradually bringing in-house inspection activities on the common Inspections Management Platform.
  4. Over a period of 3-9 months (depending on Team Size), have the internal QC Teams gradually perform all inspections except FRI. Consequently, this will allow for standardized inspections, eliminate data fragmentation, and furthermore ensure high level of visibility across 1st/2nd/3rd Party Inspection activities.
  5. After 9-12 months, implement a Supplier/Factory Certification Program based on reliable performance analysis that you can easily perform using the Inspections Management Platform. Henceforth, best performing Suppliers/Factories will require lower QC interventions, worse performing Suppliers/Factories will require more QC interventions – all QC interventions can be continually fine tuned, driven by data analytics.

Inspekt is a cloud based, Inspections & Audits Management Solution that manages end to end, the Order to Invoice functionality for the entire Inspections & Audits process. For additional details and to register for a free evaluation, please go to:

RegisterStarted by veterans of the Testing, Inspections, and Certifications (TIC) industry, QualNimbus is an information and technology service provider that is focused on building value chain optimization and information analytics solutions to drive Digital Transformation in the Quality Management space.

If you have any comments regarding this blog or would like to know how QualNimbus can help you, please feel free to send an email to QualNimbus or submit your comments below.

Release Notes: Inspekt Version 1.8.0 (20161128.1)

New in Inspekt Version 1.8.0: Analytics, Quotation Management and more !

Inspekt Version 1.8.0 (20161128): 28 November 2016


  • Analytics – Comparative Failure Analysis: You can now generate Comparative Failure Analysis reports.comparative-failure-search-release-v1-8For any Product, you can now visually compare the performances of all your Buyers and Factories over a period of time. For any Product, you can also compare the performances of countries. In addition to graphs, the report is also available in tabular format.
  • Analytics – Benchmark Analysis: For any product, you can now compare the performance of Buyers, Factories, by country against the Industry performance.benchmark-analysis-release-v1-8
  • Quotation Module: Your operations team can now prepare a quotation (based on pricing matrix) and post it on the Customer Portal. A customer portal access is automatically created if that customer (quotation recipient) doesn’t have portal access. Your Customer can then Accept or Reject the Quotation. If accepted, Inspekt will generate the Proforma Invoice.quotation-release-v1-8


  • Protocol AQL – Configurability: You now have the option to include Section level AQLs in the Protocols and display it accordingly on the auto generated final report.protocol-aql-release-v1-8

Click here to see the entire change log (for every Inspekt version released to date).

ABOUT Inspekt: Inspekt lets you manage the entire Inspections lifecycle – from order to invoice – all from your web browser. 

Not a current user of Inspekt – but want to get all these great features for Inspections Management? Start your free trial today!Register

Release Notes: Inspekt Version 1.7.0 (20161025.1)

New in Inspekt Version 1.7.0: Quotation Management, Pricing Matrix, Analytics and more !

Inspekt Version 1.7.0 (20161025): 25 October 2016


  • Quotation Management: Your customers can now raise a quotation request via the Inspekt Portal. Your operations team can then very quickly prepare the pricing matrix based quotation that will be posted back to the Portal. Your Customer can then Accept or Reject the Quotation. If accepted, Inspekt will generate the Proforma Invoice.
  • Pricing Matrix: You can now manage all your complex pricing requirements very efficiently. You can setup your multi currency pricing by Service/Customer/Protocol/Location/Country/Region and so on. This in turn allows for easy, fast, accurate quotation/invoice generation.
  • Terms & Conditions (T&C): You can now setup your T&C documents with captions and show it to your Customer for every quotation and booking on the Portal. Your Customer has to check a box to indicate agreement to attached T&C before placing the order.
  • Analytics – Inspector Monitoring: Managers can now generate a report on percentage of Pass/Fail/Pending results by an Inspector for each factory can now be generated. Report reviewers can also quickly view this information during the review process.
  • Analytics – Failure Analysis: You can now generate rich drill down Inspection Failure Analysis by Protocol Section, by Defects, by Onsite Tests and so on. Users can easily filter by Buyer, Supplier, Factory, Product …. you can also quickly export to excel all this rich data.


  • Booking: During the inspections booking process, in addition to Buyer, Supplier and Factory, your customers and booking staff can now also capture Agency/Agent name and contacts.
  • Booking: User can capture the Country of Destination information during the inspections booking process.
  • Protocol: You can assign Protocols faster than before based on Product Taxonomy mapping.
  • Scheduling: Schedulers can now specify the precise Inspection Time during the scheduling process.
  • Field Inspection: Inspekt now maintains a time log to track the time of various events such as “Arrival at Factory”, “Document Signed”, Departure from Factory” and so on.

Click here to see the entire change log (for every Inspekt version released to date).

ABOUT Inspekt: Inspekt lets you manage the entire Inspections lifecycle – from order to invoice – all from your web browser. 

Not a current user of Inspekt – but want to get all these great features for Inspections Management? Start your free trial today!Register