Quality . . .

How Can I Manage My Suppliers and Delight My Users?

Quality Assessment

Note that the "spec/defect" quality view of quality can fit with both information-as-a-product and information-as-a-service, as can the "disappointment" view. This is made explicit in the PSP/IQ model, developed for understanding Information Quality:

Conforms to Specifications Meets or Exceeds Customer Expectations
Product Quality Sound Information
  • Free of Error
  • Concise Representation
  • Completeness
  • Consistent Representation
Useful Information
  • Appropriate Amount
  • Relevancy
  • Understandability
  • Interpretability
  • Objectivity
Service Quality Dependable Information
  • Timeliness
  • Security
Usable Information
  • Believability
  • Accessibility
  • Ease of Operation
  • Reputation
Source: Information Quality Benchmarks: Product and Service Performance

All items here should be assessed. Items in the first column lend themselves to objective measures, either continuously (automated systems) or during periodic audits. The units are often reported as percentages (ie defect rates) and thresholds can be set by reference to the formally agreed specification (or service level). Items in the second column cannot typically be measured objectively, and so must be assessed through surveys, feedback (such as complaints/compliments) or user interviews/observation. Setting thresholds for these less-tangible elements often requires benchmarking against best-practice. It's an unfortunate fact that this second column tends to get ignored due its difficulty, while it is an absolutely vital part of the process.

The final quality assessment - and one that is especially important for high-discretion systems like Enterprise Reporting - is usage monitoring. As is often said, people vote with their feet ie their true preferences are revealed through their behaviour. It is imperative that any quality assurance function includes detailed monitoring, tracking and analysis of which users, reports and features are being used. Ideally, this should be compared against (known) user alternatives, such as legacy reporting systems and external information sources.

Lessons for Enterprise Reporting

  • Understand and Appreciate Users' Views - The different report users will have different views on whether reporting is a product or a service. Some will assess quality as conformance to specifications while others will see it in terms of meeting their expectations. You need to be able appreciate all perspectives to meet their needs.
  • Assess Quality of Content and Delivery - Be prepared to measure aspects of system and information quality using a variety of techniques - quantitative and qualitative, objective and subjective. The breadth and depth of this understanding will limit your ability to implement improvements.
  • Monitor, Analyse and Report on Usage - In your rush to monitor the reporting system itself, don't forget that it's actual usage that creates value. You need to gather statistics and anecdotes on how users adopt the system. Like any enterprise intiative, the Enterprise Reporting function itself needs to have a well-thought through set of reports linked to accountable managers.
  • Remove Obstacles to Take-Up - Armed with this insight, identify the key obstacles for adoption and try to move users down the ladder. If you can link quality assessments with usage monitoring, you can evaluate the success of initiatives for driving take-up (eg. awareness, training, incentives). This means you can target and prioritise the initiatives accordingly.