Ask Our Experts: Foundations of Quality Data


Q: With all the informational overload in the market, how can one identify quality data from white noise?

By New Frontier Data

A: Today, every successful market can be quantified through substantial data and market analysis. Effective analysis can provide valuable insights to shifts in the industry, operators’ and stakeholders’ motivations, evolving market trends, customer demographics, and the traits of customers’ behaviors and expenditure. The potential understanding which it can provide is vital to the growth of any player in the market or industry.

Whether it concerns policy or evidence-based practices, good data well collected from a large and representative pool for careful analysis can be invaluable. What is meant by data? Data is defined as statistics and facts collected for reference or analysis. Big data is a complex collection of information from multiple sources, rapidly accumulated for ongoing discovery and analysis. In-depth analysis of big data is instrumental for risk mitigation, cost reduction, and greater operational efficiency for any industry. Big data and its proper analysis are crucial for operators, investors, legislators, and researchers alike to make educated decisions based on accurate, statistically meaningful facts.

In a contemporary arena tainted by fake news, cynical spin, and twisted truths, it is unfortunately far too common for someone to twist numbers, whether by a biased survey, skewed methodology, or over-reliance on data sources leaning in the direction of external motives rather than inherent facts. The key to solid data quality resides in a transparent, unbiased collection methodology utilizing a wide range of sources to wash out the noise.

Many organizations only focus on the final data and invest in data quality control right before it is delivered. Too often in such an approach, by the time that an issue arises, it is already too late — either it takes a long time to find out where the problem came from, or it becomes too costly and time-consuming to fix.

Bad data quality begets poor information quality. A lack of actionable knowledge in business operations leads to risky business outcomes.

New Frontier Data fundamentally relies on identifying quality data. To continuously provide the most accurate reporting in the industry requires strict adherence to best practices:

  • Engage with top-level management and a cross-departmental perspective.
  • Manage data quality activities as a part of a data governance framework.
  • Occupy roles as data owners and data stewards from the business side of the organization, and occupy data custodian roles from business or I.T., where it makes most sense.
  • Use a business glossary as the foundation for metadata management. Metadata is data about data, and metadata management must be used to identify common data definitions and link those to current and future business applications.
  • For each data-quality issue raised, start with a root cause analysis.
  • Strive to implement processes and technology to prevent issues from occurring.
  • Define data quality KPIs linked to general KPIs for business performance.
  • Strive to use fact-based impact and risk analysis to justify solutions and their required funding levels.
  • Aim to find cost-effective solutions for data onboarding that utilizes third- party data sources for publicly available data. For product data, utilize second-party data from trading partners whenever possible.