|By Hollis Tibbetts||
|September 10, 2011 12:00 PM EDT||
In survey after survey, about half of IT executives consistently agree that data quality and data consistency is one of the biggest roadblocks to them getting full value from their data.
This has been consistently true all since the Chinese invented the abacus. I suspect it will be true long after quantum computing has solved every other problem that humanity faces.
Incorrect, inconsistent, fraudulent and redundant data cost the U.S. economy over $3 Trillion a year - an astounding figure that is over twice the amount of the 2011 Federal Deficit.
I've long been a proponent of healthy software - but healthy software can only function properly in the presence of healthy data. Does quality software even matter if the underlying data are defective? Agreed - that's pushing the point to the extreme.
The rapid, iterative, continuous testing model has measurably improved the quality of software development. Evangelists such as Kent Beck have had a huge impact on this. I recently posted a freely downloadable white paper on this topic. But where are the evangelists for data quality? Where is an open source "JUnit for Data" and if it's out there, why isn't everyone using it?
The Cost of Bad Data
Anyone care to make a guess at how much money is wasted every year due to dirty or duplicate / redundant data? I'll start by presenting one common user story - one you probably have also recently experienced. And then expand on it.
Recently, I went to my mailbox and waiting for me was yet another invitation from a major bank to join their credit card program.
This shouldn't come as a surprise, as people everywhere are deluged by credit card offers. Except that I already have the particular card in question. Not only that, but because the particular bank in question has managed to acquire a number of other banks and credit card lines of business, between my personal and my corporation, I believe I now have five Visa cards from this particular bank.
I also occasionally get mail from them offering me cash bonuses to open up a checking account at their bank. Probably wasted postage, as I already have two checking accounts there. I suppose I could open up a third, just to get the $100.
Every month, I get a significant number of expensive looking direct mail offers from this bank, often with slightly different variations on my name, which I promptly throw away. Aside from the impact on the environment and the wasted direct mail expense, it's a bit irritating to me. I hate junk mail, and I feel compelled to shred things like credit card offers. So they've burdened me (an existing customer) with yet another "thing to do". So they've spent money, hurt the environment, irritated an existing customer, and now I get to make fun of them online. Bad investment on their part.
QAS (an Experian company) estimates that the average company wastes $180,000 per year simply on direct mail that does not reach the intended recipient because of inaccurate data. But this is just one miniscule slice of the data quality issue. In fact it's only one small part of the "direct mail" data quality issue. A lot more money is wasted in "inappropriate offers" and "duplicate offers" such as the ones my bank sends. I also get offers from several companies that are convinced that I'm married to the previous owner of my house. Those offers reach me, yet are immediately shredded. No sense opening them. So the "big picture" just for direct mail is much larger than what QAS shows.
None of this accounts for the "irritation" factor - what is the cost of annoying existing customers (or potential customers) with badly targeted offers?
Yet direct mail and all other forms of advertising together add up to a tiny slice of the bad-data pie.
Fraud Is a Bad Data Problem
Some time back, the US Attorney General's office stated that they believed that 14 percent of health care dollars are wasted in fraud or inaccurate billing.
Why do I lump fraud in with "bad data"? Bad data comes in two forms - accidentally created bad data and intentionally created bad data (for example, fraudulent billing). Either way, it's bad data. It doesn't matter how it got there, it's defective. And a lot of it could be detected and remediated "at the point of entry".
Healthcare accounts for over 16% of the U.S. GDP (Canada is 10%, Australia is 9% as a comparison). The U.S. GDP is currently approximately $14 Trillion - therefore healthcare spending in the U.S. amounts to $2.25 trillion. And the cost of bad data in Healthcare- $314 Billion.
That's just for fraud or inaccurate billing. What about other areas in healthcare (e.g. lost data, "bad patient outcomes", duplicate patient testing, manual rework, etc.)? Even if we round down, we're still taking about $500 Billion for one industry alone. If I extrapolate that out to the entire U.S. economy, we're talking about a $3.1 Trillion problem. No matter how far off my estimate is (on the high side or the low side), it's a problem of astonishing proportions.
Cost of Bad Data to Business and IT
A classic but very worthwhile book from information governance expert Larry English posits that the business cost of nonquality data may be as high as 10-25% of an organization's revenue, and that as much as 50% of the typical IT budget may be spent in "information scrap and rework". If that is the case, then my $3.1 estimate is not out of line.
In the introduction to his book, English states "With this proliferation of information, the challenge of managing data and providing quality information has never been more important or complex."
That was in 1999. With so much more data today, and a surprising lack of attention to the data quality issue, I can only imagine the total economic impact of things today. I do not doubt that the cost of bad data has risen.
Dealing with bad data at the I.T. level is expensive. But if I.T. doesn't deal with the bad data problem, then the cost gets pushed downstream to the "business", where the business costs are geometrically higher. The model is not that different from that of "healthy software", where it costs $1 to uncover a defect during developer/unit testing, but $100 to fix that defect if the software is released to the end-users.
"Low Hanging Fruit" - Best Practices for Bad Data Avoidance
I am not saying that there are any easy fixes to the bad data problem. Even something as relatively simple as cleaning, standardizing and de-duping a mailing list with 10,000,000 entries is essentially impossible to get completely right no matter how much effort is put into it. Yet there are some relatively easy things that can be done to substantially improve the quality of our data. As with so many other problems in life, the some version of the 80/20 rule applies to this as well.
Best Practice #1: When integrating data, fix the quality problem during integration
As data are added or integrated, data should be tested. Profiling is a simple, fast, relatively easily implemented and highly effective way for eliminating significant volumes of defective data.
When developers write a new application for the input of some new data, it's normal for input fields to be "validated" - a simple "hard coded" form of profiling. Month number needs to be between 1-12. 13 is never correct. Not rocket science. And it's universally done.
Yet people have far fewer reservations about integrating data from here, there and everywhere - often not checking for even the most egregious data errors, and thereby polluting the organizational drinking water (i.e. all the data and applications downstream).
I strongly suspect that's why I get so many offers from my current mega-bank. Since the banking implosion, this particular bank has purchased every other bank around. And their credit card businesses. And their marketing databases. And (apparently) smashed them together. So I get offers for Hollis Tibbetts, Hollis W. Tibbetts, Hollis Winslow Tibbetts, Hollis Tibbets, Hollis Tibbitts and so on.
Integration of data isn't necessarily just a "big bang" event - like when one company acquires another and smashes all the data together, or when two divisional customer applications get merged. It can be more insidious and more when you have "trickle" integration - the slow feed of new data from one system into another (either within the organization or from customers/suppliers/partners). This is the class of integration that is causing a lot of the problems previously discussed with healthcare fraud.
Either way, FIX IT before integrating it. Once the poison enters the corporate drinking water, it's a lot harder to get out (not just technically, but especially politically/organizationally).
Best Practice #2: When migrating data, fix the data problem as PART of the migration project
Spending $1 billion to upgrade your Seibel system like the US Government is doing? Sounds like a great time to fix your data quality problem.
If you're doing something like migrating your customer data from Seibel to Netsuite or Salesforce.com, data quality should be a major element in your project plan (and budget). Fixing the problems during the migration are easier than fixing them later:
- You probably already possess a lot of knowledge about the existing legacy systems, the types of problems in the data. But your new system is relatively unknown to you. So it's likely to be easier to fix data issues from a technical perspective BEFORE they get loaded into the new system.
- As part of the data migration process, you can export the data to a staging platform (On Prem or Cloud), leverage any number of data quality tools/engines, and then import the data into the the application platform. This approach may partially pay for itself in an easier/smoother upgrade to the new application, but that's a rounding error in the overall scheme of things.
- Organizationally and politically, companies are much more likely to spend money to clean data if it's part of a project like "upgrade the CRM system". I'd hate to be the CIO that spends a mountain of money to upgrade the CRM system and then goes back to the board asking for another mountain of money to fix all the bad data that just got loaded into the CRM system. That's how CIO's become ex-CIOs.
Best Practice #3: Data profiling and data de-duplication engines
Data profiling engines are a great technology for quickly improving the quality of data as it is integrated from one system into another. At the highest level, they are an engine that scans data, and applies certain easily definable rules to data elements, such as formats, ranges, allowable values and can evaluate relationships between different fields.
Furthermore, these engines can also be used to analyze existing data stores very rapidly and generate "exceptions files" for manual, or semi-automated remediation (if anyone can find a totally automated data remediation system, I'd love to know about it). So they can be used in "continuous testing" or "batch testing" mode. In batch mode, they're ideal for application migrations or big-bang integrations, as they're easiest to use them if you have your data in something like a staging database. But they can also be used to test data as it is "trickle integrated" into production systems.
De-duping engines generally fit into the same category. I haven't seen them be as effective as data profiling engines, yet I believe they're essential. The technology for de-duping is considerably more sophisticated - with a large number of different algorithms and tunable thresholds and such. It's a harder class of technology to implement. More manual effort is involved. And, unlike profiling (where there is NEVER a month "13"), de-duping can "get it wrong", so the technology needs to be applied more selectively.
I've never understood why these engines haven't been more popular. There is no "JUnit for data" as far as I know. But commercial solutions are available - they're not terribly expensive and rapidly pay for themselves.
On the other hand, I've never understood why organizations are so tolerant of bad, dirty data. They waste millions and millions directly because of it (and untold quantities of money in "wasted opportunities"), but are reluctant to spend $15,000 on a data quality engine to help fix a significant portion of the problem.
SYS-CON Events announced today that SOA Software, an API management leader, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. SOA Software is a leading provider of API Management and SOA Governance products that equip business to deliver APIs and SOA together to drive their company to meet its business strategy quickly and effectively. SOA Software’s technology helps businesses to accelerate their digital channels with APIs, drive partner adoption, monetize their assets, and achieve a...
Oct. 31, 2014 07:15 PM EDT Reads: 2,083
SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named "Bronze Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.
Oct. 31, 2014 06:30 PM EDT Reads: 2,159
SYS-CON Events announced today that AgilePoint, the leading provider of Microsoft-centric Business Process Management software, will exhibit at SYS-CON's 2nd International @ThingsExpo which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. AgilePoint is the leading provider of Microsoft-based Business Process Management (BPM) software products, has 1,300+ on-premise and cloud deployments in 25+ countries and provides the same advanced BPM feature set as J2EE vendors like IBM and Appian for the Microsoft .NET native environment. AgilePoint customer...
Oct. 31, 2014 05:00 PM EDT Reads: 1,198
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at Internet of @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., will show what is needed to leverage the IoT to transform your business. He will discuss opportunities and challenges ahead for the IoT from a market and tec...
Oct. 31, 2014 04:00 PM EDT Reads: 1,587
SYS-CON Events announced today that Utimaco will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Utimaco is a leading manufacturer of hardware based security solutions that provide the root of trust to keep cryptographic keys safe, secure critical digital infrastructures and protect high value data assets. Only Utimaco delivers a general-purpose hardware security module (HSM) as a customizable platform to easily integrate into existing software solutions, embed business logic and build s...
Oct. 31, 2014 03:00 PM EDT Reads: 1,950
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, will describe an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people’s real needs and desires.
Oct. 31, 2014 02:00 PM EDT Reads: 1,817
SYS-CON Events announced today that TeleStax, the main sponsor of Mobicents, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TeleStax provides Open Source Communications software and services that facilitate the shift from legacy SS7 based IN networks to IP based LTE and IMS networks hosted on private (on-premise), hybrid or public clouds. TeleStax products include Restcomm, JSLEE, SMSC Gateway, USSD Gateway, SS7 Resource Adaptors, SIP Servlets, Rich Multimedia Services, Presence Services/RCS, Diame...
Oct. 31, 2014 09:00 AM EDT Reads: 1,692
Samsung VP Jacopo Lenzi, who headed the company's recent SmartThings acquisition under the auspices of Samsung's Open Innovaction Center (OIC), answered a few questions we had about the deal. This interview was in conjunction with our interview with SmartThings CEO Alex Hawkinson. IoT Journal: SmartThings was developed in an open, standards-agnostic platform, and will now be part of Samsung's Open Innovation Center. Can you elaborate on your commitment to keep the platform open? Jacopo Lenzi: Samsung recognizes that true, accelerated innovation cannot be driven from one source, but requires a...
Oct. 31, 2014 09:00 AM EDT Reads: 3,379
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at Internet of @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, will discuss how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money! Speaker Bio: Esmeralda Swartz, CMO of MetraTech, has spent 16 years as a marketing, product management, and busin...
Oct. 31, 2014 09:00 AM EDT Reads: 2,335
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
Oct. 31, 2014 08:45 AM EDT Reads: 1,989
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Oct. 31, 2014 07:00 AM EDT Reads: 1,662
SYS-CON Events announces a new pavilion on the Cloud Expo floor where WebRTC converges with the Internet of Things. Pavilion will showcase WebRTC and the Internet of Things. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices--computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
Oct. 30, 2014 05:30 PM EDT Reads: 2,213
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic strategies that utility/cloud computing provides. Whether public, private, or in a hybrid form, clo...
Oct. 30, 2014 05:30 PM EDT Reads: 1,433
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridsto...
Oct. 30, 2014 02:00 PM EDT Reads: 2,595
SYS-CON Events announced today that Red Hat, the world's leading provider of open source solutions, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As the connective hub in a global network of enterprises, partners, a...
Oct. 30, 2014 12:15 PM EDT Reads: 2,084
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at Internet of @ThingsExpo, Andrew Bolwell, Director of Innovation for HP’s Printing and Personal Systems Group, will discuss how key attributes of mobile technology – touch input, senso...
Oct. 30, 2014 12:00 PM EDT Reads: 1,770
The Internet of Things (IoT) is making everything it touches smarter – smart devices, smart cars and smart cities. And lucky us, we’re just beginning to reap the benefits as we work toward a networked society. However, this technology-driven innovation is impacting more than just individuals. The IoT has an environmental impact as well, which brings us to the theme of this month’s #IoTuesday Twitter chat. The ability to remove inefficiencies through connected objects is driving change throughout every sector, including waste management. BigBelly Solar, located just outside of Boston, is trans...
Oct. 30, 2014 11:00 AM EDT Reads: 2,195
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics...
Oct. 30, 2014 08:00 AM EDT Reads: 1,629
Internet of @ThingsExpo Silicon Valley announced on Thursday its first 12 all-star speakers and sessions for its upcoming event, which will take place November 4-6, 2014, at the Santa Clara Convention Center in California. @ThingsExpo, the first and largest IoT event in the world, debuted at the Javits Center in New York City in June 10-12, 2014 with over 6,000 delegates attending the conference. Among the first 12 announced world class speakers, IBM will present two highly popular IoT sessions, which will take place November 4-6, 2014 at the Santa Clara Convention Center in Santa Clara, Calif...
Oct. 30, 2014 07:30 AM EDT Reads: 2,304
From a software development perspective IoT is about programming "things," about connecting them with each other or integrating them with existing applications. In his session at @ThingsExpo, Yakov Fain, co-founder of Farata Systems and SuranceBay, will show you how small IoT-enabled devices from multiple manufacturers can be integrated into the workflow of an enterprise application. This is a practical demo of building a framework and components in HTML/Java/Mobile technologies to serve as a platform that can integrate new devices as they become available on the market.
Oct. 29, 2014 02:15 PM EDT Reads: 2,083