The latest from Insurance Journal. See WRG
The latest from Insurance Journal. See WRG
KPMG International has prepared an executive summary of the four papers provided by the following companies in support of the Executive Panel: “How well is the Life insurance industry keeping pace with rapidly changing technology?” The panel included Allianz, Legal and General Group, RGA and Sun Life Financial. Download Life-insurance-keeping-pace
A standard is a document that provides requirements, specifications, guidelines or characteristics that can be used consistently to ensure that materials, products, processes and services are fit for their purpose. We published over 19 500 International Standards that can be purchased from the ISO store or from our members.
ISO International Standards ensure that products and services are safe, reliable and of good quality. For business, they are strategic tools that reduce costs by minimizing waste and errors, and increasing productivity. They help companies to access new markets, level the playing field for developing countries and facilitate free and fair global trade.
Learn more about ISO
I've been listening in on a developer doing a little coding. I stumbled on Ben Lopatin's blog owing to his statement that “conventions are conventions for a reason. It’s okay to do things differently when there’s a solid expected payoff, but that expectation needs to be well grounded.”
Ben relates playing with a new geocoding service. He wanted to switch the order of coordinates given by the service from (latitude, longitude) to (longitude, latitude) because prior experience with a system led him to believe the latter was the GIS standard. It turned out that the system he was familiar with had flipped the normal order for arcane reasons of its own. The rest of the world says latitude, longitude, and that's the standard – one created by convention, not regulation.
It's a small example of how habit and familiarity can trip us up. It's also a good reminder that any convention only looks “right” because we're used to it. For example, it seems natural that we put the dollar sign ahead of the numerals, even though we read and say money amounts as “x dollars”.
I like Ben's takeway. You've got to have a good reason for departing from a standard. Standards have reasons, even if you can't see them. Geocoding
As citizens press for more data transparency in government, the demand for openness is going global. The Open Contracting Data Standard has credible support and funding. It will enable governments around the world to publish their procurements in a comparable way. Governments are estimated to spend $9.5 trillion every year through contracts. Supply Management
Information professionals can sometimes get lost in conversations with technology folks – especially vendors – when the two parties are on slightly different pages when it comes to standards. Or, rather, when they're reading from different books.
This happens when both parties are talking about data standards – but the information guys mean standards at the business level, while the technology guys are thinking about the technical level.
This divergence is at the root at much confusion in information management circles, where proponents of (business) standards seem to hit a brick wall talking with colleagues who are up to their necks in (technical) standards. One side is saying, We need standards! The other side is saying, We got standards!
For example, pretty much every application you can think of that uses medical imaging uses the DICOM standard. Every system is compliant with this robust, detailed standard for image representation and transmission. But “each vendor supports the DICOM standard in its own way, so there’s no uniformity with regard to DICOM tags and how they are used”. Without interoperability of image data at the business level, it's very, very hard to reliably attach an image set to a patient record.
I have a tip for recognizing when you're in one of those cross-purpose conversations, where people are using different conceptions of standards. It's to ask a “Who” question.
For example: “Can your data tell me who this is a picture of?” This will flush out whether the standard in mind is at the technical or business level. Technical standards don't contain entities representing business objects, such as customers. A simpler “Who” question for this situation is: “Who uses this data?” A blank look will tell you all you need to know. Merge.com
Our understanding of big data today usually only encompasses data consciously generated by people or passively collected by relatively large sensors, such as closed circuit TV cameras. The real explosion in data is going to come from much smaller, and much more numerous, sensors. And a prime site for locating sensors is on – and in – the human body.
Already the trend toward capturing data about steps taken, calories burned, and heart rate, is growing. There are dedicated devices and smartphone apps. The tech companies are competing fiercely to reinvent the wristwatch as a device primarily dedicated to personal data monitoring. The long-stalled category of “wearable computing” is enjoying a new lease of life.
Now scientists have developed a way of creating personalized assemblies of multiple sensors that can be implanted in the body. So, for example, they make a 3D model of your heart using scanning technologies. They then use the model to print a sensor-embedded membrane that exactly fits the surface of your heart. The membrane can then transmit live data about the performance of every part of the organ.
This is one of those ideas that's amazing for about minute – and then seems obvious. Surely everybody should have one of these as soon as it's safe and practical to do so. And that's just the heart... There's an awesome number of substances, rates, shapes, and relationships within the body that could be emitting data.
Making use of all this data is something else. The line between data collection and health improvements may be long and complex. But with sensor technology improving in leaps and bounds, this is surely the next frontier for data. Wired
This takeaway leaps out from a recent Deloitte survey: “Many data governance organizations struggle to obtain executive sponsorship, in part because they have not linked data quality to improved business performance.”
Deloitte's Tom Mongoven puts his finger on maybe the crucial disconnect of our times – the failure of enterprises to “get” the importance of data, even while they invest heavily in information technology and salute the information era we live in.
The survey details the disappointingly high number of organizations which don't track business value metrics “like the number of reductions in out of stocks driven by having more accurate product and customer data or increases in discounts from strategic suppliers driven by having more accurate vendor data”. This implies people are building sophisticated management information and ERP systems, doubtless based on specific business cases, and then ignoring the outcomes.
Is it because data governance folks think the benefits of good data are plain obvious? If so, they need to show a few more slides illustrating the measured benefits they're bringing to the business. The link between data and business performance needs to be demonstrated – regularly. This is true for every aspect of an enterprise. Don't take the contribution of data for granted. Deloitte
Tim Davies wrote a deep piece about the ethical dimensions of data standards. It's a more philosophical take on standards than I'm used to, but the issues raised are interesting. For example, Tim asks us to consider which items in a data standard are descriptive and which are normative; that is, which are, if you like, just capturing neutral facts and which are making assertions.
As I say, this stuff is deep, and for me Tim is already going deeper than most of us in business in his deft description of the purely practical aspect of standards: “Building a good standard involves practical choices based on the data that is available, the technologies that might use that data and what they expect, and the feasibility of encouraging parties who might communicate using that standard to adapt their practices (more or less minimally) in order to adopt it.”
The last point about “the feasibility of encouraging parties” to change their behaviors has been routinely missed by technologists. It's a point that's obvious to social scientists and should be salient in the thinking of business folks.
Perhaps a standard's potential for successful adoption relies in some measure on the ethical dimension Tim explores. For example, if the target community believes the standard contains items that don't describe the world they know, but attempt to force their thinking into some other mold, then they are likely to reject or undermine it.
This is one reason why community-developed data standards, like ACORD's, have greater real feasibility than mandated or proprietary ones. If you're involved in creating and maintaining the standard, then you know for sure the standard is meant to serve you and your purposes.
What do useful people in organizations do? There are lots of ways of analyzing successful behaviors. I believe a good way to characterize what makes somebody, or some team, useful is according to their orientation to standards. And this is not just a ho-hum, sort-of-interesting perspective. It goes right to the heart of people's roles in our increasingly complex enterprises.
My thesis is: All useful people are either exploiting, adapting or creating a standard. Also, about 90 per cent of the useful people are exploiting, 9 per cent are adapting, and 1 per cent are creating.
Just think about the most common problems in any business. The pain and stubborn costs in business are overwhelmingly connected with people “doing their own thing”. With “the left hand not knowing what the right hand is doing”. With duplications and omissions. With lack of clarity and ambiguity. Those who exploit standards just eradicate these problems. Standards stop all this nonsense in its tracks. It's not about forcing anyone to do anything against their will. It's just about giving them the right tools for the job.
In order to make sure the majority of people in the organization can exploit standards, a small amount of resources must be dedicated to seeking and sometimes adapting standards to areas of the business that still require them.
In a vanishingly small number of cases, an appropriate usable standard does not exist and has to be invented to meet a specific need. In the wild west days of IT (which, like the real wild west, didn't last as long as you might think) there weren't any standards, so people made their own stuff up. But those days are long gone. Similarly, it's got to be at least a hundred years since people started to study management and figure out the wisdom of what we now call business information and process management. Most of the standards required by most people exist in tried and trusted forms. In a phrase: Useful people use standards. There's no utility in doing otherwise.
Service businesses are different than product businesses. You can't force one type to behave like the other. Everyone knows that. Except, in one very important respect, everyone is wrong. Both product and service businesses need to take a product approach to data, and a service approach to data management.
The product approach to data is another way of saying standards. By using data standards, you productize your data. You define pieces of data in terms of their features and values. And you don't craft information objects from scratch. The philosophy is all about reproduction and manipulation of known quantities. The classic service businesses have data “products” like customers, contracts, and charges.
The service approach to data management complements the usage of standards. Data serves the purposes of people, so while data must be standardized, its usability always requires the continued support of real life human beings. The aim of automation is not to remove people from the information path, but to keep shifting their location to more profitable segments of the path. Once you have data standards in place, you have a strong foundation for achieving greater business value from your information.
Harvard Magazine reports big data is big business at the university, with students and faculty of all kinds taking up courses in data science. Qualitative experts are getting together with statisticians and finding quick, high quality routes to new results: “Whenever sufficient information can be quantified, modern statistical methods will outperform an individual or small group of people every time.”
Let's hold on that precondition: “Whenever sufficient information can be quantified”. You know what I'm going to say. It's not just that someone has to have counted, measured or gathered the information of interest. They must also have assembled the data according to a standard that gives it the correct meaning.
There's also a danger we will choose increasingly to work with the data we have, rather than the data we need. There's a perception that the world is drowning in data that can be turned to useful account with a little statistical genius. But lose an airliner one day and you soon realize we only really have the data we actively seek. By-product data is great when it's there, but it's not everywhere.
It's like projecting terrorist organizations from cellphone calls. You don't have to go out into the marketplace and make contacts. But you get a skewed view of the enemy – which is dedicated to flying under whatever kind of radar you use to detect them.
Harvard people are smart. They know data without theory is useless. I hope the students flocking to data analysis realize data isn't a natural force, or an inevitable by-product of modern life. We still need to identify, source, and care for, data. Data is actually a scare resource, no matter how big the numbers sound. Harvard Magazine
The New York Times ran a long article about Enigma.io, which collates federal data and organizes it by company entity. Anyone can use Enigma to get a coherent picture of a company's business across government.
Enigma does a lot of scraping and collating to produce its analyses. None of this is cheap. It's about custom-writing interfaces for every federal feed of interest. It also involves designing rules and heuristics for deciding when two or more entities are identical, distinct, or related. Enigma recently raised $4.5m from investors – including the New York Times.
The same article quotes Hudson Hollister of the Data Transparency Coalition: “If the government adopted data standards, analytics would be cheaper and insight would be better.”
Also, I suggest, Enigma's investments in data resolution would become worthless. So, which is to be? Will Enigma, or some other portal/aggregator/hub stake a long-term place in the market for public data, eventually monetizing its crucial traffic cop role? Or will the government mandate sensible data standards for federal data?
Personally, while I like to see entrepreneurs addressing the fragmentation in public data, I believe the internal costs to government of not using standards will force its hand. US citizens suffer twice from the absence of federal data standards; first when duplication and error is tolerated within government, and second when the fragmented data is released to the public.
Getting information to flow properly saves time and money. The costs and frustrations of fragmented data used to happen behind closed doors. People shook their heads and said, It's the system. Computers and networks were so complex, sophisticated, and temperamental, that we were supposed to be grateful anything worked at all.
Computers remain complex, but their use has dramatically simplified. We don't need to know anything about networks to be connected wherever we go. IT isn't a white-coated priesthood any more. So no one can use technology issues as an excuse for data fragmentation.
Let's be clear about one thing: If data streams are not matching up across business boundaries, this is the result of a poor decision, or, more likely these days, a decision not taken. Anybody with a responsibility for the end-to-end delivery of a service to some stakeholder must look for instances of data fragmentation – and fix them.
The UK is putting drivers' license data online for insurers so that fraud and error about endorsements no longer needs to be priced in. This should result in about a £15 reduction in annual premiums. It will also make the application process much quicker. This is a good example of business and government working together to remove the kind of fragmentation none of us should tolerate any longer. TheGuardian
I hear this phrase a lot, usually accompanied by some hurry-up gestures: “We are where we are.” Heck, I might even use it myself. We've all been stuck in those one-sided conversations that seem to be more about blaming other people for what went less than well, rather than figuring out what to do next.
And yet... Often times it's necessary to review past actions and learn lessons. If we don't analyse the decision processes that led to poor performance or a bad outcome, then we're likely to make the same kinds of mistakes again. Sometimes it's worth looking back so you can move forward with confidence.
Here's a choice of soundbites from a recent conference on government procurement. You can have Karen Pica from the Office of Management and Budget (OMB) saying: “There is never a dull moment with data standards.” Or you can have the Defense Department's Paul Brubaker saying: “I think we need to flip the conversation. What outcomes do we want in government and do we have the datasets?”
I don't know if Pica's quote got laughs. I hope Brubaker's got lots of nods. His point was that rather than trying to collect more and more data, agencies need to decide on their goals and then collect the data which supports those goals.
Not all the glistering data we happen to have is gold. Why would it be? The data we have is the product of past designed processes. If those processes don't intersect with other areas of interest, then the data can have no relevance to them. And even when process areas overlap, the data collected for one purpose may well not serve any other purpose. Federal Times
Ian Kalin at Socrata writes an excellent piece on open data, pinpointing and answering five key questions city leaders and CIOs raise about open data. Kalin's second question is so cool I want to quote it in full: “Hasn’t Big Data already solved the problem of messy data? - Tools exist to help mobile applications understand the similarities between a data field labeled 'Address' and one labeled 'Street, City, State'. The real problem is the scale of deployment. For example, a San Francisco-based application that places food inspection scores in people’s hands cannot easily scale to Philadelphia if scores are calculated differently in each city, or if the data isn’t available in real time via an Application Programming Interface (API).”
I'd be grateful for Kalin's articulation of the question, let alone the answer. This is an important, and often unspoken, misconception about today's data landscape. And the answer makes it clear that technology speed will never compensate for conceptual clarity. Even if you could always assemble the required data on the fly from disparate sources, you will always need to make that data available in a comprehensible, stable, and published format. Data standards are an inherent aspect of data exploitation.
Many news outlets covered a report from the Government Accountability Office (GAO) on the current confusion in health data standards. Reporter Allison Bell summed up the core issue neatly: “Just about everyone agrees that the United States should have electronic health data standards, but federal regulators are having trouble actually setting standards and making them stick.”
The report includes the eyebrow-raising information that, in some places, patient records can be conflated if the patients share the same name, birth year, and locality. How, you might ask, could any systems integrator make such a basic mistake?
I believe the answer may be that, despite the widespread use of the term “systems integration”, there's actually very little experience of systems integration within health care. There's not enough appreciation of business architectural issues. Health is late to the integration scene. If you're choosing technical resources, you're more likely to choose someone with health system experience than someone from a different domain who happens to have a greater appreciation of systems integration issues.
The report also highlights that while a number of measures have been taken to encourage greater data standardization, actions and milestones have not been set.
The headline for Bell's article is “GAO: Agencies vague on health data standards”. There's two kinds of vagueness at work here. First is a lack of clarity over domain-wide data standards – such as how you uniquely identify a patient. Second is a lack of specificity about who is meant to be doing what, and by when. The health sector needs to tighten focus in both these areas.
Two major themes of our times, big data and open data, rely on the successful agreement and usage of data standards. The benefits of working with masses of data, and with massively available data, are absolutely predicated on clear and robust schemes for understanding, collating, and sharing data in meaningful ways.
There's lots going on in the open data arena, partly because opening up data can generate new types of services, and partly because the need to standardize data ahead of opening it up makes for significant cost savings in public sector organizations.
People building and advocating data standards for open data can learn a lot from the experiences of standards groups in commerce and industry. The ACORD community has much to teach – not necessarily about individual modeling decisions, but about the process of developing, promoting, and implementing standards. We've evolved and we know what doesn't work as well as what does.
Some of the advice I'd give sounds a little paradoxical. For example, I'd say you need every stakeholder involved in the creation of a viable standard. But I'd also say the user community can't wait around for the outliers to have their say. At some point, the bus has to leave.
Similarly, I'd say that while you want to get the best possible structure and nomenclature for every item in your standard, you also have to be prepared to go with something that's 90 per cent there. Standards work attracts perfectionists – and thank goodness it does. But we also have to trade perfection against timeliness. There will always be a chance to improve a standard. There may not be a second chance for an industry to enter a new market in strength.
Automated equity traders in Europe are building a data standard. Here's why. Marc Berthoud, executive director, SIX, says: “Imagine you've got all the major data vendors having to capture all those feeds, understand the logic of the feeds and try to normalise the output in order to have a coherent output for the end user […] The point is, for those 90 feeds there is no chance for the data vendors to have a fully converging educated guess of the data normalisation process.”
The pattern for growing markets goes like this. Players turn up somewhere and begin to trade. They build systems. They spit data out their systems. They scrape up each other's data and try to match it up. Pretty soon, most resources are devoted to laboring in the dirt. Then, somebody looks up and says: You know what? This is crazy! Let's just have all the data in a shareable form from the get go.
It would be neat if communities didn't have to wait for this point – if they could start with standards. Most times there are pre-existing standards that could cater for at least 80% of any new sector's business needs. However, the need for standards will always have to wait for the proof of the market. No one wants to invest in standards ahead of the formation of a viable community.
The big question is: What size does a community have to reach before standards are necessary? Experience suggests communities have to get too big before people recognize the need for standards. If you're battling to keep 90 different versions of the same story in line, as our friends in the European automated equities business are doing, then that's going to hurt loud enough to yell.
As the word about standards grows, I expect to see this level drop. Ninety is too high. You can be saving real money at numbers like five and six. This is simple network math: Once you have a handful of partners, it becomes cheaper to standardize than to maintain variance.
Data standard, standard of data... It's easy to get confused. The first term means recording data in line with an agreed structure. The second means recording data with accuracy. Data standards are fundamental to any business use of IT. So is accuracy.
I just read an article that talks about accuracy of data without mentioning data standards. And yet, I think the article is about data standards, not accuracy. William McKnight revisits the Garbage In, Garbage Out (GIGO) meme, correctly noting how this age old saying continues to be vitally relevant. Here's the section where I get confused:
“For example, in a retail situation each store can map their own keyboard. Most won’t, but some will – to take advantage of local trends. A store that sells mostly Combination #3 might change F1 from Combination #1 to Combination #3. It makes life easier at the store, but can create a headache at corporate if it is not prepared to make the adjustment to the data. The store is concerned with getting paid what it is due. Headquarters is concerned with what is selling. In this case, the source doesn’t change, the analytic store will change the data.”
To me, this is a data standards issue. You reassign F1, you go off-standard. You start speaking the wrong language. A data accuracy issue would be if you kept the keys assigned according to the standard, but you hit F1 twice every time you made a sale.
The difference I'm describing is the one between item and value. Data standards ensure, among other things, that the business counts what it's interested in. But data standards can't ensure people count properly. To me, the latter issue is a data quality issue. McKnight
Computer architecture is organized the same way. Processes and data are separated, even though both are ultimately made up of ones and zeros. Programs access pieces of data, and use or transform them. However, it's increasingly clear that business doesn't work this way – not naturally, anyway. We have designed processes, companies, and entire industries to work in a stop-go way that enacts the filing and retrieval of data by people performing procedures. Our information systems mimic these human systems.
In reality, business flows. In manufacturing, raw materials and energy flow in one end of the shed, and finished goods flow out the other. In service businesses, data flows among specialists who can use or transform the data to add value. It would make sense, then, if instead of seeing data as something that is mostly stored and occasionally accessed, we saw data as a stream that is temporarily halted by value-adding nodes.
Why does this matter? Because, according to traditional thinking, issues like data standards, data quality, and data management are regarded as matters of housekeeping. Supposedly, looking after data properly and making it maximally usable are nice-to-have's that must be justified.
But when you see that data mostly flows, and that it's flow that generates value, these data issues take on greater prominence. Your business absolutely relies on data. So your data had better be standardized, reliable, and available.
The Obashi methodology puts flow at the centre of its approach. The founders of the methodology stress that business has always been about flow – from the water needed to drive the first factories through steam, electricity, oil, components, and now data. It's a shift in attitude that can make a lot of difference to how we view – and improve – the enterprise.
New research suggests over half of airline travelers want to use mobile solutions for flight information, baggage status, and airport directions, and that most airlines and airports will be offering such services by 2016.
What impresses me about these findings is how unimpressive they are. I'd have thought everyone with a mobile device (ie pretty much everyone who gets on a plane) would want to use these services. Like, 99.9%. Also, I'd have thought every airline and airport would have arranged to offer such services by about two years ago.
Of course, the reason full coverage still lies in the future is (mainly) the lack of real time data. Or, in many cases, the lack of easily accessible, standardized data. For example, data about your baggage exists right now – otherwise they wouldn't be able to flash it up on the screens and the carousel displays. Getting the data to your mobile device is another matter. For other vital data streams – such as how long the lines are at check-in – you've got to have a human checking and reporting regularly (although you could probably calculate good estimates from CCTV feeds).
All of these solutions cost money to develop. Can the air transportation industry save time and money by pursuing standards? I hope so. Smart Thinking
I often see “enforce data standards” included in lists of actions for improving data management. It is also a duty that occurs in many job specifications. But I submit that “enforcing data standards” isn't something that one person, or one team, can do. It's not a duty that can be assigned to a few. It's a behavior that has to be manifested by the many.
Imagine someone advising you to “enforce use of auditing standards”. Good advice, right? You wouldn't then give someone the responsibility of enforcement, and sit back. What you do is design the organization's processes so that standard audit is an inevitable outcome of the actions taken by your staff.
It's unrealistic to treat data standards in a different way. If we let non-standardized processes and systems emerge through our project development processes, then this is a management issue that needs to be fixed. It's not something that an “enforcement team” can fix, however brilliant, articulate, or persuasive they are. Likewise, if we fail to use data standards as one of the criteria for deciding on continued investment in portfolio applications, no amount of enforcement can make up for the error.
This is not to say we don't need people to take the lead in standards. Sure, we need people to major in standards, just as we need people to major in audit. They should be there to help, not to punish. And certainly not to try to create an outcome which is systemically disabled by the enterprise's decision making machinery. Appreciation of standards, and care for their use, has to be baked in to the enterprise.
The Data Transparency Group says: “Federal data should be published online in electronic formats that make the data easy to search, sort and download. When separate federal agencies publish similar data, they should use standardized formats so that government-wide searches are possible.”
Absolutely. But we could go farther. Data standards are the minimum requirement for government to function. Strong standards make for a small state. Bringing data to the citizen must be a basic function of our democracy. SD Times
As you will hear in my opening remarks at the ACORD LOMA Forum last week, we are bringing the partnership to a close. ACORD began the conference in 1993 and we partnered with LOMA for eleven years. We plan to revert back to an ACORD Conference in 2015. We will merge content from ACORD-LOMA into the ACORD Implementation Forum. (Play the video on top and you'll see.) Anyhow, I just wanted to publicly say that it has been a real pleasure working with LOMA and we plan open new areas of cooperation in the future. (Picture with Jeff Hasty)
When governments ask for passenger data, it should conform to the standards which they agreed to through ICAO. The non-standard requirements of some governments should be eliminated as they complicate the system with no benefit to security.
We must modernize the collection of data. As airlines are transmitting data electronically, it is time to do away with the many paper forms that airlines, passengers and shippers are required to submit. And if we are all on the same page with the elements and format of the data, governments should create a single harmonized window through which the data can be sent.
And finally, governments should explore how passenger data can improve the effectiveness and efficiency of border controls. We should be able to measure the improvement it achieves.
These are timely and sensible recommendations. Notice how they can be applied in other situations. For example, why do some business units seek to use non-standard data formats – and what threats does this additional complexity introduce to the business? Again, Tyler's final point underlines the need to “close the loop” with data. The data we collect should be used to improve the business it serves. Tony Tyler
John Owens recently diagnosed and labeled some really important dysfunctional behaviors in data management. His description of how organizations get data wrong is funny and frightening at the same time. Owens believes many of our data problems stem from the fact that technical teams create bad data models, and business teams create workarounds so they can address the issues embodied in their KPIs.
This is a good characterization of what often happens. I'd add that the original bad data model is often a historical legacy, and that many of the workarounds stem from the needs of external partners, customers and regulators – needs that are often excluded from original data analyses.
Owens' cure lies in correct data modeling, deriving data entities from business functions. Good analysts always did this, of course. But perhaps the gradual disappearance of “systems analyst” as a role in its own right and the prevailing emphasis on mixed “analyst/programmer” roles has narrowed the professional perspective. With so much development work now comprising package implementation, it's not surprising the view has narrowed.
I believe industry data standards can also correct this problem, especially where those standards are accompanied by business models which encapsulate the functions driving the data. With standards, developers begin with a validated, proven-in-use, widely supported base set – a massive improvement on using “the old system” or whatever boxes we can draw on a whiteboard. Hub Designs
This article will probably only appeal to people who are really, really into data standards – which is a pity, because it's about an issue which potentially affects us all. Mandi Bishop examines the current conversion work going on in healthcare to standardize on SNOMED codes rather than ICD-9. Sounds like a boring technical issue, right? Bishop says dealing with the conversion has been left to technologists, who have developed handy click-and-assign conversion tools for hard-pressed data administrators.
The problem is, SNOMED and ICD-9 codes were developed for different purposes and they take different views of patients and their conditions. SNOMED was designed to encode a running problem list for a patient – which evolved from a pen-and-paper list of issues that need monitoring. ICD-9 was designed to capture billable ailments. So, while ICD-9 is happy with “osteoarthrosis”, SNOMED has over 20 concepts that map to this single code.
It looks to me as if the business – that is, the clinicians – should have been driving this development. Describing symptoms, signs, and illnesses is notoriously difficult. It's quite hard in prose, let alone codes. The different usages of patient data by different actors – in this case doctors, patients, insurers – must be taken into account in any data standard. HL7 Standards
Here is one focal point from an in-depth post on data and the pharmaceutical industry by Forrester analyst Skip Snow: “The recent changes in American patent laws from first to invent to first to file is driving a need to more coherently manage the data associated with creation of intellectual property.”
This is an example of how a small regulatory change can signal a big business change. “First to invent” is debatable; “first to file” is verifiable. The speed and certainty of a company's path from discovery to filing will become absolutely vital to business success.
With the migration of business online, “first to file” is also the make-or-break concern for many service organizations. For insurers addressing novel markets with innovative products, getting from invention to website is the crucial path. Snow notes that data standards play a key role in another pharma focal point, namely the move from looking at molecules to treating patients holistically. But standards can also help to accelerate the “to file” path, and make the path replicable across different areas of the business. Forrester
Over in the world of securities trading, Waters Technology runs a report on a talk by David Saul, Chief Scientist at State Street. Saul was presenting on semantic technology, which is all about exploiting unstructured data alongside structured data. He has a four point plan which starts and ends with standards: “Step one is to implement data governance with data discovery and definition processes throughout your enterprise. […] Finally, take a proactive role in the industry by making a plan to influence relevant semantic data standards.”
If you need to convince someone that semantic technology (or linked data) isn't a neat way of avoiding data standards, then Saul's four point plan is one place to start. Meaning never comes for free. You can't avoid defining your data. You either waste time and effort working up your own definitions, or do the smart thing and use the available industry standards.
Saul's call for participation in standards bodies is well made. He urges: "Standards can help you but most of the standards organizations are volunteer organizations […] They're only as good as the contributions that their members make. I urge you, if you're not involved with a standards organization, make sure that you're spending time, whether it's with the EDM council or Object Management Group. You'll get more out of it than you put into it." WatersTechnology
According to Shaun Abrahamson, smart cities are like standards-based applications: “Cities increasingly look to one another to understand what policies are working. Like data standards, similar policies across more cities mean that startups do not have to adapt product and services to unique situations in each city.”
As the world becomes more urbanized, new cities will be generated using tried and tested formulas from existing cities. City developers will need good reasons not to reuse these emergent standards. Standardization – with appropriate localization – looks like the only economic way to manage the growth in urbanization.
Data standards are, of course, not just like policies – they are policies. The word “data” fools people into thinking standards are a technical issue. In reality, data standards ensure we don't waste time and money reinventing the wheel, ensure we can deal with our business partners effectively, and underwrite our ability to grow sustainably. GIGAOM
Ever since the first mouse, the user experience has kept getting better and better. Now we expect to casually swipe at our smart phones and see the latest stock prices or catch up with friends. We don't think of ourselves as operating computers. We're just taking care of business.
In our roles as consumers of information, we are fabulously well served. In our roles as producers and exploiters of information, we are less well served. Yes, we have keyboards, we have cut and paste, we have microphones and cameras, we have email and text and Dropbox and social media – lots of ways of capturing and transmitting data. But making sense of data – pulling it together, extracting intelligence from it, converting it into information, bringing it to bear on our opportunities – we're still in the dark ages.
When bringing together numerical data, too many of us have to play with spreadsheets. When bringing together textual data, we reach for the scissors. When reviewing masses of video or audio data – well, we're waiting for smart programs to do that for us.
There's a creativity gap in IT. The effort and dedication that's gone into the user experience needs to be matched in the field of knowledge capability. This is what we regularly refer to as “the back end”. In this case, the back end is where the brain is.
I'm all for usability. Great usability makes technology a part of our lives and dramatically increases the range of things we can do. But usability isn't just about the user interface. It's also about the data landscape. By linking our data together better, and organizing it according to common standards that give every item a solid, stable meaning, we can put data to work as naturally – and profitably – as we manipulate our windows on the world.
I like a well crafted and compelling mission statement. This one comes from RESO, the Real Estate Standards Organization: “An environment for the development and implementation of data standards and processes that facilitate innovation, insure portability, eliminate redundancies and obtain maximum efficiencies for all parties participating in the real estate transaction.”
Will Swann notes the strong data standards stance of the Department of Defense, but reports that “it is feared that the complexity of the data is being over simplified and that some records are being forced to fit within the standardization mechanisms”.
Forcing data to fit a standard could mean many things. It could mean taking a value from some record and assigning it to the wrong attribute. It could mean truncating or breaking up items to fit unsuitable formats. It could mean corrupting a classification scheme.
I would doubt that any of these strategies would pass the DoD's stringent quality tests for development and integration. However, I can well imagine that unstructured data is being copied as-is into new, standardized formats. Natural language notes attached to forms are like information icebergs. In the rush to embrace big data and analytics, little attention has been paid to the opportunity to create more meaningful information out of this class of unstructured data. Standards in DOD
“RETS [Real Estate Transaction Standard] has been under development since 1999. It isn’t even close to 'done' because for most of those years no one was managing the process, and RETS development was driven almost entirely by the part-time effort of volunteers such as myself. It is only in recent years that RESO [Real Estate Standards Organization ]was formed and professional project management was put in place to manage the volunteer effort. This has radically accelerated RETS standards development; if the industry puts more funding into RESO, it may be possible to hire more staff to make the process go even faster.”
In fact, fifteen years is good going for a data standard. Call it a soft launch. It takes time for people to understand the issues and to see how standards are the answer to their problems. RETS is getting implemented although Cohen stresses that customers need to press vendors to adopt standards. REALUOSO
The use of Legal Entity Identifiers (LEIs) is growing strongly in the financial sector. By being able to report on entities as well as instruments, players and regulators get a proper picture of where assets and liabilities are held. This is a major step toward curing the fragmentation that partly led to the credit crunch and its aftermath.
However, LEIs are being implemented in a distributed way. There is, as yet, no central organization to manage them. So it's possible that LEIs will not be truly global, and that they may become outdated.
The debate seems to be around not whether there needs to be such a central organization, but when it will be required, and how extensive its powers should be. My instinctive answer would be to set it up sooner rather than later, and give it only the absolute minimum authority it needs to do its job.
The world of finance is global as well as local. So it needs global standards. However, the standard for an entity identifier does not need to be complex. It just needs everybody's support. Any central coordinating body should be lightweight, but led and staffed by respected people. A TEAM
One of the paradoxes about data standards is that we need to to make a lot of noise about standards to get them adopted – yet the role of standards in business is quite unobtrusive, and even invisible. This is because standards have a unique feature that makes them different than other techniques, principles, solutions, and methodologies. The unique thing about standards is that they correct the habits of the organization. Standards are systemic. It's as if the organization could swallow a pill for good behavior.
Standards encapsulate the best practice of a business community. But standards are not just descriptive. They are also prescriptive. Standards can be applied to systems and processes to ensure business operations are performed in the best possible way every time, everywhere – by default. So best practice becomes basic practice.
Having standards throughout the enterprise guarantees competence, currency, and connectivity for all your information management and exploitation activities. And since information is the lifeblood of the enterprise, standards are an essential guarantor of corporate health.
With this sound basis for operation and collaboration, the organization can reach beyond “business as usual” to produce real excellence. Standards are a sure foundation for leadership and innovation. At the same time, they reduce costs across the board by removing duplication, delay, confusion, convoluted systems, and support for non-standard solutions.
Writing in Forbes magazine, Hudson Hollister, Executive Director of the Data Transparency Coalition, gives an enthusiastic and rousing welcome to the unanimous passing of the DATA Act by the Senate: “Without common data standards across all government spending, analyses of cross-agency spending trends require endless conversions of apples to oranges. For a nation whose tech industry leads the world, there is no reason to allow this antiquated system to persist.”
Hollister is a lucid and cogent advocate of data standards. But what's notable about this article is its location – in a mass market, newsstand magazine title. This is more proof – if more proof is needed – that business really does get it. Data standards are a business issue. I believe access to meaningful, usable data is as vital as, and maybe more vital than, access to capital, skills, or physical infrastructure.
The importance of data to our economy and society is getting wider coverage thanks to the DATA Act. The Washington Post's report on the Senate vote is headed: “The DATA Act just passed the Senate. Here’s why that matters.” Forbes
The standards mindset includes a distinct attitude to innovation. Let me try to summarize this here. Innovations are often pitched as new-to-the-world. But an innovation only needs to be viable, and beneficial to the business. The substance of an innovative action may be very old – tried and tested, copied from elsewhere. So, when thinking from a standards viewpoint, we assess innovations by looking for a small difference in detail that produces a big difference in performance. For example, the inventor of the Sony Walkman took existing technology and twisted it for a new usage scenario. Basically, a Walkman was a tape recorder that didn't record and didn't have any speakers. The innovation lay in rethinking the existing technology, and then applying engineering skills to produce the required small form factor.
The most powerful innovations are those which rely 99% on existing assets. This is as true for the most outlandish skyscraper as the most creative insurance products. Each uses a range of standardized, trusted components – plus some novelty. The standards mindset looks for maximum leverage of existing productized entities, minimum invention, and a credibly defined opportunity. I find the standards mindset helps people evaluate new ideas rationally while insulating them from the ballyhoo of passing bandwagons.
I believe the standards mindset leads to faster and more certain evaluation of innovation opportunities, and more confident implementation. Standards thinking is smart thinking. It's about using not just your own brain, but the experience and insight of the entire community behind the standard.
Leaders in data standards are usually both thoughtful and practical people. This tends to be a rare mix – not because the majority of people aren't thoughtful and practical, but because organizations tend to push people to one extreme or the other. You know, you're either a thinker or a doer – when, with information at the heart of every business, it's obvious we all need to be both.
Alex R Coley is the data standards guy at the UK's Department for Environment, Food and Rural Affairs (Defra). He's thoughtful and practical, in spades, going by his blog. He has a couple of slogans: “Data for a reason” and “What before how”. I like both of these.
Alex recently asked why we get over-attached to software, even though we know that doing so tends to create silos that we'll then labor hard to break through. He believes we get particularly attached to software which bolsters our self-image as experts in our area.
I think there's a lot in this. It makes me wonder whether there might be some sense in limiting innovation in the user interface while continuing to press for standardization of data and process under the hood.
Has there been a genuine worthwhile innovation in user interfaces recently? The most radical thing I can think of is that icons showing floppy disks are slowly being replaced by ones that show downward-pointing arrows. How's this for a pitch: Our new release looks and feels exactly like the last one! But it uses data standards!
First, there are unique commitment costs. If an organization doesn't use an available standard, then it has to create, maintain, and support its own formats. It has to document, teach, and police these internal protocols. It will need to source and protect the resources and skills required to keep the non-standard approach afloat. All of these costs will persist, and swell, for the duration of the commitment.
Second, there are lost opportunity costs. Inability to trade with partners, share information across organizational boundaries, and innovate with certainty and speed, will hold the business back and cause measurable failures. Competitiveness will fall and the organization's attractiveness to potential employees will wane. The organization will deem to have isolated itself.
Third, there are knowledge deficits. Without standards, meaningful information is hard and often impossible to capture, collate, and act upon. Process fragmentation is deeply entrenched and the business mission becomes confused. An organization without standards rapidly becomes an organization without knowledge, unable even to recognize the decisions it needs to take, let alone take them in an informed and timely manner.
What makes a data standard fit for purpose? The answer is: Recognition. I don't mean recognition in the sense of recognizing a friend in a crowd. I mean the kind of recognition that countries do when they agree to have talks with each other. Or when the chair of a meeting recognizes one of the questioners from the floor.
If something is “the recognized leader in its field” then it has the practical approval of most people who are in a position to make an informed judgement. A medical researcher comments in a paper on data standards: “We argue that it does not matter which data standards are chosen as long as well-recognized data standards are used.” At first, this might look strange. Isn't it better to have demonstrably right standards, rather than just the standards other people use?
When it comes to standards, what gets used is what's right. There is no other way to prove correctness other than by repeated successful application. There's no formal proof for data standards, like there might be for a math problem. And this is because standards solve human problems – for humans. Translational Medicine
If you know someone who could benefit from an introduction to data, School of Data is a great place to send them. The School has free modules written by knowledgeable folks, designed to be understandable by anybody. School of Data
Gregory A. Maciag: The Business Information Revolution: Making the Case for ACORD Standards
This book was the end result of my writing monthly columns for ten years.