The future of CRM application software – today’s tech can rebaseline the norm

CRM applications used by the frontline have been around for around 20-30 years. My first consulting job was designing a CRM portal for wealth management advisors distributed around the country. Technically, it was web based and was a bit of a reach at the time but it was highly innovate and essentially had all the moving parts you see in CRM applications today. Over time, I went on to design and launch many more CRM applications covering a broad range of areas some of which won awards or were highly placed. CRM apps cover a wide range of touchpoints usages and my focus here are those CRM apps used by the frontline when the engage with the customer.

The world of CRM apps has not changed much. Today’s CRM apps are slicker, more integrated an easier to be program. But overall, they still fundamentally are hard to use, hard to enforce a process with and generally try to force you to enter in structured data all for the explicit purpose of using that data an the backend side.

In other words, the way you interact or want to interact with a customer–a fluid dance of conversations and touchpoints–comes to a jarring halt when you have to type your customer “data” into a relatively fixed, confining CRM application on your screen. Even the marketing automation space has learned that it screwed up as it realized that email campaigns have become old school and the nuances of social media marketing and messaging are the new black. After all, a growing majority of people today use email less than the previous generation, significantly less.

What is the future?

The future is not narrow list of checkboxes, pick lists, small text boxes or small fields to capture one specific concept, like the first name.

Instead the future is fluid and free flowing, much like many of the newer collaboration tools just now gaining prominence in small companies and now larger companies. It’s more about “notes” and small snippets of information versus structured screens. It’s more about searching different locations for data about customers and not requiring that all information be managed in a single tool. It’s about automating the interactions so that the right information is available to personalize a touchpoint.

Evidence for this model abound:

  • CRM applications now have “chatter” or “posts” that capture a stream of unstructured notes and objects like pictures or audio clips.
  • Applications like “slack” show that collaboration and documentation is easier when it’s fluid, in context and completely searchable. Trello is the same way.
  • Many CRM applications capture only a few structured fields and most of the complexity is really around trying to capture additional customer information–which is where the application start becoming unwieldy.
  • Most CRM software tries to tie together a 360 degree view of the customer using various ad-hoc methods of integrating with other applications.  They shoehorn that “app’s” data into the CRM application to get a 360 degree view of a customer. These integration costs are often the largest costs in a CRM project.
  • CRM has started to rely on data mining and machine learning algorithms to help the advisor/rep become more productive about how to spend their time at the same time they personalize communication to the customer.
  • CRM automation is increasing as bots and other automation techniques become more prevalent…for some products and channels, customers prefer automation.

Now CRM is more than just capturing information about customers, it’s also about servicing them and using information, again in context, to order their products, resolve their issues or try to understand their behavior. Getting information from other applications into the context “flow” has proven to be very tricky.

It’s true that some data, like an order, is highly structured and needs to be in sequence properly to support the supply chain, that’s fair. But a lot of CRM data does not need the same amount of structure. When interacting with a company’s rep or a automated systems, the needs are much different. CRM apps do need to digest data of different media types and tell you what’s important. Or, at the very least, sort through the data and summarize it for you.

In other words, the future of CRM is really more like an instant messaging program like Slack or a free-form note taking application OneNote or collaborative management tool like Trello then an application framework like popular CRM platforms today. Think tweets and hashtags and AI driving data record enrichment.

It’s not about checkboxes anymore. Sales people do not really like check checkboxes. Text mining, or unstructured analysis–whatever you want to call it–is mature enough to sort through the data and fined postal addresses, email addresses, phone numbers and linkage information to connect all the dots and prepare the data for analytical use. Network analysis is mature enough to create a graph of contacts, with context, from your email and notes. This crystal ball thinking is true for both B2C and B2B although B2B has regulatory issues that suggest that it does require some additional “structure.” In fact, these techniques are in play in extremely advanced CRM scenarios such as Know Your Customer in the AML/BSA space.

A lot of what passes today for CRM software is just a jumble of straight jackets that are unneeded and run counter to how people communicate, create information and collaborate today.

Branding, advertising and social media

There were two articles this week/month on social media advertising that did not seem to overlap per se but are related.

The first is in HBR, March 2016 issue titled “Branding in the Age of Social Media.” (here) This article suggests that companies have spent billions on trying to build out their brands using social media but most of the money and effort has been a waste. The basic idea idea is that branded content and sponsorships in the past used to work because there were limited channels of distribution for the content and therefore most consumers had limited choices and had to watch what was shoved into those channels.

Today, it’s a bit different. The mulitude of channels means that consumers can filter out ads, shape their own customized content flows and create their own flow of entertainment content–much of it created by their friends. Rather suddenly, brands no longer could command the audience. The article mentions that most heavily branded companies such as Coca-Cola command less viewership than two guys sitting on a couch narrating video games (“e-sports”). Now, brand must fit into the flow of either “amplified subcultures” (groups of people with more narrow interests) or “art worlds” where new creative breakthroughs occur. Either way, you have to fit in via cultural branding where you align the brand around the culture of people in those two areas. So the brand can be there but only in the context of say, for example, the subculture of people who do not like smelly socks that come from running 10 miles a day. You have to create a story about smelly socks and positioning your laundry detergent as part of addressing the smelly socks problem (I made up the smelly socks example).

You essentially align the product/brand around a more specific theme that resonates with the target audience. Because the specific themes are more narrow, the amount of creative customization increases.

This is not a new concept. The article is really just saying that you have to create content about your brand/product that aligns with you target audience and is delivered to them through the “channels” that they watch.

I was also scanning Bloomberg Businessweek and their article “If You Don’t Know It By Now You’ll Never Make Millions on Snapchat.” (here) It described the “snapchat” phenomena, with its rapid rise, as well the challenge many similar companies have on maintaining their user volumes. The biggest issue is that they need to generate revenue and Snapchat is considered “expensive” advertising with little insights into “returns.”  One of the strategies Snapchat has taken is to focus their sales time on helping customers create stories to fit into their Discover channels and Snapchat’s model of perishable content. Still, a slightly talented musician posting just his daily musings and activities garners more views than all the biggest networks combined, daily. Ouch!

But it is just another lesson in what we already knew.  Find the audience you want to reach, find out where their eyes are especially now that they more choices about how and where they engage, tailor your content with a message and delivery that will engage them to watch, take action or whatever. Segment, segment, segment…

That’s about it. So yes, branding (and really just general advertising) has changed. It has to be more clever/entertaining, more thoughful and more tailored to a smaller group. You cannot rely on a famous name to push your product alone and you cannot count on blanket reach to communicate.

So there is not really a lot of new news here, just a recognition that we as companies and marketers have to be more clever because the easy ways no longer work and it’s possible to get a huge ramp (given the viewing numbers) if we put that cleverness to work.

Perhaps the real news is that some people in their current jobs need to become more clever quickly or find some clever people to help them with their branding/marketing. What is wonderful at least to me, is that the volumes of eyeballs in some of these channels makes them worth paying attention to.

Got it.

Check.

Roll credits.

Platform Scale

Sanjeet Choudary has put out a book about how platforms, not pipes, are the new business model. The book is very inspiring so I recommend reading it. There are not any new ideas in it but they are packaged together very nicely. It’s very much another “explaining things” book and for the lens that it wants you to use, I think it does a good job.

The key thought behind the book is actually fairly simple:

Be a middleman. Reduce your costs as a middleman to gain share. Shift cost and risk out to everyone else, as much as possible. Allow companies to build on your platform. Reducing your middleman costs can gain you share and the best way do that is to be digital. If you only make a small slice of money at every interaction, you need alot of interactions so don’t forget the “make it big” part.

That’s really about it. There’s not alot of examples with deep insight in the book and he avoids most levels of strategic thinking entirely. The book also fails to connect what has been going on today to the massive “platforms” built in the past few decades but which are not necessarily fully digital as in the examples reused in the book. The book spends most of its pages explaining that if you can reduce transactions costs and get scale, the wold is your oyster. Of course, this is only just one model of succeeding in business and actually not always the most interesting or sustainable.

But that’s OK. Go find your “unit,” reduce that friction and make a billion. It’s a good read.

Enjoy!

Do sanctions work? Not sure, but they will keep getting more complex

After Russia and Ukraine ran into some issues a few months back, the US gathered international support and imposed sanctions.

Most people think that sanctions sound like a good idea. But do they work?

Whether sanctions work is a deeply controversial topic. You can view sanctions through many different lenses. I will not be able to answer that question in this blog. It is interesting to note that the sanctions against Russia over the Ukraine situation are some of the most complex in history. I think the trend will continue. Here’s why.

Previously, sanctions would be imposed on a country that is doing things the sanctioning entity does not want to happen. Country-wide sanctions are fairly easy to understand and implement. For example, sanctions against Iran for nuclear enrichment. Sanctions in the past could be levelled at an entire country or a category of trade e.g. steel or high performance computers. But they have to be balanced. In the case of Russian and Ukraine, the EU obtains significant amounts of energy from Russia.  Sanctions against the energy sector would hurt both the EU and Russia.

Sanctions today often go against individuals. The central idea is to target individuals who have money at stake. OFAC publishes a list of sanctioned individuals and updates in regularly. If you are on the list, you are not allowed to do business with those sanctioned individuals, that is, you should not conduct financial transactions of any type with that individual (or company).

The new Russian sanctions target certain individuals, a few Russian banks (not all of them), and allows certain forms of transactions. For example, you cannot transact with a loan or debenture longer than 90 days maturity or new issues. Instead of blanket sanctions, its a combination of attributes that apply as to whether a financial transaction can be made.

Why are the Russian sanctions not a blanket “no business” set of sanctions?

By carefully targeting (think targeted marketing) the influences of national policy, the sanctions would hurt the average citizen a bit less, perhaps biting them, but no so much that the average citizen turns against the sanctioning entity. Biting into the influencers and others at the top is part of a newer model of making individuals feel the pain. This approach is being used the anti-money laundering (AML) and regulatory space in the US in order to drive change in the financial services industry e.g. hold a chief compliance officer accountable if a bad AML situation develops.

So given the philosophical change as well as the new information-based tools that allow governments to be more targeted they will keep getting more complex.

Oso Mudslides and BigData

There was much todo about google’s bigdata bad flu forecasts recently in the news. google had tried to forecast flu rates in the US based on search data. That’s a hard issue  to forecast well but doing better will have public benefits by giving public officials and others information to identify pro-active actions.

Lets also think about other places where bigdata, in a non-corporate, non-figure-out-what-customers-will-buy-next way, could also help.

Let’s think about Oso, Washington (Oso landslide area on google maps)

Given my background in geophysics (and a bit of geology), you can look at Oslo, Washington and think…yeah…that was a candidate for a mudslide. Using google earth, its easy to look at the pictures and see the line in the forest where the earth has given way over the years. It looks like the geology of the area is mostly sand and it was mentioned it was glacier related. All this makes sense.

We also know that homeowner’s insurance tries to estimate the risk of a policy before its issued and its safe to assume that the policies either did not cover mudslides or catastrophes of this nature for exactly this reason.

All of this is good hind-sight. How do we do better?

Its pretty clear from the aerial photography that the land across the river was ripe for a slide. The think sandy line, the sparse vegetation and other visual aspects from google earth/maps shows that detail. Its a classic geological situation. I’ll also bet the lithography of the area is sand, alot of sand, and more sand possible on top of hard rock at the base.

So lets propose that bigdata should help give homeowners a risk assessment of their house which they can monitor over time and use to evaluate the potential devastation that could come from a future house purchase. Insurance costs alone should not prevent homeowners from assessing their risks. Even “alerts” from local government officials sometimes fall on deaf ears.

Here’s the setup:

  • Use google earth maps to interpret the images along rivers, lakes and ocean fronts
  • Use geological studies. Its little known that universities and the government have conducted extensive studies in most areas of the US and we could, in theory, make that information more accessible and usable
  • Use aerial photography analysis to evaluate vegetation density and surface features
  • Use land data to understand the terrain e.g. gradients and funnels
  • Align the data with fault lines, historical analysis of events and other factors.
  • Calculate risk scores for each home or identify homes in an area of heightened risk.

Do this and repeat monthly for every home in the US at risk and create a report for homeowners to read.

Now that would be bigdata in action!

This is a really hard problem to solve but if the bigdata “industry” wants to prove that its good at data fusion on a really hard problem that mixes an extremely complex and large amount of disparate data and has public benefit, this would be it.

Yanukovych, Money Laundering and a Probe: The Rise of Network Analytics

I have been working in the Anti-Money Laundering (AML) for awhile. Compared to healthcare or the more general Customer Relationship Management (CRM) space, the AML and Bank Secrecy Act (BSA) is really the “shady” side of the customer–or at least it assumes that some customer are a shady and tries to find them or prevent their actions. Some estimates suggest that the aggregate impact of BSA/AML (and Fraud) regulations is only 10-20% of the total amount of dollar flow in the world so we know that while regulators and prosecutors do catch some of the bad guys alot of dollars remain on the table.

Take the recent case of the Ukraine. It’s been reported that the Swiss are launching a money-laundering probe into ousted president Viktor Yanukovich and his son Oleksander. They think the money laundering could amount to tens of billions. All told, over 20 Ukrainians are listed as targets of the Swiss probe.

In BSA/AML, Yanukovichs (father and son) is a clearly a Politically Exposed Person (PEP). And apparently the son had a company established that was doing quite well. That information usually leads to flags that up the risk score of a customer at a bank. So an investigation and PEP indicators are all good things.

Officials estimate that $70 billion disappeared from the government almost overnight. Of course, Yanukovich WAS the president of Ukraine and he was on the run up until last week. But an investigation into money laundering on tens of billions that suddenly just happened?

Recently, I attended an ACAMs event in NYC. Both Benjamin Lawsky (regulator side) and Preet Bharara (prosecution side) spoke. One of their comments was that to have a real impact on money-laundering, you have to create disincentives so that people do not break the law in the future. You can sue companies, people and levy fines. These create disincentives and disincentives are the only scalable way to reduce money-laundering–stop it before it starts. The ACAMs event was US based, but the ideas are valid everywhere. The Swiss have always had issues with shielding bad people’s money but they are playing better than before.

But the real issue is that the conduits, the pathways, were already setup to make this happen. And most likely, there have been many dollars siphoned off with the list $70 billion being the end of the train. So the focus needs to be on active monitoring of the conduits and the pathways, with the BSA/AML components being one part of monitoring those paths. After all, the BSA/AML regulations motivate a relatively narrow view of the “network” with an organization’s boundaries.

If we want to really crack down on the large scale movement of funds, it will not be enough to have the financial institutions–which have limited views into corporations–use traditional BSA/AML and Fraud techniques. A layer of network analysis is needed at the cross-bank level that goes beyond filing a suspicious activity report (SAR) or a currency transaction report (CTR). And this network analytical layer needs to be intensely and actively monitored at all times and not just during periods of prosecution. While the Fed uses the data sent back from a company’s SAR and CTR (and other reports) and in theory acts at the larger network level, it is not clear that such limited sampling can produce a cohesive view. Today, social media companies (like Facebook) and shopping sites (like Amazon) collect an amazing amount of information at a detailed level. NSA tried to collect just phone metadata and was pounced on. So the information available in the commercial world is vast, that which the government receives is tiny.

In other words, the beginnings of an analytical network is clearly present in the current regulations, but the intensity and breadth of the activity needs to match the scale of the problem so that the disincentives dramatically increase. And while it is very difficult to make this happen across borders or even politically within the US, its pretty clear that until the “network analysis” scale either increases its “resolution” or another solution is found, large scale money laundering will continue to thrive and most enforcement efforts will continually lag.

Its a balancing act. Too much ongoing monitoring is both political anathema to some in the US and it can be very costly. Too little and the level of disincentives may not deter future crimes.

Pop!…another $70 billion just disappeared.

Should companies organize themselves like consultancies? If they do, they need to hire like them as well.

A recent HBR article (October 2013)  mentioned that P&G and other companies are rethinking how they organize themselves. The basic idea is that instead of having fixed organizations, companies should organize themselves like consultancies–everything is a project and you assemble/disassemble teams as needed to solve problems. There will be some ongoing operations that do require “flat” jobs–jobs that more repetitive but still require knowledge workers?

The article begs the question of whether organizing into projects and flexible staff (like consultancies) is a good thing for companies that are heavily knowledge worker based. Part of the proof that knowledge work is becoming more dominant is by looking at the decreasing COGS and increasing SG&A lines on financial statements. COGS indicates decreasing amounts of “blue-collar” work over time while SG&A is a good proxy for white-collar, knowledge worker type jobs.

So is it?

My view is that it is not so cut and dry. Consultancies create large labor pools at the practice area level that generally have a specific industry expertise. Generally, there are horizontal practices for people who specialize in skills that cut across industries. Typically, these practice areas are large enough that the random noise (!) of projects starting and stopping creates a consistent utilization curve over time. And a management structure, for performing reviews, connecting with people, is still needed to ensure consultants feel like they have a home.

Another important aspect quoted in the article is the creation of repeatable methodologies that consultants are trained on so that knowledge can be codified instead of horded.

Consultancies are good, but not super great, at knowledge management and sharing deliverables so that practices that have proven themselves to work can be re-used in other projects or contexts.

Let’s look at companies:

  • Companies have people, often fairly substantial groups, that are focused on a horizontal area e.g. finance, marketing, IT, customer service. Companies are often organized by product which also forces it to be organized by industry, but there are many variations to this model.
  • Companies try to organize activities into projects. Not everything can be a project e.g. ongoing operational support of different kinds. But companies do try to kick-off efforts, set deadlines, integrate teams from different groups, etc.
  • Companies share deliverables from one project to another. Unlike consultancies, the pool of deliverables is often narrower because of the corporate boundaries and sharing within an industry are often not as robust as in a consultancy. Companies that hire talent from the outside frequently can bring these elements in, however.
  • Groups share resources, although not as robustly as consultancies, across projects and groups. Companies are less robust at true sharing because inside of companies, people count is often a measure of power. At consultancies, revenue and margin usually are the primary metric, but of course, these are only achieved through resources.

Companies today are already employing many elements of what this model calls out. Most companies are not as robust as consultancies at some aspects. But are these differences the primary reason why consultancies have shown good resilience to execution in different circumstances?

There is probably another aspect. Consultancies typically seek out and retain a large amount of quality talent. Companies, to varying degrees, do not always hire highly talented individuals. Their pay, performance management approach and culture do not attract the best talent in the marketplace.

While companies could improve certain areas of their capabilities, there was an entire part of the story that was missing in the HBR article–a focus on top talent across the entire company and not just for a few key roles.

ACO and HMO: HMO redo or something new?

I have covered this topic before but I came across an article that stimulated by thinking again.

It has been said that ACOs are really the new HMOs. HMOs in the 1990 were an experiment to put ‘risk and reward in the same bucket.” Much like integrated capitation, the idea is to let those that save money, while still delivering great quality care, benefit from their innovations.

This was the thinking behind the Affordable Care Act, which seeks to re-align risk and reward. It also, possibly unfortunately, makes Provider power even more concentrated. Maybe that’s good, maybe that’s bad.

A recent analysis of healthcare costs as a % of GDP came out in the New England Journal of Medicine. One of the questions we want to answer is where the healthcare costs will be a decade from now based on changes today. Typical projections run that in a decade or so, that 20% of the US GDP will be spent on healthcare (all private and public expenditures). This is based on projects from the last 2 years worth of data, which have shown lower health care growth rates than the past 20. The past 2 years of healthcare growth rates have been thoughtfully reviewed and it has been determined that these growth rates in the past 2 years are not representative of growth rates that we are likely to see in the next decade or two.

This NJEM article, published May 22, 2013, The Gross Domestic Product and Health Care Spending (Victor Fuchs, PhD), suggests that the growth rate that should be used is probably the long term growth rate, that recent changes in the growth rate are one-time events and that using 2 year growth rates is typically a bad idea anyway. The articles also describes how the growth rates were greatly reduced, cut in 1/2, when HMOs came out. HMOs rationed care. It is generally thought that most people want “all you can drink” healthcare for their “buffer” prices and this is the reason that HMOs were given the boot by consumers. Victor thinks that if you use historical growth rates, then the share of GDP for healthcare grows to 30%. That’s huge.

So if ACOs are really HMOs reborn, wouldn’t that be a good thing? Its probably a good thing to think it through a bit to see if such a top-level thought holds water. First we’ll recognize that the ACOs and concentration of power into Proivders (the Act gives enormous emphasis on hospitals) which possibly leads to verticalization is not necessarily bad, at least in business circles. And combining risk and reward to get incentives right, is also probably not a bad thing.

But there are other factors. We will also assume that that Americans will not engage in more healthy lifestyles since changing people’s behaviors towards the healthy has not really worked nor probably will ever work without significant economic rewards. And we will assume that Americans want choice and do not like being told where to get their healthcare services (especially since care is so uneven).

If normal competition were at work, then we would expect that verticalization and centralizing risk & reward should all be good. We should expect to see declining prices and improving outcomes.

But when we look at the healthcare spectrum, we see some of the largest improvements in outcomes results from drugs and medical devices. We do not see large improvements in care based on “processes” inside of hospitals. While some hospitals do indeed work on Six-Sigma type continuous process improvements and show great results, they are inconsistently used and are not a source of large-scale productivity increases.

So we have not seen that hospitals are capable of being managed to reduce their costs and improve outcomes to any significant degree. In fact, most innovation in the hospital community is around centralization, getting big to have the scale. But we need to ask ourselves if whether hospitals becoming large lowers cost and improves outcomes, or does it just allow more fixed cost (beds) to be spread out over a large surface area threrby reducing per unit costs but not reducing costs? The ACOs models of having a minimum of a few thousand patients to be efficient is probably way off. Some estimates suggest you need a million patients. Hospital systems are responding to this scale need and scaling up. As we have seen though, a larger company is not the most efficient when it comes to innovation or lowering cost without significant forms of competition.

And that’s where the ACO model is probably not like the HMO model. The ACO model is encouraging super-regional entities to be formed that will reduce competition in a given service area versus increase it. Unlike national Payers that look a bit more like Walmart, super-regional ACOs will be large, but not super large. They willl not have competition in the area. And improving their productivity is a bit suspect (I hope they do improve their productivity by the way). And hospital systems are fighting changes to allow clinics to become more prevalent as well as allowing non-physicians to write prescriptions because that draws power away from them.

It has been widely studied and reported that HMOs reduced choice as a trade-off to obtain the benefit of reduced costs. This reduced choice is not directly present in the ACO model although both Payers and Providers want patients to stay in network of course and have been forming “lean” or “focused” networks just for this reason. So there are no large forces in the Act to strongly motivate that ACOs will help manage and control consumer choice.

So on the surface, ACOs look like a good model but they become questionable fairly quickly. You can place your bets on what will happen or wait it out. It is clear that will take years to demonstrate  if ACOs are working just like it did for HMOs–way after they were killed off.

There are actually ways to fix many of these issues by addressing the underlying problems directly. For example, creating more uniform outcomes by standardizing processes and the quality of practicing physicians may reduce the need to have the ultimate “go anywhere” flexibility driven by a patient’s need to find quality care. We need to promote competition on a very broad scale across multiple geographies by changing laws. Reduce Provider power around Rx writing and get people into clinics and alternative care delivery centers. We can also modify the reimbursement policies and centralize risk & reward so that investors (a term I used broadly here)  receive a reward for taking risk and succeeding unlike today where they are penalized with lower payments–essentially creating a disincentive to invest.

All of these ideas would have create a dramatic change in the cost curve over time without fundamentally altering the landscape. It would be a good start.

Ranking information, “winner take all” and the Heisenberg Uncertainty Principle

Does ranking information, who likes what, top-10 rankings, produce “winner take all” situations?

There is an old rule in strategy which is that there are no rules. While it is always nice to try and create underlying theories or rules of how the world works, science and mathematics are still fairly young to describe something this complex. Trying to apply rules like the concept presented in the title is probably some form of confirmatory bias.

Having said that, there is evidence that this effect can happen, not as a rule to be followed, but something that does occur. How could this happen?

Ranking information does allow us, as people, to see what other people are doing. That’s always interesting–to see what others are doing, looking at or thinking about. And by looking at what other people are looking at, there is a natural increase in “viewership” of that item. So the top-10 ranking, always entertaining of course, does create healthy follow-on “views.”

But “views” does not mean involvement or agreement. In other word, while ranking information and today’s internet makes it easy to see what other are seeing, our act of observation actually contributes to the appearance of popularity. That popularity appears to drive others to “winner take all.”

“Winner take all” can take many forms. Winner take all can meant that once the pile-on starts, that a web site becomes very popular. This is often confused with the network effect. Winner take all can also mean that a song becomes popular because its played alot, so more people like it, so its played even more, and so on. Of course this does not describe how the song became popular to begin with–perhaps people actually liked the song and it had favorable corporate support–there is nothing wrong with that.

And this leads us to the uncertainty principle. The act of observation disturbs the thing we are trying to measure. The more scientific formulation has to do with the limits of observation of position and velocity at the atomic level but we’ll gloss over that more formal definition.

The act of observing a top-10 list on the internet, causes the top-10 list to become more popular. The act of listening to a song, and through internet communication channels, changes the popularity of the song. So its clear, that given internet technology, there is a potential feedback loop that employs the uncertainty principle.

Alright, that makes sense. But the world is probably a little more complex than this simple thought.

While the act of observing could make something more popular, that does not mean that the act of observing makes something unpopular turn into something popular. In other words, people are not fools. They like what they like. If something is on the top-10 list or comes from a band that has good corporate airtime support, that does not mean that it is a bad song or a bad list. It does not mean that people would not like it if did not play in that venue.

The internet is a powerful tool to help people quickly find what they want. The cost of finding a new web site or a new source of top-10 lists (or whatever) is fairly low. So there is no real inherent lock-in. There is the possibility that given the internet’s reach, the ability to more rapidly escalate and de-escalate from “winner take all” to “has been” is fairly robust. Its quite possible, in the spirit of making business rules for fun, that the internet produces a steady stream of “winner take all” and if there is a steady stream of “winner take take all” events, then they are really just average events after all (regression to the mean). So with my fancy new rule, there are no “winner take all” events any more, just a large number of rapidly escalating/de-escalating average events–the frequency has just been bumped up.

That’s okay as well.

Tempering our expectations for bigdata in healthcare

Expectations around bigdata’s impact on healthcare is leaping ahead of reality and some good thoughts are being expressed. However, healthcare has already had significant amounts of analytics applied to it. The issue is not that larger sets of data are critical, but that the sharing and integration of the data are the critical parts for better analysis. Bigdata does not necessarily solve these problems although the bigdata fever may help smash through these barriers. Over 15 Blues and most of the major nationals have already purchased data warehouse appliances and advanced systems to speed-up analysis, so its not necessarily performance or scalability that is constraining advances built on data-driven approaches. And just using unstructured text in analytics will not create a leapfrog in better outcomes from data.

We really need to think integration and access. More people performing analysis in clever ways will make a difference. And this means more people than just the few that can access healthcare detailed data: most of which is proprietary and will stay proprietary to companies that collect it. Privacy and other issues prevent widespread sharing of the granular data needed to truly perform analysis and get great results…its a journey.

This makes the PCORI announcements about yet another national data infrastructure (based on a distributed data model concept) and Obama’s directive to get more Medicare data into the world for innovation (see the 2013 Healthcare Datapooloza that just completed in Washington DC) that much more interesting. PCORI is really building a closed network of detailed data using a common data model and distributed analysis while CMS is being pushed to make datasets more available to entrepreneurs and innovators–a bit of the opposite in terms of “access.”

There are innovative ideas out there, in fact, there is no end to them. Bigdata is actually a set of fairly old ideas that are suddently becoming economic to implement. And there is serious lack of useful datasets that are widely available. The CMS datasets are often heavily massaged prior to release in order to conform to HIPAA rules e.g. you cannot provide detailed data at an individual level essentially despite what you think you are getting: just stripping off a name and address off a claim form is sufficient for satisfying HIPAA rules.

So its clear that to get great results, you probably have to follow the PCORI model, but then analysis is really restricted to just a few people who can access those datasets.

That’s not to say that if patients are willing to opt-in to programs that get their healthcare data out there, bigdata does not have alot to offer. Companies using bigdata technology on their proprietary datasets can make a difference and there are many useful ideas to economically go after using bigdata–many of which are fairly obvious and easy to prioritize. But there is not going to suddenly be a large community of people with new access to granular data that could be, and often is, the source of innovation. Let’s face it. Many healthcare companies have had advanced analytics and effectively no real budget constraints for many years and will continue to do so.  So the reason that analytics have not been created deployed more than today is unrelated to technology.

If bigdata hype can help executives get moving and actually innovate (its difficult for executives to innovate versus just react in healthcare) then that’s a good thing and getting momentum will most likely be the largest stimulus to innovation overall. That’s why change management is key when using analytics for healthcare.