data lakes: old is new and no free lunch, rinse and repeat

I recently watched a few videos from the dremio sponsored data lake conference: https://www.dremio.com/press-releases/introducing-subsurface-the-cloud-data-lake-conference/.

It’s a good collection of videos about a relatively new topic, data lakes. Data lakes are an architectural focal point for data management.

Some people think data lakes are new, especially vendors selling you on data lake tools and consulting. The new hotness is “separating compute and storage,” although that’s been going on for nearly four decades. Even though data lakes are the new hotness, rumors suggest that data lakes are hard to show and deliver ROI. There are many reasons this may be true. We should step back and look at data lakes. Data lakes are nothing new, but their implementations have changed.

Let’s start with a bit of history, around the late 80s and early 90s when data-warehouses roamed the earth. Data warehouses were hot until they weren’t.

Data warehouses were an universal answer to a variety of data management and organizational problems. Today, most people love to make the data warehouses the bogeyman. Data warehouse projects became widow-makers for IT managers. It was always unfair to ask IT managers to smooth over differences in priorities, delivery speeds, and data/analytical needs in the different divisions. Although my point of view is not widespread, after many years helping companies with their analytics, it’s clear that IT is the wrong place to produce a wide range of analytical products consumed by a wide range of users. Budgets for analytics should be borne by those that need it. A few “data products” can be consolidated for cost efficiency into a shared service group like IT. In some cases, if there is a common need or cost control, sure, IT may be an Ok choice where to do these things, but in general, it is not and never will be. That’s just the way business works.

At least in my world, a data warehouse’s inputs and outputs were almost always provided to different data consumers–the data warehouse itself was not the only physical data asset. But this approach and point-of-view was not the standard design approach. Data-warehouses became hard-to-use siloes almost *by definition*. One client hired me to find out why a data-warehouse had no users. The primary user said the IT group turned off his access and did not have the data they needed. Case closed! Many IT managers wanted to control these files to control “one version of the truth,” but it is not efficient to force IT to be the owner of these business issues. You do need one particular place to go for a business measure, but it is not necessarily IT’s job to own and publish it.

By providing inputs and outputs from a data warehouse, a data warehouse became a “cache” of pre-computed values. Whether it was a database table, a cube, or another proprietary data structure, there was always a cache. It is usually too expensive to always recompute a result of raw source data. Storage and compute may be cheap but not free. Caching is not a technical issue. Think economics. The caches are more convenient and less costly to access. Even in a cloud environment, there is a cost to recompute from the raw data. To build a cache, you have to specify what you want before you need it. Even with automatic caching, you need to be thoughtful. And cloud, incremental work is often not capitalizable.

Data virtualization, mostly on-premise, came later in the late 90’s early 2000’s. You could combine data from any source, raw source data, data warehouses, downstream extracts, excel files on your desktop, and query it without having to have prepared the data prior. Of course, to get anything useful, you would have to reproduce many of the same business data processing steps you have to regardless of your data management approach. In some scenarios, this was a huge step forward. The pharmaceutical industry, with vast amounts of unintegrated data, complex formats such as those found in clinical trials, and other domain areas really benefits from this approach. Interestingly enough to get good and fast results, data virtualization tools always had a giant cache in the middle along with a query planner and execution engine.

Enter the cloud and data lakes.

A data lake is a set of inputs and outputs. It is a cache of intermediate computations for some. For others, it is a source of raw information. It usually has data in several multiple formats for tool convenience. It often has some metadata, lineage, and other “management” features to help navigate and understand what is available. Typically, a wide variety of tools are available that work with several, although not infinite, number of data formats. When these types of features are essential to your users, then a data lake makes sense.

Today’s data lake companies are trying to convince you that data warehouses are evil. In many ways, I agree with them because most of them were designed wrong. However, the thinking and effort that goes into a data-warehouse never really goes away. Even in a cloud environment, you still pretty much have to do the same thing as you would build a “thing with a cache in the middle.” At some point, you have to specify what you want to do to the data to make it ready for use. Business intent and processing is inevitable. There is no free lunch.

Fortunately, newer tools, like dremio’s, AWS, Azure, and many others, make this more accessible than before. Most modern tools recognize that there are many formats, access patterns, and data access needs–one size does not fit all. This point of view alone makes these tools better than the traditional “single ETL tool” and “single DW database” approach from the decade prior.

Data lake companies provide tools and patterns that *are* more useful in a highly complex and distributed (organizationally and technically). 

Look at dremio.

Dremio has a great product. I like it. It is cast as a data lake engine because data lakes are still kind-of hot in the market. It is really a data virtualization tool well suited for a cloud environment. Highly useful in a situation where you want to provide access to the data in a wide variety of formats and access technologies and tools. Yes, there is a finite list of “connectors.” At least, however, part of dremio, such as apache arrow and arrow flight, is open-source so you can add your own.

dremio has to implement patterns that have been used for decades, even if dremio describes it differently. To make it fast enough and lower costs, it has a cache in the middle, although optional. It has a C++ core, instead of something less efficient, it targets zero-copy transfers through the networking and app stack. It uses code generation to push computation to different locations.

Many, if not most, of these features, were implemented four decades ago for MPP internetworking and were present in Ab-Initio and Torrent data processing products if anyone remembers them. Columnar databases with compression were available three decades ago–I used them. Separate compute and storage, break apart the RDBMS into pieces and retarget them. Check! I’m not saying that everyone is saying these are completely new concepts and have never been done before.

However, newer products like dremio’s are better than yesterday’s tools. Their mindset and development approach is entirely different. Sure, they are not doing anything new architecturally, but that makes them easy to figure out and use. Under the hood, they must build out the same building blocks needed to process data like any product–you cannot escape gravity. They are doing things new design-wise. They are making a better product. Recognizing these basic ideas should help large enterprises adopt and integrate products like dremio.

The sins of data-warehousing and proprietary tools, in general, are many. Most likely, proprietary tools probably still make more money daily than open-source tools. Open-source tools may have higher valuations. Perhaps this reflects their ability to be used by more companies in the long run. Open-source tools are cheaper for the moment. There are more product choices.

In the long run, no market can sustain a large number of products, so when the Fed finally stops supporting companies and capitalism returns, you may see a shrinking of funds around open-source data management tools.

All is not perfect, but it is better than before. Data lakes can be useful because they were useful 20 years ago when they existed at companies but had different names, e.g., the “input layer” or the “extract layer.” Insurance companies loved the “extract” layer because their source systems were many and complex and if you could find the right extract, life was easier. I’m hoping tools like dremio get situated and last in the long run because they are better.

Companies are building non-open parts of their product to monetize and incentivize companies. They still need income. Like previous tools they displaced, these newer tools will be displaced by others unless they get embedded enough at a client or another software company and create a sustainable source of income. Look at Palantir, for example. They have a little open-source, but their core product is behind the firewall. Many of these companies use open-source as a cover for coolness, but their intent is monetized proprietary software. I’m not against that, but we should recognize the situation so we are smarter about our decisions of what to use.

The cycle will continue. Rinse and repeat.

covid-19 teaches us about family

While we all cannot shelter-at-home as well as we all might wish to, shelter-as-home has reminded me of how important family is to me.

During this primary sheltering period, we have both children at home. Both are in college now and one mostly permanently lives a couple of hours away–a bit too far to drop by at the spur-of-the-moment. The other son will soon start a rigorous program that will not allow him to visit home very much for a few years.

I know there are a lot of bad things going on with covid-19 and many people are suffering. The burden is great. However, in spite of these burdens, there are moments of grace and joy. Having both children home again, eating dinner, watching a bit of the world on the internet, and gabbing about current issues/solving big problems reminds me that the health of the family is top-of-the-list important.

I’m glad they are safe. Parents cannot protect children forever, but we can always be supportive, encouraging, and committed to helping them succeed in life the way they want to succeed. Doing that during the pandemic reminds me of these simple ideas.

Now if we can just get them to clean up their dishes and wipe up the crumbs 🙂

covid-19 teaches us, again, the importance of in-person

While many states reman under lockdown, it is clear that sheltering-in-place and working-from-home are truly enabled by technology. From the internet to your phone to your tablet/computer, we communicate, get work done and do things from home.

However, not all workers can do that. Many people need to touch and interact with the everyday world. Technology helps them, but they still need to be out there helping. While the promise of robots is great, they are not ready to do everyday chores. Those that cannot shelter-at-home are teaching us the value if being in-person.

covid-19 is really teaching us the importance of being there. For those who are working from home, we start to miss the interactions with friends, co-workers and others.

I used to work in sales and delivery, and much more in sales. Interactions are key. It’s hard to develop trust over the phone although it is possible. It is almost always better to be there with a customer to work with them where they feel the most comfortable. You cannot do that sheltered at home.

Let’s hope that technology, small molecule and/or biologics, can step up to help us sooner rather than later and get our society back on track.

stimulus package shows that there is no free lunch

The relatively large, not in absolute terms, stimulus package indicates that corporate America and the marketplace, by and large, are not really working well. Over the past few years, we’ve had multiple interest rate drops, panic buying at the Fed window and other indicators that things are amiss.

If we as Americans want to reduce taxes on companies and people and take on an emergency funding model for running our government, that’s our choice. It’s a poor model but that’s where the conservative push-and-shove has led us.

However, the implications are staggering. That means that over time, as emergencies and other critical items come up, we will spend either the same or more than we would have had we handled payments more smoothly over time. We will need to print money, devalue our currency (through inflation) as well as cause substantial market distortions that are highly localized. That’s how we have decided to pay for our standard of living, print money when a bump comes.

The stimulus package also highlights the massive cost-shifting that corporations have been doing for a long time. By using more temporary workers, companies pay less per employee as there fewer taxes and commitments that companies make towards the workers. Also, gig workers need to still buy health care insurance and other important necessities. When issues come up, such as a pandemic, the government then needs to, as a matter of helping the citizens it serves, provides relief in some way. Costs that should be borne by companies are being cost shifted to the American public.

While the stimulus package seems to be a large subsidy program for companies, it also provides relief to workers directly. Essentially, its a highly concentrated form of “programs” that should have been in place already for companies *and* people.

Corporations have pushed for tax relief, deregulatory actions that they believe impose costs on themselves regardless of the costs on others and deliver value to their corporate leadership team lopsidedly (vs shareholders). Then when something happens, companies plead hardship to the government to get cheap loans. Companies did not take the benefits provided by America and use them to build rainy day funds, develop corporate planning or do things that governments normally do.

For companies and CEOS who claim to be capitalists, they act entirely the opposite.

In other words, there is no free lunch. Political ideologies on the left and the right seem to think that large imbalances either way are the way to run the government. The coronavirus pandemic suggests that a smoothly running, well, but not excessively, funded government is just easier and more responsive over time. Today, distortions build up then resolve in more convulsive and painful, acute events such as the coronavirus pandemic.

We have great and smart people in the country, we can do better.

send checks to everyone – only if you can id them

As part of the covid-19 response, the US federal government wants to send checks to everyone. Ideally, you pay your taxes individually, you would have each person’s information to send a check to.

But its not that easy and it is going to be ripe with fraud and abuse. And unfortunately, banks, who are guilty of several sins *again* around this, must play a role and take their cut. And the payments will be susceptible to fraud, especially for lower-income individuals with less sophisticated banking technology.

Ideally, we would have a way to identify people, independent of the banking world, so they could receive an electronic payment. Unfortunately, while the federal government is busy imposing ID requirements for traveling on planes, it is ignoring citizen identity that operates for broader national interests.

For years, technologists have been talking about decentralized identity and verifiable claims, based on blockchain concepts, and electronic payments using blockchain technology.

While we can hope this is a one-time event, it is a pretty clear example of how those technologies could make this effortless and cost-efficient.

Today, this type of payment effort will be bad.

Oracle needs to be Unhurded.

It is an interesting case of blinders when one considers Oracle. While Oracle was quick to enter the business applications market in the 90s, buying Siebel as well as developing their own products, they started missing the mark fairly quickly shontly thereafter.

A recent article https://www.cnbc.com/2019/12/05/oracle-shows-buybacks-can-go-too-far.html discusses Oracle buyback frenzy and how it is leaving the company with net debt. There is really only one reason for that–they were Hurded.

Mark Hurd ran the company as co-CEO for a long time. Unfortunately, Hurd rips apart companies, with an eye on financials, but without an eye for doing anything useful in the marketplace. He proved that time and time again. First at NEC, Teradata, HP, then at Oracle. Hurd was good at understanding financials and I think that’s critical and good. He was horrible, always, at understanding what makes a company tick and missing big trends. He’s missed them all his life–totally blind. Mostly, he propped up a company playing with financials while undermining its core–the companies would falter after he left and he always left. Hurd would make a good 2nd in command, just not a 1st in command.

With Ellison mostly retired (now un-retired), Ellison was out of the loop of the marketplace. He’s been mostly a one-trick poney so far–a good trick that has its place of course. But not a trick that takes it to the next level. That’s why Microsoft finally got rid of its self-limiting ponies as well as, recently, google. I’m still amazed that Ellison does not understand the damage Hurd did at Oracle.

At Oracle, like HP, Hurd scared away deep, technical talent. A short-term focus on financials meant that Hurd was missing the market signals. Oracle is missing the largest IT transformation story in the history of IT–cloud computing (private/hybrid/public)–because he scared away the talent that understands the change. Locked in with just a focus on financials, he completely misread the trend. Underinvested and undercommitted in multiple ways, Oracle’s transformation story to prepare itself for the next decade is sorely lacking. If you are going to take on debt, at least do it to help you become more competitive.

I own Oracle stock and I want it to succeed. I am hoping that it does not fall into a death spiral and sold off. The marketplace needs more competitors sooner. They need to replace the senior leadership team with a new “Ellison” (Ellison was good in his time). The focus needs to be on customers and improving their interactions with them. You can see a steady stream of awful sales executives leave Oracle, bounce over to IBM and HP, then bounce around again–all while delivering little value. Sales executives who learned truly terrible behaviors at Oracle replicate their poor behaviors elsewhere while not delivering–just look at S. Cook who has bounced around at Oracle, HP, IBM, MicroStrategy et al.

Let’s hope Oracle succeeds at becoming competitive again and can direct itself to the next level.

WeWork’s collapse: the markets can still work

The recent WeWork IPO debacle shows that the markets can still work. WeWork’s IPO problems show how risk transfer mechanisms expose risks and risk management is an important part of well-performing markets.

Here’s the storyline.

Private Equity (PE) money takes risks. That’s Ok and a good thing. PE place bets across industries. That’s a good thing as well. The really smart people, we hope, that society should applaud are doing something more concrete then moving money around (yes, sometimes after a “hit” people move to PE). But hope is not a strategy. In the end, we have a bunch of PE people who want returns from their investments and they obtain returns from the work of others–say WeWork. Given all the bets PE places, many will lose, some will win big.


Since money is involved, there is bad behavior–everywhere. Money makes some people crazy. Look at WeWork. Bad behavior from people we want to succeed trickled upward into PE where bad behavior already existed. PE players reinforced and “played up” up WeWork. Of course, they played it up regardless of what they thought about WeWork. PE had significant investments in the company and they wanted to win big. PE wanted to convince people that their investment made sense so they would buy into it–a classic sell job. Their bad behavior made WeWork look like it was worth tens of billions when in reality it is a corporate office rental company.

Here’s where the “markets still work” comment comes into the story.

Imagine one party that takes many risks and plays up its investments–“these investments are great!” However, public equity markets run on a higher level of transparency. Financial documents describe a company’s organization and show investors where the “value” is. Financial reporting and transparency is a public market function and was explicitly designed to expose issues like WeWork’s. Public equity markets like hard facts such as earnings. While you might argue that public equity has its downsides and is sometimes not so smart, public equity is a much larger pool of money to tap into and much more liquid. PE wants “public.”

If you move money from PE to public equity, the risk is also moved. For example, risk shifts from a few PE companies to the public. The public risk pool is much larger. Assuming that private risk assessments were held to the same standards as public risk assessments, we can use their “results” to predict how smooth the transfer will be. While there are exceptions to the rule and bad behavior during transfers happens, when the risk profiles are more or less in agreement, there is an orderly transfer of risk. Each risk holder in the public area, in theory, holds less risk “per unit” as the equity holder pool typically becomes much larger. The public benefits because the “common” investor can invest in companies. PE benefits–payback. That’s all Ok.

However, when the risk profiles between private and public are a mismatch, we see a situation like what happened with WeWork. The risk profiles were way out of whack, and the friction between the two was exposed. The mismatch was huge, and WeWork collapsed.

Sometimes the mismatch continues for a while and gets corrected later. Most “middle-men” businesses, like WeWork, Uber, and “last mile” delivery companies, are not technology companies. They are service companies using technology–very common and mundane. Service companies get a much lower valuation/multiple. The middle-men players may see a bump for a bit, but over time, they are just a “tax.” These costs need to be squeezed out. Facebook is like this as well, but less so on this particular topic.

It’s not about lower taxes…that completely misses the point again

Republicans have for a long time focused on the “lower taxes” mantra. While recent events suggest that this mantra was probably not a sticky value, the idea of lowering taxes as a key objective is wrong and incredibly poor thinking–it’s not even good business thinking. These thoughts apply to all political parties.

The real challenge is to lower my total costs, not just lowering one part of my cost burden while jacking up other costs that’s just cost-shifting which invariably is a zero-sum game.

A recent article in the Washington Post highlights this issue. The article describes how CA will demand that all new homes have solar power installed. The payback for doing so is over the life of the home, more or less, potentially sooner. This seems like an outrageous cost burden to impose. But let’s think of the alternatives.

If we believe that we can fully identify costs, including the costs of building new power plants and the cost of pollution or even the cost of servicing the national debt, then let’s focus on lowering “total” costs.

For the solar home law, they are shifting costs to the homeowner with the anticipation of lowering or zero-summing costs away from large utilities/higher taxes. That does not really help on a net basis–the costs may be the same. Of course, if solar homes lower overall costs over the calculation horizon, then its a win and a good model. We could also speculate that the law imposes regulations on homes with the trade-off of not imposing regulations on future power generation or other utilities. While you may think that all regulations are bad, a subject of another blog, this regulation is a cost-shifting regulation with potential, we presume, of lowering overall costs.

It’s possible that cost-shifting stimulates innovation in the area where the costs are borne. We have seen, however, that when costs are imposed on companies, many companies just lobby to have them removed and play tax schemes vs innovating around them. That’s not good and we should demand more from the companies we purchase from.

If we imagine that there is widespread popular support for addressing global warming, etc. through solar power generation then shifting costs to the “people” may force the people to “demand” innovation in companies. This allows the public to have an end-run around companies not being well managed, lacking morals and civil responsibility. By shifting costs to the homeowners, the CA government may also be getting around various federal and state officials’ lack of action on this particular topic.

We face the “lower taxes” hoax all the time. Lawmakers have shown that they are not really interested in lowering total costs–which should be the real focus. In fact, they do not seem to be interesting in lowering taxes either except for a few donors. Lowering taxes (or taxes in one category) while increasing costs elsewhere is disingenuous. I do not care whether I pay one person or another for a certain level of benefit, comfort or moral objective given the two are roughly equivalent. And in some cases, forcing costs onto one party or another is not morally or tactically helpful. Sometimes it could be, sometimes it is not.

Blockchain will change everything…wait…Blockchain 2.0 will change everything…wait…

Much has been said about Blockchain and how it will, literally, change the entire world. Blockchain is the hip and relatively new technology that has been described as the second coming of the internet. Most people are familiar with it based on the cryptocurrency, bitcoin, a pseduo-anonymous, distributed, public distributed ledger of currency. Depending on how you use the vocabulary, bitcoin/blockchain could refer to a variety of things ranging from the algorithm, the protocol or the currency. It’s been claimed that the  blockchain can be applied to “all human endeavors”–as has been foretold since bitcoin came into the public view. It’s important to remember that blockchain technology is part of a cryptocurrency but a cryptocurrency is focused on payments while blockchain technology can be used for more than payments.

Regardless of the risk, legal or moral issues surrounding blockchain as a currency, bitcoin technology allows parties with various trust levels  to  transact together. Blockchain 1.0 really viewed the world through a  currency and financial lens–financial transactions between two or more parties. Blockchain 2.0 is based on the idea that “all human endeavors” can be coded (you pick your programming language) into little programs that are baked into the blockchain and “run” based on triggers or other criteria i.e. smart contracts. These little blockchain programs allow you to execute conditional logic e.g. if it rains on Tuesday, pay party “A” 2 bitcoins. Obviously, as soon as a “program” is executing, you run into a large variety of issues such as the ability of that program to run a “trusted” fashion or who gets access to what and whether access can be limited (talk about risk mangement!). Blockchain 2.0 technology also has additional features to serve diverse needs of their users e.g. blockchain tokens/coins for use in  representing physical (or even non-physical) assets.

Newer projects such as Ethereum, Hyperledger and others have been created to deliver the Blockchain 2.0 vision. They add the ability to run these programs, control access, create trusted execution environments, etc. I will state for the record that all of these things are needed to create a Blockchain that is useful to business interests e.g. B2B type activities where additional privacy, control and capabilities are needed–governance in general. You could easily imagine taking Blockchain 1.0 and using it carefully to create Blockchain 2.0 capabilities, but Blockchain 2.0 is a bit more a rewrite than a tweak.

This is all very good, but the questions you should start asking yourself is “who gets to cash the check–who really benefits?” The person “cashing the check” really determine how fast things will move and whether they will share the benefits with others.

Blockchain promises to reduce the cost of transactions and make it easier for parties that do not trust each other, to conduct transactions. Does that mean that banks are not needed and the cost of a transaction becomes minuscule unlike today? I’ll mention that the concept of “transaction” related to banks may or may not mean exchanging payments, it could also mean “asset management.”

If so, the consumer benefits, the banks do not. Or does it mean that banks are still needed, maybe they are not called banks anymore, but a middleman is still needed. If so, then the “new middleman” benefits at the loss of the old middleman (ala Platform Scale). Consumers may lose for awhile due to an increase in choices/confusion.

The technology can deliver benefits. However, it is interesting to consider:

  • You will still need alot of computer servers and people to feed and care them.
    • The actual blockchain can be viewed as a database that talks to other databases to sync up and update itself. Sometimes the algorithms require alot of computational power.
  • You’ll still need to administer the process e.g. in Blockchain 2.0, someone has to give “permission” to transact.
  • There are legacy assets that need to be retired over time and sometimes this takes a really long time–as in decades.
  • There will probably be multiple, maybe thousands, smaller transaction networks setup for specialized interests and uses. This means that all the above issues are multiplied by “n.”
  • It is hard to get people to agree to use the same standards across the entire stack of an application unless it gives them an advantage.
  • New technology and its applications that enable new scenarios can create challenges to managing risk—not transaction risk but overall risk of the activities the transactions support.
  • Perhaps most importantly, if you transact with Bitcoin 2.0, you have to trust the platform to execute, which means you have to trust the people running the platform, which is exactly the issue we have today, “who do you trust?”

Bitcoin 2.0 thinking is designed to be more business friendly e.g. less computational power needed and more access controls. As Bitcoin 1.0 becomes Bitcoin 2.0, the types of issues present in today’s systems crept in and imposed an overhead and burden similar to the way the same requirements burden today’s environments. The key issue though is that IF companies can agree to use these new technologies together, then their total cost of ownership CAN go down. In other words, if companies collaborate smartly to transact, then yes, costs can go down and benefits can increase. This was true 30 years ago as well–standardization can benefit the entire ecosystem.

So its clear there can be a benefit. Most likely companies will benefit first as they will incur the initial investments and most companies will not fully transfer everything over to the new platform. Eventually, consumers will benefit as existing goods and services can operate under cheaper transactions. Cryptocurrencies are where people can obtain a benefit fairly quickly if you can become comfortable with the use of non-fiat currency. Government regulations will eventually catch up.

So back to the title, Blockchain can definitely change everything. Companies could benefit the most first, incrementally. There can clearly be a shift in the players and there are opportunities for startups to disrupt if they can get out far enough ahead using Christensen‘s definition of disruption.

But I am not convinced that it is a tidal wave about to hit me this year or next (2019 looks like a strong blockchain year with 2018 being a ramp) especially since large corporations most likely hold the keys to deployment speed and deployment functionality. For example, today, there are only a few firms that really hold the “ledgers” (custodians) for financial accounts. These players are enormously powerful and “trusted” for good reason. That’s not going to change. They are the only ones that will really lead the charge in the financial sector because they own the “transactions.”

They are not going to go away quietly or at all. They will probably create a bitcoin-based system that benefits them–the new market makers. Whether good or bad, they will deploy blockchain first and reap the benefits of the investments and they are the ones who will create a system beneficial to them. It is doubtful if they will ever pass along the benefits since they must still maintain legacy systems, they’ll have two systems to maintain. More importantly, why should they pass along the benefits to others? A smart person, with morals not strictly aligned with public benefit, will seek to make money and enhance their position. It is known that this is exactly what they are doing, right now.

Sure, there are other types of “custodians” who hold the ledger today. But due to a variety of factors, once you back away from a “single, transparent system that untrusted parties can transaction with” which is what bitcoin 1.0 is today with its “proof-of-work”, the collaboration and standards benefits start bouncing up against creeping costs to “use.”

Today, there are over 100 cryptocurrencies. Beyond payments, will the future hold tens of thousands of “bitcoin 2.0” ledgers? Fragmentation, even using the same technology, also seems like the bogeyman of the benefits story. In order to try and gain control from current owners, disruptors will try to “own” the bitcoin 2.0 ledger platforms to run smart “contracts.” In the in process the “ledgers” will fragment.

Also, since its doubtful that there will really be any disruption quickly (but it is coming), the limited set of players who deploy these capabilities will reap the reward in the short and near-term. There will be benefits from Bitcoin 2.0 but maybe we (the public) will need to wait…until Bitcoin 3.0, wait, until Bitcoin 4.0, wait, …

Disclaimer: I own a few bitcoins.

 

Drink the Kool-Aid? Yes, But Pick When You Drink It

In the business world, drinking the kool-aid refers to an employee’s willingness to commit fully, without hesitation and without cynicism, to their organization’s and boss’s objectives–to be a fully engaged team member.

I was thinking about a friend who recently changed jobs. He is a smart guy and always has two or things running in parallel–backup plans in case the primary activity fails. I had suggested that for the moment, he needs to drink the Kool-Aid on his primary activity. He needed to stop keeping options actively in play once he made his primary choice as maintaining options sometimes has its price. The idea was to stop thinking that the current gig was temporary. Was I wrong to recommend this?

HBR recently had a short article that suggested there is a real cost to making backup plans. The fundamental question goes something like this:

“When we think about what we’ll do if we fail to achieve our goals, are we less likely to succeed?”

The answer, according to Jihae Shin the principal investigator, is mostly a “yes.” His research concluded that people who made back-up plans achieved their goals less often than others who did not have back-up plans. But his findings did not say *not* to make back-up plans. Instead, you should be more thoughtful about the timing and level of effort you put into your backup planning.

That makes sense. Different people operate differently. For example, we want a backup plan for our son, who is focusing on a music career in college. But we do not want him spending a lot of time on the backup plan *now*. We do not want our son to be distracted from his focus on music *now*, in order to prepare later for a different career later which he may never pursue. We encourage him to think of options but to the point that at the expense of his current focus.

I think my suggestion makes sense specific to my friend’s situation. I was not suggesting that he forgo multiple threads running, but that he fully commit to the one in front of him and assume that this choice would be the solution for a very long time.

The idea is to take the opportunity as far as it will go and only then get the backup plans moving along. It was really a suggestion to stop thinking that the current objective would not be achieved and to avoid the distraction of trying to line up alternate plans prematurely.

At the right time, even temporarily, go all in, get the tee-shirt, buy the mug, think that your organization is great even if it has warts, adopt its strategy–drink the Kool-Aid. Pick a time, later, to consider options.