Which New Business Models Will Be Unleashed By Web 3.0? – Fabric

The forthcoming wave of Web 3.0 goes far beyond the initial use case of cryptocurrencies. Through the richness of interactions now possible and the global scope of counter-parties available, Web 3.0 will cryptographically connect data from individuals, corporations and machines, with efficient machine learning algorithms, leading to the rise of fundamentally new markets and associated business models.

The future impact of Web 3.0 makes undeniable sense, but the question remains, which business models will crack the code to provide lasting and sustainable value in today’s economy?

A history of Business Models across Web 1.0, Web 2.0 and Web 3.0

We will dive into native business models that have been and will be enabled by Web 3.0, while first briefly touching upon the quick-forgotten but often arduous journeys leading to the unexpected & unpredictable successful business models that emerged in Web 2.0.

To set the scene anecdotally for Web 2.0’s business model discovery process, let us not forget the journey that Google went through from their launch in 1998 to 2002 before going public in 2004:

  • In 1999, while enjoying good traffic, they were clearly struggling with their business model. Their lead investor Mike Moritz (Sequoia Capital) openly stated “we really couldn’t figure out the business model, there was a period where things were looking pretty bleak”.
  • In 2001, Google was making $85m in revenue while their rival Overture was making $288m in revenue, as CPM based online advertising was falling away post dot-com crash.
  • In 2002, adopting Overture’s ad model, Google went on to launch AdWords Select: its own pay-per-click, auction-based search-advertising product.
  • Two years later, in 2004, Google hits 84.7% of all internet searches and goes public with a valuation of $23.2 billion with annualised revenues of $2.7 billion.

After struggling for 4 years, a single small modification to their business model launched Google into orbit to become one of the worlds most valuable companies.

Looking back at the wave of Web 2.0 Business Models

Content

The earliest iterations of online content merely involved the digitisation of existing newspapers and phone books … and yet, we’ve now seen Roma (Alfonso Cuarón) receive 10 Academy Awards Nominations for a movie distributed via the subscription streaming giant Netflix.

Marketplaces

Amazon started as an online bookstore that nobody believed could become profitable … and yet, it is now the behemoth of marketplaces covering anything from gardening equipment to healthy food to cloud infrastructure.

Open Source Software

Open source software development started off with hobbyists and an idealist view that software should be a freely-accessible common good … and yet, the entire internet runs on open source software today, creating $400b of economic value a year and Github was acquired by Microsoft for $7.5b while Red Hat makes $3.4b in yearly revenues providing services for Linux.

SaaS

In the early days of Web 2.0, it might have been inconceivable that after massively spending on proprietary infrastructure one could deliver business software via a browser and become economically viable … and yet, today the large majority of B2B businesses run on SaaS models.

Sharing Economy

It was hard to believe that anyone would be willing to climb into a stranger’s car or rent out their couch to travellers … and yet, Uber and AirBnB have become the largest taxi operator and accommodation providers in the world, without owning any cars or properties.

Advertising

While Google and Facebook might have gone into hyper-growth early on, they didn’t have a clear plan for revenue generation for the first half of their existence … and yet, the advertising model turned out to fit them almost too well, and they now generate 58% of the global digital advertising revenues ($111B in 2018) which has become the dominant business model of Web 2.0.

Emerging Web 3.0 Business Models

Taking a look at Web 3.0 over the past 10 years, initial business models tend not to be repeatable or scalable, or simply try to replicate Web 2.0 models. We are convinced that while there is some scepticism about their viability, the continuous experimentation by some of the smartest builders will lead to incredibly valuable models being built over the coming years.

By exploring both the more established and the more experimental Web 3.0 business models, we aim to understand how some of them will accrue value over the coming years.

  • Issuing a native asset
  • Holding the native asset, building the network:
  • Taxation on speculation (exchanges)
  • Payment tokens
  • Burn tokens
  • Work Tokens
  • Other models

Issuing a native asset:

Bitcoin came first. Proof of Work coupled with Nakamoto Consensus created the first Byzantine Fault Tolerant & fully open peer to peer network. Its intrinsic business model relies on its native asset: BTC — a provable scarce digital token paid out to miners as block rewards. Others, including Ethereum, Monero and ZCash, have followed down this path, issuing ETH, XMR and ZEC.

These native assets are necessary for the functioning of the network and derive their value from the security they provide: by providing a high enough incentive for honest miners to provide hashing power, the cost for malicious actors to perform an attack grows alongside the price of the native asset, and in turn, the added security drives further demand for the currency, further increasing its price and value. The value accrued in these native assets has been analysed & quantified at length.

Holding the native asset, building the network:

Some of the earliest companies that formed around crypto networks had a single mission: make their respective networks more successful & valuable. Their resultant business model can be condensed to “increase their native asset treasury; build the ecosystem”. Blockstream, acting as one of the largest maintainers of Bitcoin Core, relies on creating value from its balance sheet of BTC. Equally, ConsenSys has grown to a thousand employees building critical infrastructure for the Ethereum ecosystem, with the purpose of increasing the value of the ETH it holds.

While this perfectly aligns the companies with the networks, the model is hard to replicate beyond the first handful of companies: amassing a meaningful enough balance of native assets becomes impossible after a while … and the blood, toil, tears and sweat of launching & sustaining a company cannot be justified without a large enough stake for exponential returns. As an illustration, it wouldn’t be rational for any business other than a central bank — i.e. a US remittance provider — to base their business purely on holding large sums of USD while working on making the US economy more successful.

Taxing the Speculative Nature of these Native Assets:

The subsequent generation of business models focused on building the financial infrastructure for these native assets: exchanges, custodians & derivatives providers. They were all built with a simple business objective — providing services for users interested in speculating on these volatile assets. While the likes of Coinbase, Bitstamp & Bitmex have grown into billion-dollar companies, they do not have a fully monopolistic nature: they provide convenience & enhance the value of their underlying networks. The open & permissionless nature of the underlying networks makes it impossible for companies to lock in a monopolistic position by virtue of providing “exclusive access”, but their liquidity and brands provide defensible moats over time.

Payment Tokens:

With The Rise of the Token Sale, a new wave of projects in the blockchain space based their business models on payment tokens within networks: often creating two sided marketplaces, and enforcing the use of a native token for any payments made. The assumptions are that as the network’s economy would grow, the demand for the limited native payment token would increase, which would lead to an increase in value of the token. While the value accrual of such a token model is debated, the increased friction for the user is clear — what could have been paid in ETH or DAI, now requires additional exchanges on both sides of a transaction. While this model was widely used during the 2017 token mania, its friction-inducing characteristics have rapidly removed it from the forefront of development over the past 9 months.

Burn Tokens:

Revenue generating communities, companies and projects with a token might not always be able to pass the profits on to the token holders in a direct manner. A model that garnered a lot of interest as one of the characteristics of the Binance (BNB) and MakerDAO (MKR) tokens was the idea of buybacks / token burns. As revenues flow into the project (from trading fees for Binance and from stability fees for MakerDAO), native tokens are bought back from the public market and burned, resulting in a decrease of the supply of tokens, which should lead to an increase in price. It’s worth exploring Arjun Balaji’s evaluation (The Block), in which he argues the Binance token burning mechanism doesn’t actually result in the equivalent of an equity buyback: as there are no dividends paid out at all, the “earning per token” remains at $0.

Work Tokens:

One of the business models for crypto-networks that we are seeing ‘hold water’ is the work token: a model that focuses exclusively on the revenue generating supply side of a network in order to reduce friction for users. Some good examples include Augur’s REP and Keep Network’s KEEP tokens. A work token model operates similarly to classic taxi medallions, as it requires service providers to stake / bond a certain amount of native tokens in exchange for the right to provide profitable work to the network. One of the most powerful aspects of the work token model is the ability to incentivise actors with both carrot (rewards for the work) & stick (stake that can be slashed). Beyond providing security to the network by incentivising the service providers to execute honest work (as they have locked skin in the game denominated in the work token), they can also be evaluated by predictable future cash-flows to the collective of service providers (we have previously explored the benefits and valuation methods for such tokens in this blog). In brief, such tokens should be valued based of the future expected cash flows attributable to all the service providers in the network, which can be modelled out based on assumptions on pricing and usage of the network.

A wide array of other models are being explored and worth touching upon:

  • Dual token model such as MKR/DAI & SPANK/BOOTY where one asset absorbs the volatile up- & down-side of usage and the other asset is kept stable for optimal transacting.
  • Governance tokens which provide the ability to influence parameters such as fees and development prioritisation and can be valued from the perspective of an insurance against a fork.
  • Tokenised securities as digital representations of existing assets (shares, commodities, invoices or real estate) which are valued based on the underlying asset with a potential premium for divisibility & borderless liquidity.
  • Transaction fees for features such as the models BloXroute & Aztec Protocol have been exploring with a treasury that takes a small transaction fee in exchange for its enhancements (e.g. scalability & privacy respectively).
  • Tech 4 Tokens as proposed by the Starkware team who wish to provide their technology as an investment in exchange for tokens — effectively building a treasury of all the projects they work with.
  • Providing UX/UI for protocols, such as Veil & Guesser are doing for Augur and Balance is doing for the MakerDAO ecosystem, relying on small fees or referrals & commissions.
  • Network specific services which currently include staking providers (e.g. Staked.us), CDP managers (e.g. topping off MakerDAO CDPs before they become undercollateralised) or marketplace management services such as OB1 on OpenBazaar which can charge traditional fees (subscription or as a % of revenues)
  • Liquidity providers operating in applications that don’t have revenue generating business models. For example, Uniswap is an automated market maker, in which the only route to generating revenues is providing liquidity pairs.

With this wealth of new business models arising and being explored, it becomes clear that while there is still room for traditional venture capital, the role of the investor and of capital itself is evolving. The capital itself morphs into a native asset within the network which has a specific role to fulfil. From passive network participation to bootstrap networks post financial investment (e.g. computational work or liquidity provision) to direct injections of subjective work into the networks (e.g. governance or CDP risk evaluation), investors will have to reposition themselves for this new organisational mode driven by trust minimised decentralised networks.

When looking back, we realise Web 1.0 & Web 2.0 took exhaustive experimentation to find the appropriate business models, which have created the tech titans of today. We are not ignoring the fact that Web 3.0 will have to go on an equally arduous journey of iterations, but once we find adequate business models, they will be incredibly powerful: in trust minimised settings, both individuals and enterprises will be enabled to interact on a whole new scale without relying on rent-seeking intermediaries.

Today we see 1000s of incredibly talented teams pushing forward implementations of some of these models or discovering completely new viable business models. As the models might not fit the traditional frameworks, investors might have to adapt by taking on new roles and provide work and capital (a journey we have already started at Fabric Ventures), but as long as we can see predictable and rational value accrual, it makes sense to double down, as every day the execution risk is getting smaller and smaller

Source : https://medium.com/fabric-ventures/which-new-business-models-will-be-unleashed-by-web-3-0-4e67c17dbd10

API Metrics and Status – A Regulatory Requirement or a Strategic Concern? – John Heaton-Armstrong

TL;DR – those discussing what should be appropriate regulatory benchmarks for API performance and availability under PSD2 are missing a strategic opportunity. Any bank that simply focusses on minimum, mandatory product will rule itself out of commercial agreements with those relying parties who have the wherewithal to consume commercial APIs at scale.

Introduction

As March approaches, those financial institutions in the UK and Ireland impacted by PSD2 are focussed on readiness for full implementation. The Open Banking Implementation Entity (OBIE) has been consulting on Operational Guidelineswhich give colour to the regulatory requirements found in the Directive and Regulatory Technical Standards which support it. The areas covered are not unique to the UK, and whilst they are part of an OBIE-specific attestation process, the guidelines could prove useful to any ASPSP impacted by PSD2.

Regulatory Requirements

The EBA at guidelines 2.2-4 are clear on the obligations for ASPSPs. These are supplemented by the RTS – ” [ASPSPs must] ensure that the dedicated interface offers at all times the same level of availability and performance, including support, as the interfaces made available to the payment service user for directly accessing its payment account online…” and “…define transparent key performance indicators and service level targets, at least as stringent as those set for the interface used by their payment service users both in terms of availability and of data provided in accordance with Article 36″ (RTS Arts. 32(1) and (2)).

This places the market in a quandary – it is extremely difficult to compare, even at a theoretical level, the performance of two interfaces where one (PSU) is designed for human interaction and the other (API) for machine. Some suggested during the EBA’s consultation period that a more appropriate comparison might be between the APIs which support the PSU interface and those delivered in response to PSD2. Those in the game of reverse engineering confirm that there is broad comparability between the functions these support – unfortunately this proved too much technical detail for the EBA.

To fill the gap, OB surveyed the developers, reviewed those existing APIs already delivered by financial institutions, and settled on an average of 99% availability (c.22hrs downtime per quarter) and 1000 m/s per 1MB of payload response time (this is a short summary and more detail can be read on the same). A quick review of the API Performance page OB publish will show that, with average availability of 96.34% across the brands in November, and only Bank of Scotland, Lloyds and the HSBC brands achieving >99% availability, there is a long way to go before this target is met, made no easier by a significant amount of change to platforms as their functional scope expands over the next 6-8 months. This will also been in the face of increasing demand volumes, as those organisations which currently rely on screen scraping for access to data begin to transfer their integrations onto APIs. In short, ASPSPs are facing a perfect storm to achieve these goals.

Knowledge and Reporting

At para 2.3.1 of their guidelines, the OBIE expands on the EBA’s reporting guidelines, and provides a useful template for this purpose, but this introduces a conundrum. All of the data published to date has been the banks reporting on themselves – i.e. the technical solutions to generate this data sit inside their domains, so quite apart from the obvious issue of self-reporting, there have already been clear instances where services haven’t been functioning correctly, and the bank in question simply hasn’t known this to be the case until so informed by a TPP. One of the larger banks in the UK recently misconfigured a load balancer to the effect that 50% of the traffic it received was misdirected and received no response, but without its knowledge. A clear case of downtime that almost certainly went unreported – if an API call goes unacknowledged in the woods, does anyone care?

Banks have a challenge, in that risk and compliance departments typically baulk at any services they own being placed in the cloud, or indeed anywhere outside their physical infrastructure. This is absolutely what is required for their support teams to have a true understanding of how their platforms are functioning, and to generate reliable data for their regulatory reporting requirements.

[During week commencing 21st Jan, the Market Data Initiative will announce a free/open service to solve some of these issues. This platform monitors the performance and availability of API platforms using donated consents, with the aim of establishing a clear, independent view of how the market is performing, without prejudicial comment or reference to benchmarks. Watch this space for more on that.]

Regulatory or strategic concern?

For any TPP seeking investment, where their business model necessitates consuming open APIs at scale, one of the key questions they’re likely to face is how reliable these services are, and what remedies are available in the event of non-performance. In the regulatory space, some of this information is available (see above) but is hardly transparent or independently produced, and even with those caveats does not currently make for happy reading. For remedy, TPPs are reliant on regulators and a quarterly reporting cycle for the discovery of issues. Even in the event that the FCA decided to take action, the most significant step they could take would be to instruct and ASPSP to implement a fall-back interface, and given that they would have a period of weeks to build this, it is likely that any relying party’s business would have suffered significant detriment before it could even start testing such a facility. The consequence of this framework is that, for the open APIs, performance, availability and the transparency of information will have to improve dramatically before any commercial services rely on them.

Source : https://www.linkedin.com/pulse/api-metrics-status-regulatory-requirement-strategic-john?trk=portfolio_article-card_title

Digital Transformation of Business and Society: Challenges and Opportunities by 2020 – Frank Diana

At a recent KPMG Robotic Innovations event, Futurist and friend Gerd Leonhard delivered a keynote titled “The Digital Transformation of Business and Society: Challenges and Opportunities by 2020”. I highly recommend viewing the Video of his presentation. As Gerd describes, he is a Futurist focused on foresight and observations — not predicting the future. We are at a point in history where every company needs a Gerd Leonhard. For many of the reasons presented in the video, future thinking is rapidly growing in importance. As Gerd so rightly points out, we are still vastly under-estimating the sheer velocity of change.

With regard to future thinking, Gerd used my future scenario slide to describe both the exponential and combinatorial nature of future scenarios — not only do we need to think exponentially, but we also need to think in a combinatorial manner. Gerd mentioned Tesla as a company that really knows how to do this.

Our Emerging Future

He then described our current pivot point of exponential change: a point in history where humanity will change more in the next twenty years than in the previous 300. With that as a backdrop, he encouraged the audience to look five years into the future and spend 3 to 5% of their time focused on foresight. He quoted Peter Drucker (“In times of change the greatest danger is to act with yesterday’s logic”) and stated that leaders must shift from a focus on what is, to a focus on what could be. Gerd added that “wait and see” means “wait and die” (love that by the way). He urged leaders to focus on 2020 and build a plan to participate in that future, emphasizing the question is no longer what-if, but what-when. We are entering an era where the impossible is doable, and the headline for that era is: exponential, convergent, combinatorial, and inter-dependent — words that should be a key part of the leadership lexicon going forward. Here are some snapshots from his presentation:

  • Because of exponential progression, it is difficult to imagine the world in 5 years, and although the industrial era was impactful, it will not compare to what lies ahead. The danger of vastly under-estimating the sheer velocity of change is real. For example, in just three months, the projection for the number of autonomous vehicles sold in 2035 went from 100 million to 1.5 billion
  • Six years ago Gerd advised a German auto company about the driverless car and the implications of a sharing economy — and they laughed. Think of what’s happened in just six years — can’t imagine anyone is laughing now. He witnessed something similar as a veteran of the music business where he tried to guide the industry through digital disruption; an industry that shifted from selling $20 CDs to making a fraction of a penny per play. Gerd’s experience in the music business is a lesson we should learn from: you can’t stop people who see value from extracting that value. Protectionist behavior did not work, as the industry lost 71% of their revenue in 12 years. Streaming music will be huge, but the winners are not traditional players. The winners are Spotify, Apple, Facebook, Google, etc. This scenario likely plays out across every industry, as new businesses are emerging, but traditional companies are not running them. Gerd stressed that we can’t let this happen across these other industries
  • Anything that can be automated will be automated: truck drivers and pilots go away, as robots don’t need unions. There is just too much to be gained not to automate. For example, 60% of the cost in the system could be eliminated by interconnecting logistics, possibly realized via a Logistics Internet as described by economist Jeremy Rifkin. But the drive towards automation will have unintended consequences and some science fiction scenarios could play out. Humanity and technology are indeed intertwining, but technology does not have ethics. A self-driving car would need ethics, as we make difficult decisions while driving all the time. How does a car decide to hit a frog versus swerving and hitting a mother and her child? Speaking of science fiction scenarios, Gerd predicts that when these things come together, humans and machines will have converged:
  • Gerd has been using the term “Hellven” to represent the two paths technology can take. Is it 90% heaven and 10% hell (unintended consequences), or can this equation flip? He asks the question: Where are we trying to go with this? He used the real example of Drones used to benefit society (heaven), but people buying guns to shoot them down (hell). As we pursue exponential technologies, we must do it in a way that avoids negative consequences. Will we allow humanity to move down a path where by 2030, we will all be human-machine hybrids? Will hacking drive chaos, as hackers gain control of a vehicle? A recent Jeep recall of 1.4 million jeeps underscores the possibility. A world of super intelligence requires super humanity — technology does not have ethics, but society depends on it. Is this Ray Kurzweil vision what we want?
  • Is society truly ready for human-machine hybrids, or even advancements like the driverless car that may be closer to realization? Gerd used a very effective Video to make the point
  • Followers of my Blog know I’m a big believer in the coming shift to value ecosystems. Gerd described this as a move away from Egosystems, where large companies are running large things, to interdependent Ecosystems. I’ve talked about the blurring of industry boundaries and the movement towards ecosystems. We may ultimately move away from the industry construct and end up with a handful of ecosystems like: mobility, shelter, resources, wellness, growth, money, maker, and comfort
  • Our kids will live to 90 or 100 as the default. We are gaining 8 hours of longevity per day — one third of a year per year. Genetic engineering is likely to eradicate disease, impacting longevity and global population. DNA editing is becoming a real possibility in the next 10 years, and at least 50 Silicon Valley companies are focused on ending aging and eliminating death. One such company is Human Longevity Inc., which was co-founded by Peter Diamandis of Singularity University. Gerd used a quote from Peter to help the audience understand the motivation: “Today there are six to seven trillion dollars a year spent on healthcare, half of which goes to people over the age of 65. In addition, people over the age of 65 hold something on the order of $60 trillion in wealth. And the question is what would people pay for an extra 10, 20, 30, 40 years of healthy life. It’s a huge opportunity”
  • Gerd described the growing need to focus on the right side of our brain. He believes that algorithms can only go so far. Our right brain characteristics cannot be replicated by an algorithm, making a human-algorithm combination — or humarithm as Gerd calls it — a better path. The right brain characteristics that grow in importance and drive future hiring profiles are:
  • Google is on the way to becoming the global operating system — an Artificial Intelligence enterprise. In the future, you won’t search, because as a digital assistant, Google will already know what you want. Gerd quotes Ray Kurzweil in saying that by 2027, the capacity of one computer will equal that of the human brain — at which point we shift from an artificial narrow intelligence, to an artificial general intelligence. In thinking about AI, Gerd flips the paradigm to IA or intelligent Assistant. For example, Schwab already has an intelligent portfolio. He indicated that every bank is investing in intelligent portfolios that deal with simple investments that robots can handle. This leads to a 50% replacement of financial advisors by robots and AI
  • This intelligent assistant race has just begun, as Siri, Google Now, Facebook MoneyPenny, and Amazon Echo vie for intelligent assistant positioning. Intelligent assistants could eliminate the need for actual assistants in five years, and creep into countless scenarios over time. Police departments are already capable of determining who is likely to commit a crime in the next month, and there are examples of police taking preventative measures. Augmentation adds another dimension, as an officer wearing glasses can identify you upon seeing you and have your records displayed in front of them. There are over 100 companies focused on augmentation, and a number of intelligent assistant examples surrounding IBM Watson; the most discussed being the effectiveness of doctor assistance. An intelligent assistant is the likely first role in the autonomous vehicle transition, as cars step in to provide a number of valuable services without completely taking over. There are countless Examples emerging
  • Gerd took two polls during his keynote. Here is the first: how do you feel about the rise of intelligent digital assistants? Answers 1 and 2 below received the lion share of the votes
  • Collectively, automation, robotics, intelligent assistants, and artificial intelligence will reframe business, commerce, culture, and society. This is perhaps the key take away from a discussion like this. We are at an inflection point where reframing begins to drive real structural change. How many leaders are ready for true structural change?
  • Gerd likes to refer to the 7-ations: Digitization, De-Materialization, Automation, Virtualization, Optimization, Augmentation, and Robotization. Consequences of the exponential and combinatorial growth of these seven include dependency, job displacement, and abundance. Whereas these seven are tools for dramatic cost reduction, they also lead to abundance. Examples are everywhere, from the 16 million songs available through Spotify, to the 3D printed cars that require only 50 parts. As supply exceeds demand in category after category, we reach abundance. As Gerd put it, in five years’ time, genome sequencing will be cheaper than flushing the toilet and abundant energy will be available by 2035 (2015 will be the first year that a major oil company will leave the oil business to enter the abundance of the renewable business). Other things to consider regarding abundance:
  • Efficiency and business improvement is a path not a destination. Gerd estimates that total efficiency will be reached in 5 to 10 years, creating value through productivity gains along the way. However, after total efficiency is met, value comes from purpose. Purpose-driven companies have an aspirational purpose that aims to transform the planet; referred to as a massive transformative purpose in a recent book on exponential organizations. When you consider the value that the millennial generation places on purpose, it is clear that successful organizations must excel at both technology and humanity. If we allow technology to trump humanity, business would have no purpose
  • In the first phase, the value lies in the automation itself (productivity, cost savings). In the second phase, the value lies in those things that cannot be automated. Anything that is human about your company cannot be automated: purpose, design, and brand become more powerful. Companies must invent new things that are only possible because of automation
  • Technological unemployment is real this time — and exponential. Gerd talked to a recent study by the Economist that describes how robotics and artificial intelligence will increasingly be used in place of humans to perform repetitive tasks. On the other side of the spectrum is a demand for better customer service and greater skills in innovation driven by globalization and falling barriers to market entry. Therefore, creativity and social intelligence will become crucial differentiators for many businesses; jobs will increasingly demand skills in creative problem-solving and constructive interaction with others
  • Gerd described a basic income guarantee that may be necessary if some of these unemployment scenarios play out. Something like this is already on the ballot in Switzerland, and it is not the first time this has been talked about:
  • In the world of automation, experience becomes extremely valuable — and you can’t, nor should attempt to — automate experiences. We clearly see an intense focus on customer experience, and we had a great discussion on the topic on an August 26th Game Changers broadcast. Innovation is critical to both the service economy and experience economy. Gerd used a visual to describe the progression of economic value:
Source: B. Joseph Pine II and James Gilmore: The Experience Economy
  • Gerd used a second poll to sense how people would feel about humans becoming artificially intelligent. Here again, the audience leaned towards the first two possible answers:

Gerd then summarized the session as follows:

The future is exponential, combinatorial, and interdependent: the sooner we can adjust our thinking (lateral) the better we will be at designing our future.

My take: Gerd hits on a key point. Leaders must think differently. There is very little in a leader’s collective experience that can guide them through the type of change ahead — it requires us all to think differently

When looking at AI, consider trying IA first (intelligent assistance / augmentation).

My take: These considerations allow us to create the future in a way that avoids unintended consequences. Technology as a supplement, not a replacement

Efficiency and cost reduction based on automation, AI/IA and Robotization are good stories but not the final destination: we need to go beyond the 7-ations and inevitable abundance to create new value that cannot be easily automated.

My take: Future thinking is critical for us to be effective here. We have to have a sense as to where all of this is heading, if we are to effectively create new sources of value

We won’t just need better algorithms — we also need stronger humarithms i.e. values, ethics, standards, principles and social contracts.

My take: Gerd is an evangelist for creating our future in a way that avoids hellish outcomes — and kudos to him for being that voice

“The best way to predict the future is to create it” (Alan Kay).

My Take: our context when we think about the future puts it years away, and that is just not the case anymore. What we think will take ten years is likely to happen in two. We can’t create the future if we don’t focus on it through an exponential lens

Source : https://medium.com/@frankdiana/digital-transformation-of-business-and-society-5d9286e39dbf

Chris McCann – Blockchain don’t scale (yet), aren’t really decentralized, distribute wealth poorly, lack killer apps, and run on a controlled Internet

Naval Ravikant recently shared this thought:

“The dirty secrets of blockchains: they don’t scale (yet), aren’t really decentralized, distribute wealth poorly, lack killer apps, and run on a controlled Internet.”
https://twitter.com/naval/status/983016288195829770

In this post, I want to dive into his fourth observation that blockchains “lack killer apps” and understand just how far away we are to real applications (not tokens, not store of value, etc.) being built on top of blockchains.

Thanks to Dappradar, I was able to analyze the top decentralized applications (DApps) built on top of Ethereum, the largest decentralized application platform. My research is focused on live public DApp’s which are deployed and usable today. This does not include any future or potential applications not deployed yet.

If you look at a broad overview of the 312 DApps created, the main broad categories are:

I. Decentralized Exchanges
II. Games (Largely collectible type games, excluding casino/games of chance)
III. Casino Applications
IV. Other (we’ll revisit this category later)

On closer examination, it becomes clear only a few individual DApps make up the majority of transactions within their respective category:

Diving into the “Other” category, the largest individual DApps in this category are primarily pyramid schemes: PoWH 3DPoWMPoWLLockedIn, etc. (*Please exercise caution, all of these projects are actual pyramid schemes.)

These top DApps are all still very small relative to traditional consumer web and mobile applications.

*Even “Peak DApp” isn’t that large: by our rough estimates, CryptoKitties only had ~14,000 unique users and 130,000 transactions daily.

Compared to:

*As another comparison point, even the top 50 apps in the Google Play Store alone get on average 25,000+ downloads per day. (This is just downloads, not even counting “transactions”).

Further trends emerge on closer inspection of the transactions of DApps tracked here:

  • More than half of all DApps have zero transactions in the last week.
  • Of the DApps with any usage, the majority of usage is skewed to a small few (see graph).
  • Only 25% of DApps have more than 100 transactions in a week.

Takeaways

Where we are and what it means for protocols and the ecosystem:

After looking through the data, my personal takeaways are:

  1. We are orders of magnitudes away from consumer adoption of DApps. No killer app (outside of tokens and trading) have been created yet. Any seemingly “large” DApp (ex. IDEX, CryptoKitties, etc) has low usage overall.
  2. All of the top DApps are still very much about speculation of value. Decentralized exchanges, casino games, pyramid schemes, and even the current collectible games (I would argue) are all around speculation.
  3. What applications (aside from value transfer and speculation) really take advantage of the true unique properties of a blockchain (censorship resistance, immutability of data, etc) and unlock real adoption?
  4. For new protocol developers, instead of trying to convince existing DApp developers to build on your new platform — think hard about what DApps actually make sense on your protocol and how to help them have a chance at real adoption.
  5. We as an ecosystem need to build better tools and infrastructure for more widespread adoption of DApps. Metamask is an awesome tool, but it is still a difficult onboarding step for most normal users. ToshiStatus, and Cipherare all steps in the right direction and I’m really looking forward to the creation of other tools to simplify the user onboarding experience and improve general UI/UX for normal users.

What kind of DApps do you think we as a community should be building? Would love to hear your takeaways and thoughts about the state of DApps, feel free to comment below or tweet @mccannatron.

Also, if there are any DApps or UI/UX tools I should be paying attention to, let me know — I would love to check them out.

Playbook App – 67 Blockchain Articles & Whitepapers that Shaped Crypto into What it is Today

With all of the noise surrounding bitcoin and its underlying technology, blockchain, it’s often difficult to separate real blockchain articles from those just looking for clicks.

Here’s a list, in no particular order, of articles and whitepapers written by the people actively involved in developing this new technology. The resources below range all the way back from the 1980s to today.

Source: cointelegraph.com

Articles

  1. Fat Protocols by Joel Monegro, Union Square Ventures
  2. Bitcoin — The Internet of Money by Naval Ravikant
  3. The Bitcoin Model for Crowdfunding by Naval Ravikant
  4. The Fifth Protocol by Naval Ravikant
  5. A Beginner’s Guide to Blockchain Technology by Coindesk
  6. The Blockchain Problem Space by Dax Ravi, Ironbay
  7. Blockchain and Design by Blake Hudelson
  8. A Quick Reminder Why Bitcoin was Invented in the First Place by Playbook App
  9. Making Money — Bitcoin Explained (with Emoji) by Tess Rinearson, Chain
  10. Bitcoin, Ethereum, Blockchain, Tokens, ICOs: Why should anyone care? by Preethi Kasireddy
  11. Crypto Tokens: A Breakthrough in Open Network Design by Chris Dixon
  12. Minimum Viable Block Chain by Ilya Grigorik
  13. Introduction to Bitcoin Concepts by 21
  14. Why Bitcoin Matters by Marc Andreessen
  15. Bit gold by Nick Szabo
  16. What is bitcoin and the blockchain? by Brian Forde
  17. Blockchain, simplified by Alexey Malanov
  18. Bitcoin in comic by Shi Yan
  19. The Importance of Bitcoin Not Being Money by Erik Voorhees
  20. The Role of Bitcoin as Money by Erik Voorhees
  21. Blythe Masters Tells Banks the Blockchain Changes Everything by Edward Robinson
  22. The Crypto Anarchist Manifesto by Timothy C. May
  23. The God Protocols by Nick Szabo
  24. A Cyberpunk’s Manifesto by Eric Hughes
  25. How the Bitcoin protocol actually works by Michael Nielsen
  26. Why the blockchain matters by Reid Hoffman
  27. Bitcoin faces a crossroads, needs an effective decision-making process by Arvind Narayanan
  28. Encrypted Data For Efficient Markets by Numerai
  29. Thoughts on Tokens by 21
  30. Three Technical Requirements to Connect Blockchains Without a Token by TenX
  31. Why I like the term, “Cryptoassets” by cburniske
  32. The Crypto J-Curve by cburniske
  33. Quantifying Decentralization by Balaji S. Srinivasan
  34. Beyond the boring blockchain bubble by Jon Evans
  35. Blockchains from the ground up: Part 1 by John Matthews
  36. On Bitcoin and Red Balloons by Moshe Babaioff
  37. A Proof of Stake Design Philosophy by Vitalik Buterin

White Papers & PDFs

  1. Blockchain in the Mainstream by Jeremy Epstein, Never Stop Marketing
  2. Denationalisation of Money: The Argument Refined by F. A. Hayek
  3. RPOW — Reusable Proofs of Work by Hal Finney
  4. B-Money by W. Dai
  5. Bitcoin Whitepaper by Satoshi Nakamoto
  6. Some Simple Economics of the Blockchain by Christian Catalini, MIT Sloan
  7. Hashcash — A Denial of Service Counter-Measure by Adam Back
  8. Shelling Out: The Origins of Money by Nick Szabo
  9. Bitcoin and The Age of Bespoke Silicon by Michael Bedford Taylor
  10. Majority is not Enough: Bitcoin Mining is Vulnerable by Ittay Eyal, Cornell
  11. On Bitcoin as a public randomness source by Joseph Bonneau
  12. Research Perspectives and Challenges for Bitcoin and Cryptocurrencies by Joseph Bonneau
  13. Eclipse Attacks on Bitcoin’s Peer-to-Peer Network by Ethan Heilman
  14. Does Governance Have a Role in Pricing? Cross-Country Evidence from Bitcoin Markets by Robert Viglione
  15. Bitcoin in Islamic Banking and Finance by Charles W. Evans
  16. Bitcoin-NG: A Scalable Blockchain Protocol by Ittay Eyal
  17. The Bitcoin Backbone Protocol: Analysis and Applications by Juan A. Garay
  18. The Bitcoin Lightning Network: Scalable Off-Chain Instant Payments by Joseph Poon
  19. Essays on Bitcoin by Alex Kroeger
  20. Blockchain Technology by Michael Crosby, Google
  21. Blockchain: the solution for transparency in product supply chains by Provenance
  22. Blockstack: A Global Naming and Storage System: Secured by Blockchainsby Muneeb Ali
  23. Consensus: Immutable Agreement for the Internet of Value by Sigrid Seibold
  24. Distributed Ledger Technology: beyond block chain by UK Government Chief Scientific Advisor
  25. Economics of Bitcoin: is Bitcoin an alternative to fiat currencies and gold?by Peter Surda
  26. Extending Existing Blockchains with Virtualchain by Jude Nelson
  27. The Impact and Potential of Blockchain on the Securities Transaction Lifecycle by Michael Mainelli
  28. A Fistful of Bitcoins: Characterizing Payments Among Men with No Namesby Sarah Meiklejohn
  29. An architecture for the Internet of Money by Meher Roy
  30. MRA Bitcoin Primer by Phillip Rapoport
Scroll to top