News & Press

Follow our trends news and stay up to date.

Ed Sim – Enterprise Strikes Back – Legacy workloads in the enterprise continue to move to a cloud-native architecture

Consumer companies are the ones that drive the headlines, that generate the most clicks on Techcrunch, and are top of mind for many in the tech industry. So I’d like to celebrate this brief point in time where the enterprise strikes back. While one of the darlings of the last 10 years, Facebook, is getting pummeled, the enterprise market is back in the spotlight.

Look at the Dropbox IPO which priced above its initial value and came out white hot at the end of one of the worst weeks in stock market performance. Couple that with Mulesoft being bought for 21x TTM revenue (see Tomasz Tunguz analysis) at $6.5 billion and Pivotal’s recent S-1 filing and you can see why the enterprise market has everyone’s attention again. However, I’ve been around the markets long enough to know that this too shall pass.

The real story in my mind is about what’s next. It’s true that Salesforce and Workday have created some of the biggest returns in recent enterprise memory. And with that, VC money poured into every category imaginable as every VC and entrepreneur scrambled to create a new system of record…until there were no more new systems of record to be created. My view is that we will see many more of these application layer companies go public in the next couple of years and that will be awesome for sure. There will also still be some amazing companies that raise their Series C, D and beyond funding rounds with scaling metrics. There will also be the few new SaaS app founders who have incredible domain expertise reinventing pieces of the old guard public SaaS companies.

However as a first check investor in enterprise startups, the companies that truly get my attention are more of the infrastructure layer companies like Mulesoft and Pivotal. We are at the beginning stages of one of the biggest IT shifts in history as legacy workloads in the enterprise continue to move to a cloud-native architecture. Being in NYC working with many of the 52 Fortune 500 companies who are undergoing their own migrations and challenges makes us even more excited about what’s ahead. The problem is that as an investor in infrastructure, it’s quite scary to enter a world where AWS commoditizes every bit of infrastructure and elephants like Microsoft and Google are not far behind. Despite that, it’s also hard to ignore the following facts:

  1. Enormous spend and growth for public cloud and app infrastructure, middleware and developer software of $50b (Gartner, Pivotal S-1)
  2. Rise of multi-cloud
  3. Fortune 1000 digital transformation journeys still in early innings
  4. Most legacy workloads are still locked on-prem and not moved to any cloud infrastructure
  5. Every large enterprise is a software company which means developer productivity is paramount
  6. Infrastructure market moves way too fast and more software needed to help manage this chaos
  7. New architectures = new attack vectors and security needs to be reimagined
  8. Serverless technologies…

and many more threads which can create new billion dollar outcomes. Key here is tying this all to a business problem to solve and not just having infrastructure for infrastructure’s sake.

SaaS to Infrastructure, Salesforce and Mulesoft

Salesforce clearly sees the future and it’s in moving a layer deeper into the infrastructure stack, and combining the world of application with back-end and cloud with on-prem. The irony is that the company that led the “no software” movement is the one that bought Mulesoft, a company where 1/2 of its revenue is from software installed on-premise. What Salesforce clearly understands is that in the world of enterprise, integration becomes king as organizations constantly look to get disparate applications, databases and other systems to talk to each other.

“Every digital transformation starts and ends with the customer,” Salesforce CEO Marc Benioff said in a statement. “Together, Salesforce and MuleSoft will enable customers to connect all of the information throughout their enterprise across all public and private clouds and data sources — radically enhancing innovation.”

It’s a digital transformation journey, one that every Fortune 1000 is undergoing. In a world where Gartner predicts that 75% of new applications supporting digital businesses will be built not bought by 2020″, you can see why Mulesoft’s integration platform helps Salesforce future proof itself and embed itself in a future where developers rule.

The Pivotal Story and Digital Transformation

If you are looking for a story about how large enterprises digitally transform themselves into agile software organizations (to the extent they can), then I suggest reading Pivotal’s recently filed S-1 on Friday. Their ascent over the last 5 years mirrors many of the trends we are hearing about on a daily basis; cloud in all forms — public, private, hybrid, and multi; agility; rise of developers; monolithic apps to microservices, containers, continuous integration/deployment, abstraction of ops and infrastructure, and every Fortune 500 is a software company in disguise. Their growth to over $509mm of revenue from $281mm 2 years ago is a case in point. What Pivotal understood early is that there is no digital transformation and agile application development without infrastructure spend. Benioff clearly understands this which is why he paid such a high multiple for Mulesoft.

For those that don’t know what Pivotal does, here is what they do in a nutshell:

PCF accelerates and streamlines software development by reducing the complexity of building, deploying and operating modern applications. PCF integrates an expansive set of critical, modern software technologies to provide a turnkey cloud-native platform. PCF combines leading open-source software with our robust proprietary software to meet the exacting enterprise-grade requirements of large organizations, including the ability to operate and manage software across private and public cloud environments, such as Amazon Web Services, Microsoft Azure, Google Cloud Platform, VMware vSphere and OpenStack. PCF is sold on a subscription basis.

I’ve been fortunate to have a chance to watch closely through my first check into Greenplum many moons ago which ultimately sold to EMC and spun back out as Pivotal (along with some VMWare assets). I also remember the journey the founders were taking on when they decided to sell into P&L units at Fortune 500s charged with making a more agile company. Instead of selling infrastructure to IT, they were able to sell a vision of how P&L units could deliver on their goals faster. Difficult in the beginning, but proved out over time. These P&L units were the one’s charged with creating the bank of the future, the hotel of the future, the insurance company of the future, all centered around a better customer experience driven off of one platform that allowed developers to be more productive and delivered on any cloud.

My only fear about all of this enterprise infrastructure excitement is that like the SaaS markets of yesteryear, this attention will attract way too much venture capital, driving up prices, and reducing opportunities to create meaningful exits. It’s great that enterprise infrastructure is top of mind, but part of me prefers for it to stay in the background, stealthily delivering amazing results.

WiderFunnel – How to blast through silo mentality to create a culture of experimentation

Is your experimentation program experiencing push-back from other departments? Marketers and designers who own the brand? Developers with myriad other priorities? Product owners who’ve spent months developing a new feature?

The reality is that experimentation programs often lose steam because they are operating within a silo.

Problems arise when people outside of the optimization team don’t understand the why behind experimentation. When test goals aren’t aligned with other teams’ KPIs. When experimentation successes aren’t celebrated within the organization-at-large.

Optimization champions can struggle to scale their experimentation programs from department initiatives to an organizational strategy. Because to scale, you need buy-in.

Most companies have a few people who are optimizers by nature, interest, or experience. Some may even have a “growth team.” But what really moves the dial is when everyone in the company is on board and thinks this way reflexively, with full support from C-level leaders.

– Krista Seiden, Global Analytics Education Lead, Google

But getting buy-in for any initiative – especially one that challenges norms, like experimentation – is no easy task. Particularly if your organization suffers from silo mentality.

In this post, we outline a 5-step process for blasting through the silo-mentality blocks in your organization to create a culture of experimentation.

Our 5-step process for destroying silos so you can scale your experimentation program:

  • Step 1: Align your stakeholders on a vision for experimentation.
  • Step 2: Get buy-in from stakeholders and leaders.
  • Step 3: Clarify your experimentation protocol.
  • Step 4: Get strategic with cross-functional teams.
  • Step 5: Let the internal communication flow.

First things first: What is silo mentality?

At WiderFunnel, we often hear questions like: How can I get other people on board with our experimentation program? How can I create an organizational culture of experimentation, when our team seems to be working in a bubble?

When a company operates in silos, people have fewer opportunities to understand the priorities of other departments. Teams can become more insular. They may place greater emphasis on their own KPIs, rather than working with the team-at-large towards the organization’s business goals.

But it’s not silos that are necessarily the problem, it’s silo mentality.

And when an experimentation mindset is only adopted by the optimization team, silo mentality can be a major barrier to scaling your program to create a culture of experimentation.

Silo mentality causes people to withhold information, ignore external priorities, and delay processes where other teams are involved. All in an effort to ensure their team’s success over another’s.

Within a silo, members can suffer from groupthink, emphasizing conformity over confrontation and allowing weak ideas or processes to go unchallenged. They rely on intuition to guide their practices, and resist change because it’s new, uncomfortable, and different.

At its worst, silo mentality can point to adversarial dynamics between teams and their leads. It points to internal conflict, either between management as they fight over limited resources or compete to rise to the upper echelons of your organization.

Teams follow the leader

Silo mentality often comes down to leaders, who are creating the goals and priorities for their teams. If team leads experience conflict, this us-against-them mentality can trickle down to their reports.

Managers, particularly under stress, may feel threatened by another manager’s initiatives. This is because silos often form in organizations where leaders are competing. Or, they appear in organizations where there is a clear divide between upper management and employees.

Unfortunately, silo mentality is a pain point for many optimization champions. But every department is a stakeholder in your organization’s growth. And to enable a strong organizational culture of experimentation, every department needs to understand the value of testing—the why.

So, let’s dive in and explore our 5-step process for breaking down silo mentality. At the heart of this process is creating an understanding of what experimentation can do for the individual, the department, and the organization-at-large.

Step 1: Align your stakeholders on a vision for experimentation.

You may be thinking: What does a “culture of experimentation” even look like?

That’s a great question.

A culture of experimentation is about humility and fearlessness. It means that your organization will use testing as a way to let your customers guide your decision making.

Develop your organization’s culture of experimentation with our strategic planning worksheet!

Plan your next steps for winning over doubtful executives, keeping the experimentation ball rolling, and getting your entire organization excited about optimization.

a vision for a culture of experimentation
An inspiring vision can do wonders for team buy-in. A vision can help align the team on what they should aspire to with experimentation.

Ask yourself these questions to create a vision for your experimentation program:

  1. What business problem(s) do we hope to address with experimentation?
  2. Why is experimentation the best solution to our business problem?
  3. What would test-and-learn values add to our individual team members, departments, and organization?
  4. What do we envision for our organization’s “culture of experimentation”?

In traditional business settings, leadership often takes a top-down approach to communication. But experimentation flips this dynamic on its head. Instead of the HiPPO (highest paid person’s opinion) calling all the shots, all ideas must be tested before initiatives can be implemented.

Michael St. Laurent WiderFunnel

To me, a culture of experimentation is simply measured by the number of people in an organization who are able to admit, ‘I don’t know the answer to that question, but I know how to get it’.

If people within your organization are telling you ‘This is what our customers want’ (without the data to back it up) then you have a problem. Organizations that excel at experimenting aren’t better at knowing what customers want, they are just better at knowing how to find out.

Mike St, Laurent, WiderFunnel Senior Optimization Strategist

The most effective way to persuade others to adopt an experimentation mindset is to subscribe to your vision. You need to demonstrate the test-and-learn values of an Optimization Champion. Values like:

We listen to our gut, then test what it says.
We gather market research, then test it.
We create best practices, then test them.
We listen to our opinions, then test them.
We hear the advice of others, then test it.
We hear the advice of experts, then test it.
We believe in art and science, creativity and discipline, intuition and evidence, and continuous improvement.
We aim for marketing insights.
We aim to improve business results.
We test because it works.
Scientific testing is our crucible for decision-making.

– Chris Goward in “The Optimization Champion’s Handbook”

Once you have clarified your vision, write it down in a declarative statement. And be ready to communicate your vision over. And over. And over.

Step 2: Get buy-in from stakeholders and leaders.

You can’t achieve a culture of experimentation by yourself. You need testing allies.

Other department leads can help create momentum, acting as internal influencers, inspiring others to adopt an experimentation mindset in their workflow. They can help spread the gospel of a test-and-learn culture.

When executives embrace failure as an inherent and integral part of the learning process, there is a trickle-down effect on the entire enterprise. Increasingly, more employees from more departments are eager to learn about the customer experience and contribute their ideas. With more individuals invested and involved, it’s easier for a company to gain a deeper understanding of its customer base and make informed decisions that drive business value.

To do this, you need to understand what will motivate stakeholders to fully adopt an experimentation mindset; how to incentivize them to champion the cause of experimentation. And of course, not everyone will prescribe to your vision.

At least not right away. It may take some finesse. In her Growth & Conversion Virtual Summit presentation, Gretchen Gary, Product Manager at HP, outlined three different types of stakeholders that may have difficulty engaging in a testing culture.

  1. New employees. Recent hires may have yet to experience a test-and-learn culture in their work histories. When their ideas are then positioned as hypotheses, they may feel threatened by the idea of testing and results.
  2. Senior-level managers. They have been hired for their expertise and granted the authority to lead projects. They may feel challenged by the need to test their ideas, rather than using their executive judgment.
  3. Long-term employees. Sometimes the most difficult stakeholders to convince are the employees who have been with your organization for decades and have set workflows. To them, they’ve found a proven process that works and they’re (perhaps) uncomfortable with change.

The underlying emotions for all three types of stakeholders are the same:

  • they may feel threatened by new ideas and concepts;
  • they may feel challenged by their reports and/or leaders;
  • or they may fear failure, being proved wrong.

Your job is to inspire them to overcome these emotions. You need to communicate the possibilities of experimentation to each department in a way that makes sense for them – particularly in terms of their own performance.

WiderFunnel Stakeholder Survey
What motivates your different stakeholders in your experimentation program? One way to identify what drives different teams is to understand their KPIs.

What’s in it for your stakeholders?

You, the Optimization Champion, will need to mitigate different perspectives, opinions, and knowledge levels of testing. You’ll want to:

  1. Address the questions, “Why should I care?” Or, “What’s in it for me?”
  2. Tailor your message to each individual leader’s concerns.

The best thing you can do is try to familiarize yourself with [other team’s] KPIs so you can speak their language and know what might drive them to be more involved with your program.

– Gretchen Gary

Support your vision of experimentation by building a business case. Leverage existing case studies to demonstrate how similar organizations have achieved growth. And show, through real-world examples, how different internal teams — from product development to marketing, from branding to IT — have incorporated experimentation into their workflows.

Step 3: Clarify your experimentation protocol.

It’s important to create an experimentation protocol so that people across your organization understand how and when they can contribute to the experimentation program.

Remove bottlenecks and unify siloed and understaffed teams by formalizing an optimization methodology, empowering individuals and groups to take ownership to execute it.

– Hudson Arnold, Senior Strategy Consultant at Optimizely
WiderFunnel Build the structure with an experimentation protocol
Experimentation protocol is just like a solid foundation, allowing people to know how and when they can contribute.

A standard process enables any employee to know when they can take ownership over a test and when they’ll need to collaborate with other stakeholders.

Building a test protocol is essential. If I’ve learned anything over the last six years, it is that you really have to have formal test protocol so everyone is aware of how the testing tool works, how a test is operated and performed, and then how you’re reading out your results. You will always get questions about the credibility of the result, so the more education you can do there, the better.

– Gretchen Gary

First, evaluate how your experimentation program is currently structurally organized. And think about the ideal structure for your organization and business needs.

Experimentation programs often fall into one of the following organizational structures:

  1. All-owned testing: Optimizers exist in different teams across the organization. Each strategizes and executes experiments according to their own KPIs, as they are positioned to reach larger organizational goals. This model does not have the oversight of a central team.
  2. Center of excellence: Often in companies where individual teams have their own sites or domains. A central body that enables independent teams to run in parallel, encouraging the growth of skills and expanding experimentation.
  3. A hybrid version: When there is shared ownership of digital initiatives, a central body works with all relevant stakeholders to prioritize and plan for experimentation. But individual teams have the ownership to execute tests.

Regardless of how you structure your program, education is a major part of ensuring success when experimentation is a company-wide initiative. Anyone involved in testing should understand the ultimate goals, the experimentation methodology, and how to properly design tests to reveal growth and insights.

When clarifying your organization’s experimentation methodology, you should:

  • Share your organization’s SMART goals for the experimentation program;
  • Clearly communicate expectations for experimentation, including defined key deliverables;
  • Outline the steps involved in an experiment, noting when approvals are needed and when team contributions are welcome;
  • Educate your team(s) in experimentation, including the tools, frameworks, and technologies, and how to properly structure experiments;
  • And, finally, ensure you have a proper feedback loop to determine how and why contributions are prioritized internally. When your team understands that you can’t test everything at once, they’ll understand that every idea is vetted through a prioritization framework.

“Every department should have complete access to and be encouraged to submit ideas for experimentation. But this should only be done when the company is also confident it can complete the feedback loop and provide explanation as to the acceptance or rejection of every single idea,” Mike St. Laurent explains.

An incomplete feedback loop – where people’s ideas get lost in a black hole – is one of the most detrimental things that can happen to the testing culture. Until a feedback loop can be established, it is better for a more isolated testing team to prove the value of the program, without the stressors caused by other parts of the organization getting involved.

Feedback Loop Experimentation Ideas
When you are seeking buy-in from other departments, ensuring that you communicating why and how experimentation ideas get implemented will encourage further contributions.

Different departments in your organization offer unique insight, experience, and expertise that can lead to experiment hypotheses. Experimentation protocol should communicate why your organization is testing, and how and when people can contribute.

Step 4: Get strategic with cross-functional teams.

If silo mentality is limiting your experimentation program, cross-functional teams may be an ideal solution. On cross-functional teams, each member has a different area of expertise and can bring a unique perspective to testing.

Cross-functional teams
Cross-functional teams facilitate dialogue across silos, providing the opportunity for people from different departments to share their knowledge and understand the unique KPIs of other teams. 

Eliminate the territoriality of small teams,” advises Deborah Wahl, CMO of Cadillac and former CMO of McDonald’s. “[Leverage] small, cross-functional teams rather than teams at-large and really get people committed to working towards the same goal.

When you form cross-functional teams, everyone benefits by gaining a deeper understanding of what drives other teams, what KPIs measure their success, and how experimentation can help everyone solve real business problems. They can also generate a wealth of experiment ideas.

Hypothesis volume is (after all) one of the biggest roadblocks that organizations experience in their optimization programs.

Cross-functional teams can channel the conflict associated with silo mentality toward innovative solutions since they help to break down the silo characteristic of groupthink.

Cross-Functional Teams
People bring different perspectives and objectives to cross-functional teams and it will take time to get them aligned. But it’s worth the effort when you can channel their efforts towards innovative solutions.

How to move from groupthink to think tank

Bruce Tuckman’s theory of group development provides a unique lens for the problem of collaboration within teams. He breaks down the four stages of group development:

  1. Forming
  2. Storming
  3. Norming
  4. Performing

In the first stage, forming, a team comes together to learn about the goals of other team members and they become acquainted with the goals of the group. In this case, the goal is growth through experimentation.

Everyone is more polite in this stage, but they are primarily still oriented toward their own desires for an outcome. They are invested in their own KPIs, rather than aligning on a common goal. And that’s fine, because they’re just getting to know each other.

In the second stage, storming, the group learns to trust each other. And conflict starts to rear its head in group discussions, either when members offer different perspectives or when different members make power plays based on title, role, or status within the organization.

But for the team to work, people need to work outside the norms of hierarchy and seniority in favor of collaboration.

In this stage, members feel the excitement of pursuing the goals of the team, but they also may feel suspicion or anxiety over other member’s contributions. You want to make sure this stage happens so that people feel comfortable raising unconventional or even controversial perspectives.

In the context of experimentation, one person’s opinion won’t win out over another person’s opinion. Rather both opinions can be put to the test.

I find [experimentation] has been a great way to settle disputes over experience and priorities. Basically you just need to find out what people want to know, and offer answers via testing. And that in itself is gaining trust through collaboration. And to do so you need to deliver value to all KPIs, not just the KPIs that your program will be measured on. Aligning on common goals for design, support, operations, and others will really help to drive relevancy of your program,” explains Gretchen Gary.

It’s important to enable the right kind of conflict—the kind that can propel your experimentation program toward new ideas and solutions.

The third stage, norming, is when members of the group start to accept the challenge of meeting their shared goal. They understand the perspectives of others and become tolerant of different working or communication styles. They start to find momentum in the ideation process, and start working out solutions to the problems that arise.

And the last stage, performing, is when the team becomes self-sufficient. They are now committed to the goal and competent at decision-making. And conflict, when it arises, is effectively channeled through to workable solutions.

Teams may go through these stages again and again. And it’s necessary that they do so.

Because you want weak ideas to be challenged. And you want innovative ideas to be applied in your experimentation program.

Step 5: Let the internal communication flow.

Free-flowing internal communication is essential in maintaining and scaling experimentation at your organization.

You should be spreading experiment research, results, and learnings across departments. Those learnings can inform other team’s initiatives, and plant the seed for further marketing hypotheses.

Information has a shelf-life in this era of rapid change. So, the more fluid your internal communication, the more central and accessible your data, the more likely it will be put to use.

How are customer learnings and insights shared at your organization?

One method for sharing information is to create an “intelligence report.”

An intelligence report combines data generated from your organization and data derived from external sources. Paired with stories and insights about experimentation, an intelligence report can be a helpful tool for inciting creativity and generating experimentation ideas.

Another method is to provide regular company-wide results presentations. This creates an opportunity for team members and leaders to hear recent results and customer insights, and be inspired to adopt the test-and-learn mindset. It also provides a space for individuals to express their objections, which is essential in breaking down the silo mindset.

Marketing Test Result Presentation at WiderFunnel
WiderFunnel Strategists share exciting results and insights with the whole team each month.

But sharing insights can be also be more informal.

WiderFunnel Strategist Dennis Pavlina shares how one of his clients posts recent test results in the bathroom stalls of their office building to encourage engagement.

A new idea doesn’t get anywhere unless someone champions it, but it’s championship without ownership. Keep it fun and find a way to celebrate the failures. Every failure has a great nugget in it, so how do you pull those out and show people what they gain from it, because that’s what makes the next phase successful.

– Deborah Wahl

Whatever tactic you find most effective for your organization, information dissemination is key. As is giving credit for experiment wins! At WiderFunnel, we credit every single contributor – ideator, strategist, designer, developer, project manager, and more – when we share our experiment results. Because it takes a team to make truly drive growth with testing.

Build trust in a culture of experimentation…

A lot of what we talked about in this post is about building trust.

People need to trust systems, procedures and methodologies for them to work. And every initiative in breaking down silos should be geared towards earning that trust.

Because trust is buy-in. It’s a commitment to the process.

…but creating a culture of experimentation is an iterative process.

Creating and maintaining a culture of experimentation doesn’t happen in a straightforward, sequential manner. It’s an iterative process. For example, you’ll want to:

  • Revisit your vision after six months. Has it had wide-spread adoption? Is your organization aligned on the vision? Do your stakeholders thoroughly understand the benefits of a culture of experimentation?
  • Evaluate your experimentation program for efficiency. Are team members communicating any constraints, any roadblocks, any redundancies in the process? Can you make it more agile, more flexible?
  • Apply your learnings to accelerate the progress of your cross-functional teams. Are your teams forming, storming, norming, and most importantly, performing? What can be improved in terms of collaboration and knowledge-sharing?

Because a culture of experimentation is about continuous exploration and validation. And it’s about testing and optimizing what you’ve learned as an organization. Which means you’ll need to apply these concepts over and over.

Make the terms a part of your vocabulary. Make the steps a part of your routine. Day in and day out.

CBInsights – Future Factory: How Technology Is Transforming Manufacturing

Lights-out manufacturing refers to factories that operate autonomously and require no human presence. These robot-run settings often don’t even require lighting, and can consist of several machines functioning in the dark.

While this may sound futuristic, these types of factories have been a reality for more than 15 years.

Famously, the Japanese robotics maker FANUC has been operating a “lights-out” factory since 2001, where robots are building other robots completely unsupervised for nearly a month at a time.

“Not only is it lights-out,” said FANUC VP Gary Zywiol, “we turn off the air conditioning and heat too.”

To imagine a world where robots do all the physical work, one simply needs to look at the most ambitious and technology-laden factories of today.

For example, the Dongguan City, China-based phone part maker Changying Precision Technology Company has created an unmanned factory.

Everything in the factory — from machining equipment to unmanned transport trucks to warehouse equipment — is operated by computer-controlled robots. The technical staff monitors activity of these machines through a central control system.

Where it once required about 650 workers to keep the factory running, robot arms have cut Changying’s human workforce to less than a tenth of that, down to just 60 workers. A general manager for the company said that it aims to reduce that number to 20 in the future.

As industrial technology grows increasingly pervasive, this wave of automation and digitization is being labelled “Industry 4.0,” as in the fourth industrial revolution.

So, what does the future of factories hold?

To answer this, we took a deep dive into 8 different steps of the manufacturing process, to see how they are starting to change:

  • Product R&D: A look at how platforms are democratizing R&D talent, the ways AI is helping materials science, and how the drafting board of tomorrow could be an AR or VR headset.
  • Resource Planning & Sourcing: On-demand decentralized manufacturing and blockchain projects are working on the complexities of integrating suppliers.
  • Quality Assurance (QA): A look at how computer vision will find imperfections, and how software and blockchain tech will more quickly be able to identify problems (and implement recalls).
  • Warehousing: New warehouse demand could bring “lights-out” warehouses even faster than an unmanned factory, with the help of robotics and vision tracking.

Manufacturers predict overall efficiency to grow annually over the next five years at 7x the rate of growth seen since 1990. And despite representing 11.7% of US GDP and employing 8.5% of Americans, manufacturing remains an area of relatively low digitization — meaning there’s plenty of headroom for automation and software-led improvements.

Manufacturing is deeply changing with new technology, and nearly every manufacturing vertical — from cars, to electronics, to pharmaceuticals — is implicated. The timelines and technologies will vary by sector, but most steps in nearly every vertical will see improvement.

Read on for a closer look at how technology is transforming each step of the manufacturing process.

1. Product R&D

From drug production to industrial design, the planning stage is crucial for mass-production. Across industries, designers, chemists, and engineers are constantly hypothesis testing.

Will this design look right? Does this compound fit our needs? Testing and iterating is the essence of research and development. And the nature of mass-production makes last-minute redesigns costly.

Major corporations across drugs, technology, aerospace, and more pour billions of dollars each year into R&D. General Motors alone spent upwards of $8B on new development last year.

In the highly-scientific world of R&D, high-caliber talent is distributed across the globe. Now, software is helping companies tap into that pool.

When it comes to networking untapped talent in data science and finance, platforms like KaggleQuantopian, and Numerai are democratizing “quant” work and compensating their collaborators. The concept has also aleady taken off with pharmaceutical R&D, though it’s growing elsewhere as well. On-demand science platforms like Science Exchange are currently working across R&D verticals, and allow corporations to quickly solve for a lack of on-site talent by outsourcing R&D.

While R&D scientists may seem non-essential to the manufacturing process, they are increasingly critical for delivering the latest and greatest technology, especially in high-tech manufacturing.

Companies are exploring robotics, 3D printing, and artificial intelligence as avenues to improve the R&D process and reduce uncertainty when going into production. But the process of hypothesis testing has room for improvement, and tightening iteration time will translate to faster and better discoveries.

ROBOTICS & 3D PRINTING SPEED UP PRODUCT DEVELOPMENT ACROSS VERTICALS

Accelerating product development is the No. 1 priority for firms using 3D printing, according to a recent industry survey. Moreover, 57% of all 3D printing work done is in the first phases for new product development (i.e. proof of concept and prototyping).

3D printing is already a staple in any design studio. Before ordering thousands of physical parts, designers can us 3D printing to see what a future product looks like.

Similarly, robotics is automating the physical process of trial-and-error across a wide array of verticals.

In R&D for synthetic biology, for example, robotics making a big impact for companies like Zymergen and Ginkgo Bioworks, which manufacture custom chemicals from yeast microbes. Finding the perfect microbe requires testing up to 4,000 different variants concurrently, which translates to lot of wet lab work.

Using automatic pipette systems and robotics arms, liquid handling robots permit high-throughput experimentation to arrive at a winning combination faster and with less human error.

Below is the robot gene tester Counsyl (left), used for transferring samples, and Zymergen’s pipetting robot (right) for automating microbe culture testing.

“Materials engineering is the ability to detect a very small particle — something like a 10-nanometer particle on a 300-millimeter wafer. That is really equivalent to finding an ant in the city of Seattle.” — Om Nalamasu, CTO at Applied Materials

Looking beyond biotech, material science has played a pivotal role in computing and electronics.

Notably, chip manufacturers like Intel and Samsung are among the largest R&D spenders in the world. As semiconductors get ever-smaller, working at nanoscale requires precision beyond human ability, making robotics the preferred option.

 

Tomorrow’s scientific tools will be increasingly more automated and precise to handle micro-scale precision.

AI IS HASTENING MATERIALS SCIENCE DISCOVERIES

Thomas Edison is well-known for highlighting materials science as a process of elimination: “I have not failed 10,000 times. I have not failed once. I have succeeded in proving that those 10,000 ways will not work.”

The spirit of Edison persists in today’s R&D labs, although R&D is still less digitized and software-enabled than one might expect (the National Academy of Sciences says developing new materials is often the longest stage of developing new products). Better digitization of the scientific method will be crucial to developing new products and materials and then manufacturing them at scale.

Currently, the hottest area for deals to AI startups is healthcare, as companies employ AI for drug discovery pipelines. Pharma companies are pouring cash into startups tracing drug R&D such as Recursion Pharmaceuticals and twoXAR, and it’s only a matter of time until this takes off elsewhere.

One company working in chemistry and materials science is Citrine Informatics(below, left). Citrine runs AI on its massive materials database, and claims it helps organizations hit R&D and manufacturing milestones 50% of the time. Similarly, Deepchem (right) develops a Python library for applying deep learning to chemistry.

In short, manufacturers across sectors — industrial biotech, drugs, cars, electronics, or other material goods — are relying on robotic automation and 3D printing to remain competitive and tighten the feedback loop in bringing a product to launch.

Already, startups developing or commercializing complex materials are taking off in the 3D printing world. Companies like MarkForged employ carbon fiber composites, where others like BMF are developing composites with rare nanostructures and exotic physical properties.

Certainly, manufacturers of the future will be relying on intelligent software to make their R&D discoveries.

AUGMENTED AND VIRTUAL REALITY ‘ABSTRACT AWAY’ THE MODELING PROCESS

Currently, manufacturers of all types rely on prototyping with computer aided design (CAD) software. In future manufacturing processes, augmented and virtual reality could play a greater role in R&D, and could effectively “abstract away” the desktop PC for industrial designers, possibly eliminating the need for 3D printed physical models.

Autodesk, the software developer of AutoCAD, is a bellwether for the future of prototyping and collaboration technology. The company has been no stranger to investing in cutting-edge technology such as 3D printing, including a partnership with health AI startup Atomwise on a “confidential project.” Recently, Autodesk’s exploration into making an AR/VR game engine foreshadows the larger role it envisions for immersive computing in the design process.

Autodesk’s game engine, called Stingray, has added support for the HTC Vive and Oculus Rift headsets. Additionally, game and VR engine maker Unity has announced a partnership with Autodesk to increase interoperability.

Similarly, Apple has imagined AR/VR facilitating the design process in combination with 3D printing. Using the CB Insights database, we surfaced an Apple patent that envisions AR “overlaying computer-generated virtual information” onto real-world views of existing objects, effectively allowing industrial designers to make 3D-printed “edits” onto existing or unfinished objects.

The patent envisions using AR through “semi-transparent glasses,” but also mentions a “mobile device equipped with a camera,” hinting at potential 3D printing opportunities for using ARKit on an iPhone.

A researcher at Cornell has recently demonstrated the ability to sketch with AR/VR while 3D printing. Eventually, the human-computer interface could be so seamless that 3D models can be sculpted in real time.

Tomorrow’s R&D team will be exploring AR and VR, and testing how it works in combination with 3D printing, as well as the traditional prototyping stack.

2. Resource planning & sourcing

Once a product design is finalized, the next step is planning how it will be made at production scale. Typically, this requires gathering a web of parts suppliers, basic materials makers, and contract manufacturers to fulfill a large-scale build of the product. But finding suppliers and gaining trust is a difficult and time-consuming process.

The vacuum maker Dyson, for example, took up to two years to find suppliers for its new push into the auto industry: “Whether you’re a Dyson or a Toyota it takes 18 months to tool for headlights,” a worker on their project reported.

In 2018, assembly lines are so lean they’re integrating a nearly real-time inflow of parts and assembling them as fast as they arrive. Honda’s UK-based assembly factory, for example, only keeps one hour’s worth of parts ready to go. After Brexit, the company reported longer holdups for incoming parts at the border, and said that each 15 minute delay translates to £850,000 per year.

We looked at how technology is improving this complicated sourcing process.

DECENTRALIZED PARTS MANUFACTURING

Decentralized manufacturing may be one impending change that helps manufacturers handle demand for parts orders.

Distributed or decentralized manufacturing employs a network of geographically dispersed facilities that are coordinated with IT. Parts orders, especially for making medium- or small-run items like 3D printed parts, can be fulfilled at scale using distributed manufacturing platforms.

Companies like Xometry and Maketime offer on-demand additive manufacturing and CNC-milling (a subtractive method that carves an object out of a block), fulfilling parts orders across its networks of workshops.

Xometry’s site allows users to simply upload a 3D file and get quotes on milling, 3D printing, or even injection molding for parts. Right now, the company allows up to 10,000 injection-molded parts to be ordered on-demand, so it can handle builds done by larger manufacturers.

Xometry isn’t alone in offering printing services: UPS is also embracing the movement, offering services for 3D printed plastic parts like nozzles and brackets in 60 locations and using its logistics network to deliver orders globally.

As mass-customization takes off, so could the reliance on decentralized network of parts suppliers.

BLOCKCHAIN FOR RESOURCE TRACKING

Enterprise resource planning (ERP) software tracks resource allocation from raw material procurement all the way through customer relationship management (CRM).

Yet a manufacturing business can have so many disparate ERP systems and siloed data that, ironically, the ERP “stack” (which is intended to simplify things) can itself become a tangled mess of cobbled-together software.

In fact, a recent PwC report found that many large industrial manufacturers have as many as 100 different ERP systems.

Blockchain and distributed ledger technologies (DLT) projects aim to unite data from a company’s various processes and stakeholders into a universal data structure. Many corporate giants are piloting blockchain projects, often specifically aiming to reduce the complexity and disparities of their siloed databases.

Last year, for example, British Airways tested blockchain technology to maintain a unified database of information on flights and stop conflicting flight information from appearing at gates, on airport monitors, at airline websites, and in customer apps.

When it comes to keeping track of the sourcing of parts and raw materials, blockchain can manage the disparate inflows to a factory. With blockchain, as products change hands across a supply chain from manufacture to sale, the transactions can be documented on a permanent decentralized record — reducing time delays, added costs, and human errors.

Viant, a project out of the Ethereum-based startup studio Consensys, works on a number of capital-intensive areas that serve manufacturers. And Provenance is building a traceability system for materials and products, enabling businesses to engage consumers at the point of sale with information gathered collaboratively from suppliers all along the supply chain.

Going forward, we can expect more blockchain projects to build supply chain management (SCM) software, handle machine-to-machine (M2M) communication and payments, and promote cybersecurity by keeping a company’s data footprint smaller.

3. Operations technology: Monitoring & machine data

Presumably, tomorrow’s manufacturing process will eventually look like one huge, self-sustaining cyber-physical organism that only intermittently requires human intervention. But across sectors, the manufacturing process has a long way to go before we get there.

According to lean manufacturing metrics (measured by overall equipment effectiveness, or OEE), world-class manufacturing sites are working at 85% of their theoretical capacity. Yet the average factory is only at about 60%, meaning there’s vast room for improvement in terms of how activities are streamlined.

Industry 4.0’s maturation over the next two decades will first require basic digitization.

Initially, we’ll see a wave of machines become more digital-friendly. Later, that digitization could translate into predictive maintenance and true predictive intelligence.

Large capital goods have evolved to a “power by the hour” business model that guarantees uptime. Power by the hour (or performance-based contracting) is now fairly common in the manufacturing world, especially in mission-critical areas like semiconductors, aerospace, and defense.

The idea dates back to the 1960s, when jet engine manufacturers like GE Aviation, Rolls Royce, and Pratt & Whitney began selling “thrust hours,” as opposed to one-off engine sales. This allows engine makers to escape the commodity trap and to focus on high-margin maintenance and digital platforms. Nowadays, GE is incentivized to track every detail of its engine, because it only gets paid if the engine is working properly.

Despite a guarantee of uptime, a machine’s owner is responsible for optimizing usage (just like airlines that buy jet engines still need to put them to good use). In short, factory owners still “own” the output risk between the chain of machines.

Without digitizing every step, efficiency is being left on the table. Yet there are serious barriers for manufacturers to take on the new burden of analytics.

Shop floors typically contain old machines that still have decades of production left in them. In addition to significant cost, sensors tracking temperature and vibration aren’t made with a typical machine in mind, lengthening the calibration period and efficacy.

When Harley-Davidson’s manufacturing plant went through an IIoT sensor retrofit, Mike Fisher, a general manager at the company, said sensors “make the equipment more complicated, and they are themselves complicated. But with the complexity comes opportunity.”

FROM INITIAL DIGITIZATION TO PREDICTIVE

To put it simply, operational technology (or OT) is similar to traditional IT, but tailored for the “uncarpeted areas.” Where the typical IT stack includes desktops, laptops, and connectivity for knowledge work and proprietary data, OT manages the direct control or monitoring of physical devices.

For manufacturers, the OT stack typically includes:

  • Connected manufacturing equipment (often with retrofitted industrial IoT sensors)
  • Supervisory control and data acquisition (SCADA) systems and human machine interfaces (HMI), which provide industrial monitoring for operations analysts
  • Programmable logic controllers (PLCs), the ruggedized computers that grab data on factory machines
  • 3D printers (additive manufacturing) and computer numerical control (CNC) machines for subtractive manufacturing (like whittling away a block)

In a way, IT and OT are two sides to the same tech stack token, and as manufacturing gets better digitized, the boundaries will continue to blur.

Today, the “brain” for most industrial machines is in the programmable logic controller (PLC), which are ruggedized computers. Industrial giants like Siemens, ABB, Schneider, and Rockwell Automation all offer high-priced PLCs, but these can be unnecessarily expensive for smaller manufacturing firms.

This has created an opportunity for startups like Oden Technologies to bring off-the-shelf computing hardware that can plug into most machines directly, or integrate existing PLCs. This, in turn, allows small- and medium-sized businesses to be leaner and analyze their efficiency in real time.

As digitization becomes ubiquitous, the next wave in tech efficiency improvements will be about predictive analytics. Today’s narrative around the Internet of Things has suggested that everything — every conveyor and robotic actuator — will have a sensor, but not all factory functions are of equal value.

Slapping cheap IoT sensors on everything isn’t a cure-all, and it’s entirely possible that more value gets created from a smaller number of more specialized, highly accurate IoT sensors. Augury, for example, uses AI-equipped sensors to listen to machines and predict failure.

Cost-conscious factory owners will recognize that highly accurate sensors will deliver greater ROI than needless IoT.

NEW ARCHITECTURE AT THE EDGE

Computing done at the “edge,” or closer to the sensor, is a new trend within IIoT architecture.

Drafting on innovations in AI, and smarter hardware, Peter Levine of a16z anticipates an end to cloud computing for AVs, drones, and advanced IoT objects.

Connected machines in future factories should be no different.

Companies like Saguna Networks specialize in edge computing (close to the point of collection), whereas a company like Foghorn Systems does fog computing (think a lower-hanging cloud that’s done on-site like a LAN). Both methods allow mission-critical devices to operate safely without the latency of transmitting all data to a cloud, a process that can save big on bandwidth.

In the near future, advances in AI and hardware will allow IoT as we know it to be nearly independent of centralized clouds.

This is important because in the short term, it means that rural factories don’t need to send 10,000 machine messages relaying “I’m OK,” which expends costly bandwidth and compute. Instead, they can just send anomalies to a centralized server and mostly handle the decision-making locally.

Additionally, cloud computing latency has drastic downsides in manufacturing. Mission critical-systems such as connected factories can’t afford the delay of sending packets to off-site cloud databases. Cutting power to a machine split-seconds too late is the difference between avoiding and incurring physical damage.

And in the longer term, edge computing lays down the rails for the autonomous factory. The AI software underpinning the edge will be the infrastructure that allows factory machines to make decisions independently.

In sum, devices that leverage greater computing at the edge of the network are poised to usher in a new, decentralized wave of factory devices.

CYBERSECURITY IS A PRIORITY

One paradox of IIoT is that factories bear significant downside risk, yet are barely investing in protection: 28% of the manufacturers in a recent survey said they saw a loss of revenue due to cybersecurity attacks in the past year, but only 30% of executives said they’ll increase IT spend.

Cyber attacks can be devastating to heavy industry, where cyber-physical systems can be compromised. The WannaCry ransomware attack caused shutdowns at the Renault-Nissan auto plants in Europe. And in 2014, a sophisticated cyber attack resulted in physical damage at a German steel plant when an outage prevented a blast furnace from being shut down correctly.

Consequently, critical infrastructure is a growing segment within cybersecurity, and many startups like Bayshore Networks are offering IoT gateways (which bridge the disparate protocols for connected sensors) to allow manufacturers across many verticals to monitor their IIoT networks. Other gateway-based security companies like Xage are even employing blockchain’s tamperproof ledgers so industrial sensors can share data securely.

28% of the manufacturers in a recent survey cited a loss of revenue due to cybersecurity attacks in the past year. But only 30% of executives said they’ll increase IT spend.

Similarly, adding connected IoT objects and Industrial Control System (ICS) sensors has opened up new vulnerabilities at the endpoint.

To address this, Mocana and Rubicon Labs, among others, are developing secure communication products at the IP and device level.

Additionally, several of the most active enterprise cybersecurity investors are corporates with interests in OT computing. The venture arms of Dell (which makes industrial IoT gateways), as well as Google, GE, Samsung, and Intel are among the most active in this space.

 

Managing the ICS and IIoT systems securely will continue to be a critical area for investment, especially as hack after hack proves OT’s vulnerability.

4. Labor augmentation & management

In a recent write-up about furniture maker Steelcase’s production line, humans were described as being solely present to guide automation technology.

Steelcase’s “vision tables,” which are computerized workstations that dictate step-by-step instructions, eliminate human error in assembling furniture. Using sound cues and overhead scanners to track assembly, the system won’t let workers proceed if a step is done incorrectly. Scanners also allow off-site operations engineers to analyze progress in real time.

The New Yorker wrote about Steelcase’s labor management, A decade ago, industrial robots assisted workers in their tasks. Now workers — those who remain — assist the robots in theirs.”

What manufacturing looks has changed drastically in a short time. As a retired Siemens executive recently said, “People on the plant floor need to be much more skilled than they were in the past. There are no jobs for high school graduates at Siemens today.”

But better digitization and cyber-physical technologies are all augmenting the efficiency and manpower available to the workers. Here’s how emerging technology like augmented reality (AR), wearables, and exosuits are fitting in.

AR AND MOBILE ARE DIGITIZING THE INSTRUCTION MANUAL

Augmented reality will be able to boost the skills of industrial worker.

In addition to being a hands-free “browser” that can communicate factory performance indicators and assign work, AR can analyze complicated machine environments and use computer vision to map out a machine’s parts, like a real-time visual manual. This makes highly skilled labor like field service a “downloadable” skill (in a manner not unlike The Matrix).

Daqri and Atheer are well-funded headset makers that focus on industrial settings. Upskill‘s Skylight platform (below) makes AR for the industrial workforce using Google Glass, Vuzix, ODG, and Realwear headsets. The company raised nearly $50M from the corporate venture arms of Boeing and GE, among other investors.

Many AR makers envision the tech working like a handsfree “internet browser” that allows workers to see real-time stats of relevant information. Realwear‘s wearable display doesn’t aspire to true augmented reality like a Daqri headset, but even a small display in the corner of the eye is fairly robust.

Others like Scope AR do similar work in field service using mobile and iPad cameras, employing AR to highlight parts on industrial equipment and connecting to support experts in real time. This saves on the travel costs of flying out people to repair broken equipment.

Parsable, which works with mobile phones, is a workflow platform that gives out tasks and digitizes data collection, something that is often done with pencil and paper in industrial environments.

As the maxim goes, “what gets measured gets managed,” and in an area where robots are a constant competitive pressure, manufacturing organizations will invest in technologies that digitize human efforts down to each movement.

EXOSUITS & SAFETY TECH WILL BECOME STANDARD IN DIRTY & DANGEROUS JOBS

Exoskeleton technology is finally becoming a reality on factory floors, which could drastically reduce the physical toll of repetitive work. Startups here are making wearable high-tech gear that bear the load alongside a worker’s limbs and back.

Ekso Bionics, seen below, is piloting its EksoVest suit at Ford Motor Company’s Michigan assembly plants, and workers using the suit have reported less neck stress in their daily demands. The EksoVest reduces wear from repetitive motion and, unlike some competing products, provides lift assistance without batteries or robotics. Ekso’s CTO has said the long-term strategy is to get workers accustomed to the technology before eventually moving into powered exoskeletons.

Sarcos is another well-known exosuit maker, which has raised from corporates including Schlumberger, Caterpillar, and Microsoft and GE’s venture arms. Sarcos is more strictly focused on remote controlled robotics and powered exoskeletons, which can lift 200 lbs repeatedly. Delta Airlines recently said it would join Sarcos’ Technical Advisory Group to pilot the technology.

In similar territory is Strong Arm Technologies, which makes posture-measuring and lift-assisting wearables. Strong Arm touts predictive power to intervene before risk of injury or incident, and is positioned as a labor-focused risk management platform.

Where humans are still needed for some dirty and dangerous tasks, wearables and exoskeletons will augment human’s ability to do work while also promoting safety.

5. Machining, Production & Assembly

Automation is coming for dirty, dull, and dangerous jobs first.

Already, many human jobs within the mass-production assembly line have been crowded out by automation. Cyber-physical systems like industrial robotics and 3D printing are increasingly common in the modern factory. Robots have gotten cheaper, more accurate, safer, and more prevalent alongside humans.

Consumer tastes have also broadened, and manufacturers are trying to keep up with increasing demands for customization and variety.

Visions for Industry 4.0 involve a completely intelligent factory where networked machines and products communicate through IoT technology, and not only prototype and assemble a specific series of products, but also iterate on those products based on consumer feedback and predictive information.

MODULAR PRODUCTION ENABLES CUSTOMIZATION

Before we reach a world where humans are largely uninvolved with manufacturing, modular design can help existing factories become more flexible.

Modularity allows the factory to be more streamlined for customization, as opposed to the uniformity that’s traditional for the assembly line. Modularity could come in the form of smaller parts, or modules, that go into a more customizable product. Or it could be equipment, such as swappable end-effectors on robots and machines, allowing for a greater variety of machining.

Presently, mass-production is already refashioning itself to handle consumer demand for greater customization and variety. 90% of auto makers in a BCG survey said they expect a modular line setup will be relevant in final assembly by 2030. Modular equipment will allow more models to come off the same lines.

Startups are capitalizing on the push toward modular parts.

Seed-stage company Vention makes custom industrial equipment on-demand. Choosing from Vention’s modular parts, all a firm needs to do is upload a CAD design of the equipment they want, and then wait 3 days to be sent specialized tooling or robot equipment. Many existing factories have odd jobs that can be done by a simple cobot (collaborative robot) arm or custom machine, and these solutions will gain momentum as factories everywhere search for ways to improve efficiency.

Modular production will impact any sector offering increased product customization. Personalized medicine, for example, is driving demand for smaller and more targeted batches. In pharmaceutical manufacturing, modularity allows processors to produce a variety of products, with faster changeovers.

ROBOTICS AUTOMATE THE ONCE-ODD JOBS

Industrial robotics are responsible for eroding manufacturing jobs, which have been on the decline for decades. As a report by Bank of America Merrill Lynch explains: “long robots, short humans.”

But the latest wave of robotics seems to be augmenting what a human worker can accomplish.

Cobots (collaborative robots) are programmable through assisted movement. They “learn” by first being moved manually and then copying the movement moving forward. These robots are considered collaborative because they can work alongside humans.

Whether these are truly collaborative or rendering human labor redundant remains to be seen. After a Nissan plant in Tennessee added autonomous guided vehicles, no material handlers were laid off with the increased productivity. European aircraft manufacturer Airbus also uses a mobile robot, which works alongside humans to drill thousands of holes into passenger jets.

While even the best robots still have limitations, economists fear that automation will eventually lead to a drastic restructuring of labor.

Due to rising labor costs worldwide, robotics are presently causing a new wave of re-shoring — the return of manufacturing to the United States.

In a 2015 survey by BCG, 24% of US-based manufacturers surveyed said that they were actively shifting production back to the US from China, or were planning to do so over the next two years — up from only 10% in 2012. The majority said lower automation costs have made the US more competitive.

Robotics have become invaluable for monotonous jobs such as packaging, sorting, lifting repeatedly. Cobot manufacturer Universal Robots says some of its robot arms pay for themselves in 195 days on average. As a whole, the category of collaborative robots are priced on average at $24,000 apiece.

We’ve previously identified more than 80 robotics startups, but for heavy-duty machining, significant market share is taken by big industrials players like ABB, Mitsubishi, Fanuc, and Yaskawa.

In the near term, the reprogrammable nature of cobots will allow manufacturing firms to become more customized and work in parallel with existing equipment and employees. On a longer time horizon, however, robotics will be the engine for moving towards “lights-out” manufacturing.

3D PRINTING

For certain mass-produced items, 3D printing will never beat the economies of scale seen in injection molding. But for smaller runs, fulfillment using additive manufacturing will make sense.

Using metal additive manufacturing for one-third of components, GE made an engine that burns 15% less fuel than previous designs. GE says it will begin testing the Cessna Denali engine for potential flight tests in 2018.

Manufacturers will increasingly turn to 3D printing as mass-customization takes off within certain consumer products.

Shoes have become one popular use case to watch. For example, Adidas has partnered with Carbon to mass-print custom athletic shoes. Additionally, other 3D printing services companies like Voxel8 and Wiiv have positioned themselves specifically for the shoe use case.

Just a few years from now, it may be more commonplace to see mass-customized parts in consumer electronics, apparel, and other accessories — all brought to you by 3D printing. Additionally, if rocket-printing startup Relativity Space is any indication, the technology will also be applied to building large-scale industrial print jobs.

Industrial 3D printing is the hottest segment within the broader space, and many startups are aiming to deliver advanced materials that include carbon fiber or other metals with exotic properties.

6. Quality assurance

As the factory gets digitized, quality assurance will become increasingly embedded in the organization’s codebase. Machine learning-powered data platforms like FeroSight Machine, and Uptake, among a host of others, will be able codify lean manufacturing principles into systems’ inner workings.

Computer vision and blockchain technologies are already on the scene, and offer some compelling alternative methods for tracking quality.

COMPUTER VISION

In mass production, checking whether every product is to specification is a very dull job that is limited by human fallibility. In contrast, future factories will employ machine vision to scan for imperfections that the human eye might miss.

Venture-backed startups like Instrumental are training AI to spot manufacturing issues. And famed AI researcher Andrew Ng has a new manufacturing-focused startup called Landing.ai that is already working with Foxconn, an electronics contract manufacturer. (Below is a view inside Landing.ai’s module for identifying defects.)

Many imperfections in electronics aren’t even visible to the human eye. Being able to instantaneously identify and categorize flaws will automate quality control, making factories more adaptive.

BLOCKCHAIN WILL HELP WITH RECALLS

In August 2017, Walmart, Kroger, Nestle, and Unilever, among others, partnered with IBM to use blockchain to improve food safety through enhanced supply chain tracking. Walmart has been working with IBM since 2016, and said that blockchain technology helped reduce the time required to track mango shipments from 7 days to 2.2 seconds.

With 9 other big food suppliers joining the IBM project, the food industry — where collaboration is rare — could also be better aligned for safety recalls.

Similarly, factories employing blockchains or distributed ledgers could be better positioned in the event of recall. In factories where food or automobiles are processed, a single system for managing recalls could more swiftly figure out the origin of faulty parts or contaminated batches, possibly saving lives and money.

7. Warehousing

Lights-out warehouses may come even faster than lights-out factories.

With the rise of e-commerce, demand for warehouse space has exploded. Last year, the average warehouse ceiling height is up 21% compared to 2001, and spending for new warehouse construction hit a peak in October 2017, with $2.3B spent on construction in that month alone.

WAREHOUSE ROBOTICS

Amazon’s historic $775M acquisition of Kiva Systems is said to have set off an arms race among robotics makers. Riding the e-commerce wave and the industry-wide pressure to deliver orders on time, we’ve witnessed an explosion of robotics startups focused on making fulfillment more efficient.

Lately, other Kiva-like companies, including Fetch Robotics and GreyOrange, are focusing on other areas of warehouse automation, such as picking and palletizing.

Some startups such as Ready Robotics and Locus have applied the classic robotic arm to package e-commerce orders, though their collaborative nature makes them suited for a number of industrial tasks. We’ve previously looked at industrial robotics companies that could be targets for large corporates.

Manufacturers and hardware-focused investors will continue to hunt for the next robotics maker that’s 10x better than the status quo. And the economics of cheaper and more agile robots may mean we’ll see more robots alongside humans in the short term.

AI FOR SCANNING

As computer vision melds with enterprise resource planning, fewer people and clipboards will be needed in sorting, scanning, and spotting defects.

Aquifi, for example, uses computer vision inside fixed IIoT and handheld scanners. Machine vision can measure products dimensions, count the number of boxes in a pallet, and inspect the quality of boxes. Presently, this is often done with clipboards, eyeballing, and intermittent scanning.

Vision will be increasingly crucial for IIoT to “abstract away” a real-time picture of what’s happening inside a warehouse. Closing the loop, so to speak, between the physical world and bits and bytes is essential to creating the autonomous warehouse. 

8. Transport & supply chain management 

Once the product is packaged and palletized, getting it out the door efficiently is a daunting task. With thousands of SKU numbers and orders to manage, the complexity can be astounding — and enterprise resource planning (ERP) software has proliferated to handle it.

But there’s still room for IoT and blockchain to get even more granular with real-time supply chains.

TRUCKING & FLEET TELEMATICS IOT

In general, there is poor awareness about where items are in real time throughout the supply chain.

The fleet telematics field saw several large exits in recent years, with Verizon acquiring both FleetMatics and Telogis. IoT and software for shipments will only grow more important as supply chains decentralize and get automated.

Farther out, the advent of autonomous trucks could mean that autonomous systems will deliver, depalletize, and charge upon receipt of a Bill of Lading. This will bring greener, more efficient movement, as well as more simplified accounting.

Uber and Tesla both have high-profile plans for autonomous semi-trucks, and Starsky Robotics (below) recently raised nearly $20M from Y Combinator, Sam Altman, and Data Collective, among others, specifically for long-haul trucking.

BLOCKCHAIN

As mentioned above, a number of DLT pilots and blockchain startups are trying to put supply chain management software into a distributed ledger.

The willingness to explore these technologies indicates digitization here is long overdue. The highly fragmented nature of supply chains is a fitting use case for decentralized technologies and could be part of a larger trend for eliminating the inefficiencies of global commerce.

Shipping giant Maersk, for example, is working on a startup with Hyperledger that will aim to help shippers, ports, customs offices, and banks in global supply chains track freight. Maersk’s goal is to replace related paperwork with tamper-resistant digital records.

Meanwhile Pemex, the Mexican state-owned petroleum company, is assisting Petroteq in developing oil-specific supply chain management software. The Petroteq project — an enterprise-grade, blockchain-based platform called PetroBLOQ — will enable oil and gas companies to conduct global transactions.

In the future, manufacturers will explore decentralized technologies to make their organizations more autonomous and their belongings (coming or going) more digitized in real-time. Blockchain not only has the promise of simplifying SCM, but also could make payments more frictionless.

Conclusion

Manufacturing is become increasingly more efficient, customized, modular, and automated. But factories remain in flux. Manufacturers are known to be slow adopters of technology, and many may resist making new investments. But as digitization becomes the new standard in industry, competitive pressure will escalate the inventive to evolve.

The most powerful levers manufacturers can pull will come in the form of robotics, AI, and basic IoT digitization. Richer data and smart robotics will maximize a factory’s output, while minimizing cost and defects. At the unmanned factory in Dongguan, employing robotics dropped the defect rate from 25% to less than 5%.

Meanwhile, as cutting-edge categories like blockchain and AR are being piloted in industrial settings, manufacturing could eventually be taken to unprecedented levels of frictionless production and worker augmentation.

In the words of Henry Ford: “If you always do what you always did, you’ll always get what you always got.” To reach its full potential, the manufacturing industry will need to continue to embrace new technology.

Ericsson – 10 Hot Consumer Trends 2018

Tomorrow, your devices will know you

Imagine you have just arrived home from work. You wave your hand, and the lamp turns on, flashing the light in greeting. The home speaker begins to play music, but when you give it an exasperated look, it turns off. You make a coffee, but grimace because it’s too bitter. The coffee machine immediately offers to add sugar or milk.

Two things are conspicuously absent from this vision of a not-too-distant future. One is an appliance with switches and knobs, and the other is a smartphone full of remote control apps.

Our research indicates that consumers are increasingly moving towards a paradigmatic shift in how they expect to interact with technology. Ever more things are becoming connected, but the complexities of how to control them all are a different matter.

On the one hand, alternative yet equally good user interface solutions for simple functions have existed for much longer than we’ve had electronic gadgets. A Westerner who experiences an Asian meal for the first time soon finds out that the user interface to that meal is a pair of chopsticks rather than a knife and fork.

On the other hand, mass-market acceptance of digital technology has made the proliferation of user interfaces practically infinite. Every new device with a screen adds new user interface variations, which are then multiplied by the number of apps within each gadget.

Today you have to know all the devices. But tomorrow all the devices will have to know you. If consumers continue to be faced with the prospect of learning and relearning how to use devices in the face of an ever-increasing pace of technological change, they will become increasingly reluctant to buy in to the future.

We might already be close to that breaking point. The current generation of “flat” user interfaces do not use 3D effects or embellishments to make clickable interface elements, such as buttons, stand out. It is difficult for users to know where to click. As a result, they navigate web pages 22 percent slower.1 For this reason, our trends for 2018 and beyond focus on various aspects of more direct interaction between consumers and technology.

With 5G, connectivity is set to become ubiquitous. This might sound simple, but it involves a huge technology upgrade; devices must be able to relay complex human interaction data to cloud-based processing, and respond intuitively within milliseconds. The Internet of Things (IoT) must provide interoperability between all devices, and allow for mobility. Network availability also needs to be maintained, so that devices do not suddenly go offline and lose their human-like capabilities.

Trend 1. Your body is the user interface

Digital tech is beginning to interact on human terms.

Consumers who already use intelligent voice assistants are leading a behavioral change. In fact, more than half of them believe we will use body language, intonation, touch and gestures to interact with tech just like we do with people; two out of three think this will happen in only three years.

Today, smartphones are almost synonymous with internet use. But when consumers increasingly interact with other types of tech, they may well start to think about a general need for connectivity.

Given that one in three intelligent voice assistant users think that eventually they will not be able to open doors, cook food or even brush their teeth without an internet connection, it is clear that reliable connectivity will become all important.

But it may also be necessary to think about what we will use less because of this change. Potentially, we will have a reduced need for smartphone-based remote control apps.

And although the keyboard and mouse are universally present and accepted by almost everyone today, 81 percent of intelligent voice assistant users actually believe such traditional input devices will be a thing of the past in only 5 years. Will we miss them? If direct interaction turns out to be more convenient, we certainly won’t.

There are many other interfaces that will also be replaced by direct interaction and a reliance on connectivity. For example, the advanced internet users in our survey voted self-driving cars as the next tech gadget that people everywhere will eventually buy. This means not only the end of steering wheels and pedals, but also that cars will have to directly interact with pedestrians. For example, how does someone waiting at a crossing know when they can go if there is no driver in the car to gesture to them?

Trend 2. Augmented hearing

In the near future, we might find that we use wireless earphones all day long – and even sleep with them in too.

Many smartphone makers are now abandoning the headphone jack in favor of digital multi-function ports, in a way forcing consumers to seek out wireless alternatives instead. Some accept this change, while others do not; but all might agree that the headphone jack represents an analogue era that we no longer live in.

This means when consumers are upgrading their phones they also need to upgrade their earphones. And just as people expect new functions in a phone, it turns out that they expect new functions in earphones too. Today, earphones are already used not only to enable sounds but also to block them out. For example, noisecancelling functionality has been serving this dual purpose for some time.

We use headphones and earphones to select what we want and do not want to hear.

It is therefore not surprising that half of all advanced internet users surveyed think that earphones that let you select which people in a room you want to hear clearly, and which people you want to mute, will be mainstream in only three years. But for that to become a reality, earphones will need to be more aware of our intentions and allow for more direct user control.

Furthermore, such functionality can be applied in many situations. In fact, 81 percent believe earphones that charge wirelessly, so that you never have to take them out at all, will be mainstream in only 5 years.

The most anticipated functionality for such earphones is real-time translation of all languages, desired by 63 percent of respondents. But 52 percent also want to block out the sound of snoring family members in order to sleep.

 Trend 3. Eternal newbies

As many as 30 percent of respondents say new technology makes it impossible to keep their skills up to date. This means some of us feel like total beginners even when performing everyday routine tasks.

The pace of technological change is increasing almost every day, and it is easy to feel stress at not being able to keep up. For some, this is probably manifested as a feeling of helplessness. [1] But for many, it may present an opportunity. In fact, almost half of consumers think technology will make learning even advanced professions much quicker. On the other hand, endeavors to learn and relearn will be a never-ending rat race, with 55 percent believing that technological change will accelerate the pace of change in skills needed at work.

Luckily, the internet can also help consumers cope with this new situation. As many as 46 percent say the internet allows them to learn and forget skills at a faster pace than ever before.

Generally, we learn skills only at the moment we need them. Already today, almost half say they often just search the internet for how to do things, because they have either forgotten or because there is a new way to do it anyway.

Trend 4. Social broadcasting

Social media promised user-driven two-way communication, giving voice and power to individual consumers and redressing the balance between senders and receivers. However, social media is now being overrun by one-sided broadcasters.

Influencers with money buy followers and those with the right know-how use artificial intelligence (AI) bots to fill social media with traditional broadcasting messages – turning social media back into a platform of one-way communication.

Consumers are well aware that social networks are increasingly becoming the scene for standardized broadcast messages that are more designed to spread an opinion than to invite dialogue and reciprocity. Fifty-five percent think influential groups use social networks to broadcast their messages, and a similar number think politicians use social media to spread propaganda. Thirty-nine percent think celebrities pay to get more followers, while as many are getting tired of requests from companies to rate them and like them online. In fact, one in three confess that they do not really read other people’s status updates, implying they do not neccessarily pay attention to what they see online anyway.

On the other hand, half of the advanced internet users surveyed say AI would be useful to help check whether facts stated on social networks are true or false. The same number of respondents would also like to use AI to verify the truthfulness of what politicians say.

We humans sometimes favor automated communication over spontaneous dialogue. Messaging apps in smartphones and smart watches are already offering lists of predefined answers that we use to reply even to our nearest and dearest. As many as 41 percent of thosewho currently use intelligent voice assistants would even want to use AI to automate their email replies.

Even though we may find it acceptable to give impersonal, machine like responses to others, we must realize that we are also receiving them in return.

The question as to whether we humans really want to engage in sustained dialogue still remains open.

What would happen if we leave all dialogue to machines instead? Given that as many as 38 percent of those who currently already use intelligent voice assistants would like to use AI to write social network status updates, this is a question that needs to be answered. Would a world where only AI assistants interact allow for a better exchange of opinion?

Trend 5. Intelligent ads

Ads might become too smart for their own good.

Consumers have a love-hate relationship with advertising. In our study, 40 percent say they do not mind advertising if it means they get free services, whereas just over a third say they actually dislike ads.

This tension will remain, as the online advertising industry will certainly jump at the chance to create more direct interaction with consumers. Simultaneously, consumers themselves also see an opportunity to employ cutting-edge tech to make ads less invasive. For instance, 6 in 10 want to employ AI to block out online ads.

Speaking of AI, 42 percent think companies will use it to make intelligent advertising that knows exactly how to persuade us to buy things. On its own, that would leave people quite exposed to commercial exploitation. But at the same time, 6 out of 10 consumers expect to be able to use AI for price comparisons, thus helping them to select other suppliers.

This could cause issues, with consumers becoming reliant on an electronic assistant for their purchases. For example, 57 percent of current intelligent voice assistant users would like an AI to help them with everyday shopping. But many already use a voice assistant that has been developed by an advertising company or retailer.

However, some believe the urge to provide compelling experiences might eventually cause ads to defeat their own purpose. Today, only a proportion of consumers use premium versions of smartphone apps when free versions exist. However, ads using augmented reality (AR) and virtual reality (VR) will gain app-like functionality and could in essence turn into free versions of the products or services themselves. For this reason, more than half of current AR or VR users think ads will eventually replace the products being advertised. For example, you might experience a beach destination in a VR ad and realize you do not need the actual vacation anymore.

More than half

More than half of AR or VR users think ads will become so realistic they will eventually replace the products themselves.

Trend 6. Uncanny communication

Machines that mimic human communication can make us feel surprisingly awkward.

One thing that we have all been practicing since the day we were born is communication with other humans. Obviously, this also makes us experts at knowing when an interaction, even a familiar one, is not quite human after all.

Although people quite easily assign human characteristics to toys, phones and pets, we can quickly become suspicious if the objects become too human-like. For example, some people who have visited the Madame Tussauds wax museum or any similar place might recall how their feelings shifted from wonder to dislike when looking at the figures on display. The fact that researchers are now creating AI-enabled robots that mimic human expression down to the slightest detail [1] may not necessarily improve our aversion to things pretending to be human.

A future where we move towards more direct communication with devices all around us will be full of pitfalls. Will machines communicate just like humans if they grow up communicating with us? Or will humans refuse to interact if machines become too similar to us?

In our research, 50 percent of respondents said that not being able to tell the difference between human and machine would spook them out. In other words, the feeling of uncertainty alone would be enough to create a negative reaction. This has implications for automation of some processes that are already well underway. For example, as many as one in three say they would avoid contacting companies that use intelligent robots in customer service.

Most likely, the smartphone is the first device that will expose consumers to these issues. Today, we already use biometric data, such as fingerprints or even facial recognition, to unlock the screen. But if the smartphone were to use such information interactively, many will feel uneasy; almost half  of consumers said they would be spooked out by a smartphone that constantly watches their face. And as many as 40 percent say that it would be spooky if their smartphone sees when they are happy, sad or bored and responds accordingly.

A natural instinct in such situations might be to try to hide your face. And indeed, one in three would like to wear glasses that make it impossible for facial recognition software in their smartphone or social network to recognize them.

If consumers were to develop such mistrust of their personal devices and communication services, they would also soon doubt similar technology used on a societal level. Thus, one in three would also like to wear glasses that make it impossible for surveillance cameras to recognize them.

Trend 7. Leisure society

Creating the freedom to engage in leisure may be more important than the need to preserve work.

One in five students and working people in our study believe robots will take their jobs before they retire. Some people certainly look to such a future with trepidation, whereas others may be looking forward to a day that is free from the boredom and stress of the daily work routine.

In any case, those who think robots will take over their jobs are outnumbered by the 32 percent who do not think they need a job to find meaningful things to do in life. Furthermore, almost 4 in 10 believe their hobbies may also develop into new sources of income. For this reason, it is rather likely that more people will face a situation where work and leisure become more intertwined and income is garnered from many different sources.

At the core of this is of course the strong tie between work and income. If that connection is severed, more people would be willing to forgo work. In our research, 49 percent said they are in fact interested in a universal basic income, and as many as 1 in 3 think it is OK to not have a job as long as their economic situation is not hurt.

But is it realistic to believe that income will be separated from work? The alternative may be to have robots work for you rather than having them take your job. An example could be a taxi driver who would rather manage a few self-driving taxis than drive himself. Forty percent say they would indeed like a robot alter ego that works and earns income for them.

But both of these scenarios would lead to fewer humans actually working. Are we then heading towards a leisure society? In fact, one in three would like having everything handled by intelligent robots, giving them all the free time they could ever want. And almost a quarter of respondents even see a future where intelligent robots take control of everything.

Trend 8. Your photo is a room

Our photos are memories we have captured to revisit time and again, but they may be turning into rooms we can freely walk around in.

Smartphones are the most popular cameras ever. Not because they
are necessarily the best quality, but because they are always there when you need them. When that memorable moment suddenly happens, the smartphone is with you.

For this reason, our memories have changed from physical photo albums stowed away in a cabinet, to digital albums on our smartphones. However, new technologies such as light field photography are changing the nature of photos themselves, and we will soon be able to revisit our memories from more angles than a flat picture frame allows.

Three out of four consumers believe taking photos at events such as weddings or birthdays and revisiting them in VR as if you were one of the guests will be commonplace in only five years. As many think we will also do this on holiday and at parties by then.

In order to do this, one in two already want a smartphone camera that lets you capture everything around you in 3D. Those who are currently using AR or VR have a higher level of interest in this area, with 56 percent even wanting contact lenses with built-in AR or VR functionality.

But if photos become rooms, consumers will also need to be able to manipulate objects in these rooms. In this light, it is not a big surprise that as many as 55 percent of those currently using AR or VR would also like gloves or shoes that allow you to interact with virtual objects.

Trend 9. Streets in the air

City streets are getting so crowded that citizens are looking to the skies for relief.

Urbanization keeps accelerating, as cities become increasingly powerful drivers of the global economy. But whereas cities not only contain the majority of the earth’s population, and consume an even higher proportion of its natural resources, cities in fact only occupy one percent or less of the land area worldwide. [1] Cities are, in other words, extremely space-challenged places.

Yet, from the perception of space, cities seem to be inhabited by people who ave not realized there is a third dimension; apart from a few airplanes, the kies above are mostly empty.

But as city populations continue their extensive rowth, this might change. Already today, 39 percent think their city is so congested that it needs a road network in the air for drones and flying vehicles.

Obviously, city dwellers recognize that new layers of streets in the air would ause some disturbances in airplane traffic, and would also increase overall treet noise. But an even bigger concern, voiced by 38 percent, is the possibility of drones actually falling on their heads.

Hence, there would need to be a way of knowing where drones fly, so that citizens could take similar precautions as when they cross streets on the ground. Therefore 55 percent of current AR or VR users would like an AR smartphone app that visualizes these air corridors.

The fact that 4 out of 10 respondents are interested in using flying taxis might reveal more about current frustration levels among city dwellers than it does about the most economically viable type of transport.

A more potentially likely near-future scenario may be that competition to increase the delivery speed of consumer purchases takes to the air. For example, almost half of respondents want drones that deliver takeout food so quickly that the dishes are still hot when they arrive. Given the extreme environment of the world’s largest cities, this could happen quicker than you might imagine.  In fact 77 percent think most online retailers will use drones in order to minimize delivery times in only 5 years.

Trend 10. The charged future

A connected world will require mobile power. Keeping the power flowing will be as critical as maintaining connectivity; if either goes down, instant disruption will ensue.

There are of course many aspects to how we will power our hyper-connected lives. Sustainability of resources might be one reason why consumers now rate electricity as the most popular energy source – 48 percent even think electricity should power airplanes.

Another aspect is convenience, which could explain why consumers have high expectations of batteries. Fifty-six percent of advanced internet users expect smart battery technology to fundamentally change how we power everything from phones to cars.

For many consumers, their smartphone’s battery doesn’t last a day without dying, and 71 percent want long-lasting batteries that they don’t need to worry about charging. The same percentage of respondents also want batteries you can fully charge in minutes, just in case. Consumers have been asking for batteries such as these for years, but now more than 80 percent of respondents believe they will be mainstream in only 5 years. One in two even thinks charging batteries using radio signals in the air around us will be commonplace in only three years.

It might be the renewed focus on electricity in general, and electric cars in particular, that makes people believe innovation in battery technology will pick up speed. As many as 63 percent want electricity to power cars, whereas only 33 percent prefer either oil or gas. Still, one in three believes fuel cars will not easily be replaced, indicating that there could still be some speed bumps ahead on the road to a fully charged future.

Methodology
 This report presents insights based on Ericsson’s long-standing consumer trends program, now in its seventh year. The quantitative results referred to in the report are based on an online survey of 5,141 advanced internet users in Johannesburg, London, Mexico City, Moscow, New York, San Francisco, São Paulo, Shanghai, Sydney and Tokyo that was carried out in October 2017.

Respondents were advanced internet users aged 15−69, who have an urban early adopter profile with high average use of new digital technologies such as intelligent voice assistants, virtual reality headsets and augmented reality applications.

Correspondingly, they represent only 30 million citizens out of around 180 million living in the metropolitan areas surveyed, and this, in turn, is just a small fraction of consumers globally. However, we believe their early adopter profile makes them important to understand when exploring future trends.

The voice of consumer

 Ericsson ConsumerLab has more than 20 years’ experience of studying people’s behaviors and values, including the way they act and think about ICT products and services. Ericsson ConsumerLab provides unique insights on market and consumer trends.

Ericsson ConsumerLab gains its knowledge through a global consumer research program based on interviews with 100,000 individuals each year, in more than 40 countries – statistically representing the views of 1.1 billion people.

Both quantitative and qualitative methods are used, and hundreds of hours are spent with consumers from different cultures. To be close to the market and consumers, Ericsson ConsumerLab has analysts in all regions where Ericsson is present, developing a thorough global understanding of the ICT market and business models.

Haystack – 2018 Seed Stage Reset

This post is about the Bay Area startup and venture ecosystem. I believe the local ecosystem just completed an entire “reset” with respect to the earliest stages of company formation. (I need to state upfront that this isn’t about the seed landscape nationwide — I’ll address that in a different post this year.)

When I began investing nearly five years ago in 2013, there were clear delineations between investment rounds — a founding team would raise from friends and family (and eventually, through scouts and Syndicates), progress to seed funds (established ones institutionalizing while new ones sprouted up), and then reach the promised land of the institutional Series A round, typically led by an established VC firm with a history of joining boards, maintaining ownership, and needing to drive large outcomes over many years to make their fund vehicles profitable. There was little cross-stage investing and many folks became cognizant of potential signaling risks when taking money from firms who would make an offer outside their known sweet spot.

Now with 2018 under full swing, those once-clear delineations are at best muddled. Nearly every VC, seed fund, and angel investor in the Bay Area is investing earlier and earlier to the point where “pre-seed” as an investment category has been normalized in our lexicon. With the exception of a few of the most focused funds — think: Benchmark focusing on Series A and Bs, or firms like IVP or Meritech which focus on post-traction growth rounds) — the Bay Area is now in a free-for-all, no-rules-barred competition to identify and partner with entrepreneurial talent before traction, before product-market fit, and before a team can demonstrate the evidentiary proof once required to initiate multi-million dollar wires.

VC heavyweights agree. A few weeks ago, USV’s Fred Wilson wrote in “The Early Stage Slump” that for VCs, “it means seed rounds are going to be the place to be.” Earlier this week, Sequoia Capital formally announced a new $180M seed fund (note: USV’s fund size is $175M), citing its long, storied history of investing very early in a laundry list of the Internet’s seminal companies, often right at what would be considered the seed round and “first money in.”

When the leadership of firms like USV and Sequoia publicly state that seed rounds are important to them, we have to step back and parse what it means. Let’s also not forget Sequoia’s pioneering work at creating deal funnels with their scout program and also by helping Y Combinator get off the ground. A few years ago now, a16z recruited Chris Dixonand Khosla Ventures recruited Keith Rabois, notable hires as both Dixon and Rabois were highly-visible, proven angel and seed investors. Since then, both have gone on to continue to make incredible seed investments with their new VC checkbooks (while also investing in more traditional VC style rounds).

During these past five years as I’ve tried to learn to become an investor, we’ve witnessed unprecedented, orthogonal changes to the ecosystem: AngelList Syndicates are now mostly private and increasingly responsible for early-stage financings; “pre-seed” funds market themselves as “first check in” and “pre-product” backed by institutional capital; the more established seed funds which began in the previous decade have “bulked up” to become nearly the size of USV, or even larger; the established VC funds have launched discovery funds keep their pulse on the early-stage market; hundreds of new investment firms across the spectrum (seed to growth) have sprouted, seemingly out of nowhere, eager to find the next Uber; and mega-VC funds like Softbank’s $100B Vision Fund and Mubadala, a $50B+ sovereign wealth fund headquartered in Abu Dhabi, which has opened a new office in San Francisco.

And, there you have it… an entire new seed ecosystem and “seed stage reset” in the Bay Area. As I tweeted out a few months ago, nearly every seed fund and larger VC may end colliding around a $3M round at $12-15M post-money. Call them mango seeds or call them small Series As. Whatever label you choose, it is clear professional investors are trying to invest much earlier than they have before. Founders now in the Bay Area (theoretically) have more capital sources to pitch to, but will also have to be able to decipher “option” seed checks versus concentrated seed checks from large VCs; traditional seed funds and micro funds will face a new type of competition; larger VC firms will need to manage more investments across their GP ranks, which may force them to hire more or stress-test bandwidth, a small price to pay for the luxury of turning over more cards; and larger VC funds will be able to see talented teams earlier, track them, get to know them, and see more evidence before committing more money to a particular company — this is of critical importance right now in the Bay Area given the local cost inflation and how much of VC $ are plunked right into high-priced office rents and salaries, which largely go into sky high residential rents.

A lot of good money was followed by bad money in recent years, especially in 2014-15. The Bay Area is trying to learn from that and reset the system, which effectively starts now at seed — or the small Series A, or whatever term you’d like to use. There’s no more signaling risk. There’s no more stage focus. There’s no more stigma around party rounds with VCs or taking money from a successful crypto project that ICO’d. The old rules of seed are now history and will eventually be rewritten. Buckle up and enjoy the ride!

Playbook App – 67 Blockchain Articles & Whitepapers that Shaped Crypto into What it is Today

With all of the noise surrounding bitcoin and its underlying technology, blockchain, it’s often difficult to separate real blockchain articles from those just looking for clicks.

Here’s a list, in no particular order, of articles and whitepapers written by the people actively involved in developing this new technology. The resources below range all the way back from the 1980s to today.

Source: cointelegraph.com

Articles

  1. Fat Protocols by Joel Monegro, Union Square Ventures
  2. Bitcoin — The Internet of Money by Naval Ravikant
  3. The Bitcoin Model for Crowdfunding by Naval Ravikant
  4. The Fifth Protocol by Naval Ravikant
  5. A Beginner’s Guide to Blockchain Technology by Coindesk
  6. The Blockchain Problem Space by Dax Ravi, Ironbay
  7. Blockchain and Design by Blake Hudelson
  8. A Quick Reminder Why Bitcoin was Invented in the First Place by Playbook App
  9. Making Money — Bitcoin Explained (with Emoji) by Tess Rinearson, Chain
  10. Bitcoin, Ethereum, Blockchain, Tokens, ICOs: Why should anyone care? by Preethi Kasireddy
  11. Crypto Tokens: A Breakthrough in Open Network Design by Chris Dixon
  12. Minimum Viable Block Chain by Ilya Grigorik
  13. Introduction to Bitcoin Concepts by 21
  14. Why Bitcoin Matters by Marc Andreessen
  15. Bit gold by Nick Szabo
  16. What is bitcoin and the blockchain? by Brian Forde
  17. Blockchain, simplified by Alexey Malanov
  18. Bitcoin in comic by Shi Yan
  19. The Importance of Bitcoin Not Being Money by Erik Voorhees
  20. The Role of Bitcoin as Money by Erik Voorhees
  21. Blythe Masters Tells Banks the Blockchain Changes Everything by Edward Robinson
  22. The Crypto Anarchist Manifesto by Timothy C. May
  23. The God Protocols by Nick Szabo
  24. A Cyberpunk’s Manifesto by Eric Hughes
  25. How the Bitcoin protocol actually works by Michael Nielsen
  26. Why the blockchain matters by Reid Hoffman
  27. Bitcoin faces a crossroads, needs an effective decision-making process by Arvind Narayanan
  28. Encrypted Data For Efficient Markets by Numerai
  29. Thoughts on Tokens by 21
  30. Three Technical Requirements to Connect Blockchains Without a Token by TenX
  31. Why I like the term, “Cryptoassets” by cburniske
  32. The Crypto J-Curve by cburniske
  33. Quantifying Decentralization by Balaji S. Srinivasan
  34. Beyond the boring blockchain bubble by Jon Evans
  35. Blockchains from the ground up: Part 1 by John Matthews
  36. On Bitcoin and Red Balloons by Moshe Babaioff
  37. A Proof of Stake Design Philosophy by Vitalik Buterin

White Papers & PDFs

  1. Blockchain in the Mainstream by Jeremy Epstein, Never Stop Marketing
  2. Denationalisation of Money: The Argument Refined by F. A. Hayek
  3. RPOW — Reusable Proofs of Work by Hal Finney
  4. B-Money by W. Dai
  5. Bitcoin Whitepaper by Satoshi Nakamoto
  6. Some Simple Economics of the Blockchain by Christian Catalini, MIT Sloan
  7. Hashcash — A Denial of Service Counter-Measure by Adam Back
  8. Shelling Out: The Origins of Money by Nick Szabo
  9. Bitcoin and The Age of Bespoke Silicon by Michael Bedford Taylor
  10. Majority is not Enough: Bitcoin Mining is Vulnerable by Ittay Eyal, Cornell
  11. On Bitcoin as a public randomness source by Joseph Bonneau
  12. Research Perspectives and Challenges for Bitcoin and Cryptocurrencies by Joseph Bonneau
  13. Eclipse Attacks on Bitcoin’s Peer-to-Peer Network by Ethan Heilman
  14. Does Governance Have a Role in Pricing? Cross-Country Evidence from Bitcoin Markets by Robert Viglione
  15. Bitcoin in Islamic Banking and Finance by Charles W. Evans
  16. Bitcoin-NG: A Scalable Blockchain Protocol by Ittay Eyal
  17. The Bitcoin Backbone Protocol: Analysis and Applications by Juan A. Garay
  18. The Bitcoin Lightning Network: Scalable Off-Chain Instant Payments by Joseph Poon
  19. Essays on Bitcoin by Alex Kroeger
  20. Blockchain Technology by Michael Crosby, Google
  21. Blockchain: the solution for transparency in product supply chains by Provenance
  22. Blockstack: A Global Naming and Storage System: Secured by Blockchainsby Muneeb Ali
  23. Consensus: Immutable Agreement for the Internet of Value by Sigrid Seibold
  24. Distributed Ledger Technology: beyond block chain by UK Government Chief Scientific Advisor
  25. Economics of Bitcoin: is Bitcoin an alternative to fiat currencies and gold?by Peter Surda
  26. Extending Existing Blockchains with Virtualchain by Jude Nelson
  27. The Impact and Potential of Blockchain on the Securities Transaction Lifecycle by Michael Mainelli
  28. A Fistful of Bitcoins: Characterizing Payments Among Men with No Namesby Sarah Meiklejohn
  29. An architecture for the Internet of Money by Meher Roy
  30. MRA Bitcoin Primer by Phillip Rapoport

CB Insights – History Of CVC: From Exxon And DuPont To Xerox And Microsoft, How Corporates Began Chasing ‘The Future’

CVC units have proliferated in recent years, but corporate venture’s roots go back to the dawn of America’s 20th Century business giants.

On On June 8, 2016  hundreds of businesspeople filed into the Metropolitan Pavilion in New York’s Chelsea neighborhood for a conversation featuring Fred Wilson of Union Square Ventures. They probably didn’t expect a take-down of corporate venture capital at the Future of Fintech conference, since corporate investors weighed heavily on the attendee and panelist list. But that’s what they got.

“Corporate investing is dumb,” he said. “I think corporations should buy companies. Investing in companies makes no sense. Don’t waste your money being a minority investor in something you don’t control. You’re a corporation! You want the asset? Buy it.”

Wilson’s comments cut against the grain. Corporate startup investing has actually grown steadily in recent years, with large companies from IBM to 7-Eleven investing in startups both directly and especially through dedicated venture capital arms. CVC units have become a relatively common feature of large corporate organizations, and companies seemingly far removed from tech and biotech are investing in startups, including Coca-Cola, Wal-Mart, and Campbell Soup Company.

So why do Fortune 500 corporations start corporate venture units and invest in small, risky companies developing untested products and services? Is it realistic for them to imagine they can absorb innovative technologies and business models simply by taking a stake in a startup? Can they compete for deals against specialists like Wilson and other top VCs who may be better compensated?

To answer these questions, it’s important to understand the history of corporate venture capital. Digging into the history, it’s clear that the tensions and contradictions surrounding CVC have been there from the start: the tension between financial and strategic aims, the contradictory evidence over whether startup investing actually works as a form of “outsourced R&D,” and the difficulty in competing for the best deals.

In this report, we dive into the history of corporate venture investing:

As CVCs increase in number and diversity, it will become increasingly important to understand their historical origins, motivations, and constraints.

uploadThe number of active CVC units has doubled since 2012

THE ORIGINS: GM AND DUPONT

In some sense, corporate venture capital can be traced back to the earliest days of America’s business giants.

It was 1914 when Pierre S. du Pont, president of chemical and plastics manufacturer DuPont, invested in a still private 6-year-old automobile startup named General Motors. Pierre du Pont had picked a winner. Like shares in a present-day Silicon Valley success, the stock leapt in value seven-fold over the course of World War I as wartime needs led to increased demand for automobiles.

After the war, the companies would become even more intertwined. DuPont’s board of directors invested $25M in GM, betting the cash injection could speed GM’s development, which in turn would also expand the demand for DuPont’s own goods — including artificial leather, plastics, and paints.

Not to mention, they saw GM as a promising investment. The company, which had gone public in 1916, was growing sales 56% annually, already had over 85,000 employees, and had begun building a new Detroit headquarters for its executives.

In other words, DuPont’s bet on GM blended strategic and financial aims, a mixed strategy that would also later come to define more formal corporate venture capital units.

DuPont, along with companies like 3M and Alcoa, would go on to pioneer the first major era of corporate venture investing. DuPont, in fact, developed the largest corporate venture program. This first flowering of CVC stretched from the late fifties and early sixties, roughly, until the stagflation crises of the seventies.

FIRST WAVE: CONGLOMERATE VENTURE CAPITAL, 1960-1977

The prevailing spirit of American big business at mid-century favored large diversified corporations operating in many sectors. The head of General Motors, which then employed hundreds of thousands of employees, could plausibly say that what was good for GM was good for the country.

GM headquarters, Detroit

The push for diversification was, in part, a result of strict anti-trust enforcement following the Great Depression, which prevented companies from exerting too much control in any of their established markets and forced them to look to new opportunities in order to increase profits. For companies looking to expand, corporate venture investing became a natural way to extend a company’s reach into a variety of different sectors and industries.

Early CVC investors had three primary motivations:

  1. Fast growing companies wanted to diversify and find new markets.
  2. American industrial conglomerates, at the height of their success, were flush with cash and wanted to put it to productive use.
  3. Venture capital was experiencing its first successes with the nascent tech industry, providing a model for corporations to follow.

CVC investors during this early period included many titans of American industry: Dupont, 3M, Alcoa, Boeing, Dow, Ford, GE, General Dynamics, Mobil, Monsanto, Ralston Purina, Singer, WR Grace, and Union Carbide.

Not many of these companies are the sort of brands we would traditionally associate with venture capital (although amidst the current resurgence of CVC programs, many of these companies that have survived do in fact have CVC programs today). Earlier in the 20th Century, venture capital was not always exclusively tied to the tech or pharma industries. Indeed, there was not much of a tech industry to speak of during this period — computers still took up entire rooms and had not yet emerged from elite corridors into popular consciousness.

Early CVC investors employed a variety of CVC models, often at the same time. Companies invested in internal employee ventures, and tried to spin out languishing in-house technologies into new ventures. In addition, corporations also invested in external startups, usually firms that addressed the parent corporation’s needs or strategic objectives, as was the case with DuPont and 3M. The most successful of these early CVC programs was run by 3M, whose internal CVC program famously produced Post-it notes, a classic business school case study.

But there were also examples of companies that stretched further with their venture arms, looking far beyond their core businesses and embracing new technology as the key means of diversifying. While this may have been prescient, it was not necessarily prudent.

‘Very bright and very careful’: Exxon Enterprises in the first wave

An emblematic program of the first CVC era was Exxon Enterprises. Exxon was probably the largest CVC investor of the 1970s, overtaking DuPont, which is thought to have had the largest program in the 1960s.

The group was founded in 1964 with the intention of exploiting underutilized technologies from Exxon’s corporate labs. The group then later began to make minority investments in external startups and finally pivoted one final time to focus on computing systems for commercial use.

Behind Exxon’s push to diversify beyond the oil business were the 1970s energy crises, which began in 1973. With Middle Eastern countries and OPEC suddenly seemingly willing to choke off the largest sources of oil, the prognosis for the industry became much more uncertain.

The president of Exxon Enterprises said in 1976 that the goal of the program was “to involve the corporation in new technologies and in new business opportunities that could have some significance in the 1980s and beyond.” Exxon Enterprises invested in 37 ventures during the 1970s1, about half internal and half external; companies with names like Qume, Vydec, Ramtek, Qwip, and Xentex.

These companies made products far outside of Exxon’s core business: a test scoring machine, a high-speed printer, air pollution mitigation technology, a text-editing machine, surgery equipment, solar heating panels, graphite composite golf club shafts, and advanced computers (a partial list can be found here).

Exxon Enterprises also had two wholly owned subsidiaries, a company that manufactured and sold gasoline pumps and other service–station equipment and Exxon Nuclear, a commercial supplier of nuclear fuel products.

A company spokesperson told the New York Times in 1976 that small venture investments were one of the only ways the company could diversify while avoiding anti-trust litigation. The company, itself the product of a landmark antitrust suit, did, indeed, have a major acquisition blocked by the government a few years later.

Although the idea of Exxon investing in graphite composite golf club shafts may have seemed strange to some observers at the time, the company was generally praised for its efforts. An anonymous venture capitalist described Exxon Enterprises’ staff to The New York Times as “very bright and very careful.” And a writer at the Harvard Business Review argued in 1980 that corporate venture investing has “allowed Exxon to transform itselffrom a huge —though unglamorous — one-product, narrow-technology oil company” — that was already pulling in $49B per year in revenue at the time — “to an exciting company that is expanding into computers and communications, advanced composite materials, and alternative energy devices.”

Another VC investor speculated in 1976 that “someday you might be able to drive into the local Exxon station for gas and while you’re there, rent a telephone, a computer or almost anything else you want.”

As the years dragged on, Exxon grew impatient with its investments and tried to merge and consolidate these ventures under Exxon’s corporate structure, but the entrepreneurs, who had been promised relative freedom to pursue their projects as they saw fit, fled and the efforts collapsed, leading to tens of millions of dollars in losses.

Exxon Enterprises then regrouped and refocused, putting its emphasis on organizing its companies to sell computing systems and aspiring to become the next IBM or Apple. In fact, the company actually released a personal computer in 1982.

In addition, as part of this effort, it invested in and helped nurture an innovative microprocessing company by an early Intel engineer. But Exxon’s push for consolidation of its companies to achieve vertical integration in the computing market undermined its position as it unwillingly became a competitor to the same computing companies who bought its microprocessors. When Exxon shut down the program in 1984, the losses on computing investments alone had surpassed $2B. Exxon had again decided to remain a “narrow-technology oil company.”

The End Of The Diversification Era

The first wave of corporate venture capital was largely finished by 1973, although some companies, as we have seen, continued to soldier on through the decade. The immediate cause of the decline was the economic downturn — the oil shocks and the stagflation crises — which led to the collapse of the IPO market and dried up the rich cash flows that had been funding much CVC activity.

In addition, the seventies saw the “shareholders’ revolution” and companies pulled apart during the beginning of corporate raider era. There was also more lax anti-trust regulation. All this put an end to the frantic diversification pushing the first wave of CVC.

The average CVC program of this first era lasted four years. Another, less-known reason for the decline in CVC was a substantial increase to the capital gains tax in 1969, which hurt stand-alone venture capital firms — which many corporations emulated. By 1978, only 20 US corporations had an active CVC program.2 Around this time, however, regulatory, cultural, and technological changes took hold that would stimulate and spur the second wave of corporate venture capital activity into action.

THE SECOND WAVE: SILICON VALLEY, 1978-1994

The release of the first personal computers in the late 1970s was the signature event that would ultimately drive the second wave of corporate venture capital investment. While corporations in the first wave looked inward for innovation and invested externally to diversify their businesses, the second wave was marked by recognition of the epochal changes introduced by computers and a sense in the business community that they must engage or risk obsolescence.

Technology was becoming a consumer-facing industry and Silicon Valley was becoming “Silicon Valley” in the popular imagination in the early 1980s; Time Magazine named the computer as its machine of the year in 1983. The hype surrounding the PC’s rise set a pattern that would continue in subsequent periods of corporate venture capital activity: a new, “can’t miss” technology breaks through in the market, generates enthusiasm in the business community and broader culture, and ultimately helps stimulate investment activity.

Early tech success stories like Microsoft and Apple were celebrated by the press, and their founders, Bill Gates and Steve Jobs, came to represent a new kind of businessmen in a culture still dominated by the staid image of the company man. Gates and Jobs were already near-celebrities, a whiff of counterculturalism about them. As a result, entrepreneurship became a buzzword for the first time.

A contemporary article in The New York Times proclaimed, “A Pioneer Sweeps Business” and offered the following observations:

Nearly 160 schools now offer courses in entrepreneurship — up from 16 in 1970. At Harvard — training ground for the new generation of business leaders — nearly two-thirds of all students take the entrepreneurial management course and, last year, 80% of first-year Harvard students said they wanted to own and manage their own business someday.

Some of the people are going into business not because they have a fundamental belief in American capitalism, but because they have a social mission.

Young people by the droves — refugees from corporate life, career-minded housewives and the cream of the business school elite — are turning their backs on giant corporations and going it alone. In doing so, they have pushed forth America’s inventive edge and are restoring vitality to an economy that, far too often in recent years, seemed to have lost its competitive might.

None of this would sound unusual in 2017 (or 1998) — but this article was written in 1984. Zenas Block, a professor of management at New York University, told The New York Times in 1985, “media publicity given to private entrepreneurship has been considerable, and that has had a major impact on large corporations.”

CVCs are followers not leaders

As is often the case in the history of CVC, corporate investors largely followed the lead of private venture capital, which was reinvigorated by favorable regulatory changes in the late seventies. Private VC received a big boost in 1978 when the capital gains tax was significantly reduced, and then again in 1980, when it was lowered once more, incentivizing investment and creating a boom in venture capital. This increased the pool of capital available to entrepreneurs, incentivizing entrepreneurship, and creating a positive feedback loop. Between 1977 and 1982, the amount of money dedicated to venture capital grew from $2.5B to $6.7B.

Corporate funds accounted for a significant portion of this capital. Money from public corporations accounted for 41% of the $2.5B dedicated to the diminished venture capital industry in 1977, which includes both active corporate investors and passive investments by corporations in independent VC firms. By 1982, that figure had fallen to 27% of the $6.7B dedicated to venture capital, which nonetheless still represents an increase in overall corporate dollars.

Companies employed several models in pursuing corporate venture capital programs during this period, often pursuing multiple strategies at once.

  • Some companies preferred an indirect approach. Many companies simply gave their money to independent VC firms. Around 100 companies used this approach in 1987; by 1989, $483M of corporate money was invested in independent VCs, 20% of the total.
  • Other times corporations provided the capital for a dedicated VC fund handled by an external fund manager, an indirect approach known as a client-based fund. Corporations sometimes teamed up to create a client-based fund; AT&T, 3M, and Gulf and Western, for example, came together to create Edelson Technology Partners, operated by a former Wall Street analyst. The number of client-based funds rose from 31 in 1982 to 102 in 1987.3
  • Internally managed CVC funds also grew in popularity over the decade, rising in number from 28 in 1982 to 76 in 1988. Some corporations made direct VC-style investments outside of any dedicated CVC fund; corporates made 245 such deals in 1985, up from 30 in 1980.
  • Others pursued more idiosyncratic strategies. Eastman Kodak, for example, used a significant percentage of its $80M CVC fund to finance internally developed employee ideas that fell outside of its core business.
 The motivations behind the second wave of corporate venture capital largely mirrored those of the first: companies wanted access to technology, sometimes as means for diversification or sometimes to expand into adjacent product lines, albeit in a more disciplined, less wide-ranging fashion than during the first wave of CVC, rationalizing that CVC was often cheaper and more profitable than outright acquisition.


Silicon Valley, 1991. Source: Reddit/Imgur via Business Insider

Access to technology could also mean protecting or hedging against existing technologies. When the tech industry was looking for alternatives to silicon-based chips, Analog Devices, a maker of silicon-based chips, started a CVC program to invest in competing technologies. No alternative was found and Analog Devices’ investments largely failed — but the company would have been well-positioned had an alternative emerged.

Given some of these motivations, the parent corporate’s strategies were not always beneficial for the startups they invested in. General Motors, for example, invested in five machine vision companies in order to create technology that could automatically inspect parts on the assembly line. The company pushed these companies’ product development strategy to meet GM’s needs until finally they decided to abandon the tech and cut spending on the products, effectively abandoning the companies.

CVC expands its reach

While the second wave of CVC was largely focused on the technology industry, this was not exclusively the case. Colgate, Raytheon, and GM, for example, had CVC programs. General Electric still had its CVC fund from the first wave and invested in a number of successful tech startups. Xerox, Johnson & Johnson, Dow, WR Grace, and Motorola likewise maintained CVC programs from the first wave.

There were also a number of robust CVC investors among metal and chemical companies, such as Dow and WR Grace. One of the most active CVC investors of the period was Lubrizol Corporation, a chemical company, which invested in, among other companies, Genentech, which was backed by Kleiner Perkins Caufield & Byers and was one of the giant exits of 1999, when it went public with a $10.8B valuation.

In addition, the second wave of CVC was when, for the first time, foreign companies, especially from Japan, instituted CVC programs as well, although the investment was still largely confined to the US. Japanese companies made 60 investments in US-based companies in 1989, and their share of US-based CVC programs rose to 12% from 3% in 1983.

By 1990, there were 138 corporate investors in European VC funds, although only a few European corporates had internally managed CVC funds.4

Mark Radtke of consulting firm Venture Enterprises told The New York Times in 1986 that foreign corporations viewed VC as “an easy way … to effect a technological transfer,” i.e. gain access to new American technology.

There was also some investment flowing in the opposite direction. For example, Monsanto, DuPont, 3M, IBM, and Apple teamed up to create a client-based fund focused on European investments.

Hedging against the past: Xerox Technology Ventures

One of the most prominent corporate investors of the second wave was Xerox, then one of the most cutting-edge companies in Silicon Valley, partly because of its famous Palo Alto Research Center (PARC).

Xerox Parc, 2003

Xerox had had an active CVC program since the 1960s, operating an internally managed fund that invested in some of the most legendary figures in Silicon Valley, including Raymond Kurzweil and Steve Jobs. Kurzweil got his start in technology when Xerox invested in his first company, Kurzweil Applied Intelligence Inc., in 1982, to develop a computer that could transcribe spoken English. The idea behind the investment was to create technology that would increase demand for Xerox’s printing products when it ultimately hit the market.

When Apple released its revolutionary Macintosh computer, some observers claimed that it had commercialized technologies first developed at Xerox PARC, such as the mouse, windows, and icons, even dating the origin of the supposed copying to a 1979 visit to the lab by Jobs and other Apple employees — the tour is considered a seminal event in Silicon Valley lore. Although the accusation that Apple outright copied its ideas for user interface from Xerox is generally refuted today, the revelations stung Xerox’s management. The company was eventually spurred into rethinking its CVC program in an effort to better capitalize on the company’s languishing in-house technologies.

Xerox started Xerox Technology Ventures (“XTV”) in 1988 to exploit and monetize the technology created in PARC and its other research labs, funding it with $30M.

The company’s chairman said at the time that it was “a hedge against repeated missteps of the past.” Apple was one of several examples in which technology initially developed by Xerox was commercialized by more nimble competitors.5 These “missteps” were prominently highlighted in a book released that same year, Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer.

Not everyone thought the way Xerox had handled development at PARC was so terrible. Robert Adams, who would go on to lead XTV, argued, “The laser printer alone paid for all of the other PARC research projects many times over. If some of the innovation results fall off the wagon, so what?”

XTV was modeled on the structure and practices of independent VC firms, one of the earliest examples of a quasi-independent CVC unit like what we see today: Managers were given flexibility to act quickly on investment decisions, were allowed to invest $2M without asking for permission, had autonomy to monitor, exit, and liquidate investments, and were charged to maximize ROI. And in keeping with a typical VC firm, there was a compensation scheme that recognized big winners who drove results and thus rewarded greater risk taking.

Between 1988 and 1996, XTV invested in more than a dozen companies created from Xerox’s existing technologies. The companies were staffed with outside employees and allowed to make their own technological decisions, and Xerox never intended to maintain control of the companies, and involved outside VCs as the companies grew.

500004467-03-1Meeting at Parc

XTV was an enormous financial success, netting capital gains of $219M on the company’s initial investment, an astounding net internal rate of return of 56% — far larger than comparable returns by independent VCs during the same period.6

Nonetheless, XTV was terminated early and replaced with Xerox New Enterprises, which did not relinquish control of firms or allow for outside investment, and likewise used a more standard corporate investment compensation scheme.

Why was it scrapped despite the successes? XTV had created a lot of internal strife at Xerox, partly because of its compensation structure, which lavishly rewarded XTV’s executives, creating tension with other Xerox managers, a common problem for CVCs. In addition, some believed that XTV startups succeeded at the expense of other Xerox units while utilizing Xerox resources, which further undermined the VC unit.

A brief hiatus

The demise of XTV highlights the precarious position of even the most successful CVC programs, which can still fall victim to management’s whims or internal bickering between corporate units. The average lifespan of a CVC program between 1988 and 1996 was 2.5 years, one-third the duration of an independent VC.7

The 1987 stock market crash is widely considered to have brought the second wave of corporate venture capital to an end. CVC direct investment peaked in 1986, with approximately $750M of CVC investment, up from $300M the year before.8 The total number of companies with some sort of CVC program fell by a third between 1987 and 1992. Likewise, indirect CVC investment by investing in funds set up by independent VCs fell to $84M in 1992, from $483M in 1989.9

Nonetheless, some CVC indicators continued to rise after the crash. The number of internally managed CVC funds rose to 95 in 1991, from 76 in 1988, before falling to 69 in 1992, wiped out by the Savings & Loan recession.10

If Time Magazine’s “Machine of the Year” presaged the second major CVC wave, a different magazine cover more than a decade later augured the beginning of the third wave of corporate venture capital.

THE THIRD WAVE: IRRATIONAL EXUBERANCE? 1995-2001

On August 9, 1995, Netscape Communications Inc. went public and its shares more than doubled on the first day — an unprecedented event for a company that had yet to earn a profit. Six months later, Netscape founder Marc Andreessen — barefoot, sitting on what appeared to be a throne — was featured on the cover of Time magazine next to the headline “The Golden Geeks.”

The dot com boom had begun. If the personal computer was the breakthrough technology that drove the second wave of corporate venture capital, the internet was undoubtedly the impetus behind the third wave, which far outpaced its predecessors in scope and size.

In 2000 alone, more than 20 new CVC groups made their first investment, according to CB Insights data. Nearly 100 CVCs made their first investments in the years between 1995 and 2001.

Total dollars from deals involving CVCs grew to approximately $17B in 2000, or 25% of total funding to VC-backed companies in that year.

CVC also continued to internationalize during this period, even as the US remained the most important market. Between 1990 and 1999, 71% of CVC investors and 75% of CVC firms were located in the US; the real percentage of US-based firms may have been even smaller because some foreign corporations set up their CVC units in the US.12 Large Japanese companies, for example, began the practice of sending executives on work rotations in the offices of private VC funds they had invested in to bring the gleaned information and experience back to headquarters.13 American companies, conversely, began to use CVC as a means of accessing foreign markets or foreign technology.

The alternative R&D era sweeps in

Part of the reason for this enormous growth in CVC activity was, as always, hype building on hype. As Paul Gompers and Josh Lerner write in The Venture Capital Revolution, the publicity around early high-profile success stories like eBay and Yahoo “triggered the interest of many CEOs, who sought to harness some of the same energy in their organizations.” However, it also reflected real changes in the broader business environment.

Companies that had previously relied on a central R&D lab were experimenting with new models to drive innovation, because, as we have already seen from prior waves, new technologies often languished internally or were commercialized by more agile competitors, often startups. Small firms’ share of total dollars spent on R&D in the US rose from 4.4% in 1981 to 24.4% in 2009. At this point, startups are filing roughly 30% of patent applications.14 

CVC was thus a natural way for companies to access alternative R&D, allowing them to outsource at least part of their research and development to smaller, nimbler startups. Pharmaceutical companies and freshly ascendant tech mega-companies emerged as major CVC investors during this period.

However, whereas CVC investors were historically companies with a research and development DNA, the profile of CVC participants began to change with the enormous expansion of CVC investment. Media and advertising companies, for example, rushed into CVC for the first time. Reuters, News Corp., Reed Elsevier, and the WPP Group, among others, started CVC programs. The production company behind “The Blair Witch Project” even started a CVC fund with $50M of investment capital. (They successfully completed at least one investment but the production company shut down in 2003.)

The scale of companies’ individual CVC programs far outstripped anything that had come before. In September 2000, for example, the German media conglomerate Bertelsmann AG pledged $1B to a fund investing in new media startups — which, even if it was only a pledged figure, exceeded the size of total CVC investment at the height of the second wave.

The growth of CVC also accentuated some of its internal contradictions, notably around issues of compensation. Independent VCs were minting fortunes during the tech boom, seemingly overnight, far exceeding anything that had occurred prior. Many CVC programs, however, did not offer compensation structures like those of traditional VCs. CVCs couldn’t justify the enormous payouts to individual senior investors associated with successful VC investment and as a result companies would often train or develop CVC investors, only to see them jump to independent competitors in search of more lucrative compensation. GE Equity, for example, lost a total of 18 investors between 1998 and 1999, many of whom went to leading VC firms.

Despite this tension, this period saw closer collaboration between corporate and private venture capital investors than in any of the previous waves. This occurred because the VC market was overcrowded, and partnerships with corporations offered independent VCs a competitive advantage, according to Gompers and Lerner.

These partnerships often went beyond investment syndicates. Kleiner Perkins, for example, set up a Java Fund as a way for companies to stimulate demand for the technology by investing in companies creating Java-based applications, which held the promise of being compatible with multiple operating systems. Investors in the fund included Cisco Systems, Compaq Computer, IBM, Netscape Communications, Oracle, Sun Microsystems, and Tele-Communications Inc. These companies wanted to legitimize the language, which had the potential to break Microsoft’s chokehold on application development.

Others sought to benefit from the advantages of bringing private and corporate venture capital together. Texas Instruments, for example, partnered with independent VC Granite Ventures to create a fund, TI Ventures. The fund was directed to invest in strategic companies for TI, but within that, Granite focused on maximizing financial returns.

Approaches to CVC also began to change during this period. CVC was once about entering new markets or expanding product lines, but this new third wave of CVC focused on defending and supporting existing product lines by fostering a healthy business environment and ecosystem around those products. No company better exemplifies this strategy than Intel Capital, arguably the most successful longstanding CVC program.

Intel Capital: Steady does it

Intel Capital was founded in 1991 as a way to centralize Intel’s external investments, which had previously been handled by each individual business unit. While the venture unit initially focused on filling in gaps in its product line and rounding out its technology, it broadened its mandate in the mid-1990s to invest in the wider ecosystem surrounding Intel’s product offerings with the goal of improving market conditions. Intel Capital would be investing in “companies building technologies that supported, were sold alongside, or improved the value of Intel’s products in the marketplace,” according to a Harvard Business Review case study. This also included companies that stimulated demand for Intel’s products in the market.

Intel’s strategy was often to invest in multiple startups competing in the same market, which irritated some independent VCs who typically don’t invest in competing companies. But Intel was more committed to stimulating emerging technologies and market sectors rather than the success of any individual company. While this could, in theory, be off-putting to entrepreneurs, many were nonetheless eager to access Intel’s extensive resources and technical expertise. Among the companies backed by Intel Capital were Broadcom, Clearwire, and Research in Motion.

In 1999, for example, Intel started the Itanium 64 Fund, a $250M fund that invested in companies creating products using its Itanium 64-bit processor. Intel took a similar strategy in October 2002 when it wanted to encourage the adoption of wireless technologies using the 802.11 network standards, unveiling a $150M fund to invest in companies developing products that would boost the adoption of Wi-Fi networks, which would boost the sales of its new Centrino chip sets.

1280px-KL_Intel_Itanium_ES

Itanium processor chip, produced 2001-2002

Part of Intel’s success is also attributable to the fact that it focused on long-term goals, insulating itself from the boom-bust cycle that hurts so many CVC investors. Intel Capital was founded at a time when most other VC investors, independent and corporate, were pulling back.

And in 2001, when CVC programs were shutting down in droves, Intel continued to play the market, investing more than industry stalwarts Kleiner Perkins and General Atlantic Partners. “It’s not an option not to do it,” Leslie L. Vadasz, the president of Intel Capital, said at the time. He later told researchers for the Harvard Business Review, “There is no question that corporate venture investing increases the risk of P&L volatility, but it is one of the risks you accept along with all the gains.”

It certainly didn’t hurt Vadasz’s standing and ability gain the trust of his peers at Intel that he was Intel’s fourth employee and had previously run every major business unit at the company.

At the height of the boom in 2000, Intel Capital posted gains of $3.7B, accounting for one third of Intel’s total profit. However, it then experienced 10 successive quarters of losses that nearly wiped out those gains — but kept on investing throughout, and was again posting profitable quarters by early 2004.

Despite its financial success, Intel’s investments have always been based on strategic value, never solely on potential returns. Also notable is that Intel’s investment program was more robust that most other CVCs: Between its founding in 1991 and 2005, it had invested over $5B in more than 1000 companies — a rate of investment which surpasses most independent VCs. It also had a more international focus than other CVCs, investing more than 40% of it funds internationally. This often meant investing in companies that were improving internet access in international markets in the hope that it would increase PC sales and thereby demand for Intel’s products. A smaller portion of Intel Capital’s funds, about 10%, was devoted to poking around the marketplace for potential threats and opportunities years down the line.

Intel overcame the compensation issue internally by putting senior employees who were already deeply invested in the company in charge at Intel Capital. These individuals also had the institutional knowledge necessary to align Intel’s investments with the company’s overarching strategy. While Intel Capital did not take board seats, its observers did play an active role in shaping portfolio companies’ strategies and success in the market — in a way that stimulated synergies with Intel’s products and strategies — and provided portfolio companies with significant resources, including direct access to Intel’s executives. It also, crucially, had the support of management. As mentioned, Intel Capital’s president, Leslie L. Vadasz, was one of the company’s original employees.

All of this meant that Intel was uniquely well-positioned when the inevitable downturn arrived.

The tech bubble busts CVC

The third wave finally ended with massive declines in the stock market. Between March and May of 2000, the Nasdaq fell 40%. It clawed back about half of that loss by September, before losing another 50% through April of the following year. Many public tech companies went bust, as well as many tech startups that depended on a robust financing and IPO market to finance their expansion.

Because most CVC units are funded through the balance sheet, CVC investors are often required to use mark-to-market accounting, and as a result had to write down enormous losses during this period. While this method of accounting does not necessarily reflect real gains or losses, it can produce jaw-dropping headlines that nonetheless scare off executives and shareholders. 

Corporations were forced to write down $9.5B of venture related losses in the second quarter of 2001 alone.

Microsoft wrote off more than $5.7B in 2001; Wells Fargo wrote down $1.2B; even Intel, previously impregnable, wrote down $632M. Numerous companies, including Microsoft, AT&T, and News Corp, shuttered their CVC units; Amazon and Starbucks, among others, decided to no longer invest in startups; others liquidated their holdings in fire sales. As a result of the losses and write-downs, some companies faced pressure from large shareholders and activist investors who questioned the propriety and wisdom of CVC programs.

FG_menuez_20

Many companies unceremoniously dropped their CVC programs or ignored their investments, leaving a negative impression of corporate VC investors that persist, as we’ll see, in some circles today. Hewlett-Packard’s CVC program, for example, invested in hundreds of startups. After the bust, the company largely ignored the program. By 2008, when it was finally winding down, no one at the company even knew if its portfolio companies were still in business, or had gone bust, or if they were earning the company strong returns — a team from the strategy and corporate development had to call around and find out.15

However, not all companies pulled out of the market. Biotech and pharmaceutical companies, in particular, retained robust CVC programs throughout the early part of the decade. Shell, Mitsubishi, and Johnson Matthey, for example, came together to form a CVC fund investing in fuel cell technologies as late as 2002.

Even if CVC was momentarily discredited in the eyes of many, the boom in CVC investing had given observers a broader set of data than ever before to provide insights about the phenomenon, to understand its successes and failures.

Some of the data was encouraging. Paul Gompers writes in The Venture Capital Cycle that between 1982 and 1994, firms backed by CVC were more likely to go public than those that were not. However, CVC investments did not, in general, outperform those of independent VCs, unless there was a strategic tie between the investing corporation and the startup.

On the other hand, CVCs also paid significantly more for their investments. CVCs invested at an average valuation of $28.5M versus $18.2M for independent VCs.16

CVCs also invested less frequently over a more abbreviated period of time. The average CVC made 1.76 investments every year , whereas an independent VC made 5.75. 17

What doomed many CVC programs of the period, most researchers agree, was a lack of strategic focus and clearly defined objectives. Many companies started CVC programs because it seemed, at the time, that any serious company had a CVC program, and everyone was doing it, not because they carefully thought through how such a program could improve their business — a problem compounded when outside investors were brought in to run the program.

Companies wanted access to Silicon Valley without really knowing how Silicon Valley could be of service to them. They just knew it represented “The Future.” In other words, they were engaged in what CB Insights has dubbed “Innovation Theater.” Even if these programs had strategic objectives, the frothy atmosphere of speculation often meant that financial returns clouded out their original focus. As a result, they were swept away by the moment and over-invested in flashy startups with unclear benefits to the parent company’s core business. Then they were caught off-guard when everything went bust.

Even if CVC garnered a bad reputation in some corners, though, there were still enough successes to convincingly demonstrate its importance and vitality in the right hands. It was always clear that CVC investing would one day make a comeback with the improved fortunes of the tech industry — but would it return again only to get caught up in the hype?

FOURTH WAVE: THE UNICORN ERA, 2002 TO PRESENT

While corporate venture capital fell off substantially after the bubble burst, it by no means disappeared. CVC as a percentage of total VC was halved, but CVC investment leveled out at about $2B per year through the first half of the decade, then began to increase again before dipping, along with the rest of VC investment, during the worst years of the global financial crisis — dollars from CVC-backed deals reached only $5.1B in 2009 — then took off when Silicon Valley began to boom again in the first half of the current decade.

annualcvcdealsdollars

Dollars are for deals involving CVCs, which often involve non-CVC investors as well

In 2012, for example, total funding from deals involving CVCs was $8.4B, according to CB Insights — a significant increase from 2009, but still less than half of total CVC investment during the boom years. It remained at roughly that level in the following year before doubling in 2014, and then jumping nearly 70% in 2015 to an unprecedented $28.4B.

upload

Dollars are for deals involving CVCs, which often involve non-CVC investors as well

This trend reflects the broader increase in venture capital investment, which has more than doubled since 2011 and saw significant gains between 2013 and 2014 and 2014 and 2015. At this point, though, CVC is actually growing at a faster rate than venture capital investment in general. But it only captures part of the picture: corporate financings of private companies, done outside of a CVC unit, have also seen substantial increases in recent years.

CVC vs Corps 2016 (2)

With the new rise of CVC, critics have again surfaced. Sarah Lacy, editor of Pando, had a negative outlook, “At best, corporate VCs are a bit like minor league baseball teams. If you’ve got the chops to be a real VC, you’re either moving up or moving down. Either way, at many companies it’s a revolving door of talent and climbers.” (Fred Wilson, as we previously saw, also has his criticisms.)

The obvious impetus for CVC’s resurgence was the dual rise of social media and the smartphone, most prominently represented by Facebook and the iPhone. Together, internet and mobile accounted for 63% of CVC dollars by the final quarter of 2016. Healthcare now also regularly tops software and hardware, both tech boom darlings, as a destination for CVC dollars. Another factor encouraging CVC activity is that corporations are sitting on historically large piles of cash and global interest rates are historically low.

With the benefit of hindsight, two other events may have played a small role in nudging companies back into corporate venture capital. The first was Microsoft’s 2007 investment in Facebook, which was widely ridiculed at the time — ”Microsoft has to be seriously desperate to be considering this much of an investment for so little, even with its bags of cash to spend,” wrote Kara Swisher on the eve of the deal.

gates_zuckerberg_7a

Microsoft paid $240M for a 1.6% share of Facebook at a valuation of $15B. The company is valued at more than $300B today and the investment connected the by-then tech stalwart with the hottest startup in the world in the process.

The second spark that helped CVC’s resurgence was Google’s 2008 decision to start Google Ventures, likewise derided by some. The outfit has since grown into one of the largest and most respected CVC investors — a strong endorsement of the idea from one of the world’s most successful and innovative companies.

Nonetheless, hype and exuberance have undoubtedly contributed to the enormous growth in CVC in the past few years, buoyed by the triumphant press surrounding new technologies, a rash of new buzzwords, and the omnipresent fear of disruption. Arvind Sodhani, who led Intel Capital for ten years, told The New York Times earlier this year, “CEOs who are worried they’re going to get disrupted want to have an outpost in Silicon Valley to discern where the disruption is coming from.”

upload-1

The recent expansion of CVC investment is impressive. There are now, as we saw, roughly 200 CVC units active in any given quarter, a number that is more than 2x what it was just five years ago. Some companies that ditched their CVC units after the bubble burst have decided to give it another go. Dell closed its CVC unit in 2004 and reopened a new CVC unit in 2011. Microsoft, likewise, is revamping its CVC efforts.

Microsoft Ventures: Disciplined ambition

While Microsoft had an ad hoc corporate venture capital program in the 1990s, making minority investments in startups without a formalized CVC program, the new Microsoft Ventures unit represents an example of a mature model for corporate VC, pioneered by large technology companies and since adopted beyond the industry. Large write downs and losses following the tech bust, mentioned above, largely curtailed Microsoft’s ad hoc approach, although Microsoft did continue to make intermittent investments in startups, such as its 2007 investment in Facebook. At the beginning of 2016, however, Microsoft decided it was time to ramp up their investing activities and founded their first structured CVC program, Microsoft Ventures.

Nagraj Kashyap, formerly the head of Qualcomm Ventures, was brought on to spearhead the effort, and he has hit the ground running. In 2016, Microsoft Ventures invested in 18 companies, far more than Kashyap initially expected when he took on the job. The selection of Kashyap, who has been working in CVC for more than a decade, signified Microsoft’s intentions for the group.

Kashyap came from Qualcomm Ventures, which like Intel Capital, emulates many independent VC firms and is focused on maximizing its financial returns. Kashyap confirmed that he would bring a similar philosophy to Microsoft, “We don’t see the distinction between financial and strategic [goals]. The best financial companies make for the best strategic returns.”

That does not mean, of course, that Microsoft Ventures will be investing in coffee startups or lifestyle brands. On the contrary, it is focused on enterprise startups providing business-to-business services, such as companies building next-generation cloud infrastructure, companies that help enterprises migrate to the cloud, and all sorts of business-oriented SaaS; thus far, enterprise startups account for approximately 90% of Microsoft Ventures’ investments.

MV

Why these companies? The answer is simple: These are the companies where Microsoft can provide the most help and technical support post-investment. Microsoft’s investment philosophy does not mean the venture arm is indifferent to strategic concerns, but rather that they have set investment parameters that fit their strategic interests and then focus on financial returns within those parameters. “If I spend an enormous amount of time engaging a company with Microsoft and it fails,” said Kashyap, “that is basically a waste of time for everybody.”

Microsoft Ventures’ portfolio companies are under no obligation to work with Microsoft post-investment and vice-versa, and many of the startups have not even met with the company at the time of investment (excluding the investment team, of course). Nonetheless, the CVC offers these companies three services, should they accept an investment:

  • Technical integration with one of Microsoft’s products, such as Azure or Office 365, if it is beneficial to the portfolio company’s product
  • Go-to-market help, using Microsoft’s large and robust enterprise sales team; and
  • Promotional services, such as featuring a portfolio company at a Microsoft-held conference

Microsoft Ventures makes the necessary introduction and then an internal business development team, staffed by Microsoft employees, manages its portfolio companies’ interactions with Microsoft.

The investment team, however, is staffed by investment professionals, some of whom previously worked at other CVCs, like Intel Capital, or for independent VC firms, and they are expected to evaluate startups as any other VC would, without regard to a startup’s potential strategic advantages.

Microsoft Ventures has no set investment budget, no limiting fund size, and no minimum or maximum required investments per year. In some regards, this means its investors have an even greater level of flexibility than their counterparts at independent VCs, highlighting a potential advantage of CVC. It has the autonomy to invest more or less in a company, take a large or small ownership stake, and invest in as many good opportunities as it can find.

Kashyap sees the group’s focus on financial returns as key to maintaining its unusual degree of independence and flexibility, “As long as you’re making good investments, a corporate parent will typically not have problems with you spending more or less.”

They have made good use of the flexibility thus far. Microsoft’s 18 deals in 2016 are more than many robust independent VCs typically make per year — and the group was only founded towards the end of January. Kashyap says that the market has gradually improved over the course of the year as valuations have become more rational than in recent years and entrepreneurs have accordingly lowered their expectations.

As can be seen from the snapshot from CB Insights’ Investor Analytics tool below, Microsoft often invests after top VCs such as Bessemer and Data Collective, and also sees VCs and CVCs like Trinity Ventures, Intel Capital, and Accel invest in its portfolio companies.

Screen Shot 2017-02-03 at 3.27.36 PM

Kashyap believes that market conditions have fundamentally changed since the end of the last tech boom and there is unlikely to be a shakeout as brutal, despite concern over a unicorn-era bubble.

Many CVC investors back then were looking for quick returns from portfolio companies’ IPOs — the criticism that they were short-term players was largely true, he says. Today, however, the IPO market is much less robust, meaning CVC investors can no longer expect an instant return on investment, and corporations’ balance sheets are generally much stronger — corporations are sitting on record levels of cash in a historically low interest rate environment — meaning that they can afford to focus on longer-term goals and let the companies they’ve invested in mature over time.

Critics aside, CVC grows up

CVC investors have changed as well. There are enough established investors with strong track records that independent VCs and entrepreneurs are less wary of partnering with CVCs than they once were. Nonetheless, Kashyap warns, “There’s always going to be immature actors — not bad actors but immature actors, because they [started investing] from the wrong perspective and for the wrong goals.”

Skepticism on the part of entrepreneurs and other VCs is justified, because not all CVC investors share their principles. While some of these immature investors will undoubtedly disappear as the market cools and others will pull back, it is unlikely that we will see the sort of enormous drop-off in CVC investing that occurred in the early 2000s.

Meanwhile, the music has not yet stopped playing, despite the downturn in VC activity in 2016. More than fifty new CVC units were started in the first half of 2016 alone. The number of active CVC investors per quarter more than doubled between 2012 and 2016, according to CB Insights.

However, despite assurances from CVC investors that they are pursuing smaller, more nimble investments than during the tech boom, CVCs on average actually invest in larger deals. In the second quarter of 2016, CVC units participated in 19% of VC deals, but those deals represented 27% of VC investment dollars.

CVC has, however, grown up in the meantime, meaning that there is more institutional knowledge and more resources for companies just starting out. According to a survey of CVC investors, about half have processes in place to solicit and incorporate feedback from other stakeholders at their parent corporations. Two-thirds have a dedicated budget, and 80% complete more than five deals a year. CVC has, to some extent, standardized.

The mix of strategic and financial objectives continues. Four-fifths say they are primarily looking for strategic alignment in their startup investments, but three quarters also list financial considerations as a core objective.

Chart_01c

Some companies, it should be noted, employ multiple types of funds and investment strategies. Cisco Investments, for example, makes direct investments and acts as a limited partner in a number of independent venture capital funds. Microsoft Ventures, referenced above, is an internal dedicated fund.

Others are like more elaborate iterations of the “client-based” funds first pioneered in the 1980s, i.e. external funds which may be managed by an independent investment team, but that are wholly funded by a specific corporate or group of corporates.

Unilever and Pepsi, for example, are limited partners in Physic Ventures, a firm whose stated mission is “investing in keeping people healthy” and which is designed to let corporate investors forge commercial partnerships with portfolio companies. Both companies reportedly have full-time employees working out of Physic Ventures’ offices.

Screen Shot 2017-02-01 at 1.03.14 PM

Kleiner Perkins similarly teamed up with Apple to create the iFund in 2008 in order to stimulate development for the app store and potentially create more companies that would funnel through KPCB, similar to the fund to spur Java development in the 1990s.

Bumps in the road

Despite the generally positive atmosphere surrounding CVC investment of late, there have been setbacks. OnLive, an online gaming startup backed by Time Warner Investments, AutoDesk, HTC, and AT&T, crashed and burned in 2012 after achieving a $1B valuation. Walgreens and BlueCross BlueShield Venture Partners were investors in Theranos, the highly touted blood testing company that spectacularly blew up last year after a scandal. Other corporate-backed startups have seen steep drops in their valuations lately, including JawboneZenefits, and Dropbox.

This could signal the beginning of a broader chill in the market. If and when this happens, many CVC investors will have to write down significant losses — 76% of CVC investment is funded through the balance sheet, meaning that the market value of these investments must be reflected in company filings.

Believe the unbelievable

Even if these do not necessarily reflect real losses, the numbers will raise eyebrows and fresh questions about how worthwhile CVC really is to the corporation. Some companies have publicly stated that they will continue investing even if there is a downturn — but that is, of course, easier said than done. Nonetheless, some are putting their money where their mouth is.

Sapphire Ventures, formerly SAP’s CVC arm, and still solely backed by SAP, recently raised a $1B fund.

There are also important structural differences in CVC between the dot com era and the current tech boom. Many of the largest CVC investors in the past few years are not upstart units blundering into the market, but rather the CVC arms of blue chip tech companies, many of which rode out the last downturn and kept on investing, like Intel Capital and Cisco Investments. This makes them well positioned to capitalize on the current upswing. Other large CVC tech investors, like Google and Salesforce, started their funds more recently, in 2008 and 2009, respectively, but were already investing heavily in the market before it really heated up.

Salesforce has substantially increased its investments to more than $500M, from $27M in 2011. Many of the large investors subscribe to some variant of Intel Capital’s approach to corporate venture capital. Salesforce, for example, has been funding enterprise companies in order to stimulate the ecosystem of its core product. Comcast invests in a variety of content companies that complement and could possibly be incorporated into its core offerings, as well as technical companies that augment its core competencies.

Everyone’s a VC

It is true that there have been new CVC units from companies far from the Silicon Valley ethos, such as 7-Eleven, Campbell Soups, and General Mills, and this has raised some eyebrows. However, the success of a CVC program is not contingent on its proximity, geographically or spiritually, to San Francisco.

Companies across numerous industries face fresh challenges from a rapidly changing world, much of which, indeed, is rooted in Silicon Valley, but not exclusively so. American consumers, for example, have recently shown an inclination towards healthier foods, embracing some ingredients — kale, quinoa, acai berries, etc. — and rejecting others — gluten, some dairy and meat products — in a way that would have been nearly impossible to predict a decade ago.

This is obviously a concern for a food company like General Mills. Rather than exclusively relying on internally generating new product lines to meet these changes — a long and time-consuming process with no assurance of success — General Mills is using its new venture arm, 301 INC, to invest in food startups like Kite Hill and Rhythm Superfoods that already have a foothold in the market.

Screen Shot 2017-02-01 at 12.51.16 PM

Given its expertise in marketing and distribution, this is a natural extension of General Mills’ institutional knowledge, much more so than Sand Hill Road stalwarts investing in coffee companies and grilled cheese startups — which is not to say that it will succeed.

Corporate venture capital is, in a sense, getting back to roots, moving beyond its strong association with the technology industry that has come to predominate in recent decades. In doing so it has underlined what corporate venture capital actually is, at its essence: a tool for augmenting a company’s products and strategies, present and future. For many reasons, that often includes Silicon Valley, but it does not mean that CVC is an easy way to access its buzziest technologies.

CVC must be informed by an underlying strategy or differentiator. Have corporate venture capital firms learned the lessons of the tech boom, when they piled in simply because the internet was the shiny new thing? Over time, we’ll learn what companies really had a strategy — and which were merely victims of FOMO.

1. Dushnitsky, Gary. “Corporate Venture Capital in the Twenty First Century: An Integral Part of Firms’ Innovation Toolkit.” D. Cumming (Ed.) The Oxford Handbook of Venture Capital, 2012.

2. McNally, Kevin. Corporate Venture Capital: Bridging the Equity Gap in the Small Business Sector, 1997.

3. Ibid.

4. Ibid.

5. Gompers, Paul Allan and Josh Lerner. The Venture Capital Cycle, 2004.

6. Ibid.

7. Dushnitsky, Gary. “Corporate Venture Capital: Past Evidence and Future Directions.” A. Basu, M. Casson, N. Wadeson, B. Young (Eds.). The Oxford Handbook of Entrepreneurship, 2006.

8. Ibid.

9. McNally, Kevin. Corporate Venture Capital: Bridging the Equity Gap in the Small Business Sector, 1997.

10. Ibid.

11. Dushnitsky, Gary. “Corporate Venture Capital in the Twenty First Century: An Integral Part of Firms’ Innovation Toolkit.” D. Cumming (Ed.). The Oxford Handbook of Venture Capital, 2012.

12. Dushnitsky, Gary. “Corporate Venture Capital: Past Evidence and Future Directions.” A. Basu, M. Casson, N. Wadeson, B. Young (Eds.). The Oxford Handbook of Entrepreneurship, 2006.

13. Romans, Andrew. Masters of Corporate Venture Capital, 2016.

14. Ibid.

15. Ibid.

16. Gompers, Paul Allan and Josh Lerner. The Venture Capital Cycle, 2004.

17. Ibid.

Scroll to top