GitHub’s Top 100 Most Valuable Repositories Out of 96 Million – Hackernoon

GitHub is not just a code hosting service with version control — it’s also an enormous developer network.

The sheer size of GitHub at over 30 million accounts, more than 2 million organizations, and over 96 million repositories translates into one of the world’s most valuable development networks.

How do you quantify the value of this network? And is there a way to get the top repositories?

Here at U°OS, we ran the GitHub network through a simplified version¹ of our reputation algorithm and produced the top 100 most valuable repositories.

The result is as fascinating as it is eclectic in the way that it does feel like a good reflection of our society’s interest in the technology and where it moves.

There are the big proprietary players with open source projects — Google, Apple, Microsoft, Facebook, and even Baidu. And at the same time, there’s a Chinese anti-censorship tool.

There’s Bitcoin for cryptocurrency.

There’s a particle detector for CERN’s Large Hadron Collider.

There are gaming projects like Space Station 13 and Cataclysm: Dark Days Ahead and a gaming engine Godot.

There are education projects like freeCodeCamp, Open edX, Oppia, and Code.org.

There are web and mobile app building projects like WordPress, Joomla, and Flutter to publish your content on.

There are databases to store your content for the web like Ceph and CockroachDB.

And there’s a search engine to navigate through the content — Elasticsearch.

There are also, perhaps unsurprisingly, jailbreak projects like Cydia compatibility manager for iOS and Nintendo 3DS custom firmware.

And there’s a smart home system — Home Assistant.

All in all, it’s really a great outlook for the technology world: we learn, build stuff to broadcast our unique voices, we use crypto, break free from proprietary software on our hardware, and in the spare time we game in our automated homes. And the big companies open-source their projects.

Before I proceed with the list, a result of running the Octoverse through the reputation algorithm also produced a value score for every individual GitHub contributor. So, if you have a GitHub account and curious, you can get your score at https://u.community/github and convert it to a Universal Portable Reputation.

Top 100 projects & repositories

Out of over 96 million repositories

  1. Google Kubernetes
    Container scheduling and management
    Repository: https://github.com/kubernetes/kubernetes
    Website: https://kubernetes.io/
  2. Apache Spark
    A unified analytics engine for large-scale data processing
    Repository: https://github.com/apache/spark
    Website: http://spark.apache.org/
  3. Microsoft Visual Studio Code
    A source-code editor
    Repository: https://github.com/Microsoft/vscode
    Website: https://code.visualstudio.com/
  4. NixOS Package Collection
    A collection of packages for the Nix package manager
    Repository: https://github.com/NixOS/nixpkgs
    Website: https://nixos.org
  5. Rust
    Programming language
    Repository: https://github.com/rust-lang/rust
    Website: https://www.rust-lang.org/
  6. Firehol IP Lists
    Blacklists for Firehol, a firewall builder
    Repository: https://github.com/firehol/blocklist-ipsets
    Website: https://iplists.firehol.org/
  7. Red Hat OpenShift
    A community distribution of Kubernetes optimized for continuous application development and multi-tenant deployment
    Repository: https://github.com/openshift/origin
    Website: https://www.openshift.com/
  8. Ansible
    A deployment automation platform
    Repository: https://github.com/ansible/ansible
    Website: https://www.ansible.com/
  9. Automattic WordPress Calypso
    A JavaScript and API powered front-end for WordPress.com
    Repository: https://github.com/Automattic/wp-calypso
    Website: https://developer.wordpress.com/calypso/
  10. Microsoft .NET CoreFX
    Foundational class libraries for .NET Core
    Repository: https://github.com/dotnet/corefx
    Website: https://docs.microsoft.com/en-us/dotnet/core/
  11. Microsoft .NET Roslyn
    .NET compiler
    Repository: https://github.com/dotnet/roslyn
    Website: https://docs.microsoft.com/en-us/dotnet/csharp/roslyn-sdk/
  12. Node.js
    A JavaScript runtime built on Chrome’s V8 JavaScript engine
    Repository: https://github.com/nodejs/node
    Website: https://nodejs.org/en/
  13. TensorFlow
    Google’s machine learning framework
    Repository: https://github.com/tensorflow/tensorflow
    Website: https://www.tensorflow.org/
  14. freeCodeCamp
    Code learning platform
    Repository: https://github.com/freeCodeCamp/freeCodeCamp
    Website: https://www.freecodecamp.org/
  15. Space Station 13
    A round-based roleplaying game
    Repository: https://github.com/tgstation/tgstation
    Website: https://www.tgstation13.org/
  16. Apple Swift
    Apple’s programming language
    Repository: https://github.com/apple/swift
    Website: https://swift.org/
  17. Elasticsearch
    A search engine
    Repository: https://github.com/elastic/elasticsearch
    Website: https://www.elastic.co/products/elasticsearch
  18. Moby
    An open framework to assemble specialized container systems
    Repository: https://github.com/moby/moby
    Website: https://mobyproject.org/
  19. CockroachDB
    A cloud-native SQL database
    Repository: https://github.com/cockroachdb/cockroach
    Website: https://www.cockroachlabs.com/
  20. Cydia Compatibility Checker
    A compatibility checker for Cydia — a package manager for iOS jailbroken devices
    Repository: https://github.com/jlippold/tweakCompatible
    Website: https://jlippold.github.io/tweakCompatible/
  21. Servo
    A web browser engine
    Repository: https://github.com/servo/servo
    Website: https://servo.org/
  22. Google Flutter
    Google’s mobile app SDK to create interfaces for iOS and Android
    Repository: https://github.com/flutter/flutter
    Website: https://flutter.dev/
  23. macOS Homebrew Package Manager
    Default formulae for the missing package manager for macOS
    Repository: https://github.com/homebrew/homebrew-core
    Website: https://brew.sh/
  24. Home Assistant
    Home automation software
    Repository: https://github.com/home-assistant/home-assistant
    Website: https://www.home-assistant.io/
  25. Microsoft .NET CoreCLR
    Runtime for .NET Core
    Repository: https://github.com/dotnet/coreclr
    Website: https://docs.microsoft.com/en-us/dotnet/core/
  26. CocoaPods Specifications
    Specifications for CocoaPods, a Cocoa dependency manager
    Repository: https://github.com/CocoaPods/Specs
    Website: https://cocoapods.org/
  27. Elastic Kibana
    An analytics and search dashboard for Elasticsearch
    Repository: https://github.com/elastic/kibana
    Website: https://www.elastic.co/products/kibana
  28. Julia Language
    A technical computing language
    Repository: https://github.com/JuliaLang/julia
    Website: https://julialang.org/
  29. Microsoft TypeScript
    A superset of JavaScript that compiles to plain JavaScript
    Repository: https://github.com/Microsoft/TypeScript
    Website: https://www.typescriptlang.org/
  30. Joomla
    A content management system
    Repository: https://github.com/joomla/joomla-cms
    Website: https://www.joomla.org/
  31. DefinitelyTyped
    A repository for TypeScript type definitions
    Repository: https://github.com/DefinitelyTyped/DefinitelyTyped
    Website: http://definitelytyped.org/
  32. Homebrew Cask
    A CLI workflow for the administration of macOS applications distributed as binaries
    Repository: https://github.com/Homebrew/homebrew-cask
    Website: https://brew.sh/
  33. Ceph
    A distributed object, block, and file storage platform
    Repository: https://github.com/ceph/ceph
    Website: https://ceph.com/
  34. Go
    Programming language
    Repository: https://github.com/golang/go
    Website: https://golang.org/
  35. AMP HTML Builder
    A way to build pages for Google AMP
    Repository: https://github.com/ampproject/amphtml
    Website: https://amp.dev/
  36. Open edX
    An online education platform
    Repository: https://github.com/edx/edx-platform
    Website: https://open.edx.org/
  37. Pandas
    A data analysis and manipulation library for Python
    Repository: https://github.com/pandas-dev/pandas
    Website: https://pandas.pydata.org/
  38. Istio
    A platform to manage microservices
    Repository: https://github.com/istio/istio
    Website: https://istio.io/
  39. ManageIQ
    A containers, virtual machines, networks, and storage management platform
    Repository: https://github.com/ManageIQ/manageiq
    Website: http://manageiq.org/
  40. Godot Engine
    A multi-platform 2D and 3D game engine
    Repository: https://github.com/godotengine/godot
    Website: https://godotengine.org/
  41. Gentoo Repository Mirror
    A Gentoo ebuild repository mirror
    Repository: https://github.com/gentoo/gentoo
    Website: https://www.gentoo.org/
  42. Odoo
    A suite of web based open source business apps
    Repository: https://github.com/odoo/odoo
    Website: https://www.odoo.com/
  43. Azure Documentation
    Documentation of Microsoft Azure
    Repository: https://github.com/MicrosoftDocs/azure-docs
    Website: https://docs.microsoft.com/azure
  44. Magento
    An eCommerce platform
    Repository: https://github.com/magento/magento2
    Website: https://magento.com/
  45. Saltstack
    Software to automate the management and configuration of any infrastructure or application at scale
    Repository: https://github.com/saltstack/salt
    Website: https://www.saltstack.com/
  46. AdGuard Filters
    Ad blocking filters for AdGuard
    Repository: https://github.com/AdguardTeam/AdguardFilters
    Website: https://adguard.com/en/welcome.html
  47. Symfony
    A PHP framework
    Repository: https://github.com/symfony/symfony
    Website: https://symfony.com/
  48. CMS Software for the Large Hadron Collider
    Particle detector software components for CERN’s Large Hadron Collider
    Repository: https://github.com/cms-sw/cmssw
    Website: http://cms-sw.github.io/
  49. Red Hat OpenShift
    OpenShift installation and configuration management
    Repository: https://github.com/openshift/openshift-ansible
    Website: https://www.openshift.com/
  50. ownCloud
    Personal cloud software
    Repository: https://github.com/owncloud/core
    Website: https://owncloud.org/
  51. gRPC
    A remote procedure call (RPC) framework
    Repository: https://github.com/grpc/grpc
    Website: https://grpc.io/
  52. Liferay
    An enterprise web platform
    Repository: https://github.com/brianchandotcom/liferay-portal
    Website: https://www.liferay.com/
  53. CommCare HQ
    A mobile data collection platform
    Repository: https://github.com/dimagi/commcare-hq
    Website: https://www.commcarehq.org/
  54. WordPress Gutenberg
    An editor plugin for WordPress
    Repository: https://github.com/WordPress/gutenberg
    Website: https://wordpress.org/gutenberg/
  55. PyTorch
    A Python package for Tensor computation and deep neural networks
    Repository: https://github.com/pytorch/pytorch
    Website: https://pytorch.org/
  56. Kubernetes Test Infrastructure
    A test-infra repository for Kubernetes
    Repository: https://github.com/kubernetes/test-infra
    Website: https://kubernetes.io/
  57. Keybase
    Keybase client repository
    Repository: https://github.com/keybase/client
    Website: https://keybase.io/
  58. Facebook React
    A JavaScript library for building user interfaces
    Repository: https://github.com/facebook/react
    Website: https://reactjs.org/
  59. Code.org
    Code learning resource
    Repository: https://github.com/code-dot-org/code-dot-org
    Website: https://code.org/
  60. Bitcoin Core
    Bitcoin client software
    Repository: https://github.com/bitcoin/bitcoin
    Website: https://bitcoincore.org/
  61. Arm Mbed OS
    A platform operating system for the Internet of Things
    Repository: https://github.com/ARMmbed/mbed-os
    Website: https://www.mbed.com
  62. scikit-learn
    A Python module for machine learning
    Repository: https://github.com/scikit-learn/scikit-learn
    Website: https://scikit-learn.org
  63. Nextcloud
    A self-hosted productivity platform
    Repository: https://github.com/nextcloud/server
    Website: https://nextcloud.com/
  64. Helm Charts
    A curated list of applications for Kubernetes
    Repository: https://github.com/helm/charts
    Website: https://kubernetes.io/
  65. Terraform
    An infrastructure management tool
    Repository: https://github.com/hashicorp/terraform
    Website: https://www.terraform.io/
  66. Ant Design
    A UI design language
    Repository: https://github.com/ant-design/ant-design
    Website: https://ant.design/
  67. Phalcon Framework Documentation
    Documentation for Phalcon, a PHP framework
    Repository: https://github.com/phalcon/docs
    Website: https://docs.phalconphp.com
  68. Documentation for CMS Software for the Large Hadron Collider
    Documentation for CMS Software for CERN’s Large Hadron Collider
    Repository: https://github.com/cms-sw/cms-sw.github.io
    Website: http://cms-sw.github.io/
  69. Apache Kafka Mirror
    A mirror for Apache Kafka, a distributed streaming platform
    Repository: https://github.com/apache/kafka
    Website: https://kafka.apache.org/
  70. Electron
    A framework to write cross-platform desktop applications using JavaScript, HTML and CSS
    Repository: https://github.com/electron/electron
    Website: https://electronjs.org/
  71. Zephyr Project
    A real-time operating system
    Repository: https://github.com/zephyrproject-rtos/zephyr
    Website: https://www.zephyrproject.org/
  72. The web-platform-tests Project
    A cross-browser testsuite for the Web-platform stack
    Repository: https://github.com/web-platform-tests/wpt
    Website: https://www.w3.org/
  73. Marlin Firmware
    Optimized firmware for RepRap 3D printers based on the Arduino platform
    Repository: https://github.com/MarlinFirmware/Marlin
    Website: http://marlinfw.org/
  74. Apache MXNet
    A library for deep learning
    Repository: https://github.com/apache/incubator-mxnet
    Website: https://mxnet.apache.org/
  75. Apache Beam
    A unified programming model
    Repository: https://github.com/apache/beam
    Website: https://beam.apache.org/
  76. Fastlane
    A build and release automaton for iOS and Android apps
    Repository: https://github.com/fastlane/fastlane
    Website: https://fastlane.tools/
  77. Kubernetes Website and Documentation
    A repository for the Kubernetes website and documentation
    Repository: https://github.com/kubernetes/website
    Website: https://kubernetes.io
  78. Ruby on Rails
    A web-application framework
    Repository: https://github.com/rails/rails
    Website: https://rubyonrails.org/
  79. Zulip
    Team chat software
    Repository: https://github.com/zulip/zulip
    Website: https://zulipchat.com/
  80. Laravel
    A web application framework
    Repository: https://github.com/laravel/framework
    Website: https://laravel.com/
  81. Baidu PaddlePaddle
    Baidu’s deep learning framework
    Repository: https://github.com/PaddlePaddle/Paddle
    Website: http://www.paddlepaddle.org/
  82. Gatsby
    A web application framework
    Repository: https://github.com/gatsbyjs/gatsby
    Website: https://www.gatsbyjs.org/
  83. Rust Crate Registry
    Rust’s community package registry
    Repository: https://github.com/rust-lang/crates.io-index
    Website: https://crates.io/
  84. Nintendo 3DS Custom Firmware
    A complete guide to 3DS custom firmware
    Repository: https://github.com/hacks-guide/Guide_3DS
    Website: https://3ds.hacks.guide/
  85. TiDB
    A NewSQL database
    Repository: https://github.com/pingcap/tidb
    Website: https://pingcap.com
  86. Angular CLI
    CLI tool for Angular, a Google web application framework
    Repository: https://github.com/angular/angular-cli
    Website: https://cli.angular.io/
  87. MAPS.ME
    Offline OpenStreetMap maps for iOS and Android
    Repository: https://github.com/mapsme/omim
    Website: https://maps.me/
  88. Eclipse Che
    A cloud IDE for Eclipse
    Repository: https://github.com/eclipse/che
    Website: http://www.eclipse.org/che/
  89. Brave Browser
    A browser with native BAT cryptocurrency
    Repository: https://github.com/brave/browser-laptop
    Website: https://www.brave.com/
  90. Patchwork
    A repository to learn Git
    Repository: https://github.com/jlord/patchwork
    Website: http://jlord.us/patchwork/
  91. Angular Material
    Component infrastructure and Material Design components for Angular, a Google web application framework
    Repository: https://github.com/angular/components
    Website: https://material.angular.io/
  92. Python
    Programming language
    Repository: https://github.com/python/cpython
    Website: https://www.python.org/
  93. Space Station 13
    A round-based roleplaying game
    Repository: https://github.com/vgstation-coders/vgstation13
    Website: http://ss13.moe/
  94. Cataclysm: Dark Days Ahead
    A turn-based survival game
    Repository: https://github.com/CleverRaven/Cataclysm-DDA
    Website: http://cataclysmdda.org/
  95. Material-UI
    React components that implement Google’s Material Design
    Repository: https://github.com/mui-org/material-ui
    Website: https://material-ui.com/
  96. Ionic
    A Progressive Web Apps development framework
    Repository: https://github.com/ionic-team/ionic
    Website: https://ionicframework.com/
  97. Oppia
    A tool for collaboratively building interactive lessons
    Repository: https://github.com/oppia/oppia
    Website: https://www.oppia.org
  98. Alluxio
    A virtual distributed storage system
    Repository: https://github.com/Alluxio/alluxio
    Website: https://www.alluxio.io/
  99. XX Net
    A Chinese web proxy and anti-censorship tool
    Repository: https://github.com/XX-net/XX-Net
    Website: None
  100. Microsoft .NET CLI
    A CLI tool for .NET
    Repository: https://github.com/dotnet/cli
    Website: https://docs.microsoft.com/en-us/dotnet/core/tools/

[1] The explanation of the calculation of the simplified version is at the U°OS Network GitHub repository.

Source : https://hackernoon.com/githubs-top-100-most-valuable-repositories-out-of-96-million-bb48caa9eb0b

 

Improving the Accuracy of Automatic Speech Recognition Models for Broadcast News – Appen

Sound Waves illustration
In their paper entitled English Broadcast News Speech Recognition by Humans and Machines, the team proposes to identify techniques that close the gap between automatic speech recognition (ASR) and human performance.

Where does the data come from?

IBM’s initial work in the voice recognition space was done as part of the U.S. government’s Defense Advanced Research Projects Agency (DARPA) Effective Affordable Reusable Speech-to-Text (EARS) program, which led to significant advances in speech recognition technology. The EARS program produced about 140 hours of supervised BN training data and around 9,000 hours of very lightly supervised training data from closed captions from television shows. By contrast, EARS produced around 2,000 hours of highly supervised, human-transcribed training data for conversational telephone speech (CTS).

Lost in translation?

Because so much training data is available for CTS, the team from IBM and Appen endeavored to apply similar speech recognition strategies to BN to see how well those techniques translate across applications. To understand the challenge the team faced, it’s important to call out some important differences between the two speech styles:

Broadcast news (BN)

  • Clear, well-produced audio quality
  • Wide variety of speakers with different speaking styles
  • Varied background noise conditions — think of reporters in the field
  • Wide variety of news topics

Conversational telephone speech (CTS)

  • Often poor audio quality with sound artifacts
  • Unscripted
  • Interspersed with moments where speech overlaps between participants
  • Interruptions, sentence restarts, and background confirmations between participants i.e. “okay”, “oh”, “yes

People speaking into a phone
How the team adapted speech recognition models from CTS to BN

The team adapted the speech recognition systems that were so successfully used for the EARS CTS research: Multiple long short-term memory (LSTM) and ResNet acoustic models trained on a range of acoustic features, along with word and character LSTMs and convolutional WaveNet-style language models. This strategy had produced results between 5.1% and 9.9% accuracy for CTS in a previous study, specifically the HUB5 2000 English Evaluation conducted by the Linguistic Data Consortium (LDC). The team tested a simplified version of this approach on the BN data set, which wasn’t human-annotated, but rather created using closed captions.

Instead of adding all the available training data, the team carefully selected a reliable subset, then trained LSTM and residual network-based acoustic models with a combination of n-gram and neural network language models on that subset. In addition to automatic speech recognition testing, the team benchmarked the automatic system against an Appen-produced high-quality human transcription. The primary language model training text for all these models consisted of a total of 350 million words from different publicly available sources suitable for broadcast news.

Getting down to business

In the first set of experiments the team separately tested the LSTM and ResNet models in conjunction with the n-gram and FF-NNLM before combining scores from the two acoustic models in comparison with the results obtained on the older CTS evaluation. Unlike results observed on original CTS testing, no significant reduction in the word error rate (WER) was achieved after scores from both the LSTM and ResNet models were combined. The LSTM model with an n-gram LM individually performs quite well and its results further improve with the addition of the FF-NNLM.

For the second set of experiments, word lattices were generated after decoding with the LSTM+ResNet+n-gram+FF-NNLM model. The team generated n-best lists from these lattices and rescored them with the LSTM1-LM. LSTM2-LM was also used to rescore word lattices independently. Significant WER gains were observed after using the LSTM LMs. This led the researchers to hypothesize that the secondary fine-tuning with BN-specific data is what allows LSTM2-LM to perform better than LSTM1-LM.

The results

Our ASR results have clearly improved state-of-the-art performance, and significant progress has been made compared to systems developed over the last decade. When compared to the human performance results, the absolute ASR WER is about 3% worse. Although the machine and human error rates are comparable, the ASR system has much higher substitution and deletion error rates.

Looking at the different error types and rates, the research produced interesting takeaways:

  • There’s a significant overlap in the words that ASR and humans delete, substitute, and insert.
  • Humans seem to be careful about marking hesitations: %hesitation was the most inserted symbol in these experiments. Hesitations seem to be important in conveying meaning to the sentences in human transcriptions. The ASR systems, however, focus on blind recognition and were not successful in conveying the same meaning.
  • Machines have trouble recognizing short function words: theandofathat and these get deleted the most. Humans on the other hand, seem to catch most of them. It seems likely that these words aren’t fully articulated so the machine fails to recognize them, while humans are able to infer these words naturally.

Silhouette of person speaking on phone
Conclusion

The experiments show that speech ASR techniques can be transferred across domains to provide highly accurate transcriptions. For both acoustic and language modeling, the LSTM- and ResNet-based models proved effective and human evaluation experiments kept us honest. That said, while our methods keep improving, there is still a gap to close between human and machine performance, demonstrating a continued need for research on automatic transcription for broadcast news.

Source : https://appen.com/blog/improving-the-accuracy-of-automatic-speech-recognition-models-for-broadcast-news/

 

Consulting or con-$ulting – Hackernoon

The article by The Register regarding Hertz suing Accenture over their failed website revamp deal has gained a lot of attention on social media creating a lot of discussion around failed software projects and the IT consulting giants such as Accenture.

What I found saddest in the article is that the part about Accenture completely fumbling a huge website project doesn’t surprise me one bit: I stumble upon articles about large enterprise IT projects failing and going well over their budgets on a weekly basis. What was more striking about the article is that Hertz is suing Accenture, and going public with it. This tells us something about the state of the IT consulting business, and you don’t have to be an expert to tell that there is a huge flaw somewhere in the process of how large software projects are sold by consultancies, and especially how they are purchased and handled by their clients.

Just by reading through the article, one might think that the faults were made completely on Accenture’s side, but there is definitely more to it.Hertz too has clearly made a lot of mistakes during crucial phases of the project: in purchasing, service designing and development. I’ll try to bite into the most critical and prominent flaws.

If we dig into the actual lawsuit document we start getting a better picture of what actually went down, and what led to tens of millions of dollars going down the drain on a service that is unusable.

Siloed service design & abandoning ownership

Reading through points 2. and 3. of the legal complaint we get a small glimpse into the initial service design process:

2. Hertz spent months planning the project. It assessed the current state of its ecommerce activities, defined the goals and strategy for its digital business, and developed a roadmap that would allow Hertz to realize its vision.

3. Hertz did not have the internal expertise or resources to execute such a massive undertaking; it needed to partner with a world-class technology services firm. After considering proposals from several top-tier candidates, Hertz narrowed the field of vendors to Accenture and one other.

Hertz first “planned the project, defined the goals and strategy and developed the roadmap”. Then after realising they “don’t have the internal expertise or resources”, they started looking for a vendor who could be able to carry out their vision.

This was the first large mistake. If the initial plan, goals and vision are done before the vendor, the party who is responsible for realising the vision, is involved, you will most likely end up in a ‘broken telephone’ situation where the vision and goals are not properly transferred from the initial planners and designers to the implementers.

This is a very dangerous starting situation. What makes it even worse is this:

6. Hertz relied on Accenture’s claimed expertise in implementing such a digital transformation. Accenture served as the overall project manager. Accenture gathered Hertz’s requirements and then developed a design to implement those requirements. Accenture served as the product owner, and Accenture, not Hertz, decided whether the design met Hertz’s requirements.

Hertz made Accenture the product owner, thus relieving the ownership of the service to Accenture. This, if something, tells us that Hertz did not have the required expertise and maturity to undertake this project in the first place. Making a consulting company, a company which has no deep insight into your specific domain, business & needs, the owner & main visionary of your service is usually not a good idea. Especially when you consider that it might not be in the interest of the consulting company to finish the project in the initial budget, but rather to extend the project to generate more sales and revenue.

Having the vendor as a product owner is not a rare occurrence, and it can sometimes work if the vendor has deep enough knowledge of the client’s organisation, business & domain. However, when working in such a large project and for a huge organisation like Hertz, it’s impossible for the consulting company to have the necessary insight and experience of Hertz’s business.

Lack of transparency & communication

Moving on to the development phase of the project:

7. Accenture committed to delivering an updated, redesigned, and re-engineered website and mobile apps that were ready to “go-live” by December 2017.

8. Accenture began working on the execution phase of the project in August 2016 and it continued to work until its services were terminated in May 2018. During that time, Hertz paid Accenture more than $32 million in fees and expenses. Accenture never delivered a functional website or mobile app. Because Accenture failed to properly manage and perform the services, the go-live date was postponed twice, first until January 2018, and then until April 2018. By that point, Hertz no longer had any confidence that Accenture was capable of completing the project, and Hertz terminated Accenture.

Hertz finally lost its confidence into Accenture ~5 months after the initial planned go-live date, seemingly after at least a full year into kicking off the project partnership with them.

If it took Hertz around 1½ years to realise that Accenture can’t deliver, It’s safe to say that Hertz & Accenture have been both working in their own silos with minimal transparency into each other’s work, and critical information was not moving between the organisations. My best guess is that Hertz & Accenture met only once in a while to assess the status of the project and share. But a software project like this should be an ongoing collaborative process, with constant daily discussion between the parties. In a well functioning organisation, the client and vendor are one united team pushing the product out together.

The lack of communication infrastructure is a common problem in large scale software projects between a company and its vendor. It’s hard to say on whose responsibility it should be to organise the required tools, processes, meetings and environments to make sure that the necessary discussions are being had and that knowledge is shared. But often the consulting company is the one with a more modern take on communication, and they can provide the framework and tools for it much easier.

We get a deeper glimpse into the lack of transparency, especially regarding the technical side, when we go through points 36. — 42. of the legal complaint, e.g. number 40.:

40. Accenture’s Java code did not follow the Java standard, displayed poor logic, and was poorly written and difficult to maintain.

Right. Accenture’s code quality and technical competence was not on a satisfying level, and that is on Accenture, as they have been hired to be the technical experts in the project. But if Hertz would’ve had even one technical person working on the project, and they would have had visibility into the codebase, they could’ve caught this problem right from the first commit, instead of noticing it after over a year of Accenture delivering bad quality code. If you are buying software for tens of millions, you must have an in-house technical expert as part of the software development process, even if only as a spectator.

The lack of transparency and technical expertise combined with the lack of ownership/responsibility was ultimately the reason why Hertz managed to blow tens of millions USD, instead of just a couple. If Hertz would have had the technical know-how and had been more deeply involved in the work, they could’ve early on assessed that the way Accenture is doing things is flawed. Perhaps some people in Hertz saw that the situation was bad early on, but since the ownership of the product was on Accenture’s side, it must have been hard for those people to speak up as they saw the issues. This resulted in Accenture being allowed to do unsuitable work for over a year, until the initial ‘go-live’ date was way past and it was already too late.

And finally… Crony contracts & short-term thinking

There have been rumours of Hertz leadership firing the entire well-performing in-house software development talent, replacing it with off-shore workforce from IBM and making crony ‘golf course’ deals with Accenture in 2016. And the Hertz CIO securing a $7 million bonus for the short-term ‘savings’ made by those changes. I’d recommend taking these Hacker News comments with a grain of salt, but I wouldn’t be at all surprised if the allegations were more or less true.

These kinds of crony contracts are huge problem in the enterprise software industry in general, and the news we see about them are only the tip of the iceberg. But that is a subject for a whole other blog post.

To wrap it up

It’s important to keep in mind that the lawsuit text doesn’t really tell us the whole truth: a lot of things must have happened during those years that we will never know off. However, it’s quite clear that some common mistakes that happen in consulting projects constantly happened here too, and that the ball was dropped by both parties involved.

It’s going to be interesting to see how the lawsuit plays out, as it will work as a real-life example to both consulting companies and their clients on what could happen when their expensive software projects go south.

For a company which is considering buying software, the most important learnings to take out of this mess are:

  • Before buying software, make sure your organisation is ready for it and the required expertise is there.
  • Include the vendor from the very beginning in the planning, goal defining and service design process. Make sure you and the vendor are working as a unified team with a shared goal.
  • Make sure that the contracts are well thought out and prepare the business for worst-case scenarios.
  • Maintain the ownership of the project in your own hands, unless you are absolutely sure that the vendor has deep enough knowledge of your organisation and its business & domain.
  • Make sure the necessary communication & transparency is present both ways. The communication between you and the vendor should be constant, natural, open and wide. Include all people involved in the project, not just the managers. You must have full transparency into and understanding of the vendor’s development process.

Also, one thing to note is that many companies who have had bad experiences with large enterprise consultancies have turned to the smaller, truly agile software consultancies instead of the giants like Accenture. Smaller companies are better at taking responsibility for their work, and they have the required motivation to actually deliver quality, as they appreciate the chance to tackle a large project. For a small company the impact of delivering a project well and keeping the client happy is much more important than it is for an already well established giant.

Hopefully by learning from history and the mistakes of others, we can avoid going through the hell that the people at Hertz had to!

Source :

 

 

Geothermal Making Inroads as Baseload Power

It’s energy that has been around forever, used for years as a heating source across the world, particularly in areas with volcanic activity. Today, geothermal has surfaced as another renewable resource, with advancements in drilling technology bringing down costs and opening new areas to development.

Renewable energy continues to increase its share of the world’s power generation. Solar and wind power receive most of the headlines, but another option is increasingly being recognized as an important carbon-free resource.

Geothermal, accessing heat from the earth, is considered a sustainable and environmentally friendly source of renewable energy. In some parts of the world, the heat that can be used for geothermal is easily accessible, while in other areas, access is more challenging. Areas with volcanic activity, such as Hawaii—where the recently restarted Puna Geothermal Venture supplies about 30% of the electricity demand on the island of Hawaii—are well-suited to geothermal systems.

“What we need to do as a renewable energy industry is appreciate that we need all sources of renewable power to be successful and that intermittent sources of power need the baseload sources to get to a 100% renewable portfolio,” Will Pettitt, executive director of the Geothermal Resources Council (GRC), told POWER. “Geothermal therefore needs to be collaborating with the solar, wind, and biofuel industries to make this happen.”

1. The Nesjavellir Geothermal Power Station is located near the Hengill volcano in Iceland. The 120-MW plant contributes to the country’s 750 MW of installed geothermal generation capacity. Courtesy: Gretar Ívarsson

The U.S. Department of Energy (DOE) says the U.S. leads the world in geothermal generation capacity, with about 3.8 GW. Indonesia is next at about 2 GW, with the Philippines at about 1.9 GW. Turkey and New Zealand round out the top five, followed by Mexico, Italy, Iceland (Figure 1), Kenya, and Japan.

Research and Development

Cost savings from geothermal when compared to other technologies is part of its allure. The DOE is funding research into clean energy options, including up to $84 million in its 2019 budget to advance geothermal energy development.

 

2. This graphic produced by AltaRock Energy, a geothermal development and management company, shows the energy-per-well equivalent for shale gas, conventional geothermal, an enhanced geothermal system (EGS) well, and a “super hot” EGS well. Courtesy: AltaRock Energy / National Renewable Energy Laboratory

Introspective Systems, a Portland, Maine-based company that develops distributed grid management software, in February received a Small Business Innovation Research award from the DOE in support of the agency’s Enhanced Geothermal Systems’ (EGS) project. At EGS (Figure 2) sites, a fracture network is developed, and water is pumped into hot rock formations thousands of feet below the earth’s surface. The heated water is then recovered to drive conventional steam turbines. Introspective Systems is developing monitoring software that enables EGS systems to be cost-competitive.

Kay Aikin, Introspective Systems’ CEO, was among business leaders selected by the Clean Energy Business Network (CEBN)—a group of more than 3,000 business leaders from all 50 states working in the clean energy economy—to participate in meetings with members of Congress in March to discuss the need to protect and grow federal funding for the DOE and clean energy innovation overall.

Aikin told POWER that EGS technology is designed to overcome the problem of solids coming “out of the liquids and filling up all the pores,” or cracks in rock through which heated water could flow. The Introspective Systems’ software uses “algorithms to find the sites [suitable for a geothermal system]. We can track those cracks and pores, and that is what we are proposing to do.”

Looking for more insight into geothermal energy? Read our “Q&A with Geothermal Experts,” featuring Dr. Will Pettitt, executive director of the Davis, California-based Geothermal Resources Council, and Dr. Torsten Rosenboom, a partner in the Frankfurt, Germany office of global law firm Watson Farley & Williams LLP.

“In my view there are three technology pieces that need to come together for EGS to be successful,” said the GRC’s Pettitt. “Creating and maintaining the reservoir so as to ensure sufficient permeability without short-circuiting; bringing costs down on well drilling and construction; [and] high-temperature downhole equipment for zonal isolation and measurements. These technologies all have a lot of crossover opportunities to helping conventional geothermal be more efficient.”

Aikin noted a Massachusetts Institute of Technology report on geothermal [The Future of Geothermal Energy: Impact of Enhanced Geothermal Systems (EGS) on the United States in the 21st Century] “that was the basis for this funding from DOE,” she said. Aikin said current goals for geothermal would “offset about 6.1% of CO2 emissions, about a quarter of the Paris climate pledge. Because it’s base[load] power, it will offset coal and natural gas. We’re talking about roughly 1,500 new geothermal plants by 2050, and they can be sited almost anywhere.”

NREL Takes Prominent Role

Kate Young, manager of the geothermal program at the National Renewable Energy Laboratory (NREL) in Golden, Colorado, talked to POWER about the biggest things that the industry is focusing on. “DOE has been working with the national labs the past several years to develop the GeoVision study, that is now in the final stages of approval,” she said.

The GeoVision study explores potential geothermal growth scenarios across multiple market sectors for 2020, 2030, and 2050. NREL’s research focuses on things such as:

    ■ Geothermal resource potential – hydrothermal, coproduction, and near-field and greenfield enhanced geothermal systems.
    ■ Techno-economic characteristics – the costs and technical issues of advanced technologies and potential future impacts and calculating geothermal capacity.
    ■ Market penetration – modeling of dozens of scenarios, including multiple reference scenarios.
    ■ Non-technical barriers – factors that create delays, increase risk, or increase the cost of project development.

The study started with analyses spearheaded by several DOE labs in areas such as exploration; reservoir development and management; non-technical barriers; hybrid systems; and thermal applications (see sidebar). NREL then synthesized the analyses from the labs in market deployment models for the electricity and heating/cooling sectors.

Geothermal Is Big Business in Boise

The first U.S. geothermal district heating system began operating in 1892 in Boise, Idaho. The city still relies on geothermal, with the largest system of its kind in the U.S., and the sixth-largest worldwide, according to city officials. The current system, which began operating in 1983, heats 6 million square feet of real estate—about a third of the city’s downtown (Figure 3)—in the winter. The city last year got the go-ahead from the state Department of Water Resources to increase the amount of water it uses, and Public Works Director Steve Burgos told POWER the city wants to connect more downtown buildings to the system.

3. This plaque, designed by artist Ward Hooper, adorns buildings across downtown Boise, Idaho, denoting properties that use geothermal energy. Courtesy: City of Boise

Burgos said it costs the city about $1,000 to pump the water out of the ground and into the system on a monthly basis, and about another $1,000 for the electricity used to inject the water back into the aquifer. Burgos said the water “comes out at 177 degrees,” and the city is able to re-use the water in lower-temperature (110 degrees) scenarios, such as at laundry facilities. The city’s annual revenue from the system is $650,000 to $750,000.

“We have approximately 95 buildings using the geothermal system,” said Burgos. “About 2% of the city’s energy use is supplied by geothermal. We’re very proud of it. It’s a source of civic pride. Most of the buildings that are hooked up use geothermal for heating. Some of the buildings use geothermal for snow melt. There’s no outward sign of the system, there’s no steam coming out of the ground.”

Colin Hickman, the city’s communication manager for public works, told POWER that Boise “has a downtown YMCA, that has a huge swimming pool, that is heated by geothermal.” He and Burgos both said the system is an integral part of the city’s development.

“We’re currently looking at a strategic master plan for the geothermal,” Burgos said. “We definitely want to expand the system. Going into suburban areas is challenging, so we’re focusing on the downtown core.” Burgos said the city about a decade ago put in an injection well to help stabilize the aquifer. Hickman noted the city last year received a 25% increase in its water rights.

Boise State University (BSU) has used the system since 2013 to heat several of its buildings, and the school’s curriculum includes the study of geothermal physics. The system at BSU was expanded about a year and a half ago—it’s currently used in 11 buildings—and another campus building currently under construction also will use geothermal.

Boise officials tout the city’s Central Addition project, part of its LIV District initiative (Lasting Environments, Innovative Enterprises and Vibrant Communities). Among the LIV District’s goals is to “integrate renewable and clean geothermal energy” as part of the area’s sustainable infrastructure.

“This is part of a broader energy program for the city,” Burgos said, “as the city is looking at a 100% renewable goal, which would call for an expansion of the geothermal energy program.” Burgos noted that Idaho Power, the state’s prominent utility, has a goal of 100% clean energy by 2045.

As Boise grows, Burgos and Hickman said the geothermal system will continue to play a prominent role.

“We actively go out and talk about it when we know a new business is coming in,” Burgos said. “And as building ownership starts to change hands, we want to have a relationship with those folks.”

Said Hickman: “It’s one of the things we like as a selling point” for the city.

Young told POWER: “The GeoVision study looked at different pathways to reduce the cost of geothermal and at ways we can expand access to geothermal resources so that it can be a 50-state technology, not limited to the West. When the study is released, it will be a helpful tool in showing the potential for geothermal in the U.S.”

Young said of the DOE: “Their next big initiative is to enable EGS, using the FORGE site,” referring to the Frontier Observatory for Research in Geothermal Energy, a location “where scientists and engineers will be able to develop, test, and accelerate breakthroughs in EGS technologies and techniques,” according to DOE. The agency last year said the University of Utah “will receive up to $140 million in continued funding over the next five years for cutting-edge geothermal research and development” at a site near Milford, Utah, which will serve as a field laboratory.

“The amount of R&D money that’s been invested in geothermal relative to other technologies has been small,” Young said. “and consequently, the R&D improvement has been proportionally less than other technologies. The potential, however, for geothermal technology and cost improvement is significant; investment in geothermal could bring down costs and help to make it a 50-state technology – which could have a positive impact on the U.S. energy industry.”

For those who question whether geothermal would work in some areas, Young counters: “The temperatures are lower in the Eastern U.S., but the reality is, there’s heat underground everywhere. The core of the earth is as hot as the surface of the sun, but a lot closer. DOE is working to be able to access that heat from anywhere – at low cost.”

Investors Stepping Up

Geothermal installations are often found at tectonic plate boundaries, or at places where the Earth’s crust is thin enough to let heat through. The Pacific Rim, known as the Ring of Fire for its many volcanoes, has several of these places, including in California, Oregon, and Alaska, as well as northern Nevada.

Geothermal’s potential has not gone unnoticed. Some of the world’s wealthiest people, including Microsoft founder Bill Gates, Amazon founder and CEO Jeff Bezos, and Alibaba co-founder Jack Ma, are backing Breakthrough Energy Ventures, a firm that invests in companies developing decarbonization technologies. Breakthrough recently invested $12.5 million in Baseload Capital, a geothermal project development company that provides funding for geothermal power plants using technology developed by Climeon, its Swedish parent company.

Climeon was founded in 2011; it formed Baseload Capital in 2018. The two focus on geothermal, shipping, and heavy industry, in the latter two sectors turning waste heat into electricity. Climeon’s geothermal modules are scalable, and available for both new and existing geothermal systems. Climeon in March said it had an order backlog of about $88 million for its modules.

“We believe that a baseload resource such as low-temperature geothermal heat power has the potential to transform the energy landscape. Baseload Capital, together with Climeon’s innovative technology, has the potential to deliver [greenhouse gas-free] electricity at large scale, economically and efficiently,” Carmichael Roberts of Breakthrough Energy Ventures said in a statement.

Climeon says its modules reduce the need for drilling new wells and enable the reuse of older wells, along with speeding the development time of projects. The company says the compact and modular design is scalable from 150-kW modules up to 50-MW systems. Climeon says it can be connected to any heat source, and has just three moving parts in each module: two pumps, and a turbine.

4. The Sonoma Plant operated by Calpine is one of more than 20 geothermal power plants sited at The Geysers, the world’s largest geothermal field, located in Northern California.  Courtesy: Creative Commons / Stepheng3

Breakthrough Energy’s investment in Baseload Capital is its second into geothermal energy. Breakthrough last year backed Fervo Energy, a San Francisco, California-based company that says its technology can produce geothermal energy at a cost of 5¢/kWh to 7¢/kWh. Fervo CEO and co-founder Tim Latimer said the money from Breakthrough would be used for field testing of EGS installations. Fervo’s other co-founder, Jack Norbeck, was a reservoir engineer at The Geysers in California (Figure 4), the world’s largest geothermal field, located north of Santa Rosa and just south of the Mendocino National Forest.

Most of the nearly two dozen geothermal plants at The Geysers are owned and operated by Calpine, though not all are operating. The California Energy Commission says there are more than 40 operating geothermal plants in the state, with installed capacity of about 2,700 MW.

Geothermal “is something we have to do,” said Aikin of Introspective Systems. “We have to find new baseload power. Our distribution technology can get part of the way there, toward 80% renewables, but we need base power. [Geothermal] is a really good ‘all of the above’ direction to go in.”

Source : https://www.powermag.com/bringing-the-heat-geothermal-making-inroads-as-baseload-power/?printmode=1

 

Making Simulation Accessible to the Masses – American Composites Manufacturers Association

Composites simulation tools aren’t just for mega corporations. Small and mid-sized companies can reap their benefits, too.

In 2015, Solvay Composite Materials began using simulation tools from MultiMechanics to simplify testing of materials used in high-performance applications. The global business unit of Solvay recognized the benefits of conducting computer-simulated tests to accurately predict the behavior of advanced materials, such as resistance to extreme temperatures and loads. Two years later, Solvay invested $1.9 million in MultiMechanics to expedite development of the Omaha, Neb.-based startup company’s material simulation software platform, which Solvay predicts could reduce the time and cost of developing new materials by 40 percent.

Commitment to – and investment in – composites simulation tools isn’t unusual for a large company like Solvay, which recorded net sales of €10.3 billion (approximately $11.6 billion) in 2018 and has 27,000 employees working at 125 sites throughout 62 countries. What may be more surprising is the impact composites simulation can have on small to mid-sized companies. “Simulation tools are for everyone,” asserts Flavio Souza, Ph.D., president and chief technology officer of MultiMechanics.

The team at Guerrilla Gravity would agree. The 7-year-old mountain bike manufacturer in Denver began using simulation software from Altair more than a year ago to develop a new frame technology made from thermoplastic resins and carbon fiber. “We were the first ones to figure out how to create a hollow structural unit with a complex geometry out of thermoplastic materials,” says Will Montague, president of Guerrilla Gravity.

That probably wouldn’t have been possible without composites simulation tools, says Ben Bosworth, director of composites engineering at Guerrilla Gravity. Using topology optimization, which essentially finds the ideal distribution of material based on goals and constraints, the company was able to maximize use of its materials and conduct testing with confidence that the new materials would pass on the first try. (They did.) Afterward, the company was able to design its product for a specific manufacturing process – automated fiber placement.

“There is a pretty high chance that if we didn’t utilize composites simulation software, we would have been far behind schedule on our initial target launch date,” says Bosworth. Guerrilla Gravity introduced its new frame, which can be used on all four of its full-suspension mountain bike models, on Jan. 31, 2019.

The Language of Innovation
There are dozens of simulation solutions, some geared specifically to the composites industry and other general finite element analysis (FEA) tools. But they all share the common end goal of helping companies bring pioneering products to market faster – whether those companies are Fortune 500 corporations or startup entrepreneurships.

“Composites simulation is going to be the language of innovation,” says R. Byron Pipes, executive director of the Composites Manufacturing & Simulation Center at Purdue University. “Without it, a company’s ability to innovate in the composites field is going to be quite restricted.”

Those innovations can be at the material level or within end-product applications. “If you really want to improve the micromechanics of your materials, you can use simulation to tweak the properties of the fibers, the resin, the combination of the two or even the coating of fibers,” says Souza. “For those who build parts, simulation can help you innovate in terms of the shape of the part and the manufacturing process.”

One of the biggest advantages that design simulation has over the traditional engineering approach is time, says Jeff Wollschlager, senior director of composites technology at Altair. He calls conventional engineering the “build and bust” method, where companies make samples, then break them to test their viability. It’s a safe method, producing solid – although often conservative – designs. “But the downside of traditional approaches is they take a lot more time and many more dollars,” says Wollschlager. “And everything in this world is about time and money.”

In addition, simulation tools allow companies to know more about the materials they use and the products they make, which in turn facilitates the manufacturing of more robust products. “You have to augment your understanding of your product with something else,” says Wollschlager. “And that something else is simulation.”

A Leap Forward in Manufacturability
Four years ago, Montague and Matt Giaraffa, co-founder and chief engineer of Guerrilla Gravity, opted to pursue carbon fiber materials to make their bike frames lighter and sturdier. “We wanted to fundamentally improve on what was out there in the market. That required rethinking and analyzing not only the material, but how the frames are made,” says Montague.

The company also was committed to manufacturing its products in the United States. “To produce the frames in-house, we had to make a big leap forward in manufacturability of the frames,” says Montague. “And thermoplastics allow for that.” Once Montague and Giaraffa selected the material, they had to figure out exactly how to make the frames. That’s when Bosworth – and composites simulation – entered the picture.

Bosworth has more than a decade of experience with simulation software, beginning as an undergraduate student in mechanical engineering as a member of his college’s Formula SAE® team to design, build and test a vehicle for competition. While creating the new frame for Guerrilla Gravity, he used Altair’s simulation tools extensively, beginning with early development to prove the material feasibility for the application.

“We had a lot of baseline data from our previous aluminum frames, so we had a really good idea about how strong the frames needed to be and what performance characteristics we wanted,” says Bosworth. “Once we introduced the thermoplastic carbon fiber, we were able to take advantage of the software and use it to its fullest potential.” He began with simple tensile test samples and matched those with physical tests. Next, he developed tube samples using the software and again matched those to physical tests.

“It wasn’t until I was much further down the rabbit hole that I actually started developing the frame model,” says Bosworth. Even then, he started small, first developing a computer model for the front triangle of the bike frame, then adding in the rear triangle. Afterward, he integrated the boundary conditions and the load cases and began doing the optimization.

“You need to start simple, get all the fundamentals down and make sure the models are working in the way you intend them to,” says Bosworth. “Then you can get more advanced and grow your understanding.” At the composite optimization stage, Bosworth was able to develop a high-performing laminate schedule for production and design for automated fiber placement.

Even with all his experience, developing the bike frame still presented challenges. “One of the issues with composites simulation is there are so many variables to getting an accurate result,” admits Bosworth. “I focused on not coming up with a 100 percent perfect answer, but using the software as a tool to get us as close as we could as fast as possible.”

He adds that composites simulation tools can steer you in the right direction, but without many months of simulation and physical testing, it’s still very difficult to get completely accurate results. “One of the biggest challenges is figuring out where your time is best spent and what level of simulation accuracy you want to achieve with the given time constraints,” says Bosworth.

Wading into the Simulation Waters
The sophistication and expense of composites simulation tools can be daunting, but Wollschlager encourages people not to be put off by the technology. “The tools are not prohibitive to small and medium-sized companies – at least not to the level people think they are,” he says.

Cost is often the elephant in the room, but Wollschlager says it’s misleading to think packages will cost a fortune. “A proper suite provides you simulation in all facets of composite life cycles – in the concept, design and manufacturing phases,” he says. “The cost of such a suite is approximately 20 to 25 percent of the yearly cost of an average employee. Looking at it in those terms, I just don’t see the barrier to entry for small to medium-sized businesses.”

As you wade into the waters of simulation, consider the following:

Assess your goals before searching for a package. Depending on what you are trying to accomplish, you may need a comprehensive suite of design and analysis tools or only a module or two to get started. “If you want a simplified methodology because you don’t feel comfortable with a more advanced one, there are mainstream tools I would recommend,” says Souza. “But if you really want to innovate and be at the cutting-edge of your industry trying to understand how materials behave and reduce costs, then I would go with a more advanced package.” Decide upfront if you want tools to analyze materials, conduct preliminary designs, optimize the laminate schedule, predict the life of composite materials, simulate thermo-mechanical behaviors and so on.

Find programs that fit your budget. Many companies offer programs for startups and small businesses that include discounts on simulation software and a limited number of hours of free consulting. Guerrilla Gravity purchased its simulation tools through Altair’s Startup Program, which is designed for privately-held businesses less than four years old with revenues under $10 million. The program made it fiscally feasible for the mountain bike manufacturer to create a high-performing solution, says Bosworth. “If we had not been given that opportunity, we probably would’ve gone with a much more rudimentary design – probably an isotropic, black aluminum material just to get us somewhere in the ballpark of what we were trying to do,” he says.

Engage with vendors to expedite the learning curve. Don’t just buy simulation tools from suppliers. Most companies offer initial training, plus extra consultation and access to experts as needed. “We like to walk hand-in-hand with our customers,” says Souza. “For smaller companies that don’t have a lot of resources, we can work as a partnership. We help them create the models and teach them the technology behind the product.”

Start small, and take it slow. “I see people go right to the final step, trying to make a really advanced model,” says Bosworth. “Then they get frustrated because nothing is working right and the joints aren’t articulating. They end up troubleshooting so many issues.” Instead, he recommends users start simple, as he did with the thermoplastic bike frame.

Don’t expect to do it all with simulation. “We don’t advocate for 100 percent simulation. There is no such thing. We also don’t advocate for 100 percent experimentation, which is the traditional approach to design,” says Wollschlager. “The trick is that it’s somewhere in the middle, and we’re all struggling to find the perfect percentage. It’s problem-dependent.”

Put the right people in place to use the tools. “Honestly, I don’t know much about FEA software,” admits Montague. “So it goes back to hiring smart people and letting them do their thing.” Bosworth was the “smart hire” for Guerrilla Gravity. And, as an experienced user, he agrees it takes some know-how to work with simulation tools. “I think it would be hard for someone who doesn’t have basic material knowledge and a fundamental understanding of stress and strain and boundary conditions to utilize the tools no matter how basic the FEA software is,” he says. For now, simulation is typically handled by engineers, though that may change.

Perhaps the largest barrier to implementation is ignorance – not of individuals, but industry-wide, says Pipes. “People don’t know what simulation can do for them – even many top level senior managers in aerospace,” he says. “They still think of simulation in terms of geometry and performance, not manufacturing. And manufacturing is where the big payoff is going to be because that’s where all the economics lie.”

Pipes wants to “stretch people into believing what you can and will be able to do with simulation.” As the technology advances, that includes more and more each day – not just for mega corporations, but for small and mid-sized companies, too.

“As the simulation industry gets democratized, prices are going to come down due to competition, while the amount you can do will go through the roof,” says Wollschlager. “It’s a great time to get involved in simulation.”

Source : http://compositesmanufacturingmagazine.com/2019/05/making-simulation-accessible-to-the-masses/

 

Scroll to top