GitHub is not just a code hosting service with version control — it’s also an enormous developer network.
The sheer size of GitHub at over 30 million accounts, more than 2 million organizations, and over 96 million repositories translates into one of the world’s most valuable development networks.
How do you quantify the value of this network? And is there a way to get the top repositories?
Here at U°OS, we ran the GitHub network through a simplified version¹ of our reputation algorithm and produced the top 100 most valuable repositories.
The result is as fascinating as it is eclectic in the way that it does feel like a good reflection of our society’s interest in the technology and where it moves.
There are the big proprietary players with open source projects — Google, Apple, Microsoft, Facebook, and even Baidu. And at the same time, there’s a Chinese anti-censorship tool.
There’s Bitcoin for cryptocurrency.
There’s a particle detector for CERN’s Large Hadron Collider.
There are gaming projects like Space Station 13 and Cataclysm: Dark Days Ahead and a gaming engine Godot.
There are education projects like freeCodeCamp, Open edX, Oppia, and Code.org.
There are web and mobile app building projects like WordPress, Joomla, and Flutter to publish your content on.
There are databases to store your content for the web like Ceph and CockroachDB.
And there’s a search engine to navigate through the content — Elasticsearch.
There are also, perhaps unsurprisingly, jailbreak projects like Cydia compatibility manager for iOS and Nintendo 3DS custom firmware.
And there’s a smart home system — Home Assistant.
All in all, it’s really a great outlook for the technology world: we learn, build stuff to broadcast our unique voices, we use crypto, break free from proprietary software on our hardware, and in the spare time we game in our automated homes. And the big companies open-source their projects.
Before I proceed with the list, a result of running the Octoverse through the reputation algorithm also produced a value score for every individual GitHub contributor. So, if you have a GitHub account and curious, you can get your score at https://u.community/github and convert it to a Universal Portable Reputation.
Top 100 projects & repositories
Out of over 96 million repositories
 The explanation of the calculation of the simplified version is at the U°OS Network GitHub repository.
Source : https://hackernoon.com/githubs-top-100-most-valuable-repositories-out-of-96-million-bb48caa9eb0b
In their paper entitled English Broadcast News Speech Recognition by Humans and Machines, the team proposes to identify techniques that close the gap between automatic speech recognition (ASR) and human performance.
IBM’s initial work in the voice recognition space was done as part of the U.S. government’s Defense Advanced Research Projects Agency (DARPA) Effective Affordable Reusable Speech-to-Text (EARS) program, which led to significant advances in speech recognition technology. The EARS program produced about 140 hours of supervised BN training data and around 9,000 hours of very lightly supervised training data from closed captions from television shows. By contrast, EARS produced around 2,000 hours of highly supervised, human-transcribed training data for conversational telephone speech (CTS).
Because so much training data is available for CTS, the team from IBM and Appen endeavored to apply similar speech recognition strategies to BN to see how well those techniques translate across applications. To understand the challenge the team faced, it’s important to call out some important differences between the two speech styles:
Broadcast news (BN)
Conversational telephone speech (CTS)
The team adapted the speech recognition systems that were so successfully used for the EARS CTS research: Multiple long short-term memory (LSTM) and ResNet acoustic models trained on a range of acoustic features, along with word and character LSTMs and convolutional WaveNet-style language models. This strategy had produced results between 5.1% and 9.9% accuracy for CTS in a previous study, specifically the HUB5 2000 English Evaluation conducted by the Linguistic Data Consortium (LDC). The team tested a simplified version of this approach on the BN data set, which wasn’t human-annotated, but rather created using closed captions.
Instead of adding all the available training data, the team carefully selected a reliable subset, then trained LSTM and residual network-based acoustic models with a combination of n-gram and neural network language models on that subset. In addition to automatic speech recognition testing, the team benchmarked the automatic system against an Appen-produced high-quality human transcription. The primary language model training text for all these models consisted of a total of 350 million words from different publicly available sources suitable for broadcast news.
In the first set of experiments the team separately tested the LSTM and ResNet models in conjunction with the n-gram and FF-NNLM before combining scores from the two acoustic models in comparison with the results obtained on the older CTS evaluation. Unlike results observed on original CTS testing, no significant reduction in the word error rate (WER) was achieved after scores from both the LSTM and ResNet models were combined. The LSTM model with an n-gram LM individually performs quite well and its results further improve with the addition of the FF-NNLM.
For the second set of experiments, word lattices were generated after decoding with the LSTM+ResNet+n-gram+FF-NNLM model. The team generated n-best lists from these lattices and rescored them with the LSTM1-LM. LSTM2-LM was also used to rescore word lattices independently. Significant WER gains were observed after using the LSTM LMs. This led the researchers to hypothesize that the secondary fine-tuning with BN-specific data is what allows LSTM2-LM to perform better than LSTM1-LM.
Our ASR results have clearly improved state-of-the-art performance, and significant progress has been made compared to systems developed over the last decade. When compared to the human performance results, the absolute ASR WER is about 3% worse. Although the machine and human error rates are comparable, the ASR system has much higher substitution and deletion error rates.
Looking at the different error types and rates, the research produced interesting takeaways:
The experiments show that speech ASR techniques can be transferred across domains to provide highly accurate transcriptions. For both acoustic and language modeling, the LSTM- and ResNet-based models proved effective and human evaluation experiments kept us honest. That said, while our methods keep improving, there is still a gap to close between human and machine performance, demonstrating a continued need for research on automatic transcription for broadcast news.
Source : https://appen.com/blog/improving-the-accuracy-of-automatic-speech-recognition-models-for-broadcast-news/
The article by The Register regarding Hertz suing Accenture over their failed website revamp deal has gained a lot of attention on social media creating a lot of discussion around failed software projects and the IT consulting giants such as Accenture.
What I found saddest in the article is that the part about Accenture completely fumbling a huge website project doesn’t surprise me one bit: I stumble upon articles about large enterprise IT projects failing and going well over their budgets on a weekly basis. What was more striking about the article is that Hertz is suing Accenture, and going public with it. This tells us something about the state of the IT consulting business, and you don’t have to be an expert to tell that there is a huge flaw somewhere in the process of how large software projects are sold by consultancies, and especially how they are purchased and handled by their clients.
Just by reading through the article, one might think that the faults were made completely on Accenture’s side, but there is definitely more to it.Hertz too has clearly made a lot of mistakes during crucial phases of the project: in purchasing, service designing and development. I’ll try to bite into the most critical and prominent flaws.
If we dig into the actual lawsuit document we start getting a better picture of what actually went down, and what led to tens of millions of dollars going down the drain on a service that is unusable.
Reading through points 2. and 3. of the legal complaint we get a small glimpse into the initial service design process:
2. Hertz spent months planning the project. It assessed the current state of its ecommerce activities, defined the goals and strategy for its digital business, and developed a roadmap that would allow Hertz to realize its vision.
3. Hertz did not have the internal expertise or resources to execute such a massive undertaking; it needed to partner with a world-class technology services firm. After considering proposals from several top-tier candidates, Hertz narrowed the field of vendors to Accenture and one other.
Hertz first “planned the project, defined the goals and strategy and developed the roadmap”. Then after realising they “don’t have the internal expertise or resources”, they started looking for a vendor who could be able to carry out their vision.
This was the first large mistake. If the initial plan, goals and vision are done before the vendor, the party who is responsible for realising the vision, is involved, you will most likely end up in a ‘broken telephone’ situation where the vision and goals are not properly transferred from the initial planners and designers to the implementers.
This is a very dangerous starting situation. What makes it even worse is this:
6. Hertz relied on Accenture’s claimed expertise in implementing such a digital transformation. Accenture served as the overall project manager. Accenture gathered Hertz’s requirements and then developed a design to implement those requirements. Accenture served as the product owner, and Accenture, not Hertz, decided whether the design met Hertz’s requirements.
Hertz made Accenture the product owner, thus relieving the ownership of the service to Accenture. This, if something, tells us that Hertz did not have the required expertise and maturity to undertake this project in the first place. Making a consulting company, a company which has no deep insight into your specific domain, business & needs, the owner & main visionary of your service is usually not a good idea. Especially when you consider that it might not be in the interest of the consulting company to finish the project in the initial budget, but rather to extend the project to generate more sales and revenue.
Having the vendor as a product owner is not a rare occurrence, and it can sometimes work if the vendor has deep enough knowledge of the client’s organisation, business & domain. However, when working in such a large project and for a huge organisation like Hertz, it’s impossible for the consulting company to have the necessary insight and experience of Hertz’s business.
Moving on to the development phase of the project:
7. Accenture committed to delivering an updated, redesigned, and re-engineered website and mobile apps that were ready to “go-live” by December 2017.
8. Accenture began working on the execution phase of the project in August 2016 and it continued to work until its services were terminated in May 2018. During that time, Hertz paid Accenture more than $32 million in fees and expenses. Accenture never delivered a functional website or mobile app. Because Accenture failed to properly manage and perform the services, the go-live date was postponed twice, first until January 2018, and then until April 2018. By that point, Hertz no longer had any confidence that Accenture was capable of completing the project, and Hertz terminated Accenture.
Hertz finally lost its confidence into Accenture ~5 months after the initial planned go-live date, seemingly after at least a full year into kicking off the project partnership with them.
If it took Hertz around 1½ years to realise that Accenture can’t deliver, It’s safe to say that Hertz & Accenture have been both working in their own silos with minimal transparency into each other’s work, and critical information was not moving between the organisations. My best guess is that Hertz & Accenture met only once in a while to assess the status of the project and share. But a software project like this should be an ongoing collaborative process, with constant daily discussion between the parties. In a well functioning organisation, the client and vendor are one united team pushing the product out together.
The lack of communication infrastructure is a common problem in large scale software projects between a company and its vendor. It’s hard to say on whose responsibility it should be to organise the required tools, processes, meetings and environments to make sure that the necessary discussions are being had and that knowledge is shared. But often the consulting company is the one with a more modern take on communication, and they can provide the framework and tools for it much easier.
We get a deeper glimpse into the lack of transparency, especially regarding the technical side, when we go through points 36. — 42. of the legal complaint, e.g. number 40.:
40. Accenture’s Java code did not follow the Java standard, displayed poor logic, and was poorly written and difficult to maintain.
Right. Accenture’s code quality and technical competence was not on a satisfying level, and that is on Accenture, as they have been hired to be the technical experts in the project. But if Hertz would’ve had even one technical person working on the project, and they would have had visibility into the codebase, they could’ve caught this problem right from the first commit, instead of noticing it after over a year of Accenture delivering bad quality code. If you are buying software for tens of millions, you must have an in-house technical expert as part of the software development process, even if only as a spectator.
The lack of transparency and technical expertise combined with the lack of ownership/responsibility was ultimately the reason why Hertz managed to blow tens of millions USD, instead of just a couple. If Hertz would have had the technical know-how and had been more deeply involved in the work, they could’ve early on assessed that the way Accenture is doing things is flawed. Perhaps some people in Hertz saw that the situation was bad early on, but since the ownership of the product was on Accenture’s side, it must have been hard for those people to speak up as they saw the issues. This resulted in Accenture being allowed to do unsuitable work for over a year, until the initial ‘go-live’ date was way past and it was already too late.
There have been rumours of Hertz leadership firing the entire well-performing in-house software development talent, replacing it with off-shore workforce from IBM and making crony ‘golf course’ deals with Accenture in 2016. And the Hertz CIO securing a $7 million bonus for the short-term ‘savings’ made by those changes. I’d recommend taking these Hacker News comments with a grain of salt, but I wouldn’t be at all surprised if the allegations were more or less true.
These kinds of crony contracts are huge problem in the enterprise software industry in general, and the news we see about them are only the tip of the iceberg. But that is a subject for a whole other blog post.
It’s important to keep in mind that the lawsuit text doesn’t really tell us the whole truth: a lot of things must have happened during those years that we will never know off. However, it’s quite clear that some common mistakes that happen in consulting projects constantly happened here too, and that the ball was dropped by both parties involved.
It’s going to be interesting to see how the lawsuit plays out, as it will work as a real-life example to both consulting companies and their clients on what could happen when their expensive software projects go south.
For a company which is considering buying software, the most important learnings to take out of this mess are:
Also, one thing to note is that many companies who have had bad experiences with large enterprise consultancies have turned to the smaller, truly agile software consultancies instead of the giants like Accenture. Smaller companies are better at taking responsibility for their work, and they have the required motivation to actually deliver quality, as they appreciate the chance to tackle a large project. For a small company the impact of delivering a project well and keeping the client happy is much more important than it is for an already well established giant.
Hopefully by learning from history and the mistakes of others, we can avoid going through the hell that the people at Hertz had to!
It’s energy that has been around forever, used for years as a heating source across the world, particularly in areas with volcanic activity. Today, geothermal has surfaced as another renewable resource, with advancements in drilling technology bringing down costs and opening new areas to development.
Renewable energy continues to increase its share of the world’s power generation. Solar and wind power receive most of the headlines, but another option is increasingly being recognized as an important carbon-free resource.
Geothermal, accessing heat from the earth, is considered a sustainable and environmentally friendly source of renewable energy. In some parts of the world, the heat that can be used for geothermal is easily accessible, while in other areas, access is more challenging. Areas with volcanic activity, such as Hawaii—where the recently restarted Puna Geothermal Venture supplies about 30% of the electricity demand on the island of Hawaii—are well-suited to geothermal systems.
“What we need to do as a renewable energy industry is appreciate that we need all sources of renewable power to be successful and that intermittent sources of power need the baseload sources to get to a 100% renewable portfolio,” Will Pettitt, executive director of the Geothermal Resources Council (GRC), told POWER. “Geothermal therefore needs to be collaborating with the solar, wind, and biofuel industries to make this happen.”
1. The Nesjavellir Geothermal Power Station is located near the Hengill volcano in Iceland. The 120-MW plant contributes to the country’s 750 MW of installed geothermal generation capacity. Courtesy: Gretar Ívarsson
The U.S. Department of Energy (DOE) says the U.S. leads the world in geothermal generation capacity, with about 3.8 GW. Indonesia is next at about 2 GW, with the Philippines at about 1.9 GW. Turkey and New Zealand round out the top five, followed by Mexico, Italy, Iceland (Figure 1), Kenya, and Japan.
Cost savings from geothermal when compared to other technologies is part of its allure. The DOE is funding research into clean energy options, including up to $84 million in its 2019 budget to advance geothermal energy development.
2. This graphic produced by AltaRock Energy, a geothermal development and management company, shows the energy-per-well equivalent for shale gas, conventional geothermal, an enhanced geothermal system (EGS) well, and a “super hot” EGS well. Courtesy: AltaRock Energy / National Renewable Energy Laboratory
Introspective Systems, a Portland, Maine-based company that develops distributed grid management software, in February received a Small Business Innovation Research award from the DOE in support of the agency’s Enhanced Geothermal Systems’ (EGS) project. At EGS (Figure 2) sites, a fracture network is developed, and water is pumped into hot rock formations thousands of feet below the earth’s surface. The heated water is then recovered to drive conventional steam turbines. Introspective Systems is developing monitoring software that enables EGS systems to be cost-competitive.
Kay Aikin, Introspective Systems’ CEO, was among business leaders selected by the Clean Energy Business Network (CEBN)—a group of more than 3,000 business leaders from all 50 states working in the clean energy economy—to participate in meetings with members of Congress in March to discuss the need to protect and grow federal funding for the DOE and clean energy innovation overall.
Aikin told POWER that EGS technology is designed to overcome the problem of solids coming “out of the liquids and filling up all the pores,” or cracks in rock through which heated water could flow. The Introspective Systems’ software uses “algorithms to find the sites [suitable for a geothermal system]. We can track those cracks and pores, and that is what we are proposing to do.”
Looking for more insight into geothermal energy? Read our “Q&A with Geothermal Experts,” featuring Dr. Will Pettitt, executive director of the Davis, California-based Geothermal Resources Council, and Dr. Torsten Rosenboom, a partner in the Frankfurt, Germany office of global law firm Watson Farley & Williams LLP.
“In my view there are three technology pieces that need to come together for EGS to be successful,” said the GRC’s Pettitt. “Creating and maintaining the reservoir so as to ensure sufficient permeability without short-circuiting; bringing costs down on well drilling and construction; [and] high-temperature downhole equipment for zonal isolation and measurements. These technologies all have a lot of crossover opportunities to helping conventional geothermal be more efficient.”
Aikin noted a Massachusetts Institute of Technology report on geothermal [The Future of Geothermal Energy: Impact of Enhanced Geothermal Systems (EGS) on the United States in the 21st Century] “that was the basis for this funding from DOE,” she said. Aikin said current goals for geothermal would “offset about 6.1% of CO2 emissions, about a quarter of the Paris climate pledge. Because it’s base[load] power, it will offset coal and natural gas. We’re talking about roughly 1,500 new geothermal plants by 2050, and they can be sited almost anywhere.”
Kate Young, manager of the geothermal program at the National Renewable Energy Laboratory (NREL) in Golden, Colorado, talked to POWER about the biggest things that the industry is focusing on. “DOE has been working with the national labs the past several years to develop the GeoVision study, that is now in the final stages of approval,” she said.
The GeoVision study explores potential geothermal growth scenarios across multiple market sectors for 2020, 2030, and 2050. NREL’s research focuses on things such as:
The study started with analyses spearheaded by several DOE labs in areas such as exploration; reservoir development and management; non-technical barriers; hybrid systems; and thermal applications (see sidebar). NREL then synthesized the analyses from the labs in market deployment models for the electricity and heating/cooling sectors.
Geothermal Is Big Business in Boise
The first U.S. geothermal district heating system began operating in 1892 in Boise, Idaho. The city still relies on geothermal, with the largest system of its kind in the U.S., and the sixth-largest worldwide, according to city officials. The current system, which began operating in 1983, heats 6 million square feet of real estate—about a third of the city’s downtown (Figure 3)—in the winter. The city last year got the go-ahead from the state Department of Water Resources to increase the amount of water it uses, and Public Works Director Steve Burgos told POWER the city wants to connect more downtown buildings to the system.
Burgos said it costs the city about $1,000 to pump the water out of the ground and into the system on a monthly basis, and about another $1,000 for the electricity used to inject the water back into the aquifer. Burgos said the water “comes out at 177 degrees,” and the city is able to re-use the water in lower-temperature (110 degrees) scenarios, such as at laundry facilities. The city’s annual revenue from the system is $650,000 to $750,000.
“We have approximately 95 buildings using the geothermal system,” said Burgos. “About 2% of the city’s energy use is supplied by geothermal. We’re very proud of it. It’s a source of civic pride. Most of the buildings that are hooked up use geothermal for heating. Some of the buildings use geothermal for snow melt. There’s no outward sign of the system, there’s no steam coming out of the ground.”
Colin Hickman, the city’s communication manager for public works, told POWER that Boise “has a downtown YMCA, that has a huge swimming pool, that is heated by geothermal.” He and Burgos both said the system is an integral part of the city’s development.
“We’re currently looking at a strategic master plan for the geothermal,” Burgos said. “We definitely want to expand the system. Going into suburban areas is challenging, so we’re focusing on the downtown core.” Burgos said the city about a decade ago put in an injection well to help stabilize the aquifer. Hickman noted the city last year received a 25% increase in its water rights.
Boise State University (BSU) has used the system since 2013 to heat several of its buildings, and the school’s curriculum includes the study of geothermal physics. The system at BSU was expanded about a year and a half ago—it’s currently used in 11 buildings—and another campus building currently under construction also will use geothermal.
Boise officials tout the city’s Central Addition project, part of its LIV District initiative (Lasting Environments, Innovative Enterprises and Vibrant Communities). Among the LIV District’s goals is to “integrate renewable and clean geothermal energy” as part of the area’s sustainable infrastructure.
“This is part of a broader energy program for the city,” Burgos said, “as the city is looking at a 100% renewable goal, which would call for an expansion of the geothermal energy program.” Burgos noted that Idaho Power, the state’s prominent utility, has a goal of 100% clean energy by 2045.
As Boise grows, Burgos and Hickman said the geothermal system will continue to play a prominent role.
“We actively go out and talk about it when we know a new business is coming in,” Burgos said. “And as building ownership starts to change hands, we want to have a relationship with those folks.”
Said Hickman: “It’s one of the things we like as a selling point” for the city.
Young told POWER: “The GeoVision study looked at different pathways to reduce the cost of geothermal and at ways we can expand access to geothermal resources so that it can be a 50-state technology, not limited to the West. When the study is released, it will be a helpful tool in showing the potential for geothermal in the U.S.”
Young said of the DOE: “Their next big initiative is to enable EGS, using the FORGE site,” referring to the Frontier Observatory for Research in Geothermal Energy, a location “where scientists and engineers will be able to develop, test, and accelerate breakthroughs in EGS technologies and techniques,” according to DOE. The agency last year said the University of Utah “will receive up to $140 million in continued funding over the next five years for cutting-edge geothermal research and development” at a site near Milford, Utah, which will serve as a field laboratory.
“The amount of R&D money that’s been invested in geothermal relative to other technologies has been small,” Young said. “and consequently, the R&D improvement has been proportionally less than other technologies. The potential, however, for geothermal technology and cost improvement is significant; investment in geothermal could bring down costs and help to make it a 50-state technology – which could have a positive impact on the U.S. energy industry.”
For those who question whether geothermal would work in some areas, Young counters: “The temperatures are lower in the Eastern U.S., but the reality is, there’s heat underground everywhere. The core of the earth is as hot as the surface of the sun, but a lot closer. DOE is working to be able to access that heat from anywhere – at low cost.”
Geothermal installations are often found at tectonic plate boundaries, or at places where the Earth’s crust is thin enough to let heat through. The Pacific Rim, known as the Ring of Fire for its many volcanoes, has several of these places, including in California, Oregon, and Alaska, as well as northern Nevada.
Geothermal’s potential has not gone unnoticed. Some of the world’s wealthiest people, including Microsoft founder Bill Gates, Amazon founder and CEO Jeff Bezos, and Alibaba co-founder Jack Ma, are backing Breakthrough Energy Ventures, a firm that invests in companies developing decarbonization technologies. Breakthrough recently invested $12.5 million in Baseload Capital, a geothermal project development company that provides funding for geothermal power plants using technology developed by Climeon, its Swedish parent company.
Climeon was founded in 2011; it formed Baseload Capital in 2018. The two focus on geothermal, shipping, and heavy industry, in the latter two sectors turning waste heat into electricity. Climeon’s geothermal modules are scalable, and available for both new and existing geothermal systems. Climeon in March said it had an order backlog of about $88 million for its modules.
“We believe that a baseload resource such as low-temperature geothermal heat power has the potential to transform the energy landscape. Baseload Capital, together with Climeon’s innovative technology, has the potential to deliver [greenhouse gas-free] electricity at large scale, economically and efficiently,” Carmichael Roberts of Breakthrough Energy Ventures said in a statement.
Climeon says its modules reduce the need for drilling new wells and enable the reuse of older wells, along with speeding the development time of projects. The company says the compact and modular design is scalable from 150-kW modules up to 50-MW systems. Climeon says it can be connected to any heat source, and has just three moving parts in each module: two pumps, and a turbine.
4. The Sonoma Plant operated by Calpine is one of more than 20 geothermal power plants sited at The Geysers, the world’s largest geothermal field, located in Northern California. Courtesy: Creative Commons / Stepheng3
Breakthrough Energy’s investment in Baseload Capital is its second into geothermal energy. Breakthrough last year backed Fervo Energy, a San Francisco, California-based company that says its technology can produce geothermal energy at a cost of 5¢/kWh to 7¢/kWh. Fervo CEO and co-founder Tim Latimer said the money from Breakthrough would be used for field testing of EGS installations. Fervo’s other co-founder, Jack Norbeck, was a reservoir engineer at The Geysers in California (Figure 4), the world’s largest geothermal field, located north of Santa Rosa and just south of the Mendocino National Forest.
Most of the nearly two dozen geothermal plants at The Geysers are owned and operated by Calpine, though not all are operating. The California Energy Commission says there are more than 40 operating geothermal plants in the state, with installed capacity of about 2,700 MW.
Geothermal “is something we have to do,” said Aikin of Introspective Systems. “We have to find new baseload power. Our distribution technology can get part of the way there, toward 80% renewables, but we need base power. [Geothermal] is a really good ‘all of the above’ direction to go in.”
Source : https://www.powermag.com/bringing-the-heat-geothermal-making-inroads-as-baseload-power/?printmode=1
Composites simulation tools aren’t just for mega corporations. Small and mid-sized companies can reap their benefits, too.
In 2015, Solvay Composite Materials began using simulation tools from MultiMechanics to simplify testing of materials used in high-performance applications. The global business unit of Solvay recognized the benefits of conducting computer-simulated tests to accurately predict the behavior of advanced materials, such as resistance to extreme temperatures and loads. Two years later, Solvay invested $1.9 million in MultiMechanics to expedite development of the Omaha, Neb.-based startup company’s material simulation software platform, which Solvay predicts could reduce the time and cost of developing new materials by 40 percent.
Commitment to – and investment in – composites simulation tools isn’t unusual for a large company like Solvay, which recorded net sales of €10.3 billion (approximately $11.6 billion) in 2018 and has 27,000 employees working at 125 sites throughout 62 countries. What may be more surprising is the impact composites simulation can have on small to mid-sized companies. “Simulation tools are for everyone,” asserts Flavio Souza, Ph.D., president and chief technology officer of MultiMechanics.
The team at Guerrilla Gravity would agree. The 7-year-old mountain bike manufacturer in Denver began using simulation software from Altair more than a year ago to develop a new frame technology made from thermoplastic resins and carbon fiber. “We were the first ones to figure out how to create a hollow structural unit with a complex geometry out of thermoplastic materials,” says Will Montague, president of Guerrilla Gravity.
That probably wouldn’t have been possible without composites simulation tools, says Ben Bosworth, director of composites engineering at Guerrilla Gravity. Using topology optimization, which essentially finds the ideal distribution of material based on goals and constraints, the company was able to maximize use of its materials and conduct testing with confidence that the new materials would pass on the first try. (They did.) Afterward, the company was able to design its product for a specific manufacturing process – automated fiber placement.
“There is a pretty high chance that if we didn’t utilize composites simulation software, we would have been far behind schedule on our initial target launch date,” says Bosworth. Guerrilla Gravity introduced its new frame, which can be used on all four of its full-suspension mountain bike models, on Jan. 31, 2019.
The Language of Innovation
There are dozens of simulation solutions, some geared specifically to the composites industry and other general finite element analysis (FEA) tools. But they all share the common end goal of helping companies bring pioneering products to market faster – whether those companies are Fortune 500 corporations or startup entrepreneurships.
“Composites simulation is going to be the language of innovation,” says R. Byron Pipes, executive director of the Composites Manufacturing & Simulation Center at Purdue University. “Without it, a company’s ability to innovate in the composites field is going to be quite restricted.”
Those innovations can be at the material level or within end-product applications. “If you really want to improve the micromechanics of your materials, you can use simulation to tweak the properties of the fibers, the resin, the combination of the two or even the coating of fibers,” says Souza. “For those who build parts, simulation can help you innovate in terms of the shape of the part and the manufacturing process.”
One of the biggest advantages that design simulation has over the traditional engineering approach is time, says Jeff Wollschlager, senior director of composites technology at Altair. He calls conventional engineering the “build and bust” method, where companies make samples, then break them to test their viability. It’s a safe method, producing solid – although often conservative – designs. “But the downside of traditional approaches is they take a lot more time and many more dollars,” says Wollschlager. “And everything in this world is about time and money.”
In addition, simulation tools allow companies to know more about the materials they use and the products they make, which in turn facilitates the manufacturing of more robust products. “You have to augment your understanding of your product with something else,” says Wollschlager. “And that something else is simulation.”
A Leap Forward in Manufacturability
Four years ago, Montague and Matt Giaraffa, co-founder and chief engineer of Guerrilla Gravity, opted to pursue carbon fiber materials to make their bike frames lighter and sturdier. “We wanted to fundamentally improve on what was out there in the market. That required rethinking and analyzing not only the material, but how the frames are made,” says Montague.
The company also was committed to manufacturing its products in the United States. “To produce the frames in-house, we had to make a big leap forward in manufacturability of the frames,” says Montague. “And thermoplastics allow for that.” Once Montague and Giaraffa selected the material, they had to figure out exactly how to make the frames. That’s when Bosworth – and composites simulation – entered the picture.
Bosworth has more than a decade of experience with simulation software, beginning as an undergraduate student in mechanical engineering as a member of his college’s Formula SAE® team to design, build and test a vehicle for competition. While creating the new frame for Guerrilla Gravity, he used Altair’s simulation tools extensively, beginning with early development to prove the material feasibility for the application.
“We had a lot of baseline data from our previous aluminum frames, so we had a really good idea about how strong the frames needed to be and what performance characteristics we wanted,” says Bosworth. “Once we introduced the thermoplastic carbon fiber, we were able to take advantage of the software and use it to its fullest potential.” He began with simple tensile test samples and matched those with physical tests. Next, he developed tube samples using the software and again matched those to physical tests.
“It wasn’t until I was much further down the rabbit hole that I actually started developing the frame model,” says Bosworth. Even then, he started small, first developing a computer model for the front triangle of the bike frame, then adding in the rear triangle. Afterward, he integrated the boundary conditions and the load cases and began doing the optimization.
“You need to start simple, get all the fundamentals down and make sure the models are working in the way you intend them to,” says Bosworth. “Then you can get more advanced and grow your understanding.” At the composite optimization stage, Bosworth was able to develop a high-performing laminate schedule for production and design for automated fiber placement.
Even with all his experience, developing the bike frame still presented challenges. “One of the issues with composites simulation is there are so many variables to getting an accurate result,” admits Bosworth. “I focused on not coming up with a 100 percent perfect answer, but using the software as a tool to get us as close as we could as fast as possible.”
He adds that composites simulation tools can steer you in the right direction, but without many months of simulation and physical testing, it’s still very difficult to get completely accurate results. “One of the biggest challenges is figuring out where your time is best spent and what level of simulation accuracy you want to achieve with the given time constraints,” says Bosworth.
Wading into the Simulation Waters
The sophistication and expense of composites simulation tools can be daunting, but Wollschlager encourages people not to be put off by the technology. “The tools are not prohibitive to small and medium-sized companies – at least not to the level people think they are,” he says.
Cost is often the elephant in the room, but Wollschlager says it’s misleading to think packages will cost a fortune. “A proper suite provides you simulation in all facets of composite life cycles – in the concept, design and manufacturing phases,” he says. “The cost of such a suite is approximately 20 to 25 percent of the yearly cost of an average employee. Looking at it in those terms, I just don’t see the barrier to entry for small to medium-sized businesses.”
As you wade into the waters of simulation, consider the following:
• Assess your goals before searching for a package. Depending on what you are trying to accomplish, you may need a comprehensive suite of design and analysis tools or only a module or two to get started. “If you want a simplified methodology because you don’t feel comfortable with a more advanced one, there are mainstream tools I would recommend,” says Souza. “But if you really want to innovate and be at the cutting-edge of your industry trying to understand how materials behave and reduce costs, then I would go with a more advanced package.” Decide upfront if you want tools to analyze materials, conduct preliminary designs, optimize the laminate schedule, predict the life of composite materials, simulate thermo-mechanical behaviors and so on.
• Find programs that fit your budget. Many companies offer programs for startups and small businesses that include discounts on simulation software and a limited number of hours of free consulting. Guerrilla Gravity purchased its simulation tools through Altair’s Startup Program, which is designed for privately-held businesses less than four years old with revenues under $10 million. The program made it fiscally feasible for the mountain bike manufacturer to create a high-performing solution, says Bosworth. “If we had not been given that opportunity, we probably would’ve gone with a much more rudimentary design – probably an isotropic, black aluminum material just to get us somewhere in the ballpark of what we were trying to do,” he says.
• Engage with vendors to expedite the learning curve. Don’t just buy simulation tools from suppliers. Most companies offer initial training, plus extra consultation and access to experts as needed. “We like to walk hand-in-hand with our customers,” says Souza. “For smaller companies that don’t have a lot of resources, we can work as a partnership. We help them create the models and teach them the technology behind the product.”
• Start small, and take it slow. “I see people go right to the final step, trying to make a really advanced model,” says Bosworth. “Then they get frustrated because nothing is working right and the joints aren’t articulating. They end up troubleshooting so many issues.” Instead, he recommends users start simple, as he did with the thermoplastic bike frame.
• Don’t expect to do it all with simulation. “We don’t advocate for 100 percent simulation. There is no such thing. We also don’t advocate for 100 percent experimentation, which is the traditional approach to design,” says Wollschlager. “The trick is that it’s somewhere in the middle, and we’re all struggling to find the perfect percentage. It’s problem-dependent.”
• Put the right people in place to use the tools. “Honestly, I don’t know much about FEA software,” admits Montague. “So it goes back to hiring smart people and letting them do their thing.” Bosworth was the “smart hire” for Guerrilla Gravity. And, as an experienced user, he agrees it takes some know-how to work with simulation tools. “I think it would be hard for someone who doesn’t have basic material knowledge and a fundamental understanding of stress and strain and boundary conditions to utilize the tools no matter how basic the FEA software is,” he says. For now, simulation is typically handled by engineers, though that may change.
Perhaps the largest barrier to implementation is ignorance – not of individuals, but industry-wide, says Pipes. “People don’t know what simulation can do for them – even many top level senior managers in aerospace,” he says. “They still think of simulation in terms of geometry and performance, not manufacturing. And manufacturing is where the big payoff is going to be because that’s where all the economics lie.”
Pipes wants to “stretch people into believing what you can and will be able to do with simulation.” As the technology advances, that includes more and more each day – not just for mega corporations, but for small and mid-sized companies, too.
“As the simulation industry gets democratized, prices are going to come down due to competition, while the amount you can do will go through the roof,” says Wollschlager. “It’s a great time to get involved in simulation.”
Source : http://compositesmanufacturingmagazine.com/2019/05/making-simulation-accessible-to-the-masses/