Add to favourites
News Local and Global in your language
15th of August 2018

Internet



Five ways supercomputing is solving some of the world's greatest challenges

The evolution of supercomputing over the last several decades has dramatically impacted how scientists, researchers and technologists tackle global issues. And today it’s never been more important to model our world by using supercomputers because of the profound challenges we face globally in the areas of energy, climate change and health. To simulate how our planet works on a granular level and how our bodies function at the cellular level require the most analytic and computational tools available. Supercomputers are that tool.

Can Wind Power Fuel the Entire World?

Wind power is an unbeatable renewable energy source that, if harnessed effectively, could meet all of the world’s energy demands. In fact, according to GE, based on wind conditions at a typical site in the German North Sea, one turbine could produce as much as 67 gigawatt hours (GWh) annually, enough power for as many as 16,000 European households. That’s why Dr. Lawrence Cheung and the team from GE Global Research are working with a supercomputer to understand the elements critical to the development and design of wind turbines and wind farms.

Wind is invisible to the naked eye and wind motions vary in scale as their movements change regularly – hourly, daily and seasonally. This makes wind extremely difficult to measure and track, and traditional methods such as radar and wind measurement devices don’t provide an adequately comprehensive picture of what’s going on around an entire array of wind turbines. 

Using a supercomputer, Dr. Cheung and the GE team are running predictive simulations of actual wind farms in just a couple of weeks. They’ve been successful in measuring and repeating what they’ve seen in the atmosphere and what’s happening in the wind field around the turbines. This data is now being used to build new, more efficient turbines. With supercomputing capabilities, Cheung says, “We’re getting things that can’t ever be reasonably measured in the field. We’re getting information that would be impossible to measure by other means.”

Earthquake Simulations Help Cities Plan for Safer Infrastructure

While science can’t yet predict earthquakes, the power of supercomputing is helping to work on the next best thing – predicting the effects on infrastructure so cities like San Francisco and Mexico City can plan their buildings, roads and utilities for when the “Big Ones” strike. 

Specifically, the team of scientists at the Southern California Earthquake Center (SCEC) is using a supercomputer to process enormous calculations and obtain seismic intelligence on how the ground would move during an earthquake. SCEC recently ran a simulation that generated two sets of seismic maps expanding from the original Los Angeles basin area into central California and covered 438 sites, including public utility stations, historic sites and key cities.

Ground motion is mythically difficult to model and achieving an accurate view of ground motion requires more than standard practical techniques. CyberShake, one of SCEC’s major research efforts, has taken those standard techniques and built on them, using advanced computing, integrating advanced physics, and aggregating loads and loads of data and wave motions to produce complete and accurate earthquake models.

For example, SCEC’s supercomputer used for earthquake simulations has almost 300,000 CPUs and 19,000 GPUs. This means it can solve trillions of equations in a matter of moments, faster than you can scroll down to the bottom of this article.

Advanced Weather Forecasts Save Lives

Every year, extreme weather conditions present a threat to lives and livelihoods around the globe. Advanced weather forecasting requires an incredible amount of compute power, something the Met Office – the U.K.’s national weather service – knows all too well.

The Met Office is estimated to save as many as 74 lives and £260 million a year. These life-saving forecasts, however, demand immense compute power and, most importantly, forecasting accuracy. Without accuracy, the public’s attention to severe weather warnings would wane.

The Met Office uses more than 10 million daily weather observations in the UK alone, an advanced atmospheric model and a supercomputer to create 3,000 tailored forecasts each day. The weather center is able to turn out such a vast number of forecasts because its supercomputer is capable of processing 16,000 trillion calculations each second. These accurate forecasts help save lives every day.

The Met Office’s supercomputer allows it to take in 215 billion weather observations from all over the world every day, which it then uses as a starting point for running an atmospheric model containing more than one million lines of code. In addition, it’s expected that the supercomputer will enable £2 billion of socio-economic benefits across the UK through enhanced resilience to severe weather and related hazards.

Getting Oil to the Pump Faster

The oil and gas industry is in an aggressive race toward efficiency. Studies show that energy consumption is on pace to double by 2040, and as a result, the demand for oil and gas is soaring. At the same time, new oil and gas reserves are becoming harder to locate and the industry as a whole is encountering cutbacks.

Exploration and production (E&P) companies are faced with a stark reality – finding oil and gas resources is no guessing game. PGS, a marine seismic company that creates high-resolution 3D seismic images for E&P companies, is flipping this reality on its head with the immense computing power of a supercomputer to produce increasingly accurate and clear images that create a better, faster and smarter game for E&Ps.

In 2014, PGS conducted a survey of the resource-rich area of the Gulf of Mexico that covered 10,000 square miles and took almost two years to plan, and almost a year to conduct. This was the largest-ever seismic survey to process and required supercomputing technology to crunch the data in the shortest time possible. Put another way, the survey resulted in a 3D image with a pixel size of 30x30x10 meters that covered an area about 5 times the size of Texas, to a depth of 16 kilometers.

Discovering Bone Fracture Treatments in One-Quarter of the Time

Treating bone fractures is actually an uncertain science. You can have two patients with exactly the “same” situation, but one implant may work and one may fail. Dr. Ralf Schneider from the High Performance Computing Center Stuttgart (HLRS) is trying to figure out why that is.

To predict how a specific patient’s bones would heal, Dr. Schneider is using HLRS’s supercomputer to conduct micromechanical simulations of bone tissue. But achieving accurate calculations begins with having the correct material data. Dr. Schneider uses samples from patients of different ages and genders to build a representative database. "You have to calculate the local strain within the bone in the correct way," says Schneider. "If you don't have the right elasticity, you'll formulate the strain incorrectly, which will lead to an incorrect estimation of bone remodeling" and an incorrect calculation of the risk of implant failure.

While micromechanical simulations aren’t large in size, resolving each tissue sample requires 120,000 individual simulations. With a supercomputer, Dr. Schneider can accomplish these simulations in a day. Before, when he was using a PC-Cluster, one simulation required about four days to complete. Now, because the simulations are completed more quickly, Dr. Schneider has time to run more simulations and, therefore, more confidence in the results.

As the above examples demonstrate, supercomputing plays an important role in finding answers to issues that will ultimately better our lives. And it has played that key role for decades already. With the coming emergence of exascale supercomputing systems that are even more powerful, we’ll all get to see what these next-generation systems can accomplish, guided by the brightest minds of today and tomorrow.  

Fred Kohout, Senior Vice President Products and Chief Marketing Officer at Cray Inc. 

Image Credit: Scanrail1 / Shutterstock

Read More




Leave A Comment

More News

Latest ITProPortal news

TechRadar: Internet news

TechCrunch » Enterprise

ExtremeTech » Internet

How-To Geek

Digital Trends

Disclaimer and Notice:WorldProNews.com is not the owner of these news or any information published on this site.