8 Big Data success stories you (probably) didn't know about.
In his tremendous compendium about 45 companies achieving success, Bernard Marr talks about facts across a portfolio of organizations and industries having achieved miracles with a new wave of technology.
Big Data has changed how everything operates; from the way hospitals treat our diseases to how banks treat transactions and protect clients from malicious scammers, to the way shops and brands envision their clients behaviour. No matter what you go through in your daily life, you probably have been affected by it somehow.
This article is based upon the book of Bernard Marr - Big Data in Practice. I will go through 8 of the companies or structures that showcase the current state and added value of Big Data.
1. Walmart :
With over two million workers and 20,000 locations in 28 countries, Walmart is the world's largest retailer and the world's largest corporation by revenue.
When Hurricane Sandy reached the United States in 2004, they discovered that when data was evaluated as a whole, rather than as separate individual sets, surprising insights may emerge.
Walmart has significantly expanded its Big Data and analytics department since then, ensuring that they remain on the cutting edge. The business announced in 2015 that they were in the midst of developing the world's largest private data cloud, capable of processing 2.5 petabytes of data each hour.
What problem needed to be solved ?
Supermarkets compete not only on price, but also on customer service and, crucially, convenience. Having the right items at the right place and at the right time so that the right people may buy them offers enormous logistical challenges.
How did they act ?
Walmart formed @WalmartLabs and their Fast Big Data Team in 2011 to develop and deliver innovative data-driven initiatives across the organization.
The Data Café- a cutting-edge analytics facility at its Bentonville, Arkansas headquarters - was the pinnacle of this plan. The analytics team at the Cafe can monitor 200 streams of internal and external data in real time, including a 40-petabyte database of all prior weeks' sales transactions.
Sales across many stores in various geographical areas may also be tracked in real time. One night, happened to be Halloween, experts were watching the sales data of novelty cookies when they saw that they weren't selling at all in certain areas. This allowed them to send an alert to the merchandising staff in charge of those locations, who immediately recognized the items hadn't even been placed on the shelves. It's not a very difficult algorithm, but it wouldn't have been feasible without real-time analytics.
Another anecdote was that of a supermarket team that was perplexed as to why sales of a specific produce were unexpectedly falling. Once their data was in the hands of the Café experts, it was easily determined that the reduction was due to a pricing error. Sales resumed within a few days after the issue was soon corrected.
Social Genome Project initiated by Walmart, which analyzes public social media interactions and seeks to forecast what goods individuals will buy based on their talks, is another project initiated by Walmart. They also offer the Shopycat service, which forecasts how people's buying habits are impacted by their friends (again, using social network data), and they have built their own search engine, Polaris, to analyze search words submitted by consumers on their websites.
What was the impact ?
The Data Café strategy has reduced the time it takes from identifying an issue in the statistics to proposing a remedy from an average of two to three weeks to roughly 20 minutes.
2. CERN :
CERN is the worldwide scientific research organization that runs the Large Hadron Collider (LHC), the world's largest and most sophisticated physics experimental facility. The colliders, which are housed in 17 miles of tunnels 600 feet beneath the surface of Switzerland and France, are designed to replicate circumstances in the cosmos moments after the Big Bang.
The LHC alone creates over 30 petabytes of data every year - 15 trillion pages of printed text, enough to fill 600 million filing cabinets - obviously Big Data by any definition.
What problem needed to be solved ?
The LHC's sensors record hundreds of millions of collisions between particles, some of which travel at fractions of the speed of light as they are propelled around the collider. This produces a vast quantity of data and necessitates the use of very sensitive and accurate technology to measure and record the outcomes.
How did they act ?
Sensors within the collider acquire data by monitoring hundreds of millions of particle collisions every second. The sensors are basically cameras, with a 100-megapixel resolution and the ability to capture pictures at breakneck rates.
The algorithms then compare the produced photos to theoretical data that describes how we expect the target particles, such as the Higgs boson, would behave. If the findings match, it indicates that the sensors detected the target particles.
What was the impact ?
CERN scientists reported in 2013 that they thought they had detected and documented the Higgs boson's existence. This was a tremendous step forward for science because the presence of the particle had been postulated for decades but could not be verified until this scale of technology was achieved.
3. Shell :
Scientists have established the notion of the data-driven oilfield in recent years in an attempt to enhance efficiency, decrease costs, and improve safety across the sector.
What problem needed to be solved ?
While efforts are being made to create more energy from renewable or alternative sources, the great majority of the energy we consume is still derived from non-renewable sources such as oil, gas, and coal.
The quest for hydrocarbons requires massive quantities of resources, equipment, and energy, thus it's critical that drilling takes place in regions that will provide the best results.
How did they act ?
Traditionally, new resource development has included placing sensors into the earth to detect low-frequency seismic waves induced by tectonic activity. These waves of energy will register differently on the sensors depending on whether they are traveling through solid rock, liquids, or gaseous substances, suggesting the likely location of hydrocarbon resources.
This information is then transferred to analytics systems and compared to information from other drilling locations throughout the world. The more closely it resembles the profiles of comparable locations where significant resources have been discovered, the more likely it is that a full-scale drilling program will be profitable.
Shell also uses Big Data to monitor the operation and condition of their equipment. Sensors capture data on the functioning of each piece of equipment at a drilling site using techniques pioneered in the manufacturing and engineering sectors, allowing precise estimates of its performance and likelihood of failure to be created. This enables routine maintenance to be performed more efficiently.
What was the impact ?
Although Shell is tight-lipped about the particular nature of the algorithms they use, they believe that Big Data analytics have given them more confidence than ever in their ability to anticipate reserves.
4. Lotus F1 Team :
Teams and race organizers are utilizing more sophisticated data-driven strategies at all levels of the sport. In this example, we'll take a look at the Lotus F1 Formula One team.
What problem needed to be solved ?
Data isn't a new concept in Formula One racing: telemetry has been used since the 1980s to feed live data from the car to pit line engineers.
Thomas Mayer, COO of the Lotus F1 team, says: "Formula One has always been on the cutting edge of technological development so it was natural that data analysis would become a big thing for us. It saves us time and money: instead of having to rely on trial and error, we have real data . . . "
How did they act ?
All of this data may be utilized to make real-time modifications to every component of the automobile in order to match it to the driver's performance. During testing, the crew may make judgments on what to adjust or change in the car's configuration before it returns to the track a few minutes later, utilizing data supplied by the vehicles. Without having raced the vehicle, the crew may arrive at a racing track with a good notion of how it will perform thanks to simulations and data analysis.
The speed at which data is conveyed is critical, much as the speed of transportation. Lotus F1 switched to a quicker storage source for data received by their vehicles in 2013, allowing them to upload 2000 statistics every lap. They attributed this to their junior driver Marlon Stöckinger's tremendous improvement in performance in the Formula Renault 3.5 series; in 2013, he earned 23 points and ended 18th overall in the season; in 2014, he collected 73 points and finished 9th.
Formula One fans also create a lot of data. During the 2014 US Grand Prix, spectators transferred over 2.3 terabytes of data via mobile networks by uploading images to social media and writing about their experience.
What was the result ?
Big Data is an important component of Lotus F1's success, helping them to improve driver and car performance and boost competitiveness.
5. John Deere :
John Deere has always been a forerunner. Their founder invented, produced, and marketed some of the first commercial steel farm machineries. These made life considerably simpler for settlers coming into the Midwest around the middle of the nineteenth century, and established the firm as an American legend.
What problem needed to be solved ?
The world's population is growing rapidly, which means there is always going to be an increasing demand for more food; so increasing the efficiency of the production of standard crops is key to meeting this growing demand.
How did they act ?
John Deere has released a number of Big Data-enabled services that allow farmers to examine data collected from sensors linked to their own machines while working in the fields. They also allow them to take use of crowdsourced, real-time monitoring of data acquired from their thousands of users. Farmers may use this information to make educated decisions about anything from which crops to plant where to how much fertilizer to use.
What was the impact ?
There may be environmental benefits in addition to increased farmer income and, perhaps, cheaper, more abundant food for the globe.
6. Royal Bank of Scotland :
Prior to the 2008 financial crisis, Royal Bank of Scotland (RBS) was the world's largest bank. When its exposure to the subprime mortgage market threatened to bring the company to its knees, the UK government stepped in, acquiring 84 percent of the company's shares.
What problem needed to be solved ?
According to RBS head of analytics Christian Nelissen, banks got disconnected from their consumers during the 1970s and 1980s. The emphasis was on selling items and meeting sales objectives, regardless of whether or not they were giving their consumers with the services they required.
How did they act ?
A basic and uncomplicated example that serves as a good beginning point is personally greeting clients on their birthdays when they call a branch. That isn't Big Data analytics, but it is consistent with the notion of personology which was set off from within.
Systems have also been designed to inform clients on an individual basis how they might profit from the bargains and promotions that are being given. Previously, logging into an online account or calling customer service would have been an opportunity for the bank to offer whatever services it could most profitably offload, customers will now receive personalized recommendations showing exactly how much they would save by taking up a specific offer.
Furthermore, transactional data is analyzed to identify instances of clients paying twice for financial items, such as paying for insurance or breakdown assistance that is already included in a bundled bank account.
7. General Electric :
GE arose from Thomas Edison's innovations in the second part of the nineteenth century, which for the first time introduced electric lights and machines into homes and businesses. They were the first privately owned corporation to invest in computer hardware and have been at the cutting edge of technology for more than a century. Their gear is now employed to generate a fifth of the world's electrical supply.
What Problem needed to be solved ?
Downtime of critical machinery can immediately result in revenue loss, and costly human resources must be committed to system upkeep and maintenance.
How did they act ?
Data acquired by sensors put in machinery across all sectors is examined and analyzed to offer information on how it operates. This implies that the impact of tiny changes - such as operating temperatures or fuel levels - may be closely tracked and connected with any other statistic that is gathered.
What was the impact ?
Although GE has not provided total data, they have stated that their industrial customers may expect to save an average of $8 million per year simply by reducing equipment downtime.
8. Autodesk :
Autodesk is a software publisher based in California that specializes in the development of commercial computer-aided design (CAD) tools. Beginning with AutoCAD, they have progressed to create specialized software for certain sectors of design and architecture, such as Revit (construction), Moldflow (production), and Maya (graphics and effects for entertainment media). In several of these areas, their products have become industry standards.
What problem needed to be solved ?
Customer surveys and feedback forms included in the product packaging were typically the only avenues for receiving input. However, software makers were always aware that only a small percentage of users would ever utilize them.
How did they act ?
On addition to hosting the product in the cloud and collecting extra user input, the business makes early, pre-release builds of several of its popular products available through their Autodesk Labs service. This provides businesses with vital information into the types of features and capabilities that their consumers want to see included in prospective services, as well as new plugins and extensions for existing products.
What was the impact ?
The most significant impact has been the increase in the speed with which insights into user behavior can be gleaned, and hence the decrease in time before action can be done. This has resulted in a significant reduction in the time between issues being identified as problematic by user activities and a remedy being implemented.
Conclusion :
Should an organization begin to gather and analyze data to help itself answer questions and thrive, Big Data may be a great asset. I think the case examples in the cited book demonstrate how these ideas are effectively utilized.
However, regretfully, a lot of firms get lost in the Big Data prospects and wind up stockpiling data in the erroneous belief that it would one day be valuable. The successful case studies help us avoid that.



Comments
There are no comments for this story
Be the first to respond and start the conversation.