The servers collect data on machine states, error codes, temperature, humidity, and pressure with remote management that preclude the need for of human operators. The supervisory systems collect, aggregate, and visualize equipment data in graphical dashboards that operators can monitor and control remotely. Built for edge deployment, the servers feature advanced airflow and cooling systems that prevent overheating and have specially designed dust filters to prevent coffee grounds from entering the servers. Instead of replacing entire PCs, now the company simply cleans the filters.
Strauss Coffee’s solution is just one example of how organizations are deploying edge computing, a distributed processing architecture that is expected to unlock trillions of dollars’ worth of innovative new applications over the next decade.
More than sensors
Many people are confused about what edge computing is. A common misconception is that the edge is all about gathering data from intelligent devices. While sensors and cameras are often part of the equation, most experts prefer Gartner’s broader definition of edge computing as “part of a distributed computing topology where information processing is located close to … where things and people produce or consume information.”
Edge environments often include powerful servers and even small data centers. These components complement central clouds, which aren’t appropriate for every data processing scenario.
For example, game-changing applications such as computer vision and augmented reality require massive amounts of compute power and rapid response times. Sending all that data back and forth to a cloud data center hundreds of miles away is neither practical nor efficient.
“There’s a lot more data being created than ever before,” says Charles Ferland, Vice President and General Manager of Edge Computing and Communications Service Providers at Lenovo. “It’s impossible to transport it efficiently over a long distance for processing. You need a way to distribute in a hybrid way, with some at the edge for real-time processing and some in the data center for long-term storage.”
Edge architectures take advantage of continuous advances in hardware price/performance to move processing as close as possible to the point at which data is created or used for the purpose of enabling split-second decisions.
The edge market is poised for explosive growth. Gartner expects more than 15 billion IoT devices will connect to the enterprise infrastructure by 2029, and predicts that by 2025, three-quarters of enterprise-generated data will be created and processed at the edge. . Network World’s 2020 State of the Network survey found that 69% of businesses have deployed, are piloting, or are actively researching edge computing architectures.
Use cases are proliferating:
- In industrial automation scenarios like manufacturing and logistics, smart devices can track and optimize truck routes or guide robots to place or pick items in a warehouse. High-definition cameras, monitored by machine learning applications, can spot safety issues such as liquid spills, and equipment sensors can identify equipment that is at risk of failure so proactive maintenance can be performed.
- Street-level sensors in smart cities can monitor traffic, detect collisions, and speed response times of emergency vehicles to crash sites.
- Drones equipped with high-definition cameras can inspect equipment in high-risk areas such as under bridges and atop cell towers to identify and even fix problems.
- Augmented reality can deliver contextual information to tourists marveling at a medieval cathedral, utility workers repairing power lines, ship captains navigating unfamiliar harbors, and countless other scenarios.
- Wearable and ingestible devices let healthcare providers continuously monitor patients at home, while robots allow surgeons to operate from hundreds of miles away.
- Insurance companies can tap into constellations of miniature weather stations to provide early warning of catastrophic events such as tornadoes and hailstorms.
Data growth drives new uses
A dominant factor underlying the growth of edge computing is the rapid increase in data volumes. Streaming data from sensors, video cameras, online transactions, and social media posts will make up more than 30% of all data generated in 2025, according to IDC, while the number of smart devices is expected to more than triple to 24 billion in 2030, according to Transforma Insights.
Gartner predicts that three-quarters of enterprise-generated data will be created and processed at the edge by 2025, up from just 10% in 2018. Small data centers and even individual servers in the field will make increasingly sophisticated decisions at the point of action while uploading aggregated data to regional nexus points and the cloud for analytical processing.
All this will place ever-greater demands on IT infrastructure. Latency, or response times, will be a key factor in the successful deployment of applications that require split-second decisions, ranging from avoiding collisions to recommending add-on purchases at the point of sale.
These dynamics are driving an evolution in centralized processing to multitier architectures, with data managed both at the point of generation and further upstream in regional data centers and cloud servers.
“In order to scale applications and workloads practically, some of the ubiquitous services that used to be centralized need to move to the edge,” says Blake Kerrigan, General Manager of Edge Computing in the Emerging Business Group at Lenovo. To meet the increasing needs of faster processing power, better security, and scalability close to where data is created and consumed, Lenovo now offers the ThinkEdge portfolio of purpose-built edge devices.
Where are the biggest edge opportunities right now? They’re virtually everywhere. And with high-speed 5G networks and specialized AI processors only beginning to make their impact felt, it’s safe to say the opportunities will proliferate faster than ever from here.