Connect with us

Technology

How edge computing can boost business efficiency

Avatar

Published

on

network
Share this:

Edge computing is about processing data as close to the source as possible, which reduces both latency and bandwidth use. This concept is seen as critical for furthering the Internet of Things and for driving the development of autonomous vehicles.

What is edge computing?

Edge computing is a decentralized approach to computing applied to networks (the opposite to cloud computing’s centralized approach). The concept relates to how a network stores its information. In edge computing, most data on a network is moved away from physical computers. For businesses, data is moved onto a private server.

Edge computing is especially useful in cases where a lot of data is generated. The approach allows for the successful triage of data locally so that some of it is processed locally, reducing the backhaul traffic to the central data repository. This is very useful in cases where many devices are connected together, as with the Internet of Things.

Edge computing helps to make the Industrial Internet of Things possible. This is an area of great value. McKinsey & Co. calculate that the Industrial Internet of Things will generate $7.5 trillion in value by 2025. The advantages here are to connect people to machine data that accelerate digital industrial transformation.

How can edge computing benefit business?

The advantages of edge computing are that it takes less time to move data and there are fewer are less hardware limitations and that hardware limitations are easily addressed. With conventional storage systems, hardware is normally required, and this can create a bottleneck that places a restriction on how much memory can be moved at any time point. The use of hardware also leads to slower data transfer speeds.

Furthermore, the costs of operating and maintaining the hardware are relatively more expensive.

Security is also stronger with edge computing, making edge computing systems harder for hackers to penetrate. This is because data is continually moving between network modes.

When data are moved throughout a network, they go through different security layers to ensure hackers cannot get into the system, but edge computing goes beyond this. More security layers are used because, instead of the data moving between the network nodes, the data moves from the Internet into the servers and onto the nodes. This provides an opportunity for creating additional firewalls and antivirus scans.

How are businesses using edge computing?

Businesses can derive many advantages from the edge computing concept. The edge process enables analytics and data gathering to occur at the source of the data. This enables companies to leverage resources from devices that are not necessarily continuously connected to a network like laptops, smartphones, tablets and sensors.

Autonomous vehicles and edge computing

Among the more specific examples is autonomous car technology. These are, in a sense, datacenters on wheels, and here edge computing plays a key role. To collect the high volumes of data, edge computing provides an advantage. In terms of data, Intel estimates that autonomous cars, with their many on-vehicle sensors, generate over 40 terabytes of data for each eight hours of driving. Given that this level of data cannot be easily sent to a cloud (and this also presents a safety risk in terms of delayed reactions), the use of edge computing becomes a necessity.

Security cameras and edge computing

A second example is with security systems. If a large complex is served by dozens of high-definition Internet of Things video cameras where data is continuously streaming that signal to a cloud server, these systems can be slow to respond. This is especially so if the security protocol is designed to respond to motion-detection. This set-up places a major strain on the building’s Internet infrastructure, with a high proportion of the bandwidth becoming consumed by a high volume of video footage.

With the edge concept, each camera would have an independent internal computer to run the motion-detecting application and then sent footage to the cloud server as needed. This improves efficiency and lowers bandwidth use.

Fleet management and edge computing

Edge computing also helps to improve the efficiency of fleet management. While a large volume of key performance data needs to be collected – wheels, brakes, battery, electrical – where such data requires a response, such as a potential brake failure, then some of this data needs to be collected and stored locally on the edge in order to minimize the risk of vehicle breakdown or accident.

An example of edge computing applied to fleet management is with trailer temperature. With most fleet monitoring systems, only temperature readings that are outside of a set range are reported back to fleet managers through telematics. The fleet manager then needs to assess whether or not there is a problem. However, with edge analytics, temperature readings can be analyzed onboard a vehicle and notified to the driver, empowering the driver to take steps to mitigate the temperature fluctuation.

Share this:

Technology

Purolator to design tech-driven courier hub in Toronto

Avatar

Published

on

Share this:

A new national technology hub is being designed and built by courier service provider Purolator in Toronto, reflecting growth in the area’s e-commerce sector. The $339 million project will extend to 60 acres.

Purolator Inc.’s new super-hub in Toronto is set to open in 2021 and the expectation is that it will triple the capacity of the courier provider’s network. The development of the hub forms part of a wider $1-billion investment the company intends to inject into Canada across the next five years. Included in the plans is the intention to upgrade the vehicle fleet, taking advantage of more advanced technology. The central hub will help to coordinate Purolator’s 172 operations facilities and 111 shipping centres.

Also included in the scheme are plans to focus on the customer. This includes improving the online experience by making the main website easier to navigate. There will also be innovations in automation and the hub will be designed to meet environmental standards, meeting the Toronto Green Standards program (which details Toronto’s sustainable design requirements).

Quoted by Bloomberg, Purolator CEO John Ferguson says that the announcement “is one of the most ambitious in our company’s history and will future-proof our business. Purolator has experienced record growth over the past three years. We picked up and delivered over one quarter of a billion packages in 2018 and we expect our growth trajectory to continue.”

The Purolator hub is just one of several innovations making use of Toronto’s growing technology infrastructure. E-commerce company Shopify plans to increase its operations and to employ more staff over the next three years in the city.

Toronto’s cultural and economic diversity has fueled the city’s rapid growth in a number of high-tech areas, particularly for startups and developments in areas like artificial intelligence. This reflects the Canadian government’s plans to build and keep successful startup ecosystems, especially in the Toronto area.

Share this:
Continue Reading

Investment

Connecting with ‘US:’ The necessity and value of the Internet of Things

Done right, the Internet of Things is the Internet of Us, connecting the physical and digital in a human-centered way that improves the world intelligently.

Cognizant

Published

on

Share this:

By Frank Antonysamy, Vice President of Cognizant’s Global IoT and Engineering Services

U.S. food safety has been a concern since the days of Upton Sinclair’s classic novel about the stockyards and meatpacking industries in Chicago. Public reaction to The Jungle compelled Teddy Roosevelt and the U.S. Congress to pass food safety laws and establish the U.S. Food and Drug Administration in 1906.

More than a century later, threats clearly remain to the safety of domestic and global food supplies and the purity of water sources. Recently, we’ve learned about significant, ongoing, even deadly threats to our food and water. Food recalls have ranged from romaine lettuce to beef in the last 12 months; the tragedy in Flint, Mich., reminds us that poisonous chemicals still make their way into our water, as well. Faulty equipment or poorly executed processes often are to blame.

[Read more: The State of the Union for IoT Intelligence]

Solving Safety Challenges with Internet of Things

It doesn’t have to be this way. As the Internet of Things (IoT) begins to permeate our global infrastructure, sensor-equipped devices will soon outnumber the global population. There’s no reason to wait until communities face a food- or water-borne threat before fixing malfunctioning equipment or improving safety procedures.

Today we can automatically and rapidly glean information from IoT-enabled devices – about temperatures in IoT-equipped food storage and transportation equipment, for example, or the chemicals sensed by the pumps that filter and move our water, or the monitoring capabilities of the medical devices we increasingly rely on in hospitals and the home. With such intelligence, communities and businesses can address problems before they become a threat.

[Download]: Advancing Smart Manufacturing Operations Value with Industry 4.0

Increasing Food Safety on a Massive Scale

Recently, I had a conversation with Internet of Things maven Stacey Higginbotham on one of her Stacey on IoT podcasts. We discussed Cognizant’s work with Internet of Things adoption, and the ways in which these solutions can help businesses and the people they serve.

We talked about how one of the world’s largest sellers of fresh and frozen foods uses IoT-enabled refrigerators and freezers to reduce food spoilage across its global supply chain. Such spoilage not only results in financial losses due to food waste, but can also present risks to consumers. Although the business had already implemented alarms on the refrigeration systems in its distribution centers to signal malfunctions, it could take 36 hours for the maintenance operations team to respond – clearly too long when it comes to food safety and waste. There was also no mechanism to proactively monitor the refrigeration units and ensure timely service calls.

Our solution minimizes energy consumption and seeks to ensure consumer safety. It ties together sensors, cloud-based monitoring, algorithms that trigger alerts and warnings, reminders in handheld applications and a direct link of performance data to individual employees to encourage compliance with the company’s internal food safety protocols. The system covers hundreds of freezers, thousands of deliveries, 600 million data points and millions of pounds of food.

The results have been impressive. After rolling out the system to 100 of its stores, the business reduced priority response times from 36 hours to four hours, and decreased food loss by 10% in the first year by predicting refrigeration failures. The company aims to expand the system to 5,300 stores, with the potential to reduce operating costs by up to $40 million while ensuring the safe storage of food. (Hear more about this solution in the three-minute podcast recording below.)

[Download]: Designing Manufacturing’s Digital Future

From Providing Pumps to Offering Insights

These same principles guided our solution for a global manufacturer of high-technology industrial water pumps used in a range of applications, from providing drinking water for cities and villages, to processing waste water, to clearing and filtering the huge volumes of water moved during deep-sea drilling.

With the movement of all that water through its sensor-equipped and self-monitoring pumps, the manufacturer had access to a flood of information on everything from performance-based data on pressure and volume to the chemical composition of the water. By collecting and analyzing this information, the company could leverage and monetize its insights into not just equipment performance but also the safety of the water it delivers. If a certain chemical spikes in the water supply, for example, alerts are triggered, and municipalities can investigate. If water pressure or volume falls outside set parameters, precautions can be taken, including automatic alerts and even preemptive shutdowns.

Buyers of the pumps want this information. So, while using this data to improve the performance of its products, the business can also share insights with its clients on a subscription basis, opening up new revenue streams. The business is no longer just providing world-class high-tech pumps; it’s offering customers critical insights from the pumps it sells, as a value-added service. (Hear more about this solution in the three-minute podcast recording below.)

[Download]: Advancing Smart Manufacturing Operations Value with Industry 4.0

Connecting Things; Connecting to Our Needs

What links these two examples is their prioritization of real human needs as part of the solution. Clean and safe food and water are vital to human health, and companies that help provide themadd value.

For many years, large industrial enterprises have lived in two separate worlds: the world of all their physical assets (factories, equipment, buildings, people) and the world of their digital assets (software, workflows, algorithms, reports). Through sensor technology, network capability, security advances and IoT platforms, these two worlds are now becoming seamlessly integrated like never before.

Today, the shorthand for this ongoing integration is the Internet of Things. In reality, though, it’s the Internet of Us. Technology offers us a path to connect our physical world with a digital one, in which we occupy a new space and a new future: a place where the physical and digital come together, enabling businesses to transform their operational and business models, in a scalable way, through intelligence. (Hear more on the Internet of Us in the three-minute podcast recording below.)

Share this:
Continue Reading

Healthcare

AI technology from IBM detects breast cancer risk before it happens

Avatar

Published

on

Share this:

IBM has taken a step forward in disease prevention by designing an artificial intelligence technology that can predict the risk of breast cancer developing up to one year before the first signs of cancer appear.

With this new step in medical diagnosis, IBM has developed an artificial intelligence model which is capable of predicting malignant breast cancer within a year with an 87 percent accuracy rate (when the output from the machine is compared with expert radiologists.) In addition, the technology could correctly predict 77 percent of non-cancerous cases.

The prediction methods uses both mammogram images and medical records in order to make the assessment, based on a data review of the medical evidence. This is the first application of AI to draw upon both images and data to make a prediction in relation to breast cancer.

At the heart of the deep neural network technology is an algorithm, which was trained by IBM technologists along with medical professionals from Israel’s largest healthcare organizations. The training of the artificial intelligence took place using anonymized mammography images which were linked to biomarkers (like patient reproductive history) together with clinical data. The training data-set consisted of 52,936 images from 13,234 women who underwent at least one mammogram between 2013 and 2017.

The aim of the medtech is not to replace the physician, but to act as a ‘second pair of eyes’, providing a backup in the event that something has been missed through conventional patient assessment. This could prove especially useful in areas with staff shortages where a second medical professional is not available to provide a second assessment.

An assessment of the technology has been published in the journal Radiology. The research paper is titled “Predicting Breast Cancer by Applying Deep Learning to Linked Health Records and Mammograms.”

In related news, IBM is applying artificial intelligence to catch Type 1 diabetes much earlier. IBM’s other health technology project could help identify patients at risk and help chart a course for tracking the condition. The predictive tool is a joint project between IBM and JDRF (formerly known as the Juvenile Diabetes Research Foundation).

Share this:
Continue Reading

Featured