Edge computing is about processing data as close to the source as possible, which reduces both latency and bandwidth use. This concept is seen as critical for furthering the Internet of Things and for driving the development of autonomous vehicles.
What is edge computing?
Edge computing is a decentralized approach to computing applied to networks (the opposite to cloud computing’s centralized approach). The concept relates to how a network stores its information. In edge computing, most data on a network is moved away from physical computers. For businesses, data is moved onto a private server.
Edge computing is especially useful in cases where a lot of data is generated. The approach allows for the successful triage of data locally so that some of it is processed locally, reducing the backhaul traffic to the central data repository. This is very useful in cases where many devices are connected together, as with the Internet of Things.
Edge computing helps to make the Industrial Internet of Things possible. This is an area of great value. McKinsey & Co. calculate that the Industrial Internet of Things will generate $7.5 trillion in value by 2025. The advantages here are to connect people to machine data that accelerate digital industrial transformation.
How can edge computing benefit business?
The advantages of edge computing are that it takes less time to move data and there are fewer are less hardware limitations and that hardware limitations are easily addressed. With conventional storage systems, hardware is normally required, and this can create a bottleneck that places a restriction on how much memory can be moved at any time point. The use of hardware also leads to slower data transfer speeds.
Furthermore, the costs of operating and maintaining the hardware are relatively more expensive.
Security is also stronger with edge computing, making edge computing systems harder for hackers to penetrate. This is because data is continually moving between network modes.
When data are moved throughout a network, they go through different security layers to ensure hackers cannot get into the system, but edge computing goes beyond this. More security layers are used because, instead of the data moving between the network nodes, the data moves from the Internet into the servers and onto the nodes. This provides an opportunity for creating additional firewalls and antivirus scans.
How are businesses using edge computing?
Businesses can derive many advantages from the edge computing concept. The edge process enables analytics and data gathering to occur at the source of the data. This enables companies to leverage resources from devices that are not necessarily continuously connected to a network like laptops, smartphones, tablets and sensors.
Autonomous vehicles and edge computing
Among the more specific examples is autonomous car technology. These are, in a sense, datacenters on wheels, and here edge computing plays a key role. To collect the high volumes of data, edge computing provides an advantage. In terms of data, Intel estimates that autonomous cars, with their many on-vehicle sensors, generate over 40 terabytes of data for each eight hours of driving. Given that this level of data cannot be easily sent to a cloud (and this also presents a safety risk in terms of delayed reactions), the use of edge computing becomes a necessity.
Security cameras and edge computing
A second example is with security systems. If a large complex is served by dozens of high-definition Internet of Things video cameras where data is continuously streaming that signal to a cloud server, these systems can be slow to respond. This is especially so if the security protocol is designed to respond to motion-detection. This set-up places a major strain on the building’s Internet infrastructure, with a high proportion of the bandwidth becoming consumed by a high volume of video footage.
With the edge concept, each camera would have an independent internal computer to run the motion-detecting application and then sent footage to the cloud server as needed. This improves efficiency and lowers bandwidth use.
Fleet management and edge computing
Edge computing also helps to improve the efficiency of fleet management. While a large volume of key performance data needs to be collected – wheels, brakes, battery, electrical – where such data requires a response, such as a potential brake failure, then some of this data needs to be collected and stored locally on the edge in order to minimize the risk of vehicle breakdown or accident.
An example of edge computing applied to fleet management is with trailer temperature. With most fleet monitoring systems, only temperature readings that are outside of a set range are reported back to fleet managers through telematics. The fleet manager then needs to assess whether or not there is a problem. However, with edge analytics, temperature readings can be analyzed onboard a vehicle and notified to the driver, empowering the driver to take steps to mitigate the temperature fluctuation.
How to accelerate business change using data and AI
AI poses as many challenges as opportunities. Here are a few common characteristics we’ve seen among businesses that have realized success with data and AI.
By Bret Greenstein
Most senior leaders would agree that adapting in real time to customer and market needs is vital for achieving their visions and goals. And to develop this capability, they’d also likely agree they need data and artificial intelligence (AI).
The examples are all around us. The city of Chicago uses 12 variables, including high daily temperatures, to prioritize which of the city’s 7,000 “high-risk” restaurants it should send its 35 food inspectors to. The AI solution found violations a week earlier than they otherwise would have.
We helped an insurer create an AI system that provides underwriters with more precise estimates of risk, as well as the likelihood applicants will accept a specific price quote. The insurer can adjust how the AI environment makes its approval and pricing recommendations to ensure compliance with changing corporate priorities around risk vs. revenue.
We’ve also seen AI help predict customers’ emerging needs as markets change. If the economy goes into a recession, informed analysis could help organizations not only cut costs but also provide the products and services customers might need in a downturn.
And, of course, there are the businesses that have used data and AI to create new revenue streams and business models that drive lasting competitive advantage. For game changers like Uber, data is at the core of the company and constitutes value, not cost.
Facing up to AI realities
But there’s another hard fact that would draw consensus from many business leaders today: the unprecedented effort required to move an enterprise in an AI direction. The fact is, AI-enabled business change requires as much alteration to corporate culture, organizational structures and processes, and workforce roles and skills as it does new technology.
For example, too many organizations still treat data as an expense and a security risk. Technical or organizational siloes make it difficult to pool information in flexible data lakes. Many businesses also lack the skills to manage AI-enabled analytics amid rising privacy and ethical concerns. Others are unable to provide audit trails on how AI decisions were made or deliver AI-enabled analytics quickly enough.
AI success factors
Here are a few common characteristics we’ve seen among businesses that have realized success with data and AI:
- Senior leadership is passionate about the value of data. CIOs can play a big role here. They need to work with the CEO and business leaders to resolve the tension between business managers who want more access to data and the pressure to restrict such access to reduce cost and risk.
- Data scientists embedded with business users. By working more closely with business users, data scientists can better understand the business context behind their requests. Business managers may say they need “X,” but what they really need is a solution to a problem, which may or may not be “X.”
- The establishment of an AI oversight role. Because AI systems learn and improve, they can produce different results at different times based on the data they’re given and the algorithms applied. This means they require closer oversight than traditional systems that are coded, tested and summarily released – and then left alone.In some ways, managing an AI system is like raising a child. You need to be sure they’re not being trained with intentionally or unintentionally biased data or by malicious users. If a loan application is denied, for example, you need to know the decision was ethically sound.
- Plenty of high-quality data. The more high-quality data, the better and more quickly the AI system can learn. To supply that data, organizations must refine their processes for ongoing data preparation, integration and pipelining. This essential groundwork is the hardest part in building an AI system.
- Modernized infrastructure. An infrastructure based on application programming interfaces (APIs) and Agile development techniques makes it easier to quickly deliver new AI-based applications. This more flexible infrastructure enables organizations to better access needed data from outside the enterprise, and more effectively monetize the resulting insights by sharing them across the ecosystem.For instance, we worked with one of the world’s largest and busiest airports to revamp its technology infrastructure to increase airport efficiency and performance. The organization combined data and AI to unlock more capacity to serve airlines and reduce passenger misconnects.
Our final recommendation: Start now. AI can be difficult and complex, and it’s hard to catch up once you’ve fallen behind. Even modest successes will teach you a lot, and the real danger in digital is being a laggard.
Cognizant (Nasdaq: CTSH) is dedicated to helping the world’s leading companies build stronger businesses — helping them go from doing digital to being digital.
How BMO branch technology saves employees up to 30 minutes per day
When it comes to banking, it comes as little surprise that customers are increasingly preferring tellerless interactions.
A recent customer insight report from Mercator Advisory Group found that those who don’t like using mobile and online banking prefer to use self-service kiosks at physical branch locations.
Even back in 2015, a study by Source Technologies found that self-service retail banking kiosks improve operations, “reducing the time it takes to get an official check from nine minutes (using a teller) to 40 seconds – 13.5 times faster than a teller-conducted transaction.”
When banks invest in features like remote authentication and mobile deposits, it isn’t just customers who benefit — staff are able to better focus on more complex transactions, and developing relationships with clients.
“We see that more and more of our customers are migrating toward self-serve interactions, especially for the simpler, straightforward transactions,” explained Kyle Barnett, BMO’s chief operating officer for US personal and business banking, in an interview with PYMNTS.
One of technologies implemented by BMO was a faster, real-time process for scanning and depositing cheques, saving customers from having to fill out a paper deposit slip. This has led to deposits clearing within hours instead of days.
Another BMO implementation was its easy PIN authentication; instead of using a driver’s licenses or state-issued ID, customers use debit cards to verify their identities. The transaction is therefore accelerated, and data is aggregated instantly on the teller’s screen.
Both of these improvements were implemented in more than 500 branches by the end of 2019.
“If a customer walks in and opens up an account [during the] same interaction, they can actually leave with a fully functioning, embossed card that has their name on it,” Barnett said.
And unlike before, when a customer was issued a temporary card and had to wait for the fully-functioning replacement to arrive in the mail, “they also get the PIN right there as part of the account opening, and can even set up a custom PIN if they want at the ATM.”
With the in-branch experience changing, and customers requiring fewer interactions with tellers, the result has been “really freeing up our branch bankers to have more time to dedicate to customers, and have better holistic conversations, and create more personalized recommendations.”
One case study found that employees have saved between 15 and 30 minutes per day on processing forms. Multiply that by the number of employees within BMO, and you get a major win for efficiency and time saving.
DX Journal covers the impact of digital transformation (DX) initiatives worldwide across multiple industries.
Addressing cybercrime with passwordless authentication
Cybercrime costs the global economy $2.9 million every minute, according to a RiskIQ report titled The Evil Internet Minute, adding up to $1.5 trillion in a year.
The cause? 80% of attacks are password-related issues.
Just think: How many times have you been prompted to change your password by your bank or email company? How many times have you heard the term “data breach” in the news? How many accounts do you have that require two-step authentication? How many passwords do you have, scattered across various accounts?
PINs, passwords, security questions are a daily part of our lives, but it’s typically nothing more than a hassle when we have to change them. For larger organizations, however, costs add up. Estimates show that nearly 50 percent of IT help desk costs are allocated to password resets.
Enter: Passwordless authentication — harnessing the power of technologies like AI and Machine Learning to save time and money.
Cyber security is high on the World Economic Forum’s agenda, and in collaboration with open industry association FIDO Alliance, WEF’s whitepaper Passwordless Authentication The next breakthrough in secure digital transformation presents a framework for future authentication systems, illustrating five key passwordless technologies organizations can implement.
As part of Davos 2020, the paper — featuring lead authors Andrew Shikiar (Executive Director and Chief Marketing Officer, FIDO Alliance) and Adrien Ogee (Project Lead, Platform for Shaping the Future of Cybersecurity and Digital Trust, World Economic Forum) — outlines that “passwords are indeed at the heart of the data breach problem,” and presents facial biometric technology, hardware keys, and even QR codes and behavioural analysis as the future of passwordless authentication.
At #Davos, @WEF's new paper points to #FIDO as a viable alternative to passwords. It’s validating to see WEF educate world leaders on the economic impact of legacy authentication practices, and recognize better alternatives ready to implement today. https://t.co/QFhLXhKfL2 pic.twitter.com/pqw5RSMsHe
— The FIDO Alliance (@FIDOAlliance) January 22, 2020
“While company adoption of platform businesses is increasingly driving business valuation and growth, the problem of digital trust is growing equally fast and eroding confidence across online communities,” explains the introduction.
“Security enhancement is a continuous process, there is no magic bullet. Cyber criminals will adapt and develop new means of attack, but the alternative authentication mechanisms presented here provide greater challenge to them and greater security in the foreseeable future.”
DX Journal covers the impact of digital transformation (DX) initiatives worldwide across multiple industries.
Healthcare2 months ago
IoT + Data = Healthcare Intelligence
Technology1 month ago
What challenges face IT leaders in 2020?
Retail1 month ago
IoT + Data = Retail Intelligence
Technology1 month ago
The 2020 outlook for artificial intelligence
Investment2 months ago
Digital transformation will generate significant growth in Latin America in 2020