According to researchers from the Russian Quantum Center, Moscow, cryptocurrencies and blockchain technology will be vulnerable in the future unless they integrate quantum technologies.
Blockchain has been heralded in terms of security — especially as a way to conduct transactions between people who may not trust each other all that much in the first place. For example, blockchain has been described as nearly impenetrable technology, which can be used to protect data from cyberattacks and improve cybersecurity across industries.
Designers of blockchain state how three factors deliver the desired level of security: decentralization, cryptography and consensus. More specifically it is the complex interplay of these characteristics that secure blockchain transactions and discourage attacks.
The security feature of blockchain is based on a combination of public and private keys. These keys are formed by various combinations of random numbers and letters, and these codes are not directly associated with users’ identity. Through the keys the need for weak and easily compromised passwords and online identities is avoided.
However, blockchain may not be as secure as proponents think, according to Aleksey K. Fedorov, Evgeniy O. Kiktenko and Alexander I. Lvovsky, writing in the science journal Nature.
The authors predict that in less than ten years the emerging generation of quantum computers will be able to break a blockchain’s cryptographic codes. Quantum computers leverage different physical phenomena — superposition, entanglement and interference — to manipulate information.
The vulnerability within a blockchain arises because the technology relies on ‘one-way’ mathematical functions. These functions are relatively straightforward to run on a conventional computer and they are difficult to calculate in reverse. However, quantum computers will be able to calculate the one-way functions easily and this will render the security created through one-way functions obsolete.
In fact, the researchers argue, a blockchain with its single digital signature will be easier to crack by a quantum computer than a user with a bank account who has multi-layer processes like plastic cards, security questions, identity checks and the oversight of human cashiers.
To avoid the crash of the expanding cryptocurrency market, the researchers recommend, quantum-safe encryption needs to be developed instead, using quantum computers. This arises because quantum states cannot be copied or measured without being altered, and any alteration will immediately alert the user.
Intel embraces DX at Data-Centric Innovation Day
Intel’s recent Data-Centric Innovation Day in San Francisco showed how the company is putting digital transformation at the forefront of its business strategy, to build a bridge from its former position as the big name in PC CPUs, toward a more agile future. In a competitive world of business technology startups and scaleups, Intel is putting its DX foot forward, and showing how the company’s own innovations can help its global customers to embrace the wins that comes with digital transformation.
While the event was a product launch for all intents and purposes, there was a bigger story going on at Data-Centric Innovation Day: the positioning of Intel as a data-centric enterprise and the company’s emphasis on collaboration with its customers around the world as they undertake digital transformation.
At the event’s outset, Intel CEO Robert Swan predicted that the company’s data-centric total addressable market will be 200 billion by 2022. As a continually growing number of organizations move to the cloud, and C-suites continue to look to AI and analytics to develop their competitive advantage, this kind of market growth for the IT giant seems reasonable.
At the core of Intel’s data-driven shift is the customer experience. As Swan stated at the event, Intel is looking to become ‘customer-obsessed’ through the company’s new focus on data. While the role of a processor or a new hardware product within enterprise organizations has not radically shifted — it remains just one piece within the larger technology structures powering digital transformation — Intel’s attitude around their hardware and software offerings, and how they play into the customer’s overall business technology experience, has certainly taken a big leap forward.
In a press release for the event, Navin Shenoy, Intel executive vice president and general manager of the Data Center Group, noted that the new technology was all about putting data first:
“Today’s announcements reflect Intel’s new data-centric strategy. The portfolio of products announced today underscores our unmatched ability to move, store and process data across the most demanding workloads from the data center to the edge. Our 2nd-Generation Xeon Scalable processor with built-in AI acceleration and support for the revolutionary Intel Optane DC persistent memory will unleash the next wave of growth for our customers.”
Intel unveiled a new range of products, including the next generation of Xeon Scalable Processors. The new Xeon line was designed with DX tasks in mind, and the processors look to aid Intel clients with AI processes, cloud and edge computing and with running rapidly growing workloads. The new processors feature DL Boost, a unique inference acceleration offering designed specifically for AI-heavy processes.
The technology giant also emphasized the security enhancements of the new range. VP of Digital Transformation at Intel, Lisa Davis, announced during the event that Intel has partnered with Lockheed Martin to create hardened, full-stack security solutions for CIOs and CESOs. Processing and moving more data than ever requires ever-evolving security, and Intel made a point of emphasizing their dedicating to this element of their new product line.
DX at the heart of Intel’s announcements
For an established tech company like Intel to take on data in such a massive way should be no surprise to digital transformation diehards. But for those still hesitant to take on data management as a bigger part of their organization, Intel’s focus on moving, storing and processing every bit of client data should act as a wakeup call for those still holding out when it comes to digital transformation efforts.
The shape that Intel’s technology is taking, as innovations like DL Boost and the cloud-centric nature of the company’s new security offerings show, is all about meeting the digital transformation needs of customers around the world.
“You can’t digitally transform as an organization if you’re focused on aging IT practices,” said Intel Canada’s Phil Vokins during an interview on the day of the event. “I think the one thing we’ve seen today which we should all be excited about is the range of capabilities and performance that we’re enabling, which was unthinkable even a couple of years ago. It’s not just about the performance of the processor, but look at the memory we can have per socket now. This will really enable businesses to take advantage of the information they have.”
Collaboration with partners and clients key
This focus on a holistic approach to data is not something Intel is doing on its own. The emphasis of Intel’s Data-Centric Innovation Day was so clearly on collaboration, with many major players in the IT and enterprise world contributing to the event. During his keynote, Shenoy was vocal about Intel’s broad set of partners and customers, emphasizing branching out and building a bigger business ecosystem.
Every technology showcased during the product launch was tied back to one of Intel’s global partners: AWS, Vodafone, Twitter, Microsoft, Alibaba, and other companies were featured and promoted through Intel’s own announcements. Featuring partners like this led to some very conversational panels on the nitty gritty of DX throughout the day’s events. But this collaborative approach to the technology also highlighted another aspect of Intel’s digital transformation journey.
Vokins said that, for Intel, the process of digital transformation is also a question of interpreting what’s happening in the world of business technology and turning that information into valuable insights to improve performance.
“We’re in a very fortunate position, given our market share, that we have huge amounts of information and resources and access to leading businesses. So we need to make sure that we can disseminate, understand and rearticulate that information back.”
Vokins emphasized the need to collaborate around each digital transformation insight, “so that we can all learn from it, and learn how customers are embracing technology to rapidly improve performance.”
How edge computing can boost business efficiency
Edge computing is about processing data as close to the source as possible, which reduces both latency and bandwidth use. This concept is seen as critical for furthering the Internet of Things and for driving the development of autonomous vehicles.
What is edge computing?
Edge computing is a decentralized approach to computing applied to networks (the opposite to cloud computing’s centralized approach). The concept relates to how a network stores its information. In edge computing, most data on a network is moved away from physical computers. For businesses, data is moved onto a private server.
Edge computing is especially useful in cases where a lot of data is generated. The approach allows for the successful triage of data locally so that some of it is processed locally, reducing the backhaul traffic to the central data repository. This is very useful in cases where many devices are connected together, as with the Internet of Things.
Edge computing helps to make the Industrial Internet of Things possible. This is an area of great value. McKinsey & Co. calculate that the Industrial Internet of Things will generate $7.5 trillion in value by 2025. The advantages here are to connect people to machine data that accelerate digital industrial transformation.
How can edge computing benefit business?
The advantages of edge computing are that it takes less time to move data and there are fewer are less hardware limitations and that hardware limitations are easily addressed. With conventional storage systems, hardware is normally required, and this can create a bottleneck that places a restriction on how much memory can be moved at any time point. The use of hardware also leads to slower data transfer speeds.
Furthermore, the costs of operating and maintaining the hardware are relatively more expensive.
Security is also stronger with edge computing, making edge computing systems harder for hackers to penetrate. This is because data is continually moving between network modes.
When data are moved throughout a network, they go through different security layers to ensure hackers cannot get into the system, but edge computing goes beyond this. More security layers are used because, instead of the data moving between the network nodes, the data moves from the Internet into the servers and onto the nodes. This provides an opportunity for creating additional firewalls and antivirus scans.
How are businesses using edge computing?
Businesses can derive many advantages from the edge computing concept. The edge process enables analytics and data gathering to occur at the source of the data. This enables companies to leverage resources from devices that are not necessarily continuously connected to a network like laptops, smartphones, tablets and sensors.
Autonomous vehicles and edge computing
Among the more specific examples is autonomous car technology. These are, in a sense, datacenters on wheels, and here edge computing plays a key role. To collect the high volumes of data, edge computing provides an advantage. In terms of data, Intel estimates that autonomous cars, with their many on-vehicle sensors, generate over 40 terabytes of data for each eight hours of driving. Given that this level of data cannot be easily sent to a cloud (and this also presents a safety risk in terms of delayed reactions), the use of edge computing becomes a necessity.
Security cameras and edge computing
A second example is with security systems. If a large complex is served by dozens of high-definition Internet of Things video cameras where data is continuously streaming that signal to a cloud server, these systems can be slow to respond. This is especially so if the security protocol is designed to respond to motion-detection. This set-up places a major strain on the building’s Internet infrastructure, with a high proportion of the bandwidth becoming consumed by a high volume of video footage.
With the edge concept, each camera would have an independent internal computer to run the motion-detecting application and then sent footage to the cloud server as needed. This improves efficiency and lowers bandwidth use.
Fleet management and edge computing
Edge computing also helps to improve the efficiency of fleet management. While a large volume of key performance data needs to be collected – wheels, brakes, battery, electrical – where such data requires a response, such as a potential brake failure, then some of this data needs to be collected and stored locally on the edge in order to minimize the risk of vehicle breakdown or accident.
An example of edge computing applied to fleet management is with trailer temperature. With most fleet monitoring systems, only temperature readings that are outside of a set range are reported back to fleet managers through telematics. The fleet manager then needs to assess whether or not there is a problem. However, with edge analytics, temperature readings can be analyzed onboard a vehicle and notified to the driver, empowering the driver to take steps to mitigate the temperature fluctuation.
VMware and AWS unleash hybrid cloud options in Canada
Cloud platforms are transforming the way organizations do business, and the competition between cloud providers is fierce. VMware and Amazon Web Services are partnering to provide cloud solutions to businesses in Canada.
At a roundtable in Toronto, data center company VMware, IT partner Scalar Decisions and AWS discussed how the new VMware Cloud on AWS service will open up new options for customers. The organizations hope that the new arrangement will act as a catalyst to get Canadian organizations, private and public, to the cloud.
Easing cloud migration for Canadian companies
The new offering from VMware and AWS looks to provide customers with powerful hybrid cloud options in order to help them benefit from AWS’ many capabilities. The new, on-demand service allows organizations working with VMware to extend, migrate and manage their cloud-based resources with the use of AWS services.
Sean Forkan, Vice President and Country Manager at VMware, stated that innovation in the public cloud is happening daily on a global scale, and thanks to VMware Cloud, Canadian companies can now benefit from these transformative shifts coming from within AWS.
The VMware Cloud service lives in the same region and availability as Amazon services, and is managed by VMware. Customers can be served both by AWS’ Montreal-based data centre or the U.S. data centre, depending on their data residency requirements. Over time both VMware and AWS hope to see a greater merger of the tools.
Peter Near, National Director of Solutions Engineering with VMware Canada, said that the transition to cloud services for businesses is not just a question of efficiency, but global performance. And while the majority of data sets on Canadian databases are not easy to migrate, Near predicted that the new offering from VMware and AWS provides these companies an ‘easy button’ for migration.
Transition to cloud has never been more popular
In a recent survey by multi-cloud management company RightScale, 95 percent of respondents said they are using cloud in some way. Hybrid and public cloud were far and away the most popular amongst adopters, with 85 percent of surveyed businesses citing some kind of hybrid cloud strategy, while only 10 percent of respondents cited the use of a single public cloud.
As Eric Gales, Director of AWS Canada, said during the roundtable, “It used to be that owning and operating infrastructure was an advantage.” According to Gales, eliminating the ownership and operation of costly infrastructure is at the heart of the pronounced increase in cloud adoption in Canada.
Gales noted that artificial intelligence and machine learning are also driving organizations towards the scalability and on-demand talent of public cloud services. AI and ML need a lot of computing, said Gales, and now VMware can use existing apps and workloads through AWS to accelerate and amplify the use of these apps. In terms of talent required to scale AI and ML tools, this could be a boon for medium-sized businesses, as the “surface area of new things they need to develop skills for or learn is lower,” said Gales.
Leadership2 weeks ago
Calgary college launches new program in response to a changing workforce
Retail2 months ago
Digital strategy is behind Walmart’s impressive Q4 earnings
Technology2 months ago
Artificial intelligence needs to reset
Technology2 months ago
The cloud strategy that Microsoft is leading and that Google and Amazon are betting on is growing, report says
Events1 month ago
Three speakers you can’t miss at Big Data Toronto 2019