Connect with us

Technology

AWS chief architect says AI is boosting human intelligence

Key takeaways from Glenn Gore from CIX 2018

Published

on

Amazon Web Service’s Chief Architect, Glenn Gore. - Photo by DX Journal
Amazon Web Service’s Chief Architect, Glenn Gore. - Photo by DX Journal
Share this:

Every talk about artificial intelligence at a technology event inevitably ends up in a comment about robots replacing human jobs. But for Amazon Web Service’s Chief Architect, Glenn Gore, humans benefit from the process of designing artificial intelligence (AI).

“We are seeing human intelligence being brought back with the rise of artificial intelligence,” he said during his keynote speech at CIX 2018.

Gore spent the next 30 mins talking about how various companies leverage big data for insight, and how designing artificial intelligence frameworks is creating new innovation and ways of thinking for humans.

Here are three major takeaways from his talk:

AI is boosting human intelligence

During his keynote, Gore turned to a quote from Joi Ito, director of MIT Media Lab, to get to the heart of the appeal of AI and cloud: “Want to increase innovation? Lower the cost of failure.”

Gore cites time, rather than money, as the biggest cost that failure incurs. Across multiple industries, AWS is seeing inventive uses of AI and machine learning (ML) that allow companies to innovate faster, using speed as a compelling sell point.

For example, GE Healthcare is innovating around medical imaging in order to provide better diagnosis and make preventative discoveries for patients. Their medical scanning equipment produces terabytes of data from each incredibly detailed scan, so it takes time for images to be processed. Using a combination of connected devices and machine learning tools is allowing GE to speed up image process for doctors who seek to improve early detection of tumors and other abnormalities.

AI and cloud are also being used by financial services giant Moody’s to filter through complicated financial documents.

The company has tens of millions of financial documents to mine for insights, but it is incredibly time-consuming to process manually. Moody’s now uses AWS’ AI Lab Solutions to find insights around trends and patterns, reducing the need for teams of people to work through the same quantity of documents.

Gore said it’s now becoming a question of proving an idea’s effectiveness over six weeks rather than six months. And thanks to the array of AI and ML tools available, you don’t have to be a ML engineer or a data scientist to develop a concept and get it running.

Amazon Web Service’s Chief Architect, Glenn Gore. - Photo by DX Journal

Amazon Web Service’s Chief Architect, Glenn Gore. – Photo by DX Journal

Get business out of the way of the customer

Gore also spoke about how technology can have a negative impact on customer interaction if it gets in the way.  Data is often the culprit.

As a business grows, you need to remain close to customers. The data may signal positive signs for a business, but at the end of the day a customer’s satisfaction and experience is the most important thing to a business.

One company that is taking a customer-centric approach to its technology is the NFL, Gore said.

The league is running an advanced stats platform on AWS that draws from thousands of games, providing valuable data about every level of the game. The organization doesn’t keep these insights to themselves, either – technology is used to provide an immersive experience for fans by combining statistical analysis with real-time video footage, allowing fans to engage with the sport like never before.

It’s never been easier or cheaper to collect data at any scale

One of the key factors behind the AI and ML renaissance is that businesses are dealing with “hundreds of millions of interactions” that can be tracked, stored in the cloud, measured and mined for insights.

An example of a business doing big data analysis at scale is the online multiplayer game Fortnite.

Gore said the game supports 125 million players worldwide, with 92 million “events” tracked every minute in the cloud minute.

Fortnite uses AWS to leverage the scale of the data, store it, and process it to gain insights that can be used to change the game. The 40 gigabytes of analytical data per minute that is tracked by gamemakers allows the company to constantly improve, helping to win the attention of players and retain them as customers.

Share this:

Events

Intel embraces DX at Data-Centric Innovation Day

Published

on

Share this:

Intel’s recent Data-Centric Innovation Day in San Francisco showed how the company is putting digital transformation at the forefront of its business strategy, to build a bridge from its former position as the big name in PC CPUs, toward a more agile future. In a competitive world of business technology startups and scaleups, Intel is putting its DX foot forward, and showing how the company’s own innovations can help its global customers to embrace the wins that comes with digital transformation.

While the event was a product launch for all intents and purposes, there was a bigger story going on at Data-Centric Innovation Day: the positioning of Intel as a data-centric enterprise and the company’s emphasis on collaboration with its customers around the world as they undertake digital transformation.

At the event’s outset, Intel CEO Robert Swan predicted that the company’s data-centric total addressable market will be 200 billion by 2022. As a continually growing number of organizations move to the cloud, and C-suites continue to look to AI and analytics to develop their competitive advantage, this kind of market growth for the IT giant seems reasonable.

At the core of Intel’s data-driven shift is the customer experience. As Swan stated at the event, Intel is looking to become ‘customer-obsessed’ through the company’s new focus on data. While the role of a processor or a new hardware product within enterprise organizations has not radically shifted — it remains just one piece within the larger technology structures powering digital transformation — Intel’s attitude around their hardware and software offerings, and how they play into the customer’s overall business technology experience, has certainly taken a big leap forward.

The 2nd-Gen Intel Xeon Scalable Processors are all about data and digital transformation.

In a press release for the event, Navin Shenoy, Intel executive vice president and general manager of the Data Center Group, noted that the new technology was all about putting data first:

“Today’s announcements reflect Intel’s new data-centric strategy. The portfolio of products announced today underscores our unmatched ability to move, store and process data across the most demanding workloads from the data center to the edge. Our 2nd-Generation Xeon Scalable processor with built-in AI acceleration and support for the revolutionary Intel Optane DC persistent memory will unleash the next wave of growth for our customers.”

Intel unveiled a new range of products, including the next generation of Xeon Scalable Processors. The new Xeon line was designed with DX tasks in mind, and the processors look to aid Intel clients with AI processes, cloud and edge computing and with running rapidly growing workloads. The new processors feature DL Boost, a unique inference acceleration offering designed specifically for AI-heavy processes.

Intel

Lisa Davis (left), Intel’s VP of Data Center Group and General Manager of Digital Transformation and Scale Solutions, unveils new security solutions.

The technology giant also emphasized the security enhancements of the new range. VP of Digital Transformation at Intel, Lisa Davis, announced during the event that Intel has partnered with Lockheed Martin to create hardened, full-stack security solutions for CIOs and CESOs. Processing and moving more data than ever requires ever-evolving security, and Intel made a point of emphasizing their dedicating to this element of their new product line.

DX at the heart of Intel’s announcements

For an established tech company like Intel to take on data in such a massive way should be no surprise to digital transformation diehards. But for those still hesitant to take on data management as a bigger part of their organization, Intel’s focus on moving, storing and processing every bit of client data should act as a wakeup call for those still holding out when it comes to digital transformation efforts.

The shape that Intel’s technology is taking, as innovations like DL Boost and the cloud-centric nature of the company’s new security offerings show, is all about meeting the digital transformation needs of customers around the world.

“You can’t digitally transform as an organization if you’re focused on aging IT practices,” said Intel Canada’s Phil Vokins during an interview on the day of the event. “I think the one thing we’ve seen today which we should all be excited about is the range of capabilities and performance that we’re enabling, which was unthinkable even a couple of years ago. It’s not just about the performance of the processor, but look at the memory we can have per socket now. This will really enable businesses to take advantage of the information they have.”

Collaboration with partners and clients key

This focus on a holistic approach to data is not something Intel is doing on its own. The emphasis of Intel’s Data-Centric Innovation Day was so clearly on collaboration, with many major players in the IT and enterprise world contributing to the event. During his keynote, Shenoy was vocal about Intel’s broad set of partners and customers, emphasizing branching out and building a bigger business ecosystem.

Every technology showcased during the product launch was tied back to one of Intel’s global partners: AWS, Vodafone, Twitter, Microsoft, Alibaba, and other companies were featured and promoted through Intel’s own announcements. Featuring partners like this led to some very conversational panels on the nitty gritty of DX throughout the day’s events. But this collaborative approach to the technology also highlighted another aspect of Intel’s digital transformation journey.

Vokins said that, for Intel, the process of digital transformation is also a question of interpreting what’s happening in the world of business technology and turning that information into valuable insights to improve performance.

“We’re in a very fortunate position, given our market share, that we have huge amounts of information and resources and access to leading businesses. So we need to make sure that we can disseminate, understand and rearticulate that information back.”

Vokins emphasized the need to collaborate around each digital transformation insight, “so that we can all learn from it, and learn how customers are embracing technology to rapidly improve performance.”

Share this:
Continue Reading

Technology

How edge computing can boost business efficiency

Published

on

network
Share this:

Edge computing is about processing data as close to the source as possible, which reduces both latency and bandwidth use. This concept is seen as critical for furthering the Internet of Things and for driving the development of autonomous vehicles.

What is edge computing?

Edge computing is a decentralized approach to computing applied to networks (the opposite to cloud computing’s centralized approach). The concept relates to how a network stores its information. In edge computing, most data on a network is moved away from physical computers. For businesses, data is moved onto a private server.

Edge computing is especially useful in cases where a lot of data is generated. The approach allows for the successful triage of data locally so that some of it is processed locally, reducing the backhaul traffic to the central data repository. This is very useful in cases where many devices are connected together, as with the Internet of Things.

Edge computing helps to make the Industrial Internet of Things possible. This is an area of great value. McKinsey & Co. calculate that the Industrial Internet of Things will generate $7.5 trillion in value by 2025. The advantages here are to connect people to machine data that accelerate digital industrial transformation.

How can edge computing benefit business?

The advantages of edge computing are that it takes less time to move data and there are fewer are less hardware limitations and that hardware limitations are easily addressed. With conventional storage systems, hardware is normally required, and this can create a bottleneck that places a restriction on how much memory can be moved at any time point. The use of hardware also leads to slower data transfer speeds.

Furthermore, the costs of operating and maintaining the hardware are relatively more expensive.

Security is also stronger with edge computing, making edge computing systems harder for hackers to penetrate. This is because data is continually moving between network modes.

When data are moved throughout a network, they go through different security layers to ensure hackers cannot get into the system, but edge computing goes beyond this. More security layers are used because, instead of the data moving between the network nodes, the data moves from the Internet into the servers and onto the nodes. This provides an opportunity for creating additional firewalls and antivirus scans.

How are businesses using edge computing?

Businesses can derive many advantages from the edge computing concept. The edge process enables analytics and data gathering to occur at the source of the data. This enables companies to leverage resources from devices that are not necessarily continuously connected to a network like laptops, smartphones, tablets and sensors.

Autonomous vehicles and edge computing

Among the more specific examples is autonomous car technology. These are, in a sense, datacenters on wheels, and here edge computing plays a key role. To collect the high volumes of data, edge computing provides an advantage. In terms of data, Intel estimates that autonomous cars, with their many on-vehicle sensors, generate over 40 terabytes of data for each eight hours of driving. Given that this level of data cannot be easily sent to a cloud (and this also presents a safety risk in terms of delayed reactions), the use of edge computing becomes a necessity.

Security cameras and edge computing

A second example is with security systems. If a large complex is served by dozens of high-definition Internet of Things video cameras where data is continuously streaming that signal to a cloud server, these systems can be slow to respond. This is especially so if the security protocol is designed to respond to motion-detection. This set-up places a major strain on the building’s Internet infrastructure, with a high proportion of the bandwidth becoming consumed by a high volume of video footage.

With the edge concept, each camera would have an independent internal computer to run the motion-detecting application and then sent footage to the cloud server as needed. This improves efficiency and lowers bandwidth use.

Fleet management and edge computing

Edge computing also helps to improve the efficiency of fleet management. While a large volume of key performance data needs to be collected – wheels, brakes, battery, electrical – where such data requires a response, such as a potential brake failure, then some of this data needs to be collected and stored locally on the edge in order to minimize the risk of vehicle breakdown or accident.

An example of edge computing applied to fleet management is with trailer temperature. With most fleet monitoring systems, only temperature readings that are outside of a set range are reported back to fleet managers through telematics. The fleet manager then needs to assess whether or not there is a problem. However, with edge analytics, temperature readings can be analyzed onboard a vehicle and notified to the driver, empowering the driver to take steps to mitigate the temperature fluctuation.

Share this:
Continue Reading

Technology

VMware and AWS unleash hybrid cloud options in Canada

Published

on

cloud computing
Share this:

Cloud platforms are transforming the way organizations do business, and the competition between cloud providers is fierce. VMware and Amazon Web Services are partnering to provide cloud solutions to businesses in Canada.

At a roundtable in Toronto, data center company VMware, IT partner Scalar Decisions and AWS discussed how the new VMware Cloud on AWS service will open up new options for customers. The organizations hope that the new arrangement will act as a catalyst to get Canadian organizations, private and public, to the cloud.

Easing cloud migration for Canadian companies

The new offering from VMware and AWS looks to provide customers with powerful hybrid cloud options in order to help them benefit from AWS’ many capabilities. The new, on-demand service allows organizations working with VMware to extend, migrate and manage their cloud-based resources with the use of AWS services.

Sean Forkan, Vice President and Country Manager at VMware, stated that innovation in the public cloud is happening daily on a global scale, and thanks to VMware Cloud, Canadian companies can now benefit from these transformative shifts coming from within AWS.

The VMware Cloud service lives in the same region and availability as Amazon services, and is managed by VMware. Customers can be served both by AWS’ Montreal-based data centre or the U.S. data centre, depending on their data residency requirements. Over time both VMware and AWS hope to see a greater merger of the tools.

Peter Near, National Director of Solutions Engineering with VMware Canada, said that the transition to cloud services for businesses is not just a question of efficiency, but global performance. And while the majority of data sets on Canadian databases are not easy to migrate, Near predicted that the new offering from VMware and AWS provides these companies an ‘easy button’ for migration.

Transition to cloud has never been more popular

In a recent survey by multi-cloud management company RightScale, 95 percent of respondents said they are using cloud in some way. Hybrid and public cloud were far and away the most popular amongst adopters, with 85 percent of surveyed businesses citing some kind of hybrid cloud strategy, while only 10 percent of respondents cited the use of a single public cloud.

As Eric Gales, Director of AWS Canada, said during the roundtable, “It used to be that owning and operating infrastructure was an advantage.” According to Gales, eliminating the ownership and operation of costly infrastructure is at the heart of the pronounced increase in cloud adoption in Canada.

Gales noted that artificial intelligence and machine learning are also driving organizations towards the scalability and on-demand talent of public cloud services. AI and ML need a lot of computing, said Gales, and now VMware can use existing apps and workloads through AWS to accelerate and amplify the use of these apps. In terms of talent required to scale AI and ML tools, this could be a boon for medium-sized businesses, as the “surface area of new things they need to develop skills for or learn is lower,” said Gales.

Share this:
Continue Reading

Featured