Connect with us

Technology

Accelerating the future of privacy through SmartData agents

Avatar

Published

on

Digital Life of Mind
Share this:

Imagine a future where you can communicate with your smartphone – or whatever digital extension of you exists at that time – through an evolved smart digital agent that readily understands you, your needs, and exists on your behalf to procure the things and experiences you want. What if it could do all this while protecting and securing your personal information, putting you firmly in control of your data?

Dr. George Tomko, University of Toronto

Dr. George Tomko, University of Toronto

Dr. George Tomko Ph.D, Expert-in-Residence at IPSI (Privacy, Security and Identity Institute) at the University of Toronto, Adjunct Professor in Computer Science at Ryerson University, and Neuroscientist, believes the time is ripe to address the privacy and ethical challenges we face today, and to put into place a system that will work for individuals, while delivering effective business performance and minimizing harms to society at large. I had the privilege of meeting George to discuss his brainchild, SmartData: the development of intelligent agents and the solution to data protection.

As AI explodes, we are witnessing incident after incident from technology mishaps to data breaches, to data misuse, and erroneous and even deadline outcomes. My recent post, Artificial Intelligence needs to Reset advances the need to take a step back, slow down the course of AI, and examine these events with a view to educate, fix, prevent and regulate towards effective and sustainable implementations.

Dr. Tomko is not new to the topic of privacy. He also invented Biometric Encryption as well as the Anonymous Database in the early 90’s.  His invention of SmartData was published SmartData: Privacy Meets Evolutionary Robotics, co-authored with Dr. Ann Cavoukian, former 3-term Privacy Commissioner in Ontario and inventor of Privacy by Design. This led to his current work, Smart Data Intelligent Agents, the subject of this article.

There is an inherent danger with the current model today. How the internet evolved was not its intended path. Tim Berners-Lee envisioned an open internet, owned by no one,

…an open platform that allows anyone to share information, access opportunities and collaborate across geographical boundaries…

This has been challenged by the spread of misinformation and propaganda online has exploded partly because of the way the advertising systems of large digital platforms such as Google or Facebook have been designed to hold people’s attention…

People are being distorted by very finely trained AIs that figure out how to distract them.

What has evolved is a system that’s failing. Tomko points to major corporations and digital gatekeepers who are accumulating the bulk of the world’s personal data:

He who has the personal data has the power, and as you accumulate more personal information (personally identifiable information, location, purchases, web surfing, social media), in effect you make it more difficult for competitors to get into the game. The current oligopoly of Facebook, Google, Amazon etc will make it more difficult for companies like Duck Duck Go and Akasha to thrive.

That would be okay if these companies were to utilize the data in accordance with the positive consent of the data subject for the primary purpose intended and protected it against data hacking. However, we know that’s not happening. Instead, they are using it for purposes not intended, selling the data to third parties, transferring it to government for surveillance, often without a warrant for probable cause.

Tomko asserts if Elon Musk and the late Stephen Hawking are correct about the potential of a dystopian-like AI popularized by Skynet in The Terminator series, this is likely if the AI has access to large amounts of personal data centralized into databases. While this implies an AI with malevolent intentions, humans are relentlessly innovative and Tomko argues for the importance of putting roadblocks in place before this happens.

Enter SmartData. This is the evolution of Privacy by Design, which shifts control from the organization and places it directly in the hands of the individual (the data subject).

SmartData empowers personal data by, in effect, wrapping it in a cloak of intelligence such that it now becomes the individual’s virtual proxy in cyberspace. No longer will personal data be shared or stored in the cloud as merely data, encrypted or otherwise; it will now be stored and shared as a constituent of the binary string specifying the neural weights of the entire SmartData agent. This agent proactively builds-in privacy, security and user preferences, right from the outset, not as an afterthought.

For SmartData to succeed, it requires a radical, new approach – with an effective separation from the centralized models which exist today.

Privacy Requires Decentralization and Distribution

Our current systems and policies present hurdles we need to overcome as privacy becomes the norm. The advent of Europe’s GDPR is already making waves and challenging business today. Through GDPR’s Article 20 (The Right to Data Portability) and Article 17 (The Right to Be Forgotten), the mechanisms to download personal data, plus the absolute deletion of data belie current directives and processes. Most systems ensure data redundancy, therefore data will always exist. Systems will need to evolve to fully comply with these GDPR mandates. In addition, customer transactions on private sites are collected, analyzed, shared and sometimes sold with a prevailing mindset that data ownership is at the organizational level.

Tomko explains the SmartData solution must be developed in an open source environment.

A company that says: “Trust me that the smart agent or app we developed has no “back-door” to leak or surreptitiously share your information,” just won’t cut it any longer. Open source enables hackers to verify this information. I believe that such a platform technology will result in an ecosystem that will grow, as long as there is a demand for privacy.

Within this environment, a data utility within the SmartData platform can request all personal data under GDPR-like regulations from the organizational database. As per the SmartData Security Structure, each subject’s personal data is then cleaned and collated into content categories e.g. A = MRI data, B = subscriber data. They will be de-identified, segmented, encrypted and placed in these locked boxes (files in the cloud) identified by categorized metatags. A “Trusted Enclave” like Intel’s SGX will be associated with each data subject’s personal data. The enclave will generate a public/private key pair and output the public key to encrypt the personal data by category.

Today, information is stored and accessed by location. If breaches occur, this practice increases the risk of exposure as information about data subjects are bundled together. By categorizing and storing personal information by content, this effectively prevents personal identity to be connected with the data itself. Only SmartData will know its data subjects and pointers to their unique personal information, accessed by a unique private key.

SmartData Security Structure, George Tomko

SmartData Security Structure, George Tomko

Ensuring Effective Performance while Maintaining Individual Privacy

Organizations who want to effectively utilize data to improve efficiencies and organizational performance will take a different route to achieve this. How do companies analyze and target effectively without exposing personal data? Tomko declares that using Federated Learning, to distribute data analytics such as Machine Learning(ML) is key:

Federated Learning provides an alternative to centralizing a set of data to train a machine learning algorithm, by leaving the training data at their source. For example, a machine learning algorithm can be downloaded to the myriad of smartphones, leveraging the smartphone data as training subsets. The different devices can now contribute to the knowledge and send back the trained parameters to the organization to aggregate.  We can also substitute smartphones with the secure enclaves that protect each data subject’s personal information.

Here’s how it would work: An organization wants to develop a particular application based on machine learning, which requires some category of personal data from a large number of data-subjects as a training set. Once it has received consent from the data subjects, it would download the learning algorithm to each subject’s trusted enclave. The relevant category of encrypted personal data would then be inputted, decrypted by the enclave’s secret key, and used as input to the machine learning algorithm. The trained learning weights from all data-subjects’ enclaves would then be sent to a master enclave within this network to aggregate the weights. This iteration would continue until the accuracies are optimized. Once the algorithm is optimized, the weights would then be sent to the organization. Tomko affirms,

 

The organization will only have the aggregated weights that had been optimized based on the personal data of many data subjects. They would not be able to reverse engineer and determine the personal data of any single data subject. The organization would never have access to anyone’s personal data, plaintext or otherwise, however, would be able to accomplish their data analytic objectives.

Federated Learning - Master Enclave, George Tomko

Federated Learning – Master Enclave, George Tomko

Building a Secure Personal Footprint in the Cloud

To ensure personal web transactions are secure, a person will instruct his SmartData agent to, for example, book a flight. The instruction is transmitted to the cloud using a secure protocol such as IPSec. This digital specification (a binary string) is decrypted and downloaded to one of many reconfigurable computers, which will interpret the instructions.

Natural language (NLP) would convert the verbal instructions into formal language, as well as the encoded communications, back and forth between subject and organization to facilitate the transaction, eliciting permission for passport and payment information. What’s different is the development of an agreement (stored on the Blockchain) that confirms consented terms of use between the parties. It also adds an incentive component through cryptocurrency that enables the data subject to be compensated for their information, if required. This mechanism would be used before every transaction to ensure transparency and expediency between parties.

Tomko realizes Blockchain has its limitations:

Everyone wants to remove the intermediary and the crypto environment is moving quickly. However, we can’t rely on Blockchain alone for privacy because it is transparent, and we can’t use it for computation because it is not scalable.

AI as it exists today is going through some stumbling blocks. Most experiments are largely within ANI: Artificial Narrow Intelligence, with models and solutions built for very specific domains, which cannot be transferred to adjacent domains. Deep Learning has its limitations. The goal of SmartData is to develop a smart digital personal assistant to serve as a proxy for the data-subject across varied transactions and contexts. Tomko illustrates,

With current Deep Learning techniques, different requests such as ‘Hey SmartData, buy me a copy of …” or “book me a flight to…” encompass different domains, and accordingly, require large sets of training data specific to that domain. The different domain-specific algorithms would then need to be strung together into an integrated whole, which, in effect, would become SmartData. This method would be lengthy, computationally costly and ultimately not very effective.

The promise of AI: to explain and understand the world around us and it has yet to reveal itself.

Tomko explains:

To date, standard Machine Learning (ML) cannot achieve incremental learning that is necessary for intelligent machines and lacks the ability to store learned concepts or skills in long-term memory and use them to compose and learn more sophisticated concepts or behaviors. To emulate the human brain to explain and generally model the world, it cannot be solely engineered. It has to be evolved within a framework of Thermodynamics, Dynamical Systems Theory and Embodied Cognition.

Embodied Cognition is a field of research that “emphasizes the formative role that both the agents’ body and the environment will play in the development of cognitive processes.” Put simply, these processes will be developed when these tightly coupled systems emerge from the real-time, goal-directed interactions between the agents and their environments, and in SmartData’s case, a virtual environment. Tomko notes the underlying foundation of intelligence (including language) is action.

Actions cannot be learned in the traditional ML way but must be evolved through embodied agents. The outcomes of these actions will determine whether the agent can satisfy the data subject’s needs.

Tomko references W. Ross Ashby, a cybernetics guru from the 50’s, who proposed that every agent has a set of essential variables which serve as its benchmark needs, and by which all of its perceptions and actions are measured against. The existential goal is to always satisfy its needs. By using this model (see below), we can train the agent to satisfy the data subject’s needs, and retain the subject’s code of ethics. Essential variables are identified that determine the threshold for low surprise or high surprise. Ideally, the agent should try to maintain a low-surprise and homeostatic state (within the manifold) to be satisfied. Anything outside the manifold, i.e., high surprise should be avoided. Tomko uses Ashby’s example of a mouse, who wants to survive. If a cat is introduced, a causal model of needs is built such that the mouse uses its sensory inputs compared to its benchmark needs to determine how it will act when a cat is present and maintain its life-giving states.

Apply this to individual privacy. As per Tomko,

The survival range will include parameters for privacy protection. Therefore, if the needs change or there is a modified environment or changing context the agent will modify its behavior automatically and adapt because its needs are the puppet-master.

This can be defined as a reward function. We reward actions that result in low surprise or low entropy. For data privacy, ideally, we want to avoid any potential actions that would lead to privacy violations equating to high surprise (and greater disorder).

 

Manifold of Needs, George Tomko

Manifold of Needs, George Tomko

Toronto’s Sidewalk Labs: The Need for Alternative Data Practices

At the time of writing this article, Dr. Ann Cavoukian, Expert-in-Residence at Ryerson University, former 3-term Privacy Commissioner, resigned as an advisor to Sidewalk Labs, in Toronto, a significant project powered by Alphabet, which aimed to develop one of the first smart cities of privacy in the world. Cavoukian’s resignation resulted in a media coup nationally because of her strong advocacy for individual privacy. She explains,

My reason for resigning from Sidewalk Labs is only the tip of the iceberg of a much greater issue in our digitally oriented society.  The escalation of personally identified information being housed in central databases, controlled by a few dominant players, with the potential of being hacked and used for unintended secondary uses, is a persistent threat to our continued functioning as a free and open society.

Organizations in possession of the most personal information about users tend to be the most powerful. Google, Facebook and Amazon are but a few examples in the private sector… As a result, our privacy is being infringed upon, our freedom of expression diminished, and our collective knowledge base outsourced to a few organizations who are, in effect,  involved in surveillance fascism. In this context, these organizations may be viewed as bad actors; accordingly, we must provide individuals with a viable alternative…

The alternative to centralization of personal data storage and computation is decentralization – place all personal data in the hands of the data-subject to whom it relates, ensure that it is encrypted, and create a system where computations may be performed on the encrypted data, in a distributed manner… This is the direction that we must take, and there are now examples of small startups using the blockchain as a backbone infrastructure, taking that direction.  SmartData, Enigma, Oasis Labs, and Tim Berners-Lee’s Solid platform are all developing methods to, among other things, store personal information in a decentralized manner.

Other supporters of Dr. George Tomko concur:

Dr. Don Borrett, a practicing neurologist with a background in evolutionary robotics, with a Masters from the Institute for the History and Philosophy of Science and Technology for the University of Toronto states:

By putting control of personal data back into the hands of the individual, the SmartData initiative provides a framework by which respect for the individual and responsibility for the collective good can be both accommodated.

Bruce Pardy is a Law Professor at Queen’s University, who has written on a wide range of legal topics: human rights, climate change policy, free markets, and economic liberty, among others and he declares:

The SmartData concept is not just another appeal for companies to do better to protect personal information. Instead, it proposes to transform the privacy landscape. SmartData technology promises to give individuals the bargaining power to set their own terms for the use of their data and thereby to unleash genuine market forces that compel data-collecting companies to compete to meet customer expectations.

Dr. Tomko is correct! The time is indeed ripe, and SideWalk Labs, an important experiment that will vault us into the future, is an example of the journey many companies must take to propel us into an inevitability where privacy is commonplace.

This originally appeared om Forbes.

Share this:

Investment

4 ways to plan for the post-pandemic normal

When the crisis eases, we will have entered a new digital normal. Your strategies need to reflect this shift: Consider these factors as you plan for the longer term.

Avatar

Published

on

COVID-19
Share this:

This post originally appeared at Enterprisers Project.

When I sat down to write this article, a follow-on to my previous article on common leadership oversights on the path to digital transformation, the coronavirus’s threat to global business had not reached the magnitude that we feel and see today. In a few short weeks, the pandemic has forced a new virtual work reality on businesses and entire operating models have been shifted – and in many cases, upended.

A business environment that is changing so dramatically and rapidly requires speed, innovation on the fly, and the need to scale thinking beyond anything we might have previously imagined. Now is not the time to back-burner digital initiatives but to ramp them up.

Now is not the time to back-burner digital initiatives but to ramp them up.

When the crisis eases, we will have entered a new digital normal. The strategies we use to run, change, and staff the business will need to reflect this shift. Consider the following factors as you plan for the longer term:

1. The right financials

Any business that isn’t digital by now likely won’t be a business for long. Learning to embrace and adjust is imperative. Continuing – or starting – a digital transformation will be more important than ever, and you’ll need to rethink your business’ capital allocation strategies for digital initiatives and the staffing that supports them.

To figure this out, become best friends with your finance team and think for both the short- and long-term. In the current climate, it can be easy to be either too short-sighted or too far-sighted, but you need to plan for the next week, month, quarter, year, three and five years.

Become best friends with your finance team and think for both the short- and long-term.

Consider how your company may bounce back from the pandemic when stay-at-home orders are lifted, kids go back to school, and consumers begin to mobilize again: We will have entered an entirely different digital world, with new digital expectations from consumers. Is there potential for a rapid and significant surge, followed by a normalization? Will you be facing a slow rise? Digital transformation funds need to be allocated to react appropriately to these various scenarios; staffing discussions should follow based on these decisions.

2. The right tools

It is likely that at least some of your employees will remain virtual, even when the majority can get back into the office. How will you support them? You may have sacrificed some tools or technologies in your move to quickly get employees out of your building and into their homes; you may have also overpaid for the sake of quick deployment.

You’ll need to rework your strategy for the long term. This could include better or more consistent access to networks and servers, the capacity to host formal business meetings online, new portable equipment, virtual collaboration and communication software, and more.

For many, this will require working with your corporate legal team to change their thinking. Where they may have once been risk-averse for the sake of the business, they will now need to take smart risks, also for the take of the business. State your case, find common ground, and move forward.

In some particularly dire situations, you may even need to become comfortable with making decisions first and asking for permission later.

3. The right staffing

You’ll need to continue to make smart staffing decisions – quickly. You likely have three types of talent available:

  • Employees who are great at running the business
  • Employees who are hungry for more
  • New talent that may not yet exist in your business but needs to be brought in

Unfortunately, this global crisis may have created gaps in your workforce.

Identify the individuals in the first two groups and work with your talent management team to assess whether you need to advance digital investments previously planned for. Do these individuals have the right type of skills for their teams? Are they collaborative and communicative? IT cannot work in a silo, and team members need to be able to communicate what they are doing and why, and be clear on how their actions are aligned to larger goals.

When you’ve completed this review, identify the additional skills you will need for the future. This might include teams familiar with building out cloud deployments or working with microservices, etc. Push the rest of your leadership team to break through capital allocation constraints to bring in new employees who not only have the right experience but also can quickly teach your existing teams on new tools organically.

4. The right brand permission

As you work through your accelerated digital transformation, you’ll start to think about your business as a truly digital brand. In fact, you might already think so, simply because you’ve been able to get your staff up and running remotely.

But is this the perception all your stakeholders have? According to the Yale School of Management, “Brand permission defines the limits of customers’ willingness to accept a familiar brand name in new marketplace situations.” For example, you can’t simply say, “We are digital now, world!” and expect your market to immediately accept that if you haven’t been digital historically. You need to earn this right.

You can’t simply say, “We are digital now, world!” You need to earn this right.

Brand permission is something you and the rest of the company will need to work on – largely focused on delivering useful and impactful digital products and services – in order to attract the new talent you need. Start thinking about this now.

The global pandemic has thrown us into an entirely new world. Business leaders can no longer rest on their laurels and, certainly, can no longer put off or draw out a digital transformation. Making the right decisions now will help to ensure your business is positioned well when this crisis passes.

This post originally appeared at Enterprisers Project.

Share this:
Continue Reading

Technology

Five key trends shaping the application landscape

Avatar

Published

on

Share this:

According to application services/application delivery company F5 Networks, 98% of organizations depend on applications to run or support their business — hardly surprising considering that most organizations have some version of a digital transformation plan.

In their new 2020 State of Application Services Report, F5 has found that most organizations have entered the second phase of DX, defined as the integration of automated tasks, “and taking advantage of cloud-native infrastructures to scale the process with orchestration.”

As Lori MacVittie, Principal Technical Evangelist, Office of the CTO at F5 Networks explains in a blog post about the rise of cloud-native architectures, the average enterprise app portfolio is now at 15% modern, microservices-based applications. 

“That’s now more than the stalwart 11% of monolithic / mainframe-hosted applications,” she adds. “Considering reports of extreme backlogs for new applications in every industry, that modern apps have consumed such a significant percentage of the corporate portfolio is nothing short of impressive.”

Based on a global survey of nearly 2,600 senior leaders from various industries, company sizes, and roles, F5’s report outlines five key findings on the trends shaping the application landscape, “and how organizations around the world are transforming to meet the ever-changing demands of the digital economy.”

1. 80% of organizations are executing on digital transformation—with increasing emphasis on accelerating speed to market. 

As organizations work to scale their DX efforts via a digital footprint with cloud, automation, and containers, “it is time to manage the application portfolio like the business asset it is.” 

“Organizations able to harness the application (and API) data and insights generated will be rewarded with significant business value.” 

2. 87% of organizations are multi-cloud and most still struggle with security.

27% of respondents reported that they will have more than half of their applications in the cloud by the end of 2020. 

But despite the crucial importance of applications to business strategy, “organizations are much less confident in their ability to withstand an application-layer attack in the public cloud versus in an on-premises data center.”

When F5 asked how organizations decided which cloud is best for their applications, 41% responded that it was on a “case-by-case, per application” basis — an important strategy, given the uniqueness of each application and the purpose it serves for the business. 

“It is imperative to have application services that span multiple architectures and multiple infrastructures,” outlines the report, “to ensure consistent (and cost-effective) performance, security, and operability across the application portfolio.”

3. 73% of organizations are automating network operations to boost efficiency.

Process optimization is a key motivation for DX efforts, which makes it unsurprising that most organizations are automating their network operations. The goal? Consistent automation across key pipeline components: app infrastructure, app services, network, and security.

“Despite the fact that network automation continues to rise, we are still a long way from the continuous deployment model necessary for business to really take advantage of digital transformation and expand beyond optimization of processes to competitive advantage in the marketplace.”

Respondents report that the most frequent obstacles to continuous deployment are “a lack of necessary skill sets, challenges integrating toolsets across vendors and devices, and budget for new tools.” 

4. 69% of organizations are using 10 or more application services.

With the maturation and scaling of cloud-and container-native application architectures, “more organizations are deploying related app services, such as Ingress control and service discovery, both on premises and in the public cloud.”

One of the most widely deployed application services are those largely dealing with corporate and per-application security. “For the third year running, respondents told us by a wide margin (over 30 percentage points) that the worst thing they could do is deploy an app without security services,” details the report. 

5. 63% of organizations still place primary responsibility for app services with IT operations, with more than half moving to DevOps-inspired teams. 

It’s also no surprise to find that as organizations transform from single-function to modern ops-oriented team structures,” adds the report, “responsibility begins to shift from IT operations and NetOps to SecOps and DevOps.”

One reason why? The shift of application services into modern architectures. “DevOps teams are intimately involved with the CI/CD pipeline, which, for cloud- and container-native apps, includes a growing portfolio of application services such as ingress control, service mesh, service discovery, and good old-fashioned load balancing.” 

Share this:
Continue Reading

Leadership

Digitized and digital: Two sides of the digital transformation coin

Avatar

Published

on

Share this:

According to a research brief out of MIT, thriving in the digital age means undergoing two distinct transformations: Digitization, i.e. the incorporation of digital technology into core operations like accounting and invoicing, and becoming digital — “developing a digital platform for the company’s digital offerings.”

While both of these require companies to embrace emerging technologies, these present two distinct challenges, each with a differing set of rules and strategies. As explained by Sara Brown from the MIT Sloan School of Management, “Becoming digitized relies on traditional business methods. Becoming digital requires breaking old rules and embracing new thinking.” 

Digitization relies on the company’s operational backbone, which supports core operations — i.e. how a company delivers goods and services, maintains its books of record, and completes essential back office processes, explains the research brief. Traditionally, base technologies for these were ERPs, CRMs, and core banking engines. Today, though, it’s likely software-as-a-service (SaaS).

At the same time, becoming digital means creating a digital platform — “a foundation for a company’s digital offerings and their rapid innovation.” Creating speed and innovation, “this platform, a combination of different software components that can link with partners and connect with customers, enables a company to quickly develop and add new digital offerings, and targets revenue growth,” explains Brown.

When it comes to managing both sides of this digital coin, decision-makers must manage leadership, operational, and cultural differences, Brown says:

Leadership: For digitization, leadership is firmly in place, making clear decisions, outlining processes and standards, and ensuring adoption success. 

For a digital platform, however, top-down decision making stands in the way of success. Trusted teams are in the driver’s seat, innovating and implementing new ideas. It’s up to management to define an overall digital vision.

Operational: “Changes to the operational backbone can be planned and evaluated using traditional methods like metrics and customer satisfaction,” writes Brown. On the digital platform side, these methods only result in frustration.

Cultural: Digitization isn’t changing the fundamental place of the operational backbone, MIT’s research found. A digital platform, however, “means radical changes in how decisions are made and work gets done. This can be uncomfortable for people at every level.”

Image via the MIT Center for Information Systems Research

When it comes to actually managing these two different teams, MIT researchers suggest these three actions:

Keep ‘em separated: Simultaneous management of digitization and digital means clearly distinguishing their separate responsibilities, says the research brief. Examples of companies that have taken this approach include Schneider Electric, Royal Philips, and Toyota. In another example, one organization’s operational backbone was managed by the CIO, with a Chief Digital Officer taking the lead on the digital platform.

Funding should also be separate. As the researchers outline, “People responsible for digitization can better pursue operational excellence when the operational backbone receives consistent investment, year after year, at the enterprise level.” Meanwhile, funding for short-term digital innovation “experiments” can be easily upped or decreased, depending on outcomes.

It’s important, however, to keep the overall shared vision in mind, explains tech specialist and Tech Wire Asia editor Soumik Roy, for TechHQ. Leaders might feel that separate teams are a waste of resources, he writes, “because ultimately, the business needs its digital initiatives to converge — like its data, analytics, and platforms.” But in reality, separate teams can optimize DX efforts, but only if a shared vision of the organization’s future is kept top of mind: “Each team, working on their own side of improvements, can make contributions that help move closer to the end state. In practice, this is often more productive as well.”

Rule breaking: Inherent in digital innovation is breaking old rules and making new ones, the researchers found — from subverting budgets processes to guarantee resources to bypass CRM approaches, among other challenges. 

Rule breaking ends up being manageable because it’s relatively contained to a small team that’s experimenting, though it’s crucial digital teams have sign-off and ongoing support from senior leadership. 

New leadership: “Not all people who have successfully led traditional businesses are well-suited to digital business leadership,” says the brief. “The idea of breaking rules to identify what works may feel terribly unnerving for some— even when they have been encouraged to experiment.”  

If someone in a leadership position isn’t comfortable with creating new rules, they explain, coaching could be implemented to help guide them in the right direction. Alternatively, there is likely plenty of new talent that is ready to implement a shift.

Share this:
Continue Reading

Featured