Connect with us

Technology

Why Big Business Should Proactively Build for Privacy

Published

on

Big data
Share this:

This article explores the rise of Privacy by Design (PbD) from the basic framework, to its inclusion in the GDPR, to its application in business practices and infrastructure especially in the wake of Artificial Intelligence.

We had the pleasure of sitting down with Dr. Ann Cavoukian, former 3-Term Privacy Commissioner of Ontario, and currently Distinguished Expert-in-Residence, leading the Privacy by Design Centre of Excellence at Ryerson University in Toronto, Canada to discuss this massive shift that will upend current business practices. We’ve also sought responses from top execs from AI start-ups, and enterprise to address the current hurdles and future business implications of Privacy by Design. This article includes contributions from Scott Bennet, a colleague researching privacy and GDPR implications on emerging technology and current business practices.

I call myself an anti-marketer, especially these days. My background has predominantly come from database marketing and the contextualization of data to make more informed decisions to effectively sell people more stuff. The data that I saw, whether it be in banking, loyalty programs, advertising and social platforms — user transactions, digital behaviour, interactions, conversations, profiles — were sewn together to create narratives about individuals and groups, their propensities, their intents and their potential risk to the business.

While it was an established practice to analyze this information in the way that we did, the benefit was largely to businesses and to the detriment of our customers. How we depicted people was based on the data they created, based on our own assumptions that, in turn, informed the analysis and ultimately, created the rules which governed the data and the decisions. Some of these rules unknowingly were baked in unintended bias from experience and factors that perpetuated claims of a specific cluster or population.

While for many years I did not question the methods we used to understand and define audiences, it’s clear that business remained largely unchecked, having used this information freely with little accountability and legal consequence.

As data becomes more paramount and as AI analyzes and surfaces meaning at greater speeds, the danger of perpetuating these biases becomes even more serious and will inflict greater societal divisions if measures are not put in place and relentlessly enforced.

Recently, I met my maker. Call it atonement for the many years I manipulated data as a marketer. We had the honour of talking Privacy with an individual I had admired for years. Dr. Ann Cavoukian, in my view, will drive a discussion across industry that will make business stand up and listen.

Remember when Canada’s Privacy Commissioner took on Facebook?

Ann Cavoukian has been an instrumental force in spreading awareness of Privacy, which brought her front in centre on the world stage, pitted directly against Facebook in 2008. Back then the federal Privacy Commissioner alleged that 22 practices violated the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA). This eventually led to an FTC settlement with Facebook that mandated an increased transparency with its users, requiring their explicit consent before “enacting changes that override their privacy settings.”

Ann Cavoukian is a household name in technology and business. As a three-term Privacy Commissioner of Ontario, Canada, she has jettisoned the privacy discussion for a few decades. Today that discussion has reached a fever pitch as the EU General Data Protection and Regulation (GDPR), which came into effect May 25, 2018, includes Cavoukian’s long-advocated creation, Privacy by Design (PbD). This will raise the bar dramatically and any company or platform who does business with the EU, will need to comply with these standards. At the heart of GDPR are these guiding principles when collecting, storing and processing personal consumer information:

  • Lawfulness, fairness and transparency
  • Purpose limitation
  • Data minimization
  • Accuracy
  • Storage limitation
  • Integrity and confidentiality (security)
  • Accountability

Privacy by Design’s premise is to proactively embed privacy at every stage in the creation of new products or services in a way that’s fair and ethical. Cavoukian argues that by implementing PbD, companies would, in effect, be well on their way to complying with the GDPR.

What Makes this Moment Ripe for Privacy by Design?

In the 90’s the web was growing exponentially. Commerce, online applications, and platforms were introducing a new era that would dramatically change business and society. Ann Cavoukian, at this time, was in her first term as Privacy Commissioner of Ontario. She witnessed this phenomenon and was concerned it was going to grow dramatically, and in an era of ubiquitous computing, increasing online connectivity and massive social media, she surmised that privacy needed to be developed as a model of prevention, not one which simply “asked for forgiveness later.”

Imagine going to your doctor, and he tells you that you have some signs of cancer developing and says, “We’ll see if it gets worse and if it does, we’ll send you for some chemo”. What an unthinkable proposition! I want it to be equally unthinkable that you would let privacy harms develop and just wait for the breach, as opposed to preventing them from occurring. That’s what started PbD.

In 2010, at the International Conference of Data Protection Authorities and Privacy Commissioners in Europe, Cavoukian advanced the resolution that PbD should complement regulatory compliance, to mitigate the potential harms. It was unanimously passed. The reason?

Everyone saw this was just the tip of the iceberg in identifying the privacy harms, and we were unable to address all the data breaches and privacy harms that were evading our detection because the sophistication of perpetrators meant that the majority of breaches were remaining largely unknown, unchallenged and unregulated. As a result, PbD became a complement to the current privacy regulation, which was no longer sustainable as the sole method of ensuring future privacy.

These days the issue of data security has gotten equal, if not more, airplay. Cavoukian argues:

When you have an increase in terrorist incidents like San BernadinoCharlie Hebdo attacks in Paris, and in Manchester, the pendulum spins right back to: Forget about privacy — we need security. Of course we need security — but not to the exclusion of privacy!

I always say that Privacy is all about control — personal control relating to the uses of your own data. It’s not about secrecy. It drives me crazy when people say ‘Well, if you have nothing to hide, what’s the problem?’ The problem is that’s NOT what freedom is about. Freedom means YOU get to decide, as a law-abiding citizen, what data you want to disclose and to whom — to the government, to companies, to your employer.

Pew Research conducted an Internet Study post-Snowden to get a consumer pulse on individual privacy. Key findings cited:

There is widespread concern about surveillance by both government and business:
• 91% of adults agreed that consumers had lost control over their personal information;
• 80% of social network users are concerned about third parties accessing their data;
• 80% of adults agreed that Americans should be concerned about government surveillance.

Context is Key:

And while there are those who understand they are trading their information for an expectation of value, they should be fully informed of how that value is extracted from their data. Cavoukian cautions:

Privacy is not a religion. If you want to give away your information, be my guest, as long as YOU make the decision to do that. Context is key. What’s sensitive to me may be meaningless to you and vice versa… At social gatherings, even my doctors won’t admit they’re my doctors! That’s how much they protect my privacy. That is truly wonderful! They go to great lengths to protect your personal health information.

The importance of selling the need for privacy includes persistent education. Unless people have been personally affected, many don’t make the connection. Does the average person know the implications of IoT devices picking up the “sweet nothings” they’re saying to their spouse or their children? When they realize it, they usually vehemently object.

Context surfaces the importance of choice. It is no longer an all-or-nothing game subsumed under a company’s terms and conditions where one click, “Accept” automatically gives full permission. Those days are over.

And while some can object to analyzing and contextualization for insurance purposes, they may allow their personal health history to be included in an anonymized manner for research to understand cancers endemic to their particular region.

Context is a matter of choice; freedom of choice is essential to preserving our freedom.

Privacy Does Not Equal Secrecy

Cavoukian emphasizes that privacy is not about having something to hide. Everyone has spheres of personal information that are very sensitive to them, which they may or may not wish to disclose them.

You must have the choice. You have to be the one to make the decision. That’s why the issue of personal control is so important.

I extracted this slide from Ann Cavoukian’s recent presentation:

The <ahref=”https://www.wired.co.uk/article/china-social-credit” target=”_blank” rel=”nofollow noopener noreferrer noopener”>Chinese Social Credit System was created to develop more transparency and improve trustworthiness among its citizens. It’s a dystopia we do not want. China is a clear surveillance society that contradicts free society’s values. Cavoukian crystalizes the notion that privacy forms the foundation of our freedom. If you value freedom, you value privacy.

Look at Germany. It’s no accident that Germany is the leading privacy and data protection country in the world. It’s no accident they had to endure the abuses of the Third Reich and the complete cessation of their privacy and their freedom. And when that ended, they said, ‘Never again will we allow the state to strip us of our privacy — of our freedom!’ And they have literally stood by that.

Post-Snowden, I wrote this: The NSA, Privacy and the Blatant Realization: Nothing You Do Online is Private and referenced a paragraph written by Writynga in his response to Zuckerberg’s view at the time 2012 that privacy was no longer a social norm:

We like to say that we grew up with the Internet, thus we think that the Internet is all grown up. But it’s not. What is intimacy without privacy? What is a democracy without privacy?…Technology makes people stupid. It can blind you to what your underlying values are and need to be. Are we really willing to give away our constitutional and civil liberties that we fought so hard for? People shed blood for this, to not live in a surveillance society. We looked at the Stasi and said, ‘That’s not us.

The will of the people has demanded more transparency.

But we don’t want a state of surveillance that eerily feels like we’re living in a police state. There has to be a balance between ensuring the security of the nation and the containment of our civil liberties.

People will have Full Transparency… Full Control… Anytime

Since the passing of Privacy by Design (PbD) as an international standard in 2010 to complement privacy regulation, PbD has been translated into 40 languages. The approach has been modified to include the premise that efforts to ensure individual privacy can be achieved while developing consumer trust and improved revenue opportunities for business within a Positive Sum paradigm. Cavoukian is convinced this is the practical way forward for business:

We can have privacy and meet business interests, security and public safety … it can’t be an either/or proposition. I think it’s the best way to proceed, in a positive-sum, win/win manner, thereby enabling all parties to gain.

Privacy by Design’s Foundational Principles include:

  1. Proactive not Reactive: preventive not remedial
  2. Privacy as the default setting
  3. Privacy embedded into design
  4. Full functionality: positive sum, not zero-sum
  5. End-to-end security: full lifecycle protection
  6. Visibility and transparency: keep it open
  7. Respect for user privacy: keep it user-centric

Cavoukian contends that Principle #2, Privacy by Default is critical and, of all the foundational principles, is the hardest one since it demands the most investment and effort: with explicit requirements that change how the data is collected, used and disclosed, and will result in data policy and process alterations including new user-centric privacy controls.

Article 21 also states individuals have the “right to object” to the processing of their personal information at any time. This includes for use in direct marketing and profiling:

“The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights, and freedoms of the data subject.”

The business must be more explicit and go much further, beyond the traditional disclosure and terms of service. Purpose specification and use limitation require organizations to be explicit about the information it requires, for what purpose, and must elicit consent specifically for that purpose and that purpose alone. Later on, if a secondary use transpires, the organization will require the user consent once again. If disclosure is key to transparency, businesses will need to find a way to do this while mitigating consent fatigue.

Article 17 suggests a much stronger user right that belies current business practices: The Right to Erasure (“the right to be forgotten”)

The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay.

While this statute will have exceptions like data that establishes the data subject as an entity: through health records and banking information, behaviour, transactions, future analysis in profiling, and contextual models are fair game for “the right to be forgotten.” The advent of the GDPR has provided business a glimpse of the potential impacts where companies experienced customer record volumes drop an average of 20% for customers who did not explicitly opt-in.

This is a truly user-centric system. Make no mistake, Privacy by Design will challenge current practices and upend current infrastructures.

This privacy UI simulation (IBM: Journey to Compliance) displays how potential user controls will work in real time and the extent to which the user can grant consent based on different contexts. This level of user access will require a data repository to purge user information, but must be configured with the flexibility to redeploy the data into systems down the road, should the user decide to revert.

 

Can Privacy by Design Create a Positive-Sum Existence for Business?
If you had asked me a year ago, I would have argued that Privacy by Design

is not realistic for business adoption, let alone, acceptance. It will will upend process, structure and policy. However, within the mandate of GDPR this is an inevitability.

We asked Ann Cavoukian to consider business practices today. Both Google and Facebook have received enormous fines in wake of the GDPR to the tune of $9.3 billion. Because of the recent Cambridge Analytica data breach, Facebook is investing millions in tools and resources to minimize future occurrences. It’s recent Q2 stock plummet took the market by surprise but for Zuckerberg, he made it clear they would be taking a performance hit for a few quarters in order to improve the platform for its users… not for its shareholders. While they are a beacon of how companies should behave, this clear “ask forgiveness later” model negated any appearance that this strategy was nothing less than altruistic.

Emily Sharpe, Privacy Policy Manager at Facebook contends that in preparation for the GDPR, they paid particular attention to the Article 29 Working Party’s Transparency Guidance:

We have prepared for the past 18 months to ensure we meet the requirements of the GDPR. We have made our policies clearer, our privacy settings easier to find and introduced better tools for people to access, download, and delete their information. In the run up to GDPR we asked people to review key privacy information which was written in plain language, as well as make choices on three important topics. Our approach complies with the law, follows recommendations from privacy and design experts, and is designed to help people understand how the technology works and their choices.

Cavoukian pointed to a study by IBM with the Ponemon Institutethat brought awareness to the cost of data breaches: It reports that the global average cost of a data breach is up 6.4 percent over the previous year to $3.86 million per incident. On a per record basis, the average cost for each record lost rose by 4.8% to $148. As Cavoukian points out, these costs will continue to rise if you maintain Personally Identifiable Information (PII) at rest.

The PbD solution requires a full end-to-end solution which includes both privacy and security:

  1. IT systems;
  2. accountable business practices; and
  3. networked infrastructure.

How Do You Address the Advertisers Who Successfully Monetize Data Today?

What do you say to advertisers and publishing platforms who play in this $560-billion industry? We can’t stop progress. The more data out there, the more demand from willing buyers to extract meaning from it. On the other hand, given the fallout from Facebook, some advertisers have been grey or black listed from advertising on the platform because of questionable practices or content. The platform changes have also significantly curbed ad reach opportunities for current advertisers. This domino effect is now compounded with mandates from GDPR to garner explicit consent and create greater transparency of data use. Ann Cavoukian said this:

The value of data is enormous. I’m sorry but advertising companies can’t assume they can do anything they want with people’s data anymore. I sympathize with them. I really do; their business model will change dramatically. And that is hard to take so I genuinely feel bad for them. But my advice is: that business model is dying so you have to find a way to transform this so you involve your customers, engage them in a consensual model where benefits will accrue to customers as well. Context is key. Give individuals the choice to control their information and gain their consent to exchange it for something they value from you.

Mary Meeker’s “Paradox of Privacy” points to the consumer’s increasing demand for products and services that are faster, easy, convenient and affordable. This requires systems that can leverage personal information to make this a reality for the consumer. Increased customization is the expectation but brings with it increased business risk. As long as current business practices persist, according to Cavoukian, it leaves their business vulnerable to, as we’ve witnessed, incessant data breaches and cyber attacks. Equifax and Target are two cases in point.

Communication with the data subject needs to be a win/win (positive sum). Can the business provide the necessary value, while respecting the choices dictated by the individual? When AI becomes more pervasive this will become even more challenging as streaming data will require more real-time interfaces and applications that allow access and individual configuration of data types across various contexts and vertical uses.

I asked a few executives from various data start-ups and from established enterprise businesses, who have had considerable business to consumer experience from advertising to social technology to network platforms, to weigh in on the privacy debate:

Josh Sutton, CEO of Agorai, was also former Global Head for Data and AI at Publicis.Sapient. In an advertising industry which drives hundreds of millions in revenue, the quest to build consumer relevance comes at a cost. This proliferates as more companies look to artificial intelligence to drive precision:

Data is clearly one of the most valuable assets in the world today — especially with the growing importance of artificial intelligence (AI) which relies on massive amounts of data. Data privacy needs to be incorporated into the fabric of how these technologies work in order for society to get the most benefit from AI. To me, data privacy means having the ability to control when and why data that you own is used — not the ability to keep it secret which is a far easier task. For that to happen, there needs to be open and transparent marketplaces where people and companies can sell data that they create, as well as a consistent set of regulations for how companies can use data.

Dr. Nitin Mayande, PhD, Chief Scientist of Tellagence, and former Data Scientist at Nike concurs with Josh Sutton. Nitin had been studying social network behavior for years and understands the need to transform current approaches:

Sooner or later I envision a data marketplace — a supply side and a demand side. Today, companies leverage data at the user’s expense and monetize it. The end user does not experience any real economic benefit. Imagine a time when data becomes so valuable the individual can have full control and become the purveyor of his/her own information.

For Dana Toering, Chief Revenue Officer at Yroo and former Managing Director at Adobe Advertising Cloud, his career saw the emergence of ad platforms, which heavily relied on treasure troves of data to gain increasing granularity for ad targeting:

As an entire ecosystem I feel we are just now coming to terms with the evolution of value exchange that was established between end users and digital publishers and software developers starting in October 1994 when Hotwired.com ran the internet’s first banner ad. The monetization of audiences through advertising and wide-spread data harvesting of the same audiences in exchange for ‘free’ content or software has enabled the meteoric growth of the internet and the businesses that are built around it but has also enabled massive amounts of fraud and nefarious activity. Thankfully we are at a tipping point where corporations/brands and users alike are taking back data ownership and demanding transparency, as well as consent and accountability. Defining and managing the core tenets of this value exchange will become even more important (and complex) in the future with the rise of new technologies and associated tools. So the time is now to get it right so both businesses and users can benefit long term.

I have had curious discussions with Dr. Sukant Khurana, Scientist heading the Artificial Intelligence, Data Science, and Neurophysiology laboratory at CSIR-CDRI, India. As an entrepreneur also working on various disruptive projects, he had this to say, echoing the above sentiments:

The debate between privacy and security is a misleading one, as the kind and amount of data shared with private companies and the government need not and should not be the same. AI has been vilified in data privacy issues but the same technology (especially the upcoming metalearning approaches) can be used to ensure safety while preventing unwanted marketing and surveillance. If the monitoring tools (by design) were made incapable of reporting the data to authorities, unless there was a clear security threat, such situation would be like having nearly perfect privacy. It is technologically possible. Also, we need to merge privacy with profits, such that by and large, companies are not at odds with the regulatory authorities. This means there needs to be smarter media and social platforms, which present more choices for data sharing, choices that are acceptable between the end customer and the platforms.

Alfredo C. Tan, Industry Professor, DeGroote School of Business at McMaster University has extensive experience on B2C advertising platforms, and understands the need for fair exchange, baked in trust:

If there was better control and understanding of how personal data is being used, I believe people would be willing to be more open. The balance is ensuring there is a fair value exchange taking place. In exchange for my data, my experiences become better, if not in the present but in the future. And as long as this is a trusted relationship, and people understand the value exchange then people are open to sharing more and more information. I am happy that Facebook, Amazon, and other platforms are aware that I am a male between 35–45 with specific interests in travel and pets, but no interest in hockey or skateboarding. Or that based on certain movies I watch, Netflix makes recommendation on what other types of content I would be interested in to keep me more entertained. And maybe that data is used elsewhere, with my permission to make experiences better on other platforms. The battle for data in an increasingly competitive consumer landscape is to increase engagement using personalized insight they have gleaned about their customers to ultimately create better experiences. I am certain many people do not want to go back to the anonymous web where all of us are treated largely the same and there was no differentiation in the experience.

Everyone agrees the regression to anonymity is not plausible nor tenable.

Privacy, Security, Trust and Sustainability

This is the future and it’s critical that business and government develop a stance and embrace a different way of thinking. As AI becomes more pervasive, the black box of algorithms will mandate business to develop systems and policies to be vigilant against the potential harms. Cavoukian understands it’s an uphill battle:

When I have these conversations with CEOs, at first they think I’m anti-business and all I want to do is shut them down. It’s the farthest thing from my mind. You have to have businesses operating in a way that will attract customers AND keep their business models operating. That’s the view I think you should take. It has to be a win/win for all parties.

Do you have a data map? I always start there. You need to map how the data flows throughout your organization and determine where you need additional consent. Follow the flow within your organization. This will identify any gaps that may need fixing.

TRUST: it takes years to build… and days to lose…

Perhaps this is the view that companies should take. Ann Cavoukian maintains that those who have implemented PbD say it builds enormous trust. When you have a trusted business relationship with your customers, they’re happy to give you additional consent down the road. They just don’t want the information flowing out to third parties unknown.

I tell companies if you do PbD, shout it from the rooftops. Lead with it. Tell your customers the lengths you’re going to to protect their privacy, and the respect you have for them. They will thank you in so many ways. You’ll gain their continued loyalty, and you’ll attract new opportunity.
I say to companies who see privacy as a negative, saying that it stifles creativity and innovation: ‘It’s the exact opposite: Privacy breeds innovation and prosperity, and it will give you a competitive advantage. It allows you to start with a base of trust, which steadily enhances the growth of your customers and their loyalty. Make it a win/win proposition!

Ann Cavoukian has recently launched Global Privacy and Security by Design: GPSbyDesign.org, an International Council on Global Privacy and Security. For more information on Ann Cavoukian, please go to Privacy by Design Centre of Excellence, at Ryerson University.

This article first appeared on Forbes: Part 1 and Part II.

Hessie Jones is the Founder of ArCompany advocating AI readiness, education and the ethical distribution of AI. She is also Cofounder of Salsa AI, distributing AI to the masses. As a seasoned digital strategist, author, tech geek and data junkie, she has spent the last 18 years on the internet at Yahoo!, Aegis Media, CIBC, and Citi, as well as tech startups including Cerebri, OverlayTV and Jugnoo. Hessie saw things change rapidly when search and social started to change the game for advertising and decided to figure out the way new market dynamics would change corporate environments forever: in process, in culture and in mindset. She launched her own business, ArCompany in social intelligence, and now, AI readiness. Through the weekly think tank discussions her team curated, she surfaced the generational divide in this changing technology landscape across a multitude of topics. Hessie is also a regular contributor to Towards Data Science on Medium and Cognitive World publications.

This article solely represents my views and in no way reflects those of DXJournal. Please feel free to contact me h.jones@arcompany.co

Share this:

Technology

Philips is all in when it comes to the IoT

Lights may be a familiar sight whether you’re at home or anything to say about it, lighting will soon do a lot more than just illuminate.

Published

on

Share this:

Sponsored by Cognizant

Stop reading for a moment and look up. Did you see a light on the ceiling, or maybe one over on the wall nearby? Of course you did. Lighting is everywhere we go — and not just indoors. It comes in all shapes, sizes, colours, and brightnesses, but if Philips Electronics has anything to say about it, lighting will soon do a lot more than just illuminate.

The Dutch electronics multinational — a leading global supplier of lighting technology — has announced that the IoT isn’t just an important technology, it’s going to be central to the company’s overall strategy in the future. That IoT-based strategy can be most clearly seen within Philips’ lighting division, where it even has its own name: Interact.

Related: Stepping into Digital with IOT – 14 Cases

“You can imagine all these devices — lamps, drivers, luminaries, sensors — being connected, sending information through software,” Philips Lighting CEO, Eric Rondolat, told attendees at the Light+Building exhibition in Frankfurt, earlier this year. “And all this software sending this information back to a cloud-based platform, an IoT platform that is called Interact,” he said.

The Power of Occupancy Sensing

While some of data gathered by Philips lighting will be related to energy consumption and other operational parameters for the lights themselves, there’s a lot more smart lighting can do.

One big area that Philips and the rest of the lighting industry is eyeing is occupancy sensing. The global occupancy sensor market was worth USD$1.7 billion in 2017, according to Market Prognosis, and is projected to reach USD$4.8 billion by 2023. Those numbers are being driven largely by a North American push to increase energy efficiency. Being able to know when someone is in a space that requires lighting, or HVAC, can lead to significant savings. But that same data has other value too, and Philips plans to leverage machine learning to unearth hidden insights trapped in that data.

Download] Stepping into Digital with IOT – 14 Cases

If these moves weren’t proof enough that Philips Lighting is betting big on the IoT, consider this: The company just hired former Cisco senior vice president, IoT sales, Chris White, to lead its Americas division. Then there’s the name. Philips Lighting announced in March that the company would change the company name to as Signify — a name it hopes will shine a light on the company’s beyond-lighting IoT focus.

DX Journal covers the impact of digital transformation (DX) initiatives worldwide across multiple industries.

Share this:
Continue Reading

Technology

Navigating the AI Hype

Published

on

Share this:

Welcome to Navigating the AI Hype. This will be a timely article that curates events in AI to tabulate AI’s journey as this unprecedented phenomenon makes its way into our lives: The Good, the Bad and the Ugly. We will acknowledge successes in AI as well as those that still require further progress. We will also highlight areas where human conscience will need to dictate policy and regulation as ethical standards will be built in lockstep with technology as it evolves. Finally, we will highlight references and resources for anyone wanting to dive in further into Artificial Intelligence. Enjoy!

The Good:

Uber applies for permission to test self-driving cars again

“ We have taken a measured, phased approach to returning to on-road testing, starting first with manual driving in Pittsburgh. We committed to deliver this safety report before returning to on-road testing in self-driving mode, and will go back on the road only when we’ve implemented improved processes.

Read more.

 

LinkedIn founder Reid Hoffman makes record-breaking gift to U of T’s Faculty of Information for chair in AI

“Artificial intelligence will revolutionize how we live, creating both incredible opportunity for benefits, as well as some disruption that will be important to manage,”

Read more.

MIT is investing $1 billion in an AI college

“Interdisciplinary learning should mean better, saner Artificial Intelligence”

Read more.

The Bad:

The future of border control agents might come in the form of an AI lie detector

A six-month trial will take place at four border crossing points in Hungary, Greece and Latvia.

 

Read more.

What to know about WhatsApp in Brazil ahead of Sunday’s election

“I don’t know where they found my phone number.”.

Read more.

Google wants to improve your smart home with iRobot’s room maps

The idea of Google using data about users’ home will be justifiably unsettling to some. Although Google doesn’t have as bad of a reputation for data leaks and breaches as Facebook, it’s still had a number of serious lapses.

Read more.

The Ugly:

Australia’s data breach numbers steady at 245 in three months

“Everyone who handles personal information in their work needs to understand how data breaches can occur so we can work together to prevent them”

Read more.

Radisson Hotel Group suffers data breach, customer info leaked

Radisson Hotel Group loyalty scheme members are affected and may have had their personal information stolen.

Read more.

China has been ‘hijacking the vital internet backbone of western countries’

“Using these numerous PoPs, [China Telecom] has already relatively seamlessly hijacked the domestic US and cross-US traffic and redirected it to China over days, weeks, and months”

Read more.

AI courses and resources

Machine Learning AI Certification by Stanford University (Coursera)

Artificial Intelligence Certification: Learn How To Build An AI (Udemy)

The week in breaches – Newsletter 

 

Share this:
Continue Reading

Technology

Canadians up in arms: Privacy without consent and the dangerous precedent

Published

on

Canada data concept, DepositPhotos
Share this:

It’s the news that has taken Canada by storm of late, on Twitter, in the headlines, and in today’s parliamentary debate: Statistics Canada, Canada’s agency which issues statistical research on the state of Canada, its population, the economy and culture, unwittingly walked into the spotlight when Global News revealed the agency had asked TransUnion, a credit bureau that amasses credit information for many financial institutions to provide financial transactions and credit histories on approximately 500,000 Canadians, without their individual prior consent. The Liberal government has endorsed this move.

During the parliamentary debate, Conservative opposition Gérard Deltell declared,

If the state has no business in people’s bedrooms, the state has no business in their bank accounts either. There is no place for this kind of intrusion in Canada. Why are the Liberals defending the [Statistics Canada] indefensible? 

The data being demanded, according to Global News, consists of private information including name, address, date of birth, SIN, account balances, debit and credit transactions, mortgage payments, e-transfers, overdue amounts, and biggest debts on 15 years worth of data. Equifax, the other credit reporting agency that supports financial institutions in Canada has not been asked to provide data.

Francois-Philippe Champagne, Minister of Infrastructure and Communities was vague in his response. While he affirms StatsCanada’s upstanding practices in anonymizing and protecting personal data, he also admitted proper consent was not received,

StatsCan is going above the law and is asking banks to notify clients of this use. Stats Canada is on their side… We know data is a good place to start to make policy decisions in this country, and we will treat the information in accordance with the law. They can trust Statistics Canada to do the right thing.

Statistics Canada and the Liberal government failed to disclose the explicit use of this information, however,

By law, the agency can ask for any information it wants from any source.

I posed this question to former 3-term Privacy Commissioner, Ann Cavoukian, who currently leads the Privacy by Design Practice at Ryerson University, Toronto:

Ann Cavoukian Twitter

Ann Cavoukian Twitter

What’s troubling is that while the opposition cried foul, lashing out accusations of authoritarianism and surveillance, the latter outcome is not implausible.

According to Personal Information Protection and Electronic Documents Act (PIPEDA) Guidelines to Obtain Meaningful Consent, these are the main exceptions

  • if the collection and use are clearly in the interests of the individual and consent cannot be obtained in a timely manner;
  • if the collection and use with consent would compromise the availability or the accuracy of the information and the collection is reasonable for purposes related to investigating a breach of an agreement or a contravention of the laws of Canada or a province;
  • if disclosure is required to comply with a subpoena, warrant, court order, or rules of the court relating to the production of records;
  • if the disclosure is made to another organization and is reasonable for the purposes of investigating a breach of an agreement or a contravention of the laws of Canada or a province that has been, is being or is about to be committed and it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the investigation;
  • if the disclosure is made to another organization and is reasonable for the purposes of detecting or suppressing fraud or of preventing fraud that is likely to be committed and it is reasonable to expect that the disclosure with the knowledge or consent of the individual would compromise the ability to prevent, detect or suppress the fraud;
  • if required by law.

For Statistics Canada, its broad legal reach is enough for the agency to circumvent explicit disclosure of data use and permission. This alone sets a dangerous precedent that wrestles with current European GDPR mandates, which will be referenced in the updated PIPEDA Act, at a time yet to be determined.

However, this privilege will not make StatsCanada immune to data breaches, but in fact, will make it a stronger target for data hackers. According to the Breach Level Index, since 2013 there have been 13+ billion records lost or stolen, with an average of 6.3+ million lost on a daily basis. The increasing centralization of data makes this more likely. For Statistics Canada, which has been collecting tax filings, census data, location, household, demographic, usage, health and economic data, it is increasingly amassing its data online. According to National Newswatch, the dwindling survey completions and costly census programs have necessitated a move to compile information from other organizations such as financial institutions, which come at more reasonable costs and better data quality.

If this is the catalyst to aggregate compiled information, with the goal of record linking, it will unearth significant privacy alarms in the process. For StatsCanada, which has received significant government support because of the critical information it lends to policy decisions, there are looming dangers of being the purveyor of every Canadian’s private information, beyond data breach vulnerabilities.

Anonymized Data Doesn’t Mean Anonymous Forever

I spoke to Alejandro Saucedo, the Chief Scientist at The Institute for Ethical AI & Machine Learning, a UK-based research center that develops industry standards and frameworks for responsible machine learning development and asked him to weigh in on this issue:

Canadians are rightly worried. It concerns me that StatsCanada is suggesting that just discarding names and addresses would be enough to anonymize the data. Not to point out the obvious, but data re-identification is actually a big problem. There have been countless cases where anonymized datasets have been reverse engineered, let alone datasets as rich as this one. 

Re-identification is used to reverse-engineer the anonymity data state and uses alternative data sources to link information to identity. Using publicly available data, easily found in today’s BigData environment, coupled with the speed of advanced algorithms, Saucedo points to successful attempts of re-identification: reverse engineering credit card data, or when this engineer was able to create a complete NYC taxis data dump of 173 million trips and fare logs by decoding the cryptographically secure hashing function that anonymized the medallion and taxi number.

Ethical hacks are not new to banking or any company that collects and manages significant data volumes. These are intentional hacks propagated internally and intentionally by corporations against their existing infrastructure to ensure mitigation of vulnerabilities on-premise and online. This practice ensures the organization is up to par with the latest methods for encryption and security as well as current breach mechanisms. As Saucedo points out:

Even if StatsCanada didn’t get access to people’s names (e.g. requested the data previously aggregated), it concerns me that there is no mention of more advanced methods for anonymization. Differential Privacy, for example, is a technique that adds statistical noise to the entire dataset, protecting users whilst still allowing for high-level analysis. Some tech companies have been exploring different techniques to improve privacy – governments should have a much more active role in this space.

Both Apple and Uber are incorporating Differential Privacy. The goal is to mine and analyze usage patterns without compromising individual privacy. Since the behavioral patterns are more meaningful to the analysis, a “mathematical noise” is added to conceal identity. This is important as more data is collected to establish these patterns. This is not a perfect methodology but for Apple and Uber, they are making momentous strides in ensuring individual privacy is the backbone of their data collection practices

Legislation Needs to be Synchronous with Technology

GDPR is nascent. Its laws will evolve as technology surfaces other invasive harms. Government is lagging behind technology. Any legislation that does not enforce fines for significant breaches in the case of Google Plus, Facebook or Equifax will certainly ensure business and government maintain the status quo.

Challenges of communicating the new order of data ownership will continue to be an uphill battle in the foreseeable future. Systems, standards and significant investment into transforming policy and structure will take time. For Statistics Canada and the Canadian government, creating frameworks that give individuals unequivocal control of their data require education, training, and widespread awareness. Saucedo concedes,

 A lot of great thinkers are pushing for this, but for this to work we need the legal and technological infrastructure to support it. Given the conflict of interest that the private sector often may face in this area, this is something that the public sector will have to push. I do have to give huge credit to the European Union for taking the first step with GDPR – although far from perfect, it is still a step in the right direction for privacy protection.

 (Update) As of Friday, November 1, 2018, this Petition E-192 (Privacy and Data Protection) was put forward to the House of Commons calling for the revocation of this initiative. 21,000 signatures have been collected to date. Canadians interested in adding their names to this petition can do so.
Petition to the House of Commons
Whereas:
  • The government plans to allow Statistics Canada to gather transactional level personal banking information of 500,000 Canadians without their knowledge or consent;
  • Canadians’ personal financial and banking information belongs to them, not to the government;
  • Canadians have a right to privacy and to know and consent to when their financial and banking information is being accessed and for what purpose;
  • Media reports highlight that this banking information is being collected for the purposes of developing “a new institutional personal information bank”; and
  • This is a gross intrusion into Canadians’ personal and private lives.
We, the undersigned, Citizens and Residents of Canada, call upon the Government of Canada to immediately cancel this initiative which amounts of a gross invasion of privacy and ensure such requests for personal data never happen again.

This post first appeared on Forbes.

Hessie Jones is the Founder of ArCompany advocating AI readiness, education and the ethical distribution of AI. She is also Cofounder of Salsa AI, distributing AI to the masses. As a seasoned digital strategist, author, tech geek and data junkie, she has spent the last 18 years on the internet at Yahoo!, Aegis Media, CIBC, and Citi, as well as tech startups including Cerebri, OverlayTV and Jugnoo. Hessie saw things change rapidly when search and social started to change the game for advertising and decided to figure out the way new market dynamics would change corporate environments forever: in process, in culture and in mindset. She launched her own business, ArCompany in social intelligence, and now, AI readiness. Through the weekly think tank discussions her team curated, she surfaced the generational divide in this changing technology landscape across a multitude of topics. Hessie is also a regular contributor to Towards Data Science on Medium and Cognitive World publications.

This article solely represents my views and in no way reflects those of DXJournal. Please feel free to contact me h.jones@arcompany.co

Share this:
Continue Reading

Featured