It’s the news that has taken Canada by storm of late, on Twitter, in the headlines, and in today’s parliamentary debate: Statistics Canada, Canada’s agency which issues statistical research on the state of Canada, its population, the economy and culture, unwittingly walked into the spotlight when Global News revealed the agency had asked TransUnion, a credit bureau that amasses credit information for many financial institutions to provide financial transactions and credit histories on approximately 500,000 Canadians, without their individual prior consent. The Liberal government has endorsed this move.
During the parliamentary debate, Conservative opposition Gérard Deltell declared,
If the state has no business in people’s bedrooms, the state has no business in their bank accounts either. There is no place for this kind of intrusion in Canada. Why are the Liberals defending the [Statistics Canada] indefensible?
The data being demanded, according to Global News, consists of private information including name, address, date of birth, SIN, account balances, debit and credit transactions, mortgage payments, e-transfers, overdue amounts, and biggest debts on 15 years worth of data. Equifax, the other credit reporting agency that supports financial institutions in Canada has not been asked to provide data.
Francois-Philippe Champagne, Minister of Infrastructure and Communities was vague in his response. While he affirms StatsCanada’s upstanding practices in anonymizing and protecting personal data, he also admitted proper consent was not received,
StatsCan is going above the law and is asking banks to notify clients of this use. Stats Canada is on their side… We know data is a good place to start to make policy decisions in this country, and we will treat the information in accordance with the law. They can trust Statistics Canada to do the right thing.
Statistics Canada and the Liberal government failed to disclose the explicit use of this information, however,
By law, the agency can ask for any information it wants from any source.
I posed this question to former 3-term Privacy Commissioner, Ann Cavoukian, who currently leads the Privacy by Design Practice at Ryerson University, Toronto:
What’s troubling is that while the opposition cried foul, lashing out accusations of authoritarianism and surveillance, the latter outcome is not implausible.
- if the collection and use are clearly in the interests of the individual and consent cannot be obtained in a timely manner;
- if the collection and use with consent would compromise the availability or the accuracy of the information and the collection is reasonable for purposes related to investigating a breach of an agreement or a contravention of the laws of Canada or a province;
- if disclosure is required to comply with a subpoena, warrant, court order, or rules of the court relating to the production of records;
- if the disclosure is made to another organization and is reasonable for the purposes of investigating a breach of an agreement or a contravention of the laws of Canada or a province that has been, is being or is about to be committed and it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the investigation;
- if the disclosure is made to another organization and is reasonable for the purposes of detecting or suppressing fraud or of preventing fraud that is likely to be committed and it is reasonable to expect that the disclosure with the knowledge or consent of the individual would compromise the ability to prevent, detect or suppress the fraud;
- if required by law.
For Statistics Canada, its broad legal reach is enough for the agency to circumvent explicit disclosure of data use and permission. This alone sets a dangerous precedent that wrestles with current European GDPR mandates, which will be referenced in the updated PIPEDA Act, at a time yet to be determined.
However, this privilege will not make StatsCanada immune to data breaches, but in fact, will make it a stronger target for data hackers. According to the Breach Level Index, since 2013 there have been 13+ billion records lost or stolen, with an average of 6.3+ million lost on a daily basis. The increasing centralization of data makes this more likely. For Statistics Canada, which has been collecting tax filings, census data, location, household, demographic, usage, health and economic data, it is increasingly amassing its data online. According to National Newswatch, the dwindling survey completions and costly census programs have necessitated a move to compile information from other organizations such as financial institutions, which come at more reasonable costs and better data quality.
If this is the catalyst to aggregate compiled information, with the goal of record linking, it will unearth significant privacy alarms in the process. For StatsCanada, which has received significant government support because of the critical information it lends to policy decisions, there are looming dangers of being the purveyor of every Canadian’s private information, beyond data breach vulnerabilities.
Anonymized Data Doesn’t Mean Anonymous Forever
I spoke to Alejandro Saucedo, the Chief Scientist at The Institute for Ethical AI & Machine Learning, a UK-based research center that develops industry standards and frameworks for responsible machine learning development and asked him to weigh in on this issue:
Canadians are rightly worried. It concerns me that StatsCanada is suggesting that just discarding names and addresses would be enough to anonymize the data. Not to point out the obvious, but data re-identification is actually a big problem. There have been countless cases where anonymized datasets have been reverse engineered, let alone datasets as rich as this one.
Re-identification is used to reverse-engineer the anonymity data state and uses alternative data sources to link information to identity. Using publicly available data, easily found in today’s BigData environment, coupled with the speed of advanced algorithms, Saucedo points to successful attempts of re-identification: reverse engineering credit card data, or when this engineer was able to create a complete NYC taxis data dump of 173 million trips and fare logs by decoding the cryptographically secure hashing function that anonymized the medallion and taxi number.
Ethical hacks are not new to banking or any company that collects and manages significant data volumes. These are intentional hacks propagated internally and intentionally by corporations against their existing infrastructure to ensure mitigation of vulnerabilities on-premise and online. This practice ensures the organization is up to par with the latest methods for encryption and security as well as current breach mechanisms. As Saucedo points out:
Even if StatsCanada didn’t get access to people’s names (e.g. requested the data previously aggregated), it concerns me that there is no mention of more advanced methods for anonymization. Differential Privacy, for example, is a technique that adds statistical noise to the entire dataset, protecting users whilst still allowing for high-level analysis. Some tech companies have been exploring different techniques to improve privacy – governments should have a much more active role in this space.
Both Apple and Uber are incorporating Differential Privacy. The goal is to mine and analyze usage patterns without compromising individual privacy. Since the behavioral patterns are more meaningful to the analysis, a “mathematical noise” is added to conceal identity. This is important as more data is collected to establish these patterns. This is not a perfect methodology but for Apple and Uber, they are making momentous strides in ensuring individual privacy is the backbone of their data collection practices
Legislation Needs to be Synchronous with Technology
GDPR is nascent. Its laws will evolve as technology surfaces other invasive harms. Government is lagging behind technology. Any legislation that does not enforce fines for significant breaches in the case of Google Plus, Facebook or Equifax will certainly ensure business and government maintain the status quo.
Challenges of communicating the new order of data ownership will continue to be an uphill battle in the foreseeable future. Systems, standards and significant investment into transforming policy and structure will take time. For Statistics Canada and the Canadian government, creating frameworks that give individuals unequivocal control of their data require education, training, and widespread awareness. Saucedo concedes,
A lot of great thinkers are pushing for this, but for this to work we need the legal and technological infrastructure to support it. Given the conflict of interest that the private sector often may face in this area, this is something that the public sector will have to push. I do have to give huge credit to the European Union for taking the first step with GDPR – although far from perfect, it is still a step in the right direction for privacy protection.
Petition to the House of CommonsWhereas:
- The government plans to allow Statistics Canada to gather transactional level personal banking information of 500,000 Canadians without their knowledge or consent;
- Canadians’ personal financial and banking information belongs to them, not to the government;
- Canadians have a right to privacy and to know and consent to when their financial and banking information is being accessed and for what purpose;
- Media reports highlight that this banking information is being collected for the purposes of developing “a new institutional personal information bank”; and
- This is a gross intrusion into Canadians’ personal and private lives.
This post first appeared on Forbes.
Hessie Jones is the Founder of ArCompany advocating AI readiness, education and the ethical distribution of AI. She is also Director for the International Council, Global Privacy and Security by Design. As a seasoned digital strategist, author, tech geek and data junkie, she has spent the last 18 years on the internet at Yahoo!, Aegis Media, CIBC, and Citi, as well as tech startups including Cerebri, OverlayTV and Jugnoo. Hessie saw things change rapidly when search and social started to change the game for advertising and decided to figure out the way new market dynamics would change corporate environments forever: in process, in culture and in mindset. She launched her own business, ArCompany in social intelligence, and now, AI readiness. Through the weekly think tank discussions her team curated, she surfaced the generational divide in this changing technology landscape across a multitude of topics. Hessie is also a regular contributor to Towards Data Science on Medium and Cognitive World publications.
This article solely represents my views and in no way reflects those of DXJournal. Please feel free to contact me firstname.lastname@example.org
Purolator to design tech-driven courier hub in Toronto
A new national technology hub is being designed and built by courier service provider Purolator in Toronto, reflecting growth in the area’s e-commerce sector. The $339 million project will extend to 60 acres.
Purolator Inc.’s new super-hub in Toronto is set to open in 2021 and the expectation is that it will triple the capacity of the courier provider’s network. The development of the hub forms part of a wider $1-billion investment the company intends to inject into Canada across the next five years. Included in the plans is the intention to upgrade the vehicle fleet, taking advantage of more advanced technology. The central hub will help to coordinate Purolator’s 172 operations facilities and 111 shipping centres.
Also included in the scheme are plans to focus on the customer. This includes improving the online experience by making the main website easier to navigate. There will also be innovations in automation and the hub will be designed to meet environmental standards, meeting the Toronto Green Standards program (which details Toronto’s sustainable design requirements).
Quoted by Bloomberg, Purolator CEO John Ferguson says that the announcement “is one of the most ambitious in our company’s history and will future-proof our business. Purolator has experienced record growth over the past three years. We picked up and delivered over one quarter of a billion packages in 2018 and we expect our growth trajectory to continue.”
The Purolator hub is just one of several innovations making use of Toronto’s growing technology infrastructure. E-commerce company Shopify plans to increase its operations and to employ more staff over the next three years in the city.
Toronto’s cultural and economic diversity has fueled the city’s rapid growth in a number of high-tech areas, particularly for startups and developments in areas like artificial intelligence. This reflects the Canadian government’s plans to build and keep successful startup ecosystems, especially in the Toronto area.
Connecting with ‘US:’ The necessity and value of the Internet of Things
Done right, the Internet of Things is the Internet of Us, connecting the physical and digital in a human-centered way that improves the world intelligently.
By Frank Antonysamy, Vice President of Cognizant’s Global IoT and Engineering Services
U.S. food safety has been a concern since the days of Upton Sinclair’s classic novel about the stockyards and meatpacking industries in Chicago. Public reaction to The Jungle compelled Teddy Roosevelt and the U.S. Congress to pass food safety laws and establish the U.S. Food and Drug Administration in 1906.
More than a century later, threats clearly remain to the safety of domestic and global food supplies and the purity of water sources. Recently, we’ve learned about significant, ongoing, even deadly threats to our food and water. Food recalls have ranged from romaine lettuce to beef in the last 12 months; the tragedy in Flint, Mich., reminds us that poisonous chemicals still make their way into our water, as well. Faulty equipment or poorly executed processes often are to blame.
Solving Safety Challenges with Internet of Things
It doesn’t have to be this way. As the Internet of Things (IoT) begins to permeate our global infrastructure, sensor-equipped devices will soon outnumber the global population. There’s no reason to wait until communities face a food- or water-borne threat before fixing malfunctioning equipment or improving safety procedures.
Today we can automatically and rapidly glean information from IoT-enabled devices – about temperatures in IoT-equipped food storage and transportation equipment, for example, or the chemicals sensed by the pumps that filter and move our water, or the monitoring capabilities of the medical devices we increasingly rely on in hospitals and the home. With such intelligence, communities and businesses can address problems before they become a threat.
Increasing Food Safety on a Massive Scale
Recently, I had a conversation with Internet of Things maven Stacey Higginbotham on one of her Stacey on IoT podcasts. We discussed Cognizant’s work with Internet of Things adoption, and the ways in which these solutions can help businesses and the people they serve.
We talked about how one of the world’s largest sellers of fresh and frozen foods uses IoT-enabled refrigerators and freezers to reduce food spoilage across its global supply chain. Such spoilage not only results in financial losses due to food waste, but can also present risks to consumers. Although the business had already implemented alarms on the refrigeration systems in its distribution centers to signal malfunctions, it could take 36 hours for the maintenance operations team to respond – clearly too long when it comes to food safety and waste. There was also no mechanism to proactively monitor the refrigeration units and ensure timely service calls.
Our solution minimizes energy consumption and seeks to ensure consumer safety. It ties together sensors, cloud-based monitoring, algorithms that trigger alerts and warnings, reminders in handheld applications and a direct link of performance data to individual employees to encourage compliance with the company’s internal food safety protocols. The system covers hundreds of freezers, thousands of deliveries, 600 million data points and millions of pounds of food.
The results have been impressive. After rolling out the system to 100 of its stores, the business reduced priority response times from 36 hours to four hours, and decreased food loss by 10% in the first year by predicting refrigeration failures. The company aims to expand the system to 5,300 stores, with the potential to reduce operating costs by up to $40 million while ensuring the safe storage of food. (Hear more about this solution in the three-minute podcast recording below.)
From Providing Pumps to Offering Insights
These same principles guided our solution for a global manufacturer of high-technology industrial water pumps used in a range of applications, from providing drinking water for cities and villages, to processing waste water, to clearing and filtering the huge volumes of water moved during deep-sea drilling.
With the movement of all that water through its sensor-equipped and self-monitoring pumps, the manufacturer had access to a flood of information on everything from performance-based data on pressure and volume to the chemical composition of the water. By collecting and analyzing this information, the company could leverage and monetize its insights into not just equipment performance but also the safety of the water it delivers. If a certain chemical spikes in the water supply, for example, alerts are triggered, and municipalities can investigate. If water pressure or volume falls outside set parameters, precautions can be taken, including automatic alerts and even preemptive shutdowns.
Buyers of the pumps want this information. So, while using this data to improve the performance of its products, the business can also share insights with its clients on a subscription basis, opening up new revenue streams. The business is no longer just providing world-class high-tech pumps; it’s offering customers critical insights from the pumps it sells, as a value-added service. (Hear more about this solution in the three-minute podcast recording below.)
Connecting Things; Connecting to Our Needs
What links these two examples is their prioritization of real human needs as part of the solution. Clean and safe food and water are vital to human health, and companies that help provide themadd value.
For many years, large industrial enterprises have lived in two separate worlds: the world of all their physical assets (factories, equipment, buildings, people) and the world of their digital assets (software, workflows, algorithms, reports). Through sensor technology, network capability, security advances and IoT platforms, these two worlds are now becoming seamlessly integrated like never before.
Today, the shorthand for this ongoing integration is the Internet of Things. In reality, though, it’s the Internet of Us. Technology offers us a path to connect our physical world with a digital one, in which we occupy a new space and a new future: a place where the physical and digital come together, enabling businesses to transform their operational and business models, in a scalable way, through intelligence. (Hear more on the Internet of Us in the three-minute podcast recording below.)
Cognizant (Nasdaq: CTSH) is dedicated to helping the world’s leading companies build stronger businesses — helping them go from doing digital to being digital.
AI technology from IBM detects breast cancer risk before it happens
IBM has taken a step forward in disease prevention by designing an artificial intelligence technology that can predict the risk of breast cancer developing up to one year before the first signs of cancer appear.
With this new step in medical diagnosis, IBM has developed an artificial intelligence model which is capable of predicting malignant breast cancer within a year with an 87 percent accuracy rate (when the output from the machine is compared with expert radiologists.) In addition, the technology could correctly predict 77 percent of non-cancerous cases.
The prediction methods uses both mammogram images and medical records in order to make the assessment, based on a data review of the medical evidence. This is the first application of AI to draw upon both images and data to make a prediction in relation to breast cancer.
At the heart of the deep neural network technology is an algorithm, which was trained by IBM technologists along with medical professionals from Israel’s largest healthcare organizations. The training of the artificial intelligence took place using anonymized mammography images which were linked to biomarkers (like patient reproductive history) together with clinical data. The training data-set consisted of 52,936 images from 13,234 women who underwent at least one mammogram between 2013 and 2017.
The aim of the medtech is not to replace the physician, but to act as a ‘second pair of eyes’, providing a backup in the event that something has been missed through conventional patient assessment. This could prove especially useful in areas with staff shortages where a second medical professional is not available to provide a second assessment.
An assessment of the technology has been published in the journal Radiology. The research paper is titled “Predicting Breast Cancer by Applying Deep Learning to Linked Health Records and Mammograms.”
In related news, IBM is applying artificial intelligence to catch Type 1 diabetes much earlier. IBM’s other health technology project could help identify patients at risk and help chart a course for tracking the condition. The predictive tool is a joint project between IBM and JDRF (formerly known as the Juvenile Diabetes Research Foundation).
Manufacturing3 weeks ago
The State of the Union for IoT Intelligence
Technology2 months ago
World’s first sports-tech blockchain accelerator launched
Energy1 month ago
Oil and gas needs to embrace digital transformation
Healthcare2 months ago
Artificial intelligence assesses PSTD by analysing voice patterns
Technology1 month ago
Lenovo develops new AR headset called ThinkReality