Data privacy as a human right must be recognized by privacy and AI bill, say advocates

Proposed legislation intended to strengthen consumer privacy protections and establish accountability frameworks for artificial intelligence (AI) requires an overhaul, according to some groups arguing acts within the bill don’t go far enough to protect human rights.
“Overall, the [Artificial Intelligence and Data] Act treats human rights and human rights impacts of artificial intelligence as a secondary issue, and actually fails to establish adequate protections for—and take into account—human rights impacts when assessing and developing AI tools,” said Tim McSorley, national co-ordinator of the International Civil Liberties Monitoring Group. “It fails to do that by failing to mention human rights in the legislation itself whatsoever.”
On April 24, an open letter was sent to Innovation Minister François-Philippe Champagne (Saint-Maurice-Champlain, Que.) calling for the Artificial Intelligence and Data Act (AIDA) to be split from the rest of Bill C-27, the Digital Charter Implementation Act, and given a full public consultation.
The International Civil Liberties Monitoring Group was among the nearly 60 civil society organizations, corporations, and academics who signed the letter arguing the MPs’ study of AIDA was “hasty, confusing and rushed,” which resulted in a “gravely and fundamentally flawed bill that lacks democratic legitimacy.”

Bill C-27 was introduced in June 2022. It completed second reading in the House in April 2023, and is currently under consideration by the House Industry Committee. The bill bundles together three proposed acts: AIDA, as well as the Consumer Privacy Protection Act, and the Personal Information and Data Protection Tribunal Act, which—if passed—would amend the Personal Information Protection and Electronic Documents Act, Canada’s 23-year-old data protection law.
Currently, AIDA includes provisions for the assessment of AI tools being developed before they’re released to the public, but doesn’t include a requirement for assessing the affect on human rights and civil liberties, according to McSorley.

“The legislation focuses on the risks posed to individuals, and particularly individuals as consumers,” he said. “Without mentioning human rights as a factor, it means that, eventually, if there are concerns and problems about the impacts of the [AI] tools and somebody files a complaint based on the fact that it violates human rights, it wouldn’t fall under the purview of the regulations being put in place, because it’s not explicitly mentioned.”
As an example, McSorley talked about how AI tools, such as for facial recognition programs, may be used by law enforcement agencies.
“If those [facial recognition tools] aren’t assessed for specific human rights impacts before they’re released, and used by law enforcement, then it could have a discriminatory impact on racialized communities, on marginalized communities, that already face over-policing,” he said. “We could see people who already face heightened levels of surveillance, or heightened levels of false accusation, face even greater repercussions because of artificial intelligence tools that haven’t been properly assessed for their impacts on those Canadians and impacts on fundamental human rights.”
How well AIDA addresses human rights is not the only concern with the bill, according to McSorely. Another issue is its lack of independence for the AI and data commissioner, who would have the responsibility of monitoring compliance and intervening if necessary to ensure that AI systems are safe and non-discriminatory.
“Under the proposed rules, [the commissioner] would be a part of [Innovation, Science and Economic Development Canada], whose mandate is the promotion of Canada’s AI sector. Our concerns with how the rules established under AIDA would be enforced would be significantly addressed if the government agreed to make the proposed commissioner an independent officer of parliament, similar to the Privacy Commissioner,” said McSorely in an emailed statement to The Hill Times on April 30.
The open letter calls for Ottawa to initiate an “in-depth and meaningful” consultation process so AIDA can be revised and reintroduced.
The International Civil Liberties Monitoring Group, along with OpenMedia and the Privacy and Access Council of Canada, also released a list in March of recommended “bare minimum” changes to AIDA in the event the federal government moves forward with Bill C-27 without additional public consultation. Among the recommendations is a call for the inclusion of the “fundamental right” to individual privacy, and “human rights” pertaining to privacy and data protection in the bill’s preamble.
Yuan Stevens, an academic associate with Centre of Genomics and Policy at McGill University in Montreal, told The Hill Times that AIDA failed to address human rights risks posed by the use of AI systems.
“Right now, there’s only a two-tiered approach in the law which basically says certain uses of AI are fine, and then the other tier says, ‘let’s be careful, this is a high risk,’” said Stevens. “We can contrast that to the [European Union] AI Act, which actually includes several prohibitions on the use of AI because of the impacts of the law in terms of rights.”
Stevens’ research examines data governance, privacy, and human rights. She said AIDA should include a ban list of some of uses for AI that could be considered harmful, such as facial recognition tools in public spaces, or for predicting crime.

“People who are Black, people of colour, will be over-represented in things like mugshot databases, and are subject to being stopped by the police more often, and therefore will end up in a feedback loop that will impact peoples’ right to life, liberty and … the right to privacy,” she said. “I don’t actually think that a tweaking of a line will address the concerns that I’m personally raising because what will be needed is a law that is premised upon the protection of human rights, and, therefore, there will probably be an entire section of the law that says these certain uses of AI are unacceptable, and therefore they are prohibited.”
Stevens said that she has conducted research into how countries around the world are regulating AI, and found that Canada is one of the few jurisdictions to have proposed binding general laws that would regulate the technology. She said it is important Canada get the law right when regulating AI, and recommended following the lead of the European Union AI Act, which passed on March 13 and is regarded as the world’s first comprehensive legal framework for AI.
Stevens said that act is “by no means perfect,” but more comprehensive than the current form of AIDA.
“There is this pretty important window of time where countries are proposing laws on AI. Many, maybe, won’t ever do that, but for the ones that will and have actually shown that they want to regulate this, they will look to Canada for how to regulate this,” she said. “It seems to me that Canada has an opportunity right now to propose a law on AI that would be robust, sustainable, and future-proof and comprehensive, and it’s not clear to me that, right now, the law in its current form actually meets those criteria.”
‘Human rights’ language considered in Bill C-27
The issue of acknowledging human rights has also formed part of the overall discussions of Bill C-27 during the bill’s examination by the House Industry Committee.

In April, 2023, Privacy Commissioner of Canada Philippe Dufresne argued in a letter to Liberal MP Joël Lightbound (Louis-Hébert, Que.), chair of the committee, that the preamble of Bill C-27 “does not go far enough in recognizing the fundamental right to privacy,” and that stakeholders in civil society shared the view that the bill “should go further in recognizing privacy as a fundamental right.”
Among a list of proposed changes, Dufresne argued the preamble should be modified to include references to fundamental rights.
Champagne also sent a letter to Lightbound on Oct. 20, 2023, which included several draft motions, including a motion that the bill’s preamble be amended to qualify the right to privacy as a fundamental right.
On April 8, 2024, Conservative MP Brad Vis (Mission-Matsqui-Fraser Canyon, B.C.), introduced a motion for amendments to Bill C-27, including for the preamble to specify “the protection of the fundamental right to privacy of individuals.” That amendment passed on April 10.
Lyndsay Wasser, a partner at McMillan LLP who acts as a strategic advisor to organizations in the technology industry, told The Hill Times that proponents of including “human rights” in the bill’s language may argue that would be beneficial in situations where the law needs to be interpreted. Proponents may also argue that such an inclusion would help align Canada’s laws with global privacy laws and standards, such as the EU’s General Data Protection Regulation (GDPR), which is “the big one,” according to Wasser.
However, a distinction between the GDPR and Bill C-27 is that the bill tackles private sector privacy regulation, as opposed to the public sector, according to Wasser.
“The GDPR governs the activities of both public sector and private organizations, and when you talk about fundamental rights in Canada, we typically look to the Charter of Rights and Freedoms. When you look at that, you don’t see privacy in there, and there are some rights that do touch upon privacy … but those rights are vis-à-vis the government, not private organizations,” she said.
In PIPEDA currently, there’s a balance between the interests of businesses and the promotion of the digital economy versus employee privacy rights, according to Wasser.

“In my view, the most important factor is to ensure that the legislation properly falls within the constitutional authority of the federal government, and in that regard, it should be legislation that facilitates trade and commerce, and strikes the right balance between the legitimate interest of businesses versus individual privacy rights,” she said.
Wasser said that it should not be lost on the federal government that consumers in Canada want to access new, interesting, innovative and exciting technologies, and therefore protecting the interests of businesses also has benefits for the nation.
“Although I understand the need to protect individual privacy—obviously, that’s critical—I also think that there are benefits both to businesses and consumers to taking a balancing approach that’s not overly restrictive or prescriptive, such that Canadians lose out on the opportunity to be part of these developments worldwide,” she said. “Firstly, it will facilitate businesses that are developing these types of technologies, which is great for our economy, and it’s also, in my view, good for consumer access to see new and interesting technologies that will be available in other parts of the world.
Teresa Scassa, a law professor and researcher in the areas of privacy, data protection and AI for the University of Ottawa, told The Hill Times that she considers the most important question pertaining to the Bill C-27 being whether or not human rights should come first.

“The volume of data that is now being collected about people and the many, many different ways in which it can be used, has already had a very significant impact. Some of those are privacy impacts and that people can be tracked and monitored in fairly unprecedented ways. When there are data breaches, that puts people’s personal financial security and other forms of security at risk,” she said. “The use of personal data to track who may have had an abortion in the United States, for example, all of these types of things. There’s just so much data that’s being collected that it can make people quite vulnerable to a variety of different intrusions, manipulations, fraud, you name it.”
Scassa argued that see Bill C-27 being viewed as a balance between individual’s right to privacy on one hand, and the need to access and use data on the other, creates a risk of privacy eroding away.
“You’d start to talk in terms of pragmatics and ‘well, maybe these rights aren’t as important,’ and certainly there’s a broad public interest in these uses of data, and, ‘it’s not that harmful’, and ‘the data isn’t that sensitive,’ and it’s like a death by 1,000 cuts, in a lot of ways,” she said.
jcnockaert@hilltimes.com
The Hill Times
Canada Privacy Concerns Statistics:
- About 93 per cent of Canadians expressed some level of concern about the protection of their privacy.
- Just over half of the Canadians surveyed (58 per cent; down from 63 per cent in 2020) feel that the federal government respects their privacy rights. Far fewer believe that businesses respect their privacy rights (39 per cent; down from 45 per cent in 2020).
- Three-quarters each said they have a fair amount or a great deal of trust in banks (76 per cent) and law enforcement (76 per cent). Fewer have this level of trust in telecommunications firms and internet service providers (41 per cent), retailers (36 per cent), and Big Tech (34 per cent). Canadians are least likely to trust social media companies. Just one in 10 trust these businesses to protect their personal data.
- Most Canadians (91 per cent) believe that at least some of what they do online or on their smartphones is being tracked by companies or organizations. In contrast, fewer Canadians (73 per cent) believe at least some of what they do online or on their smartphone is being tracked by the government.
- Three-quarters of Canadians have adjusted privacy settings on a social media account (75 per cent) or refused to provide an organization or business with their personal data due to privacy concerns (74 per cent). One-third (32 per cent) said they have raised a privacy concern with a company or organization.
Source: 2022-23 Survey of Canadians on Privacy-Related Issues, released on June 14, 2023, prepared for the Office of the Privacy Commissioner of Canada.
The Hill Times