‘The devil’s in the details, and we don’t have any’: critics, civil liberties groups decry feds’ lack of clarity on changes to privacy and AI bill

The Liberals’ failure to produce details on suggested changes to their privacy and artificial intelligence legislation before the start of a House committee study is “the worst thing they could have done” if the government is serious about quickly passing its latest attempt to update Canada’s privacy laws and regulate the emerging technology, says NDP MP Brian Masse.
And while civil liberties groups say they are encouraged by the government’s willingness to make its proposed improvements, they say they do not go far enough and are calling for the artificial intelligence (AI) component to be “withdrawn, reworked, and reintroduced” as a separate piece of legislation.
On Sept. 24, Innovation, Science, and Industry Minister François-Philippe Champagne (Saint-Maurice–Champlain, Que.) appeared before the House Industry and Technology Committee to provide a verbal summary of the suggested amendments to Bill C-27, which would repeal parts of the Personal Information Protection and Electronic Documents Act (PIPEDA) and enact the Consumer Privacy Protection Act, and the Artificial Intelligence and Data Act (AIDA). The legislation is also the Liberals’ second attempt to amend PIPEDA, after Bill C-11, the Digital Charter and Implementation Act, died on the Order Paper following the dissolution of Parliament ahead of the 2021 election.

While Champagne did not provide written details of the amendments, he told the committee they will include the recognition of privacy as a fundamental right, and the obligation to protect children’s personal data online; strengthen and clarify the role of the proposed artificial intelligence and data commissioner, as well as enabling it to share information and co-operate with the privacy commissioner and Competition Bureau; and define specific obligations for “high-impact” generative systems, as well as general purpose ones like ChatGPT.
Champagne initially told committee members the text of the amendments would not be provided until after the legislation reaches the clause-by-clause review stage at the committee, which will take place after the committee has completed witness testimony.
While opposition MPs on the committee responded incredulously to the suggestion they and committee witnesses are expected to seriously study the legislation without more details on the “substantial amendments” Champagne had announced, his office later said it was aiming to publish the amendments within the coming days. However, late on Sept. 29, Champagne’s office said it will not be providing the full text of the amendments until the study is complete, and will instead provide a letter outlining the changes.
In response to emailed questions from The Hill Times as to why the full text of the amendments was not ready before the committee began its study of the bill, Audrey Champoux, Champagne’s press secretary, did not directly answer, but said the letter outlining the changes would be provided to committee in the coming days, and would focus on areas of improvements identified during debate in the House of Commons and consultations with stakeholders and experts.
“These include specific changes that members in the committee proposed they would put forward during the study. The government is proactively welcoming these improvements,” Champoux wrote. “While we have presented our intent, we are making sure to hear from the various witnesses invited at committee and welcome their opinions on Bill C-27.”
In an interview with The Hill Times on Sept. 25, Masse (Windsor West, Ont.), his party’s innovation critic, said that while he is appreciative of the government’s willingness to change a “significantly flawed piece of legislation,” he was “a little bit shocked” that Champagne did not provide any details before the committee began its study.

In comparison to the changes the proposed amendments would make to the bill, Masse said that the legislation is “significantly hollow” and that the committee—alongside the more than 90 experts, stakeholders, and witnesses invited to appear before it—would be unable to provide a “responsible, educated response” without greater detail on what the final piece of legislation will ultimately look like. The committee is currently scheduled to hold 13 meetings to study the legislation, with up to five witnesses or stakeholder groups attending each.
“If the government’s intent is to move fast on this, the worst thing they could have done is to not come to committee prepared to present the actual amendments,” Masse said. “It’s disrespectful to all the groups and organizations that have to spend their time and money to come and give their thoughts based upon basically a 10-minute speech.”
As for why he believes the government did not have amendments ready for the committee’s study, Masse said that despite Champagne being a “very hard worker” who has put a lot of personal time and energy into the file, “the mortal and systemic weakness of the Liberals has always been laziness.”
Masse said it is “baffling” that while the Industry Committee had ramped up its work during the spring to finish two studies and clear up its schedule to better focus on C-27, the Liberals weren’t been able to use the summer to prepare at least some of the amendments before the fall.
The elements of the current legislation dealing with privacy and competition will be easier to fast-track through the committee, Masse said, but the complexity and novelty of AI technology would significantly hamper the study without more details of the government’s proposed changes.
“Champagne lived up to some of the changes that he said he was going to make, which is good, and I give him credit for that,” Masse said. “But we still need the amendments to move this along.”
Artificial Intelligence and Data Act should be its own bill, say civil liberties groups
Daniel Konikoff, interim director of the privacy, technology, and surveillance program at the Canadian Civil Liberties Association (CCLA), described his reaction to Champagne’s appearance at the committee as a “mixed bag.”
“[The CCLA] is pretty thrilled that our first key recommendation on recognizing privacy as a fundamental human right is something that Champagne came right out the gate to say the government is going to include that amendment,” Konikoff told The Hill Times, adding that while he had been encouraged by the government’s willingness to make the changes it suggested, “the big caveat is the question of process.”

“Introducing a bunch of amendments that possibly aren’t even written up and only showing this willingness essentially right before witnesses are set to testify has really thrown a wrench into that process,” Konikoff explained. “The devil is in the details, and we don’t have any details for the kind of minute and rigorous attention that these amendments merit.”
Tim McSorley, national co-ordinator of the International Civil Liberties Monitoring Group (ICLMG), told The Hill Times that despite the amendments proposed by Champagne at the committee, ICLMG’s view that AIDA needs to be “withdrawn, reworked and reintroduced” as a separate piece of legislation from C-27 has not changed.
On Sept. 25, the CCLA and ICLMG, alongside more than 40 other Canadian civil liberties organizations, experts, and academics, released an open letter addressed to Champagne outlining their main concerns with the current draft of AIDA. Specifically, the signatories say they are concerned that “shoehorning” AI regulation into Bill C-27 will not allow for adequate study of the AIDA, and will take time and attention away from the bill’s privacy provisions. The signatories also provided “bottom-line” changes to AIDA they believe will be needed, including recognizing privacy as a fundamental human right; a commitment to more active consultation with stakeholders “beyond industry leaders”; expanding AI regulation to apply to both the public and private sector, including government security agencies; and removing AI regulation from Innovation, Science, and Economic Development (ISED) Canada’s sole jurisdiction.

While McSorley said he understands why AI regulation would be included in the innovation minister’s portfolio, the concerns civil liberties groups have with its current formulation are a symptom of the conflict between ISED’s mandate to promote industry, and the mandate to regulate it.
That conflict is one of the reasons those groups have called for AI regulation to be removed from ISED’s sole jurisdiction, and for the proposed AI and data commissioner to be kept at “arm’s length” from the department and given independent powers of investigation and enforcement, rather than being appointed and having those powers delegated by the minister.
Konikoff added that while he doesn’t begrudge ISED and Champagne their focus on spurring Canadian innovation in the AI sector, that focus overemphasizes the interests of private business.
“Having some sort of external, independent body that isn’t under the ISED banner would be able to shift that emphasis ever so slightly away from those private interests,” Konikoff said, pointing to the elements of the legislation dealing with “legitimate interest,” which he says provides excessive leeway to private business to regulate themselves regarding when they can collect or use personal information without a user’s knowledge or consent.
Code of conduct a ‘good first step’ in nurturing evolution of Canada’s AI industry: Ksenia Yadav
Ksenia Yadav, an adjunct professor at Carleton University and director of data engineering and AI/machine learning (ML) at Enablence Technologies, said while it is important to strike a balance between the new technology’s powerful benefits and its very real potential for harm, it is critical to ensure any proposed regulatory or oversight framework doesn’t stifle innovation at a crucial moment in the technology’s early evolution.

It’s important to responsibly guide the deployment of AI systems to align them with Canadian values, as well as competition and privacy laws, said Yadav, but it’s also crucial the government not stand in the way of the important research happening in both academia and the technology sector to support the deployment of AI systems and nurture their development.
Yadav pointed to the government’s recently unveiled voluntary AI code of conduct on the use of advanced generative AI systems, which Champagne has said will complement Bill C-27 and promote the safe development of the systems in Canada. Signatories to the code of conduct—which already includes Canadian tech companies like Telus and Blackberry—agree to abide by several principles surrounding accountability, safety, human oversight and monitoring, and transparency regarding how they collect and use personal information, as well as implementing methods to remove bias from the systems they develop or deploy.
Yadav said the code of conduct was a “very good first step,” noting it’s important that it remain voluntary. “If you start making it mandatory, that slows everything down and makes it a bureaucratic process and that would dangerously stifle innovation.”
Fortunately, that balancing act between innovation and regulation is not new or unique to AI technologies, Yadav said, noting that the responsibility of developers to ensure the ethical development of AI mirrors the accountability required of engineers to uphold similar standards of reliability and quality across various technological domains. Furthermore, Yadav said the suggested amendment to require “high-impact” systems to have human oversight when making financial or employment decisions is an important step in catching those instances of bias before any harm can be done.
While Yadav said she understands the concerns about the lack of details regarding the final text of those amendments, she noted that the willingness to adapt to the rapidly changing environment is more important than having details that might not be relevant by the end of the committee’s study.
“Technologically, the development of AI/ML is so rapidly evolving that if we were to have this conversation two months from now, it would have a completely different context,” Yadav said, explaining that the technology’s implications on society and the government’s response to it would need to adapt just as rapidly.
“This cannot be a bill that takes care of all of those potential evolutions at this point,” Yadav continued. “This is a start, and our government will need to evolve their solutions and recommendations to grow with the technology.”
The Hill Times