AI and Private Capital in Canada – Context and Legal Outlook
Canada boasts a robust artificial intelligence (“AI”) ecosystem fuelled by, among other things, strong and consistent government support and growing venture capital (“VC”) and private equity interest therein. With the launch of the Pan-Canadian Artificial Intelligence Strategy in 2017, Canada became the first country in the world to adopt a national AI strategy,2 through which the Government of Canada is investing in efforts to drive the adoption of artificial intelligence across Canada’s economy.3 Governmental support for measures designed to position Canada as a leading global player in AI research and commercialization is evidenced by substantial funding, including, most recently, the C$2.4 billion that was allocated in Canada’s 2024 federal budget for a package of measures to secure Canada’s AI advantage (including C$2 billion over five years in support of a new AI Compute Access Fund and Canadian AI Sovereign Compute Strategy),4 and various initiatives at both the federal and provincial levels.
Alongside government investment, the Canadian AI sector has attracted increasing private capital in recent years, with the number of VC and PE AI deals steadily increasing year-over-year since 2014, and deal sizes generally trending upwards over the past 10 years, consistent with global trends.5 According to the most recent AI report prepared by Deloitte LLP in partnership with the Canadian Institute for Advanced Research and Canada’s three national AI institutes, in 2023 Canada ranked third among G7 countries in per capita VC investments in AI enablers, developers and users (behind only the United States and the United Kingdom), with domestic investment constituting 28% of the total VC investment.6 Over 300 AI/tech announcements regarding foreign direct investments in Canada from 2018 to July 2024 totalled an estimated US$33.6 billion.7 All signs are pointing to continued private capital activity in the Canadian AI sector as Canada’s ecosystem evolves beyond the start-up stage and AI technologies continue to develop
It is imperative that PE and VC actors pay attention to the rapidly changing legal landscape in Canada and beyond, whether they are contemplating investments in AI-driven companies or deploying AI solutions to support core portfolio management functions such as research, deal sourcing, contract management, due diligence or valuation. Compliance with evolving regulations will be challenging for private capital players because it will require both technical and legal expertise.
On June 16, 2022, the Government of Canada introduced the Artificial Intelligence and Data Act (“AIDA”), forming part of Bill C-27, titled The Digital Charter Implementation Act, 2022. If adopted, AIDA would become Canada’s first law specifically dedicated to regulating AI. AIDA seeks to set out clear requirements for the responsible development, deployment and use of AI systems by the private sector. Its stated goal is to put in place a rigorous regulatory framework governing responsible adoption of AI systems to limit the harms those systems may cause, including reproducing and amplifying biases and discrimination in decision-making that may, in turn, propagate considerable systemic harms and erode public trust in the technology, which could have a chilling effect on the development of AI. AIDA’s regulatory focus is limited to “high-impact” AI systems within the private sector, the definition of which is left to regulations which have not yet been developed but are expected to be based in part on the severity of potential harms caused by the system. Recent proposed amendments suggest that AIDA could also regulate general-purpose AI, such as generative AI systems. As of October 2024, Bill C-27 is under review by the House Standing Committee on Industry and Technology and, with upcoming Canadian federal elections in 2025, whether and when AIDA will come in to force remains unclear. If enacted, AIDA’s impact on any particular PE and VC fund or portfolio companies will need to be assessed on a case-by-case basis by AI technical and legal experts.
Beyond AIDA, the development of Canadian law is evolving, with legislators, regulators and courts subjecting AI to increased scrutiny across disciplines, including human rights, labour and employment, intellectual property, data protection and privacy, cybersecurity, trade and anti-trust and competition. For instance, in May 2023, the Office of the Privacy Commissioner of Canada and its provincial counterparts launched an investigation into OpenAI’s ChatGPT which remains ongoing. PE and VC firms must also consider the extraterritorial impact of legislative frameworks outside of Canada, including the European Union’s Artificial Intelligence Act, which (like the General Data Protection Regulation) can affect Canadian businesses launching AI products or services in the European Union. Such legislative frameworks should also be seen as useful best practice guides, influencing norms and contractual terms in the AI space.
PE firms that are developing, using or investing in AI should take proactive measures to manage the risks associated with AI, including regulatory compliance and litigation risks. PE firms can start by establishing policies or frameworks for the responsible use of AI both at the fund and portfolio company levels, and ensuring that investment committees and portfolio company boards are well equipped to assess and address AI-related risks and opportunities. Policies or frameworks should be in line with current and expected regulatory requirements and standards an include, among others things, AI governance policies, measures for monitoring the output and performance of AI and fairness and robustness testing for AI models.
As PE and other investors continue to deploy capital in this fast-moving technological, legal and regulatory landscape, deal terms are evolving to address risks specifically facing AI-intensive targets, including data integrity, model robustness, ethical design processes, compliance with established AI standards (including ISO/IEC standards) and reliance on key AI personnel. Beyond terms that are customarily applicable in tech sector transactions, specific representations and warranties, indemnity baskets, recourses provisions and post-closing covenants may be appropriate in instances where, for example, a target’s valuation is largely dependent on AI that attracts particular regulatory scrutiny, such as generative AI, or the target’s AI governance framework is lacking.
Our Private Equity & Investment team’s full service approach is tailored to deliver a seamless experience to our international clients with their inbound Canadian mandates, and connect Canadian clients to capital and growth opportunities in Canada and globally. For more information or for specific questions on the latest market trends and how can we help you achieve success this year, please do not hesitate to contact.
2 https://www.pm.gc.ca/en/news/news-releases/2024/04/07/securing-canadas-ai.
3 https://ised-isde.canada.ca/site/ai-strategy/en.
4 https://www.budget.canada.ca/2024/report-rapport/budget-2024.pdf (page 168).
5 https://central.cvca.ca/mapping-the-growth-of-ai-in-canada-through-investment.
6 https://cifar.ca/cifarnews/2023/09/27/deloitte-report-canada-leads-the-world-in-ai-talent-concentration/.
7 https://cifar.ca/wp-content/uploads/2024/09/pcais-one-pager-eng-8-AODA.pdf.