Corps de l’article


In United States v. Jones,[1] Justice Alito observed that “[i]n the pre-computer age, the greatest protections of privacy were neither constitutional nor statutory, but practical.”[2] Given the limited resources of government, storing mass amounts of data or conducting regular surveillance of everyday activities was not economically feasible. With the onset of the digital age, this reality was thoroughly undermined.[3] Both government and corporate entities now frequently employ rapidly advancing and complex technologies, which permit the gathering and storing of incredible amounts of information about individuals.[4]

This novel legal terrain has given rise to a general debate in the American literature about whether courts or legislatures are institutionally better equipped to meet the challenges presented by technological advancement.[5] Courts have been shown to have two main weaknesses. First, the rapid evolution of digital technologies often results in judges rendering outdated decisions.[6] Second, because of the “unusually complex” nature of digital technologies, judges operating within the adversarial system often receive inadequate evidence upon which to develop principled rules.[7] This is unsurprising since there is no guarantee that the parties to a proceeding have sufficient technological knowledge or resources to explain the intricacies of a complex technology.[8] Legislatures are arguably better equipped to respond given their greater informational capacity and ability to pass laws expediently.[9]

Others counter, however, that in practice legislatures are often unable or unwilling to update “obviously flawed and outdated provisions.”[10] These difficulties are explained by identifying structural impediments to passing legislation, as well as special interest influence on legislatures, including majoritarian influence stemming from a dislike of criminal suspects.[11] Even though judges tend to craft broad rules to give future courts flexibility in assessing novel circumstances, judicial rule-making at least allows for the incremental, evolutionary development of policy in response to changing technological and social circumstances.[12] As a result, these scholars argue that courts are better suited to govern privacy interests in complex search technologies.[13]

The Canadian literature has identified similar problems with respect to judicial governance of digital technologies.[14] Unfortunately, however, only a limited amount of scholarship has explored Canada’s legislative ability to create laws governing digital devices.[15] These authors conclude that Parliament has risen to the challenge of governing privacy in the digital age.[16] Their conclusions, however, derive from Parliament’s first few legislative responses to complex technological issues that arose from litigation under section 8 of the Charter.[17] As more difficult problems have arisen since these initial legislative reactions, more sustained study of parliamentary capacity to address the unique challenges of governing digital privacy is necessary.

As I conclude below, the initial academic optimism about Parliament’s abilities in this regard was unwarranted. Parliament often passes digital privacy laws that are broad and indeterminate, leaving it to the courts to develop a framework for governing digital privacy intrusions. Where Parliament enacts laws tailored to address a narrow aspect of digital privacy, these laws often become stagnant, lead to incoherent results, or both. As courts are struggling to create informed rules within the adversarial framework, either legislatures must take a much more active role, or they must provide courts with better tools to decide issues relating to complex technologies. I contend that the latter approach is preferable since Parliament’s institutional constraints will likely continue to prevent it from legislating quickly and coherently in response to the use of new and complex technologies. Instead, I maintain that courts and Parliament should work together to ensure judicial development of the law is expedient, coherent, and even-handed.

This article is divided into three parts. In Part I, I outline my methodology for exploring the institutional capacity of legislatures to govern digital privacy as opposed to courts. In Part II, I analyze Parliament’s legislation governing complex and rapidly shifting technologies, asking whether this legislation responds quickly and coherently to technological change, and without undue influence. I conclude that Parliament suffers from many of the same weaknesses attributed to Congress in the American literature, although to varying degrees. Part III closes by using the article’s findings to develop a normative framework for governing digital privacy. Contrary to much of the literature on institutional choice, I maintain that Canadian courts should play a significant role with respect to governing digital technologies, at least in the context of criminal procedure.

I. Methodology

The term “digital technology” refers to electronic tools, systems, or devices that generate, store, or process data. Although my study primarily focuses on Parliament’s legislative responses to digital technologies, other complex and rapidly developing technologies raise similar governance concerns and, therefore, are also appropriate objects of study.[18] As I explain in Part II, Parliament’s legislative responses to these types of technologies have been enacted piecemeal over the last several decades. This time period provides ample opportunity to test parliamentary capacity to respond to digital privacy concerns.

In my review of the statutes, I seek to answer three main questions. First, I inquire as to whether Parliament has reacted quickly to developments in digital technologies. As noted above, this is one of the main weaknesses of allowing courts to create rules with respect to digital technologies. Judges operating within the adversarial system can only address technological issues when criminals or police have used new technologies in a legally relevant way.[19] Even after a technology is considered by a court, the appeals process will delay confirmation of any rule rendered at trial.[20] If Parliament reacts no more quickly than courts, this consideration will hold little sway in determining who is better capable of governing digital privacy.

Second, I will assess whether Parliament’s responses have led to incoherent or unintended results. Again, this is a main critique of allowing courts to regulate digital technologies. Courts not only face time constraints when rendering decisions,[21] they are also limited to consideration of the evidence submitted at trial.[22] Because adversarial proceedings tend to produce inadequate evidence of the operation of digital technologies, courts are prone to render decisions without vital information.[23] However, if Parliament is receiving inadequate evidence, conducting insufficient study, or passing laws in haste, it is likely that mistakes will also be found in its statutory scheme. If mistakes are made, its relative institutional competence will be undermined.

Finally, it is necessary to ask whether Parliament is subject to undue influence by special interest groups or ignores privacy interests to appeal to majoritarian bias. This is an important question in the context of search and seizure law, as prominent academics have questioned whether such concerns arise at all in the criminal law context.[24] Even if such concerns arise, others ask whether these concerns apply to novel search technologies, which are disproportionately owned by members of social classes that have few encounters with the criminal law.[25] If these concerns prove to be founded, and Parliament is unduly influenced, the value of judicial independence will assist courts in tailoring more balanced responses to governing digital privacy.

This inquiry is undertaken through the lens of public choice theory. Public choice theory applies microeconomics to political decision-making. Its broad contribution is to illustrate how the rational actor model applies to political actors.[26] Public choice theorists reject the assumption that political actors always act in the public interest, and seek to explain political behaviour by viewing political actors as “egoistic, rational, utility maximizer[s].”[27] Public choice theory is frequently used to explain inaction[28] and anomalous action (often caused by lobbyist influence[29]) by legislatures. Applying this theoretical framework to complex search technologies will allow for a reasoned conclusion concerning why Parliament reacts in the manner it does with respect to said technologies.

Before embarking on this study, it is also prudent to explain what is not the subject of inquiry. First, I limit my study to federal criminal laws; including provincial legislation would make the study overly broad. As will become evident, Parliament’s post-Charter criminal legislation includes sufficient case studies to shed general light on the normative capacity of Canadian legislatures to govern privacy, at least in the criminal law context. Second, my study excludes national security legislation. To investigate the speed, coherence, and public choice theory questions central to my study requires broad access to records related not only to the development of the laws, but also to how those laws are interpreted and acted upon. Generally speaking, such information is not sufficiently available. As one author aptly puts it, “[a]bsent whistle-blowers, it is almost impossible to develop enough understanding of the intelligence agencies and their practices to identify what should even be negatively framed in the first place.”[30]

II. Parliament’s Legislative Responses

To assess Parliament’s ability to govern digital privacy, I have divided my analysis into three sections. The first considers whether Parliament responded quickly to a technology that arose in the jurisprudence or was widely used by the public. Whether the response was intelligible or had significant gaps will be the subject of the second inquiry. The third inquiry will assess whether public choice concerns have arisen when Parliament passes digital privacy laws. I offer institutional explanations for Parliament’s successes and failures at each interval.

A. Speed of Response

The adoption of the Charter resulted in a series of assertive decisions interpreting the scope of section 8, which protects against “unreasonable searches and seizures.”[31] Perhaps unsurprisingly, Parliament’s response was somewhat lagging in the first decade, as it also had to respond to a plethora of other Charter decisions. Despite the challenge of responding to the judicial interpretation of a new bill of rights, Parliament initially reacted relatively quickly to fill gaps in the law on several occasions. In later years, however, institutional limitations prevented timely, if any, legislative response.

1. Post-Charter

The passage of the Protection of Privacy Act[32] introduced what is now Part VI of the Criminal Code.[33] With this change, Parliament followed in the footsteps of its American counterpart and provided a comprehensive scheme for governing interceptions of private communications.[34] It defined “private communications” as “any oral communication, or any telecommunication, that is made ... under circumstances in which it is reasonable for the originator to expect that it will not be intercepted by any person other than the person intended by the originator to receive it.”[35] However, with the onset of communications technologies, the limitations of Part VI’s ability to respond to privacy and law enforcement concerns were repeatedly exposed.[36]

One of the first challenges posed to the scope of Part VI arose from its application to analog pagers.[37] At least two courts concluded that these technologies did not attract a reasonable expectation of privacy, thereby precluding the need for an intercept warrant.[38] Two reasons formed the basis for this conclusion. First, it was possible that a third party would overhear the recorded messages when played back on the pagers’ speakers.[39] Second, it was also possible for third-party pagers to access recorded messages by tuning into the same frequency as the receiving party’s receiver.[40] Despite the fact that the volume of a speaker may be controlled, these courts refused to recognize a reasonable expectation of privacy in the devices.[41]

Similar difficulties arose from public use of cell phones.[42] Cell phones sent unencrypted analog signals that were available over publicly accessible radio waves, thereby giving rise to the question of whether they attracted a reasonable expectation of privacy.[43] In R. v. Solomon,[44] the court concluded that no reasonable expectation of privacy existed because the cell phone signals were publicly accessible. In R. v. Cheung,[45] however, a more detailed assessment of telephony was undertaken. In so doing, the court concluded that because of the many frequencies and transmission towers from which information is transferred over wireless networks used by some phones,[46] it would be rare to intercept any communications from these wireless phones.[47] As such, the user’s expectation of privacy was held to be reasonable.[48]

In the late 1980s, a further issue arose with respect to whether the consent of one party to covertly record a conversation supplanted the other party’s reasonable expectation of privacy. No legislative provision expressly permitted such activity. The police could therefore only rely on the evidence obtained if the accused’s expectation of privacy was unreasonable.[49] As the consenting party could repeat the words in court, there was a basis to conclude that the accused gave up any reasonable expectation of privacy.[50] In R. v. Duarte,[51] however, the Court rejected this argument. As Justice La Forest wrote, “[a] society which exposed us, at the whim of the state, to the risk of having a permanent electronic recording made of our words every time we opened our mouths might be superbly equipped to fight crime, but would be one in which privacy no longer had any meaning.”[52] Given the “wholly unacceptable” danger to privacy brought on by such new technologies, the Court concluded that prior judicial authorization was required.[53]

Shortly after Duarte, the Court in R. v. Wong[54] considered whether Part VI applied to video recordings. As outlined above, Part VI only covered oral or voice communications when it was first enacted. It therefore did not apply to non-audio-equipped video recordings. A few years earlier, the Law Reform Commission of Canada had explicitly concluded that this gap in the legislation would not lead to “unjustifiable intrusion into privacy.”[55] As a result, the police had taken advantage of this loophole and planted a non-audio-equipped video camera in the accused’s hotel room. The Court ultimately found a breach of section 8, since the accused had a reasonable expectation of privacy in his hotel room.[56] As Part VI did not provide for a warrant power, it was again unable to serve legitimate law enforcement interests.

Around the same time Duarte and Wong were decided, the courts were also considering the legality of using digital number recorders to register the metadata relating to outgoing and incoming calls.[57] In R. v. Fegan,[58] the Ontario Court of Appeal found that no warrant was required to use digital number recorders because the service provider was not acting on behalf of the state. Had such activity occurred at the behest of the state, however, pre-authorization would have been required.[59] This conclusion derived from the then-recent decision in R. v. Wise,[60] wherein the Court considered whether police installation of a tracking device on a motor vehicle required prior judicial authorization. Even though the “beeper” device at issue was unsophisticated,[61] the Court found that its use breached the occupant’s reasonable expectation of privacy. If such a minimal infringement required pre-authorization, then it was likely (contrary to an earlier appellate opinion[62]) that a digital number recorder would also require pre-authorization.[63] As the Criminal Code provided neither powers, such searches violated section 8 of the Charter.[64]

Parliament attended to many of these concerns in 1993 with Bill C-109.[65] To address the inapplicability of Part VI to wireless phone communications, Parliament amended the definition of “private communication” to include any “radio-based telephone communication that is treated electronically or otherwise for the purpose of preventing intelligible reception by any person other than the person intended by the originator to receive it.”[66] This ensured that some wireless telephone communications would require the state to meet the higher requirements for a Part VI intercept warrant.[67]

Parliament’s enactment of section 184.2 of the Criminal Code further provided for warrants to allow for consensual interception of communications. This addressed the concerns raised in Duarte. In addition, Parliament enacted provisions that permitted warrantless interception where bodily or imminent harm is reasonably foreseeable.[68] Although the requirements now found in section 184.2 do not provide the added protections of other Part VI warrants,[69] the courts have found the lower standard to be constitutional as the third-party privacy concerns raised by traditional intercepts are not engaged.[70] As Justice Watt observed in R. v. Largie,[71] “[p]articipant surveillance is generally more focused than third-party surveillance, targeting specific conversations with specific individuals.”[72] Thus, not only do captures of third-party communications become less likely, but the state agent’s control over the conversation also reduces the risk of accidentally receiving irrelevant but private information.[73]

To address the gap revealed in Wong regarding the non-applicability of Part VI to non-audio-equipped video recordings, Parliament enacted the general warrant provision under section 487.01. This broad provision offered a means for police to seek a warrant where no other legislative enactment prescribed a suitable power. It also specifically included sections 487.01(4) and (5), which extended Part VI to apply to any observation “by means of a television camera or other similar electronic device” of “any person who is engaged in activity in circumstances in which the person has a reasonable expectation of privacy.”[74] Thus, Parliament not only provided police with a means to lawfully conduct non-audio-equipped video recordings, but also gave police a flexible tool to apply for search warrants where no specific Criminal Code provision applied.

Finally, in response to Wise and Fegan, Parliament enacted sections 492.1 and 492.2 of the Criminal Code. Section 492.1 allowed tracking warrants to be issued if the police had reasonable grounds to suspect an offence had been or would be committed and that information relevant to the offence could be obtained by using a tracking device. Section 492.2 allowed for the use of digital number recorders if police had reasonable grounds to suspect information related to an accused’s telephone calls would aid in an investigation. This lower standard of reasonable suspicion was borrowed from the decision in Wise, wherein the Court concluded that any parliamentary response could allow for authorization on a lower standard given the lower privacy interests inherent in the information revealed by some searches.[75]

2. 1994–1997

The next parliamentary response to digital privacy concerns was less comprehensive, but no less important, as it updated the warrant powers for police officers under section 487 of the Criminal Code. This provision’s scope extended only to “things” found in buildings, places, or receptacles. The problem raised by digital evidence was aptly summarized by Susan Magotiaux:

Is a computer a thing? Is the data on it a thing? Is the string of binary code sent through satellites in pieces and reassembled at some other machine a thing? Is it the same “thing” when it lands as it is when it travels in pieces? And what of the places? Police can’t knock and announce their presence at the door of satellites and clouds and mobile servers. Yet without particularity of place, current tools may be unavailable.[76]

To ensure police could seek warrants for digital “things,” Parliament amended section 487 of the Criminal Code in 1997.[77] Subsections 487(2.1) and (2.2) were added to ensure police may apply to access and use computer systems found in the place of a search. These broad provisions provide that a police officer may “use or cause to be used any computer system at the building or place to search any data contained in or available to the computer system.”[78]

3. 1998–2013

As computer technologies became more prevalent, the modes for committing a diverse amount of crimes were fundamentally transformed.[79] Unfortunately, the wording of many criminal offences did not capture acts committed with computer technologies, while other offences now prevalent in the digital age had not received any criminal prohibition. Parliament spent much of this period attempting to fill these legislative gaps.

The main area addressed by Parliament was broadly concerned with the sexual exploitation of minors. Digital technologies provided new and difficult-to-trace means of possessing and distributing child pornography.[80] The typical means of “possession” in the physical sense applied to those who downloaded child pornography.[81] However, determining whether accessing an image on an internet website constituted “possessing” the data posed conceptual difficulties.[82] Although evidence stored in the cache may provide sufficient evidence of knowledge and control, these core elements of possession will often be difficult to prove with such evidence.[83] Equally concerning, the definition of distributing child pornography did not extend to digital means of distribution, which had become increasingly common at the turn of the century.[84]

In response to these issues, Parliament enacted Bill C-15A in 2002.[85] This bill created the “accessing” child pornography offence now found in subsections 163.1(4.1) and (4.2) of the Criminal Code.[86] Parliament’s purpose in making these arrangements was to “capture those who intentionally view child pornography on the net but where the legal notion of possession may be problematic.”[87] Bill C-15A also amended the distribution of child pornography offence found in subsection 163.1(3) to include “transmission” and “making available” within the scope of the offence. This had the effect of ensuring that the “offence extends to distribution of child pornography in electronic form on the Internet by such means as e-mail and posting items to websites.”[88] Parliament further passed section 164.1, which allowed for courts to order the removal and destruction of child pornography on the internet.[89]

Bill C-15A also introduced an offence for child luring by way of a “computer system.”[90] The internet enabled the increased prevalence of this sort of predatory behaviour and, as such, the prohibition was tailored to combat the digital commission of these crimes.[91] Similarly, voyeurism offences had become increasingly prevalent with increased technological capacity. Parliament responded with a specific prohibition against recording people in private circumstances.[92] These and the child pornography provisions would not require any substantial amendments during this time period.

Parliament also updated several other offences to account for contemporary technology. For instance, the illegal gambling provisions in paragraph 202(1)(i) were amended in 2008 to include digital means for promoting or facilitating betting.[93] Section 342.01 was amended to include copying of “credit card data” as opposed to prohibiting only “forging or falsifying” credit cards, since the latter definition did not apply to the mere possession or use of a credit card’s data.[94] Parliament also introduced a criminal prohibition for using recording technology (i.e., small cameras) to record private productions such as movies on display in a theatre.[95]

In addition to creating new or amending old criminal offences, Parliament passed its first production order scheme in 2004.[96] Production orders allow police to compel third parties who are not under investigation for any offence to produce data or documents that may be relevant to the commission of an offence by another person.[97] The impetus to pass this scheme arose from Canada’s 2001 signing of the Council of Europe’s Convention on Cybercrime.[98] The convention requires that all signatories criminalize certain offences commonly committed on computers and improve investigative techniques for detecting online crime. By adopting this framework, the signatories aimed to facilitate increased co-operation between countries investigating cybercrime.[99]

Parliament’s legislation furthered these goals by providing police with two types of production orders: a general production order issuable on reasonable grounds to believe an offence occurred and a specific order relating to financial or commercial data issuable on reasonable suspicion.[100] Subsequent attempts in 2005,[101] 2009,[102] 2010,[103] and 2012[104] to bring in more narrowly tailored production orders, as well as provide a variety of other police powers necessary to ratify the Cybercrime Convention,[105] were unsuccessful. Either the Conservative government received limited opposition party support when in a minority position, an election was called causing the proposals to die on the order table, or, as discussed in detail below,[106] public backlash caused government to retract its proposal.[107]

4. 2014–Present

The advent of email and text messaging introduced novel challenges for Part VI intercepts. Under section 183, the meaning of “intercept” includes to “listen to, record or acquire a communication or acquire the substance, meaning or purport thereof.”[108] Courts and academics had long argued that inclusion of the word “acquire” made it necessary to apply for a Part VI warrant to access retrospective email and text messages.[109] Others, however, concluded that the plain meaning of “intercept” required that the acquisition of the message occur during its transmission.[110] As this distinction fundamentally alters the prerequisites for obtaining private communications,[111] several courts heard arguments with respect to when a state act qualified as an “intercept.”[112]

The Supreme Court of Canada partially addressed this issue in Telus.[113] Unlike other telecommunications providers, Telus stores all messages sent through its infrastructure on a computer database for thirty days.[114] The police wanted to retrieve historical messages from this database, as well as future messages throughout the course of a warrant.[115] Rather than applying for a production order and an intercept warrant, the police applied for a general warrant under section 487.01.[116] A plurality of the Court found the acquisition of the future messages to be an “intercept,” since any prospective capture of communications engages the purpose of Part VI.[117] The remaining members of the majority concluded that this technique was “substantively equivalent” to an intercept.[118] The dissent found that Part VI drew a distinction between interception and use, retention, or disclosure of a communication.[119] As Telus was disclosing to police what it had independently intercepted during its delivery process, the practice did not qualify as an “intercept.”

The issue of whether an intercept warrant was required for purely historical emails or text messages reached the Court four years later in R. v. Jones.[120] The Court adopted the dissenting view in Telus that the statutory scheme supported the distinction between disclosure and interception. As such, police need only apply for a production order to obtain historical messages. Although this issue is now settled (barring a constitutional challenge),[121] it is notable that Parliament failed to update its legislation despite these ambiguities being known to the federal government for well over a decade.[122]

The use of peer-to-peer file sharing networks in the context of child pornography investigations also posed difficulties for police investigations. These networks allow users to download files directly from another user’s computer. As users are anonymous online, police must begin such investigations by procuring the Internet Protocol (IP) address that obtained the child pornography files.[123] The investigating officer can then run the IP address through a database that matches IP addresses with approximate locations and service providers.[124] The officer then makes a “law enforcement request” to the relevant service providers requesting that it release the subscriber information related to the IP address.[125] With this information, the police may then obtain a warrant to seize and search the suspect’s computer.[126]

These were the facts underlying the Court’s decision in R. v. Spencer,[127] as well as a series of earlier lower court decisions dating back to the mid-2000s.[128] The accused in Spencer successfully argued that he had a reasonable expectation of privacy in his subscriber information.[129] As such, the Court concluded that state requests for Internet Service Provider (ISP) subscriber information qualify as a search under section 8 of the Charter, thereby requiring lawful authority to conduct the search. As there was no suitable provision authorizing the state to make such requests,[130] the search was found to be unconstitutional.[131]

Technological change also affected the intrusiveness of tracking warrants. Tracking warrants are frequently attached to objects, such as vehicles, but now are also available to monitor mobile devices frequently carried on the person. The ability to track a person’s precise location with Global Positioning System (GPS) technology, as opposed to the unsophisticated methods at issue in Wise, poses significantly more serious threats to privacy. As such, it was questionable whether tracking a person based on “reasonable suspicion” still struck an appropriate balance between privacy and law enforcement interests.[132]

The utility of digital number recorders was also impacted by technological developments. Section 492.2 of the Criminal Code originally stipulated that a “number recorder” was “any device that could be used to record or identify the telephone number or location of the telephone from which a telephone call originates, or at which it is received or is intended to be received.”[133] As people now frequently communicate with other media such as email and text, it was necessary to create a broader framework for capturing metadata with respect to such communications. It was also unclear if the retrievable data under section 492.2 included the place at which the call was made and received. Arguably this would also be constitutional, but the legislation needed to explicitly allow for such a search.[134]

Finally, the Court was presented with the issue of whether searching cell phones incident to arrest was constitutional.[135] This issue has especially important implications for digital privacy.[136] As such, the Court’s decision to allow warrantless cell phone searches incident to arrest—when many cell phones are functionally equivalent to computers[137]—was controversial. For a variety of reasons, the Court’s ruling has been heavily criticized.[138] Anticipating its institutional shortcomings to develop a comprehensive rule, the majority invited Parliament to pass legislation governing when police may conduct such searches.[139]

Parliament addressed some of these concerns in 2014 with Bill C-13.[140] To address the gap in Spencer, as well as other more general gaps in the production and preservation order scheme, Parliament overhauled sections 487.011 to 487.0199 of the Criminal Code. Three main production orders were created, all issuable upon reasonable grounds to suspect an offence has been or will be committed. Sections 487.015 and 487.016 were added to allow police to trace and have third parties produce “transmission data.”[141] Transmission data is effectively metadata—that is, the contextual information surrounding a communication.[142] Acquiring such data allows police to trace the origin of any telecommunication.[143] Section 487.017 allows police to apply for “tracking data,” being data that “relates to the location of a transaction, individual or thing.”[144] The amendments also provided police with the ability to compel third parties to preserve documents in their possession for a prescribed period. As such information is routinely destroyed (sometimes intentionally but often inadvertently), this provision was necessary to preserve evidence for crimes committed with digital technologies.[145]

Parliament further responded to concern over the constitutionality of tracking device warrants available under section 492.1 of the Criminal Code by raising the standard from reasonable suspicion to reasonable grounds to believe when the device being tracked is commonly on the person.[146] Parliament simultaneously updated the digital number recorder provision to include the broader term “transmission data.”[147] This allowed police to obtain data indicating the origin and intended recipient of internet and text communications, not just telephone communications.[148] The revised definition also clarified that location data during the transmission of a call may be obtained, a question left open by the previous provisions.[149] The fact that it took until 2014 to update these provisions, however, is evidence of Parliament’s difficulty keeping pace with digital technologies.

Finally, Bill C-13 updated the Criminal Code by providing an offence for what has come to be known as “cyberbullying.”[150] A legislative gap arose because digital technologies made it easy for young persons to distribute sexually explicit photos of their peers. Because charging youth with distribution of child pornography was too harsh a sanction,[151] Parliament introduced subsection 162.1(1) of the Criminal Code. Although the section in many ways mirrored the existing child pornography offences, it provided prosecutors with more moderate sentencing options for prosecuting youth and young adults than the child pornography provisions.[152]

5. Summary

Several conclusions may be drawn from the above review. Parliament’s first few responses to gaps or constitutional issues with its legislative framework governing complex technologies were relatively quick.[153] At the turn of the century, however, Parliament became much less efficient. Despite having undertaken to provide a comprehensive lawful access scheme with its signing of the Convention on Cybercrime in 2001,[154] Parliament’s legislation was patchwork and slow. Parliament did, however, manage to meet the requirements of the convention fourteen years after it was adopted.[155] In the interim, the Crown pursued drawn-out litigation in the courts trying to find lawful access provisions where none existed.[156] Disputes surrounding Part VI warrants fared no better, as Parliament’s refusal or inability to address the confusion surrounding the definition of “private communication” and “intercept” was ultimately left to the courts.[157] Although the digital number recorder warrant was eventually updated, the provision was inapplicable to many of the most common mediums of communication for two decades. Other issues with significant digital privacy implications, such as searches of cell phones incident to arrest or guidelines for searching computers under subsections 487(2.1) and (2.2), have so far received no response from Parliament.[158]

Parliament did fare better in defining offences—a domain where it could not rely on courts to fill in legislative gaps. Several offences were modified in the early- to mid-2000s to allow prosecution of new ways of committing crime brought on by digital technologies. Parliament’s record with respect to updating offences, however, is not perfect. As Peter McKay observed, given the seriousness of the child pornography offence, the delay in updating these provisions was “virtually inexcusable.”[159] The well-known practice of cyberbullying had also been an issue long before Parliament’s legislation passed. More than anything, the response was a reaction to high profile teenage suicides.[160] Moreover, other desirable offences—such as a criminal prohibition for accessing and stealing historical data—have still not received criminal sanction.[161] Overall, although Parliament has responded reasonably quickly when updating offences, its record has blemishes.[162]

Any attempt at explaining Parliament’s slow response time will, to some extent, be guesswork. However, it is not unreasonable to at least partially explain significant delays by observing that Canadian governments are often in a minority position. This was the case from 2004 to 2011, a period where legislative amendments regarding controversial privacy issues such as “lawful access” were repeatedly stifled.[163] A great deal more legislation was passed in the following years, which witnessed a Conservative majority government. It is also important to note, however, that all opposition parties during the Conservative government’s time in power cited instances where the cause of delay was the Conservative government’s tendency to shelve bills containing criminal justice issues, and then re-raise the bills to distract from scandals or to drum up political support around election time.[164] The “tough on crime” angle suggests that majoritarian politics were at play, and that the government was willing to sacrifice privacy interests for political gain.

B. Coherence of Response

The coherence of Parliament’s responses to complex and rapidly advancing search technologies is equally illustrative of its relative institutional capacity to govern digital privacy. As will be seen, both privacy advocates and law enforcement have expressed concern about significant deficiencies with Parliament’s legislative responses. Many of the technological developments were not anticipated by Parliament. Other anomalous results arose from unclear legislative drafting, which may be attributed to a failure to fully comprehend digital technologies. Still other responses relied on highly questionable determinations that the technology at issue did not attract a reasonable expectation of privacy.

1. Wireless Phones

Parliament’s 1993 amendment to the definition of “private communication” ensured that all encrypted digital signals sent via wireless phones came within the ambit of the term.[165] However, the various technologies used by different “generations” of cordless phones resulted in many then-current technologies falling outside of the amended definition of private communication. First generation cordless phones, which at the time of the amendments were used by 95 per cent of telephone users,[166] were susceptible to interception by simple scanner devices.[167] As a result, some courts held that communications via these phones did not attract a reasonable expectation of privacy.[168] These phones, like their analog pager predecessors, could therefore be tapped by anyone, including police, at will.

Other courts, in line with modern jurisprudence on section 8 of the Charter, concluded that the technical capabilities of private communication technology and their ability to be intercepted should not be the only factor considered.[169] To exclude 95 per cent of then-current cordless phone users was arguably not in line with what the average consumer would expect, as it is unlikely that anyone other than the police was frequently trying to intercept phone calls.[170] Moreover, placing emphasis on the type of phone one owns allows those who can afford to purchase newly available technologies to have greater privacy protections.[171] Parliament, then, arguably drew an arbitrary and unfair distinction in its first amendment to the definition of private communication.

2. Tracking Device Warrants

In its 2014 amendments, Parliament elevated the grounds necessary for issuance of a tracking device warrant if the device is commonly found on the person. Given the onset of GPS tracking, this sounds like a principled approach. However, this approach may unduly limit police, depending on what technique is used to track a device. Tracking a cell phone, for instance, may involve police using a tactic known as “pinging.” This practice indicates to police the cell phone tower with which a cell phone is exchanging signals. In Grandison, the expert testimony revealed that the information gained from this tactic told police that the accused was anywhere from a 50- to 4,894-metre radius from a tower.[172] The court also noted that pinging does not involve constant tracking of the subject, but instead requires that police make specific requests to the telecommunication service provider to determine the subject’s approximate location at any given time.[173] This is contrary to GPS tracking, wherein an accused’s exact location can be determined at any time.[174]

With a fuller understanding of the technology used for tracking the accused’s phone, the court rejected the accused’s contention that using the previous reasonable grounds to suspect standard was unconstitutional.[175] It came to this conclusion despite the amendments raising the relevant burden of proof having been implemented between the time the charge arose and when the court rendered its decision. Although the technique at issue was somewhat more sophisticated than the vehicle tracker used in Wise, the court concluded that the information revealed did not, unlike the use of GPS technologies, significantly touch on the biographical core of personal information required to constitutionally impose the higher reasonable and probable grounds standard.[176] Parliament’s amendment, although well intended, therefore inadvertently prevented police from using other reasonable and less invasive methods of cell phone tracking.

3. Digital Number and Transmission Data Recorders

As noted in the preceding section, the initial language of section 492.2 (“digital number recorder”) was not broad enough to encompass metadata relating to technologies other than telephone calls. This had the effect of leaving metadata related to technologies such as email and text to be sought under the general warrant or production order provisions.[177] As these provisions require reasonable grounds to believe, they raised the standard for receiving what is effectively the same information from the lower reasonable suspicion standard required under section 492.2.[178] This was undesirable from a law enforcement perspective, since metadata is often used early on to further an investigation and therefore is needed to make out reasonable and probable grounds for a warrant.[179] Although the 2014 amendments corrected this mistake, it had persisted in the Criminal Code for twenty-one years.

4. General Warrants

Parliament enacted the general warrant provision found in section 487.01 to allow courts to issue warrants authorizing police to “use any device or investigative technique or procedure or do any thing described in the warrant that would, if not authorized, constitute an unreasonable search or seizure.”[180] Although section 487.01 provides police with a flexible law enforcement tool,[181] it must be acknowledged that it abdicates authority for governing many novel search technologies to the courts. For instance, the following investigative techniques have all been governed under section 487.01: Forward Looking Infrared (FLIR) thermal imaging,[182] installation of “amp meters” to measure electricity usage,[183] making electronic copies of data on a computer system,[184] review of third-party forensic files,[185] the ability to program failures into a criminal suspect’s computer hardware,[186] use of forensic fluorescent light technologies to covertly search for bloodstains,[187] and the ability to perform phallometric testing.[188] As Daniel Scanlan observes, it is reasonable to anticipate that the general warrant “will [continue to] have broad application to the investigation of offences involving computers and the capture of data.”[189]

5. Computer Searches

The addition of subsections 487(2.1) and (2.2) of the Criminal Code allow police to use “any computer system” to search for “any data” available to the computer system.[190] As Susan Magotiaux observes, “[t]he scope of the[se] subsection[s] ... [is] potentially boundless. ... Depending on the configurations and active connections of a given device, there could be data accessible to the device from other people, other networks, other countries, or other businesses.”[191] The privacy interests implicated by such computer searches were aptly summarized by Justice Fish. As he wrote in R. v. Morelli, “[i]t is difficult to imagine a search more intrusive, extensive, or invasive of one’s privacy than the search and seizure of a personal computer.”[192] The need to ensure such searches respect privacy interests is therefore of the utmost importance.

Unfortunately, Parliament has not elaborated upon the process for searching computers. Indeed, until 2013 the Crown maintained that special authorizations for computer searches are unnecessary, because computers are no different than filing cabinets or cupboards.[193] Although the Court unanimously rejected these analogies,[194] by far the more difficult question requires asking how computer searches must be conducted.[195] This concern prompted the Court in Vu[196] to suggest that the broad scope of computer searches may require Parliament or the courts to devise search protocols.[197] By enacting subsections 487(2.1) and (2.2), and then refusing to update these sections in response to the Court’s decision in Vu, Parliament has again effectively left it to the courts to determine the rules with respect to a complex search technology.

Although some commentators believe that developing computer search protocols is not possible,[198] others have proposed ways forward.[199] The capacity and functionality of modern computers give rise to some basic questions.[200] Should police be able to look through every file and folder on a computer?[201] Does the type of crime investigated limit police to reviewing certain types of files? Should police searches be restricted to use of certain keywords? How does the plain view doctrine operate within computer searches?[202]

It is important to explore the answers to these and related questions because leaving computer searches to ex post review is inconsistent with the purpose of section 8 of the Charter: to prevent unreasonable searches and seizures.[203] This is especially important as the case law is replete with instances where police have grossly overstepped the boundaries of what would qualify as a “reasonable” search.[204] Moreover, new technological developments allow police to search in manners much more respectful of privacy interests.[205] It is unlikely that the adversarial system will be able to stay on top of these developments, since a court’s ability to respond to technological developments is limited by the evidence provided to it in a given case.[206] Parliament’s approach so far has not, however, fared any better.

6. The Definition of “Intercept”

Although the Court reconciled the competing interpretations with respect to the meaning of “intercept” in Jones,[207] two main issues persist. The first concerns the prospective acquisition of “untransmitted” communications. As Professor Steven Penney observed, the definition of “private communication” should be amended “to include the prospective interception of electronic communications before they are transmitted.”[208] Given that the current definition of “private communication” includes only “oral” communications and “telecommunications” (the latter of which requires the “emission, transmission or reception” of communicative content “by any wire, cable, radio, optical or other electromagnetic system, or by any similar technical system”), Part VI intercepts are not required to prospectively intercept non-oral communications.[209] As such, covertly installed key logger software could be used to record emails and other communications before they are sent, but these would not be afforded the protections in Part VI despite implicating identical privacy interests.[210]

Second, the result of relying on the prospective–retrospective distinction may cause constitutional issues in other contexts. In Jones,[211] the police had applied for a production order under section 487.012 (now 487.014) to produce historical text messages stored on Telus’s database. In his concurring opinion in Jones, Justice Rowe raised a problem with the scheme as interpreted in Justice Cromwell’s decisions in Telus and in Jones. The prospective–retrospective distinction may break down in practice, as it leaves the possibility of police applying for a transmission data warrant, and then subsequently applying for production orders to retrieve the stored messages a short time after they receive notice that a call or text was made.[212] If the moment of authorization is what matters, then there is nothing stopping police from exploiting this loophole.[213] As I have argued elsewhere, by narrowing the definition of “intercept,” the constitutional problem with its definition has simply been shifted to Parliament’s production order scheme.[214]

7. Subscriber Information

In Re Subscriber Information,[215] the Provincial Court of Alberta considered whether subscriber information to a cell phone could be retrieved by police without warrant. Because the phone in question was internet-connected, the court concluded that its subscriber information attracted a reasonable expectation of privacy, even if the subscriber information for non-internet-connected phones did not.[216] As such, the Crown sought to have the cell phone’s subscriber information produced through sections 487.016 and 487.017. To qualify, the information sought must relate to “telecommunication functions of dialling, routing, addressing or signalling” (487.016) or “the location of a transaction, individual or thing” (487.017). The Crown argued that cell phone subscriber information meets these tests because it is accumulated and stored to facilitate billing and collection of payment.[217] However, as subscriber information does not relate to the functioning of telecommunications as required by these sections, it was held not to fall within the ambit of the provisions.[218] Other cases and legal commentary support this conclusion.[219] Parliament’s 2014 amendments therefore created an anomalous result by permitting police to obtain transmission and location data on a lower standard (reasonable suspicion via sections 492.1 and 492.2) than basic subscriber information to internet-connected cell phones (reasonable and probable grounds via section 487.014).[220]

8. Summary

In most of the areas where Parliament has responded to the challenges of governing digital privacy, noticeable gaps have been revealed via judicial or academic review. Again, it is difficult to provide a definitive reason for why holes in Parliament’s legislative scheme frequently arise. However, it is reasonable to conclude that in some circumstances Parliament was not provided with the relevant information when passing laws. It is likely that technology is not presented to legislators with a list of all current or possible future applications and interaction effects with other technologies. Even with the advantage of time to study technologies in depth, it is difficult to anticipate their transformative potential. Moreover, there is no guarantee that legislatures thoroughly understand digital technologies.[221] This lack of understanding has resulted in lacklustre debates that fail to expose all weaknesses in the proposed legislation.[222]

In other instances, it may be that Parliament is acting in haste or without much interest in protecting privacy. Its response to early wireless phone technology is indicative of a lack of study or outright neglect of privacy interests in early cordless telephones. Parliament’s difficulties passing lawful access legislation also resulted in the Conservative government, with its first majority, taking advantage of this position by significantly expediting the legislation. In yet other instances, Parliament has made a deliberate choice to allow courts to create governing frameworks for digital technologies. The general warrant provision in section 487.01, as well as the broad computer search powers found in subsections 487(2.1) and (2.2), are illustrative. These responses demonstrate that Parliament often fails to respond adequately or intelligibly to digital privacy challenges despite its theoretical advantage over courts.

C. Public Choice Theory

As discussed above, public choice theory cautions that the legislative process may be skewed in favour of powerful interest groups or majoritarian interests. As such, less fortunate groups will suffer to the benefit of those that are often wealthier, less diverse, and better organized.[223] Although Canada is generally less susceptible to the negative influences of lobbying,[224] it has been argued that novel search technologies are immune from majoritarian concerns.[225] As digital technologies are used disproportionately by the wealthy, Professor Kerr suggests that these individuals will convey their privacy interests to legislatures, “resulting in a healthy debate and relatively favorable conditions for balanced legislative rules.”[226] These contentions have not, however, been tested in the Canadian digital privacy and criminal procedure contexts.

The lawful access experience provides an illuminating case study for investigating the influence (or lack thereof) of lobbyists and majoritarian politics on digital privacy rules. In Parliament’s first review of the issues surrounding lawful access, it consulted more than three hundred organizations ranging from police services, telecommunications service providers (TSPs), civil rights groups, and individual Canadians.[227] As a result of this consultation, Parliament tabled Bill C-74 in 2005 only to have it die on the order table due to an election being called.[228] As mentioned earlier, subsequent attempts to pass lawful access legislation were made in 2009, 2010, and 2012. These proposals did not make it past first reading. The 2014 proposals found in Bill C-13, however, were passed by a majority Conservative government.

Throughout this experience the federal government justified increased lawful access demands by appealing to the need to protect Canadians from terrorists, identify pedophiles, prosecute violent offenders, and address the issue of cyberbullying.[229] However, the various lawful access proposals were met with fierce opposition from civil rights groups, privacy commissioners, academics, and at times TSPs.[230] The TSPs questioned the need for broad access powers, and also raised the more self-interested question of who would incur the costs of installing the necessary infrastructure to provide government access.[231] Civil rights groups rapidly disseminated information to the public via the media to create an atmosphere of opposition to controversial aspects of each attempt to institute lawful access legislation.[232] Opposition parties also seized on the opportunity to critique the Conservatives for pandering to law enforcement demands.[233]

The impact of civil society’s opposition could be seen throughout the process. As Parliament admitted in its legislative backgrounder to Bill C-74, the creation of storage obligations requiring TSPs to collect and store information about their customers’ internet viewing histories was not included after its initial consultation.[234] This is contrary to numerous regimes in Europe, which have such data retention policies.[235] A national database storing names and addresses of customers was also not part of Bill C-74.[236] Nor was a “know your customer” requirement. This would require knowing the identity of who was purchasing a service, which would prevent retailers from selling items such as anonymous phone cards.[237] The concerns raised by privacy advocates dissuaded Parliament from acceding to law enforcement requests to implement these anti-privacy policies.[238]

With these initial concessions, Parliament slimmed down its first proposal in Bill C-74. It maintained, however, a requirement that TSPs update their infrastructure to allow police to intercept communications.[239] It also included a provision allowing law enforcement to obtain ISP subscriber information upon request, without judicial authorization.[240] This law was designed to provide the “lawful authority” required under paragraph 7(3)(c.1) of PIPEDA to allow TSPs to hand over subscriber information without warrant.[241] As some authors observed at the time, this development would lead to “a significant alteration in the procedural safeguards against excessive fishing expeditions by law enforcement agencies.”[242] The fact that the legislation provided no overview for this process made this proposal even more controversial.[243] As a result, privacy advocates protested the bill, only to have it die on the order table following the calling of an election.

The 2009, 2010, and 2012 attempts to pass lawful access legislation suffered from the same controversial aspects of Bill C-74.[244] The battle was again fought in the media, wherein civil rights groups and opposition parties aligned themselves against the government proposal. A series of social media campaigns was highly influential at painting the government’s bill as anti-privacy.[245] The opposition parties also launched campaigns against each bill.[246] In so doing, they accused the Conservatives of pandering to majoritarian desires to be “tough on crime” as opposed to drafting a constitutionally compliant lawful access scheme that took seriously the many concerns raised by pro-privacy advocates.[247]

Even TSPs played an active role in opposing the new legislation. The federal government had proposed modifications to the Solicitor General’s Enforcement Standards (SGES) for Lawful Interception of Telecommunications that would require licensed TSPs to replace circuit switched telephony systems with interconnected radio-based transmission facilities.[248] As the TSPs’ representative observed, this change “opens up several additional services to interception requirements, including Internet services, and cable and broadcasting services.”[249] The TSPs objected since this strategy sought to do with regulations what Parliament had been unable to accomplish with its legislation.[250] Even without significant response from the other privacy advocates, the federal government backed off from this proposed change.

The result of the decade-long debate on lawful access was that the government conceded that any modernization to police powers would not include “the warrantless mandatory disclosure of basic subscriber information or the requirement for telecommunications service providers to build intercept capability within their systems.”[251] However, one controversial aspect remained in the legislation Parliament passed—namely, section 487.0195—which allows TSPs to voluntarily disclose subscriber information to law enforcement without incurring civil or criminal liability. However, as the Court in Spencer recognized a reasonable expectation of privacy in ISP subscriber information, it is unlikely that telecommunications providers will risk their reputations and provide such information to police without a warrant.[252] The lawful access experience thus exemplifies the ability of civil society to mobilize to protect digital privacy interests, even in the face of persistent demands by law enforcement for expansive search powers and a government using majoritarian “tough on crime” politics to achieve political ends.

III. Implications

The above review suggests that Parliament’s advantage over courts in responding to complex and rapidly changing search technologies is more theoretical than real. Although Parliament should be able to respond quickly and coherently, it often fails to meet these objectives. It is notable, however, that there appear to have been few instances where public choice concerns have given rise to serious problems in the context of criminal law legislation governing digital technologies. Any proposal, then, needs to begin by recognizing that in the criminal law and digital privacy contexts, both courts and Parliament are slow in responding; both also make rules in incomplete information environments, but tend to make them in an even-handed manner.

Two other points must also affect any institutional strategy. First, Parliament has exclusive authority to pass new offences or update current offences. As such, it is Parliament’s sole prerogative to carefully tailor the definition of offences to keep up with digital technologies—a task that has proven to be quite challenging. Second, courts often serve a gap-filling role when developing and implementing rules governing complex and rapidly changing search technologies. The challenge is therefore twofold. First, how should Parliament tailor its non-offence related legislation knowing that it tends to react slowly and at times incoherently? Second, how can we best ensure that courts play their gap-filling role most effectively?

Any approach to governing digital privacy should begin by considering the literature on institutional choice. Professors Neil Komesar and Adrian Vermeule have each written on this topic.[253] They recognize that “comparing institutions requires identifying parallels across institutions in some acceptable, understandable, and usable fashion.”[254] To accomplish this end, Professor Komesar developed the “participation-centred approach.”[255] The model is a simple economic one wherein “[t]he character of institutional participation is determined by the interaction between the benefits of that participation and [its costs].”[256]

One of the major impediments for using courts was discussed above—namely, judicial ability to receive adequate information. Another barrier is litigation costs, how they are diffused, and whether they create incentives to litigate.[257] Professor Komesar uses pollution as his primary example to illustrate when these considerations might influence institutional approaches to rulemaking. If everyone faces small losses for pollution, no individual lawsuits will arise, and unless the amount of damages is large overall, there likely will not be a class action.[258] Moreover, preventing pollution is extremely complex. A similar logic could be applied to the digital privacy context. Given the ability of legislatures to thoroughly research an issue, legislatures are better suited to weigh the competing concerns. As long as there are not significant majoritarian or lobbying concerns, it is best to leave it to the legislature.

As Professor Vermeule observes, however, institutional choice is also determined by a country’s constitutional and institutional arrangements and cultures.[259] In addition to the fact that Professor Komesar is speaking in the American setting, the examples he used are not applicable in the narrower topic of this article for two reasons. First, the potential for exclusion of evidence in the criminal law context always provides an incentive to litigate vague or yet-to-be-determined police powers, even if the violation seems small.[260] Second, although public choice concerns have proven to be insignificant, Parliament has been at least as slow and confusing in passing legislation as courts have been in developing the common law. Although it is often assumed that legislatures will utilize their institutional advantages, the Canadian digital privacy and criminal law contexts provide an excellent example of Parliament being unable to take advantage of its institutional strengths.

It is therefore appropriate to be skeptical about the utility of relying on institutional competence arguments as the sole means for determining the appropriate role of each institution when governing digital privacy. As one critic of institutional choice theory observes, relying on broad generalizations of institutional competence paints “a stilted portrait of institutions” that “focuses too heavily on the current characteristics of institutions rather than on their potential for reform and change.”[261] In other words, the “inherent” strengths and weaknesses of courts and legislatures are subject to ebb and flow. This in turn affects each institution’s ability to respond effectively at different times. A better approach, then, would focus on how these institutions can work together to respond to the various challenges.[262] I suggest this approach can be applied to governing digital privacy.

To begin furthering this aim, I have elsewhere developed two institutional strategies to aid Canadian courts in developing digital privacy rules.[263] The first proposal concerns scenarios where Parliament—either intentionally or inadvertently—leaves it to the courts to develop a rule to govern a complex and rapidly advancing technology. In broad strokes, I suggest that when Parliament relies on courts to play such a role, it should send the relevant question as a reference to the Supreme Court or other provincial appellate courts.[264] The reference process not only allows appellate courts to develop rules with an ideal evidentiary record,[265] but also avoids lengthy trial and appeal delays.[266] In other words, utilizing the reference procedure allows courts to provide an informed and timely response to a digital privacy issue.

To help courts apply existing rules to digital technologies, I recommended tasking an independent institution with an investigative role.[267] By providing reports outlining timely and pertinent facts related to technologies expected to come before the courts, counsel would have reliable information upon which to argue its case.[268] In turn, digital privacy rules would be much more likely to be applied in a principled manner.[269] Even if the ultimate ruling is of little precedential value due to technology outpacing the law, the decision will at least have been made with a robust evidentiary record and thus stand a much greater chance of being consistent with Charter principles.[270]

These reforms, however, do not address how Parliament should tailor its digital privacy legislation. As section 8 of the Charter requires that searches be authorized by law, Parliament must typically pass a law granting search powers to law enforcement.[271] Although Parliament may provide courts with broad legislation like the general warrant provision (section 487.01) or computer search provisions (subsections 487(2.1) and (2.2)), ex post judicial development of such rules is not an optimal procedure because it fails to communicate the rule before a technology is in widespread use. Legislative rules are thus preferable to the extent that they can provide clear and lasting guidance to law enforcement officers before searches of a technology become common.

In deciding how a law affecting digital privacy should be drafted, Parliament should therefore consider the relative costs of specific and general rules. As discussed earlier, when Parliament passes detailed legislation with respect to complex and rapidly advancing technologies, those laws tend to become outdated or have gaps which either needlessly undermine privacy or unduly hamper police investigations. Where the technology is stable, however, legislative rulemaking can better respond to both law enforcement and privacy interests. This follows because stable technologies can be studied in depth and rules can be crafted without concern that the law will soon become outdated. Delays inherent in the adversarial process will result in judicial rules relating to stable technologies being unknown for unnecessarily lengthy periods of time.[272]

Where Parliament is unsure about the development of a technology, however, legislative rules are vulnerable to becoming quickly outdated. To address this concern, Parliament should approach drafting its legislation in one of two ways. First, it could draft digital privacy laws broadly and allow courts to update the law on a case-by-case basis. If my above recommendations allow courts to receive adequate information about digital technologies, courts will be well equipped to develop principled digital privacy rules. Although this approach would likely result in many rules lagging behind technological development, this is already a prominent feature of legislative and judicial digital privacy rules in the digital privacy and criminal law settings.

Second, if Parliament is confident in its understanding of a complex and rapidly advancing technology and its ability to pass a rule expediently, it could consider passing rules with built-in sunset clauses.[273] By ensuring that a rule is no longer applicable after a designated period, Parliament can control, to some extent at least, whether its legislation will be overtaken by technological advancement. Moreover, sunset clauses can be designed to ensure that the law comes before a special committee tasked with reporting to Parliament before the law expires.[274] Parliament can then take the opportunity to consider any potential gaps in its legislation and respond accordingly.

This more dynamic approach to governing digital privacy requires that courts and legislatures be flexible in determining the process for making a rule. There are multiple options for crafting principled rules and some processes may prove more or less feasible at different times due to restrictions in the judicial and political processes. The ideal approach would allow Parliament to craft and expediently revisit digital privacy rules in a way that allows for judicial review of its legislation. Recognizing that this is unlikely to occur frequently, Parliament must be attuned to its institutional weaknesses, and focus on strengthening the judicial process to allow the courts to address the inevitable gaps that its legislation will leave. The above recommendations, I suggest, would go a long way in achieving these goals.

Several objections to this proposal may be anticipated. First, it may be argued that stare decisis will prevent courts from responding flexibly to technological change.[275] It should be remembered, however, that developing digital privacy rules in the criminal procedure context implicates section 8 of the Charter. As the Court recently concluded, significant factual changes underlying Charter decisions make it permissible for lower courts to reconsider even the Supreme Court’s rulings.[276] Although the Court has cautioned against liberal use of this exception,[277] it is not difficult to imagine changes in technology “fundamentally shifting” the applicable privacy and security interests central to determining whether a search or seizure is unreasonable. As such, stare decisis should not prove as restrictive as it may be in other contexts or countries.[278]

Second, any suggestion that Parliament should play a lesser role in developing police search powers is constitutionally questionable. As Professor James Stribopoulos observes, the principle of legality requires that police powers derive from Parliament, not from the courts.[279] The legality principle does not, however, inhibit Parliament from passing broad legislation to facilitate judicial development of digital privacy rules. First, it is notable that the Court has, for better or for worse, all but abandoned the legality principle by creating a variety of police powers under the common law.[280] Second, although searches must, at minimum, be authorized by law,[281] the courts have not imposed a high threshold for meeting this requirement. For instance, the broadest provision discussed above—the general warrant found in section 487.01—has survived constitutional scrutiny on this ground.[282] As such, there does not appear to be a constitutional impediment to my proposal.

Finally, it may be argued that it is undemocratic to vest significant digital privacy rule-making duties with courts. This argument may be countered in two ways. First, it is notable that those advocating for legislative primacy in the fields of digital privacy and criminal law do not present any cogent arguments to address the significant limitations of legislative rule-making.[283] Political science scholars observe that politicians tend to address issues only when they arise on the public agenda.[284] Whether a legal gap will be addressed in turn depends on what other issues of the day are demanding political attention.[285] Moreover, the fact that Canadian federal governments are often in minority positions makes passing legislation with any controversy increasingly difficult.[286] Add to this the necessary study required to pass legislation, as well as laws having to pass through both the House of Commons and the Senate. There are also temporal and practical barriers that often become insurmountable for both minority and majority legislatures.[287] Parliament should acknowledge these limitations and explore institutional options to address them. This does not strike me as undemocratic: it exemplifies responsible governance.

Second, my proposal need not stifle Parliament from passing digital privacy laws or prevent Parliament from responding to digital privacy rulings. Instead, I suggest that Parliament should consider its institutional limitations before passing digital privacy legislation. This still allows for important dialogue on the content of rights to occur.[288] As Professor Peter Hogg and Allison Bushell observe, the democratic legitimacy of judicial review is bolstered because the structure of the Charter often results in judicial review of legislation leaving room for a legislative response.[289] That response is typically able to achieve the legislature’s objective while at the same time respecting constitutional rights.[290] In this way, then, constitutional dialogue provides an important mechanism for determining “how society should struggle together for the best answers to controversies about justice.”[291]

As should be evident from Part II, dialogue in the digital privacy context has been lacklustre.[292] This should not be surprising. Courts and legislatures are having difficulty determining the basic facts upon which to create rules governing digital technologies. They are also having difficulty keeping pace with the rapid development of digital technologies. Dialogue is meaningless if there is no basic understanding of what facts underlie the dialogue or if the dialogue is rendered moot because a rule becomes outdated due to its failure to keep pace with use of a particular technology. By reforming how courts receive information about digital technologies, courts will become equipped to participate in this dialogue.

Parliament’s “tone” in this dialogue should, however, be altered to reflect the changing circumstances within which this conversation takes place. A revitalized dialogue in the digital privacy context requires that Parliament pay attention to judicial and legislative weaknesses in rulemaking. In practice, this will often require Parliament to speak more cautiously, using tools such as sunset clauses to ensure its legislation does not unduly hinder law enforcement or needlessly undermine digital privacy. This modified approach to passing digital privacy laws, I suggest, provides a democratically responsible way of ensuring that Canadian institutions tasked with governing digital privacy are capable of balancing the important law enforcement and digital privacy interests at the heart of section 8 of the Charter.


American scholars have entertained a lively debate about the relative institutional capacities of legislatures and courts to govern privacy interests in light of rapidly evolving and complex search technologies. Although the Canadian judiciary has encountered similar problems as their American counterparts, a comprehensive study had not been undertaken to assess the potential advantages of having Canadian legislatures govern digital technologies. This article fills the void with respect to the institutional capacities of Parliament to govern digital privacy in the criminal law context. After reviewing several decades of its legislation, I conclude that there is little reason to believe that Parliament is quicker or more coherent in its responses to digital technologies than courts. Unlike with Congress, however, concerns about Parliament being susceptible to majoritarian or lobbyist influence are minor. This may be the result of the more stable political climate in Canada, or, as Professor Kerr contends, because the populace is more likely to defend its digital privacy interests given their general importance to the polity.[293]

The research findings in this article directly inform my proposed institutional approach for governing digital privacy in the Canadian criminal law context. As courts and Parliament have similar weaknesses, it is not sensible to rely on institutional process arguments to exclude one institution from governing digital privacy. Instead, the focus should be on how to help courts and legislatures work together to ensure the best digital privacy rules are implemented. This requires thinking creatively about how to address institutional weaknesses. In addition to ensuring courts are institutionally equipped to respond to digital privacy concerns, Parliament should be vigilant about weighing the costs and benefits of responding to novel and complex technologies with legislation. When a technology is advancing quickly, Parliament can either pass broad laws that allow judges to fill in legislative gaps or proceed cautiously, using tools such as sunset clauses to ensure its legislation is not vulnerable to falling out of date. Although this approach may abdicate significant rule-making authority to courts, concerns about democratic legitimacy are mitigated if Parliament approaches digital privacy rule-making with a realistic assessment of its capacity to meet the challenges of governing privacy in the digital age.