Hong Kong’s New Security Law: The Battle Between Online Freedom and Chinese Censorship in the Name of ‘National Security’

Hong Kong’s New Security Law: The Battle Between Online Freedom and Chinese Censorship in the Name of ‘National Security’

Written by: Bryce Hoyt

Historical and Legal Overview

On June 30, China passed a controversial national security law for Hong Kong that aims at giving more power to mainland China over the semi-independent city. The power to create such law is found within the mini constitution of Hong Kong called the Basic Law—which gives China back door control over the quasi-independent city.[1] The national security law follows a string of attempts in 2003 and 2019 by Hong Kong to create a national security law to which the immense pushback by citizens resulted in multiple failed attempts and dead resolutions.[2] China argues the law is necessary to uphold the sovereignty of the mainland, but many citizens of Hong Kong are worried that their independent freedoms are quickly being taken away—diminishing the “one country-two systems” model that China had promised before it regained control of the city from the United Kingdom in 1997.[3]

The law, which was not announced to the public until after it was passed, now makes it a crime to support Hong Kong independence and potentially makes vandalizing public property or government premises a terrorist activity.[4] The law also allows mainland China to operate in Hong Kong for the first time and trumps any and all conflicting local laws within Hong Kong.[5] The law is applicable to “any person” in Hong Kong and most of the crimes stated in the law are defined very broadly. Among the crimes listed, there are four new offences—secession (breaking away from the country), subversion (undermining the power or authority of the central government), terrorism (using violence or intimidation against people), and collusion with foreign powers.[6]

If a foreign national violates the law oversees, they may still be charged if they ever return to the city.[7] Beijing plans to establish a national security office in Hong Kong staffed by mainland officials who will oversee enforcement of the law. Hong Kong courts will oversee national security cases. However, Beijing has the power to take over the case and all decisions cannot be legally challenged.[8] If a case involves “state secrets or public order,” it will face a closed-door trial with no jury.[9] Any person found in violation of the law may be subject to a maximum sentence of life imprisonment.[10] Hong Kong is also mandated to carry out education on national security including new educational curriculum and social organizations with the purpose of informing the public of the importance of national security.[11]

How does this impact Privacy?

Under Article 9, enforcement of the new law allows police to employ covert online surveillance and wiretap those suspected of any crimes.[12] Being that the crimes are so broadly defined and contain such severe consequences, many Hong Kong residents fear that any information posted on social media sites (many of which are completely banned in mainland China) which reference Hong Kong independence or criticism of the government, may be used as evidence of subversion or secession.[13] This has led many people to begin scrubbing their social media accounts and deleting their online presence, much of which contains online records of political debate and criticism of mainland China. China has made it very clear that this law will be strictly and routinely enforced, having already arrested multiple people who protested the enactment of the law.[14]

The Tech Standoff

Many international tech companies, along with most Hong Kong residents, view this as a war of free speech and political and economic independence and fear that the online threat of China’s censorship is now being draped over Hong Kong.[15] Shortly after the law was announced, tech companies including Facebook, Google, Twitter, Zoom and LinkedIn have stated they would temporarily stop complying with requests for user data from the Hong Kong authorities—in violation of Article 43 of the new security law.[16] Hong Kong has responded by threatening jail time for company employees for noncompliance with the law.[17] In efforts to take control of protests in 2019, Hong Kong had requested help from Google in taking down online posts expressing support of independence and any leaked police information.[18] At the time, Google said no. However, the new law could now punish Google with fines, equipment seizures, and arrests if it declines such requests again.[19] TikTok, which is owned by the Chinese internet platform ByteDance but managed mostly outside of China, announced that it would withdraw from stores in Hong Kong and make the app inoperable to users in the city within the next few days.[20] According to an official from the Internet Society of Hong Kong, a non-profit dedicated to the open development of the internet within the city, there may be technical actions companies can take to guard against the law.[21] The law states that a request for data may be avoided if the technology required to comply is “not reasonably available,” which means companies may be able to add levels of encryption to the data or store the content in multiple locations to make the process of complying overly burdensome.[22]

Many other small businesses have either shut down or developed plans for moving out of Hong Kong, but for tech giants like Amazon and Google who have large data centers in Hong Kong, this is likely not a practical solution.[23] As the war for online freedom continues, the chilling effects to privacy and personal autonomy cannot be understated.


[1] Hong Kong: What is the Basic Law and how does it work?, BBC (Nov. 20, 2019), https://www.bbc.com/news/world-asia-china-49633862.

[2] See Jessie Yeung, China has passed a controversial national security law in Hong Kong. Here’s what you need to know, CNN (July 1, 2020), https://www.cnn.com/2020/06/25/asia/hong-kong-national-security-law-explainer-intl-hnk-scli/index.html.

[3] Id.

[4] See Grace Tsoi, Lam Cho Wai, Hong Kong security law: What is it and is it worrying?, BBC (June 30, 2020), https://www.bbc.com/news/world-asia-china-52765838.

[5] Yeung, supra note 2.

[6] Tsoi, supra note 4.

[7] Yeung, supra note 2.

[8] Id.

[9] Id.

[10] Id.

[11] Id.

[12] See Rita Liao, The tech industry comes to grips with Hong Kong’s national security law, TechCrunch (July 8, 2020), https://techcrunch.com/2020/07/08/hong-kong-national-security-law-impact-on-tech/, See also HK National Security Law -Bilingual, China Law Translate, https://www.chinalawtranslate.com/bilingual-hong-kong-national-security-law/.

[13] See Chris Buckley, Keith Bradsher, Tiffany May, New Secuirty Law Gives China Sweeping Powers Over Hong Kong, The New York Times (June 29, 2020), https://www.nytimes.com/2020/06/29/world/asia/china-hong-kong-security-law-rules.html. See also List of websites blocked in mainland China, Wikipedia, https://en.wikipedia.org/wiki/List_of_websites_blocked_in_mainland_China.

[14] Austin Ramzey, Elaine Yu, Tiffany May, Hong Kong Is Keeping Pro-Democracy Candidates Out of Its Election, The New York Times (July 29, 2020), https://www.nytimes.com/2020/07/29/world/asia/hong-kong-arrests-security-law.html.

[15] Paul Mozur, In Hong Kong, a Proxy Battle Over Internet Freedom Begins, The New York Times (July 7, 2020), https://www.nytimes.com/2020/07/07/business/hong-kong-security-law-tech.html.

[16] Id.

[17] Id.

[18] Id.

[19] Id.

[20] Paul Mozur, Tik Tok to Withdraw From Hong Kong as Tech Giants Halt Data Requests, The New York Times (July 6, 2020), https://www.nytimes.com/2020/07/06/technology/tiktok-google-facebook-twitter-hong-kong.html.

[21] Mozur, supra note 15.

[22] Id.

[23] Id.

The New Normal: Mass Temperature Screening and the Law

The New Normal: Mass Temperature Screening and the Law

Written by: Michael Walsh

Disclaimer: This post does not contain legal advice. I am not a licensed attorney nor am I qualified to give compliance help or other legal services. This post is for educational purposes only.

Due to a resurgence of the COVID-19 pandemic in many states, federal and state health agencies have deployed several technologies to help track (and ultimately quell) the spread of the pandemic. Temperature scans and other screening technologies have become commonplace, and nonconsensual mass temperature screening has been used to mitigate the spread of other major pandemics in the past.[1] The Food and Drug Administration (FDA) issued comprehensive, but nonbinding guidance on the use of thermal imaging technologies for COVID diagnostics, which advocates for the use of thermal imaging tech as an initial screening tool in “high throughput areas” such as airports, businesses and other high density areas where traditional temperature measuring techniques would be ineffective or impracticable.[2]

Technical Limitations and Efficacy of Temperature Screening

Some scientific studies support the use of telethermographic devices or non-contact infrared thermometers (NCITs) to accurately measure skin temperature (which correlates with core temperature).[3] NCITs are thermal imaging systems that measure infrared radiation that is omitted from febrile humans (humans with a detectable fever) and convert that radiation map into a relative temperature measurement.[4] However, the FDA emphasizes that such technologies are not suitable as a sole means of diagnosing COVID-19.[5]

NCITs can be effective at sensing relative temperatures but have palpable limitations that can affect the technology’s efficacy. The American Civil Liberties Union (ACLU), citing a clinical study of NCITs, asserts that mass screening of open rooms can lead to wildly inaccurate temperature measurements.[6] The FDA recommends that temperature scans should be made in highly controlled environments or in rooms which have a temperature between 68-76 degrees Fahrenheit and that have no draft, radiant heat, (filament) light interference, or reflective backgrounds.[7] Because the technology senses relative infrared radiation, most systems also require a controlled temperature reference (called a blackbody) to compare the radiation density between the individual and the ambient environment. A relatable analogy to the purpose of the blackbody is comparing a white tissue (blackbody) to the color (heat radiation) of one’s teeth to determine if one’s teeth are truly white (heat saturated). The relative differences between the thermal maps of the blackbody and the scanned individual can be used to estimate skin temperature with relatively high confidence (this study found skin temperature variations of ±10 degrees Fahrenheit and within a 95% confidence interval),[8] meaning that measured temperatures were generally accurate within 2-3 degrees Fahrenheit.[9]

Additionally, FLIR, one of the most prominent thermographic device manufacturers concedes that the technology has technical limitations and is not suitable as the sole diagnostic tool for identifying individuals with COVID.[10]

Regardless, the aforementioned CDC study found that although thermal imagery systems are highly dependent on controlled environments, infrared tech can reliably detect “elevated skin temperatures” and are significantly more accurate at determining fever than self-reported questionnaires (In this study, only one tenth of those who reported a fever were actually febrile).[11] Overall, the technology, once calibrated and controlled, can determine core temperatures with similar accuracy to more traditional oral temperature measurements.[12]

Legal Implications of NCITs

NCITs are governed exclusively by the FDA under part 201(h) of the FD&C Act 21 U.S.C. § 321(h), which governs some medical devices.[13] Generally, these medical devices are those which are intended for use in the diagnosis of disease or other conditions, or in the “cure, mitigation, treatment, or prevention of disease.”[14] However, thermal devices that are not intended for such a purpose are not within the regulatory scope of the FDA, meaning the Food, Drug, and Cosmetic Act (FD&C) does not apply to those businesses or individuals using nonmedical thermal devices. Of course, the definition of a medical device under 201(h) is dependent on the intent of the user, so thermal imaging systems that were originally unintended for COVID screening should still comply with the FD&C and other relevant FDA guidance.[15] However, the FDA promotes the use of thermal imaging technologies as a preliminary tool for COVID screening. The FDA states that businesses (because the COVID-19 pandemic is defined as a public health emergency) likely need not comply with many medical device regulations so long as such use does not “create undue risk.”[16]

Privacy Concerns

HIPAA

HIPAA, the flagship federal legislation that protects medical health information is rendered obsolete in the age of contact tracing. HIPAA applies primarily to health plans, clearinghouses and health care providers, of which Google, Apple, PwC, PopId and Clear (contact tracing powerhouses) are not.[17]

Searches

It is also important to note that thermal imagery can qualify as a “search,” but Constitutional protections for unreasonable searches and seizures only apply to government actors. However, there is evidence that tech companies have shared location data with government agencies to help track the spread of COVID.[18] This data may be aggregated and anonymized, but combining relevant data sets may reidentify that data, revealing private medical data traceable to specific individuals. Apparently, 63% of individuals can be uniquely identified by a combination of gender, date of birth, and zip code alone.[19] By combining different data sets which have both “anonymized” or “aggregate” direct or indirect personal identifiers, many anonymous data sets can be reidentified, compromising the privacy of specific individuals.[20]

ADA

The Americans with Disabilities Act (ADA) enforces nondiscrimination based on disability (under which COVID may qualify) and binds all private employers with fifteen or more employees.[21] However, the U.S Equal Employment Opportunity Commission (EEOC) explicitly states that the ADA should not interfere with COVID-19 guidelines made by the CDC.[22] Temperature and other COVID tests must be ‘job related and consistent with business necessity’ and employees may be furloughed or excluded if they have a “medical condition” that would pose a direct threat to health or safety (such as COVID-19).[23]

State Privacy Laws

Of course, there are some existing protections such as the California Consumer Privacy Act (CCPA), Vermont’s data broker registration law, and Illinois’s biometrics law (BIPA), each of which either contain a public health emergency, “direct relationship” or other exception, meaning that most contact tracing companies are exempted from complying with these privacy laws until they are amended or COVID is no longer classified as a health emergency.[24]

Two companies, Clear and PopID have already begun using biometric face scanning and thermal imaging technologies to monitor COVID-19 in businesses and other public places.[25] Some restaurants are implementing these screening procedures in response to the White House guidelines, which require businesses to “monitor workforce[s] for indicative symptoms.”[26]

Pending Federal Legislation

Amid concerns of private health information gathered from COVID screening, senators have introduced a COVID-19 privacy bill which would: (1) require express consent to collect, process or transfer “personal health, geolocation, or proximity information”; (2) disclose to whom that data will be transferred to and retained by; (3) give individuals the opportunity to opt out of their health information being stored or compiled; and (4) give individuals the right to delete or deidentify all personal information that is no longer being used.[27] However the bill has been criticized for preempting stricter state laws (including the CCPA) and not providing a private right of action.[28]Another bill, the Public Health Emergency Privacy Act (PHEPA), is sufficiently broad in its definitions of medical health data, contains clauses for nondiscrimination against those who opt out of COVID tracing programs, and does not undermine existing state data privacy laws through preemption.[29]

The novel coronavirus is just that, novel. Government health agencies and businesses are scrambling to adapt to the constantly changing circumstances. Due to resurgences in cases, the global pandemic has appropriately been categorized as a national health crisis. There is evidence that contact tracing, health screening, and mass temperature scanning can help mitigate the spread of the virus, or at the very least, allow researchers to learn more about the virus. The remaining question is what we are willing to give up in the process. Will government agencies forfeit the private health data that was shared with them once the virus subsides? If so, how will the government and cooperating tech companies protect individuals’ data privacy?


[1]Pejman Ghassemi et al., Best Practices For Standardized Performance Testing of Infrared Thermographs Intended For Fever Screening, PLoS ONE, 1710 (Sept. 19, 2018), https://doi.org/10.1371/journal.pone.0203302 [https://perma.cc/SUB3-8JKB].

[2]U.S. Food and Drug Administration, Enforcement Policy For Telethermographic Systems during the Coronavirus Disease 2019 (COVID-19) Public Health Emergency, Food And Drug Administration, 2 (April 2020), https://www.fda.gov/media/137079/download [https://perma.cc/SZ4J-RGXU].

[3]An Nguyen, et al., Comparison of 3 Infrared Thermal Detection Systems and Self-Report for Mass Fever Screening, Centers For Disease Control and Prevention, 1713-14 (Nov. 2010), https://www.cdc.gov/eid/article/16/11/10-0703 [https://perma.cc/5CXG-TAZS].

[4] U.S. Food And Drug Administration, Thermal Imaging Systems (Infrared Thermographic Systems/ Thermal Imaging Cameras), Food and Drug Administration (May 13, 2020), https://www.fda.gov/medical-devices/general-hospital-devices-and-supplies/thermal-imaging-systems-infrared-thermographic-systems-thermal-imaging-cameras [https://perma.cc/89CQ-WR8N].

[5]U.S. Food and Drug Administration, supra note 2, at 3.

[6] Jay Stanley, Temperature Screening and Civil Liberties During an Epidemic, American Civil Liberties Union, 1-4 (May 19, 2020), https://www.aclu.org/aclu-white-paper-temperature-screening-and-civil-liberties-during-epidemic [https://perma.cc/8ZVH-AUHP].

[7]U.S. Food and Drug Administration, supra note 4.

[8] Nguyen, supra note 3, at 1713.

[9] Id.

[10]Frequently Asked Questions: Thermal Imaging for Elevated Skin Temperature Screening, FLIR (May 13, 2020), https://www.flir.com/discover/public-safety/faq-about-thermal-imaging-for-elevated-body-temperature-screening/ [https://perma.cc/J2AG-X9MM].

[11] Nguyen, supra note 3, at 1713-15.

[12] Id. at 1713.

[13] U.S. Food and Drug Administration, supra note 2, at 3.

[14] Id.

[15] Id. at 4.

[16] Id.; pt. 510(k) of the FD&C Act (21 U.S.C. § 360(k)) (requiring device certification and quality testing before the introduction of the device into interstate commerce); 21 C.F.R. pt. 807.81 (requiring device manufacturers to submit a premarket approval request to the FDA before commercial distribution of the device); 21 C.F.R. pt. 806 (governing the scope and definitions of manufacturer liability for medical devices that have been removed or corrected from current marketed equivalents); 21 C.F.R. pt. 80 (governing medical device registration); 21 C.F.R. pt.  820 (governing device quality control and system requirements); 21 C.F.R. pt. 830 (requiring unique identifiers for medical devices); 21 CFR pt. 801.20 (governing labeling requirements for medical devices).

[17]U.S. Department of Health and Human Services, HIPAA for Professionals (April 2015), https://www.hhs.gov/hipaa/for-professionals/privacy/index.html [https://perma.cc/RE33-8JTK];  Adam Schwartz, Two Federal COVID-19 Privacy Bills: A Good start and a Misstep, Electronic Frontier Foundation (May 28, 2020), https://www.eff.org/deeplinks/2020/05/two-federal-covid-19-privacy-bills-good-start-and-misstep [https://perma.cc/TFW6-LWBR].

[18] Garret Stone, Constitution in Crisis: The Fourth Amendment and Combating COVID-19, Wake Forest J. of L. and Pol’y (April 20, 2020), https://wfulawpolicyjournal.com/2020/04/20/constitution-in-crisis-the-fourth-amendment-and-combating-covid-19/ [https://perma.cc/98TB-F4TM].

[19]Boris Lubarsky, Re-identification of “Anonymized” Data, 1 Geo. L. Tech Rev. 202 (2017), https://georgetownlawtechreview.org/wp-content/uploads/2017/04/Lubarsky-1-GEO.-L.-TECH.-REV.-202.pdf [https://perma.cc/AU9G-E4FA].

[20] Id.

[21]U.S. Equal Employment Opportunity Commission, What You should Know About COVID 19 and ADA Rehabilitation Act, and Other EEO Laws (June 17, 2020), https://www.eeoc.gov/wysk/what-you-should-know-about-covid-19-and-ada-rehabilitation-act-and-other-eeo-laws [https://perma.cc/F2TN-LAY5].

[22]Id.

[23]Id.

[24] Adam Schwartz, Vermont’s New data Privacy Law, Electronic Fronteir Foundation (Sept. 27, 2018), https://www.eff.org/deeplinks/2018/09/vermonts-new-data-privacy-law [https://perma.cc/MH8P-QE4B]; Daniel Gottlieb, California Bill Proposes CCPA Exceptions for HIPAA De-Identified Information, McDermott, Will and Emory, (Jan. 17, 2020), https://www.mwe.com/de/insights/california-bill-proposes-ccpa-exceptions-for-hipaa-de-identified-information-other-health-data/ [https://perma.cc/BR9P-GJ2Z]; Illinois General Assembly,  § 740 ILCS,  https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57s/california-bill-proposes-ccpa-exceptions-for-hipaa-de-identified-information-other-health-data/ [https://perma.cc/8TBQ-CDUP].

[25]Natasha Singer, Employers Rush to Adopt Virus Screening. The Tools May Not Help Much., New York Times, (May, 11, 2020), https://www.nytimes.com/2020/05/11/technology/coronavirus-worker-testing-privacy.html [https://perma.cc/7AMC-QANC].

[26]Centers for Disease Control, Opening Up America Again, https://www.whitehouse.gov/openingamerica/ [https://perma.cc/MKT6-Z9MG].

[27]John Thune, Thune Wicker, Moran, Blackburn Announce Plans to Introduce Data Privacy Bill, US Senator for South Dakota (April 30, 2020) https://www.thune.senate.gov/public/index.cfm/press-releases?ID=37E557F5-566E-4872-A66D-EBBFEC1D190A [https://perma.cc/F6RM-LCZN].

[28]U.S. Department of Health and Human Services, supra note 17.

[29]Id.

Quiltwork: Existing State Privacy Legislation and Federal Intervention

Quiltwork: Existing State Privacy Legislation and Federal Intervention

Written by: Michael Walsh

There were forty bills proposed for state privacy legislation between 2018 and June 2020 (up from 27 bills in February 2020).[1] Of those, only fourteen bills died in committee or were postponed.[2] Nevertheless, the introduction of such bills indicate that States are becoming increasingly concerned about consumer privacy protection, and these bills still have a chance of being reintroduced and enacted. Six of those bills were instead replaced with a dedicated task force to monitor and enforce nationwide consumer privacy concerns. Excluding the bills that either died in committee or were replaced with dedicated task forces, twenty bills remain to be considered for passage.

Consumer Rights

The majority of these bills focus on consumers’ rights, including many of the following fundamental provisions: (1) Right of Access; (2) Right of Deletion; (3) Right to opt out; (4) Private Right of Action; (5) Right to Fair Notice; and (6) Right to Nondiscriminatory Access.

  • Right of Access (15 of 20 bills include this provision)

The consumer may submit a request to a business or data collector (data controller), to receive a file, which notes the categories or “specific pieces” of personal data that the data controller has collected from said consumer. A consumer should be able to submit a request for access to his or her personal information through more than one means (written or electronic). These requests should be timely fulfilled and returned to the consumer in a common file format.

  • Right of Deletion (14 of 20 bills include this provision)

The consumer may submit a request to a data controller to delete any or all personal data that the data controller has collected from said consumer.

  • Right to Opt Out (17 of 20 bills include this provision)

The consumer may affirmatively opt out of the sale of his or her personal information to third parties.

  • Private Right of Action (9 of 20 bills include this provision)

The consumer may seek civil damages from the data controller for violations of a consumer data privacy statute.

  • Right to Fair Notice (15 of 20 bills include this provision)

A data controller shall provide to the consumer reasonable notice of the collection of said consumer’s personal information.

  • Right to Nondiscriminatory Access (13 of 20 bills include this provision)

A consumer shall not be discriminated against or have impaired access to services merely for exercising his or her privacy rights under a consumer data privacy statute.

Is a Quilt Better than a Blanket?

The current privacy landscape in the U.S. can be described as a patchwork. About one half of the states have introduced some type of consumer privacy law.[3] So, should we just enact federal privacy legislation? The Electronic Frontier Foundation, a digital rights advocacy group, urges not. Tech superpowers including Facebook and Google (as the “Internet Association”) have been lobbying for the enactment of a federal privacy law, but the EFF contends that enacting a federal privacy law will undermine stricter state laws through preemption (an issue that may be able to be resolved with careful legislative drafting).[4]

Oppositely, some business advocates contend that universal federal privacy legislation that resembles the General Data Protection Regulation (GDPR) would be needlessly costly.[5] Gartner estimates that consumer requests for current and future privacy legislation will cost, on average, $1,406 and take about a week to fulfill.[6] These compliance costs are doubtlessly expensive but may be offset by return on investment from increased consumer trust.[7]

Regardless, Congress is considering two bills that resemble the GDPR (with a private action being the most controversial provision), the Consumer Online Privacy Rights Act (COPRA) and the United States Consumer Data Privacy Act (USCDPA). COPRA is effectively broader in the way it defines personal information, while USCDPA is restricted to a more linear definition of “sensitive” personal information. COPRA allows for a private right of action while USCDPA does not. COPRA retains state authority for most areas of privacy protection (allowing states to enforce their own laws if they are more stringent than the federal equivalent), while USCDPA preempts most areas of existing state data privacy laws.[8] We will likely see the passage of one of these bills in 2020, albeit modified. Get to know them here: COPRA[9] and USCDPA.[10]


[1] Mitchell Noordyke, U.S. State Comprehensive Privacy Law Comparison, Iapp (June 2020), https://iapp.org/resources/article/state-comparison-table/ [https://perma.cc/V583-4LMB].

[2] Id.

[3] Id.

[4] Bennett Cyphers, Big Tech’s Disingenuous Push For a Federal Privacy Law, Electronic Frontier Foundation (Sept. 18, 2019), https://www.eff.org/deeplinks/2019/09/big-techs-disingenuous-push-federal-privacy-law [https://perma.cc/2XLJ-MEF8]; Michael Beckerman, Americans Will Pay a Price for State Privacy Laws, New York Times (Oct. 14, 2019), https://www.nytimes.com/2019/10/14/opinion/state-privacy-laws.html [https://perma.cc/8KEX-VPPA].

[5] Alan McQuinn and Daniel Castro, The Costs of an Unnecessarily Stringent Federal Data Privacy Law, Information Technology and Innovation Foundation (Aug. 5, 2019), https://itif.org/publications/2019/08/05/costs-unnecessarily-stringent-federal-data-privacy-law [https://perma.cc/P5XL-H66C].

[6] Jordan Bryan, 4 Legal Tech Trends for 2020, Gartner (Feb. 6, 2020), https://www.gartner.com/smarterwithgartner/4-legal-tech-trends-for-2020/ [https://perma.cc/HN7R-SVKB].

[7] Nasdaq, Cisco 2020 Data Privacy Benchmark Study Confirms Positive Financial Benefits of Strong Corporate Data Privacy Practices (Jan 27, 2020), https://www.nasdaq.com/press-release/cisco-2020-data-privacy-benchmark-study-confirms-positive-financial-benefits-of [https://perma.cc/3386-D6GL]; Brooke Auxier et al, Americans’ Attitudes and Experiences With Privacy Policies and Laws Pew Research Center (Nov. 15, 2019), https://www.pewresearch.org/internet/2019/11/15/americans-attitudes-and-experiences-with-privacy-policies-and-laws/ [https://perma.cc/XN2K-DPJJ]; Emily Leach, Iapp, (2016), https://iapp.org/media/pdf/resource_center/ROI_Whitepaper_FINAL.pdf [https://perma.cc/BV8R-P5BQ].

[8] Wendy Zhang, Comprehensive Federal Privacy Law Still Pending, National Law Review (Jan. 22, 2020), https://www.natlawreview.com/article/comprehensive-federal-privacy-law-still-pending [https://perma.cc/4U4H-8EPX]; Christian T Fjeld, Christopher Harvie, Cynthia J. Larose, Congressional Privacy Action – Part 1: The Senate, National Law Review (Jan. 28, 2020),  https://www.natlawreview.com/article/congressional-privacy-action-part-1-senate [https://perma.cc/VTK5-3YMC]; Angelique Carson, At Senate, consensus on federal law until you get to ‘private right of action’, Iapp (Dec. 5, 2019), https://iapp.org/news/a/at-senate-consensus-on-federal-law-until-you-get-to-that-private-right-of-action/ [https://perma.cc/938E-RKKU]; Charlie Warzel, Will Congress Actually Pass a Privacy Bill?, New York Times (Dec. 17, 2019), https://www.nytimes.com/2019/12/10/opinion/congress-privacy-bill.html [https://perma.cc/GY6Q-T9QG]

[9] Consumer Online Privacy Rights Act, 116th Cong. (2019), https://www.cantwell.senate.gov/imo/media/doc/COPRA%20Bill%20Text.pdf [https://perma.cc/EA85-5BQT].

[10] United States Consumer Privacy Act of 2019, 116th Cong. (2019), https://aboutblaw.com/NaZ [https://perma.cc/3J8X-MP3G].

Important: Read our blog and commenting guidelines before using the USF Blogs network.