As previously suggested in Digital Contact Tracing: Hope or Hype?, digital contact tracing, if implemented successfully, can help to control the spread of COVID-19. But the government is not a tech company. Therefore, no state has undertaken an effort to independently develop a digital instrument to conduct contact tracing. Instead, the private sector has underpinned this effort.
A variety of institutions have undertaken efforts to build digital contact tracing applications. But the rollout of these applications has been a slow process. To date, only twenty-two states have employ some method of digital contact tracing and some states haven’t even attempted it. Moreover, the rollout of digital contact tracing applications can take months. Indeed, the largest scale of these efforts—a collaboration between Apple and Google that provides contact tracing to their smartphone users—took three months to become available after its initial release. And, even though it was the first digital contact tracing tool released, it is currently available in only ten states. This is not surprising, however, given that the legal landscape of digital contact tracing applications is complicated and riddled with new and evolving regulatory standards.
This article provides a roadmap of the major legal guideposts developers must navigate to rollout their digital contact tracing applications.
App-Design: Digital Contact Tracing Systems
Generally, digital contact tracing involves a three-pronged effort: (1) identify who has the virus; (2) identify with whom they have come in contact; and (3) notify those contacts to stay at home. However, no one application is the same. Below is a list of several ways digital contact tracing applications can differ that affect their respective ethical and legal postures:
Voluntariness and Disclosure
First, digital contact tracing applications differ in voluntariness and disclosure. There are several features that can affect this:
- Source of the Data: Applications differ on where proximity data is obtained. Some applications collect proximity data from users who use their application, while others collect data from a variety of third-party sources, such as data-sharing partners and software development kits.
- Opt-in vs. Opt-Out: Applications differ on whether they require the user to “opt-in” (i.e. require users to affirmatively enable tracking before it begins) or “opt out” (i.e. begin tracking automatically and require users to affirmatively disable tracking to end it).
- Method of Reporting: There are three ways in which applications can identify who has been diagnosed with COVID. First, applications can rely on a voluntary system where they ask users to self-report. Alternatively, applications can obtain this information through healthcare providers or employers. Finally, the government might mandate self-reporting and allow applications access to that data.
- Disclosure: A closely related idea is disclosure. Applications vary in the degree of their disclosure, the extensiveness of their privacy policies, and whether users can opt-in or opt-out of the application’s data-sharing network.
So far, there are two ways in which digital contact tracing applications track COVID exposure:
- Bluetooth Tracking: Bluetooth technology tracks the least user information. It monitors and logs nearby Bluetooth broadcast signals to identify which phones came into contact with one another. And, once a smartphone user reports a positive COVID diagnosis, the application anonymously alerts all phones from this log that they were in proximity to someone with COVID.
- GPS Tracking: GPS tracking collects location data and is typically used in conjunction with Bluetooth technology. This system also monitors and logs smartphone users’ contacts and anonymously alerts users of possible exposure. But, in addition to monitoring contacts, the GPS information allows health organizations and officials to map the spread.
While both methods serve the same function—to notify other phones that were in proximity of the COVID-19 diagnosed person within a specified period of time—they also suffer from the same limitation. Both technologies can potentially overreport exposure because they unable to distinguish between contact and when people are separated by walls or driving near each other on the highway.
Data Storage and Access
Digital contact tracing applications also differ in how they store data and who can access it. There are two primary ways in which these applications approach this:
- Centralized Storage: A centralized approach stores proximity data in servers controlled by the application developers, companies, or governments. This approach informs public health officials about social-distancing behavior to create risk-scoring systems, but also allows them to identify and sanction individuals for violating social-distancing orders.
- Decentralized Storage: Conversely, a decentralized approach stores proximity data across peer-to-peer networks without oversight from an organization – for example, storage on each individual user’s device.
Additionally, applications vary in the type of data they collect and store to record proximity. Currently, there are two ways that applications do this:
- Identifiable Information: Types of identifiable information for contact tracing include federally-issued identification information, such as social security numbers, and phone numbers. This type of information is considered less anonymous than contact event numbers.
- Contact Event Numbers: Essentially, contact event numbers are randomly-generated numbers that obscure identifiable information using cryptography. In the event someone reports a positive COVID diagnosis, the application sends the contact event number of the diagnosed person to a server. The server then distributes the number to every phone in the system, and each phone cross-references the number with the contact event numbers stored in its log and alerts the user if necessary.
Finally, some, but not all, digital contact tracing applications include sunset clauses in their user agreements.
- Sunset clauses: Though they vary in their provisions, sunset clauses generally provide a safeguard against long-term data uses by requiring the application developer to take some action after a predetermined period of time. To demonstrate, a digital contact tracing sunset clause might require the developer to sever data-sharing with third-parties or government authorities, automatically uninstall the application, or automatically destroy all personal data, after the pandemic has ended.
App-Development: United States Privacy Laws
First, digital contact tracing application developers must ensure their application comports with all applicable laws. But, because the applications share user information, compliance with privacy laws should be a primary concern. However, the United States does not have a unified approach to data privacy. Instead, data collection is governed by a slew of dated and incidental federal laws and piecemeal state privacy laws.
As digital contact tracing is in essence a health application, the developer’s first legal consideration should be the federal Health Insurance Portability and Accountability Act of 1996 (“HIPAA”). HIPAA only applies to “covered entities.” Additionally, HIPAA extends to “business associates” whose work on behalf of or provide services to covered entities involves accessing “protected health information.” Protected health information is any identifiable information regarding the health status of an individual that is created, collected, transmitted, or maintained by a covered entity relating to healthcare services. Therefore, developers should consider the following to determine whether their application falls under the purview of HIPAA:
- Is the developer a “covered entity” or “business associate”? HIPAA applies only to covered entities and their business associates.
- Does the digital contact tracing use “protected health information”? HIPAA protects the disclosure and use of protected health information.
Most applications—even digital contact tracing applications—do not fall into any of these categories and therefore, HIPAA does not apply. But to avoid falling under the purview of HIPAA, developers should carefully monitor the sources of their data to ensure it is not protected health information and avoid contracting with covered entities.
If HIPAA applies, developers should ensure their application is in full compliance and take special note of HIPAA’s “Privacy Rule.” The Privacy Rule addresses the use and disclosure of protected health information. The Privacy Rule’s requirements are extensive. They impose obligations on developers, such as to: (1) use only the “minimum necessary” protected health information; (2) meet certain requirements for the digital packaging and protection of protected health information; and (3) obtain consent or de-identify protected health information before sharing it. Digital contact tracing application developers subject to HIPAA should carefully investigate their obligations and ensure their application is in compliance.
While recent HHS bulletins have stressed privacy protections guaranteed by HIPAA have not been set aside during the pandemic, HHS has relaxed HIPAA during the pandemic by: (1) waiving several provisions of the act; and (2) declining to enforce against disclosures to law enforcement that are in violation of the Privacy Rule but made in a “good faith” attempt at compliance.
In addition to HIPAA, digital contact tracing developers should consider the California Consumer Privacy Act (“CCPA”), which took full effect in August 2020. Generally speaking, the CCPA protects only California residents and their personal information not already covered by HIPAA—information that is identifiable, but not protected health information. It applies to for-profit entities that: (1) operate in California; (2) collect or receive consumers’ personal information; and (3) meet certain revenue or in-state operation minimums.
Thus, because the CCPA exists where HIPAA does not, the CCPA has limited applicability to digital contact tracing applications. However, there are several factors a developer should consider as to whether their application falls under the umbrella of the CCPA:
- Is the developer a for-profit entity or part of a state or local government? For-profit entities are covered by the CCPA even if they contract with government authorities. However, the CCPA does not cover nonprofits and government entities.
- Does the developer operate in California? The CCPA only applies to businesses that operate in California.
- Does the developer collect Californians’ personal information outside of protected health information? The CCPA applies to digital contact tracing applications if the application collects data beyond the user’s protected health information, such as location and other personal information.
- Does the developer meet the CCPA’s revenue or operation requirements? The developer must reach the scale required to satisfy the CCPA’s revenue or operation minimums for the law to apply.
If the CCPA does apply, application developers should install protocols to ensure its users can exercise the three rights the CCPA guarantees, which are: (1) the right to certain information about how and why their data is collected and used; (2) the right to request the application delete any information it has collected about the consumer; and (3) the right to opt out of the sale of the consumer’s personal information to third-parties.
Other State Privacy Laws
Without a harmonizing federal privacy law, developers must also consider all relevant state laws. With the exception of California, most states do not have robust privacy protections or any applicable privacy laws at all (although many predict that some states will soon enact them). Thus, application developers will find that in many states users are entirely reliant on their word to avoid misusing data or violating their privacy policies.
App-Launch: Government Procurement Process
Digital contact tracing application developers may also want to contract with state governments. Though some states have issued guidence to expedite emergency purchases of goods and services related to the pandemic, government procurement remains a slow process for most contact tracing applications.
The government’s key concerns when evaluating a digital contact tracing application are likely efficacy, privacy, and interoperability across states. However, digital contact tracing application developers report that state government officials lack the technical expertise to evaluate these novel applications, and this slows the procurement process. Indeed, Teddy Gold, executive director of Zero, a pandemic response software, described his experience to Time, “You’d get sent from the public health department to the governor’s office, to the [chief information officer], back to a mayor’s office, back to the chief information security officer’s office. It was this Kafkaesque thing where no one had ever done this. No one had ever developed a contact-tracing app before. States don’t develop apps.”
In response, the private sector stepped up to reduce friction from state governments’ lack of technical expertise during the procurement process. U.S. Digital Response (“USDR”), for example, reviewed several digital contact tracing applications and shared its own guidelines regarding efficacy, privacy, and interoperability. Moreover, the organization provides free evaluation and implementation assistance to governments considering digital contact tracing applications.
Meanwhile, Google and Apple met states more than halfway. In August, Google and Apple launched a new initiative called “Exposure Notifications Express,” which created a system whereby: (1) state authorities would submit parameters for contact tracing to Apple and Google; then (2) Apple and Google would develop an app or notification system on behalf of the state. This bolstered Google and Apple’s previous digital contact tracing tool, which was only an application programming interface, thus leaving state governments to develop the system and user interface. The purpose of Exposure Notifications Express was to speed up the procurement process by relieving state governments of the hassle of developing the application interface and system.
Digital contact tracing application developers should learn from these lessons. First, they should expect a slow procurement process. Second, in light of state governments’ lack of technical expertise, developers should proactively work with the government and USDR throughout the development process and carry their application to completion, rather than just provide the underlying software.
The regulatory landscape for digital contact tracing application developers is complex, but there are several things developers and the federal government can do to flatten these regulatory hurdles:
Recommendations for Developers
- Subscribe to a policy of data minimization and take a privacy-by-design approach: To avoid CCPA and HIPAA regulation and to anticipate future privacy laws already underway, application developers should employ a policy of data minimization and a privacy-by design approach. That is, developers should use the least amount of data that is the most effective epidemiologically and incorporate privacy into application designs by default. This would require developers to:
- Limit data-sharing
- Use an opt-in and voluntary self-reporting strategy
- Include extensive and flexible disclosure agreements
- Track using Bluetooth software only
- Store data on decentralized peer-to-peer networks
- Where possible, abstain from using identifiable information and use only contact event numbers
- De-identify any identifiable information
- Include sunset clauses that promise to uninstall the application and destroy all stored data and personal information once the pandemic ends
- Carefully examine data sources to ensure they are not a “covered entity” or “business associate” under HIPAA
- Consider, before introducing an application to California, whether that introduction will subject the application to regulation by the CCPA
- Comply with all privacy laws: Investigate and prepare to comply with all relevant privacy laws.
- Allow substantial time for the procurement process and work preemptively with state governments: Digital contact tracing applications should anticipate a slow procurement process and take care to address the state governments’ core considerations, which are efficacy, privacy, and interoperability. But they should also take steps to alleviate state governments’ lack of technical expertise by: (1) working with USDR and state governments during the development process; and (2) developing the application to completion rather than just providing a tool which the government would have to build over.
Recommendations for the Federal Government
- Pass a federal privacy law: Right now, there are three federal privacy bills awaiting Congressional action and another COVID-19 and the Law blog post, “The Constitutionality of Technology-Assisted Contact Tracing” provides a helpful comparison. Whatever the law, it should simplify the privacy law landscape developers currently must navigate. Specifically, it should harmonize with existing federal and state laws and apply to all digital data collection, transfer, storage, and use.
- Pursue a whole-government-contract for digital contact tracing: The federal government should consider procuring a nationwide digital contact tracing system to eliminate the slow rollout caused by the states’ procurement processes. Reportedly, the White House considered a similar nationwide digital contact tracing system in April 2020 but did not execute on it, citing concerns about patient privacy and civil liberties. But other countries have done this with some success. Australia, for instance, contracted with COVIDSafe, Amazon’s digital contact tracing application. This deal integrated all federal, state, and territory agencies on one platform and enabled the country to quickly respond to the pandemic. Therefore, the Biden administration should reconsider implementing a national digital contact tracing system.
 Covered entities are healthcare providers, health insurance organizations, and clearinghouses.
 In addition to the first two requirements, the CCPA only applies to for-profit entities that meet one of the following thresholds: (1) those that earn more than $25 million in annual gross revenue; (2) those that buy, sell, or receive personal information of 50,000 or more Californians; or (3) those that derive more than 50% of their annual revenue from the sale of Californian’s personal information.