More than a dozen privacy bills have been introduced in this Congress. Here’s what it needs to do.
By Peter M. Lefkowitz
Mr. Lefkowitz is the chief privacy and digital risk officer at Citrix Systems.
June 25, 2019
This is the year of privacy, a unique opportunity to protect Americans’ privacy rights while protecting technological innovation and civil discourse.
If this sounds lofty for a technology issue, consider: The Cambridge Analytica scandal exposed significant threats to consumers and to democracy posed by the misuse of consumer personal information; the General Data Protection Regulation, Europe’s year-old comprehensive privacy law, required companies globally to rework their privacy programs, at real expense and with the possibility of monumental fines; and the California Consumer Privacy Act, enacted quickly last year, gives consumers extensive new rights, including the ability to tell companies not to sell or retain their data.
Congress is working on a consumer privacy law, with the prospect that the United States will join 80 other countries with comprehensive national laws protecting personal information. At least 15 bills have been filed this year, and a group of six senators is expected to introduce its own bill in coming weeks, the most substantial undertaking to date.
There are many reasons to support federal privacy legislation. A federal law would set a consistent standard for how companies treat consumers’ personal information and would inspire greater confidence in how responsible companies behave. It could address the significant risks posed by the aggregation of consumer profiles, which include racial and economic discrimination and a lack of transparency
about how information is collected and used. And a federal law would allow the United States to harmonize its laws with those of other major economies, easing trade concerns and promoting American technology in Europe and beyond.
For all of this promise, there are serious deficiencies in the way privacy is being discussed in public and in Congress, particularly in the failure to consider how proposals designed to address social media would impact critical work elsewhere.
[If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.]
First, “opt-in consent” — the model suggested in a number of federal proposals — does not protect consumers.
Website privacy policies average more than 2,500 words and are carefully drafted to maximize data use and minimize legal exposure. Few consumers read these policies before agreeing to give up information, and the practices of ad networks and social media are not clear to most of us when we click the “I agree” button.
Second, rigid rules about the “sale of data” and limits on the use of artificial intelligence
are not a productive way to prevent abuse and would impact activities essential to our safety and security.
Some behavior raises alarms and should be stopped, like secretly sharing minutely detailed personal profiles with political operatives to turn elections. However, essential activities — including advances in health care, cybersecurity, financial services and fundamental scientific research — depend upon large data sets and broad data sharing. Massachusetts has funded a public-private partnership called Mass Digital Health, and the American Medical Association has created the Integrated Health Model Initiative to promote data sharing across the medical and scientific communities to improve health care outcomes. Technology companies are deploying artificial intelligence across massive data sets to advance understanding of Parkinson’s, Alzheimer’s and other diseases.
Detecting fraud and cybercrime relies upon compiling and analyzing large sets of metadata, as bad actors intentionally strike broadly and over time. And A.I. is being used to benefit underserved communities. There are innovative programs using a range of personal data to offer loans to disadvantaged consumers, and there is research on internet search data to predict and prevent infectious diseases. Requiring companies to remove individual pieces of data from large data sets on demand, and prohibiting analytics because some might use it improperly, would render many of these activities impractical.
Finally, the law must not be so burdensome that it cuts off innovation and economic opportunity.
The digital sector represented roughly 7 percent of the American economy as of 2017, and it is growing at nearly triple the average, according to the Bureau of Economic Affairs. Contrast that to Continental Europe, where a burdensome regulatory environment contributes to anemic tech sector growth.
At the same time, American companies are competing to develop more privacy-protective technologies and to wed their brands to how well they guard our data. Any new law must foster this commercial interest in the value of privacy without making it onerous for new businesses to emerge and compete. Notably, creating high barriers to new services only further entrenches the few social media and data companies with established services, large data sets and financial cushions.
Some ideas already employed widely outside the United States deserve consideration. Privacy laws in Canada, Japan and elsewhere rightfully require companies to consider the benefits and risks their data practices pose to consumers and society. Setting standards for data use and requiring more granular disclosures would take the pressure off consumers to consent to all possible uses at once.
Any new law must also recognize that data is important to all we do and that we cannot simply make it go away. Instead, a variety of tools — including reasonable data minimization, development of industry standards and Federal Trade Commission rule-making — can protect against misuse while allowing development of science and industry.
Time is running out for Congress to pass meaningful privacy legislation before the implementation of the California privacy law in 2020, the adoption of inconsistent laws in other states, and the distraction of the 2020 elections. Let’s use the coming months constructively to enact a privacy law that protects consumers and promotes technology innovation and a healthy digital environment.
Mr. Lefkowitz is the chief privacy and digital risk officer at Citrix Systems. He was the 2018 chairman of the International Association of Privacy Professionals.