Have you created a ShazzleMail account on your smartphone? This is a required first step.

Yes No

Free Encrypted Email

Posts Tagged ‘privacy’

fb featured image

Privacy Problems Mount for Tech Giants

January 21, 2019

By Sam Schechner
Jan. 21, 2019 6:30 a.m. ET

Big tech companies have taken a public lashing in the past year over their handling of users’ personal information. But many of their biggest privacy battles have yet to be fought—and the results will help determine the fate of some of the world’s largest businesses.

So far, tech giants like Facebook Inc. and Alphabet Inc.’s Google have proved relatively resilient against a growing backlash over possible abuse of their users’ personal privacy. Tech companies’ stocks may have swooned, but advertisers are continuing to cut them checks, and their profits are still growing at double-digit rates that would earn most CEOs a standing ovation.

This year may be stormier. Growing discontent among users over privacy and other issues—such as the widespread feeling that mobile devices and social media are addictive—could damp profit growth, discourage employees or chase away ad dollars. In Europe, regulators are slated to make major rulings about tech companies’ privacy practices, likely setting off high-stakes litigation. In the U.S., revelations about allegedly lax privacy protections are raising political pressure for federal privacy regulation.

At risk are tens of billions of dollars that marketers spend every year in online advertisements targeted at users with the help of personal information about individuals’ web browsing, mobile-app usage, physical location and sometimes other data, like income levels.

The behavior of tech giants is likely to be a major topic at the World Economic Forum this week in Davos, Switzerland. While the yearly meeting of world leaders and company executives normally celebrates how businesses can solve the world’s problems, tech companies were on the defensive last year against complaints that ranged from fomenting political polarization to building artificial intelligence that will displace millions of workers.

Since then, the pressure has increased. Facebook executives have been dragged before legislators on both sides of the Atlantic, after the company said data related to as many as 87 million people may have been improperly shared with Cambridge Analytica, a political analytics firm. And in September, Facebook said hackers had gained access to nearly 50 million accounts.

Google, meanwhile, has faced criticism of its privacy practices from political leaders, including flak after The Wall Street Journal reported that the company had exposed the private data of hundreds of thousands of users of its Google+ social network and opted initially not to disclose it.

Some tech executives have raised alarms, too. Apple Inc. Chief Executive Tim Cook, speaking in October before a privacy conference organized by the European Union, called for tighter regulation in the U.S. along the lines of a strict new privacy law in the EU, saying that some companies had “weaponized” users’ personal information in what he described as a “data-industrial complex.”

Facebook and Google both say that they have been investing heavily in improving how they protect user privacy and that they welcome tighter privacy rules; both companies support passage of a U.S. federal privacy law. Tech-industry lobbyists say they are planning to support U.S. privacy legislation over the coming year, in part to avoid contending with a patchwork of laws like one passed last year in California.

“Our industry strongly supports stronger privacy protections for consumers,” says Josh Kallmer, executive vice president for policy at the Information Technology Industry Council, which represents Facebook, Google and other tech companies. Mr. Kallmer says consumers “benefit incredibly from these technological innovations,” but adds that “alongside that are some very legitimate concerns about how data is being handled.”

What impact will stricter privacy rules have? There are two theories.

One school of thought says that stricter rules and tighter enforcement will benefit big, incumbent companies that already have access to large amounts of user data and can spend more heavily on legal-compliance efforts. The other argues that rules like those in the EU’s new General Data Protection Regulation, if strictly applied, will force significant changes to how the biggest tech companies collect and analyze individuals’ personal information—undercutting their advertising businesses and weakening their advantage over existing or potential new competitors.

“Both are reasonable claims. But it is far too early to tell which will turn out to be true,” says Alessandro Acquisti, a professor at Carnegie Mellon University who studies the behavioral economics of privacy.

At issue, in part, is the distinction between short-term and long-term effects. There are signs that Google, for one, benefited at least initially from the transition to the GDPR in May, in part because advertisers shifted money to the bigger firms, which were able to show they had users’ consent to display targeted ads.

In Europe, Google saw a 0.9% increase in the share of websites that include its advertising trackers two months after the GDPR went into effect compared with two months before, according to Cliqz, which makes antitracking tools for consumers. Facebook’s share declined 6.7%. The share for the other top 50 online-ad businesses fell more than 20%.

The longer-term impact on big firms is harder to predict. One study of nearly 10,000 online display advertising campaigns showed that users’ intent to purchase products was diminished after earlier EU laws restricted advertisers’ ability to collect data in order to target those ad campaigns. But more research is needed to determine what impact tighter rules would have on consumer spending more broadly, Prof. Acquisti says.

How the laws are enforced by regulators and courts will play an important role. Ireland’s Data Protection Commission, which is the EU’s lead regulator for Facebook and Google, is investigating complaints from privacy activists that the consent companies sometimes request for the processing of individuals’ data is a condition of using a service and so is not “freely given,” as the law requires.

In Germany, the federal antitrust enforcer says it will issue early this year a final decision regarding its preliminary finding that Facebook uses its power as the most popular social network in the country to strong-arm users into allowing it to collect data about them from third-party sources. A German decision wouldn’t involve fines, but could include orders to change business practices.

Both Facebook and Google say they comply with privacy laws.

Initial decisions could come this year, but whichever way the watchdogs come down, their actions are likely to end up reviewed in court. Those cases will end up determining how new privacy standards will be applied. And that will determine how profound their impact is.

“There is active litigation in a couple of places that could become hugely important,” Mr. Kallmer says. “It’s uncertainty that our industry thinks it’s on the right side of.”

Mr. Schechner is a Wall Street Journal reporter in Paris. Email sam.schechner@wsj.com.

Tags: , ,


Why data privacy is hot and machine learning is not

January 15, 2019

Looking back on the past twelve months, we will all remember cybersecurity scares, revelations of data malpractices, and countless large-scale data breaches. Allegations ranged from Google’s non-consensual tracking of user location data to (unlikely) instances of China covertly installing microscopic spy chips on US tech hardware.

It’s clear that data privacy will underpin innovation and technological advancement in 2019, while the latest buzzword tech is set to go the way cryptocurrency went last year (who saw that coming?). As a member of the recently fashionable – not to mention increasingly lucrative – open source community, I wanted to share my predictions for the year ahead.

The bellwether blunder from the past 12 months has to be the Facebook-Cambridge Analytica data scandal, where it was alleged that the poached data of millions was used to influence voter behavior for key political decisions, including Trump’s election and the Brexit referendum.

Thus Mark Zuckerberg’s purported clandestine global practises were brought to the forefront of public debate, which according to one survey preceded five per cent of Brits deleting their Facebook accounts.

For most of us in the open source community, it really feels like we are at a tipping point. Finally, the general public are beginning to understand what we have known for a long time: people are waking up to the realization that their data is being used and misused for heinous purposes.

That, coupled with the fact that millennials are rapidly falling out of love with the platform, suggests that in 2019 we can expect to take one step closer to the “grown-up” internet – sometimes called the Web 3.0.

Pressure will continue to mount for companies to embrace user-friendly data privacy and ownership. The shift in cultural expectations as well as the introduction of data legislation (e.g. GDPR) will force the “walled gardens” to cease operating their former model of siloing data without disclosing how they are using it. Otherwise users will flee the platform causing a network effect where if one domino falls, they all fall.

Although we have seen a rise of alternatives trying to take down the big silos – sometimes utilizing open source technology – we are yet to witness a viable alternative to Facebook, Twitter, Google, etc. But as public figures continue to call for change, I expect that in 2019 we will see more and more attempts at a federated, distributed messaging system competitor. And somebody might just get it right.

Downswing: Machine learning
On the back of the rollercoaster ride that was the Bitcoin bubble, blockchain was the buzzword at the beginning of 2018. And yet as we move into 2019, only the staunchest of fans will be defending its – eternally-yet-to-be-established – application as a solution for problems that we may or may not face in the future.

I won’t be bucking any trends with this prediction: it’s very likely that blockchain will continue its way along the downswing phase of its hype cycle in the new year.

But one hot topic from today that I feel most people are giving too much credit is machine learning. OK, so it isn’t blockchain – it is certainly true that companies have done some great things with it, but they do tend to have very specialized applications. It seems that for the broader, more headline-grabbing applications, the problems are harder to solve than we ever realized.

Take the example of autonomous vehicles. We’ve been promised self-driving cars for what feels like an age, and yet how close are we really to achieving full autonomy? When will we see the first example of a driverless car that is able to react to all situations that you encounter on the road? Certainly not in 2019.

With the machine learning approach, it’s become clear that autonomous cars are unable to recognize the social component of driving on roads full of human motorists. Eye-contact and social cues are essential to ensure road safety, and as humans this comes easily to us. For machines, it’s a different story – and teaching them to recognize nuanced human communication is going to be a difficult task no matter how many CAPTCHAs of road traffic signs and storefronts we complete.

Then on top of all this, yet again there’s the privacy element. Cars collect a lot of data about us; including where we’ve been and when. And it’s not always apparent who this data is sold to. It may be ‘anonymized’, but location data is one of the most identifiable types of personal information that a company can hold on a user. People need to have more control over this.

As far as the future of autonomous vehicles is concerned, the key will be networking all of the cars using open standards and open protocols. Once all the cars are connected, there’s no need for machines to understand the complexities of human communication. This is the open source approach and it’s how we created the internet, so why not create the Internet of Cars?

Granting users control
The past year has been an eventful time for the wider tech sector – and amid the Teslas in space and genetic bioengineering, we’ve taken several leaps forward in regards to bringing issues surrounding data privacy into the public sphere of influence.

Facebook’s meltdown quite conveniently coincided with the introduction of GDPR, feeding into a catalyst for debate over data rights, privacy and ownership. While at the other end of the scale, revolutionary technologies that we once thought would be upon us already, continue to be pushed back.

We now begin 2019 with interest in open source solutions at an all-time high. And I cannot help but anticipate that innovators will continue to apply this technology, taking control away from the silos and granting it back to the users.

Tags: , ,


Apple is portraying itself as the defender of privacy in the tech world, but it’s one slip away from embarrassment

January 10, 2019

Analysis: Apple has continued to ratchet up its criticism of competitors in a bid to differentiate itself as the “most secure” tech company.
The move is a risky one, as Apple is exposed on several fronts to possible privacy and security leaks and breaches, putting it one step removed from a significant reputation dent that could further hurt sales.
Kate Fazzini

Tim Cook, Chief Executive Officer of Apple Inc., takes a selfie with a customer and her iPhone as he visits the Apple Store in Chicago, Illinois, U.S., March 27, 2018.
John Gress | Reuters
Tim Cook, Chief Executive Officer of Apple Inc., takes a selfie with a customer and her iPhone as he visits the Apple Store in Chicago, Illinois, U.S., March 27, 2018.
Apple ramped up its efforts this week to differentiate its business on the basis of privacy and security, a risky move given risks to its cloud-based backup service and a challenging privacy environment globally, particularly in China, where the company says it is struggling.

Apple took a high-profile swipe at Google, Amazon and Facebook at this year’s Computer Electronics Show, with a full-building ad touting “What happens on your iPhone, stays on your iPhone.” CEO Tim Cook has criticized competitors for their privacy practices and their willingness to share data with third parties.

Apple is now also reportedly hiring ex-Facebook engineer Sandy Parakilas, who called Facebook a “living, breathing crime scene” because of its misuse by Russian hackers in the 2016 election. (Parakilas is reportedly taking an internal spot as a privacy product manager at Apple, a role not likely to include public-facing statements like these in the future).

For sure, Apple’s core business is different from Facebook’s and Google’s. Apple makes the bulk of its money selling iPhones and other computing devices, and charging consumer subscriptions for things like Apple Music. That means Apple has little reason to compile detailed information about users, and even less incentive to sell that information to third parties. But Facebook and Google make the vast majority of their money from advertising.

But putting such a big stake in privacy as a differentiator may be a risky business move.

First, Apple is just one iCloud breach away from an embarrassing incident that could damage its “what happens on your iPhone, stays on your iPhone” claims.

Scandals in the past years involving major celebrities who have had nude photographs stolen from their iCloud archives have been dangerously close. Apple has said these incidents involved username and password theft, giving criminals access to iCloud files through the celebrities’ password information, not a breached iCloud database.

But iCloud relies on the same cloud-based network architecture most companies rely on, including Amazon Web Services, Google’s cloud platform and Microsoft Azure. No database is impenetrable, and that includes those iCloud uses. A single instance of leaked data or an insider theft could put the company at serious reputational risk.

Third-party applications are also a potential sticking point. From a security point of view, Apple’s app store has stringent safeguards in place that make it more resilient to security issues like application spoofing than competitors such as Google’s Play store.

But independent iPhone apps still have the capacity to misuse data. The company routinely removes applications from the store for providing user information to unauthorized third parties. The New York Times reported earlier this year that numerous free iOS apps track detailed user information and provide it to third parties.

So Apple may also be one data-tracking scandal away from significantly denting the idea that data necessarily “stays on your iPhone.”

Tags: , ,


Why Privacy Needs All of Us

December 17, 2018

By Cyrus Farivar
Dec 17 2018 – 7:30am
An excerpt from “Habeas Data: Privacy vs. the Rise of Surveillance Tech” (Melville House, 2018)

There is one American city that is the furthest along in creating a workable solution to the current inadequacy of surveillance law: Oakland, California — which spawned rocky road ice cream, the mai tai cocktail, and the Black Panther Party. Oakland has now pushed pro-privacy public policy along an unprecedented path.

Today, Oakland’s Privacy Advisory Commission acts as a meaningful check on city agencies — most often, police — that want to acquire any kind of surveillance technology. It doesn’t matter whether a single dollar of city money is being spent — if it’s being used by a city agency, the PAC wants to know about it. The agency in question and the PAC then have to come up with a use policy for that technology and, importantly, report back at least once per year to evaluate its use.

The nine-member all-volunteer commission is headed by a charismatic, no-nonsense 40-year-old activist, Brian Hofer. During the PAC’s 2017 summer recess, Hofer laid out his story over a few pints of beer. In the span of just a few years, he has become an unlikely crusader for privacy in the Bay Area.

In July 2013, when Edward Snowden was still a fresh name, the City of Oakland formally accepted a federal grant to create something called the Domain Awareness Center. The idea was to provide a central hub for all of the city’s surveillance tools, including license plate readers, closed circuit television cameras, gunshot detection microphones and more — all in the name of protecting the Port of Oakland, the third largest on the West Coast.

Had the city council been presented with the perfunctory vote on the DAC even a month before Snowden, it likely would have breezed by without even a mention in the local newspaper. But because government snooping was on everyone’s mind, including Hofer’s, it became a controversial plan.

After reading a few back issues of the East Bay Express in January 2014, Hofer decided to attend one of the early meetings of the Oakland Privacy Working Group, largely an outgrowth of Occupy and other activists opposed to the DAC. The meeting was held at Sudoroom — then a hackerspace hosted amidst a dusty collective of offices and meeting space in downtown Oakland.

Within weeks, Hofer, who had no political connections whatsoever, had meetings scheduled with city council members and other local organizations. By September 2014, Hofer was named as the chair of the Ad Hoc Privacy Committee. In January 2016, a city law formally incorporated that Ad Hoc Privacy Committee into the PAC — each city council member couldappoint a member of their district as representatives. Hofer was its chair, representing District 3, in the northern section of the city. Hofer ended up creating the city’s privacy watchdog, simply because he cared enough to do so.

On the first Thursday of every month, the PAC meets in a small hearing room, on the ground floor of City Hall. Although there are dozens of rows of theater-style folding seats, more often than not there are more commissioners in attendance than citizens. While occasionally a few local reporters and concerned citizens are present, most of the time, the PAC plugs away quietly. Turns out, the most ambitious local privacy policy in America is slowly and quietly made amidst small printed name cards — tented in front of laptops — one agenda item at a time.

Its June 1, 2017, meeting was called to order by Hofer. He was flanked by seven fellow commissioners and two liaison positions, who do not vote.

The PAC was comprised of a wide variety of commissioners: a white law professor at the University of California, Berkeley; an African-American former OPD officer; a 25-year-old Muslim activist; an 85-year-old founder of a famed user group for the Unix operating system; a young Latino attorney; and an Iranian-American businessman and former mayoral candidate.

Professor Deirdre Mulligan, who as of September 2017 announced her intention to step down from the PAC pending a replacement, is probably the highest-profile member of the commission. Mulligan is a veteran of the privacy law community: she was the founding director of the Samuelson Clinic, a Berkeley Law clinic that focuses on technology-related cases.

“The connection between race and surveillance and policing has become more evident to people,” she told me. “It seemed like Oakland was in a good position to create some good examples. To think about how the introduction of technology would affect not just privacy, but equity and fairness issues.”

For his part, Robert Oliver tends to sit back — his eyes toggling between his black laptop and whoever in the PAC happens to be speaking. As the only Oakland native in the group, an army vet with a computer science degree from Grambling State University, and a former Oakland Police Department cop, Oliver comes to the commission with a very uniqueperspective. When uniformed officers come to speak before the PAC, Oliver doesn’t underscore that he served among them from 1998 until 2006. But he understands what a difficult job police officers are tasked with, especially in a city like Oakland, where, in recent years, there have been around 80 murders annually.

“From a beat officer point of view, who doesn’t have the life experience — and of course they’re not walking around with the benefit of case law sloshing around in their heads — they’re trying to make these decisions on the fly and still remain within the confines of the law while simultaneously trying not to get hurt or killed,” he told me over beers.

The way he sees it, Riley v. California is a “demarcation point” — the legal system is starting to figure out what the appropriate limits are. Indeed, the Supreme Court does seem to understand in a fundamental way that smartphones are substantively different from every other class of device that has come before.

Meanwhile, Reem Suleiman stands out, as she is both the youngest member of the PAC and the only Muslim. A Bakersfield native, Suleiman has been cognizant of what it means to be Muslim and American nearly her entire life. Since Sept. 11, 2001, she’s known of many instances where the FBI or other law enforcement agencies would turn up at the homes or workplaces of people she knew.

It felt like a prerequisite as a Muslim in America,” she told me at a downtown Oakland coffee shop.

After leaving home, Suleiman went to the University of California, Los Angeles, to study, where she also became a board member of the Muslim Student Association. After graduation and moving to the Bay Area, she got a job as a community organizer with Asian Law Caucus, a local advocacy group. She quickly realized thata lot of people, including her own father, take the position that if law enforcement comes to your door, you should help out as much as possible, presuming that you have nothing to hide.

Suleiman would advise people: “Never speak with them without an attorney. Ask for their business card and say that your attorney will contact them. People didn’t understand that they had a right to refuse and that they [weren’t required] to let them enter without a warrant. It could be my father-in-law. It could be my dad, it was very personal.”

This background was her foray into how government snooping could be used against Muslims like her.

“The surveillance implications aren’t even in the back of anybody’s heads,” she said. “I feel like if the public really understood the scope of this they would be outraged.”

In some ways, Lou Katz is the polar opposite of Suleiman: he’s 85, Jewish and male. But they share many of the same civil liberties concerns. In 1975, Katz founded USENIX, a well-known Unix users’ group that continues today — he’s the nerdy, lefty grandpa of the Oakland PAC. Throughout the Vietnam era, and into the post-9/11 timeframe, Katz has been concerned about government overreach.

“I was a kid in the Second World War,” he told me over coffee. “When they formed the Department of Homeland Security, the bells and sirens went off. ‘Wait a minute, this is the SS, this is the Gestapo!’ They were using the same words. They were pulling the same crap that the Nazis had pulled.

”Katz got involved as a way to potentially stop a local government program, right in his own backyard, before it got out of control.

“It’s hard to imagine a technology whose actual existence should be kept secret,” he continued. “Certainly not at the police level. I don’t know about at the NSA or CIA level, that’s a different thing. NSA’s adversary is other nation states, the adversaries in Oakland are, at worst, organized crime.”

Serving alongside Katz is Raymundo Jacquez, a 32-year-old attorney with the Centro Legal de la Raza, an immigrants’ rights legal group centered in Fruitvale, a largely Latino neighborhood in East Oakland. Jacquez’s Oakland-born parents raised him in nearby Hayward with an understanding of ongoing immigrant and minority struggles. It was this upbringing that eventually made him want to be a civil rights attorney.“

This committee has taken on a different feel post-Trump,” he said. “You never know who is going to be in power and you never know what is going to happen with the data. We have to shape policies in case there is a Trump running every department.”

As of late 2017, the PAC’s most comprehensive policy success has been its stingray policy. Since the passage of the California Electronic Communications Privacy Act, California law enforcement agencies must, in nearly all cases, obtain a warrant before using them. But the Oakland Police Department must now go a few steps further: As of February 2017, stingrays can only be approved by the chief of police or the assistant chief of police. (In an emergency situation, a lieutenant or above must approve.) In either case, each use must be logged, with the name of the user, the reason and results of each use. In addition, the police must provide an annual report that describes those uses, violations of policy (alleged or confirmed), and must describe the “effectiveness of the technology in assisting in investigations based on data collected.”

Cyrus Farivar (@cfarivar) is a senior tech policy reporter at Ars Technica, and a radio producer and author. “Habeas Data” builds on his coverage by diving deep into the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America. Excerpt published courtesy of Melville House.

Filed in: Demographics Government Immigration Civil & human rights Crime Technology Oakland Asian Law Caucus Brian Hofer Centro Legal de la Raza Edward Snowden FBI immigrants NSA Oakland police privacy spying stingray surveillance

Tags: , ,


Private Blockchains Could Be Compatible with EU Privacy Rules, Research Shows

November 12, 2018

Private blockchains, such as interbanking platforms set to share information on customers, could be compatible with new E.U. privacy rules, according to research published Nov. 6. The study was conducted by Queen Mary University of London and the University of Cambridge, U.K.
The General Data Protection Regulation (GDPR) act, a recent legislation that regulates the storage of personal data for all individuals within the European Union, came into effect this May. According to the law, all data controllers have to respect citizens’ rights in terms of keeping and transferring their private information. In case a data controller fails to do so, the potential fines are set as €20 million (about $22 million) or four percent of global turnover/revenues, whichever is higher.

The recent U.K. study, published in the Richmond Journal of Law and Technologies, views blockchain and its nodes through the length of GDPR. According to the researchers, crypto-related technologies could fall under these rules and be treated as “controllers,” given that they publicly store private information about E.U. citizens in the chain and allow third parties to operate it. This, the study reveals, might slow down technology implementation in EU:

“There is a risk that this legal uncertainty will have a chilling effect on innovation, at least in the EU and potentially more broadly. For example, if all nodes and miners of a platform were to be deemed joint controllers, they would have joint and several liability, with potential penalties under the GDPR.”

However, the researchers emphasize that blockchain operators could be treated like “processors” instead, the same as the companies behind cloud technologies who act on behalf of users rather than control their data. This, the study continues, is mostly applicable for Blockchain-as-a-Service (BaaS) offerings, where a third party provides the supporting infrastructure for the network while users store their data and control it personally.

As an example for such type of blockchain platform, the researchers cite centralized platforms for land registry and private interbanking solutions that set up “a closed, permissioned blockchain platform with a small number of trusted nodes.” Such closed systems could effectively comply with GDPR rules, the report continues.

To meet the privacy law, blockchain networks might also store personal data externally or allow trusted nodes to delete the private key for encrypted information, thus leaving indecipherable data on the chain, the researchers state.

However, the GDPR rules are extremely difficult to comply with for more decentralized nets, such as those concerned with mining and cryptocurrency. In this case, the nodes, operating with the data of E.U. citizens, might agree to fork a new version of the blockchain from time to time, thus reflecting mass requests for rectification or erasure. “However, in practice, this level of coordination may be difficult to achieve among potentially thousands of nodes,” the study reads.

As a conclusion, the researchers urge the European Data Protection Board, an independent regulatory body behind GDPR, to issue clearer guidance on the application of data protection law to various common blockchain models.

As Cointelegraph wrote earlier, the GDPR could both support and harm blockchain. Despite the fact that current E.U. legislation partially has the same goals as crypto-related technologies, such as decentralizing data control, blockchain companies could also face extremely high fees as data controllers.

Tags: , ,

Introducing ShazzleMail Email and How it Works

Privacy is your Fundamental Human Right.

Our Daily Blog
fb featured image
Privacy Problems Mount for Tech Giants
January 21, 2019

By Sam Schechner Jan. 21, 2019 6:30 a.m. ET Big tech companies have taken a public lashing in th...

Read more
Why data privacy is hot and machine learning is not
January 15, 2019

by RAFAEL LAGUNA — 1 day ago in CONTRIBUTORS Looking back on the past twelve months, we will all ...

Read more
Apple is portraying itself as the defender of privacy in the tech world, but it’s one slip away from embarrassment
January 10, 2019

Analysis: Apple has continued to ratchet up its criticism of competitors in a bid to differentiate i...

Read more
Editorial: Privacy Lessons From Google
December 28, 2018

Thursday, December 27, 2018 Congress is eyeing a federal privacy framework for 2019. But what about...

Read more
Why Privacy Needs All of Us
December 17, 2018

By Cyrus Farivar Dec 17 2018 - 7:30am An excerpt from "Habeas Data: Privacy vs. the Rise of Surve...

Read more