Have you created a ShazzleMail account on your smartphone? This is a required first step.

Yes No

Free Encrypted Email

Posts Tagged ‘privacy’


Apple is portraying itself as the defender of privacy in the tech world, but it’s one slip away from embarrassment

January 10, 2019

Analysis: Apple has continued to ratchet up its criticism of competitors in a bid to differentiate itself as the “most secure” tech company.
The move is a risky one, as Apple is exposed on several fronts to possible privacy and security leaks and breaches, putting it one step removed from a significant reputation dent that could further hurt sales.
Kate Fazzini

Tim Cook, Chief Executive Officer of Apple Inc., takes a selfie with a customer and her iPhone as he visits the Apple Store in Chicago, Illinois, U.S., March 27, 2018.
John Gress | Reuters
Tim Cook, Chief Executive Officer of Apple Inc., takes a selfie with a customer and her iPhone as he visits the Apple Store in Chicago, Illinois, U.S., March 27, 2018.
Apple ramped up its efforts this week to differentiate its business on the basis of privacy and security, a risky move given risks to its cloud-based backup service and a challenging privacy environment globally, particularly in China, where the company says it is struggling.

Apple took a high-profile swipe at Google, Amazon and Facebook at this year’s Computer Electronics Show, with a full-building ad touting “What happens on your iPhone, stays on your iPhone.” CEO Tim Cook has criticized competitors for their privacy practices and their willingness to share data with third parties.

Apple is now also reportedly hiring ex-Facebook engineer Sandy Parakilas, who called Facebook a “living, breathing crime scene” because of its misuse by Russian hackers in the 2016 election. (Parakilas is reportedly taking an internal spot as a privacy product manager at Apple, a role not likely to include public-facing statements like these in the future).

For sure, Apple’s core business is different from Facebook’s and Google’s. Apple makes the bulk of its money selling iPhones and other computing devices, and charging consumer subscriptions for things like Apple Music. That means Apple has little reason to compile detailed information about users, and even less incentive to sell that information to third parties. But Facebook and Google make the vast majority of their money from advertising.

But putting such a big stake in privacy as a differentiator may be a risky business move.

First, Apple is just one iCloud breach away from an embarrassing incident that could damage its “what happens on your iPhone, stays on your iPhone” claims.

Scandals in the past years involving major celebrities who have had nude photographs stolen from their iCloud archives have been dangerously close. Apple has said these incidents involved username and password theft, giving criminals access to iCloud files through the celebrities’ password information, not a breached iCloud database.

But iCloud relies on the same cloud-based network architecture most companies rely on, including Amazon Web Services, Google’s cloud platform and Microsoft Azure. No database is impenetrable, and that includes those iCloud uses. A single instance of leaked data or an insider theft could put the company at serious reputational risk.

Third-party applications are also a potential sticking point. From a security point of view, Apple’s app store has stringent safeguards in place that make it more resilient to security issues like application spoofing than competitors such as Google’s Play store.

But independent iPhone apps still have the capacity to misuse data. The company routinely removes applications from the store for providing user information to unauthorized third parties. The New York Times reported earlier this year that numerous free iOS apps track detailed user information and provide it to third parties.

So Apple may also be one data-tracking scandal away from significantly denting the idea that data necessarily “stays on your iPhone.”

Tags: , ,


Why Privacy Needs All of Us

December 17, 2018

By Cyrus Farivar
Dec 17 2018 – 7:30am
An excerpt from “Habeas Data: Privacy vs. the Rise of Surveillance Tech” (Melville House, 2018)

There is one American city that is the furthest along in creating a workable solution to the current inadequacy of surveillance law: Oakland, California — which spawned rocky road ice cream, the mai tai cocktail, and the Black Panther Party. Oakland has now pushed pro-privacy public policy along an unprecedented path.

Today, Oakland’s Privacy Advisory Commission acts as a meaningful check on city agencies — most often, police — that want to acquire any kind of surveillance technology. It doesn’t matter whether a single dollar of city money is being spent — if it’s being used by a city agency, the PAC wants to know about it. The agency in question and the PAC then have to come up with a use policy for that technology and, importantly, report back at least once per year to evaluate its use.

The nine-member all-volunteer commission is headed by a charismatic, no-nonsense 40-year-old activist, Brian Hofer. During the PAC’s 2017 summer recess, Hofer laid out his story over a few pints of beer. In the span of just a few years, he has become an unlikely crusader for privacy in the Bay Area.

In July 2013, when Edward Snowden was still a fresh name, the City of Oakland formally accepted a federal grant to create something called the Domain Awareness Center. The idea was to provide a central hub for all of the city’s surveillance tools, including license plate readers, closed circuit television cameras, gunshot detection microphones and more — all in the name of protecting the Port of Oakland, the third largest on the West Coast.

Had the city council been presented with the perfunctory vote on the DAC even a month before Snowden, it likely would have breezed by without even a mention in the local newspaper. But because government snooping was on everyone’s mind, including Hofer’s, it became a controversial plan.

After reading a few back issues of the East Bay Express in January 2014, Hofer decided to attend one of the early meetings of the Oakland Privacy Working Group, largely an outgrowth of Occupy and other activists opposed to the DAC. The meeting was held at Sudoroom — then a hackerspace hosted amidst a dusty collective of offices and meeting space in downtown Oakland.

Within weeks, Hofer, who had no political connections whatsoever, had meetings scheduled with city council members and other local organizations. By September 2014, Hofer was named as the chair of the Ad Hoc Privacy Committee. In January 2016, a city law formally incorporated that Ad Hoc Privacy Committee into the PAC — each city council member couldappoint a member of their district as representatives. Hofer was its chair, representing District 3, in the northern section of the city. Hofer ended up creating the city’s privacy watchdog, simply because he cared enough to do so.

On the first Thursday of every month, the PAC meets in a small hearing room, on the ground floor of City Hall. Although there are dozens of rows of theater-style folding seats, more often than not there are more commissioners in attendance than citizens. While occasionally a few local reporters and concerned citizens are present, most of the time, the PAC plugs away quietly. Turns out, the most ambitious local privacy policy in America is slowly and quietly made amidst small printed name cards — tented in front of laptops — one agenda item at a time.

Its June 1, 2017, meeting was called to order by Hofer. He was flanked by seven fellow commissioners and two liaison positions, who do not vote.

The PAC was comprised of a wide variety of commissioners: a white law professor at the University of California, Berkeley; an African-American former OPD officer; a 25-year-old Muslim activist; an 85-year-old founder of a famed user group for the Unix operating system; a young Latino attorney; and an Iranian-American businessman and former mayoral candidate.

Professor Deirdre Mulligan, who as of September 2017 announced her intention to step down from the PAC pending a replacement, is probably the highest-profile member of the commission. Mulligan is a veteran of the privacy law community: she was the founding director of the Samuelson Clinic, a Berkeley Law clinic that focuses on technology-related cases.

“The connection between race and surveillance and policing has become more evident to people,” she told me. “It seemed like Oakland was in a good position to create some good examples. To think about how the introduction of technology would affect not just privacy, but equity and fairness issues.”

For his part, Robert Oliver tends to sit back — his eyes toggling between his black laptop and whoever in the PAC happens to be speaking. As the only Oakland native in the group, an army vet with a computer science degree from Grambling State University, and a former Oakland Police Department cop, Oliver comes to the commission with a very uniqueperspective. When uniformed officers come to speak before the PAC, Oliver doesn’t underscore that he served among them from 1998 until 2006. But he understands what a difficult job police officers are tasked with, especially in a city like Oakland, where, in recent years, there have been around 80 murders annually.

“From a beat officer point of view, who doesn’t have the life experience — and of course they’re not walking around with the benefit of case law sloshing around in their heads — they’re trying to make these decisions on the fly and still remain within the confines of the law while simultaneously trying not to get hurt or killed,” he told me over beers.

The way he sees it, Riley v. California is a “demarcation point” — the legal system is starting to figure out what the appropriate limits are. Indeed, the Supreme Court does seem to understand in a fundamental way that smartphones are substantively different from every other class of device that has come before.

Meanwhile, Reem Suleiman stands out, as she is both the youngest member of the PAC and the only Muslim. A Bakersfield native, Suleiman has been cognizant of what it means to be Muslim and American nearly her entire life. Since Sept. 11, 2001, she’s known of many instances where the FBI or other law enforcement agencies would turn up at the homes or workplaces of people she knew.

It felt like a prerequisite as a Muslim in America,” she told me at a downtown Oakland coffee shop.

After leaving home, Suleiman went to the University of California, Los Angeles, to study, where she also became a board member of the Muslim Student Association. After graduation and moving to the Bay Area, she got a job as a community organizer with Asian Law Caucus, a local advocacy group. She quickly realized thata lot of people, including her own father, take the position that if law enforcement comes to your door, you should help out as much as possible, presuming that you have nothing to hide.

Suleiman would advise people: “Never speak with them without an attorney. Ask for their business card and say that your attorney will contact them. People didn’t understand that they had a right to refuse and that they [weren’t required] to let them enter without a warrant. It could be my father-in-law. It could be my dad, it was very personal.”

This background was her foray into how government snooping could be used against Muslims like her.

“The surveillance implications aren’t even in the back of anybody’s heads,” she said. “I feel like if the public really understood the scope of this they would be outraged.”

In some ways, Lou Katz is the polar opposite of Suleiman: he’s 85, Jewish and male. But they share many of the same civil liberties concerns. In 1975, Katz founded USENIX, a well-known Unix users’ group that continues today — he’s the nerdy, lefty grandpa of the Oakland PAC. Throughout the Vietnam era, and into the post-9/11 timeframe, Katz has been concerned about government overreach.

“I was a kid in the Second World War,” he told me over coffee. “When they formed the Department of Homeland Security, the bells and sirens went off. ‘Wait a minute, this is the SS, this is the Gestapo!’ They were using the same words. They were pulling the same crap that the Nazis had pulled.

”Katz got involved as a way to potentially stop a local government program, right in his own backyard, before it got out of control.

“It’s hard to imagine a technology whose actual existence should be kept secret,” he continued. “Certainly not at the police level. I don’t know about at the NSA or CIA level, that’s a different thing. NSA’s adversary is other nation states, the adversaries in Oakland are, at worst, organized crime.”

Serving alongside Katz is Raymundo Jacquez, a 32-year-old attorney with the Centro Legal de la Raza, an immigrants’ rights legal group centered in Fruitvale, a largely Latino neighborhood in East Oakland. Jacquez’s Oakland-born parents raised him in nearby Hayward with an understanding of ongoing immigrant and minority struggles. It was this upbringing that eventually made him want to be a civil rights attorney.“

This committee has taken on a different feel post-Trump,” he said. “You never know who is going to be in power and you never know what is going to happen with the data. We have to shape policies in case there is a Trump running every department.”

As of late 2017, the PAC’s most comprehensive policy success has been its stingray policy. Since the passage of the California Electronic Communications Privacy Act, California law enforcement agencies must, in nearly all cases, obtain a warrant before using them. But the Oakland Police Department must now go a few steps further: As of February 2017, stingrays can only be approved by the chief of police or the assistant chief of police. (In an emergency situation, a lieutenant or above must approve.) In either case, each use must be logged, with the name of the user, the reason and results of each use. In addition, the police must provide an annual report that describes those uses, violations of policy (alleged or confirmed), and must describe the “effectiveness of the technology in assisting in investigations based on data collected.”

Cyrus Farivar (@cfarivar) is a senior tech policy reporter at Ars Technica, and a radio producer and author. “Habeas Data” builds on his coverage by diving deep into the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America. Excerpt published courtesy of Melville House.

Filed in: Demographics Government Immigration Civil & human rights Crime Technology Oakland Asian Law Caucus Brian Hofer Centro Legal de la Raza Edward Snowden FBI immigrants NSA Oakland police privacy spying stingray surveillance

Tags: , ,


Private Blockchains Could Be Compatible with EU Privacy Rules, Research Shows

November 12, 2018

Private blockchains, such as interbanking platforms set to share information on customers, could be compatible with new E.U. privacy rules, according to research published Nov. 6. The study was conducted by Queen Mary University of London and the University of Cambridge, U.K.
The General Data Protection Regulation (GDPR) act, a recent legislation that regulates the storage of personal data for all individuals within the European Union, came into effect this May. According to the law, all data controllers have to respect citizens’ rights in terms of keeping and transferring their private information. In case a data controller fails to do so, the potential fines are set as €20 million (about $22 million) or four percent of global turnover/revenues, whichever is higher.

The recent U.K. study, published in the Richmond Journal of Law and Technologies, views blockchain and its nodes through the length of GDPR. According to the researchers, crypto-related technologies could fall under these rules and be treated as “controllers,” given that they publicly store private information about E.U. citizens in the chain and allow third parties to operate it. This, the study reveals, might slow down technology implementation in EU:

“There is a risk that this legal uncertainty will have a chilling effect on innovation, at least in the EU and potentially more broadly. For example, if all nodes and miners of a platform were to be deemed joint controllers, they would have joint and several liability, with potential penalties under the GDPR.”

However, the researchers emphasize that blockchain operators could be treated like “processors” instead, the same as the companies behind cloud technologies who act on behalf of users rather than control their data. This, the study continues, is mostly applicable for Blockchain-as-a-Service (BaaS) offerings, where a third party provides the supporting infrastructure for the network while users store their data and control it personally.

As an example for such type of blockchain platform, the researchers cite centralized platforms for land registry and private interbanking solutions that set up “a closed, permissioned blockchain platform with a small number of trusted nodes.” Such closed systems could effectively comply with GDPR rules, the report continues.

To meet the privacy law, blockchain networks might also store personal data externally or allow trusted nodes to delete the private key for encrypted information, thus leaving indecipherable data on the chain, the researchers state.

However, the GDPR rules are extremely difficult to comply with for more decentralized nets, such as those concerned with mining and cryptocurrency. In this case, the nodes, operating with the data of E.U. citizens, might agree to fork a new version of the blockchain from time to time, thus reflecting mass requests for rectification or erasure. “However, in practice, this level of coordination may be difficult to achieve among potentially thousands of nodes,” the study reads.

As a conclusion, the researchers urge the European Data Protection Board, an independent regulatory body behind GDPR, to issue clearer guidance on the application of data protection law to various common blockchain models.

As Cointelegraph wrote earlier, the GDPR could both support and harm blockchain. Despite the fact that current E.U. legislation partially has the same goals as crypto-related technologies, such as decentralizing data control, blockchain companies could also face extremely high fees as data controllers.

Tags: , ,


Just Don’t Call It Privacy

September 23, 2018

What do you call it when employers use Facebook’s advertising platform to show certain job ads only to men or just to people between the ages of 25 and 36?

How about when Google collects the whereabouts of its users — even after they deliberately turn off location history?

Or when AT&T shares its mobile customers’ locations with data brokers?

American policymakers often refer to such issues using a default umbrella term: privacy. That at least is the framework for a Senate Commerce Committee hearing scheduled for this Wednesday titled “Examining Safeguards for Consumer Data Privacy.”

After a spate of recent data-mining scandals — including Russian-sponsored ads on Facebook aimed at influencing African-Americans not to vote — some members of Congress are now rallying behind the idea of a new federal consumer privacy law.
At this week’s hearing, legislators plan to ask executives from Amazon, AT&T, Google, Twitter and other companies about their privacy policies. Senators also want the companies to explain “what Congress can do to promote clear privacy expectations without hurting innovation,” according to the hearing notice.

There’s just one flaw with this setup.

In a surveillance economy where companies track, analyze and capitalize on our clicks, the issue at hand isn’t privacy. The problem is unfettered data exploitation and its potential deleterious consequences — among them, unequal consumer treatment, financial fraud, identity theft, manipulative marketing and discrimination.
In other words, asking companies whose business models revolve around exploiting data-based consumer-influence techniques to explain their privacy policies seems about as useful as asking sharks to hold forth on veganism.

“Congress should not be examining privacy policies,” Marc Rotenberg, the executive director of the Electronic Privacy Information Center, a prominent digital rights nonprofit, told me last week. “They should be examining business practices. They should be examining how these firms collect and use the personal data of customers, of internet users.”

The Senate Commerce hearing, however, doesn’t seem designed to investigate commercial surveillance and influence practices that might merit government oversight.
For one thing, only industry executives are currently set to testify. And most of them are lawyers and policy experts, not engineers versed in the mechanics of data-mining algorithms.

Companies are sending their “policy and law folks to Washington to make the government go away — not the engineering folks who actually understand these systems in depth and can talk through alternatives,” Jonathan Mayer, an assistant professor of computer science and public affairs at Princeton University, told me.

That may be because Congress is under industry pressure.

California recently passed a new privacy law that would give Californians some power over the data companies’ hold on them. Industry groups hope to defang that statute by pushing Congress to pass federal privacy legislation that would overrule state laws. The industry-stacked Senate hearing lineup seems designed to pave the way for that, said Danielle Citron, a law professor at the University of Maryland.

Frederick Hill, a spokesman for the Senate Commerce Committee, said the group planned future hearings that would include other voices, such as consumer groups. But “for the first hearing,” Mr. Hill said, “the committee is bringing in companies most consumers recognize to make the discussion about privacy more relatable.”

What is at stake here isn’t privacy, the right not to be observed. It’s how companies can use our data to invisibly shunt us in directions that may benefit them more than us.

Many consumers know that digital services and ad tech companies track and analyze their activities. And they accept, or are at least resigned to, data-mining in exchange for conveniences like customized newsfeeds and ads.

But revelations about Russian election interference and Cambridge Analytica, the voter-profiling company that obtained information on millions of Facebook users, have made it clear that data-driven influence campaigns can scale quickly and cause societal harm.
And that leads to a larger question: Do we want a future in which companies can freely parse the photos we posted last year, or the location data from the fitness apps we used last week, to infer whether we are stressed or depressed or financially strapped or emotionally vulnerable — and take advantage of that?

“Say I sound sick when I am talking to Alexa, maybe they would show me medicine as a suggestion on Amazon,” said Franziska Roesner, an assistant professor of computer science at the University of Washington, using a hypothetical example of Amazon’s voice assistant. “What happens when the inferences are wrong?”

(Amazon said it does not use Alexa data for product recommendations or marketing.)

It’s tough to answer those questions right now when there are often gulfs between the innocuous ways companies explain their data practices to consumers and the details they divulge about their targeting techniques to advertisers.

AT&T’s privacy policy says the mobile phone and cable TV provider may use third-party data to categorize subscribers, without using their real names, into interest segments and show them ads accordingly. That sounds reasonable enough.

Here’s what it means in practice: AT&T can find out which subscribers have indigestion — or at least which ones bought over-the-counter drugs to treat it.

In a case study for advertisers, AT&T describes segmenting DirecTV subscribers who bought antacids and then targeting them with ads for the medication. The firm was also able to track those subscribers’ spending. Households who saw the antacid ads spent 725 percent more on the drugs than a national audience.

Michael Balmoris, a spokesman for AT&T, said the company’s privacy policy was “transparent and precise, and describes in plain language how we use information and the choices we give customers.”
But consumer advocates hope senators will press AT&T, Amazon and other companies this week to provide more details on their consumer-profiling practices. “We want an inside look on the analytics and how they’re categorizing, ranking, rating and scoring us,” Professor Citron said.

Given the increased public scrutiny, some companies are tweaking their tactics.

AT&T recently said it would stop sharing users’ location details with data brokers. Facebook said it had stopped allowing advertisers to use sensitive categories, like race or religion, to exclude people from seeing ads. Google created a feature for users to download masses of their data, including a list of all the sites Google has tracked them on.

Government officials in Europe are not waiting for companies to police themselves. In May, the European Union introduced a tough new data protection law that curbs some data-mining.

It requires companies to obtain explicit permission from European users before collecting personal details on sensitive subjects like their religion, health or sex life. It gives European users the right to see all of the information companies hold about them — including any algorithmic scores or inferences.

European users also have the right not to be subject to completely automated decisions that could significantly affect them, such as credit algorithms that use a person’s data to decide whether a bank should grant him or her a loan.

Of course, privacy still matters. But Congress now has an opportunity to press companies like Amazon on broader public issues. It could require them to disclose exactly how they use data extracted from consumers. And it could force companies to give consumers some rights over that data.

Tags: , , ,


Privacy and security: no simple solution, warns Rachel Dixon

September 18, 2018

The tide is turning when it comes to privacy and security, with Australians gradually becoming more aware of the need to protect their personal data and the risks involved in sharing it.

Rachel Dixon, privacy and data protection deputy commissioner at the Office of the Victorian Information Commissioner, saysthat with public debates over My Health Record and new tech surveillance laws, the public is now more informed about these issues than ever before.

“Not that many years ago there was (a view) that privacy is dead,” she says. “That now sounds quite outdated. In some ways the conversation still does need to get more mature. But this has been a real watershed year for privacy issues making it to the mainstream.
“That’s a very good thing.”

According to Ms Dixon, the theme of the last decade broadly had been to “hoover up as much data as possible”, and that’s now shifting to a theme of “taking the data that is necessary to fulfil the function”.

“There’s been a change in people’s understanding around their privacy,” she says. “Increasingly they’re more concerned, and are less willing to hand over data in certain circumstances. A lot of the use of data now is looking at the risks involved.

“Humans historically have not been very good at calculating risk. That’s been terrific in the past, it’s allowed us to sail across oceans and go into space. But we’re not very good at it. So I want us to move to having a risk-based framework, and change the culture around assessing risk.”

Debate is currently raging as to whether Australian law enforcement agencies should have the right to decrypt smart devices to prevent and solve criminal activity, with ferocious opinions coming on both sides of the debate.

For former FBI agent Ed Stroz, the founder and co-president of Stroz Friedberg, the ability to thwart terrorist attacks is more important overall than the right to an individual’s privacy.

“You can see both sides of the issue. And it comes down to, ‘Do people have the right to privacy?’ I’m a little more sympathetic to the law enforcement side,” he says.

“People do value their privacy now, but if you have an encrypted phone held by a criminal, that creates a sacred category of evidence we’ve never had in our judicial system before. Out of the box, this engineering empowers adverse behaviour and that has big social effects.

“If we didn’t have that many adversaries around, it probably wouldn’t matter that much. But I weigh that aspect of it more heavily than valuing privacy overall. That’s a personal view that I have.”

Ms Dixon said encryption was a complex issue, and that there was no simple, obvious, single solution to the balance between privacy and security.

“If there was, we would have done it by now,” she says. “Chances are, the solution here is a combination of things. But the debate is definitely going to be messy. At least the discussion has raised some really good points. I would caution against looking for a ­simple answer or seeing the issue as binary. It’s not, these are healthy tensions between privacy, data protection and freedom of information.”

Marcin Kleczynski is chief executive of Malwarebytes, a security company he started as a 16-year-old. He saidthat while users had become more savvy about their own security and privacy, they were still generally the weak link when it comes to viruses and other attacks. “It takes a lot to always be patching your systems and keeping everything up to date,” he says. “There are so many damn security companies, I could name 60 or 70, but an attack still comes and no-one’s ready.

“I’m fairly pessimistic about this stuff. I think we still haven’t solved a lot of the basics when it comes to security. We need a lot more user awareness training about security and storing your own information, there needs to be a lot more basic hygiene in place. We’re slowly getting there.”

Tags: , ,

Introducing ShazzleMail Email and How it Works

Privacy is your Fundamental Human Right.

Our Daily Blog
Edward Snowden’s Autobiography Makes a Plea for the Fourth Amendment, the Right to Privacy, and Encryption
September 24, 2019

America's most famous whistleblower calls for restricting the power of government. Article by SCO...

Read more
Chinese deepfake app Zao sparks privacy row after going viral
September 3, 2019

Critics say face-swap app could spread misinformation on a massive scale A Chinese app that lets ...

Read more
Google tightens grip on some Android data over privacy fears, report says
August 19, 2019

The search giant ends a program that provided network coverage data to wireless carriers. BY CARR...

Read more
Wikipedia co-founder slams Mark Zuckerberg, Twitter and the ‘appalling’ internet
July 8, 2019

Elizabeth Schulze Wikpedia Co-Founder Larry Sanger said in an interview social media companies ...

Read more
Why America Needs a Thoughtful Federal Privacy Law
June 26, 2019

More than a dozen privacy bills have been introduced in this Congress. Here’s what it needs to do....

Read more