Have you created a ShazzleMail account on your smartphone? This is a required first step.

Yes No

Free Encrypted Email

Posts Tagged ‘privacy’

images-1

US legal eagle: Well done, you bought privacy compliance tools. Doesn’t mean you comply with anything

February 25, 2019

From California state regs to Europe’s GDPR: It’s all just a ‘veneer of protection’

By Rebecca Hill 25 Feb 2019 at 14:44 13
Much-lauded privacy laws risk being undermined as compliance is outsourced to tech vendors and “toothless trainings, audits and paper trails” are confused for genuine protections, a New York Law School professor has said.

In a paper in the Washington Law Review, published online last week, Ari Ezra Waldman argued that recently strengthened privacy laws actually offer “false promises” for consumers.

He said that laws like the European Union’s GDPR or California’s state privacy rules are failing to deliver on their promised protections partly because of the “booming market” in tech vendors hawking privacy compliance tools.

“The responsibility for fulfilling legal obligations is being outsourced to engineers at third-party technology vendors who see privacy law through a corporate, rather than substantive, lens,” he wrote.

“Toothless trainings, audits, and paper trails, among other symbols, are being confused for actual adherence to privacy law, which has the effect of undermining the promise of greater privacy protection for consumers.”

The problem is heightened because, as they fear increasing fines under the new laws, organisations – particularly those without the cash to build tools in-house or hire in experts – are more likely to look for a quick fix.

However, Waldman warned that this could have knock-on effects for not only because organisations buying honky kit risk non-compliance, but also for both the long-term outlook of the vendors and consumers.

“Not all innovation is good innovation,” Waldman said. “Companies that develop shoddy products may lose out in the market in the long term, but in the short and medium term, they risk putting millions of persons’ data at risk.”

‘Symbols of compliance standing in for real protections’
The paper aimed to emphasise the importance of privacy laws by pointing to Facebook’s “cavalier” approach to data protection, mobile app platforms that “routinely sweep in user data” because they can, and even academics’ interest in hoovering up personal info as part of studies.

As the implications of such mass data hoarding, harvesting and hawking have come to light, a set of comprehensive international privacy laws have been drawn up – but Waldman said that, in reality, the law’s “veneer of protection is hiding the fact that it is built on a house of cards”.

He pins much of this on the burgeoning “privacy outsourcing market” and the idea that third-party tech vendors “instantiate their own vision of the law into their services” to fling at organisations desperate to avoid whopping fines.

The argument is based on a socio-legal principle of “legal endogeneity”, first mooted by academic Lauren Edelman. This is when the law is shaped by ideas emerging from the space it seeks to regulate, rather than constraining or guiding those organisations’ behaviour.

It occurs when “ambiguously worded legal requirements” allow compliance professionals on the ground to define what the law means in practice – and in the case of privacy laws, much of this comes down to tech vendors and compliance professionals.

Some of the law’s most important premises – like privacy by design or consent – “are so unclear that professionals on the ground have wide latitude to frame the law’s requirements, kicking endogeneity into high gear”.

Tech can’t save you – but everyone wants it to
Mixed in with this is the fact that both private and public bodies have (misplaced) faith in technology to solve their problems; meanwhile the threats of financial penalties make organisations “uniquely susceptible to promises that vendors can make their troubles disappear”.

This opens the door to vendors selling compliance, and Waldman said that there are 200-plus firms that “instantiate their own interpretations of privacy law into the designs of automated tools, often marketing themselves as one-stop compliance shops”.

The author – hoping to see off any “not all vendors!” comebacks – emphasised that he isn’t saying every firm is part of the problem, nor that they alone are responsible for undermining the promise of privacy law.

Instead, Waldman said that the impact of privacy tech vendors on the legal frameworks is “both significant and under-explored” – and aimed to probe this by assessing the claims made by 165 companies listed in a 2018 report (PDF) from the International Association of Privacy Professionals.

He found that, at some point, almost three-quarters had at some point positioned their products and services as achieving GDPR compliance – when most are designed to meet just two or three of the GDPR’s requirements, “if that”.

‘Privacy law can’t be broken down into code-able pieces’
A further issue described in the paper is that, by promoting these tools for compliance, vendors are attempting to reduce the law into “code-able pieces” when the law is about more than just paper trails and data maps.

“Such under-inclusive compliance technologies may then have the effect of increasing corporate exposure to administrative fines if in-house constituencies confuse purchasing a compliance technology that does a few things with actually solving a problem,” Waldman wrote.

He also posits the idea that this could lead to an imbalance between firms that have to outsource because they lack the money or time to recruit legal experts or build their own tools in-house, and those that can afford to do this.

Meanwhile, consumers are being disempowered because they are increasingly faced with tech-driven conversations about compliance based on black box algorithms. This also risks “erasing” traditional safeguards that sees the law interpreted in the open and on the public record.

Waldman proposed lawmakers edge away from “transactional visions of privacy law that are susceptible to symbolic structures”, as well as calling on the US Federal Trade Commission to be “more active vendor regulators” with better audits.

For vendors, he called for “more modest approaches” that include hiring lawyers and professionals and establishing a closer relationship with regulators, possibly including certification.

Possible products and services include summaries and comparisons of legislation, training courses and tools that scan the data a company has to seek out personal information.

He also called for further research that puts vendors in an ecosystem of social forces that influence the implementation of privacy law on the ground, as well as work on the problem of privacy education for engineers. ®

Tags: , ,

imrs

Give To Get: Sensing, Tracking And Your Privacy

February 11, 2019

226 viewsFeb 10, 2019, 06:00pm
By Tracy Brower: I write about the changing nature of work, workers and the workplace.

What are you willing to give up when it comes to your privacy? In the end, it depends on what you get in return—the give to get equation. Technology can track us anywhere. As it gains momentum, and as we increasingly choose to be tracked, we also give up some privacy. But, what do we gain?

Your Fitbit knows your steps, your water intake and your weight. MapMyWalk knows your whereabouts and your level of workout discipline. Your ride sharing app knows the route you took, what time you traveled, and how much you paid. Facebook knows your friends, your sentiments, and so much more. If you use these apps, you believe that you get something in return that is worth the personal information you share—better health, convenient transportation or connections with a community of friends.

It’s Only The Beginning

These are only the beginning, of course. New contact lenses for diabetics sense glucose levels. A dress outfitted with sensors tracks how often women were touched at a nightclub. There is even a diaper that senses when a baby needs a change. In addition, China is beginning use of a social credit system in order to assess economic and social reputation.

Employee Monitoring

Increasingly, our employers know a lot about us as well. Sensing systems track handwashing compliance in medical workers. In an effort to curtail theft, TSA established a surveillance system for its employees. Unfortunately, it met with unintended consequences in which employees felt devalued and did everything they could to stay out of the cameras’ view.

Consequences of these capabilities and systems – both intended and unintended – result in promise and peril. Growth in sensing technology will have far-reaching implications for our social norms and systems. Data gathering is not inherently negative, it’s a matter of how transparent companies are in gathering data and the choices they make about how the data is used.
Some companies provide discounts on health insurance in exchange for the use of a health app into which employees enter their most personal health information. Companies track worker locations through badging data and talent information based on performance management systems.

None of this is bad, necessarily, but what do employees receive in return? The risk is that the answer to that question is “not much”. The pendulum can swing toward the value of data for the company—where organizations are getting a lot of data—without much of a give-back to employees. But there is an opportunity to equal the playing field for employees—ensuring employees receive a message they are trusted and that tracking delivers value for them as much as for the company.

Give To Get

What if the data that is collected via your email or badge tracking could come back in a way that helps you work more effectively? What if the data you enter into the talent management system could help you curate your career, enhance your job fit, or match you with a mentor? What if the data you enter into the company health program could help you maintain fitness and reduce stress?

“Give to get” is the balancing of what the company receives in terms of the data it collects and the value it returns to employees. The best companies are those which do both—extract data that helps them achieve organizational results and provides value to employees as well.

Constructive Cultures

For organizations, the holy grail is constructive productive cultures where people want to work, make discretionary effort and contribute their best skills. The best cultures are transparent – sharing openly so employees can make the most informed decisions. They seed innovation by fostering appropriate risk taking, and encourage employees to share and explore. This type of culture can be antithetical to a need to protect security and privacy through limited information sharing and confidentiality. The ways companies gather, track, and monitor information send important messages to employees about value and trust.

Companies can balance the need for both security and privacy by educating people about why they’re gathering information and being as transparent as possible. Trust and positive culture are also enhanced by providing more choice and control—giving employees the opportunity to opt out of data gathering when it’s possible. Ultimately, companies need to do what’s right—not just what’s possible—by using their values as a guide.

Resolving The Tension

It’s a tension and the idea of “give to get” is one way to resolve it. When companies extract data from employees and consumers, they must give back as well.

Tags: , ,

_92023784_thinkstockphotos-482112104

Wise words on privacy, insurance company fined for privacy breach, and secure that email

February 5, 2019

Howard Solomon Howard Solomon @howarditwc
Published: February 4th, 2019
Wise words on privacy from a Canadian expert, a U.K.insurance company fined for mixing business and politics for Brexit and how to secure that email.

Welcome to Cyber Security Today. It’s Monday February 4th. To hear the podcast click on the arrow below:
I was at a privacy conference in Toronto last week where I heard the respected Canadian expert Ann Cavoukian speak. She reminded attendees that privacy and security go together: They aren’t opposites. In fact, she said, privacy is essential to innovation. Companies that do both privacy and security will have an advantage over competitors because customers will trust them more. Is improving the control customers have over their personal information costly, including giving the ability to refuse to allow their personal data to be re-used or shipped to another firm? Maybe. But, Cavoukian adds, that’s nothing to the damage to your brand, loss of trust, and lawsuits that result from a data breach.

Speaking of privacy breaches, the U.K. information commissioner has fined a British insurance company that sent over one million emails to subscribers of the Leave EU Brexit campaign without their full consent three years ago. And the campaign was fined as well for unlawfully using the insurance company to 300,000 political marketing messages to customers. It is deeply concerning that sensitive personal data gathered for political purposes was later used for insurance purposes; and vice versa,” said the information commissioner. You can read her full report here
Fake email, where an attacker uses a phony “from” email address, a deceptive domain or a display name that impersonates a familiar company, is behind many successful data breaches. Someone clicks on a link or opens an attachment and in seconds they’re infected. If only there was a way to authenticate where email comes from. Actually, there is: It’s an open standard called DMARC. The good news is more companies are using it. The bad news, according to security vendor Vailmail, is not enough of them are doing it, nor are they configuring it right. In a study released on Friday, the company said 80 per cent of U.S. federal government domains now use DMARC. By comparison at least 50 per cent of Fortune 500 and large U.S. tech companies have adopted DMARC. Does your company have a way of authenticating email it sends? You should ask.

Finally, ever wonder how cellphone companies co-ordinate the billions of phone calls and text messages sent around the world? They do it through a protocol called SS7. This protocol has vulnerabilities. However, until recently it was thought only intelligence agencies could break into it. But last week the news site Motherboard confirmed a British bank was victimized by an SS7 hack, so it seems cybercriminals now have the capability as well. What this means is the six-digit text messages a financial institution will send you as part of a two-factor authentication login system are increasingly likely to be stolen. I’ve talked about this before: The standard text messaging app that comes with many smartphones may not be safe enough for two-factor authentication. Some cellphone companies say they have taken steps to better secure their text messaging. But if you use an email, company login or bank app that offers two-factor authentication in addition to a username and password, see if it offers the ability to get the special code through a safer messaging app. Four of them are Google Authenticator, Authy, Authenticator Plus, and Duo.

That’s it for Cyber Security Today. Subscribe on Apple Podcasts, Google Podcasts or add us to your Flash Briefing on your smart speaker. Thanks for listening.

Tags: , ,

screen-shot-2017-09-13-at-2-38-44-pm

Privacy is a human right, we need a GDPR for the world: Microsoft CEO

January 28, 2019

This article is part of the World Economic Forum Annual Meeting
24 Jan 2019
Ceri Parker
Commissioning Editor, Agenda, World Economic Forum

Against the backdrop of a “techlash”, the CEO of Microsoft called for new global norms on privacy, data and Artificial Intelligence.

Satya Nadella, who has been shifting Microsoft’s focus to cloud computing, said he would welcome clearer regulations as every company and industry grappled with the data age.
In a talk at Davos, he praised GDPR, the European regulation on data protection and privacy that came into force last year.

“My own point of view is that it’s a fantastic start in treating privacy as a human right. I hope that in the United States we do something similar, and that the world converges on a common standard.”

The default position had to be that people owned their own data, he said.

Privacy is just one controversial area for tech companies. Nadella also addressed the growing field of facial recognition.
“It’s a piece of technology that’s going to be democratized, that’s going to be prevalent, I can come up with 10 uses that are very virtuous and important and can improve human life, and 10 uses that would cause problems,” he said.

Microsoft’s own website lists the below as applications to celebrate:

“Police in New Delhi recently trialed facial recognition technology and identified almost 3,000 missing children in four days. Historians in the United States have used the technology to identify the portraits of unknown soldiers in Civil War photographs taken in the 1860s. Researchers successfully used facial recognition software to diagnose a rare, genetic disease in Africans, Asians and Latin Americans.”

But the dark sides include invasion of privacy and bias. While Microsoft has built a set of principles for the ethical use of AI, Nadella said that self-regulation was not enough.

“In the marketplace there’s no discrimination between the right use and the wrong use… We welcome any regulation that helps the marketplace not be a race to the bottom.”

Share

Written by

Ceri Parker, Commissioning Editor, Agenda, World Economic Forum

The views expressed in this article are those of the author alone and not the World Economic Forum.

Tags: , , ,

fb featured image

Privacy Problems Mount for Tech Giants

January 21, 2019

By Sam Schechner
Jan. 21, 2019 6:30 a.m. ET

Big tech companies have taken a public lashing in the past year over their handling of users’ personal information. But many of their biggest privacy battles have yet to be fought—and the results will help determine the fate of some of the world’s largest businesses.

So far, tech giants like Facebook Inc. and Alphabet Inc.’s Google have proved relatively resilient against a growing backlash over possible abuse of their users’ personal privacy. Tech companies’ stocks may have swooned, but advertisers are continuing to cut them checks, and their profits are still growing at double-digit rates that would earn most CEOs a standing ovation.

This year may be stormier. Growing discontent among users over privacy and other issues—such as the widespread feeling that mobile devices and social media are addictive—could damp profit growth, discourage employees or chase away ad dollars. In Europe, regulators are slated to make major rulings about tech companies’ privacy practices, likely setting off high-stakes litigation. In the U.S., revelations about allegedly lax privacy protections are raising political pressure for federal privacy regulation.

At risk are tens of billions of dollars that marketers spend every year in online advertisements targeted at users with the help of personal information about individuals’ web browsing, mobile-app usage, physical location and sometimes other data, like income levels.

The behavior of tech giants is likely to be a major topic at the World Economic Forum this week in Davos, Switzerland. While the yearly meeting of world leaders and company executives normally celebrates how businesses can solve the world’s problems, tech companies were on the defensive last year against complaints that ranged from fomenting political polarization to building artificial intelligence that will displace millions of workers.

Since then, the pressure has increased. Facebook executives have been dragged before legislators on both sides of the Atlantic, after the company said data related to as many as 87 million people may have been improperly shared with Cambridge Analytica, a political analytics firm. And in September, Facebook said hackers had gained access to nearly 50 million accounts.

Google, meanwhile, has faced criticism of its privacy practices from political leaders, including flak after The Wall Street Journal reported that the company had exposed the private data of hundreds of thousands of users of its Google+ social network and opted initially not to disclose it.

Some tech executives have raised alarms, too. Apple Inc. Chief Executive Tim Cook, speaking in October before a privacy conference organized by the European Union, called for tighter regulation in the U.S. along the lines of a strict new privacy law in the EU, saying that some companies had “weaponized” users’ personal information in what he described as a “data-industrial complex.”

Facebook and Google both say that they have been investing heavily in improving how they protect user privacy and that they welcome tighter privacy rules; both companies support passage of a U.S. federal privacy law. Tech-industry lobbyists say they are planning to support U.S. privacy legislation over the coming year, in part to avoid contending with a patchwork of laws like one passed last year in California.

“Our industry strongly supports stronger privacy protections for consumers,” says Josh Kallmer, executive vice president for policy at the Information Technology Industry Council, which represents Facebook, Google and other tech companies. Mr. Kallmer says consumers “benefit incredibly from these technological innovations,” but adds that “alongside that are some very legitimate concerns about how data is being handled.”

What impact will stricter privacy rules have? There are two theories.

One school of thought says that stricter rules and tighter enforcement will benefit big, incumbent companies that already have access to large amounts of user data and can spend more heavily on legal-compliance efforts. The other argues that rules like those in the EU’s new General Data Protection Regulation, if strictly applied, will force significant changes to how the biggest tech companies collect and analyze individuals’ personal information—undercutting their advertising businesses and weakening their advantage over existing or potential new competitors.

“Both are reasonable claims. But it is far too early to tell which will turn out to be true,” says Alessandro Acquisti, a professor at Carnegie Mellon University who studies the behavioral economics of privacy.

At issue, in part, is the distinction between short-term and long-term effects. There are signs that Google, for one, benefited at least initially from the transition to the GDPR in May, in part because advertisers shifted money to the bigger firms, which were able to show they had users’ consent to display targeted ads.

In Europe, Google saw a 0.9% increase in the share of websites that include its advertising trackers two months after the GDPR went into effect compared with two months before, according to Cliqz, which makes antitracking tools for consumers. Facebook’s share declined 6.7%. The share for the other top 50 online-ad businesses fell more than 20%.

The longer-term impact on big firms is harder to predict. One study of nearly 10,000 online display advertising campaigns showed that users’ intent to purchase products was diminished after earlier EU laws restricted advertisers’ ability to collect data in order to target those ad campaigns. But more research is needed to determine what impact tighter rules would have on consumer spending more broadly, Prof. Acquisti says.

How the laws are enforced by regulators and courts will play an important role. Ireland’s Data Protection Commission, which is the EU’s lead regulator for Facebook and Google, is investigating complaints from privacy activists that the consent companies sometimes request for the processing of individuals’ data is a condition of using a service and so is not “freely given,” as the law requires.

In Germany, the federal antitrust enforcer says it will issue early this year a final decision regarding its preliminary finding that Facebook uses its power as the most popular social network in the country to strong-arm users into allowing it to collect data about them from third-party sources. A German decision wouldn’t involve fines, but could include orders to change business practices.

Both Facebook and Google say they comply with privacy laws.

Initial decisions could come this year, but whichever way the watchdogs come down, their actions are likely to end up reviewed in court. Those cases will end up determining how new privacy standards will be applied. And that will determine how profound their impact is.

“There is active litigation in a couple of places that could become hugely important,” Mr. Kallmer says. “It’s uncertainty that our industry thinks it’s on the right side of.”

Mr. Schechner is a Wall Street Journal reporter in Paris. Email sam.schechner@wsj.com.

Tags: , ,

Introducing ShazzleMail Email and How it Works

Privacy is your Fundamental Human Right.

Our Daily Blog
20190323_fbd001
Big tech faces competition and privacy concerns in Brussels
March 25, 2019

And the sector may be the better for it Print edition | Briefing Mar 23rd 2019 | PARIS Around 19 ...

Read more
telegram-3m
Telegram gets 3M new signups during Facebook apps’ outage
March 19, 2019

Natasha Lomas@riptari / 5 days ago Messaging platform Telegram claims to have had a surge in signup...

Read more
privacy-coins-and-bitcoin-dominance-guide
Apple tied to new privacy website, suggesting future security marketing
March 6, 2019

The iPhone maker, which makes privacy a selling point for its devices, appears to be gearing up for ...

Read more
images-1
US legal eagle: Well done, you bought privacy compliance tools. Doesn’t mean you comply with anything
February 25, 2019

From California state regs to Europe's GDPR: It's all just a 'veneer of protection' By Rebecca Hi...

Read more
imrs
Give To Get: Sensing, Tracking And Your Privacy
February 11, 2019

226 viewsFeb 10, 2019, 06:00pm By Tracy Brower: I write about the changing nature of work, workers ...

Read more