Have you created a ShazzleMail account on your smartphone? This is a required first step.

Yes No

Free Encrypted Email

Posts Tagged ‘privacy’


Your Privacy Is Our Business

April 30, 2019

Let us reassure you: You’re worried only because you don’t understand anything about anything.

By Jessica Powell
Ms. Powell writes about the tech world.

April 27, 2019

Zak Tebbal
You look upset. You’ve been making a lot of noise lately about deleting our company’s app from your phone. But please, sit down. Let me explain. You just don’t understand how technology works.

I don’t mean that in a condescending way — it’s just that you clearly don’t have a Ph.D. in internet security, and I think you sometimes get a little confused when you talk about these matters.

You keep saying we don’t respect your privacy, but look — here’s your data, right here in front of you. Here’s your picture of Uncle Greg throwing his face into the Mai Tai Volcano Bowl at that tiki bar. There’s your post on President Trump (nice job! That one got a lot of likes!). It’s all there, safe and secure in the warm blue blanket we’ve placed around it. No one has hacked it. It’s totally safe. Private.

[As technology advances, will it continue to blur the lines between public and private? Sign up for Charlie Warzel’s limited-run newsletter to explore what’s at stake and what you can do about it.]

Maybe by “privacy,” what you really mean to say is “control” — that you don’t have any control over your data. Believe me, we are all about control. We have a ton of privacy settings on our app. Just click on the top right of your screen, and then the link on the pull-down menu, and then follow that one and the one after that and — there you go!

The reason we make it a bit hard to get to your settings is that people aren’t really that interested in messing with those things. We know this because we watch everything you do.

Having said that, we care deeply about privacy and security, and think it’s so empowering for you (for all of us, really) to have control over your data. So by all means, if it will make you feel better, go right ahead and change your ad settings so that we can’t target you as closely as before. But just so you know, this means we’re going to have to show you a lot of ads for colorful socks with elephants on them, because we really have no idea who wants to buy those things, and we’re paid to show them to someone.

(Also, don’t be surprised if you suddenly feel lonely when all of our precision targeting disappears and your favorite brands are no longer in touch with you. What are you going to do? Go outside? Call your friends on the phone? This is 2019. No one wants to talk to you as much as the advertisers on our app.)
You’re mumbling again and looking irritated. Something about us and a data breach. I can tell you’ve been reading the newspapers. Don’t you know those things are obsolete? And journalists are so ill informed on security matters. They don’t respect privacy — their whole job is to reveal private things! They are the last people you should listen to on this issue.

Trust me when I assure you that there hasn’t been any data breach. Your data is secure. I mean, we did give third parties access to your data and sat back and watched as a political party used an ethically dubious service to target people in one of the most fraught elections in modern history, but really, who doesn’t do that in this industry? Also, we want to be really clear on this point since you aren’t a very technical person: Your data was secure the whole time — it was just in someone else’s very secure hands.

You’re shaking your head. Look, I know this is hard for you to understand, but I’m basically Henry Ford here and you are the old carriage driver who’s really worried that I’m going to run over your horse with my new car. And well, yes, I am going to do that — but here’s the thing! We’re doing it together.

All right, I’m still working on that metaphor, but let’s talk about the printing press. You see, when the printing press was invented, people were very afraid of it. They tried to stop it. But the printing press democratized access to information. That’s what we want to do with your information — democratize it — and make it available to every person, everywhere. We want to set your data free!

And we intend to make a fortune doing it.

Jessica Powell (@themoko) is the former head of communications for Google and the author of “The Big Disruption: A Totally Fictional but Essentially True Silicon Valley Story.”

Tags: , ,


Coffee with Privacy Pros: Three Constants of Privacy

April 23, 2019

A look behind the career and privacy theology of the law-lovin’ CPO of Uber, Ruby Zefo
Jared Coseglia On Apr 23, 2019

Ruby Zefo spent over fifteen years at Intel doing a ton of cool stuff! Now she’s at Uber, and privacy is her focus and passion. Like many of her preeminent peers in the privacy profession, Zefo found her way to the field after a series of internal promotions and having developed a reputation for understanding the brand and successfully implementing sensitive security and legal systems and protocols. “My manager said, ‘We need to step up the privacy team! You have built global teams here before, dropkick that!’” recalls Zefo after serving first as Intel’s managing counsel for trademarks and brands, then legal director for corporate affairs and IT/privacy and security, group counsel for McAfee products, followed by chief privacy and security counsel and finally group counsel for AI products. “Privacy was becoming the new black long before the GDPR,” Zefo admits. “I saw privacy as an opportunity for another career disruption.” Zefo is now the chief privacy officer for Uber, a company that has become a household name in under a decade and could possibly move toward a major public offering as early as this year.

Zefo is thoughtful, funny and to the point. She breaks down privacy into three pillars of challenge and constant consideration that should serve as a simple, recyclable reminder of what this profession is all about: laws, customers and technology. As she gets into the weeds of these three segments of the discipline, she illuminates potential opportunities for professionals looking to get ahead in the continuously competitive landscape of privacy.

Zefo is herself an attorney but feels that the responsibility to stay educated on the laws related to privacy is universal for anyone in the field. “Privacy laws and regulations are constantly changing; just keeping up can be really hard,” shares Zefo. And just keeping up may be enough for some hungry newcomers to the vertical looking to make an impact. According to Zefo, the speed and intensity with which privacy laws are changing and growing globally “levels the playing field for more junior people to break into the industry.” Zefo elaborates that, “You can be in privacy for twenty years, but if you are not keeping up [with the laws], you’re not worth much.”

Ruby Zefo, CPO at Uber
“But it’s not just keeping up,” she interjects. “Then you need to know how to operationalize it.” Zefo is indifferent to whether a privacy professional is a lawyer or not. “Legal … nonlegal … I can make anything work.” However, she does have a strong opinion about where the legal and operational privacy professionals should be based in the org chart. “I believe it’s best to have those privacy professionals condensed in the legal department. When you are split up, the nonlegal operational professionals end up in IT and not part of the privacy budget. Having done it both ways, I’ve been much happier being able to control the budget and share talent. No one questions if they are lawyers and nonlawyers. Both elements are helpful to a program.” One exception to this rule may be privacy engineers. “Privacy engineers often remain in engineering, which is appropriate in my opinion and a different kind of privacy function.” Zefo’s program at Uber is under legal.

When asked how privacy pros – lawyers or not – can keep learning and educating, Zefo, like many, points to attending privacy conferences, creating a social network, joining the IAPP, buying inexpensive books by experts or creating committees or working groups all in an effort to champion privacy programs and awareness. When asked what she looks for in new hires – lawyers or not – many of whom do not have explicit privacy experience in their background, she identifies “self-starters, go-getters and people willing to take chances.” Zefo finds the easiest people to weed out are the ones “who have taken no steps to self-groom themselves for a new practice area.”
he laws are but one pillar in Zefo’s triad of privacy practice. The customer, perhaps the most ever-changing and often hardest variable to understand and affect, is squarely at the center of Zefo’s professional ethos. “We are always asking what’s happening with customers,” says Zefo. “How do we differentiate in a competitive market? How do we keep up with customer expectations?” This in many cases has nothing to do with what is lawful, but rather what is liked. Zefo describes a constant ebb and flow between practicality and culture. “Sometimes the law isn’t enough. You’re doing something totally lawful, but customers don’t like it. Sometimes we remove things that they don’t like.” For large organizations, engaging customers about privacy and understanding what they want and do not want can be, according to Zefo, “hard to decipher, but has to be part of the conversation.” Zefo continues by saying, “Privacy is about feelings – not just how a regulator will enforce it, but how customers will view the policy.”

The final pillar in Zefo’s triad is technology. She includes both the use of technology to automate tasks for privacy programs as well as how everyone in the company uses technology as core to this consideration. Vendors are increasingly entering the privacy vertical, and Zefo values these entrees. “You need external vendors to choose from who are experts in the space,” comments Zefo. This list is growing, and Zefo believes this is largely due to the rapidly evolving priorities for buyers in the space. Additionally, Zefo views the dynamic between software provider and client as a partnership, pointing to the importance of giving vendors feedback so they can grow the tools in the right direction. Those who can wield technology and work collaboratively with complex in-house corporate legal teams have a real opportunity for professional advancement and enhancement in the privacy space.
“Technology has made a big impact on privacy,” says Zefo, “but it is not just about AI. AI is a very broad term – a true scientist would tell you to clarify.” When it relates to an individual’s personal data, Zefo feels people at a minimum want to know “what is going into our system, so what comes out isn’t bias.” Zefo smartly holds technology and the people who use it to a standard of “people don’t like what they don’t understand.” Thus, marrying transparency with simplicity when describing what technology does with one’s data is paramount to the practice of privacy programs and policies. “Your AI-enabled vacuum is not going to take over the world, but people are rightfully concerned when AI gets fancier,” adds Zefo.

Zefo closes the conversation with a clear example of how consideration of these three pillars of law, customer and technology is demonstrated by a simple Uber user experience. Say an Uber customer is in Tecopa, California, and has a driver picking them up and taking them to Las Vegas, Nevada. The destinations are in different states. How could this affect data regulation? Will an application technology know when or if data generation is created in different states? Are notifications necessary? Will this nuance require any unique user experience differentiation? Should it? “Laws are my job. I have to make sure my company is compliant,” says Zefo, “but that’s not the end of my analysis, especially when you are looking for customers to have a uniform experience.”

A uniform customer experience is a recurring theme among privacy leaders (see previous Coffees with Privacy Pros with CPO Cynthia Van Ort and CPO James Howard) and challenges all privacy pros to stay very in touch, not only with the laws, but with the people who use their service and products every day. The constant need for educational refreshment in these three pillars of privacy, coupled with the move of privacy from corporate to social consciousness, forecasts a high likelihood that privacy as a profession will continue to grow, expand and demand top talent who can keep pace and stay relevant.

Tags: ,


We’ve Stopped Talking And Searching About Privacy

April 15, 2019

Kalev Leetaru
AI & Big Data
I write about the broad intersection of data and society.

After a year of nearly continuous privacy revelations involving Facebook and, in an era when breaches and privacy failures have become so commonplace they no longer even attract much in the way of headlines, have we as a society simply given up on the quaint historical notion of privacy?

Historically, mentions of the word “privacy” in English language books published 1800-2000 seem to have taken off in the late 1960’s, coinciding with a near-vertical increase in mentions of “data” and “computer.” It seems from the very beginning, our societal discussion of privacy was tightly linked to the rise of the computing era.
In fact, the US Government’s Privacy Act of 1974 was motivated in part by the “potential abuses presented by the government’s increasing use of computers to store and retrieve personal data by means of a universal identifier.”

Looking to American television news over the past decade as monitored by the Internet Archive’s Television News Archive via the GDELT Project, the timeline below shows the percentage of CNN, MSNBC and Fox News’ combined airtime June 2009 to April 2019 that mentioned privacy.
It seems the impact of Edward Snowden’s disclosures in June 2013 was short lived at best. Over the past decade there has been no appreciable long-term increase in coverage mentioning “privacy.” In fact, overall there has been a slight decrease. Even BBC News London shows no increase since the Archive began monitoring it in January 2017, suggesting Europe’s greater focus on privacy has not translated into greater media coverage of the term.

Looking to worldwide online news coverage in the 65 languages monitored by GDELT since January 2017, the impact of the Cambridge Analytica scandal can be seen more clearly.It seems the impact of Edward Snowden’s disclosures in June 2013 was short lived at best. Over the past decade there has been no appreciable long-term increase in coverage mentioning “privacy.” In fact, overall there has been a slight decrease. Even BBC News London shows no increase since the Archive began monitoring it in January 2017, suggesting Europe’s greater focus on privacy has not translated into greater media coverage of the term.

Looking to worldwide online news coverage in the 65 languages monitored by GDELT since January 2017, the impact of the Cambridge Analytica scandal can be seen more clearly.
Global coverage has nearly doubled in the year since, but has remained stable, with no substantial increase even as Facebook has racked up privacy issue after privacy issue and as the steady drumbeat of global data breaches has accelerated.

Worldwide web searches via Google Trends 2004-2019 using Google’s “Privacy” Topic (which includes its translations into languages across the world) shows that we used to search a lot more about privacy in 2004. Something changed between 2004 and 2007, as we searched less and less about it. From 2007-2019, we see to care little about privacy.
We do seem to care more about privacy-related news, however. Looking at worldwide Google News searches 2008-2019 about Privacy, we see that 2009 marked the rise of news-related interest in privacy, but that overall news searches about privacy have remained relatively stable. Facebook’s privacy shift in May 2010 caused a brief spike in searches, as did the Obama White House’s proposed “Privacy Bill of Rights” in February 2012. Yet, other than these two brief blips, there seems to have been waning interest until last year’s Cambridge Analytica story. Yet, even a year of Facebook privacy stories seems to have increased interest only slightly.
Of course, the concept of privacy can be expressed in myriad ways and searches for stories like Cambridge Analytica are implicitly searches about digital privacy. It could certainly be the case that we still care about privacy, but simply search about it using different terminology or more tactically, researching specific privacy-related stories we believe have the greatest impact on ourselves, rather than researching privacy as a whole.

Regardless, it is intriguing that at least the literal term “privacy” is fading overall from both media coverage and search interest.

Putting this all together, perhaps the greatest contribution of the digital world has been not the introduction of global access to information, but rather the completion of the great governmental dream of eliminating privacy once and for all. It seems Orwell’s 1984 was just a few decades too early.

Tags: ,


US legal eagle: Well done, you bought privacy compliance tools. Doesn’t mean you comply with anything

February 25, 2019

From California state regs to Europe’s GDPR: It’s all just a ‘veneer of protection’

By Rebecca Hill 25 Feb 2019 at 14:44 13
Much-lauded privacy laws risk being undermined as compliance is outsourced to tech vendors and “toothless trainings, audits and paper trails” are confused for genuine protections, a New York Law School professor has said.

In a paper in the Washington Law Review, published online last week, Ari Ezra Waldman argued that recently strengthened privacy laws actually offer “false promises” for consumers.

He said that laws like the European Union’s GDPR or California’s state privacy rules are failing to deliver on their promised protections partly because of the “booming market” in tech vendors hawking privacy compliance tools.

“The responsibility for fulfilling legal obligations is being outsourced to engineers at third-party technology vendors who see privacy law through a corporate, rather than substantive, lens,” he wrote.

“Toothless trainings, audits, and paper trails, among other symbols, are being confused for actual adherence to privacy law, which has the effect of undermining the promise of greater privacy protection for consumers.”

The problem is heightened because, as they fear increasing fines under the new laws, organisations – particularly those without the cash to build tools in-house or hire in experts – are more likely to look for a quick fix.

However, Waldman warned that this could have knock-on effects for not only because organisations buying honky kit risk non-compliance, but also for both the long-term outlook of the vendors and consumers.

“Not all innovation is good innovation,” Waldman said. “Companies that develop shoddy products may lose out in the market in the long term, but in the short and medium term, they risk putting millions of persons’ data at risk.”

‘Symbols of compliance standing in for real protections’
The paper aimed to emphasise the importance of privacy laws by pointing to Facebook’s “cavalier” approach to data protection, mobile app platforms that “routinely sweep in user data” because they can, and even academics’ interest in hoovering up personal info as part of studies.

As the implications of such mass data hoarding, harvesting and hawking have come to light, a set of comprehensive international privacy laws have been drawn up – but Waldman said that, in reality, the law’s “veneer of protection is hiding the fact that it is built on a house of cards”.

He pins much of this on the burgeoning “privacy outsourcing market” and the idea that third-party tech vendors “instantiate their own vision of the law into their services” to fling at organisations desperate to avoid whopping fines.

The argument is based on a socio-legal principle of “legal endogeneity”, first mooted by academic Lauren Edelman. This is when the law is shaped by ideas emerging from the space it seeks to regulate, rather than constraining or guiding those organisations’ behaviour.

It occurs when “ambiguously worded legal requirements” allow compliance professionals on the ground to define what the law means in practice – and in the case of privacy laws, much of this comes down to tech vendors and compliance professionals.

Some of the law’s most important premises – like privacy by design or consent – “are so unclear that professionals on the ground have wide latitude to frame the law’s requirements, kicking endogeneity into high gear”.

Tech can’t save you – but everyone wants it to
Mixed in with this is the fact that both private and public bodies have (misplaced) faith in technology to solve their problems; meanwhile the threats of financial penalties make organisations “uniquely susceptible to promises that vendors can make their troubles disappear”.

This opens the door to vendors selling compliance, and Waldman said that there are 200-plus firms that “instantiate their own interpretations of privacy law into the designs of automated tools, often marketing themselves as one-stop compliance shops”.

The author – hoping to see off any “not all vendors!” comebacks – emphasised that he isn’t saying every firm is part of the problem, nor that they alone are responsible for undermining the promise of privacy law.

Instead, Waldman said that the impact of privacy tech vendors on the legal frameworks is “both significant and under-explored” – and aimed to probe this by assessing the claims made by 165 companies listed in a 2018 report (PDF) from the International Association of Privacy Professionals.

He found that, at some point, almost three-quarters had at some point positioned their products and services as achieving GDPR compliance – when most are designed to meet just two or three of the GDPR’s requirements, “if that”.

‘Privacy law can’t be broken down into code-able pieces’
A further issue described in the paper is that, by promoting these tools for compliance, vendors are attempting to reduce the law into “code-able pieces” when the law is about more than just paper trails and data maps.

“Such under-inclusive compliance technologies may then have the effect of increasing corporate exposure to administrative fines if in-house constituencies confuse purchasing a compliance technology that does a few things with actually solving a problem,” Waldman wrote.

He also posits the idea that this could lead to an imbalance between firms that have to outsource because they lack the money or time to recruit legal experts or build their own tools in-house, and those that can afford to do this.

Meanwhile, consumers are being disempowered because they are increasingly faced with tech-driven conversations about compliance based on black box algorithms. This also risks “erasing” traditional safeguards that sees the law interpreted in the open and on the public record.

Waldman proposed lawmakers edge away from “transactional visions of privacy law that are susceptible to symbolic structures”, as well as calling on the US Federal Trade Commission to be “more active vendor regulators” with better audits.

For vendors, he called for “more modest approaches” that include hiring lawyers and professionals and establishing a closer relationship with regulators, possibly including certification.

Possible products and services include summaries and comparisons of legislation, training courses and tools that scan the data a company has to seek out personal information.

He also called for further research that puts vendors in an ecosystem of social forces that influence the implementation of privacy law on the ground, as well as work on the problem of privacy education for engineers. ®

Tags: , ,


Give To Get: Sensing, Tracking And Your Privacy

February 11, 2019

226 viewsFeb 10, 2019, 06:00pm
By Tracy Brower: I write about the changing nature of work, workers and the workplace.

What are you willing to give up when it comes to your privacy? In the end, it depends on what you get in return—the give to get equation. Technology can track us anywhere. As it gains momentum, and as we increasingly choose to be tracked, we also give up some privacy. But, what do we gain?

Your Fitbit knows your steps, your water intake and your weight. MapMyWalk knows your whereabouts and your level of workout discipline. Your ride sharing app knows the route you took, what time you traveled, and how much you paid. Facebook knows your friends, your sentiments, and so much more. If you use these apps, you believe that you get something in return that is worth the personal information you share—better health, convenient transportation or connections with a community of friends.

It’s Only The Beginning

These are only the beginning, of course. New contact lenses for diabetics sense glucose levels. A dress outfitted with sensors tracks how often women were touched at a nightclub. There is even a diaper that senses when a baby needs a change. In addition, China is beginning use of a social credit system in order to assess economic and social reputation.

Employee Monitoring

Increasingly, our employers know a lot about us as well. Sensing systems track handwashing compliance in medical workers. In an effort to curtail theft, TSA established a surveillance system for its employees. Unfortunately, it met with unintended consequences in which employees felt devalued and did everything they could to stay out of the cameras’ view.

Consequences of these capabilities and systems – both intended and unintended – result in promise and peril. Growth in sensing technology will have far-reaching implications for our social norms and systems. Data gathering is not inherently negative, it’s a matter of how transparent companies are in gathering data and the choices they make about how the data is used.
Some companies provide discounts on health insurance in exchange for the use of a health app into which employees enter their most personal health information. Companies track worker locations through badging data and talent information based on performance management systems.

None of this is bad, necessarily, but what do employees receive in return? The risk is that the answer to that question is “not much”. The pendulum can swing toward the value of data for the company—where organizations are getting a lot of data—without much of a give-back to employees. But there is an opportunity to equal the playing field for employees—ensuring employees receive a message they are trusted and that tracking delivers value for them as much as for the company.

Give To Get

What if the data that is collected via your email or badge tracking could come back in a way that helps you work more effectively? What if the data you enter into the talent management system could help you curate your career, enhance your job fit, or match you with a mentor? What if the data you enter into the company health program could help you maintain fitness and reduce stress?

“Give to get” is the balancing of what the company receives in terms of the data it collects and the value it returns to employees. The best companies are those which do both—extract data that helps them achieve organizational results and provides value to employees as well.

Constructive Cultures

For organizations, the holy grail is constructive productive cultures where people want to work, make discretionary effort and contribute their best skills. The best cultures are transparent – sharing openly so employees can make the most informed decisions. They seed innovation by fostering appropriate risk taking, and encourage employees to share and explore. This type of culture can be antithetical to a need to protect security and privacy through limited information sharing and confidentiality. The ways companies gather, track, and monitor information send important messages to employees about value and trust.

Companies can balance the need for both security and privacy by educating people about why they’re gathering information and being as transparent as possible. Trust and positive culture are also enhanced by providing more choice and control—giving employees the opportunity to opt out of data gathering when it’s possible. Ultimately, companies need to do what’s right—not just what’s possible—by using their values as a guide.

Resolving The Tension

It’s a tension and the idea of “give to get” is one way to resolve it. When companies extract data from employees and consumers, they must give back as well.

Tags: , ,

Introducing ShazzleMail Email and How it Works

Privacy is your Fundamental Human Right.

Our Daily Blog
Your Privacy Is Our Business
April 30, 2019

Let us reassure you: You’re worried only because you don’t understand anything about anything. ...

Read more
Coffee with Privacy Pros: Three Constants of Privacy
April 23, 2019

A look behind the career and privacy theology of the law-lovin’ CPO of Uber, Ruby Zefo Jared Cose...

Read more
We’ve Stopped Talking And Searching About Privacy
April 15, 2019

Kalev Leetaru Contributor AI & Big Data I write about the broad intersection of data and soci...

Read more
Rebiton Allows You to Buy Bitcoin and Keep Your Privacy
April 8, 2019

by Kai Sedgwick Purchasing bitcoin ought to be quick and easy, but over the years, encroaching KY...

Read more
Big tech faces competition and privacy concerns in Brussels
March 25, 2019

And the sector may be the better for it Print edition | Briefing Mar 23rd 2019 | PARIS Around 19 ...

Read more