Have you created a ShazzleMail account on your smartphone? This is a required first step.

Yes No

Free Encrypted Email

Posts Tagged ‘privacy’

imrs

Give To Get: Sensing, Tracking And Your Privacy

February 11, 2019

226 viewsFeb 10, 2019, 06:00pm
By Tracy Brower: I write about the changing nature of work, workers and the workplace.

What are you willing to give up when it comes to your privacy? In the end, it depends on what you get in return—the give to get equation. Technology can track us anywhere. As it gains momentum, and as we increasingly choose to be tracked, we also give up some privacy. But, what do we gain?

Your Fitbit knows your steps, your water intake and your weight. MapMyWalk knows your whereabouts and your level of workout discipline. Your ride sharing app knows the route you took, what time you traveled, and how much you paid. Facebook knows your friends, your sentiments, and so much more. If you use these apps, you believe that you get something in return that is worth the personal information you share—better health, convenient transportation or connections with a community of friends.

It’s Only The Beginning

These are only the beginning, of course. New contact lenses for diabetics sense glucose levels. A dress outfitted with sensors tracks how often women were touched at a nightclub. There is even a diaper that senses when a baby needs a change. In addition, China is beginning use of a social credit system in order to assess economic and social reputation.

Employee Monitoring

Increasingly, our employers know a lot about us as well. Sensing systems track handwashing compliance in medical workers. In an effort to curtail theft, TSA established a surveillance system for its employees. Unfortunately, it met with unintended consequences in which employees felt devalued and did everything they could to stay out of the cameras’ view.

Consequences of these capabilities and systems – both intended and unintended – result in promise and peril. Growth in sensing technology will have far-reaching implications for our social norms and systems. Data gathering is not inherently negative, it’s a matter of how transparent companies are in gathering data and the choices they make about how the data is used.
Some companies provide discounts on health insurance in exchange for the use of a health app into which employees enter their most personal health information. Companies track worker locations through badging data and talent information based on performance management systems.

None of this is bad, necessarily, but what do employees receive in return? The risk is that the answer to that question is “not much”. The pendulum can swing toward the value of data for the company—where organizations are getting a lot of data—without much of a give-back to employees. But there is an opportunity to equal the playing field for employees—ensuring employees receive a message they are trusted and that tracking delivers value for them as much as for the company.

Give To Get

What if the data that is collected via your email or badge tracking could come back in a way that helps you work more effectively? What if the data you enter into the talent management system could help you curate your career, enhance your job fit, or match you with a mentor? What if the data you enter into the company health program could help you maintain fitness and reduce stress?

“Give to get” is the balancing of what the company receives in terms of the data it collects and the value it returns to employees. The best companies are those which do both—extract data that helps them achieve organizational results and provides value to employees as well.

Constructive Cultures

For organizations, the holy grail is constructive productive cultures where people want to work, make discretionary effort and contribute their best skills. The best cultures are transparent – sharing openly so employees can make the most informed decisions. They seed innovation by fostering appropriate risk taking, and encourage employees to share and explore. This type of culture can be antithetical to a need to protect security and privacy through limited information sharing and confidentiality. The ways companies gather, track, and monitor information send important messages to employees about value and trust.

Companies can balance the need for both security and privacy by educating people about why they’re gathering information and being as transparent as possible. Trust and positive culture are also enhanced by providing more choice and control—giving employees the opportunity to opt out of data gathering when it’s possible. Ultimately, companies need to do what’s right—not just what’s possible—by using their values as a guide.

Resolving The Tension

It’s a tension and the idea of “give to get” is one way to resolve it. When companies extract data from employees and consumers, they must give back as well.

Tags: , ,

_92023784_thinkstockphotos-482112104

Wise words on privacy, insurance company fined for privacy breach, and secure that email

February 5, 2019

Howard Solomon Howard Solomon @howarditwc
Published: February 4th, 2019
Wise words on privacy from a Canadian expert, a U.K.insurance company fined for mixing business and politics for Brexit and how to secure that email.

Welcome to Cyber Security Today. It’s Monday February 4th. To hear the podcast click on the arrow below:
I was at a privacy conference in Toronto last week where I heard the respected Canadian expert Ann Cavoukian speak. She reminded attendees that privacy and security go together: They aren’t opposites. In fact, she said, privacy is essential to innovation. Companies that do both privacy and security will have an advantage over competitors because customers will trust them more. Is improving the control customers have over their personal information costly, including giving the ability to refuse to allow their personal data to be re-used or shipped to another firm? Maybe. But, Cavoukian adds, that’s nothing to the damage to your brand, loss of trust, and lawsuits that result from a data breach.

Speaking of privacy breaches, the U.K. information commissioner has fined a British insurance company that sent over one million emails to subscribers of the Leave EU Brexit campaign without their full consent three years ago. And the campaign was fined as well for unlawfully using the insurance company to 300,000 political marketing messages to customers. It is deeply concerning that sensitive personal data gathered for political purposes was later used for insurance purposes; and vice versa,” said the information commissioner. You can read her full report here
Fake email, where an attacker uses a phony “from” email address, a deceptive domain or a display name that impersonates a familiar company, is behind many successful data breaches. Someone clicks on a link or opens an attachment and in seconds they’re infected. If only there was a way to authenticate where email comes from. Actually, there is: It’s an open standard called DMARC. The good news is more companies are using it. The bad news, according to security vendor Vailmail, is not enough of them are doing it, nor are they configuring it right. In a study released on Friday, the company said 80 per cent of U.S. federal government domains now use DMARC. By comparison at least 50 per cent of Fortune 500 and large U.S. tech companies have adopted DMARC. Does your company have a way of authenticating email it sends? You should ask.

Finally, ever wonder how cellphone companies co-ordinate the billions of phone calls and text messages sent around the world? They do it through a protocol called SS7. This protocol has vulnerabilities. However, until recently it was thought only intelligence agencies could break into it. But last week the news site Motherboard confirmed a British bank was victimized by an SS7 hack, so it seems cybercriminals now have the capability as well. What this means is the six-digit text messages a financial institution will send you as part of a two-factor authentication login system are increasingly likely to be stolen. I’ve talked about this before: The standard text messaging app that comes with many smartphones may not be safe enough for two-factor authentication. Some cellphone companies say they have taken steps to better secure their text messaging. But if you use an email, company login or bank app that offers two-factor authentication in addition to a username and password, see if it offers the ability to get the special code through a safer messaging app. Four of them are Google Authenticator, Authy, Authenticator Plus, and Duo.

That’s it for Cyber Security Today. Subscribe on Apple Podcasts, Google Podcasts or add us to your Flash Briefing on your smart speaker. Thanks for listening.

Tags: , ,

screen-shot-2017-09-13-at-2-38-44-pm

Privacy is a human right, we need a GDPR for the world: Microsoft CEO

January 28, 2019

This article is part of the World Economic Forum Annual Meeting
24 Jan 2019
Ceri Parker
Commissioning Editor, Agenda, World Economic Forum

Against the backdrop of a “techlash”, the CEO of Microsoft called for new global norms on privacy, data and Artificial Intelligence.

Satya Nadella, who has been shifting Microsoft’s focus to cloud computing, said he would welcome clearer regulations as every company and industry grappled with the data age.
In a talk at Davos, he praised GDPR, the European regulation on data protection and privacy that came into force last year.

“My own point of view is that it’s a fantastic start in treating privacy as a human right. I hope that in the United States we do something similar, and that the world converges on a common standard.”

The default position had to be that people owned their own data, he said.

Privacy is just one controversial area for tech companies. Nadella also addressed the growing field of facial recognition.
“It’s a piece of technology that’s going to be democratized, that’s going to be prevalent, I can come up with 10 uses that are very virtuous and important and can improve human life, and 10 uses that would cause problems,” he said.

Microsoft’s own website lists the below as applications to celebrate:

“Police in New Delhi recently trialed facial recognition technology and identified almost 3,000 missing children in four days. Historians in the United States have used the technology to identify the portraits of unknown soldiers in Civil War photographs taken in the 1860s. Researchers successfully used facial recognition software to diagnose a rare, genetic disease in Africans, Asians and Latin Americans.”

But the dark sides include invasion of privacy and bias. While Microsoft has built a set of principles for the ethical use of AI, Nadella said that self-regulation was not enough.

“In the marketplace there’s no discrimination between the right use and the wrong use… We welcome any regulation that helps the marketplace not be a race to the bottom.”

Share

Written by

Ceri Parker, Commissioning Editor, Agenda, World Economic Forum

The views expressed in this article are those of the author alone and not the World Economic Forum.

Tags: , , ,

fb featured image

Privacy Problems Mount for Tech Giants

January 21, 2019

By Sam Schechner
Jan. 21, 2019 6:30 a.m. ET

Big tech companies have taken a public lashing in the past year over their handling of users’ personal information. But many of their biggest privacy battles have yet to be fought—and the results will help determine the fate of some of the world’s largest businesses.

So far, tech giants like Facebook Inc. and Alphabet Inc.’s Google have proved relatively resilient against a growing backlash over possible abuse of their users’ personal privacy. Tech companies’ stocks may have swooned, but advertisers are continuing to cut them checks, and their profits are still growing at double-digit rates that would earn most CEOs a standing ovation.

This year may be stormier. Growing discontent among users over privacy and other issues—such as the widespread feeling that mobile devices and social media are addictive—could damp profit growth, discourage employees or chase away ad dollars. In Europe, regulators are slated to make major rulings about tech companies’ privacy practices, likely setting off high-stakes litigation. In the U.S., revelations about allegedly lax privacy protections are raising political pressure for federal privacy regulation.

At risk are tens of billions of dollars that marketers spend every year in online advertisements targeted at users with the help of personal information about individuals’ web browsing, mobile-app usage, physical location and sometimes other data, like income levels.

The behavior of tech giants is likely to be a major topic at the World Economic Forum this week in Davos, Switzerland. While the yearly meeting of world leaders and company executives normally celebrates how businesses can solve the world’s problems, tech companies were on the defensive last year against complaints that ranged from fomenting political polarization to building artificial intelligence that will displace millions of workers.

Since then, the pressure has increased. Facebook executives have been dragged before legislators on both sides of the Atlantic, after the company said data related to as many as 87 million people may have been improperly shared with Cambridge Analytica, a political analytics firm. And in September, Facebook said hackers had gained access to nearly 50 million accounts.

Google, meanwhile, has faced criticism of its privacy practices from political leaders, including flak after The Wall Street Journal reported that the company had exposed the private data of hundreds of thousands of users of its Google+ social network and opted initially not to disclose it.

Some tech executives have raised alarms, too. Apple Inc. Chief Executive Tim Cook, speaking in October before a privacy conference organized by the European Union, called for tighter regulation in the U.S. along the lines of a strict new privacy law in the EU, saying that some companies had “weaponized” users’ personal information in what he described as a “data-industrial complex.”

Facebook and Google both say that they have been investing heavily in improving how they protect user privacy and that they welcome tighter privacy rules; both companies support passage of a U.S. federal privacy law. Tech-industry lobbyists say they are planning to support U.S. privacy legislation over the coming year, in part to avoid contending with a patchwork of laws like one passed last year in California.

“Our industry strongly supports stronger privacy protections for consumers,” says Josh Kallmer, executive vice president for policy at the Information Technology Industry Council, which represents Facebook, Google and other tech companies. Mr. Kallmer says consumers “benefit incredibly from these technological innovations,” but adds that “alongside that are some very legitimate concerns about how data is being handled.”

What impact will stricter privacy rules have? There are two theories.

One school of thought says that stricter rules and tighter enforcement will benefit big, incumbent companies that already have access to large amounts of user data and can spend more heavily on legal-compliance efforts. The other argues that rules like those in the EU’s new General Data Protection Regulation, if strictly applied, will force significant changes to how the biggest tech companies collect and analyze individuals’ personal information—undercutting their advertising businesses and weakening their advantage over existing or potential new competitors.

“Both are reasonable claims. But it is far too early to tell which will turn out to be true,” says Alessandro Acquisti, a professor at Carnegie Mellon University who studies the behavioral economics of privacy.

At issue, in part, is the distinction between short-term and long-term effects. There are signs that Google, for one, benefited at least initially from the transition to the GDPR in May, in part because advertisers shifted money to the bigger firms, which were able to show they had users’ consent to display targeted ads.

In Europe, Google saw a 0.9% increase in the share of websites that include its advertising trackers two months after the GDPR went into effect compared with two months before, according to Cliqz, which makes antitracking tools for consumers. Facebook’s share declined 6.7%. The share for the other top 50 online-ad businesses fell more than 20%.

The longer-term impact on big firms is harder to predict. One study of nearly 10,000 online display advertising campaigns showed that users’ intent to purchase products was diminished after earlier EU laws restricted advertisers’ ability to collect data in order to target those ad campaigns. But more research is needed to determine what impact tighter rules would have on consumer spending more broadly, Prof. Acquisti says.

How the laws are enforced by regulators and courts will play an important role. Ireland’s Data Protection Commission, which is the EU’s lead regulator for Facebook and Google, is investigating complaints from privacy activists that the consent companies sometimes request for the processing of individuals’ data is a condition of using a service and so is not “freely given,” as the law requires.

In Germany, the federal antitrust enforcer says it will issue early this year a final decision regarding its preliminary finding that Facebook uses its power as the most popular social network in the country to strong-arm users into allowing it to collect data about them from third-party sources. A German decision wouldn’t involve fines, but could include orders to change business practices.

Both Facebook and Google say they comply with privacy laws.

Initial decisions could come this year, but whichever way the watchdogs come down, their actions are likely to end up reviewed in court. Those cases will end up determining how new privacy standards will be applied. And that will determine how profound their impact is.

“There is active litigation in a couple of places that could become hugely important,” Mr. Kallmer says. “It’s uncertainty that our industry thinks it’s on the right side of.”

Mr. Schechner is a Wall Street Journal reporter in Paris. Email sam.schechner@wsj.com.

Tags: , ,

pr

Why data privacy is hot and machine learning is not

January 15, 2019

by RAFAEL LAGUNA — 1 day ago in CONTRIBUTORS
Looking back on the past twelve months, we will all remember cybersecurity scares, revelations of data malpractices, and countless large-scale data breaches. Allegations ranged from Google’s non-consensual tracking of user location data to (unlikely) instances of China covertly installing microscopic spy chips on US tech hardware.

It’s clear that data privacy will underpin innovation and technological advancement in 2019, while the latest buzzword tech is set to go the way cryptocurrency went last year (who saw that coming?). As a member of the recently fashionable – not to mention increasingly lucrative – open source community, I wanted to share my predictions for the year ahead.

The bellwether blunder from the past 12 months has to be the Facebook-Cambridge Analytica data scandal, where it was alleged that the poached data of millions was used to influence voter behavior for key political decisions, including Trump’s election and the Brexit referendum.

Thus Mark Zuckerberg’s purported clandestine global practises were brought to the forefront of public debate, which according to one survey preceded five per cent of Brits deleting their Facebook accounts.

For most of us in the open source community, it really feels like we are at a tipping point. Finally, the general public are beginning to understand what we have known for a long time: people are waking up to the realization that their data is being used and misused for heinous purposes.

That, coupled with the fact that millennials are rapidly falling out of love with the platform, suggests that in 2019 we can expect to take one step closer to the “grown-up” internet – sometimes called the Web 3.0.

Pressure will continue to mount for companies to embrace user-friendly data privacy and ownership. The shift in cultural expectations as well as the introduction of data legislation (e.g. GDPR) will force the “walled gardens” to cease operating their former model of siloing data without disclosing how they are using it. Otherwise users will flee the platform causing a network effect where if one domino falls, they all fall.

Although we have seen a rise of alternatives trying to take down the big silos – sometimes utilizing open source technology – we are yet to witness a viable alternative to Facebook, Twitter, Google, etc. But as public figures continue to call for change, I expect that in 2019 we will see more and more attempts at a federated, distributed messaging system competitor. And somebody might just get it right.

Downswing: Machine learning
On the back of the rollercoaster ride that was the Bitcoin bubble, blockchain was the buzzword at the beginning of 2018. And yet as we move into 2019, only the staunchest of fans will be defending its – eternally-yet-to-be-established – application as a solution for problems that we may or may not face in the future.

I won’t be bucking any trends with this prediction: it’s very likely that blockchain will continue its way along the downswing phase of its hype cycle in the new year.

But one hot topic from today that I feel most people are giving too much credit is machine learning. OK, so it isn’t blockchain – it is certainly true that companies have done some great things with it, but they do tend to have very specialized applications. It seems that for the broader, more headline-grabbing applications, the problems are harder to solve than we ever realized.

Take the example of autonomous vehicles. We’ve been promised self-driving cars for what feels like an age, and yet how close are we really to achieving full autonomy? When will we see the first example of a driverless car that is able to react to all situations that you encounter on the road? Certainly not in 2019.

With the machine learning approach, it’s become clear that autonomous cars are unable to recognize the social component of driving on roads full of human motorists. Eye-contact and social cues are essential to ensure road safety, and as humans this comes easily to us. For machines, it’s a different story – and teaching them to recognize nuanced human communication is going to be a difficult task no matter how many CAPTCHAs of road traffic signs and storefronts we complete.

Then on top of all this, yet again there’s the privacy element. Cars collect a lot of data about us; including where we’ve been and when. And it’s not always apparent who this data is sold to. It may be ‘anonymized’, but location data is one of the most identifiable types of personal information that a company can hold on a user. People need to have more control over this.

As far as the future of autonomous vehicles is concerned, the key will be networking all of the cars using open standards and open protocols. Once all the cars are connected, there’s no need for machines to understand the complexities of human communication. This is the open source approach and it’s how we created the internet, so why not create the Internet of Cars?

Granting users control
The past year has been an eventful time for the wider tech sector – and amid the Teslas in space and genetic bioengineering, we’ve taken several leaps forward in regards to bringing issues surrounding data privacy into the public sphere of influence.

Facebook’s meltdown quite conveniently coincided with the introduction of GDPR, feeding into a catalyst for debate over data rights, privacy and ownership. While at the other end of the scale, revolutionary technologies that we once thought would be upon us already, continue to be pushed back.

We now begin 2019 with interest in open source solutions at an all-time high. And I cannot help but anticipate that innovators will continue to apply this technology, taking control away from the silos and granting it back to the users.

Tags: , ,

Introducing ShazzleMail Email and How it Works

Privacy is your Fundamental Human Right.

Our Daily Blog
main-snowden
Edward Snowden’s Autobiography Makes a Plea for the Fourth Amendment, the Right to Privacy, and Encryption
September 24, 2019

America's most famous whistleblower calls for restricting the power of government. Article by SCO...

Read more
ph
Chinese deepfake app Zao sparks privacy row after going viral
September 3, 2019

Critics say face-swap app could spread misinformation on a massive scale A Chinese app that lets ...

Read more
1463600977631262
Google tightens grip on some Android data over privacy fears, report says
August 19, 2019

The search giant ends a program that provided network coverage data to wireless carriers. BY CARR...

Read more
4000
Wikipedia co-founder slams Mark Zuckerberg, Twitter and the ‘appalling’ internet
July 8, 2019

Elizabeth Schulze Wikpedia Co-Founder Larry Sanger said in an interview social media companies ...

Read more
venmo_pub_priv
Why America Needs a Thoughtful Federal Privacy Law
June 26, 2019

More than a dozen privacy bills have been introduced in this Congress. Here’s what it needs to do....

Read more