Have you created a ShazzleMail account on your smartphone? This is a required first step.

Yes No

Free Encrypted Email

Posts Tagged ‘privacy’

private

Private Blockchains Could Be Compatible with EU Privacy Rules, Research Shows

November 12, 2018

Private blockchains, such as interbanking platforms set to share information on customers, could be compatible with new E.U. privacy rules, according to research published Nov. 6. The study was conducted by Queen Mary University of London and the University of Cambridge, U.K.
The General Data Protection Regulation (GDPR) act, a recent legislation that regulates the storage of personal data for all individuals within the European Union, came into effect this May. According to the law, all data controllers have to respect citizens’ rights in terms of keeping and transferring their private information. In case a data controller fails to do so, the potential fines are set as €20 million (about $22 million) or four percent of global turnover/revenues, whichever is higher.

The recent U.K. study, published in the Richmond Journal of Law and Technologies, views blockchain and its nodes through the length of GDPR. According to the researchers, crypto-related technologies could fall under these rules and be treated as “controllers,” given that they publicly store private information about E.U. citizens in the chain and allow third parties to operate it. This, the study reveals, might slow down technology implementation in EU:

“There is a risk that this legal uncertainty will have a chilling effect on innovation, at least in the EU and potentially more broadly. For example, if all nodes and miners of a platform were to be deemed joint controllers, they would have joint and several liability, with potential penalties under the GDPR.”

However, the researchers emphasize that blockchain operators could be treated like “processors” instead, the same as the companies behind cloud technologies who act on behalf of users rather than control their data. This, the study continues, is mostly applicable for Blockchain-as-a-Service (BaaS) offerings, where a third party provides the supporting infrastructure for the network while users store their data and control it personally.

As an example for such type of blockchain platform, the researchers cite centralized platforms for land registry and private interbanking solutions that set up “a closed, permissioned blockchain platform with a small number of trusted nodes.” Such closed systems could effectively comply with GDPR rules, the report continues.

To meet the privacy law, blockchain networks might also store personal data externally or allow trusted nodes to delete the private key for encrypted information, thus leaving indecipherable data on the chain, the researchers state.

However, the GDPR rules are extremely difficult to comply with for more decentralized nets, such as those concerned with mining and cryptocurrency. In this case, the nodes, operating with the data of E.U. citizens, might agree to fork a new version of the blockchain from time to time, thus reflecting mass requests for rectification or erasure. “However, in practice, this level of coordination may be difficult to achieve among potentially thousands of nodes,” the study reads.

As a conclusion, the researchers urge the European Data Protection Board, an independent regulatory body behind GDPR, to issue clearer guidance on the application of data protection law to various common blockchain models.

As Cointelegraph wrote earlier, the GDPR could both support and harm blockchain. Despite the fact that current E.U. legislation partially has the same goals as crypto-related technologies, such as decentralizing data control, blockchain companies could also face extremely high fees as data controllers.

Tags: , ,

private

Just Don’t Call It Privacy

September 23, 2018

What do you call it when employers use Facebook’s advertising platform to show certain job ads only to men or just to people between the ages of 25 and 36?

How about when Google collects the whereabouts of its users — even after they deliberately turn off location history?

Or when AT&T shares its mobile customers’ locations with data brokers?

American policymakers often refer to such issues using a default umbrella term: privacy. That at least is the framework for a Senate Commerce Committee hearing scheduled for this Wednesday titled “Examining Safeguards for Consumer Data Privacy.”

After a spate of recent data-mining scandals — including Russian-sponsored ads on Facebook aimed at influencing African-Americans not to vote — some members of Congress are now rallying behind the idea of a new federal consumer privacy law.
At this week’s hearing, legislators plan to ask executives from Amazon, AT&T, Google, Twitter and other companies about their privacy policies. Senators also want the companies to explain “what Congress can do to promote clear privacy expectations without hurting innovation,” according to the hearing notice.

There’s just one flaw with this setup.

In a surveillance economy where companies track, analyze and capitalize on our clicks, the issue at hand isn’t privacy. The problem is unfettered data exploitation and its potential deleterious consequences — among them, unequal consumer treatment, financial fraud, identity theft, manipulative marketing and discrimination.
In other words, asking companies whose business models revolve around exploiting data-based consumer-influence techniques to explain their privacy policies seems about as useful as asking sharks to hold forth on veganism.

“Congress should not be examining privacy policies,” Marc Rotenberg, the executive director of the Electronic Privacy Information Center, a prominent digital rights nonprofit, told me last week. “They should be examining business practices. They should be examining how these firms collect and use the personal data of customers, of internet users.”

The Senate Commerce hearing, however, doesn’t seem designed to investigate commercial surveillance and influence practices that might merit government oversight.
For one thing, only industry executives are currently set to testify. And most of them are lawyers and policy experts, not engineers versed in the mechanics of data-mining algorithms.

Companies are sending their “policy and law folks to Washington to make the government go away — not the engineering folks who actually understand these systems in depth and can talk through alternatives,” Jonathan Mayer, an assistant professor of computer science and public affairs at Princeton University, told me.

That may be because Congress is under industry pressure.

California recently passed a new privacy law that would give Californians some power over the data companies’ hold on them. Industry groups hope to defang that statute by pushing Congress to pass federal privacy legislation that would overrule state laws. The industry-stacked Senate hearing lineup seems designed to pave the way for that, said Danielle Citron, a law professor at the University of Maryland.

Frederick Hill, a spokesman for the Senate Commerce Committee, said the group planned future hearings that would include other voices, such as consumer groups. But “for the first hearing,” Mr. Hill said, “the committee is bringing in companies most consumers recognize to make the discussion about privacy more relatable.”

What is at stake here isn’t privacy, the right not to be observed. It’s how companies can use our data to invisibly shunt us in directions that may benefit them more than us.

Many consumers know that digital services and ad tech companies track and analyze their activities. And they accept, or are at least resigned to, data-mining in exchange for conveniences like customized newsfeeds and ads.

But revelations about Russian election interference and Cambridge Analytica, the voter-profiling company that obtained information on millions of Facebook users, have made it clear that data-driven influence campaigns can scale quickly and cause societal harm.
And that leads to a larger question: Do we want a future in which companies can freely parse the photos we posted last year, or the location data from the fitness apps we used last week, to infer whether we are stressed or depressed or financially strapped or emotionally vulnerable — and take advantage of that?

“Say I sound sick when I am talking to Alexa, maybe they would show me medicine as a suggestion on Amazon,” said Franziska Roesner, an assistant professor of computer science at the University of Washington, using a hypothetical example of Amazon’s voice assistant. “What happens when the inferences are wrong?”

(Amazon said it does not use Alexa data for product recommendations or marketing.)

It’s tough to answer those questions right now when there are often gulfs between the innocuous ways companies explain their data practices to consumers and the details they divulge about their targeting techniques to advertisers.

AT&T’s privacy policy says the mobile phone and cable TV provider may use third-party data to categorize subscribers, without using their real names, into interest segments and show them ads accordingly. That sounds reasonable enough.

Here’s what it means in practice: AT&T can find out which subscribers have indigestion — or at least which ones bought over-the-counter drugs to treat it.

In a case study for advertisers, AT&T describes segmenting DirecTV subscribers who bought antacids and then targeting them with ads for the medication. The firm was also able to track those subscribers’ spending. Households who saw the antacid ads spent 725 percent more on the drugs than a national audience.

Michael Balmoris, a spokesman for AT&T, said the company’s privacy policy was “transparent and precise, and describes in plain language how we use information and the choices we give customers.”
But consumer advocates hope senators will press AT&T, Amazon and other companies this week to provide more details on their consumer-profiling practices. “We want an inside look on the analytics and how they’re categorizing, ranking, rating and scoring us,” Professor Citron said.

Given the increased public scrutiny, some companies are tweaking their tactics.

AT&T recently said it would stop sharing users’ location details with data brokers. Facebook said it had stopped allowing advertisers to use sensitive categories, like race or religion, to exclude people from seeing ads. Google created a feature for users to download masses of their data, including a list of all the sites Google has tracked them on.

Government officials in Europe are not waiting for companies to police themselves. In May, the European Union introduced a tough new data protection law that curbs some data-mining.

It requires companies to obtain explicit permission from European users before collecting personal details on sensitive subjects like their religion, health or sex life. It gives European users the right to see all of the information companies hold about them — including any algorithmic scores or inferences.

European users also have the right not to be subject to completely automated decisions that could significantly affect them, such as credit algorithms that use a person’s data to decide whether a bank should grant him or her a loan.

Of course, privacy still matters. But Congress now has an opportunity to press companies like Amazon on broader public issues. It could require them to disclose exactly how they use data extracted from consumers. And it could force companies to give consumers some rights over that data.

Tags: , , ,

static2.politico.com

Privacy and security: no simple solution, warns Rachel Dixon

September 18, 2018

The tide is turning when it comes to privacy and security, with Australians gradually becoming more aware of the need to protect their personal data and the risks involved in sharing it.

Rachel Dixon, privacy and data protection deputy commissioner at the Office of the Victorian Information Commissioner, saysthat with public debates over My Health Record and new tech surveillance laws, the public is now more informed about these issues than ever before.

“Not that many years ago there was (a view) that privacy is dead,” she says. “That now sounds quite outdated. In some ways the conversation still does need to get more mature. But this has been a real watershed year for privacy issues making it to the mainstream.
“That’s a very good thing.”

According to Ms Dixon, the theme of the last decade broadly had been to “hoover up as much data as possible”, and that’s now shifting to a theme of “taking the data that is necessary to fulfil the function”.

“There’s been a change in people’s understanding around their privacy,” she says. “Increasingly they’re more concerned, and are less willing to hand over data in certain circumstances. A lot of the use of data now is looking at the risks involved.

“Humans historically have not been very good at calculating risk. That’s been terrific in the past, it’s allowed us to sail across oceans and go into space. But we’re not very good at it. So I want us to move to having a risk-based framework, and change the culture around assessing risk.”

Debate is currently raging as to whether Australian law enforcement agencies should have the right to decrypt smart devices to prevent and solve criminal activity, with ferocious opinions coming on both sides of the debate.

For former FBI agent Ed Stroz, the founder and co-president of Stroz Friedberg, the ability to thwart terrorist attacks is more important overall than the right to an individual’s privacy.

“You can see both sides of the issue. And it comes down to, ‘Do people have the right to privacy?’ I’m a little more sympathetic to the law enforcement side,” he says.

“People do value their privacy now, but if you have an encrypted phone held by a criminal, that creates a sacred category of evidence we’ve never had in our judicial system before. Out of the box, this engineering empowers adverse behaviour and that has big social effects.

“If we didn’t have that many adversaries around, it probably wouldn’t matter that much. But I weigh that aspect of it more heavily than valuing privacy overall. That’s a personal view that I have.”

Ms Dixon said encryption was a complex issue, and that there was no simple, obvious, single solution to the balance between privacy and security.

“If there was, we would have done it by now,” she says. “Chances are, the solution here is a combination of things. But the debate is definitely going to be messy. At least the discussion has raised some really good points. I would caution against looking for a ­simple answer or seeing the issue as binary. It’s not, these are healthy tensions between privacy, data protection and freedom of information.”

Marcin Kleczynski is chief executive of Malwarebytes, a security company he started as a 16-year-old. He saidthat while users had become more savvy about their own security and privacy, they were still generally the weak link when it comes to viruses and other attacks. “It takes a lot to always be patching your systems and keeping everything up to date,” he says. “There are so many damn security companies, I could name 60 or 70, but an attack still comes and no-one’s ready.

“I’m fairly pessimistic about this stuff. I think we still haven’t solved a lot of the basics when it comes to security. We need a lot more user awareness training about security and storing your own information, there needs to be a lot more basic hygiene in place. We’re slowly getting there.”

Tags: , ,

emailtracking-ta

Are you privacy literate?

September 11, 2018

Online privacy is a new literacy that educators and students need to learn and practice. But what should teachers consider before adopting a digital tool?

Below are the top five questions teachers should ask themselves when vetting new tools and platforms; these are the questions that tend to elicit the most red flags. You’ll find the answers to these questions in privacy policies and terms of use documents. As a rule, you won’t have to apply these questions to tools provided by your school for institution-wide use, such as Google Drive or your learning management system. These tools have already been vetted by the school and/or district.

Does the product collect Personally Identifiable Information? Personally identifiable information—or PII as it’s known—is data that’s tied to an student’s name or ID number. The relevant section in a privacy policy will be called something like “Information We Collect from You.” If the service collects your name and email address, that’s information needed to create an account, so that’s not alarming. Other commonly collected information include the following:

Usage information that includes interactions with a website’s or product’s services
Device information such as unique device identifiers
Operating system information
Internet service provider
IP address
Dates and times of your log-ins and requests
Does the vendor scrub PII from deleted accounts? Say you decide to stop using a service or if a family decides that they’re not comfortable with their child using that service. When students completely delete their accounts, the vendor is required by federal regulations to scrub all the PII. But if their documents do not explicitly state that they do, you’re taking the risk that they may not.

Does the vendor share information they collect? With whom? Vendors may share information with third-party service providers that help them help them with cloud storage, data management, or consult with them on improving their product. Someone needs to look through the privacy policies of each of those third-party services to make sure that they follow school-approved privacy conditions.

That can be a huge task and it’s why no district should require teachers to vet their own apps. This should be the responsibility of whoever is assigned to do vetting in your IT or legal department. But this helps you understand why sometimes it takes a while for an app to be vetted.

The privacy policy may also say that the company will disclose your personal information without notice if required to do so by law (this is standard). If they say something like “…or in the good faith belief that such action is necessary,” that’s a little looser and might be something you want to consider.

What are the age restrictions? This information is commonly in the terms of use. The service may state that it does not knowingly collect or use PII from children under the age of 13. This particular guideline is an indication that the service may not be compliant with Children’s Online Privacy Protection Act, which controls what information can be collected from children under 13.

If the service does say that users under 13 can use it, they are likely to be COPPA compliant in the way they collect and store and preserve information. When the terms say that users under 13 must get the consent of a parent or guardian, the school might be able to step in and give permission. But this would be something you’d have to review with the school officials who do the vetting.

Does the product display targeted ads? It’s against federal regulations to display targeted ads within web sites, games and apps that target children who are under 13. If an online service does this, they are no longer considered educational and they can no longer have that designation. A service with targeted ads collects PII so that it can select ads that will draw in each individual user, which is a distraction from educational work.

Teach students to be privacy savvy

Fear of data privacy laws has compelled many schools and teachers to say no to tools that students suggest. I recommend, instead, that we work with students and teach them digital data privacy literacy.

Start by expressing interest in the suggested tool. Then ask them some of the above questions to help them figure out whether the tool is collecting their PII. For instance, does it say anything about collecting information about location, contacts, IP address, operating system, and the like? Students are usually shocked by how much information that apps and companies are collecting about them.

Next, tell them, “Before I can give you permission to use this as part of our work, I have to run it through our IT department or our legal department. But I’m really excited that you’re excited, and I’m going to get back to you as soon as I can.”

Digital literacy is an essential life skill, so having conversations with students about the tools that they’re excited to use and how they can tell whether it will protect their data is extremely important. Just as we build character education into lessons, it’s important for us to build in these digital literacy conversations whenever appropriate.

Kerry is the assistant principal of teaching and learning at St. John’s Prep in Danvers, Mass. She is also the director of K12 Education for ConnectSafely.org, an internet safety nonprofit based in Palo Alto, Calif. Kerry is an EdSurge columnist and also writes for her own blog, Start With a Question, which can be found at www.KerryHawk02.com. She is co-author of The Educator’s Guide to Student Data Privacy. She has been a middle and high school educator in public and private schools for 16 years and is also a keynote speaker, writer, and professional learning facilitator sought after by schools, districts, and conferences nationwide. Kerry can be found on social media at @KerryHawk02.

Tech Tips is a weekly column in SmartBrief on EdTech. Have a tech tip to share? Contact us at knamahoe@smartbrief.com

____________________________________

Like this article? Sign up for SmartBrief on EdTech to get news like this in your inbox, or check out all of SmartBrief’s education newsletters , covering career and technical education, educational leadership, math education and more.

Tags: ,

monero

Monero (XMR) The Privacy Oriented Coin Story And Latest: 10.00% Increase – Showcasing Predicted Success

September 5, 2018

Seaming like out of the blue, Monero (XMR) is taking center stage for the last couple of week constantly while pulling more investors and traders towards it. With so many options under the radar it should not be overlooked at all.

Monero XRM
This should not come as a surprise as it is one of the few coins that follows the original idea of cryptocurrencies to respect anonymity and safety. While the same in some way or shape is targeted by the leading coins too, there is a miss-lead taking place.

The nature to conform to present regulations set by officials can be felt from these coins which is in contrary to the above-mentioned original idea. Having main concentration set on privacy being almost untraceable and unlinkable makes Monero a choice to stand out.
Monero is headed by a group of 7 developers of which 5 have chosen to remain anonymous while two have come out openly in public. They are: David Latapie and Riccardo Spagni aka “Fluffypony”. The project is open source and crowdfunded.
Bitcoin (BTC), Monero (XMR)–According to a report by the initial coin offering (ICO) advisory and research firm Satis Group, both Monero and Bitcoin look to be the biggest winners in terms of price gain over the next decade. Satis, which publishes outlooks for both ICOs and current cryptocurrencies has released a new forecast for the next ten years that puts XMR as the greatest price gainer.

Just Recently, the team behind Monero declared that a third-party has just completed a technical audit for the ‘bulletproofs’ protocol. An official Monero blog post elaborates on what the improvement entails, noting that bulletproofs allow for cheaper, smaller and faster transactions, and will allow for Monero to scale in a much easier fashion

“Overall, bulletproofs represent a huge advancement in Monero transactions. We get massive space savings, better verification times, and lower fees.”

The 10th largest coin by market capitalization welcomed over 10.00% gain in the last 24-hours against the US Dollar. The pair XMR/USD is changing hands at $134.56 leading BTC’s market with 10.40%. This marks the highest for almost two months now.

Tags: , ,

Introducing ShazzleMail Email and How it Works

Privacy is your Fundamental Human Right.

Our Daily Blog
private
Private Blockchains Could Be Compatible with EU Privacy Rules, Research Shows
November 12, 2018

Private blockchains, such as interbanking platforms set to share information on customers, could be...

Read more
apple
Apple launches privacy portal, initiatives
October 18, 2018

Apple (NASDAQ:AAPL) launches a new privacy website letting users find personal data the company has ...

Read more
private
Just Don’t Call It Privacy
September 23, 2018

What do you call it when employers use Facebook’s advertising platform to show certain job ads onl...

Read more
static2.politico.com
Privacy and security: no simple solution, warns Rachel Dixon
September 18, 2018

The tide is turning when it comes to privacy and security, with Australians gradually becoming more ...

Read more
emailtracking-ta
Are you privacy literate?
September 11, 2018

Online privacy is a new literacy that educators and students need to learn and practice. But what sh...

Read more