Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Flattr now deletes your web browsing history within 3 months (ctrl.blog)
128 points by d2wa on June 1, 2018 | hide | past | favorite | 94 comments


I love how some of the tech industry is beginning to see data as a liability rather than an asset. It dramatically reduces the ability for government mass surveillance for two reasons:

1. If companies only collect what they need (to reduce their liability), governments can't demand more than that (or even hack in to get the data illegally).

2. If the industry culture is to limit data collection, governments can't just say, "Well every company does it, so why can't we."

There's a wonderful talk, Haunted By Data, that covers a lot of the societal downsides of treating data as an asset. Highly encourage watching/reading.

Text: http://idlewords.com/talks/haunted_by_data.htm

Video: https://www.youtube.com/watch?v=GAXLHM-1Psk


When young in my career, which was finance/banking, my 'Head' (as was the organisational naming structure) sent an email re-forwarding and re-emphasining data retention policy.

I was inclined to never delete an email rather than comply with 'your inbox should not exceed 500MB'. That was 50 emails a day then, my inbox now exceeds 500MB on a daily basis.

Sure, you can save archive locally or on a shared drive.

But the key idea... we don't want customer data because of liability. If we're storing it for a specific purpose, be that regulatory reporting or clearly defined analytics, fine.

Before the term 'big data', simply 'lots of data' sounded nice.

But don't let it hang around. At some point it will be a liability. You're serving customers. The key part of this is keeping their data safe. Not having it anymore (or having it in an non-accessible lock-away auto-delete repisitory for legal purposes) is more than good enough, to properly store and manage it is what's better.


Just to be contrarian: I’m not so sure if that’s a good thing... Data can be used for great things, e.g. longitudinal data in healthcare. I think seeing data as a liability might reduce the speed of progress


In the talk, there's a parallel drawn between the nuclear industry 60 years ago and big data now, where the nuclear was originally touted as a miracle cure for everything, then disasters happened, then it never really got over the stigma despite its huge potential.

Society decided, for now, the upsides aren't worth the downsides.

Oil is another parallel where society is currently just at the point where we are starting to not value the upsides over the downsides.


Seems like oil and nuclear energy are different because both can be replaced with alternative sources of energy. What are the alternative sources of user data?


Paper records, human memory....

Comparing growth in data storage versus energy usage per capita is interesting.

Even if you look back to the founding of the U.S., the change in energy use per person is actually only a few fold, definitely less than an order of magnitude.

Harder to compare quantity of data storage but the change would seem much larger. How much data is there, per U.S. person?


Why is data a requirement to keep?

Currently has value with externalized downsides.


Data can be used for great things, e.g. longitudinal data in healthcare. I think seeing data as a liability might reduce the speed of progress

I think the word "can" is acting as a euphemism or term of elision here, since "longitudinal data in healthcare" includes things like the Tuskeegee Syphilis Experiment.


> governments can't just say, "Well every company does it, so why can't we."

Odd. That's precisely what's happening with GDPR. While the EU governments are demanding that private companies limit collection and use of data, the intelligence arms of these same countries (I'm looking at you, GCHQ) and their FVEY partners are doing everything they can to hoover up and store every single bit of data on the world population, including their own citizens.

And somehow, we think this is fine. An entity with the power to disappear you and render you to black sites can have all the data on you they wish. But Facebook determining that there is an 83.7% chance you are stressed serving up an ad for vacation rentals is completely verboten.

It's absolutely mad.


Wonder what will happen if the US passes laws enforcing companies to retain data for up to a year minimum for digital forensic integrity reasons of cyber criminal cases. Also maybe on the other hand GDPR is good for VPN services to be much more transparent.


I suspect there will simply be a direct pipe for logging to the government, where companies don't need to retain anything.


GDPR requires you to make that kind of information available.


However governments often have undisclosed access not even known to the holder of the information.

I doubt GDPR will significantly affect information to which the government wants access

Can I ask the government to delete all information on me?


There will undoubtedly be a lawsuit again soon over whether companies are allowed to transfer data out of the EU to US government warrantless requests.


I hope its Greece so we can all be as transparent as possible about the real issue here...

(Democracy)


If I'm not mistaken, GDPR has the usual exemptions in place for that kind of usage (national security etc.), hasn't it?


Do we really believe large Chinese tech companies will be complying with GDPR?


Are there any that do a lot of business in the EU that you’re thinking of? I can think of lots of hardware, but not many traditional consumer Internet companies. Maybe Ali Baba for SMEs?


Tencent to name one.


The thought anyone believes that makes me chuckle, thanks.


I concur and would like to add that this view of data collection has become common only because of the slew of breaches. Hackers leaking massive customer databases has forced companies to review their collection policies because they don't want to deal with the fallout, technically, politically and otherwise. This is true of the hacker ethos, using radical, often criminal behavior to point out glaring flaws that others become complacent with. Of course, these days that goal is secondary if it is considered at all, to the goal of financial gain and that's a damn shame.


I feel like this whole thing with the GDPR is people fighting each other for the wrong reasons.

The people for it argue that gets large companies to behave better.

The people against it argue that it's unnecessarily complicated, poorly drafted and burdensome to small businesses.

But it's both at the same time.

The issue is that it doesn't have to be. It's possible to get the desired effect without using such a complicated system, and to mitigate the impact it imposes.

For example, one of the major burdens on small businesses is the cost of producing the data the company has on you -- but there is a simple way to fix that. Require the requester to pay the cost of collecting the information, similar to FOIA requests in the US. Then the requirement is no longer an unfunded mandate and can't be used by griefers as a method of harassment.

Nobody bothered to use common sense like that when drafting it, so now in order to fix it, it has to be thrown out and rewritten. But then the people who support the spirit of the law end up fighting against the people who oppose the letter of the law, even though everybody really wants the same thing.


Your proposed solution gives companies zero incentive to minimize data collected, and even incentivizes them to make fulfilling such a request as convoluted and expensive as possible.

What people online love to call "common sense solutions" are usually not solutions a all.


> Your proposed solution gives companies zero incentive to minimize data collected

The expense of producing the data isn't supposed to be an incentive to minimize data collected, that is what the other sections are for. For large companies it wouldn't be regardless because they could automate the process.

> and even incentivizes them to make fulfilling such a request as convoluted and expensive as possible.

A company charging outrageous prices would obviously invite investigative scrutiny.


45 C.F.R. § 164.524 (c)(4) requires medical providers to make our records available to you upon request, for a reasonable fee to reflect their cost of preparing it.

“Fees. If the individual requests a copy of the protected health information or agrees to a summary or explanation of such information, the covered entity may impose a reasonable, cost-based fee, provided that the fee includes only the cost of:

(i) Labor for copying the protected health information requested by the individual, whether in paper or electronic form;

(ii) Supplies for creating the paper copy or electronic media if the individual requests that the electronic copy be provided on portable media;

(iii) Postage, when the individual has requested the copy, or the summary or explanation, be mailed; and

(iv) Preparing an explanation or summary of the protected health information, if agreed to by the individual as required by paragraph (c)(2)(iii) of this section.”

Despite this language, actual costs for getting medical records varies widely. “Fees range very widely, from $2-55 for short records of 15 pages to $15-585 for long ones of 500 pages. Times also range widely, from 1–30 days (or longer for off-site records)” from a survey study on the topic.

No one has been investigated on this that I have ever heard of.


> Despite this language, actual costs for getting medical records varies widely. “Fees range very widely, from $2-55 for short records of 15 pages to $15-585 for long ones of 500 pages. Times also range widely, from 1–30 days (or longer for off-site records)” from a survey study on the topic.

All of those numbers are in line with the actual costs of producing information like that. The problem scenario is the one where the company claims everything costs a zillion dollars just so they don't have to provide the information at all, which the rules appear to be successfully preventing from happening.


Since meaningful use regulation has strong-armed almost all providers in the country into EMRs, it costs absolute pennies to provide those documents.

And for most people, 500$ for a medical record is effectively zillions.


> Since meaningful use regulation has strong-armed almost all providers in the country into EMRs, it costs absolute pennies to provide those documents.

The reproduction cost is not the dominant factor in processing costs. Someone has to review the information, e.g. to make sure the file actually belongs to the requesting party and some clerical error doesn't cause them to receive someone else's records, the information may be in multiple independent systems, the request may be for a subset of the information which then has to be separated out, etc.

Not all of that happens all of the time, which is why it costs $2 sometimes and $500 other times.

> And for most people, 500$ for a medical record is effectively zillions.

$500 is <1% of the median income in the US.


What’s sad isn’t how far off you are in your conception of what the process looks like; what’s sad is that what you’re describing sounds like an entirely reasonable security effort, and it’s disappointing that it has almost no reflection in reality.

500$ is also more than the majority of Americans have in their saving accounts. (http://money.cnn.com/2017/01/12/pf/americans-lack-of-savings...)


The problem is we don't all agree about what "outrageous prices" are, and courts are unlikely to conclude that anything that matches standard industry practices is "outrageous".

The cost of complying with the GDPR nightmare letter, for the average large company that's not designed around the ability to comply with it, is likely to involve multiple engineers for several months digging out the data - if not a large internal redesign to make it even possible to dig out the data. It is not outrageous that the cost of doing that work is objectively going to be hundreds of thousands of dollars. And end users aren't going to pay for that.

So the question is whether we think that it's worth it to society to cause companies to figure it out anyway, or not. If we don't, we don't need a law at all; we just let companies carry on as they are now. If we do, then we should give incentives to design your company right from the start, to minimize data collected, etc.


The nightmare letter is specifically designed as an instrument of malice to maximize processing costs. If it does what it was intended to do and that generates a large processing fee, the system is working as intended.

If the cost of preparing for compliance is reasonable then amortizing it over each request should not produce unreasonable costs per user. If the costs are unreasonable to begin with then that is the root of the problem which is what needs to be corrected independent of who pays.


> A company charging outrageous prices would obviously invite investigative scrutiny.

Outrageous? No, just the mere cost that really happens.

Say, the data can only be accessed by the CEO personally. For privacy reasons, okay? Very reasonable, and totally in the spirit of the GDPR.

It can only be accessed in a very secure server location in Norway. Also commendable.

Our CEO needs to travel (first-class air travel) to Norway. He stays in a first-class hotel. Oh, and his time as value, too. So let's say three days business trip, that adds 1 percent of his salary to the bill.

Of course, this is an hypothetical, not-really-serious exaggeration. But it demonstrates the effect that would certainly happen on a much smaller scale.


> Say, the data can only be accessed by the CEO personally. For privacy reasons, okay? Very reasonable, and totally in the spirit of the GDPR.

Then the company has the same costs in making any use of the data, at which point they might as well not even collect it.

> Of course, this is an hypothetical, not-really-serious exaggeration. But it demonstrates the effect that would certainly happen on a much smaller scale.

But the scale makes all the difference. If they literally require the CEO to fly to Norway an stay in a hotel then it would add millions of dollars to the cost, but it's also manifestly unreasonable.

They can easily add some nominally unnecessary bureaucracy to the process that adds a few percent to the cost and get away with it, but that isn't really going to deter a lot of people.

The more they try to get away with, the more likely they make it that the regulators will crawl inside them and lay eggs. It's a losing trade off just so you can spite your own customers.


I think you're thinking of this GDPR thing as two sides fighting each other, and you're there in the middle with this reasonable compromise.

But you're wrong :)

You're actually on the same side as most of the anti-GDPR arguments I see.

Rather than speak for everyone, I'll just state my case: I like the intent of the law, and I don't like my privacy or anyone else's being violated. I'm fine with reasonable regulation to prevent that.

But the GDPR is legitimately terrible regulation. It's overly broad, vague, has the potential for huge penalties, is anti-small business, tries to establish the entire globe as being within its jurisdiction because someone from the EU might stumble across your website, and expects us to just trust that regulators from 28 countries will forever be gracious and helpful instead of capriciously punitive.

If the GDPR was restricted to company with an actual presence in the EU and above a certain size, I'd be fully in favor of it. I'm not sure I'd think it's smart, but the EU should do what it thinks is right for its citizens. But instead the GDPR claims that my side project on a server in Alabama (hypothetical) is a violation of their residents' human rights worth a fine up to millions of Euros depending on the whims of 28 different regulatory agencies (yes, I know they almost certainly won't fine me millions, but this law leaves it up to them to decide).

But the reason the other side disagrees is because: a) they trust the regulators, and b) they see any collection of anything remotely resembling "private data" as a fundamental violation of their human rights.

Under that set of assumptions, why would you let small businesses off the hook so they can be free to violate anyone's human rights as they see fit?


>For example, one of the major burdens on small businesses is the cost of producing the data the company has on you -- but there is a simple way to fix that. Require the requester to pay the cost of collecting the information, similar to FOIA requests in the US. Then the requirement is no longer an unfunded mandate and can't be used by griefers as a method of harassment.

If you can't cheaply and quickly respond to a subject access request, then you're probably breaching several other parts of GDPR. How would you respond to a deletion request under Article 17, a restriction of processing request under Article 18, a portability request under Article 20 or an objection under Article 21?

Compliance with all of those requirements necessitates clear documentation of how you're processing personal data. If you don't have that documentation, how can you comply with the stewardship and security requirements of Articles 24, 25 and 32? How will you comply with the record-keeping requirements of Article 30 and the impact assessment requirements of Article 35? If you have a data breach, how will you be able to properly fulfil your notification requirements under Articles 33 and 34? How can you be confident that you're abiding by the general principles laid out in Chapter 2?

Processing personal data is a serious responsibility with potentially grave risks. If you have been taking that responsibility seriously (and been in compliance with the 1995 Data Protection Directive), then complying with the GDPR is relatively straightforward. If you just copy-pasted a privacy policy and gave no real thought to data protection while hoovering up tons of potentially sensitive personal data, then compliance will be a painful and expensive process.

Being a small business doesn't give you the right to be negligent. Not under the GDPR and not under any of the other laws governing businesses. If you don't have the expertise to comply with data protection law, you need to delegate those tasks to someone who is suitably qualified. If the scale and complexity of your processing activities makes compliance impossible, then the EU would very much prefer that you find a different business model or stop doing business with their citizens.


> Being a small business doesn't give you the right to be negligent.

I can't understand this attitude that it's negligence to not want to be bankrupted by poorly crafted regulations.

We're dealing with things like a website to sell wood carvings that someone makes as a hobby. They've got names and addresses because they need to know where to ship them.

This is not a reasonable place to apply a complex regulatory framework. The idea that this involves "serious responsibility" or "grave risks" is akin to applying a regulatory framework designed for an oil refinery to a garage sale because a used lawnmower may contain some "hazardous" motor oil.

It's not that the danger isn't theoretically possible, it's that the regulatory requirements are totally inappropriate to the context.


>This is not a reasonable place to apply a complex regulatory framework. The idea that this involves "serious responsibility" or "grave risks" is akin to applying a regulatory framework designed for an oil refinery to a garage sale because a used lawnmower may contain some "hazardous" motor oil.

You're subject to exactly the same regulations on health and safety and hazardous materials handling. The steps you need to take are smaller and simpler, because the scope of your activities are smaller and less hazardous. You're not allowed to take the guards off your machine tools or pour motor oil down the drain just because you're a small shop. Same with GDPR.

If you're a small company that sells wood carvings, you don't need to do very much to comply. If you're a small company that maintains a blacklist of construction workers, you're in deep trouble.

https://en.wikipedia.org/wiki/Consulting_Association


> You're subject to exactly the same regulations on health and safety and hazardous materials handling.

It's appropriate to have special requirements for large scale operations, e.g. to have fire suppression systems with regular inspections, an assigned safety officer with specific training, material safety data sheets, site zoning requirements and so on.

None of that is appropriate for someone who just has some motor oil in their garage. And the requirements that are appropriate -- the small subset like not pouring motor oil down the drain -- are easy to understand and consist of a simple prohibition rather than an unfunded mandate to affirmatively take specific actions.

> If you're a small company that sells wood carvings, you don't need to do very much to comply.

If that were true there wouldn't be small companies writing off the EU because of the compliance costs.


No, the point is that the cost of automating this should be part of the cost of doing the collection, not a separate item.

If you can't afford to deal with the so-called "nightmare letter", then you can't afford to snoop.


The problem is there is no exception for companies whose business model isn't snooping.

Normal companies have customer lists, accounts receivable information, customer email addresses for password resets and announcements etc. A business can't stop keeping records of who owes them money. But then they're subject to regulations designed to make things difficult for the likes of Facebook and Microsoft.


There is exceptions though, since this data is required. They just need to delete it, if they no longer have a reason to keep it.


Yes, because it's so amazingly difficult to run a few SQL queries and send out an email.


A pay solution disadvantages anyone poor or unable to access funds from your plan. There's no reason a company can't just jack up the prices because they failed to create a decent system for performing mass deletions.


"cost of producing the data" seems really hard to strictly define. What's stopping a company from implementing extremely tedious procedures and billing $100 an hour to discourage people from requesting it?

In addition, GPDR says that getting access to data on you is a right and it seems contradictory to have to pay for something that is your right to have.


> "cost of producing the data" seems really hard to strictly define. What's stopping a company from implementing extremely tedious procedures and billing $100 an hour to discourage people from requesting it?

Government agencies have been known to do this when they don't want to release FIOA information. Then they get sued over it by the ACLU or similar.

> In addition, GPDR says that getting access to data on you is a right and it seems contradictory to have to pay for something that is your right to have.

That is how all the other rights work. Freedom of the press isn't a right to use someone else's printing press without compensating them.


   access to data on you is a right 
   Freedom of the press isn't a right 
Hence the classic liberal difference between rights to something and freedom to do something


By this definition all of the usual "rights" are really freedoms. Which leaves me waiting for an explanation of why accessing data a third party has on you should be elevated above the freedom of speech, travel, association and so on as something private parties should be required to do for all comers without compensation.


One (admittedly theoretical) way to get companies to reveal the cost is to tax them based on the self-declared value of the data. The lower value they declare, the less taxes they pay, but the cheaper they have to give it away. One could do this for either personal requests, or even more radical, force the companies to sell the data, at the delcared price, to competitors.

I think this is pretty much suggested by Eric Posner in his book Radical Markets.


I'm not so sure - how do you make sure the cost isn't so high that it deters people with legitimate reasons to request their data, but also high enough to actually cover the cost of obtaining it? I would expect that those are two very different amounts.


> I'm not so sure - how do you make sure the cost isn't so high that it deters people with legitimate reasons to request their data, but also high enough to actually cover the cost of obtaining it? I would expect that those are two very different amounts.

If the cost of doing something legitimately exceeds the value of doing it to the person asking for it to be done, isn't not doing it then the rational outcome?


GDPR is trying to push a change in society's norms regarding privacy. The value of that change cannot be meaningfully measured by multiplying the "value" of one person's privacy by the number of people; there are effects that happen at scale that don't happen when you change privacy/data handling for an individual person for an individual company. And it's definitely unclear that the ability of any individual to pay for their data is a meaningful measure of that value; among other things, it can't be resold without destroying it.

Imagine if a city with no roads, with just human-width sidewalks between skyscrapers. The cost of moving all the skyscrapers to fit even a single road is enormous; no single driver would be willing to pay it for the incremental benefit of being able to drive. Would you say that this means that rationally the society should never build roads?


You can't justify an auto repair shop with just one customer with just one car either. The way it works isn't that the first customer pays all the fixed costs and every subsequent customer pays only the incremental costs, it's that the fixed costs are amortized over the projected number of customers.

What obviously doesn't work is to try to say that the auto repair shop pays all the costs and never charges anything to anyone who wants to have their car repaired, because then they go out of business.


"Ads on this site don’t track or stalk you. Please disable your blocker."

According to my blocker those are Google ads so...


Google AdSense on Ctrl blog is configured to only show non-personalized ads to any users with the Do-Not-Track (DNT) setting enabled or European Economic Area (EEA) citizens. AdSense still uses cookies for rate-limiting and fraud prevention, but not ad personalization or tracking. requestNonPersonalizedAds=1 is part of AdSense’s GDPR APIs. https://www.ctrl.blog/entry/adsense-gdpr-consent

You also only see that particular message if your browser sends the DNT header, and an adblocker is detected.


In my case, I don't use an ad blocker, but I do use Privacy Badger, which is only supposed to block tracking, and should only block ads if they can't have their tracking disabled. It looks like Privacy Badger decided "pagead2.googlesyndication.com" should now be blocked as a tracking domain.

Perhaps Google should split off their nonpersonalized ads service into a different domain so such services can easily distinguish between the two?


Actually, this is something that should be fixed in Privacy Badger. They can detect the URL parameter that flips ads from personalized to non-personalized (npa=1) and block accordingly. https://github.com/EFForg/privacybadger/issues/2046


"use cookies to combat fraud and rate-limiting (not showing the exact same ads repeatedly)" and "you cannot opt out of showing ads to users based on their previous interactions with the advertiser, such as visits to an advertiser's website, known as remarketing"[1] sure sound like tracking to me. Far less invasive tracking than Google normally does, but clearly still tracking.

[1] https://support.google.com/adsense/answer/142293?hl=en


That doesn't make any sense. The point of privacy badger is to prevent cookies that track you across different domains.

Allowing tracking cookies just because a certain parameter is present is asking for abuse.

If you are serious about not tracking people, don't use tracking cookies.


Cookies is just a tool it all comes down to how you use it. Not all knifes are stabbing-people-to-death knifes.


Privacy badger is quite specific in what it blocks. It blocks third party tracking cookies that track you across multiple domains.

First party cookies are no problem. Third party cookies that are only used in one domain are no problem. Third party non-tracking cookies are no problem.

So it does block the equivalent of bringing a large knife to a bar.


Privacy badger doesn't have filters like this. That's the whole point. It sees Google doesn't respect the DNT header (it sets tracking cookies) and shuts it down. This is privacy badger working as intended.


Wow, I never knew that was an option. Now I wish my adblocker gave the option of allowing those ads. That sounds like what I'd hoped AdBlocks's Acceptable Ads program would be.


> install the company’s browser extension which collects their browsing history

Yeah I think I'll pass thanks. What person would install such a thing?


Well, … people choose to install and use Google Chrome … It all comes down to who you trust with your data.


It's hard to use the internet or live without a web browser, I suspect most people get along just fine without letting a startup with a cool name scrape their browsing history.


Wtf?! Why is micro payment processor even recording browser history?!


From the article:

"Flattr subscribers make a voluntary payment from 3 USD/month, install the company’s browser extension which collects their browsing history, and then Flattr divides their subscription fee out among the creators and websites they spent the most time on."


That could be done in a totally anonymous way. Add up total number of donations and total number of visits and distribute money accordingly. There is no need for them to ever store personal information for this.


(a) How do you get the total number of visits.

(b) This doesn't correctly distribute funds because there is a correlation between how much someone is willing to give per month and what kind of sites they frequent.


You can calculate the numbers as soon as the data about visits come in and then immediately discard all data. You certainly don't need to store the browsing history. You need the payment amount though and even that you can discard after a month. I am going through this at work right now where we are working data collection about a medical device. First we wanted to suck up all data and figure out what to do with it later. Now we have to think about the data use upfront and get that data in the least intrusive way. I requires more thinking but we get the same benefit with only a fraction of the data. GDPR is a good thing because it causes companies to rethink their data collection strategies.


There isn't enough data to calculate numbers as soon as the visits come in. If I pay $3/month and visit your site, the amount you get depends on how many other site visits I accumulate during the month. If I visit 2 sites, you get $1.50. If I visit 20 sites, you get $0.15.

You also need to store browsing history to avoid scammers redirecting other users donations. This happens on similar platforms like Spotify. You register for Flattr and just run a script to constantly visit your site. With aggregated statistics, this shifts a bit of everyone's donations to you and can be extremely lucrative. With session history, you just get your own donation back, minus fees.


Thank you. The scammer aspect is exactly what I was getting at. Without allocating any given user's donations fairly across the sites they have individually visited, I don't see how it works. It'd be susceptible to massive click-fraud.


> as soon as the data about visits come in

How does this work, exactly? You are eliding all of the important details that require more information than just "My webserver reported a visit".


You will need their relevant browsing history for the entire payment period


OK. Then throw it away after that.


That is what they just announced.


True. To be honest I don't know how their stuff works in detail. My main point was that most of the officially stated needs for data collection can be met with much less data if you really think about it. Until now most incentives were for collecting as much as possible.


Flattr is an extension/service that monitors your web usage and uses it to direct your payment towards the sites that you use. Recording your browser history is almost the extensions entire purpose


Are you new to the internet? Everyone does it.


Except its not ‘in practice’ because gdpr does not require such thing. In spirit, maybe

[i.e. practically they wouldn't be fined for this]


It might be reasonable to conclude that, in practice, GDPR has caused a number of companies to their re-assess data collection and retention hygiene (even beyond the minimum bounds of the law). In particular, this change seems to be a very charitable or expansive reading of the requirement under the GDPR for companies not to collect more than is necessary -- as the post ends by noting.


I think hygiene was mostly in place, but there was no PR points to score. Before GDPR an information that company stores something for 3 months would be a non-news. GDPR doesn't in any way protect people from data leaks.


>GDPR doesn't in any way protect people from data leaks.

Article 32:

Security of processing

Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:

(a) the pseudonymisation and encryption of personal data; (b) the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services; (c) the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident; (d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

Recital 83:

In order to maintain security and to prevent processing in infringement of this Regulation, the controller or processor should evaluate the risks inherent in the processing and implement measures to mitigate those risks, such as encryption. Those measures should ensure an appropriate level of security, including confidentiality, taking into account the state of the art and the costs of implementation in relation to the risks and the nature of the personal data to be protected. In assessing data security risk, consideration should be given to the risks that are presented by personal data processing, such as accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed which may in particular lead to physical, material or non-material damage.

GDPR also mitigates the impact of leaks. Art. 5 requires that data is stored for no longer than necessary for the purposes for which it was collected. Art. 33 requires that the supervisory authority must be notified of any data breach within 72 hours. Art. 34 requires that data subjects be notified of any breach without undue delay. All of this is enforceable with heavy fines.

https://gdpr-info.eu/art-32-gdpr/

https://gdpr-info.eu/recitals/no-83/


I can't see how this protects people. You can evaluate risks all the time you want, but unless you have exceptional security team, you won't get your situation improved beyond what's already been established in the industry. Unless you think about companies leaving their databases facing the public without password - but then I still can't see how GDPR would help there.

Requirements for post mortem actions are quite sensible though, but given arbitrary rules that is likely only going to be a cash cow for the governments, as even a second of the delay is undue.


The criteria for when such a delay is undue in Germany ("ohne schuldhafte Verzoegerung"), is if it would have been within your power and not incurring gross risks/costs (unless it's your fault for creating a situation where there are gross risks/costs) to have done the required action at an earlier time. It is not undue delay if you needed to sleep, or if your ISP just cut you off and you need to go into the city and get some other ISP to get you a connection, but it is your fault if you then sit around for a month, waiting for the ISP to get it ready, without the ISP getting the connection up soon enough. They would require you to go to the city and get a permit to string fiber across from the next hub to your building, if there was no other way to get it done sooner, due to e.g. there not being anyone with free time to dig up the street and fix the cable, or whatever.


> GDPR doesn't in any way protect people from data leaks.

If GDPR provides PR reasons for better data hygiene the result is the same: less data retained, less data at risk of being leaked.


Art. 5:

Personal data shall be:

kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed

Recital 39:

The personal data should be adequate, relevant and limited to what is necessary for the purposes for which they are processed. This requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum.

The GDPR is very explicit on this point. You must delete or thoroughly anonymise personal data as soon as is practically possible.

https://gdpr-info.eu/art-5-gdpr/

https://gdpr-info.eu/recitals/no-39/


Yes it's quoted in the article too, however "necessary for the purposes" is subject to interpretation, and there is a huge leeway here. In practice it is always possible to claim that some procedure would require retention for longer, and it would be impossible to have them prosecuted for that.


If they divvy up a monthly fee to the sites you visit, they're going to have a hard time explaining the need to keep your browsing data more than a single month: Enough time to pay out to the websites. Then you can maybe add a couple months to account for backups and the like.

It sounds like trimming the data retention here was well-warranted, particularly given the high sensitivity of browsing data.


It's literally "in practice" if companies are doing this because of GDPR. The fact that they are misinterpreting the guidelines doesn't change the fact that they are making decisions on what they believe are the guidelines.


Well, it's like calling a common pointer error "C in practice". You're not wrong, it's just viewing it at a particular angle.


What they’ve done is much cheaper to develop and maintain than implement more granular data deletion tools. Especially when they really don’t need to store the data any longer than one billing period.


Don't they still need to do implement granular deletion to support user requests?


But it does.

> “The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. […]”

― GDPR: Article 25 Data protection by design and by default: Paragraph 2

> “The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay [when] the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;”

― GDPR: Article 17 Right to erasure: Paragraph 1: Point A




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: