* A company is holding PII in a system they don't have the resources to manage .
* The software is insufficiently secure to hold that data.
* The company appears to be even be holding data on people that didn't even do business with the company.
* This is in-part caused by the (sub)hiring of companies that also were not scrupulous with PII in the past.
You say that this hurts said company, and they are going to stop doing that.
I'd say this is the exact intended effect of the law. Not so stupid after all!
Meanwhile, for people who scrupulously and ethically avoided collecting extraneous PII in the first place; I think the GDPR provides no great additional burden.
An email address is PII. Given that many preexisting systems used email addresses as usernames to identify users, let’s say a small business in 2015 hired a company to create a web app which let a user create an account using their email address and it put the email address into a log file with that user’s activity. The contracted developer finished the site, which cost 25000 EUR, much more than the business could afford to spend on tech another ten years. If this company gets 500 GDPR requests and cannot remove the PII because they don’t have the skill or money, should that company be fined? Should it shut down? What if there were 14 million companies with the same problem?
You are asking a question as if this was some sort of moral issue, and that's pretty much guaranteed to lead to terrible decisions -- ultimately immoral decisions -- so my advice is to not approach technical problems through a moralistic lense, but through a technical lense.
The situation we have now is that massive amounts of code and business processes that were created without the assumption that things like email addresses were protected information that users have a right to purge whenever they want. It doesn't particularly matter if you think this is right or wrong, what matters is that this is how the world is. So then, what to do about it?
A rational approach is to try to look at a cost-benefit analysis of various solutions -- how much would it take to refactor the code and update the business processes? More importantly, how much would it take to put into operation controls that effectively ensure that all the data was deleted? Finally, how much would it cost to get rid of all that data -- remember companies can have tape backups, recovery centers, and data was sprayed everywhere for decades.
So you get some number, say a hundred billion. Is it still worth the expense? Could there be some other solution?
For example, force companies to delete old data after X years, where X is say 10 after the business relationship has been ended. Or some other approach. That approach might cost only 20 billion. Or force companies to do this for new code and business processes but leave the legacy ones in place for X years. That might be only 10 billion.
As another example, look at C code. It's unsafe. We are aware of the problems with C code now and have discovered safer languages. But the cost of rewriting the existing pool of C code is huge. It doesn't help to wring your hands and approach this from a moral argument -- so security doesn't matter, we declare indignantly? Instead, look for practical ways of transitioning to safer languages over time, and other ways to isolate and mitigate the damage of unsafe code.
But at all costs, understand the limitations involved, and craft remedies that give you the most bang for the buck, because resources are limited and a dollar spent on this is a dollar not spent on some other cause, which might be more worthwhile than being able to delete any email address on command from a customer letter.
If we consider the consequences of sloppy and/or outright malicious data handling to be negative externalities foisted on society; then removing such externalities by law is pretty much fair game in my opinion.
Especially taking into account that there might not be a linear relationship between the damage accrued by society versus the costs of the company to ameliorate such damage.
* A company is holding PII in a system they don't have the resources to manage .
* The software is insufficiently secure to hold that data.
* The company appears to be even be holding data on people that didn't even do business with the company.
* This is in-part caused by the (sub)hiring of companies that also were not scrupulous with PII in the past.
You say that this hurts said company, and they are going to stop doing that.
I'd say this is the exact intended effect of the law. Not so stupid after all!
Meanwhile, for people who scrupulously and ethically avoided collecting extraneous PII in the first place; I think the GDPR provides no great additional burden.