Flock has knowledge/use of the data. Their system processes can relate the photos “owned” by two different entities. They’re interacting with it and selling their access to it as a feature. That’s obviously distinct from S3.
I know quite a bit about Flock, having been intimately involved in the process of evicting it from our municipality, and I don't think the distinction you're trying to draw here is meaningful. Flock will say they provide a service, one avidly sought by the actual owners of the data, to generate analysis based on that data.
They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.
Flock's facilitation of data-sharing is a huge part of their value proposition over other cameras, and why their customers buy from them over their competitors.
As such, even if they can contract it such that they are not legally responsible for such use, they are very much knowingly facilitating it. If this was physical goods, rather than data, they would probably been as responsible as their customers.
I've read our contract. I know what it says. This isn't an abstraction. They can do lots of things. What they actually do is not data brokerage under California Law, at least not that I can tell.
What Flock names the relationship in their contract does not make it one, as the courts do very much duck type.
Flock knowingly collects PII of people they have no direct relationship with, and transfers it to third parties. If that transfer, which Flock seem to gain from, is legally a sale is something to be argued at a great expense in front of the court.
But regardless of that definition, I so think that any reasonable person (= not a corporate lawyer) would consider there is a sale of data here.
Except their customer's data isn't actually theirs: OP requested their private data to be deleted from the system. So OP expressed a clear intent for their data not to be used by Flock's customer. We could say that the data thus becomes abusively retained on these systems. As a result, IF Flock has the technical means of performing the requested data deletion, it should be compelled to perform it.
This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content.
Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP.
Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform.
If I as a photographer take a photograph of someone, the photo does not belong to that person—the photographer retains the IP and ownership rights.
You have rights too, such as privacy/likeness rights, which allow you to restrict what the IP owner is allowed to do with the image that they own, but you do not own the data, and your rights give you a claim against the data owner.
Flock probably have legal obligations or contractual commitments not to delete or destroy their customers' data, and changing that is not necessarily a good thing.
That's not the case under GDPR, CCPA, HIPAA, or other privacy regimes which codify our right to decide who can store our personal data and what they can do with it.
Can you point me to the part of the GDPR that gives you ownership of data that relates to you? I’m fairly confident that you are assigned rights over personal information as it relates to you, but it doesn’t assign ownership.
Look, I’m all in on public transit, I have straight up never had a drivers license in my 38 years on earth. That said, I pay for that shit by tapping my phone against a circle. They accept cash but I no longer work a tip-based job. They know where I got on and in some transit systems (not Chicago) you gotta tap to get out, as well. They got cameras on the bus and train pointed right at the spot where everyone’s face is as they get on. Public transit solves many problems, surveillance ain’t one of them.
In hindsight I forgot that all those systems are automated now. Where I live you can just walk in with no identification or payment and I guess I take that for granted. Of course if you get caught that's another problem, but even then you just renew your subscription and only show when checked.
The argument is that P(customer wants to run their own firmware) cancels out and 2,3 are just the raw probability of you on the receiving end of an evil maid attack. If you think this is a high probability, a locked bootloader won’t save you.
Very neat, but 1) is not really P(customer wants to run their own firmware), but P(customer wants to run their own firmware on their own device).
So, the first term in 1) and 2) are NOT the same, and it is quite conceivable that the probability of 2) is indeed higher than the one in 1) (which your pseudo-statistical argument aimed to refute, unsuccessfully).
The fuck I'm not arguing anything. I'm only pointing out that buying the book only to support the author likely isn't as impactful as it could seem, especially if the idea is to give the middle finger to large corps like amazon.
Responding to your first paragraph, the rest wasn’t constructive.
Shopify paying for infrastructure related to Ruby is an investment, not charity. Hosting gems costs money and a healthy community depends on that gem hosting. Spotify, in turn, depends on that healthy community to produce and maintain gems, train future employees, stuff like that. They’re not paying that money for fun, it is to protect their interests.
And all of the above would be true even if the OSS committee wasn’t 100% Shopify affiliated. That’s gravy.
You’d think that name, Shopify, would appear three times, once per employee/committee member. Or just once, to say the entire OSS committee was employed by Shopify, if we’re still identifying the group strictly as a group. Either would be fine.
> Or just once, to say the entire OSS committee was employed by Shopify,
Mike works at Basecamp (now and then). Based on comms I don't believe any of them acted on behalf of their employer i.e. no "team orders." Or if they did, they did so in ways that aligned with my perception of what I believed to be the correct read of the situation.
I also think that we (as humans) are much less incapable of knowing what things sway and influence our opinions than we think. We are much less capable of correcting for conflicts of interest than we would like. The study "tappers and listeners" is about adjusting for knowledge (curse of knowledge), but I think it applies to influence as well. Which is to say...I'm sure that everyone was influenced in many ways, but I felt they acted as individuals and reacted in real time.
There are other details of affiliations that I omitted from the former maintainers as well, that are true to state, and likely had some impact on their decisions ... but I used judgment to omit what I didn't think was fair or didn't think was immediately relevant. Not saying I got it all right all the time, but sort of chiming in to say "I'm not only omitting information in favor of one party." Yes, I'm biased...but I'm trying to correct for that bias. (A funny thing to state after just saying humans are bad at it, I know).
But you knew that.
reply