Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The main issue I see is how they plan to fix this issue:

> Instead of guessing our location, relationships, or hidden desires behind our backs, they could ask questions and respect our answers.

You __really__ cannot respect what someone tells you about themselves. Machines can rely on the information people feed into social media because that's things the person is personally OK with sharing to anyone and everyone (or, at least, their friend group), but once they start asking specifically about relationships, (bad) habits, and embarrassing situations ("were you at <a bar> yesterday at 2 AM?"), people often lie to the machine to make them seem like a better person or boost their own morale.



Yes, but that's okay. Why should humans not be able to lie to machines? We value privacy because there are some things that are not other people's (or other machine's) buisiness.

In the end it matters why you share the data. If you share the data with a service because you have an addiction problem and the service will help you to overcome it, then it is reasonable to be more open and personal. But if the same data is also given out to others to judge your credit-worthyness, then we have a breach of trust and that is rightfully not acceptable.

If you share your data on facebook (or any more trustworthy plattform) then your objectives are of a different kind: You want to share aspects of your social life. This is a lot less private than the addiction service and also a lot less critical when it comes to the correctness of the information.


It’s not really lying to machines, though; the machines don’t care. It’s lying to the people on the other end of the machine.

Suppose somebody working on Hacker News wants to redesign it, and wants to know if they should optimize for long visits or short visits. If they call me on the phone and ask me how often I open HN and what my mean usage duration is, I’m just going to make up numbers. Their website analytics, on the other hand, can tell them something closer to reality.

I’m not saying that’s good or bad; I’m just saying that our devices are often more reliable records of ourselves than our own memories and thoughts.


This gets into very deep water about what the "real" self it. And using this in the machine feedback loop produces weird side effects, like the tendency of Youtube to recommend far-right, conspiracy, anti-vax etc videos because those have high engagement. It's almost a dissociative effect, where people keep watching or clicking but understand less and less why they're doing it.

People who wire a machine directly into the id have obviously never seen or understood Forbidden Planet.


It gets into the value of the humanities. Not the departments at universities but the course of study, which must be concerned with the functional sum of sensory systems.

Since Plato, the humanities has served critical purposes beyond the production of immersive experience. But the two are no more separable than science is from logic.

The value of the humanities, in my view, is in beginning with the sum of the effects and critiquing the mechanics from there. I am afraid this necessitstes something like the opposite of a technocracy, with no lack of sincerity and brevity. Hence it’s historical conguencce with enlightenment, whatever that is ...


> Not the departments at universities but the course of study

Why are you making this distinction? I think most people undertaking that course of study will be doing at the department in the university.


>> Not the departments at universities but the course of study

> Why are you making this distinction?

Probably because there's a pretty deeply held belief among a lot of technologists that non-STEM degrees and departments are at best wastes of time, and at worst utterly corrupt.


I've observed the very Youtube behavior you describe, on the newly setup computer on the newly setup location, where I haven't logged in with my Google account. It's really horrible.


> I’m not saying that’s good or bad; I’m just saying that our devices are often more reliable records of ourselves than our own memories and thoughts.

True. But I am still running a scriptblocker and other privacy mechanism to make the analytics fail.


> ...but once they start asking specifically about relationships [etc] people often lie to the machine to make them seem like a better person or boost their own morale.

People do this on social media anyway. Because it's pseudo-public, social media is mostly performative. You post your staged selfies, victories, and happy memories and don't post the rest. The only place you'll find the kind of honesty you espouse is private chat and analyzing chat for "preferences" is like snooping on people's private conversations - ultra creepy.


I see another aspect to this as well. Our incentives are fundamentally opposed to the incentives of those who are collecting data on us. So not only are there reasons for them not to believe what we tell them about ourselves, even if they were to ask, there are also reasons for them to figure out better ways to use the information against us if they were to ask and we were to answer.

I don't think there's a good solution to this as long as so many services that people want to use are monolithic and centralized.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: