I didn't go to FB because I thought it's unethical but "money". I went there because I wanted to work on a product that I and every one of my friends/family uses, that helped me a lot when I was getting divorced, etc. Money was OK, but you can make more money than at FB, eg. in Finance, or other special places.
I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.
Also impressive was that _every_ week there was a 1-hour QA where any intern/employee can ask Zuck a question (it's open mic). In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.
"it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment" > sorry, this is just silly...
>> I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.
I don't doubt this, but I think Facebook has a flawed culture that allows and encourages employees to use this mindset as a rationalization for unethical things like:
- run emotional experiments on news feeds
- silently logging all Android users' calls and texts
- allow proliferation of fake news
- allow buying of propaganda ads from state actors
- zero safeguarding of data to ensure it wasn't sold by app creators/devs.
Data is central to facebook's business model and the ability to collect, analyze, and sell lots of data (a natural result of the 'big data' hype) became an infatuation for facebook employees.
The Boz memo supports my point - except he cleverly hides it as 'grow at all costs' rather than the underlying 'collect/analyze/sell data at all costs'.
"run emotional experiments on news feeds" > I believe this happened once, for the advancement of science. Personally, given that FB is the only place where such real social science can happen, I wish they'd do it much-much more. I don't think any social scientist can perform an A/B test outside of Facebook.
"silently logging all Android users' calls and texts" > I also don't like this.
"allow proliferation of fake news" > I think the "allow" part is disingenuous. It's not like FB people are able to guess all the bad vectors in advance and have advance alerts set up. Also, remember, 2B people are on FB, so there will be a lot of shit, because that's what people are like. I actually think they reacted pretty quickly, after the first time there was a credible attack.
"allow buying of propaganda ads from state actors" > Not sure what you mean? US elections are okay to use ads, right? You're saying other countries shouldn't, if you don't like the gov't? This is a lose-lose on FB I think, either people like you bitch that they're enabling a bad gov't, or they're seen as a censor. Believe me that a lot of smart ppl are trying to figure out what the "least wrong" thing to do is on things like this.
"zero safeguarding of data to ensure it wasn't sold by app creators/devs" > bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.
>> "zero safeguarding of data to ensure it wasn't sold by app creators/devs"
> bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.
> I've never worked at a company that takes data protection as seriously as FB
Before or after 2015? Because a couple of years does not quite make up for the preceding period starting in 2007 (if I recall correctly) during which FB clearly didn't care.
I worked there in 2016-2017. I was a Data Engineer so I was pretty close to this. It was taken very seriously, to the point it was annoying (tables with PII get anonymized, which then means extra work, etc). Also, the sheer amount of effort that went into this [the tooling/infra that was already there for this when I arrived] was impressive.
I'm not claiming FB couldn't have / can't do a better job, you can always do a better job, hire even more people for this, etc. But it was definitely taken very seriously, much more seriously than you'd think from all this bad press. And if you go and work there, you'll be impressed, I guarantee that.
However, what I'm talking about is data protection, the problem here was that app permissions were explicitly too loose [until 2015]. As I said, I also think this was a bad policy, and people are rightly upset. But there's way too much generalization happening in this thread.
> In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.
Just this week he has ducked questioning by the UK parliament, and opted not to stand behind what his company does. A couple of weeks ago, when this story broke, he opted to just hide for a little bit, issue legal threats against the guardian and the NYT and see if the whole thing would blow over.
My understanding is FB is still going, it's just that Zuck is sending somebody else. FB is in a lot of countries, he can't go and personally talk to every parliament for PR purposes. He said he is happy to go to the US one. I'd do the same, go to the US one and send others to the rest.
I believe the problem last time was that whomever it was who was doing the answering (FB lawyer?) too often claimed not to have the info being asked for ("I'd have to check and get back to you"). The questioners felt it was an intentional ploy to weasel out of answering uncomfortable questions. Hence the current insistence that Zuck be there himself.
There's no guarantee, in fact I'd suspect it's less likely, that zuck would be able to answer those questions better than a relevant lawyer or more relevant lower-chain director or vp.
I didn't go to FB because I thought it's unethical but "money". I went there because I wanted to work on a product that I and every one of my friends/family uses, that helped me a lot when I was getting divorced, etc. Money was OK, but you can make more money than at FB, eg. in Finance, or other special places.
I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.
Also impressive was that _every_ week there was a 1-hour QA where any intern/employee can ask Zuck a question (it's open mic). In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.
"it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment" > sorry, this is just silly...