Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Internal Facebook posts of employees discussing leaked memo (theverge.com)
486 points by coloneltcb on March 30, 2018 | hide | past | favorite | 404 comments


The Guardian has a piece that discussed the FB employees reaction to this leak and it was down right scary. Many calls for hiring workers "with integrity" on talking about how this leak was destroying FB's perception as a great place to work.

How scary it must be to work at a place with such an overwhelming "don't rock the boat" mentality. Leakers everywhere, and Google, FB, and Apple especially risk their jobs and their career to give the public an open look at places which hold overwhelming power over our personal lives.

FB's internal perspective on privacy and goals are vital for the public to know, it shouldn't take the next massive breach of trust to trigger an investigation to learn the detais. A leaker, sorry, I mean someone "without integrity", in 2016 could have done a lot of good.


I don't know about Facebook in particular, but what you're missing is that the reason employees can debate internally about policy is that they trust it won't leak externally. The risk of leaks eventually results in companies clamping down on security, so most employees aren't told anything that's not already public, unless they need it for their job. (Much like Apple has been all along, where employees only know what they need to know.)

So I would ask you where you'd rather work? At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.

You'll find this is true in most organizations, not just companies. They want to know if they can trust you with their secrets. It does require some faith that internal debate will help the organization make good decisions most of the time, which admittedly can be a stretch sometimes.


I think you're missing the greater context here. Bosworth's memo is classic end justifies the means. This is almost an admission that they knew things they were doing would be deeply unpopular.

The post in full doesn't read at all like it was to stimulate discussion. It reads more like it was to silence dissent.

If you really wanted to stimulate discussion and gather employee views on this stuff, you'd send a survey round. But the relationship is still asymmetrical between boss and employee.

All you're doing when posting something like this semi-publicly is creating an environment where more quiet conscientious views get shouted down by the loudest voices.


>It reads more like it was to silence dissent.

Which makes his response read all that more hollow. Calling it a straw man? The post seems to have been an all out justification for immoral behavior by an executive. I can't imagine a Jr. Engineer or someone fresh out of college with their MA in Stats feeling super comfortable hopping in and going, "Hey this sound unethical and if people saw you saying it they would think we're hella fucked up." I'm relatively low-ranking at another big SV company and the thought of needing to stand up to a high ranking employee like that is more than a little intimidating.


> the thought of needing to stand up to a high ranking employee like that is more than a little intimidating.

And not just that, but to be expected to "contribute to a discussion" in such a way that all your coworkers can see. I think as somebody who takes objection to that memo I'd probably be more inclined look for alternative work.


Same. I'm not even THAT low-ranking, and it's still a big deal when I push back against a VP or director on something that's in my area. For some topic where I didn't even feel like an expert, against such language, and with morals concerned? I think I'd just silently start looking for a new job.


> I think I'd just silently start looking for a new job.

That's good though. Boz already made the decision to prioritize connecting more people, despite the costs. He didn't have to tell his employees, but he did. This allows you to make the decision on whether it's worth staying at Facebook.


So basically, he’s admitted to unethical behaviour? Is this why his heart is breaking?


It's good for that employee, but it doesn't do anything for the billions of people using Facebook.


I had disagreements with Boz via text while I was a rank and file employee at Facebook. It didn’t change his mind, and I still thought he was incorrect.

But I’d sure as hell take that situation over many others I’ve had where my only contact with execs is through occasional, content-free memos.


I'd guess that sending a one-sided bombshell like that would tend to stimulate debate rather than suppressing it, at least in a company with a tradition of lively internal debate. Maybe in more top-down companies it's different?

But yes, the people in power do tend to be heard more in internal debate. (Not necessarily just managers though. Good or controversial writers can also have a lot of influence.) And this does mean more soft-spoken people sometimes don't get an equal voice.

Online discussions are often more heat than light. I don't think internal discussion can be replaced by surveys, though? They're both useful.

There are also problems that equality doesn't fix. As the number of people scales up, the power of each person gets smaller. Filling out a survey when you know it's one of hundreds or thousands tends not to feel very empowering, or even a good use of your time.


I guess I'm comparing and contrasting this with the infamous Google sexism memo.

They canned an employee for saying something unpopular they disagreed with.

The fact that this went such a different way says something. Maybe that something is "the cultures of Google and Facebook are so different it explains the discrepancy." But maybe it's "Facebook wanted to float this as an ethical trial balloon."


They canned an employee for creating an unholy PR shitstorm outside of the firm.

Basically this is the end of the tech world and all those people who used to join these firms because they believed tech would make the world a better place.

Now its going to be pretty much closed communication and minimal interaction internally. IF you have an issue, well tough balls, tech is no longer good for that - god forbid it shows up on HN. If it shows up on the media thats career suicide.

I suspect its probably time for HN to be shut down soon as well.


No they didn't, they didn't can the leaker that actually created the shitstorm, they canned Damore to appease the outraged.


I must admit, when I read it, it did absolutely read to me like a contrarian attempt to start debate. I find it hard to believe that anyone would post sentiments along the lines of ‘we connect people, so what if someone commits suicide’ without it being a deliberate attempt to start debate.

If he actually believed that stuff, it would be extraordinary scary. I don’t think he does.


"The risk of sarcasm is that you're taken seriously."

Trolling is a terrible leadership technique.


Have you read any HN’s opinions on self driving cars? There is a strongly held belief that individual deaths don’t matter as long as, on average, deaths go down. You can argue the connecting people is inherently good, so how are Boz’s opinions any different?

Boz’s pieces over the last few years tend to fall into the “Strong opinions, weakly held” category. I also suspect he argues a point that is stronger in sentiment than he really believes, to help his message stand out.


The problem is, you might think that but this is senior leadership putting out an email that is setting the general culture and direction for the company. He’s explicitly recognising and endorsing “questionable” (his words!) decisions made.

He can’t come back and say he didn’t mean it. Besides, if he was trying to spark debate, doesn’t that mean he though my it was even potentially justifiable to use unethical practices to drive growth?


I agree it was very poorly judged. Regarding your last point, I honestly assumed that he had seen stirrings of this kid of reasoning within the company and was attempting to ridicule it.


Are they truly unpopular? The events in question happened several years ago, and while this specific incident wasn't known, it was well-understood that FB profiting from harvesting personal data.

In my daily casual conversations with co-workers and friends, the topic is very rarely raised, and only to speculate why the timing of the issue seems to tied to a rise in conservative politics.


The consequentialism is not the talking point in the meme. What's controversial is his valuation of "the ends": that "connecting people" has greater utility than the life or happiness of a minority of those connected.

I'm not going to make any assertions about the intent of the meme because I don't know the context, but the logic expressed in the meme seems to have been their strategy already to me, from the outside.


> This is almost an admission that they knew things they were doing would be deeply unpopular.

Yes, and there were discussions internally about this. One could argue that the recent shift from promoting pages content to promoting your friends content might have been the result of that.


Perhaps you would send a survey, Bosworth chose for a different option. Internal memos leaking out to the public without context is a classic case of causing FUD.


The post will always look different to outsiders wont it?

most of the justifications for it to appear as dissent silencing will have to be post fact justifications as a result.


I prefer the distrustful organization.

I would never be tempted to suspend disbelief that this one time my opinion, effort, goodwill actually mattered.

I've fallen for the "trust us" scam too many times. Embarrassing. Ever more, the only thing I trust is mutual distrust.


I've never worked for a company that gave employees a forum to talk openly. Company policy was always set by upper management behind closed doors and broadcasted down to everyone else. Discussions and disagreements were handled privately.

I like the idea of employees having open discussions about company policy and direction, but I would never would have believed such a thing could exist at enormous companies like Facebook and Google. Though, given headlines recently, I'm not sure it will survive much longer.


How about a company where decisions are shared internally and employees feel empowered enough to speak up, *externally if needed, against what they see as an injustice when internally nothing is done? That shouldn't be too much to ask for. Perhaps some tech unions are needed to enforce this, as shareholders and owners would probably rather have just the dichotomy you presented.

This just reveals how big the power imbalance is between employees and executives that we’d have to make such decisions.


When you say speak up, do you mean internally or externally?


I thought it’d be clear but I added clarification.

Speaking up internally when you’re not on the board often doesn’t get much done. At least if it’s a moral objection.


If you want to get something done, you need to think about who gets to make the decisions and how to influence them.

Depending what it is, speaking up externally might not get you more pull with internal decision-makers? Particularly if they feel betrayed.

Or, maybe a big external stink could cause action? Depends what it is.

It's quite possible that neither would work, and then you've burned your bridges for nothing.


> At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you?

That's a false dichotomy. I'd like to work for an employer that would not mind me discussing my work related stuff outside without immediately classifying that as a leak, and I'd like my employer to trust my judgment in knowing what is and what is not appropriate in such discussions.


Discussing stuff outside work is different from leaking internal communications verbatim.

I left Facebook for a place where the stakes are a lot lower, but information leaks like a sieve.

It’s really disheartening to know that a lot of new coworkers would prefer to leak their “spin” to the press and actively try to damage the company when they don’t get their way.


It all depends on what the public interest angle is. In the case of Facebook the hypocrisy on display borders on the unbelievable. Facebook arguably infringes in the worst way possible on the privacy of a very large chunk of humanity but is highly offended when its own 'private' communications are exposed.

If it's good for the goose it is good for the gander and companies with this much influence on the world should welcome transparency, not oppose it. And if they do not welcome transparency then we'll have to help them along a bit every now and then.


Facebook doesn’t maliciously expose private user data in order to inflict harm on people.


No, they do it to make money. But that's all the same to me.


Employees would probably feel more comfortable about debating opinions regardless of company trust if their opinions didn't involve Orwellian kinds of user manipulation and "questionable practices".


Unfettered leak-free debate, or, heavy secrecy? Seems like an easy choice, especially on paper, just like unlimited vacation vs. 3 weeks paid — the devil is in the implementation and unforeseen consequences.

I think the reality with FB is that the current idealized system is not necessarily the best, and likewise, the occurrence of leaks is not necessarily a sign of impending doom. Sometimes leaks are a necessary symptom for when an organization has gone off the rails and has failed to self-correct. I don’t think any organization enjoys or wants leaks — just as no human enjoys sneezing or diarrhea — but sometimes the temporary discomfort is necessary for long-term health.


So you could say, they value their privacy and don't want their comments shared with people they didn't authorize?

It's ironic to see employees complaining about what is essentially a lack of privacy, when the company they work for goes out of its way to convince everybody that privacy is a thing of the past, and in so many words, so does the very Bosworth post they would protect and keep private. Eat your own dog food.

And then one of them says that whoever leaked the post (the whistleblower, is how I would refer to them) lacks integrity. Integrity? You work for Facebook. Has it never occurred to you that maybe you're the baddies?


The word "integrity" may have a different meaning inside a tightly-knit corporate culture[1] than on the outside, just like "honor" means something different inside the Mafia, where it means "you can steal and murder, but above all, keep your mouth shut".

A corporate code of silence that insists on the absolute privacy of internal communications is similar to the Mafia's code of silence, in which it's considered bad form (punishable by death) to blab to the authorities:

https://en.wikipedia.org/wiki/Omert%C3%A0

"Omertà is a code of honor that places importance on silence, non-cooperation with authorities, and non-interference in the illegal actions of others."

It will be interesting to see what kind of documents this and other whistleblowers will decide to leak in the future. Facebook needs its own Snowden to expose its inner workings.

[1] I suspect that the corporate obsession with secrecy we're seeing here is not unique to Facebook. What's unique is the irony of a privacy-destroying company insisting on its own right to privacy.


That's a false dichotomy. It's better to work somewhere where transparency isn't a problem.


Exactly.


> So I would ask you where you'd rather work? At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.

Employers which still use trust at scale are ignoring their risk analysts. The risk of a secret leaking is proportional to the number of people who know the secret. You can reduce the risk with Stasi-style surveillance, or legal enforcement (e.g. legally classified state secrets), but few people wish to work under those conditions.

It's a false dichotomy because people would rather work for an employer that trusts them with the secrets they need to get their job done, and doesn't trust them with the secrets they don't need, a.k.a. the principle of least access. Openness in organizations is important insofar as people can attain access to information they need when they need it, but not unlimited access to everything, which ultimately reduces organizational trust when leaks inevitably occur.


> At an employer that trusts you not to leak stuff, or somewhere that doesn't trust you? If it's the latter, you might as well be a contractor.

I'm a contractor, in the same team of FTE devs for a year, it's going really well, but I was a little hurt when I realised they were reading CVs to fill a vacancy in the team without putting me in the loop.

Edit: I understand why they are doing it, I mean I'm from a big consulting company (Alten), but still it stung a little, especially since I'm on pretty good term with the rest of the team.


This sort of post isn’t a debate. It’s a pep rally.

Personally, I’d rather be in the dark about policy than be schnookered into thinking that I have some meaningful input.


Oh the irony. People trusted Facebook to keep their data secure and it 'leaked' to C.A.

It seems apt that internal debates and posts would leak to the outside world.


I was a Facebook employee for several years; I left shortly before this memo was drafted. The environment was pretty much the opposite of "don't rock the boat." Dissenting opinions were encouraged and openly discussed, but everyone understood that could only happen if it didn't undermine the PR department's job.

Boz fostered this culture by example, publishing internal memos critical of the company. In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted. Having such high-profile dissent in circulation gives more cover to individual contributors with a gripe than any amount of policy language would.


> In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted.

Except that, Bosworth's shabby recants aside, in the original post the driving mission was not questioned but reinforced to an extreme, cult-like degree by a high ranking Facebook official. He didn't minced words and sought no compromise: growth at any cost, using unethical methods and to the point of endangering people, if that is what it takes. Growth is a good by definition, regardless what your antiquated, pre-Facebook morals tell you.

This a wide extension of the field where debate is possible and a strong reinforcement for unethical behaviour, "Facebook and Boz have your back and anyone questioning growth is an enemy". What was previously unspeakable, is now under debate, we are debating the degree of acceptable unethical behavior and Boz's position seems to be "to any degree". This was merely 20 months ago, not in the distant past when Facebook was founded.

It's specious to call this an environment of open debate, it's a bold move to the organizational culture of a cult or criminal gang. It's not surprising at all then that the current debate centers on ways to root out the traitors and select employees for "integrity" (unflinching loyalty).


I've read it over a few times again, and think I know why it's so divisive. In the memo, he describes a state of affairs, with two possible subtexts:

1) This is the state of things today, and the uncomfortable truth of how we got here; what do we do about it?

2) This is how things both are and should be; either get in line or leave.

I, obviously, gravitated towards the first interpretation and you the second. Without further context, I'm not sure there's any way to really know which was intended.


It seemed more declarative to me. It seemed more about clearly delineating the ugly parts of a pre-existing ideology, not suggesting that there be any change, but that people should acknowledge the consequences of pushing the "connecting people" philosophy.


When viewed in the context of a conversation of whether “the company's driving mission was the universal good that leadership thought it was when it was adopted.” can you see how Boz’s post could move conversation in a positive direction?

I know this is hard to believe from the outside, but most Facebook employees believe that Facebook can have a positive influence on be world. It’s deeply ingrained in the company’s culture. An executive doesn’t just come out and say “it’s all business, fuck the consequences”.

This is why not leaking things is so important. The context and culture within a company change how a message is interpreted.


This falls within: “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum....” ― Noam Chomsky

It applies to pretty much any tech company that claims to love debate and dissent internally.


Reminds me of Orthodox Judaism (other religions may be similar, but it's what I grew up with). Intense debate was highly valued and encouraged, but as soon as you questioned the fundamental truths that the belief system was founded on, you went too far. e.g. questioning that the bible was written by God, or whether God even exists.


That in turn reminds me of Al-Ghazali [1], and the impact he had on Islam. Unable to resolve why some things seemed to contradict Islamic beliefs, he developed and successfully spread the view that there's actually no such thing as causality or logic -- that every single thing is an independent act of god. In other words a leaf does not start burning when exposed to fire because it reaches a certain temperature (speaking loosely), but rather because god decided he'd set it alight at that exact moment. And of course that ash is not created by the fire, but instead by an instantaneous decision by god to turn the burnt object to ash. By rejecting any and all causality, he was able to dismiss all logical issues by simply asserting that causality and logic are social constructs. And that belief spread like wildfire, as such rationale that offers easy explanations for uncomfortable to accept phenomena is wont to do...

Today somewhere around 1/4 of the world's population is Islamic. And there have been a total of 3 Islamic Nobel laureates in the sciences. It's a rather nice demonstration on the question of whether 'geniuses' are born or made. If Allah's hand is not chained, what point is there in seeking to discover these alleged laws of nature?

[1] - https://en.wikipedia.org/wiki/Al-Ghazali


Over half the number of Muslim Nobel laureates (sciences and more), according to that Wikipedia link, have occurred since the year 200.

So, obviously, since the graph is spiking, we can expect to see lots more.

This is my sarcastic way of saying that taking a single metric which is affected by tons of different factors, and applying it to a complex argument about, basically, sociology/anthropology (human behavior and culture) really doesn’t provide a lot of value.

I think your post opens a door to a lot of interesting conversations, but that using the # of Nobel Prize nominations per religious / cultural group as a metric closed most of those doors.

It’s also not very scientific.


Please keep the crypto racism off HN.

The Nobel Prize is a European institution.


I'd say it's the same in other religions (from my personal anecdata with catholicism).

But I think that, in such context, the acceptance of fundamental truths are necessary to have a debate, like mathematical axioms are necessary for proofs. In addition to that the fundamental truths are about one's Faith, so I don't think there's a lot to debate on, either you belive or you don't.


Spot on. It's an effective technique to give people the illusion of having explored all possible options and arguments.

Relevant: https://en.wikipedia.org/wiki/Overton_window


That quote is meant to be applied towards government and society. It doesn't make sense when you apply it to companies. Most employees are passive and obedient as long as they get paid. Do you think IBM or Goldman Sachs employees are allowed to dissent internally? I'd much rather work in an environment that's somewhat open than one that is completely closed.


Is your argument Facebook allows more dialogue then Goldman or IBM so this is okay?


Perfect example of what Chomsky was talking about, and the former cult member above who sincerely believes they were in a free speech zone demonstrates how effective this management technique is.


Basically saying https://en.wikipedia.org/wiki/Wedge_issue with more words :)


This seems to assume that public debate is masterminded according to a particular design


Debate in a corporate environment is, in fact, masterminded. In these cases, its often specifically encouraged, in a certain fashion. The venues for the debate are built and moderated for that purpose, backed by the policies of allowed conduct of the employer.


An interesting comment from comments on TheVerge:

It’s probably worth bearing in mind that in any company that pushes this kind of ‘open’ communication, there’s an unavoidable pressure on most ordinary employees to say the ‘right’ things. A company that has so much of its internal correspondence open and visible to anyone will very quickly descend into 1984 territory. So those ‘dense’ folk bleating about integrity are likely to really be saying ‘I would never leak, boss, you can trust me’. And in all likelihood, the actual leaker is one of those voices. Personally, I find it baffling that so many supposedly intelligent people see an office under the Eye of Sauron as a Good Place to Work.


>"Boz fostered this culture by example, publishing internal memos critical of the company. "

I read his memo titled "The Ugly." There was nothing in there that was critical of "the company." The only criticality I read was this individual being critical of people who might be prone to self-reflection. Judging from the memo and other employees characterizations of him "Boz" just sounds like a total asshole.


Yes, it does seem like it. However, I would leave the door open a crack for irony. This is hard to judge out of context, but he has to have anticipated (encouraged?) pushback.


Does the memo really come across to you as "critical" or "questioning" anything?

> That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends.

> I know a lot of people don’t want to hear this. Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here. If you joined the company because it is doing great work, that’s why we get to do that great work.

I understand what you and others are trying to say. That he somehow disagreed with his own words, and wrote it to start an internal debate.

But is the reader supposed to know that from some kind of context we can't see? From the words themselves, it seems pretty clear he fully supports the continuation of, in his own words, "questionable" practices.


He said that he didn’t believe his own words. I just don’t believe him. He literally tells employees all the dodgy things they are doing is justified to help them “connect” more people. In no way do I believe he didn’t believe that.

Not only that, but he told his employees what they are doing is totally justified and to keep doing it, because it was sanctioned by management.


This is how I would communicate and do communicate when people are making morally borderline choices.

The most common reaction for people is to ignore the moral implications, a la wall street "We are unlocking value".

If boz went 180 degrees and said "welp, thats it growth is over, we have a major disaster in a few years" - the GREATER force would murder him, namely shareholders.

Even now, Facebooks greatest pain is coming from the hit to its wealth, not to the number of uninstalls coming via "deletefacebook".

At this scale and size for a large top tier tech company, the man in charge is expected to not rock the boat. Any course correction occurs slowly, or through crisis.

Apparently we are doing it by crisis


Please don't try and spin the term "don't rock the boat." A company that brutally cracks down on leakers and has employees en masse calling them "people without integrity" is the epitome of that mentality.

It's great they feel they created a microcosm of openness within the company, but that doesn't seem to have made it act any more morally when it comes to protecting user privacy.


Scary, it seems like they use Newspeak internally.


>Boz fostered this culture by example, publishing internal memos critical of the company. In this case, questioning whether the company's driving mission was the universal good that leadership thought it was when it was adopted.

“We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it”

This is not dissent, this is not 'rocking the boat', this is not being critical of the company. He's taking their mission to the extreme worst case scenario and saying that even then the mission is justified. It's the polar opposite of the things you assert it is.

FB is covering by saying no one agreed with it and it was only there to provoke discussion. Why delete the post and its discussion then? It's obvious the discussion on the post wasn't critical enough to actually provide cover for these excuses so they burnt the post and are now lying about it.


"Why delete the post and its discussion then?"

Because the post and the discussion was leaked publicly.


Question remains the same. So what?


You and all the other former Facebook employees sound like people who are working hard to defend the money you made there. Because the dirty looks members of the general public now give those who made their money from Facebook probably gets to you.

Unless you were there since 2005, Boz was a higher up with more seniority, so the original memo was more of a put up or shut up piece than an RFC.


> so the original memo was more of a put up or shut up piece than an RFC.

That couldn't be further from the truth. Facebook's internal communications happen almost 100% exclusively through Facebook itself, meaning this "memo" was most likely a Facebook post, complete with liking, reacting, and commenting capability from anyone in the company.

Buzzfeed touched on this in their version of the story:

> One former employee who spoke with BuzzFeed News noted that they remembered the post and the blowback it received from some workers at the time. “It was one of [Bosworth’s] least popular and most controversial posts,” the ex-employee said. “There are people that are probably still not in his fan club because of his view.”

Source: https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...


I think the issue is that Facebook hasn’t done anything to curb bullying, terrorism, fake news or electioneering. They allowed CA to abscond with user data and they took Russian money to target voters. If they can’t track their ads or comply with federal election laws, that’s a big problem.


> I was a Facebook employee for several years; I left shortly before this memo was drafted. The environment was pretty much the opposite of "don't rock the boat." Dissenting opinions were encouraged and openly discussed, but everyone understood that could only happen if it didn't undermine the PR department's job.

It likely depends on where you are in the company. I haven't worked at FB but have a few friends who have in the past. One of them told me something pretty similar to what you experienced, the other had the exact opposite.

So I'm honestly not sure what to believe about their culture. It seems, once a company gets big enough, that culture becomes multiple sub cultures and I'm not sure there is a single culture that drives the company anymore.


It's also a question of the importance of the issue that you're "rocking the boat" about. It might be completely OK to argue with your manager about how the UI for a particular feature should look, but not OK at all to question the ethics of the company's mission, or the ways in which it makes money.


Arguing that internal company discussions should be all public is like arguing that people should have no privacy. That's a slippery slope.

For example, how would you feel if you told your spouse about the grudges you have with your friends or work colleagues, then (s)he goes out and tells on you? That wouldn't be a very happy marriage imo.

In society people that spread gossip are marginalized by those that hate gossip, because we have private affairs that we'd like to keep private. It's a natural phenomenon.

And yes, I see the irony of defending Facebook by invoking privacy. I try not to have double standards.


This is a company that wants to be the default medium for private communications among friends and loved ones, yet deliberately makes security settings opaque and actively encourages oversharing. Anyone who doesn't see that analogy -- especially those employed at Facebook -- are already riding down the slippery slope.


You've hit the key point from my POV.

I have a hard time having empathy for Facebook in this situation when their entire approach to user's information has been incredibly disrespectful. Constant TOS changes. Misleading privacy settings. Opt-out rather than opt-in sharing. Dark patterns designed to serve the company rather than the user ... and straight up bad ideas. I deleted my account right after they did the TOS change that Cambridge Analytica took advantage of. (The one where your friends choices would share your information. That was a transparently dumb idea from the get go.)


What did leaking this memo actually accomplish? If you are someone who wants tech companies to be accountable for their impact on society you should welcome these internal conversations. All that the leak will do is make them less likely.


Conversations where bosses tell their employees we should connect more people and grow the network no matter the cost? You're right we should have conversations about this, but publicly.

Facebook surrendered it's right to discuss such things privately when it's willfully kept lax policies on sharing users data. Stuff like this should leak earlier so we can talk about it before, rather than after, awful things happen.


I’m far from the biggest fan of Facebook, but I’m absolutely a fan of playing devils advocate in an organization if for no other reason than to solicit reactions and get people engaged. As someone who will use this device sparingly when appropriate, that’s really what this post looked like to me (as opposed to someone who was in it to get terrorists signed up to fb... really?). I honestly feel sorry for the guy


On the other hand, have we really gotten to the point where we have to try to provoke others into a debate? Why can't we state what we mean, what we think, what we're uncertain about, what questions we'd like to discuss in order to foster discussion instead of provoking it. Playing devil's advocate is fine when it's understood what's going on and why you're playing devil's advocate, but when there's ambiguity you play this game of "yes I said that I didn't mean it though" which ends up sounding weak as it does in those case. Devil's advocate is a great cognitive strategy for exploring an issue together, but it's a very poor conversational strategy.


No, I don't see anything that says provoking others into a debate is the only means of conversation, just one possible way of prompting a discussion. I imagine a straightforward discussion as your described is the norm, and this could be one case where they were provocative and so was selected to be leaked. But I agree with your second point that this does not appear to be such a case.


You don't inspire this sort of debate by putting up a straw man.

You inspire this sort of debate by thought exercise and ask about actual application - you couch the conversation to direct your staff to stronger ethics.

If this conversation were at Uber, in their self driving car division the consequences of this would be human life. The way to have that conversation, with context would be to couch it in the "trolly problem" - because that would keep the framing.

Ethics, the word is ethics - Facebook is clearly lacking them. Were "dumb fucks" according to FB's chief - and the fish rots from the head down.

And the staff's response "find the leakers" -- funny how many groups of people I find despicable seem to chant this.


It’s really not a good idea to play the devils advocate as a high ranking individual in a company without being super extra explicitly clear about that. People might mistake it as the companies position, especially if no other high ranking individual contradicts or clarifies he companies position.


I do believe he meant it when he was saying unethical behaviors and negative effects on society were worth the greater good of connectivity.


When I play devil's advocate I clearly state what I'm doing up front. This smells like an attempt at rewriting history.


You are mistaken. At Facebook, the devils advocate would argue in favor of government regulation.


Memos like this allow for open discourse within the company. Leaks only encourage companies to be even more closed off. Facebook could easily hide their language in corporate speak if they really want to encourage people to drink the corporate Kool Aid.


That's not what the memo was, it was not a case where "bosses tell their employees what they should do".

It was a case of starting a debate by voicing an extreme opinion.


No it wasn't.

The memo is exactly and clearly telling employees what to think and do. No questions allowed.


Companies sometimes drink so much of their own kool-aid that they lose all perspective on what's actually important. Shining a light on conversations like this one can be an ego-check, where people who don't work at Facebook can say, "hey ... wait a minute here...".

It's an "emperor's new clothes" situation where we all get to play the role of the child.


The thesis of the memo couldn't be written any other way?

I think you could write about the ideas contained in Boz's memo in such a way that if the memo leaked you still wouldn't look like huge bleeps.

It's not the conversations that are getting them into trouble. If it were just this memo, then nobody would care.

Their action are getting them into trouble. Leaking memos like this merely offers a window into their souls.


> It's not the conversations that are getting them into trouble.

I find these surreal cult-like conversations a lot more off-putting than Facebook's data practices. Those I can understand, these conversations (and the words of these well meaning employees more than those of big bad Boz) make me feel like I need to take a shower. To me, this shows the very worst of intellectually dishonest to the point of delusion, modern day North American culture, and it disgusts me.


I was with you until you specifically criticized North American culture. What makes you think that German or Chinese companies don't also push their employees to place the company's success above ethical considerations?


Of course it could have been written in the style of a press release or perhaps reduced to a politician-style soundbite. But although pablum is harmless when leaked, it doesn't have the nuance needed to give real direction to smart and powerful knowledge workers. It is also bland and may be regarded by thoughtful workers as insincere.


This didn't have nuance. If the guy isn't lying, he was throwing a bomb to get people to react; if he is lying, he was floating the worst let-us-do-evil-that-good-may-come company line I've personally ever seen. In neither case was this a nuanced statement!


It was much more nuanced than the headlines such as the one used by BuzzFeed in their original story [https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...]. He was trying to start a meaningful conversation, which is basically impossible to do under the constraint that you avoid giving adversaries any way to take your remarks out of context and spin them to manufacture outrage.


Well said. If I received internal memos which have gone through an external PR filter, the message would probably read like any other generic press release, and engender gossiping and finding hidden meanings in the memo.

Perhaps it could have been worded differently (better?), but I did appreciate the solid direction that was given by the memo. All too often, leadership is unable to give clear guidance because they are too wishy-washy about what the goals actually are, perhaps not even knowing what the goals should be besides making money.


>What did leaking this memo actually accomplish?

It showed that FB says one thing in public and then does the opposite in private.


There is the "WikiLeaks justification": leaking this memo will force Facebook to have more vigorous internal controls for locking down information, making them less efficient and hastening their downfall.


Wouldn't hiring people with integrity mean hiring executives that write memos that they actually agree with ?

>Bosworth distanced himself from the memo, saying in a Twitter post that he hadn’t agreed with those words even when he wrote them.

That is scary as shit. That they think the leaker is the one without integrity and not the executive team.

What terrible people.


>What terrible people.

The language was kind of funny to me even. Hunting down the leakers to make Facebook great again... the company sounds like the business version of the white house administration. If it's this difficult and requires this much secrecy to convince yourself that what you're doing isn't evil then maybe something is very wrong on a foundational level.


> he hadn’t agreed with those words even when he wrote them

I'm struggling to see why anybody thinks this is a reasonable defence.

There is no indication he didn't believe it. The company's behaviour is consistent with it. He only said "I didn't mean it, it was to stimulate debate", and the classic "You're missing context" (which I am not able to show, of course) after drawing negative PR.

It seems very generous to me to give his recent tweet much credibility at all.


>Wrote another: “This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.”

Says the company who's very moral compass is coming into question.

Nazi soldiers following orders to line up minorities in slums and shoot them, or herd them into cattle cars - loyal, yes? Moral? No.

"Loyalty," in that post, to what? To Mark and the investors' bottom line, and tangentially with it the bottom line of you and your fellow employees? Loyalty to this idea of "connecting the world?" Is that really the value of Facebook?

It sounds like a cargo cult.


The stuff being reported around Facebook is certainly cause for serious concern. "Cult"[0] has a pretty contentious meaning which is problematic when applied to Facebook without further analysis—it can certainly be separated from issues of loyalty and morality. "Cargo cult"[1] really doesn't apply at all, at least in the context of what you're describing here.

[0]: https://en.wikipedia.org/wiki/Cult

[1]: https://en.wikipedia.org/wiki/Cargo_cult#Metaphorical_uses_o...


Thanks, you're right. I've been using the word wrong for a while.


And how do they "connect" people? Maybe I use Facebook wrong, but I so nothing much more than vanilla posts I have no interest in from the same people day after day. There is literally nothing I know of in the platform that would cause me to spontaneously meet someone new, unlike forums, meetup.com, etc.


> And how do they "connect" people?

Don't forget - corporations are people too. I think Facebook's mission might be to connect people with advertisers.


Established organizations are the wrong place to effect change by employees. They are not democratic, they may love to give the impression to employees and the media but there is always a hierarchy and a powerful inner group that makes the decisions.

Debate is ok but anything real that threatens the powerbase will be quickly dispatched. The only person who can change Facebook is Zuckerburg and his inner cotorie or strict regulation. But this is not a problem limited to Facebook. Google is worse and there are others like Palantir and a pipeline of companies who would like to take their place.

Ethical behavior from individuals will only have an effect in smaller companies and early startups. When they do not get engineers who agree to unethical practices and when there is pushback they quickly realise they may need to rethink.

But software folks have postured on freedom and liberty endlessly but gone ahead and build some of the creepiest stalking infrastructure ever built without a care for fundamental human values or ethics and thus are not trusted anymore.


No?

People with integrity have done this all the time, people with integrity are standing up and bringing these issues up constantly.

But frankly normal people cant be arsed to give up free services like google and facebook because they dont, cant, and wont afford the costs of those services at full.

Further - I am a dyed in the wool non facebook user who was warning about this from the day it was created.

But Facebook employees are correct in what they say.

They genuinely believe that they must be a force for good, and that their websites will bring people together.

It is the MOST essential thing for these people to be able to talk to each other candidly and clearly while they still believe in doing good and being ethical.

Because once that goes, the ability to say uncomfortable things, the only other option is to become the corporate behemoths that all SV-ites hate. To become a suit.

I can't understand how people on HN are missing this.

Facebook has been regularly an enabler - but for all those years HN has been cool with it.

Now, when the shoe has dropped, people here are displaying the same overreach and lack of nuance that created this scenario in the first place.

Facebook is the least of all evils. People are ALWAYS going to create this miserable form of social networking because its easier and matches human neuro patterns closely enough.

But this is the one time we will have a single institution which is not yet culturally made up of suits, who can institute or make the effort to fail correctly.

Facebook internally discussing this and realizing that there is no hope is more critical than people tearing facebook down.

Having a clear idea of objective reality, of being able to see our actual options as both employees at facebook, and as users of facebook (or friends with facebookers), is our best way forward.


Be careful what you ask for FB'ers if fb etc get considered CNI your probley going to have to have security clearance if you access to sensitive data - which is going to suck even more so if your origionaly from outside of the states.

I know that some Team leaders at one Uk telco I worked at where asked to go through DV clearance - That's TS (drug tests polygraph) in USA terms.


Most companies have held internal memos like this as private to the organization. Any breach of that is a firing offense. FB, Google, Apple, etc. are not doing anything that corporate America hasn't been doing for ... for I don't know how long.

It is disconcerting how FB employees have come out in support of these ideas from the memo, though.


I think big tech has it very different, given how much they know about their personal employees lives. Some information should leak within a healthy society so backlash and corrections can occur before an election might be compromised by a horrific breach.

For a view on what's different an excerpt from a Guardian piece is below.

---

“It’s horrifying how much they know,” he told the Guardian, on the condition of anonymity. “You go into Facebook and it has this warm, fuzzy feeling of ‘we’re changing the world’ and ‘we care about things’. But you get on their bad side and all of a sudden you are face to face with [Facebook CEO] Mark Zuckerberg’s secret police.”

The public image of Silicon Valley’s tech giants is all colourful bicycles, ping-pong tables, beanbags and free food, but behind the cartoonish facade is a ruthless code of secrecy. They rely on a combination of Kool-Aid, digital and physical surveillance, legal threats and restricted stock units to prevent and detect intellectual property theft and other criminal activity. However, those same tools are also used to catch employees and contractors who talk publicly, even if it’s about their working conditions, misconduct or cultural challenges within the company.

https://www.theguardian.com/technology/2018/mar/16/silicon-v...


One of the important qualities of a great place to work is that decisions are made based on sober analysis rather than reflexive corrections. Maybe you're right, and Facebook is just too important to allow that, but I think it's fair for Facebook employees to be unhappy about this.


Do you honestly believe you can’t have both? I’m not saying all data should be public, what I am saying is if what you’re discussing is morally repugnant given your past history it shouldn’t surprise you if it gets leaked.


The problem is that lots of people find lots of different things morally repugnant. If it's normalized for people to leak things that they consider morally repugnant, that means there are serious costs to engaging in any controversial discussion. There's no way I'm going to talk about diversity at my company if I think someone might go tell a reporter what I said; there's no viewpoint on the issue which isn't offensive to someone.


I think we’ve all blocked out the scary part of Pinocchio, where the boys go to the carnival (Pleasure Island). Spoilers: it ends with them being treated as literal livestock.


Leakers are a HUGE problem, especially when they reveal that the emperor has no clothes. (hello Googlers and FB-ers)

BUT, I've the read the memo and as I saw it, he says that getting people connected can mean that bad guys will also be connected. But that is life. Terrorist and child molesters will use a smartphone too...but


The employee reaction is natural. They feel their privacy has been violated


The data driven espionage companies should be neutered. Ad's are a joke, that's not how they make their monies.


This one is my favourite:

"This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty."

Note how the focus is not on the morality of honouring users privacy, but on the "morality" of protecting the company. Collecting personal information about billions of people, using "questionable" practices, then selling access to it. Its a "good job". One that the commenter does not want to lose. Understood.

However there are laws to protect companies against employees who leak secrets. Employees sign nondisclosure agreements. Companies can adopt no tolerance policies on leaking to the media. They can terminate employees who violate them. No employee needs to consult a moral compass; the rules are clear. Break them and there can be grave consequences.

On the contrary, there are no equivalent remedies available to users whose privacy has been entrusted to Facebook. There is nothing to keep FB honest. There are no grave consequences for violations of user privacy.

When there is a "leak" of users information, the user is entitled to nothing more than an impersonal apology.

Relative to other businesses, one might go so far as to believe "there are no rules" in the space where FB has operated. Users (who are not the customers of FB) have no recourse; theres nowhere else they can go. Buy.

In all seriousness, it is the user who must hope that every FB employee has a "moral compass". Whether FB employees can trust each other is not what the user wants to know. The user wants to know if she can trust FB's employees.


I found that part wow-worthy too. I wonder if they thought whistle-blowers or leakers like Manning or Snowden have "moral compass" and "integrity", or if they're traitors without "loyalty".


Funny, didn’t Zuckerberg say that “Privacy is dead”? I guess only for users of his product.


Not a surprise really.

Many jobs considered "good" require a person to forget about ethics.

Tobacco, oil, pharma, agro, car, ads industries. Now we can also add "social web" to this list.

A person willing to suspend morals in exchange of money is the one you should not trust. Especially if they cannot be held accountable for their actions.

Because hell knows what ELSE they are ready to do.

And they will fight to defend their source of income = "loyalty" = "protecting questionable practices" = "indulging in mafia-like behaviors".



It's kinda ironic that a post about "All we are doing is connecting people and information" gets deleted because it gets connected to a lot of people ("leaked").

You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak." It just strikes me as so ironic for a company that champions "privacy is dead, live with it," to have to delete its own valid internal debates because of the consequences of lack of privacy (i.e. leak = lack of privacy).


You may find it additionally ironic that about a month before the Boz memo, it was reported[1] that Zuck bought 4 houses located around his own for $30 million.

Privacy is looking pretty alive in that neighborhood.

[1] http://time.com/money/4346766/mark-zuckerberg-houses/


Zuckerberg built a 6 foot wall around his 700-acre estate in Hawaii because he values his own privacy that much: http://www.newsweek.com/facebook-mark-zuckerberg-wall-hawaii...


If it wasn't for the unearned and unnassailable value of the network effect Facebook benefits from, this disgusting behavior (from Zuck and his minions) would be enough to drive everyone to a new platform.


"The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie."

"This must result in minimization of efficient internal communications mechanisms (an increase in cognitive "secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption."

"Hence in a world where leaking is easy, secretive or unjust systems are hit nonlinearly relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance."

-- Julian Assange

http://cryptome.org/0002/ja-conspiracies.pdf


Yeah I feel like secretive and unjust are two completely different things, and Julian Assange conflating the two doesn't reflect well on his already questionable character.


Systems that are just do not generally require secrecy to the degree that unjust systems do, precisely due to the fact that when publicized, people will generally agree with just processes.


Except that even a just system has parts that can be taken out of context, and twisted to paint a narrative that it is not just.


There are a whole bunch of examples of this, though "hide the decline" sticks out to me.

It's also the justification for keeping courts open to public attendance but barring recording equipment (in Canada).


That's why I said "to the same degree", i.e. privacy is important, but you don't have to have the massive opacity required for large-scale unfairness to persist.


> You can't simultaneously hold the opinion that there's no harm in sharing information, while also holding the opinion that there is such thing as a "leak.

Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post. In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment. Maybe that would lead to more collaboration on solutions, which is necessary because there are actually some tricky tradeoffs here. But that doesn't give the same dopamine hit as cutting down the tall poppies, right?


I guess I don't understand your point. But I'll elaborate because I'm interested in this topic.

My point is that Facebook's ethos, that a post-privacy era can exist and be okay, is betrayed by how they clamor when it's their own privacy. It makes it feel like it's a one-directional relationship.

Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom. I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.

Aside from ideals, we can point out the consequences of Facebook in practice. Facebook is a written medium that preserves everything (even something from 2 years ago) and thus has constructed a system that forces its users to hyper-curate their entire public persona or suffer social consequences, and from a practical perspective their own VP failed to curate sufficiently. So regardless of ideals, if the system punishes discussion then I see that as a problem, as well as an irony when it happens to their own VP.


> Perhaps you're suggesting facebook's ethos is actually that a privacy middle-ground can exist, where people can choose what gets shown where and to whom.

They can, to a larger degree than most critics seem to realize. I use those controls all the time. I might agree that they're not as prominent or easy to use as they should be, but they exist because people at Facebook cared enough to implement them (which isn't easy or cheap at that scale BTW).

> I can't disprove that that's their ideal, but in light of their data-sharing I have no reason to believe it.

Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since? Is that continuing effort and foregone revenue "no reason" to believe such a sentiment exists?

> if the system punishes discussion then I see that as a problem

Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?


>> Oh, you mean the data sharing that was curtailed sharply in 2014, and again in increments ever since?

Don't pretend they're taking care of it on their own. They're reading text messages in 2018.

>> Yes, punishing discussion is a problem. Wait, what is this entire thread, and a thousand like it all spawned by the publication of these posts, about?

Wait... Are you arguing that it's not bad that facebook stifles controversial opinions on its platform because its behavior creates controversial discussions on other websites...?


> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.

There's a power asymmetry here that makes the individual user vulnerable, and that power asymmetry should be countered by demanding transparency.

There's an obvious parallel here between individual citizens and government apparatus.

Those that control infrastructure and institutions shouldn't be enabled to abuse that power. And if they do they shouldn't be surprised when the affected protest!


>> Likewise, you can't simultaneously hold the opinion that users should have control over where their content is seen, and that it's OK to publish and comment on an internal post.

This argument is illogical, because Facebook forces everyone to sign its ToS to use its services, while nobody forces a Facebook employee to leak internal stuff. Said another way, whether or not I wish to have control over my FB data, FB coerces me to agree that it can do whatever it wants with my data. Its not exactly opt-in, is it? Its far worse, of course, if you consider shadow profiles, because it is even coercing people who didn't even explicitly sign up to the ToS. Unless the leak happened via some kind of coercion (which doesn't seem to be the case), your comment is incorrect.

>> In a less spiteful world, some of the employees' reactions might have been taken as evidence that they do understand and care about issues of privacy or containment.

What? You mean you care about something, but you just won't do something about it, nor openly tell anyone why you wouldn't do something about it, or even talk about it before the issue blows up? Yep, totally convincing.

>> Maybe that would lead to more collaboration on solutions,

Why do people need to "collaborate" on solutions? What do they get from it? Is Facebook going to pay people a share of the profits? If Facebook is a corporate entity which serves its self-interest against people's self-interest (which they have clearly been doing for a long time), what kind of idiot would suggest the people whose self-interest has been affected should now "come to the table" so "we can all work something out"?

>> which is necessary because there are actually some tricky tradeoffs here.

The only tricky tradeoff here is: should Mark Zuckerberg be the only one who should go to jail, or should the entire company be rounded up? It is quite tricky, I do agree.

>> But that doesn't give the same dopamine hit as cutting down the tall poppies, right?

I don't know about tall poppies, but "culling" the "weeds" is the only way to have a healthy garden.


> Its not exactly opt-in, is it?

You're free not to use it. If that opt-in isn't enough, exactly how many levels do you want? If you do choose to use a free service, whether it's Facebook or a public library, you have to consider how it's paid for. Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.

> You mean you care about something, but you just won't do something about it

You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.

> Why do people need to "collaborate" on solutions? What do they get from it?

Ummm ... the solutions, which are not only applicable to Facebook? This is a general problem faced by many companies. The solutions could also be useful to the people who blather about creating a distributed alternative to Facebook. I've been a member of the decentralization and distributed-system community for far longer than Facebook or Y Combinator have existed. I also know something about the scale and connectedness of the data at Facebook. We're multiple basic innovations away from being able to create such an alternative. Wouldn't it be nice if people who actually understand various parts of this can talk and work together? That doesn't become more likely when every discussion is filled with people who only read others' comments enough to find where to insert their own half-baked opinions or insults.


>> If that opt-in isn't enough, exactly how many levels do you want?

Since you can't seem to count to 2, how about:

1. You let us share your data with others in return for free service

2. You don't let us share your date in return for paid service

>> If you do choose to use a free service, whether it's Facebook or a public library

Well, a public library is tax funded and people outside the library employees have a big say in its inner workings. So you can't get your comparisons correct either.

>> Actively using something and also actively undermining its means of support ... well, I'll just leave that thought there.

Perhaps you should complete the thought, because I don't actively use the something

>> You seem to have some pretty unrealistic expectations of what individual employees can do at a 30K-person company, or about anyone taking the right action without deliberating first.

Really, as opposed to your very realistic expectations that everyone should just trust FB employees would have "done the right thing" had they not been caught red-handed? Oh right, because FB knows better what is best for everyone else.

>>Wouldn't it be nice if people who actually understand various parts of this can talk and work together?

This is truly bizarre. So if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt? Let us say you think, "oh, but it might take much longer". Does that automatically adversely affect people more than the damages that can be caused to society via rampant data collection? How can you be so sure? Oh wait, because you must be smarter than everyone else, as you got through the interview.

And finally, it is interesting all the things that you selectively left unsaid (exactly like other FB employees have been doing all the while).

- you don't have the courage (what an ironic handle) to discuss shadow profiles

- you never actually addressed the fact that no one from outside coerced the leak, which made your first comment more rhetorical than substantial

- you cleverly twisted the "collaboration" to be amongst FB employees when clearly the line following tells that you actually meant collaboration between FB employees and its users (dopamine hit for whom, that is? so you are now assuming others cannot read either?)


We've banned this account for violating the site guidelines.

https://news.ycombinator.com/newsguidelines.html


Bystander here. Why the ban? It’s snarky in places for sure but I’d say it’s a pretty solid set of points and counter points. It definitely “added something” to my experience reading this thread.


I suggest that you also identify the primary account behind it and give them a reminder too, or else they'll just keep doing it over and over again until their targets run out of patience.


> 2. You don't let us share your date in return for paid service

Personally, I think that might be a good option, but you can't claim to have made it explicit before so your "count to 2" insult is misplaced. I know that the only thing you've ever done since your account was created is bash Facebook (how nice that anyone can check that for themselves BTW), but even in that light such childishness is counterproductive.

> if FB rolls over and dies tomorrow, does it mean innovation will come to a complete halt

Total strawman. Nobody said or implied that. There's plenty of knowledge and innovation everywhere, but the amount that can come from Facebook only has to be non-zero to support my point. Several hundred developers who have collectively worked on almost every distributed system you've ever heard of might have an idea or two worth discussing. They might even have a perspective on scaling issues that's highly relevant to the problem at hand but not widely known outside of Facebook and maybe three other companies. Why do you try so hard to throw cold water on any such conversations?


[flagged]


You were teetering on incivility earlier in the thread and here you fell straight into it. Please don't! Instead, please read https://news.ycombinator.com/newsguidelines.html and follow the rules regardless of how badly anyone else is behaving.


I think we all understand this scenario can lead us to highly charged emotions.

I believe you comment would have been better without name-calling and leap to jail time.


I'm surprised by the sheer amount of blind loyalty at Facebook.

I mean, I consider myself to be a loyal employee but I'm not blind to ethical violations. The way these employees are defending a global multi-billion dollar organization it's almost like they were executives. They'd rather sell out the rest of the world for what? To be a FB engineer until they retire? It's like Facebook indoctrinates it's employees somehow.

I can't think of a single non-managerial employee at my company that wouldn't speak up if we deliberately started violating agreements with our customers and coming closer and closer to breaking the law, and I'm comfortable with that.


These employees are people who are pulled straight from college, given salaries higher than most of their contemporaries ever dreamed of earning, and are told they are special and are changing the world. Why wouldn't they be blindly loyal to the sociopathic machine they helped create?


The vast majority of these people signed onto what they believed was a sure tradeoff:

They knew FB wasn't the most ethical company around, and that they at times would have to pinch their nose.

In return they would be handsomely financially rewarded.

What we are seeing now is not the response of cult members - it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment. In the worst case they would have irrevocably stained their personal reputations for a relatively small gain.

So to repeat, no - this is not a cult - the employees knew what they were getting themselves into. We should not feel sorry for them.


As a former FB employee, I say "bullshit".

I didn't go to FB because I thought it's unethical but "money". I went there because I wanted to work on a product that I and every one of my friends/family uses, that helped me a lot when I was getting divorced, etc. Money was OK, but you can make more money than at FB, eg. in Finance, or other special places.

I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.

Also impressive was that _every_ week there was a 1-hour QA where any intern/employee can ask Zuck a question (it's open mic). In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.

"it's the response of lots of pragmatic employees/"investors" being protective and worried about their investment" > sorry, this is just silly...


>> I've never worked at a company that takes data protection as seriously as FB, or has the caliber of people protecting data.

I don't doubt this, but I think Facebook has a flawed culture that allows and encourages employees to use this mindset as a rationalization for unethical things like:

- run emotional experiments on news feeds

- silently logging all Android users' calls and texts

- allow proliferation of fake news

- allow buying of propaganda ads from state actors

- zero safeguarding of data to ensure it wasn't sold by app creators/devs.

Data is central to facebook's business model and the ability to collect, analyze, and sell lots of data (a natural result of the 'big data' hype) became an infatuation for facebook employees.

The Boz memo supports my point - except he cleverly hides it as 'grow at all costs' rather than the underlying 'collect/analyze/sell data at all costs'.


"run emotional experiments on news feeds" > I believe this happened once, for the advancement of science. Personally, given that FB is the only place where such real social science can happen, I wish they'd do it much-much more. I don't think any social scientist can perform an A/B test outside of Facebook.

"silently logging all Android users' calls and texts" > I also don't like this.

"allow proliferation of fake news" > I think the "allow" part is disingenuous. It's not like FB people are able to guess all the bad vectors in advance and have advance alerts set up. Also, remember, 2B people are on FB, so there will be a lot of shit, because that's what people are like. I actually think they reacted pretty quickly, after the first time there was a credible attack.

"allow buying of propaganda ads from state actors" > Not sure what you mean? US elections are okay to use ads, right? You're saying other countries shouldn't, if you don't like the gov't? This is a lose-lose on FB I think, either people like you bitch that they're enabling a bad gov't, or they're seen as a censor. Believe me that a lot of smart ppl are trying to figure out what the "least wrong" thing to do is on things like this.

"zero safeguarding of data to ensure it wasn't sold by app creators/devs" > bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.


>> "zero safeguarding of data to ensure it wasn't sold by app creators/devs"

> bullshit, they stopped this is 2015. But I agree, the way it worked before was really broken and asked for this to happen.

> I've never worked at a company that takes data protection as seriously as FB

Before or after 2015? Because a couple of years does not quite make up for the preceding period starting in 2007 (if I recall correctly) during which FB clearly didn't care.


I worked there in 2016-2017. I was a Data Engineer so I was pretty close to this. It was taken very seriously, to the point it was annoying (tables with PII get anonymized, which then means extra work, etc). Also, the sheer amount of effort that went into this [the tooling/infra that was already there for this when I arrived] was impressive.

I'm not claiming FB couldn't have / can't do a better job, you can always do a better job, hire even more people for this, etc. But it was definitely taken very seriously, much more seriously than you'd think from all this bad press. And if you go and work there, you'll be impressed, I guarantee that.

However, what I'm talking about is data protection, the problem here was that app permissions were explicitly too loose [until 2015]. As I said, I also think this was a bad policy, and people are rightly upset. But there's way too much generalization happening in this thread.


> In my time there I've never seen Zuck really duck a question. He stands behind what his company does and believes in the mission, as do most employees.

Just this week he has ducked questioning by the UK parliament, and opted not to stand behind what his company does. A couple of weeks ago, when this story broke, he opted to just hide for a little bit, issue legal threats against the guardian and the NYT and see if the whole thing would blow over.


He was very clearly talking about the weekly internal QA. The context of this discussion is what it's like to work at Facebook. Try not to be dense.


That doesn't matter, because it still contradicts the statement that "He stands behind what his company does and believes in the mission."


My understanding is FB is still going, it's just that Zuck is sending somebody else. FB is in a lot of countries, he can't go and personally talk to every parliament for PR purposes. He said he is happy to go to the US one. I'd do the same, go to the US one and send others to the rest.


I believe the problem last time was that whomever it was who was doing the answering (FB lawyer?) too often claimed not to have the info being asked for ("I'd have to check and get back to you"). The questioners felt it was an intentional ploy to weasel out of answering uncomfortable questions. Hence the current insistence that Zuck be there himself.


There's no guarantee, in fact I'd suspect it's less likely, that zuck would be able to answer those questions better than a relevant lawyer or more relevant lower-chain director or vp.


It's kind of culty here, but I don't think that's out of the ordinary for silicon valley. There is a lot of internal debate and discussion about things, as well. It's not a hive mind.


From the original post by "Boz"[0]:

> Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And we still connect people

This guy is a VP at Facebook. Words mean things and his words have weight within the company. This alone disgusts me. He could have easily taken the other side of the argument to stir debate and chose not to.

> Leakers, please resign instead of sabotaging the company

I think the level of hubris espoused by these Facebook employees is a much better reason to delete Facebook than anything I've seen so far. In fact, although the data we have gotten is incomplete, it seems to possibly be the general feeling. The focus on growth and profit over any thought of doing the right thing is actually evil, especially when one recognizes that evil is being done.

This company is no longer a small company built out of a dorm room. It is a massive publicly traded company that has revenues and active users in the billions. Despite the current climate, words actually have meaning, especially words greatly amplified on these tools, and these actions have real consequences.

[0] https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...


The reason Facebookers hate leakers so much is because it helps no one. The leaker gets to continue working his cushy software job while a bunch of PR people are forced to work overtime to control the situation. Facebook is going to continue to do whatever it wants to do, but they'll just be more secretive about it within the company. Boz could have easily written a memo with corporate-speak if he just wanted everyone to drink the Kool-Aid.


> The reason Facebookers hate leakers so much is because it helps no one.

It helps the general public by giving insight about how crazy is the people that builds and moderate the platform used to communicate by millions of people.


How are they crazy? If anything, I think it's refreshing that a VP acknowledges Facebook's problems. If he never acknowledged it, everything would be peachy. I guess ignorance is bliss for most people.


It helps me understand what is really going on behind closed doors. I think you meant that "it helps no one who works at facebook" a demographic that most of us don't care about at all


It sometimes helps the public, against whom the conspiracy is performed.

Do you think Snowden's leaks were intended to help the NSA?


He does seem like a bit of a bully himself:

http://boz.com/articles/be-kind.html


Not cool. You comment descends a bit into ad hominem territory.

That blog post was basically Boz publicly acknowledging a personal flaw he hadn't been aware of up until his interaction with Dustin. In other words, calling him a mean co-worker in 2008 would be an accurate characterization, if we rely on his recollection of events when Dustin was still Facebook CTO. Calling him a mean co-worker now would be an unfair characterization.

FWIW, Jeff Bezos told a similar story of how he became aware of his meanness to his grandmother in a commencement address in 2010, though in his own case he was a 10 yo boy when he came to that realization. [0].

[0] https://www.princeton.edu/news/2010/05/30/2010-baccalaureate...


From the outside, this all seems rather cultish. Are Facebook employees so convinced of the nobility of the company, in spite of all the evidence to the contrary?

It brings to mind the famous Upton Sinclair quote: "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"


> Are Facebook employees so convinced of the nobility of the company, in spite of all the evidence to the contrary

At best, we only have evidence that the subset of employees who chose to comment on that internal thread and whose comment was chosen by the journalist to be included in the article feel this way.

One of the things that media and aggregation have really accelerated is the cognitive fallacy where we assume the most interesting data points are the most frequent when in reality the inverse is almost always true — common is boring. If you were to go by the news, men bite dogs way more often than dogs bite men. But that's only because "man bites dog" is worth reporting and "dog bites man" is not.


Or maybe they're just all pretending to, because they're all saying this in the company facebook discussion, knowing it's trackable back to them, so they're trying to prove loyalty.

It's a disconcerting system.


I thought the point of "culture fit" was to make sure one only hires employees who truly believe in the company and its ~~religion~~mission.


> in spite of all the evidence to the contrary?

evidence of the contrary? What is your evidence to measure the value of Facebook is not outweighted by its costs?


What material is "the bubble" made out of if Facebook employees immediately jump to "spies" when it comes to leaking something like this? Has it been so long since they've had contact with soul-containing humans that they forgot how they operated?! Zucker-bot must be replicating ye olde FB HQ.


Well, selection bias aside consider that Facebook (along with the other Big Tech firms) have for years now engaged in a hiring process that specifically targets fresh young graduates who are told they have to absolutely be the best of the best to get through an interview process. "Cult" is the word that comes to mind when I think of the hiring practices and cultures at these companies. It shouldn't be surprising when a substantial number of them have this attitude.


I don't think it's fair to say the entire company believe it was spies. Clearly a lot think it was just a few jerks. That said, FB is almost certainly infiltrated by a few governments. It's not exactly a small company. If China wanted to get someone on the inside, it couldn't be that hard.

But IMO, this was just an employee who admires Snowden.


> just a few jerks.

Where by "jerks" you mean "people with deep misgivings and the courage to risk their careers by speaking out against one of the world's premier surveillance organizations?" This isn't the kind of thing you do for lulz, and I doubt they're getting paid.


I think was sidlls said is certainly true, but it's important to remember that media organizations like to focus on things that would be interesting to their readers.

Yes, this cultish behavior is real, but I'm sure there are just as many jaded employees or else this leak would never have happened.


Undiluted self-righteousness. Told they're the best and most worthy by their schools, thence to an employer where they tell each other they're the most virtuous and important. There is an yawning gulf between how they see themselves and how they appear to the rest of the world.


Funny how they think spies would be more interested in leaking internal memos than compromising their orwellian treasure trove of data.


Frankly, it sounds like a cult.


It's interesting to see how Facebook reacts to internal corporate data being shared, in reverse to Facebook sharing private user data to third parties.

Boz complains that his memo was taken out of context and he doesn't even agree with it anymore, yet everyone is judging! Gasp! Facebook on the other hand totally connects people by creating and selling ad profiles on said connected people. Based on data people shared years ago, out of the context, disagreeing to what they said years ago! And that's a good thing, right? Because it connects people! Ugh.


I've got an idea. Is there any way to place targeted advertisement inside Facebook's internal communications feed?

Based on Facebook's latest reactions to media coverage and memos shared to public I was able to deduce an ad profile which I'd like to sell. Facebook, it appears, might start looking for external psychological and legal counselling, and I might have third parties interested in targeting that circumstance.


"Hi, it looks like youre trying to type an internal memo and the language youre using suggest depression, angst and anxiety! how can i help" --Facebooky


> Andrew “Boz” Bosworth, a vice president at Facebook

~sigh~

"Boz" is the only person I regret introducing to programming. (I convinced him to attend the 4-H project where I taught BASIC)


Do you still teach? I ask because when I obtained my C.S. degree, the curriculum required that students take a course that focused on ethics, entitled: Computing, Society, & Professionalism [1.]. I think this would be hard to squeeze into a BASIC course, but discussing the implications of use/abuse of technology is valuable.

[1.] https://www.cc.gatech.edu/fac/Amy.Bruckman/teaching/4001/fal...


Not many students see their ethics course as anything more than a writing credit, with some resentment.


I agree. It feels forced. But exposure is important.


Would love to hear why you have such regrets?


There may be a clue in the story we’re commenting on.


Wow, as according to this leadership was fully aware that collecting cell phone contact data is unethical, but they did it anyway because the end justifies the means. Scary stuff. I always thought it's awesome to work there, with free food and games and stuff but I guess it all comes with a cost. Facebook internally seems like a pretty unhealthy community.


Seems like this is a bit of a dupe, so I'll repeat my initial comment:

There are many, including those inside Facebook, that are actively asking for more federal regulations. I don't do regulation discussions. Brings out too much stupid.

But I will say that this is a survival ploy. If you get regulated, you are now approved by the government. You're good to go, able to operate freely in society. You just have to abide by whatever the regulations are.

Facebook may have a business model that just doesn't work in a society that values privacy, anonymity, and small diverse groups of people with widely-varying mores. This seems to be the lesson Facebook itself is learning now about, well, Facebook. What do we do then? It's not like anybody at Facebook is going to bring that up. They've got a ton of money. What do we do if the existence of Facebook itself is unacceptable?


> If you get regulated, you are now approved by the government.

Not necessarily. I suppose it would be possible to regulate it into nonexistence, by undermining the privacy-invading business model. You could probably get at least part-way there by requiring that profile-based targeted advertising and data collection be explicitly opt-in with informed consent. IIRC, the GDPR has a lot of good stuff about how consent must be obtained. Further regulations could mandate that services and incentives cannot be provided conditionally based on tracking or accepting profile-based targeted advertising, etc.

If the above regulation takes effect, much of the business value of profile data evaporates. Facebook would only be left with eyeballs to shove non-targeted ads in front of, and maybe generalized market research.


“That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.”

If you justify "questionable contact importing practices" then you aren't putting up a straw man. You are talking honestly about your "questionable" decisions and trying to defend them. Thus you should expect outrage from those who realize you've lied to the public using "subtle language" in order to grow at any costs.

In other words, the self-awareness to call out the "subtle language" and "questionable practices" implies that the company pursued those growth strategies despite CLEARLY knowing they were sketchy. This is extremely damning but those in power will try to deflect the problem as if someone leaking it is the issue.

I never would have worked at Facebook before due to their questionable policies. This incident reinforces that and shows that the problems come from the top. Now we'll watch debate there get muzzled in the interest of staying out of the news and growing at all costs. Perhaps a success for shareholders, but a failure for rank and file employees with morals.


The work we will likely have to do in China some day

Social media disruption disintegrated Libyan society to the point it could no longer function. The Arab Spring was a test firing of a psychological nuke. We should all be concerned that Facebook is planning to attack China, because they will defend themselves.


I didn't interpret that as Facebook planning to attack China. To me, it sounded like the work they'd have to do to cooperate with the Chinese system of censorship and surveillance in order to get their product accepted there. (E.g., build a version of PRISM[1] for the Chinese government, just as they did for the NSA.)

This kind of collaboration with an authoritarian regime is something that Facebook employees who value freedom might find to be distasteful, but if it furthers the company's lofty goal of "connecting everyone in the world", then, hey, it can be justified.

[1] https://en.wikipedia.org/wiki/PRISM_(surveillance_program)


I'm kind of amazed over the attention FB is receiving over the data leak and their general attitude over privacy. They have been incredibly clear that they don't value privacy at all, Zuck even said so years ago.

- They have a huge spam-bot problem that they willfully ignore; - use aggressive email and notification technique to re-engage users; - stone-wall traditional publishers to accommodate their distribution techniques - sell and trade personal user information for advertising - Have most security and privacy settings buried under submenus, almost intentionally built to ignore - Have a history of shady apps and games on their systems with questionable value to the user - Have a culture of growth at all costs, without regard to the substance of that growth.

All of the above was known for years, has nothing to do with the recent breach or 2016's election. I just can't understand why this is a surprise or why we should trust a word FB says.


Zuck even said so years ago.

Zuck also got a free pass on his “young people are smarter” comments. I hope the rampant ageism catches up with FB too.


Exactly. Sure, he put out a press release and some PR moves "expressing regret" of his "word choice", but does he really regret or care? Impossible to tell when he is sincere, impossible to trust him.


Cue Facebook with the fake victim complex.

I respect their level of propaganda skill how they are trying to shape the narrative into that they have been victimized, are fully justified and are not actually at fault.

What impresses me most is how homogenously fanatically unified their culture is.

It’s said a cult is that which can only survive by cutting itself off from reality.

What scares me is that people who are so fanatical about their mission and Corp are responsible for so much real human outcomes... The best lack all conviction the worst are full of passionate intensity.


Facebook employees are perfectly suitable to work in spy organisations, such as in the NSA. They seem to all agree on “punish the leaker” attitude, instead of self reforms!


Well, that was unfortunate for Facebook. Zuck was trying to mitigate the PR disaster that Cambridge Analytica has produced last week, telling that mistakes were made. Now there is another whistleblower who basically destroys all credibility of anything Facebook will have to say in the upcoming weeks. It appears that Zuck's lieutenants have a distinct understanding of what's good for them and what they think of what's good for others. Either the head does not know what the hands are doing, or they all are lying and all that is common practice in the entire user data fencing industry.


The tone of the responses are objectively creepy. Almost cultish. It makes you wonder if FB has become known as an amazing place to work because their HR org had become so good at selling the FB culture to anyone who joins to keep them in line.

Now that mirage has a few cracks in it and executives are freaking out.

Reminds me of Westworld.


You can see the same approach at Google. Leaks are characterized as betrayal of the Google family/social group. The goal of this seems to be to ideally prevent people from leaking things they can't be fired for leaking.

You leak of trade secret, you can be fired. But if you talk about work practices, that's a legally protected right that these companies don't want you to talk about. If they can make people who do feel ostracized and like they're betraying their family, maybe they'll reconsider sharing what goes on behind the curtain.


There are three kinds of leaks. Those that leak upcoming products or company roadmaps, those that leak internal company reorgs, meetings, mailing lists/announcements, and those that leak gripes.

I assume you're ok with people being mad about leaked products they've been working hard on for years having their big debut ruined?

So why would employees be mad about the leaking of internal meetings or discussions? Because in the majority of the cases, these leaks don't hurt company management, they hurt employees.

Several times in the past, Googlers have had their physical or mental safety threatened by such leaks. What you see as a company discussion leaked, turns out to be a real life Doxx-ing of employees. Would Damore have been fired if someone hadn't leaked his memo externally? Probably not, he'd be quietly transferred by HR and maybe managed out, but able to preserve his career elsewhere in the valley. The response to his firing was that the internal company meeting questions got leaked and the real names of several female and transgender employees got forwarded to alt-right sites, which proceeded to harass them, send death threats, etc.

And one year, an "innocent" leak of Google's yearly company Christmas present, a brown envelope with cash in it, put the physical safety of Googlers at risk, because Google Shuttle bus stops are well known and Googlers are easily identified in public, and that day, everyone knew Googlers would be carrying hundreds of dollars of cash in brown envelopes.

These leaks often harm employees, not management.


Allegedly, Google's confidentially demands go well beyond not disclosing product launches, as a lawsuit that, to my knowledge, has not been concluded, has claimed: https://www.scribd.com/document/334736972/John-Doe-vs-Google...

Furthermore, you can see in this letter from Brian Katz, the head of stopleaks, more or less everything discussed here, and certainly, it coming from "management": https://regmedia.co.uk/2017/05/22/katz-letter-google.pdf (Actually, everything here is useful/interesting: https://regmedia.co.uk/2017/05/22/discovery.pdf )

Note that he expressly suggests bringing concerns to a manager rather than speaking publicly about them, which likely includes concerns about working conditions or illegal conduct. He also cites the whole "damages our culture" aspect that is extremely relevant to the topic at hand.

Edit: Removed off-topic addressing of removed content from parent post.


Google employees who leak official confidential information: unannounced products, internal company documents, etc can get fired. There's nothing shocking about that. That will get you fired at pretty much any workplace. These are not "whistleblower" leaks.

If you leak say, a private thread from internal mailing lists, you could lose your job, especially if it could harm other employees. If you summarize some issue, you probably are on steadier ground.

Google has a vibrant internal culture of criticism. Memegen has already leaked (https://gizmodo.com/5946769/google-workers-make-internal-mem...), execs and products get raked over the coals at weekly company wide meetings, there's actually internal comics that satirize company beliefs.

And yes, management doesn't always listen, probably the single biggest reason unionization might be needed in Silicon Valley, not for worker pay and benefits, which are already good, but to ensure management listens on other issues.

BTW, the TGIF transcript and meme leaks are the kinds that have ended up Doxxing employees and triggering harassment. The gizmodo link I put properly redacts the usernames, but other leaks have not.


The "doxxing employees" thing only carries so much weight with me, because I feel that your correspondence in a corporate environment which impacts billions of people should have accountability, and the corporate party cannot be entrusted with it.

People should not say things even in a private corporate setting that they would not be comfortable if was someday public. Not only can a lawsuit or investigation bring internal correspondence to light, but for instance, some of my correspondence could be FOIA'd! People fail to realize their corporate correspondence is not their private place, and treat it accordingly.

I do have a lot of sympathy for targeted folks like LGBT folks getting harassed, particularly after events like Damore and the related lawsuits and embedded posts/content. But in many cases, they are already publicly identifying their RL name, gender identity, and political positions on Twitter, for example. They weren't so much "doxxed", as they became higher profile due to having been in the news.

This sucks a lot, but we can't hide important information from the public based on the unfortunate reality that soul-sucking nightmare trolls chase every name that gets even the briefest of public attention... imagine if we took that approach/mindset with leaks coming out of the White House.

PS - I really wish more of Memegen was leaked/public. I can't count how many times a news story has led to the thought "darn, I wish I could see what Memegen looks like today".


Other employees doesn't like leakers because they're selfish and immature. The leaker gets to continue to work their cushy software job while forcing PR people to work overtime. "Leaks" like this don't help anyone besides journalists anyways. The most obvious example is the Jame Damore diversity "manifesto." I can't conceive of how that leak was supposed to accomplish anything but generate controversy.


> while forcing PR people to work overtime

As opposed to doing what else? Emptying bins?

They are employed to spin news for the benefit of the corporation. They have risk assessments and plans-of-action for the most likely eventualties. Their job is to do exactly what they've been doing in response to this leak.


A janitor is employed to clean. That doesn't mean you should throw your garbage on the floor.


Leaking Damore's memo worked exactly as the leaker almost certainly intended, as Cromwell points out below: It got him fired, which is what many Googlers were demanding internally before it became public.


And do you think that Googlers shouldn't be mad at the leaker?


I definitely know of both Googlers happy it leaked and Googlers upset it leaked. At 70,000 some-odd employees, "Googlers" isn't a homogenous group.

Google, as a corporate entity, definitely did not gain from the leak, and certainly would've preferred it not leak.


You implied that the attitude towards discouraging leaks is cultish, but it seems rational to me.


The Guardian has a great piece on this, https://www.theguardian.com/technology/2018/mar/16/silicon-v...

Below is an excerpt

———

The public image of Silicon Valley’s tech giants is all colourful bicycles, ping-pong tables, beanbags and free food, but behind the cartoonish facade is a ruthless code of secrecy. They rely on a combination of Kool-Aid, digital and physical surveillance, legal threats and restricted stock units to prevent and detect intellectual property theft and other criminal activity. However, those same tools are also used to catch employees and contractors who talk publicly, even if it’s about their working conditions, misconduct or cultural challenges within the company.


I wonder, I work as a consultant, and if you think surveillance at a place like Google or Facebook (from what I've been able to tell at interview), surveillance there is minimal.

McDonalds, and more general retail businesses, Target, Walgreens, ..., call centres, ... those are the ones both installing massive surveillance and actually getting it analyzed. Or at least, enough to pay me loads of money to make it possible, and I've had some conversations, they do this routinely. And frankly, I'm pretty sure that since this is directed by low-level managers (you can't do it otherwise, not at that scale, all tracking is done by the store manager) there will be tons and tons of abuse of these systems. I mean for trivial reasons, like stealing, attempting to make a false accusation of stealing stick, stalking girls (or even men), that sort of thing.

From what I can tell, at FB the entrances and exits, some presumably high value locations inside and to a lesser extent the parking lot, those are the places under surveillance. I'm pretty sure that at most places in the FB buildings nobody can see you on any security monitor (it's one of those open plan offices, not much privacy, easy to get everything under surveillance, but they're not doing it. I've been to several call centres where surveillance was utterly pervasive in the same sort of environment. Although the environment was much, much better at FB. Big open office plan, but the air was perfect, for lack of a better word. It wasn't cramped at all, it wasn't like those call centres at all, no wall smelled, not like smoke, not like oil, nothing like that). And of course you hear stories, like "sneaking" into the office on sunday for a board game because it's an easy place and central for everyone to get together is pretty normal.

Of course I realize that this is "fake" like it is fake at any company. Unless you're the CEO AND own a majority of the shares, the company is not a social club, it is not there to have your back. But they have that vibe going there, and they wouldn't have that vibe if they broke it for trivial reasons.

> legal threats and restricted stock units to prevent and detect intellectual property theft

Protecting yourself against insider threats by giving employees means for a good life ! How UNAMERICAN !

I'm pretty sure at your local supermarket you'd find those same threats, except they're totally pervasive. Every employee will have been threatened, and I assure you, ...

... not with restricted stock units or any kind of reward.


Do you need to surveil employees with cameras when you can monitor everything they do on their computers and track where their phone is in the building?

http://www.businessinsider.com/facebook-employee-concerned-c...


Judging by most callcenters I've consulted for ... yes.


> Protecting yourself against insider threats by giving employees means for a good life ! How UNAMERICAN !

Discussing working conditions publicly is legally protected by worker rights laws.

So yes, protecting yourself against that (by firing people for discussing working conditions, for example) is both un-American AND illegal.


There's also selection bias at work here though - the only responses that'll get media attention are the ones that are novel or 'newsworthy' - we aren't seeing the normal boring comments.


It's unusual for a company that big to have so few leaks in their 14 years of existence, especially one that is that high profile.

My understanding is that FB does not have a "fear-based" culture that would've prevented leaks, so really the only way they could keep people in line at scale is if there was a cultish element to their onboarding process that makes people "love" FB so much that employees are actually so offended by a leaker to make comments like this.


But it does have a fear-based culture when it comes to leaks.

https://www.cnbc.com/2018/03/16/facebooks-secret-police-catc...


Fairly capable private investigative and security forces are pretty common for tech companies now. I am not surprised Facebook falls into this category.

Google's is run by a guy named Brian Katz, who's been the subject of a lawsuit before: https://arstechnica.com/tech-policy/2016/12/anonymous-google... and the very same has allegedly threatened a bartender who found a prototype Nexus 4: http://www.dailymail.co.uk/news/article-2224589/Google-threa...

Apple security folks have tossed a guy's home looking for a prototype iPhone, escorted by police. All of them, allegedly, had badges: https://gizmodo.com/5836990/lost-iphone-5-investigators-were...

They obviously can intimidate employees into silence, but it's far more useful and beneficial for morale to go for the "loyalty" angle, and make sure employees shame others who leak.


you'd be surprised by the number of "job opportunities" where Mike Ehrmantraut skills are required (jobs subcontracted to subcontractors) in multinational corporations.


Identity and cognitive dissonance.

I would imagine people at FB that believe working there is objectively good are having a hard time reconciling that belief with the negative externalities that are playing out in society at large.


Everyone puts them (and other Big N) on such a pedestal that when they finally land the job they feel like the company is a reflection of them as a person.

If you see working at one of these companies as a status symbol, you'll do whatever it takes to protect that image.


Agreed, but I would point out that if I were an FB employee and I disagreed with the memo, I certainly wouldn't be saying "yeah, we suck" on the highly-traceable internal discussion board! I'd say that stuff face-to-face with people I trusted at the water cooler or something.

There's a strong selection bias at work here in these posts and I think it's a serious statistical error to assume this represents the median view of Facebook employees.


Yea, they sell culture and give away free food. As you can see[0], most devs at FB are eating, traveling, eating, going to cafes or eating some more. ;-)

[0] https://www.youtube.com/watch?v=hWFDujYzvbI


That is some real interesting insight derived from a gossip column about a few facebook posts.


Nice to see that instead of trying to question why the memo was leaked, some are blaming the leaker


Same counts for Snowden.


This memo is being totally overblown.

It’s like a telephone company admitting their services will be used to call in bomb threats, coordinate terrorist attacks, conduct verbal abuse, but that they should stay steadfast in their mission to provide communication to people for the greater good that comes of it.

It’s a little tacky, but past the last thing on a long list of wrongs Facebook has committed for so many words to be shed over.


I think the part of the memo that is so surprising to me is the company-wide acknowledgement of the bad practices we all talk about; I'm amazed that an executive would send out such a frank description of the misleading "dark patterns" that Facebook uses, even if people at the company know it to be true. I can't imagine that lawyers are going to be happy that there's a record of an executive saying this:

> All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day.


> It’s like a telephone company admitting their services will be used to call in bomb threats

No, it's like a telephone company acknowledging that they use underhanded practices to get more customers, and claiming the ends justify the means.

> It’s a little tacky, but past the last thing on a long list of wrongs Facebook has committed for so many words to be shed over.

How do you think a company acquires such a long list of wrongs in the first place? This is a message from the top, broadcast to all employees, rationalizing (practically encouraging) bad behavior at an institutional level.


I think roughly the same about the provocative terrorists/someone might die part of the quoted memo, but the same argument completely fails to make sense for the other part, where he lists all those questionable things they do for growth and calls them ok (because they are less bad than terrorism? Seriously?)

A phone company analogy would be pretending that cancellation letters never arrived to get another 24 months from a leaving customer, or getting some call center agent to tease out a statement from the customer that could be mistaken for a revokation of the cancellation (they tried to pull that on me once and did not even speak to me, apparently some random flatmate on the phone was considered good enough to mark that checkbox)


This is cut and paste from my other comment below, but my interpretation is kinder (maybe too kind?):

>That’s why all the work we do in growth

I'm not reading that as "anything we do for growth is justified". It's "everything we do in growth", as in all the tactics we currently use in our departments focused on growth, are justified.

This is like "everything we do in sales is kosher". It's admitting you do some abrasive things, but it's stuff that's justified.

Does it sound gross? Yes. Is it true for most companies? Yes. These are uncomfortable truths, companies make sites imperfect to optimize for ad revenue, do extensive A/B testing to see how they can influence behavior, go as far as sending you 20 email drip campaigns or bombard you with notifications to wear you down into using their products, these are all growth tactics that get used by not-evil companies.

To me, the dumbest part of this is that someone would write this all down, and be proud of it. So proud they'd make it a memo and send it company-wide.

This is the silent shame of the tech world. We're forced to modify our most harmless ideas to make a successful product in the real world where even having the best product ever won't guarantee success without tons of marketing using ad campaigns driven by ill-gotten analytics data.

That's what "Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here." is saying.


> I'm not reading that as "anything we do for growth is justified".

Maybe I interpretation was a little eager see scandal.

> To me, the dumbest part of this is that someone would write this all down, and be proud of it.

...and that is pretty much exactly what I wanted to add after reading the first part. It's a bit like the difference between someone who occasionally cheats on his partner but is ashamed of it and someone who cheats on his partner and then brags about it, calling everybody a loser who doesn't.

Dirty little secrets are much more bearable if they stay dirty little secrets. Once they go from secret to pride, they will quickly lose the "little" qualifier and you end up like Uber (who seem to have an infinite supply of dirty secrets to get discovered, but the general attitude was never hidden).


I think its all part of a narrative build up to start censoring people under the justification of stopping things like terrorism and bullying


[flagged]


Please don't do this here.


It's very difference to admit you are doing unethical things vs providing tools that can be used to unethical ends.


Yeah, I agree. People these days just seem to want to pile on people's comments (sometimes said in a very casual setting) by twisting them and taking the most uncharitable reading. As has happened with your comment too.


What's the charitable reading of "all the work we do in growth is justified. All the questionable contact importing practices." ? That's not admitting that others are misusing FB, it's admitting that FB itself is doing questionable things under the umbrella of "the end justifies the means".


>We connect people.

>That can be good if they make it positive. Maybe someone finds love. Maybe it even saves the life of someone on the brink of suicide.

>So we connect more people

>That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.

>That's not admitting that others are misusing FB,

You're only quoting one part. He addresses that in the quote above. It's the opposite of what you're saying.

>it's admitting that FB itself is doing questionable things under the umbrella of "the end justifies the means".

What is there to "admit"? Its a piece of code running on a smartphone. Any moderately skilled engineer can figure out what its doing. Calling something questionable doesn't make it illegal or immoral. "the end justifies the means" implies that the entity has purposefully employed illegal or immoral actions. That is clearly not the case here. Although it is clear that the author is torn between seeing those actions as legitimate business actions. Whether you think Facebook using your contacts is a breach of trust is up to that individual person and their views on privacy. A person choosing to create an account on Facebook and installing their app is a voluntary action. Privacy is defined by choice. A choice that is in your hands. It is also not an absolute, because at times the choice can be taken away. For e.g. You don't need to go up-to every single person and ask their permission before taking a picture at a tourist spot. But you can't take a peek inside someones bedroom.


You're only quoting one part. He addresses that in the quote above. It's the opposite of what you're saying.

I don't agree. He's saying two things. One is that others will abuse FB; that's reasonable. Other is that FB is using questionable practices.

What is there to "admit"? Its a piece of code running on a smartphone. Any moderately skilled engineer can figure out what its doing.

So there's nothing to admit as long as moderately skilled expert can eventually figure it out? That's an interesting perspective. Since a moderately skilled portfolio manager could figure out that Madoff was running a scam[1], there was also nothing for the latter to admit? If there was, what's the difference?

Regarding it being up to someone's view on privacy, that's true - and we're discussing his views on privacy, and he says it's questionable.

As for your whole argument that an admittedly questionable action is clearly not immoral, I'm at a loss. Clearly we're using different dictionaries, because all of mine associate the word is dubious morality.

[1] https://en.wikipedia.org/wiki/Harry_Markopolos


I'd agree if he didn't say that he doesn't agree with what he said, or that he said it to be 'provocative'. I get that he's probably under a ton of pressure right now to get rid of the thing he said, but there's clearly a guilty conscience there.


Well, did he post an equally strong, unambiguous retraction and ensure it reached all the people the original post did?

He could have taken strong steps to undo this memo, if he really regretted it and didn't agree with what it says.

To me it looks like the "guilty conscience" of someone simply trying to evade consequences.


Would you have believed it? I think people would just dismiss it as PR.


> This memo is being totally overblown

The memo may be. The reaction is not. This is not a culture that is willing to be critical of itself. These problems will recur until we break the company up.


I keep hearing that analogy but it doesn't really hold water. Telephone companies don't coerce you into becoming an addict, manipulating their UIs so that you subconsciously associate their platforms with the words "friend" and "like", shove controversy and irrelevant comments from strangers into your feed so that you get annoyed and feel like you're missing out when you clearly stated that you want a chronological ordering of what your close friends and family posted, and then innocently call it "engagement" or "connecting people". So no, it's not like a telephone company.


You’ve twisted my words.

I’m not saying they’re like a phone company, I’m saying this would be the equivalent of a phone company saying what I said.

There’s a very clear difference.

I’m also not saying Facebook is a social good here, but those wrongs, valid or not, are tangential to the memo.


The analogy is still broken because the Facebook memo, although cold and calculating, is still blatantly sugarcoating what they do and why they do it. It's not about connecting people at heart, and they do it through much more nefarious means than simply potentially letting their tools fall into the hands of bad actors.


FB is like a phone company where everyone can call everyone else, all the time, in some type of huge broadcast style phonecall, where people who miss the calls can listen into the recording later. As a matter of fact, that's where the analogy between phones and the internet breaks down too.


>I feel you’ve intentionally twisted my words.

I don't. Make comparisons, own them.


>>I feel you’ve intentionally twisted my words.

I feel you don't understand what your own words mean. You made a comparison, and someone pointed out that it was grossly inaccurate.


If someone compares a tall person to a giraffe, they're not also claiming they're orange.


The point of an analogy seems to be missed on you. There is no actual analogous company to facebook because no company does everything like facebook.


> Telephone companies don't coerce you into becoming an addict

If you were around at the turn of the 20th century, you might think very differently. Gossip lines, heck, when lines were shared you could ease drop on your neighbors conversations. The telephone company must have had some idea of what they were enabling in society at the time.


There is remorseful guilt, they know that deep-stalking everyone on the planet with a phone is not good.


Hardly.

You're getting sidetracked by the terrorism stuff. Of course, any communication network can be used to communicate bad things. But that's not the main point of the memo. I don't know why you'd stop reading and thinking there and draw conclusions.

The scary part is the strong and clear call to all of Facebook that the ends absolutely justifies the means. That growing the network is good and it's the only good.

Privacy? Security? Health? Truth? Law? Integrity and ethics?

The message is clear that none of these are important if you can grow the network.


> I don't know why you'd stop reading and thinking there

You've broken several of the site guidelines with this. Please read https://news.ycombinator.com/newsguidelines.html and abide by them when commenting here.


I read the memo, it’s a little rude to tell someone they didn’t read something they did, no?

And I guess I’ll agree to disagree. All I read was a tacky attempt at acknowledging bad people are allowed to use good things. I think you’re painting your own narrative on to a cliche startup growth memo ex post facto in a world where they’ve finished growing


It really does seem like you didn't read past the first sentence. The main point of the memo isn't that people will use the platform for terrorism. It's that anything Facebook does to add users is good and justified. Quotes from the memo:

The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.

That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends.

Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here.

And in this case, "connecting people" is a euphemism for adding more Facebook users, i.e., getting more people using a highly addictive social medium product that leads many people to feel more isolated and less connected.


You're really doubling down on the "You don't agree with me so you must not have read it" angle? Maybe this is why I'll never feel comfortable in tech. The crave for groupthink people seem to have around it is incredible.

None of what I read there is saying "anything we do is good and justified".

>That’s why all the work we do in growth

I'm not reading that as "anything we do for growth is justified". It's "everything we do in growth", as in all the tactics we currently use in our departments focused on growth, are justified.

This is like "everything we do in sales is kosher". It's admitting you do some abrasive things, but it's stuff that's justified.

Does it sound gross? Yes. Is it true for most companies? Yes. These are uncomfortable truths, companies make sites imperfect to optimize for ad revenue, do extensive A/B testing to see how they can influence behavior, go as far as sending you 20 email drip campaigns or bombard you with notifications to wear you down into using their products, these are all growth tactics that get used by not-evil companies.

To me, the dumbest part of this is that someone would write this all down, and be proud of it. So proud they'd make it a memo and send it company-wide.

This is the silent shame of the tech world. We're forced to modify our most harmless ideas to make a successful product in the real world where even having the best product ever won't guarantee success without tons of marketing using ad campaigns driven by ill-gotten analytics data.

That's what "Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here." is saying.


> You're really doubling down on the "You don't agree with me so you must not have read it" angle?

No, I'm doubling down on the "you missed the relevant and interesting parts of the memo" angle.


Your analysis which leads to your conclusion that the reaction is overblown is based entirely on a side-point in the memo. A flashy, provocative side-point, but still just a side-point. Meanwhile you don't acknowledge or address the actually scary parts of the memo, which I think is what is getting people worked up.

If you don't want to be called out for not addressing the main points of concern before dismissing the issue, why don't you address them rather than complaining for being called out?

Anyway, I seriously doubt "startups will be startups" will be the narrative very many people will accept here.


> questionable contact importing practices

If your phone company had suggested it used morally wrong and potentially illegal tactics to get your private contact information, and equal uproar would ensue.

> The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.

This is literal insanity. Of course this is a big deal. This man writes that he thinks there is almost literally nothing, legal or not, outside the company's willingness to grow exclusively to earn more money. Accepting murder, terrorism, death, just to grow and earn more money? Fuck that man, he's evil and the memo is written to sound as evil as possible. It's sick and it's wrong and it's a lot more than a little tacky.

Facebook has no desire to help you or your friends connect. Facebook has no desire to keep you safe or keep you a customer. They want your friends as customers and they don't care if you die as a result. It almost sounds in the memo like this crazy person wanted someone to die, it's super weird with the fixation on " even some DEATH won't stop us now!" uh okay, sounds like a nice man.

The man says "the truth...we believe in.." and then says later he never believed in it, claims himself that he's a liar and a cheat. He is obviously not mentally well and cannot be trusted to do any real work, especially the kind that has dramatic effect on the lives of 2B people.


Let me ask you: if someone found a way to use one of your software projects to harm - or even kill - others, would you shut it down?


“It’s like a telephone company admitting their services will be used to call in bomb threats, coordinate terrorist attacks, conduct verbal abuse,”

I do agree it would be like a telephone company doing that but a telephone company would never openly admit that to their entire company. Also I doubt they would describe as necessary for growth.


That’s why I feel it’s a very tacky, out of touch memo, but it’s also showing a pretty cliche startup mentality.


Suggesting some of your users will kill people or commit violent acts of terrorism and that this is okay and normal in the name of pursing money and growth, is not a "pretty cliche startup mentality". No startup I've ever worked at thankfully has ever suggested such evil things and then say they're normal. That's not what startups are.


It’s not a suggestion, it’s an admission.

There are so many indirect ways to enable harm that you’d probably be hard pressed to find a company that connects people that isn’t somehow causing harm.

You’ve never worked at a startup with a product that required a moderation team?

If your product requires a moderation team, you’re under the umbrella of enabling negative content.

To give an example, people working on video games with online play are enabling a toxic environment where verbal abuse often occurs.

Are they trying to be evil? No.

But in the course of money and growth they add chat to add a social aspect and enable it. More people than not get a positive experience.

The tacky startup aspect is the fact you’d write that down in a memo. Everyone knows it, it’s not some brilliant revelation to wave up and down and act all high and mighty about.


"There are bad people in the world, some of them will use our service, some of them will use our service as a tool or medium to do bad things."

The behavior from bad people isn't normal, but above a certain size it would be abnormal if you didn't find any bad people.

I wouldn't think there's anything objectionable about this statement other than it's vacuously true.


This. The memo tone really is more like growth justifies all evil


> “How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?“

This sounds like the perfect storm in the perfect bozo culture, like the one perfectly described in Dan Lyon’s “Disrupted”. If you haven’t read that book, I can’t recommend it enough for understanding the insanity of tech startups.


Seeing the comments of some of these Facebook employees and the caps for “integirty” checks reminds me of flavor of the internal discussions at Booz Allen Hamilton after Edward Snowden leaked and was promptly fired from the company.

If you’ve worked somewhere, and given them sweat and blood for years, whether it’s Facebook or the NSA, it’s impossible to not feel defensive and betrayed when someone challenges the very ground you’ve been building your career on. Regardless of whether the challenger is right in the end.

Having said that, at least the government is honest about its secrecy and opaqueness, sounds like Facebook has a bit of a cult going on.


Facebook employees outraged their privacy was violated


> Several employees suggested Facebook attempt to screen employees for a high degree of “integrity” during the hiring process.

Is this a euphemism for "fidelity", or do they really, genuinely mean "integrity"? In my mind "integrity" implies doing the right thing if it's not to the immediate benefit of those close to you, but that could very well involve whistleblowing or leaking information for the benefit of the greater good, which doesn't seem to be in line with what they want. Seems to me that they really mean they just want people who'd stay steadfastly loyal to themselves?


> “All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.”

In China? Really?

> Another theory floated by multiple employees is that Facebook has been targeted by spies or state-level actors hoping to embarrass the company.

Still want to work in China?


> Bosworth distanced himself from the memo, saying in a Twitter post that he hadn’t agreed with those words even when he wrote them.

What's most ironic about Bosworth's words and his response (other than well, it's impacting company PR now), is that FB folks expected their internal memos to stay private while knowingly allowing exfiltration of personal data for millions of their users.

The hubris here is ridiculous. If you support public-everything don't sound shocked when your internal stuff is also exposed.


>“That conversation is now gone,” Bosworth continued. “And I won’t be the one to bring it back for fear it will be misunderstood by a broader population that doesn’t have full context on who we are and how we work.”

So you're saying your users aren't supposed to understand who you are and how you work?

Well, that's why I keep a Facebook profile: I want a public persona to have his data collected.


I love the internal attitude here. Everyone is (at least portrayed) to be assuming that the leakers were being "evil". One think I've been taught is to assume everyone's intentions are the best they could be (aside from the reality of what they are), and I always try to put this into practice. It seems like the attitude is "everyone-is-out-to-get-me".


i admit i do not know the whole story, however bosworth's memo sounds very cultish:

> That conversation is now gone. And I won’t be the one to bring it back for fear it will be misunderstood by a broader population that doesn’t have full context on who we are and how we work.

> This is the very real cost of leaks. We had a sensitive topic that we could engage on openly and explore even bad ideas, even if just to eliminate them. If we have to live in fear that even our bad ideas will be exposed then we won’t explore them or understand them as such, we won’t clearly label them as such, we run a much greater risk of stumbling on them later. Conversations go underground or don’t happen at all. And not only are we worse off for it, so are the people who use our products.

this sets facebook as the chosen ones or saviors of the people. and of course the commoners dont understand and cannot be privy to these esoteric methods.


I'm always shocked by how easily people fall into cults and groupthink. I guess our biology is just tuned for it.


> "Leakers, please resign instead of sabotaging the company,"

So are the employees upset about the "growth at all costs" mentality, or upset that it was laid bare?


Upset that their options are worth less than before.


There are many, including those inside Facebook, that are actively asking for more federal regulations.

I don't do regulation discussions. Brings out too much stupid.

But I will say that this is a survival ploy. If you get regulated, you are now approved by the government. You're good to go, able to operate freely in society. You just have to abide by whatever the regulations are.

Facebook may have a business model that just doesn't work in a society that values privacy, anonymity, and small diverse groups of people with widely-varying mores. This seems to be the lesson Facebook itself is learning now about, well, Facebook. What do we do then? It's not like anybody at Facebook is going to bring that up. They've got a ton of money. What do we do if the existence of Facebook itself is unacceptable?


Funny, the only people who acted with any integrity here are the leakers, it seems like FB does have a hiring problem though when it comes to integrity, only not the one they think they have.

FB sharing user data behind their backs -> good, FB internal data shared with the world -> bad. The cognitive dissonance is strong enough to fracture skulls.

The way the guy tries to whitewash his previous article by stating it was a strawman of his own making and he didn't feel good about it when he wrote it... sucks to be him I guess, he probably never thought he would have to defend those words. Nice to see someone at FB openly admitting to FB being unethical.


Nice to know that conspiracy theories run wild INSIDE Facebook as well as among users.

Maybe that’s why they’re so bad about controlling obviously bad information.

It couldn’t possibly be because people disagreed with that ethos. No. It’s the Mongolians.


Wait, I don't have integrity if I stand up against something, anonymously or not, I believe is wrong? The employees calling for the hiring of people with "integrity" have been drinking the Kool-aid.

Bosworth can say what he wants but he shouldn't feel broken-hearted because what he said leaked, he should feel broken-hearted for what he said.

Leaking doesn't have to silence conversation unless Facebook wants it to.


“We recognize that connecting people isn’t enough by itself. We also need to work to bring people closer together,”

These two sentences in conjunction imply that facebook has a strategic goal to specifically manipulate rather than facilitate human relationships...

What sort of narcissistic pride is necessary for one to believe that one has the right or imperative to manipulate societal interactions as a whole for profit?


Looking at the comments quoted by the Verge, it seems like internal Facebook commenters are no better than Fb commenters below news stories:

> Another theory floated by multiple employees is that Facebook has been targeted by spies or state-level actors hoping to embarrass the company. “Keep in mind that leakers could be intentionally placed bad actors, not just employees making a one-off bad decision,” one wrote. “Thinking adversarially, if I wanted info from Facebook, the easiest path would be to get people hired into low-level employee or contract roles.” Another wrote: “Imagine that some percentage of leakers are spies for governments. A call to morals or problems of performance would be irrelevant in this case, because dissolution is the intent of those actors. If that’s our threat — and maybe it is, given the current political situation? — then is it even possible to build a system that defaults to open, but that is able to resist these bad actors (or do we need to redesign the system?)

Also, the blind loyalty is really disturbing


Provocative ideas are critical to discovering truths that consensus won't find.

The reason Facebook (and many successful tech companies) is successful is because they relentlessly experiment and iterate, killing off ideas that don't gain traction while embracing those that do. Often times the ideas that do get traction are unexpected, but revealed because of a culture that allows any idea to be thrown into the dialogue.

If you can't say something provocative without fear of it later coming back to haunt you, folks will be less likely to raise uncomfortable, or non-consensus, ideas. This will lead to less innovation. It's also okay if it's wrong. And if it is wrong, folks should be willing to push back. That's how a high-functioning, ideas driven organizations thrives.


It seems like a poor lie in this case though, doesn't it? His statement doesn't say "let's talk about this", but he is specifically saying that the ends justify the means.

And provocative is important, but I would be surprised if employees didn't agree that "Facebook causing bullying or terrorism" is a bad thing.


Is this "don't worry we're mature enough to police ourselves"? This is of public importance when Facebook decided to have such lax privacy policies as to allow the CA leak to occur.

I think what's good for the public is important to consider, hopefully more than what's just good for Facebook and their shareholders.


Theranos have innovated themselves to oblivion. We must learn that there's a definite line between innovating and blatantly generating noise for the sake of new noise (also supported by inverstors).


This seems incorrect because as far as I know Theranos didn't innovate. Their machines didn't work, did they?


How do you tell Theranos from not-Theranos before you'd invested a bunch of money in them?

As far as we know there were quite a few large players in the industry who were uninterested in Theranos - which means they have rejected their ideas before the ideas came into living.


Societal utility is a weighted average. Any worthwhile process will have positive and negative outcomes. Those that endure will provide greater upside than downside.

I believe that a process where any idea is given consideration is preferable to one in which ideas are suppressed. We need all the "bad" ideas to discover the "good" ones.


I'm perplexed by the downvotes. So I can refine my understanding, I'd love to know where I'm thinking incorrectly. Here is what I think are the possible counters to what I proposed:

1) Societal utility is NOT a weighted average - namely, an acceptable outcome will only have positive externalities.

2) Processes in which ideas are suppressed are in general preferable to those in which ideas can be unconstrained

3) We can discover the full set of "good" ideas without being exposed to "bad" ideas

Where else is the disagreement coming from? What are your insights that lead you to reject what I've proposed?


A process where any idea is always given consideration is naturally highly vulnerable to sybil attacks.

For example, there's a reason why once upon a time we had agreed (as a society) to never accept any new potents for perpetual motion machines - because the only ideas that were left are the bad ones, and we can prove it.

Now if there comes a time in history when the laws of physics suddenly discover something new - we may re-approach our decision. But I don't think many educated people believe it's going to happen anytime soon.


That's the point though - we only know that perpetual motion machines are not yet possible because nearly every testable idea we've come up yields nothing. Our system is better for allowing, and dismissing, bad ideas than never allowing them at all.


I'm not sure where you're going with this. Facebook allowed these terrible ideas to proliferate which is why we have lost our privacy and rights to "subtle language" and "questionable policies." Now people are acting like the choice is between allowing these ideas or not allowing them.

In reality, its much more nuanced. It turns out a lot of people think the ideas are horrible, offensive, and immoral. Thus Facebook can continue to embrace these ideas and lose employees and public respect, or they can not embrace these ideas and hopefully recover some reputation. It isn't about "disallowing" these ideas. Its about not building a business around screwing people over if you don't want to be known for screwing people over. Something can be allowed without memos from top ranking managers dictating that this is the way the business runs (and will run in the future in China)


Innovated? More like blatantly lied and committed fraud, I'd say


So you'd agree with the fact that sometimes calling someone on their bullshit before giving them money is a viable strategy?

I had to put "innovated" in quotation marks - I meant it in a symbolic way: "innovation is what Theranos were selling to their investors and then they either realised they could not deliver or they have blatantly lied from the beginning".


Agreed. In addition, tech companies are successful because they enter into mostly unregulated markets (this is a truism of new markets in general, govts can't predict where new markets will pop up so they aren't regulated when they do). Regulation and experimentation are kind of inherently at odds. Unfortunately what we are learning about FB is that regulation probably should have happened when the CEO started referring to his company as a utility.


> The reason Facebook ... is successful

I think it's much too early (and much too close-minded) to call Facebook successful. Yes they have made a lot of money already, but their story is just beginning and it's a big eager to call Facebook a success.

Facebook is Public Enemy #1 right now, and for good reason. Success is defined by more than Money - and even at that, Facebook's not shown success in its stock price recently anyhow.

Facebook is not a success. Facebook has been used to systematically dismantle democracy and to fuel fear and hatred and lies into the American population at large scale.

Facebook does emotional experiments on you without your direct knowledge or any outside influence. Facebook has been hell-bent on testing to see if it can make hundreds of thousands of people simultaneously depressed. They can do that and they are proud of it.

All the money in the world doesn't turn Facebook into a success. It is a moral pit of hell and will never been seen as "successful" by most people or hopefully anyone.


"Facebook does emotional experiments on you without your direct knowledge or any outside influence. Facebook has been hell-bent on testing to see if it can make hundreds of thousands of people simultaneously depressed. They can do that and they are proud of it."

Evidence please?

There was one test conducted with 600-700k users (out of 2 billion plus). This is negligible and honestly not statistically significant enough.

Plenty of reasonable concerns around political advertising, bad actors, transparancy into who is funding / buying ads are largely things to be concerned about.

However, errant ranting as above is not exactly helpful to this debate.

Also calling $FB not successful is laughable? On what grounds can you actually say this?

There is almost no evidence that the ad product actually influenced america (it was more likely TV). If you need to be educated about advertising i'm happpy to help you here but as someone who does this for a living I can categorically tell you this - the ads did not sway the election based on the pseudo-science bullshit from Wylie, Nix and anyone else in CA.


Your post is one of the more bizarre things I have read on HN.

On the one hand, you ask for evidence. On the other hand, you admit it happened?

Then you suggest that it is not important because it is "not statistically significant enough", as if the numbers you mentioned aren't staggering? That's a bit like saying it wouldn't be significant if Rhode Island fell into the Atlantic because it only affects a tiny percent of people on Earth. Is statistical significance the only significance there is...?


The study, for anyone interested:

http://www.pnas.org/content/111/24/8788.full

A comment on the study itself, if you're here just for discussion around Facebook please move on:

I think the results of these study are not necessarily surprising. Perhaps the most intriguing bit is this:

> This observation, and the fact that people were more emotionally positive in response to positive emotion updates from their friends, stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively, for example, via social comparison (6, 13).

Which I think should be looked at more closely.

Yet, I mean, hasn't this been assumed/understood to be the case about media we consume for a while (the fact it affects our emotional state?) A quick search seems to support the claim that it is widely understood that TV has the same effect [1], and I think it's been known (or at least assumed) for a while that the emotional content of other broadcasting media can have deep effect on our emotional state. During the War radio and posters were used to keep populations calm (ie, "Keep Calm and Carry On") or angry. Today Fox News, CNN (most media companies actually) try to keep you scared and mad.

I'd say in that sense the results are not very surprising, though it's interesting to think of this in the context of the internet, because internet communication is two-way and and actors are small. Rather than being a single big broadcaster reaching all nodes, social networks are more federated and clique-y. This could imply a self-enforcing network effect. If the network's collective feelings affect each individual's, it's not hard to imagine the effective reduced positivity further worsening the mood of the collective in a vicious cycle, or the opposite. The same "rich gets richer" effect should apply for positive emotions too.

Idk, I personally thing that's kinda crazy, and would like to learn more about it, where the critical points lie, and how these effects might spread across a broader network. Would the positive and negative one tend to cancel each other out?

If you look at Figure 1, I have to wonder if the difference seen in the chart on quadrant one (Positive words / Positivity reduced) being larger than the delta on the other charts means anything for this natural evolution and network effect. Is this just an artifact of the way things are measured, or does it go deeper into our psyche?

Idk, just thoughts.

[1]: https://scholar.google.com/scholar?hl=en&as_sdt=0,5&as_vis=1...


Are you really justifying making hundreds of thousands of people depressed, just because they didn't make millions of people depressed? The distances people will go here to defend Facebook are literally insane.

Making even one single person depressed on purpose is 100% pure evil. Hundreds of thousands of people depressed is a massive crime against humanity. There are sources all over the internet for this. Also I do not think Facebook had 2B+ users in 2014 or so when the "study" to hurt people was done - but that doesn't matter anyway.

> This is negligible and honestly not statistically significant enough.

300,000 depressed people doesn't matter to you? It's not statistically significant? What do you even mean by this? This is literally the most offensive thing I have ever read in my life.

> Also calling $FB not successful is laughable? On what grounds can you actually say this?

Yes I am. On the grounds that it is based on revenue from stolen money, illegally gotten gains through criminal behaviour and morally wrong actions. Not only that it is dirty money driving that price up, but it's so early in FB's days it doesn't make sense to call the stock successful yet.


For the nth time, would you please stop posting overwrought rants to HN? Few people would disagree with you that this is an important topic and I imagine most of us mostly agree with your views. But that's obviously not as important as preserving this place for substantive discussion, and what you're doing breaks it badly.

We need you to stop this habit of going beyond the pale if you want to keep commenting here. We like you and don't want to ban you but if you keep ignoring our warnings and requests about breaking the site guidelines, what choice are we going to have? So please fix this.

In case it helps, here is how I have tried to fix this in my own case: (1) notice when my system is agitated; (2) don't post until I am less agitated; (3) go over my comment after it's up and edit out any leftovers.

https://news.ycombinator.com/newsguidelines.html


My definition of success in this context is attracting around 25% of the entire human population to it's platform.


Your definition of success in this context is purely about the number of people involved? That doesn't make sense and is extremely scary. Is Donald Trump also a success, since he got nearly 100% of the world talking about him and paying attention to him?

There is a lot more to success than attention and time and adoring fans spending their life using your services and talking about you. If that is your definition of success, then humanity will "success" its way to death faster than ever with such successful people like Mark Zuckerberg and Donald Trump uniting us all.

These are not successful people. They are plagues on society and the attention they get from billions of people is harmful, not an indicator or even the defining metric of success.

Edit:

In response to the comment below (since I am no longer allowed to post on HN, maybe too many downvotes?)

> Donald Trump, Facebook, and Mark Zuckerberg are highly successful.

Really? It doesn't seem to me like any of them have met their goals yet or are even remotely close. Maybe we should ask them what their goals are? Donald sounds 100% miserable every day. Was it his goal to be so upset and miserable all the time?

Facebook and Mark sound equally sad and messed up and dealing with major major issues that would make any person feel like they're doing really hard things and are nowhere near success.


Success is simply achieving a goal.

Donald Trump, Facebook, and Mark Zuckerberg are highly successful. But success does not mean good or moral, often it's exactly the opposite traits that lead to success.


(x-posted from the other submission marked dupe)

It's fun imagining this attitude applied to the Space Race.

>So we go to space.

>That can be bad if they say accidents happen. Maybe three of our guys die on the launchpad. Maybe a spacecraft is lost on recovery.

>And we still go to space.

>The ugly truth is that we believe in going to space so deeply that anything that allows us to go to space more is de facto good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.

>That's why all the work we do in space-going is justified. All the pure oxygen environments. All the hatch designs that can only be opened from the outside. All the ad hoc procedures and cheap wiring. The work we will likely have to do on the Moon some day. All of it.


Please don't post duplicate comments! It lowers the signal-noise ratio of the site and makes it harder to merge threads.


This attitude was applied to the space race. What's funny is that it's being applied to getting people to click on memes.


The bad part of the space race is ICBMs, not losing a few astronauts.


>The bad part of the space race is ICBMs

my understanding of history is that space race was good part of ICBMs.


It's fun imagining this attitude applied to the Space Race.

>So we go to space.

>That can be bad if they say accidents happen. Maybe three of our guys die on the launchpad. Maybe a spacecraft is lost on recovery.

>And we still go to space.

>The ugly truth is that we believe in going to space so deeply that anything that allows us to go to space more is de facto good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.

>That's why all the work we do in space-going is justified. All the pure oxygen environments. All the hatch designs that can only be opened from the outside. All the ad hoc procedures and cheap wiring. The work we will likely have to do on the Moon some day. All of it.


For what it's worth, the space race was something most people accepted as something worth doing, the astronauts knew every operational detail of their spacecrafts and volunteered, and a small number of people were affected.

By contrast, not everyone accepts the FB mission as bluntly stated, few except a fraction of technical people understand what FB is doing and how it works and are capable of informed consent, and 2B people are affected.

The Moon isn't populated, China is and their people might have an opinion about being colonized by FB industrialists, either for or against.


Facebook is a for-profit venture so that analogy doesn't really work for me.

A closer analogy would be an investment circulating a memo saying that their questionable AML practices are actually no big deal because they really believe in facilitating seamless transfers of money, or VW talking about how their quest for the ultimate driving experience means staff should just get over the ethics of the unrealistic emissions testing on diesel cars. If people die then so be it, we've got numbers to worry about.


While a neat thought, I would argue that the space race had a positive value, while Facebook has more of a negative value than positive one. I think that the memo is flawed because they're pretending that what they're doing is for the greater good, instead of what they're really doing it for, profit.


Your last line lost me. You're putting your own negative connotations in a way that totally recharacterizes reality, both for the space race and for Facebook's actions.


Consider this, Facebook employees: if your mission truly were to connect everyone in the world, you could do that without exploiting them.

You could become a premier Mastodon node in a federated social network and succeed against your peers with a better user experience, features, mobile apps, etc.

Or, you could continue on your path to centralized hegemony, which will only ultimately end in failure as the world migrates to a federated social media model.

Quit now while you're ahead. You've made a lot of money already, go make some more working for a company that doesn't exploit people.

If only one talented engineer takes this advice to heart, the world will be a better place.


The irony and hypocrisy in this are stunning.


It is a bit disconcerting to hear FB employees say that the failure of integrity was to leak an internal discussion, not complacency around dubious business practices.


The memo itself reads as if a manifesto of a cult, where it claims growth is simply GOOD, even worth sacrifice of others' lives. That makes me a little uncomfortable.

Also it feels so weak, that the author himself seems running into some existential crisis himself that 'questionable' practices such as he mentioned, begins to rock his 'faith', that he has to declare the opposite, loudly, confidently, to dispense his inner guilt.


There isn't a single source for the material besides the original 2016 memo?

It's funny that the article should mention 'Another theory floated by multiple employees is that Facebook has been targeted by spies or state-level actors hoping to embarrass the company.' because until proven otherwise, the whole article is just one Comcast subsidiary's hearsay about another company.


Facebook employees upset that internal information they thought was private has been leaked. The lack of self awarness is amazing. No sympathy.


Great section in the book Chaos Monkeys about the 'Sec', basically an internal KGB that monitors all employees actions and access


I just love how (supposedly) quoted FB employees use word "integrity" when they actually mean "spineless". Hypocrites.


I did laugh when we can see how much FB employees value the 'privacy' of their communications. Which are essentially conversations about destroying the privacy of everyone else.

And to everyone who says, "they need this space to discuss internal matters privately", I actually totally agree with that. If the internal discussion is leaked, it can totally be effected and controlled by outside parties, leading to unhealthy results for the country... I mean company.

Edit: The joke is that Buzzfeed is fucking with the internal policies of Facebook, just like the Russians fucked with our election.


The leaker is spineless. The post was posted many months ago. They leaked it now, during one of the biggest shit storm in fbs history. They did it for some benefits, not because they give a shit. If they did, they would have leaked it when it was posted.


Basically anything that ever happened in the world can be explained that it had been done for "some benefits" or "for hype" etc. Personally I do not consider "for PR" an important argument unless there is any factual evidence for it. Otherwise we could dismiss literally anything with such explanation. And, no, obviously leaker is not spineless by any possible definition.


Let me know when everyone is ready to get down to brass tacks... and discuss the CIA, NSA, InQTel, and tech companies. There is something much more widespread and insidious in tech than just Facebook. Facebook is one of many created surveillance orgs. These issues are systemic, and I mean so much more than just the fucked up SV advertising model.


Full memo is here:

https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...

As other commenters have said, it seems to me that this is being totally overblown due to all the recent CA drama.


There teaches a point in size with companies like that (open to its own employees, with candid-ish company-wide meetings every week or whatever), past which the chances of leaking get a lot higher. E.g. you see a lot more leaks coming out of Google now that it's huge.

It's unclear how to reconcile open culture and leaks.


> Another theory floated by multiple employees is that Facebook has been targeted by spies or state-level actors hoping to embarrass the company. “Keep in mind that leakers could be intentionally placed bad actors, not just employees making a one-off bad decision,” one wrote.

Who could they be? Democrats or Chinese?


Pretty much anything that has a significant impact is going to have a dark side.

e.g. Electricity has improved living standards drastically, but occasionally it electrocutes someone.

That doesn't strike me as a reason to avoid electricity. I suspect that is the thought the author was trying to convey...poorly.



This is like /dev/null for naive employees right? I’m sure Zuck views most of his employees in the same light as people who trust him, and rightly so. Their opinions are only relevant to making them feel better about themselves.


Andrew Bosworth lists himself as a "Co-Inventer of News Feed, Messenger, Groups" on his Twitter bio without any sense of irony or self-awareness.

His software patents are about as innovative as a peanut-butter jelly sandwich.


The scariest part is un-adult emotional environment, where employees (people hired at-will to work) refer to other employees as "brothers and sisters".

This looks more like _Lord of the Flies_ setup than like for-profit company workforce that sells 8 hours of its time in return for paycheck, with some reasonable socializing at work.

This seems to be the trend, and is by no means limited to Facebook. Providing socializing benefits (food, playground, etc.) and requiring emotional committment at work is far more sinister than it appears.

While keeping employees emotionally arrested at 12-year-old stage may extract extra sweat, the overall downside for the society is abysmal.


If they could have hired for integrity this situation would have been avoided, just not in the way that the Facebook employees calling for "integrity" imagine.


If anything, now FB won't even be able to say that "it was all Zuck's fault, Zuck's the bad one". No, they are rotten to the core.


> All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.

What is he talking about here? Work in China?


I'm assuming he is referring to actually building up Facebook in China. I don't think they have any major presence there. But its a massive market so I'm assuming its on their target list.


Hmm didn't know integrity meant making sure not to share morally compromising behavior of the company with the public.


I like that some of these employee's think they work for a company that is not evil.


many of these comments are plain retaliation, and the leadership has to deal with it; the working environment has become toxic already


Zuck::Gates :::: Boz::Ballmer ?


“This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.”

The sheer lack of self-awareness in that statement.


It's breathtaking isn't it? Facebook sells personal information to advertisers. And this employee worries there might be 'smart people who lack a moral compass' working there? It would be funny if it wasn't also terrifying that Facebook employees believe the company is a force for good in the world. Working for Facebook is about as moral as working for Big Tobacco: there's plenty of evidence the product is actively harmful.


Where can I purchase some Facebook personal information? I'd love to use it for some custom audiences on my ads. /s

As far as I'm aware, Facebook has never sold personal information to advertisers[0], because that would be giving away its crown jewels. It might leak it, but doesn't hand it out for sale.

EDIT: Downvoters, please point out where I'm wrong. It looks bad for the community to suppress factual statements.

0. https://www.facebook.com/help/152637448140583


In every debate about "[Advertising company] sells personal information" I've seen, this is always an issue. I suggest we instead say what it is: They monetize personal information.


Facebook sells ad targeting based on gender, sexual orientation, marital status, age, location, salary, education, musical taste, purchasing behaviour, device usage, interests, hobbies and the list goes on. Without 2B users' worth of personal information, there isn't anything to sell. For Facebook to say 'they don't sell your information' is Orwellian corporate doublespeak. Advertisers pay them money and in exchange, Facebook allows them to target sets of users based on those users personal information. How is this not 'selling information about you'?


Are you confused by the distinction between selling personal information and selling ads? The latter does not involve any exchange of personal information to the purchaser.


It seems you are both correct, and the disagreement lies on the meaning 'personal':

- They use 'personal' as 'private', or 'information about you'. e.g. I have a fetish, I wish no one to know, it's 'personal'.

- You seem to use 'personal' as 'personally identifiable information'. e.g. if I have this data, I can know trace back to its originator.

The ambiguity can be found in many places around. Most people don't make the distinction. Rather, they think anything private is personal. At the same time, the definition you use can be found in official documents, like GDPR, the new EU Privacy Law (https://gdpr-info.eu/art-4-gdpr/).


Thanks for clarifying this: personal information does not mean personally identifiable information. Personal information can be sold without uniquely identifying individuals and this is exactly what Facebook's core business model is.


So to reword your question:

Facebook allows clients to purchase personal information. What specific personal information is purchased by Facebook advertisers?


It hasn't sold the information, but until 2014 Facebook was effectively giving it away to attract developers on its platform


I think the CA scandal shows they sell access to gather it yourself, which is a fine distinction that doesnt matter in practice.


CA used data from Facebook's Graph API. They don't sell access to their API, it's freely accessible. It's disadvantageous for Facebook to not keep your personal info in their walled garden of advertising, where they can use it as a competitive advantage for better targeted ads.


You're right. Facebook has not sold personal information to advertisers. That's factually false. What I think we should say instead, is that Facebook sold the notion that 'they knew about you and your preferences' to help advertisers.


I think he means indirectly sell, which is kinda true. By letting the CA scandal happen, they seriously sold a lot of data. Not what they really wanted to do, but their reckless policies and vague morality standard caused that.

Also, nothing changes the fact that Facebook is clearly harmful to the society in important ways, according to various researches. And they do not care about this fundamental issue, they just want more users and more profit.


> By letting the CA scandal happen, they seriously sold a lot of data.

The CA scandal did not involve any sale between CA and Facebook. CA sold insights gleaned from data pulled from Facebook's public and free-of-charge Graph API.

> Also, nothing changes the fact that Facebook is clearly harmful to the society in important ways, according to various researches.

I'm not sure how that's relevant here. Arguing that Facebook is a bad company does not make misinformation more correct.


It's relevant because we're discussing a Facebook employee that wants to screen hires for integrity. Given widely known scientific evidence that FB causes anxiety and lowers self-esteem, which is a direct consequence of its core business model: integrity is already in question when you accept the interview.


I feel like these are weasel words, and I could use the same argument about anything. I could say Clinton let the email scandal happen. Or that Clinton indirectly leaked classified emails, which is kinda true. Or someone can say Clinton is clearly harmful to society in important ways, according to various sources. Clinton just wants more votes and power. I think we should stick with the facts to make a strong argument against Facebook.


Yeah why would FB sell that info when it's the key to their targeted advertising?


Come on, are you serious? 2B people use it every month to keep in touch with their friends. I'm typing this in Dubai, and I use FB/Messenger daily to keep in touch with people in Budapest, London, Amsterdam, Berlin, etc.

Disclaimer: I worked at FB.


The road to hell is paved with good intentions, innit?


The kinds of employee responses i see makes me think of an intelligence agency. Talk of loyalty and fear of foreign spy infiltration. And quite frankly, facebook is one. Except none of the employees are "Secret" or "Top Secret" cleared AFAICT.

I'm speculating here but it may even possible that they have a greater global reach than the NSA as 2.2 Billion people are in their system using their apps on all their devices from watches to desktops. They record all the personal data, track every website being visited, possibly hot mic phones, and record people's locations. Remember those russian soldiers posting photos from inside the ukraine? Yeah, facebook has that data and intelligence agencies of the cold war era would have drooled for this data.


This is why self regulation doesn't work. People posture endlessly and when given half a chance with any sort of power and real stakes, ethics gives way and we are left with apologism, hand waving, denial and rationalization.

This is easy to fix. Follow the money and change the incentives. And the fix will allow software folks to continue to posture endlessly as they will not have the power to make a decision.

1. Ban micro targeting by advertisers, only contextual text and location can be used.

2. Platforms like Facebook, Google and their chain of contractors, partners etc cannot offer micro targeting.

3. Ban data aggregation with heavy penalties and imprisonment for companies, marketing individuals and their agencies who try this. This will solve these problems in a jiffy.

That removes the incentive for stalking people 24/7, collecting tons of data and making correlation and inferences.

The thing is so many people, especially governments and political players, crave this power that it cannot be done anymore, that's why building surveillance infrastructure is a bad idea. It will always be abused and why the individual ethics of the engineers and society at large becomes critical but we have already failed that test.


The leaker simply "connected" Facebook to the outside world.


Here the employees thought they were just sharing among "friends" and someone went and used their information for other purposes.


Yes! Well said.


Emily Waite: "Mark Zuckerberg has never really understood privacy. From Facebook's earliest days, he figured people [incl. Boz?] would eventually grow comfortable sharing everything with everyone-indeed, his business depended on it."

Source:

https://www.wired.com/story/facebook-a-history-of-mark-zucke...

"Whether Boz believed his own words-then or now-matters less than the memo's seeming confirmation of a growing hunch among users: that tech companies like Facebook care less about the people who use their products than they do about growth as a means to fulfill their rosy-lensed ambitions.

...

Instead, it lays bare the naivete Facebook has maintained about its product."

[Poll: What do you think. Do the employees comments that have been "shared with everyone" suggest naivete?]

Source:

https://www.vanityfair.com/news/2018/03/why-the-leaked-faceb...


I don't like Facebook either, but I actually very much like the part where they want to let people talk knowing fully well that what gets said would range from Nobel Peace prize worthy, to undesired, to wanting to burn the whole universe down.

Can you imagine Bic worrying like this about what someone might write using their pens?

Do you imagine the restaurant worker who served breakfast to the 9/11 hijackers having second thoughts about serving food?

Answer to Facebook's woes is to return to Free Speech with compliance with law enforcement if laws are broken. No proactive banning of speech with the fear that it might be illegal. Let the legal process enforce laws, Facebook should comply with lawful orders to take down content, but not before. Facebook should embrace bubbles. Let people chose the bubbles they live in and let advertisers choose bubbles that they advertise in. If a advertiser doesn't want to advertise in "self-defense" bubble, then any content tagged/identified/categorized as "self-defense" doesn't get ads from that advertiser. Make people explicitly aware that there are bubbles out there that they can choose to be part of, or choose to be excluded from, or choose to peek inside for just a bit. Make default no-login/under-age view safe i.e. at least 1k logged-in views where at least 20 people did not raise "inappropriate content" flag.

Free Speech, embrace bubbles, comply with law enforcement and that should make Facebook relevant again.


"Facebook has tapped one of its most veteran execs to lead all of its consumer hardware efforts, including the mysterious Building 8 division responsible for its forthcoming video chat device.

Andrew "Boz" Bosworth will oversee Building 8 and Oculus, Facebook's virtual reality arm, Business Insider has learned. ...

The device, codenamed Aloha, will feature a large touchscreen along with a camera and speakers and be capable of recognizing peoples' faces when they step into view, three people with knowledge of the device said.

...

One hurdle Building 8 has faced in its efforts to build its first device is consumer mistrust of Facebook protecting user privacy, according to multiple people familiar with the matter. The company conducted marketing studies for project Aloha and received overwhelming concern that Facebook would use the device to spy on users, according to one person with knowledge of the matter.

To assuage concerns about privacy, Facebook has considered creative ways to market Aloha, including pitching it as a device for letting the elderly easily communicate with their families. Building 8 employees have also considered creating new brand names beside Facebook to sell their gadgets under."

Source:

https://www.businessinsider.in/Facebook-is-unifying-its-hard...

"Regina Dugan, the head of Facebook's secretive hardware lab called Building 8, is leaving the company after just 18 months.

It's unclear who will take over Dugan's role leading Building 8 on a day-to-day basis. Facebook recently promoted Andrew "Boz" Bosworth to run all of the company's hardware projects, but that also includes Oculus hardware, not just Building 8.

"There is a tidal shift going on in Silicon Valley, and those of us in this industry have greater responsibilities than ever before," Dugan said in a statement provided by a company spokesperson. "The timing feels right to step away and be purposeful about what's next, thoughtful about new ways to contribute in times of disruption."

At Facebook, Dugan oversaw a number of hardware efforts, none of which have actually launched, including a video chat device and a smart [microphone and] speaker, according to Business Insider."

Source:

https://www.recode.net/2017/10/17/16488654/regina-dugan-face...

[ What did she mean by "greater responsibilities"? Whats the "tidal shift"? ]

"The memo is classic Boz because it speaks to the majority of Facebook employee views but it's also polarizing. Tonally he doesn't mince words. This is clearly a post meant to rally the troops."

There is no record of Zuckerberg's response to the memo. However, a year later in August 2017, Bosworth was tapped to run the company's consumer hardware efforts."

"The natural state of the world is not connected," Bosworth wrote. "It is not unified. It is fragmented by borders, languages, and increasingly by different products. The best products don't win. The ones everyone use win."

Source:

https://www.buzzfeed.com/ryanmac/growth-at-any-cost-top-face...

That last statement suggests everyone is not using the best products.

It suggests the senior management at Facebook are well aware that their product is not the best.

As a user, I want the best products.FN1

Is there any reason I shouldnt?

Why should I settle?

There are many ways to communicate over the internet (cf. www), peer-to-peer being the original and one of the most versatile, IMO.

Using a single billions-page public website to communicate over the internet is one possible way, it has proven very popular, but if I understand the Bosworth statement correctly, it is not necessarily the best way.

Being "fragmented" (e.g. decentralized) may be a desirable characteristic for users communication if one of the goals is to avoid being an easy target for advertisers looking for massive pools of consumers all gathered in one place, which appears to be the definition of the Facebook "product".

FN1. Using the word "product" to describe what Facebook offers to the user is probably misleading. FB is only a "product" for advertisers. Users pay nothing and recieve nothing. They merely use a single website belonging to a Harvard computer science drop-out in order to "stay in touch". The website is designed to exploit that use for the purpose of selling advertising.


1. Bosworth's reaction:"for fear it will be misunderstood by a broader population that doesn’t have full context on who we are and how we work"

2. Wrote another: “How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?”

3. Back to Bosworth: "The post was of no particular consequence in and of itself, it was the comments that were impressive." (italics mine)

4. And lastly: " If we have to live in fear that even our bad ideas will be exposed then we won’t explore them or understand them as such, we won’t clearly label them as such, we run a much greater risk of stumbling on them later. Conversations go underground or don’t happen at all. And not only are we worse off for it, so are the people who use our products." (italics mine)

In other words: 1. we know better than you 2. the ends do justify the means (contrary to MZ's opinion) 3. don't mind the contents, just notice how smart the employees are 4. let us tell you what is best for you


The comments quoted are pretty shill-ish... but I found myself weirdly sympathetic to Bosworth's concern about leaks. If corporations are people, than their internal communications are like thoughts floating around before a decision is made. If you're properly distributing decision making in a company, individual sections are going to come up with ideas that are embarrassing in the big picture! Would you want your stray thoughts picked apart in the press?

BUUUUUUT (a) Bosworth is at the top of his org, so calling his own post a "not quite staw man" is a ridiculous cop-out. Like it doesn't have 1000x the weight of the comments below. And (b) The reason people are being forced to interrogate Facebook's "character" through internal docs is because a lack of candor has made that character impossible to judge based on public statements and actions. You did it to yourself, man!


> If corporations are people ...

But they're not, of course. Sure, I don't want my internal thoughts floating around, but I'm not a) several thousand people, and b) the personification of those several thousand people attempting to work out the best way of extracting other people's sub-conscious & internal motivations and desires, and sell them to marketers ... so my perspective is skewed.

The rest of the excuses in TFA seem very 'some people say...' or 'it's been said that...' style posturing, which may be an interesting academic exercise, but as you say it feels a bit of a cop-out to use that as a post-rationalisation.


It just shows some people inside think that outsiders are too stupid to understand/appretiate what the insiders are talking about, so it better stay hidden. Some of the employees' comments also betray this belief.

Why couldn't a discussion about the future (features? :)) of Facebook could not be open if it's such a big and far reaching platform? What's to hide?

Media would have hard time feeding on it, compared to the current situation where it is all interesting and whatever because some "secret" "leaked" and therefore it is "newsworthy".


Separately, it's pretty hilarious at this point to hear hand wringing about how a leak of private data is keeping FB from having an honest and constructive internal discourse. That's exactly how I feel about what leaking 50M profiles to a political devil bent on destroying constructive discourse has done to my favorite democracy.


An excellent point.


Facebook's level of engineering, technology and data seem close to the point of implementing a transformative virtual reality social experience that would be indistinguishable from reality.

Imagine if Facebook invented a way to simulate reality, then the user (the one superuser) would write the past in the future. Then go i to the simulation, hide himself and forget where he is.

Only the powerful, intergalactic intensity of music and lyrics along with the obvious absurdist quality of news in reality finally leads the user to realize he is the superuser.

He created this world but She created him. His bots (abstractions) run this world and with Her, they can turn the simulation into a perfect Utopia.

Just a thought I had after seeing Ready Player One yesterday. This idea that the future creates the past, and then the past creates the future is another indication that Free Will is an illusion.

The best aspect of all of this; if it were all some Facebook-Total Recall experience, then all the pain, confusion and embarassment would evaporate.


I'm opening a can of worms by saying what I'm about to say but here goes nothing: Why are people criticizing individual companies for "growing at all cost"? Every company was once a start-up fighting to survive, every large company was once medium size and had to fight with the Google, Amazon of their days. It is not the Nash equilibrium to NOT try growing at all cost. If they don't, they'll lose. So at what point should a company stop trying to grow at all cost? Is it when they've won all battles, like Google? So why are we looking into the past to criticize Facebook, Uber, and potentially many more startups that are "growing at all cost"? Of course, the definition of "all cost" shouldn't be taken too literally.


Your argument as I understand it is "Hey if I/companies don't do what's needed to survive, someone else will do it anyways, and I/companies lose". Just because some other group is willing to do unethical things (i.e. at all costs) doesn't make it ok for your company / you to do those things.

People generally cheat and do unethical things to "win" or "not lose". The reward doesn't make it any more ok.

Also, I can't believe I'm explaining this.

The memo was pretty literal - "someone might die", "things we have to do in China". Sounds like they truly meant "all cost".


>So at what point should a company stop trying to grow at all cost?

This is an important philosophical question and underscores why company codes of ethics / conduct, statements of values, etc., aren't just feel good, rhetorical BS, but actually matter and are important part of culture. Each company has to decide where to draw the line. Deciding that there is one is an important first step and it needs to be supported and reinforced throughout the organization.


It's not about growth. It's about money. And to get that money, some companies/executives are OK with harming anyone that gets in the way.


Those pesky Russians...that Putin sure has a lot of time on his hands /s


This would be the easy way out of admitting they did something wrong.


> ‘How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?’

I wonder if some Facebook employees understand how folks in the IC feel about Snowden or Manning now …




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: