Eight months later, in August 2020, Lindsey attended a virtual meeting to discuss the company’s goal of hiring more Black engineers. In the meeting, a White manager played a Drake song in the background whose chorus repeats the phrase “Where the [n-word]s be at?,” five times, according to videos of the incident reviewed by The Washington Post.
Lindsey asked in the chat system why they were playing the song, then said he was “really disappointed,” according to the video. Nine other employees who were present in the meeting echoed his frustrations by putting emoji expressing shock alongside his comment.
“It shows you the insensitivity and the lack of awareness,” Lindsey said. A manager subsequently apologized, according to the video.
The country was in the midst of a historic reckoning over racial justice, and Facebook had just set an ambitious hiring goal of 30 percent more people of color in leadership by 2025.
But Lindsey and other current and former Black employees involved in hiring — as well as potential recruits who filed a claim with the Equal Employment Opportunity Commission last summer — describe a problematic system that makes it difficult to achieve that and other diversity goals. They say the company has adapted metrics that prompt recruiters to go through the motions without actually delivering talent. Even the diverse candidates who are brought in can be rejected over vague terms such as “cultural fit.” They also say that the problem goes deeper than hiring and that many employees of color feel alienated by the social network’s culture.
Facebook spokesman Andy Stone said the company is focused on advancing racial justice in the workplace and in recruiting. “We’ve added diversity and inclusion goals to senior leaders’ performance reviews. We take seriously allegations of discrimination and have robust policies and processes in place for employees to report concerns, including concerns about microaggressions and policy violations,” he said. He did not address the incident involving the Drake song.
Lindsey quit the company in November, just 11 months after he started, and has since founded a start-up.
Facebook is facing a federal investigation by the EEOC that launched last summer into allegations of bias in hiring, promotion and pay, according to the complaint. That case has since been expanded into a systemic probe by the EEOC, a special designation which means that the federal agency is examining whether company practices may be contributing to widespread discrimination and is assessing the potential to bring a broader lawsuit representing an entire class of workers, according to the lawyers representing the complainants.
The EEOC declined to comment. The agency can only speak publicly about charges if they result in a lawsuit against the employer.
In the EEOC complaint, three Black job applicants say they met all the advertised job qualifications but were rejected after going through the interview process. They say they were told by Facebook interviewers that the company was looking for people who would fit in culturally. One candidate, whose lawyer requested The Washington Post withhold her name because parts of the complaint are not public, was told by a Facebook hiring manager, “There’s no doubt you can do the job, but we’re really looking for a culture fit,” but was not given any further explanation.
Culture fit is an ill-defined term for whether a candidate is a good match for a company’s internal culture.
A Facebook operations manager who is identified in the complaint, Oscar Veneszee Jr., who is Black and still works at Facebook, said in an interview that he had submitted more than half a dozen qualified applicants who were underrepresented minorities for jobs at Facebook, but that all were rejected and he suspected it was because they failed the cultural fit test.
“When I was interviewing at Facebook, the thing I was told constantly was that I needed to be a culture fit, and when I tried to recruit people, I knew I needed find people who were a culture fit,” he said. “But unfortunately not many people I knew could pass that challenge because the culture here does not reflect the culture of Black people.”
Facebook’s Stone said the company’s recruiters do not assess cultural fit, but the company looks for whether skills and behaviors in the interview process, such as responses to questions about what a person might do in a particular scenario, align with Facebook’s values.
“There is no culture fit check mark on an application form, but at Facebook it is like this invisible cloud that hangs over candidates of color,” said Lindsey. He added that at least a dozen qualified candidates of color that he referred for interviews were also rejected by Facebook, with culture fit was part of those decisions. “It really boils down to who do I feel comfortable hanging out with.”
Racial issues at Facebook have been particularly acute over the last year because of the decision by CEO Mark Zuckerberg to give wide latitude to racially divisive comments by President Trump during last summers’ protests, and because of the company’s role in providing a platform for extremist groups that espouse white supremacist ideas. The decision to leave up Trump’s comment was of particular concern to workers of color, some of whom met personally with senior leaders to protest the decision while others have left the company. Facebook software engineer Ashok Chandwaney quit publicly in the fall, citing unease with the social media giant’s role in fueling hate.
Zuckerberg’s decision “created such lack of psychological safety on all kinds of levels, and Black employees in particular didn’t know how to truly process that,” said a former Black executive who cited the decision as one of her reasons for resigning.
Facebook is one of the several Silicon Valley companies, including Google and Microsoft, to announce ambitious diversity targets in the wake of the death of George Floyd, an unarmed Black man killed while in police custody. But years of annual tech companies’ diversity reports show only incremental progress on increasing the ratio of Black and Latino employees, and high attrition rates among Black women, supported by recent accounts of racial bias and inequities in pay and promotion from Black women at Google, Pinterest and Amazon.
Google’s leadership is more than 95 percent White or Asian and 73 percent male, and Facebook’s is more than 87 percent White or Asian and 66 percent male, according to the companies’ 2020 diversity reports.
The independent civil rights auditors Facebook hired to scrutinize its record last summer found attrition was of concern to employees of color and to civil rights advocates, and noted a “disconnect” between the experiences described by employees of color and the company’s myriad diversity and inclusion initiatives. In the report, which was made public by Facebook, auditors also called the company’s permissive stance on politicians’ speech a “tremendous setback” for its civil rights progress, saying such decisions were made by members of senior leadership who lacked civil rights expertise.
The auditors noted that “civil rights leaders have characterized the current numbers for Hispanic and African American staff as abysmal across every category.“
Facebook has pledged that 50 percent of its workforce will be made up of underrepresented people by 2024 — defined as underrepresented minorities and women — but its progress so far has been modest. Currently just over 85 percent of its workforce is White or Asian, and more than 90 percent of those in highly compensated technical roles are White or Asian, according to its annual diversity report. That is down from 91 percent of the overall workforce and 94 percent of the technical workforce in 2014, when the company first published its annual diversity report.
Women make up 37 percent of all roles, up from 31 percent in 2014. And they now fill 24 percent of technical roles, up from 15 percent in 2014.
Many tech workers and civil rights advocates say progress on diversity often is thwarted by invisible biases and outright discrimination that permeate Silicon Valley culture, leading to fewer hires, unwelcoming environments and high attrition among workers of color.
Ifeoma Ozoma, a former public policy official at Facebook, said that when she was asked to interview job candidates at the social media giant, the process was superficial and numbers-driven. In one instance, she alleged, a Facebook recruiter told her there were already sufficient numbers of women in the running for a particular role, implying there was no need to find additional applicants from underrepresented groups.
Ozoma added that none of the recruiting measures matter if retention isn’t a focus, “because even the best managers are not always able to protect their hires from toxic work environments.”
When Lindsey started sourcing candidates for Facebook, he said he and other recruiters used a custom-built software dashboard called FBR, or Facebook Recruiting platform, where recruiters recorded their outreach to candidates and created profiles of them. Their manager had weekly meetings with them to gauge their progress.
If recruiters didn’t hit targets of making contact or starting the recruiting process with a specific number of people of each race and gender each week, they were told specifically that executives, namely Zuckerberg, were unhappy with them, Lindsey said. Managers would then enforce 30- to 90-day “lockdowns,” a long-standing companywide practice where employees are told to drop all other responsibilities to make progress on a single metric, such as a requirement to increase the number of Latino candidates a recruiter pinged on LinkedIn. Sometimes the recruiters were able to hit the targets, but it’s not clear how well it worked, he said.
Managers primarily instructed recruiters to infer the race and gender of candidates by scouring the Internet, particularly Instagram, Twitter and Facebook, he said.
They then formalized those guesses and inputted them into a system that any person who interviewed the candidate could see, according to Lindsey and to a screenshot of the system viewed by The Post. The screenshot showed nine affirmative action, or AA, categories with a bubble next to them for check marks, that are part of a candidate’s profile for a job at Facebook. Seven of the categories were race-based, including Black and Hispanic, and there were two other categories for women and veterans. He said the system was still being used when he left.
The pressure sometimes led the recruiters to make problematic assumptions, Lindsey said, based on his conversations with other employees, discussions at their team meetings and his own experience.
Tech recruiters say it’s a common, if unspoken, practice in Silicon Valley to guess the race and gender of applicants. It’s a less-than-ideal outcome of companies trying to reduce the risk of violating civil rights laws, which effectively prevent employers from requiring candidates to disclose their race outright.
Another former Black Facebook recruiter, who requested anonymity because he was not authorized to speak publicly, confirmed that FBR contained race tags when he worked there in 2018. He said in his department, which was different than Lindsey’s, both team leaders and other recruiters told him not to use them. He was told they were being phased out because that type of guesswork was frowned upon. His team also no longer used “culture fit” as hiring criteria, he said.
Julie Levinson Werner, a partner at the law firm Lowenstein Sandler, said that because both state and federal law prohibit making hiring decisions because of a protected characteristic, companies avoid asking to try to mitigate their risk of a lawsuit.
But many recruiters don’t put those guesses in writing or, if they risk doing so, might note the guesswork in code, such as emoji, in their applicant-tracking software, said diversity expert Nicole Sanchez, founder and CEO of Vaya Consulting. She advises clients against this kind of guesswork.
Lindsey’s description of that practice within Facebook was troubling, Sanchez said, because it took that guesswork to another level by potentially exposing it to everyone involved in the hiring process.
Stone said the company does not instruct recruiters to visually inspect the race of prospective candidates and instead asks them to use objective criteria, such as membership in a professional society for a certain ethnicity or whether a person went to a historically black school.
According to Lindsey, the pressure resulted in some cases in recruiters feeling the need to duplicate candidate profiles — manipulating the profile slightly so the same candidate would count twice, or tagging White people as people of color — in the tracking system to inflate their numbers. He said he never did so.
Facebook’s Stone declined to comment on Lindsey’s allegation.
That system then fed into an interview and selection process that was itself not inclusive, Lindsey said. The educational backgrounds of candidates of color, who more frequently came from less prestigious universities, coding boot camps or HBCUs, were often dubbed less of a “cultural fit” in HR meetings than other candidates, Lindsey said.
Lindsey says his time at Facebook led him to co-found his start-up, Siimee, focused on helping job candidates connect with recruiters on an app that focuses on prioritizing inclusivity and mitigating bias. It lets people create profiles where they can openly discuss their identities and ambitions and show those to potential employers.
He says a lot of people are surprised that he left such a prestigious company.
“It was a hard year. The work from home, the racial justice protests. I worked for a platform where there was a lot of hate and disconnect, and as a Black man, I felt I was always under a microscope,” he said. “Just because you’re in this place everyone thinks is great and have stability, you don’t have to stay somewhere that is problematic in your eyes.”