CBP Knew About Agents’ Racist and Sexist Postings on the “I’m 10-15” Facebook Group Beginning in August 2016
The Committee obtained a list of thirteen "I'm 10-15" cases that CBP investigated between August 2016 and November 2018.[25] Despite the agency's longstanding knowledge of employee misconduct on the "I'm 10-15" group and other Facebook groups, CBP did not take sufficient disciplinary measures or other action to enforce its social media policies until it faced scrutiny during the summer of 2019.[26]
The first of these cases occurred in August 2016, soon after the creation of "I'm 10-15," when CBP received a referral of an image of a Border Patrol agent posing "behind a mannequin positioned in a prone, sexually suggestive manner." CBP found that agent committed misconduct and suspended him for three days. CBP subsequently investigated 12 other cases involving racist, sexist, and political attacks as well as the release of unauthorized sensitive information. CBP substantiated the conduct of eight of the agents and gave them written reprimands or counseling. The remaining four cases were closed with no action or found to be unsubstantiated.[27]
An arbitrator who overturned one of CBP's proposed removals following the 2019 revelations wrote that the agency's "history of tolerating racist and bigoted social media posts" affected its ability to hold agents fully accountable after their misconduct was made public.[28] When overturning CBP's proposed removal for another CBP agent, an arbitrator observed that the "I'm 10-15" page "was a secret to a lot of people, but it was not secret to the Agency."[29]
The Wall Street Journal recently reported that Facebook was aware of misconduct related to drug cartels and human trafficking but failed to remove inappropriate posts.[30] Facebook similarly failed to take appropriate action in response to "I'm 10-15." In particular, Facebook failed to enforce its Community Standards and remove content in "I'm 10-15" that violated its rules. An arbitrator in a CBP agent's case pointed to Facebook's longstanding knowledge, finding that the "I'm 10-15" page "was not secret to Facebook, which enabled obscene, harassing, and disruptive behavior."[31] Facebook's Community Standards prohibit this activity, stating, "[W]e don't allow hate speech on Facebook."[32] The company's policy also states, "We also protect refugees, migrants, immigrants and asylum seekers from the most severe attacks."[33] The Wall Street Journal recently reported that, based on a review of internal company documents, "Facebook Inc. knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands."[34]