Last month, a researcher for Meta prepared a talk for colleagues that they knew would hit close to home. The subject: how to cope as a researcher when the company you work for is constantly receiving negative press. The talk had been approved to show at the company’s annual research summit for employees in early November. But shortly before the event, Meta’s legal and communications department determined that the risk of the contents leaking was too great. So it disappeared from the research summit’s agenda days before, along with another pre-taped talk describing efforts to combat hate speech and bullying. Both talks never saw the light of day. The pulling of the talks highlights how a barrage of leaks and external scrutiny has chilled the flow of information inside the company formerly known as Facebook. Many of the changes appear designed to thwart the next Frances Haugen, who worked in the Integrity organization responsible for making the social network safer before she quit earlier this year, taking thousands of internal documents with her. Those documents served as the basis for a series of damning stories in The Wall Street Journal and dozens of other news outlets, including The Verge. Some of them, such as internal research showing Instagram and Facebook can have negative effects on young people, have led to congressional hearings and lawsuits. And as the bad press continues, Meta executives have argued that the documents were cherry-picked to smear the company and paint an incomplete story. While the documents Haugen leaked haven’t yet caused Meta to make meaningful changes to its products, they’ve already left a lasting mark on how the world’s largest social network operates, particularly in its research and Integrity divisions. Ten of the 70 preapproved talks presented at the internal research summit a couple of weeks ago received a second, more stringent review to minimize leak risk. Senior leaders, including policy and communications chief Nick Clegg, have in recent months slowed the release of Integrity research internally, asking for reports to be reviewed again before they’re shared even in private groups. In some cases, researchers have been told to make clear what is defensible by data in their work and what is an opinion, and that their projects will need to be cleared by more managers before work begins. Last month, Meta rolled out a new “Integrity Umbrella” system designed to thwart leakers. The Umbrella maintains a list of employees in Integrity and gives them automatic access to join private Integrity groups in Workplace, the internal version of Facebook used by employees. When it was introduced, several employees internally pointed out that the system wouldn’t have stopped Haugen, since she worked in the Integrity division when she gathered the leaked documents. It’s not just the Integrity division that is locking down access to Workplace groups. The change has become so widespread that employees have taken to a group in Workplace titled “Examples of Meta Culture trending towards ‘Closed,’” where they’ve been posting screenshots of previously open groups they belong to being set to private. This story is based on conversations with current and former Meta employees and internal Workplace posts from the past month obtained by The Verge. In response to this story, Meta confirmed that the company was making changes to internal communication. “Since earlier this year, we have been talking about the right model of information sharing for the company, balancing openness with sharing relevant information and maintaining focus,” said Mavis Jones, a Meta spokesperson. “This is a work in progress and we are committed to an open culture for the company.” Complete details are posted on OUR FORUM.