For the past couple of years, Facebook’s reputation has diminished, so much so that peoples’ trust in the company dropped by 66 percent since the infamous data scandal last year. Because of this mess, Facebook’s own employees recently told their managers that they were worried about answering difficult questions about their workplace from friends and family. So, Facebook’s public relations teams built an artificial intelligence bot to help employees deflect criticism. The tool, “Liam Bot,” teaches Facebookers official company answers for uncomfortable questions. For example, if your worried nan asked how the company deals with hate speech published on the platform, Liam would tell you to cite statistics from a Facebook report and reply with any of the following: “Facebook consults with experts on the matter,” “It has hired more moderators to police its content,” “It is working on AI to spot hate speech,” and “Regulation is important for addressing the issue.” As first reported by The New York Times, the tool was rolled out to employees shortly before Thanksgiving last week, and it was first tested earlier this year. The bot also provides links to company blog posts and news releases on how its working to regain peoples’ trust. Facebook has found itself in multiple problematic situations over the course of the year, including the Cambridge Analytica Scandal in which 50 million accounts were harvested, and allowing politicians to lie in political ads in the runup to the 2020 US presidential elections — an issue its own employees protested against in an open letter last month. This year, Facebook fell to seventh place from the top spot when people were asked where they most wanted to work, according to a survey by Glassdoor, the employee reviews site. In the past, Facebook employees most pressing question about their workplace was probably from a relative asking how to change their password. Now, people question why people work at Facebook at all. The tech giant’s answer to protecting its employees from difficult questions seems lazy — relying on tech to clean up the mess it created — and it runs the danger of turning workers into robot-like mouthpieces for the firm