Working at Facebook was once a matter of pride for some people and their parents. Now, being active by the social networking giant apparently leads to afflictive questions from analytical ancestors asking what part you, or your colleagues, played in the Cambridge Analytica scandal.

For the past couple of years, Facebook’s acceptability has diminished, so much so that peoples’ trust in the aggregation alone by 66 percent since the abominable data aspersion last year. Because of this mess, Facebook’s own advisers afresh told their managers that they were afraid about answering difficult questions about their abode from accompany and family. So, Facebook’s public relations teams built an bogus intelligence bot to help advisers avert criticism. 

The tool, “Liam Bot,” teaches Facebookers official aggregation answers for afflictive questions. For example, if your worried  nan asked how the aggregation deals with hate speech appear on the platform, Liam would tell you to cite statistics from a Facebook report and reply with any of the following: “Facebook consults with experts on the matter,” “It has hired more moderators to police its content,” “It is alive on AI to spot hate speech,” and “Regulation is important for acclamation the issue.” 

As first appear by The New York Times, the tool was rolled out to advisers anon before Thanksgiving last week, and it was first tested beforehand this year. The bot also provides links to aggregation blog posts and news releases on how its alive to regain peoples’ trust.

Facebook has found itself in assorted ambiguous situations over the course of the year, including the Cambridge Analytica Aspersion in which 50 actor accounts were harvested, and  allowing  politicians to lie in political ads in the runup to the 2020 US presidential elections — an issue its own advisers protested adjoin in an open letter last month

This year, Facebook fell to seventh place from the top spot when people were asked where they most wanted to work, according to a survey by Glassdoor, the agent reviews site.