How two of the world’s biggest chat apps tackle fake news
WhatsApp limits message forwarding, while WeChat bans some content entirely

Eating breakfast frequently is dangerous. Orange peel can unlock smartphones. Australia announced the end of cancer.
These were all actual story headlines circulated on China’s biggest chat app last year. None of them sound particularly convincing, but they became some of the most widely spread hoaxes on WeChat in 2018.
WeChat, the app that does everything
Around the world, social media firms are grappling with ways to tackle the spread of misinformation. If you’re a WhatsApp user for instance, you may notice that starting from this week onwards, you can only forward a message up to five times -- a new measure adopted by app owner Facebook to curb fake news, which plagued users in India and Brazil (among many other places).
Placing limits on message sharing isn’t new. But the way it’s done differs from platform to platform. On WhatsApp, where messages are encrypted end-to-end, it’s difficult for moderators to read what’s actually being shared and deal with them directly.
The measure isn’t limited to fighting fake news. It also targets a long list of other content deemed inappropriate by WeChat, including obscene materials and solicitation for religious donations.