Instagram will soon start sending out warning messages to users who post “potentially offensive” captions on their photos and videos.
The details: Instagram announced that it will send out these warnings if someone posts an offensive caption or photo to the main feed, the company announced.
- An artificial intelligence powered tool will review each photo to see if it is harmful.
- The app will send a notification that indicates the caption looks similar to other reported offensive content.
- Instagram will then asks users to edit the caption, or to leave it untouched.
Flashback: In July, Instagram introduced a new artificial intelligence tool that reviews comments, according to The Verge. The tool would help users see potentially harmful comments.
- In October, Instagram launched a new feature called “Restrict,” which allows people to ban bullies from seeing their feeds. The company has also worked to identify bullying comments and filter out anything offensive that pops up.
- The Verge: “Unlike its other moderation tools, the difference here is that Instagram is relying on users to spot when one of their comments crosses the line. It’s unlikely to stop the platform’s more determined bullies, but hopefully it has a shot at protecting people from thoughtless insults.”
View Comments
Yes, but: Instagram’s changes may be a little too late, according to TechCrunch. The app wasn’t designed with bullying in mind so some of these measures come a little too late.