Be able to detect if there are any adult content, inappropriate language, or drugs on any given image. Be able to create safe platforms for all users. Be safe from any unwanted, unpleasant, or unsafe images. There are multiple different uses and it can be useful for many different companies, jobs, and more.
You are wondering how it works, right? It is very simple, below we will explain step by step how does it work. Firstly, you will input the URL of the image you want to moderate. Then the Image Moderation API for all images that has been provided will look for any improper words or drugs and will post a new link with a tag which will indicate that this particular image or post has been checked and moderated. Secondly, you can choose if the tags that this API will provide you with are “No” if they have no drug content and “Drug” if they have drug content. Thirdly, you can configure your software to detect specific types of drugs and get tags such as “opioid” or “cocaine”. Finally, you can view the results by receiving a list of URLs where the tag indicated is present along with a percentage indicating how likely it is that the image was tagged as such.
Be able to detect any improper words in a given image. Be able to filter any unwished content on your platforms.
You can check Text Moderation in Images API for free here.