NSFW JS

NSFW Image Detection

NSFW JS

NSFWJS is a powerful JavaScript library that identifies potentially inappropriate images on client's browsers using TensorFlowJS. Experience safe browsing with CameraBlur Protection.

Screenshot for NSFW JS

Project Description

NSFWJS is an advanced JavaScript library that revolutionizes the way users browse the internet by detecting and filtering potentially inappropriate images. Powered by TensorFlowJS, an open-source machine learning library, NSFWJS effectively identifies specific patterns in images to determine their appropriateness, boasting a remarkable accuracy rate of 93%. One of the standout features of NSFWJS is CameraBlur Protection, a unique capability that automatically blurs any images that the library identifies as potentially inappropriate. This feature ensures a safe and comfortable browsing experience for users, giving them peace of mind while navigating the web. The library is continuously improved and updated, with new models being released frequently to enhance its detection capabilities. NSFWJS is free to use, and its code can be modified and distributed under the permissive MIT license. It also includes a mobile demo that allows users to test different images on their mobile devices, making it versatile and accessible for a wide range of users. Available for download through GitHub, NSFWJS encourages users to report any false positives and actively contribute to the development and enhancement of the library. Whether you are an individual or an organization, NSFWJS is your go-to solution for creating a secure and family-friendly online environment. Embrace safe browsing with NSFWJS and experience the power of AI-powered NSFW image detection.