6
5 Comments

We prototyped an AI camera app to safeguard children from sexting πŸ™Œ

Hello community! πŸ‘‹

A couple of months ago I partnered up 🀝 for an upcoming tech adventure with Berend (@Daalen on IH) and Paul.

Since then, we are relentlessly working on an AI camera app that safeguards children from sharing sexually explicit content.

2.5 months into our agreement, we have an iOS prototype with our ML model that detects sexually explicit content in realtime. For the geeks out there, we are predicting multiple objects (example- exposed female genitalia, exposed buttocks) at an average of 30 frames a second.

Here is a screen grab from the iOS app prototype πŸ“±: https://ibb.co/k8Z3Tzz

The app can work offline with the help of 2 machine learning models,

  1. Object detector for realtime recognition.
  2. Classifier model, to run a final check after an image is captured and before it's saved on to the device.

This gives us extra surety that clicking sexually explicit images would be next to impossible. 😸

The prototyped model is based on an open-source image dataset. We look forward to introduce a lot improvements that further refines our model.

For more information & to signup for our closed beta, visit https://www.getnuno.com/?lang=en

We hope to work on this app and solve a potentially dangerous problem due to the over exposure of online consumption that children/teens these days are introduced into.

If you are interested, and preferably a parent who wishes to help make this app better for the upcoming generation of children, we would be glad to have you join our closed beta program! 🀩

Lastly, I found Indie Hackers as a brilliant platform/community to find like minded people, this instance being a good example.

Open for questions / suggestions, Have a nice time πŸ‘‹

posted to Icon for group Growth
Growth
on December 18, 2020
  1. 3

    This is so great for the world Dave!

  2. 2

    @meetdave - love the ingenuity and the clear talent you, @Daalen, and Paul have.

    Question.

    Once a parent installs the app on their child's/teen's mobile phone (iPhone/Android), what prevent the child from uninstalling it?

    How do users actually know the app is working? Are parents supposed to test the app by exposing themselves to the camera privately?

    Or will you be suggesting they find an explicit set of photos and put them in front of the camera to test it?

    Either seems like it might be awkward, unless you can find a fun way to approach.

    1. 1

      Hey @iammike. Thanks for appreciating, means a lot to us! :)

      Answer.

      If you use Nuno as parental control app, you can use Google family link on Android to enforce the app. Also the parent would need to ideally block camera access to all other apps and only allow camera access to Nuno. Nuno, here will work as a standalone source to capture and store images on the device.

      For iOS, we are still exploring possibilities on how we can go for the ideal approach, we are still under RnD.

      Lastly, we get it, it's awkward to test out a set of explicit photos. At present, we don't suggest parents to test by exposing themselves, however, they are free to do so.

      At the moment, this problem boils down to trust until we find a fun way to approach (as you suggested :))

      Dave

      1. 1

        Good stuff! Wishing you the very best of luck in your endeavors.

  3. 2

    This would safeguard alot of vulnerable people and save them from bad decisions

Trending on Indie Hackers
Meme marketing for startups πŸ”₯ User Avatar 11 comments Google Whisk - Generate images using images as prompts, not text prompts User Avatar 1 comment After 19,314 lines of code, i'm shutting down my project User Avatar 1 comment Need feedback for my product. User Avatar 1 comment 40 open-source gems to replace your SaaS subscriptions πŸ”₯ πŸš€ User Avatar 1 comment We are live on Product Hunt User Avatar 1 comment