-
Notifications
You must be signed in to change notification settings - Fork 483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No effect on AWS Rekognition? #138
Comments
I see the same effect with my photos. In case I don't get back to this, I realized I had only run this with --mode=low. Currently processing --mode=high, but it's taking a while. |
So I considered it wasn't a realistic test to compare a before and after. I was thinking what we need to test are two different images of the same person that have been run through fawkes. Well I did that, and the results are not good. Basically we get 99.9, 99.5, and 99.3% similarity for low, mid, and high. I imagine these facial recognition tools saw all the publicity for fawkes and started training their networks to recognize cloaked images. Maybe generative adversarial networks would be a good next step, but I am not an expert on this. I imagine we would need those scripts for automating the testing. |
https://www.theregister.com/2022/03/15/research_finds_data_poisoning_cant/ |
I just downloaded two images, original and cloaked, from your website and uploaded them to AWS Rekognition. The results are 100% "similarity".
Did you upload the wrong images (I checked all of them, including Obama - that have different sizes)?
The text was updated successfully, but these errors were encountered: