was thinking it might be a filter to prevent pedos but my post had nothing to do with a sexual nature the photo was a small slime girl knawing on a chunk of raw meat with a smile
@edwardthefma99@lemmy.world
so i typed basically all the words i could think of that were pedo and sexual and hit generate and it generated normal like 20+year old girls. tried to add one to the gallery and it said network failure probably because my internet is really slow right now. trying to add a normal image to gallery also says network failure so that seems to be on my end.
my GUESS is it doesn't have to do with pedo prevention since the method to prevent that seems to be to detect words and manipulate the prompt on the server side before we even receive the image. my internet is really slow during the day tho so it's hard for me to test. And what I just typed I'm sure woulda ranked a bajillion percent higher on pedo detection than whatever your innocent prompt was. But i got no error, it just automatically weeded what i had typed and gave a legal image back.
someone else probably knows more about this or has seen the error before and knows why. Testing to see if antipedo is the reason seems currently like a negative.
Yo, I got like seven warnings at different pics, which were not containing (both image and prompt did not have it) anything related to something like that or something which is as illegal as that. After 7 or 8th warning it showed Error: last warning, and then I was banned from all the galleries. Any of my prompts are just not visible to others, and they are just disappearing from the gallery even for me, after a few minutes.