I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It's weird that it stumbles at first but is able to see it's error and fix it. I wonder if it's a thing that it 'learned' from the data set. People not correctly answering prompts the first time.
It's weird though because they were able to point out they got to absurdity to its comment and it did agree. No it's not just algorithmic phrase matching, there is an actual "thought process" going on.
I've never been able to get an AI to explain its logic though which is a shame. I'm sure it would be useful to know why they come up with the answers they do.
I never really understood the outrage associated with using this word. I have no problem calling myself male, or being referred to as male even though I just identy as a "person", tbh. People just want to be pissed off.
I never really developed an association with age as it relates to the term "girl" or "boy". Sure, I'd call a child a boy or a girl over a man or a woman every time, but there's not some magical age at which it becomes inappropriate.
We have brains and we are capable of interpreting things based on context. Things in the real world are fluid and flexible and rigid definitions are silly in the face of societal, cultural, and personal diversity. Stop trying to find outrage. It's pointless and you end up being wrong more ways than you're right.
The only reason I stopped using "girl" as a descriptor was because I stopped using "boy". And I stopped using "boy" because I learned white bigots use the word in a derogatory and demeaning way against black men, and i didn't want to accidentally demean a friend or coworker.
I likely would eventually have stopped using "girl" as I began to understand more about relative power dynamics and the implications of using diminutive language on adults. That being said, when you grow up hearing language used a certain way, you didn't really think about it's use as an adult until someone brings it to your attention.
Or at least that's my experience. But I generally think the best of people until contrary evidence presents itself. Like if they use blatant bigoted speech, or drive on the same road I'm driving on. Then I know they're hateful, awful people that should be eliminated from the gene pool.
I don't think it's dead Internet theory. But this is more dystopian in that private corporations are censoring our speech and searching based on their own criteria.
I find csam repulsive, but having a corporation or ai restrict unrelated content because their system construes an innocent search as potential bad is almost worst.
Lady does sound like an older woman. Gal, woman or chick would probably work, but that's all besides the point, it's very common to use <whatever> girl as a search term: biker girl, skater girl, ring girl, bikini girl, racer girl, etc. It's just dumb to automatically assume any search with "girl" means "child".
Maybe it's just me, but I don't think warning people they might expose themselves to csam meets the definition of "broke the internet". I bet if you replace "girl" with "woman" you'll get the expected results.
Funnily enough, wanted to just find one of those reels where the girl sits on the gas tank in front, legs pincered over the guy, because I wanted to show it to my girlfriend as a thing to do 😄
Always wonder about these images. Even without straight up editing them with paint/photoshop you could just search for something explicitly outrageous first, then type something ordinary in the search bar and take a screenshot.
I think it's plausible here if they integrated AI into the search. AI doesn't understand context very well, when you say girl it thinks you mean a child
Yea, or perhaps if one of the returned results would have been at issue, even if the search wasn't seeking such things. Automation has been quite a bit less helpful than people initially imagined.
Yet you try to report that user who has listed 530 S/M sex toys in the marketplace including huge anal hooks, horse dildos and nipple pitchers, but they reply that they dont see anything wrong here. Even when you ask them to go through the report one more time.
Why, ignoring the bullshit censoring, does it not just refuse to return results but feel it needs to lecture you on top of it?
It could have just returned anything near the results ignoring the word 'girl' but no, it has to tell you you are bad and why. Fucking judgemental piece of shit on top of bullshit censorship.