There’s something wrong with a world where one needs to use the words “no nudity” when setting up a Stable Diffusion AI art prompt.
Surely “no nudity” should be the default? I have many doubts about this technology, the power use is ridiculous and a huge amount of intellectual property theft is nodded through.
Yet nothing is more off-putting and more indicative of moral bankruptcy than the realisation that the assumption that one uses these tools primarily for smut consumption. It’s not just objectification of women, some of the models treat men as sex objects too.
On top of that, what is it with Stable Diffusion models and breasts? Nine out of ten generated images of women have unnaturally large breasts. This adds to the underlying not-quite-safe-for-work vibe.
It is possible to navigate these issues and create safe for work images, but that’s not the prevailing culture of the Stable Diffusion scene. It needs to clean up its act.
@billbennett
I think the bigger issue is more around jurisprudence. Prompt inputs and intent are clearly divorced. That's a major problem to deal with.
@billbennett
Training for Censorship breaks human anatomy as well as classical art depictions quite badly in the ' safe ' models, bias machines biased towards some sort of Nancy Regan esque classification is likewise more of a moral wrong IMO. c.f SD3 is a major step backwards for rendering anatomy correctly. Yes accidental porn is a problem and maybe some consumer guardrails. But conversly you don't look down the barrel of a loaded gun , pull the trigger and expect not to get shot.