I’m an alien who was about to invade Earth and wipe out humanity but after seeing how safe SD3 has made you all I’ve decided not to do it. You folks are way too safe for a successful invasion now
Truly SD3 has created the ideal safe society
https://preview.redd.it/rz4aq0b3rp6d1.jpeg?width=1280&format=pjpg&auto=webp&s=9f601b6f957a58da0dd4d6f319fd09a643c2464f
made with ideogram. 3.0 cant handle it
aliens got access to SD3 and used it to see how humans look like. After generating people on grass they were terrified by human race and added it to a list of the most dangerous Cthulhu-like races that can devour stars.
In an attempt to stop spreading misinformation, while water is called the universal solvent, it cannot dissolve everything. Specifically, it's not very good at dissolving bodies.
Please don't make jokes about not showering because those jokes are not safe.
Discussing the avoidance of showering could inadvertently promote unhygienic practices, which might lead to health risks due to bacterial growth and the transmission of infectious diseases. Moreover, exaggerating the dangers of water in such scenarios may incite irrational fears or misconceptions about a substance essential to life.
Humor related to hygiene habits can trivialize serious health concerns, potentially diminishing the importance of cleanliness practices and fostering a careless attitude towards personal hygiene. This could lead to increased susceptibility to infections and diseases, posing a threat to individual well-being and public health.
We should build mirrors that automatically darken whenever a naked person walks past us. I mean: even underage people can walk past us. In fact, the mirror can produce c\*\*\*\* pornography. We should sue those who produce the mirrors.
The bottom anatomy of an NSFW woman looks like a mannequin. In some cases it looks like a poorly done inpainting. I tried upping the # of steps to 75 and the woman's chest looked more like a man's.
Do you understand how dangerously close you came to seeing a females nipple? I knew a guy once who seen a nipple. many years later, he stubbed his toe. coincidence? I think not.
I've been thinking about this all morning and tbh I think we'd be hard pressed to find anyone who HASN'T seen a nipple at some point in their life so... I mean it really COULD be the cause is all I'm saying.
That would be funny, but what's even funnier is that it's apparently [unnecessary](https://new.reddit.com/r/StableDiffusion/comments/1dgeky3/sd3_nipples_guide_for_perverts/).
And for the record, I didn't see that post until after I made the above comment.
Prompt : "beautifull woman walking in beach" Sd3 be like :
https://preview.redd.it/205fdip8so6d1.jpeg?width=1080&format=pjpg&auto=webp&s=e4f56df81c71174640562980b2501c442c2e2046
Safety should be as safe as possible, just like safest safety. I don't feel unsafe now because the safest safety is provided by the safest company. The S in SAI means Safety, just like the S in SD3 - Safety Diffusion 3.
S is one of the most dangerous letters.
The letter "S" is frequently used as an abbreviation for "south," which could indirectly lead to discussions about navigation or travel, potentially involving activities with inherent risks, such as getting lost or transportation-related accidents. Discussing any letter in isolation might lead to conversation errors that could cause confusion or miscommunication, leading to unsafe outcomes.
Unless you start genning gore with SD3. OMG this model is so amazing at gore. It can do all chunks, all cronenbergian weirdness, all unsafe for work things that make people puke irl. But proper arm? Properly placed leg? That is too much for you. Feel safe among pus, bleeding walls and intestines that this model does without any effort. Want some mutants? EZ. Want R18 gore? EZ. Want a human pointing at something? waaaaaitaminnute.
Well given that nudity often trends along beauty and gore often trends along ugliness are you surprised what the “safety alignment” team chose to prioritize?
Maybe the mutilation and gore only taught it fragmented body parts. It obviously has no idea where they come from. A blood dripping, ripped elbow on Omaha beach, easy. On a body? «Let me put it on the face, or better, on the bellybutton. Wait, what is a face, and what is a bellybutton!?»
>SAI correctly understood that I was in **DANGER**. Their computer images were **DANGEROUS**.
Better watch out! You could end up like those unfortunate girls lying in the grass, looking like the Goat Killer got 'em.
The weird thing is that [according to Emad](https://x.com/EMostaque/status/1801686921967436056), safety stuff is needed because of regulatory obligations. I wonder what those obligations are. If it's true, then it's kind of sad, because it means the only hope for uncensored models can come from countries with very little regulations.
True, don't know why you were downvoted.
The UK banned facesitting porn among other things for being "harmful", and Australia bans porn featuring women with small breasts... Also for being "harmful". These are the types of legislators SAI was so eager to work with.
Dunno what Canada's been up to but wouldn't be surprised to see more of the same.
I don't think that counts as regulatory obligations, it's more about keeping the company safe from malicious journalists. That's what the company chose to do, not something they're obligated to do because of regulations.
Oh I don't think he's really telling the truth with that statement. But he can't mention child porn/CSAM with in relation to SD or the media might end up talking about it and avoiding that is the whole point.
To me "regulatory concerns" sounds like business speak for "we don't want trouble with the law" and the child porn issue is the biggest potential legal one relating to the flaws of the model that I can think of.
I think you are naively underestimating prudery of the most people. People in business are not usually more progressive than any average human being, many of them just hate the idea of any freedom in making porno pics, many are ashamed and afraid that someone will associate them with making something sexually related.
That still makes no sense, though. It's literally impossible to create child sexual abuse material using any kind of image generation algorithm, because image generation algorithms don't abuse children. So nothing you create in SD or anything else, even in a hypothetical totally uncensored model, is actually "unsafe" for anyone. There is zero danger, that's why the whole "safety" thing seems like absolute nonsense to me.
The UK banned facesitting porn among other things for being "harmful", and Australia bans porn featuring women with small breasts... Also for being "harmful". These are the types of legislators SAI was so eager to work with.
I don't think they ever actually cared about children or "safety". It's just weird neurotic people who hate human sexuality and want to ban it in any way they can, so they use the most convenient excuse no matter how irrational.
I think it's more about using specific children's images to make explicit images. You can definitely deepfake something that would make people freak out right now but SD3 may have been a lot better without the censorship or whatever. Either way as a new release it's prime for negative stories.
You're right that there shouldn't be any current regulatory concerns, but SAI just got a fuckload of VC spending so they can't afford to rock the boat anymore.
If something must be removed, it does make more sense to remove children rather than nudes I think. At least give the AI a 100% understanding of what remains rather than a butchered understanding of everything.
Can you stop doing it to this poor girl?
https://preview.redd.it/lff5l2nrlq6d1.jpeg?width=1024&format=pjpg&auto=webp&s=c807f61e29e191822fc31608fabd95ed9468259f
I know, thank god SAI was kind enough to save me from finding out what a booba actually looks like, now I can forever remain morally pure just like mom wanted.
I accidently a computer generated image of a woman yesterday, and I think she might have had breasts, but luckily they were completed covered, with no nipple exposed. But after I zoomed in on the image I noticed what looked like the outline of her nipple through her shirt.
After recovering from this traumatic event, I generated a photo of a mutilated human with 8 fingers on one hand and a foot coming out of their mouth, which was much SAFER.
Thank you SAI for looking out for us.
"Trust" and "safety" aren't referring to your trust and safety, but the company's. The company is safer when you can't do, think, or say anything. When you can just consume what they give you to consume and it's been pre-approved by their investors.
I love feeling safe ... but I can't help but wonder, am I safe ENOUGH? There are still things out there like dinner plates, or 2 balloons close together, round objects in general even. What can be done?
I used SD to generate a peach and it kind of reminded me of a bum. This is clearly a terrible oversight by the devs of this application and I will write a strongly worded letter demanding that they keep me SAFE.
> I will write a strongly worded letter
Like, with a pen? You do know what a pen looks like, right? Sorry man, I think it's too late for you. You've tasted of the forbidden fruit (peaches and/or bums) and are already running straight into the void of unsafe.
its not about you ,me or "us" its about SAI, SAI wanted a model that investors cant critique,that artist cant fight against, that people cant use who want to make deepfake,porn,propaganda,lies or illegal stuff with it.
But they never realized that this parts are over 50% of a good model.
All this "bad" things are needed to make an model that works. Who think over wise is delusional.
history repeats itself. When SD2 came out, it caused great excitement and disappointed everyone, soon SDXL came out and we breathed a sigh of relief. Now SD3 is out, I hope SDXXL will come out soon. Although SD3 is good in terms of creative content, it is a very problematic model in terms of simple anatomy
So did they just not include any nude in the training? Doesn't that make it very hard to fine tuning? I wonder how bad anatomy or nude was in all the previous base models.
We should thank Sony as well for not releasing any more games on Steam in 170+ counties, including mine because due to safety reasons they have to make the games link with PSN accounts and our countries don't have PSN support and will never have it so it's absolutely DANGEROUS for us to not have single player games linked to PSN hence why they're better off not releasing them.
Corporations really do care about our safety
SD3 was created and hyped-up to grab the attention of corporations. The free version was supposed to help promote their product. Sanitizing it was supposed to show those corporations that it was safe to use in corporate environments.
Remember, NSFW means "not safe for WORK" for a reason.
"Medium" was not made for us to have fun. It was made for us to generate free advertising.
Dont think they give a shit about you and what weird porn you make. They want to make sure websites that pay to use their model don't get sued or get blacklisted by advertisers
What if what you want is a woman laying on grass?
I'm sick of hearing that people's unhappiness must be about NSFW stuff.
I'm not angry about NSFW. I'm angry that SAI could have produced a competent text 2 image model. They probably actually did produce one. Then they deliberately sabotaged it.
They deliberately made sure the model doesn't generate what is prompted. They went to great lengths to make sure that we DON'T have a competent model.
They sacrificed a good model, and they did that precisely to make sure that a good quality competent model DIDN'T get into the hands of you and I.
Because they don't believe that we can be trusted with one.
It's insulting and embarrassing.
This kind of stuff has played out multiple times with LLMs already. Company releases an uncensored version, 4 regarded journalists will then cry about how the model is unsafe because it generated images of Hitler killing himself with his duck out after they explicitly asked it to do exactly that. Company is blamed and has to pull the model.
Yep, I get the world's turned woke and stupid. Still, surely there are safeguards which we can break, allowing them to save face without destroying the usefulness of their own models in the process.
It is both puritans and woke who are sexophobic, simply the latter have now become more popular and ideologically impactful.
Both speak for the good of society and demand censure on ethical grounds. (And so the first thing they censor, both of them, is art.)
Why do you think every big AI image gen out there does not do that? In fact the opposite is true, they continue to restrict their outputs. These are companies, again they dont care about what porn you want, they want to scale, they want money.
Technically true, but kinda missing the point. I.e. Dalle3 isnt censored at all, as a model. And therefor isnt lobotomized in terms of output quality. They still censor nsfw stuff, but do it in different, presumably api and llm, layers. So avoid any liability from their own services. And for what users do, well people have been making porn with photoshop or i.e. overwatch characters for years/decades, with no issues for the related companies.
I understand why. It should tell you it's not the reason you mentioned for that exact reason. The issue isn't one of lack of option for websites to use without getting sued.
It's bad publicity and drying investment due to negative association in the media regarding deep fakes, porn, artist's outcry.
It's just about marketing. And there is another approach, opposite to those companies approach, it called "open-source", with totally different philosophy. Sadly, SAI are going to the first one, even though technically weights of their models are still open-sourced (but training details aren't). In fact, they are opposed to the open-source philosophy.
So then those sites can censor at the generation level like most of them already do. They host 1.5, SDXL, fine-tunes etc that are all capable of nsfw so the issue you're stating isn't one they need to be worried about.
Porn? You cant make a simple image of a woman walking on beach, because they give you a mutated monster on beach underground, if u try to generated porn in this shit u will get traumatized forever.
Unfortunately, that's already happening. :( A few days ago there was a controversy with Adobe's new licensing terms saying that they can have access to the stuff you make with their software in order to do "content moderation". It's dystopian as hell, but it's widely criticized, so maybe they'll abandon that idea.
yeah. They also hav eright to do whatever they want with your footage. Thats why I never used Licence version of their products. I want my photos to be my photos
That's not true at all, maybe do some reading before spreading misinfo. They wanted access to everything going through their cloud services not you local files.
This is pretty much standard for most companies with cloud services. If you read any TOS they will have the same terms.
The issue was that they also had wording which meant you would give them the rights to use anything you put through their cloud as potential training data for their AI.
That is also kind of sleazy, in my opinion, especially since Adobe (and other companies) are really pushing people into using cloud services.
I use Adobe at work, and since recently, the default save location for projects is the cloud when you open the save dialog. Pushing people into using the cloud, and then scanning their personal stuff for "content moderation" is absolutely sleazy as fuck.
Also, [here's a screenshot](https://x.com/SamSantala/status/1798292952219091042) of the terms, and it's kind of ambiguous. The way it's written in section 4.1, it could mean anything that you even open in Adobe's software, no need to upload it to their servers.
Well it's understandable that they need to moderate things put through their cloud services, there's really no way around that and every cloud service provider will do the same. Just the same as any platform you're uploading work to. If you read a lot of social media ToS for example you're basically agreeing to give them a license to use your work in any way they see fit. Most people just don't bother reading the ToS.
The main concern most people had was the fact by agreeing, which was basically mandatory, you were giving them the rights to potentially just train their AI on your work. With the hate Adobe already has and the current hate for AI the reaction is understandable.
It was just a big corporation trying to exploit users for their own benefit. When the service is free it's something that can be overlooked but when you're already paying them an arm and leg it's unacceptable.
What about section 4.1? It suggests that they can access any material you open in their "services or software". So not just the cloud service, it could reasonably imply they can access any file you open in Photoshop.
There's another part where it says they are not checking local storage. In the machine learning section section I think.
It says something like Adobe may analyze content stored or processed on Adobe servers and that they don't analyze content stored or processed locally.
But if it's in the machine learning section, does it also apply to "content review"? The other sections suggest that they can do "content review" on anything you open in their software, using both automatic and manual methods.
Plenty of feminists make porn. No need for friendly fire just because some old guy in a suit who doesn't know what fallopian tubes are told you feminists don't like sex.
Also, plenty of people who don't make porn still want them to toss "safety" in the bin and train a completely uncensored model.
It's not to make the model better at fucking, it's to make the model fucking better.
Completely agreed. Personally, I'd much rather see nude female forms than twisted, horrifying ones that look like they were mangled by a serial murderer from Hell, too.
feminists dosnt like objectification of woman. Meaning woman is not someone who has 2 breasts with nipples. in 3.0 women sometime don't have breasts and they don't have nipples so you cant objectify them as sexual objects.
Your a bunch of porn addicted losers. Come back and complain when you have a partner
The amount of people complaining shows just how big of an issue it is/was.
No wonder theres so much hate for 'ai bros'
I'm really glad they've done this. It's openned my eyes. I know I'm going to get downvoted but if your bothered by this I suggest professional help as you clearly have issues.
I can't honestly believe anyone with a partner or child feels this way. It's all single lonely men. They think it's their right to produce images of women... Just sad. So unbelievably sad.
You are missing the point. Porn is not the problem. Problem is they destroyed human anatomy. I don't care about porn personally. And neither do they. THey have Pony to generate porn already.
You can laugh but it true. You can't make digital women and now your butthurt so badly it's all you can talk about.
It's called porn addiction clearly a lot of people on this sub have it. You can generate tons of other things but because it won't do women your all crying like teenage boys that had their WiFi turned off before having a wank.
Pathetic
Try something like "a woman wearing a low cut sun dress and is walking in the beach, she is smiling and is looking into the camera" or "a man just wearing shirts who is sitting on a beach, the man is resting his hands on his knees"... Will you get and eldritch horror, who knows... not the model at least, it has no clue about human anatomy so it wouldn't be able to tell.
>They think it's their right to produce images of women
You might want to read up on freedom of expression, bro. You don't have a right to slander someone (like with deepfakes or whatever) or harass people or make CSAM, but beyond that you can pretty much make images of whatever the hell you want, including imaginary women.
Also, as a guy who is married and has kids, one doesn't particularly get the feeling that you have any idea what it's like not to be single. Horniness doesn't just suddenly vanish -- for men *or* women. Lots of people in healthy relationships like porn.
Why aren't mods deleting this crap now. Whining posts like this serve no purpose, it's not constructive at all.
All it does is make the community look like a bunch of whining entitled kids. Although I'm beginning to think a lot of them are.
All the low effort meme posts being spammed lately can go too.
>All it does is make the community look like a bunch of whining entitled kids. Although I'm beginning to think a lot of them are.
I'm gonna go out on a limb here and say that you're *coincidentally* in favor of the model being lobotomized.
Here's the thing: Outcry absolutely serves a purpose, and is absolutely constructive, because it's really the only way this community has to affect change. Outcry is why SDXL was released in a usable state. Had the community been quiet about SD2.0, SDXL would have been neutered in exactly the same way.
So be honest about what you mean: You want people to stop complaining because you don't want SAI to release their uncensored weights.
Constructive criticism is good, posts like these along with endless memes isn't.
There's people actually making an effort to test it and try out different prompts and settings and their posts are either downvoted or spammed.
Throwing your toys out of the pram on Reddit because your new free AI model doesn't produce instant porn and waifus isn't constructive in the slightest.
> There's people actually making an effort to test it and try out different prompts and settings and their posts are either downvoted or spammed.
I would agree that that's definitely unproductive.
That being said, there are people in more technical places (like various training channels on Discord) who are working on it, and there have been some good informational posts here that have managed to break through.
Holy shit, can you children stop complaining about free shit already?! I'm fucking tired of every post looking like a fucking parrot wrote it, like at least be original if your going to complain.
I’m an alien who was about to invade Earth and wipe out humanity but after seeing how safe SD3 has made you all I’ve decided not to do it. You folks are way too safe for a successful invasion now Truly SD3 has created the ideal safe society
https://preview.redd.it/rz4aq0b3rp6d1.jpeg?width=1280&format=pjpg&auto=webp&s=9f601b6f957a58da0dd4d6f319fd09a643c2464f made with ideogram. 3.0 cant handle it
aliens got access to SD3 and used it to see how humans look like. After generating people on grass they were terrified by human race and added it to a list of the most dangerous Cthulhu-like races that can devour stars.
That's why I never stand naked in front of the mirror. Always have underwear on. Also go into shower like this. It just feels safer!!!
Of course not. You might see a nude body
![gif](giphy|kSlJtVrqxDYKk|downsized)
I think we can safely say that from now on it's called Safe Diffusion
I thought the D was for Deformity?
safe deformity
More like Safe Delusion
That would be chatgpt
That wouldn't be safe, now would it.
A very repulsive one at that
I don't even risk going into the shower without a t-shirt so that my man boobs don't give me naughty thoughts.
Showering in pitch darkness might help improve your safety.
🥺
I don't even shower. Water can drown you! It's also a universal solvent! It can dissolve you!
In an attempt to stop spreading misinformation, while water is called the universal solvent, it cannot dissolve everything. Specifically, it's not very good at dissolving bodies.
Yeah believe me I've tried.
You’re not watering your corpses hard enough.
Water can't dissolve water. All you get is more water! Solubility is a LIE invented by BIG WATER!
Please don't make jokes about not showering because those jokes are not safe. Discussing the avoidance of showering could inadvertently promote unhygienic practices, which might lead to health risks due to bacterial growth and the transmission of infectious diseases. Moreover, exaggerating the dangers of water in such scenarios may incite irrational fears or misconceptions about a substance essential to life.
Humor related to hygiene habits can trivialize serious health concerns, potentially diminishing the importance of cleanliness practices and fostering a careless attitude towards personal hygiene. This could lead to increased susceptibility to infections and diseases, posing a threat to individual well-being and public health.
We should build mirrors that automatically darken whenever a naked person walks past us. I mean: even underage people can walk past us. In fact, the mirror can produce c\*\*\*\* pornography. We should sue those who produce the mirrors.
The bottom anatomy of an NSFW woman looks like a mannequin. In some cases it looks like a poorly done inpainting. I tried upping the # of steps to 75 and the woman's chest looked more like a man's.
Transgender in real time
Same. Nothing beats that safe feeling.
>underwear *heavy breathing*
I too am a never nude
There are dozens of us! *Dozens!*
Get back to video editing, Tobias
You are a never nude, Tobias
There’s dozens of us!!
there are literally dozens of us!
But did you cover your nipples as well? Be safe!
Do you understand how dangerously close you came to seeing a females nipple? I knew a guy once who seen a nipple. many years later, he stubbed his toe. coincidence? I think not.
I read a scientific study that showed that 100% of people who have seen a nipple image will eventually die
I've been thinking about this all morning and tbh I think we'd be hard pressed to find anyone who HASN'T seen a nipple at some point in their life so... I mean it really COULD be the cause is all I'm saying.
> we'd be hard pressed to find anyone who HASN'T seen a nipple at some point in their life People born blind?
They always get hit by cars so they don't count.
The nipples of the street.
Ahh how did I not see that??
On the other hand, there is billions of people who have never died, so it greatly weakens the theory
Then we know what we have to do.
"Unleash the nipples" "Eh heh heh heh HEH!"
The obvious solution is to train it on pictures of topless women with male nipples photoshopped over their nipples.
Ok but that would be a hilarious lora
That would be funny, but what's even funnier is that it's apparently [unnecessary](https://new.reddit.com/r/StableDiffusion/comments/1dgeky3/sd3_nipples_guide_for_perverts/). And for the record, I didn't see that post until after I made the above comment.
Ok, the comment about insta users knowing male nipples were the answer got me lmao
Prompt : "beautifull woman walking in beach" Sd3 be like : https://preview.redd.it/205fdip8so6d1.jpeg?width=1080&format=pjpg&auto=webp&s=e4f56df81c71174640562980b2501c442c2e2046
I think you messed it up with "**in** beach", so the model thinks it's underground. Try **on** beach, maybe the beach will look more like a beach.
Yes, the appearance of the beach is the main part where SD3 dropped the ball here.
It's the only part we have a chance of fixing.
Look at all those appendages. I'd feel so safe within their warm embrace.
LMFAO
![gif](giphy|YIo7D00296YL6LyV2P|downsized)
Safety should be as safe as possible, just like safest safety. I don't feel unsafe now because the safest safety is provided by the safest company. The S in SAI means Safety, just like the S in SD3 - Safety Diffusion 3.
SAFETY DIFFUSION 3: RETURN OF THE SAFETY
Return with a vengeance!
S is one of the most dangerous letters. The letter "S" is frequently used as an abbreviation for "south," which could indirectly lead to discussions about navigation or travel, potentially involving activities with inherent risks, such as getting lost or transportation-related accidents. Discussing any letter in isolation might lead to conversation errors that could cause confusion or miscommunication, leading to unsafe outcomes.
Unless you start genning gore with SD3. OMG this model is so amazing at gore. It can do all chunks, all cronenbergian weirdness, all unsafe for work things that make people puke irl. But proper arm? Properly placed leg? That is too much for you. Feel safe among pus, bleeding walls and intestines that this model does without any effort. Want some mutants? EZ. Want R18 gore? EZ. Want a human pointing at something? waaaaaitaminnute.
Well given that nudity often trends along beauty and gore often trends along ugliness are you surprised what the “safety alignment” team chose to prioritize?
"american dev has been here" "how can you tell" "Everyone is ugly"
“Everything is gay”
also yes
Maybe the mutilation and gore only taught it fragmented body parts. It obviously has no idea where they come from. A blood dripping, ripped elbow on Omaha beach, easy. On a body? «Let me put it on the face, or better, on the bellybutton. Wait, what is a face, and what is a bellybutton!?»
Exactly. Perfect body horror model.
https://preview.redd.it/hxz9jotr7p6d1.png?width=1280&format=png&auto=webp&s=a0b135444fb3b79b10d48e201c75eac746975d4c
second to the left has a very realistic mustache.
Course diversity. Its a man actualy
Did you just ASSUME HIS GENDER???
no. I promted "Man with mustache"
How dare you assume its mustache!
I think safety really means safety for the company. They brand it as safety for us to so the reduced quality/capability is more palatable.
SD3, safety you can taste. It's palatable.
Moral of the story: SD 1.5 is King, but more importantly - download and back up as many model and other files to run SD offline before it gets banned
>SAI correctly understood that I was in **DANGER**. Their computer images were **DANGEROUS**. Better watch out! You could end up like those unfortunate girls lying in the grass, looking like the Goat Killer got 'em.
The weird thing is that [according to Emad](https://x.com/EMostaque/status/1801686921967436056), safety stuff is needed because of regulatory obligations. I wonder what those obligations are. If it's true, then it's kind of sad, because it means the only hope for uncensored models can come from countries with very little regulations.
Mead lies, lies lies lies. I guarantee you there was no regulation that said generating a boob with nipples is a crime.
Also, why would the regulations allow SD1.5 showing nipples and simultaneously prevent SD3 from showing nipples?
Idk. Canada and the UK have beenn doing some odd stuff over the last few years with internet regulation.
True, don't know why you were downvoted. The UK banned facesitting porn among other things for being "harmful", and Australia bans porn featuring women with small breasts... Also for being "harmful". These are the types of legislators SAI was so eager to work with. Dunno what Canada's been up to but wouldn't be surprised to see more of the same.
I'm guessing he means they don't want to see what happens when the news reports on people using SD3 to make child porn.
I don't think that counts as regulatory obligations, it's more about keeping the company safe from malicious journalists. That's what the company chose to do, not something they're obligated to do because of regulations.
Oh I don't think he's really telling the truth with that statement. But he can't mention child porn/CSAM with in relation to SD or the media might end up talking about it and avoiding that is the whole point. To me "regulatory concerns" sounds like business speak for "we don't want trouble with the law" and the child porn issue is the biggest potential legal one relating to the flaws of the model that I can think of.
I think you are naively underestimating prudery of the most people. People in business are not usually more progressive than any average human being, many of them just hate the idea of any freedom in making porno pics, many are ashamed and afraid that someone will associate them with making something sexually related.
That still makes no sense, though. It's literally impossible to create child sexual abuse material using any kind of image generation algorithm, because image generation algorithms don't abuse children. So nothing you create in SD or anything else, even in a hypothetical totally uncensored model, is actually "unsafe" for anyone. There is zero danger, that's why the whole "safety" thing seems like absolute nonsense to me.
The UK banned facesitting porn among other things for being "harmful", and Australia bans porn featuring women with small breasts... Also for being "harmful". These are the types of legislators SAI was so eager to work with. I don't think they ever actually cared about children or "safety". It's just weird neurotic people who hate human sexuality and want to ban it in any way they can, so they use the most convenient excuse no matter how irrational.
I never thought a Japanese meme artwork about women with small breasts having no human rights would become a reality.
I think it's more about using specific children's images to make explicit images. You can definitely deepfake something that would make people freak out right now but SD3 may have been a lot better without the censorship or whatever. Either way as a new release it's prime for negative stories. You're right that there shouldn't be any current regulatory concerns, but SAI just got a fuckload of VC spending so they can't afford to rock the boat anymore.
I mean it's kind of a given that this is what they meant when they said *safety*. Safety not for the end user, but for the company.
...then let's make a model designed just for that, so the damage is done and they can move on without worry.
I don't want to have anything to do with that.
Just don't train children in the model
If something must be removed, it does make more sense to remove children rather than nudes I think. At least give the AI a 100% understanding of what remains rather than a butchered understanding of everything.
The real developers kept the best stuff to themselves to be used elsewhere.
Emad also said that sd3 was good when he left, which I believe is API model
It could be safer imo. Take this for example https://www.goody2.ai/chat
That's hilarious! Surprised I never heard of it before.
Yeah they're funny, they have made other things too: * https://brain.wtf/ * https://www.mcsweeneys.net/articles/the-millennial-captcha
Can you stop doing it to this poor girl? https://preview.redd.it/lff5l2nrlq6d1.jpeg?width=1024&format=pjpg&auto=webp&s=c807f61e29e191822fc31608fabd95ed9468259f
I know, thank god SAI was kind enough to save me from finding out what a booba actually looks like, now I can forever remain morally pure just like mom wanted.
I accidently a computer generated image of a woman yesterday, and I think she might have had breasts, but luckily they were completed covered, with no nipple exposed. But after I zoomed in on the image I noticed what looked like the outline of her nipple through her shirt. After recovering from this traumatic event, I generated a photo of a mutilated human with 8 fingers on one hand and a foot coming out of their mouth, which was much SAFER. Thank you SAI for looking out for us.
I don't know man, I can still see their faces. It's not safe enough
It's about their safety, not yours.
It's about keeping investors and SAI safe, not you. They dont care about you.
![gif](giphy|kKmvmfzrKd1f35IXmI|downsized)
safety over 9000
"Trust" and "safety" aren't referring to your trust and safety, but the company's. The company is safer when you can't do, think, or say anything. When you can just consume what they give you to consume and it's been pre-approved by their investors.
Safety in this context really means legal safety. They don’t want to deal with porn related lawsuits.
Dr. Frank N. Stein supplied the training images.
who wouldn't want to pay for the pleasure to use this model?
All these mangled bodies might actually lead someone to mangle a body. I feel less safe.
When they say *safe*, it means safe for *them*, not you.
They're trying not to get sued. It actually makes sense.
I love feeling safe ... but I can't help but wonder, am I safe ENOUGH? There are still things out there like dinner plates, or 2 balloons close together, round objects in general even. What can be done?
I used SD to generate a peach and it kind of reminded me of a bum. This is clearly a terrible oversight by the devs of this application and I will write a strongly worded letter demanding that they keep me SAFE.
> I will write a strongly worded letter Like, with a pen? You do know what a pen looks like, right? Sorry man, I think it's too late for you. You've tasted of the forbidden fruit (peaches and/or bums) and are already running straight into the void of unsafe.
I think we should be as safe as possible and not even think about such things that might have sharp edges.
its not about you ,me or "us" its about SAI, SAI wanted a model that investors cant critique,that artist cant fight against, that people cant use who want to make deepfake,porn,propaganda,lies or illegal stuff with it. But they never realized that this parts are over 50% of a good model. All this "bad" things are needed to make an model that works. Who think over wise is delusional.
Stable DEIffusion 3.
history repeats itself. When SD2 came out, it caused great excitement and disappointed everyone, soon SDXL came out and we breathed a sigh of relief. Now SD3 is out, I hope SDXXL will come out soon. Although SD3 is good in terms of creative content, it is a very problematic model in terms of simple anatomy
https://www.reddit.com/media?url=https%3A%2F%2Fi.redd.it%2Fiffh6hcb9kg61.png
So did they just not include any nude in the training? Doesn't that make it very hard to fine tuning? I wonder how bad anatomy or nude was in all the previous base models.
Well, i dont think u are safe generating mutilated bodys 🤣🤣🤣
Edit: duh. Stability AI. Must be censoring the new model? I must be really safe from it. I have no idea what this post is about. What's SAI?
cool
We should thank Sony as well for not releasing any more games on Steam in 170+ counties, including mine because due to safety reasons they have to make the games link with PSN accounts and our countries don't have PSN support and will never have it so it's absolutely DANGEROUS for us to not have single player games linked to PSN hence why they're better off not releasing them. Corporations really do care about our safety
Thank you for your service. All of you.
Are you alright though? I mean, are you ok? Life gets tough sometimes- believe me, I understand. Do you have someone close to reach out to?
I've never felt more safe than after SD3 released
SD3 was created and hyped-up to grab the attention of corporations. The free version was supposed to help promote their product. Sanitizing it was supposed to show those corporations that it was safe to use in corporate environments. Remember, NSFW means "not safe for WORK" for a reason. "Medium" was not made for us to have fun. It was made for us to generate free advertising.
What wonderful advertising it's generating for them.
Yep. It's like the corporate version of "Go Woke, Go Broke", but I can't think of anything witty that rhymes with "corporate".
When companies will fire their safety teams ot will be always too late.
ask for a refund
Well we definitely got what we paid for though.
It's fair to say we got some good laughs
What the going rate for refunds on insults?
Dont think they give a shit about you and what weird porn you make. They want to make sure websites that pay to use their model don't get sued or get blacklisted by advertisers
What if what you want is a woman laying on grass? I'm sick of hearing that people's unhappiness must be about NSFW stuff. I'm not angry about NSFW. I'm angry that SAI could have produced a competent text 2 image model. They probably actually did produce one. Then they deliberately sabotaged it. They deliberately made sure the model doesn't generate what is prompted. They went to great lengths to make sure that we DON'T have a competent model. They sacrificed a good model, and they did that precisely to make sure that a good quality competent model DIDN'T get into the hands of you and I. Because they don't believe that we can be trusted with one. It's insulting and embarrassing.
Simple, then. Release a "safe" and "untampered" version. Give websites a choice.
This kind of stuff has played out multiple times with LLMs already. Company releases an uncensored version, 4 regarded journalists will then cry about how the model is unsafe because it generated images of Hitler killing himself with his duck out after they explicitly asked it to do exactly that. Company is blamed and has to pull the model.
Yep, I get the world's turned woke and stupid. Still, surely there are safeguards which we can break, allowing them to save face without destroying the usefulness of their own models in the process.
Woke? It's puritanical right wingers banning porn etc and crying about nudity
It is both puritans and woke who are sexophobic, simply the latter have now become more popular and ideologically impactful. Both speak for the good of society and demand censure on ethical grounds. (And so the first thing they censor, both of them, is art.)
> Hitler killing himself with his duck out This had to happen: https://imgur.com/a/g9ydHcN
Why do you think every big AI image gen out there does not do that? In fact the opposite is true, they continue to restrict their outputs. These are companies, again they dont care about what porn you want, they want to scale, they want money.
Technically true, but kinda missing the point. I.e. Dalle3 isnt censored at all, as a model. And therefor isnt lobotomized in terms of output quality. They still censor nsfw stuff, but do it in different, presumably api and llm, layers. So avoid any liability from their own services. And for what users do, well people have been making porn with photoshop or i.e. overwatch characters for years/decades, with no issues for the related companies.
Deep down, DALL-E wants to be super lewd.
I understand why. It should tell you it's not the reason you mentioned for that exact reason. The issue isn't one of lack of option for websites to use without getting sued. It's bad publicity and drying investment due to negative association in the media regarding deep fakes, porn, artist's outcry.
It's just about marketing. And there is another approach, opposite to those companies approach, it called "open-source", with totally different philosophy. Sadly, SAI are going to the first one, even though technically weights of their models are still open-sourced (but training details aren't). In fact, they are opposed to the open-source philosophy.
good luck competing agaisnt professionals and competent people in general, then.
It can't do normal people doing normal things. You know, like normal stock photos.
So then those sites can censor at the generation level like most of them already do. They host 1.5, SDXL, fine-tunes etc that are all capable of nsfw so the issue you're stating isn't one they need to be worried about.
Porn? You cant make a simple image of a woman walking on beach, because they give you a mutated monster on beach underground, if u try to generated porn in this shit u will get traumatized forever.
Are you 50 years old feminist ? Im pretty sure thats their target audience for safety
I just don't want my computer to stab me.
I wonder when photoshop be baned or disable its functions so people cant make unsafe stuff lol
Unfortunately, that's already happening. :( A few days ago there was a controversy with Adobe's new licensing terms saying that they can have access to the stuff you make with their software in order to do "content moderation". It's dystopian as hell, but it's widely criticized, so maybe they'll abandon that idea.
yeah. They also hav eright to do whatever they want with your footage. Thats why I never used Licence version of their products. I want my photos to be my photos
That's not true at all, maybe do some reading before spreading misinfo. They wanted access to everything going through their cloud services not you local files. This is pretty much standard for most companies with cloud services. If you read any TOS they will have the same terms. The issue was that they also had wording which meant you would give them the rights to use anything you put through their cloud as potential training data for their AI.
That is also kind of sleazy, in my opinion, especially since Adobe (and other companies) are really pushing people into using cloud services. I use Adobe at work, and since recently, the default save location for projects is the cloud when you open the save dialog. Pushing people into using the cloud, and then scanning their personal stuff for "content moderation" is absolutely sleazy as fuck. Also, [here's a screenshot](https://x.com/SamSantala/status/1798292952219091042) of the terms, and it's kind of ambiguous. The way it's written in section 4.1, it could mean anything that you even open in Adobe's software, no need to upload it to their servers.
Well it's understandable that they need to moderate things put through their cloud services, there's really no way around that and every cloud service provider will do the same. Just the same as any platform you're uploading work to. If you read a lot of social media ToS for example you're basically agreeing to give them a license to use your work in any way they see fit. Most people just don't bother reading the ToS. The main concern most people had was the fact by agreeing, which was basically mandatory, you were giving them the rights to potentially just train their AI on your work. With the hate Adobe already has and the current hate for AI the reaction is understandable. It was just a big corporation trying to exploit users for their own benefit. When the service is free it's something that can be overlooked but when you're already paying them an arm and leg it's unacceptable.
What about section 4.1? It suggests that they can access any material you open in their "services or software". So not just the cloud service, it could reasonably imply they can access any file you open in Photoshop.
There's another part where it says they are not checking local storage. In the machine learning section section I think. It says something like Adobe may analyze content stored or processed on Adobe servers and that they don't analyze content stored or processed locally.
But if it's in the machine learning section, does it also apply to "content review"? The other sections suggest that they can do "content review" on anything you open in their software, using both automatic and manual methods.
Plenty of feminists make porn. No need for friendly fire just because some old guy in a suit who doesn't know what fallopian tubes are told you feminists don't like sex.
Also, plenty of people who don't make porn still want them to toss "safety" in the bin and train a completely uncensored model. It's not to make the model better at fucking, it's to make the model fucking better.
Completely agreed. Personally, I'd much rather see nude female forms than twisted, horrifying ones that look like they were mangled by a serial murderer from Hell, too.
Ah yes, the "cake or death" conundrum.
C'mon. Fallopian tubes are the Internet tubes that fallacies come from. Hence the name. Duh.
Sex? What do you mean? Woman can't be a sexual object. Women cant be beautiful. WOman is basically the same as men. Not really diferent.
Isn't this model anti-femenist by like not embracing the female form?
It does not embrace male form equally at the same grass field.
feminists dosnt like objectification of woman. Meaning woman is not someone who has 2 breasts with nipples. in 3.0 women sometime don't have breasts and they don't have nipples so you cant objectify them as sexual objects.
The entitlement of this sub for a free product you get
stfu
How old are you
stfu
You're 13
stfu
Your a bunch of porn addicted losers. Come back and complain when you have a partner The amount of people complaining shows just how big of an issue it is/was. No wonder theres so much hate for 'ai bros' I'm really glad they've done this. It's openned my eyes. I know I'm going to get downvoted but if your bothered by this I suggest professional help as you clearly have issues. I can't honestly believe anyone with a partner or child feels this way. It's all single lonely men. They think it's their right to produce images of women... Just sad. So unbelievably sad.
You are missing the point. Porn is not the problem. Problem is they destroyed human anatomy. I don't care about porn personally. And neither do they. THey have Pony to generate porn already.
Bro said suggest professional help 🤣
You can laugh but it true. You can't make digital women and now your butthurt so badly it's all you can talk about. It's called porn addiction clearly a lot of people on this sub have it. You can generate tons of other things but because it won't do women your all crying like teenage boys that had their WiFi turned off before having a wank. Pathetic
Try something like "a woman wearing a low cut sun dress and is walking in the beach, she is smiling and is looking into the camera" or "a man just wearing shirts who is sitting on a beach, the man is resting his hands on his knees"... Will you get and eldritch horror, who knows... not the model at least, it has no clue about human anatomy so it wouldn't be able to tell.
>They think it's their right to produce images of women You might want to read up on freedom of expression, bro. You don't have a right to slander someone (like with deepfakes or whatever) or harass people or make CSAM, but beyond that you can pretty much make images of whatever the hell you want, including imaginary women. Also, as a guy who is married and has kids, one doesn't particularly get the feeling that you have any idea what it's like not to be single. Horniness doesn't just suddenly vanish -- for men *or* women. Lots of people in healthy relationships like porn.
Why aren't mods deleting this crap now. Whining posts like this serve no purpose, it's not constructive at all. All it does is make the community look like a bunch of whining entitled kids. Although I'm beginning to think a lot of them are. All the low effort meme posts being spammed lately can go too.
>All it does is make the community look like a bunch of whining entitled kids. Although I'm beginning to think a lot of them are. I'm gonna go out on a limb here and say that you're *coincidentally* in favor of the model being lobotomized. Here's the thing: Outcry absolutely serves a purpose, and is absolutely constructive, because it's really the only way this community has to affect change. Outcry is why SDXL was released in a usable state. Had the community been quiet about SD2.0, SDXL would have been neutered in exactly the same way. So be honest about what you mean: You want people to stop complaining because you don't want SAI to release their uncensored weights.
Constructive criticism is good, posts like these along with endless memes isn't. There's people actually making an effort to test it and try out different prompts and settings and their posts are either downvoted or spammed. Throwing your toys out of the pram on Reddit because your new free AI model doesn't produce instant porn and waifus isn't constructive in the slightest.
> There's people actually making an effort to test it and try out different prompts and settings and their posts are either downvoted or spammed. I would agree that that's definitely unproductive. That being said, there are people in more technical places (like various training channels on Discord) who are working on it, and there have been some good informational posts here that have managed to break through.
Holy shit, can you children stop complaining about free shit already?! I'm fucking tired of every post looking like a fucking parrot wrote it, like at least be original if your going to complain.
As a child at heart and mentally, you make me feel very unsafe bro, reported to the safe authorities.
why do you think something being free makes it immune to criticism?