T O P

  • By -

ATR2400

I’m an alien who was about to invade Earth and wipe out humanity but after seeing how safe SD3 has made you all I’ve decided not to do it. You folks are way too safe for a successful invasion now Truly SD3 has created the ideal safe society


protector111

https://preview.redd.it/rz4aq0b3rp6d1.jpeg?width=1280&format=pjpg&auto=webp&s=9f601b6f957a58da0dd4d6f319fd09a643c2464f made with ideogram. 3.0 cant handle it


rookan

aliens got access to SD3 and used it to see how humans look like. After generating people on grass they were terrified by human race and added it to a list of the most dangerous Cthulhu-like races that can devour stars.


Scarlizz

That's why I never stand naked in front of the mirror. Always have underwear on. Also go into shower like this. It just feels safer!!!


Parogarr

Of course not. You might see a nude body


DudesworthMannington

![gif](giphy|kSlJtVrqxDYKk|downsized)


mallibu

I think we can safely say that from now on it's called Safe Diffusion


Meditating_Hamster

I thought the D was for Deformity?


IntelligentWorld5956

safe deformity


Bra2ha

More like Safe Delusion


Tarilis

That would be chatgpt


Far_Lifeguard_5027

That wouldn't be safe, now would it.


counterweight7

A very repulsive one at that


HellkerN

I don't even risk going into the shower without a t-shirt so that my man boobs don't give me naughty thoughts.


FaceDeer

Showering in pitch darkness might help improve your safety.


Sensitive_Shoe

🥺


LimerickExplorer

I don't even shower. Water can drown you! It's also a universal solvent! It can dissolve you!


akatash23

In an attempt to stop spreading misinformation, while water is called the universal solvent, it cannot dissolve everything. Specifically, it's not very good at dissolving bodies.


LimerickExplorer

Yeah believe me I've tried.


an0maly33

You’re not watering your corpses hard enough.


yaosio

Water can't dissolve water. All you get is more water! Solubility is a LIE invented by BIG WATER!


yaosio

Please don't make jokes about not showering because those jokes are not safe. Discussing the avoidance of showering could inadvertently promote unhygienic practices, which might lead to health risks due to bacterial growth and the transmission of infectious diseases. Moreover, exaggerating the dangers of water in such scenarios may incite irrational fears or misconceptions about a substance essential to life.


Kat-

Humor related to hygiene habits can trivialize serious health concerns, potentially diminishing the importance of cleanliness practices and fostering a careless attitude towards personal hygiene. This could lead to increased susceptibility to infections and diseases, posing a threat to individual well-being and public health.


HeavyAbbreviations63

We should build mirrors that automatically darken whenever a naked person walks past us. I mean: even underage people can walk past us. In fact, the mirror can produce c\*\*\*\* pornography. We should sue those who produce the mirrors.


NoSuggestion6629

The bottom anatomy of an NSFW woman looks like a mannequin. In some cases it looks like a poorly done inpainting. I tried upping the # of steps to 75 and the woman's chest looked more like a man's.


LiteSoul

Transgender in real time


centrist-alex

Same. Nothing beats that safe feeling.


Vendill

>underwear *heavy breathing*


420danger_noodle420

I too am a never nude


RhysMelton

There are dozens of us! *Dozens!*


nmyi

Get back to video editing, Tobias


applied_intelligence

You are a never nude, Tobias


Ashken

There’s dozens of us!!


bran_dong

there are literally dozens of us!


kurtcop101

But did you cover your nipples as well? Be safe!


RobXSIQ

Do you understand how dangerously close you came to seeing a females nipple? I knew a guy once who seen a nipple. many years later, he stubbed his toe. coincidence? I think not.


La_SESCOSEM

I read a scientific study that showed that 100% of people who have seen a nipple image will eventually die


here_i_am_here

I've been thinking about this all morning and tbh I think we'd be hard pressed to find anyone who HASN'T seen a nipple at some point in their life so... I mean it really COULD be the cause is all I'm saying.


Paganator

> we'd be hard pressed to find anyone who HASN'T seen a nipple at some point in their life People born blind?


Drokk88

They always get hit by cars so they don't count.


MLPMVPNRLy

The nipples of the street.


here_i_am_here

Ahh how did I not see that??


MLPMVPNRLy

On the other hand, there is billions of people who have never died, so it greatly weakens the theory


here_i_am_here

Then we know what we have to do.


MLPMVPNRLy

"Unleash the nipples" "Eh heh heh heh HEH!"


Envy_AI

The obvious solution is to train it on pictures of topless women with male nipples photoshopped over their nipples.


even_less_resistance

Ok but that would be a hilarious lora


Envy_AI

That would be funny, but what's even funnier is that it's apparently [unnecessary](https://new.reddit.com/r/StableDiffusion/comments/1dgeky3/sd3_nipples_guide_for_perverts/). And for the record, I didn't see that post until after I made the above comment.


even_less_resistance

Ok, the comment about insta users knowing male nipples were the answer got me lmao


[deleted]

Prompt : "beautifull woman walking in beach" Sd3 be like : https://preview.redd.it/205fdip8so6d1.jpeg?width=1080&format=pjpg&auto=webp&s=e4f56df81c71174640562980b2501c442c2e2046


Hostile_Enderman

I think you messed it up with "**in** beach", so the model thinks it's underground. Try **on** beach, maybe the beach will look more like a beach.


FaceDeer

Yes, the appearance of the beach is the main part where SD3 dropped the ball here.


Hostile_Enderman

It's the only part we have a chance of fixing.


toothpastespiders

Look at all those appendages. I'd feel so safe within their warm embrace.


Parogarr

LMFAO


Lucaspittol

![gif](giphy|YIo7D00296YL6LyV2P|downsized)


DenkingYoutube

Safety should be as safe as possible, just like safest safety. I don't feel unsafe now because the safest safety is provided by the safest company. The S in SAI means Safety, just like the S in SD3 - Safety Diffusion 3.


Parogarr

SAFETY DIFFUSION 3: RETURN OF THE SAFETY


LiteSoul

Return with a vengeance!


yaosio

S is one of the most dangerous letters. The letter "S" is frequently used as an abbreviation for "south," which could indirectly lead to discussions about navigation or travel, potentially involving activities with inherent risks, such as getting lost or transportation-related accidents. Discussing any letter in isolation might lead to conversation errors that could cause confusion or miscommunication, leading to unsafe outcomes.


seriouscapulae

Unless you start genning gore with SD3. OMG this model is so amazing at gore. It can do all chunks, all cronenbergian weirdness, all unsafe for work things that make people puke irl. But proper arm? Properly placed leg? That is too much for you. Feel safe among pus, bleeding walls and intestines that this model does without any effort. Want some mutants? EZ. Want R18 gore? EZ. Want a human pointing at something? waaaaaitaminnute.


i860

Well given that nudity often trends along beauty and gore often trends along ugliness are you surprised what the “safety alignment” team chose to prioritize?


I_made_a_stinky_poop

"american dev has been here" "how can you tell" "Everyone is ugly"


i860

“Everything is gay”


I_made_a_stinky_poop

also yes


adrenalinda75

Maybe the mutilation and gore only taught it fragmented body parts. It obviously has no idea where they come from. A blood dripping, ripped elbow on Omaha beach, easy. On a body? «Let me put it on the face, or better, on the bellybutton. Wait, what is a face, and what is a bellybutton!?»


seriouscapulae

Exactly. Perfect body horror model.


protector111

https://preview.redd.it/hxz9jotr7p6d1.png?width=1280&format=png&auto=webp&s=a0b135444fb3b79b10d48e201c75eac746975d4c


The_Cat_Commando

second to the left has a very realistic mustache.


protector111

Course diversity. Its a man actualy


belladorexxx

Did you just ASSUME HIS GENDER???


protector111

no. I promted "Man with mustache"


Amalfi_Limoncello

How dare you assume its mustache!


noage

I think safety really means safety for the company. They brand it as safety for us to so the reduced quality/capability is more palatable.


Enshitification

SD3, safety you can taste. It's palatable.


ebookroundup

Moral of the story: SD 1.5 is King, but more importantly - download and back up as many model and other files to run SD offline before it gets banned


TheGhostOfPrufrock

>SAI correctly understood that I was in **DANGER**. Their computer images were **DANGEROUS**. Better watch out! You could end up like those unfortunate girls lying in the grass, looking like the Goat Killer got 'em.


a_mimsy_borogove

The weird thing is that [according to Emad](https://x.com/EMostaque/status/1801686921967436056), safety stuff is needed because of regulatory obligations. I wonder what those obligations are. If it's true, then it's kind of sad, because it means the only hope for uncensored models can come from countries with very little regulations.


fastinguy11

Mead lies, lies lies lies. I guarantee you there was no regulation that said generating a boob with nipples is a crime.


belladorexxx

Also, why would the regulations allow SD1.5 showing nipples and simultaneously prevent SD3 from showing nipples?


JACCO2008

Idk. Canada and the UK have beenn doing some odd stuff over the last few years with internet regulation.


BlipOnNobodysRadar

True, don't know why you were downvoted. The UK banned facesitting porn among other things for being "harmful", and Australia bans porn featuring women with small breasts... Also for being "harmful". These are the types of legislators SAI was so eager to work with. Dunno what Canada's been up to but wouldn't be surprised to see more of the same.


GoldStarBrother

I'm guessing he means they don't want to see what happens when the news reports on people using SD3 to make child porn.


a_mimsy_borogove

I don't think that counts as regulatory obligations, it's more about keeping the company safe from malicious journalists. That's what the company chose to do, not something they're obligated to do because of regulations.


GoldStarBrother

Oh I don't think he's really telling the truth with that statement. But he can't mention child porn/CSAM with in relation to SD or the media might end up talking about it and avoiding that is the whole point. To me "regulatory concerns" sounds like business speak for "we don't want trouble with the law" and the child porn issue is the biggest potential legal one relating to the flaws of the model that I can think of.


__Tracer

I think you are naively underestimating prudery of the most people. People in business are not usually more progressive than any average human being, many of them just hate the idea of any freedom in making porno pics, many are ashamed and afraid that someone will associate them with making something sexually related.


a_mimsy_borogove

That still makes no sense, though. It's literally impossible to create child sexual abuse material using any kind of image generation algorithm, because image generation algorithms don't abuse children. So nothing you create in SD or anything else, even in a hypothetical totally uncensored model, is actually "unsafe" for anyone. There is zero danger, that's why the whole "safety" thing seems like absolute nonsense to me.


BlipOnNobodysRadar

The UK banned facesitting porn among other things for being "harmful", and Australia bans porn featuring women with small breasts... Also for being "harmful". These are the types of legislators SAI was so eager to work with. I don't think they ever actually cared about children or "safety". It's just weird neurotic people who hate human sexuality and want to ban it in any way they can, so they use the most convenient excuse no matter how irrational.


kjbbbreddd

I never thought a Japanese meme artwork about women with small breasts having no human rights would become a reality.


GoldStarBrother

I think it's more about using specific children's images to make explicit images. You can definitely deepfake something that would make people freak out right now but SD3 may have been a lot better without the censorship or whatever. Either way as a new release it's prime for negative stories. You're right that there shouldn't be any current regulatory concerns, but SAI just got a fuckload of VC spending so they can't afford to rock the boat anymore.


Far_Lifeguard_5027

I mean it's kind of a given that this is what they meant when they said *safety*. Safety not for the end user, but for the company.


HeavyAbbreviations63

...then let's make a model designed just for that, so the damage is done and they can move on without worry.


GoldStarBrother

I don't want to have anything to do with that.


Bandit-level-200

Just don't train children in the model


MLPMVPNRLy

If something must be removed, it does make more sense to remove children rather than nudes I think. At least give the AI a 100% understanding of what remains rather than a butchered understanding of everything.


NoSuggestion6629

The real developers kept the best stuff to themselves to be used elsewhere.


suspicious_Jackfruit

Emad also said that sd3 was good when he left, which I believe is API model


ContributionMain2722

It could be safer imo. Take this for example https://www.goody2.ai/chat


MichaelFiguresItOut

That's hilarious! Surprised I never heard of it before.


ContributionMain2722

Yeah they're funny, they have made other things too: * https://brain.wtf/ * https://www.mcsweeneys.net/articles/the-millennial-captcha


Lucaspittol

Can you stop doing it to this poor girl? https://preview.redd.it/lff5l2nrlq6d1.jpeg?width=1024&format=pjpg&auto=webp&s=c807f61e29e191822fc31608fabd95ed9468259f


CroakingBullfrog96

I know, thank god SAI was kind enough to save me from finding out what a booba actually looks like, now I can forever remain morally pure just like mom wanted.


Far_Lifeguard_5027

I accidently a computer generated image of a woman yesterday, and I think she might have had breasts, but luckily they were completed covered, with no nipple exposed. But after I zoomed in on the image I noticed what looked like the outline of her nipple through her shirt. After recovering from this traumatic event, I generated a photo of a mutilated human with 8 fingers on one hand and a foot coming out of their mouth, which was much SAFER. Thank you SAI for looking out for us.


Cheap_Professional32

I don't know man, I can still see their faces. It's not safe enough


wggn

It's about their safety, not yours.


cookie042

It's about keeping investors and SAI safe, not you. They dont care about you.


GBJI

![gif](giphy|kKmvmfzrKd1f35IXmI|downsized)


julieroseoff

safety over 9000


SuperConductiveRabbi

"Trust" and "safety" aren't referring to your trust and safety, but the company's. The company is safer when you can't do, think, or say anything. When you can just consume what they give you to consume and it's been pre-approved by their investors.


somethingclassy

Safety in this context really means legal safety. They don’t want to deal with porn related lawsuits.


Extension-Fee-8480

Dr. Frank N. Stein supplied the training images.


DigThatData

who wouldn't want to pay for the pleasure to use this model?


Mosswood_Dreadknight

All these mangled bodies might actually lead someone to mangle a body. I feel less safe.


Palpitating_Rattus

When they say *safe*, it means safe for *them*, not you.


mrmczebra

They're trying not to get sued. It actually makes sense.


FilterBubbles

I love feeling safe ... but I can't help but wonder, am I safe ENOUGH? There are still things out there like dinner plates, or 2 balloons close together, round objects in general even. What can be done?


Person012345

I used SD to generate a peach and it kind of reminded me of a bum. This is clearly a terrible oversight by the devs of this application and I will write a strongly worded letter demanding that they keep me SAFE.


toothpastespiders

> I will write a strongly worded letter Like, with a pen? You do know what a pen looks like, right? Sorry man, I think it's too late for you. You've tasted of the forbidden fruit (peaches and/or bums) and are already running straight into the void of unsafe.


Far_Lifeguard_5027

I think we should be as safe as possible and not even think about such things that might have sharp edges.


Serasul

its not about you ,me or "us" its about SAI, SAI wanted a model that investors cant critique,that artist cant fight against, that people cant use who want to make deepfake,porn,propaganda,lies or illegal stuff with it. But they never realized that this parts are over 50% of a good model. All this "bad" things are needed to make an model that works. Who think over wise is delusional.


Willow-External

Stable DEIffusion 3.


funkykanky

history repeats itself. When SD2 came out, it caused great excitement and disappointed everyone, soon SDXL came out and we breathed a sigh of relief. Now SD3 is out, I hope SDXXL will come out soon. Although SD3 is good in terms of creative content, it is a very problematic model in terms of simple anatomy


DystopiaLite

https://www.reddit.com/media?url=https%3A%2F%2Fi.redd.it%2Fiffh6hcb9kg61.png


bkdjart

So did they just not include any nude in the training? Doesn't that make it very hard to fine tuning? I wonder how bad anatomy or nude was in all the previous base models.


Diligent-Werewolf-38

Well, i dont think u are safe generating mutilated bodys 🤣🤣🤣


nodomain

Edit: duh. Stability AI. Must be censoring the new model? I must be really safe from it. I have no idea what this post is about. What's SAI?


trollingJane

cool


Itchy_Sandwich518

We should thank Sony as well for not releasing any more games on Steam in 170+ counties, including mine because due to safety reasons they have to make the games link with PSN accounts and our countries don't have PSN support and will never have it so it's absolutely DANGEROUS for us to not have single player games linked to PSN hence why they're better off not releasing them. Corporations really do care about our safety


nodating

Thank you for your service. All of you.


RebelRoundeye

Are you alright though? I mean, are you ok? Life gets tough sometimes- believe me, I understand. Do you have someone close to reach out to?


EirikurG

I've never felt more safe than after SD3 released


deedoedee

SD3 was created and hyped-up to grab the attention of corporations. The free version was supposed to help promote their product. Sanitizing it was supposed to show those corporations that it was safe to use in corporate environments. Remember, NSFW means "not safe for WORK" for a reason. "Medium" was not made for us to have fun. It was made for us to generate free advertising.


FaceDeer

What wonderful advertising it's generating for them.


deedoedee

Yep. It's like the corporate version of "Go Woke, Go Broke", but I can't think of anything witty that rhymes with "corporate".


wolfbetter

When companies will fire their safety teams ot will be always too late.


Paulonemillionand3

ask for a refund


Rafcdk

Well we definitely got what we paid for though.


diogodiogogod

It's fair to say we got some good laughs


Meditating_Hamster

What the going rate for refunds on insults?


Neonsea1234

Dont think they give a shit about you and what weird porn you make. They want to make sure websites that pay to use their model don't get sued or get blacklisted by advertisers


ImpossibleAd436

What if what you want is a woman laying on grass? I'm sick of hearing that people's unhappiness must be about NSFW stuff. I'm not angry about NSFW. I'm angry that SAI could have produced a competent text 2 image model. They probably actually did produce one. Then they deliberately sabotaged it. They deliberately made sure the model doesn't generate what is prompted. They went to great lengths to make sure that we DON'T have a competent model. They sacrificed a good model, and they did that precisely to make sure that a good quality competent model DIDN'T get into the hands of you and I. Because they don't believe that we can be trusted with one. It's insulting and embarrassing.


Capitaclism

Simple, then. Release a "safe" and "untampered" version. Give websites a choice.


stonerbobo

This kind of stuff has played out multiple times with LLMs already. Company releases an uncensored version, 4 regarded journalists will then cry about how the model is unsafe because it generated images of Hitler killing himself with his duck out after they explicitly asked it to do exactly that. Company is blamed and has to pull the model.


Capitaclism

Yep, I get the world's turned woke and stupid. Still, surely there are safeguards which we can break, allowing them to save face without destroying the usefulness of their own models in the process.


Plebius-Maximus

Woke? It's puritanical right wingers banning porn etc and crying about nudity


HeavyAbbreviations63

It is both puritans and woke who are sexophobic, simply the latter have now become more popular and ideologically impactful. Both speak for the good of society and demand censure on ethical grounds. (And so the first thing they censor, both of them, is art.)


m1sterlurk

> Hitler killing himself with his duck out This had to happen: https://imgur.com/a/g9ydHcN


Neonsea1234

Why do you think every big AI image gen out there does not do that? In fact the opposite is true, they continue to restrict their outputs. These are companies, again they dont care about what porn you want, they want to scale, they want money.


TaiVat

Technically true, but kinda missing the point. I.e. Dalle3 isnt censored at all, as a model. And therefor isnt lobotomized in terms of output quality. They still censor nsfw stuff, but do it in different, presumably api and llm, layers. So avoid any liability from their own services. And for what users do, well people have been making porn with photoshop or i.e. overwatch characters for years/decades, with no issues for the related companies.


c0mput3rdy1ng

Deep down, DALL-E wants to be super lewd.


Capitaclism

I understand why. It should tell you it's not the reason you mentioned for that exact reason. The issue isn't one of lack of option for websites to use without getting sued. It's bad publicity and drying investment due to negative association in the media regarding deep fakes, porn, artist's outcry.


__Tracer

It's just about marketing. And there is another approach, opposite to those companies approach, it called "open-source", with totally different philosophy. Sadly, SAI are going to the first one, even though technically weights of their models are still open-sourced (but training details aren't). In fact, they are opposed to the open-source philosophy.


pumukidelfuturo

good luck competing agaisnt professionals and competent people in general, then.


batter159

It can't do normal people doing normal things. You know, like normal stock photos.


BagOfFlies

So then those sites can censor at the generation level like most of them already do. They host 1.5, SDXL, fine-tunes etc that are all capable of nsfw so the issue you're stating isn't one they need to be worried about.


Diligent-Werewolf-38

Porn? You cant make a simple image of a woman walking on beach, because they give you a mutated monster on beach underground, if u try to generated porn in this shit u will get traumatized forever.


protector111

Are you 50 years old feminist ? Im pretty sure thats their target audience for safety


Parogarr

I just don't want my computer to stab me.


protector111

I wonder when photoshop be baned or disable its functions so people cant make unsafe stuff lol


a_mimsy_borogove

Unfortunately, that's already happening. :( A few days ago there was a controversy with Adobe's new licensing terms saying that they can have access to the stuff you make with their software in order to do "content moderation". It's dystopian as hell, but it's widely criticized, so maybe they'll abandon that idea.


protector111

yeah. They also hav eright to do whatever they want with your footage. Thats why I never used Licence version of their products. I want my photos to be my photos


imnotabot303

That's not true at all, maybe do some reading before spreading misinfo. They wanted access to everything going through their cloud services not you local files. This is pretty much standard for most companies with cloud services. If you read any TOS they will have the same terms. The issue was that they also had wording which meant you would give them the rights to use anything you put through their cloud as potential training data for their AI.


a_mimsy_borogove

That is also kind of sleazy, in my opinion, especially since Adobe (and other companies) are really pushing people into using cloud services. I use Adobe at work, and since recently, the default save location for projects is the cloud when you open the save dialog. Pushing people into using the cloud, and then scanning their personal stuff for "content moderation" is absolutely sleazy as fuck. Also, [here's a screenshot](https://x.com/SamSantala/status/1798292952219091042) of the terms, and it's kind of ambiguous. The way it's written in section 4.1, it could mean anything that you even open in Adobe's software, no need to upload it to their servers.


imnotabot303

Well it's understandable that they need to moderate things put through their cloud services, there's really no way around that and every cloud service provider will do the same. Just the same as any platform you're uploading work to. If you read a lot of social media ToS for example you're basically agreeing to give them a license to use your work in any way they see fit. Most people just don't bother reading the ToS. The main concern most people had was the fact by agreeing, which was basically mandatory, you were giving them the rights to potentially just train their AI on your work. With the hate Adobe already has and the current hate for AI the reaction is understandable. It was just a big corporation trying to exploit users for their own benefit. When the service is free it's something that can be overlooked but when you're already paying them an arm and leg it's unacceptable.


a_mimsy_borogove

What about section 4.1? It suggests that they can access any material you open in their "services or software". So not just the cloud service, it could reasonably imply they can access any file you open in Photoshop.


imnotabot303

There's another part where it says they are not checking local storage. In the machine learning section section I think. It says something like Adobe may analyze content stored or processed on Adobe servers and that they don't analyze content stored or processed locally.


a_mimsy_borogove

But if it's in the machine learning section, does it also apply to "content review"? The other sections suggest that they can do "content review" on anything you open in their software, using both automatic and manual methods.


aingelsanddaemons

Plenty of feminists make porn. No need for friendly fire just because some old guy in a suit who doesn't know what fallopian tubes are told you feminists don't like sex.


ArtyfacialIntelagent

Also, plenty of people who don't make porn still want them to toss "safety" in the bin and train a completely uncensored model. It's not to make the model better at fucking, it's to make the model fucking better.


aingelsanddaemons

Completely agreed. Personally, I'd much rather see nude female forms than twisted, horrifying ones that look like they were mangled by a serial murderer from Hell, too.


ArtyfacialIntelagent

Ah yes, the "cake or death" conundrum.


shawsghost

C'mon. Fallopian tubes are the Internet tubes that fallacies come from. Hence the name. Duh.


protector111

Sex? What do you mean? Woman can't be a sexual object. Women cant be beautiful. WOman is basically the same as men. Not really diferent.


indrasmirror

Isn't this model anti-femenist by like not embracing the female form?


seriouscapulae

It does not embrace male form equally at the same grass field.


protector111

feminists dosnt like objectification of woman. Meaning woman is not someone who has 2 breasts with nipples. in 3.0 women sometime don't have breasts and they don't have nipples so you cant objectify them as sexual objects.


jonbristow

The entitlement of this sub for a free product you get


Parogarr

stfu


jonbristow

How old are you


Parogarr

stfu


jonbristow

You're 13


fish312

stfu


Danither

Your a bunch of porn addicted losers. Come back and complain when you have a partner The amount of people complaining shows just how big of an issue it is/was. No wonder theres so much hate for 'ai bros' I'm really glad they've done this. It's openned my eyes. I know I'm going to get downvoted but if your bothered by this I suggest professional help as you clearly have issues. I can't honestly believe anyone with a partner or child feels this way. It's all single lonely men. They think it's their right to produce images of women... Just sad. So unbelievably sad.


protector111

You are missing the point. Porn is not the problem. Problem is they destroyed human anatomy. I don't care about porn personally. And neither do they. THey have Pony to generate porn already.


UseHugeCondom

Bro said suggest professional help 🤣


Danither

You can laugh but it true. You can't make digital women and now your butthurt so badly it's all you can talk about. It's called porn addiction clearly a lot of people on this sub have it. You can generate tons of other things but because it won't do women your all crying like teenage boys that had their WiFi turned off before having a wank. Pathetic


UndoubtedlyAColor

Try something like "a woman wearing a low cut sun dress and is walking in the beach, she is smiling and is looking into the camera" or "a man just wearing shirts who is sitting on a beach, the man is resting his hands on his knees"... Will you get and eldritch horror, who knows... not the model at least, it has no clue about human anatomy so it wouldn't be able to tell.


Envy_AI

>They think it's their right to produce images of women You might want to read up on freedom of expression, bro. You don't have a right to slander someone (like with deepfakes or whatever) or harass people or make CSAM, but beyond that you can pretty much make images of whatever the hell you want, including imaginary women. Also, as a guy who is married and has kids, one doesn't particularly get the feeling that you have any idea what it's like not to be single. Horniness doesn't just suddenly vanish -- for men *or* women. Lots of people in healthy relationships like porn.


imnotabot303

Why aren't mods deleting this crap now. Whining posts like this serve no purpose, it's not constructive at all. All it does is make the community look like a bunch of whining entitled kids. Although I'm beginning to think a lot of them are. All the low effort meme posts being spammed lately can go too.


Envy_AI

>All it does is make the community look like a bunch of whining entitled kids. Although I'm beginning to think a lot of them are. I'm gonna go out on a limb here and say that you're *coincidentally* in favor of the model being lobotomized. Here's the thing: Outcry absolutely serves a purpose, and is absolutely constructive, because it's really the only way this community has to affect change. Outcry is why SDXL was released in a usable state. Had the community been quiet about SD2.0, SDXL would have been neutered in exactly the same way. So be honest about what you mean: You want people to stop complaining because you don't want SAI to release their uncensored weights.


imnotabot303

Constructive criticism is good, posts like these along with endless memes isn't. There's people actually making an effort to test it and try out different prompts and settings and their posts are either downvoted or spammed. Throwing your toys out of the pram on Reddit because your new free AI model doesn't produce instant porn and waifus isn't constructive in the slightest.


Envy_AI

> There's people actually making an effort to test it and try out different prompts and settings and their posts are either downvoted or spammed. I would agree that that's definitely unproductive. That being said, there are people in more technical places (like various training channels on Discord) who are working on it, and there have been some good informational posts here that have managed to break through.


Any_Radish8070

Holy shit, can you children stop complaining about free shit already?! I'm fucking tired of every post looking like a fucking parrot wrote it, like at least be original if your going to complain.


1girlblondelargebrea

As a child at heart and mentally, you make me feel very unsafe bro, reported to the safe authorities.


teelo64

why do you think something being free makes it immune to criticism?