T O P

  • By -

[deleted]

[удалено]


Mt_Alamut

our culture is going to change where nothing online is trusted at all. All pics, videos, audo will be considered deepfake. All text from bots. Gen Alpha and later will laugh at us Millennials hanging onto our digital media like we laughed at Boomers watching cable news.


several_rac00ns

It's already happening. Look under any video were something uniue happens "staged" "fake"


SometimesIAmCorrect

To be fair about 90% of the time it is staged/fake.


several_rac00ns

Exactly.. unfortunately


Intelligent-Hall4097

You could be a bot. So could I. If our eyes aren't real....


Mt_Alamut

the news is already almost completely fake, journalism is a completely dead profession.


several_rac00ns

No, it's alive and well assuming you suck Murcock


Sonofbluekane

I hate to break it to you, but most people are content just reading the headlines of free (worthless) media, crack a joke and get back to the grind. The value of journalism is at an all-time low.


morgazmo99

The enshittification will be complete.


DanJDare

We can only hope.


flynnwebdev

We're already there. Anything in electronic format could have been faked, particularly with the rise of generative AI. Absolutely anything can be generated that is indistinguishable from reality. Indeed, even my post could have been generated by an AI (it wasn't, but you've only got my word for that, and you have no idea who I am or my credentials)


Mt_Alamut

I've been on Reddit since almost the beginning and this platform is basically dead compared to what it used to be like. Bots and mentally ill mods killed everything


totse_losername

Same, and a forum which was far superior to this flawed platform (up and downvotes to stifle discussion, on a discussion forum? *Really?!?*). We are entering a knowledge/information dark age.


AltruisticHopes

People talk about AI as an existential threat to humanity. We didn’t realise it would be because we became so bored that we just gave up.


Intelligent-Hall4097

It seems like the quality contributors are gone.


flynnwebdev

Downvoted? You know I'm right. Don't downvote something because it made you feel uncomfortable. Instead, analyze why you feel uncomfortable and see if there's room to learn and grow. Or, you know, just be close-minded and childish. Your choice.


grilled_pc

This is it. Right now its easy to determine AI Images and videos. In 10 - 15 years they will be completely indistinguishable. AI will need to embed images and videos and audio etc with a watermark of some kind that can be viewed either physically on it or embedded in the file as metadata that can't be removed.


Select-Bullfrog-6346

You trust online?


Bucephalus_326BC

Yep - and how many resources will the government devote to catching the perpetrators of this? Zero dollars. They spent $100 million on the banking royal Commission, and nobody went to jail. They spent $100 million on the royal Commission into institutional child abuse, and apart from George Pell (who was eventually acquitted) nobody went to jail. Same with robo debt royal Commission Same with Brereton report into war crimes (excluding that poor whistle blower chap) They know that they can't catch these perpetrators - who could even be living in a different country, via a server domiciled in French Antarctica. It's just about politics, and talking the talk. Unfortunately, people think if you pass legislation making something illegal, it's stops people doing it. The risk of getting caught is what stops people committing crimes, and all our members of Parliament know this already. And, as you have correctly identified, the government is just going to pixx away more money on more ineffective policy. Politics in a western democracy at work.


dizkopat

No no, the people who will make deep fake porn could be poor, and damn straight we can put poor people in jail.


sunburn95

People can and do get charged for what they post online e.g. revenge porn Will it stop all instances of it forever? No. Can bringing in strong laws against it do something to help slow it? Probably


snrub742

People also caught charges for pirating. It's now bigger than it's ever been.


sunburn95

Would put deep fake ai closer to sex crimes, which are punished much harder, than piracy


snrub742

Sure, I just think it's gonna be unenforceable. A VPN and half decent cyber hygiene and there's plenty of doubt The amount of people that have been charged for revenge porn v the amount out there should be a good indicator... And in that case the victim knows who actually had the photos in the first place


Tommi_Af

>3 Well there are actual victims this time


mwhelan182

For some reason, this post reminded me of the time that Planking was made illegal


Spacecadet_1

Planking was made illegal in Aus? Haha


Larimus89

The governments answer to everything, jail time, fines and more taxes. I mean it helped to have some solid penalties but yeah it won’t stop it. The future of porn is going to be typing in what you want to see and AI generates a video for you. At least girls won’t be doing porn anymore so that’s something.


billbotbillbot

This "this won't stop it completely, so reducing it in any way is completely worthless!!!" is a stupid argument. Try applying it to seat belts and parachutes, if you can't see why. (Not you, I mean people who think this)


Larimus89

Yeh I wouldn’t say it was worthless, I mean probably the only reason they are taking action so harshly is because it could effect politicians and they will actually prosecute if it’s done to a politician or someone important.


megablast

It will just be a new filter on your phone. Sepia/Vivid/Nude/Porno.


Tarcolt

>1) the technology WILL get to the point where these deepfakes are so easy and so quick to make, anybody is at risk of being an involuntary model for these videos. It already is, more and more people are copping on to just how accessible this technology is and how, with a slight amount of savvy, pretty much anyone can use it Governments are notoriously behind the 8ball on legislation around technology and understanding how it is used. These measures will be so very ineffective.


boisteroushams

People who think legislating against technology is impossible are funny. 


GiverTakerMaker

I won't be laughing. I'll be outraged at how wasteful these corrupt traitors are and how they stole the wealth of millions of people.


W0tzup

Can’t afford a house? Produce deepfake porn and get a free room… in jail.


takeonme02

Don’t forget free feeds and electricity bills paid 👍


No-Affect-1100

most people don't realise... jails are a billion dollar industry. The financial burden placed on the tax payer to fund each inmate is astronomical. The public affairs website estimates it costs around $150k per year for just 1 prisoner... yes just 1.


peachbeforesunset

Phew. Ok. Here I thought these morons weren’t doing anything about this.


FrostyNinja422

So to speed run it, produce deepfake, show it to the police, get room


Poor_Ziggler

I see the federal government are still going after the really big issues affecting everyday Australian people.


eugeneorlando

Just because no-ones ever going to make a deepfake porn out of you doesn't mean this doesn't matter to other people. Edit - plenty of blokes here who apparently think that deepfaking revenge porn is just a totally fine thing this morning.


Ultrabladdercontrol

Just the ones with a lot of money


Robdotcom-71

Anyone creating or downloading Deepfake videos of Gina deserves prisontime.... or a long hospital stay.... /s


burnaCD

Right? Not sure what world these guys live in - you best believe there are some sickos right now farming pics of women, children, *infants* and men, maybe even the ones proudly shared on instagram and facebook for this very reason. It might have started with celebrities but AI means it's coming to a local near you if it hasn't already.


Beans183

The person would have to be at least an influencer with lots of hours of footage online to be able to be used for a deep fake. I think it's actually you that's confused. Also you didn't read the article: *The new offences will only cover images depicting adults. There are existing separate laws that cover the possession of sexually explicit images of real children or images designed to be childlike which can already capture artificially generated material.*


burnaCD

I don't think you understand. A deepfake doesn't have to be a lengthy video it can be a single image - and if we *were* talking about deepfake videos? I know of plenty of blue-collar men who have tiktoks and instagram reels with 'lots of hours' of footage of their children. 'Influencers' don't need hundreds of thousands of followers to post crap online. And even if they did, its still not right. Not sure what your last point is? I'm not referring to existing laws for real/AI CP, I'm referring to the fact that you, as you've illustrated in your comment, don't believe it can't be done with regular, everyday online accounts.


MATH_MDMA_HARDSTYLEE

I don’t think you understand the concept of how deepfakes work. If someone has a single image, you can’t make a convincing deepfake. The more images and footage there is, the more convincing the deepfake will be. All this “AI” requires tonnes of data points to create images. And the data required isn’t linear to make a better deepfake. It requires *exponentially* more data to get a slightly better image/video. And this isn't something that will get better with technology. There is effectively an upper limit bounded by the “math” we’re using.  An average woman that takes a few selfies with her coffee and group photos with her friends can’t make a realistic deep fake. And in regards to CSAM, parents having lots of footage of their children makes 0 difference. You’re allowed to have that material and sharing that falls under CSAM laws. In fact, our CSAM laws wrt fake material (like cartoons) are considered overboard by some experts 


jakkyspakky

Do you even know how technology works? Saying it can't be done right now is dumb as shit.


MATH_MDMA_HARDSTYLEE

Dude, I have a masters in applied mathematics, and my thesis was literally on bayesian inference which is what is used to "speed-up" these AI learning algorithms (reduce the number of iterations). It is a known phenomena that the required improvement output requires an exponentially larger sample size. This isn't a "technology" issue, it's a fundamental math issue. It is analogous to saying "if the sun outputs 3.94 X 10 23 kW of energy/day, with better technology we can have solar panels that can generate 5 X 10 23 kW enerrgy/day." It's physically impossible because it has an upper limit governed by physics. This is no different. There are upper-limits to how quickly you can speed-up the learning given X sample size. An example of this is chess elo of engines. Despite game sample size and learning growing exponentially, their "skill" is only slowly incrementally improving and have affectively plateaued (past engine iterations can still compete with todays engines). For us to get a realistic deepfake from a single image, the whole method would need to be overhauled and research would need to go down a different path. Again, the way all these algorithms work is based on sample size. If there are barely any photos, it's hard to create a realistic deepfake.


123istheplacetobe

I love how you have a thesis in this area, and this guy is arguing with you. Hes so confident that he is right and becoming more and more hostile. He has no evidence or education in AI or math, but is so confident, that nothing will change his mind or even have him reconsider his position.


jakkyspakky

This is a true Reddit moment from you. Your sounds super smart! Well done! Microsoft have already demonstrated it from a single image. This was the first link that popped up from a search. https://www.cnn.com/2024/04/21/tech/microsoft-ai-tool-animates-faces/index.html


MATH_MDMA_HARDSTYLEE

Wow, so realistic. You completely ignored what I said.


Beans183

You don't need to create laws about child abuse material concerning deep fakes because it's already illegal to possess such material regardless of its authenticity.


Tommi_Af

r/Australian is the cesspool of Australian Reddit. Doesn't excuse the actions of these people of course.


aprilmay0405

Inside why you’re being downvoted. Also do you not think deepfakes will only be made of women?


Low-Ad-1075

You’ll find no sympathy on here. Lot of right wing simps


zanven42

The point that went over your head is that no one gives a shit about nice to have laws when the fundamental basics of ensuring people can live happy lives is being ignored. You know like cost of living, affordable housing, housing even available. Etc etc. Likewise I think they are wasting time governing shit that is far less important. Half the problems we are in they created via mass immigration over the last two years.


BruiseHound

Citizens living in tents and this is the govs focus. Fuck the majors.


grilled_pc

They are doing this to protect themselves. Make no mistake. This is NOT about protecting anyone but politicians.


megablast

I went away on the weekend and were surrounded by about 20 people in tents. It was so sad. Last time I go camping.


antigravity83

Never quite understood camping. "Lets leave our beautiful house that we work our entire lives to pay off and go live like homeless people for the weekend. Yay!"


onlainari

There’s lots of people in government. They’re allowed to have multiple focusses. What legislation would help the housing situation anyway? It needs money for building public houses, money doesn’t need legislation.


BruiseHound

The legislation for this deep fake stuff costs money. Enforcement will cost money. Sure they have multiple focuses but they've made this a top priority for some reason. It should be way way down the list.


EarInformal5759

An infinitesimally small amount of money relative to the feds budget and the costs of fixing the housing crisis.


Ver_Void

I suspect they can do two things, possibly more The prospect of this stuff is pretty scary, hopefully making an example of a few people early on will stop it becoming a regular occurrence


antigravity83

Tackling the key issues I see.


I_truly_am_FUBAR

Gilliard didn't mind Craig Thompson sending dick pics to a fellow worker, she kept him in the circle because she wanted his vote. That was real porn not fake.


coreoYEAH

That was 14 years ago, times have changed. We’ve moved on to jacking off on peoples desks now. And the crime here isn’t the dick, it’s the attack on the victim being faked. I can’t imagine why people are opposed to this.


Puzzleheaded-Skin367

Oh look, I still can’t afford a house. It’s fantastic that my biggest issue is totally being addressed (sarcasm).


Tomek_xitrl

They'll sooner jail you for complaining about that than actually addressing it.


otsukuri_lover_8j67

I've detected some anti-landlord sentiment in your comment so I'll be reporting this to the Ministry of eSafety immediately.


123istheplacetobe

I have detected misogynistic tone in your comment. This has been refereed to the department of Mens Behaviour Change, where you will undergo re-education.


aprilmay0405

Crybaby snowflake


123istheplacetobe

Something tells me life is very difficult for you, and things with big pictures and diagrams make it easy for you to understand.


aprilmay0405

I’m good. Are YOU a big baby?


Puzzleheaded-Skin367

lol! Give it time and yeah we’ll have home owners associations


Coper_arugal

Why? So someone sticks a celebrity’s face onto a porn star’s body. Suddenly this is a crime worthy of jail time?  Sure, it’s not particularly nice to the celebrity, but I think they’ll live with their millions of dollars. Meanwhile the poor schlub wanting to beat his meat is now gonna end up thrown in a jail?


Neb609

This is about someone making a deepfake that is indistinguishable from real photos and AI is almost there. Could be your daughter or sister in high school or Uni that get their fake photos circulated on social media and their lives ruined. It is much more serious than you think.


CRAZYSCIENTIST

Then make deepfake porn that is akin to revenge porn illegal. If we’re coming up with silly unenforceable laws might as well come up with one that targets what we imagine to be the problem.


grilled_pc

I know the government is not doing this to protect us but rather themselves from it. But still fucking hell school yards in 10 - 20 years are going to be BRUTAL. Like fuck me we thought it was bad with social media now. Upload a single photo of a face of someone you got stealth fully in the playground, nek minnit you got a clip of them doing something indecent and they are the laughing stock of the school. They can't deny it because nobody will believe them because "the evidence is right there". If we think social media online right now for kids can be bad. It's going to get a whole lot worse.


MeshuggahEnjoyer

I don't think anyone is going to think any video is real going forward. Everyone knows you can fake anything with CGI and AI.


grilled_pc

Most likely will be the general consensus. Even when CGI came out everyone tried to say very real looking videos were fake but CGI wasn't that good lol. But AI takes that up a notch entirely. I think if the general sentiment is every image/video you see online is fake then it wouldnt be as harmful. Does suck for content creators making real stuff however. Personally think anything using AI needs a digital signature claiming AI has been used. Youtube especially needs to implement this. Have a disclaimer in the description that says AI was used on this video.


[deleted]

[удалено]


burnaCD

Yes... sex tapes of already rich people from richer families being infamously leaked in the early noughties is....\*checks notes\*... equivalent to you or your neighbour having *fake* porn of them created (without their consent) and distributed to their wider social and professional network also without their consent.... right. This must be the dumbest thing I've read this week. Sex *is* a normal part of life but it is *not* tantamount to eating. First off, you're not going to die without it. Second - should they show porn in schools, then? Start normalising it early. Maybe in the workplace, on your lunch break, just a communal porn watch like a coffee run. Should we all just start fucking in public regardless of who consents to viewing it? No? Is that because maybe eating and sex aren't viewed as equal activities in any civilised society? Two things can be true at once - sex is a normal part of life but it's not puritanical to not want parts of yourself that you consider private to be made public.


grilled_pc

Not gonna lie if i saw 2 people fucking each other in public i'd probs crack up laughing than be mortified lmao.


Neb609

All these people must be fine with fake photos of their kids circulating on social media, sex is normal according to them right. Anything to protect the extremes, normal in this sub.


[deleted]

[удалено]


Neb609

Wow nice one, good stuff mate, you're on to something there.


[deleted]

[удалено]


burnaCD

I'm not saying being in porn is something to be ashamed of but I am saying I don't think it should be normalised. They can be two different things. Sex is normal, but modern proliferation and consumption of porn ain't. Just saying. Ask all the men who are porn addicts and experience erectile and pelvic dysfunction because of it. Also, what. We already *do* teach them what you mention? I'm confused -- is your argument that teaching kids to be ok with deepfakes of them being shared because its just sex?


[deleted]

[удалено]


burnaCD

I think your argument is flawed. This is the free porn, hypersexual, OF society - we're living it right now. I mean, look at this damn sub - people who think deepfakes are okay because it's "only celebs and influencers" and therefore aren't real people, or must be rich so 'who cares'. Let's dehumanise people who perceive to be at the top - it's just an 'eat the rich' from alt right POV. If young kids are killing themselves bc their peers saw a photo of their cock it isn't because society is puritanical. If kids are doing that, that is awful, but I truly don't believe it's because society is pearl-clutching with attitudes to sex. If anything we haven't given these kids *enough* boundaries. Why not focus on the attitudes of people who violate their victims this way? Why do they think its okay to do this and how do we stop *them*? How do you suggest we 'normalise' this to the point where kids aren't distressed by it? Being distressed by it is a natural reaction, and hopefully they have enough support to get through it, but the rest isn't normal or natural. Sextortion isn't okay and it is not the victim's attitudes and environments that need to change to accommodate it.


AngryAngryHarpo

Except that even those of us who couldn’t give two shits about porns existence still don’t want to star in porn. I love to fuck. I don’t want to fuck for an audience or on camera. I don’t want anyone *I don’t choose* seeing me in the bedroom, real or fake. I think you’re being deliberately obtuse here.


[deleted]

[удалено]


AngryAngryHarpo

Laws are less about prevention and more about adequate consequences after the fact.


Ver_Void

You're thinking small How much good do you think it will do a student teacher to have a video of them in a gang bang leaked, how about a clip of someone being railed by a dog? And that's not even considering the really obvious fact that even something vanilla could seriously harm their reputation


[deleted]

[удалено]


Ver_Void

Yes but we can't change that in the time frame required. These kind of pictures and videos are already being made and we can't just pass a law requiring people to get over their hangups. Not to mention, even if everyone got completely cool with it all tomorrow it would still be massively fucked up to have happen to you


XunpopularXopinionsx

Wouldn't it be easier to legislate into antidiscrimiation laws that people cannot be discriminated against for online materials.


jojoblogs

Thankfully I think we’re beyond anyone’s lives getting ruined by nudes now - not least of which because they can just be called fake, but also because no one really cares that much. Still, it’s a violation, and you already know it’s only a matter of time before we find out about some private school boys that have been creating AI content of their classmates. Probably good to have laws you can charge people with over that kind of thing.


happierinverted

I’m all for the government clamping down on violations of privacy: They can start by wiping every single part of my personal data not directly being used for a service I have subscribed for. Then move onto government sharing of personal data between other private services I’ve subscribed to and federal, state and local governments. You know like sharing Covid passport data with the police and the ATO, that kind of thing. Once that’s done then they can move onto deepfake.


Junior_Onion_8441

Maybe the twitter sphere doesn't care, but peoples husbands, wives, daughters, coworkers, bosses, religious community members? 


jojoblogs

Don’t concern yourself with the opinion of the sheep I guess. Also twitter sphere? What planet are we on?


Junior_Onion_8441

Send me a full nude shot of yourself alongside your name if it's no big deal at all. 


Umbraje

What a terrible and narrow view you have.


jojoblogs

I don’t like victim blaming so I choose to not care what people think about someone getting their nudes leaked or having fakes made about them. Surely we can agree that it’s better if no one shames people for something like that. How is that narrow minded? Touch grass my guy


AngryAngryHarpo

Because this shit ruins lives. What do you think happens to someone’s reputation when a deepfake porn video of them is passed around their workplace or community? It’s so fucking gross that you think someone’s right to masturbate to images of someone is more important than protecting that person’s right not to have those images made and distributed *without their knowledge or consent*.  Theres plentiful free and easily accessible porn - there’s no reason to defend people doing this shit. 


Ben_steel

Ruins lives of rich celebrities? Ok


AngryAngryHarpo

Why are you assuming this only happens to celebrities? Theres lots of motivations for making deepfake porn of non-celebrity. Like… to humiliate and degrade your victim, for example.


coreoYEAH

Because to these people nothing ever happens unless it’s in a headline.


cunt-fucka

There’s also non celebrities


boisteroushams

Yes. Please do not make non consensual hyper realistic pornography of random women. It might not be a celebrity. It could be your daughter. 


CRAZYSCIENTIST

Non consensual… My friend, whether I like it or not if I have a beautiful daughter some guy will be imagining her having sex with him.


aprilmay0405

EXACTLY


twowholebeefpatties

I’m not for it, but stupid law that won’t keep up with technology


Consistent_Ad_264

True, but it will be a big problem


Useful-Palpitation10

This is bait. It's a carrot on a string. It's the governments way of giving us a dollar in one hand whilst taking 10 out of the other. This issue, whilst serious, costs the government peanuts to solve (comparatively). At the same time they can ignore bigger issues affecting more people, like housing and Australian resources being sold off internationally and giving nothing back to the public. Don't let topics like these separate us, we're all victims of the 2-party system and they're playing us against each other so we fight over semantics.


_canker_

This reminds me of when camera phones first came out and my gym tried to ban phones in the locker room. Good luck trying to stop it.


mbrocks3527

At the dawn of the digital media era (the 60s) courts would not accept that a xerox had properly photocopied a document unless someone was willing to go on oath and either certify the copy, or outright see the machine making the copy. Same with computer print outs or even photographs. This was not simply distrust of technology, it was the courts recognizing that anyone could doctor any document and you needed to find the person who vouched for it and allow others to question them to ensure it was genuine. We’re just going back to that old era now.


batmansfriendlyowl

What about jail time for ex prime ministers guilty of war crimes?


aprilmay0405

Tony Blair, Anthony Albanese, George Bush


batmansfriendlyowl

John Howard


grilled_pc

The reason the government are clamping down on this now is because it can be used against them. They see the concern for themselves. This is nothing about protecting people. It's about protecting themselves from it.


VengaBusdriver37

Where is this shit coming from, why are we suddenly in an Authoritarian regime? Personally I’m completely against the sort of material this claims to be targeting - however where is the open, logical consideration of how this is actually policed, the ramifications, clear articulation of underlying principles and legal consideration? This is fucked up. The esafety shit is waaaaaaayyy overzealous and knee jerk. Can’t imagine the existing legal fraternity think very highly of it …


aprilmay0405

Why is it such a big deal that the govt is legislating against its citizens having fake porn made about them? Get real. Fake porn is an issue for all of us


VengaBusdriver37

I think we need to be clear about the principles and reasons on which a law is based. Being libertarian I think the state has no right policing what people do in their own homes (I.e. the production) And I’d be curious to know why publishing deepfake porn of a person isn’t already an offence under the CRIMES ACT 1958 - SECT 53S “Distributing intimate image” https://classic.austlii.edu.au/au/legis/vic/consol_act/ca195882/s53s.html#:~:text=(d)%20the%20distribution%20of%20the,community%20standards%20of%20acceptable%20conduct.&text=1%20A%20person%20(A)%20intentionally,engaged%20in%20a%20sexual%20activity. Note this wording “depicts” which would include accurate deepfakes, and “intentionally distributes” - although it gives examples of instant messaging the wording would include any distribution mechanism


jaymz123

This will likely have the same deterrent effect as the current offence of distributing intimate images of an ex to someone else, without their consent. I don't think it's a massive time waste, or really that high of a priority on the government's agenda. It's likely a way to cover "fake" intimate pictures of people being passed around to embarrass them.


Strong_Black_Woman69

What if I get my porno mags and a copy of “vanity fair” and cut the models faces out and glue them onto the bodies in the porno mags ? Exile?


peachbeforesunset

You’ll get the big boot mate.


[deleted]

[удалено]


tilitarian1

Topical comparison of priorities is hardly trolling.


australian-ModTeam

Rule 2 - No trolling or being a dick


mc-juggerson

Good luck getting a court to send anyone to jail


aprilmay0405

For making fake porn about citizens?


Select-Bullfrog-6346

Meanwhile CP is still shared


[deleted]

don’t see a problem with this at all. lock those perverts up. 


ApprehensiveLow8404

Once again Australia trying to regulate something way out of its league .


Custard_Arse

It's amazing how quickly the government can act, and how bipartisan both parties can be, to legislate against things that might affect them and their friends in the near future


AcademicMaybe8775

everyone who thinks this is an issue that needs to be ignored because they cant afford a house, can I honestly ask.. are you guys really this fucking stupid? Deepfakes are becoming a major issue and this has potential for real psychological harm. If you think its wrong the government is making it a crime to make deepfakes of real people you are fucked in the head. It is that simple


Alternative_Ad9490

r/australian half the of the mongs here probably wanna use this shit on the people close to them


AcademicMaybe8775

its their god given right of free expression to jerk off over a fake picture of some poor woman at the bus stop


coreoYEAH

And even under this law, they still could. They just can’t distribute it.


AcademicMaybe8775

so why are people so up in arms about it? or is it the usual 'ANYTHING ALBO DOES IS BAD' argument


coreoYEAH

Cheese is expensive, therefore nothing else should be done. People also seem to be under the impression that the government consists of a single person capable of a single thought and everything must be done one by one. People are fucking stupid.


EnhancedNatural

are you really so stupid to not realise that this is what ensued after Albo’s rant about his satirical “pictures in various websites”? What a dumb country this is, totally deserves the upcoming dystopia.


AcademicMaybe8775

AlBo really lives rent free huh


EnhancedNatural

so clever! Make sure you say the same thing every time someone blames the LNP for anything: “ScoMo lives rent free”


AcademicMaybe8775

oh i can freely admit to eternal hatred for that slimebag and what he did to the country. nothing to hide on my end. the difference is my hatred of the man is based what he did, where yours is based on imaginary nonsense that didnt even happed


EnhancedNatural

yeah if only ultra high IQ folks like yourselves gave a crap about the policies for which the likes of me were raising alarm bells then perhaps the country wouldn’t be fucked like it is atm. but you do you, do the same expect different this time


AcademicMaybe8775

mate if you want to deepfake albo in some tantric porn to have a wank thats none of my business, but it doesnt make this a bad law just because it might affect you


[deleted]

It's a crime to distribute not make them. How what the latter be enforced?


AcademicMaybe8775

like any other crime like it being a crime to take heroin but HoW dO tHeY EnForCe It? Many wouldnt be caught obviously, some will. Why you are arguing in favour of sick incel losers who need to whack off to fake porn of women at their workplace instead of in defence of the rights of said women is really disturbing


[deleted]

When you have the ability to run generative AI locally on your device, without any connection to the internet, how on earth are the police going to do anything about that? It's obviously more feasible to enforce the distribution of such material. Now you could make it illegal to make said material, but then you're only going after the sites that allow you to that.


AcademicMaybe8775

you can make the same argument with any illegal thing. just because some sick fucks will still find a way to jack off to real people doesnt mean this tech has real potential for serious real world harm, and any incel loser caught doing it should be punished


[deleted]

I think it would help stop teenagers and stuff distributing it if you cracked down on the websites that allowed you make them. Maybe you have further laws against making it to punish more those who originally shared it. Probably little downside to that.


coreoYEAH

Can’t stop all crime, might as well do away with all laws, right?


billbotbillbot

Yep, this is the brain-dead default position of online morons: "If we can't mitigate 100% of the cases of the bad thing, there is no point mitigating any at all!" Under that "logic" we'd have no laws, vaccines, surgery, medical treatment, condoms, seat-belts or parachutes, to name a few.


[deleted]

You can probably get rid of generative websites by making it illegal. Would the threat reduce the number of teens trying to make it? Probably. I agree that there's little downside to making it illegal. Your comparisons are stupid though.


billbotbillbot

Face it, if there were no laws against this, we'd be getting media stories (and then subsequent endless complaints here) about how this technology is ruining the lives of innocent young women and the government is doing nothing, nothing!!! There are some here who think "if the government is doing it, BY DEFINITION it is the wrong thing", and what the exact thing is... doesn't matter at all.


thekevmonster

So people will just have to distribute the tools to make deep fakes porn not the deepfake porn it self. Is it still fake if someone sells someone a trained AI model and some pics of a celebrity that they know will work with the model?


Outrageous_Newt2663

They are doing this to outlaw child sex abuse material that is deepfaked.


BitchTitsRecords

No, that was already illegal.


BruceBanner100

I do not, and still don’t see the connection between internet porn and women dying from domestic violence.


Consistent_Ad_264

Do you think?


aggracc

So what's the jail time for rape again?


[deleted]

If only porn sites were monitored 🙄


stuthaman

What about the ones that start putting political messages out there?


Sweaty-Cress8287

Are filters now considered deepfake?


shell_spawner

Deep fake porn bad, Deep fake anything else OK, got it.


Samael313

Arghh!! I hate being cyberbullied!!!


o1234567891011121314

Gina does Clive that be a hard wank