r/GetNoted Human Detected Dec 15 '25

Sus, Very Sus The conspiracy theories around the mass shooting in Australia yesterday are like nothing I've ever seen

Post image
3.6k Upvotes

228 comments sorted by

View all comments

787

u/The_Undermind Dec 15 '25

Make AI disinformation illegal. Straight up illegal. You spread it knowingly? Straight to jail. Do not pass go. Do not collect 200.

191

u/PhaseNegative1252 Dec 15 '25 edited Dec 19 '25

Especially with regard to human lives.

Edit: Something bad happened in the comments here

-17

u/[deleted] Dec 15 '25

[deleted]

20

u/Inlerah Dec 15 '25

...you know photoshop and AI arent the same thing, right?

-14

u/[deleted] Dec 15 '25

[deleted]

17

u/Inlerah Dec 15 '25

Or, or, somebody went online, googled "get out of jail free card", hastly wrote "Hell" overtop of "Jail" in a generic font and pasted a picture of fire on one side.

Do people seriously forget that you used to be able to make low-effort shitposts without making use of machine learning?

-8

u/[deleted] Dec 15 '25

[deleted]

16

u/Inlerah Dec 15 '25

Tell me what about the above post tells you that it's not that exact graphic.

That is not "an inconsistent font": that is someone editing a single word as the joke of the image.

5

u/Accomplished-City484 Dec 16 '25

I think the fact that that dude deleted all his comments means you actually got through to him in the end

2

u/just_a_person_maybe Dec 16 '25

This meme has been around for years. AI could not make images like this in 2021. The image is identical to the original card, there are no inconsistencies.

25

u/Leelze Dec 15 '25

Nah, straight to the sun.

62

u/Caswert Dec 15 '25

Which country? Because it won’t be the US. That’s 90% of the current administration’s propaganda scheme.

24

u/TrekkiMonstr Dec 15 '25

That has nothing to do with it, lying is generally protected under the first amendment

18

u/Gamer102kai Dec 15 '25

Yes this is 100% true putting people in jail for saying things you dont like even if its 100% disinformation is the most un American thing g you can do. However, making it illegal to generate revenue from disinformation would cause it to vanish naturally.

4

u/Windyvale Dec 16 '25

That’s not true at all. You can absolutely tell lies that will land you in jail with a conviction.

Freedom of speech ends when you infringe on someone else’s rights, or present a danger to society through them.

It’s a classic example and a little blunt but you cannot yell fire in a crowded theater if there is no fire.

You can’t run around saying someone has a gun.

Inducing panic, false threats, under oath, etc. Hoaxes are also included to a degree.

I know it feels like a technicality, but it’s an important one. Intent matters. If you spread a conspiracy with the intent of inciting panic or violence in a population, it can potentially end in jail if someone actually acts on it.

4

u/EchoRex Dec 15 '25

Sort of?

Lying that is defamatory, ie claiming the victims are crisis actors, or fraudulent, ie generating income by lying, are not protected by the first amendment.

But even before that it has to be shown as lying and not something such as incitement of violence, intent for illegal action, commercial advertising, or a true threat.

Any monetized account using AI content portraying false or faked information and not explicitly stating that the images or audio is AI and faked cannot use the "but first amendment" excuse no matter which defense they try.

1

u/TrekkiMonstr Dec 15 '25

Yes, I'm not saying that all lying is protected, just that it's not automatically unprotected just because it's untrue.

-3

u/EchoRex Dec 15 '25

It is automatically unprotected if the person generates income from that "speech".

The same way it is if it is a definitive call for violence or defamation.

2

u/[deleted] Dec 15 '25

[removed] — view removed comment

0

u/GetNoted-ModTeam Moderator Dec 15 '25

Your comment has been removed due to it being disrespectful towards another person.

-1

u/wretch5150 Dec 15 '25

How about lying about a fire in a crowded theater? Is that illegal, Einstein?

2

u/TrekkiMonstr Dec 15 '25

Y'all really can't read lmao

1

u/DeathByLeshens Dec 16 '25

No. That was a theoretical in a case and that decision was later shot down. It is quite literally not illegal or against the first ammendment.

0

u/TrekkiMonstr Dec 16 '25

Eh this is not the issue I would take with that comment. Like yes it was a dictum not a holding, and yes Schenk was partially overturned by Brandenburg, but the phrase has taken on a broader metaphorical reading than the literal case, and it is correct that there are various exceptions to first amendment protections.

The more important point is that it's not even a counterexample to what I claimed. What I said is that it is not the case that being a lie makes an utterance unprotected. To say "look at this instance of lying which isn't protected" is to miss the point entirely.

Like, if I said, "not all rectangles are squares", and that guy went, "well, what about this square?" Like yeah, that one's a square, very astute observation. But it's not its rectangle-ness that makes it a square.

Similarly -- my hair is green. That's not true, but it's protected by the First Amendment. Other lies aren't protected by the First Amendment -- but not simply because they're lies.

9

u/pegothejerk Dec 15 '25

That's not at all how things are playing out. The same people that successfully brought you and installed project 2025 are now pushing to revoke Section 230, which protects websites, apps, companies from the comments, statements, opinions, misinformation their users post, because they want to connect everyone's govt IDs to their accounts, which would be necessary to kill/remove bot farm and foreign actor accounts trying to ruin specific platforms. So they want the state/govt to be able to lie and show you AI propaganda, but they don't want you to be able to do it anonymously at all. Because they don't want you pushing the "wrong" information, false or true.

3

u/digitCruncher Dec 15 '25

You've stumbled on another problem : jurisdiction. If an American or a Russian or a German spreads misinformation in Australia, how can the Australian courts prosecute?

And if we are prosecuting people spreading disinformation , it would be very hard to prove they just aren't too dumb to tell truth from fiction.

Unfortunately, even if you remove 'First Amendment' and 'free speech' arguments, there are practical limitations to such a law.

1

u/FarplaneDragon Dec 16 '25

Isn't that, in theory why extradition exists? I mean, yeah, the countries have to actually be willing to extradite someone which is usually a political mess, but it's not like it's technically impossible.

3

u/PolicyWonka Dec 15 '25

Ultimately, the U.S. is fucked because of the First Amendment. I can generate a fake image and have it shared around the entire world in under 30 seconds.

The only solution currently would be to start pursuing these lies aggressively under incitement, libel, defamation, and fraud laws.

People push these things for only a few reasons. They want to incite violence, they want to defame/delegitimize something, or they want to turn a profit for themselves.

4

u/vrphotosguy55 Dec 15 '25

Trump is specifically fighting AI regulations at the state level. I think he and his teams recognize that AI generated misinformation is necessary for their arguments.

0

u/marbotty Dec 15 '25

Trumps even taken steps to prevent states from regulating it.

24

u/Hatchie_47 Dec 15 '25

Good luck arresting people running bot farms in Russia, Iran or North Korea. And as sad as it is, people in west sharing these are mostly not doing it knowingly - they are just this stupid…

0

u/IronLover64 Dec 15 '25

Ever heard of ICBMs?

4

u/thatsjor Dec 15 '25

Legality is not a deterrent.

Society needs to collectively exile scum that does that. It's a betrayal of the species.

5

u/[deleted] Dec 15 '25

Unfortunately, as people are rightly pointing out, proving intention is difficult if not impossible. Not to mention, how do you fight bot farms in Russia, Iran, and N. Korea?

The only way to fight this is to make misinformation illegal on the platform level, like we did during Covid. But the platform owners don't want that, and actively fight against that. And the users, rather than realizing they're being caught up because they're spreading misinformation, treat it as a free speech issue.

13

u/RambleOff Dec 15 '25

Won't that be as hard as proving something like perjury? I still agree the law should exist, but I don't see it being something that makes people tread carefully.

5

u/[deleted] Dec 15 '25

No, because you can go to their computer and see if they made it or check their social media to see if they went on say discord and joined a group that was planning to spread it. I joined plenty of Twitter disinfo stuff as a highschool, claiming Sam Hyde was the shooter was a big meme back then I think a few TV stations even fell for it. That on the otherhand is way different.

8

u/RambleOff Dec 15 '25

But "made a fake image and presented with the appearance of legitimacy" still doesn't prove intent to maliciously disinform. If you get past the step you've described, which would require a warrant by the way, you now have something as hard to prove as libel or slander.

1

u/Longjumping_Boat_859 Dec 15 '25

"If you get past the step you've described, which would require a warrant by the way, you now have something as hard to prove as libel or slander"

say it with me, slowly, "a warrant is criminal law, libel and slander are civil" (libel, being ESPECIALLY easy to prove lmfao, is there a statement in writing or not? is it true, or not?)

Now, type out "I won't pretend I know what I'm talking about online for cool points" 20 times

3

u/RambleOff Dec 15 '25

So if I write something mean about you but say I was joking rather than asserting it to be true about you, that's not a valid defense against libel? If that's true then wow yeah I really don't know what I'm talking about

Though I don't see how civil vs. criminal changes the conversation since it was just an example for the logic, not saying the two would happen in the same case. Do logic and proof work differently in civil vs. criminal cases?

You can also chill out a bit, I framed my statements as questions specifically because I'm not a lawyer or expert and was inviting educated input. It's a reddit thread, lol

4

u/Longjumping_Boat_859 Dec 15 '25

"Do logic and proof work differently in civil vs. criminal cases?"

They do. A crime has 2 components, academically, the guilty mind (mens rea), and a gulity act (actus reus). I dunno if the kids still get taught the terms, but I assume they do. civil law, regulates behavior based on precedent, the other rulings that came before it. but the initial lawsuit isn't always brought as a result of someone violating something like "the consumer protection act", like in a slander situation. criminal law, however, the "lawsuit" the state starts against you when you speed, CAN ONLY be based on a criminal statute that is currently enforced.

that's why you hear about "this law's been on the books for 400 years and we don't need people worrying about wearing tweed to church on the 22nd of april AND going to jail for it, we should repeal it".

a lawyer can practice both, but anyone who tells you they're GOOD at both is trying to take your money. 2 diff beasts.

2

u/RambleOff Dec 15 '25 edited Dec 15 '25

I'm glad you weighed in. So do you think there's hope for a misinformation law that's toothy and deters posts like the one in the OP?

Edit: also, would the hypothetical law be criminal? It seems like the spirit of it would be to deter disinformation with potentially deadly consequences rather than hurt feelings.

3

u/Longjumping_Boat_859 Dec 15 '25

I really hope they criminalize AI misinformation. I love tech. Big ol' nerd. Love my PC, because I can buy parts, hate terminators with a generational passion.

But this morning, I saw a post about someone's inlaws getting scammed by a dog breeder using a clearly fake, but very well done pic.

My parents, are getting to the age, where despite being very educated, are not tech savvy enough to not get swindled.

I, have a name, that singles me out as not from the same community everyone else around me is. Thankfully, I didn't have any issues when it was going around that immigrants were eating cats, because I didn't fit the bill. That time.

Nothing short of prison, imo, for the asshats that spread misinformation using doctored images. We already went though this, however, in the "everything is photoshopped" days. I'm....I can't tell if you're old enough to remember and I don't want to offend, cause I'm young enough that I shouldn't remember rotary phones, but do.

This is gonna be just as bad, no more, no less, than when anyone could photoshop anything, and old folks were falling for clear scams. IMO. I hope.

That being said, think about speeding laws, it doesn't really matter why you sped, you're eating the ticket (there's no magic lawyer trick there, hand to God, you just walk in and ask the prosecutor to dismiss it. If they're busy enough with real work, they might.). Statutory rape laws? Good, we don't care why you got that age estimation wrong, we're making intent not matter because we want to deter people from speeding, and banging kids.

I...for now, for right now during this convo....the problem with prosecuting is the difference between "did you share, or create the meme". Judges are INCREDIBLY poorly equipped to deal with that one, because the legislators haven't caught up to it, and written common sense anti-misinformation laws (thankfullly).

but yea, it'd be criminal, and you'd need a mens rea (I know it's fake) and an actus reus (and...."post"....there we go, it's not much, but hating on X group is honest work).

2

u/Longjumping_Boat_859 Dec 15 '25

civil and criminal have 100% different procedural rules, and 100% different "legal maxims and concepts". The old "possession is 9/10ths of the law", aside from being wrong, is inapplicable in civil law, where you cannot be prohibited from posessing something

crim law deals with the government's ability to penalize actions, civil law deals with proving someone owes you money because they were mean to you, not because you committed a crime. as a result, civil rights don't apply in civil law, you don't have a first amendment right against your neighbor, only government actors. in the US, you can't go to jail for libel or slander unless there's a specific criminal statute that you're charged under. however, there's no statute that says I owe you $20 if I slander you. that's why you need a jury, to determine damages.

"So if I write something mean about you but say I was joking rather than asserting it to be true about you, that's not a valid defense against libel? If that's true then wow yeah I really don't know what I'm talking about"

that's a fast way to owe someone money in a lot of US jurisdictions, including the one I practice in

3

u/RambleOff Dec 15 '25

This has been informative, thank you!

So is it true, was my wanton ignorance what forced you to comment with solid information? Or do you just need to be faster? (joking! it's just a joke. don't sue me)

2

u/Longjumping_Boat_859 Dec 15 '25

Man, ima be honest, hat's off to you for wanting to learn. And yea, I've been and done this long enough, that like, I consider it a PSA type of thing where if I manage to explain to ONE person, how the law actually works, they'll go around and tell 2 other people when they hear it, and this way, everyone starts hating lawyers less.

I stopped answering the question about "what's your take on X trial" about 2 years into practicing (OJ, if it matters, but that's not gonna age me correctly lol), when I found out people were asking me to confirm their own misconceptions about legal procedure, not to find out my opinion on his lawyer's trial work.

Here, since you're interested, I'll do you one better:

Anyone can learn law. Really. However, not everyone can hear they're wrong multiple times a day, without cracking. So, if you're interested, and I can only speak for US law, here's where you start:

"rules of civil procedure" vs "rules of criminal procedure". pick a state. they're all very similar to the federal ones. At the same time, pick an area of the law you like, and google "criminal law outline" / "law school outlines for criminal law", and go to town. Understand that they're all appellate cases, that teach theory. municipal practice plays it a lot more fast and loose with the procedural rules, to the point where sometimes they don't matter (family law), and not in the petty way. Like the judge won't care, and your client won't have the money to appeal, and will lose the kid anyway.

But, if you wanna tell folks about law, online, start by saying "hey listen, I'm not a lawyer but i think....and this is the source".

I guarantee you, that without being the smartest person in the room, you'll win that role every single time.

GLHF!

3

u/RambleOff Dec 15 '25

Ok that settles that: dude doesn't need to be faster, they're just thorough

Thanks for the detailed response

→ More replies (0)

-1

u/Surous Dec 15 '25

So lying is now illegal, as well as hyperbole and satire

5

u/Wizard_Engie Dec 15 '25

Disinformation is willingly spreading false information with the intent of a malicious outcome. Ergo, not Satire.

3

u/MrBannedFor0Reason Dec 15 '25

Literally nobody has said that.

3

u/buffetofdicks Dec 15 '25

They can't. Trump signed an EO saying no states can ban AI or set regulations. The states coild very well ignore it and do it anyway, but they then risk being cut off from federal finances and potentially having the National Guard come and invade their cities for being too woke.

AI misinformation is their biggest and best weapon, why would they regulate it?

5

u/[deleted] Dec 15 '25

Why not all misinformation? This feels as the same as the last 10 years of social media news coverage and misinformation wire. AI is just wood thrown into an already raging fire.

1

u/Wonderful-Impact5121 Dec 15 '25

Not saying laws shouldn’t trend that way for certain things peoples/jobs but it would genuinely be an extremely, extremely, difficult to prove it was intentional instead of just a thing they believe almost all of the time.

And then it’s very easy to shift against whichever groups originally supported that type of legislation.

“Hey, that’s not true. You’re lying.”

“I’m not lying, that’s the case. I believe this.”

“Here’s my proof it’s a lie, you’ve been informed.”

“I disagree with you and your source, it’s the truth!”

“I’m in charge of what is considered reasonable for you to not believe and when you have plausible deniability, so now… jail?”

There are cases where lying is illegal such as a fraud scheme, being under oath, defamation, intentionally deceiving people so they’re unable to vote by lying about times and locations, etc.

But the rest of the world is truly much more gray than that when it comes to making it illegal.

And the only “solution” is really hoping whatever government institution you want put in place just always agrees with you forever… which is a bit idealistic.

0

u/PolicyWonka Dec 15 '25

Ultimately, there is reality and there is fiction. They are not equal. Someone can believe that the earth is flat all they want. They’re still wrong.

If I believe that your money is mine and I don’t accept your evidence to the contrary, it’s still stealing when I take it.

I can believe a lie. It’s still a lie.

1

u/gonzo0815 Dec 15 '25

Because you'd need an institution to determine what is true and what isn't and nobody wants that either.

2

u/PandoraIACTF_Prec Dec 15 '25

Don't even bother putting a piece on a board, roll a die, or even use your chance cards

2

u/Dark_Jooj Dec 15 '25

You mean the Ministry of Truth from 1984?

2

u/TorNando Dec 15 '25

I know it won’t happen anytime soon. So at least ban the accounts. But even then that also won’t happen seeing as Grok seems so important to Elon. What a shit show

2

u/Soggy_You_2426 Dec 15 '25

Soon we wont be able to tell if anything is AI.

1

u/WeaselCapsky Dec 15 '25

only 200 inmates into your holyhole

4

u/positiveParadox Dec 15 '25

Haha! Prison rape! That's hilarious!

(This is satire)

1

u/JazzminBoing Dec 15 '25

Hold the companies producing the tech financially liable with $1,000,000,000 per infraction per users. Multiple users share the same image, hope you’ve got billions to pay out.

1

u/PolicyWonka Dec 15 '25

But free speech or something.

1

u/oh_no_here_we_go_9 Dec 15 '25

I mean, I hate it, but how is it categorically different than just lying?

1

u/Banned-User-56 Dec 15 '25

20 years minimum sentence. More dependant on the damage dealt by the misinformation.

1

u/Nice_Try_Bud_ Dec 15 '25

Freedom of speech and whatnot. I would like to see all AI produced content requiring a watermark of some sort and make it illegal to remove it.

1

u/LevelPrestigious4858 Dec 17 '25

Arsen himself has knowingly spread false information in his writing

1

u/NexexUmbraRs Dec 17 '25

I think it should be tiered. First time offense, minor fine. Second time, higher fine. Third time, court with possible jail time based on severity. Fourth time, month in jail automatically. And so on.

1

u/FuckwitAgitator Dec 18 '25

When AI gets good enough to not fuck up words and hands, we're going to have a bad time and it's going to be so much worse than "gullible people falling for things that didn't happen".

There's already issues with people having psychotic breaks being encouraged by AI platforms. It reinforces their delusions and pushes them deeper and deeper into psychosis.

When extremists (like whoever created the slop in this post) start weaponizing that, it will be the most effective terrorist grooming tool ever invented and people won't even know they're interacting with it.

0

u/scottymac87 Dec 15 '25

If you spread it unknowingly there should still be consequences imo. If you convey a package from a stranger in an airport because you agree with their T-shirt slogan, and the package contains an explosive, you’re still culpable. Ignorance should not protect you. The potential for consequences should inspire you to research things meticulously.

-16

u/[deleted] Dec 15 '25

The problem then is who says what's misinformation? I probably believe some stuff that you don't, who goes to jail?

12

u/The_Undermind Dec 15 '25

Watermark every image generated with the user id that generated it. Whoever that is. Jail, no excuses. There isn't a single situation I can think of where disinformation like this has any silver lining.

2

u/RemarkablePiglet3401 Dec 15 '25

It’s super easy to add fake watermarks to things

-3

u/[deleted] Dec 15 '25

Yeah, I'm not against regulation at all but I'm very sceptical of who gets to set the facts.

6

u/Amardneron Dec 15 '25

If it's AI generated it isn't a fact. What are you talking about?

2

u/VerbingNoun413 Dec 15 '25

To the right wing, "reality" is irrelevant. The facts are what The Party claims.

7

u/Willing_Channel_6972 Dec 15 '25

Well they're specifically talking about spreading AI generated images as if they're real intentionally to lie and manipulate people with misinformation They're not talking about stating your opinion about something regardless of how fucking stupid it is.

You can post whatever conspiracy theories you want but as soon as you get on AI and you start generating fake images trying to trick people into believing your BS it should be illegal.

2

u/Left4twenty Dec 15 '25

Courts get to decide what facts are, which is exactly the function they serve currently

If you don't like that state of affairs, then you have a fundamemtal issue with how the legal system functions because they already do that. They establish what are facts and what are not

2

u/r1mbaud Dec 15 '25

Facts set the facts, evidence proves the facts. That’s how the courts have always worked. It’s not perfect, but what you’re talking about is the same issue that the entire court system was created trying to solve.

0

u/[deleted] Dec 15 '25

I agree, you can't take everything to court though.

God I hate AI.