A. My understanding is an underage teen who takes a nude selfie and sends it can be charged with distributing "child porn." This is a serious problem and should not be true. If other people forward it, they should be charged, but a teenager should not get in legal trouble for making a nude selfie and sending it privately.
B. We need to culturally get over a lot of our hangups about certain things. His girlfriend was attending church. The church should be telling kids "This is not worth killing yourself over."
C. We somehow need to come up with reasonable accommodation for the reality that teens have phones and budding sexualities and these two things are colliding horrifically in a legal and cultural system strongly rooted in assuming "child porn" is made solely by abusive adults and not willingly by underage teens who don't think it's a big deal at the time the photo is snapped.
Taking a nude selfie should be safer than baring your body in person. No risk of STDs or pregnancy are involved. And yet we've turned it into this hugely dangerous thing for anyone who hasn't spent years thinking about sexual morality who isn't prepared to say to law enforcement and the world "Nudes of me? Big fucking deal."
No idea how to further any of those things, but this is hardly the first article I've read about teen selfies being involved in extreme levels of harsh consequences that I just think morally should not happen and I think have roots in cultural and legal stuff going bad places in part because we aren't keeping up with the times.
It's not just the CSAM aspect, it's the whole idea that sex ed is grooming. It's like the difference between saying "you shouldn't have sex until you are a responsible adult" vs "I honestly don't recommend that you have sex at this age, but if you do, here's how to do it safely and responsibly".
Teaching kids "don't send naked pictures of yourself to other people" is one thing, but if they look around and see their peers sext with each other with no real repercussions, this warning will fall on deaf ears. They will file it next to "don't drink until you're 21".
On the other hand, if you start tailoring this message into something like, "sexting is a bad idea, don't do it, especially if you don't know the other person or don't trust them 100%; if you do sext, here's how to minimize the impact of cyberbullying or blackmail: ...", then people will loudly complain that schools are teaching their children how to sext.
I homeschooled my kids, so school-based sex education doesn't really make my radar.
It's all very alien to me how teaching safe practice of something that has a huge probability of happening can still be seen as encouraging it, "grooming". Education should be a bare minimum here: here's how your body works, here's your risk of becoming/making your partner pregnant (and what that entails for you), of catching a life-long disease, cancer, or a nasty painful thing, here's what medical science and the FDA says you can both use as contraceptive and their failure rate and side effects, no means fricking no, and all of this compounds if you drink or do drugs...
Not teaching this amounts to not preparing your kid for the actual world and reduces body autonomy and responsible behaviour... They will experiment, and they will do stupid things.
It's as if teaching civics was a gateway to anarchist bombing or teaching chemistry a gateway to Breaking Bad... American puritanism and the whole "teaching is grooming" is such a weird thing.
What about the toxic masculinity culture taught by the right and the church? The right says, "keep these women under control." The church says,"women MUST obey their husbands no matter what." If that's not grooming, I don't know what is.
We home schooled our children. We made sure our kids knew what could happen in life because we are all flawed and sometimes make decisions that could be life altering in unintended ways. We knew that we couldn't educate them enough about the variables of life, but we tried.
> The church says,"women MUST obey their husbands no matter what."
Which church exactly says that? Only a fringe group of American evangelicals. I have never come across even them saying "no matter what" although there are probably a few real lunatics who do.
Here is the view of the the largest Christian church, that submission in marriage is mutual: https://www.vatican.va/content/dam/francesco/pdf/apost_exhor... (page 115)
Many evangelical fundamentalist churches, for example, use the material from The Institute in Basic Life Principles. On marriage it says,"A husband's authority over his wife is God-given, as is his wife's non-negotiable duty to submit to him; she must respect his position regardless of his "deficiencies".
https://en.wikipedia.org/wiki/Institute_in_Basic_Life_Princi...
There seems to be a lot of (new?) confusion between pedophiles and adults being interested in sexually mature teenagers, which, sure, can cause it's own issues, but is neither a sickness nor a one-way street, and that's what the different majorities of consent and authority and being able to interact with adult industries are for.
And the cutoff for turning fully major and being able to still interact with your slightly younger peers has to be dealt with anyway, ideally in a progressive way.
There are multiple ways of dealing with it. Romeo and Juliet style age of consent laws for example. Not treating teenagers who do something stupid the same way as adults who distribute child porn.
But it takes some serious mental resilience to contemplate that as a teenager when someone is threatening to send your nudes to the people you love. Not many kids are going to have that kind of intestinal fortitude.
The tendency of porn to give men severe anxiety about their genitals is playing into this; kids are more embarrassed/ashamed about their bodies because all the other penises they've seen are so huge.
We need to get really cool about a whole bunch of stuff really quickly if we're going to make this (and other problems) go away. I don't think AI is going to be part of that.
And this is the problem. In years past kids would be naked with other kids, when changing after gym for example. Nowadays the only penis a boy sees is his own and that of porn people, because we've made normal nude situations extremely rare, especially outside of the context of sex.
On the other hand, bodies change over time, so it gets easier to say "That 10 year old pic? No, not real."
(Unless it's extremely attractive. Then enthuse about how gorgeous you once were, I guess, and see if people buy that.)
I think the problem is knowing what to do preemptively (which affects everyone who might help, not just the church). I am sure that if they asked a priest/paster/whatever they would be told that.
What can you do if they do not ask for. Preach a sermon on this specific problem? There are many variants and you need to get the message across. Maybe a general sermon on the risk of blackmail in general, and talking about where victims can get help, explaining the blackmailer is doing something wrong.
I think parents have a critical role to play. Talk about things like staying safe online, not trusting people, the fact that people assume false identities.
I know my teenage daughter does not disclose her real identity online. I have spoken to her about the dangers of doing so. I will give this as an example, not because I am worried that exactly the same thing would happen, but as an example of the general sort of things that happen. It might be something you say rather than a selfie, for example.
> Taking a nude selfie should be safer than baring your body in person. No risk of STDs or pregnancy are involved. And yet we've turned it into this hugely dangerous thing for anyone who hasn't spent years thinking about sexual morality who isn't prepared to say to law enforcement and the world "Nudes of me? Big fucking deal."
I agree. The law and culture is badly out of date.
It’s more complicated than that. Due to a new law in Germany, a teacher getting to know that these images are floating around and forwarding them to the respective parent could be charged for distributing child porn.
I don't actually want to see my sons naked, thanks. They hit puberty, suddenly discovered the concept of privacy and I have no idea what their private parts look like.
That’s an insane overreach which could easily lead to abuse, they shouldn’t be accessing children’s phones, especially under the premise of ‘I need that image for safeguarding purposes’ - absolutely fucking not.
By accident? The teacher and the and the other teen have the same first or last name, or the next one in the contact list, or the autofill rearranged just before clicking, or just one missclick... there are so many ways it could have happened.
Not to mention your absolute stretch of a scenario. Do kids typically have their teachers phone numbers in their contacts? Have them added on Facebook? Follow them on Insta? These are all genuine safeguarding issues in themselves, imo (granted I haven’t been in school for 10-15 years). I feel you’re being disingenuous here.
> It’s more complicated than that. Due to a new law in Germany, a teacher getting to know that these images are floating around and forwarding them to the respective parent could be charged for distributing child porn.
I was only adressing how this hypothetical teacher could have had access to it, not what they should do about it, since you were making very strong suppositions about it.
> Do kids typically have their teachers phone numbers in their contacts? Have them added on Facebook? Follow them on Insta?
Is it that rare for some teacher (even in high school) to receive some of their students work online? And didn't the Covid situation with remote classes, Zoom, etc, made it possible for such a thing (students having some way of contacting their teachers online) to be way more common now than before?
Anyway, not sure anything I can say could change your mind.
Edit: About what should the teacher do in that case, this was my assumption, I may be wrong, but I think when @pflenker was mentionning this law, they only tried to put it in a scenario in relation with the thread were we could think that child pronography wasn't involved, similar to when @DoreenMichele was mentionning the fact that a teenager sharing (with consent) a nude with other teenagers should not be charged and labeled as CP.
>Is it that rare for some teacher (even in high school) to receive some of their students work online?
It's been a while, but when I worked in the education sector we had systems for students to upload their work, get graded, feedback, reports etc. Yeah, you could upload .pngs and .jpgs but it's hardly 'the next contact in the list' or 'maybe they had the same first or last name' in those scenarios, you upload the work for specific classes/courses. Obviously that's a single system, I don't know the full scope of what schools use these days.
> Anyway, not sure anything I can say could change your mind.
No, nothing will change my mind that forwarding nude pictures of minors to literally anyone other than a law enforcement case handler is acceptable.
As you (unintentionally) point out, there's money to be made/data to be acquired in Education. A few years ago, before Covid, one of my siblings had to use a cloud offering because of school. I forgot which, Microsoft Team for Education maybe, or Facebook Education Groups. I just remember being disapointed. People are use to their interface, and they have the money to lobby and ads, easier to sell something "free" with easy onboarding to schools administrators.
I presume it’s a very difficult market to break into though for various reasons, a lot of which tech won’t solve any time soon, to the detriment of our new generations.
Edit: wait, Facebook education groups? Is this a thing that schools run, or something else?
Who made the case it was acceptable?
Granted, in this case I agree that it's better judgment not to forward the picture for all the reasons you've mentioned - but calling it insane is a bit much.
If you're a teacher, it would be wise to assume the parents may be in some way part of the problem.
If you're a teacher coming up with excuses to share nudes of your students with other people, you are at risk of being investigated very seriously as a potential child molester or distributor of child porn.
This is a minefield for a teacher and I don't think it's really a stretch to say it's "insane" or "extremely stupid" or other similarly strong language.
It has bad idea written all over it in blinking neon letters.
(And that's part of the reason why some of the ways in which parts of society react to these things is so problematic. If you're reacting that strongly to something that most likely is just the result of naivety, then frankly you have a much stronger claim to being insane than the person who's being naive.)
But your intuition is pretty much spot-on with the training I have received.
Sending a nude photo of a child to the child’s parents is absolutely beyond any form of rationality. Discuss the concern with the parents, inform the police, get them to verify that their little angel did in fact take nudes and send them around.
It’s literally forwarding CP. Let the authorities deal with this, if the parent wants to see their kids nudes then… well, I don’t really know what to say other than let the authorities deal with it. They can show them for verification purposes or something, without sending this image to any more devices than it needs to exist on.
The police and social services, along with school district policy, will advise the next steps, including how parents are informed.
They (us) don't have full control in later teens, but everything leading up to it, mostly yes. But for every parent I see raising kids the hard way (tons of time spent together consistently every day, being a positive role model, motivating and supporting them in all the right directions while explaining in detail the rest), I see the other way (obsessed by their pathetic office careers, kids with phones/screens from very early age that then go mental if they have to spend weekend without them, parents addicted to their phones/other screens too, overweight, depressed, without any real healthy passion(s) in their lives).
The results, I mean the kids, always show how parenting went (barring say some inborne mental issues and traumatic accidents, that I have no right to comment on).
I'd say this leans much more heavily on father too, like it or not. Mother is a safe haven and initial care and nurture, but father is a) hardcore role model for the boys, and b) a template what to look for in partners later for girls. Yeah, missing dad syndrome is brutal, every single effin' time, best mothers do minimize this and thats about it. I don't like it, its deeply unfair, not sure to whom to complain to.
Yet your comment reminds me of be the modern opposite take on parental responsibility: blame most everything on childhood trauma, and blame trauma on the parents.
We used to blame autism on mothers.
Seriously, we make as much sense as our ancestors blaming miasma for sickness. Remember the hellscape fad of recovering repressed memories? There are people that blame trauma on their past lives!
We all literally have no idea about any of this: our best bet is to be non-judgemental, do our best to create good communities, and to accept our own ignorance.
> The results, I mean the kids, always show how parenting went
Why is this such a common way to think?
Personally some of the worst things I have done I have learnt from my peers. My middle-class innocent parents are not to blame.
> Mother is a safe haven and initial care and nurture
I think stereotypes are dangerous. I'm middle-aged and while we can make generalisations about mothers and fathers, I've learnt that those generalisations can't be applied to individual mothers and fathers.
And conversing about this is just plain hard. I definitely don't want to attack you: https://news.ycombinator.com/item?id=35576696
Anyways: To my best knowledge I've really veered off the path - please don't delve into my comment too far. Black and white thinking can be a problem and I'm just as guilty of that as anybody: https://en.m.wikipedia.org/wiki/Splitting_(psychology)
That idea is pure revisionism coming from individualist ideology.
In 'those' days, everybody raised everyone. 'It takes a village' and all that. Check any actual scientific research on the matter.
Currently, the Catholic church has a track record of de facto aiding and abetting child molesting priests. I think there's room for improvement.
It's a really, really brave teenager that's going to be able to front up with naked photos sent to his parents and friends and say "they're fake". And, of course, if they're not fake and one person finds out, everyone will know soon enough. This kind of stuff is what teenage drama is made of.
From the article: the photo included his pyjamas, which matched hers. How is an AI going to know the exact pattern of pyjamas he wore? It might, but even asking that question is a problem if you're the victim.
I don't think ignoring the problem and telling the victim to toughen up is helping.
Claim the AI saw a picture of your pyjamas? But ye, people are bad at lying. It would be enough to claim they don't know how the AI made the picture.
Anyone can write absolutely anything with little check and balance on substance, and MILLIONS still believe tabloids and headlines. We've been able to reliably edit photos/video/film for 80 years with an exponential increase in its efficiency within the last 30 - and we still have millions that would take a photo or video's substance at face-value.
It was shared on social media just before elections during 'moratorium' [0]. There was no time or place to dismiss it by mainstream.
The scams are real, the victims are real, it's in the global "west" and sorely under reported and the conveyors (Facebook, Twitter, etc) have washed their hands of dealing with it:
eg.
Forrest's Facebook fight dropped by Australian prosecutors https://www.watoday.com.au/national/western-australia/forres...
Andrew Forrest’s legal battle to hold social media giant Meta to account over the proliferation of scam ads using his likeness on Facebook has been dealt a major blow.
Australian victim of fraud syndicate advertising on Facebook on a mission to stop others being burned https://www.abc.net.au/news/2024-04-15/australians-falling-f...The realism of deepfake video with real known public figures apparently endorsing investments as "safe" is another level past Nigerian prince scams.
All of those scams work perfectly fine, that's what they're still around. Money lost through scams is only increasing; it's a growth market.
Even if they had your actual real nudes, you not replying might even make a real scammer go "shit they don't check their messages", and move to the next target.
"Meta has a portal for police to file requests to preserve records of accounts connected to criminal investigations. Like other social media companies, it has to hold the records—including emails, IP addresses, message transcripts and general usage history—for 90 days. It only hands over user data if it’s ordered to do so by a court.
There’s one way to expedite the request: file it as an emergency, meaning a child could be harmed or there’s risk of death. Larson believed this case qualified. He told Meta that a 17-year-old was already dead, and there was a high probability other kids were in danger, too.
*Meta declined his request within an hour, he says. “The request you submitted does not rise to the level of an emergency,” the company responded.*"
Also iMessage really does not seem to be well designed to handle abuse. Your only option is to block, which is a couple of taps away from a conversation. The block does not instantly take effect cross device, so you still get messages on your Apple Watch until you reboot it. Your only option for reporting anything is when deleting a conversation, and all you can do is say it’s spam.
He called in a panic and I told him to ignore it. Nothing ever came of it (because in this case, how could it - and we're both adults, I convinced him even if it were true why would it matter - again, highly case specific but he agreed it wouldnt).
I doubt the same would be true in all cases, just wanted to share.
If people were all that good at being rational about safety in the face of possible sexytimes, it's true this wouldn't happen. And we wouldn't have STDs or unintended pregnancies, either. But as the man says, life finds a way.
My first assumption would be the "girl" is a bot and the image generated by AI until proven otherwise, i.e. by meeting up at a public location and seeing it's a real person from my country. Am I too paranoid? It sure feels like the dead internet theory is just about to get real and we should act accordingly.
People who are exposed to this line of thinking will now be doubly ashamed if they fall for this, making them even less likely to ask for help.
There's a real world parallel in Search and Rescue operations that I'm personally familiar with: https://vancouver.citynews.ca/2021/01/17/shaming-those-who-n...
But... WHY?! You're not The Rock. Unless you're ripped and/or packing, the chances of your "nudes" sent in reply scaring the recipient away are pretty high.
No. Apparently it also happens with british Member of Parliament. https://www.theguardian.com/politics/2024/apr/04/senior-tory...
(1) Most young people are insecure as anything and have unhealthy self-esteem.
(2) Culturally the west is very (and I mean very) ashamed of nudity.
Problem (1) isn't something that can be done half-assed. The kid will need a healthy family, friends, support system. They will need to be free of traumatic influences... While (2) I see as being more practical.
If you look at European countries they are much more naturalist regarding nudity. Like Dutch have their Freikörperkultur (FKK) free body movement where there's many places where you can go and do activities nude. Mixed gender saunas where people are naked are common. Then there's Japan where parents commonly bathe with their children. It teaches them not to be ashamed of their body.
We don't really have anything like that in the west. It's really quite dangerous because its just like: do we expect literal teenagers to practice good opsec when adults can't even get that shit right?
I can't believe phone companies (in the UK) don't provide better protection, even a registry of whitelisted numbers, that could be set for the old/vulnerable.
Indians, by number of victims, are the biggest victims of Indian scammers. Several people I know personally have been burned.
Our phone numbers are in several lists, and they get leaked.
I think one solution to this is strict data privacy laws. If there is a list with phone numbers/addresses, it should be subject to highest level of care and security. Or there should be laws banning collection of phone numbers unless absolutely needed.
No amount of spreading awareness seems to work. The local law enforcement of the exact two states in India where the domestic scammers are from are also "involved".
I still get occasional scam calls a few times each month, often using mobile phone numbers or fraudulent VOIP numbers registered in Austria. They usually hang up as soon as you push back on their claim that they got the phone number "from the database, maybe you participated in a contest once" when prompted with the question of how they got your number. The callers are mostly women with Eastern European accents.
I suspect my number ended up in a list when it was leaked in a Facebook data leak because I had to connect it once for account recovery/verification.
By all accounts the Dr would probably be in better shape if the kidnappers weren’t able to contact their family.
Because there are some criminals in a geographical area you want to exclude the whole area? Why don't you demand a US-only internet instead?
"NOTICE: The user appears to be in XXX, Nigeria. Beware of scams or impersonations by people hijacking your friend's accounts"
But Social media companies would never do this because it would destroy their so called "brand-trust"
Or if a care taker educated a child about dangers on the internet.
Do you really think care takers would watch their kids screens remotely all the time and then such horrific events wouldn't happen? I think it wouldn't help at all if that feature existed
And, I would definitely have caught things sooner if I'd had that ability. (Disclaimer: yes, we have been victims of these kinds of scams.)
Are they supposed to monitor the content of the chats? Some would call that eavesdropping.
The sad fact is that desperate people exist. And these desperate people are willing to do despicable things to make a buck.
So "Facebook should abort itself" is one option.
AFAIK Onlyfans doesn't directly purse blackmail cases with its own legal teams, it just bans accounts.
FWIW, if the goal is to detect potential scam/abuse scenarios and offer help, then nowadays, they could train a ML classifier and have it run locally. If the conversation gets classified as highly suspect, the app could pop up some "this looks suspect, if you need help press HERE to share the conversation with us" warning. Privacy concerns would go away entirely, if nothing is actually reported to the mothership until the user specifically reports a conversation.
There's only so much they can do for users behind their backs, and a lot of that is undesirable unless you trust the company to be benevolent; local classifiers popping up recommendations could, however, help users help themselves.
"NOTICE: The user appears to be in XXX, Nigeria. The User's hometown and usual IP is XXX , USA. Beware of scams or impersonations by people hijacking your friend's accounts"
But Social media companies would never do this because it would destroy their so called "brand-trust"
Another user noted that OnlyFans is analyzing text and bans anyone saying anything approaching blackmail.
Odd (but maybe not odd at all) to think that OnlyFans might be the leader on this.
But if you can prevent script kiddies, why not? The current barrier to entry is so damn low. Paying VPN is indeed a barrier to entry.
From the point of Facebook, I think they use SMS verification and their app also requires location permission. Facebook is pretty good at preventing bad signups. What they need to do is invest that effort in keeping their user base safe.
You don’t get to say “we will read your messages to sell ads, we will straight up sell your message history to other companies, but no we can’t do anything to protect users from trivially detectable scams.
They used to actually sell your days to select groups.
Now they sell advertiser ads based on your data but the advertiser does not get to see said data.
I rather assumed they already did monitor chat, in order to target adverts, A/B test, handle abuse reports, filter spam, etc.
Even if FB says it has "end to end encryption", I can't fully believe them due to the political pressure under the banner "terrorism/kids get hurt" is immense regardless of us here regarding that as thought terminating cliché from the intelligence agencies.
[1] https://www.michigan.gov/mdhhs/-/media/Project/Websites/mdhh...
https://www.bbc.com/news/world-australia-68720247
I'm not real confident this is solvable with law enforcement in a world where the police press release is - "located in a slum in Nigeria with a population of 25 million people"
Like if Facebook is being a safe haven for Nigerian extortionists, either they block Nigeria or Australia blocks Facebook.
Imagine a large American city getting cut off from Facebook because there are many people selling shoplifted items on FB Marketplace and your underfunded govt doesn't have the financial ability to crack down on them. The average person would lose all contact with friends and family because some people are using a service to commit crimes.
I think it's fine to cut off FB entirely in response… but then, I also think FB monetised what was previously free, collecting rent on being social, and as such everyone will be better off if it gets blocked in their area.
Including the advertisers. Sales can't exceed global income, so at this point the extra ad slots being forced everywhere only serve to make the advertisers part of a Nash game to spend ever more to fight each other for the same potential reward.
Said country affected can sign an extradition treaty to restore access.
Now? Does any country today block zero domains? There's nearly 200 nations so I've never bothered to check, but my guess is all block something.
Banning a company from doing business with a country is called "sanctions", not "war", and sanctions happen a lot.
Facebook making money off of illegal activity conveniently hidden behind national borders is not the cornerstone of the Internet.
"…an utterly insignificant little blue green planet whose ape-descended life forms are so amazingly primitive that they still think social media is a pretty neat idea."
I suspect that the same will eventually apply to IP traffic crossing borders. Big companies like google, netflix, meta etc will be approved by default, but anything else will be blocked.
Unless, of course, "it's so hard to regulate" is just a thought-terminating cliché because the SV set also benefits from lax internet laws. But I'm sure it can't be that, no...
It's actually not completely fine. Sanctions are effectively an act of war, just instead of shooting people and risking your own troops, you have your enemy's civilian population starve and shoot each other. This can be justified in some situations, possibly like the one you refer to; but it's definitely not an action to take lightly.
Ironically, in the originally proposed case of blocking Facebook, this is a bit of a "cut off your nose to spite your face" situation. How many small and medium businesses rely on Facebook as their main, or only, customer acquisition, communication and/or sales channel? For many countries, banning Facebook out of the blue would cause some serious economic issues and lead to plenty of actual suffering of innocent people.
(And yes, businesses will adapt, but let's not forget that adaptation in nature only ever means that the survivors of a mass die-off have more resources to use to bounce back. And it's the "die-off" part that's actually the necessary part.)
We let people fly to countries without extradition orders. VPNs can be seen just like that: crossing the digital border. But for a company operating in a nation, it makes sense to impose regulation.
Eventually you keep walking down this line until you write laws that local ISPs are required to globally blackhole countries which otherwise evade law enforcement.
Residential VPNs exist, but they cost money and increase the barrier for entry, while also being a choke point for abuse.
There is a strong network to support rape victims, and it should be used for such cases.
Maybe you know different teenagers than me, but I'm not expecting that the boys described here would be big on joining a rape support group.
There should be support. I suspect there are sources of support but a victim of this may not think of looking of this.
I would absolutely join rape support group, but there are none for men. Maybe the is the real problem. At least they should get break from school, or help with relocation!
And correct labeling would help with school bullies. Reposting some "scam" pictures has zero punishment, spreading revenge porn on other side...
Reposting dick pics of exes, or groups like "Are we dating the same guy?" are sadly way too common!
Perhaps it is scam and revenge porn at the same time! And $300 scam is the smaller part! There are many precedents, but with girls!
This tone policing "men can never be a victim of a sexual crime" is getting old.
But NCP is used in all sorts of ways. One of those ways, this particular way here, is as part of a scam. And I think it's important to keep that in mind because to be effective fighting crime, you have to understand it. Anti-scam education is also an important component of online safety training. So when you say, "Why is it called 'scam' and 'fraud'?" as if that were somehow incorrect, I'm going to explain why it's correct.
[1] https://www.cagoldberglaw.com/states-with-revenge-porn-laws/
No, it isn't typical, you are confusing "revenge" with "extortion."
The scammers aren't doing it because they are jilted lovers, obsessed stalkers, or even just bullies: They set out from the very beginning to create a relationship to extort cash.
I.e. extortion.
Beyond it being a truly despicable crime, it's interesting how this extortion hadn't changed much since the early 2000s the only difference is that it's done on a larger scale and probably more done for money than for other nefarious reasons.
The shame is the key to this scam. The intersection of (especially teenaged) insecurity, and America cultures nonsensical relationship to sex and shame, is tragic.
There are fortunately few ways for random strangers to cram images I don't want to see into my eyes. Most of the ones that do exist (e.g. making a social media profile and trying to "add" me) are automatically scanned for porn by the social network site.
I mentioned that shame (of sexuality/nudity/vulgarity) as a contrast to fear of being caught doing something illegal, because I think we should work to remove that shame. Or at least that magnitude of it.
I have my own relationship and journey with this American-puritanical bs shame. Life's short, lots of people are horny, no one really cares about your junk that much anyway. But yeah, I didn't have any of that perspective as a teenager.
also UK and US: "We don't even wank to it"
Asking because that’s the situation the article in the OP is talking about, and I don’t see how any of what you said applies to it.
The perpetrator I am thinking of was a foreign national, but was a student in the USA, attending a US university.
The problem is law enforcement.
Best to contact a cyber division of some sort, especially sexual harassment task forces and non-profits. They're usually able to get the ball moving. Local PDs, especially in Metro areas, simply don't have the resources. It's a sad reality.