Support the Arctic Sea Ice Forum and Blog

Author Topic: The problem of social media  (Read 44650 times)

morganism

  • Nilas ice
  • Posts: 1764
    • View Profile
  • Liked: 218
  • Likes Given: 128
Re: The problem of social media
« Reply #250 on: February 15, 2024, 02:29:43 AM »
Your AI Girlfriend Is a Data-Harvesting Horror Show

The privacy mess is troubling because the chatbots actively encourage you to share details that are far more personal than in a typical app.

Lonely on Valentine’s Day? AI can help. At least, that’s what a number of companies hawking “romantic” chatbots will tell you. But as your robot love story unfolds, there’s a tradeoff you may not realize you’re making. According to a new study from Mozilla’s *Privacy Not Included project, AI girlfriends and boyfriends harvest shockingly personal information, and almost all of them sell or share the data they collect.

“To be perfectly blunt, AI girlfriends and boyfriends are not your friends,” said Misha Rykov, a Mozilla Researcher, in a press statement. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Mozilla dug into 11 different AI romance chatbots, including popular apps such as Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Every single one earned the Privacy Not Included label, putting these chatbots among the worst categories of products Mozilla has ever reviewed. The apps mentioned in this story didn’t immediately respond to requests for comment.

You’ve heard stories about data problems before, but according to Mozilla, AI girlfriends violate your privacy in “disturbing new ways.” For example, CrushOn.AI collects details including information about sexual health, use of medication, and gender-affirming care. 90% of the apps may sell or share user data for targeted ads and other purposes, and more than half won’t let you delete the data they collect. Security was also a problem. Only one app, Genesia AI Friend & Partner, met Mozilla’s minimum security standards.

One of the more striking findings came when Mozilla counted the trackers in these apps, little bits of code that collect data and share them with other companies for advertising and other purposes. Mozilla found the AI girlfriend apps used an average of 2,663 trackers per minute, though that number was driven up by Romantic AI, which called a whopping 24,354 trackers in just one minute of using the app.
(more)

https://gizmodo.com/your-ai-girlfriend-is-a-data-harvesting-horror-show-1851253284

morganism

  • Nilas ice
  • Posts: 1764
    • View Profile
  • Liked: 218
  • Likes Given: 128
Re: The problem of social media
« Reply #251 on: February 18, 2024, 08:09:38 PM »
(excellent article by Mike on Techdirt taking apart an article on Wired about Section 230 and algorithms and 1st Ammend issues. Touches on mods. Doesn't address probs with bots and AI that is a diff issue ahead, tho with the PTOffice saying no trademarks to AI content, may be able to filter from a blacklist)

(...)
No one — and I do mean no one — wants a website where companies can only moderate based on the First Amendment. Such a site would almost immediately turn into harassment, abuse, and garbage central. Most speech is protected under the First Amendment. Very, very, very little speech is not protected. The very “harassment” that the authors complain about literally one paragraph above is almost entirely protected under the First Amendment.

Also, if you could only moderate based on the First Amendment, all online forums would be the same. The wonder of the internet right now is that every online forum gets to set its own rules and moderate accordingly. And that’s because Section 230 allows them to do so without fear of litigation over their choices.

Under this plan, you couldn’t (for example) have a knitting community with a “no politics” rule. You’d have to allow all legal speech. That’s… beyond stupid.

And, as if to underline that the authors, the fact checkers, and the editors, have no idea how any of this works, they throw this in:

    The United States has more than 200 years of First Amendment jurisprudence that establishes categories of less protected speech—obscenity, defamation, incitement, fighting words—to build upon, and Section 230 has effectively impeded its development for online expression. The perverse result has been the elevation of algorithms over constitutional law, effectively ceding judicial power.

The first sentence is partially right. There is jurisprudence establishing exceptions to the First Amendment. Though it’s very narrow and very clearly defined. Indeed, the inclusion of “fighting words” in the list of exceptions above shows that the authors are unaware that over the past 50 years the fighting words doctrine has been effectively deprecated as an exception.

It’s also just blatantly, factually, incorrect that 230 has somehow “impeded” the development of First Amendment exceptions. It’s as if the authors are wholly unaware of myriad attempts in the decades since Section 230 went into effect for people to convince courts to establish new exceptions. Most notably was the US v. Stevens, in which the Supreme Court made it clear that it wasn’t really open to adding new exceptions to the First Amendment.

https://www.techdirt.com/2024/02/15/has-wired-given-up-on-fact-checking-publishes-facts-optional-screed-against-section-230-that-gets-almost-everything-wrong/

morganism

  • Nilas ice
  • Posts: 1764
    • View Profile
  • Liked: 218
  • Likes Given: 128
Re: The problem of social media
« Reply #252 on: February 24, 2024, 06:34:07 AM »
The “Need for Chaos” and Motivations to Share Hostile Political Rumors

Why are some people motivated to circulate hostile political information? While prior studies have focused on partisan motivations, we demonstrate that some individuals circulate hostile rumors because they wish to unleash chaos to “burn down” the entire political order in the hope they gain status in the process. To understand this psychology, we theorize and measure a novel psychological state, the Need for Chaos, emerging in an interplay of social marginalization and status-oriented personalities. Across eight studies of individuals living in the United States, we show that this need is a strong predictor of motivations to share hostile political rumors, even after accounting for partisan motivations, and can help illuminate differences and commonalities in the frustrations of both historically privileged and marginalized groups. To stem the tide of hostility on social media, the present findings suggest that real-world policy solutions are needed to address social frustrations in the United States.

https://www.cambridge.org/core/journals/american-political-science-review/article/need-for-chaos-and-motivations-to-share-hostile-political-rumors/7E50529B41998816383F5790B6E0545A


(and another write up article on the Turchin theory of Cliodynamics is always worth a read)

https://peterturchin.com/cliodynamica/

How we’re using maths and data to reveal why societies collapse

(...)
We create structured, analysable information by surveying the huge amount of scholarship available about the past. For instance, we can record a society’s population as a number, or answer questions about whether something was present or absent. Like, did a society have professional bureaucrats? Or, did it maintain public irrigation works?

These questions get turned into numerical data – a present can become a “1” and absent a “0” – in a way that allows us to examine these data points with a host of analytical tools. Critically, we always combine this “hard” quantitative data with more qualitative descriptions, explaining why the answers were given, providing nuance and marking uncertainty when the research is unclear, and citing relevant published literature.

We’re focused on gathering as many examples of past crises as we can. These are periods of social unrest that often result in major devastation — things like famine, disease outbreaks, civil wars and even complete collapse.

Our goal is to find out what drove these societies into crisis, and then what factors seem to have determined whether people could course-correct to stave off devastation.

But why? Right now, we are living in an age of polycrisis – a state where social, political, economic, environmental and other systems are not only deeply interrelated, but nearly all of them are under strain or experiencing some kind of disaster or extreme upheaval.
(more)

https://www.rawstory.com/how-were-using-maths-and-data-to-reveal-why-societies-collapse/

(can't find the link to the Conversation article)


morganism

  • Nilas ice
  • Posts: 1764
    • View Profile
  • Liked: 218
  • Likes Given: 128
Re: The problem of social media
« Reply #253 on: April 14, 2024, 02:06:47 AM »
(Step right up,get ur clicks right here!)

How games are used to control you You don't have to play by other people's rules

(...)
But it begins with a mild-mannered psychologist who studied pigeons at Harvard in the Thirties. B.F. Skinner believed environment determines behaviour, and a person could therefore be controlled simply by controlling their environment. He began testing this theory, known as behaviourism, on pigeons. For his experiments, he developed the “Skinner box”, a birdcage with a food dispenser controlled by a button.

Skinner’s goal was to make the pigeons peck the button as many times as possible. From his experiments, he made three discoveries. First, the pigeons pecked most when doing so yielded immediate, rather than delayed, rewards. Second, the pigeons pecked most when it rewarded them randomly, rather than every time. Skinner’s third discovery occurred when he noticed the pigeons continued to peck the button long after the food dispenser was empty, provided they could hear it click. He realised the pigeons had become conditioned to associate the click with the food, and now valued the click as a reward in itself.
(more)

https://unherd.com/2024/04/how-games-are-used-to-control-you/