• BoofStroke@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    13 days ago

    “For the children” tech laws should all be abolished. Why should I be burdened because you can’t be bothered to raise your own damned kids properly?

    • dogslayeggs@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      10
      ·
      13 days ago

      You’re right, because kids have been shown to listen to the parents all the time and have never had problems handling adult situations when their parents aren’t around 100% of the time. Even amazing parents raise kids who do stupid shit. And once these amazing parents aren’t around their kid 100% of the time, those kids are still kids and will make bad decisions. This is especially true when it is something that literally every person around them is doing (adults, kids, friends, celebrities).

      • Baylahoo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 days ago

        Sure you are correct that parents can’t be there hovering at every moment to correct their kid everytime they make a mistake. At this point, it is easier to put controls that actually work on any internet connected device that you give them than any shenanigans that could get up to outside of supervision. Give them a a tablet with parental controls. It will be a better control than when they go to the corner and buy drugs or whatever is the real life equivalent. It’s never been easier for a parent to control their child’s online consumption than now and it will only get better. The offline risks aren’t really changing the same way.

        • dogslayeggs@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 days ago

          Honest question because I have zero interest in this stuff with no kids, but how easy are parental controls to bypass these days? They used to be trivially easy for any kid to bypass.

  • dhork@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    13 days ago

    Normally, I am all for Techdirt’s takes. But I think this one is off the mark a bit, because I legitimately think that infinite scroll and auto play are insidious, and actually harmful enough to be treated as a dangerous design decision.

    The whole point of Section 230 is that communications companies can’t be held responsible for harmful things that people transmit on their networks, because it’s the people transmitting those harmful things that are actually at fault. And that would be reasonable in the initial stages of the Internet, when people posted on bulletin boards (or even early social media) and the harmful content had a much smaller reach. People had to “opt in”, essentially, to be exposed to this content, and if they stumble on something they find objectionable they can easily change their focus

    But the purpose of the infinite scroll and auto play is to get people hooked on content. The algorithms exist to maximize engagement, regardless of the value of that engagement. I think the comparison to cigarettes is particularly apt. They are looking to hook people into actively harmful behaviors, for profit. And the algorithms don’t really differentiate between good engagement and harmful engagement. Anything that attracts the users attention is fair game.

    The author’s points regarding how these rulings can be abused are correct, but that doesn’t negate how fundamentally harmful these addictive practices are. It will be up to lawmakers to make sure that the laws are drafted in such a way that they can be applied equitably… (So maybe we’re screwed after all…)

  • azuth@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 days ago

    One of the key pieces of evidence the New Mexico attorney general used against Meta was the company’s 2023 decision to add end-to-end encryption to Facebook Messenger. The argument went like this: predators used Messenger to groom minors and exchange child sexual abuse material. By encrypting those messages, Meta made it harder for law enforcement to access evidence of those crimes. Therefore, the encryption was a design choice that enabled harm.

    The state is now seeking court-mandated changes including “protecting minors from encrypted communications that shield bad actors.”

    I don’t see any of the people celebrating this decision discussing this? Perhaps it’s a misrepresentation by the author since I can’t find the actual decision text.

    This is going to harm small non-corporate websites, not just social media, far more than Facebook or Tiktok. Harmful content is also going to include stuff like LGBTQ, especially anything trans related, and ‘antisemitism’ (but probably not antisemitism.)

  • Corkyskog@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    12 days ago

    The author reads like he doesn’t understand context or the legal idea of a rational actor. What users are going to purposefully upload boring content?

    Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

    Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

  • Übercomplicated@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 days ago

    The beginning of the article is pretty weak, especially Masnick kinda defending addictive design:

    Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

    Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

    But I gotta say, it does seem like this could set a dangerous precedent. If it becomes easy to file cases for design decisions on the platfor

  • Lumisal@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    11 days ago

    Local wannabe crack dealer Mike Masnick says crack isn’t harmful, life without it would be boring. More at 11