• greenskye@lemm.ee
    link
    fedilink
    English
    arrow-up
    29
    ·
    edit-2
    2 months ago

    I thought the same until someone shared some additional insights with me.

    So basically for device verification to work, you have to prove to someone that you’re an adult, typically by linking your real ID. The problem comes from when you log in to a porn website and they try to determine you’re an adult by reaching out to that trusted 3rd party. Now even though the porn site doesn’t know who you are, only that you’re an adult, the ‘trusted verifier’ does know that you’ve visited the porn website. This makes that organization a huge security risk as it directly links your identity to visiting controversial websites.

    Who would you really trust with that info? Corporation or government, both have major risks to collecting that info. What happens when FL bans porn and starts targeting people they know have accessed it via this database? What happens when LGBT info is labeled ‘adult only’ and requires this tech to access, creating a database of potential ‘undesirables’?

    Once it’s created it’s absolutely positive that the data will be hacked and that the government will use this mechanism to target at risk groups.

    The difference between this and in person ID checks is one of data persistence. Bars and such things just look at your ID, but don’t typically log it in a database. Compiling a persistent database of every ‘adults only’ only action is just too risky.

    • kbal@fedia.io
      link
      fedilink
      arrow-up
      11
      ·
      2 months ago

      The “you’d have to prove to someone that you’re an adult” is where we disagree. I was talking about parents setting a “user is a child” flag on the devices they let their kids use. They already know who their children are, no proof is necessary. The device can then send an http header to websites for example indicating that it’s a child user. That part could be mandated and standardized by law. It’s 99% of the problems solved (in legal theory; obviously not every website and app in the world will choose to participate in any of these schemes) with 1% of the dangers.

      So long as they don’t go overboard with misguided efforts to make it impossible for children to defeat the thing, it seems fine. It’s dismaying that all the proposals end up with all these ridiculously dysfunctional ideas instead.

      • greenskye@lemm.ee
        link
        fedilink
        English
        arrow-up
        22
        ·
        2 months ago

        That isn’t sufficient for the people trying to pass these laws. They’re trying to get the government to enforce parental controls, not the parents. Those types of controls already essentially exist and yet they were deemed insufficient.

        This is mostly because these people are not interested in protecting children, but rather shutting down anything they don’t like. The same way they tried to shut down abortion clinics by attempting to hold them to full blown hospital building standards. It wasn’t because it was unsafe, it was a way to harass the clinics they disapproved of.

        • captainlezbian@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          2 months ago

          Yeah it’s important to keep in mind that while some of these people are just concerned about children, many ultimately want some content topics to be illegal that are currently legal. And for those funding and enacting such things such content is often all of pornography (as per their definitions) and all of queer content