• dogslayeggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    15
    ·
    2 days ago

    You’re right, because kids have been shown to listen to the parents all the time and have never had problems handling adult situations when their parents aren’t around 100% of the time. Even amazing parents raise kids who do stupid shit. And once these amazing parents aren’t around their kid 100% of the time, those kids are still kids and will make bad decisions. This is especially true when it is something that literally every person around them is doing (adults, kids, friends, celebrities).

    • Baylahoo@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      13 hours ago

      Sure you are correct that parents can’t be there hovering at every moment to correct their kid everytime they make a mistake. At this point, it is easier to put controls that actually work on any internet connected device that you give them than any shenanigans that could get up to outside of supervision. Give them a a tablet with parental controls. It will be a better control than when they go to the corner and buy drugs or whatever is the real life equivalent. It’s never been easier for a parent to control their child’s online consumption than now and it will only get better. The offline risks aren’t really changing the same way.

    • bootstrap@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      2 days ago

      We all did dumb shit as kids, but tech wasnt anywhere near what it is now.

      These platforms need to be punished and held to account for the pervasive technology they have designed for profit, these things (FB, insta. Tiktok etc) shouldnt be able to exist in the first place in their current state. There were no guard rails put in place - just like the flood of AI, technology moves so much quicker than legislation can keep up and companys do really shitty things with that.

      I believe it starts at a parenting level, but it’s much more difficult to manage these days compared to 20 years ago. Age verification bullshit is not the answer but parents need to be given some form of help againt these fuckers and their incredibly easy to access addiction machines.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        I have a question. What if it’s not just at a parenting level. What if it’s also at a school in level? Because I think at least partially there is a disconnect between media and internet literacy and people of all ages including children and parents.

        I think we’re going to need such skills going forward and that there exist places in the world where students are being taught such things and are benefiting from them significantly.

        Yet the immediate knee jerk reactions seem to be blame the parents and blame the companies that facilitate the access to the content.

        It doesn’t have to be a parents by themselves against the world system. But it also can’t just be a “companies protecting the children” system because that’s not what companies do or are for? The need to maintain a profit margin flies directly in the face of the aim to hold companies responsible and the laws seem to be intent on capping the monetary consequences of a breach of the law.

        I do feel that the least a parent should be required to do before complaining to a governing body that they find someone else is “harming” their child is to show that they have done their due diligence to protect said child. We punish parents for willful negligence and child endangerment all the time. I don’t understand why this is different but I also wonder if there are other options for educating both children and adults that could help the situation significantly.

        • Baylahoo@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 hours ago

          I think you make some good points here, but just for context, I do think that there is a level of responsibility on the parents here in combination with the companies. There’s plenty of “online literacy” classes that I think would be appropriate for adolescent education. I’m the unfortunate benefactor of needing to master cursive as a class one year and then typing the next year. Schools would be more beneficial if they included teaching kids internet literacy. They can probably drop some of the old stuff. They also don’t teach several other things like financial literacy in many situations (despite heavy capitalist leanings in real life). The education system sucks, but that is not an excuse to let iPad kids control my freedoms, and the root cause for age verification has never been about protecting children in the first place.

          • atrielienz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 hour ago

            I absolutely agree that parents do play a role and have some responsibilities for both their and their childrens internet literacy, as well as for what their children access on the internet. I also agree that companies bear some responsibility (for making their platforms addictive on purpose in order to make money off of people they already know are underaged).

            I just really want to put forth other ideas for fixing this problem that don’t involve companies being forced by law to enact ID verification when they can’t be trusted to safeguard such information and it feeds into the information database they already have, which will more than likely be used to violate the privacy of their users.

            If the government absolutely must get involved making it illegal to produce and give access to a platform found to be addictive would be a start, but so would media and internet literacy education, both of which are solutions that don’t violate the privacy of minors or adults.

            Digital media literacy is part of the education system in Denmark and some other European countries and it’s been beneficial to their populace. I think it could be a good solution.

        • bootstrap@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          13 hours ago

          I guess in response to your last paragraph, the issue is the predatory nature of the attention addiction machines these companies make.

          You could compare it to a child that got in to a van that had “free candy” written on the side. The door was open, if you assume someone was standing next to the van asking the kid to get in, that would be advertising. Now the kid gets abducted. Their “attention” is held hostage in the case of social media etc.

          Now, would the parents have had to tell the kids to not get in to a van with free candy written on it for them to be able to report it to the police? Bad luck otherwise? Now what if every month a new van rocks up with more bells and whistles, its a different colour, its got flames down the side, whatever - the point is its different and cool and more appealing each time. More kids go missing. The “predators” have figured out what makes these kids tick and what makes them more likely to get in the van every time.

          It’s a bit of an out there and confronting comparison but really, these companies are praying on your mental instead of your physical, which apparently is free game. They are still predators.

          They know the harm their platforms cause, they suppress studies that report that harm, they cover it up, they fight tooth and nail and spend millions lobbying government to let them continue to do it.

          Back on track sorry, schools are also responsible but you run in to the same issues once companies start targetting school kids like google did with chromebooks - the shittest PCs sold at a loss just so they could attempt to hook the younger generation in to their ecosystem of surveilance and advertising early.

          Companies will NEVER protect the children. They will only ever protect shareholders, profits and their pedo CEOs.

          Real change will only ever come from real (not sponsored) education, government legislation that isnt bullshit (I dont know what this would look like but ID checking isnt it) and holding the tech bros increasingly accountable for their fucked up apps.

          • atrielienz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 hours ago

            So, for the “it’s the parents fault” bit I’ll say this. Parents are the arbiters of Internet access in their homes. If that van with “Free Candy” written on it pulled into their driveway and they didn’t call the police or warn their children not to get in the van, yes I would consider them liable.

            The fact is, lots of parents do know their children are using social media like Facebook, Instagram, Tik Tok etc. A lot of parents are my age and younger (the age where we grew up with the internet and social media in its toddler years if not it’s infancy). A lot of us do know the dangers (and are probably addicted ourselves).

            What some of us may lack is the knowledge to use parental controls effectively (and at least some of that is because we do dumb shit like using the same password for everything, or not changing default passwords).

            But I also think that some of us (looking at you collective shout and other organizations like it) just want to offload our responsibilities onto these companies so we have someone to blame.

            And even though I agree that what these companies are doing is wrong (directly targeting minors, deliberately making their platforms addictive, collecting data on minors etc), and I want them held accountable, I also don’t think ID collection is warranted, and I view this as a way to violate privacy and collect data for surveillance purposes which I believe is wrong to do to people who haven’t done anything illegal.

            Even if that weren’t the case, these companies also just cannot be trusted to safeguard the PII data they’re wanting to collect. So as far as I’m concerned the ID verification thing is just not going to work.

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        Kids should be banned from the platforms.

        But that requires the tools to do so. And then we are back at checking on ages and identities.