• 0 Posts
  • 8 Comments
Joined 3 years ago
cake
Cake day: June 12th, 2023

help-circle


  • Computers were in use for at least a decade before the first home computer was a thing and then they became accessible to the public. That’s just history. First cell phones were also prohibitively expensive and it took a while before we got to the point that everyone got one.

    Social media would have been a better example to go with since that has ruined a good number of lives and allowed for genocides due to the speed at which it was deployed to people before they (or the companies) knew how to use it properly. Also, many problems were created by the companies rolling social media out (algorithms pushing towards engagement that leads to huge body issues and increases in heath problems, genocide in Myanmar allowed to be organized on the platform and assisted due to algorithms pushing conflict posts for engagement again) so that’s not a user issue, that’s very much a professional using a tool for malicious purposes due to profit.

    To your point that it comes down to the individual and not the tool, well how does anyone know who is a responsible user on the internet in early interactions? That might be why a lot of people are defaulting to all users are irresponsible due to lack of proof.


  • Maybe if computers and steam engines were being forced into everyday life and into use by people who do not understand how to use them appropriately or the context of any of the answers given out, the critics would also have been riled up.

    As it is, computers and steam engines were prohibitively expensive and so only got used by actual experts long enough for basic use protocols to work their way into society and a slow entry into public use. Not really the same thing at all.

    AI and LLMs are being forced on everyday users without much recourse and so you get a lot more problematic use both by malicious users and by people who don’t understand, which is entirely the fault of the tool and the companies making the tool.