• 0 Posts
  • 5 Comments
Joined 10 months ago
cake
Cake day: May 28th, 2024

help-circle

  • Separate the Art from the Artist is accepted by SANE people, who don’t have time to milk tirades so that they can play victim on the internet so that strangers think they are virtuous.

    There is not a thing you do, have, own, buy, or operate that isn’t part of slavery, human exploitation, etc in some way or another. You only do this because you want to virtue signal to others that you think “good thoughts” and so you can be praised for being brave.

    You’re not brave. You don’t make a difference. And nobody cares.

    If you eat any kind of meat, your an evil “carnist”. If you don’t adopt pets from the shelter, then you’re contributing to pet farming. If you don’t drink from a paper straw then you’re killing turtles.

    Everyone everywhere has some problem with someone’s something. You literally cannot avoid it all. Own a smartphone? Then you’re evil and deserve to die because you’re carrying an item that was made by slavery! Let me guess…you’re not gonna give up the smartphone…are you? Yeah - I didn’t think you would.




  • kitnaht@lemmy.worldtoTechnology@lemmy.worldGenerative AI is a Parasitic Cancer
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    5
    ·
    edit-2
    2 months ago

    The website that she refers to in her video, and the massively wrong verbiage in the paragraphs proceeding, is definitely not “AI Slop”, this is a pretty common thing to see in mass website farms from India.

    Go ask chat.openai.com to explain GLTF and GLB to you, and it’s not going to have any of this weird grammatical error shit; it explains GLB pretty succinctly, because it’s a widely discussed subject.

    Unfortunately, sooooo many people are worried about AI, that they’re attributing anything and everything to it now. These types of grammatically broken websites existed before any kind of useful LLM text generation, and they’ll continue to exist afterwards as well. But they are not AI-generated.

    AI hallucinates and generates jibberish when you’re asking it to generate text about edge cases and knowledge bases which aren’t commonly talked about. This is not one of those examples.