• arc99@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    2 days ago

    Hardly surprising. Llms aren’t -thinking- they’re just shitting out the next token for any given input of tokens.