• kescusay@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s true, although the smart companies aren’t laying off workers in the first place, because they’re treating AI as a tool to enhance their productivity rather than a tool to replace them.

      • ShittyBeatlesFCPres@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I don’t know if it even helps with productivity that much. A lot of bosses think developers’ entire job is just churning out code when it’s actually like 50% coding and 50% listening to stakeholders, planning, collaborating with designers, etc. I mean, it’s fine for a quick Python script or whatever but that might save an experienced developer 20 minutes max.

        And if you “write” me an email using Chat GPT and I just read a summary, what is the fucking point? All the nuance is lost. Specialized A.I. is great! I’m all for it combing through giant astronomy data sets or protein folding and stuff like that. But I don’t know that I’ve seen generative A.I. without a specific focus increase productivity very much.

        • cecilkorik@piefed.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          As a senior developer, my most productive days are genuinely when I remove a lot of code. This might seem like negative productivity to a naive beancounter, but in fact this is my peak contribution to the software and the organization. Simplifying, optimizing, identifying what code is no longer needed, removing technical debt, improving maintainability, this is what requires most of my experience and skill and contextual knowledge to do safely and correctly. AI has no ability to do this in any meaningful way, and code bases filled with mostly AI generated code are bound to become an unmaintainable nightmare (which I will eventually be paid handsomely to fix, I suspect)

          • 6nk06@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            That’s what I suspect. ChatGPT is never wrong, and even if it doesn’t know, it knows and still answers something. I guess its no different for source code: always add, never delete.

            • saltesc@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 months ago

              Yesterday it tried to tell me Duration.TotalYears() and Number.IsNaN() were M functions in the first few interactions. I immediately called it out and for the first time ever, it doubled-down.

              I think I’m at a level where, for most cases, what I ask of LLMs for coding is too advanced, else I just do it myself. This results in a very high counts of bullshit. But even for the most basic stuff, I have to take the time to read all of it and fix or optimise mistakes.

  • Jhuskindle@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Same thing happened during the outsourcing craze of early 2000s. Everything and I mean everything moved to India or Philippines. There’s even a movie about it because it was so common. I and everyone else lost our jobs. about a year later the contracts expired and we all got jobs back and outsourcing is used in balance. Eventually ai use will be balanced I hope. It cannot replace us. Not yet anyways.

  • rozodru@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    As someone who has been a consultant/freelance dev for over 20 years now this is true. Lately I’ve been getting offers and contacts from places to essentially clean up the mess from LLMs/AI.

    A lot of is pretty bad. It’s a mess. But like I said I’ve been at it for awhile and I’ve seen this before when companies were offshoring anything and everything to India and surprise, surprise, they didn’t learn anything. It’s literally the exact same thing. Instead of an Indian guy that claims they know everything and will work for peanuts, it’s AI pretty much stating the same shit.

    I’ve been getting so many requests for gigs I’ve been hitting up random out of work devs on linkedin in my city and referring the jobs to them. I’ve burned through all my contacts that now I’m just reaching out to absolute strangers to get them work.

    yes it’s that bad (well bad for companies, it’s fantastic for developers.)

    EDIT: Since my comment has gained a lot of traction I’ve marked down peoples user names and portfolios/emails to my dev list. If something more comes up (and trust me, it will) I’ll shoot you an email or msg on here. Currently I’ve already shoved off a bunch of stuff to others and have nothing as of now but I imagine that will change by next week so if more stuff comes up I’ll shoot you an email or DM.

  • jsomae@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    AI as it exists today is only effective if used sparingly and cautiously by someone with domain knowledge who can identify the tasks (usually menial ones) that don’t need a human touch.

    • Awkwardparticle@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      The biggest point is that you must be an expert in the field you are using it in. I rarely get fooled by hallucinations and stupid bugs because they are glaringly obvious to me. The best use case is having the llm write code for using a library that has poor documentation, that am going to use once, and I am too lazy to learn. These tools are scary when used by juniors, they are creating more work for everyone by using llms to code. I just imagine myself using this when I was a fresh grad, it is terrifying. It would have only been one step up from vibe coding.