• bpev@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    15 hours ago

    I think the biggest difference between this and blue-collars workers losing their jobs, though, is that the same people losing their jobs are also placed very to benefit from the technology. Blue collared workers losing manufacturing jobs couldn’t, because they were priced out of obtaining that mafacturing hardware themselves, but programmers can use AI on an individual basis to augment their production. Not sure what the industry will look like in 10 years, but I feel like there will be plenty of opportunities for people who build digital things.

    That being said, people who were looking to be junior developers exactly right now… uhhh… that’s some extrememly unlucky timing. I wish you luck.

    • kossa@feddit.org
      link
      fedilink
      Deutsch
      arrow-up
      2
      ·
      10 hours ago

      They could now, because big “AI” companies sell their product on a loss.

      The individual programmer is already outpriced when it comes to training those kind of models themselves. Once the companies want to turn a profit, the just laid off worker is outpriced as well. If an LLM can really do as good as a human programmer, who costs 70-100k, nothing stops the LLM provider to charge 35-50k easily. Try to augment your productivity at that price point, especially without a job.

      I mean, society came through the change of the first and second work sector, we could reap the new productivity gains for the benefit of all, but, alas here we are at the beginning of a new crisis 😅

      • bpev@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        12 minutes ago

        mmm so I’ve only used the online models for agent coding, since I’m on a laptop and lack the hardware, but my understanding is that local models like devstral and llama are relatively competitive, and those can be used on like… a gaming rig? I don’t think they’d be able to push the price that much.

        But I don’t disagree that big companies will try their darnedest to.

    • Proudly Green@feddit.uk
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      12 hours ago

      Well I’m old, so no looking for a job, I am just learning programming because i want to. But to your point, I am seeing LOTS of developers who have been laid off and finding another job is proving more challenging than ever before. It’s rough out there and I feel for them.

      To copy what someone else in this thread said:

      The idea that AI will some day be good at coding isn’t the issue. The issue is that some people in management think it’s already well on the way to being a good substitute, and they’re trying to do more with fewer coders to everyone’s detriment.

      • bpev@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        5 hours ago

        Oh layoffs are definitely happening. I’m just not sure if it’s caused by AI productivity gains, or if it’s just the latest excuse (the pandemic, then soft layoffs of “back to office” enforcement, and now AI). Esp since the companies most talking about AI productivity gains are the same companies that benefit from AI adoption…

        What I wanted to explain is just that the skills to program actually translate pretty well. At my old company, we used to say “you know someone’s a staff engineer, because they only make PowerPoint presentations and diagrams, and don’t actually write any code”. And those skills directly translate to directing an AI to build the thing you need. The abstracted architect role will probably increase in value, as the typing value decreases.

        My biggest concern is probably that AI is currently eating junior dev jobs, since what it excels at is typically the kind of work you’d give to a junior engineer. And I think that more gruntwork kinda tasks are the way that someone develops the higher level skills that are important later; you start to see the kinds of edge cases first hand, so it makes them memorable. But I feel like that might just be a transition thing; many developers these days don’t know bare code down to the 1s and 0s. The abstraction might just move up another level, and people will build more things. At least, this is the optimistic view. 🤷 But I’m an optimistic guy.