• areyouevenreal@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    No actually it has changed pretty fundamentally. These aren’t simply a bunch of FCNs put together. Look up what a transformer is, that was one of the major breakthroughs that made modern LLMs possible.

    • sudneo@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      That is a technical detail, not a fundamental change. By fundamental mechanism I mean what the machine is designed to do. Of course techniques and implementations evolve, refine and improve in 60 years, but the idea behind the technology did not evolve much (NLP).

      • areyouevenreal@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 days ago

        Did back propagation even exist in the 60s? That was a pretty fundamental change in what they do.

        If we are arguing about really fundamental changes then arguably any neural network is the same and computers are the same as ChatGPT or a mouse, or even something simpler like a single layer perceptron.