• thedeadwalking4242@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    13 hours ago

    I feel like people over estimate how effective LLMs actually are. Alot of the slop bros are unreliable narrators. Alot of their shit isn’t finished or never really worked in the first place.

    LLMs just steal large swaths of code from their training data with light composition. That’s why it fails if you try and do anything remotely novel or specific.

    They are just casinos for stack overflow posts.

    The models for bigger = they can store more data.

    I’ll give them that LLMs have some very minor intelligence but it’s minor.

    People equate speed with quality all the time. Management thinks that just because they get code faster it’s better for long term company profits.

    What they don’t realize is they code they are getting is completely unmaintainable and they’ll hit a wall in a year or two once the LLM has churned the code base into a spaghettied mess.

    I just hope people in charge of critical infrastructure know better.

    I can’t imagine dying because some medical equipment I’m using fails because of LLM slop.

  • bobbear@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 day ago

    The first half of the article reads like the dime-a-dozen “if it ain’t perfect I’m not using it”. You can safely skip it. In the second part, he gets around to his point, which is finally brought home in the summary:

    However, especially in software development, it is quite likely that AI-based software development will eventually become the predominant paradigm and the tools will mature. Therefore, it is highly advisable not to ignore AI-based software development even if the tooling is still highly immature. Instead, it is important to follow the evolution and understand the technology well enough to know when the inflection points arrive.

  • resipsaloquitur@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    2 days ago

    Not sure I follow the DOS/Windows analogy. Unless you spent all your time designing GUIs, a lot of those skills carried over. Especially with the advent of VisualStudio (also intended to replace engineers), you could drag and drop a window layout, double-click a button and continue coding as before. There was a small mental leap necessary to write event-driven code (instead of using a superloop), but that’s an afternoon-long “a ha” epiphany, not going back to school for a new degree. Ditto for MVC-like layering.

    And the DOS/Windows analogy is further baffling since Windows-native coding has mostly come and gone. Even “native” Windows apps are typically Electron — web pages captured in a window

    Unless the author is saying we must always run to stand still, which they seem to reject early on.

  • arnitbier@sh.itjust.works
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    3 days ago

    This is a really good read, not sure why its not being appreciated more

    Well thought out and well done, thanks for sharing

    Guess it just kinda pisses everyone off by not “taking a side” on this set of issues but its a rock solid blogpost for anyone IMO

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 days ago

    UNIX schedulers became better and better, and eventually nobody needed to set process priorities and nice levels anymore.

    I use nice levels.