29 comments

  • CodingJeebus 26 minutes ago

    > Which brings us to the question: why does this pattern repeat?

    The pattern repeats because the market incentivizes it. AI has been pushed as an omnipotent, all-powerful job-killer by these companies because shareholder value depends on enough people believing in it, not whether the tooling is actually capable. It's telling that folks like Jensen Huang talk about people's negativity towards AI being one of the biggest barriers to advancement, as if they should be immune from scrutiny.

    They'd rather try to discredit the naysayers than actually work towards making these products function the way they're being marketed, and once the market wakes up to this reality, it's gonna get really ugly.

      psychoslave 11 minutes ago

      >The pattern repeats because the market incentivizes it.

      Market is not universal gravity, it's just a storefront for social policy.

      No political order, no market, no market incentives.

  • MontyCarloHall 14 minutes ago

    It's not so much about replacing developers, but rather increasing the level of abstraction developers can work at, to allow them to work on more complex problems.

    The first electronic computers were programmed by manually re-wiring their circuits. Going from that to being able to encode machine instructions on punchcards did not replace developers. Nor did going from raw machine instructions to assembly code. Nor did going from hand-written assembly to compiled low-level languages like C/FORTRAN. Nor did going from low-level languages to higher-level languages like Java, C++, or Python. Nor did relying on libraries/frameworks for implementing functionality that previously had to be written from scratch each time. Each of these steps freed developers from having to worry about low-level problems and instead focus on higher-level problems. Mel's intellect is freed from having to optimize the position of the memory drum [0] to allow him to focus on optimizing the higher-level logic/algorithms of the problem he's solving. As a result, software has become both more complex but also much more capable.

    (The thing that distinguishes gen-AI from all the previous examples of increasing abstraction is that those examples are deterministic and often formally verifiable mappings from higher abstraction -> lower abstraction. Gen-AI is neither.)

    [0] http://catb.org/jargon/html/story-of-mel.html

  • xnx 22 minutes ago

    Don't take it personal. All business want to reduce costs. As long as people cost money, they'll want to reduce people.

      bill_joy_fanboy 7 minutes ago

      Which is why quiet quitting is the logical thing.

      Managers and business owners shouldn't take it personally that I do as little as possible and minimize the amount of labor I provide for the money I receive.

      Hey, it's just business.

  • walterbell an hour ago

    > Understanding this doesn’t mean rejecting new tools. It means using them with clear expectations about what they can provide and what will always require human judgment.

    Speaking of tools, that style of writing rings a bell.. Ben Affleck made a similar point about the evolving use of computers and AI in filmmaking, wielded with creativity by humans with lived experiences, https://www.youtube.com/watch?v=O-2OsvVJC0s. Faster visual effects production enables more creative options.

  • SonnyTark 5 minutes ago

    I recently did a higher education contract for one semester in a highly coding focused course. I have a few years of teaching experience pre-LLMs so I could evaluate the impact internally, my conclusion is that academic education as we know it is basically broken forever.

    If educators use AI to write/update the lectures and the assignments, students use AI to do the assignments, then AI evaluates the student's submissions, what is the point?

    I'm worried about some major software engineering fields experiencing the same problem. If design and requirements are written by AI, code is mostly written by AI, and users are mostly AI agents. What is the point?

  • klodolph 31 minutes ago

    Kind of off topic but this has got to be one of my least favorite CSS rules that I’ve seen in recent memory:

      .blog-entry p:first-letter {
        font-size: 1.2em;
      }
  • PeterStuer 23 minutes ago

    The reverse is developer's recurring dream of replacing non-IT people, usually with a 100% online automated self promoting SaaS. AI is also the latest incarnation of that.

  • erichocean 33 minutes ago

    Spreadsheets replaced developers for that kind of work, while simultaneously enabling multiple magnitudes more work of that type to be performed.

      ozim 2 minutes ago

      I do agree, that’s like my go to thought.

      Citizen developer were already there doing Excel. I have seen basically full fledged applications in Excel since I was in high school which was 25 years ago already.

  • cyanydeez 25 minutes ago

    Mythical Man Month -> Mythical AI Agent Swarm

  • krater23 an hour ago

    The link doesn't works for me, just get thrown on the main page after a second.

  • jalapenos 2 hours ago

    The dumb part of this is: so who prompts the AI?

    Well probably we'd want a person who really gets the AI, as they'll have a talent for prompting it well.

    Meaning: knows how to talk to computers better than other people.

    So a programmer then...

    I think it's not that people are stupid. I think there's actually a glee behind the claims AI will put devs out of work - like they feel good about the idea of hurting them, rather than being driven by dispassionate logic.

    Maybe it's the ancient jocks vs nerds thing.

      kankerlijer 30 minutes ago

      Outside of SV the thought of More Tech being the answer to ever greater things is met with great skepticism these days. It's not that people hate engineers, and most people are content to hold their nose while the mag7 make 401k go up, but people are sick of Big Tech. Like it or not, the Musks, Karps, Thiels, Bezos's have a lot to do with that.

        cyanydeez 23 minutes ago

        Popularity gets you nowhere though. What matters is money and money. Those 401k holders are tied down to the oligarchy.

      peacebeard an hour ago

      Devs are where projects meet the constraints of reality and people always want to kill the messenger.

        booleandilemma 31 minutes ago

        Devs are where the project meets reality in general, and this is what I always try to explain to people. And it's the same with construction, by the way. Pictures and blueprints are nice but sooner or later you're going to need someone digging around in the dirt.

      rvz 19 minutes ago

      Who fixes the unmaintainable mess that the AI created in which the vibe coder prompted?

      The Vibe Coder? The AI?

      Take a guess who fixes it.

        lkjdsklf a few seconds ago

        The real question is, do you even need to fix it? Does it matter?

        The reason those things matter in a traditional project is because a person needs to be able to read and understand the code.

        If you're vibe coding, that's no longer true. So maybe it doesn't matter. Maybe the things we used to consider maintenance headaches are irrelevant.

        tosapple 11 minutes ago

        For now, training these things on code and logic is the first step of building a technological singularity.

      benoau an hour ago

      Some people just see it as a cost, one "tech" startup I worked at I got this lengthy pitch from a sales exec that they shouldn't have a software team at all, that we'd never be able to build anything useful without spending millions and that money would be better-spent on the sales team, although they'd have nothing to sell lmfao. And the real laugh was the dev team was heavily subsidized by R&D grants anyway.

      plagiarist an hour ago

      They don't need to put all developers out of work to have a financial impact on the career.

      dboreham an hour ago

      > who prompts the AI

      LLMs are a box where the input has to be generated by someone/something, but also the output has to be verified somehow (because, like humans, it isn't always correct). So you either need a human at "both ends", or some very clever AI filling those roles.

      But I think the human doing those things probably needs slightly different skills and experience than the average legacy developer.

        reactordev an hour ago

        Rules engines were designed for just such a thing. Validating input/output. You don’t need a human to prompt AI, you need a pipeline.

        While a single LLM won’t replace you. A well designed system of flows for software engineering using LLMs will.

          Alex_L_Wood a minute ago

          Well, who designs the system of flows?

      spwa4 an hour ago

      Even that is the wrong question. The whole promise of the stock market, of AI is that you can "run companies" by just owning shares and knowing nothing at all. I think that is what "leaders" hope to achieve. It's a slightly more dressed get-rich-quick scheme.

      Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.

      It of course becomes "oh, sorry, we happen to have taken the only huge business for ourselves. Is your kidney now for sale?"

        rvz 8 minutes ago

        > Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.

        But you need to buy my AI engineer course for that first.

      duskdozer 40 minutes ago

      How about another AI? And who prompts that AI? You're right - another AI!