Over the last few months, most conversations in tech are dominated by Artificial Intelligence (AI) and Large Language Models (LLMs). These tools try and ape how people think and talk, with access to a lot more computational power and data than regular people. Naturally these have become useful to do tasks that people want to avoid using their own brains for, or doing tasks at a much faster speed. These are not boring tasks like calculating numbers over large tabulated data, but these are tasks that require some kind of “translation” from one concept to another. Tasks like writing, where you are trying to convert your thoughts and feelings into prose. It seems to be good at listening to you and converting your needs into artefacts that are useful.
One of the first things that AI seems to be good at is computer programming. There are millions of people in the world that write computer programs that help people with communication, trade and commerce, complex things like finding the optimal path to reach a destination, to tools that help organise personal stuff and run institutions. The most fun projects of computer programming are the making tools that make it easy for us to use computers and make new tools. Computer programming jobs come in a large variety - some are focussed on making the right tool to get the job done, and some are focussed on creating beautiful experiences for users.
Whether programming jobs are about utility (making things that work) or art (making beautiful things), most programming activity sounds like fun. It is about composing something, then making it simpler, better and faster. There is also a visual design and interactive element to it. How do you represent complex systems, how do simplify interactions, how do you give feedback. Making software is a great mix of visual design, writing, storytelling and theatre. The way to express all of this is by writing “code”, a set of instructions telling a computer on what to do. Not all of code is fun, there are some languages like Java which are terrible, but there are languages like Python that are a delight to write. For every over-engineered style-sheet, there is a beautifully crafted design system. There are “grunt” parts of programming that could be less grunt. This is why we keep building abstractions (and elegant tools like Python). A perfectly laid out abstraction strikes the right balance between reducing noise, and yet giving you the power to do what you want. Finding out the right abstractions is what makes programming fun.
On the dark side, a lot of programming jobs are to build machines that accelerate the accumulation of capital (like all machines). Companies are at a race with each other to make the greatest, fastest and the most powerful machine to power society and commerce. With such an approach, for the people whose motive is to acquire power, tools such as AI are a blessing because it helps them mimic human programmers at scale and build the machine faster and faster. For large corporations that exist for the purpose of “winning”, AI feels like a boon that will help them win even faster and heavily investing in AI.
On the other side are people who want to have the ability to build things, but don’t have the right skills for it (yet). These are people who have the hunger to build software and some surface level understanding of how software works, but are un aware of deep issues, higher order outcomes of building. These are also folks who are investing in AI. For writing automation scripts, AI has already shown a lot of value because you are writing software that is not extensive or complex.
From what we see so far, both these types of people - the large companies who are racing to build society’s tooling, and the smaller companies who want to play “catch-up” with the rest of the crowd are using AI tools heavily to win in the market. So where does this leave companies or programmers who don’t want to use AI. Like I said before, a lot of people are programmers because they love the job - they love tinkering and making things, for them AI could be an existential threat. They are caught between the fun of making things and the pressure to do it fast. This is where we find the industry today. If you need proof that the AI is working better at these jobs, then look at the billions of dollars being invested in the new AI tools and the chatter on social media.
Even at Frappe, we are seeing this trend. Frappe Build 2026 which ended yesterday, was mostly about AI. So far we have been laggards at adoption of AI, but we are seeing the ecosystem heavily invest into it. Inside Frappe, engineers use AI to quickly find solutions to problems or get assistance from it. Slowly they are delegating the job of writing code to the AI and getting some impressive results. As someone who has skin in the game at multiple levels, from being a programmer for ~30 years and then being a invested in Frappe as a company as well (emotionally, more than anything else), I see this trend as problematic. While AI does certainly give a short term boost, it does not seem to reward deep thinking. Instead of writing “boiler-plate” code (highly verbose, repetitive code), we should be thinking of abstracting it out. Engineers are not thinking of looking under the hood and relying on “high level instructions” to write code.
The next question then is, is AI another layer of abstraction? Isn’t AI just like the Framework, a tool that hides writing verbose code. The immediate answer to me is NO. AI is not a Framework, because the Framework does not write boilerplate code, it only captures the configuration for the object it is trying to create. If programming is a “puzzle”, the goal of the puzzle is to write code only once and then use to many times by making it robust to accept variations of the same instruction. Making abstractions is hard, but it is also the most fun part of programming. To make the right abstractions, you must name the objects to exactly mean what they represent (semantics). All this is what makes the “craft” of programming, much like the craft of writing i.e. choosing words that exactly mean what you want to say. If "English" is the new language to write code, I wager, Python is way more specific and efficient.
My hunch is that AI is not really an abstraction, but a completely new thing. In Frappe, there used to be a frequent quip - “Instead of instructing someone to do something, it is faster if I do it myself”. It means that the best (and most efficient) abstraction for a good programmer is the code itself. With AI this is no longer the case. Programmers who are comfortable using AI, have found the language abstraction to be the fastest abstraction to get what is needed. To me it means that they have given up trying to make better abstractions in code. AI abstractions seem much better when you don’t want to make quality software but just software that is run occasionally. Most of the cool demos I have seen are mostly incomplete projects or just proof of concepts.
Apart from the worry that whether AI can write high quality code, my other worry is that what will it do to programmers who don’t write code? Will they have the same kind of “grip” on the quality of abstractions that make robust code? Engineers gleefully come and tell me that the Frappe Framework is great for AI coding because it embeds a lot of the abstractions that ensure that AI writes highly efficient code. Well, this is how all code should be written - great abstractions, minimum repetition - just say what you want to say in minimum words. Mark Twain once said that “If I had more time, I would write a shorter letter”. By writing “prompts” and verbose “code”, my hunch is that programmers are doing the wrong thing. Not only will they lose “grip” on the code but they will also end up making verbose and repetitive code artefacts. I am not even saying about the price of being dependent to external AI compute (which I assume much like electricity, will be freely available).
While I don’t think there is a conclusive debate here, there are a lot of questions that linger. What is going to be the long term cost of AI coding, on both products and programmers? When the future is indeterminate, you have to trust your gut. Personally, my recommendation to every programmer out there is to continue to write code (and not specs) and use AI as little as possible. Using AI for searching and summarising seems okay (reading the reference docs is better), but writing specs and letting the AI write the code feels wrong. Most engineers in Frappe are engineers because they love the process of tinkering and making things, and I fear that living at a very high level of abstraction is going to make them lose that ability. At Frappe we also celebrate freedom and people are smart to choose their tools, so they are going to experiment for sure. I am not trying to ban AI or stop it (I can't), just asking hard questions so that we think before we leap.



