Using AI for programming: totally missing the underlying message

Some thoughts about “flipping over the rock” and looking at some shocking stuff beneath a gray, featureless surface. I’ll explain below.

 

Lucas Maes

Senior Programmer

Maes Associates

Advisor to the Project Counsel Media team

18 February 2025 (Berlin, Germany) – – Earlier today I enjoyed a coffee with my major client, Greg Bufithis, who owns/runs The Posse List, on his visit to my neck of the woods. We were summarizing my career. It hit me that I have been a computer programmer/developer for over 30 years, so my experience predates AI by a lot.

I’ve programmed for multiple industries: advertising, cybersecurity, hospitals/medical coding, legal technology, media, and telecommunications. Plus just holistic system design, which is the process of defining the architecture, components, modules, interfaces, and data for a system to satisfy specified requirements.

My first job was at Bell Labs which is credited with the development of radio astronomy, the transistor, the laser, the photovoltaic cell, the charge-coupled device (CCD), information theory, the Unix operating system, and the programming languages B, C, C++, S, SNOBOL, AWK, AMPL, and others, throughout the 20th century. 

In fact, for a time I worked for Dennis Ritchie. I cannot do justice to Ritchie’s sweeping influence on the modern world of technology. Ritchie is the father of the C programming language, and with fellow Bell Labs researcher Ken Thompson, he used C to build UNIX, the operating system that so much of the world is built on – including the Apple empire built by Steve Jobs. Jobs had said Apple computers, and even the iPhone, would never have happened without the input of Ritchie. 

And ever so briefly: C is the first general-purpose programming language, and remains very widely used and influential. By design, C’s features cleanly reflect the capabilities of the targeted central processing unit. It has found lasting use in operating systems code, and in all the vernacular you use today: kernels, device drivers, protocol stacks and application software – though its use has been decreasing in that last item. But C is still commonly used on computer architectures that range from the largest supercomputers to the smallest microcontrollers and embedded systems.

What triggered my reminiscing was the articleNew Junior Developers Can’t Actually Code“. The write-up is quite interesting if you are involved with programming, but more so if you have any work/relationship with/to AI.

But I think an important point in the essay has been either overlooked or sidestepped. The main point of the article in my opinion is this one:

The foundational knowledge that used to come from struggling through problems is just … missing. We’re trading deep understanding for quick fixes, and while it feels great in the moment, we’re going to pay for this later.

I agree. The push is to make creating software into what I like to describe as a “TikTok mindset”. The idea is that one can do a quick softwarevsearch and get an answer, preferably in less than 30 seconds. I know there are young people who spend time working through coding problems. We have one of those 14 year old wunderkids in our family. The problem is that I am not sure how many other 14-year-olds have this baked in desire to actually spend the hard time and work through problems. From what I see and hear, teachers are concerned that students are in TikTok mode, not in “work through” mode, particularly in class.

The write-up notes:

Here’s the reality: The acceleration has begun and there’s nothing we can do about it. Open source models are taking over, and we’ll have AGI running in our pockets before we know it. But that doesn’t mean we have to let it make us worse developers. The future isn’t about whether we use AI – it’s about how we use it. And maybe, just maybe, we can find a way to combine the speed of AI with the depth of understanding that we need to learn.

Yes. I (kind of) agree. Now my “however ….”

1. Mistakes with older software may not be easily remediated. I am approaching dinobaby-hood. Dinobabies drop out, or die. The time required to figure out why something isn’t working may not be available. That might be a problem for a small issue. For something larger, like an enormous cannot-fail large bank or a big cybersecurity issue, the problem can be a difficult one.

2. People with modern skills may not know where to look for an answer. The reference materials, the snippets of code, or the knowledge about a specific programming language may not be available. There are many reasons for this “knowledge loss.” Once gone, it will take time and money to get the information, not a TikTok fix. It’s why my most valuable IP is my programming language data base (hardcopy and optical media) that goes back to 1957, plus my network of dinobaby programmers.

I still have all of the original texts and updates of Robert Korfhage, the American computer scientist who is the “Godfather” of information storage and retrieval.

Plus the work of Brian Kernighan, the Canadian computer scientist, who worked alongside Dennis Ritchie. In the 1990s he developed the first, comprehensive university course (“D is for Digital”) with companion text that became widely used because it was aimed for everybody – a brilliant course and book aimed for non-technical readers to help them understand the world of digital computing and communications world operates, from hardware through software to the Internet and the web. Unbelievable detail to understand how these systems work, no matter what your technical background.

3. The software itself may be somebody else’s hack job. I remember a coding project at Bell Labs. The regional manager running the project asked us if we could write the documentation. My colleague said “Ho ho ho. We’ll just use Assembler and make it work” – which was kind of a cheater’s way to write code. The project manager said, “You can’t do that for this project. I want it manual, to think it through, and write it properly”. It held that time, but in the future “short cuts” became the rule.

So what, you say? Well today we are reaching a point when the disconnect between essential computer science knowledge and actual implementation in large-scale, mission-critical systems is being lost. Maybe AI can do what we all did back in Bell Labs so very long ago. 

But I am really skeptical. Yes, I do see the pronouncements that “with AI anybody can code!!” 

And that’s a problem with the “TikTok approach” and smart software. If the model gets it wrong, there may be no fix. The TikTok/ChatGPT approach won’t be much help either. It has no depth, no understanding.

This write-up basically say you do not need to “flip over the rock”. And that’s bad. Because there can be some really shocking stuff beneath the gray, featureless surface. And if you really do not know “how to code”, how these systems work, you’ll kill yourself.