Feel free to not use a debugger for your software. But I don’t hate myself so I’m going to stick to using one whenever possible.
Principal Engineer for Accumulate
- 12 Posts
- 248 Comments
Because I don’t hate myself
Ethan@programming.devto
Programmer Humor@programming.dev•The #1 Programmer Excuse for Legitimately Slacking Off (2026 Edition)English
5·19 days agoWe need better distributed connectivity. It wouldn’t be that hard to build a project management system (issues, etc) on top of Git, but DVCS only gets you so far without a way to connect directly to the other contributors.
They’re different signals. The default handling is the same - terminate - but they’re triggered by different things and (if the process handles them) handled by separate handlers.
I tried for a good 20-30 minutes to get it to generate something that worked. It just gave me a sequence of garbage. Each one looked plausible but as soon as I really read the code it was clear there were issues. Different issues each time. I wasted more time trying to get it to produce correct code than it took to write it myself, once I gave up on the AI.
For me the most productive way to use AI is to ask it to suggest a strategy and to implement that myself. If the code is simple enough for the AI to get it right, I can often write it myself with less effort. If it’s something that requires multiple reprompts to get right, I can almost always do it faster myself.
Or something more complicated than it can handle. Ask it to implement an ASN.1 floating point parser. It looks good but the closer you look at the code the more trash it is.
Ethan@programming.devto
Programmer Humor@programming.dev•y'all are gonna hate me for this, but it's the truthEnglish
11·27 days agoA good teacher will do that. And even the best teacher can’t force you to learn if you don’t want to. LLMs can be a powerful tool for learning if you’re sufficiently motivated.
Ethan@programming.devto
Programmer Humor@programming.dev•y'all are gonna hate me for this, but it's the truthEnglish
21·27 days agoAI is an extremely useful tool. I’m on the other end of the career track, but it seems to be that it’s almost like having a personal tutor. And as with any other teacher, if you use it as an aid to figure things out yourself I imagine it would help immensely, but at the same time if you use it as a crutch to do your work for you, you skills will be as weak as someone who cheats off their friends in school. I attribute a large part of my skills to spending lots of time reading other people’s code and understanding why they wrote it the way they did (usually because some library didn’t do what I wanted so I figured out how to beat it into submission out of pure stubbornness). If you use the AI as an aid and spend the time to really understand the code it’s producing (and the flaws in that code), I think you’ll build up your skills well.
My rant about code monkeys was inspired by people I’ve interviewed and worked with who had to be told exactly how to solve a problem since they apparently had zero problem solving skill themselves. The “programming is just writing code” attitude drives me up the fucking wall and “LLMs are going to make programmers obsolete” is just the latest iteration of that bullshit.
Ethan@programming.devto
Programmer Humor@programming.dev•y'all are gonna hate me for this, but it's the truthEnglish
18·27 days agoKnowing how to write code has only ever been half (or less) of the job. A real programmer solves problems with code, especially problems that aren’t like any they’ve seen before. Someone who can write code but can’t solve problems or can only ‘solve’ problems they’ve seen before is just a code monkey. AI can regurgitate code it’s seen before (that is, code it was trained on) and it can do some problem solving but it falls on its face quickly if you ask it to do anything complex (at least for my metric of what is complex).
Ethan@programming.devto
Programmer Humor@programming.dev•Don't touch my garbage!English
1·3 months agoI make my code open source and public so people can use it if they find it useful, not because I expect anyone to contribute.
And there’s a big fucking difference between actively hostile and “I’m not interested in accepting this change”.
This is the only one I subscribe to that has memes. I was not being precise but I know I have seen this before, recently, more than once.
He can know about it as a concept without really understanding it and if he he treats the dev team as a code producing machine then he could be ignorant of how much technical debt there is. Or maybe there’s an asshat on the dev team telling him there’s no technical debt.
The boss probably isn’t lying about no technical debt. He’s probably just too dumb or ignorant to know about it.
That would be a reasonable take if this hadn’t been reposted twice in the last month
Honestly I didn’t really follow OP’s meme or care enough to understand it, I’m just here to provide some context and nuance. I opened the comments to see if there was an explanation of the meme and saw something I felt like responding to.
Edit: Actually, I can’t see the meme. I was thinking of a different post. The image on this one doesn’t load for me.
“The answer we’ve all been waiting for” is a flawed premise. There will never be one language to rule them all. Even completely ignoring preferences, languages are targeted at different use cases. Data scientists and systems programmers have very different needs. And preferences are huge. Some people love the magic of Ruby and hate the simplicity of Go. I love the simplicity of Go and hate the magic of Ruby. Expecting the same language to satisfy both groups is unrealistic because we have fundamentally different views of what makes a good language.
It is being used. Objective-C (used for macOS and iOS apps) has used reference counting since the language was created. Originally it was manual, but since 2011 it’s been automatic by default. And Swift (which basically replaced Objective-C) only supports ARC (does not support manual reference counting). The downside is that it doesn’t handle loops so the programmer has to be careful to prevent those. Also, the compiler has to insert reference increment and decrement calls, and that’s a significant engineering challenge for the compiler designers. Rust tracks ownership instead of references, but that means it’s compiler is even more complicated. Rust’s system is a little bit like compile-time reference counting, but that’s not really accurate. Apparently Python, Pearl, and PHP use reference counting, plus tracing GC (aka ‘normal’ GC) in Python and PHP to handle cycles. So your implicit statement/assumption that reference counting is not widely used is false. Based on what I can find online, Python and JavaScript are by far the most used languages today and are roughly equal, so in that respect reference counting GC is equally or possibly more popular than pure tracing GC.
Ok, I concede the point, “garbage collection” technically includes reference counting. However the practical point remains - reference counting doesn’t come with the same performance penalties as ‘normal’ garbage collection. It has essentially the same performance characteristics of manual memory management because that’s essentially what it’s doing.
TinyGo isn’t that much slower and it uses LLVM
Garbage collection is analyzing the heap and figuring out what can be collected. Reference counting requires the code to increment or decrement a counter and frees memory when the counter hits zero. They’re fundamentally different approaches. Also reference counting isn’t necessarily automatic, Objective-C had manual reference counting since day one.



I, on the other hand, would not consider someone a serious, competent professional if they were unable to do their job without an IDE. Sure, every serious developer I’ve known uses an IDE or similar for day to day work, but that’s a matter of convenience. In my book “competence” includes being able to do your job without needing your hand held by the IDE.