Programming insights to Storytelling, it's all here.
Once, I inherited an application so offensive, it felt like a personal insult. It wasn’t just bad code, it was a decade of bad decisions stacked like a Jenga tower. My task was to address a complaint by the internal users on a report that was generated. I looked in the company's git repository for the tool, but it was nowhere to be found. Only after digging through the server it ran on, I found that it was using Concurrent Versioning System or CSV, a version control system older than some interns.
Every developer knows the rush. You are driving and suddenly you’re struck by a “life-altering” idea (your 14th this week). At the next red light, you record an audio while driving, avoiding eye contact with what clearly looks like a cop’s car. At 2 AM, you wake abruptly remembering the recording. Now you’re setting up repositories, debating frameworks, and buying AWS servers in the middle of the night. The blind spot? You’re convinced this time, you’ll finish.
When I built shotsrv, a service that takes screenshots of URLs using PhantomJS, I did what most solo developers do: I opened my IDE, hacked together some Node.js scripts, and celebrated when it spat out its first screenshot. There was no roadmap, no team to consult, and no definition of “success” beyond “it works… mostly.”
Every programmer has a “lightbulb moment” that sparks their love for code. For many of us, it’s video games. You marvel at sprawling open worlds, intricate quests, and heroes who swing swords or fire plasma rifles, and you think, “I want to build something like this.” Then you join the industry and realize: nobody lets you build the sword.
You’ve probably heard of the Trolly Problem. It’s that classic ethical dilemma where you have to choose between saving five people by sacrificing one or doing nothing and letting five die. It’s a favorite of philosophers, ethicists, and anyone who wants to sound deep at a dinner party. But here’s the thing: when it comes to self-driving cars, the trolley problem is pretty much irrelevant. Why? Because self-driving cars don’t work the way the trolley problem assumes they do. Let’s break it down.
When someone says, “Everything is getting more expensive,” what does that actually mean? Rising prices are easy to observe, but quantifying their impact on daily life requires a deeper look at how income shapes our perception of affordability.
When it comes to generative content, whether it’s video, audio, or images, there’s one group that’s been quick to embrace these tools with open arms: spammers.
I used to pride myself on being the "Google expert." I’d snatch keyboards from unsuspecting hands, dictate search terms like a tyrannical librarian, or laugh at anyone typing fully-formed sentences into Google’s search box. “Type like you’re instructing a machine, not chatting with a friend,” I’d scoff.
When companies develop AI products with the potential to replace workers, they like to sprinkle in a little reassurance: “AI won’t take your job. It’ll handle the repetitive, tedious tasks so you can focus on the more complex, meaningful work.” Sounds fair, doesn’t it? Almost comforting. But it’s not entirely true.
After deploying an AI customer service agent for a large client, the first thing I’d do was wait for customer feedback. Most customers never leave a review, or a Customer Satisfaction Score (CSAT), as it’s commonly known in the industry. But for a large enough client, it was only a matter of minutes before the first responses would roll in. Like clockwork, the initial feedback appeared.