A few months after I started this blog, I experienced an influx of traffic like never before. I wrote an article that went "viral" on both Hacker News and Reddit.
The moment it hit the front page, my tiny server went into full meltdown. Requests flooded in like a tsunami, Apache struggled to stay afloat, and I sat helplessly, restarting the machine over and overโlike a firefighter trying to put out a wildfire with a squirt gun. In internet parlance, this is called the Hug of Death.
It's hard to convey just how intense the flood of requests was or the stress it caused me at the time. But I want to bring it to life with a dynamic animation.

Just this February, I wrote another post that shot up to #1 on Hacker News in minutes. This time, I was prepared. I saved a copy of my server logs and built a visualization to illustrate exactly what my tiny $6/month server endured.
Web Request Visualization
Sorry mobile users, use landscape mode for now.
Hit the play button and watch.
Each web request is represented by a circle traveling toward the server. Check the bottom right for the legend:
- Bots vs. real users: Based on user-agent detection. Legit bots often contain "bot" in their name, while others are identified heuristically.
- Response types:
- โ
200 OK: Successful request
- ๐ Redirects: Represented as wiggling dots
- โ 404 Not Found: A red dot falling off the screen
- ๐ฅ Zip bombs: More on that later
Server Specs
Despite the chaos, my tiny $6/month server, armed with just 1GB of RAM, Apache 2, and a basic MySQL database, stood its ground. No fancy cloud autoscaling, no load balancers. Just a lean setup and good caching.
- Hosting: $ DigitalOcean (1GB RAM)
- Web Server: Apache 2
- Environment: Ubuntu + PHP
- Database: MySQL
- Price: $6/month
The blog runs on a custom framework. Most pages are cached in memcached, so the database is only queried about once an hour per page. This efficiency allowed the same setup to handle millions of requests in the past, including my viral stories about getting fired by a machine and being featured on the BBC.
Timeline of Events
- ๐ 4:43 PM (PST) โ Post submitted to Hacker News.
- ๐ 4:53 PM (PST) โ Hits the homepage. A flurry of bots swarm in.
- ๐ 5:17 PM (PST) โ #1 on Hacker News. The floodgates open.
- ๐ 8:00 PM (PST) โ Mods rename the article (for reasons unknown). Traffic plummets.
- ๐ 3:56 AM (PST) โ A bot scans 300 URLs searching for vulnerabilities.
- ๐ 9:00 AM (PST) โ Traffic surges again, largely from Mastodon networks.
- ๐ค 9:32 AM (PST) โ Massive spam attack: ~4,000 requests in one minute, mostly advertising darknet markets.
- ๐ 4:00 PM (PST) โ In 24 hours, my server handled 46,000 requests.
The Elephant in the Room
The server never crashed. In fact, CPU usage never exceeded 16%.
But you might have noticed something odd in the visualization: my 1GB RAM server is constantly at 50% memory usage. Why? MySQL.
When I started this blog, I ambitiously logged every single request into a database. This was useful for tracking post popularity. But over 12 years later, the database ballooned. Sorting through millions of rows for simple analytics became a costly operation.
After this viral surge, I made a backup and deleted the table. It was time.
Zip Bombs: Fighting Back Against Content Thieves
You may have noticed small explosions in the animation. Let me explain.
One day, I discovered a site was stealing my content in real-time. Whenever someone visited their page, theyโd fetch my blog post, swap out my branding, and serve it as their own.
At first, I fought back manually, feeding them fake data. But that got old fast. So I deployed my secret weapon: a zip bomb.
When their bot accessed my site, I served it a tiny compressed file. Their server eagerly downloaded and decompressed it, only to unleash several gigabytes of chaos. ๐ฅ Boom. Game over.
Over the years, zip bombs have become my shield against bots attempting to scrape, exploit, or abuse my site.
Lessons Learned
- Dang (Hacker News mod) will change your article title, and thereโs nothing you can do about it.
- Most traffic comes from bots, not humans. They scan, scrape, and spam relentlessly.
- Apacheโs worker limits matter. I discovered two running instances with a max of 75 workers, something I hadn't noticed before.
- A well-optimized, lightweight setup beats expensive infrastructure. With proper caching, a $6/month server can withstand tens of thousands of hitsโno need for Kubernetes.
Final Thoughts
This experience taught me more than just server management. Watching the web request visualization unfold gave me a new appreciation for how traffic flows, how bots operate, and how even the simplest optimization decisions can make the difference between a crash and smooth sailing.
And most importantly: if youโre going to go viral, be ready.
Comments
There are no comments added yet.
Let's hear your thoughts