i gave up on blocking bots
estimated reading time: 1 - 2 min
if you’re older than a seedling in spring, i’m sure you’ve heard of the simpsons. in the series there’s a famous quote “i’m a sign, not a cop”. you know what also is a sign, not a cop? `robots.txt`. “please don’t scrape my data”, yet it happens anyways.
since this website’s inception, i’ve been blocking about 24 user-agents, defining them under ` disallow: /`, all until today. today, i finally admitted this shitshow is about as effective as yelling at a tornado to knock it off. sure, chatgpt, claude and their fancy friends claim they’ll politely tiptoe around my robots.txt like it’s a “wet floor” sign (heh, sign, see what i did there). but let’s be real—since when do tech overlords give a damn about playing nice forever? tomorrow they could flip a switch, and just crawl shit anyways. besides how the fuck they were trained anyways, they gotta pull the fucking data from somewhere, and they can’t take it out of their own ass quite yet. i’m done gatekeeping. let the digital raccoons rummage through the dumpster pile that is my blog.
because why stop at existential dread? let’s toss in some delusional optimism and false belief that this won’t be so bad, while we’re at it. maybe i’ll throw a welcome party for those 24 blocked agents, now that my robots.txt has been reduced to ` user-agent: * allow: /`. go ahead, index my blog posts. archive my cringy hot takes. do your worst, you relentless code gremlins—just don’t expect me to pretend this isn’t a dumpster fire. i am as cringe as one can get and i don’t have enough fucks to give. so fuck you and fuck your bots, come, the feast is ready.
got more to say? email me: hi[at]riri[dot]my
posted on: 2025-02-05 05:03 am