This is cool. I am a total hypocrite; I say I blog for the love of it and being a slave to analytics is terrible but in reality I love the sense of immediate feedback when I see a bunch of hits on a project I spent hours on.
I did end up implementing a simple hit counter on my site just to satisfy my craven need for validation without resorting to full analytics. It doesn't beep at me, but maybe it should.
I did my part and manually reloaded the page about once a second for 5 minutes so that Andrew could get their dev validation beep quota in for the day (unless it's not naive hits, and unique user based, in which case this has a been a fantastically hilarious waste of time).
Ok, I played Voyage of the Marigold for a little bit now. Is it possible to get out of the shoals? Also can you beat the dreadnaught near the last two sectors with 1 torpedo and lasers, or do you need 2 torpedoes?
Thank you for playing my game. Yes, it is possible to make it to the top right sector and deliver your cargo. It is not easy - I recommend choosing the option to take extra fuel right at the start since running out of fuel is usually what kills most runs.
The dreadnaught is tough but not impossible. Torpedoes will help but a few lucky hits with lasers will also do the job.
Ink is such a cool language. I worked really hard on that game and entered it into an interactive fiction competition last year. It didn't come close to winning (I didn't expect it to) but people seemed to like it, especially if they grew up watching Star Trek.
I picture this like the classic Garfield comic where Jon just stares in increasing frustration at his rotary phone for multiple panels, to finally shout JUST RING ALREADY.
(His cat adds some dry remark which I have forgotten)
Now see I'm curious, would it be better if we had no context on what the comic _is supposed_ to be? Or is this only hilarious in comparison to the typical "i hate mondays, how many pieces of flair", type schtick the original goes for. Honestly i think it's the latter, cause after 20 of these or so they kind loose the appeal except for the very occasional guffaw. Though, is that still a higher laugh count than the original? I'm not about to read Garfield to find out
Actually no. My hit-counter uses javascript which filters out almost (but not quite) all of the bots. It probably misses some real users that have javascript turned off.
This is what I did as well. Not wanting to take away my users privacy I built my own simple counter in 2022. I wrote about The Raspberry Pi 400 in My Bedroom on my blog at the URL below.
I just downloaded a click sound and I think I'm going to see if adding it drives me crazy.
Good question, it is not bad to enjoy attention for a project you worked on.
But I feel that, if unchecked, that same impulse can lead to deliberately doing projects specifically for validation which leads to low quality click-bait and vapid self-promotion. I think a healthy indifference for the public at large is a good thing.
That is one of the reasons I got rid of detailed, real-time analytics in favor of a simple hit counter (the other is privacy). If I really stuck to my principles I wouldn't even do that but I am a hypocrite.
When I was hosting a site run on bespoke PHP pages, I had a hit counter that used straight text files under the hood. It was surprisingly effective and a fun experience.
Just goes to show that even the most obscure comments can net thousands of views, considering only a small percent of people that have read the comment will actually engage, and that small percent was over 4k folks. Kind of puts things in perspective for me.
The Fish Doorbell would be a great use case for AI, but I'd rather live in a world where volunteers just watch the video and ring a bell whenever a fish wants to get through.
The point of the fish doorbell is educating people about what lives in the water. There would be much less resource-intensive ways of "solving" the problem, if that was the goal.
Fun. You can tell it's receiving some love right now
while true; do; sleep 5; curl http://susam.net:8000 ; done
curl: (1) Received HTTP/0.9 when not allowed
curl: (1) Received HTTP/0.9 when not allowed
curl: (7) Failed to connect to susam.net port 8000 after 11 ms: Couldn't connect to server
curl: (56) Recv failure: Connection reset by peer
curl: (7) Failed to connect to susam.net port 8000 after 8 ms: Couldn't connect to server
curl: (1) Received HTTP/0.9 when not allowed
curl: (7) Failed to connect to susam.net port 8000 after 8 ms: Couldn't connect to server
curl: (1) Received HTTP/0.9 when not allowed
curl: (7) Failed to connect to susam.net port 8000 after 10 ms: Couldn't connect to server
curl: (7) Failed to connect to susam.net port 8000 after 11 ms: Couldn't connect to server
curl: (56) Recv failure: Connection reset by peer
curl: (56) Recv failure: Connection reset by peer
curl: (1) Received HTTP/0.9 when not allowed
You might want to add the --http0.9 flag to curl, to tell it that getting a response of just "ok" (HTTP 0.9 style, body only without headers) isn't an error.
Here's a more advanced - and 'ancient' (2000) - version of this idea: Peep (The Network Auralizer): Monitoring Your Network With Sound [1].
I ran this for a number of months back in the day, it made my living room sound like a jungle. Running the same setup nowadays would probably make it sound like the gates of hell given the increase in network traffic.
You can still find it at Sourceforge but it will need some work or maybe a VM running an older Linux distribution:
Sometime circa 1998 there was a group looking for new technical hires for startups they invested in. They posted somewhere, perhaps /., that they were accepting résumés via SMTP on a non-standard port, as a filter mechanism.
I never heard back, although I ended up working for one of their companies the next year anyway.
Around the same time people sometimes posted job openings in the html source of their websites. I never answered any, 'cause I wasn't looking for technical jobs at the time, but it always seemed clever to me.
Any, and only, nerds who were interested in web development incessantly "View Source"ed on every page that looked interesting. It was a major vector by which early-web frontend techniques spread themselves, and it was great: you could cut-and-paste the html, direct download the .css and other resources, and get an offline model of their site running for you to tinker with to learn their secrets. All the magic was out in the open (for those who cared to pull back the curtain), and the future seemed limitless.
> At first, I fought back manually, feeding them fake data. But that got old fast. So I deployed my secret weapon: a zip bomb.
> When their bot accessed my site, I served it a tiny compressed file. Their server eagerly downloaded and decompressed it, only to unleash several gigabytes of chaos. Boom. Game over.
How did you know their bot would decompress it? I thought a bot would copy the HTML content of your article, maybe the images, and paste them on their own website. At no point does it involve editing or decompressing files?
Impressive animation, by the way—the number of bots is staggering.
As soon as bots reach a page with the compressed payload, they never make another request. That's how I know it worked. Also curl, wget, or most libraries automatically decompress gzip content.
Of course, Some bots just post spam without ever reading the content back, which defeats my scheme.
> The other party can use whatever client they have to connect to port 8000 of my system, e.g., a web browser, nc HOST 8000, curl HOST:8000, or even, ssh HOST -p 8000, irssi -c HOST -p 8000, etc.
Now I feel a bit like writing some tool to automatically follow your posts (easy with existing HN replies) do a semantic analysis on them to determine when you will make that misstake again and give some alert (the hard/expensive part).
I put an unsecured open FTP server on the internet about 20 years ago, just to see what would happen.
Within half a day I had some pirate "marking" his claim to my FTP server, then he/she started uploading a game. I deleted everything and left it open again.
It was a long time ago, so I don't remember all the details, but all the pirates would create directories inside directories, upload files, then mark it with their mark. All of this was scripted I gather.
After a while, I set up a file system watcher that deleted subdirectories. This gave me an FTP server I could use for anything. I shut it down a few months later.
When this gets popular, what does the author do at night? Sleep as far away as possible from the terminal so that the beeping doesn’t keep them awake? Or is the terminal at work? HN needs to know this vital information!
This has been the way it is for a long time, even before the AI startups got going. Seems everyone and their mother has built some sort of HN-aggregator with builtin link scraping.
Totally. Most websites where you show and vote for a product (HackerNews, ProductHunt, etc) are ridden with bots.
For example, one person here offers:
> Our AI generates relevant, useful replies to selected mentions, that aim to genuinely help the original poster, and that include a subtle mention of your product.
I didn't expect it to get much attention, so I went with four beeps. I felt that a set of four beeps spread over three seconds would be long enough to grab my attention, even if I was busy with something else.
Also, as another commenter pointed out, a sequence of four beeps is distinctive enough that it doesn't get lost among the stray, ordinary beeps I might hear while working in a terminal or Emacs (like from hitting backspace or pressing ctrl+g, etc.).
i clicked the link to see why it was the hug of "death", only to then realize after reading, it was hug of "deaf". i wonder what the unique user count was.
In the current graphs, the x-axis represents the hour of the day in UTC. For example, the first graph shows 43 connections between 10:00 UTC and 11:00 UTC, 407 connections between 11:00 UTC and 12:00 UTC, and so on.
Previously, the first graph showed the number of hours elapsed since the experiment began (which started at 10:14 UTC). That's likely what you saw earlier today. After reading your question, I updated the graph to display the actual hour of the day (in UTC) instead of elapsed time, making the time of the day clearer. Thanks for the great question!
I did end up implementing a simple hit counter on my site just to satisfy my craven need for validation without resorting to full analytics. It doesn't beep at me, but maybe it should.
Good stuff. Cheers.
The dreadnaught is tough but not impossible. Torpedoes will help but a few lucky hits with lasers will also do the job.
(His cat adds some dry remark which I have forgotten)
https://garfieldminusgarfield.net/
I just downloaded a click sound and I think I'm going to see if adding it drives me crazy.
https://joeldare.com/private-analtyics-and-my-raspberry-pi-4...
I really hate that modern websites include multiple trackers - there is really no need for invasive analytics.
But I feel that, if unchecked, that same impulse can lead to deliberately doing projects specifically for validation which leads to low quality click-bait and vapid self-promotion. I think a healthy indifference for the public at large is a good thing.
That is one of the reasons I got rid of detailed, real-time analytics in favor of a simple hit counter (the other is privacy). If I really stuck to my principles I wouldn't even do that but I am a hypocrite.
Talking about that, I have a great blog that…
Just kidding
https://visdeurbel.nl/en/
I'd've said the benefit is that it's simply a concise single command instead of a "while true" loop and a "sleep 5" command.
The best kind of experiments. And sometimes huge innovations/inventions/medicine/progress/more fun will arise from it.
https://en.m.wikipedia.org/wiki/Trojan_Room_coffee_pot
https://github.com/NARKOZ/hacker-scripts
I ran this for a number of months back in the day, it made my living room sound like a jungle. Running the same setup nowadays would probably make it sound like the gates of hell given the increase in network traffic.
You can still find it at Sourceforge but it will need some work or maybe a VM running an older Linux distribution:
https://sourceforge.net/projects/peep/
[1] https://www.usenix.org/legacy/publications/library/proceedin...
I never heard back, although I ended up working for one of their companies the next year anyway.
Any, and only, nerds who were interested in web development incessantly "View Source"ed on every page that looked interesting. It was a major vector by which early-web frontend techniques spread themselves, and it was great: you could cut-and-paste the html, direct download the .css and other resources, and get an offline model of their site running for you to tinker with to learn their secrets. All the magic was out in the open (for those who cared to pull back the curtain), and the future seemed limitless.
https://idiallo.com/blog/surviving-the-hug-of-death (sorry not mobile friendly)
There is a surprising number of bots. It will be fun to setup something like this whenever I get hn traffic.
> When their bot accessed my site, I served it a tiny compressed file. Their server eagerly downloaded and decompressed it, only to unleash several gigabytes of chaos. Boom. Game over.
How did you know their bot would decompress it? I thought a bot would copy the HTML content of your article, maybe the images, and paste them on their own website. At no point does it involve editing or decompressing files?
Impressive animation, by the way—the number of bots is staggering.
Of course, Some bots just post spam without ever reading the content back, which defeats my scheme.
> The other party can use whatever client they have to connect to port 8000 of my system, e.g., a web browser, nc HOST 8000, curl HOST:8000, or even, ssh HOST -p 8000, irssi -c HOST -p 8000, etc.
> curl -v --http0.9 susam.net:8000
Within half a day I had some pirate "marking" his claim to my FTP server, then he/she started uploading a game. I deleted everything and left it open again.
It was a long time ago, so I don't remember all the details, but all the pirates would create directories inside directories, upload files, then mark it with their mark. All of this was scripted I gather.
After a while, I set up a file system watcher that deleted subdirectories. This gave me an FTP server I could use for anything. I shut it down a few months later.
Interesting though.
You can type it in a terminal with ctrl-g. It won't be displayed in most cases and if you've configured your terminal like me won't make a sound.
It's kinda risky to but something like this in the comments, what if nobody ever sees it? What if it never beeps?
It's just weird enough people (like myself) would do it. I would have if I saw it, but I missed it.
Such a shame susam.net still has not adopted IPv6 in 2025 :-Q
For example, one person here offers: > Our AI generates relevant, useful replies to selected mentions, that aim to genuinely help the original poster, and that include a subtle mention of your product.
and ProductHunt: https://wakatime.com/blog/67-bots-so-many-bots
Hi everyone, my VPN is running slow. Anyone have any tips?
When I originally shared this in 2022, it was just a comment here: https://news.ycombinator.com/item?id=30146019#30146451
I didn't expect it to get much attention, so I went with four beeps. I felt that a set of four beeps spread over three seconds would be long enough to grab my attention, even if I was busy with something else.
Also, as another commenter pointed out, a sequence of four beeps is distinctive enough that it doesn't get lost among the stray, ordinary beeps I might hear while working in a terminal or Emacs (like from hitting backspace or pressing ctrl+g, etc.).
Look at the quoted command, there it is.
In the current graphs, the x-axis represents the hour of the day in UTC. For example, the first graph shows 43 connections between 10:00 UTC and 11:00 UTC, 407 connections between 11:00 UTC and 12:00 UTC, and so on.
Previously, the first graph showed the number of hours elapsed since the experiment began (which started at 10:14 UTC). That's likely what you saw earlier today. After reading your question, I updated the graph to display the actual hour of the day (in UTC) instead of elapsed time, making the time of the day clearer. Thanks for the great question!