• 0 Posts
  • 8 Comments
Joined 1 year ago
cake
Cake day: August 9th, 2023

help-circle
  • The only way I can think of is blacklisting everything by default, directing to a challanging proper captcha (can be selfhosted) and temporarily whitelisting proven human IPs.

    A captcha will inconvenience the users. If you just want to make it worse for the crawlers, let them spend compute ressources through something like https://altcha.org/ (which would still allow them to crawl your site, but make DDoSing very expensive)








  • Here is my personal approach to this.

    • I have set my bash history to a ridiculous 1000000 max length, so that I can use CTRL+R to search for commands that I have ran before

    • I write down a lot of commands in a searchable note text document

    • Ask chatGPT

    • Use the tldr command

    • Added A LOT of verbose custom aliases and scripts. For example instead of

    inotifywait -m -r --exclude "(/tmp.*|/var/cache.*|/dev/pts/|/var/log.*)" -e MOVED_TO -e CREATE -e CLOSE_WRITE -e DELETE -e MODIFY . (nobody can remember that alphabet gibberish)

    I just type watch_for_changes .

    Since it is verbose, straight from my brain, I always remember it and it works with autocomplete. I have like ~30 such commands so far.