I get way too many bot requests trying to access the Mastodon API endpoints.
A lot of those seem like well behaved bots, as they always request robots.txt first.
Would it make sense to add the /api endpoint to the robots.txt
as
disallowed? Or make the robots.txt easily configurable?
(I know I can change the code to block this endpoint, but that's not a very elegant solution)
Hey,
Can you try it first by updating the robots.txt locally to see how effective it would be (and report back)?
Thanks!
Sure, I'll do that now and check next week.
On 03/02/2023 21:11, ~tsileo wrote:
Hey,
Can you try it first by updating the robots.txt locally to see how effective it would be (and report back)?
Thanks!
Indeed, it doesn't seem to make much of a difference :(
On 03/02/2023 21:22, João Costa wrote:
Sure, I'll do that now and check next week.
On 03/02/2023 21:11, ~tsileo wrote:
Hey,
Can you try it first by updating the robots.txt locally to see how effective it would be (and report back)?
Thanks!