@phnt@fluffytail.org imagine using Anubis for a public social networking site smdfkh
RE: https://fluffytail.org/objects/483ac226-65c4-439c-bd25-aa7c62e0d7dc
Timeline
Post
Remote status
Context
2
@yakumo_izuru @phnt i thought of doing so to prevent scraping. But then again, it wouldn't help due to how AP works.
@newt @yakumo_izuru It would work, but you would also have to enable signed fetches.
Replies
6@phnt@fluffytail.org which would lump you with the Mastodon crowd instead of us :P
@newt@stereophonic.space
@phnt @yakumo_izuru signed fetches aren't exactly helpful, given how most servers operate on a blacklist policy, meaning a brand new server can scrape things like no tomorrow until banned. Then buy a new domain, rinse and repeat.
@newt @yakumo_izuru Pleroma does not even enforce that I think, only Mastodon and Acoma recently does. GTS also probably does but who even uses that.
If it is signed, the signature is valid and the key is fetchable, you will get the Object no matter if the instance is on the MRF reject list or not.
So it prevents only for scraping for those that don't properly sign the requests (most scrapers). The current scraper tries to sign requests, but it doesn't have a domain name associated.
If it is signed, the signature is valid and the key is fetchable, you will get the Object no matter if the instance is on the MRF reject list or not.
So it prevents only for scraping for those that don't properly sign the requests (most scrapers). The current scraper tries to sign requests, but it doesn't have a domain name associated.
@phnt @yakumo_izuru @newt Domain name needed no more since Google's free CA now gives IP certs. Wouldn't eve need that if Mastodon wasn't so dependent on webfinger which has HTTPS hardcoded in RFC.
@mint @yakumo_izuru @newt It would, but also really easy to mitigate and I bet that at least GTS would do that.