I use good ol' obscurity. My reverse proxy requires that the correct subdomain is used to access any service that I host and my domain has a wildcard entry. So if you access asdf.example.com you get an error, the same for directly accessing my ip, but going to jellyfin.example.com works.
And since i don't post my valid urls anywhere no web-scraper can find them.
This filters out 99% of bots and the rest are handled using authelia and crowdsec
Are you using HTTPS? It's highly likely that your domains/certificates are being logged for certificate transparency. Unless you're using wildcard domains, it's very easy to enumerate your sub-domains.
If you're using jellyfin as the url, that's an easily guessable name, however if you use random words not related to what's being hosted chances are less, e.g. salmon.example.com . Also ideally your server should reply with a 200 to * subdomains so scrappers can't tell valid from invalid domains. Also also, ideally it also sends some random data on each of those so they don't look exactly the same. But that's approaching paranoid levels of security.
Of course i get a bunch of scanners hitting ports 80 and 443. But if they don't use the correct domain they all end up on an Nginx server hosting a static error page. Not much they can do there
That reminds me ... another annoying thing Google did was list my private jellyfin instance as a "deceptive site", after it had uninvitedly crawled it.