I'm setting up my web page to support ssr and here comes my question, can I know if the client is a web-crawler so I can do ssr?
This way I will serve my web-page as it is to clients that are not web-crawlers
I have seen that to verify google-bot-crawler you can use https://stackoverflow.com/a/3308728/8991228
But is there a general way of doing so?