What is the role of robots.txt? Simply put, it has two roles.
First: block dynamic URLs
Second: Block files that should not be included
If the website has a static URL turned on, then you need to block the dynamic URL, otherwise, it will be included repeatedly, the dynamic link and the static link coexist, which affects the search engine optimisation effect, and it looks very unsightly and is not good for the user experience, so You need to use the robots protocol to block dynamic URLs, and let search engines only include static URLs to form a unified URL.
We can also use the robots protocol to block some files with sensitive information and improve the security of the website.
Robots.txt also has a role, that is, to block a search engine. If you don’t want a search engine to include your site, you can use the robots protocol to block it, but few people do this. SEO itself is for Search engine rankings, if you use robots to block search engine inclusion, it is obviously unreasonable.