I hope you know about what is robots.txt in my previous articles that I have mentioned below the attached link. Now, it while to know about how it works. so, first, you need to write a basic format about robots.txt on the notepad++ as
User-agent: *
Disallow: /our-placed-students
Disallow: /cgi-bin/
Disallow: /VIET-webworkshop/
Disallow: /MenClothing
Allow: /MenClothing/tshirts
and you will have to mention which website page or content you want to allow or disallow as the above format. After this, your robots.txt file will become ready to save and you will save it with the name Robots & save type as Text Documents. This way your file will be saved.
After this, you will check the two websites as I have mentioned below:
- Non-WordPress website
- WordPress website
- Non-WordPress website
- Again, you will have to visit your website cpanel with the login process and after scrolling down, you have to select File manager.
- Now, after entering it, you will have to click on public_html and upload the robots.txt file from where you have saved it.
- After uploading the robots.txt file then it redirects to the home page and you will see your file has been downloaded over there.
- If you will check your website url with robots.txt then you will see the robots.txt file over the web browser as the same as you uploaded it on the cpanel.
- WordPress website
- Next, you will select your own WordPress website then go to your dashboard and select your sidebar tools option.
- After coming into the tools option then you will see the Robots.txt Editor and after scrolling that page you will get Robots.txt file over there.
- Again, two steps above, you will have to add a directory path with allow or disallow option and write the user agent like Yahoobot or set *( Asterisk) for all search bots.
- And click on the save changes option with * then it will not search by any search engine or bots.