Jump to content

Proposed Security.txt will work like Robots.txt


Recommended Posts

Ed Foudil, a web developer and security researcher, has submitted a draft to the IETF — Internet Engineering Task Force — seeking the standardization of security.txt, a file that webmasters can host on their domain root and describe the site’s security policies. The file is akin to robots.txt, a standard used by websites to communicate and define policies for web and search engine crawlers....

Read more about Proposed Security.txt will work like Robots.txt on Lunarsoft.

View the full article

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...