Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Thursday May 21 2015, @08:28PM   Printer-friendly
from the hackers.txt dept.

Robots.txt files are simple text files that website owners put in directories to keep web crawlers like Google, Yahoo, from indexing the contents of that directory. It's a game of trust, web masters don't actually trust the spiders to not access every file in the directories, they just expect these documents not to appear in search engines. By and large, the bargain has been kept.

But hackers have made no such bargain, and the mere presence of robots.txt files are like a X on a treasure map. And web site owners get careless, and, yes, some operate under the delusion that the promise of the spiders actually protects these documents.

The Register has an article that explains that hackers and rogue web crawlers, actually use robots.txt files to find directories worth crawling.

Melbourne penetration tester Thiebauld Weksteen is warning system administrators that robots.txt files can give attackers valuable information on potential targets by giving them clues about directories their owners are trying to protect.

Once a hacker gets into a system, it is standard reconnaissance practice to compile and update detailed lists of interesting sub directories by harvesting robots.txt files. It requires less than 100 lines of code.

If you watch your logs, you've probably seen web crawler tracks, and you've probably seen some just walk right past your robots.txt files. If you are smart there really isn't anything of value "protected" by your robots.txt. But the article lists some examples of people who should know better leaving lots of sensitive information hiding behind a robots.txt.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Insightful) by Anonymous Coward on Thursday May 21 2015, @08:40PM

    by Anonymous Coward on Thursday May 21 2015, @08:40PM (#186190)

    In security questions, never trust the other side.

    Starting Score:    0  points
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  

    Total Score:   1  
  • (Score: 5, Insightful) by VortexCortex on Thursday May 21 2015, @08:49PM

    by VortexCortex (4067) on Thursday May 21 2015, @08:49PM (#186195)

    For computer security there are three rules:

    0. Never turn on the computer.
    1. If you turn on the computer, never connect it to another computer.
    2. If you turn on the computer and connect it to a network, never let any non-sysadmin operate its clients.

    When it comes to robots.txt there are three rules:
    0. The first rule of robots.txt is that you don't talk about robots.txt.
    1. The second rule of robots.txt is that you don't talk about robots.txt.
    3. Rule number three of robots.txt is that you don't list the 3rd rule.

    You know how I keep Google et. al. from indexing things I don't want it to index? I don't make such things available outside of logged in sessions.

    For some real fun, mask your browser user agent as a search engine and get granted read access to many things you should probably not have.

    As a hacker, I hate robots.txt. What I love is URL munging, since it even works with "cookies" disabled...