May 8, 2011

Looking for web security breaches with Skipfish (II)

I am going to finish my article about looking for web security breaches with Skipfish. Once we have got a global sight of skipfish, I will run a test against a default MediaWiki installation.

First, I must create a dictionary although it will not be used in this test. One interesting option that I have chosen is -I, in order to only follow those URLs which match the string associated with the parameter.

javi@ubuntu-server:~/skipfish-1.86b$ cp -a dictionaries/complete.wl dictionary.wl

javi@ubuntu-server:~/skipfish-1.86b$ ./skipfish -W /dev/null -I -o mediawiki_dir

If you do not set this option and skipfish figures out more sites, it will scan them as well. In case you want to shut out a specific URL, you must establish it by means of the -X parameter.

During the crawling, skipfish shows information in real time about its analysis.

skipfish version 1.86b by <>                                                                                                                                                                                                                          
- -                                                                                                                                                                                                                                                   
Scan statistics:                                                                                                                                                                                                                                                        
Scan time : 0:59:38.542                                                                                                                                                                                                                                           
HTTP requests : 34669 (10.4/s), 100769 kB in, 12487 kB out (31.6 kB/s)                                                                                                                                                                                                
Compression : 77967 kB in, 255451 kB out (53.2% gain)                                                                                                                                                                                                               
HTTP faults : 0 net errors, 0 proto errors, 0 retried, 0 drops                                                                                                                                                                                                      
TCP handshakes : 351 total (152.0 req/conn)                                                                                                                                                                                                                            
TCP faults : 0 failures, 0 timeouts, 3 purged                                                                                                                                                                                                                      
External links : 202 skipped                                                                                                                                                                                                                                           
Reqs pending : 18692                                                                                                                                                                                                                                                 
Database statistics:                                                                                                                                                                                                                                                    
  Pivots : 880 total, 462 done (52.50%)                                                                                                                                                                                                                          
In progress : 34 pending, 145 init, 221 attacks, 18 dict                                                                                                                                                                                                            
Missing nodes : 4 spotted                                                                                                                                                                                                                                             
Node types : 1 serv, 186 dir, 544 file, 16 pinfo, 76 unkn, 57 par, 0 val                                                                                                                                                                                           
Issues found : 11 info, 75 warn, 57 low, 0 medium, 128 high impact                                                                                                                                                                                                   
Dict size : 263 words (263 new), 4 extensions, 256 candidates

At the end of the process, skipfish will dump all the data collected within the mediawiki_dir directory (defined by the -o option), that in turn contains an HTML file (index.html) which allows to view the report generated.

In the previous outcome, skipfish has only found out severe problems related to HTTP PUTs accepted.

So as to be able to understand the results offered by skipfish and if you do not have deep knowledge about web security (like me), you might take a look at the Browser Security Handbook, written and maintained by the same author who is developing skipfish.

Other interesting parameter is for example -A, used for passing HTTP authentication credentials.

And finally, also point out that you can tune skipfish in networking or crawling scopes, through different options which allow to set up for instance parameters related to TCP connections or the depth of the analysis. For getting more information you can check the project documentation.

1 comment:

  1. Cyber risks are evolving and becoming more complex as technology and criminals increase in sophistication, making the risk of a data breach at your company all the more real , likely and potentially catastrophic. You have to remember that when deciding to rule your data yourself. If you are not confident in your own knowledge and strength it's always better to hire one of virtual data room companies.
    data room review