Skipfish: Web Application Security Scanner

Skipfish

Introduction

Skipfish is a powerful reconnaissance tool that has the ability to carry out security checks on web-based applications. Through recursive crawls and launching probes on the available dictionary files the tool is able to prepare a site map which acts as an interactive platform for the site that is being targeted. The map that is obtained is annotated using the resulting outputs that are gotten from several security checks.

The tool then generates a report which can aid in handling security-related issues on a given web application, acting as a basic foundation for all the security assessments carried out on a particular web-based application.

Skipfish: Security Scanner for Web Applications

As a security scanner Skipfish is very efficient and can be used to spot vulnerabilities such as SQL injections, directory listings, and command injections among others. Using this tool also provides several advantages. This is made possible due to the fact that the tool tries to look into some of the security based issues that other tools may have a difficulty in handling. 

Skipfish is very fast and can handle over 2000 requests in a single second when launched in a LAN/MAN based networks. When dealing with responses on Internet targets Skipfish is able to manage 500+ requests within a second. Skipfish also supports HTTP authentications and this makes the tool to particularly come in handy when handling a site that may require basic level authentications. Since the tool is very good at capturing session cookies that have been authenticated, its functions can be also applied on sites that need authentications in the web application stage.

All the sessions that are run on Skipfish can be kept safe from any destruction by stopping new cookies or excluding the available logout links. The adaptive crawling scope of Skipfish also allows the user to perform scans on sites that may contain a large volume of data. Due to its adaptive nature you can easily tune the tool’s crawl depth by limiting the scans to subdirectories that have you have chosen.

By applying snort style signatures the tool is able to highlight errors that may be related to the server in use. It can further show web applications that may potentially pose some danger. Through the use of handcrafted dictionaries this tool is able to give results that are accurate within a timeframe that is favorable to the user. Skipfish also performs an in-depth analysis on the content available on the site and can use this information to automatically construct a word list.

Subtle problems such as incorrect caching of directives can also be detected by applying the Ratproxy logic. This style enable the tool to pick up the slightest underlying problems within a given web application.

Features

  • 500+ against Internet targets, 2000+ requests per second on LAN / MAN networks, and 7000+ requests against local instances.
  • Automatic word list construction based on site content analysis.
  • Heuristic recognition of obscure path and query-based parameter handling schemes.
  • Snort style content signatures which will highlight server errors, information leaks or potentially dangerous web applications.
  • Bundled security checks are designed to handle tricky scenarios: Stored XSS (path, parameters, headers), blind SQL or XML injection, or blind shell injection.

Installation

Clone the repo:

git clone https://github.com/spinkham/skipfish.git

or download and unpack and run:

make

Note: Make sure you install libidn in order to avoid skipfish running issues/errors.

Usage

  • To get all the parameters type
skipfish -h
  • To scan the target and to write the output in the directory.
skipfish -d -o 202 http://address/
  • To read a URL from a given file use:
skipfish [...other options...] @../path/to/url_list.txt

You need to customize your HTTP requests when scanning big sites.

-H   To insert any additional, non-standard headers.
-F To define a custom mapping between a host and an IP.
-d Limits crawl depth to a specified number of subdirectories.
-c Limits the number of children per directory.
-x Limits the total number of descendants per crawl tree branch.
-r Limits the total number of requests to send in a scan.

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *