NetSweeper attempts a denial of service attack.

I looked at my MRTG statistics this morning and noticed something odd: Between 8AM and 9AM (UTC), the number of active Apache processes (usually less than 5) jumped up to 40. Looking at my server logs, the culprit was obvious:
portsnap.daemonology.net 66.207.120.227 - - [19/Sep/2005:08:20:19 +0000] "GET /t/733a6bdcdea7399617d98aab38f79345 bc35865c175a2d546e94343122de897f HTTP/1.1" 403 357 "-" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0"

... (3222 lines elided)

portsnap.daemonology.net 66.207.120.227 - - [19/Sep/2005:08:57:12 +0000] "GET /bp/597f82d31ae80bf2c7e28bdfb5b8162b 4ded33ebdb7e2783e777f4c8b8ce96d5- 63cda9e7234e119fc7d3b81e17873d8c 1359e48b7e57bfac5de0a1ec31c4869d HTTP/1.1" 403 423 "-" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0"
(I've added some spaces in the file names to allow the lines to wrap usefully.)

These peculiarly named files are used by portsnap to fetch updates to the FreeBSD ports tree. As such, there is no reason for any user-agent other than portsnap to be fetching these files. The IP address 66.207.120.227 resolves to host227.net-sweeper.com.

A few points come to mind concerning the behaviour of NetSweeper's robot here:

  1. Over the course of 2213 seconds, it sent 3224 requests: one request every 0.68 seconds. The 1993 Guidelines for Robot Writers suggest sending no more than one request per minute.
  2. It sends a Fraudulent User-Agent header. It claims to be running Firefox/1.0, but no interactive user would ever have attempted to download such a large number of files (particularly when many of them have neither existed nor been linked to from anywhere for several days -- the files used by portsnap are rather ephemeral).
  3. It does not obey instructions in robots.txt files -- in fact, it never fetches them.
  4. Even after receiving over 3000 "HTTP 403 (Forbidden)" responses -- which, even to a robot which doesn't obey the Robot Exclusion Standard should be a fairly clear indication that its presence is undesired -- it continued to attempt to fetch files.
  5. After a request was denied by my server, NetSweeper's robot held its connection open -- thereby keeping an Apache process busy -- until Apache stopped waiting for it and closed the connection 15 seconds later. This had the result of creating the maximum possible load on my server for the set of files requested.
It has been remarked on many occasions that "Any sufficiently sophisticated denial of service attack is indistinguishable from a large amount of legitimate traffic". In light of my observations today, I'd like to add the NetSweeper corollary: Any sufficiently broken robot is indistinguishable in its behaviour from a deliberate denial of service attack.

Posted at 2005-09-19 17:00 | Permanent link | Comments
blog comments powered by Disqus

Recent posts

Monthly Archives

Yearly Archives


RSS