Broken links harm users' experience and your site's SEO performance. The web is ever-changing and a URL that is one day valid and functioning may point to different or non-existent content the next. Ensuring that each link on MY site works as intended is becoming a tedious task, let alone sites that are much larger and more complex than mine. I wrote this web crawler to automate the process. When run, this web crawling script will recurse through hyperlinks of the site specified in the configuration file. Along the way, it checks the URL specified in every 'href' attribute. Each URL is only tested once per run to avoid placing unnecessary load on other webmasters' servers. Once complete, the script outputs a nice report, telling me what broken links it found and what pages they appear on.