SANS Penetration Testing

Pen Test Poster: "White Board" - Bash - Website Cloner

Board Elements_clean_Website Cloner

By Matthew Toussain, Geoff Pamerleau

 

According to Verizon's 2016 Data Breach Investigations Report 30 percent of phishing messages get opened by targeted users and 12 percent of those users click on the malicious attachment or link. That means, on average, every 4 phished users should (statistically) yield one click. Developing a highly targeted message and leveraging additional backend resources to validate the message's legitimacy can go a long way to raising those odds even further. In many cases the first step to establishing this "legitimacy" I developing a web presence. Having a believable page to redirect users is a key component to maximizing success. While there are robust methods and frameworks for doing so, the Social Engineering Toolkit being a community favorite, sometimes a quick and dirty command line clone is enough to get the job done. When this is the case the wget command has options to crawl and retrieve a website's page hierarchy.

 

Methods Covered in this Section:

Checking and starting the Apache Web Server:

systemctl status and systemctl start

Cloning a target website:

wget -r -nH $URL

Cloning a target webpage:

wget -r -np $URL

 

For this example, we'll assume that our target organization is involved in workplace automation. With that in mind we do a little research and find a recently published article with the headline "Half the work people do can be automated".

bash_websitecloner01Original Page

 

To host the site, we plan on cloning we need to use a web server. Popular options include Apache, Nginx, IIS, and Tomcat. As apache2 ships with Kali Linux we'll use that. On Kali, we can check the webserver's status by using the systemctl command. By default, apache will not start on boot so our command output should look a little something like:

 

Checking the webserver status

bash_websitecloner02

 

Issuing a systemctl start apache2 command activates the webserver. The start command won't display any output if it is successful. We can verify that Apache is running with the systemctl status apache2 command. This time the output should show the status as active (running).

 

Checking if our webserver is running (again)

bash_websitecloner03Awesome, Apache is running

 

Now that our server is up, we just need to clone our site and move it to the document root of the default website. This is the folder that a webserver uses to distribute content upon receipt of a web request. Since this location can vary depending on apache version and distribution, it may be necessary to verify the location by examining the current Apache configuration. Depending on the Apache setup this information will most likely be in either /etc/httpd/conf/httpd.conf or /etc/apache2/sites-enabled/000-default.conf by default. On the Kali distribution used here the latter option was valid. The grep command provides a succinct mechanism to validate the DocumentRoot configuration:

 

 

bash_websitecloner04

We will clone our site into DocumentRoot at /var/www/html

 

The server is now running and we have identified where to put the mirrored website. The final step is to clone our target site using the wget command. Using our target as the URL:

 

Cloning the website:
wget -r -np <url here>

 

bash_websitecloner05Plus more output till the pages are cloned

 

The flags in the above command tell wget to recursively (-r) fetch resources and to avoid fetching anything at the parent directory or above except for explicitly listed links (-np). Recursive fetching ensures that all referenced resources are downloaded and the no parent option prevents the retrieval of resources above the targeted site in the directory hierarchy. The no-host-directories (-nH) option can be used to cut out the host prefix. Essentially this causes the directory structure to be created relative to the cloned copy. After running the wget command multiple resources will be downloaded and eventually we'll get a finished message telling us how many files were downloaded and the total size of the download. In order to finally host the content we just have to move all the downloaded files to the DocumentRoot that we determined earlier.

 

That's it! At this point there is a cloned website being hosted on our Kali station ready for use as a landing site to point our phishing victims to. Now we can freely edit the page, maybe to inject content we want to execute in the victim's browser, or check our apache logs to see what public IPs connected from our phishing attempt.

 

bash_websitecloner06Cloned Page

 

Best of luck on your phishing trip!

SANS Note:

Matthew Toussain is teaching, SANS SEC560: Network Penetration Testing and Ethical Hacking in Austin, TX in December 2017 and Anaheim, CA in February 2018. Or you can take SEC560 at anytime - online - by taking it OnDemand.

SANS Pen Test HackFest 2017

800x320_PT-Hackfest-2017

  • 2-Day Penetration Testing & Ethical Hacking Summit w/ 20+ Speakers
  • 3-Nights of SANS NetWars CtF w/ Coin-A-Palooza, your chance to earn up to five SANS Pen Test Challenge Coins
  • 1-Night of SANS CyberCity Missions - Hack/Defend a Modern City
  • Special Field Trip Experience for all Summit Attendees
  • 9 SANS Training Courses - 6-days of amazing SANS Training
  • https://www.sans.org/hackfest

Post a Comment






Captcha


* Indicates a required field.