Today we’re are looking at Hack This Site Realistic Mission 5. This is a fun challenge that requires you to enumerate the web application and then cracked the discovered hashes. In order to solve this mission, you will need some basic knowledge of how Linux web servers work and some knowledge of what password hashes are and how to crack them. If you haven’t seen my other posts on the realistic series you can do so here: Part 1, Part 2 and Part 3.
The purpose of this mission is to hack a password for a telemarkets website. With that password, we are to destroy the database thus restoring privacy to the lives of their victims. In order to get the password, we will need to explore the web application to look for clues. Upon logging in we receive the following communication from
Exploring The Web Appliation
Spiffomatic64 has given us a link to the telemarketer’s application. Visting the web application presents us with the page depicted in the image below. The application has some basic functionality such as a news page, a contact page, and a database page.
Navigating to the news page tells us a lot about the application. Probably more than the creator should have told us. For instance, they inform us that the website was previously hacked. They also inform us that google was grabbing links it shouldn’t so they have taken extra precautions. In order to stop google indexing certain parts of a web application, you can add a Robots Exclusion Standard file to the root of your domain. By adding URLs to the robots.txt file you are telling Google you don’t want those URL’s indexed the next time it crawls the application.
Navigating to the robots.txt file we can see that the file is telling all User-Agent (denoted with the asterisk) to disallow indexing of /lib and /secret. Search engine spiders have their own User-Agents that allow applications to identify them and whitelist them. See my post on User-Agent switching on how this could be abused.
Navigating to each of these directories reveals some interesting files. Firstly, in the secrets directory, we have an admin.bak.php file and an admin.php file. We can deduce that the admin.bak.php is likely a backup of the admin.php file. Attempting to access the admin.php file results in an incorrect password warning. This allows us to conclude that the admin.php page is the page we’re attempting to gain access to.
Moving forward, let’s take a look at the admin.bak.php file. I’ve downloaded a copy of this file to my virtual machine and used the cat command to display the contents. We can see from the file that there is a reference to MD4. MD4 is a hashing algorithm that was used to hash passwords. Hashing passwords attempts to protect passwords in the event that they are leaked.
The Application Hack
Going back to the robots.txt file there was another directory called lib. Navigating to this directory shows a file called hash. If we download this file and view the contents it appears as if we have recovered a password hash of “51ba17c17338c1031e11432dfb47105a”.
Based on the information we found in the admin.bak.php file, we can safely assume that the hash is an MD4 . Fortunately, MD4’s are a fairly old hashing algorithm and can be cracked easily. I attempted to crack the hash online using crackstation.net and a few other sites but it appears the sneaky admin’s over at Hack This Site like to change the hashes periodically, and this hash hadn’t been cracked before. So in order to crack the hash we needed to contact our old friend John. Using John The Ripper I specified the format of MD4 and told John which file to crack. Normally I would specify a wordlist but I left John to use his default one. The command to crack the hash is as follows.
john --format=raw-MD4 <file-to-crack>
Once the correct password has been recovered, you can head over to the database link on the main page and paste it in. This should complete the mission.