بسم الله الرحمن الرحيم
If you're starting from this section, we recommend reading the previous part, where we covered the foundational concepts of stealer logs, the types of data stolen, organizing stealer logs, identifying IOCs, and delving into a case study on "Exploiting UUID-Based Broken Access Control Using URL History from Stealer Logs".
Specifically, in this second part of the write-up, we will discuss on the role of stealer logs in asset discovery.
---
TL;DR
Given the length of the process, we have provided a TL;DR section summarizing our journey in 16 short points, leading to successful access to the main target.
However, if readers wish to understand the mindset used at each stage, they can continue reading until the end.
Below is a summary of the write-up:
Quick lesson learned: Understanding stealer logs goes beyond testing credentials - it also involves analyzing URLs linked to the target, which standard subdomain tools might overlook. This broader approach leads to a more thorough risk assessment. In this case, the target operates on three distinct domains, not subdomains, each with its own operational function.
In the following section, we will start by outlining the process from the beginning for this case study.
---
VI. Case Study 2: Stealer Logs for "Asset Discovery"
6.1. Search for Credentials related to the Target in Stealer Logs
"Stealer Logs can assist in the process of asset discovery". This may sound like a unique statement, but it's quite relevant to reality. In this case, we will explain how we gained extensive access, starting from asset discovery through stealer logs where almost all content had expired, downloading sensitive files from the target via information on archive.org, and eventually obtaining numerous credentials belonging to the target.
Before diving into this case study, it's important to understand that one of the challenges we may face when searching for target information from stealer logs is dealing with companies that have a large customer base, as they directly interact with end users, such as ride-sharing companies, ICT companies selling devices (like notebooks, phones, etc), food delivery services, and others.
Why is this a challenge? Because instead of finding portals used by internal teams, the search results are more likely to display information from customers affected by the stealer malware. If the goal is to test the target's system, then accessing customer accounts for public-facing services is certainly not the appropriate approach. So, if we encounter a situation like this, the search must be more specific, for example, by filtering out URLs commonly used by customers.
---
While we could start by exploring the target's public assets and then automating the search based on those assets, this method could take longer.
By narrowing down the URLs commonly used by customers, the search becomes more focused on assets frequently visited by users. From there, we can explore further into other assets that users also access.
---
In short, we once found a target from a public service (let's call it ABC-1) with a very large customer base. Initially, the results we obtained mostly displayed customer data affected by stealer malware. However, after refining our search using a more specific method, we finally found relevant information.
On our first specific search result, we immediately found an application with a simple login page that appeared to be intended for partners collaborating with the company (as inferred from the application's title). Although the account we found could technically be used to further test the partner's application for vulnerabilities, that was not our primary goal at the time, as we were not authorized to conduct testing.
So, what did we find? After closer inspection, we discovered a foreign logo and name located in the footer of the application.
Partner Portal with External LogoOne might ask why we didn't search by email domain.
To clarify, we did try, but found no results related to ABC-1's emails in the stealer logs (whether it was mail, email, webmail, etc). Also, since the email username format for the target was unknown, we temporarily set aside checking common email services like Microsoft or Google.
---
6.2. Searching for Company Information Based on the Logo Found in a Partner Application
Upon encountering this situation, we immediately searched for information regarding the company using the logo. Fortunately, we quickly identified it through a Google search (likely due to the uniqueness of the company name) - let's call it XYZ-2.
At this stage, we didn't dive into searching for leaked information related to XYZ-2 right away. Instead, we first conducted a profile investigation to learn more about XYZ-2, looking into its company profile and social media presence. Some of the aspects we examined included:
If XYZ-2 turned out to be a third-party service provider, we would legally be prohibited from engaging with them unless they had a vulnerability disclosure program or something similar in place. However, after further research, we concluded that XYZ-2 was part of ABC-1, handling all IT-related matters for ABC-1. We also realized that XYZ-2 developed ABC-1's main platform, which is used by ABC-1's customers.
---
6.3. Asset Discovery Through XYZ-2 - The IT Handler for ABC-1
Once it was confirmed that XYZ-2 was part of ABC-1, we resumed our search through the stealer logs. Unfortunately, we only found around five results, with the latest being from 2022. Worse yet, it appeared that the internal employee affected in 2022 had moved to another company, leading us to believe that all credentials linked to the company were no longer valid. Upon validation, our assumption proved correct - all the credentials were invalid.
We then re-analyzed all the URLs without focusing solely on XYZ-2's domain. Through this, we discovered that the device owner had logged into an email account using the domain (AD) on JKL-3's domain. Upon further investigation, it turned out that JKL-3 was another part of ABC-1, focusing on a more specific business area. From a quick look, it seemed that all of ABC-1's email operations used the JKL-3domain. This led us down a very interesting path.
---
6.4. Searching for Logs Containing ABC-1, XYZ-2, and JKL-3 Connections
As a recap, at this stage, we had identified three distinct domains:
ABC-1's domain (abc.tld), which appeared to be the parent of all the company's subsidiaries.
XYZ-2's domain (xyz.tld), seemingly responsible for handling all IT matters for ABC-1 and developing ABC-1's primary customer platform.
JKL-3's domain (jkl.tld), which appeared to be used for email operations (mail.jkl.tld) by ABC-1, XYZ-2, and JKL-3 itself.
Our next step was to start searching for leaks involving JKL-3's email domain (mail.jkl.tld), while also looking for relevant information related to XYZ-2.
Didn't XYZ-2 only return five leaked results, with the last one appearing in 2022? Yes, that's correct. Even though XYZ-2 only had five leaked results, the last of which appeared in 2022, it's important to remember that connections to XYZ aren't just tied to URLs or usernames, they can also be linked to the passwords / URL being used.
In short, we discovered one leak that was about a year old at the time. Here are the key findings from this leak:
So, what were the results? Unfortunately, none of the credentials we obtained were valid.
Even the URL that appeared to function as UAT returned a 403 Forbidden message. However, it is important to understand that a 'forbidden' message can have several meanings, such as:
Access is Forbidden
---
6.5. Accessing Data via Sites Recorded in Archives Due to Certificate Transparency
After discovering this situation, we attempted to check if this URL was archived on archive.org, either manually (deliberately) or automatically through the use of an SSL/TLS certificate. We were fortunate to find that on a specific date, this URL had just used an SSL/TLS certificate and was recorded on archive.org.
Recorded on archive.orgTo understand why this record occurred, we need to grasp how Certificate Transparency (CT) works.
When an organization or individual purchases an SSL/TLS certificate, it is first issued by the Certificate Authority (CA). After issuance, the CA logs the certificate in the Certificate Transparency (CT) log, a public record of all issued certificates. This process allows anyone to monitor and verify issued certificates, ensuring that no unauthorized certificates exist.
In this way, archive.org can use data from the CT log to capture and store a copy of the website that has recently used the digital certificate. We found that archive.org indeed had a recording of the website on the date corresponding to the information from the CT log, giving us the opportunity to view a snapshot of the site at that time.
Returning to the topic, next, we decided to review some previous records and discovered an interesting situation from 2023, where the UAT interface was previously a directory listing. This was quite intriguing because it suggested that the files listed in the interface might still be accessible and possibly updated.
Encountered an Interface in the Form of a Directory ListingThe result? Our assumption turned out to be correct. We were able to download the files smoothly without any issues. Additionally, we found that the application administrators had restricted access only to the main directory, not to the subdirectories. This allowed us to freely explore the files within.
You might now be wondering, what types of file formats were found there? We discovered various files, such as documents, .zip files, and some .dll files that appeared to be outdated. However, the most notable discovery was that one of the .zip files actually contained a more up-to-date .dll compared to the one listed in the interface.
Download the FileAccessing the Files
---
6.6. Trying to decompile a compiled .dll.
One common mistake often made by .NET-based application developers is compiling their code into a .dll without implementing obfuscation. This becomes problematic because, without obfuscation, the resulting .dll file becomes much easier to decompile and analyze, which opens up the potential for information leakage or vulnerabilities to be exploited.
In this case, we attempted to decompile the file (which can be done using dnSpy or JetBrains' dotPeek).
In short, from this activity, we discovered a significant amount of credentials within the file, including credentials for logging into email (connected to Active Directory), databases, token, and more.
Sample Result of the Decompiling ActivitySample Result of the Decompiling Activity
---
6.7. Lesson Learned from Case Study #2
Finally, we have reached the end of this article. As usual, we will highlight the key lessons learned from the case study discussed, namely:
---
REFERENCES