CPTC - Better Pentest Reports w/ Examples!

This post is intended to help CPTC participants or general penetration testers write better pentest reports, similar to a post I had done earlier this year for pentest presentations. This post uses anonymized pentest reports from the last two years of CPTC finalists. All of these teams have already been successful in regional events, which means these teams have already gone through one round of reporting successfully. That also means they are looking at this pentest both as a retest and testing new assets with an expanded scope.

Executive summary
There are a few things I like to keep in mind with pentest reports. Generally speaking, pentest reports will have multiple audiences, from security managers, to the executives reviewing it, to technical (sometimes non-security) engineers. Every report should have a clear table of contents, so that it's easy to navigate to the different sections, that different parties may be interested in. Every report should also include an executive summary, something that briefly summarizes the major findings and vulnerabilities. The executive summary area should include the scope, an overall risk summary, a summary of the major issues, perhaps a brief narrative of the test, and any high level security remediation advice. If you've encountered a really difficult environment and have a low number of findings you can also highlight successful security controls. In future CPTC competitions we may also look for relevant compliance findings and guidance in this section. Such governance related findings aren't normally the focus of the pentest, but those insights are key to CPTC and can serve a very useful purpose in the executive summaries. We generally also like to see methodologies, approaches, and risk rating systems around this section. Methodologies show an established testing or ranking system, showing the organization has a professional approach as opposed to each individual tester doing their own thing. That said, after putting this entire post together, sections focused on the methodologies and vulnerability rating systems may actually make more sense in an appendix.

Technical findings
The technical findings section dives into the details of the vulnerabilities discovered in the environment. The technical findings section should include several summaries or graphics of the findings, such that at a glance a reader can understand many vulnerabilities and some of the differences between the findings. The technical findings can also include its own guide or table of contents. Findings should clearly breakdown the technical vulnerability, the systems that are vulnerable, the impact this vulnerability has on the company, an overall risk assessment, detailed steps to test this vulnerability, suggested vulnerability remediation, and any additional links that may serve useful to the reader. As a technical security engineer, I think the technical findings are one of the most important parts of the report, so lets quickly breakdown each element of a finding:

Risk rating - This is most often a simple summary value, such as low, medium, high, critical or informational. It generally describes the severity of the vulnerability, and is often used to prioritize vulnerabilities. Having a risk rating system or methodology can help a lot, as it is a reference to an established system people can read more about elsewhere. I prefer the CVSS, or the common vulnerability scoring system, despite all of its numerous flaws and inadequacies, because of the long CVSS code where you can break down the meaning behind the code with base, temporal, and environmental metrics. Findings in the report are often grouped by risk rating, so that clients can review the most impactful findings first. Clients may also wish to debate the risk assessment on certain findings, so having a strong confidence in your risk rating and the system you are using is important for those discussions.
Scope -
This is often the items affected or systems impacted by this particular vulnerability, and can be multiple assets or a single system. This can also be a set of applications, paths within an application, or multiple systems in a process, such as a back-end service that renders data passed to it from another application. The scope is critical, especially when an issue effects multiple hosts or locations, you want the client to address them all.
Description
- This is where the vulnerability is fully described and explained. You will always want to reference known CVEs or vulnerability names in the event it is a well known or understood existing vulnerability. For larger pentest shops, this section will often be copy-paste from a template or an existing definition of a certain vulnerability. Not only should the description attempt to explain how the vulnerability works, but also the normal permissions it grants and why it is a security issue.
Impact -
This is how the vulnerability effects the target systems or how it impacts this client specifically. While this can be a further explanation of the vulnerability and the privileges it grants, this is most useful as a description of what this vulnerability means within this specific instance. This is a great section to tie the specific finding back into the overall picture of the environment or assessment. Where most pentest shops will use finding templates, or pull findings from previous reports, this is a section that should be left blank such that the assessors tailor it for the impact of this specific environment.
Likelihood - This is an optional area that is sometimes included in reports to cover the chance that a certain attack could occur. Here the report can break break down a lot of the elements of a CVSS score, such as the core, base, and temporal environmental aspects of the finding. These considerations could take into account aspects such as the complexity of the attack, if tools have been released to make the exploit easier, and any mitigating controls the environment may have. This is essentially the other half of the risk calculation: impact * likelihood = general risk rating. That said, some reports will skip this area in the technical finding, describing these details in the impact or description, or summarizing this information in the detailed risk rating score. 
Steps to test -
This is how the person reading the report can verify the vulnerability. This part of the report often includes actionable technical steps demonstrating how to reproduce the vulnerability or exploit results. The goals of these steps should be that a technical engineer can verify the vulnerability then retest it after they've remediated the vuln. This is most often where auditors will put a full set of screenshots or an example of exploitation along with the steps, which also serves as evidence of the vulnerability.
Remediation steps
- These are actionable steps to fix or remediate the vulnerability, often with a description of how to address the issue. I also prefer providing my clients with multiple options to remediate, often with different tradeoffs. When there is more than one way to remediate a vuln, or perhaps an official patch isn't out yet, it can be nice to provide a client with multiple options and insight into each option.
Additional links -
These can be a collection of informational links to explore more information regarding the specific vulnerability on each system. This is most often resources describing the vulnerability or CVE, exploit guides, open source code, various tools related to the vulnerability, or even walk-throughs on how to remediate or detect the vulnerability.

Examples
Pentest reports can get very large with any additional assessments or even with all of the findings sometimes. It's important to streamline the reports for the various audience, attempting to keep the size small and digestible so that relevant parties can actually work through parts of the report applicable to them. This is another reason you really want to tuck any larger than normal sections or assessments back into the appendix, to get them out of the way of the various normal readers who will find them unappealing and taking up space. My rule of thumb is any additional resources that are over a page in length should be included as an appendix, such as a software audit or compliance audit was added. This will keep the report flowing without any single section taking up a massive amount of volume. Lets look at some things I liked from the 2019 reports. These highlights have been taken from randomly selected reports from the 2019 anonymized batch:

It's crucial that a report have a table of contents (ToC) and be easy to navigate. This is the home base of the person reading the report, such that they can easily jump around to different, applicable content. Even if all of the sections aren't relevant to everyone, having a clear table of contents will let readers skip sections they care less about or engage in targeted reading regarding findings they are trying to remediate. It's even better when ToC has links that work in the digital version:


In a different report we can see an example of high-level strategic remediation advice in the executive summary portion of the report. This report included both short term and long term executive security recommendations. The short term issues are findings that were highlighted from the technical findings and elevated to get additional executive attention. Short term issues can often be fixed by a single technical contributor or team. The long term guidance is intended to increase the overall security posture of the environment, although it may take a major team initiative or architectural changes to accomplish. Breaking the critical findings up like this can help executives prioritize remediation projects.


In the example below we can see a popping overall risk rating within the executive summery. Notice how all of the findings have been summarized into a single score. Also notice how the most important findings have been distilled as examples for the leadership, including some of the remediation advice. This table is for conveying the risk and associated fixes to management, whereas the details for these vulnerabilities and their remediation can be found by engineers in the technical details section.

Testing methodology is important to give the clients an idea of the approach and tools used. This can be whatever methodology the testers have determined, but shows the client a degree of standardization across the testers at the given organization. This is the client's window into the process of the testing organization, are the testers just running a bunch of one off tools or is there a method to their assessments. I think it is better to use established testing methodologies, such as PTES, as this shows expertise in the field and gives the client more resources to learn from. Often times pentest organizations will copy / paste this information across reports, and for that reason I think it better fits in the appendix rather than up front, so readers can take a look if they are interested in the approach.

In the image below you can see how one client made a diagram of the hosts they discovered. This type of diagram can provide a wealth of knowledge, such as the hosts, technologies, or network connections identified during the testing. This is a creative way to visualize the scope of the environment and provides an engaging way to show the network scope to executive audiences. Granted if you were going to include something like this I would also include the written scope above it.

The vuln summary seen below is very nice for digesting the criticality and types of findings. This type of information can often be found at the top of the technical findings section and is often very useful for security managers or those looking to summarize the findings. These types of graphs or charts can sometimes highlight larger trends that get lost in the details of the report.

Below is an example of how compliance insights in the executive summary can be helpful for the executive leadership of a company. Here we can see the audit team highlight core themes from the vulnerabilities they discovered that directly relate to the MOU. Granted, this isn't a full compliance audit, but providing these insights can be valuable for the client to get in front of such an audit or present to the auditors. These insights help distill the impact of specific vulnerabilities back into the overall risk for the company, in this case not meeting their MOU.

The above were all examples from our 2019 reports. Personally I think last years reports have gotten even better, probably as a result of releasing the previous yeas reports. Again my goal with this post is to make the 2021 reports even better. That said, lets look at some great aspects of the 2020 reports:

In the next example we can see a small excerpt from the Executive Summary that very clearly summarizes the risk and some of the vulnerabilities found. In this example, I particularly like the use of color in the paragraph to make certain details pop and make them easier to digest. I also like when reports use graphics to display this information, but getting it into the executive summary is important.  

One thing that I thought was really near, was that one team included a graphic of the "kill chain" or the pentest narrative. Visualizing this information can help teams see network boundaries or trust relationships that were exploited during the pentest. The goal of a pentest is not only to enumerate vulnerabilities, but also penetrate through systems, to see what new access can be gained. Visualizing this kill chain allows defenders to focus on securing choke points, or key machines that were used in that penetration test.

Another example I like from 2020 are some of these risk rating blocks within the technical findings. In the example below you can see a highly condensed risk rating, that conveys tons of detail, such as the scope, and two different risk breakdowns, including the CVSS long code. This can save page space by not having sections like likelihood and instead including those details in a risk rating block.

One thing I really like during retests or when testing the same environment again, is to note which vulnerabilities have been fixed and which ones remain open from the last engagement. A nice concise list which references more detailed write-ups is normally useful for managers. Again, bonus points if you make the vulnerabilities actually link to their technical writeups. You will have to excuse this example because the table ran across a page break, but I think it was important to get the full table.

I also appreciate when the remediation findings go in depth regarding the retests and retest results. In the example below you can see them expand on the evidence they collected to show this vulnerability was remediated, as well as the compliance implications fixing this vulnerability has. In other examples they go on to say how the evidence they've collected also shows when a vulnerability is unremediated and what work still needs to be done to fix it.

Below we can see an example of an appendix where the team lists the tools they used. I really like this from a client perspective because it helps me understand the code introduced to my environment during the test. I would like the addition even more if they included a commit hash or a frozen version of the code they were using, for quality control. Seeing the references where these tools are used throughout the report would make this a stronger report as well.


Lets look at some things that could be done better. This is probably my favorite section of this post because its way easier to look at bad examples in my opinion. Bad examples really pop out at you when you've been doing this for a long time, so let's look at some common pitfalls we can avoid in our reports. This is also isn't designed to pick on any specific team, again all of our reports have been double anonymized (once during the competition, and again sanitized after the competition), and the examples I picked were chosen from randomly selected reports where the content just jumped out at me. 

In the following image we can see a table of contents that is a little out of order, in my opinion. I think the Introduction section should be moved up to the start, before the Executive Summary. I also think the Conclusion should be moved up and combined with the Executive Summary. Some of these parts could also be moved to the Appendix to maintain brevity. The reason for this is security managers or executives won't want to read through the bulk of the technical findings, nor do you want them skipping over a large middle section just to reach the end. The further the report goes there should be less executive analysis and more technical data that individuals can seek out if they want, rather than burying large conclusions at the end of the report. 

Below we can see some examples of technical finding blocks that really leave a lot to be desired. The lack of formatting on the finding block forces the reader to read the entire block to extract the details they are looking for, such as scope, impact assessments, or remediation advice. I prefer finding blocks with clearly laid out sections like the ones we covered above, which makes skimming the individual findings easier when looking for targeted data.

We can see another example of a questionable findings below. This finding has no external references, no CVEs, and doesn't even mention URL encoding in the definition. Ultimarly, this finding introduces more confusion than I think it helps resolve. In my opinion, this would be an example of a pentest team reaching for something, without knowing the full vulnerability details or knowing how to exploit it. Even if this is a real security vulnerability I don't think the description I read, in which it essentially amounts to an information leak, warrants a 6.5 CVSS score, especially with no proof of exploitation. I think pentest teams should really stay away from this kind of speculation.

That's all! If you want to see more examples I encourage you to look through some of the full reports. I wish everyone good luck with the new CPTC season approaching. This year we will be looking at Bon Bon Croissant and have some really fun things planned. Please let me know what you think about pentest reports in general or anything I missed in the comments!

Source: lockboxx

CPTC - Better Pentest Reports w/ Examples!