
Pentest Notes - Approaching a Target
by Eva Prokofiev
A list that contains some notes on approaching a target during the reconnaissance stage when looking for potential application entry points, misconfigurations and information exposure on a target. based on third party resources and some of my own.
Approaching a Target at a High Level:
- *.site.com - Subdomains are known for not having the same amount of security focus as the primary site. subdomain enumeration is key (Here's one of my favorite guides for subdomain enumeration methods)
- Amass- https://github.com/caffix/amass also helpful
- Check hidden subdomains e.g. *.*.site.com
- Portscan for obscure services on all hosts. Many high severity issues have been found on non-standard ports either via service exploitation or finding even more hosted web servers.
- Manually check for source code of the web application - credentials/ potential entry points that are exposed to client side /JS/HTML/ comment areas / redirects points / unreferenced pages or files. etc..
Other clues in published content
Many web applications leave clues in published content that can lead to the discovery of hidden pages and functionality. These clues often appear in the source code of HTML and JavaScript files. The source code for all published content should be manually reviewed to identify clues about other pages and functionality. For example:Programmers’ comments and commented-out sections of source code may refer to hidden content:JavaScript may contain page links that are only rendered within the user’s GUI under certain circumstances:HTML pages may contain FORMs that have been hidden by disabling the SUBMIT element:
<!-- <A HREF=”uploadfile.jsp”>Upload a document to the server</A> --><!-- Link removed while bugs in uploadfile.jsp are fixed -->
- Check for site functionalities, loaded components from remote or local servers or pointed references to the API or a particular page/function in the site.
- Do OSINT on your target. there can be many interesting bugs / misconfigurations / vulnerabilities and information exposure found using this. I personally use this as the first step before any specific methodology for pentesting/recon.
One example of OSINT is using Google Dorks to find interesting pages/ content / directories, pages with errors, hidden credentials , etc.
- Different search engines may reveal different indexed information (bing, yandex google.. etc.)
A profound guide to google hacking can be found here
Below are some things worth checking on target domain (some taken from here)
Directory listing
site:*.site.com intitle:index.of
Configuration files
site:*.target.com ext:xml | ext:conf | ext:cnf | ext:reg | ext:inf | ext:rdp | ext:cfg | ext:txt | ext:ora | ext:ini
Database + log files
site:*.target.com ext:sql | ext:dbf | ext:mdb ext:log
Backup and old files
site:*.target.com ext:bkf | ext:bkp | ext:bak | ext:old | ext:backup
Login pages
site:*.target.com inurl:login
SQL errors
site:*.target.com intext:"sql syntax near" | intext:"syntax error has occurred" | intext:"incorrect syntax near" | intext:"unexpected
Publicly exposed documents
site:*.target.com ext:doc | ext:docx | ext:odt | ext:pdf | ext:rtf | ext:sxw | ext:psw | ext:ppt | ext:pptx | ext:pps | ext:csv
Check how files are served to the end user on the server
Interesting keywords
This search can provide us with insight on potential information disclosure | errors | session errors | misconfigurations | hidden login panels | application errors | database connection errors | app entry points .. etc.
- Depending on the target - keywords used can be customized.
site:*.target.com inurl:adm | dashboard | logout | ...etc
site:*.target.com status | test | session | null | test | system | download | status | version | powered by | etc | expired
You can also check these tools for automating google dorks search
https://github.com/ZephrFish/GoogD0rker/
Cached pages
Check cached pages for potential content that has been changed or removed from the page / login information / test credentials. etc
If some pages aren't available using cache - try web-archive or its alternatives.
Approaching a Target at the Application Level:
For fingerprinting you need to understand and identify any frameworks you are testing against. Some quick Chrome extensions and tools here can help with that:
- Wapplyzer
- Builtwith
- Retire.js
- https://github.com/jobertabma/relative-url-extractor
- Whatweb
- Another alternative online source I like is https://suip.biz/ for webapp testing
These are just some of the methods you can use. There are nmap NSE scripts that are designed for this as well.
When fingerprinting identifies some sort of software or framework, check for known CVE's/Exploits/PoC's that are publicly available. e.g. sploitus/searchsploit/google.
Identifying software version
Sometimes software will be configured to not be visible and show the installed version in a simple form. therefore some options below are listed as alternatives to aid in that.
- scanner
- check headers
- bottom page
- mouseoverview may reveal different version on object on page
- source code
- connect to services - output
- banner grabbing
- changing HTTP request type reveal info?
- curl http://www.site.com/page.htm - reveal any information ?
Mapping is the key of finding application entry points/paths. In large applications this becomes a necessity. Traditional knowledge will tell you that your spider or scanner will give you a perfect site-tree to inspect but seasoned testers know that this is simply not true.
A full browse of the site while connected to an interception proxy is mandatory. Are there ways to speed this up or ensure completeness? No, not 100% but i do like utilizing something like Linkclump to drive exploration.
Directory bruteforcing
I prefer using wfuzz or dirb with the lists from the fuzzdb and seclists, SVNDigger , and GitDigger projects. It is also often a good idea to check for customized directories (test the name of the domain/sub/ other indicators as dir name)
e.g. > target.e231.com/e231 or subname.e231.com/subname
For hidden or interesting subdomains I will usually use bigger lists for directory bruteforcing - like dev/test servers or specific production servers that I find (marketing/demo/api/tests..etc).
Parameter Testing
Something that has worked for me is checking on parameters, pick a parameter that has an effect on the flow of the application. For example, if a field takes a number (lets call it ID).
What happens if:
-put in a minus number value?
-increment or decrement the number?
-put in a really large number?
-string or symbol characters?
-traverse a directory with …/
-XSS vectors?
-SQLI vectors?
-non-ascii characters?
-mess with the variable type such as casting a string to an array
-null characters or no value
Check if you can draw any conclusions from the outcomes of these tests,
-understand error output
-is anything broken or exposed
-can this action affect other things in the web app.
S3 buckets
extracting S3 buckets during recon is nice idea, look for them manually or use tools such as:
- https://github.com/0xSearches/sandcastle
- https://digi.ninja/projects/bucket_finder.php
- Or target related exposed information using https://buckets.grayhatwarfare.com/
Some good stuff to check/read
- OWASP testing guide
- https://pentester.land/conference-notes/2018/08/02/levelup-2018-the-bug-hunters-methodology-v3.html
- https://www.owasp.org/index.php/Testing_Checklist
- http://www.0daysecurity.com/penetration-testing/enumeration.html
- https://resources.infosecinstitute.com/manually-web-application-penetration-testing-fuzzing/
- https://highon.coffee/blog/penetration-testing-tools-cheat-sheet/
- https://www.cybrary.it/0p3n/web-application-penetration-testing-checklist-detailed-cheat-sheet/
About the Author
Cyber Threat Intelligence Expert with close to 10 years of experience in information security, Eva's expertise and passion is making organizations secure, bring value and awareness of their real cyber threats.
The article has been originally published at: https://www.linkedin.com/pulse/pentest-notes-approaching-target-eva-prokofiev/
Author

This is an awesome read! This really refreshed my foundation knowledge.