Awesome Tool
Here’s a beginner-friendly Awesome List on creating, designing, and maintaining tools for bug bounty and cybersecurity development. This list focuses on a streamlined approach with templates, checklists, and best practices.
Awesome Tool Creation for Cybersecurity
A curated list of resources, templates, and checklists to help you quickly create tools that are functional, good-looking, and easy to maintain.
Table of Contents
1. Introduction
Building custom tools for bug bounty or cybersecurity tasks is a skill that can save time, reduce errors, and improve efficiency. This list will guide you through creating tools with:
Proper documentation.
Easy-to-use interfaces.
Scalability for future updates.
2. Development Basics
Languages to Learn
Python: Great for automation and scripting.
Bash: Perfect for lightweight scripts and command-line tools.
Go: Efficient and fast for building scalable tools.
Environment Setup
Install essential software:
sudo apt update && sudo apt install -y git curl python3 python3-pip golangSet up a project structure:
mkdir -p ~/tools/mytool/{src,docs,tests}
Version Control
Initialize Git:
git initUse branches for new features:
git checkout -b feature/my-feature
3. Pre-Built Templates
CLI Tool Template
#!/bin/bash
# MyTool: A simple CLI example
target=$1
output_dir="./output"
# Ensure target is specified
if [ -z "$target" ]; then
echo "Usage: ./mytool.sh <target>"
exit 1
fi
# Create output directory
mkdir -p $output_dir
# Example task
echo "[+] Scanning $target"
nmap -sV -oN $output_dir/nmap_$target.txt $targetPython Recon Script
import subprocess
import sys
def check_tool(tool_name):
result = subprocess.run(["which", tool_name], capture_output=True, text=True)
if not result.stdout.strip():
print(f"Error: {tool_name} not found. Install it first!")
sys.exit(1)
def main(target):
output_file = f"./output/{target}_scan.txt"
print(f"[+] Scanning {target}")
subprocess.run(["nmap", "-sV", "-oN", output_file, target])
if __name__ == "__main__":
if len(sys.argv) != 2:
print("Usage: python recon.py <target>")
sys.exit(1)
check_tool("nmap")
main(sys.argv[1])4. Tool Design Checklist
Basic Features
Accept input via command-line arguments.
Include usage instructions (
--helpflag).Validate user input (e.g., ensure the domain is valid).
Structure
src/: Codebase.
docs/: Documentation files (README, usage guides).
tests/: Unit and functional tests.
Error Handling
Log errors to a file:
command_here 2>> error.logExit on failure with meaningful messages.
5. Automation and Recon Scripting
Recon Pipeline Example
#!/bin/bash
target=$1
output_dir="./recon/$target"
if [ -z "$target" ]; then
echo "Usage: $0 <target>"
exit 1
fi
mkdir -p $output_dir
# Subdomain enumeration
echo "[+] Enumerating subdomains..."
subfinder -d $target -silent > $output_dir/subdomains.txt
# Probing live domains
echo "[+] Probing live domains..."
cat $output_dir/subdomains.txt | httpx -silent > $output_dir/live_domains.txt
# Scanning for vulnerabilities
echo "[+] Running Nuclei..."
nuclei -l $output_dir/live_domains.txt -t ~/nuclei-templates -o $output_dir/nuclei_results.txt6. User Interface Tips
Command-Line Features
Add colorful output:
echo -e "\033[0;32m[+] Task Completed\033[0m"Include a progress bar for long tasks.
Customizing Output
Use tables for clarity:
printf "%-15s %-10s\n" "Domain" "Status" printf "%-15s %-10s\n" "example.com" "Live"Generate Markdown reports:
echo "# Report for $target" > report.md echo "- Live Domains: $(wc -l live_domains.txt)" >> report.md
7. DevOps and CI/CD Integration
GitHub Actions Pipeline
name: Tool Deployment
on:
push:
branches:
- main
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v3
- name: Install Dependencies
run: |
sudo apt update
sudo apt install -y nmap subfinder nuclei
- name: Run Tool
run: ./mytool.sh example.com8. Resources for Inspiration
Tool Repositories
Cheatsheets
Books
"Automate the Boring Stuff with Python" by Al Sweigart.
"Black Hat Python" by Justin Seitz.
This list is a starting point for building your tools and automating tasks. Feel free to customize and expand it for your needs. 🚀
Comprehensive guidelines for building effective, scalable, and user-friendly tools, focusing on bug bounty, cybersecurity, and automation.
Table of Contents
1. Introduction
Building tools isn't just about automating tasks; it's about creating reliable, efficient, and reusable workflows. This list provides detailed steps to go from zero to building highly functional tools for bug bounty and security research.
2. Getting Started with Tool Development
Languages to Learn
Python: Ideal for rapid development and API integration.
Bash: Great for quick and lightweight automation scripts.
Go (Golang): Perfect for building high-performance tools.
JavaScript: Use for browser automation and interacting with web apps.
Tools to Install
Git for version control:
sudo apt install gitPackage managers:
Python:
pipandpipenvJavaScript:
npmoryarnGo:
go install
Development Environment:
IDE: VSCode, PyCharm, or IntelliJ.
Linters:
flake8for Python,shellcheckfor Bash, andgolangci-lintfor Go.
3. Tool Design Fundamentals
Key Design Principles
User-friendly interface:
Use clear command-line options (
-h,--help).Add error messages for incorrect inputs.
Scalable architecture:
Use modular functions.
Store reusable logic in libraries.
Comprehensive output:
Include color-coded CLI output.
Generate reports in multiple formats (JSON, CSV, Markdown).
Directory Structure for Projects
mytool/
├── README.md # Documentation
├── LICENSE # License file
├── requirements.txt # Python dependencies
├── main.py # Entry point
├── src/ # Core logic
│ ├── utils.py # Helper functions
│ ├── modules/ # Submodules
├── tests/ # Unit tests
├── config/ # Configuration files
└── output/ # Scan results4. Advanced Automation Scripts
Full Recon Workflow
#!/bin/bash
domain=$1
output_dir="./output/$domain"
if [ -z "$domain" ]; then
echo "Usage: $0 <domain>"
exit 1
fi
mkdir -p $output_dir
echo "[+] Enumerating subdomains..."
subfinder -d $domain -silent > $output_dir/subdomains.txt
amass enum -passive -d $domain >> $output_dir/subdomains.txt
echo "[+] Checking live hosts..."
cat $output_dir/subdomains.txt | httpx -silent > $output_dir/live_hosts.txt
echo "[+] Running Nuclei..."
nuclei -l $output_dir/live_hosts.txt -t ~/nuclei-templates -o $output_dir/nuclei_results.txtScheduled Recon with Crontab
Edit the crontab:
crontab -eAdd a daily schedule:
0 2 * * * /path/to/your/script.sh >> /path/to/logfile.log 2>&1
5. Scripting Best Practices
Bash
Set strict mode to catch errors:
set -euo pipefailHandle arguments:
while getopts "d:o:" opt; do case $opt in d) domain=$OPTARG ;; o) output_dir=$OPTARG ;; *) echo "Invalid option"; exit 1 ;; esac done
Python
Use
argparsefor CLI tools:import argparse parser = argparse.ArgumentParser(description="Tool Description") parser.add_argument("-d", "--domain", help="Target domain", required=True) args = parser.parse_args()Leverage virtual environments:
python3 -m venv venv source venv/bin/activate
6. Integrating APIs in Tools
API Keys
Store them in environment variables:
export API_KEY="your_api_key"Access in Python:
import os api_key = os.getenv("API_KEY")
Example API Call
Using Python's requests library:
import requests
url = "https://api.shodan.io/shodan/host/search"
params = {"key": "your_api_key", "query": "apache"}
response = requests.get(url, params=params)
print(response.json())7. Report Generation
Markdown Report Template
# Recon Report: [Target]
## Summary
- Total Subdomains Found: XX
- Live Hosts: XX
- Vulnerabilities Found: XX
## Subdomains
- subdomain1.example.com
- subdomain2.example.com
## Vulnerability Findings
1. **XSS**: URL: `https://example.com/test?param=<script>alert()</script>`
2. **SQL Injection**: URL: `https://example.com?id=1'`JSON Output
import json
data = {
"target": "example.com",
"subdomains": ["sub1.example.com", "sub2.example.com"],
"vulnerabilities": [
{"type": "XSS", "url": "https://example.com/test"},
{"type": "SQLi", "url": "https://example.com?id=1"}
]
}
with open("report.json", "w") as f:
json.dump(data, f, indent=4)8. Testing and Debugging
Unit Testing
Use Python’s
unittestmodule:import unittest def add(a, b): return a + b class TestMath(unittest.TestCase): def test_add(self): self.assertEqual(add(1, 2), 3) if __name__ == "__main__": unittest.main()
Debugging Tips
Use
set -xin Bash scripts for tracing.In Python, use
pdb:import pdb; pdb.set_trace()
9. Useful Libraries and Frameworks
Bash
Python
Requests: Simplified HTTP requests.
BeautifulSoup: Web scraping.
Go
10. Resources and Learning Platforms
OWASP Cheat Sheets: Security best practices.
HackTricks: Offensive techniques.
PayloadsAllTheThings: Exploitation payloads.
This comprehensive guide should give you the confidence to start building, automating, and refining your own tools. 🚀
Certainly! Here's an expanded version of the Awesome List with additional tips, tools, workflows, and techniques.
Extended Ultimate Awesome List for Tool Building and Automation
Build faster, more efficient tools for bug bounty, cybersecurity, and recon tasks with this comprehensive guide.
11. Advanced Recon Techniques
Subdomain Enumeration
Passive Techniques:
Use
crt.shfor Certificate Transparency Logs.Tools:
subfinder,amass,assetfinder,dnsx.
Active Techniques:
DNS brute-forcing with
purednsordnsx.Permutation-based enumeration using
gotator.
DNS Data Gathering:
Retrieve DNS records with
dig,host, ordnsx:dnsx -d target.com -a -resp
Recursive Subdomain Search:
Use tools like
dsieveto recursively find deeper subdomains:dsieve -d target.com -r
URL Discovery
Combine Wayback Machine, Common Crawl, and gau for maximum coverage:
cat targets.txt | gau | sort -u > urls.txt waybackurls < target.txt >> urls.txt
12. Dynamic Wordlist Generation
Generating Wordlists from JS Files
Use
getjsandjsluice:cat urls.txt | getjs | xargs -n 1 jsluice -u > wordlist.txt
Generate Password Lists
Use
pydictor:pydictor -base rule.txt -o passwords.txt
Custom Subdomain Lists
Combine existing lists with
dnsvalidator:cat resolvers.txt | dnsvalidator -tL subdomains.txt > valid.txt
13. Advanced Scripting Tips
Parallel Processing
Use GNU Parallel for running commands on multiple cores:
cat subdomains.txt | parallel -j 10 "curl -Is {}"
Tool Dependency Checker
Ensure all required tools are installed before execution:
tools=("nuclei" "amass" "subfinder") for tool in "${tools[@]}"; do if ! command -v $tool &>/dev/null; then echo "$tool is not installed. Please install it first!" exit 1 fi done
Error Handling
Redirect errors to a separate log file:
command_here >> results.log 2>> errors.log
14. Automation Pipelines
GitHub Actions Pipeline
Automate recon with scheduled scans:
name: Automated Recon Pipeline
on:
schedule:
- cron: '0 2 * * *' # Daily at 2 AM
workflow_dispatch:
jobs:
recon:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v3
- name: Install Tools
run: |
sudo apt update
sudo apt install -y subfinder httpx nuclei
- name: Run Recon
run: |
./scripts/recon.shGoogle Cloud Automation
Schedule automated scans with Google Cloud Functions and Cloud Scheduler.
Example function for subdomain enumeration:
import subprocess def enumerate(event, context): domain = event.get("domain", "example.com") subprocess.run(["subfinder", "-d", domain, "-o", f"{domain}.txt"])
15. Post-Processing and Data Visualization
Filtering Unique URLs
Deduplicate and sort URLs:
cat urls.txt | sort -u > clean_urls.txt
Create Visual Maps
Subdomain Graphs
Use
amassandMaltegofor visualizing subdomain connections.Export with:
amass viz -d target.com -o output.json
Network Graphs
Use tools like
neo4jorGraphviz.
HTML Reporting
Use Python libraries like
BeautifulSouporj2htmlto generate reports:from jinja2 import Template template = Template("<h1>Report for {{ domain }}</h1>") print(template.render(domain="example.com"))
16. Tool Examples
API Token Validator
import requests
def check_api_key(api_key):
response = requests.get("https://api.example.com", headers={"Authorization": f"Bearer {api_key}"})
if response.status_code == 200:
print("API Key is valid")
else:
print("Invalid API Key")Port Scanning Automation
#!/bin/bash
target=$1
output_dir="./output/$target"
mkdir -p $output_dir
echo "[+] Scanning ports on $target..."
nmap -sC -sV -oN $output_dir/ports.txt $target17. Notifications
Slack Integration
Send results to Slack:
webhook_url="https://hooks.slack.com/services/your/webhook/url"
message="Recon complete for $target"
curl -X POST -H 'Content-type: application/json' --data '{"text":"'"$message"'"}' $webhook_urlTelegram Bot
Notify via Telegram:
import requests
bot_token = "your_bot_token"
chat_id = "your_chat_id"
message = "Recon complete"
requests.post(f"https://api.telegram.org/bot{bot_token}/sendMessage", data={"chat_id": chat_id, "text": message})18. Advanced Tools and Frameworks
Recon Tools
katana: Fast crawler for endpoints.
waymore: Wayback URLs with custom filtering.
Interlace: Automate tool chaining for multithreaded scans.
Fuzzing Tools
ffuf: Fast web fuzzer for directories and parameters.
GoFuzz: Fuzzing Go applications.
19. Continuous Improvement
Set Benchmarks
Track metrics for improvement:
Time to find vulnerabilities.
Tool efficiency (false positives vs. true positives).
Automation speed.
Integrate Machine Learning
Use AI-based tools like
ChatGPTorWekato analyze recon data patterns for hidden vulnerabilities.
20. Resources for Inspiration
Books
"Black Hat Python" by Justin Seitz.
"Hacking APIs" by Corey Ball.
Communities
Learning Platforms
This extended guide is designed to give you everything you need to get started with tool development, automation, and advanced recon workflows. Here’s an extended Awesome List to further cover the essentials and advanced aspects of scripting, automation, and tool creation in bug hunting and cybersecurity. This builds on the previous list with new categories and actionable insights:
31. Web Scraping and Automation
Python Web Scraping
Scraping with
requestsandBeautifulSoup:import requests from bs4 import BeautifulSoup url = "https://example.com" response = requests.get(url) soup = BeautifulSoup(response.content, "html.parser") for link in soup.find_all("a"): print(link.get("href"))Scraping APIs:
import requests url = "https://api.example.com/data" headers = {"Authorization": "Bearer your_api_token"} response = requests.get(url, headers=headers) print(response.json())
Browser Automation
Using Selenium for Dynamic Content:
from selenium import webdriver driver = webdriver.Chrome() driver.get("https://example.com") print(driver.page_source) driver.quit()Headless Browsing with Puppeteer:
const puppeteer = require('puppeteer'); (async () => { const browser = await puppeteer.launch({ headless: true }); const page = await browser.newPage(); await page.goto('https://example.com'); console.log(await page.content()); await browser.close(); })();
32. Advanced Network Scanning
Bash Utilities
Masscan for Fast Port Scanning:
masscan -p1-65535 192.168.1.0/24 --rate 10000 -oG masscan_results.txtCustom Banner Grabbing:
nmap -sV --script=banner 192.168.1.1
Python Scanners
Custom Port Scanner:
import socket def scan(ip, port): try: sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) sock.settimeout(1) sock.connect((ip, port)) print(f"Port {port} is open on {ip}") sock.close() except: pass scan("192.168.1.1", 80)
33. OSINT Automation
OSINT Tools
Email Enumeration with
holehe:holehe -l emails.txtSearch Leaked Credentials:
theharvester -d target.com -b all
Custom Scripts
Automate Google Dorking:
dorks=("site:example.com inurl:admin" "site:example.com ext:sql") for dork in "${dorks[@]}"; do query=$(echo $dork | sed 's/ /+/g') curl "https://www.google.com/search?q=$query" doneLinkedIn Scraping for Employee Info:
import requests url = "https://www.linkedin.com/search/results/people/" headers = {"User-Agent": "Mozilla/5.0"} response = requests.get(url, headers=headers) print(response.text)
34. Advanced Vulnerability Exploitation
Exploitation Scripts
SQL Injection Exploitation with Python:
import requests payload = "' OR '1'='1" url = "https://example.com/login" data = {"username": payload, "password": payload} response = requests.post(url, data=data) print(response.text)XSS Automation:
echo "<script>alert(1)</script>" > payloads.txt cat urls.txt | while read url; do curl "$url?q=$(cat payloads.txt)" done
35. API Security Testing
Custom Scripts
API Endpoint Testing:
import requests url = "https://api.example.com/v1/data" headers = {"Authorization": "Bearer token"} response = requests.get(url, headers=headers) if response.status_code == 200: print("API is functional") else: print("API error:", response.status_code)Automate Rate-Limiting Checks:
for i in {1..100}; do curl -X GET "https://api.example.com" & done
36. Data Extraction
Extract Key Info
Find All IPs in a Log File:
grep -Eo '([0-9]{1,3}\.){3}[0-9]{1,3}' logs.txt | sort -u > ips.txtExtract Emails:
grep -Eo "\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b" logs.txt > emails.txt
37. Continuous Monitoring Pipelines
GitHub Actions for Automation
Recon Automation:
name: Recon Automation on: schedule: - cron: '0 0 * * *' jobs: recon: runs-on: ubuntu-latest steps: - name: Subdomain Enumeration run: subfinder -d example.com > subs.txtSlack Alerts for Findings:
curl -X POST -H 'Content-type: application/json' --data '{"text":"Scan completed!"}' $SLACK_WEBHOOK
38. File Handling in Automation
Parse and Process Large Files
Bash: Split Large Files:
split -l 1000 large_file.txt small_Python: Process JSON:
import json with open("data.json") as f: data = json.load(f) for item in data: print(item)
39. Advanced Reporting
Visualizations
Graph Vulnerabilities:
Use Python’s
matplotlib:import matplotlib.pyplot as plt data = [10, 15, 20] labels = ["XSS", "SQLi", "SSRF"] plt.pie(data, labels=labels, autopct='%1.1f%%') plt.title("Vulnerability Distribution") plt.show()
Heatmaps for Severity:
import seaborn as sns import matplotlib.pyplot as plt data = [[1, 2, 3], [4, 5, 6], [7, 8, 9]] sns.heatmap(data, annot=True) plt.show()
40. Quick Automation Ideas
Instant Tools
Certificate Transparency:
curl -s "https://crt.sh/?q=example.com" | grep "example.com"Screenshot Script:
cat live.txt | while read url; do eyewitness -u $url doneCheck for WAF:
wafw00f https://example.com
This expanded list includes even more actionable insights for scripting, automation, and creating tools with over 40 unique sections. Each item is tailored to help you create effective, automated, and scalable bug bounty workflows.
21. Advanced Recon Pipelines
GitOps for Recon
Use GitLab CI/CD for recon pipelines:
stages: - recon - scan recon: stage: recon script: - subfinder -d target.com > subs.txt - httpx -l subs.txt -o live.txt scan: stage: scan script: - nuclei -l live.txt -t ~/nuclei-templates/Kubernetes Integration:
Deploy tools like
nucleiandffufon a Kubernetes cluster for scalable recon.
Axiom for Distributed Scanning:
Create distributed pipelines using Axiom:
axiom-scan live.txt -m nuclei -t ~/nuclei-templates/
22. Custom Scripting for Bug Bounty
Enhanced Bash Utilities
Retry Logic for Unstable Commands:
retry() { local n=1 local max=5 local delay=5 while true; do "$@" && break || { if [[ $n -lt $max ]]; then ((n++)) echo "Command failed. Attempt $n/$max:" sleep $delay; else echo "Command failed after $n attempts." return 1 fi } done }Use as:
retry curl -I https://example.com
Dynamic Wordlist Updates:
cat live_urls.txt | grep ".js" | cut -d '/' -f3 | sort -u > js_wordlist.txtAutomated Screenshot Script:
cat live.txt | aquatone -out screenshots
Python Recon Scripts
HTTP Header Analyzer:
import requests def analyze_headers(url): response = requests.get(url) headers = response.headers for header, value in headers.items(): print(f"{header}: {value}") analyze_headers("https://example.com")Directory Brute-Forcer:
import requests url = "https://example.com" wordlist = ["admin", "login", "config"] for word in wordlist: response = requests.get(f"{url}/{word}") if response.status_code == 200: print(f"Found: {url}/{word}")
23. Advanced Data Parsing
Extract Domains from JS Files
Using Bash:
grep -oP 'https?://[a-zA-Z0-9.-]+' *.js | sort -u > domains.txtWith Python:
import re with open("file.js", "r") as f: content = f.read() urls = re.findall(r"https?://[a-zA-Z0-9./-]+", content) for url in urls: print(url)
Parse JSON Files for Sensitive Data
Use
jq:jq '.keys[] | select(.type=="AWS")' sensitive.json
24. Enhanced Vulnerability Detection
Custom Scripts
SSRF Testing:
import requests payload = {"url": "http://localhost:8080/admin"} response = requests.post("https://target.com/api", json=payload) print(response.text)XSS Payload Testing:
cat urls.txt | while read url; do curl "$url?q=<script>alert(1)</script>" doneSQL Injection Automation:
sqlmap -u "https://example.com?id=1" --batch --dbs
25. Workflow Optimization Tools
Version Control
Use Git hooks to enforce standards:
# pre-commit hook echo "Running security checks..."
Linters and Formatters
Python:
black,flake8Bash:
shellcheck
Automated Deployment
Use
Docker:FROM python:3.9 WORKDIR /app COPY . . RUN pip install -r requirements.txt CMD ["python", "app.py"]
26. Best Practices for Tool Development
Modularization
Use reusable functions:
def subdomain_enum(domain): # Code here return results
Error Handling
Provide detailed error messages:
try: response = requests.get("https://example.com") response.raise_for_status() except requests.RequestException as e: print(f"Error: {e}")
Performance Optimization
Multi-threading:
from concurrent.futures import ThreadPoolExecutor def scan(url): print(f"Scanning {url}") urls = ["https://example.com", "https://test.com"] with ThreadPoolExecutor(max_workers=5) as executor: executor.map(scan, urls)
27. Continuous Monitoring
Scheduled Tasks
Use
cronfor periodic scans:0 2 * * * /path/to/recon.sh >> /path/to/logs.txt
Real-Time Alerts
Integrate Slack or Telegram for notifications:
curl -X POST -H 'Content-type: application/json' --data '{"text":"Scan completed"}' $SLACK_WEBHOOK
28. GitHub Awesome List Templates
Building Your Own List
Use GitHub Markdown Templates:
Structure:
# Awesome List ## Introduction ## Tools ## Resources ## Tutorials ## Contributing
Automated Updates
Use GitHub Actions to periodically update content:
on: schedule: - cron: '0 0 * * 1'
29. Advanced Reporting
HTML Reports
Create visually appealing reports using
ReportLaborFlask:from flask import Flask, render_template app = Flask(__name__) @app.route("/") def index(): return render_template("report.html")
Markdown Reports
Script Example:
echo "# Recon Report" > report.md cat results.txt >> report.mdExport to PDF:
Use
pandoc:pandoc report.md -o report.pdf
GitHub has blocked your push because it detected a secret (like a GitHub Personal Access Token) in your repository. To resolve this and safeguard your repository:
Steps to Fix and Prevent Issues
1. Remove Secrets from History
Identify the Secret: GitHub tells you where the secret is located, e.g.,
MyMac/Automated-Scanner/tools/.tokens:1. Open the file and remove the secret.Amend the Commit: After removing the secret, re-commit the changes:
git rm --cached MyMac/Automated-Scanner/tools/.tokens git commit --amend --no-edit git push origin main --force
2. Use git filter-repo to Remove Secrets from Entire History
If the secret exists in older commits:
Install
git-filter-repo:pip install git-filter-repoRemove the Secret: Replace
<path>with the file path containing the secret:git filter-repo --path <path> --invert-pathsForce Push Clean History:
git push origin main --force
Set Up Best Practices to Prevent Future Issues
3. Enable GitHub Push Protection
GitHub will block pushes containing sensitive information by default if push protection is enabled. Ensure it's active in your repository settings:
Go to Settings > Code Security and Analysis > Push Protection.
Enable Push Protection.
4. Use .gitignore to Prevent Secrets from Being Committed
Add paths of sensitive files to a .gitignore file:
# Ignore token files
*.tokens
*.envThen stage and commit the .gitignore file:
git add .gitignore
git commit -m "Add .gitignore to prevent committing sensitive files"
git push origin main5. Scan for Secrets Locally
Use tools like GitGuardian CLI to scan your commits for secrets before pushing:
ggshield secret scan pre-commit6. Replace Existing Tokens
If a token was exposed, revoke it and create a new one:
Revoke the token via GitHub's Settings > Developer Settings > Personal Access Tokens.
Create a new token and securely store it in environment variables or secret management tools like AWS Secrets Manager or HashiCorp Vault.
7. Use .env Files for Secrets
Store secrets in .env files and load them dynamically into your scripts using tools like dotenv. Example .env file:
GITHUB_TOKEN=your_token_hereAdd .env to .gitignore:
.envPush Again After Fixing Issues
After cleaning the history and ensuring no secrets remain, retry your push:
git push origin main
Comprehensive GitHub Solutions and Best Practices
Here’s a detailed guide to handling common Git issues, maintaining repository hygiene, securing your code, and improving workflows for professional-grade repositories.
1. General Git and GitHub Commands
Clone a Repository
git clone https://github.com/username/repo.gitCreate and Push a New Branch
git checkout -b feature-branch
git push origin feature-branchMerge Branches
git checkout main
git merge feature-branch
git push origin mainStash Uncommitted Changes
git stash
# Apply the stash later
git stash pop2. Fixing Common Git Issues
Remove Untracked Files
git clean -f -dUndo the Last Commit
git reset --soft HEAD~1 # Keeps changes staged
git reset --hard HEAD~1 # Discards changes completelyResolve Merge Conflicts
Open conflicted files.
Edit the conflict markers (
<<<<<<,======,>>>>>>).Stage the resolved files:
git add <file>Continue the merge:
git commit
Fix Detached HEAD
git checkout <branch>3. Managing Secrets
Prevent Secrets in Commits
Add sensitive files to .gitignore:
.env
*.key
*.pem
*.tokensScan Commits for Secrets
Use tools like TruffleHog:
trufflehog --regex --entropy=True https://github.com/username/repo.gitReplace Leaked Secrets
If a secret is exposed:
Revoke the token immediately.
Use
git filter-repoto remove it:git filter-repo --path-sensitive-file --invert-pathsPush the cleaned history:
git push origin main --force
4. Improving Repository Security
Enable 2FA for Your GitHub Account
Go to Settings > Security > Two-factor authentication.
Follow the prompts to set up 2FA.
Set Up Branch Protection
Go to Settings > Branches > Add branch protection rule.
Enable:
Require pull request reviews.
Require status checks.
Restrict who can push.
Enable Secret Scanning
Go to Settings > Security & Analysis.
Enable Secret Scanning and Push Protection.
5. Automating Workflows
Using GitHub Actions
Create a .github/workflows/main.yml file:
name: CI/CD Pipeline
on:
push:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v3
- name: Run Tests
run: |
npm install
npm testSchedule Automated Tasks
Run a workflow daily:
on:
schedule:
- cron: "0 0 * * *"6. Commit Hygiene
Write Descriptive Commit Messages
Follow the conventional commit format:
<type>(<scope>): <short summary>
[optional body]
[optional footer(s)]Example:
feat(auth): add login endpoint
fix(auth): resolve token expiration issueSign Commits
Enable GPG signing for commits:
git config --global user.signingkey <GPG_KEY_ID>
git config --global commit.gpgsign true7. Version Control Best Practices
Tagging Releases
git tag -a v1.0.0 -m "Initial release"
git push origin v1.0.0Changelog Automation
npx conventional-changelog-cli -p angular8. Collaboration
Create a Pull Request
Push changes:
git push origin feature-branchGo to the repository on GitHub and create a PR.
Review PRs
Use GitHub’s review tools to add comments.
Mark a PR as approved or request changes.
9. Repository Hygiene
Archive Old Branches
Delete merged branches:
git branch -d feature-branch
git push origin --delete feature-branchClean Up Commit History
Use rebase for a cleaner history:
git rebase -i HEAD~<number-of-commits>10. Advanced Git Commands
Revert a Specific Commit
git revert <commit-hash>Cherry-Pick a Commit
git cherry-pick <commit-hash>View Commit History
git log --oneline --graph --decorateShow Changes Between Branches
git diff branch1..branch211. Backup and Restore
Backup a Repository
git bundle create repo.bundle --allRestore from Backup
git clone repo.bundle12. Troubleshooting
Fix Corrupted Repositories
git fsck --fullRebuild the Index
rm -f .git/index
git resetRecover Lost Commits
git reflog
git reset --hard <commit-hash>13. Resources for Learning
This list ensures you have everything covered, from basic Git operations to advanced GitHub features, to help you create efficient, secure, and collaborative workflows. If you still encounter issues, GitHub’s documentation here provides more details on resolving blocked pushes.
30. Learning Resources
Tutorials
Videos
YouTube Channels:
Books
"The Web Application Hacker's Handbook" by Dafydd Stuttard.
"Practical Binary Analysis" by Dennis Andriesse.
This expanded Awesome List now provides 200+ actionable tips, tools, and workflows for building, automating, and optimizing cybersecurity scripts and tools!
Last updated