LinkSleuth is a high-performance, modular URL discovery and security analysis tool written in Go. Designed for security researchers, bug hunters, and developers, it provides deep insights into web application structures through concurrent crawling, sensitive endpoint detection, and comprehensive reporting.
- ⚡ High-Speed Discovery: Concurrent scanning using optimized worker pools.
- 🛡️ Security Focused: Automatic detection of sensitive files (
.env,config,backup) and administrative endpoints. - 📊 Rich Reporting: Export results in structured JSON, CSV, or interactive HTML dashboards.
- 🔁 Built-in Resilience: Automatic retries with exponential backoff and HTTP 429 (Rate Limit) handling.
- 🎭 Stealthy Operations: Random User-Agent rotation to bypass basic WAF and rate-limiting rules.
- 🧩 Modular Core: Extensible analyzer and reporter architecture.
go install github.com/ismailtsdln/linksluth@latestgit clone https://github.com/ismailtsdln/LinkSleuth.git
cd LinkSleuth
go build -o linksluth main.goScan a target domain using a wordlist and save results to a JSON file.
./linksluth scan --url https://example.com --wordlist wordlist.txt --output results.jsonAnalyze previously generated results directly in your terminal with colored status codes and findings.
./linksluth analyze --input results.jsonGenerate a visual HTML report for stakeholders or documentation.
./linksluth report --input results.json --output report.html| Flag | Short | Description | Default |
|---|---|---|---|
--url |
-u |
Target URL (Scheme required) | - |
--wordlist |
-w |
Path to directory/file wordlist | - |
--threads |
-t |
Number of concurrent workers | 10 |
--retry |
-r |
Number of retries per failed request | 2 |
--agent |
-a |
Custom User-Agent string | LinkSleuth/1.0 |
--output |
-o |
Path to save scan results | - |
--verbose |
-v |
Enable detailed debug logging | false |
LinkSleuth is built with a decoupled architecture for maximum flexibility:
- Crawler Core: Handles networking, concurrency, and worker management.
- Analyzer Engine: Processes HTTP responses and applies security heuristics.
- Reporter Module: Transforms raw data into human-readable and machine-parsable formats.
- CLI Layer: Powered by Cobra for a modern and intuitive user experience.
graph TD
CLI[CLI Layer - Cobra] --> Crawler[Crawler Core]
Crawler --> HTTP[HTTP Worker Pool]
HTTP --> Target[Target Web Application]
Target --> HTTP
HTTP --> Analyzer[Analyzer Engine]
Analyzer --> Reporter[Reporter Module]
Reporter --> JSON[JSON Output]
Reporter --> CSV[CSV Output]
Reporter --> HTML[HTML Dashboard]
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
Distributed under the MIT License. See LICENSE for more information.
Developed with ❤️ by Ismail Tasdelen