Wayback-Urls is an OSINT (Open Source Intelligence) tool leveraging the Wayback Machine for URL reconnaissance. Built with Python, it retrieves historical URLs associated with a target domain with support for keyword filtering, result limiting, screenshots, and file export.
- Python 3.7+
- requests
- selenium
- Firefox + geckodriver (only required for screenshot functionality)
git clone https://github.com/atraxsrc/wayback-Urls.git
cd wayback-Urls
pip install -r requirements.txtFor screenshots, make sure
geckodriveris installed and available in your$PATH. On Ubuntu/Debian:sudo apt install firefox-geckodriverOn Arch:sudo pacman -S geckodriverOr download manually from github.com/mozilla/geckodriver
python3 waybackurls.py [-h] -d target.com [-k keyword] [-l limit] [-s] [-r seconds] [-o output]| Flag | Long form | Description | Default |
|---|---|---|---|
-h |
--help |
Show help message | |
-d |
--domain |
Target domain (e.g., target.com) |
required |
-k |
--keyword |
Filter by extension or keyword (e.g., js, pdf, admin, login) |
|
-l |
--limit |
Maximum number of URLs to return | |
-s |
--screenshot |
Take a screenshot of each URL found | |
-r |
--rate-limit |
Delay in seconds between screenshots | 1 |
-o |
--output |
Save results to a file at the specified path |
Retrieve all archived URLs for a domain:
python3 waybackurls.py -d example.comFilter for a specific file extension:
python3 waybackurls.py -d example.com -k jsFilter by keyword and limit results:
python3 waybackurls.py -d example.com -k login -l 100Take screenshots with a 5-second delay between each:
python3 waybackurls.py -d example.com -s -r 5Retrieve URLs and save to a file:
python3 waybackurls.py -d example.com -o urls.txtRetrieve URLs, take screenshots, and save output:
python3 waybackurls.py -d example.com -s -r 2 -o urls.txtWhen the -s flag is used, screenshots are saved to the screens/ directory inside the project folder (created automatically if it doesn't exist). Files are named screen-<number>.png sequentially.
By default, retrieved URLs are printed to the console with a count summary. Use -o to save results to a file instead.
If no URLs are found for the given domain or keyword, the tool will display a warning and exit cleanly.
Contributions are welcome! If you have ideas, improvements, or bug fixes, please open an issue or submit a pull request.
This project is licensed under the MIT License.