- Given a link should be able to crawl all the hyperlinks with given pattern up to given depth - Done
- Should be able to specific max requests per domain. (use go routines) - Done
- Respecting Robots.txt
- Set timeouts in http requests - Done
- change http header dynamically - Done
- use Cookie
pkmishra/goscraper
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|