I have a recursive script running and after about 100 scrapes I always get:
FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
Initially I thought it was some JSON.stringify code that periodically saves the scrapped data to a text file but now I'm suspecting it's the scraper library. Have you experienced this at all?