Skip to content

ScraperHub/how-to-use-crawlbase-and-ai-to-summarize-web-data

Repository files navigation

smart-proxy-cta

How to Use Crawlbase and AI to Summarize Web Data

We invite you to explore our blog for more details.

Setting Up Your Coding Environment

Before building your Amazon proxy unblocker, you'll need to set up a basic Python environment. Here's how to get started:

  1. Install Python 3 on your computer
  2. Run pip install -r requirements.txt

Create/OpenAI Account and Get API Key

  1. Go to: https://platform.openai.com/signup
  2. After logging in, get your API key from: https://platform.openai.com/api-keys

Obtaining Crawlbase API Credentials

  1. Create an account at Crawlbase and log in.
  2. After registration, you will receive 5,000 free requests.
  3. Locate and copy your Token Normal requests token.

Running the Example Scripts

Before running the examples, make sure to:

  1. Replace <OpenAI API Key> with your OpenAI API Key.
  2. Replace every instance of <Crawlbase Normal requests token> with your Crawlbase Normal requests token.

🛡 Disclaimer This repository is for educational purposes only. Please make sure you comply with the Terms of Service of any website you scrape. Use this responsibly and only where permitted.


Copyright 2025 Crawlbase

About

Automatically summarize web data with AI and create visualizations, such as pie charts, bar charts, and line graphs, for your data reporting workflow.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages