Skip to content

Commit 581251a

Browse files
committed
cleanup, version 0.1.2
1 parent ba2d2e6 commit 581251a

File tree

6 files changed

+27
-35
lines changed

6 files changed

+27
-35
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
11
build/
2+
dist/
23
scrapy_proxy_headers.egg-info/
34
scrapy_proxy_headers/__pycache__/

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ The `scrapy-proxy-headers` package is designed for adding proxy headers to HTTPS
22

33
In normal usage, custom headers put in `request.headers` cannot be read by a proxy when you make a HTTPS request, because the headers are encrypted and passed through the proxy tunnel, along with the rest of the request body. You can read more about this at [Proxy Server Requests over HTTPS](https://docs.proxymesh.com/article/145-proxy-server-requests-over-https).
44

5-
Because Scrapy does not have a good way to pass custom headers to a proxy when you make HTTPS requests, we at ProxyMesh made this extension to support our customers that use Scrapy and want to use custom headers to control our proxy behavior. But this extension can work for any custom headers through a proxy.
5+
Because Scrapy does not have a good way to pass custom headers to a proxy when you make HTTPS requests, we at [ProxyMesh](https://proxymesh.com) made this extension to support our customers that use Scrapy and want to use custom headers to control our proxy behavior. But this extension can work for any custom headers through a proxy.
66

77
To use this extension, do the following:
88

pyproject.toml

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
[build-system]
2+
requires = ["setuptools>=61.0"]
3+
build-backend = "setuptools.build_meta"
4+
5+
[project]
6+
name = "scrapy-proxy-headers"
7+
version = "0.1.2"
8+
authors = [
9+
{ name="ProxyMesh", email="support@proxymesh.com" },
10+
]
11+
description = "Add custom proxy headers to HTTPS requests in Scrapy"
12+
readme = "README.md"
13+
requires-python = ">=3.8"
14+
classifiers = [
15+
"Programming Language :: Python :: 3",
16+
"Operating System :: OS Independent",
17+
"License :: OSI Approved :: BSD License",
18+
"Intended Audience :: Developers",
19+
"Topic :: Internet :: WWW/HTTP",
20+
"Topic :: Software Development :: Libraries :: Python Modules",
21+
]
22+
23+
[project.urls]
24+
Homepage = "https://github.com/proxymesh/scrapy-proxy-headers"
25+
Changelog = "https://github.com/proxymesh/scrapy-proxy-headers/commits/main/"

scrapy_proxy_headers/__init__.py

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1 @@
1-
"""
2-
To use this, in your settings, do the following:
3-
4-
DOWNLOAD_HANDLERS = {
5-
"https": "scrapy_proxy_headers.HTTP11ProxyDownloadHandler"
6-
}
7-
8-
Then when you make a request with a custom proxy header, instead of using request.headers, use request.meta["proxy_headers"] like this:
9-
10-
request.meta["proxy_headers"] = {"X-ProxyMesh-Country": "US"}
11-
"""
12-
131
from .download_handler import HTTP11ProxyDownloadHandler

scrapy_proxy_headers/agent.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,3 @@
1-
# TODO: handle response headers
2-
31
from scrapy.core.downloader.handlers.http11 import TunnelingAgent, TunnelingTCP4ClientEndpoint, ScrapyAgent, HTTP11DownloadHandler
42
from scrapy.core.downloader.webclient import _parse
53
from scrapy.utils.python import to_bytes

setup.py

Lines changed: 0 additions & 20 deletions
This file was deleted.

0 commit comments

Comments
 (0)