-
-Databases gives you simple asyncio support for a range of databases.
-
-It allows you to make queries using the powerful [SQLAlchemy Core][sqlalchemy-core]
-expression language, and provides support for PostgreSQL, MySQL, and SQLite.
-
-Databases is suitable for integrating against any async Web framework, such as [Starlette][starlette],
-[Sanic][sanic], [Responder][responder], [Quart][quart], [aiohttp][aiohttp], [Tornado][tornado], or [FastAPI][fastapi].
-
-**Documentation**: [https://www.encode.io/databases/](https://www.encode.io/databases/)
-
-**Requirements**: Python 3.7+
-
----
-
-## Installation
-
-```shell
-$ pip install databases
-```
-
-Database drivers supported are:
-
-* [asyncpg][asyncpg]
-* [aiopg][aiopg]
-* [aiomysql][aiomysql]
-* [asyncmy][asyncmy]
-* [aiosqlite][aiosqlite]
-
-You can install the required database drivers with:
-
-```shell
-$ pip install databases[asyncpg]
-$ pip install databases[aiopg]
-$ pip install databases[aiomysql]
-$ pip install databases[asyncmy]
-$ pip install databases[aiosqlite]
-```
-
-Note that if you are using any synchronous SQLAlchemy functions such as `engine.create_all()` or [alembic][alembic] migrations then you still have to install a synchronous DB driver: [psycopg2][psycopg2] for PostgreSQL and [pymysql][pymysql] for MySQL.
-
----
-
-## Quickstart
-
-For this example we'll create a very simple SQLite database to run some
-queries against.
-
-```shell
-$ pip install databases[aiosqlite]
-$ pip install ipython
-```
-
-We can now run a simple example from the console.
-
-Note that we want to use `ipython` here, because it supports using `await`
-expressions directly from the console.
-
-```python
-# Create a database instance, and connect to it.
-from databases import Database
-database = Database('sqlite+aiosqlite:///example.db')
-await database.connect()
-
-# Create a table.
-query = """CREATE TABLE HighScores (id INTEGER PRIMARY KEY, name VARCHAR(100), score INTEGER)"""
-await database.execute(query=query)
-
-# Insert some data.
-query = "INSERT INTO HighScores(name, score) VALUES (:name, :score)"
-values = [
- {"name": "Daisy", "score": 92},
- {"name": "Neil", "score": 87},
- {"name": "Carol", "score": 43},
-]
-await database.execute_many(query=query, values=values)
-
-# Run a database query.
-query = "SELECT * FROM HighScores"
-rows = await database.fetch_all(query=query)
-print('High Scores:', rows)
-```
-
-Check out the documentation on [making database queries](https://www.encode.io/databases/database_queries/)
-for examples of how to start using databases together with SQLAlchemy core expressions.
-
-
-
— ⭐️ —
-
Databases is BSD licensed code. Designed & built in Brighton, England.
-
-[sqlalchemy-core]: https://docs.sqlalchemy.org/en/latest/core/
-[sqlalchemy-core-tutorial]: https://docs.sqlalchemy.org/en/latest/core/tutorial.html
-[alembic]: https://alembic.sqlalchemy.org/en/latest/
-[psycopg2]: https://www.psycopg.org/
-[pymysql]: https://github.com/PyMySQL/PyMySQL
-[asyncpg]: https://github.com/MagicStack/asyncpg
-[aiopg]: https://github.com/aio-libs/aiopg
-[aiomysql]: https://github.com/aio-libs/aiomysql
-[asyncmy]: https://github.com/long2ice/asyncmy
-[aiosqlite]: https://github.com/omnilib/aiosqlite
-
-[starlette]: https://github.com/encode/starlette
-[sanic]: https://github.com/huge-success/sanic
-[responder]: https://github.com/kennethreitz/responder
-[quart]: https://gitlab.com/pgjones/quart
-[aiohttp]: https://github.com/aio-libs/aiohttp
-[tornado]: https://github.com/tornadoweb/tornado
-[fastapi]: https://github.com/tiangolo/fastapi
-
-
diff --git a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/RECORD b/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/RECORD
deleted file mode 100644
index a06df23..0000000
--- a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/RECORD
+++ /dev/null
@@ -1,28 +0,0 @@
-databases-0.6.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-databases-0.6.1.dist-info/LICENSE.md,sha256=rRqP1p8qMCB3kKToUMl8a-RwjN2E9EnoY7ZMUq6Eu0c,1518
-databases-0.6.1.dist-info/METADATA,sha256=S5ycd7NgxqrORd_ygrrpUK8_aX6qv89iLB8urtvi7F0,5387
-databases-0.6.1.dist-info/RECORD,,
-databases-0.6.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-databases-0.6.1.dist-info/WHEEL,sha256=G16H4A3IeoQmnOrYV4ueZGKSjhipXx8zc8nu9FGlvMA,92
-databases-0.6.1.dist-info/top_level.txt,sha256=SyMnm1m_TzLrs_3rWyRdm4A054u0rGeToSJum8uz8xg,29
-databases/__init__.py,sha256=SHRpxOcN4vsJXQ0phNfJImSx6tPGC4GLuM1UPAMJvlU,110
-databases/__pycache__/__init__.cpython-39.pyc,,
-databases/__pycache__/core.cpython-39.pyc,,
-databases/__pycache__/importer.cpython-39.pyc,,
-databases/__pycache__/interfaces.cpython-39.pyc,,
-databases/backends/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-databases/backends/__pycache__/__init__.cpython-39.pyc,,
-databases/backends/__pycache__/aiopg.cpython-39.pyc,,
-databases/backends/__pycache__/asyncmy.cpython-39.pyc,,
-databases/backends/__pycache__/mysql.cpython-39.pyc,,
-databases/backends/__pycache__/postgres.cpython-39.pyc,,
-databases/backends/__pycache__/sqlite.cpython-39.pyc,,
-databases/backends/aiopg.py,sha256=Og3lXUp1cxurQDI4D6E0Q4_17iKWO74TKxAhIYR4Oaw,10121
-databases/backends/asyncmy.py,sha256=szT7tgsINmjE1MCnAPzTnq2SR-GKureKAkVdDDo7L_o,10361
-databases/backends/mysql.py,sha256=HL3e9xn9lvKadmmY93fcnld5Djs4xAgkKuOPzAQOslA,9971
-databases/backends/postgres.py,sha256=Rn5PTh8eFBgQYMWYrUzXN7JGYlV_Ih7L_xBEkt6F6YU,12080
-databases/backends/sqlite.py,sha256=3s43TQ31VFd_iReLEhAsEZtj7ag1_NJAF36greXlbRM,9090
-databases/core.py,sha256=D91csL53jbD73y6ax6Tg48_biccOqvB4G0OTm35qmBQ,17808
-databases/importer.py,sha256=UeU5jzybPYKDTLCnip0SD-NJ6JV0mqSPIDYIHGbC1fM,1104
-databases/interfaces.py,sha256=5zYdeU3B_4B58Cr8IYh7cGfiq6qryXQOAz2bHqUZqZI,2566
-databases/py.typed,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
diff --git a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/REQUESTED b/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/REQUESTED
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/WHEEL b/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/WHEEL
deleted file mode 100644
index becc9a6..0000000
--- a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/WHEEL
+++ /dev/null
@@ -1,5 +0,0 @@
-Wheel-Version: 1.0
-Generator: bdist_wheel (0.37.1)
-Root-Is-Purelib: true
-Tag: py3-none-any
-
diff --git a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/top_level.txt b/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/top_level.txt
deleted file mode 100644
index f35b8ed..0000000
--- a/env/lib/python3.9/site-packages/databases-0.6.1.dist-info/top_level.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-databases
-databases/backends
diff --git a/env/lib/python3.9/site-packages/databases/__init__.py b/env/lib/python3.9/site-packages/databases/__init__.py
deleted file mode 100644
index 1a4a091..0000000
--- a/env/lib/python3.9/site-packages/databases/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-from databases.core import Database, DatabaseURL
-
-__version__ = "0.6.1"
-__all__ = ["Database", "DatabaseURL"]
diff --git a/env/lib/python3.9/site-packages/databases/backends/__init__.py b/env/lib/python3.9/site-packages/databases/backends/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/databases/backends/aiopg.py b/env/lib/python3.9/site-packages/databases/backends/aiopg.py
deleted file mode 100644
index 60c741a..0000000
--- a/env/lib/python3.9/site-packages/databases/backends/aiopg.py
+++ /dev/null
@@ -1,280 +0,0 @@
-import getpass
-import json
-import logging
-import typing
-import uuid
-
-import aiopg
-from aiopg.sa.engine import APGCompiler_psycopg2
-from sqlalchemy.dialects.postgresql.psycopg2 import PGDialect_psycopg2
-from sqlalchemy.engine.cursor import CursorResultMetaData
-from sqlalchemy.engine.interfaces import Dialect, ExecutionContext
-from sqlalchemy.engine.row import Row
-from sqlalchemy.sql import ClauseElement
-from sqlalchemy.sql.ddl import DDLElement
-
-from databases.core import DatabaseURL
-from databases.interfaces import (
- ConnectionBackend,
- DatabaseBackend,
- Record,
- TransactionBackend,
-)
-
-logger = logging.getLogger("databases")
-
-
-class AiopgBackend(DatabaseBackend):
- def __init__(
- self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any
- ) -> None:
- self._database_url = DatabaseURL(database_url)
- self._options = options
- self._dialect = self._get_dialect()
- self._pool: typing.Union[aiopg.Pool, None] = None
-
- def _get_dialect(self) -> Dialect:
- dialect = PGDialect_psycopg2(
- json_serializer=json.dumps, json_deserializer=lambda x: x
- )
- dialect.statement_compiler = APGCompiler_psycopg2
- dialect.implicit_returning = True
- dialect.supports_native_enum = True
- dialect.supports_smallserial = True # 9.2+
- dialect._backslash_escapes = False
- dialect.supports_sane_multi_rowcount = True # psycopg 2.0.9+
- dialect._has_native_hstore = True
- dialect.supports_native_decimal = True
-
- return dialect
-
- def _get_connection_kwargs(self) -> dict:
- url_options = self._database_url.options
-
- kwargs = {}
- min_size = url_options.get("min_size")
- max_size = url_options.get("max_size")
- ssl = url_options.get("ssl")
-
- if min_size is not None:
- kwargs["minsize"] = int(min_size)
- if max_size is not None:
- kwargs["maxsize"] = int(max_size)
- if ssl is not None:
- kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()]
-
- for key, value in self._options.items():
- # Coerce 'min_size' and 'max_size' for consistency.
- if key == "min_size":
- key = "minsize"
- elif key == "max_size":
- key = "maxsize"
- kwargs[key] = value
-
- return kwargs
-
- async def connect(self) -> None:
- assert self._pool is None, "DatabaseBackend is already running"
- kwargs = self._get_connection_kwargs()
- self._pool = await aiopg.create_pool(
- host=self._database_url.hostname,
- port=self._database_url.port,
- user=self._database_url.username or getpass.getuser(),
- password=self._database_url.password,
- database=self._database_url.database,
- **kwargs,
- )
-
- async def disconnect(self) -> None:
- assert self._pool is not None, "DatabaseBackend is not running"
- self._pool.close()
- await self._pool.wait_closed()
- self._pool = None
-
- def connection(self) -> "AiopgConnection":
- return AiopgConnection(self, self._dialect)
-
-
-class CompilationContext:
- def __init__(self, context: ExecutionContext):
- self.context = context
-
-
-class AiopgConnection(ConnectionBackend):
- def __init__(self, database: AiopgBackend, dialect: Dialect):
- self._database = database
- self._dialect = dialect
- self._connection = None # type: typing.Optional[aiopg.Connection]
-
- async def acquire(self) -> None:
- assert self._connection is None, "Connection is already acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- self._connection = await self._database._pool.acquire()
-
- async def release(self) -> None:
- assert self._connection is not None, "Connection is not acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- await self._database._pool.release(self._connection)
- self._connection = None
-
- async def fetch_all(self, query: ClauseElement) -> typing.List[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- rows = await cursor.fetchall()
- metadata = CursorResultMetaData(context, cursor.description)
- return [
- Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- for row in rows
- ]
- finally:
- cursor.close()
-
- async def fetch_one(self, query: ClauseElement) -> typing.Optional[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- row = await cursor.fetchone()
- if row is None:
- return None
- metadata = CursorResultMetaData(context, cursor.description)
- return Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- finally:
- cursor.close()
-
- async def execute(self, query: ClauseElement) -> typing.Any:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- return cursor.lastrowid
- finally:
- cursor.close()
-
- async def execute_many(self, queries: typing.List[ClauseElement]) -> None:
- assert self._connection is not None, "Connection is not acquired"
- cursor = await self._connection.cursor()
- try:
- for single_query in queries:
- single_query, args, context = self._compile(single_query)
- await cursor.execute(single_query, args)
- finally:
- cursor.close()
-
- async def iterate(
- self, query: ClauseElement
- ) -> typing.AsyncGenerator[typing.Any, None]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- metadata = CursorResultMetaData(context, cursor.description)
- async for row in cursor:
- yield Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- finally:
- cursor.close()
-
- def transaction(self) -> TransactionBackend:
- return AiopgTransaction(self)
-
- def _compile(
- self, query: ClauseElement
- ) -> typing.Tuple[str, dict, CompilationContext]:
- compiled = query.compile(
- dialect=self._dialect, compile_kwargs={"render_postcompile": True}
- )
-
- execution_context = self._dialect.execution_ctx_cls()
- execution_context.dialect = self._dialect
-
- if not isinstance(query, DDLElement):
- args = compiled.construct_params()
- for key, val in args.items():
- if key in compiled._bind_processors:
- args[key] = compiled._bind_processors[key](val)
-
- execution_context.result_column_struct = (
- compiled._result_columns,
- compiled._ordered_columns,
- compiled._textual_ordered_columns,
- compiled._loose_column_name_matching,
- )
- else:
- args = {}
-
- logger.debug("Query: %s\nArgs: %s", compiled.string, args)
- return compiled.string, args, CompilationContext(execution_context)
-
- @property
- def raw_connection(self) -> aiopg.connection.Connection:
- assert self._connection is not None, "Connection is not acquired"
- return self._connection
-
-
-class AiopgTransaction(TransactionBackend):
- def __init__(self, connection: AiopgConnection):
- self._connection = connection
- self._is_root = False
- self._savepoint_name = ""
-
- async def start(
- self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any]
- ) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- self._is_root = is_root
- cursor = await self._connection._connection.cursor()
- if self._is_root:
- await cursor.execute("BEGIN")
- else:
- id = str(uuid.uuid4()).replace("-", "_")
- self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}"
- try:
- await cursor.execute(f"SAVEPOINT {self._savepoint_name}")
- finally:
- cursor.close()
-
- async def commit(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- cursor = await self._connection._connection.cursor()
- if self._is_root:
- await cursor.execute("COMMIT")
- else:
- try:
- await cursor.execute(f"RELEASE SAVEPOINT {self._savepoint_name}")
- finally:
- cursor.close()
-
- async def rollback(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- cursor = await self._connection._connection.cursor()
- if self._is_root:
- await cursor.execute("ROLLBACK")
- else:
- try:
- await cursor.execute(f"ROLLBACK TO SAVEPOINT {self._savepoint_name}")
- finally:
- cursor.close()
diff --git a/env/lib/python3.9/site-packages/databases/backends/asyncmy.py b/env/lib/python3.9/site-packages/databases/backends/asyncmy.py
deleted file mode 100644
index e15dfa4..0000000
--- a/env/lib/python3.9/site-packages/databases/backends/asyncmy.py
+++ /dev/null
@@ -1,273 +0,0 @@
-import getpass
-import logging
-import typing
-import uuid
-
-import asyncmy
-from sqlalchemy.dialects.mysql import pymysql
-from sqlalchemy.engine.cursor import CursorResultMetaData
-from sqlalchemy.engine.interfaces import Dialect, ExecutionContext
-from sqlalchemy.engine.row import Row
-from sqlalchemy.sql import ClauseElement
-from sqlalchemy.sql.ddl import DDLElement
-
-from databases.core import LOG_EXTRA, DatabaseURL
-from databases.interfaces import (
- ConnectionBackend,
- DatabaseBackend,
- Record,
- TransactionBackend,
-)
-
-logger = logging.getLogger("databases")
-
-
-class AsyncMyBackend(DatabaseBackend):
- def __init__(
- self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any
- ) -> None:
- self._database_url = DatabaseURL(database_url)
- self._options = options
- self._dialect = pymysql.dialect(paramstyle="pyformat")
- self._dialect.supports_native_decimal = True
- self._pool = None
-
- def _get_connection_kwargs(self) -> dict:
- url_options = self._database_url.options
-
- kwargs = {}
- min_size = url_options.get("min_size")
- max_size = url_options.get("max_size")
- pool_recycle = url_options.get("pool_recycle")
- ssl = url_options.get("ssl")
-
- if min_size is not None:
- kwargs["minsize"] = int(min_size)
- if max_size is not None:
- kwargs["maxsize"] = int(max_size)
- if pool_recycle is not None:
- kwargs["pool_recycle"] = int(pool_recycle)
- if ssl is not None:
- kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()]
-
- for key, value in self._options.items():
- # Coerce 'min_size' and 'max_size' for consistency.
- if key == "min_size":
- key = "minsize"
- elif key == "max_size":
- key = "maxsize"
- kwargs[key] = value
-
- return kwargs
-
- async def connect(self) -> None:
- assert self._pool is None, "DatabaseBackend is already running"
- kwargs = self._get_connection_kwargs()
- self._pool = await asyncmy.create_pool(
- host=self._database_url.hostname,
- port=self._database_url.port or 3306,
- user=self._database_url.username or getpass.getuser(),
- password=self._database_url.password,
- db=self._database_url.database,
- autocommit=True,
- **kwargs,
- )
-
- async def disconnect(self) -> None:
- assert self._pool is not None, "DatabaseBackend is not running"
- self._pool.close()
- await self._pool.wait_closed()
- self._pool = None
-
- def connection(self) -> "AsyncMyConnection":
- return AsyncMyConnection(self, self._dialect)
-
-
-class CompilationContext:
- def __init__(self, context: ExecutionContext):
- self.context = context
-
-
-class AsyncMyConnection(ConnectionBackend):
- def __init__(self, database: AsyncMyBackend, dialect: Dialect):
- self._database = database
- self._dialect = dialect
- self._connection = None # type: typing.Optional[asyncmy.Connection]
-
- async def acquire(self) -> None:
- assert self._connection is None, "Connection is already acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- self._connection = await self._database._pool.acquire()
-
- async def release(self) -> None:
- assert self._connection is not None, "Connection is not acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- await self._database._pool.release(self._connection)
- self._connection = None
-
- async def fetch_all(self, query: ClauseElement) -> typing.List[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- async with self._connection.cursor() as cursor:
- try:
- await cursor.execute(query_str, args)
- rows = await cursor.fetchall()
- metadata = CursorResultMetaData(context, cursor.description)
- return [
- Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- for row in rows
- ]
- finally:
- await cursor.close()
-
- async def fetch_one(self, query: ClauseElement) -> typing.Optional[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- async with self._connection.cursor() as cursor:
- try:
- await cursor.execute(query_str, args)
- row = await cursor.fetchone()
- if row is None:
- return None
- metadata = CursorResultMetaData(context, cursor.description)
- return Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- finally:
- await cursor.close()
-
- async def execute(self, query: ClauseElement) -> typing.Any:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- async with self._connection.cursor() as cursor:
- try:
- await cursor.execute(query_str, args)
- if cursor.lastrowid == 0:
- return cursor.rowcount
- return cursor.lastrowid
- finally:
- await cursor.close()
-
- async def execute_many(self, queries: typing.List[ClauseElement]) -> None:
- assert self._connection is not None, "Connection is not acquired"
- async with self._connection.cursor() as cursor:
- try:
- for single_query in queries:
- single_query, args, context = self._compile(single_query)
- await cursor.execute(single_query, args)
- finally:
- await cursor.close()
-
- async def iterate(
- self, query: ClauseElement
- ) -> typing.AsyncGenerator[typing.Any, None]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- async with self._connection.cursor() as cursor:
- try:
- await cursor.execute(query_str, args)
- metadata = CursorResultMetaData(context, cursor.description)
- async for row in cursor:
- yield Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- finally:
- await cursor.close()
-
- def transaction(self) -> TransactionBackend:
- return AsyncMyTransaction(self)
-
- def _compile(
- self, query: ClauseElement
- ) -> typing.Tuple[str, dict, CompilationContext]:
- compiled = query.compile(
- dialect=self._dialect, compile_kwargs={"render_postcompile": True}
- )
-
- execution_context = self._dialect.execution_ctx_cls()
- execution_context.dialect = self._dialect
-
- if not isinstance(query, DDLElement):
- args = compiled.construct_params()
- for key, val in args.items():
- if key in compiled._bind_processors:
- args[key] = compiled._bind_processors[key](val)
-
- execution_context.result_column_struct = (
- compiled._result_columns,
- compiled._ordered_columns,
- compiled._textual_ordered_columns,
- compiled._loose_column_name_matching,
- )
- else:
- args = {}
-
- query_message = compiled.string.replace(" \n", " ").replace("\n", " ")
- logger.debug("Query: %s Args: %s", query_message, repr(args), extra=LOG_EXTRA)
- return compiled.string, args, CompilationContext(execution_context)
-
- @property
- def raw_connection(self) -> asyncmy.connection.Connection:
- assert self._connection is not None, "Connection is not acquired"
- return self._connection
-
-
-class AsyncMyTransaction(TransactionBackend):
- def __init__(self, connection: AsyncMyConnection):
- self._connection = connection
- self._is_root = False
- self._savepoint_name = ""
-
- async def start(
- self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any]
- ) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- self._is_root = is_root
- if self._is_root:
- await self._connection._connection.begin()
- else:
- id = str(uuid.uuid4()).replace("-", "_")
- self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}"
- async with self._connection._connection.cursor() as cursor:
- try:
- await cursor.execute(f"SAVEPOINT {self._savepoint_name}")
- finally:
- await cursor.close()
-
- async def commit(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- if self._is_root:
- await self._connection._connection.commit()
- else:
- async with self._connection._connection.cursor() as cursor:
- try:
- await cursor.execute(f"RELEASE SAVEPOINT {self._savepoint_name}")
- finally:
- await cursor.close()
-
- async def rollback(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- if self._is_root:
- await self._connection._connection.rollback()
- else:
- async with self._connection._connection.cursor() as cursor:
- try:
- await cursor.execute(
- f"ROLLBACK TO SAVEPOINT {self._savepoint_name}"
- )
- finally:
- await cursor.close()
diff --git a/env/lib/python3.9/site-packages/databases/backends/mysql.py b/env/lib/python3.9/site-packages/databases/backends/mysql.py
deleted file mode 100644
index 2a0a842..0000000
--- a/env/lib/python3.9/site-packages/databases/backends/mysql.py
+++ /dev/null
@@ -1,271 +0,0 @@
-import getpass
-import logging
-import typing
-import uuid
-
-import aiomysql
-from sqlalchemy.dialects.mysql import pymysql
-from sqlalchemy.engine.cursor import CursorResultMetaData
-from sqlalchemy.engine.interfaces import Dialect, ExecutionContext
-from sqlalchemy.engine.row import Row
-from sqlalchemy.sql import ClauseElement
-from sqlalchemy.sql.ddl import DDLElement
-
-from databases.core import LOG_EXTRA, DatabaseURL
-from databases.interfaces import (
- ConnectionBackend,
- DatabaseBackend,
- Record,
- TransactionBackend,
-)
-
-logger = logging.getLogger("databases")
-
-
-class MySQLBackend(DatabaseBackend):
- def __init__(
- self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any
- ) -> None:
- self._database_url = DatabaseURL(database_url)
- self._options = options
- self._dialect = pymysql.dialect(paramstyle="pyformat")
- self._dialect.supports_native_decimal = True
- self._pool = None
-
- def _get_connection_kwargs(self) -> dict:
- url_options = self._database_url.options
-
- kwargs = {}
- min_size = url_options.get("min_size")
- max_size = url_options.get("max_size")
- pool_recycle = url_options.get("pool_recycle")
- ssl = url_options.get("ssl")
-
- if min_size is not None:
- kwargs["minsize"] = int(min_size)
- if max_size is not None:
- kwargs["maxsize"] = int(max_size)
- if pool_recycle is not None:
- kwargs["pool_recycle"] = int(pool_recycle)
- if ssl is not None:
- kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()]
-
- for key, value in self._options.items():
- # Coerce 'min_size' and 'max_size' for consistency.
- if key == "min_size":
- key = "minsize"
- elif key == "max_size":
- key = "maxsize"
- kwargs[key] = value
-
- return kwargs
-
- async def connect(self) -> None:
- assert self._pool is None, "DatabaseBackend is already running"
- kwargs = self._get_connection_kwargs()
- self._pool = await aiomysql.create_pool(
- host=self._database_url.hostname,
- port=self._database_url.port or 3306,
- user=self._database_url.username or getpass.getuser(),
- password=self._database_url.password,
- db=self._database_url.database,
- autocommit=True,
- **kwargs,
- )
-
- async def disconnect(self) -> None:
- assert self._pool is not None, "DatabaseBackend is not running"
- self._pool.close()
- await self._pool.wait_closed()
- self._pool = None
-
- def connection(self) -> "MySQLConnection":
- return MySQLConnection(self, self._dialect)
-
-
-class CompilationContext:
- def __init__(self, context: ExecutionContext):
- self.context = context
-
-
-class MySQLConnection(ConnectionBackend):
- def __init__(self, database: MySQLBackend, dialect: Dialect):
- self._database = database
- self._dialect = dialect
- self._connection = None # type: typing.Optional[aiomysql.Connection]
-
- async def acquire(self) -> None:
- assert self._connection is None, "Connection is already acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- self._connection = await self._database._pool.acquire()
-
- async def release(self) -> None:
- assert self._connection is not None, "Connection is not acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- await self._database._pool.release(self._connection)
- self._connection = None
-
- async def fetch_all(self, query: ClauseElement) -> typing.List[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- rows = await cursor.fetchall()
- metadata = CursorResultMetaData(context, cursor.description)
- return [
- Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- for row in rows
- ]
- finally:
- await cursor.close()
-
- async def fetch_one(self, query: ClauseElement) -> typing.Optional[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- row = await cursor.fetchone()
- if row is None:
- return None
- metadata = CursorResultMetaData(context, cursor.description)
- return Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- finally:
- await cursor.close()
-
- async def execute(self, query: ClauseElement) -> typing.Any:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- if cursor.lastrowid == 0:
- return cursor.rowcount
- return cursor.lastrowid
- finally:
- await cursor.close()
-
- async def execute_many(self, queries: typing.List[ClauseElement]) -> None:
- assert self._connection is not None, "Connection is not acquired"
- cursor = await self._connection.cursor()
- try:
- for single_query in queries:
- single_query, args, context = self._compile(single_query)
- await cursor.execute(single_query, args)
- finally:
- await cursor.close()
-
- async def iterate(
- self, query: ClauseElement
- ) -> typing.AsyncGenerator[typing.Any, None]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- cursor = await self._connection.cursor()
- try:
- await cursor.execute(query_str, args)
- metadata = CursorResultMetaData(context, cursor.description)
- async for row in cursor:
- yield Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- finally:
- await cursor.close()
-
- def transaction(self) -> TransactionBackend:
- return MySQLTransaction(self)
-
- def _compile(
- self, query: ClauseElement
- ) -> typing.Tuple[str, dict, CompilationContext]:
- compiled = query.compile(
- dialect=self._dialect, compile_kwargs={"render_postcompile": True}
- )
-
- execution_context = self._dialect.execution_ctx_cls()
- execution_context.dialect = self._dialect
-
- if not isinstance(query, DDLElement):
- args = compiled.construct_params()
- for key, val in args.items():
- if key in compiled._bind_processors:
- args[key] = compiled._bind_processors[key](val)
-
- execution_context.result_column_struct = (
- compiled._result_columns,
- compiled._ordered_columns,
- compiled._textual_ordered_columns,
- compiled._loose_column_name_matching,
- )
- else:
- args = {}
-
- query_message = compiled.string.replace(" \n", " ").replace("\n", " ")
- logger.debug("Query: %s Args: %s", query_message, repr(args), extra=LOG_EXTRA)
- return compiled.string, args, CompilationContext(execution_context)
-
- @property
- def raw_connection(self) -> aiomysql.connection.Connection:
- assert self._connection is not None, "Connection is not acquired"
- return self._connection
-
-
-class MySQLTransaction(TransactionBackend):
- def __init__(self, connection: MySQLConnection):
- self._connection = connection
- self._is_root = False
- self._savepoint_name = ""
-
- async def start(
- self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any]
- ) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- self._is_root = is_root
- if self._is_root:
- await self._connection._connection.begin()
- else:
- id = str(uuid.uuid4()).replace("-", "_")
- self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}"
- cursor = await self._connection._connection.cursor()
- try:
- await cursor.execute(f"SAVEPOINT {self._savepoint_name}")
- finally:
- await cursor.close()
-
- async def commit(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- if self._is_root:
- await self._connection._connection.commit()
- else:
- cursor = await self._connection._connection.cursor()
- try:
- await cursor.execute(f"RELEASE SAVEPOINT {self._savepoint_name}")
- finally:
- await cursor.close()
-
- async def rollback(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- if self._is_root:
- await self._connection._connection.rollback()
- else:
- cursor = await self._connection._connection.cursor()
- try:
- await cursor.execute(f"ROLLBACK TO SAVEPOINT {self._savepoint_name}")
- finally:
- await cursor.close()
diff --git a/env/lib/python3.9/site-packages/databases/backends/postgres.py b/env/lib/python3.9/site-packages/databases/backends/postgres.py
deleted file mode 100644
index 3e1a6ff..0000000
--- a/env/lib/python3.9/site-packages/databases/backends/postgres.py
+++ /dev/null
@@ -1,325 +0,0 @@
-import logging
-import typing
-
-import asyncpg
-from sqlalchemy.dialects.postgresql import pypostgresql
-from sqlalchemy.engine.interfaces import Dialect
-from sqlalchemy.sql import ClauseElement
-from sqlalchemy.sql.ddl import DDLElement
-from sqlalchemy.sql.schema import Column
-from sqlalchemy.types import TypeEngine
-
-from databases.core import LOG_EXTRA, DatabaseURL
-from databases.interfaces import (
- ConnectionBackend,
- DatabaseBackend,
- Record as RecordInterface,
- TransactionBackend,
-)
-
-logger = logging.getLogger("databases")
-
-
-class PostgresBackend(DatabaseBackend):
- def __init__(
- self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any
- ) -> None:
- self._database_url = DatabaseURL(database_url)
- self._options = options
- self._dialect = self._get_dialect()
- self._pool = None
-
- def _get_dialect(self) -> Dialect:
- dialect = pypostgresql.dialect(paramstyle="pyformat")
-
- dialect.implicit_returning = True
- dialect.supports_native_enum = True
- dialect.supports_smallserial = True # 9.2+
- dialect._backslash_escapes = False
- dialect.supports_sane_multi_rowcount = True # psycopg 2.0.9+
- dialect._has_native_hstore = True
- dialect.supports_native_decimal = True
-
- return dialect
-
- def _get_connection_kwargs(self) -> dict:
- url_options = self._database_url.options
-
- kwargs = {} # type: typing.Dict[str, typing.Any]
- min_size = url_options.get("min_size")
- max_size = url_options.get("max_size")
- ssl = url_options.get("ssl")
-
- if min_size is not None:
- kwargs["min_size"] = int(min_size)
- if max_size is not None:
- kwargs["max_size"] = int(max_size)
- if ssl is not None:
- kwargs["ssl"] = {"true": True, "false": False}[ssl.lower()]
-
- kwargs.update(self._options)
-
- return kwargs
-
- async def connect(self) -> None:
- assert self._pool is None, "DatabaseBackend is already running"
- kwargs = dict(
- host=self._database_url.hostname,
- port=self._database_url.port,
- user=self._database_url.username,
- password=self._database_url.password,
- database=self._database_url.database,
- )
- kwargs.update(self._get_connection_kwargs())
- self._pool = await asyncpg.create_pool(**kwargs)
-
- async def disconnect(self) -> None:
- assert self._pool is not None, "DatabaseBackend is not running"
- await self._pool.close()
- self._pool = None
-
- def connection(self) -> "PostgresConnection":
- return PostgresConnection(self, self._dialect)
-
-
-class Record(RecordInterface):
- __slots__ = (
- "_row",
- "_result_columns",
- "_dialect",
- "_column_map",
- "_column_map_int",
- "_column_map_full",
- )
-
- def __init__(
- self,
- row: asyncpg.Record,
- result_columns: tuple,
- dialect: Dialect,
- column_maps: typing.Tuple[
- typing.Mapping[typing.Any, typing.Tuple[int, TypeEngine]],
- typing.Mapping[int, typing.Tuple[int, TypeEngine]],
- typing.Mapping[str, typing.Tuple[int, TypeEngine]],
- ],
- ) -> None:
- self._row = row
- self._result_columns = result_columns
- self._dialect = dialect
- self._column_map, self._column_map_int, self._column_map_full = column_maps
-
- @property
- def _mapping(self) -> typing.Mapping:
- return self._row
-
- def keys(self) -> typing.KeysView:
- import warnings
-
- warnings.warn(
- "The `Row.keys()` method is deprecated to mimic SQLAlchemy behaviour, "
- "use `Row._mapping.keys()` instead.",
- DeprecationWarning,
- )
- return self._mapping.keys()
-
- def values(self) -> typing.ValuesView:
- import warnings
-
- warnings.warn(
- "The `Row.values()` method is deprecated to mimic SQLAlchemy behaviour, "
- "use `Row._mapping.values()` instead.",
- DeprecationWarning,
- )
- return self._mapping.values()
-
- def __getitem__(self, key: typing.Any) -> typing.Any:
- if len(self._column_map) == 0: # raw query
- return self._row[key]
- elif isinstance(key, Column):
- idx, datatype = self._column_map_full[str(key)]
- elif isinstance(key, int):
- idx, datatype = self._column_map_int[key]
- else:
- idx, datatype = self._column_map[key]
- raw = self._row[idx]
- processor = datatype._cached_result_processor(self._dialect, None)
-
- if processor is not None:
- return processor(raw)
- return raw
-
- def __iter__(self) -> typing.Iterator:
- return iter(self._row.keys())
-
- def __len__(self) -> int:
- return len(self._row)
-
- def __getattr__(self, name: str) -> typing.Any:
- return self._mapping.get(name)
-
-
-class PostgresConnection(ConnectionBackend):
- def __init__(self, database: PostgresBackend, dialect: Dialect):
- self._database = database
- self._dialect = dialect
- self._connection = None # type: typing.Optional[asyncpg.connection.Connection]
-
- async def acquire(self) -> None:
- assert self._connection is None, "Connection is already acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- self._connection = await self._database._pool.acquire()
-
- async def release(self) -> None:
- assert self._connection is not None, "Connection is not acquired"
- assert self._database._pool is not None, "DatabaseBackend is not running"
- self._connection = await self._database._pool.release(self._connection)
- self._connection = None
-
- async def fetch_all(self, query: ClauseElement) -> typing.List[RecordInterface]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, result_columns = self._compile(query)
- rows = await self._connection.fetch(query_str, *args)
- dialect = self._dialect
- column_maps = self._create_column_maps(result_columns)
- return [Record(row, result_columns, dialect, column_maps) for row in rows]
-
- async def fetch_one(self, query: ClauseElement) -> typing.Optional[RecordInterface]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, result_columns = self._compile(query)
- row = await self._connection.fetchrow(query_str, *args)
- if row is None:
- return None
- return Record(
- row,
- result_columns,
- self._dialect,
- self._create_column_maps(result_columns),
- )
-
- async def fetch_val(
- self, query: ClauseElement, column: typing.Any = 0
- ) -> typing.Any:
- # we are not calling self._connection.fetchval here because
- # it does not convert all the types, e.g. JSON stays string
- # instead of an object
- # see also:
- # https://github.com/encode/databases/pull/131
- # https://github.com/encode/databases/pull/132
- # https://github.com/encode/databases/pull/246
- row = await self.fetch_one(query)
- if row is None:
- return None
- return row[column]
-
- async def execute(self, query: ClauseElement) -> typing.Any:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, result_columns = self._compile(query)
- return await self._connection.fetchval(query_str, *args)
-
- async def execute_many(self, queries: typing.List[ClauseElement]) -> None:
- assert self._connection is not None, "Connection is not acquired"
- # asyncpg uses prepared statements under the hood, so we just
- # loop through multiple executes here, which should all end up
- # using the same prepared statement.
- for single_query in queries:
- single_query, args, result_columns = self._compile(single_query)
- await self._connection.execute(single_query, *args)
-
- async def iterate(
- self, query: ClauseElement
- ) -> typing.AsyncGenerator[typing.Any, None]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, result_columns = self._compile(query)
- column_maps = self._create_column_maps(result_columns)
- async for row in self._connection.cursor(query_str, *args):
- yield Record(row, result_columns, self._dialect, column_maps)
-
- def transaction(self) -> TransactionBackend:
- return PostgresTransaction(connection=self)
-
- def _compile(self, query: ClauseElement) -> typing.Tuple[str, list, tuple]:
- compiled = query.compile(
- dialect=self._dialect, compile_kwargs={"render_postcompile": True}
- )
-
- if not isinstance(query, DDLElement):
- compiled_params = sorted(compiled.params.items())
-
- mapping = {
- key: "$" + str(i) for i, (key, _) in enumerate(compiled_params, start=1)
- }
- compiled_query = compiled.string % mapping
-
- processors = compiled._bind_processors
- args = [
- processors[key](val) if key in processors else val
- for key, val in compiled_params
- ]
-
- result_map = compiled._result_columns
- else:
- compiled_query = compiled.string
- args = []
- result_map = None
-
- query_message = compiled_query.replace(" \n", " ").replace("\n", " ")
- logger.debug(
- "Query: %s Args: %s", query_message, repr(tuple(args)), extra=LOG_EXTRA
- )
- return compiled_query, args, result_map
-
- @staticmethod
- def _create_column_maps(
- result_columns: tuple,
- ) -> typing.Tuple[
- typing.Mapping[typing.Any, typing.Tuple[int, TypeEngine]],
- typing.Mapping[int, typing.Tuple[int, TypeEngine]],
- typing.Mapping[str, typing.Tuple[int, TypeEngine]],
- ]:
- """
- Generate column -> datatype mappings from the column definitions.
-
- These mappings are used throughout PostgresConnection methods
- to initialize Record-s. The underlying DB driver does not do type
- conversion for us so we have wrap the returned asyncpg.Record-s.
-
- :return: Three mappings from different ways to address a column to \
- corresponding column indexes and datatypes: \
- 1. by column identifier; \
- 2. by column index; \
- 3. by column name in Column sqlalchemy objects.
- """
- column_map, column_map_int, column_map_full = {}, {}, {}
- for idx, (column_name, _, column, datatype) in enumerate(result_columns):
- column_map[column_name] = (idx, datatype)
- column_map_int[idx] = (idx, datatype)
- column_map_full[str(column[0])] = (idx, datatype)
- return column_map, column_map_int, column_map_full
-
- @property
- def raw_connection(self) -> asyncpg.connection.Connection:
- assert self._connection is not None, "Connection is not acquired"
- return self._connection
-
-
-class PostgresTransaction(TransactionBackend):
- def __init__(self, connection: PostgresConnection):
- self._connection = connection
- self._transaction = (
- None
- ) # type: typing.Optional[asyncpg.transaction.Transaction]
-
- async def start(
- self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any]
- ) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- self._transaction = self._connection._connection.transaction(**extra_options)
- await self._transaction.start()
-
- async def commit(self) -> None:
- assert self._transaction is not None
- await self._transaction.commit()
-
- async def rollback(self) -> None:
- assert self._transaction is not None
- await self._transaction.rollback()
diff --git a/env/lib/python3.9/site-packages/databases/backends/sqlite.py b/env/lib/python3.9/site-packages/databases/backends/sqlite.py
deleted file mode 100644
index 9626dcf..0000000
--- a/env/lib/python3.9/site-packages/databases/backends/sqlite.py
+++ /dev/null
@@ -1,245 +0,0 @@
-import logging
-import typing
-import uuid
-
-import aiosqlite
-from sqlalchemy.dialects.sqlite import pysqlite
-from sqlalchemy.engine.cursor import CursorResultMetaData
-from sqlalchemy.engine.interfaces import Dialect, ExecutionContext
-from sqlalchemy.engine.row import Row
-from sqlalchemy.sql import ClauseElement
-from sqlalchemy.sql.ddl import DDLElement
-
-from databases.core import LOG_EXTRA, DatabaseURL
-from databases.interfaces import (
- ConnectionBackend,
- DatabaseBackend,
- Record,
- TransactionBackend,
-)
-
-logger = logging.getLogger("databases")
-
-
-class SQLiteBackend(DatabaseBackend):
- def __init__(
- self, database_url: typing.Union[DatabaseURL, str], **options: typing.Any
- ) -> None:
- self._database_url = DatabaseURL(database_url)
- self._options = options
- self._dialect = pysqlite.dialect(paramstyle="qmark")
- # aiosqlite does not support decimals
- self._dialect.supports_native_decimal = False
- self._pool = SQLitePool(self._database_url, **self._options)
-
- async def connect(self) -> None:
- pass
- # assert self._pool is None, "DatabaseBackend is already running"
- # self._pool = await aiomysql.create_pool(
- # host=self._database_url.hostname,
- # port=self._database_url.port or 3306,
- # user=self._database_url.username or getpass.getuser(),
- # password=self._database_url.password,
- # db=self._database_url.database,
- # autocommit=True,
- # )
-
- async def disconnect(self) -> None:
- pass
- # assert self._pool is not None, "DatabaseBackend is not running"
- # self._pool.close()
- # await self._pool.wait_closed()
- # self._pool = None
-
- def connection(self) -> "SQLiteConnection":
- return SQLiteConnection(self._pool, self._dialect)
-
-
-class SQLitePool:
- def __init__(self, url: DatabaseURL, **options: typing.Any) -> None:
- self._url = url
- self._options = options
-
- async def acquire(self) -> aiosqlite.Connection:
- connection = aiosqlite.connect(
- database=self._url.database, isolation_level=None, **self._options
- )
- await connection.__aenter__()
- return connection
-
- async def release(self, connection: aiosqlite.Connection) -> None:
- await connection.__aexit__(None, None, None)
-
-
-class CompilationContext:
- def __init__(self, context: ExecutionContext):
- self.context = context
-
-
-class SQLiteConnection(ConnectionBackend):
- def __init__(self, pool: SQLitePool, dialect: Dialect):
- self._pool = pool
- self._dialect = dialect
- self._connection = None # type: typing.Optional[aiosqlite.Connection]
-
- async def acquire(self) -> None:
- assert self._connection is None, "Connection is already acquired"
- self._connection = await self._pool.acquire()
-
- async def release(self) -> None:
- assert self._connection is not None, "Connection is not acquired"
- await self._pool.release(self._connection)
- self._connection = None
-
- async def fetch_all(self, query: ClauseElement) -> typing.List[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
-
- async with self._connection.execute(query_str, args) as cursor:
- rows = await cursor.fetchall()
- metadata = CursorResultMetaData(context, cursor.description)
- return [
- Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
- for row in rows
- ]
-
- async def fetch_one(self, query: ClauseElement) -> typing.Optional[Record]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
-
- async with self._connection.execute(query_str, args) as cursor:
- row = await cursor.fetchone()
- if row is None:
- return None
- metadata = CursorResultMetaData(context, cursor.description)
- return Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
-
- async def execute(self, query: ClauseElement) -> typing.Any:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- async with self._connection.cursor() as cursor:
- await cursor.execute(query_str, args)
- if cursor.lastrowid == 0:
- return cursor.rowcount
- return cursor.lastrowid
-
- async def execute_many(self, queries: typing.List[ClauseElement]) -> None:
- assert self._connection is not None, "Connection is not acquired"
- for single_query in queries:
- await self.execute(single_query)
-
- async def iterate(
- self, query: ClauseElement
- ) -> typing.AsyncGenerator[typing.Any, None]:
- assert self._connection is not None, "Connection is not acquired"
- query_str, args, context = self._compile(query)
- async with self._connection.execute(query_str, args) as cursor:
- metadata = CursorResultMetaData(context, cursor.description)
- async for row in cursor:
- yield Row(
- metadata,
- metadata._processors,
- metadata._keymap,
- Row._default_key_style,
- row,
- )
-
- def transaction(self) -> TransactionBackend:
- return SQLiteTransaction(self)
-
- def _compile(
- self, query: ClauseElement
- ) -> typing.Tuple[str, list, CompilationContext]:
- compiled = query.compile(
- dialect=self._dialect, compile_kwargs={"render_postcompile": True}
- )
-
- execution_context = self._dialect.execution_ctx_cls()
- execution_context.dialect = self._dialect
-
- args = []
-
- if not isinstance(query, DDLElement):
- params = compiled.construct_params()
- for key in compiled.positiontup:
- raw_val = params[key]
- if key in compiled._bind_processors:
- val = compiled._bind_processors[key](raw_val)
- else:
- val = raw_val
- args.append(val)
-
- execution_context.result_column_struct = (
- compiled._result_columns,
- compiled._ordered_columns,
- compiled._textual_ordered_columns,
- compiled._loose_column_name_matching,
- )
-
- query_message = compiled.string.replace(" \n", " ").replace("\n", " ")
- logger.debug(
- "Query: %s Args: %s", query_message, repr(tuple(args)), extra=LOG_EXTRA
- )
- return compiled.string, args, CompilationContext(execution_context)
-
- @property
- def raw_connection(self) -> aiosqlite.core.Connection:
- assert self._connection is not None, "Connection is not acquired"
- return self._connection
-
-
-class SQLiteTransaction(TransactionBackend):
- def __init__(self, connection: SQLiteConnection):
- self._connection = connection
- self._is_root = False
- self._savepoint_name = ""
-
- async def start(
- self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any]
- ) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- self._is_root = is_root
- if self._is_root:
- async with self._connection._connection.execute("BEGIN") as cursor:
- await cursor.close()
- else:
- id = str(uuid.uuid4()).replace("-", "_")
- self._savepoint_name = f"STARLETTE_SAVEPOINT_{id}"
- async with self._connection._connection.execute(
- f"SAVEPOINT {self._savepoint_name}"
- ) as cursor:
- await cursor.close()
-
- async def commit(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- if self._is_root:
- async with self._connection._connection.execute("COMMIT") as cursor:
- await cursor.close()
- else:
- async with self._connection._connection.execute(
- f"RELEASE SAVEPOINT {self._savepoint_name}"
- ) as cursor:
- await cursor.close()
-
- async def rollback(self) -> None:
- assert self._connection._connection is not None, "Connection is not acquired"
- if self._is_root:
- async with self._connection._connection.execute("ROLLBACK") as cursor:
- await cursor.close()
- else:
- async with self._connection._connection.execute(
- f"ROLLBACK TO SAVEPOINT {self._savepoint_name}"
- ) as cursor:
- await cursor.close()
diff --git a/env/lib/python3.9/site-packages/databases/core.py b/env/lib/python3.9/site-packages/databases/core.py
deleted file mode 100644
index efa5947..0000000
--- a/env/lib/python3.9/site-packages/databases/core.py
+++ /dev/null
@@ -1,537 +0,0 @@
-import asyncio
-import contextlib
-import functools
-import logging
-import typing
-from contextvars import ContextVar
-from types import TracebackType
-from urllib.parse import SplitResult, parse_qsl, unquote, urlsplit
-
-from sqlalchemy import text
-from sqlalchemy.sql import ClauseElement
-
-from databases.importer import import_from_string
-from databases.interfaces import DatabaseBackend, Record
-
-try: # pragma: no cover
- import click
-
- # Extra log info for optional coloured terminal outputs.
- LOG_EXTRA = {
- "color_message": "Query: " + click.style("%s", bold=True) + " Args: %s"
- }
- CONNECT_EXTRA = {
- "color_message": "Connected to database " + click.style("%s", bold=True)
- }
- DISCONNECT_EXTRA = {
- "color_message": "Disconnected from database " + click.style("%s", bold=True)
- }
-except ImportError: # pragma: no cover
- LOG_EXTRA = {}
- CONNECT_EXTRA = {}
- DISCONNECT_EXTRA = {}
-
-
-logger = logging.getLogger("databases")
-
-
-class Database:
- SUPPORTED_BACKENDS = {
- "postgresql": "databases.backends.postgres:PostgresBackend",
- "postgresql+aiopg": "databases.backends.aiopg:AiopgBackend",
- "postgres": "databases.backends.postgres:PostgresBackend",
- "mysql": "databases.backends.mysql:MySQLBackend",
- "mysql+asyncmy": "databases.backends.asyncmy:AsyncMyBackend",
- "sqlite": "databases.backends.sqlite:SQLiteBackend",
- }
-
- def __init__(
- self,
- url: typing.Union[str, "DatabaseURL"],
- *,
- force_rollback: bool = False,
- **options: typing.Any,
- ):
- self.url = DatabaseURL(url)
- self.options = options
- self.is_connected = False
-
- self._force_rollback = force_rollback
-
- backend_str = self._get_backend()
- backend_cls = import_from_string(backend_str)
- assert issubclass(backend_cls, DatabaseBackend)
- self._backend = backend_cls(self.url, **self.options)
-
- # Connections are stored as task-local state.
- self._connection_context = ContextVar("connection_context") # type: ContextVar
-
- # When `force_rollback=True` is used, we use a single global
- # connection, within a transaction that always rolls back.
- self._global_connection = None # type: typing.Optional[Connection]
- self._global_transaction = None # type: typing.Optional[Transaction]
-
- async def connect(self) -> None:
- """
- Establish the connection pool.
- """
- if self.is_connected:
- logger.debug("Already connected, skipping connection")
- return None
-
- await self._backend.connect()
- logger.info(
- "Connected to database %s", self.url.obscure_password, extra=CONNECT_EXTRA
- )
- self.is_connected = True
-
- if self._force_rollback:
- assert self._global_connection is None
- assert self._global_transaction is None
-
- self._global_connection = Connection(self._backend)
- self._global_transaction = self._global_connection.transaction(
- force_rollback=True
- )
-
- await self._global_transaction.__aenter__()
-
- async def disconnect(self) -> None:
- """
- Close all connections in the connection pool.
- """
- if not self.is_connected:
- logger.debug("Already disconnected, skipping disconnection")
- return None
-
- if self._force_rollback:
- assert self._global_connection is not None
- assert self._global_transaction is not None
-
- await self._global_transaction.__aexit__()
-
- self._global_transaction = None
- self._global_connection = None
- else:
- self._connection_context = ContextVar("connection_context")
-
- await self._backend.disconnect()
- logger.info(
- "Disconnected from database %s",
- self.url.obscure_password,
- extra=DISCONNECT_EXTRA,
- )
- self.is_connected = False
-
- async def __aenter__(self) -> "Database":
- await self.connect()
- return self
-
- async def __aexit__(
- self,
- exc_type: typing.Type[BaseException] = None,
- exc_value: BaseException = None,
- traceback: TracebackType = None,
- ) -> None:
- await self.disconnect()
-
- async def fetch_all(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.List[Record]:
- async with self.connection() as connection:
- return await connection.fetch_all(query, values)
-
- async def fetch_one(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.Optional[Record]:
- async with self.connection() as connection:
- return await connection.fetch_one(query, values)
-
- async def fetch_val(
- self,
- query: typing.Union[ClauseElement, str],
- values: dict = None,
- column: typing.Any = 0,
- ) -> typing.Any:
- async with self.connection() as connection:
- return await connection.fetch_val(query, values, column=column)
-
- async def execute(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.Any:
- async with self.connection() as connection:
- return await connection.execute(query, values)
-
- async def execute_many(
- self, query: typing.Union[ClauseElement, str], values: list
- ) -> None:
- async with self.connection() as connection:
- return await connection.execute_many(query, values)
-
- async def iterate(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.AsyncGenerator[typing.Mapping, None]:
- async with self.connection() as connection:
- async for record in connection.iterate(query, values):
- yield record
-
- def connection(self) -> "Connection":
- if self._global_connection is not None:
- return self._global_connection
-
- try:
- return self._connection_context.get()
- except LookupError:
- connection = Connection(self._backend)
- self._connection_context.set(connection)
- return connection
-
- def transaction(
- self, *, force_rollback: bool = False, **kwargs: typing.Any
- ) -> "Transaction":
- return Transaction(self.connection, force_rollback=force_rollback, **kwargs)
-
- @contextlib.contextmanager
- def force_rollback(self) -> typing.Iterator[None]:
- initial = self._force_rollback
- self._force_rollback = True
- try:
- yield
- finally:
- self._force_rollback = initial
-
- def _get_backend(self) -> str:
- return self.SUPPORTED_BACKENDS.get(
- self.url.scheme, self.SUPPORTED_BACKENDS[self.url.dialect]
- )
-
-
-class Connection:
- def __init__(self, backend: DatabaseBackend) -> None:
- self._backend = backend
-
- self._connection_lock = asyncio.Lock()
- self._connection = self._backend.connection()
- self._connection_counter = 0
-
- self._transaction_lock = asyncio.Lock()
- self._transaction_stack = [] # type: typing.List[Transaction]
-
- self._query_lock = asyncio.Lock()
-
- async def __aenter__(self) -> "Connection":
- async with self._connection_lock:
- self._connection_counter += 1
- try:
- if self._connection_counter == 1:
- await self._connection.acquire()
- except BaseException as e:
- self._connection_counter -= 1
- raise e
- return self
-
- async def __aexit__(
- self,
- exc_type: typing.Type[BaseException] = None,
- exc_value: BaseException = None,
- traceback: TracebackType = None,
- ) -> None:
- async with self._connection_lock:
- assert self._connection is not None
- self._connection_counter -= 1
- if self._connection_counter == 0:
- await self._connection.release()
-
- async def fetch_all(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.List[Record]:
- built_query = self._build_query(query, values)
- async with self._query_lock:
- return await self._connection.fetch_all(built_query)
-
- async def fetch_one(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.Optional[Record]:
- built_query = self._build_query(query, values)
- async with self._query_lock:
- return await self._connection.fetch_one(built_query)
-
- async def fetch_val(
- self,
- query: typing.Union[ClauseElement, str],
- values: dict = None,
- column: typing.Any = 0,
- ) -> typing.Any:
- built_query = self._build_query(query, values)
- async with self._query_lock:
- return await self._connection.fetch_val(built_query, column)
-
- async def execute(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.Any:
- built_query = self._build_query(query, values)
- async with self._query_lock:
- return await self._connection.execute(built_query)
-
- async def execute_many(
- self, query: typing.Union[ClauseElement, str], values: list
- ) -> None:
- queries = [self._build_query(query, values_set) for values_set in values]
- async with self._query_lock:
- await self._connection.execute_many(queries)
-
- async def iterate(
- self, query: typing.Union[ClauseElement, str], values: dict = None
- ) -> typing.AsyncGenerator[typing.Any, None]:
- built_query = self._build_query(query, values)
- async with self.transaction():
- async with self._query_lock:
- async for record in self._connection.iterate(built_query):
- yield record
-
- def transaction(
- self, *, force_rollback: bool = False, **kwargs: typing.Any
- ) -> "Transaction":
- def connection_callable() -> Connection:
- return self
-
- return Transaction(connection_callable, force_rollback, **kwargs)
-
- @property
- def raw_connection(self) -> typing.Any:
- return self._connection.raw_connection
-
- @staticmethod
- def _build_query(
- query: typing.Union[ClauseElement, str], values: dict = None
- ) -> ClauseElement:
- if isinstance(query, str):
- query = text(query)
-
- return query.bindparams(**values) if values is not None else query
- elif values:
- return query.values(**values)
-
- return query
-
-
-_CallableType = typing.TypeVar("_CallableType", bound=typing.Callable)
-
-
-class Transaction:
- def __init__(
- self,
- connection_callable: typing.Callable[[], Connection],
- force_rollback: bool,
- **kwargs: typing.Any,
- ) -> None:
- self._connection_callable = connection_callable
- self._force_rollback = force_rollback
- self._extra_options = kwargs
-
- async def __aenter__(self) -> "Transaction":
- """
- Called when entering `async with database.transaction()`
- """
- await self.start()
- return self
-
- async def __aexit__(
- self,
- exc_type: typing.Type[BaseException] = None,
- exc_value: BaseException = None,
- traceback: TracebackType = None,
- ) -> None:
- """
- Called when exiting `async with database.transaction()`
- """
- if exc_type is not None or self._force_rollback:
- await self.rollback()
- else:
- await self.commit()
-
- def __await__(self) -> typing.Generator[None, None, "Transaction"]:
- """
- Called if using the low-level `transaction = await database.transaction()`
- """
- return self.start().__await__()
-
- def __call__(self, func: _CallableType) -> _CallableType:
- """
- Called if using `@database.transaction()` as a decorator.
- """
-
- @functools.wraps(func)
- async def wrapper(*args: typing.Any, **kwargs: typing.Any) -> typing.Any:
- async with self:
- return await func(*args, **kwargs)
-
- return wrapper # type: ignore
-
- async def start(self) -> "Transaction":
- self._connection = self._connection_callable()
- self._transaction = self._connection._connection.transaction()
-
- async with self._connection._transaction_lock:
- is_root = not self._connection._transaction_stack
- await self._connection.__aenter__()
- await self._transaction.start(
- is_root=is_root, extra_options=self._extra_options
- )
- self._connection._transaction_stack.append(self)
- return self
-
- async def commit(self) -> None:
- async with self._connection._transaction_lock:
- assert self._connection._transaction_stack[-1] is self
- self._connection._transaction_stack.pop()
- await self._transaction.commit()
- await self._connection.__aexit__()
-
- async def rollback(self) -> None:
- async with self._connection._transaction_lock:
- assert self._connection._transaction_stack[-1] is self
- self._connection._transaction_stack.pop()
- await self._transaction.rollback()
- await self._connection.__aexit__()
-
-
-class _EmptyNetloc(str):
- def __bool__(self) -> bool:
- return True
-
-
-class DatabaseURL:
- def __init__(self, url: typing.Union[str, "DatabaseURL"]):
- if isinstance(url, DatabaseURL):
- self._url: str = url._url
- elif isinstance(url, str):
- self._url = url
- else:
- raise TypeError(
- f"Invalid type for DatabaseURL. Expected str or DatabaseURL, got {type(url)}"
- )
-
- @property
- def components(self) -> SplitResult:
- if not hasattr(self, "_components"):
- self._components = urlsplit(self._url)
- return self._components
-
- @property
- def scheme(self) -> str:
- return self.components.scheme
-
- @property
- def dialect(self) -> str:
- return self.components.scheme.split("+")[0]
-
- @property
- def driver(self) -> str:
- if "+" not in self.components.scheme:
- return ""
- return self.components.scheme.split("+", 1)[1]
-
- @property
- def userinfo(self) -> typing.Optional[bytes]:
- if self.components.username:
- info = self.components.username
- if self.components.password:
- info += ":" + self.components.password
- return info.encode("utf-8")
- return None
-
- @property
- def username(self) -> typing.Optional[str]:
- if self.components.username is None:
- return None
- return unquote(self.components.username)
-
- @property
- def password(self) -> typing.Optional[str]:
- if self.components.password is None:
- return None
- return unquote(self.components.password)
-
- @property
- def hostname(self) -> typing.Optional[str]:
- return (
- self.components.hostname
- or self.options.get("host")
- or self.options.get("unix_sock")
- )
-
- @property
- def port(self) -> typing.Optional[int]:
- return self.components.port
-
- @property
- def netloc(self) -> typing.Optional[str]:
- return self.components.netloc
-
- @property
- def database(self) -> str:
- path = self.components.path
- if path.startswith("/"):
- path = path[1:]
- return unquote(path)
-
- @property
- def options(self) -> dict:
- if not hasattr(self, "_options"):
- self._options = dict(parse_qsl(self.components.query))
- return self._options
-
- def replace(self, **kwargs: typing.Any) -> "DatabaseURL":
- if (
- "username" in kwargs
- or "password" in kwargs
- or "hostname" in kwargs
- or "port" in kwargs
- ):
- hostname = kwargs.pop("hostname", self.hostname)
- port = kwargs.pop("port", self.port)
- username = kwargs.pop("username", self.components.username)
- password = kwargs.pop("password", self.components.password)
-
- netloc = hostname
- if port is not None:
- netloc += f":{port}"
- if username is not None:
- userpass = username
- if password is not None:
- userpass += f":{password}"
- netloc = f"{userpass}@{netloc}"
-
- kwargs["netloc"] = netloc
-
- if "database" in kwargs:
- kwargs["path"] = "/" + kwargs.pop("database")
-
- if "dialect" in kwargs or "driver" in kwargs:
- dialect = kwargs.pop("dialect", self.dialect)
- driver = kwargs.pop("driver", self.driver)
- kwargs["scheme"] = f"{dialect}+{driver}" if driver else dialect
-
- if not kwargs.get("netloc", self.netloc):
- # Using an empty string that evaluates as True means we end up
- # with URLs like `sqlite:///database` instead of `sqlite:/database`
- kwargs["netloc"] = _EmptyNetloc()
-
- components = self.components._replace(**kwargs)
- return self.__class__(components.geturl())
-
- @property
- def obscure_password(self) -> str:
- if self.password:
- return self.replace(password="********")._url
- return self._url
-
- def __str__(self) -> str:
- return self._url
-
- def __repr__(self) -> str:
- return f"{self.__class__.__name__}({repr(self.obscure_password)})"
-
- def __eq__(self, other: typing.Any) -> bool:
- return str(self) == str(other)
diff --git a/env/lib/python3.9/site-packages/databases/importer.py b/env/lib/python3.9/site-packages/databases/importer.py
deleted file mode 100644
index f680782..0000000
--- a/env/lib/python3.9/site-packages/databases/importer.py
+++ /dev/null
@@ -1,35 +0,0 @@
-import importlib
-import typing
-
-
-class ImportFromStringError(Exception):
- pass
-
-
-def import_from_string(import_str: str) -> typing.Any:
- module_str, _, attrs_str = import_str.partition(":")
- if not module_str or not attrs_str:
- message = (
- 'Import string "{import_str}" must be in format ":".'
- )
- raise ImportFromStringError(message.format(import_str=import_str))
-
- try:
- module = importlib.import_module(module_str)
- except ImportError as exc:
- if exc.name != module_str:
- raise exc from None
- message = 'Could not import module "{module_str}".'
- raise ImportFromStringError(message.format(module_str=module_str))
-
- instance = module
- try:
- for attr_str in attrs_str.split("."):
- instance = getattr(instance, attr_str)
- except AttributeError as exc:
- message = 'Attribute "{attrs_str}" not found in module "{module_str}".'
- raise ImportFromStringError(
- message.format(attrs_str=attrs_str, module_str=module_str)
- )
-
- return instance
diff --git a/env/lib/python3.9/site-packages/databases/interfaces.py b/env/lib/python3.9/site-packages/databases/interfaces.py
deleted file mode 100644
index fd6a24e..0000000
--- a/env/lib/python3.9/site-packages/databases/interfaces.py
+++ /dev/null
@@ -1,78 +0,0 @@
-import typing
-from collections.abc import Sequence
-
-from sqlalchemy.sql import ClauseElement
-
-
-class DatabaseBackend:
- async def connect(self) -> None:
- raise NotImplementedError() # pragma: no cover
-
- async def disconnect(self) -> None:
- raise NotImplementedError() # pragma: no cover
-
- def connection(self) -> "ConnectionBackend":
- raise NotImplementedError() # pragma: no cover
-
-
-class ConnectionBackend:
- async def acquire(self) -> None:
- raise NotImplementedError() # pragma: no cover
-
- async def release(self) -> None:
- raise NotImplementedError() # pragma: no cover
-
- async def fetch_all(self, query: ClauseElement) -> typing.List["Record"]:
- raise NotImplementedError() # pragma: no cover
-
- async def fetch_one(self, query: ClauseElement) -> typing.Optional["Record"]:
- raise NotImplementedError() # pragma: no cover
-
- async def fetch_val(
- self, query: ClauseElement, column: typing.Any = 0
- ) -> typing.Any:
- row = await self.fetch_one(query)
- return None if row is None else row[column]
-
- async def execute(self, query: ClauseElement) -> typing.Any:
- raise NotImplementedError() # pragma: no cover
-
- async def execute_many(self, queries: typing.List[ClauseElement]) -> None:
- raise NotImplementedError() # pragma: no cover
-
- async def iterate(
- self, query: ClauseElement
- ) -> typing.AsyncGenerator[typing.Mapping, None]:
- raise NotImplementedError() # pragma: no cover
- # mypy needs async iterators to contain a `yield`
- # https://github.com/python/mypy/issues/5385#issuecomment-407281656
- yield True # pragma: no cover
-
- def transaction(self) -> "TransactionBackend":
- raise NotImplementedError() # pragma: no cover
-
- @property
- def raw_connection(self) -> typing.Any:
- raise NotImplementedError() # pragma: no cover
-
-
-class TransactionBackend:
- async def start(
- self, is_root: bool, extra_options: typing.Dict[typing.Any, typing.Any]
- ) -> None:
- raise NotImplementedError() # pragma: no cover
-
- async def commit(self) -> None:
- raise NotImplementedError() # pragma: no cover
-
- async def rollback(self) -> None:
- raise NotImplementedError() # pragma: no cover
-
-
-class Record(Sequence):
- @property
- def _mapping(self) -> typing.Mapping:
- raise NotImplementedError() # pragma: no cover
-
- def __getitem__(self, key: typing.Any) -> typing.Any:
- raise NotImplementedError() # pragma: no cover
diff --git a/env/lib/python3.9/site-packages/databases/py.typed b/env/lib/python3.9/site-packages/databases/py.typed
deleted file mode 100644
index 8b13789..0000000
--- a/env/lib/python3.9/site-packages/databases/py.typed
+++ /dev/null
@@ -1 +0,0 @@
-
diff --git a/env/lib/python3.9/site-packages/distutils-precedence.pth b/env/lib/python3.9/site-packages/distutils-precedence.pth
deleted file mode 100644
index 7f009fe..0000000
--- a/env/lib/python3.9/site-packages/distutils-precedence.pth
+++ /dev/null
@@ -1 +0,0 @@
-import os; var = 'SETUPTOOLS_USE_DISTUTILS'; enabled = os.environ.get(var, 'local') == 'local'; enabled and __import__('_distutils_hack').add_shim();
diff --git a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/INSTALLER b/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/INSTALLER
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/INSTALLER
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/LICENSE b/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/LICENSE
deleted file mode 100644
index 3e92463..0000000
--- a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/LICENSE
+++ /dev/null
@@ -1,21 +0,0 @@
-The MIT License (MIT)
-
-Copyright (c) 2018 Sebastián Ramírez
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
diff --git a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/METADATA b/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/METADATA
deleted file mode 100644
index bbfc20f..0000000
--- a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/METADATA
+++ /dev/null
@@ -1,550 +0,0 @@
-Metadata-Version: 2.1
-Name: fastapi
-Version: 0.82.0
-Summary: FastAPI framework, high performance, easy to learn, fast to code, ready for production
-Home-page: https://github.com/tiangolo/fastapi
-Author: Sebastián Ramírez
-Author-email: tiangolo@gmail.com
-Requires-Python: >=3.6.1
-Description-Content-Type: text/markdown
-Classifier: Intended Audience :: Information Technology
-Classifier: Intended Audience :: System Administrators
-Classifier: Operating System :: OS Independent
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python
-Classifier: Topic :: Internet
-Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
-Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Classifier: Topic :: Software Development :: Libraries
-Classifier: Topic :: Software Development
-Classifier: Typing :: Typed
-Classifier: Development Status :: 4 - Beta
-Classifier: Environment :: Web Environment
-Classifier: Framework :: AsyncIO
-Classifier: Framework :: FastAPI
-Classifier: Intended Audience :: Developers
-Classifier: License :: OSI Approved :: MIT License
-Classifier: Programming Language :: Python :: 3 :: Only
-Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Programming Language :: Python :: 3.8
-Classifier: Programming Language :: Python :: 3.9
-Classifier: Programming Language :: Python :: 3.10
-Classifier: Topic :: Internet :: WWW/HTTP :: HTTP Servers
-Classifier: Topic :: Internet :: WWW/HTTP
-Requires-Dist: starlette==0.19.1
-Requires-Dist: pydantic >=1.6.2,!=1.7,!=1.7.1,!=1.7.2,!=1.7.3,!=1.8,!=1.8.1,<2.0.0
-Requires-Dist: requests >=2.24.0,<3.0.0 ; extra == "all"
-Requires-Dist: jinja2 >=2.11.2,<4.0.0 ; extra == "all"
-Requires-Dist: python-multipart >=0.0.5,<0.0.6 ; extra == "all"
-Requires-Dist: itsdangerous >=1.1.0,<3.0.0 ; extra == "all"
-Requires-Dist: pyyaml >=5.3.1,<7.0.0 ; extra == "all"
-Requires-Dist: ujson >=4.0.1,!=4.0.2,!=4.1.0,!=4.2.0,!=4.3.0,!=5.0.0,!=5.1.0,<6.0.0 ; extra == "all"
-Requires-Dist: orjson >=3.2.1,<4.0.0 ; extra == "all"
-Requires-Dist: email_validator >=1.1.1,<2.0.0 ; extra == "all"
-Requires-Dist: uvicorn[standard] >=0.12.0,<0.18.0 ; extra == "all"
-Requires-Dist: python-jose[cryptography] >=3.3.0,<4.0.0 ; extra == "dev"
-Requires-Dist: passlib[bcrypt] >=1.7.2,<2.0.0 ; extra == "dev"
-Requires-Dist: autoflake >=1.4.0,<2.0.0 ; extra == "dev"
-Requires-Dist: flake8 >=3.8.3,<6.0.0 ; extra == "dev"
-Requires-Dist: uvicorn[standard] >=0.12.0,<0.18.0 ; extra == "dev"
-Requires-Dist: pre-commit >=2.17.0,<3.0.0 ; extra == "dev"
-Requires-Dist: mkdocs >=1.1.2,<2.0.0 ; extra == "doc"
-Requires-Dist: mkdocs-material >=8.1.4,<9.0.0 ; extra == "doc"
-Requires-Dist: mdx-include >=1.4.1,<2.0.0 ; extra == "doc"
-Requires-Dist: mkdocs-markdownextradata-plugin >=0.1.7,<0.3.0 ; extra == "doc"
-Requires-Dist: typer >=0.4.1,<0.5.0 ; extra == "doc"
-Requires-Dist: pyyaml >=5.3.1,<7.0.0 ; extra == "doc"
-Requires-Dist: pytest >=6.2.4,<7.0.0 ; extra == "test"
-Requires-Dist: pytest-cov >=2.12.0,<4.0.0 ; extra == "test"
-Requires-Dist: mypy ==0.910 ; extra == "test"
-Requires-Dist: flake8 >=3.8.3,<6.0.0 ; extra == "test"
-Requires-Dist: black == 22.3.0 ; extra == "test"
-Requires-Dist: isort >=5.0.6,<6.0.0 ; extra == "test"
-Requires-Dist: requests >=2.24.0,<3.0.0 ; extra == "test"
-Requires-Dist: httpx >=0.14.0,<0.19.0 ; extra == "test"
-Requires-Dist: email_validator >=1.1.1,<2.0.0 ; extra == "test"
-Requires-Dist: sqlalchemy >=1.3.18,<1.5.0 ; extra == "test"
-Requires-Dist: peewee >=3.13.3,<4.0.0 ; extra == "test"
-Requires-Dist: databases[sqlite] >=0.3.2,<0.6.0 ; extra == "test"
-Requires-Dist: orjson >=3.2.1,<4.0.0 ; extra == "test"
-Requires-Dist: ujson >=4.0.1,!=4.0.2,!=4.1.0,!=4.2.0,!=4.3.0,!=5.0.0,!=5.1.0,<6.0.0 ; extra == "test"
-Requires-Dist: python-multipart >=0.0.5,<0.0.6 ; extra == "test"
-Requires-Dist: flask >=1.1.2,<3.0.0 ; extra == "test"
-Requires-Dist: anyio[trio] >=3.2.1,<4.0.0 ; extra == "test"
-Requires-Dist: types-ujson ==4.2.1 ; extra == "test"
-Requires-Dist: types-orjson ==3.6.2 ; extra == "test"
-Requires-Dist: types-dataclasses ==0.6.5 ; extra == "test" and ( python_version<'3.7')
-Project-URL: Documentation, https://fastapi.tiangolo.com/
-Provides-Extra: all
-Provides-Extra: dev
-Provides-Extra: doc
-Provides-Extra: test
-
-
-
-
-
- FastAPI framework, high performance, easy to learn, fast to code, ready for production
-
-
----
-
-**Documentation**: https://fastapi.tiangolo.com
-
-**Source Code**: https://github.com/tiangolo/fastapi
-
----
-
-FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.
-
-The key features are:
-
-* **Fast**: Very high performance, on par with **NodeJS** and **Go** (thanks to Starlette and Pydantic). [One of the fastest Python frameworks available](#performance).
-* **Fast to code**: Increase the speed to develop features by about 200% to 300%. *
-* **Fewer bugs**: Reduce about 40% of human (developer) induced errors. *
-* **Intuitive**: Great editor support. Completion everywhere. Less time debugging.
-* **Easy**: Designed to be easy to use and learn. Less time reading docs.
-* **Short**: Minimize code duplication. Multiple features from each parameter declaration. Fewer bugs.
-* **Robust**: Get production-ready code. With automatic interactive documentation.
-* **Standards-based**: Based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema.
-
-* estimation based on tests on an internal development team, building production applications.
-
-## Sponsors
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-Other sponsors
-
-## Opinions
-
-"_[...] I'm using **FastAPI** a ton these days. [...] I'm actually planning to use it for all of my team's **ML services at Microsoft**. Some of them are getting integrated into the core **Windows** product and some **Office** products._"
-
-
-
----
-
-"_We adopted the **FastAPI** library to spawn a **REST** server that can be queried to obtain **predictions**. [for Ludwig]_"
-
-
Piero Molino, Yaroslav Dudin, and Sai Sumanth Miryala - Uber(ref)
-
----
-
-"_**Netflix** is pleased to announce the open-source release of our **crisis management** orchestration framework: **Dispatch**! [built with **FastAPI**]_"
-
-
Kevin Glisson, Marc Vilanova, Forest Monsen - Netflix(ref)
-
----
-
-"_I’m over the moon excited about **FastAPI**. It’s so fun!_"
-
-
-
----
-
-"_Honestly, what you've built looks super solid and polished. In many ways, it's what I wanted **Hug** to be - it's really inspiring to see someone build that._"
-
-
-
----
-
-"_If you're looking to learn one **modern framework** for building REST APIs, check out **FastAPI** [...] It's fast, easy to use and easy to learn [...]_"
-
-"_We've switched over to **FastAPI** for our **APIs** [...] I think you'll like it [...]_"
-
-
-
----
-
-## **Typer**, the FastAPI of CLIs
-
-
-
-If you are building a CLI app to be used in the terminal instead of a web API, check out **Typer**.
-
-**Typer** is FastAPI's little sibling. And it's intended to be the **FastAPI of CLIs**. ⌨️ 🚀
-
-## Requirements
-
-Python 3.6+
-
-FastAPI stands on the shoulders of giants:
-
-* Starlette for the web parts.
-* Pydantic for the data parts.
-
-## Installation
-
-
-
-## Example
-
-### Create it
-
-* Create a file `main.py` with:
-
-```Python
-from typing import Union
-
-from fastapi import FastAPI
-
-app = FastAPI()
-
-
-@app.get("/")
-def read_root():
- return {"Hello": "World"}
-
-
-@app.get("/items/{item_id}")
-def read_item(item_id: int, q: Union[str, None] = None):
- return {"item_id": item_id, "q": q}
-```
-
-
-Or use async def...
-
-If your code uses `async` / `await`, use `async def`:
-
-```Python hl_lines="9 14"
-from typing import Union
-
-from fastapi import FastAPI
-
-app = FastAPI()
-
-
-@app.get("/")
-async def read_root():
- return {"Hello": "World"}
-
-
-@app.get("/items/{item_id}")
-async def read_item(item_id: int, q: Union[str, None] = None):
- return {"item_id": item_id, "q": q}
-```
-
-**Note**:
-
-If you don't know, check the _"In a hurry?"_ section about `async` and `await` in the docs.
-
-
-
-### Run it
-
-Run the server with:
-
-
-
-```console
-$ uvicorn main:app --reload
-
-INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
-INFO: Started reloader process [28720]
-INFO: Started server process [28722]
-INFO: Waiting for application startup.
-INFO: Application startup complete.
-```
-
-
-
-
-About the command uvicorn main:app --reload...
-
-The command `uvicorn main:app` refers to:
-
-* `main`: the file `main.py` (the Python "module").
-* `app`: the object created inside of `main.py` with the line `app = FastAPI()`.
-* `--reload`: make the server restart after code changes. Only do this for development.
-
-
-
-### Check it
-
-Open your browser at http://127.0.0.1:8000/items/5?q=somequery.
-
-You will see the JSON response as:
-
-```JSON
-{"item_id": 5, "q": "somequery"}
-```
-
-You already created an API that:
-
-* Receives HTTP requests in the _paths_ `/` and `/items/{item_id}`.
-* Both _paths_ take `GET` operations (also known as HTTP _methods_).
-* The _path_ `/items/{item_id}` has a _path parameter_ `item_id` that should be an `int`.
-* The _path_ `/items/{item_id}` has an optional `str` _query parameter_ `q`.
-
-### Interactive API docs
-
-Now go to http://127.0.0.1:8000/docs.
-
-You will see the automatic interactive API documentation (provided by Swagger UI):
-
-
-
-### Alternative API docs
-
-And now, go to http://127.0.0.1:8000/redoc.
-
-You will see the alternative automatic documentation (provided by ReDoc):
-
-
-
-## Example upgrade
-
-Now modify the file `main.py` to receive a body from a `PUT` request.
-
-Declare the body using standard Python types, thanks to Pydantic.
-
-```Python hl_lines="4 9-12 25-27"
-from typing import Union
-
-from fastapi import FastAPI
-from pydantic import BaseModel
-
-app = FastAPI()
-
-
-class Item(BaseModel):
- name: str
- price: float
- is_offer: Union[bool, None] = None
-
-
-@app.get("/")
-def read_root():
- return {"Hello": "World"}
-
-
-@app.get("/items/{item_id}")
-def read_item(item_id: int, q: Union[str, None] = None):
- return {"item_id": item_id, "q": q}
-
-
-@app.put("/items/{item_id}")
-def update_item(item_id: int, item: Item):
- return {"item_name": item.name, "item_id": item_id}
-```
-
-The server should reload automatically (because you added `--reload` to the `uvicorn` command above).
-
-### Interactive API docs upgrade
-
-Now go to http://127.0.0.1:8000/docs.
-
-* The interactive API documentation will be automatically updated, including the new body:
-
-
-
-* Click on the button "Try it out", it allows you to fill the parameters and directly interact with the API:
-
-
-
-* Then click on the "Execute" button, the user interface will communicate with your API, send the parameters, get the results and show them on the screen:
-
-
-
-### Alternative API docs upgrade
-
-And now, go to http://127.0.0.1:8000/redoc.
-
-* The alternative documentation will also reflect the new query parameter and body:
-
-
-
-### Recap
-
-In summary, you declare **once** the types of parameters, body, etc. as function parameters.
-
-You do that with standard modern Python types.
-
-You don't have to learn a new syntax, the methods or classes of a specific library, etc.
-
-Just standard **Python 3.6+**.
-
-For example, for an `int`:
-
-```Python
-item_id: int
-```
-
-or for a more complex `Item` model:
-
-```Python
-item: Item
-```
-
-...and with that single declaration you get:
-
-* Editor support, including:
- * Completion.
- * Type checks.
-* Validation of data:
- * Automatic and clear errors when the data is invalid.
- * Validation even for deeply nested JSON objects.
-* Conversion of input data: coming from the network to Python data and types. Reading from:
- * JSON.
- * Path parameters.
- * Query parameters.
- * Cookies.
- * Headers.
- * Forms.
- * Files.
-* Conversion of output data: converting from Python data and types to network data (as JSON):
- * Convert Python types (`str`, `int`, `float`, `bool`, `list`, etc).
- * `datetime` objects.
- * `UUID` objects.
- * Database models.
- * ...and many more.
-* Automatic interactive API documentation, including 2 alternative user interfaces:
- * Swagger UI.
- * ReDoc.
-
----
-
-Coming back to the previous code example, **FastAPI** will:
-
-* Validate that there is an `item_id` in the path for `GET` and `PUT` requests.
-* Validate that the `item_id` is of type `int` for `GET` and `PUT` requests.
- * If it is not, the client will see a useful, clear error.
-* Check if there is an optional query parameter named `q` (as in `http://127.0.0.1:8000/items/foo?q=somequery`) for `GET` requests.
- * As the `q` parameter is declared with `= None`, it is optional.
- * Without the `None` it would be required (as is the body in the case with `PUT`).
-* For `PUT` requests to `/items/{item_id}`, Read the body as JSON:
- * Check that it has a required attribute `name` that should be a `str`.
- * Check that it has a required attribute `price` that has to be a `float`.
- * Check that it has an optional attribute `is_offer`, that should be a `bool`, if present.
- * All this would also work for deeply nested JSON objects.
-* Convert from and to JSON automatically.
-* Document everything with OpenAPI, that can be used by:
- * Interactive documentation systems.
- * Automatic client code generation systems, for many languages.
-* Provide 2 interactive documentation web interfaces directly.
-
----
-
-We just scratched the surface, but you already get the idea of how it all works.
-
-Try changing the line with:
-
-```Python
- return {"item_name": item.name, "item_id": item_id}
-```
-
-...from:
-
-```Python
- ... "item_name": item.name ...
-```
-
-...to:
-
-```Python
- ... "item_price": item.price ...
-```
-
-...and see how your editor will auto-complete the attributes and know their types:
-
-
-
-For a more complete example including more features, see the Tutorial - User Guide.
-
-**Spoiler alert**: the tutorial - user guide includes:
-
-* Declaration of **parameters** from other different places as: **headers**, **cookies**, **form fields** and **files**.
-* How to set **validation constraints** as `maximum_length` or `regex`.
-* A very powerful and easy to use **Dependency Injection** system.
-* Security and authentication, including support for **OAuth2** with **JWT tokens** and **HTTP Basic** auth.
-* More advanced (but equally easy) techniques for declaring **deeply nested JSON models** (thanks to Pydantic).
-* **GraphQL** integration with Strawberry and other libraries.
-* Many extra features (thanks to Starlette) as:
- * **WebSockets**
- * extremely easy tests based on `requests` and `pytest`
- * **CORS**
- * **Cookie Sessions**
- * ...and more.
-
-## Performance
-
-Independent TechEmpower benchmarks show **FastAPI** applications running under Uvicorn as one of the fastest Python frameworks available, only below Starlette and Uvicorn themselves (used internally by FastAPI). (*)
-
-To understand more about it, see the section Benchmarks.
-
-## Optional Dependencies
-
-Used by Pydantic:
-
-* ujson - for faster JSON "parsing".
-* email_validator - for email validation.
-
-Used by Starlette:
-
-* requests - Required if you want to use the `TestClient`.
-* jinja2 - Required if you want to use the default template configuration.
-* python-multipart - Required if you want to support form "parsing", with `request.form()`.
-* itsdangerous - Required for `SessionMiddleware` support.
-* pyyaml - Required for Starlette's `SchemaGenerator` support (you probably don't need it with FastAPI).
-* ujson - Required if you want to use `UJSONResponse`.
-
-Used by FastAPI / Starlette:
-
-* uvicorn - for the server that loads and serves your application.
-* orjson - Required if you want to use `ORJSONResponse`.
-
-You can install all of these with `pip install "fastapi[all]"`.
-
-## License
-
-This project is licensed under the terms of the MIT license.
-
diff --git a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/RECORD b/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/RECORD
deleted file mode 100644
index a9f7269..0000000
--- a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/RECORD
+++ /dev/null
@@ -1,91 +0,0 @@
-fastapi-0.82.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-fastapi-0.82.0.dist-info/LICENSE,sha256=Tsif_IFIW5f-xYSy1KlhAy7v_oNEU4lP2cEnSQbMdE4,1086
-fastapi-0.82.0.dist-info/METADATA,sha256=-rNIwFA_p3JTxvXpnv16E6jRdoEUXBxE8BQmHTckwYk,24933
-fastapi-0.82.0.dist-info/RECORD,,
-fastapi-0.82.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-fastapi-0.82.0.dist-info/WHEEL,sha256=4TfKIB_xu-04bc2iKz6_zFt-gEFEEDU_31HGhqzOCE8,81
-fastapi/__init__.py,sha256=eXbm2FAen_dneWot44Iemtfn6QzsR2SBffXiv7EtFVw,1015
-fastapi/__pycache__/__init__.cpython-39.pyc,,
-fastapi/__pycache__/applications.cpython-39.pyc,,
-fastapi/__pycache__/background.cpython-39.pyc,,
-fastapi/__pycache__/concurrency.cpython-39.pyc,,
-fastapi/__pycache__/datastructures.cpython-39.pyc,,
-fastapi/__pycache__/encoders.cpython-39.pyc,,
-fastapi/__pycache__/exception_handlers.cpython-39.pyc,,
-fastapi/__pycache__/exceptions.cpython-39.pyc,,
-fastapi/__pycache__/logger.cpython-39.pyc,,
-fastapi/__pycache__/param_functions.cpython-39.pyc,,
-fastapi/__pycache__/params.cpython-39.pyc,,
-fastapi/__pycache__/requests.cpython-39.pyc,,
-fastapi/__pycache__/responses.cpython-39.pyc,,
-fastapi/__pycache__/routing.cpython-39.pyc,,
-fastapi/__pycache__/staticfiles.cpython-39.pyc,,
-fastapi/__pycache__/templating.cpython-39.pyc,,
-fastapi/__pycache__/testclient.cpython-39.pyc,,
-fastapi/__pycache__/types.cpython-39.pyc,,
-fastapi/__pycache__/utils.cpython-39.pyc,,
-fastapi/__pycache__/websockets.cpython-39.pyc,,
-fastapi/applications.py,sha256=M06H_ZDLtGHFariN4FpPKOZr3JlpruyzZ1XHciZFUBw,38062
-fastapi/background.py,sha256=HtN5_pJJrOdalSbuGSMKJAPNWUU5h7rY_BXXubu7-IQ,76
-fastapi/concurrency.py,sha256=-fRzZADbfXiPauNjLnQZs5wZIurRA1qy83OSwprWh1Q,1666
-fastapi/datastructures.py,sha256=oW6xuU0C-sBwbcyXI-MlBO0tSS4BSPB2lYUa1yCw8-A,1905
-fastapi/dependencies/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-fastapi/dependencies/__pycache__/__init__.cpython-39.pyc,,
-fastapi/dependencies/__pycache__/models.cpython-39.pyc,,
-fastapi/dependencies/__pycache__/utils.cpython-39.pyc,,
-fastapi/dependencies/models.py,sha256=zNbioxICuOeb-9ADDVQ45hUHOC0PBtPVEfVU3f1l_nc,2494
-fastapi/dependencies/utils.py,sha256=zsbThdKoayRv0pWplZ6jWjK9ekIoix036NO_nv565xc,27373
-fastapi/encoders.py,sha256=0Zd7Ad60YMVzB5A0rcHkUZwwriY9In7FY6ziyX5BZpw,5885
-fastapi/exception_handlers.py,sha256=UVYCCe4qt5-5_NuQ3SxTXjDvOdKMHiTfcLp3RUKXhg8,912
-fastapi/exceptions.py,sha256=Wy1sP3EisJohtxr-uKoH58QumPWmqHp6cpXOD3TTPOs,1117
-fastapi/logger.py,sha256=I9NNi3ov8AcqbsbC9wl1X-hdItKgYt2XTrx1f99Zpl4,54
-fastapi/middleware/__init__.py,sha256=oQDxiFVcc1fYJUOIFvphnK7pTT5kktmfL32QXpBFvvo,58
-fastapi/middleware/__pycache__/__init__.cpython-39.pyc,,
-fastapi/middleware/__pycache__/asyncexitstack.cpython-39.pyc,,
-fastapi/middleware/__pycache__/cors.cpython-39.pyc,,
-fastapi/middleware/__pycache__/gzip.cpython-39.pyc,,
-fastapi/middleware/__pycache__/httpsredirect.cpython-39.pyc,,
-fastapi/middleware/__pycache__/trustedhost.cpython-39.pyc,,
-fastapi/middleware/__pycache__/wsgi.cpython-39.pyc,,
-fastapi/middleware/asyncexitstack.py,sha256=72XjQmQ_tB_tTs9xOc0akXF_7TwZUPdyfc8gsN5LV8E,1197
-fastapi/middleware/cors.py,sha256=ynwjWQZoc_vbhzZ3_ZXceoaSrslHFHPdoM52rXr0WUU,79
-fastapi/middleware/gzip.py,sha256=xM5PcsH8QlAimZw4VDvcmTnqQamslThsfe3CVN2voa0,79
-fastapi/middleware/httpsredirect.py,sha256=rL8eXMnmLijwVkH7_400zHri1AekfeBd6D6qs8ix950,115
-fastapi/middleware/trustedhost.py,sha256=eE5XGRxGa7c5zPnMJDGp3BxaL25k5iVQlhnv-Pk0Pss,109
-fastapi/middleware/wsgi.py,sha256=Z3Ue-7wni4lUZMvH3G9ek__acgYdJstbnpZX_HQAboY,79
-fastapi/openapi/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-fastapi/openapi/__pycache__/__init__.cpython-39.pyc,,
-fastapi/openapi/__pycache__/constants.cpython-39.pyc,,
-fastapi/openapi/__pycache__/docs.cpython-39.pyc,,
-fastapi/openapi/__pycache__/models.cpython-39.pyc,,
-fastapi/openapi/__pycache__/utils.cpython-39.pyc,,
-fastapi/openapi/constants.py,sha256=mWxYBjED6PU-tQ9X227Qkq2SsW2cv-C1jYFKt63xxEs,107
-fastapi/openapi/docs.py,sha256=JBRaq7EEmeC-xoRSRFj6qZWQxfOZW_jvTw0r-PiKcZ4,6532
-fastapi/openapi/models.py,sha256=_XWDBU4Zlp5M9V6YI1kmXbqYKwK_xZxaqIA_DhLwqHk,11027
-fastapi/openapi/utils.py,sha256=DoI_rwP8wepUTsSCeaCGfXLuGm7Q7dlBqMbOkxYyk9Y,18808
-fastapi/param_functions.py,sha256=mhV6aNZmXuf_A7rZ830o3V-DFqbonIDRs_prDTetLs4,7521
-fastapi/params.py,sha256=LRoO2H1XBBIfBGB82gHtfnXhDZiDz-7CIordN3FoU1I,10600
-fastapi/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-fastapi/requests.py,sha256=zayepKFcienBllv3snmWI20Gk0oHNVLU4DDhqXBb4LU,142
-fastapi/responses.py,sha256=_-2YuL2PWB0WcVUm-T0bJzbo2Zl_v8n6XAavAUYwHjs,1279
-fastapi/routing.py,sha256=QmK293u0dclJivGQ2y8vapcwNb5h1Biw9FGyH5kYDoQ,53471
-fastapi/security/__init__.py,sha256=bO8pNmxqVRXUjfl2mOKiVZLn0FpBQ61VUYVjmppnbJw,881
-fastapi/security/__pycache__/__init__.cpython-39.pyc,,
-fastapi/security/__pycache__/api_key.cpython-39.pyc,,
-fastapi/security/__pycache__/base.cpython-39.pyc,,
-fastapi/security/__pycache__/http.cpython-39.pyc,,
-fastapi/security/__pycache__/oauth2.cpython-39.pyc,,
-fastapi/security/__pycache__/open_id_connect_url.cpython-39.pyc,,
-fastapi/security/__pycache__/utils.cpython-39.pyc,,
-fastapi/security/api_key.py,sha256=NbVpS9TxDOaipoZa8-SREHyMtTcM3bmy5szMiQxEX9s,2793
-fastapi/security/base.py,sha256=dl4pvbC-RxjfbWgPtCWd8MVU-7CB2SZ22rJDXVCXO6c,141
-fastapi/security/http.py,sha256=ZSy3DFKFDLa3-I4vwsY1r8hQB_VrtAXw4-fMJauZIK0,5984
-fastapi/security/oauth2.py,sha256=1NPA12T1_r2uo4iQWxJCKjUqVVdb532YDvX9e3PVpcE,8212
-fastapi/security/open_id_connect_url.py,sha256=iikzuJCz_DG44Q77VrupqSoCbJYaiXkuo_W-kdmAzeo,1145
-fastapi/security/utils.py,sha256=izlh-HBaL1VnJeOeRTQnyNgI3hgTFs73eCyLy-snb4A,266
-fastapi/staticfiles.py,sha256=iirGIt3sdY2QZXd36ijs3Cj-T0FuGFda3cd90kM9Ikw,69
-fastapi/templating.py,sha256=4zsuTWgcjcEainMJFAlW6-gnslm6AgOS1SiiDWfmQxk,76
-fastapi/testclient.py,sha256=nBvaAmX66YldReJNZXPOk1sfuo2Q6hs8bOvIaCep6LQ,66
-fastapi/types.py,sha256=r6MngTHzkZOP9lzXgduje9yeZe5EInWAzCLuRJlhIuE,118
-fastapi/utils.py,sha256=Kn8iHxH2qur5N7Blx8N2uwj6AcDDrJ76O192DzyOzvE,6709
-fastapi/websockets.py,sha256=419uncYObEKZG0YcrXscfQQYLSWoE10jqxVMetGdR98,222
diff --git a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/REQUESTED b/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/REQUESTED
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/WHEEL b/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/WHEEL
deleted file mode 100644
index 668ba4d..0000000
--- a/env/lib/python3.9/site-packages/fastapi-0.82.0.dist-info/WHEEL
+++ /dev/null
@@ -1,4 +0,0 @@
-Wheel-Version: 1.0
-Generator: flit 3.7.1
-Root-Is-Purelib: true
-Tag: py3-none-any
diff --git a/env/lib/python3.9/site-packages/fastapi/__init__.py b/env/lib/python3.9/site-packages/fastapi/__init__.py
deleted file mode 100644
index c50543b..0000000
--- a/env/lib/python3.9/site-packages/fastapi/__init__.py
+++ /dev/null
@@ -1,24 +0,0 @@
-"""FastAPI framework, high performance, easy to learn, fast to code, ready for production"""
-
-__version__ = "0.82.0"
-
-from starlette import status as status
-
-from .applications import FastAPI as FastAPI
-from .background import BackgroundTasks as BackgroundTasks
-from .datastructures import UploadFile as UploadFile
-from .exceptions import HTTPException as HTTPException
-from .param_functions import Body as Body
-from .param_functions import Cookie as Cookie
-from .param_functions import Depends as Depends
-from .param_functions import File as File
-from .param_functions import Form as Form
-from .param_functions import Header as Header
-from .param_functions import Path as Path
-from .param_functions import Query as Query
-from .param_functions import Security as Security
-from .requests import Request as Request
-from .responses import Response as Response
-from .routing import APIRouter as APIRouter
-from .websockets import WebSocket as WebSocket
-from .websockets import WebSocketDisconnect as WebSocketDisconnect
diff --git a/env/lib/python3.9/site-packages/fastapi/applications.py b/env/lib/python3.9/site-packages/fastapi/applications.py
deleted file mode 100644
index a242c50..0000000
--- a/env/lib/python3.9/site-packages/fastapi/applications.py
+++ /dev/null
@@ -1,871 +0,0 @@
-from enum import Enum
-from typing import (
- Any,
- Awaitable,
- Callable,
- Coroutine,
- Dict,
- List,
- Optional,
- Sequence,
- Type,
- Union,
-)
-
-from fastapi import routing
-from fastapi.datastructures import Default, DefaultPlaceholder
-from fastapi.encoders import DictIntStrAny, SetIntStr
-from fastapi.exception_handlers import (
- http_exception_handler,
- request_validation_exception_handler,
-)
-from fastapi.exceptions import RequestValidationError
-from fastapi.logger import logger
-from fastapi.middleware.asyncexitstack import AsyncExitStackMiddleware
-from fastapi.openapi.docs import (
- get_redoc_html,
- get_swagger_ui_html,
- get_swagger_ui_oauth2_redirect_html,
-)
-from fastapi.openapi.utils import get_openapi
-from fastapi.params import Depends
-from fastapi.types import DecoratedCallable
-from fastapi.utils import generate_unique_id
-from starlette.applications import Starlette
-from starlette.datastructures import State
-from starlette.exceptions import ExceptionMiddleware, HTTPException
-from starlette.middleware import Middleware
-from starlette.middleware.errors import ServerErrorMiddleware
-from starlette.requests import Request
-from starlette.responses import HTMLResponse, JSONResponse, Response
-from starlette.routing import BaseRoute
-from starlette.types import ASGIApp, Receive, Scope, Send
-
-
-class FastAPI(Starlette):
- def __init__(
- self,
- *,
- debug: bool = False,
- routes: Optional[List[BaseRoute]] = None,
- title: str = "FastAPI",
- description: str = "",
- version: str = "0.1.0",
- openapi_url: Optional[str] = "/openapi.json",
- openapi_tags: Optional[List[Dict[str, Any]]] = None,
- servers: Optional[List[Dict[str, Union[str, Any]]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- default_response_class: Type[Response] = Default(JSONResponse),
- docs_url: Optional[str] = "/docs",
- redoc_url: Optional[str] = "/redoc",
- swagger_ui_oauth2_redirect_url: Optional[str] = "/docs/oauth2-redirect",
- swagger_ui_init_oauth: Optional[Dict[str, Any]] = None,
- middleware: Optional[Sequence[Middleware]] = None,
- exception_handlers: Optional[
- Dict[
- Union[int, Type[Exception]],
- Callable[[Request, Any], Coroutine[Any, Any, Response]],
- ]
- ] = None,
- on_startup: Optional[Sequence[Callable[[], Any]]] = None,
- on_shutdown: Optional[Sequence[Callable[[], Any]]] = None,
- terms_of_service: Optional[str] = None,
- contact: Optional[Dict[str, Union[str, Any]]] = None,
- license_info: Optional[Dict[str, Union[str, Any]]] = None,
- openapi_prefix: str = "",
- root_path: str = "",
- root_path_in_servers: bool = True,
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- swagger_ui_parameters: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- **extra: Any,
- ) -> None:
- self._debug: bool = debug
- self.title = title
- self.description = description
- self.version = version
- self.terms_of_service = terms_of_service
- self.contact = contact
- self.license_info = license_info
- self.openapi_url = openapi_url
- self.openapi_tags = openapi_tags
- self.root_path_in_servers = root_path_in_servers
- self.docs_url = docs_url
- self.redoc_url = redoc_url
- self.swagger_ui_oauth2_redirect_url = swagger_ui_oauth2_redirect_url
- self.swagger_ui_init_oauth = swagger_ui_init_oauth
- self.swagger_ui_parameters = swagger_ui_parameters
- self.servers = servers or []
- self.extra = extra
- self.openapi_version = "3.0.2"
- self.openapi_schema: Optional[Dict[str, Any]] = None
- if self.openapi_url:
- assert self.title, "A title must be provided for OpenAPI, e.g.: 'My API'"
- assert self.version, "A version must be provided for OpenAPI, e.g.: '2.1.0'"
- # TODO: remove when discarding the openapi_prefix parameter
- if openapi_prefix:
- logger.warning(
- '"openapi_prefix" has been deprecated in favor of "root_path", which '
- "follows more closely the ASGI standard, is simpler, and more "
- "automatic. Check the docs at "
- "https://fastapi.tiangolo.com/advanced/sub-applications/"
- )
- self.root_path = root_path or openapi_prefix
- self.state: State = State()
- self.dependency_overrides: Dict[Callable[..., Any], Callable[..., Any]] = {}
- self.router: routing.APIRouter = routing.APIRouter(
- routes=routes,
- dependency_overrides_provider=self,
- on_startup=on_startup,
- on_shutdown=on_shutdown,
- default_response_class=default_response_class,
- dependencies=dependencies,
- callbacks=callbacks,
- deprecated=deprecated,
- include_in_schema=include_in_schema,
- responses=responses,
- generate_unique_id_function=generate_unique_id_function,
- )
- self.exception_handlers: Dict[
- Any, Callable[[Request, Any], Union[Response, Awaitable[Response]]]
- ] = ({} if exception_handlers is None else dict(exception_handlers))
- self.exception_handlers.setdefault(HTTPException, http_exception_handler)
- self.exception_handlers.setdefault(
- RequestValidationError, request_validation_exception_handler
- )
-
- self.user_middleware: List[Middleware] = (
- [] if middleware is None else list(middleware)
- )
- self.middleware_stack: ASGIApp = self.build_middleware_stack()
- self.setup()
-
- def build_middleware_stack(self) -> ASGIApp:
- # Duplicate/override from Starlette to add AsyncExitStackMiddleware
- # inside of ExceptionMiddleware, inside of custom user middlewares
- debug = self.debug
- error_handler = None
- exception_handlers = {}
-
- for key, value in self.exception_handlers.items():
- if key in (500, Exception):
- error_handler = value
- else:
- exception_handlers[key] = value
-
- middleware = (
- [Middleware(ServerErrorMiddleware, handler=error_handler, debug=debug)]
- + self.user_middleware
- + [
- Middleware(
- ExceptionMiddleware, handlers=exception_handlers, debug=debug
- ),
- # Add FastAPI-specific AsyncExitStackMiddleware for dependencies with
- # contextvars.
- # This needs to happen after user middlewares because those create a
- # new contextvars context copy by using a new AnyIO task group.
- # The initial part of dependencies with yield is executed in the
- # FastAPI code, inside all the middlewares, but the teardown part
- # (after yield) is executed in the AsyncExitStack in this middleware,
- # if the AsyncExitStack lived outside of the custom middlewares and
- # contextvars were set in a dependency with yield in that internal
- # contextvars context, the values would not be available in the
- # outside context of the AsyncExitStack.
- # By putting the middleware and the AsyncExitStack here, inside all
- # user middlewares, the code before and after yield in dependencies
- # with yield is executed in the same contextvars context, so all values
- # set in contextvars before yield is still available after yield as
- # would be expected.
- # Additionally, by having this AsyncExitStack here, after the
- # ExceptionMiddleware, now dependencies can catch handled exceptions,
- # e.g. HTTPException, to customize the teardown code (e.g. DB session
- # rollback).
- Middleware(AsyncExitStackMiddleware),
- ]
- )
-
- app = self.router
- for cls, options in reversed(middleware):
- app = cls(app=app, **options)
- return app
-
- def openapi(self) -> Dict[str, Any]:
- if not self.openapi_schema:
- self.openapi_schema = get_openapi(
- title=self.title,
- version=self.version,
- openapi_version=self.openapi_version,
- description=self.description,
- terms_of_service=self.terms_of_service,
- contact=self.contact,
- license_info=self.license_info,
- routes=self.routes,
- tags=self.openapi_tags,
- servers=self.servers,
- )
- return self.openapi_schema
-
- def setup(self) -> None:
- if self.openapi_url:
- urls = (server_data.get("url") for server_data in self.servers)
- server_urls = {url for url in urls if url}
-
- async def openapi(req: Request) -> JSONResponse:
- root_path = req.scope.get("root_path", "").rstrip("/")
- if root_path not in server_urls:
- if root_path and self.root_path_in_servers:
- self.servers.insert(0, {"url": root_path})
- server_urls.add(root_path)
- return JSONResponse(self.openapi())
-
- self.add_route(self.openapi_url, openapi, include_in_schema=False)
- if self.openapi_url and self.docs_url:
-
- async def swagger_ui_html(req: Request) -> HTMLResponse:
- root_path = req.scope.get("root_path", "").rstrip("/")
- openapi_url = root_path + self.openapi_url
- oauth2_redirect_url = self.swagger_ui_oauth2_redirect_url
- if oauth2_redirect_url:
- oauth2_redirect_url = root_path + oauth2_redirect_url
- return get_swagger_ui_html(
- openapi_url=openapi_url,
- title=self.title + " - Swagger UI",
- oauth2_redirect_url=oauth2_redirect_url,
- init_oauth=self.swagger_ui_init_oauth,
- swagger_ui_parameters=self.swagger_ui_parameters,
- )
-
- self.add_route(self.docs_url, swagger_ui_html, include_in_schema=False)
-
- if self.swagger_ui_oauth2_redirect_url:
-
- async def swagger_ui_redirect(req: Request) -> HTMLResponse:
- return get_swagger_ui_oauth2_redirect_html()
-
- self.add_route(
- self.swagger_ui_oauth2_redirect_url,
- swagger_ui_redirect,
- include_in_schema=False,
- )
- if self.openapi_url and self.redoc_url:
-
- async def redoc_html(req: Request) -> HTMLResponse:
- root_path = req.scope.get("root_path", "").rstrip("/")
- openapi_url = root_path + self.openapi_url
- return get_redoc_html(
- openapi_url=openapi_url, title=self.title + " - ReDoc"
- )
-
- self.add_route(self.redoc_url, redoc_html, include_in_schema=False)
-
- async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
- if self.root_path:
- scope["root_path"] = self.root_path
- await super().__call__(scope, receive, send)
-
- def add_api_route(
- self,
- path: str,
- endpoint: Callable[..., Coroutine[Any, Any, Response]],
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- methods: Optional[List[str]] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Union[Type[Response], DefaultPlaceholder] = Default(
- JSONResponse
- ),
- name: Optional[str] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> None:
- self.router.add_api_route(
- path,
- endpoint=endpoint,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=methods,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def api_route(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- methods: Optional[List[str]] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- def decorator(func: DecoratedCallable) -> DecoratedCallable:
- self.router.add_api_route(
- path,
- func,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=methods,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
- return func
-
- return decorator
-
- def add_api_websocket_route(
- self, path: str, endpoint: Callable[..., Any], name: Optional[str] = None
- ) -> None:
- self.router.add_api_websocket_route(path, endpoint, name=name)
-
- def websocket(
- self, path: str, name: Optional[str] = None
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- def decorator(func: DecoratedCallable) -> DecoratedCallable:
- self.add_api_websocket_route(path, func, name=name)
- return func
-
- return decorator
-
- def include_router(
- self,
- router: routing.APIRouter,
- *,
- prefix: str = "",
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- default_response_class: Type[Response] = Default(JSONResponse),
- callbacks: Optional[List[BaseRoute]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> None:
- self.router.include_router(
- router,
- prefix=prefix,
- tags=tags,
- dependencies=dependencies,
- responses=responses,
- deprecated=deprecated,
- include_in_schema=include_in_schema,
- default_response_class=default_response_class,
- callbacks=callbacks,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def get(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.get(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def put(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.put(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def post(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.post(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def delete(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.delete(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def options(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.options(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def head(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.head(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def patch(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.patch(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def trace(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[routing.APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.router.trace(
- path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
diff --git a/env/lib/python3.9/site-packages/fastapi/background.py b/env/lib/python3.9/site-packages/fastapi/background.py
deleted file mode 100644
index dd3bbe2..0000000
--- a/env/lib/python3.9/site-packages/fastapi/background.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.background import BackgroundTasks as BackgroundTasks # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/concurrency.py b/env/lib/python3.9/site-packages/fastapi/concurrency.py
deleted file mode 100644
index c728ec1..0000000
--- a/env/lib/python3.9/site-packages/fastapi/concurrency.py
+++ /dev/null
@@ -1,47 +0,0 @@
-import sys
-from typing import AsyncGenerator, ContextManager, TypeVar
-
-import anyio
-from anyio import CapacityLimiter
-from starlette.concurrency import iterate_in_threadpool as iterate_in_threadpool # noqa
-from starlette.concurrency import run_in_threadpool as run_in_threadpool # noqa
-from starlette.concurrency import ( # noqa
- run_until_first_complete as run_until_first_complete,
-)
-
-if sys.version_info >= (3, 7):
- from contextlib import AsyncExitStack as AsyncExitStack
- from contextlib import asynccontextmanager as asynccontextmanager
-else:
- from contextlib2 import AsyncExitStack as AsyncExitStack # noqa
- from contextlib2 import asynccontextmanager as asynccontextmanager # noqa
-
-
-_T = TypeVar("_T")
-
-
-@asynccontextmanager
-async def contextmanager_in_threadpool(
- cm: ContextManager[_T],
-) -> AsyncGenerator[_T, None]:
- # blocking __exit__ from running waiting on a free thread
- # can create race conditions/deadlocks if the context manager itself
- # has it's own internal pool (e.g. a database connection pool)
- # to avoid this we let __exit__ run without a capacity limit
- # since we're creating a new limiter for each call, any non-zero limit
- # works (1 is arbitrary)
- exit_limiter = CapacityLimiter(1)
- try:
- yield await run_in_threadpool(cm.__enter__)
- except Exception as e:
- ok = bool(
- await anyio.to_thread.run_sync(
- cm.__exit__, type(e), e, None, limiter=exit_limiter
- )
- )
- if not ok:
- raise e
- else:
- await anyio.to_thread.run_sync(
- cm.__exit__, None, None, None, limiter=exit_limiter
- )
diff --git a/env/lib/python3.9/site-packages/fastapi/datastructures.py b/env/lib/python3.9/site-packages/fastapi/datastructures.py
deleted file mode 100644
index b20a25a..0000000
--- a/env/lib/python3.9/site-packages/fastapi/datastructures.py
+++ /dev/null
@@ -1,56 +0,0 @@
-from typing import Any, Callable, Dict, Iterable, Type, TypeVar
-
-from starlette.datastructures import URL as URL # noqa: F401
-from starlette.datastructures import Address as Address # noqa: F401
-from starlette.datastructures import FormData as FormData # noqa: F401
-from starlette.datastructures import Headers as Headers # noqa: F401
-from starlette.datastructures import QueryParams as QueryParams # noqa: F401
-from starlette.datastructures import State as State # noqa: F401
-from starlette.datastructures import UploadFile as StarletteUploadFile
-
-
-class UploadFile(StarletteUploadFile):
- @classmethod
- def __get_validators__(cls: Type["UploadFile"]) -> Iterable[Callable[..., Any]]:
- yield cls.validate
-
- @classmethod
- def validate(cls: Type["UploadFile"], v: Any) -> Any:
- if not isinstance(v, StarletteUploadFile):
- raise ValueError(f"Expected UploadFile, received: {type(v)}")
- return v
-
- @classmethod
- def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None:
- field_schema.update({"type": "string", "format": "binary"})
-
-
-class DefaultPlaceholder:
- """
- You shouldn't use this class directly.
-
- It's used internally to recognize when a default value has been overwritten, even
- if the overridden default value was truthy.
- """
-
- def __init__(self, value: Any):
- self.value = value
-
- def __bool__(self) -> bool:
- return bool(self.value)
-
- def __eq__(self, o: object) -> bool:
- return isinstance(o, DefaultPlaceholder) and o.value == self.value
-
-
-DefaultType = TypeVar("DefaultType")
-
-
-def Default(value: DefaultType) -> DefaultType:
- """
- You shouldn't use this function directly.
-
- It's used internally to recognize when a default value has been overwritten, even
- if the overridden default value was truthy.
- """
- return DefaultPlaceholder(value) # type: ignore
diff --git a/env/lib/python3.9/site-packages/fastapi/dependencies/__init__.py b/env/lib/python3.9/site-packages/fastapi/dependencies/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/fastapi/dependencies/models.py b/env/lib/python3.9/site-packages/fastapi/dependencies/models.py
deleted file mode 100644
index 443590b..0000000
--- a/env/lib/python3.9/site-packages/fastapi/dependencies/models.py
+++ /dev/null
@@ -1,58 +0,0 @@
-from typing import Any, Callable, List, Optional, Sequence
-
-from fastapi.security.base import SecurityBase
-from pydantic.fields import ModelField
-
-
-class SecurityRequirement:
- def __init__(
- self, security_scheme: SecurityBase, scopes: Optional[Sequence[str]] = None
- ):
- self.security_scheme = security_scheme
- self.scopes = scopes
-
-
-class Dependant:
- def __init__(
- self,
- *,
- path_params: Optional[List[ModelField]] = None,
- query_params: Optional[List[ModelField]] = None,
- header_params: Optional[List[ModelField]] = None,
- cookie_params: Optional[List[ModelField]] = None,
- body_params: Optional[List[ModelField]] = None,
- dependencies: Optional[List["Dependant"]] = None,
- security_schemes: Optional[List[SecurityRequirement]] = None,
- name: Optional[str] = None,
- call: Optional[Callable[..., Any]] = None,
- request_param_name: Optional[str] = None,
- websocket_param_name: Optional[str] = None,
- http_connection_param_name: Optional[str] = None,
- response_param_name: Optional[str] = None,
- background_tasks_param_name: Optional[str] = None,
- security_scopes_param_name: Optional[str] = None,
- security_scopes: Optional[List[str]] = None,
- use_cache: bool = True,
- path: Optional[str] = None,
- ) -> None:
- self.path_params = path_params or []
- self.query_params = query_params or []
- self.header_params = header_params or []
- self.cookie_params = cookie_params or []
- self.body_params = body_params or []
- self.dependencies = dependencies or []
- self.security_requirements = security_schemes or []
- self.request_param_name = request_param_name
- self.websocket_param_name = websocket_param_name
- self.http_connection_param_name = http_connection_param_name
- self.response_param_name = response_param_name
- self.background_tasks_param_name = background_tasks_param_name
- self.security_scopes = security_scopes
- self.security_scopes_param_name = security_scopes_param_name
- self.name = name
- self.call = call
- self.use_cache = use_cache
- # Store the path to be able to re-generate a dependable from it in overrides
- self.path = path
- # Save the cache key at creation to optimize performance
- self.cache_key = (self.call, tuple(sorted(set(self.security_scopes or []))))
diff --git a/env/lib/python3.9/site-packages/fastapi/dependencies/utils.py b/env/lib/python3.9/site-packages/fastapi/dependencies/utils.py
deleted file mode 100644
index d098b65..0000000
--- a/env/lib/python3.9/site-packages/fastapi/dependencies/utils.py
+++ /dev/null
@@ -1,756 +0,0 @@
-import dataclasses
-import inspect
-from contextlib import contextmanager
-from copy import deepcopy
-from typing import (
- Any,
- Callable,
- Coroutine,
- Dict,
- List,
- Mapping,
- Optional,
- Sequence,
- Tuple,
- Type,
- Union,
- cast,
-)
-
-import anyio
-from fastapi import params
-from fastapi.concurrency import (
- AsyncExitStack,
- asynccontextmanager,
- contextmanager_in_threadpool,
-)
-from fastapi.dependencies.models import Dependant, SecurityRequirement
-from fastapi.logger import logger
-from fastapi.security.base import SecurityBase
-from fastapi.security.oauth2 import OAuth2, SecurityScopes
-from fastapi.security.open_id_connect_url import OpenIdConnect
-from fastapi.utils import create_response_field, get_path_param_names
-from pydantic import BaseModel, create_model
-from pydantic.error_wrappers import ErrorWrapper
-from pydantic.errors import MissingError
-from pydantic.fields import (
- SHAPE_FROZENSET,
- SHAPE_LIST,
- SHAPE_SEQUENCE,
- SHAPE_SET,
- SHAPE_SINGLETON,
- SHAPE_TUPLE,
- SHAPE_TUPLE_ELLIPSIS,
- FieldInfo,
- ModelField,
- Required,
- Undefined,
-)
-from pydantic.schema import get_annotation_from_field_info
-from pydantic.typing import ForwardRef, evaluate_forwardref
-from pydantic.utils import lenient_issubclass
-from starlette.background import BackgroundTasks
-from starlette.concurrency import run_in_threadpool
-from starlette.datastructures import FormData, Headers, QueryParams, UploadFile
-from starlette.requests import HTTPConnection, Request
-from starlette.responses import Response
-from starlette.websockets import WebSocket
-
-sequence_shapes = {
- SHAPE_LIST,
- SHAPE_SET,
- SHAPE_FROZENSET,
- SHAPE_TUPLE,
- SHAPE_SEQUENCE,
- SHAPE_TUPLE_ELLIPSIS,
-}
-sequence_types = (list, set, tuple)
-sequence_shape_to_type = {
- SHAPE_LIST: list,
- SHAPE_SET: set,
- SHAPE_TUPLE: tuple,
- SHAPE_SEQUENCE: list,
- SHAPE_TUPLE_ELLIPSIS: list,
-}
-
-
-multipart_not_installed_error = (
- 'Form data requires "python-multipart" to be installed. \n'
- 'You can install "python-multipart" with: \n\n'
- "pip install python-multipart\n"
-)
-multipart_incorrect_install_error = (
- 'Form data requires "python-multipart" to be installed. '
- 'It seems you installed "multipart" instead. \n'
- 'You can remove "multipart" with: \n\n'
- "pip uninstall multipart\n\n"
- 'And then install "python-multipart" with: \n\n'
- "pip install python-multipart\n"
-)
-
-
-def check_file_field(field: ModelField) -> None:
- field_info = field.field_info
- if isinstance(field_info, params.Form):
- try:
- # __version__ is available in both multiparts, and can be mocked
- from multipart import __version__ # type: ignore
-
- assert __version__
- try:
- # parse_options_header is only available in the right multipart
- from multipart.multipart import parse_options_header # type: ignore
-
- assert parse_options_header
- except ImportError:
- logger.error(multipart_incorrect_install_error)
- raise RuntimeError(multipart_incorrect_install_error)
- except ImportError:
- logger.error(multipart_not_installed_error)
- raise RuntimeError(multipart_not_installed_error)
-
-
-def get_param_sub_dependant(
- *, param: inspect.Parameter, path: str, security_scopes: Optional[List[str]] = None
-) -> Dependant:
- depends: params.Depends = param.default
- if depends.dependency:
- dependency = depends.dependency
- else:
- dependency = param.annotation
- return get_sub_dependant(
- depends=depends,
- dependency=dependency,
- path=path,
- name=param.name,
- security_scopes=security_scopes,
- )
-
-
-def get_parameterless_sub_dependant(*, depends: params.Depends, path: str) -> Dependant:
- assert callable(
- depends.dependency
- ), "A parameter-less dependency must have a callable dependency"
- return get_sub_dependant(depends=depends, dependency=depends.dependency, path=path)
-
-
-def get_sub_dependant(
- *,
- depends: params.Depends,
- dependency: Callable[..., Any],
- path: str,
- name: Optional[str] = None,
- security_scopes: Optional[List[str]] = None,
-) -> Dependant:
- security_requirement = None
- security_scopes = security_scopes or []
- if isinstance(depends, params.Security):
- dependency_scopes = depends.scopes
- security_scopes.extend(dependency_scopes)
- if isinstance(dependency, SecurityBase):
- use_scopes: List[str] = []
- if isinstance(dependency, (OAuth2, OpenIdConnect)):
- use_scopes = security_scopes
- security_requirement = SecurityRequirement(
- security_scheme=dependency, scopes=use_scopes
- )
- sub_dependant = get_dependant(
- path=path,
- call=dependency,
- name=name,
- security_scopes=security_scopes,
- use_cache=depends.use_cache,
- )
- if security_requirement:
- sub_dependant.security_requirements.append(security_requirement)
- return sub_dependant
-
-
-CacheKey = Tuple[Optional[Callable[..., Any]], Tuple[str, ...]]
-
-
-def get_flat_dependant(
- dependant: Dependant,
- *,
- skip_repeats: bool = False,
- visited: Optional[List[CacheKey]] = None,
-) -> Dependant:
- if visited is None:
- visited = []
- visited.append(dependant.cache_key)
-
- flat_dependant = Dependant(
- path_params=dependant.path_params.copy(),
- query_params=dependant.query_params.copy(),
- header_params=dependant.header_params.copy(),
- cookie_params=dependant.cookie_params.copy(),
- body_params=dependant.body_params.copy(),
- security_schemes=dependant.security_requirements.copy(),
- use_cache=dependant.use_cache,
- path=dependant.path,
- )
- for sub_dependant in dependant.dependencies:
- if skip_repeats and sub_dependant.cache_key in visited:
- continue
- flat_sub = get_flat_dependant(
- sub_dependant, skip_repeats=skip_repeats, visited=visited
- )
- flat_dependant.path_params.extend(flat_sub.path_params)
- flat_dependant.query_params.extend(flat_sub.query_params)
- flat_dependant.header_params.extend(flat_sub.header_params)
- flat_dependant.cookie_params.extend(flat_sub.cookie_params)
- flat_dependant.body_params.extend(flat_sub.body_params)
- flat_dependant.security_requirements.extend(flat_sub.security_requirements)
- return flat_dependant
-
-
-def get_flat_params(dependant: Dependant) -> List[ModelField]:
- flat_dependant = get_flat_dependant(dependant, skip_repeats=True)
- return (
- flat_dependant.path_params
- + flat_dependant.query_params
- + flat_dependant.header_params
- + flat_dependant.cookie_params
- )
-
-
-def is_scalar_field(field: ModelField) -> bool:
- field_info = field.field_info
- if not (
- field.shape == SHAPE_SINGLETON
- and not lenient_issubclass(field.type_, BaseModel)
- and not lenient_issubclass(field.type_, sequence_types + (dict,))
- and not dataclasses.is_dataclass(field.type_)
- and not isinstance(field_info, params.Body)
- ):
- return False
- if field.sub_fields:
- if not all(is_scalar_field(f) for f in field.sub_fields):
- return False
- return True
-
-
-def is_scalar_sequence_field(field: ModelField) -> bool:
- if (field.shape in sequence_shapes) and not lenient_issubclass(
- field.type_, BaseModel
- ):
- if field.sub_fields is not None:
- for sub_field in field.sub_fields:
- if not is_scalar_field(sub_field):
- return False
- return True
- if lenient_issubclass(field.type_, sequence_types):
- return True
- return False
-
-
-def get_typed_signature(call: Callable[..., Any]) -> inspect.Signature:
- signature = inspect.signature(call)
- globalns = getattr(call, "__globals__", {})
- typed_params = [
- inspect.Parameter(
- name=param.name,
- kind=param.kind,
- default=param.default,
- annotation=get_typed_annotation(param, globalns),
- )
- for param in signature.parameters.values()
- ]
- typed_signature = inspect.Signature(typed_params)
- return typed_signature
-
-
-def get_typed_annotation(param: inspect.Parameter, globalns: Dict[str, Any]) -> Any:
- annotation = param.annotation
- if isinstance(annotation, str):
- annotation = ForwardRef(annotation)
- annotation = evaluate_forwardref(annotation, globalns, globalns)
- return annotation
-
-
-def get_dependant(
- *,
- path: str,
- call: Callable[..., Any],
- name: Optional[str] = None,
- security_scopes: Optional[List[str]] = None,
- use_cache: bool = True,
-) -> Dependant:
- path_param_names = get_path_param_names(path)
- endpoint_signature = get_typed_signature(call)
- signature_params = endpoint_signature.parameters
- dependant = Dependant(
- call=call,
- name=name,
- path=path,
- security_scopes=security_scopes,
- use_cache=use_cache,
- )
- for param_name, param in signature_params.items():
- if isinstance(param.default, params.Depends):
- sub_dependant = get_param_sub_dependant(
- param=param, path=path, security_scopes=security_scopes
- )
- dependant.dependencies.append(sub_dependant)
- continue
- if add_non_field_param_to_dependency(param=param, dependant=dependant):
- continue
- param_field = get_param_field(
- param=param, default_field_info=params.Query, param_name=param_name
- )
- if param_name in path_param_names:
- assert is_scalar_field(
- field=param_field
- ), "Path params must be of one of the supported types"
- ignore_default = not isinstance(param.default, params.Path)
- param_field = get_param_field(
- param=param,
- param_name=param_name,
- default_field_info=params.Path,
- force_type=params.ParamTypes.path,
- ignore_default=ignore_default,
- )
- add_param_to_fields(field=param_field, dependant=dependant)
- elif is_scalar_field(field=param_field):
- add_param_to_fields(field=param_field, dependant=dependant)
- elif isinstance(
- param.default, (params.Query, params.Header)
- ) and is_scalar_sequence_field(param_field):
- add_param_to_fields(field=param_field, dependant=dependant)
- else:
- field_info = param_field.field_info
- assert isinstance(
- field_info, params.Body
- ), f"Param: {param_field.name} can only be a request body, using Body()"
- dependant.body_params.append(param_field)
- return dependant
-
-
-def add_non_field_param_to_dependency(
- *, param: inspect.Parameter, dependant: Dependant
-) -> Optional[bool]:
- if lenient_issubclass(param.annotation, Request):
- dependant.request_param_name = param.name
- return True
- elif lenient_issubclass(param.annotation, WebSocket):
- dependant.websocket_param_name = param.name
- return True
- elif lenient_issubclass(param.annotation, HTTPConnection):
- dependant.http_connection_param_name = param.name
- return True
- elif lenient_issubclass(param.annotation, Response):
- dependant.response_param_name = param.name
- return True
- elif lenient_issubclass(param.annotation, BackgroundTasks):
- dependant.background_tasks_param_name = param.name
- return True
- elif lenient_issubclass(param.annotation, SecurityScopes):
- dependant.security_scopes_param_name = param.name
- return True
- return None
-
-
-def get_param_field(
- *,
- param: inspect.Parameter,
- param_name: str,
- default_field_info: Type[params.Param] = params.Param,
- force_type: Optional[params.ParamTypes] = None,
- ignore_default: bool = False,
-) -> ModelField:
- default_value: Any = Undefined
- had_schema = False
- if not param.default == param.empty and ignore_default is False:
- default_value = param.default
- if isinstance(default_value, FieldInfo):
- had_schema = True
- field_info = default_value
- default_value = field_info.default
- if (
- isinstance(field_info, params.Param)
- and getattr(field_info, "in_", None) is None
- ):
- field_info.in_ = default_field_info.in_
- if force_type:
- field_info.in_ = force_type # type: ignore
- else:
- field_info = default_field_info(default=default_value)
- required = True
- if default_value is Required or ignore_default:
- required = True
- default_value = None
- elif default_value is not Undefined:
- required = False
- annotation: Any = Any
- if not param.annotation == param.empty:
- annotation = param.annotation
- annotation = get_annotation_from_field_info(annotation, field_info, param_name)
- if not field_info.alias and getattr(field_info, "convert_underscores", None):
- alias = param.name.replace("_", "-")
- else:
- alias = field_info.alias or param.name
- field = create_response_field(
- name=param.name,
- type_=annotation,
- default=default_value,
- alias=alias,
- required=required,
- field_info=field_info,
- )
- if not had_schema and not is_scalar_field(field=field):
- field.field_info = params.Body(field_info.default)
- if not had_schema and lenient_issubclass(field.type_, UploadFile):
- field.field_info = params.File(field_info.default)
-
- return field
-
-
-def add_param_to_fields(*, field: ModelField, dependant: Dependant) -> None:
- field_info = cast(params.Param, field.field_info)
- if field_info.in_ == params.ParamTypes.path:
- dependant.path_params.append(field)
- elif field_info.in_ == params.ParamTypes.query:
- dependant.query_params.append(field)
- elif field_info.in_ == params.ParamTypes.header:
- dependant.header_params.append(field)
- else:
- assert (
- field_info.in_ == params.ParamTypes.cookie
- ), f"non-body parameters must be in path, query, header or cookie: {field.name}"
- dependant.cookie_params.append(field)
-
-
-def is_coroutine_callable(call: Callable[..., Any]) -> bool:
- if inspect.isroutine(call):
- return inspect.iscoroutinefunction(call)
- if inspect.isclass(call):
- return False
- call = getattr(call, "__call__", None)
- return inspect.iscoroutinefunction(call)
-
-
-def is_async_gen_callable(call: Callable[..., Any]) -> bool:
- if inspect.isasyncgenfunction(call):
- return True
- call = getattr(call, "__call__", None)
- return inspect.isasyncgenfunction(call)
-
-
-def is_gen_callable(call: Callable[..., Any]) -> bool:
- if inspect.isgeneratorfunction(call):
- return True
- call = getattr(call, "__call__", None)
- return inspect.isgeneratorfunction(call)
-
-
-async def solve_generator(
- *, call: Callable[..., Any], stack: AsyncExitStack, sub_values: Dict[str, Any]
-) -> Any:
- if is_gen_callable(call):
- cm = contextmanager_in_threadpool(contextmanager(call)(**sub_values))
- elif is_async_gen_callable(call):
- cm = asynccontextmanager(call)(**sub_values)
- return await stack.enter_async_context(cm)
-
-
-async def solve_dependencies(
- *,
- request: Union[Request, WebSocket],
- dependant: Dependant,
- body: Optional[Union[Dict[str, Any], FormData]] = None,
- background_tasks: Optional[BackgroundTasks] = None,
- response: Optional[Response] = None,
- dependency_overrides_provider: Optional[Any] = None,
- dependency_cache: Optional[Dict[Tuple[Callable[..., Any], Tuple[str]], Any]] = None,
-) -> Tuple[
- Dict[str, Any],
- List[ErrorWrapper],
- Optional[BackgroundTasks],
- Response,
- Dict[Tuple[Callable[..., Any], Tuple[str]], Any],
-]:
- values: Dict[str, Any] = {}
- errors: List[ErrorWrapper] = []
- if response is None:
- response = Response()
- del response.headers["content-length"]
- response.status_code = None # type: ignore
- dependency_cache = dependency_cache or {}
- sub_dependant: Dependant
- for sub_dependant in dependant.dependencies:
- sub_dependant.call = cast(Callable[..., Any], sub_dependant.call)
- sub_dependant.cache_key = cast(
- Tuple[Callable[..., Any], Tuple[str]], sub_dependant.cache_key
- )
- call = sub_dependant.call
- use_sub_dependant = sub_dependant
- if (
- dependency_overrides_provider
- and dependency_overrides_provider.dependency_overrides
- ):
- original_call = sub_dependant.call
- call = getattr(
- dependency_overrides_provider, "dependency_overrides", {}
- ).get(original_call, original_call)
- use_path: str = sub_dependant.path # type: ignore
- use_sub_dependant = get_dependant(
- path=use_path,
- call=call,
- name=sub_dependant.name,
- security_scopes=sub_dependant.security_scopes,
- )
-
- solved_result = await solve_dependencies(
- request=request,
- dependant=use_sub_dependant,
- body=body,
- background_tasks=background_tasks,
- response=response,
- dependency_overrides_provider=dependency_overrides_provider,
- dependency_cache=dependency_cache,
- )
- (
- sub_values,
- sub_errors,
- background_tasks,
- _, # the subdependency returns the same response we have
- sub_dependency_cache,
- ) = solved_result
- dependency_cache.update(sub_dependency_cache)
- if sub_errors:
- errors.extend(sub_errors)
- continue
- if sub_dependant.use_cache and sub_dependant.cache_key in dependency_cache:
- solved = dependency_cache[sub_dependant.cache_key]
- elif is_gen_callable(call) or is_async_gen_callable(call):
- stack = request.scope.get("fastapi_astack")
- assert isinstance(stack, AsyncExitStack)
- solved = await solve_generator(
- call=call, stack=stack, sub_values=sub_values
- )
- elif is_coroutine_callable(call):
- solved = await call(**sub_values)
- else:
- solved = await run_in_threadpool(call, **sub_values)
- if sub_dependant.name is not None:
- values[sub_dependant.name] = solved
- if sub_dependant.cache_key not in dependency_cache:
- dependency_cache[sub_dependant.cache_key] = solved
- path_values, path_errors = request_params_to_args(
- dependant.path_params, request.path_params
- )
- query_values, query_errors = request_params_to_args(
- dependant.query_params, request.query_params
- )
- header_values, header_errors = request_params_to_args(
- dependant.header_params, request.headers
- )
- cookie_values, cookie_errors = request_params_to_args(
- dependant.cookie_params, request.cookies
- )
- values.update(path_values)
- values.update(query_values)
- values.update(header_values)
- values.update(cookie_values)
- errors += path_errors + query_errors + header_errors + cookie_errors
- if dependant.body_params:
- (
- body_values,
- body_errors,
- ) = await request_body_to_args( # body_params checked above
- required_params=dependant.body_params, received_body=body
- )
- values.update(body_values)
- errors.extend(body_errors)
- if dependant.http_connection_param_name:
- values[dependant.http_connection_param_name] = request
- if dependant.request_param_name and isinstance(request, Request):
- values[dependant.request_param_name] = request
- elif dependant.websocket_param_name and isinstance(request, WebSocket):
- values[dependant.websocket_param_name] = request
- if dependant.background_tasks_param_name:
- if background_tasks is None:
- background_tasks = BackgroundTasks()
- values[dependant.background_tasks_param_name] = background_tasks
- if dependant.response_param_name:
- values[dependant.response_param_name] = response
- if dependant.security_scopes_param_name:
- values[dependant.security_scopes_param_name] = SecurityScopes(
- scopes=dependant.security_scopes
- )
- return values, errors, background_tasks, response, dependency_cache
-
-
-def request_params_to_args(
- required_params: Sequence[ModelField],
- received_params: Union[Mapping[str, Any], QueryParams, Headers],
-) -> Tuple[Dict[str, Any], List[ErrorWrapper]]:
- values = {}
- errors = []
- for field in required_params:
- if is_scalar_sequence_field(field) and isinstance(
- received_params, (QueryParams, Headers)
- ):
- value = received_params.getlist(field.alias) or field.default
- else:
- value = received_params.get(field.alias)
- field_info = field.field_info
- assert isinstance(
- field_info, params.Param
- ), "Params must be subclasses of Param"
- if value is None:
- if field.required:
- errors.append(
- ErrorWrapper(
- MissingError(), loc=(field_info.in_.value, field.alias)
- )
- )
- else:
- values[field.name] = deepcopy(field.default)
- continue
- v_, errors_ = field.validate(
- value, values, loc=(field_info.in_.value, field.alias)
- )
- if isinstance(errors_, ErrorWrapper):
- errors.append(errors_)
- elif isinstance(errors_, list):
- errors.extend(errors_)
- else:
- values[field.name] = v_
- return values, errors
-
-
-async def request_body_to_args(
- required_params: List[ModelField],
- received_body: Optional[Union[Dict[str, Any], FormData]],
-) -> Tuple[Dict[str, Any], List[ErrorWrapper]]:
- values = {}
- errors = []
- if required_params:
- field = required_params[0]
- field_info = field.field_info
- embed = getattr(field_info, "embed", None)
- field_alias_omitted = len(required_params) == 1 and not embed
- if field_alias_omitted:
- received_body = {field.alias: received_body}
-
- for field in required_params:
- loc: Tuple[str, ...]
- if field_alias_omitted:
- loc = ("body",)
- else:
- loc = ("body", field.alias)
-
- value: Optional[Any] = None
- if received_body is not None:
- if (
- field.shape in sequence_shapes or field.type_ in sequence_types
- ) and isinstance(received_body, FormData):
- value = received_body.getlist(field.alias)
- else:
- try:
- value = received_body.get(field.alias)
- except AttributeError:
- errors.append(get_missing_field_error(loc))
- continue
- if (
- value is None
- or (isinstance(field_info, params.Form) and value == "")
- or (
- isinstance(field_info, params.Form)
- and field.shape in sequence_shapes
- and len(value) == 0
- )
- ):
- if field.required:
- errors.append(get_missing_field_error(loc))
- else:
- values[field.name] = deepcopy(field.default)
- continue
- if (
- isinstance(field_info, params.File)
- and lenient_issubclass(field.type_, bytes)
- and isinstance(value, UploadFile)
- ):
- value = await value.read()
- elif (
- field.shape in sequence_shapes
- and isinstance(field_info, params.File)
- and lenient_issubclass(field.type_, bytes)
- and isinstance(value, sequence_types)
- ):
- results: List[Union[bytes, str]] = []
-
- async def process_fn(
- fn: Callable[[], Coroutine[Any, Any, Any]]
- ) -> None:
- result = await fn()
- results.append(result)
-
- async with anyio.create_task_group() as tg:
- for sub_value in value:
- tg.start_soon(process_fn, sub_value.read)
- value = sequence_shape_to_type[field.shape](results)
-
- v_, errors_ = field.validate(value, values, loc=loc)
-
- if isinstance(errors_, ErrorWrapper):
- errors.append(errors_)
- elif isinstance(errors_, list):
- errors.extend(errors_)
- else:
- values[field.name] = v_
- return values, errors
-
-
-def get_missing_field_error(loc: Tuple[str, ...]) -> ErrorWrapper:
- missing_field_error = ErrorWrapper(MissingError(), loc=loc)
- return missing_field_error
-
-
-def get_body_field(*, dependant: Dependant, name: str) -> Optional[ModelField]:
- flat_dependant = get_flat_dependant(dependant)
- if not flat_dependant.body_params:
- return None
- first_param = flat_dependant.body_params[0]
- field_info = first_param.field_info
- embed = getattr(field_info, "embed", None)
- body_param_names_set = {param.name for param in flat_dependant.body_params}
- if len(body_param_names_set) == 1 and not embed:
- check_file_field(first_param)
- return first_param
- # If one field requires to embed, all have to be embedded
- # in case a sub-dependency is evaluated with a single unique body field
- # That is combined (embedded) with other body fields
- for param in flat_dependant.body_params:
- setattr(param.field_info, "embed", True)
- model_name = "Body_" + name
- BodyModel: Type[BaseModel] = create_model(model_name)
- for f in flat_dependant.body_params:
- BodyModel.__fields__[f.name] = f
- required = any(True for f in flat_dependant.body_params if f.required)
-
- BodyFieldInfo_kwargs: Dict[str, Any] = dict(default=None)
- if any(isinstance(f.field_info, params.File) for f in flat_dependant.body_params):
- BodyFieldInfo: Type[params.Body] = params.File
- elif any(isinstance(f.field_info, params.Form) for f in flat_dependant.body_params):
- BodyFieldInfo = params.Form
- else:
- BodyFieldInfo = params.Body
-
- body_param_media_types = [
- getattr(f.field_info, "media_type")
- for f in flat_dependant.body_params
- if isinstance(f.field_info, params.Body)
- ]
- if len(set(body_param_media_types)) == 1:
- BodyFieldInfo_kwargs["media_type"] = body_param_media_types[0]
- final_field = create_response_field(
- name="body",
- type_=BodyModel,
- required=required,
- alias="body",
- field_info=BodyFieldInfo(**BodyFieldInfo_kwargs),
- )
- check_file_field(final_field)
- return final_field
diff --git a/env/lib/python3.9/site-packages/fastapi/encoders.py b/env/lib/python3.9/site-packages/fastapi/encoders.py
deleted file mode 100644
index f64e4b8..0000000
--- a/env/lib/python3.9/site-packages/fastapi/encoders.py
+++ /dev/null
@@ -1,167 +0,0 @@
-import dataclasses
-from collections import defaultdict
-from enum import Enum
-from pathlib import PurePath
-from types import GeneratorType
-from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Union
-
-from pydantic import BaseModel
-from pydantic.json import ENCODERS_BY_TYPE
-
-SetIntStr = Set[Union[int, str]]
-DictIntStrAny = Dict[Union[int, str], Any]
-
-
-def generate_encoders_by_class_tuples(
- type_encoder_map: Dict[Any, Callable[[Any], Any]]
-) -> Dict[Callable[[Any], Any], Tuple[Any, ...]]:
- encoders_by_class_tuples: Dict[Callable[[Any], Any], Tuple[Any, ...]] = defaultdict(
- tuple
- )
- for type_, encoder in type_encoder_map.items():
- encoders_by_class_tuples[encoder] += (type_,)
- return encoders_by_class_tuples
-
-
-encoders_by_class_tuples = generate_encoders_by_class_tuples(ENCODERS_BY_TYPE)
-
-
-def jsonable_encoder(
- obj: Any,
- include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- by_alias: bool = True,
- exclude_unset: bool = False,
- exclude_defaults: bool = False,
- exclude_none: bool = False,
- custom_encoder: Optional[Dict[Any, Callable[[Any], Any]]] = None,
- sqlalchemy_safe: bool = True,
-) -> Any:
- custom_encoder = custom_encoder or {}
- if custom_encoder:
- if type(obj) in custom_encoder:
- return custom_encoder[type(obj)](obj)
- else:
- for encoder_type, encoder_instance in custom_encoder.items():
- if isinstance(obj, encoder_type):
- return encoder_instance(obj)
- if include is not None and not isinstance(include, (set, dict)):
- include = set(include)
- if exclude is not None and not isinstance(exclude, (set, dict)):
- exclude = set(exclude)
- if isinstance(obj, BaseModel):
- encoder = getattr(obj.__config__, "json_encoders", {})
- if custom_encoder:
- encoder.update(custom_encoder)
- obj_dict = obj.dict(
- include=include, # type: ignore # in Pydantic
- exclude=exclude, # type: ignore # in Pydantic
- by_alias=by_alias,
- exclude_unset=exclude_unset,
- exclude_none=exclude_none,
- exclude_defaults=exclude_defaults,
- )
- if "__root__" in obj_dict:
- obj_dict = obj_dict["__root__"]
- return jsonable_encoder(
- obj_dict,
- exclude_none=exclude_none,
- exclude_defaults=exclude_defaults,
- custom_encoder=encoder,
- sqlalchemy_safe=sqlalchemy_safe,
- )
- if dataclasses.is_dataclass(obj):
- obj_dict = dataclasses.asdict(obj)
- return jsonable_encoder(
- obj_dict,
- exclude_none=exclude_none,
- exclude_defaults=exclude_defaults,
- custom_encoder=custom_encoder,
- sqlalchemy_safe=sqlalchemy_safe,
- )
- if isinstance(obj, Enum):
- return obj.value
- if isinstance(obj, PurePath):
- return str(obj)
- if isinstance(obj, (str, int, float, type(None))):
- return obj
- if isinstance(obj, dict):
- encoded_dict = {}
- allowed_keys = set(obj.keys())
- if include is not None:
- allowed_keys &= set(include)
- if exclude is not None:
- allowed_keys -= set(exclude)
- for key, value in obj.items():
- if (
- (
- not sqlalchemy_safe
- or (not isinstance(key, str))
- or (not key.startswith("_sa"))
- )
- and (value is not None or not exclude_none)
- and key in allowed_keys
- ):
- encoded_key = jsonable_encoder(
- key,
- by_alias=by_alias,
- exclude_unset=exclude_unset,
- exclude_none=exclude_none,
- custom_encoder=custom_encoder,
- sqlalchemy_safe=sqlalchemy_safe,
- )
- encoded_value = jsonable_encoder(
- value,
- by_alias=by_alias,
- exclude_unset=exclude_unset,
- exclude_none=exclude_none,
- custom_encoder=custom_encoder,
- sqlalchemy_safe=sqlalchemy_safe,
- )
- encoded_dict[encoded_key] = encoded_value
- return encoded_dict
- if isinstance(obj, (list, set, frozenset, GeneratorType, tuple)):
- encoded_list = []
- for item in obj:
- encoded_list.append(
- jsonable_encoder(
- item,
- include=include,
- exclude=exclude,
- by_alias=by_alias,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- custom_encoder=custom_encoder,
- sqlalchemy_safe=sqlalchemy_safe,
- )
- )
- return encoded_list
-
- if type(obj) in ENCODERS_BY_TYPE:
- return ENCODERS_BY_TYPE[type(obj)](obj)
- for encoder, classes_tuple in encoders_by_class_tuples.items():
- if isinstance(obj, classes_tuple):
- return encoder(obj)
-
- try:
- data = dict(obj)
- except Exception as e:
- errors: List[Exception] = []
- errors.append(e)
- try:
- data = vars(obj)
- except Exception as e:
- errors.append(e)
- raise ValueError(errors)
- return jsonable_encoder(
- data,
- include=include,
- exclude=exclude,
- by_alias=by_alias,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- custom_encoder=custom_encoder,
- sqlalchemy_safe=sqlalchemy_safe,
- )
diff --git a/env/lib/python3.9/site-packages/fastapi/exception_handlers.py b/env/lib/python3.9/site-packages/fastapi/exception_handlers.py
deleted file mode 100644
index 2b286d7..0000000
--- a/env/lib/python3.9/site-packages/fastapi/exception_handlers.py
+++ /dev/null
@@ -1,25 +0,0 @@
-from fastapi.encoders import jsonable_encoder
-from fastapi.exceptions import RequestValidationError
-from starlette.exceptions import HTTPException
-from starlette.requests import Request
-from starlette.responses import JSONResponse
-from starlette.status import HTTP_422_UNPROCESSABLE_ENTITY
-
-
-async def http_exception_handler(request: Request, exc: HTTPException) -> JSONResponse:
- headers = getattr(exc, "headers", None)
- if headers:
- return JSONResponse(
- {"detail": exc.detail}, status_code=exc.status_code, headers=headers
- )
- else:
- return JSONResponse({"detail": exc.detail}, status_code=exc.status_code)
-
-
-async def request_validation_exception_handler(
- request: Request, exc: RequestValidationError
-) -> JSONResponse:
- return JSONResponse(
- status_code=HTTP_422_UNPROCESSABLE_ENTITY,
- content={"detail": jsonable_encoder(exc.errors())},
- )
diff --git a/env/lib/python3.9/site-packages/fastapi/exceptions.py b/env/lib/python3.9/site-packages/fastapi/exceptions.py
deleted file mode 100644
index 0f50acc..0000000
--- a/env/lib/python3.9/site-packages/fastapi/exceptions.py
+++ /dev/null
@@ -1,36 +0,0 @@
-from typing import Any, Dict, Optional, Sequence, Type
-
-from pydantic import BaseModel, ValidationError, create_model
-from pydantic.error_wrappers import ErrorList
-from starlette.exceptions import HTTPException as StarletteHTTPException
-
-
-class HTTPException(StarletteHTTPException):
- def __init__(
- self,
- status_code: int,
- detail: Any = None,
- headers: Optional[Dict[str, Any]] = None,
- ) -> None:
- super().__init__(status_code=status_code, detail=detail, headers=headers)
-
-
-RequestErrorModel: Type[BaseModel] = create_model("Request")
-WebSocketErrorModel: Type[BaseModel] = create_model("WebSocket")
-
-
-class FastAPIError(RuntimeError):
- """
- A generic, FastAPI-specific error.
- """
-
-
-class RequestValidationError(ValidationError):
- def __init__(self, errors: Sequence[ErrorList], *, body: Any = None) -> None:
- self.body = body
- super().__init__(errors, RequestErrorModel)
-
-
-class WebSocketRequestValidationError(ValidationError):
- def __init__(self, errors: Sequence[ErrorList]) -> None:
- super().__init__(errors, WebSocketErrorModel)
diff --git a/env/lib/python3.9/site-packages/fastapi/logger.py b/env/lib/python3.9/site-packages/fastapi/logger.py
deleted file mode 100644
index 5b2c4ad..0000000
--- a/env/lib/python3.9/site-packages/fastapi/logger.py
+++ /dev/null
@@ -1,3 +0,0 @@
-import logging
-
-logger = logging.getLogger("fastapi")
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/__init__.py b/env/lib/python3.9/site-packages/fastapi/middleware/__init__.py
deleted file mode 100644
index 620296d..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.middleware import Middleware as Middleware
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py b/env/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py
deleted file mode 100644
index 503a68a..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py
+++ /dev/null
@@ -1,28 +0,0 @@
-from typing import Optional
-
-from fastapi.concurrency import AsyncExitStack
-from starlette.types import ASGIApp, Receive, Scope, Send
-
-
-class AsyncExitStackMiddleware:
- def __init__(self, app: ASGIApp, context_name: str = "fastapi_astack") -> None:
- self.app = app
- self.context_name = context_name
-
- async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
- if AsyncExitStack:
- dependency_exception: Optional[Exception] = None
- async with AsyncExitStack() as stack:
- scope[self.context_name] = stack
- try:
- await self.app(scope, receive, send)
- except Exception as e:
- dependency_exception = e
- raise e
- if dependency_exception:
- # This exception was possibly handled by the dependency but it should
- # still bubble up so that the ServerErrorMiddleware can return a 500
- # or the ExceptionMiddleware can catch and handle any other exceptions
- raise dependency_exception
- else:
- await self.app(scope, receive, send) # pragma: no cover
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/cors.py b/env/lib/python3.9/site-packages/fastapi/middleware/cors.py
deleted file mode 100644
index 8dfaad0..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/cors.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.middleware.cors import CORSMiddleware as CORSMiddleware # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/gzip.py b/env/lib/python3.9/site-packages/fastapi/middleware/gzip.py
deleted file mode 100644
index bbeb2cc..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/gzip.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.middleware.gzip import GZipMiddleware as GZipMiddleware # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/httpsredirect.py b/env/lib/python3.9/site-packages/fastapi/middleware/httpsredirect.py
deleted file mode 100644
index b7a3d8e..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/httpsredirect.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from starlette.middleware.httpsredirect import ( # noqa
- HTTPSRedirectMiddleware as HTTPSRedirectMiddleware,
-)
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/trustedhost.py b/env/lib/python3.9/site-packages/fastapi/middleware/trustedhost.py
deleted file mode 100644
index 08d7e03..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/trustedhost.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from starlette.middleware.trustedhost import ( # noqa
- TrustedHostMiddleware as TrustedHostMiddleware,
-)
diff --git a/env/lib/python3.9/site-packages/fastapi/middleware/wsgi.py b/env/lib/python3.9/site-packages/fastapi/middleware/wsgi.py
deleted file mode 100644
index c4c6a79..0000000
--- a/env/lib/python3.9/site-packages/fastapi/middleware/wsgi.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.middleware.wsgi import WSGIMiddleware as WSGIMiddleware # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/openapi/__init__.py b/env/lib/python3.9/site-packages/fastapi/openapi/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/fastapi/openapi/constants.py b/env/lib/python3.9/site-packages/fastapi/openapi/constants.py
deleted file mode 100644
index 1897ad7..0000000
--- a/env/lib/python3.9/site-packages/fastapi/openapi/constants.py
+++ /dev/null
@@ -1,2 +0,0 @@
-METHODS_WITH_BODY = {"GET", "HEAD", "POST", "PUT", "DELETE", "PATCH"}
-REF_PREFIX = "#/components/schemas/"
diff --git a/env/lib/python3.9/site-packages/fastapi/openapi/docs.py b/env/lib/python3.9/site-packages/fastapi/openapi/docs.py
deleted file mode 100644
index bf33511..0000000
--- a/env/lib/python3.9/site-packages/fastapi/openapi/docs.py
+++ /dev/null
@@ -1,203 +0,0 @@
-import json
-from typing import Any, Dict, Optional
-
-from fastapi.encoders import jsonable_encoder
-from starlette.responses import HTMLResponse
-
-swagger_ui_default_parameters = {
- "dom_id": "#swagger-ui",
- "layout": "BaseLayout",
- "deepLinking": True,
- "showExtensions": True,
- "showCommonExtensions": True,
-}
-
-
-def get_swagger_ui_html(
- *,
- openapi_url: str,
- title: str,
- swagger_js_url: str = "https://cdn.jsdelivr.net/npm/swagger-ui-dist@4/swagger-ui-bundle.js",
- swagger_css_url: str = "https://cdn.jsdelivr.net/npm/swagger-ui-dist@4/swagger-ui.css",
- swagger_favicon_url: str = "https://fastapi.tiangolo.com/img/favicon.png",
- oauth2_redirect_url: Optional[str] = None,
- init_oauth: Optional[Dict[str, Any]] = None,
- swagger_ui_parameters: Optional[Dict[str, Any]] = None,
-) -> HTMLResponse:
- current_swagger_ui_parameters = swagger_ui_default_parameters.copy()
- if swagger_ui_parameters:
- current_swagger_ui_parameters.update(swagger_ui_parameters)
-
- html = f"""
-
-
-
-
-
- {title}
-
-
-
-
-
-
-
-
-
- """
- return HTMLResponse(html)
-
-
-def get_redoc_html(
- *,
- openapi_url: str,
- title: str,
- redoc_js_url: str = "https://cdn.jsdelivr.net/npm/redoc@next/bundles/redoc.standalone.js",
- redoc_favicon_url: str = "https://fastapi.tiangolo.com/img/favicon.png",
- with_google_fonts: bool = True,
-) -> HTMLResponse:
- html = f"""
-
-
-
- {title}
-
-
-
- """
- if with_google_fonts:
- html += """
-
- """
- html += f"""
-
-
-
-
-
-
-
-
-
-
- """
- return HTMLResponse(html)
-
-
-def get_swagger_ui_oauth2_redirect_html() -> HTMLResponse:
- # copied from https://github.com/swagger-api/swagger-ui/blob/v4.14.0/dist/oauth2-redirect.html
- html = """
-
-
-
- Swagger UI: OAuth2 Redirect
-
-
-
-
-
- """
- return HTMLResponse(content=html)
diff --git a/env/lib/python3.9/site-packages/fastapi/openapi/models.py b/env/lib/python3.9/site-packages/fastapi/openapi/models.py
deleted file mode 100644
index 35aa167..0000000
--- a/env/lib/python3.9/site-packages/fastapi/openapi/models.py
+++ /dev/null
@@ -1,406 +0,0 @@
-from enum import Enum
-from typing import Any, Callable, Dict, Iterable, List, Optional, Union
-
-from fastapi.logger import logger
-from pydantic import AnyUrl, BaseModel, Field
-
-try:
- import email_validator # type: ignore
-
- assert email_validator # make autoflake ignore the unused import
- from pydantic import EmailStr
-except ImportError: # pragma: no cover
-
- class EmailStr(str): # type: ignore
- @classmethod
- def __get_validators__(cls) -> Iterable[Callable[..., Any]]:
- yield cls.validate
-
- @classmethod
- def validate(cls, v: Any) -> str:
- logger.warning(
- "email-validator not installed, email fields will be treated as str.\n"
- "To install, run: pip install email-validator"
- )
- return str(v)
-
-
-class Contact(BaseModel):
- name: Optional[str] = None
- url: Optional[AnyUrl] = None
- email: Optional[EmailStr] = None
-
- class Config:
- extra = "allow"
-
-
-class License(BaseModel):
- name: str
- url: Optional[AnyUrl] = None
-
- class Config:
- extra = "allow"
-
-
-class Info(BaseModel):
- title: str
- description: Optional[str] = None
- termsOfService: Optional[str] = None
- contact: Optional[Contact] = None
- license: Optional[License] = None
- version: str
-
- class Config:
- extra = "allow"
-
-
-class ServerVariable(BaseModel):
- enum: Optional[List[str]] = None
- default: str
- description: Optional[str] = None
-
- class Config:
- extra = "allow"
-
-
-class Server(BaseModel):
- url: Union[AnyUrl, str]
- description: Optional[str] = None
- variables: Optional[Dict[str, ServerVariable]] = None
-
- class Config:
- extra = "allow"
-
-
-class Reference(BaseModel):
- ref: str = Field(alias="$ref")
-
-
-class Discriminator(BaseModel):
- propertyName: str
- mapping: Optional[Dict[str, str]] = None
-
-
-class XML(BaseModel):
- name: Optional[str] = None
- namespace: Optional[str] = None
- prefix: Optional[str] = None
- attribute: Optional[bool] = None
- wrapped: Optional[bool] = None
-
- class Config:
- extra = "allow"
-
-
-class ExternalDocumentation(BaseModel):
- description: Optional[str] = None
- url: AnyUrl
-
- class Config:
- extra = "allow"
-
-
-class Schema(BaseModel):
- ref: Optional[str] = Field(default=None, alias="$ref")
- title: Optional[str] = None
- multipleOf: Optional[float] = None
- maximum: Optional[float] = None
- exclusiveMaximum: Optional[float] = None
- minimum: Optional[float] = None
- exclusiveMinimum: Optional[float] = None
- maxLength: Optional[int] = Field(default=None, gte=0)
- minLength: Optional[int] = Field(default=None, gte=0)
- pattern: Optional[str] = None
- maxItems: Optional[int] = Field(default=None, gte=0)
- minItems: Optional[int] = Field(default=None, gte=0)
- uniqueItems: Optional[bool] = None
- maxProperties: Optional[int] = Field(default=None, gte=0)
- minProperties: Optional[int] = Field(default=None, gte=0)
- required: Optional[List[str]] = None
- enum: Optional[List[Any]] = None
- type: Optional[str] = None
- allOf: Optional[List["Schema"]] = None
- oneOf: Optional[List["Schema"]] = None
- anyOf: Optional[List["Schema"]] = None
- not_: Optional["Schema"] = Field(default=None, alias="not")
- items: Optional[Union["Schema", List["Schema"]]] = None
- properties: Optional[Dict[str, "Schema"]] = None
- additionalProperties: Optional[Union["Schema", Reference, bool]] = None
- description: Optional[str] = None
- format: Optional[str] = None
- default: Optional[Any] = None
- nullable: Optional[bool] = None
- discriminator: Optional[Discriminator] = None
- readOnly: Optional[bool] = None
- writeOnly: Optional[bool] = None
- xml: Optional[XML] = None
- externalDocs: Optional[ExternalDocumentation] = None
- example: Optional[Any] = None
- deprecated: Optional[bool] = None
-
- class Config:
- extra: str = "allow"
-
-
-class Example(BaseModel):
- summary: Optional[str] = None
- description: Optional[str] = None
- value: Optional[Any] = None
- externalValue: Optional[AnyUrl] = None
-
- class Config:
- extra = "allow"
-
-
-class ParameterInType(Enum):
- query = "query"
- header = "header"
- path = "path"
- cookie = "cookie"
-
-
-class Encoding(BaseModel):
- contentType: Optional[str] = None
- headers: Optional[Dict[str, Union["Header", Reference]]] = None
- style: Optional[str] = None
- explode: Optional[bool] = None
- allowReserved: Optional[bool] = None
-
- class Config:
- extra = "allow"
-
-
-class MediaType(BaseModel):
- schema_: Optional[Union[Schema, Reference]] = Field(default=None, alias="schema")
- example: Optional[Any] = None
- examples: Optional[Dict[str, Union[Example, Reference]]] = None
- encoding: Optional[Dict[str, Encoding]] = None
-
- class Config:
- extra = "allow"
-
-
-class ParameterBase(BaseModel):
- description: Optional[str] = None
- required: Optional[bool] = None
- deprecated: Optional[bool] = None
- # Serialization rules for simple scenarios
- style: Optional[str] = None
- explode: Optional[bool] = None
- allowReserved: Optional[bool] = None
- schema_: Optional[Union[Schema, Reference]] = Field(default=None, alias="schema")
- example: Optional[Any] = None
- examples: Optional[Dict[str, Union[Example, Reference]]] = None
- # Serialization rules for more complex scenarios
- content: Optional[Dict[str, MediaType]] = None
-
- class Config:
- extra = "allow"
-
-
-class Parameter(ParameterBase):
- name: str
- in_: ParameterInType = Field(alias="in")
-
-
-class Header(ParameterBase):
- pass
-
-
-class RequestBody(BaseModel):
- description: Optional[str] = None
- content: Dict[str, MediaType]
- required: Optional[bool] = None
-
- class Config:
- extra = "allow"
-
-
-class Link(BaseModel):
- operationRef: Optional[str] = None
- operationId: Optional[str] = None
- parameters: Optional[Dict[str, Union[Any, str]]] = None
- requestBody: Optional[Union[Any, str]] = None
- description: Optional[str] = None
- server: Optional[Server] = None
-
- class Config:
- extra = "allow"
-
-
-class Response(BaseModel):
- description: str
- headers: Optional[Dict[str, Union[Header, Reference]]] = None
- content: Optional[Dict[str, MediaType]] = None
- links: Optional[Dict[str, Union[Link, Reference]]] = None
-
- class Config:
- extra = "allow"
-
-
-class Operation(BaseModel):
- tags: Optional[List[str]] = None
- summary: Optional[str] = None
- description: Optional[str] = None
- externalDocs: Optional[ExternalDocumentation] = None
- operationId: Optional[str] = None
- parameters: Optional[List[Union[Parameter, Reference]]] = None
- requestBody: Optional[Union[RequestBody, Reference]] = None
- # Using Any for Specification Extensions
- responses: Dict[str, Union[Response, Any]]
- callbacks: Optional[Dict[str, Union[Dict[str, "PathItem"], Reference]]] = None
- deprecated: Optional[bool] = None
- security: Optional[List[Dict[str, List[str]]]] = None
- servers: Optional[List[Server]] = None
-
- class Config:
- extra = "allow"
-
-
-class PathItem(BaseModel):
- ref: Optional[str] = Field(default=None, alias="$ref")
- summary: Optional[str] = None
- description: Optional[str] = None
- get: Optional[Operation] = None
- put: Optional[Operation] = None
- post: Optional[Operation] = None
- delete: Optional[Operation] = None
- options: Optional[Operation] = None
- head: Optional[Operation] = None
- patch: Optional[Operation] = None
- trace: Optional[Operation] = None
- servers: Optional[List[Server]] = None
- parameters: Optional[List[Union[Parameter, Reference]]] = None
-
- class Config:
- extra = "allow"
-
-
-class SecuritySchemeType(Enum):
- apiKey = "apiKey"
- http = "http"
- oauth2 = "oauth2"
- openIdConnect = "openIdConnect"
-
-
-class SecurityBase(BaseModel):
- type_: SecuritySchemeType = Field(alias="type")
- description: Optional[str] = None
-
- class Config:
- extra = "allow"
-
-
-class APIKeyIn(Enum):
- query = "query"
- header = "header"
- cookie = "cookie"
-
-
-class APIKey(SecurityBase):
- type_ = Field(SecuritySchemeType.apiKey, alias="type")
- in_: APIKeyIn = Field(alias="in")
- name: str
-
-
-class HTTPBase(SecurityBase):
- type_ = Field(SecuritySchemeType.http, alias="type")
- scheme: str
-
-
-class HTTPBearer(HTTPBase):
- scheme = "bearer"
- bearerFormat: Optional[str] = None
-
-
-class OAuthFlow(BaseModel):
- refreshUrl: Optional[str] = None
- scopes: Dict[str, str] = {}
-
- class Config:
- extra = "allow"
-
-
-class OAuthFlowImplicit(OAuthFlow):
- authorizationUrl: str
-
-
-class OAuthFlowPassword(OAuthFlow):
- tokenUrl: str
-
-
-class OAuthFlowClientCredentials(OAuthFlow):
- tokenUrl: str
-
-
-class OAuthFlowAuthorizationCode(OAuthFlow):
- authorizationUrl: str
- tokenUrl: str
-
-
-class OAuthFlows(BaseModel):
- implicit: Optional[OAuthFlowImplicit] = None
- password: Optional[OAuthFlowPassword] = None
- clientCredentials: Optional[OAuthFlowClientCredentials] = None
- authorizationCode: Optional[OAuthFlowAuthorizationCode] = None
-
- class Config:
- extra = "allow"
-
-
-class OAuth2(SecurityBase):
- type_ = Field(SecuritySchemeType.oauth2, alias="type")
- flows: OAuthFlows
-
-
-class OpenIdConnect(SecurityBase):
- type_ = Field(SecuritySchemeType.openIdConnect, alias="type")
- openIdConnectUrl: str
-
-
-SecurityScheme = Union[APIKey, HTTPBase, OAuth2, OpenIdConnect, HTTPBearer]
-
-
-class Components(BaseModel):
- schemas: Optional[Dict[str, Union[Schema, Reference]]] = None
- responses: Optional[Dict[str, Union[Response, Reference]]] = None
- parameters: Optional[Dict[str, Union[Parameter, Reference]]] = None
- examples: Optional[Dict[str, Union[Example, Reference]]] = None
- requestBodies: Optional[Dict[str, Union[RequestBody, Reference]]] = None
- headers: Optional[Dict[str, Union[Header, Reference]]] = None
- securitySchemes: Optional[Dict[str, Union[SecurityScheme, Reference]]] = None
- links: Optional[Dict[str, Union[Link, Reference]]] = None
- # Using Any for Specification Extensions
- callbacks: Optional[Dict[str, Union[Dict[str, PathItem], Reference, Any]]] = None
-
- class Config:
- extra = "allow"
-
-
-class Tag(BaseModel):
- name: str
- description: Optional[str] = None
- externalDocs: Optional[ExternalDocumentation] = None
-
- class Config:
- extra = "allow"
-
-
-class OpenAPI(BaseModel):
- openapi: str
- info: Info
- servers: Optional[List[Server]] = None
- # Using Any for Specification Extensions
- paths: Dict[str, Union[PathItem, Any]]
- components: Optional[Components] = None
- security: Optional[List[Dict[str, List[str]]]] = None
- tags: Optional[List[Tag]] = None
- externalDocs: Optional[ExternalDocumentation] = None
-
- class Config:
- extra = "allow"
-
-
-Schema.update_forward_refs()
-Operation.update_forward_refs()
-Encoding.update_forward_refs()
diff --git a/env/lib/python3.9/site-packages/fastapi/openapi/utils.py b/env/lib/python3.9/site-packages/fastapi/openapi/utils.py
deleted file mode 100644
index 86e15b4..0000000
--- a/env/lib/python3.9/site-packages/fastapi/openapi/utils.py
+++ /dev/null
@@ -1,448 +0,0 @@
-import http.client
-import inspect
-import warnings
-from enum import Enum
-from typing import Any, Dict, List, Optional, Sequence, Set, Tuple, Type, Union, cast
-
-from fastapi import routing
-from fastapi.datastructures import DefaultPlaceholder
-from fastapi.dependencies.models import Dependant
-from fastapi.dependencies.utils import get_flat_dependant, get_flat_params
-from fastapi.encoders import jsonable_encoder
-from fastapi.openapi.constants import METHODS_WITH_BODY, REF_PREFIX
-from fastapi.openapi.models import OpenAPI
-from fastapi.params import Body, Param
-from fastapi.responses import Response
-from fastapi.utils import (
- deep_dict_update,
- generate_operation_id_for_path,
- get_model_definitions,
- is_body_allowed_for_status_code,
-)
-from pydantic import BaseModel
-from pydantic.fields import ModelField, Undefined
-from pydantic.schema import (
- field_schema,
- get_flat_models_from_fields,
- get_model_name_map,
-)
-from pydantic.utils import lenient_issubclass
-from starlette.responses import JSONResponse
-from starlette.routing import BaseRoute
-from starlette.status import HTTP_422_UNPROCESSABLE_ENTITY
-
-validation_error_definition = {
- "title": "ValidationError",
- "type": "object",
- "properties": {
- "loc": {
- "title": "Location",
- "type": "array",
- "items": {"anyOf": [{"type": "string"}, {"type": "integer"}]},
- },
- "msg": {"title": "Message", "type": "string"},
- "type": {"title": "Error Type", "type": "string"},
- },
- "required": ["loc", "msg", "type"],
-}
-
-validation_error_response_definition = {
- "title": "HTTPValidationError",
- "type": "object",
- "properties": {
- "detail": {
- "title": "Detail",
- "type": "array",
- "items": {"$ref": REF_PREFIX + "ValidationError"},
- }
- },
-}
-
-status_code_ranges: Dict[str, str] = {
- "1XX": "Information",
- "2XX": "Success",
- "3XX": "Redirection",
- "4XX": "Client Error",
- "5XX": "Server Error",
- "DEFAULT": "Default Response",
-}
-
-
-def get_openapi_security_definitions(
- flat_dependant: Dependant,
-) -> Tuple[Dict[str, Any], List[Dict[str, Any]]]:
- security_definitions = {}
- operation_security = []
- for security_requirement in flat_dependant.security_requirements:
- security_definition = jsonable_encoder(
- security_requirement.security_scheme.model,
- by_alias=True,
- exclude_none=True,
- )
- security_name = security_requirement.security_scheme.scheme_name
- security_definitions[security_name] = security_definition
- operation_security.append({security_name: security_requirement.scopes})
- return security_definitions, operation_security
-
-
-def get_openapi_operation_parameters(
- *,
- all_route_params: Sequence[ModelField],
- model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str],
-) -> List[Dict[str, Any]]:
- parameters = []
- for param in all_route_params:
- field_info = param.field_info
- field_info = cast(Param, field_info)
- if not field_info.include_in_schema:
- continue
- parameter = {
- "name": param.alias,
- "in": field_info.in_.value,
- "required": param.required,
- "schema": field_schema(
- param, model_name_map=model_name_map, ref_prefix=REF_PREFIX
- )[0],
- }
- if field_info.description:
- parameter["description"] = field_info.description
- if field_info.examples:
- parameter["examples"] = jsonable_encoder(field_info.examples)
- elif field_info.example != Undefined:
- parameter["example"] = jsonable_encoder(field_info.example)
- if field_info.deprecated:
- parameter["deprecated"] = field_info.deprecated
- parameters.append(parameter)
- return parameters
-
-
-def get_openapi_operation_request_body(
- *,
- body_field: Optional[ModelField],
- model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str],
-) -> Optional[Dict[str, Any]]:
- if not body_field:
- return None
- assert isinstance(body_field, ModelField)
- body_schema, _, _ = field_schema(
- body_field, model_name_map=model_name_map, ref_prefix=REF_PREFIX
- )
- field_info = cast(Body, body_field.field_info)
- request_media_type = field_info.media_type
- required = body_field.required
- request_body_oai: Dict[str, Any] = {}
- if required:
- request_body_oai["required"] = required
- request_media_content: Dict[str, Any] = {"schema": body_schema}
- if field_info.examples:
- request_media_content["examples"] = jsonable_encoder(field_info.examples)
- elif field_info.example != Undefined:
- request_media_content["example"] = jsonable_encoder(field_info.example)
- request_body_oai["content"] = {request_media_type: request_media_content}
- return request_body_oai
-
-
-def generate_operation_id(
- *, route: routing.APIRoute, method: str
-) -> str: # pragma: nocover
- warnings.warn(
- "fastapi.openapi.utils.generate_operation_id() was deprecated, "
- "it is not used internally, and will be removed soon",
- DeprecationWarning,
- stacklevel=2,
- )
- if route.operation_id:
- return route.operation_id
- path: str = route.path_format
- return generate_operation_id_for_path(name=route.name, path=path, method=method)
-
-
-def generate_operation_summary(*, route: routing.APIRoute, method: str) -> str:
- if route.summary:
- return route.summary
- return route.name.replace("_", " ").title()
-
-
-def get_openapi_operation_metadata(
- *, route: routing.APIRoute, method: str, operation_ids: Set[str]
-) -> Dict[str, Any]:
- operation: Dict[str, Any] = {}
- if route.tags:
- operation["tags"] = route.tags
- operation["summary"] = generate_operation_summary(route=route, method=method)
- if route.description:
- operation["description"] = route.description
- operation_id = route.operation_id or route.unique_id
- if operation_id in operation_ids:
- message = (
- f"Duplicate Operation ID {operation_id} for function "
- + f"{route.endpoint.__name__}"
- )
- file_name = getattr(route.endpoint, "__globals__", {}).get("__file__")
- if file_name:
- message += f" at {file_name}"
- warnings.warn(message)
- operation_ids.add(operation_id)
- operation["operationId"] = operation_id
- if route.deprecated:
- operation["deprecated"] = route.deprecated
- return operation
-
-
-def get_openapi_path(
- *, route: routing.APIRoute, model_name_map: Dict[type, str], operation_ids: Set[str]
-) -> Tuple[Dict[str, Any], Dict[str, Any], Dict[str, Any]]:
- path = {}
- security_schemes: Dict[str, Any] = {}
- definitions: Dict[str, Any] = {}
- assert route.methods is not None, "Methods must be a list"
- if isinstance(route.response_class, DefaultPlaceholder):
- current_response_class: Type[Response] = route.response_class.value
- else:
- current_response_class = route.response_class
- assert current_response_class, "A response class is needed to generate OpenAPI"
- route_response_media_type: Optional[str] = current_response_class.media_type
- if route.include_in_schema:
- for method in route.methods:
- operation = get_openapi_operation_metadata(
- route=route, method=method, operation_ids=operation_ids
- )
- parameters: List[Dict[str, Any]] = []
- flat_dependant = get_flat_dependant(route.dependant, skip_repeats=True)
- security_definitions, operation_security = get_openapi_security_definitions(
- flat_dependant=flat_dependant
- )
- if operation_security:
- operation.setdefault("security", []).extend(operation_security)
- if security_definitions:
- security_schemes.update(security_definitions)
- all_route_params = get_flat_params(route.dependant)
- operation_parameters = get_openapi_operation_parameters(
- all_route_params=all_route_params, model_name_map=model_name_map
- )
- parameters.extend(operation_parameters)
- if parameters:
- all_parameters = {
- (param["in"], param["name"]): param for param in parameters
- }
- required_parameters = {
- (param["in"], param["name"]): param
- for param in parameters
- if param.get("required")
- }
- # Make sure required definitions of the same parameter take precedence
- # over non-required definitions
- all_parameters.update(required_parameters)
- operation["parameters"] = list(all_parameters.values())
- if method in METHODS_WITH_BODY:
- request_body_oai = get_openapi_operation_request_body(
- body_field=route.body_field, model_name_map=model_name_map
- )
- if request_body_oai:
- operation["requestBody"] = request_body_oai
- if route.callbacks:
- callbacks = {}
- for callback in route.callbacks:
- if isinstance(callback, routing.APIRoute):
- (
- cb_path,
- cb_security_schemes,
- cb_definitions,
- ) = get_openapi_path(
- route=callback,
- model_name_map=model_name_map,
- operation_ids=operation_ids,
- )
- callbacks[callback.name] = {callback.path: cb_path}
- operation["callbacks"] = callbacks
- if route.status_code is not None:
- status_code = str(route.status_code)
- else:
- # It would probably make more sense for all response classes to have an
- # explicit default status_code, and to extract it from them, instead of
- # doing this inspection tricks, that would probably be in the future
- # TODO: probably make status_code a default class attribute for all
- # responses in Starlette
- response_signature = inspect.signature(current_response_class.__init__)
- status_code_param = response_signature.parameters.get("status_code")
- if status_code_param is not None:
- if isinstance(status_code_param.default, int):
- status_code = str(status_code_param.default)
- operation.setdefault("responses", {}).setdefault(status_code, {})[
- "description"
- ] = route.response_description
- if route_response_media_type and is_body_allowed_for_status_code(
- route.status_code
- ):
- response_schema = {"type": "string"}
- if lenient_issubclass(current_response_class, JSONResponse):
- if route.response_field:
- response_schema, _, _ = field_schema(
- route.response_field,
- model_name_map=model_name_map,
- ref_prefix=REF_PREFIX,
- )
- else:
- response_schema = {}
- operation.setdefault("responses", {}).setdefault(
- status_code, {}
- ).setdefault("content", {}).setdefault(route_response_media_type, {})[
- "schema"
- ] = response_schema
- if route.responses:
- operation_responses = operation.setdefault("responses", {})
- for (
- additional_status_code,
- additional_response,
- ) in route.responses.items():
- process_response = additional_response.copy()
- process_response.pop("model", None)
- status_code_key = str(additional_status_code).upper()
- if status_code_key == "DEFAULT":
- status_code_key = "default"
- openapi_response = operation_responses.setdefault(
- status_code_key, {}
- )
- assert isinstance(
- process_response, dict
- ), "An additional response must be a dict"
- field = route.response_fields.get(additional_status_code)
- additional_field_schema: Optional[Dict[str, Any]] = None
- if field:
- additional_field_schema, _, _ = field_schema(
- field, model_name_map=model_name_map, ref_prefix=REF_PREFIX
- )
- media_type = route_response_media_type or "application/json"
- additional_schema = (
- process_response.setdefault("content", {})
- .setdefault(media_type, {})
- .setdefault("schema", {})
- )
- deep_dict_update(additional_schema, additional_field_schema)
- status_text: Optional[str] = status_code_ranges.get(
- str(additional_status_code).upper()
- ) or http.client.responses.get(int(additional_status_code))
- description = (
- process_response.get("description")
- or openapi_response.get("description")
- or status_text
- or "Additional Response"
- )
- deep_dict_update(openapi_response, process_response)
- openapi_response["description"] = description
- http422 = str(HTTP_422_UNPROCESSABLE_ENTITY)
- if (all_route_params or route.body_field) and not any(
- [
- status in operation["responses"]
- for status in [http422, "4XX", "default"]
- ]
- ):
- operation["responses"][http422] = {
- "description": "Validation Error",
- "content": {
- "application/json": {
- "schema": {"$ref": REF_PREFIX + "HTTPValidationError"}
- }
- },
- }
- if "ValidationError" not in definitions:
- definitions.update(
- {
- "ValidationError": validation_error_definition,
- "HTTPValidationError": validation_error_response_definition,
- }
- )
- if route.openapi_extra:
- deep_dict_update(operation, route.openapi_extra)
- path[method.lower()] = operation
- return path, security_schemes, definitions
-
-
-def get_flat_models_from_routes(
- routes: Sequence[BaseRoute],
-) -> Set[Union[Type[BaseModel], Type[Enum]]]:
- body_fields_from_routes: List[ModelField] = []
- responses_from_routes: List[ModelField] = []
- request_fields_from_routes: List[ModelField] = []
- callback_flat_models: Set[Union[Type[BaseModel], Type[Enum]]] = set()
- for route in routes:
- if getattr(route, "include_in_schema", None) and isinstance(
- route, routing.APIRoute
- ):
- if route.body_field:
- assert isinstance(
- route.body_field, ModelField
- ), "A request body must be a Pydantic Field"
- body_fields_from_routes.append(route.body_field)
- if route.response_field:
- responses_from_routes.append(route.response_field)
- if route.response_fields:
- responses_from_routes.extend(route.response_fields.values())
- if route.callbacks:
- callback_flat_models |= get_flat_models_from_routes(route.callbacks)
- params = get_flat_params(route.dependant)
- request_fields_from_routes.extend(params)
-
- flat_models = callback_flat_models | get_flat_models_from_fields(
- body_fields_from_routes + responses_from_routes + request_fields_from_routes,
- known_models=set(),
- )
- return flat_models
-
-
-def get_openapi(
- *,
- title: str,
- version: str,
- openapi_version: str = "3.0.2",
- description: Optional[str] = None,
- routes: Sequence[BaseRoute],
- tags: Optional[List[Dict[str, Any]]] = None,
- servers: Optional[List[Dict[str, Union[str, Any]]]] = None,
- terms_of_service: Optional[str] = None,
- contact: Optional[Dict[str, Union[str, Any]]] = None,
- license_info: Optional[Dict[str, Union[str, Any]]] = None,
-) -> Dict[str, Any]:
- info: Dict[str, Any] = {"title": title, "version": version}
- if description:
- info["description"] = description
- if terms_of_service:
- info["termsOfService"] = terms_of_service
- if contact:
- info["contact"] = contact
- if license_info:
- info["license"] = license_info
- output: Dict[str, Any] = {"openapi": openapi_version, "info": info}
- if servers:
- output["servers"] = servers
- components: Dict[str, Dict[str, Any]] = {}
- paths: Dict[str, Dict[str, Any]] = {}
- operation_ids: Set[str] = set()
- flat_models = get_flat_models_from_routes(routes)
- model_name_map = get_model_name_map(flat_models)
- definitions = get_model_definitions(
- flat_models=flat_models, model_name_map=model_name_map
- )
- for route in routes:
- if isinstance(route, routing.APIRoute):
- result = get_openapi_path(
- route=route, model_name_map=model_name_map, operation_ids=operation_ids
- )
- if result:
- path, security_schemes, path_definitions = result
- if path:
- paths.setdefault(route.path_format, {}).update(path)
- if security_schemes:
- components.setdefault("securitySchemes", {}).update(
- security_schemes
- )
- if path_definitions:
- definitions.update(path_definitions)
- if definitions:
- components["schemas"] = {k: definitions[k] for k in sorted(definitions)}
- if components:
- output["components"] = components
- output["paths"] = paths
- if tags:
- output["tags"] = tags
- return jsonable_encoder(OpenAPI(**output), by_alias=True, exclude_none=True) # type: ignore
diff --git a/env/lib/python3.9/site-packages/fastapi/param_functions.py b/env/lib/python3.9/site-packages/fastapi/param_functions.py
deleted file mode 100644
index 1932ef0..0000000
--- a/env/lib/python3.9/site-packages/fastapi/param_functions.py
+++ /dev/null
@@ -1,290 +0,0 @@
-from typing import Any, Callable, Dict, Optional, Sequence
-
-from fastapi import params
-from pydantic.fields import Undefined
-
-
-def Path( # noqa: N802
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
-) -> Any:
- return params.Path(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- deprecated=deprecated,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-def Query( # noqa: N802
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
-) -> Any:
- return params.Query(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- deprecated=deprecated,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-def Header( # noqa: N802
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- convert_underscores: bool = True,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
-) -> Any:
- return params.Header(
- default=default,
- alias=alias,
- convert_underscores=convert_underscores,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- deprecated=deprecated,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-def Cookie( # noqa: N802
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
-) -> Any:
- return params.Cookie(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- deprecated=deprecated,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-def Body( # noqa: N802
- default: Any = Undefined,
- *,
- embed: bool = False,
- media_type: str = "application/json",
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- **extra: Any,
-) -> Any:
- return params.Body(
- default=default,
- embed=embed,
- media_type=media_type,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- **extra,
- )
-
-
-def Form( # noqa: N802
- default: Any = Undefined,
- *,
- media_type: str = "application/x-www-form-urlencoded",
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- **extra: Any,
-) -> Any:
- return params.Form(
- default=default,
- media_type=media_type,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- **extra,
- )
-
-
-def File( # noqa: N802
- default: Any = Undefined,
- *,
- media_type: str = "multipart/form-data",
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- **extra: Any,
-) -> Any:
- return params.File(
- default=default,
- media_type=media_type,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- **extra,
- )
-
-
-def Depends( # noqa: N802
- dependency: Optional[Callable[..., Any]] = None, *, use_cache: bool = True
-) -> Any:
- return params.Depends(dependency=dependency, use_cache=use_cache)
-
-
-def Security( # noqa: N802
- dependency: Optional[Callable[..., Any]] = None,
- *,
- scopes: Optional[Sequence[str]] = None,
- use_cache: bool = True,
-) -> Any:
- return params.Security(dependency=dependency, scopes=scopes, use_cache=use_cache)
diff --git a/env/lib/python3.9/site-packages/fastapi/params.py b/env/lib/python3.9/site-packages/fastapi/params.py
deleted file mode 100644
index 5395b98..0000000
--- a/env/lib/python3.9/site-packages/fastapi/params.py
+++ /dev/null
@@ -1,380 +0,0 @@
-from enum import Enum
-from typing import Any, Callable, Dict, Optional, Sequence
-
-from pydantic.fields import FieldInfo, Undefined
-
-
-class ParamTypes(Enum):
- query = "query"
- header = "header"
- path = "path"
- cookie = "cookie"
-
-
-class Param(FieldInfo):
- in_: ParamTypes
-
- def __init__(
- self,
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
- ):
- self.deprecated = deprecated
- self.example = example
- self.examples = examples
- self.include_in_schema = include_in_schema
- super().__init__(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- **extra,
- )
-
- def __repr__(self) -> str:
- return f"{self.__class__.__name__}({self.default})"
-
-
-class Path(Param):
- in_ = ParamTypes.path
-
- def __init__(
- self,
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
- ):
- self.in_ = self.in_
- super().__init__(
- default=...,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- deprecated=deprecated,
- example=example,
- examples=examples,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-class Query(Param):
- in_ = ParamTypes.query
-
- def __init__(
- self,
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
- ):
- super().__init__(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- deprecated=deprecated,
- example=example,
- examples=examples,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-class Header(Param):
- in_ = ParamTypes.header
-
- def __init__(
- self,
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- convert_underscores: bool = True,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
- ):
- self.convert_underscores = convert_underscores
- super().__init__(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- deprecated=deprecated,
- example=example,
- examples=examples,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-class Cookie(Param):
- in_ = ParamTypes.cookie
-
- def __init__(
- self,
- default: Any = Undefined,
- *,
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- **extra: Any,
- ):
- super().__init__(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- deprecated=deprecated,
- example=example,
- examples=examples,
- include_in_schema=include_in_schema,
- **extra,
- )
-
-
-class Body(FieldInfo):
- def __init__(
- self,
- default: Any = Undefined,
- *,
- embed: bool = False,
- media_type: str = "application/json",
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- **extra: Any,
- ):
- self.embed = embed
- self.media_type = media_type
- self.example = example
- self.examples = examples
- super().__init__(
- default=default,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- **extra,
- )
-
- def __repr__(self) -> str:
- return f"{self.__class__.__name__}({self.default})"
-
-
-class Form(Body):
- def __init__(
- self,
- default: Any,
- *,
- media_type: str = "application/x-www-form-urlencoded",
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- **extra: Any,
- ):
- super().__init__(
- default=default,
- embed=True,
- media_type=media_type,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- **extra,
- )
-
-
-class File(Form):
- def __init__(
- self,
- default: Any,
- *,
- media_type: str = "multipart/form-data",
- alias: Optional[str] = None,
- title: Optional[str] = None,
- description: Optional[str] = None,
- gt: Optional[float] = None,
- ge: Optional[float] = None,
- lt: Optional[float] = None,
- le: Optional[float] = None,
- min_length: Optional[int] = None,
- max_length: Optional[int] = None,
- regex: Optional[str] = None,
- example: Any = Undefined,
- examples: Optional[Dict[str, Any]] = None,
- **extra: Any,
- ):
- super().__init__(
- default=default,
- media_type=media_type,
- alias=alias,
- title=title,
- description=description,
- gt=gt,
- ge=ge,
- lt=lt,
- le=le,
- min_length=min_length,
- max_length=max_length,
- regex=regex,
- example=example,
- examples=examples,
- **extra,
- )
-
-
-class Depends:
- def __init__(
- self, dependency: Optional[Callable[..., Any]] = None, *, use_cache: bool = True
- ):
- self.dependency = dependency
- self.use_cache = use_cache
-
- def __repr__(self) -> str:
- attr = getattr(self.dependency, "__name__", type(self.dependency).__name__)
- cache = "" if self.use_cache else ", use_cache=False"
- return f"{self.__class__.__name__}({attr}{cache})"
-
-
-class Security(Depends):
- def __init__(
- self,
- dependency: Optional[Callable[..., Any]] = None,
- *,
- scopes: Optional[Sequence[str]] = None,
- use_cache: bool = True,
- ):
- super().__init__(dependency=dependency, use_cache=use_cache)
- self.scopes = scopes or []
diff --git a/env/lib/python3.9/site-packages/fastapi/py.typed b/env/lib/python3.9/site-packages/fastapi/py.typed
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/fastapi/requests.py b/env/lib/python3.9/site-packages/fastapi/requests.py
deleted file mode 100644
index d16552c..0000000
--- a/env/lib/python3.9/site-packages/fastapi/requests.py
+++ /dev/null
@@ -1,2 +0,0 @@
-from starlette.requests import HTTPConnection as HTTPConnection # noqa: F401
-from starlette.requests import Request as Request # noqa: F401
diff --git a/env/lib/python3.9/site-packages/fastapi/responses.py b/env/lib/python3.9/site-packages/fastapi/responses.py
deleted file mode 100644
index 88dba96..0000000
--- a/env/lib/python3.9/site-packages/fastapi/responses.py
+++ /dev/null
@@ -1,36 +0,0 @@
-from typing import Any
-
-from starlette.responses import FileResponse as FileResponse # noqa
-from starlette.responses import HTMLResponse as HTMLResponse # noqa
-from starlette.responses import JSONResponse as JSONResponse # noqa
-from starlette.responses import PlainTextResponse as PlainTextResponse # noqa
-from starlette.responses import RedirectResponse as RedirectResponse # noqa
-from starlette.responses import Response as Response # noqa
-from starlette.responses import StreamingResponse as StreamingResponse # noqa
-
-try:
- import ujson
-except ImportError: # pragma: nocover
- ujson = None # type: ignore
-
-
-try:
- import orjson
-except ImportError: # pragma: nocover
- orjson = None # type: ignore
-
-
-class UJSONResponse(JSONResponse):
- def render(self, content: Any) -> bytes:
- assert ujson is not None, "ujson must be installed to use UJSONResponse"
- return ujson.dumps(content, ensure_ascii=False).encode("utf-8")
-
-
-class ORJSONResponse(JSONResponse):
- media_type = "application/json"
-
- def render(self, content: Any) -> bytes:
- assert orjson is not None, "orjson must be installed to use ORJSONResponse"
- return orjson.dumps(
- content, option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SERIALIZE_NUMPY
- )
diff --git a/env/lib/python3.9/site-packages/fastapi/routing.py b/env/lib/python3.9/site-packages/fastapi/routing.py
deleted file mode 100644
index 233f79f..0000000
--- a/env/lib/python3.9/site-packages/fastapi/routing.py
+++ /dev/null
@@ -1,1237 +0,0 @@
-import asyncio
-import dataclasses
-import email.message
-import inspect
-import json
-from enum import Enum, IntEnum
-from typing import (
- Any,
- Callable,
- Coroutine,
- Dict,
- List,
- Optional,
- Sequence,
- Set,
- Tuple,
- Type,
- Union,
-)
-
-from fastapi import params
-from fastapi.datastructures import Default, DefaultPlaceholder
-from fastapi.dependencies.models import Dependant
-from fastapi.dependencies.utils import (
- get_body_field,
- get_dependant,
- get_parameterless_sub_dependant,
- solve_dependencies,
-)
-from fastapi.encoders import DictIntStrAny, SetIntStr, jsonable_encoder
-from fastapi.exceptions import RequestValidationError, WebSocketRequestValidationError
-from fastapi.types import DecoratedCallable
-from fastapi.utils import (
- create_cloned_field,
- create_response_field,
- generate_unique_id,
- get_value_or_default,
- is_body_allowed_for_status_code,
-)
-from pydantic import BaseModel
-from pydantic.error_wrappers import ErrorWrapper, ValidationError
-from pydantic.fields import ModelField, Undefined
-from starlette import routing
-from starlette.concurrency import run_in_threadpool
-from starlette.exceptions import HTTPException
-from starlette.requests import Request
-from starlette.responses import JSONResponse, Response
-from starlette.routing import BaseRoute, Match
-from starlette.routing import Mount as Mount # noqa
-from starlette.routing import (
- compile_path,
- get_name,
- request_response,
- websocket_session,
-)
-from starlette.status import WS_1008_POLICY_VIOLATION
-from starlette.types import ASGIApp, Scope
-from starlette.websockets import WebSocket
-
-
-def _prepare_response_content(
- res: Any,
- *,
- exclude_unset: bool,
- exclude_defaults: bool = False,
- exclude_none: bool = False,
-) -> Any:
- if isinstance(res, BaseModel):
- read_with_orm_mode = getattr(res.__config__, "read_with_orm_mode", None)
- if read_with_orm_mode:
- # Let from_orm extract the data from this model instead of converting
- # it now to a dict.
- # Otherwise there's no way to extract lazy data that requires attribute
- # access instead of dict iteration, e.g. lazy relationships.
- return res
- return res.dict(
- by_alias=True,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- )
- elif isinstance(res, list):
- return [
- _prepare_response_content(
- item,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- )
- for item in res
- ]
- elif isinstance(res, dict):
- return {
- k: _prepare_response_content(
- v,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- )
- for k, v in res.items()
- }
- elif dataclasses.is_dataclass(res):
- return dataclasses.asdict(res)
- return res
-
-
-async def serialize_response(
- *,
- field: Optional[ModelField] = None,
- response_content: Any,
- include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- by_alias: bool = True,
- exclude_unset: bool = False,
- exclude_defaults: bool = False,
- exclude_none: bool = False,
- is_coroutine: bool = True,
-) -> Any:
- if field:
- errors = []
- response_content = _prepare_response_content(
- response_content,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- )
- if is_coroutine:
- value, errors_ = field.validate(response_content, {}, loc=("response",))
- else:
- value, errors_ = await run_in_threadpool( # type: ignore[misc]
- field.validate, response_content, {}, loc=("response",)
- )
- if isinstance(errors_, ErrorWrapper):
- errors.append(errors_)
- elif isinstance(errors_, list):
- errors.extend(errors_)
- if errors:
- raise ValidationError(errors, field.type_)
- return jsonable_encoder(
- value,
- include=include,
- exclude=exclude,
- by_alias=by_alias,
- exclude_unset=exclude_unset,
- exclude_defaults=exclude_defaults,
- exclude_none=exclude_none,
- )
- else:
- return jsonable_encoder(response_content)
-
-
-async def run_endpoint_function(
- *, dependant: Dependant, values: Dict[str, Any], is_coroutine: bool
-) -> Any:
- # Only called by get_request_handler. Has been split into its own function to
- # facilitate profiling endpoints, since inner functions are harder to profile.
- assert dependant.call is not None, "dependant.call must be a function"
-
- if is_coroutine:
- return await dependant.call(**values)
- else:
- return await run_in_threadpool(dependant.call, **values)
-
-
-def get_request_handler(
- dependant: Dependant,
- body_field: Optional[ModelField] = None,
- status_code: Optional[int] = None,
- response_class: Union[Type[Response], DefaultPlaceholder] = Default(JSONResponse),
- response_field: Optional[ModelField] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- dependency_overrides_provider: Optional[Any] = None,
-) -> Callable[[Request], Coroutine[Any, Any, Response]]:
- assert dependant.call is not None, "dependant.call must be a function"
- is_coroutine = asyncio.iscoroutinefunction(dependant.call)
- is_body_form = body_field and isinstance(body_field.field_info, params.Form)
- if isinstance(response_class, DefaultPlaceholder):
- actual_response_class: Type[Response] = response_class.value
- else:
- actual_response_class = response_class
-
- async def app(request: Request) -> Response:
- try:
- body: Any = None
- if body_field:
- if is_body_form:
- body = await request.form()
- else:
- body_bytes = await request.body()
- if body_bytes:
- json_body: Any = Undefined
- content_type_value = request.headers.get("content-type")
- if not content_type_value:
- json_body = await request.json()
- else:
- message = email.message.Message()
- message["content-type"] = content_type_value
- if message.get_content_maintype() == "application":
- subtype = message.get_content_subtype()
- if subtype == "json" or subtype.endswith("+json"):
- json_body = await request.json()
- if json_body != Undefined:
- body = json_body
- else:
- body = body_bytes
- except json.JSONDecodeError as e:
- raise RequestValidationError(
- [ErrorWrapper(e, ("body", e.pos))], body=e.doc
- ) from e
- except HTTPException:
- raise
- except Exception as e:
- raise HTTPException(
- status_code=400, detail="There was an error parsing the body"
- ) from e
- solved_result = await solve_dependencies(
- request=request,
- dependant=dependant,
- body=body,
- dependency_overrides_provider=dependency_overrides_provider,
- )
- values, errors, background_tasks, sub_response, _ = solved_result
- if errors:
- raise RequestValidationError(errors, body=body)
- else:
- raw_response = await run_endpoint_function(
- dependant=dependant, values=values, is_coroutine=is_coroutine
- )
-
- if isinstance(raw_response, Response):
- if raw_response.background is None:
- raw_response.background = background_tasks
- return raw_response
- response_args: Dict[str, Any] = {"background": background_tasks}
- # If status_code was set, use it, otherwise use the default from the
- # response class, in the case of redirect it's 307
- current_status_code = (
- status_code if status_code else sub_response.status_code
- )
- if current_status_code is not None:
- response_args["status_code"] = current_status_code
- if sub_response.status_code:
- response_args["status_code"] = sub_response.status_code
- content = await serialize_response(
- field=response_field,
- response_content=raw_response,
- include=response_model_include,
- exclude=response_model_exclude,
- by_alias=response_model_by_alias,
- exclude_unset=response_model_exclude_unset,
- exclude_defaults=response_model_exclude_defaults,
- exclude_none=response_model_exclude_none,
- is_coroutine=is_coroutine,
- )
- response = actual_response_class(content, **response_args)
- if not is_body_allowed_for_status_code(status_code):
- response.body = b""
- response.headers.raw.extend(sub_response.headers.raw)
- return response
-
- return app
-
-
-def get_websocket_app(
- dependant: Dependant, dependency_overrides_provider: Optional[Any] = None
-) -> Callable[[WebSocket], Coroutine[Any, Any, Any]]:
- async def app(websocket: WebSocket) -> None:
- solved_result = await solve_dependencies(
- request=websocket,
- dependant=dependant,
- dependency_overrides_provider=dependency_overrides_provider,
- )
- values, errors, _, _2, _3 = solved_result
- if errors:
- await websocket.close(code=WS_1008_POLICY_VIOLATION)
- raise WebSocketRequestValidationError(errors)
- assert dependant.call is not None, "dependant.call must be a function"
- await dependant.call(**values)
-
- return app
-
-
-class APIWebSocketRoute(routing.WebSocketRoute):
- def __init__(
- self,
- path: str,
- endpoint: Callable[..., Any],
- *,
- name: Optional[str] = None,
- dependency_overrides_provider: Optional[Any] = None,
- ) -> None:
- self.path = path
- self.endpoint = endpoint
- self.name = get_name(endpoint) if name is None else name
- self.path_regex, self.path_format, self.param_convertors = compile_path(path)
- self.dependant = get_dependant(path=self.path_format, call=self.endpoint)
- self.app = websocket_session(
- get_websocket_app(
- dependant=self.dependant,
- dependency_overrides_provider=dependency_overrides_provider,
- )
- )
-
- def matches(self, scope: Scope) -> Tuple[Match, Scope]:
- match, child_scope = super().matches(scope)
- if match != Match.NONE:
- child_scope["route"] = self
- return match, child_scope
-
-
-class APIRoute(routing.Route):
- def __init__(
- self,
- path: str,
- endpoint: Callable[..., Any],
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- name: Optional[str] = None,
- methods: Optional[Union[Set[str], List[str]]] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Union[Type[Response], DefaultPlaceholder] = Default(
- JSONResponse
- ),
- dependency_overrides_provider: Optional[Any] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Union[
- Callable[["APIRoute"], str], DefaultPlaceholder
- ] = Default(generate_unique_id),
- ) -> None:
- self.path = path
- self.endpoint = endpoint
- self.response_model = response_model
- self.summary = summary
- self.response_description = response_description
- self.deprecated = deprecated
- self.operation_id = operation_id
- self.response_model_include = response_model_include
- self.response_model_exclude = response_model_exclude
- self.response_model_by_alias = response_model_by_alias
- self.response_model_exclude_unset = response_model_exclude_unset
- self.response_model_exclude_defaults = response_model_exclude_defaults
- self.response_model_exclude_none = response_model_exclude_none
- self.include_in_schema = include_in_schema
- self.response_class = response_class
- self.dependency_overrides_provider = dependency_overrides_provider
- self.callbacks = callbacks
- self.openapi_extra = openapi_extra
- self.generate_unique_id_function = generate_unique_id_function
- self.tags = tags or []
- self.responses = responses or {}
- self.name = get_name(endpoint) if name is None else name
- self.path_regex, self.path_format, self.param_convertors = compile_path(path)
- if methods is None:
- methods = ["GET"]
- self.methods: Set[str] = {method.upper() for method in methods}
- if isinstance(generate_unique_id_function, DefaultPlaceholder):
- current_generate_unique_id: Callable[
- ["APIRoute"], str
- ] = generate_unique_id_function.value
- else:
- current_generate_unique_id = generate_unique_id_function
- self.unique_id = self.operation_id or current_generate_unique_id(self)
- # normalize enums e.g. http.HTTPStatus
- if isinstance(status_code, IntEnum):
- status_code = int(status_code)
- self.status_code = status_code
- if self.response_model:
- assert is_body_allowed_for_status_code(
- status_code
- ), f"Status code {status_code} must not have a response body"
- response_name = "Response_" + self.unique_id
- self.response_field = create_response_field(
- name=response_name, type_=self.response_model
- )
- # Create a clone of the field, so that a Pydantic submodel is not returned
- # as is just because it's an instance of a subclass of a more limited class
- # e.g. UserInDB (containing hashed_password) could be a subclass of User
- # that doesn't have the hashed_password. But because it's a subclass, it
- # would pass the validation and be returned as is.
- # By being a new field, no inheritance will be passed as is. A new model
- # will be always created.
- self.secure_cloned_response_field: Optional[
- ModelField
- ] = create_cloned_field(self.response_field)
- else:
- self.response_field = None # type: ignore
- self.secure_cloned_response_field = None
- if dependencies:
- self.dependencies = list(dependencies)
- else:
- self.dependencies = []
- self.description = description or inspect.cleandoc(self.endpoint.__doc__ or "")
- # if a "form feed" character (page break) is found in the description text,
- # truncate description text to the content preceding the first "form feed"
- self.description = self.description.split("\f")[0].strip()
- response_fields = {}
- for additional_status_code, response in self.responses.items():
- assert isinstance(response, dict), "An additional response must be a dict"
- model = response.get("model")
- if model:
- assert is_body_allowed_for_status_code(
- additional_status_code
- ), f"Status code {additional_status_code} must not have a response body"
- response_name = f"Response_{additional_status_code}_{self.unique_id}"
- response_field = create_response_field(name=response_name, type_=model)
- response_fields[additional_status_code] = response_field
- if response_fields:
- self.response_fields: Dict[Union[int, str], ModelField] = response_fields
- else:
- self.response_fields = {}
-
- assert callable(endpoint), "An endpoint must be a callable"
- self.dependant = get_dependant(path=self.path_format, call=self.endpoint)
- for depends in self.dependencies[::-1]:
- self.dependant.dependencies.insert(
- 0,
- get_parameterless_sub_dependant(depends=depends, path=self.path_format),
- )
- self.body_field = get_body_field(dependant=self.dependant, name=self.unique_id)
- self.app = request_response(self.get_route_handler())
-
- def get_route_handler(self) -> Callable[[Request], Coroutine[Any, Any, Response]]:
- return get_request_handler(
- dependant=self.dependant,
- body_field=self.body_field,
- status_code=self.status_code,
- response_class=self.response_class,
- response_field=self.secure_cloned_response_field,
- response_model_include=self.response_model_include,
- response_model_exclude=self.response_model_exclude,
- response_model_by_alias=self.response_model_by_alias,
- response_model_exclude_unset=self.response_model_exclude_unset,
- response_model_exclude_defaults=self.response_model_exclude_defaults,
- response_model_exclude_none=self.response_model_exclude_none,
- dependency_overrides_provider=self.dependency_overrides_provider,
- )
-
- def matches(self, scope: Scope) -> Tuple[Match, Scope]:
- match, child_scope = super().matches(scope)
- if match != Match.NONE:
- child_scope["route"] = self
- return match, child_scope
-
-
-class APIRouter(routing.Router):
- def __init__(
- self,
- *,
- prefix: str = "",
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- default_response_class: Type[Response] = Default(JSONResponse),
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- routes: Optional[List[routing.BaseRoute]] = None,
- redirect_slashes: bool = True,
- default: Optional[ASGIApp] = None,
- dependency_overrides_provider: Optional[Any] = None,
- route_class: Type[APIRoute] = APIRoute,
- on_startup: Optional[Sequence[Callable[[], Any]]] = None,
- on_shutdown: Optional[Sequence[Callable[[], Any]]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> None:
- super().__init__(
- routes=routes,
- redirect_slashes=redirect_slashes,
- default=default,
- on_startup=on_startup,
- on_shutdown=on_shutdown,
- )
- if prefix:
- assert prefix.startswith("/"), "A path prefix must start with '/'"
- assert not prefix.endswith(
- "/"
- ), "A path prefix must not end with '/', as the routes will start with '/'"
- self.prefix = prefix
- self.tags: List[Union[str, Enum]] = tags or []
- self.dependencies = list(dependencies or []) or []
- self.deprecated = deprecated
- self.include_in_schema = include_in_schema
- self.responses = responses or {}
- self.callbacks = callbacks or []
- self.dependency_overrides_provider = dependency_overrides_provider
- self.route_class = route_class
- self.default_response_class = default_response_class
- self.generate_unique_id_function = generate_unique_id_function
-
- def add_api_route(
- self,
- path: str,
- endpoint: Callable[..., Any],
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- methods: Optional[Union[Set[str], List[str]]] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Union[Type[Response], DefaultPlaceholder] = Default(
- JSONResponse
- ),
- name: Optional[str] = None,
- route_class_override: Optional[Type[APIRoute]] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Union[
- Callable[[APIRoute], str], DefaultPlaceholder
- ] = Default(generate_unique_id),
- ) -> None:
- route_class = route_class_override or self.route_class
- responses = responses or {}
- combined_responses = {**self.responses, **responses}
- current_response_class = get_value_or_default(
- response_class, self.default_response_class
- )
- current_tags = self.tags.copy()
- if tags:
- current_tags.extend(tags)
- current_dependencies = self.dependencies.copy()
- if dependencies:
- current_dependencies.extend(dependencies)
- current_callbacks = self.callbacks.copy()
- if callbacks:
- current_callbacks.extend(callbacks)
- current_generate_unique_id = get_value_or_default(
- generate_unique_id_function, self.generate_unique_id_function
- )
- route = route_class(
- self.prefix + path,
- endpoint=endpoint,
- response_model=response_model,
- status_code=status_code,
- tags=current_tags,
- dependencies=current_dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=combined_responses,
- deprecated=deprecated or self.deprecated,
- methods=methods,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema and self.include_in_schema,
- response_class=current_response_class,
- name=name,
- dependency_overrides_provider=self.dependency_overrides_provider,
- callbacks=current_callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=current_generate_unique_id,
- )
- self.routes.append(route)
-
- def api_route(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- methods: Optional[List[str]] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- def decorator(func: DecoratedCallable) -> DecoratedCallable:
- self.add_api_route(
- path,
- func,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=methods,
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
- return func
-
- return decorator
-
- def add_api_websocket_route(
- self, path: str, endpoint: Callable[..., Any], name: Optional[str] = None
- ) -> None:
- route = APIWebSocketRoute(
- self.prefix + path,
- endpoint=endpoint,
- name=name,
- dependency_overrides_provider=self.dependency_overrides_provider,
- )
- self.routes.append(route)
-
- def websocket(
- self, path: str, name: Optional[str] = None
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- def decorator(func: DecoratedCallable) -> DecoratedCallable:
- self.add_api_websocket_route(path, func, name=name)
- return func
-
- return decorator
-
- def include_router(
- self,
- router: "APIRouter",
- *,
- prefix: str = "",
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- default_response_class: Type[Response] = Default(JSONResponse),
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- deprecated: Optional[bool] = None,
- include_in_schema: bool = True,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> None:
- if prefix:
- assert prefix.startswith("/"), "A path prefix must start with '/'"
- assert not prefix.endswith(
- "/"
- ), "A path prefix must not end with '/', as the routes will start with '/'"
- else:
- for r in router.routes:
- path = getattr(r, "path")
- name = getattr(r, "name", "unknown")
- if path is not None and not path:
- raise Exception(
- f"Prefix and path cannot be both empty (path operation: {name})"
- )
- if responses is None:
- responses = {}
- for route in router.routes:
- if isinstance(route, APIRoute):
- combined_responses = {**responses, **route.responses}
- use_response_class = get_value_or_default(
- route.response_class,
- router.default_response_class,
- default_response_class,
- self.default_response_class,
- )
- current_tags = []
- if tags:
- current_tags.extend(tags)
- if route.tags:
- current_tags.extend(route.tags)
- current_dependencies: List[params.Depends] = []
- if dependencies:
- current_dependencies.extend(dependencies)
- if route.dependencies:
- current_dependencies.extend(route.dependencies)
- current_callbacks = []
- if callbacks:
- current_callbacks.extend(callbacks)
- if route.callbacks:
- current_callbacks.extend(route.callbacks)
- current_generate_unique_id = get_value_or_default(
- route.generate_unique_id_function,
- router.generate_unique_id_function,
- generate_unique_id_function,
- self.generate_unique_id_function,
- )
- self.add_api_route(
- prefix + route.path,
- route.endpoint,
- response_model=route.response_model,
- status_code=route.status_code,
- tags=current_tags,
- dependencies=current_dependencies,
- summary=route.summary,
- description=route.description,
- response_description=route.response_description,
- responses=combined_responses,
- deprecated=route.deprecated or deprecated or self.deprecated,
- methods=route.methods,
- operation_id=route.operation_id,
- response_model_include=route.response_model_include,
- response_model_exclude=route.response_model_exclude,
- response_model_by_alias=route.response_model_by_alias,
- response_model_exclude_unset=route.response_model_exclude_unset,
- response_model_exclude_defaults=route.response_model_exclude_defaults,
- response_model_exclude_none=route.response_model_exclude_none,
- include_in_schema=route.include_in_schema
- and self.include_in_schema
- and include_in_schema,
- response_class=use_response_class,
- name=route.name,
- route_class_override=type(route),
- callbacks=current_callbacks,
- openapi_extra=route.openapi_extra,
- generate_unique_id_function=current_generate_unique_id,
- )
- elif isinstance(route, routing.Route):
- methods = list(route.methods or [])
- self.add_route(
- prefix + route.path,
- route.endpoint,
- methods=methods,
- include_in_schema=route.include_in_schema,
- name=route.name,
- )
- elif isinstance(route, APIWebSocketRoute):
- self.add_api_websocket_route(
- prefix + route.path, route.endpoint, name=route.name
- )
- elif isinstance(route, routing.WebSocketRoute):
- self.add_websocket_route(
- prefix + route.path, route.endpoint, name=route.name
- )
- for handler in router.on_startup:
- self.add_event_handler("startup", handler)
- for handler in router.on_shutdown:
- self.add_event_handler("shutdown", handler)
-
- def get(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["GET"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def put(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["PUT"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def post(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["POST"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def delete(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["DELETE"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def options(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["OPTIONS"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def head(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["HEAD"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def patch(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["PATCH"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
-
- def trace(
- self,
- path: str,
- *,
- response_model: Any = None,
- status_code: Optional[int] = None,
- tags: Optional[List[Union[str, Enum]]] = None,
- dependencies: Optional[Sequence[params.Depends]] = None,
- summary: Optional[str] = None,
- description: Optional[str] = None,
- response_description: str = "Successful Response",
- responses: Optional[Dict[Union[int, str], Dict[str, Any]]] = None,
- deprecated: Optional[bool] = None,
- operation_id: Optional[str] = None,
- response_model_include: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_exclude: Optional[Union[SetIntStr, DictIntStrAny]] = None,
- response_model_by_alias: bool = True,
- response_model_exclude_unset: bool = False,
- response_model_exclude_defaults: bool = False,
- response_model_exclude_none: bool = False,
- include_in_schema: bool = True,
- response_class: Type[Response] = Default(JSONResponse),
- name: Optional[str] = None,
- callbacks: Optional[List[BaseRoute]] = None,
- openapi_extra: Optional[Dict[str, Any]] = None,
- generate_unique_id_function: Callable[[APIRoute], str] = Default(
- generate_unique_id
- ),
- ) -> Callable[[DecoratedCallable], DecoratedCallable]:
-
- return self.api_route(
- path=path,
- response_model=response_model,
- status_code=status_code,
- tags=tags,
- dependencies=dependencies,
- summary=summary,
- description=description,
- response_description=response_description,
- responses=responses,
- deprecated=deprecated,
- methods=["TRACE"],
- operation_id=operation_id,
- response_model_include=response_model_include,
- response_model_exclude=response_model_exclude,
- response_model_by_alias=response_model_by_alias,
- response_model_exclude_unset=response_model_exclude_unset,
- response_model_exclude_defaults=response_model_exclude_defaults,
- response_model_exclude_none=response_model_exclude_none,
- include_in_schema=include_in_schema,
- response_class=response_class,
- name=name,
- callbacks=callbacks,
- openapi_extra=openapi_extra,
- generate_unique_id_function=generate_unique_id_function,
- )
diff --git a/env/lib/python3.9/site-packages/fastapi/security/__init__.py b/env/lib/python3.9/site-packages/fastapi/security/__init__.py
deleted file mode 100644
index 3aa6bf2..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/__init__.py
+++ /dev/null
@@ -1,15 +0,0 @@
-from .api_key import APIKeyCookie as APIKeyCookie
-from .api_key import APIKeyHeader as APIKeyHeader
-from .api_key import APIKeyQuery as APIKeyQuery
-from .http import HTTPAuthorizationCredentials as HTTPAuthorizationCredentials
-from .http import HTTPBasic as HTTPBasic
-from .http import HTTPBasicCredentials as HTTPBasicCredentials
-from .http import HTTPBearer as HTTPBearer
-from .http import HTTPDigest as HTTPDigest
-from .oauth2 import OAuth2 as OAuth2
-from .oauth2 import OAuth2AuthorizationCodeBearer as OAuth2AuthorizationCodeBearer
-from .oauth2 import OAuth2PasswordBearer as OAuth2PasswordBearer
-from .oauth2 import OAuth2PasswordRequestForm as OAuth2PasswordRequestForm
-from .oauth2 import OAuth2PasswordRequestFormStrict as OAuth2PasswordRequestFormStrict
-from .oauth2 import SecurityScopes as SecurityScopes
-from .open_id_connect_url import OpenIdConnect as OpenIdConnect
diff --git a/env/lib/python3.9/site-packages/fastapi/security/api_key.py b/env/lib/python3.9/site-packages/fastapi/security/api_key.py
deleted file mode 100644
index 36ab60e..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/api_key.py
+++ /dev/null
@@ -1,92 +0,0 @@
-from typing import Optional
-
-from fastapi.openapi.models import APIKey, APIKeyIn
-from fastapi.security.base import SecurityBase
-from starlette.exceptions import HTTPException
-from starlette.requests import Request
-from starlette.status import HTTP_403_FORBIDDEN
-
-
-class APIKeyBase(SecurityBase):
- pass
-
-
-class APIKeyQuery(APIKeyBase):
- def __init__(
- self,
- *,
- name: str,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True
- ):
- self.model: APIKey = APIKey(
- **{"in": APIKeyIn.query}, name=name, description=description
- )
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(self, request: Request) -> Optional[str]:
- api_key: str = request.query_params.get(self.model.name)
- if not api_key:
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- return api_key
-
-
-class APIKeyHeader(APIKeyBase):
- def __init__(
- self,
- *,
- name: str,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True
- ):
- self.model: APIKey = APIKey(
- **{"in": APIKeyIn.header}, name=name, description=description
- )
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(self, request: Request) -> Optional[str]:
- api_key: str = request.headers.get(self.model.name)
- if not api_key:
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- return api_key
-
-
-class APIKeyCookie(APIKeyBase):
- def __init__(
- self,
- *,
- name: str,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True
- ):
- self.model: APIKey = APIKey(
- **{"in": APIKeyIn.cookie}, name=name, description=description
- )
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(self, request: Request) -> Optional[str]:
- api_key = request.cookies.get(self.model.name)
- if not api_key:
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- return api_key
diff --git a/env/lib/python3.9/site-packages/fastapi/security/base.py b/env/lib/python3.9/site-packages/fastapi/security/base.py
deleted file mode 100644
index c43555d..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/base.py
+++ /dev/null
@@ -1,6 +0,0 @@
-from fastapi.openapi.models import SecurityBase as SecurityBaseModel
-
-
-class SecurityBase:
- model: SecurityBaseModel
- scheme_name: str
diff --git a/env/lib/python3.9/site-packages/fastapi/security/http.py b/env/lib/python3.9/site-packages/fastapi/security/http.py
deleted file mode 100644
index 1b473c6..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/http.py
+++ /dev/null
@@ -1,165 +0,0 @@
-import binascii
-from base64 import b64decode
-from typing import Optional
-
-from fastapi.exceptions import HTTPException
-from fastapi.openapi.models import HTTPBase as HTTPBaseModel
-from fastapi.openapi.models import HTTPBearer as HTTPBearerModel
-from fastapi.security.base import SecurityBase
-from fastapi.security.utils import get_authorization_scheme_param
-from pydantic import BaseModel
-from starlette.requests import Request
-from starlette.status import HTTP_401_UNAUTHORIZED, HTTP_403_FORBIDDEN
-
-
-class HTTPBasicCredentials(BaseModel):
- username: str
- password: str
-
-
-class HTTPAuthorizationCredentials(BaseModel):
- scheme: str
- credentials: str
-
-
-class HTTPBase(SecurityBase):
- def __init__(
- self,
- *,
- scheme: str,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True,
- ):
- self.model = HTTPBaseModel(scheme=scheme, description=description)
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(
- self, request: Request
- ) -> Optional[HTTPAuthorizationCredentials]:
- authorization: str = request.headers.get("Authorization")
- scheme, credentials = get_authorization_scheme_param(authorization)
- if not (authorization and scheme and credentials):
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- return HTTPAuthorizationCredentials(scheme=scheme, credentials=credentials)
-
-
-class HTTPBasic(HTTPBase):
- def __init__(
- self,
- *,
- scheme_name: Optional[str] = None,
- realm: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True,
- ):
- self.model = HTTPBaseModel(scheme="basic", description=description)
- self.scheme_name = scheme_name or self.__class__.__name__
- self.realm = realm
- self.auto_error = auto_error
-
- async def __call__( # type: ignore
- self, request: Request
- ) -> Optional[HTTPBasicCredentials]:
- authorization: str = request.headers.get("Authorization")
- scheme, param = get_authorization_scheme_param(authorization)
- if self.realm:
- unauthorized_headers = {"WWW-Authenticate": f'Basic realm="{self.realm}"'}
- else:
- unauthorized_headers = {"WWW-Authenticate": "Basic"}
- invalid_user_credentials_exc = HTTPException(
- status_code=HTTP_401_UNAUTHORIZED,
- detail="Invalid authentication credentials",
- headers=unauthorized_headers,
- )
- if not authorization or scheme.lower() != "basic":
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_401_UNAUTHORIZED,
- detail="Not authenticated",
- headers=unauthorized_headers,
- )
- else:
- return None
- try:
- data = b64decode(param).decode("ascii")
- except (ValueError, UnicodeDecodeError, binascii.Error):
- raise invalid_user_credentials_exc
- username, separator, password = data.partition(":")
- if not separator:
- raise invalid_user_credentials_exc
- return HTTPBasicCredentials(username=username, password=password)
-
-
-class HTTPBearer(HTTPBase):
- def __init__(
- self,
- *,
- bearerFormat: Optional[str] = None,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True,
- ):
- self.model = HTTPBearerModel(bearerFormat=bearerFormat, description=description)
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(
- self, request: Request
- ) -> Optional[HTTPAuthorizationCredentials]:
- authorization: str = request.headers.get("Authorization")
- scheme, credentials = get_authorization_scheme_param(authorization)
- if not (authorization and scheme and credentials):
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- if scheme.lower() != "bearer":
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN,
- detail="Invalid authentication credentials",
- )
- else:
- return None
- return HTTPAuthorizationCredentials(scheme=scheme, credentials=credentials)
-
-
-class HTTPDigest(HTTPBase):
- def __init__(
- self,
- *,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True,
- ):
- self.model = HTTPBaseModel(scheme="digest", description=description)
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(
- self, request: Request
- ) -> Optional[HTTPAuthorizationCredentials]:
- authorization: str = request.headers.get("Authorization")
- scheme, credentials = get_authorization_scheme_param(authorization)
- if not (authorization and scheme and credentials):
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- if scheme.lower() != "digest":
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN,
- detail="Invalid authentication credentials",
- )
- return HTTPAuthorizationCredentials(scheme=scheme, credentials=credentials)
diff --git a/env/lib/python3.9/site-packages/fastapi/security/oauth2.py b/env/lib/python3.9/site-packages/fastapi/security/oauth2.py
deleted file mode 100644
index 653c301..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/oauth2.py
+++ /dev/null
@@ -1,220 +0,0 @@
-from typing import Any, Dict, List, Optional, Union
-
-from fastapi.exceptions import HTTPException
-from fastapi.openapi.models import OAuth2 as OAuth2Model
-from fastapi.openapi.models import OAuthFlows as OAuthFlowsModel
-from fastapi.param_functions import Form
-from fastapi.security.base import SecurityBase
-from fastapi.security.utils import get_authorization_scheme_param
-from starlette.requests import Request
-from starlette.status import HTTP_401_UNAUTHORIZED, HTTP_403_FORBIDDEN
-
-
-class OAuth2PasswordRequestForm:
- """
- This is a dependency class, use it like:
-
- @app.post("/login")
- def login(form_data: OAuth2PasswordRequestForm = Depends()):
- data = form_data.parse()
- print(data.username)
- print(data.password)
- for scope in data.scopes:
- print(scope)
- if data.client_id:
- print(data.client_id)
- if data.client_secret:
- print(data.client_secret)
- return data
-
-
- It creates the following Form request parameters in your endpoint:
-
- grant_type: the OAuth2 spec says it is required and MUST be the fixed string "password".
- Nevertheless, this dependency class is permissive and allows not passing it. If you want to enforce it,
- use instead the OAuth2PasswordRequestFormStrict dependency.
- username: username string. The OAuth2 spec requires the exact field name "username".
- password: password string. The OAuth2 spec requires the exact field name "password".
- scope: Optional string. Several scopes (each one a string) separated by spaces. E.g.
- "items:read items:write users:read profile openid"
- client_id: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
- using HTTP Basic auth, as: client_id:client_secret
- client_secret: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
- using HTTP Basic auth, as: client_id:client_secret
- """
-
- def __init__(
- self,
- grant_type: str = Form(default=None, regex="password"),
- username: str = Form(),
- password: str = Form(),
- scope: str = Form(default=""),
- client_id: Optional[str] = Form(default=None),
- client_secret: Optional[str] = Form(default=None),
- ):
- self.grant_type = grant_type
- self.username = username
- self.password = password
- self.scopes = scope.split()
- self.client_id = client_id
- self.client_secret = client_secret
-
-
-class OAuth2PasswordRequestFormStrict(OAuth2PasswordRequestForm):
- """
- This is a dependency class, use it like:
-
- @app.post("/login")
- def login(form_data: OAuth2PasswordRequestFormStrict = Depends()):
- data = form_data.parse()
- print(data.username)
- print(data.password)
- for scope in data.scopes:
- print(scope)
- if data.client_id:
- print(data.client_id)
- if data.client_secret:
- print(data.client_secret)
- return data
-
-
- It creates the following Form request parameters in your endpoint:
-
- grant_type: the OAuth2 spec says it is required and MUST be the fixed string "password".
- This dependency is strict about it. If you want to be permissive, use instead the
- OAuth2PasswordRequestForm dependency class.
- username: username string. The OAuth2 spec requires the exact field name "username".
- password: password string. The OAuth2 spec requires the exact field name "password".
- scope: Optional string. Several scopes (each one a string) separated by spaces. E.g.
- "items:read items:write users:read profile openid"
- client_id: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
- using HTTP Basic auth, as: client_id:client_secret
- client_secret: optional string. OAuth2 recommends sending the client_id and client_secret (if any)
- using HTTP Basic auth, as: client_id:client_secret
- """
-
- def __init__(
- self,
- grant_type: str = Form(regex="password"),
- username: str = Form(),
- password: str = Form(),
- scope: str = Form(default=""),
- client_id: Optional[str] = Form(default=None),
- client_secret: Optional[str] = Form(default=None),
- ):
- super().__init__(
- grant_type=grant_type,
- username=username,
- password=password,
- scope=scope,
- client_id=client_id,
- client_secret=client_secret,
- )
-
-
-class OAuth2(SecurityBase):
- def __init__(
- self,
- *,
- flows: Union[OAuthFlowsModel, Dict[str, Dict[str, Any]]] = OAuthFlowsModel(),
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True
- ):
- self.model = OAuth2Model(flows=flows, description=description)
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(self, request: Request) -> Optional[str]:
- authorization: str = request.headers.get("Authorization")
- if not authorization:
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- return authorization
-
-
-class OAuth2PasswordBearer(OAuth2):
- def __init__(
- self,
- tokenUrl: str,
- scheme_name: Optional[str] = None,
- scopes: Optional[Dict[str, str]] = None,
- description: Optional[str] = None,
- auto_error: bool = True,
- ):
- if not scopes:
- scopes = {}
- flows = OAuthFlowsModel(password={"tokenUrl": tokenUrl, "scopes": scopes})
- super().__init__(
- flows=flows,
- scheme_name=scheme_name,
- description=description,
- auto_error=auto_error,
- )
-
- async def __call__(self, request: Request) -> Optional[str]:
- authorization: str = request.headers.get("Authorization")
- scheme, param = get_authorization_scheme_param(authorization)
- if not authorization or scheme.lower() != "bearer":
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_401_UNAUTHORIZED,
- detail="Not authenticated",
- headers={"WWW-Authenticate": "Bearer"},
- )
- else:
- return None
- return param
-
-
-class OAuth2AuthorizationCodeBearer(OAuth2):
- def __init__(
- self,
- authorizationUrl: str,
- tokenUrl: str,
- refreshUrl: Optional[str] = None,
- scheme_name: Optional[str] = None,
- scopes: Optional[Dict[str, str]] = None,
- description: Optional[str] = None,
- auto_error: bool = True,
- ):
- if not scopes:
- scopes = {}
- flows = OAuthFlowsModel(
- authorizationCode={
- "authorizationUrl": authorizationUrl,
- "tokenUrl": tokenUrl,
- "refreshUrl": refreshUrl,
- "scopes": scopes,
- }
- )
- super().__init__(
- flows=flows,
- scheme_name=scheme_name,
- description=description,
- auto_error=auto_error,
- )
-
- async def __call__(self, request: Request) -> Optional[str]:
- authorization: str = request.headers.get("Authorization")
- scheme, param = get_authorization_scheme_param(authorization)
- if not authorization or scheme.lower() != "bearer":
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_401_UNAUTHORIZED,
- detail="Not authenticated",
- headers={"WWW-Authenticate": "Bearer"},
- )
- else:
- return None # pragma: nocover
- return param
-
-
-class SecurityScopes:
- def __init__(self, scopes: Optional[List[str]] = None):
- self.scopes = scopes or []
- self.scope_str = " ".join(self.scopes)
diff --git a/env/lib/python3.9/site-packages/fastapi/security/open_id_connect_url.py b/env/lib/python3.9/site-packages/fastapi/security/open_id_connect_url.py
deleted file mode 100644
index dfe9f7b..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/open_id_connect_url.py
+++ /dev/null
@@ -1,34 +0,0 @@
-from typing import Optional
-
-from fastapi.openapi.models import OpenIdConnect as OpenIdConnectModel
-from fastapi.security.base import SecurityBase
-from starlette.exceptions import HTTPException
-from starlette.requests import Request
-from starlette.status import HTTP_403_FORBIDDEN
-
-
-class OpenIdConnect(SecurityBase):
- def __init__(
- self,
- *,
- openIdConnectUrl: str,
- scheme_name: Optional[str] = None,
- description: Optional[str] = None,
- auto_error: bool = True
- ):
- self.model = OpenIdConnectModel(
- openIdConnectUrl=openIdConnectUrl, description=description
- )
- self.scheme_name = scheme_name or self.__class__.__name__
- self.auto_error = auto_error
-
- async def __call__(self, request: Request) -> Optional[str]:
- authorization: str = request.headers.get("Authorization")
- if not authorization:
- if self.auto_error:
- raise HTTPException(
- status_code=HTTP_403_FORBIDDEN, detail="Not authenticated"
- )
- else:
- return None
- return authorization
diff --git a/env/lib/python3.9/site-packages/fastapi/security/utils.py b/env/lib/python3.9/site-packages/fastapi/security/utils.py
deleted file mode 100644
index 2da0dd2..0000000
--- a/env/lib/python3.9/site-packages/fastapi/security/utils.py
+++ /dev/null
@@ -1,8 +0,0 @@
-from typing import Tuple
-
-
-def get_authorization_scheme_param(authorization_header_value: str) -> Tuple[str, str]:
- if not authorization_header_value:
- return "", ""
- scheme, _, param = authorization_header_value.partition(" ")
- return scheme, param
diff --git a/env/lib/python3.9/site-packages/fastapi/staticfiles.py b/env/lib/python3.9/site-packages/fastapi/staticfiles.py
deleted file mode 100644
index 299015d..0000000
--- a/env/lib/python3.9/site-packages/fastapi/staticfiles.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.staticfiles import StaticFiles as StaticFiles # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/templating.py b/env/lib/python3.9/site-packages/fastapi/templating.py
deleted file mode 100644
index 0cb8684..0000000
--- a/env/lib/python3.9/site-packages/fastapi/templating.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.templating import Jinja2Templates as Jinja2Templates # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/testclient.py b/env/lib/python3.9/site-packages/fastapi/testclient.py
deleted file mode 100644
index 4012406..0000000
--- a/env/lib/python3.9/site-packages/fastapi/testclient.py
+++ /dev/null
@@ -1 +0,0 @@
-from starlette.testclient import TestClient as TestClient # noqa
diff --git a/env/lib/python3.9/site-packages/fastapi/types.py b/env/lib/python3.9/site-packages/fastapi/types.py
deleted file mode 100644
index e0bca46..0000000
--- a/env/lib/python3.9/site-packages/fastapi/types.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from typing import Any, Callable, TypeVar
-
-DecoratedCallable = TypeVar("DecoratedCallable", bound=Callable[..., Any])
diff --git a/env/lib/python3.9/site-packages/fastapi/utils.py b/env/lib/python3.9/site-packages/fastapi/utils.py
deleted file mode 100644
index 89f5445..0000000
--- a/env/lib/python3.9/site-packages/fastapi/utils.py
+++ /dev/null
@@ -1,191 +0,0 @@
-import functools
-import re
-import warnings
-from dataclasses import is_dataclass
-from enum import Enum
-from typing import TYPE_CHECKING, Any, Dict, Optional, Set, Type, Union, cast
-
-import fastapi
-from fastapi.datastructures import DefaultPlaceholder, DefaultType
-from fastapi.openapi.constants import REF_PREFIX
-from pydantic import BaseConfig, BaseModel, create_model
-from pydantic.class_validators import Validator
-from pydantic.fields import FieldInfo, ModelField, UndefinedType
-from pydantic.schema import model_process_schema
-from pydantic.utils import lenient_issubclass
-
-if TYPE_CHECKING: # pragma: nocover
- from .routing import APIRoute
-
-
-def is_body_allowed_for_status_code(status_code: Union[int, str, None]) -> bool:
- if status_code is None:
- return True
- current_status_code = int(status_code)
- return not (current_status_code < 200 or current_status_code in {204, 304})
-
-
-def get_model_definitions(
- *,
- flat_models: Set[Union[Type[BaseModel], Type[Enum]]],
- model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str],
-) -> Dict[str, Any]:
- definitions: Dict[str, Dict[str, Any]] = {}
- for model in flat_models:
- m_schema, m_definitions, m_nested_models = model_process_schema(
- model, model_name_map=model_name_map, ref_prefix=REF_PREFIX
- )
- definitions.update(m_definitions)
- model_name = model_name_map[model]
- if "description" in m_schema:
- m_schema["description"] = m_schema["description"].split("\f")[0]
- definitions[model_name] = m_schema
- return definitions
-
-
-def get_path_param_names(path: str) -> Set[str]:
- return set(re.findall("{(.*?)}", path))
-
-
-def create_response_field(
- name: str,
- type_: Type[Any],
- class_validators: Optional[Dict[str, Validator]] = None,
- default: Optional[Any] = None,
- required: Union[bool, UndefinedType] = True,
- model_config: Type[BaseConfig] = BaseConfig,
- field_info: Optional[FieldInfo] = None,
- alias: Optional[str] = None,
-) -> ModelField:
- """
- Create a new response field. Raises if type_ is invalid.
- """
- class_validators = class_validators or {}
- field_info = field_info or FieldInfo()
-
- response_field = functools.partial(
- ModelField,
- name=name,
- type_=type_,
- class_validators=class_validators,
- default=default,
- required=required,
- model_config=model_config,
- alias=alias,
- )
-
- try:
- return response_field(field_info=field_info)
- except RuntimeError:
- raise fastapi.exceptions.FastAPIError(
- f"Invalid args for response field! Hint: check that {type_} is a valid pydantic field type"
- )
-
-
-def create_cloned_field(
- field: ModelField,
- *,
- cloned_types: Optional[Dict[Type[BaseModel], Type[BaseModel]]] = None,
-) -> ModelField:
- # _cloned_types has already cloned types, to support recursive models
- if cloned_types is None:
- cloned_types = {}
- original_type = field.type_
- if is_dataclass(original_type) and hasattr(original_type, "__pydantic_model__"):
- original_type = original_type.__pydantic_model__
- use_type = original_type
- if lenient_issubclass(original_type, BaseModel):
- original_type = cast(Type[BaseModel], original_type)
- use_type = cloned_types.get(original_type)
- if use_type is None:
- use_type = create_model(original_type.__name__, __base__=original_type)
- cloned_types[original_type] = use_type
- for f in original_type.__fields__.values():
- use_type.__fields__[f.name] = create_cloned_field(
- f, cloned_types=cloned_types
- )
- new_field = create_response_field(name=field.name, type_=use_type)
- new_field.has_alias = field.has_alias
- new_field.alias = field.alias
- new_field.class_validators = field.class_validators
- new_field.default = field.default
- new_field.required = field.required
- new_field.model_config = field.model_config
- new_field.field_info = field.field_info
- new_field.allow_none = field.allow_none
- new_field.validate_always = field.validate_always
- if field.sub_fields:
- new_field.sub_fields = [
- create_cloned_field(sub_field, cloned_types=cloned_types)
- for sub_field in field.sub_fields
- ]
- if field.key_field:
- new_field.key_field = create_cloned_field(
- field.key_field, cloned_types=cloned_types
- )
- new_field.validators = field.validators
- new_field.pre_validators = field.pre_validators
- new_field.post_validators = field.post_validators
- new_field.parse_json = field.parse_json
- new_field.shape = field.shape
- new_field.populate_validators()
- return new_field
-
-
-def generate_operation_id_for_path(
- *, name: str, path: str, method: str
-) -> str: # pragma: nocover
- warnings.warn(
- "fastapi.utils.generate_operation_id_for_path() was deprecated, "
- "it is not used internally, and will be removed soon",
- DeprecationWarning,
- stacklevel=2,
- )
- operation_id = name + path
- operation_id = re.sub(r"\W", "_", operation_id)
- operation_id = operation_id + "_" + method.lower()
- return operation_id
-
-
-def generate_unique_id(route: "APIRoute") -> str:
- operation_id = route.name + route.path_format
- operation_id = re.sub(r"\W", "_", operation_id)
- assert route.methods
- operation_id = operation_id + "_" + list(route.methods)[0].lower()
- return operation_id
-
-
-def deep_dict_update(main_dict: Dict[Any, Any], update_dict: Dict[Any, Any]) -> None:
- for key, value in update_dict.items():
- if (
- key in main_dict
- and isinstance(main_dict[key], dict)
- and isinstance(value, dict)
- ):
- deep_dict_update(main_dict[key], value)
- elif (
- key in main_dict
- and isinstance(main_dict[key], list)
- and isinstance(update_dict[key], list)
- ):
- main_dict[key] = main_dict[key] + update_dict[key]
- else:
- main_dict[key] = value
-
-
-def get_value_or_default(
- first_item: Union[DefaultPlaceholder, DefaultType],
- *extra_items: Union[DefaultPlaceholder, DefaultType],
-) -> Union[DefaultPlaceholder, DefaultType]:
- """
- Pass items or `DefaultPlaceholder`s by descending priority.
-
- The first one to _not_ be a `DefaultPlaceholder` will be returned.
-
- Otherwise, the first item (a `DefaultPlaceholder`) will be returned.
- """
- items = (first_item,) + extra_items
- for item in items:
- if not isinstance(item, DefaultPlaceholder):
- return item
- return first_item
diff --git a/env/lib/python3.9/site-packages/fastapi/websockets.py b/env/lib/python3.9/site-packages/fastapi/websockets.py
deleted file mode 100644
index 55a4ac4..0000000
--- a/env/lib/python3.9/site-packages/fastapi/websockets.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from starlette.websockets import WebSocket as WebSocket # noqa
-from starlette.websockets import WebSocketDisconnect as WebSocketDisconnect # noqa
-from starlette.websockets import WebSocketState as WebSocketState # noqa
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/AUTHORS b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/AUTHORS
deleted file mode 100644
index 42a5c22..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/AUTHORS
+++ /dev/null
@@ -1,51 +0,0 @@
-Original Authors
-----------------
-* Armin Rigo
-* Christian Tismer
-
-Contributors
-------------
-* Al Stone
-* Alexander Schmidt
-* Alexey Borzenkov
-* Andreas Schwab
-* Armin Ronacher
-* Bin Wang
-* Bob Ippolito
-* ChangBo Guo
-* Christoph Gohlke
-* Denis Bilenko
-* Dirk Mueller
-* Donovan Preston
-* Fantix King
-* Floris Bruynooghe
-* Fredrik Fornwall
-* Gerd Woetzel
-* Giel van Schijndel
-* Gökhan Karabulut
-* Gustavo Niemeyer
-* Guy Rozendorn
-* Hye-Shik Chang
-* Jared Kuolt
-* Jason Madden
-* Josh Snyder
-* Kyle Ambroff
-* Laszlo Boszormenyi
-* Mao Han
-* Marc Abramowitz
-* Marc Schlaich
-* Marcin Bachry
-* Matt Madison
-* Matt Turner
-* Michael Ellerman
-* Michael Matz
-* Ralf Schmitt
-* Robie Basak
-* Ronny Pfannschmidt
-* Samual M. Rushing
-* Tony Bowles
-* Tony Breeds
-* Trevor Bowen
-* Tulio Magno Quites Machado Filho
-* Ulrich Weigand
-* Victor Stinner
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/INSTALLER b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/INSTALLER
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/INSTALLER
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/LICENSE b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/LICENSE
deleted file mode 100644
index b73a4a1..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/LICENSE
+++ /dev/null
@@ -1,30 +0,0 @@
-The following files are derived from Stackless Python and are subject to the
-same license as Stackless Python:
-
- src/greenlet/slp_platformselect.h
- files in src/greenlet/platform/ directory
-
-See LICENSE.PSF and http://www.stackless.com/ for details.
-
-Unless otherwise noted, the files in greenlet have been released under the
-following MIT license:
-
-Copyright (c) Armin Rigo, Christian Tismer and contributors
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
-THE SOFTWARE.
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/LICENSE.PSF b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/LICENSE.PSF
deleted file mode 100644
index d3b509a..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/LICENSE.PSF
+++ /dev/null
@@ -1,47 +0,0 @@
-PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
---------------------------------------------
-
-1. This LICENSE AGREEMENT is between the Python Software Foundation
-("PSF"), and the Individual or Organization ("Licensee") accessing and
-otherwise using this software ("Python") in source or binary form and
-its associated documentation.
-
-2. Subject to the terms and conditions of this License Agreement, PSF hereby
-grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
-analyze, test, perform and/or display publicly, prepare derivative works,
-distribute, and otherwise use Python alone or in any derivative version,
-provided, however, that PSF's License Agreement and PSF's notice of copyright,
-i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
-2011 Python Software Foundation; All Rights Reserved" are retained in Python
-alone or in any derivative version prepared by Licensee.
-
-3. In the event Licensee prepares a derivative work that is based on
-or incorporates Python or any part thereof, and wants to make
-the derivative work available to others as provided herein, then
-Licensee hereby agrees to include in any such work a brief summary of
-the changes made to Python.
-
-4. PSF is making Python available to Licensee on an "AS IS"
-basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
-IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
-DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
-FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
-INFRINGE ANY THIRD PARTY RIGHTS.
-
-5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
-FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
-A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
-OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
-
-6. This License Agreement will automatically terminate upon a material
-breach of its terms and conditions.
-
-7. Nothing in this License Agreement shall be deemed to create any
-relationship of agency, partnership, or joint venture between PSF and
-Licensee. This License Agreement does not grant permission to use PSF
-trademarks or trade name in a trademark sense to endorse or promote
-products or services of Licensee, or any third party.
-
-8. By copying, installing or otherwise using Python, Licensee
-agrees to be bound by the terms and conditions of this License
-Agreement.
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/METADATA b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/METADATA
deleted file mode 100644
index 23b5f72..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/METADATA
+++ /dev/null
@@ -1,103 +0,0 @@
-Metadata-Version: 2.1
-Name: greenlet
-Version: 1.1.3
-Summary: Lightweight in-process concurrent programming
-Home-page: https://greenlet.readthedocs.io/
-Author: Alexey Borzenkov
-Author-email: snaury@gmail.com
-Maintainer: Jason Madden
-Maintainer-email: jason@nextthought.com
-License: MIT License
-Project-URL: Bug Tracker, https://github.com/python-greenlet/greenlet/issues
-Project-URL: Source Code, https://github.com/python-greenlet/greenlet/
-Project-URL: Documentation, https://greenlet.readthedocs.io/
-Keywords: greenlet coroutine concurrency threads cooperative
-Platform: any
-Classifier: Development Status :: 5 - Production/Stable
-Classifier: Intended Audience :: Developers
-Classifier: License :: OSI Approved :: MIT License
-Classifier: Natural Language :: English
-Classifier: Programming Language :: C
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 2
-Classifier: Programming Language :: Python :: 2.7
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.5
-Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Programming Language :: Python :: 3.8
-Classifier: Programming Language :: Python :: 3.9
-Classifier: Programming Language :: Python :: 3.10
-Classifier: Programming Language :: Python :: 3.11
-Classifier: Operating System :: OS Independent
-Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*
-Description-Content-Type: text/x-rst
-License-File: LICENSE
-License-File: LICENSE.PSF
-License-File: AUTHORS
-Provides-Extra: docs
-Requires-Dist: Sphinx ; extra == 'docs'
-Provides-Extra: test
-
-.. This file is included into docs/history.rst
-
-.. image:: https://github.com/python-greenlet/greenlet/workflows/tests/badge.svg
- :target: https://github.com/python-greenlet/greenlet/actions
-
-Greenlets are lightweight coroutines for in-process concurrent
-programming.
-
-The "greenlet" package is a spin-off of `Stackless`_, a version of
-CPython that supports micro-threads called "tasklets". Tasklets run
-pseudo-concurrently (typically in a single or a few OS-level threads)
-and are synchronized with data exchanges on "channels".
-
-A "greenlet", on the other hand, is a still more primitive notion of
-micro-thread with no implicit scheduling; coroutines, in other words.
-This is useful when you want to control exactly when your code runs.
-You can build custom scheduled micro-threads on top of greenlet;
-however, it seems that greenlets are useful on their own as a way to
-make advanced control flow structures. For example, we can recreate
-generators; the difference with Python's own generators is that our
-generators can call nested functions and the nested functions can
-yield values too. (Additionally, you don't need a "yield" keyword. See
-the example in `test_generator.py
-`_).
-
-Greenlets are provided as a C extension module for the regular unmodified
-interpreter.
-
-.. _`Stackless`: http://www.stackless.com
-
-
-Who is using Greenlet?
-======================
-
-There are several libraries that use Greenlet as a more flexible
-alternative to Python's built in coroutine support:
-
- - `Concurrence`_
- - `Eventlet`_
- - `Gevent`_
-
-.. _Concurrence: http://opensource.hyves.org/concurrence/
-.. _Eventlet: http://eventlet.net/
-.. _Gevent: http://www.gevent.org/
-
-Getting Greenlet
-================
-
-The easiest way to get Greenlet is to install it with pip::
-
- pip install greenlet
-
-
-Source code archives and binary distributions are vailable on the
-python package index at https://pypi.org/project/greenlet
-
-The source code repository is hosted on github:
-https://github.com/python-greenlet/greenlet
-
-Documentation is available on readthedocs.org:
-https://greenlet.readthedocs.io
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/RECORD b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/RECORD
deleted file mode 100644
index e89aea7..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/RECORD
+++ /dev/null
@@ -1,71 +0,0 @@
-../../../include/site/python3.9/greenlet/greenlet.h,sha256=muQGuDPNWzBVjWoObFXddpDP_DLeE2GtdnF41cyYgy0,4648
-greenlet-1.1.3.dist-info/AUTHORS,sha256=swW28t2knVRxRkaEQNZtO7MP9Sgnompb7B6cNgJM8Gk,849
-greenlet-1.1.3.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-greenlet-1.1.3.dist-info/LICENSE,sha256=dpgx1uXfrywggC-sz_H6-0wgJd2PYlPfpH_K1Z1NCXk,1434
-greenlet-1.1.3.dist-info/LICENSE.PSF,sha256=5f88I8EQ5JTNfXNsEP2W1GJFe6_soxCEDbZScpjH1Gs,2424
-greenlet-1.1.3.dist-info/METADATA,sha256=DQAWGnxur5YBtMAo1zxHoCE1xEBsLwFP-df-a0A7oWU,3930
-greenlet-1.1.3.dist-info/RECORD,,
-greenlet-1.1.3.dist-info/WHEEL,sha256=1aIKcOiZwCaey2y-OH_J-W7IhsRHMsIvvidnI8v_CqQ,110
-greenlet-1.1.3.dist-info/top_level.txt,sha256=YSnRsCRoO61JGlP57o8iKL6rdLWDWuiyKD8ekpWUsDc,9
-greenlet/__init__.py,sha256=f2pBI8kauTC7tFFi8r-JUUPXuthYvspSRCNENiqAH8k,1323
-greenlet/__pycache__/__init__.cpython-39.pyc,,
-greenlet/_greenlet.cpython-39-darwin.so,sha256=tw6Bh25bdHgBA5VpJYeLu3HPu6NM4JA_eDIQdzTub8c,60112
-greenlet/greenlet.c,sha256=tTKIwaPu9MhiGwhtlSkWWbSTXPbddgY-Xoq7xUfvfpA,67295
-greenlet/greenlet.h,sha256=muQGuDPNWzBVjWoObFXddpDP_DLeE2GtdnF41cyYgy0,4648
-greenlet/platform/setup_switch_x64_masm.cmd,sha256=ZpClUJeU0ujEPSTWNSepP0W2f9XiYQKA8QKSoVou8EU,143
-greenlet/platform/switch_aarch64_gcc.h,sha256=TRH22e9TNRA_mys8hhLbNwz3efZk7BtKZhyhK7ucgyM,2385
-greenlet/platform/switch_alpha_unix.h,sha256=T6kOBiHy3hLmy1vrmFrxbnOnRu0EJkoG_yuWy7fykZ4,689
-greenlet/platform/switch_amd64_unix.h,sha256=KWB4PB2wcAaWvWbMzcq8tYBe02vEGPBCRMnHnfeI7gE,2610
-greenlet/platform/switch_arm32_gcc.h,sha256=wflI2cGZBfLzM_GGgYx3OrFeoOq7OTsJP53dKLsrxS0,2488
-greenlet/platform/switch_arm32_ios.h,sha256=yQZXCa0AZbyAIS9tKceyTCrRYlihpFBKDbiPCn_3im0,1901
-greenlet/platform/switch_csky_gcc.h,sha256=GHlaVXrzQuSkrDqgL7-Ji9YwZnprpFhjPznNyp0NnvU,1340
-greenlet/platform/switch_m68k_gcc.h,sha256=VSa6NpZhvyyvF-Q58CTIWSpEDo4FKygOyTz00whctlw,928
-greenlet/platform/switch_mips_unix.h,sha256=9ptMGEBXafee15RxOm5NrxiC2bEnwM9AkxJ7ktVatU8,1444
-greenlet/platform/switch_ppc64_aix.h,sha256=ADpifLPlr6pTdT76bt6ozcqPjHrfPsJ93lQfc1VNaug,3878
-greenlet/platform/switch_ppc64_linux.h,sha256=jqPKpTg09FzmCn59Kt6OJi2-40aoazFVJcf1YETLlwA,3833
-greenlet/platform/switch_ppc_aix.h,sha256=nClVVlsRlFAI-I3fmivSJyJK7Xzx3_8l3Wf8QNJ9FMU,2959
-greenlet/platform/switch_ppc_linux.h,sha256=J4eKMA73WbPYSaq0yAedzHB6J6ZKE8tIIzkqYxlaA2c,2777
-greenlet/platform/switch_ppc_macosx.h,sha256=bnL2MqIUm9--NHizb5NYijvSrqutvuJx4auYCdqXllM,2642
-greenlet/platform/switch_ppc_unix.h,sha256=5UW9c71NGJh6xksEbAOButBFH168QRyZ5O53yXdXGxg,2670
-greenlet/platform/switch_riscv_unix.h,sha256=c3v3GRDMooslDKQLM75IqokWivtelbAj3-XZK31vWlE,758
-greenlet/platform/switch_s390_unix.h,sha256=9oJkYnyUovPvXOAsVLXoj-Unl_Rr_DidkXYMaRXLS0w,2781
-greenlet/platform/switch_sparc_sun_gcc.h,sha256=0vHXNNCdz-1ioQsw-OtK0ridnBVIzErYWiK7bBu6OgM,2815
-greenlet/platform/switch_x32_unix.h,sha256=ie7Nxo6Cf_x4UVOSA_a3bJYPlRKZ1BvLWsclyQle_SY,1527
-greenlet/platform/switch_x64_masm.asm,sha256=nu6n2sWyXuXfpPx40d9YmLfHXUc1sHgeTvX1kUzuvEM,1841
-greenlet/platform/switch_x64_masm.obj,sha256=GNtTNxYdo7idFUYsQv-mrXWgyT5EJ93-9q90lN6svtQ,1078
-greenlet/platform/switch_x64_msvc.h,sha256=LIeasyKo_vHzspdMzMHbosRhrBfKI4BkQOh4qcTHyJw,1805
-greenlet/platform/switch_x86_msvc.h,sha256=hi0dgp-k14IhMCxwtJtcI_ciPnMGd37uMnMaHaeQVWg,2481
-greenlet/platform/switch_x86_unix.h,sha256=WvY2sNMFIEfoFVNVakl-osygJui3pSnlVj5jBrdaU08,3068
-greenlet/slp_platformselect.h,sha256=-J5Px9Yk7Ths4hQTecC3iadxfte1CYaFoeqfg1lUl-A,3095
-greenlet/tests/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-greenlet/tests/__pycache__/__init__.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_contextvars.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_cpp.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_extension_interface.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_gc.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_generator.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_generator_nested.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_greenlet.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_leaks.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_stack_saved.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_throw.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_tracing.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_version.cpython-39.pyc,,
-greenlet/tests/__pycache__/test_weakref.cpython-39.pyc,,
-greenlet/tests/_test_extension.c,sha256=Tceb6kMFPSvAPW2LJ_zUlj--Wz_DtLzIPmgZcqkqAEU,5402
-greenlet/tests/_test_extension.cpython-39-darwin.so,sha256=dUh5SFVuTZ-kUlQZfcX8sT_mIb1D4a9SX1W5-cJ5wVs,51720
-greenlet/tests/_test_extension_cpp.cpp,sha256=zKfz0FxBXicq-53rItZ_NP8M406OBtyQFdH5bv_pRmk,3212
-greenlet/tests/_test_extension_cpp.cpython-39-darwin.so,sha256=acI6RvJ7_W41xwk3kpEt-Q1adXFM3QbaRmn2e55-qjY,51640
-greenlet/tests/test_contextvars.py,sha256=d69XSuRrdU80xAPmzdObLjrjXnbTQChG0MgsvBF_nGM,9205
-greenlet/tests/test_cpp.py,sha256=SXMuqsHTYTxFPBrasdbx5Sgplc89wvYEuPZvwafD-3k,488
-greenlet/tests/test_extension_interface.py,sha256=1FhUkxL-NrxmQV_sxUdlt8tvIWpDcGi27JcdQ6VyvFc,2521
-greenlet/tests/test_gc.py,sha256=oATPCmEAagdf1dZBYfZ0aiDklovLo_pQt5HZNTygCzk,2892
-greenlet/tests/test_generator.py,sha256=_MLDA1kBtZQR-9a74AOZZQECQCIFljMa7vbucE0cOxw,1280
-greenlet/tests/test_generator_nested.py,sha256=pGYRpNn_WjdhY_5ZHHBuBw10wskG_7mjJjR8IqleY3M,3579
-greenlet/tests/test_greenlet.py,sha256=SVDi0e1RrJtJhiOFggmoWTZL1sFdxRpdALFRCie-n60,23427
-greenlet/tests/test_leaks.py,sha256=STvFoZsFsZ_E24kYFaIASGBx97TRgTIur6uJXnoevWc,6677
-greenlet/tests/test_stack_saved.py,sha256=SyIHZycTBfm1TxFsq1VLCAgVm02t5GSke8tT28qwi7c,450
-greenlet/tests/test_throw.py,sha256=OOWfgcEaymvGVJQ3d4xDGzC5IVH0rZAiazWuyZV9270,2755
-greenlet/tests/test_tracing.py,sha256=hZ6Cl5NMq9IaeH7NGqWYl8aQ0_5nFUSYuo6TeSXvrKw,7455
-greenlet/tests/test_version.py,sha256=lHDe3qcLvfsOHcFKFW8yrcl5wBvy6UIxaNkZZzNlpHE,1229
-greenlet/tests/test_weakref.py,sha256=gqAQunjVzbwF6qEUZijhv6UqhH4apWNIRHeoWLUo9tM,884
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/WHEEL b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/WHEEL
deleted file mode 100644
index 331c029..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/WHEEL
+++ /dev/null
@@ -1,5 +0,0 @@
-Wheel-Version: 1.0
-Generator: bdist_wheel (0.37.1)
-Root-Is-Purelib: false
-Tag: cp39-cp39-macosx_10_15_x86_64
-
diff --git a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/top_level.txt b/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/top_level.txt
deleted file mode 100644
index 46725be..0000000
--- a/env/lib/python3.9/site-packages/greenlet-1.1.3.dist-info/top_level.txt
+++ /dev/null
@@ -1 +0,0 @@
-greenlet
diff --git a/env/lib/python3.9/site-packages/greenlet/__init__.py b/env/lib/python3.9/site-packages/greenlet/__init__.py
deleted file mode 100644
index 22db798..0000000
--- a/env/lib/python3.9/site-packages/greenlet/__init__.py
+++ /dev/null
@@ -1,63 +0,0 @@
-# -*- coding: utf-8 -*-
-"""
-The root of the greenlet package.
-"""
-from __future__ import absolute_import
-from __future__ import division
-from __future__ import print_function
-
-__all__ = [
- '__version__',
- '_C_API',
-
- 'GreenletExit',
- 'error',
-
- 'getcurrent',
- 'greenlet',
-
- 'gettrace',
- 'settrace',
-]
-
-# pylint:disable=no-name-in-module
-
-###
-# Metadata
-###
-__version__ = '1.1.3'
-from ._greenlet import _C_API # pylint:disable=no-name-in-module
-
-###
-# Exceptions
-###
-from ._greenlet import GreenletExit
-from ._greenlet import error
-
-###
-# greenlets
-###
-from ._greenlet import getcurrent
-from ._greenlet import greenlet
-
-###
-# tracing
-###
-try:
- from ._greenlet import gettrace
- from ._greenlet import settrace
-except ImportError:
- # Tracing wasn't supported.
- # XXX: The option to disable it was removed in 1.0,
- # so this branch should be dead code.
- pass
-
-###
-# Constants
-# These constants aren't documented and aren't recommended.
-# In 1.0, USE_GC and USE_TRACING are always true, and USE_CONTEXT_VARS
-# is the same as ``sys.version_info[:2] >= 3.7``
-###
-from ._greenlet import GREENLET_USE_CONTEXT_VARS # pylint:disable=unused-import
-from ._greenlet import GREENLET_USE_GC # pylint:disable=unused-import
-from ._greenlet import GREENLET_USE_TRACING # pylint:disable=unused-import
diff --git a/env/lib/python3.9/site-packages/greenlet/_greenlet.cpython-39-darwin.so b/env/lib/python3.9/site-packages/greenlet/_greenlet.cpython-39-darwin.so
deleted file mode 100755
index 4791a68..0000000
Binary files a/env/lib/python3.9/site-packages/greenlet/_greenlet.cpython-39-darwin.so and /dev/null differ
diff --git a/env/lib/python3.9/site-packages/greenlet/greenlet.c b/env/lib/python3.9/site-packages/greenlet/greenlet.c
deleted file mode 100644
index 2f3ad6e..0000000
--- a/env/lib/python3.9/site-packages/greenlet/greenlet.c
+++ /dev/null
@@ -1,2170 +0,0 @@
-/* -*- indent-tabs-mode: nil; tab-width: 4; -*- */
-/* Format with:
- * clang-format -i --style=file src/greenlet/greenlet.c
- *
- *
- * Fix missing braces with:
- * clang-tidy src/greenlet/greenlet.c -fix -checks="readability-braces-around-statements"
-*/
-#define GREENLET_MODULE
-
-#include "greenlet.h"
-
-#include "structmember.h"
-
-#ifdef __clang__
-# pragma clang diagnostic push
-# pragma clang diagnostic ignored "-Wunused-parameter"
-# pragma clang diagnostic ignored "-Wmissing-field-initializers"
-#endif
-
-/***********************************************************
-
-A PyGreenlet is a range of C stack addresses that must be
-saved and restored in such a way that the full range of the
-stack contains valid data when we switch to it.
-
-Stack layout for a greenlet:
-
- | ^^^ |
- | older data |
- | |
- stack_stop . |_______________|
- . | |
- . | greenlet data |
- . | in stack |
- . * |_______________| . . _____________ stack_copy + stack_saved
- . | | | |
- . | data | |greenlet data|
- . | unrelated | | saved |
- . | to | | in heap |
- stack_start . | this | . . |_____________| stack_copy
- | greenlet |
- | |
- | newer data |
- | vvv |
-
-
-Note that a greenlet's stack data is typically partly at its correct
-place in the stack, and partly saved away in the heap, but always in
-the above configuration: two blocks, the more recent one in the heap
-and the older one still in the stack (either block may be empty).
-
-Greenlets are chained: each points to the previous greenlet, which is
-the one that owns the data currently in the C stack above my
-stack_stop. The currently running greenlet is the first element of
-this chain. The main (initial) greenlet is the last one. Greenlets
-whose stack is entirely in the heap can be skipped from the chain.
-
-The chain is not related to execution order, but only to the order
-in which bits of C stack happen to belong to greenlets at a particular
-point in time.
-
-The main greenlet doesn't have a stack_stop: it is responsible for the
-complete rest of the C stack, and we don't know where it begins. We
-use (char*) -1, the largest possible address.
-
-States:
- stack_stop == NULL && stack_start == NULL: did not start yet
- stack_stop != NULL && stack_start == NULL: already finished
- stack_stop != NULL && stack_start != NULL: active
-
-The running greenlet's stack_start is undefined but not NULL.
-
- ***********************************************************/
-
-/*** global state ***/
-
-/* In the presence of multithreading, this is a bit tricky:
-
- - ts_current always store a reference to a greenlet, but it is
- not really the current greenlet after a thread switch occurred.
-
- - each *running* greenlet uses its run_info field to know which
- thread it is attached to. A greenlet can only run in the thread
- where it was created. This run_info is a ref to tstate->dict.
-
- - the thread state dict is used to save and restore ts_current,
- using the dictionary key 'ts_curkey'.
-*/
-
-extern PyTypeObject PyGreenlet_Type;
-
-#if PY_VERSION_HEX >= 0x030700A3
-# define GREENLET_PY37 1
-#else
-# define GREENLET_PY37 0
-#endif
-
-#if PY_VERSION_HEX >= 0x30A00B1
-/*
-Python 3.10 beta 1 changed tstate->use_tracing to a nested cframe member.
-See https://github.com/python/cpython/pull/25276
-We have to save and restore this as well.
-*/
-#define TSTATE_USE_TRACING(tstate) (tstate->cframe->use_tracing)
-#define GREENLET_USE_CFRAME 1
-#else
-#define TSTATE_USE_TRACING(tstate) (tstate->use_tracing)
-#define GREENLET_USE_CFRAME 0
-#endif
-
-#ifndef Py_SET_REFCNT
-/* Py_REFCNT and Py_SIZE macros are converted to functions
-https://bugs.python.org/issue39573 */
-# define Py_SET_REFCNT(obj, refcnt) Py_REFCNT(obj) = (refcnt)
-#endif
-
-#ifndef _Py_DEC_REFTOTAL
-/* _Py_DEC_REFTOTAL macro has been removed from Python 3.9 by:
- https://github.com/python/cpython/commit/49932fec62c616ec88da52642339d83ae719e924
-*/
-# ifdef Py_REF_DEBUG
-# define _Py_DEC_REFTOTAL _Py_RefTotal--
-# else
-# define _Py_DEC_REFTOTAL
-# endif
-#endif
-
-/* Weak reference to the switching-to greenlet during the slp switch */
-static PyGreenlet* volatile ts_target = NULL;
-/* Strong reference to the switching from greenlet after the switch */
-static PyGreenlet* volatile ts_origin = NULL;
-/* Strong reference to the current greenlet in this thread state */
-static PyGreenlet* volatile ts_current = NULL;
-/* NULL if error, otherwise args tuple to pass around during slp switch */
-static PyObject* volatile ts_passaround_args = NULL;
-static PyObject* volatile ts_passaround_kwargs = NULL;
-
-/* Used internally in ``g_switchstack()`` */
-#if GREENLET_USE_CFRAME
-static int volatile ts__g_switchstack_use_tracing = 0;
-#endif
-
-/***********************************************************/
-/* Thread-aware routines, switching global variables when needed */
-
-#define STATE_OK \
- (ts_current->run_info == PyThreadState_GET()->dict || \
- !green_updatecurrent())
-
-static PyObject* ts_curkey;
-static PyObject* ts_delkey;
-static PyObject* ts_tracekey;
-static PyObject* ts_event_switch;
-static PyObject* ts_event_throw;
-static PyObject* PyExc_GreenletError;
-static PyObject* PyExc_GreenletExit;
-static PyObject* ts_empty_tuple;
-static PyObject* ts_empty_dict;
-
-#define GREENLET_GC_FLAGS Py_TPFLAGS_HAVE_GC
-#define GREENLET_tp_alloc PyType_GenericAlloc
-#define GREENLET_tp_free PyObject_GC_Del
-#define GREENLET_tp_traverse green_traverse
-#define GREENLET_tp_clear green_clear
-#define GREENLET_tp_is_gc green_is_gc
-
-static void
-green_clear_exc(PyGreenlet* g)
-{
-#if GREENLET_PY37
- g->exc_info = NULL;
- g->exc_state.exc_value = NULL;
-#if !GREENLET_PY311
- g->exc_state.exc_type = NULL;
- g->exc_state.exc_traceback = NULL;
-#endif
- g->exc_state.previous_item = NULL;
-#else
- g->exc_type = NULL;
- g->exc_value = NULL;
- g->exc_traceback = NULL;
-#endif
-}
-
-static PyGreenlet*
-green_create_main(void)
-{
- PyGreenlet* gmain;
- PyObject* dict = PyThreadState_GetDict();
- if (dict == NULL) {
- if (!PyErr_Occurred()) {
- PyErr_NoMemory();
- }
- return NULL;
- }
-
- /* create the main greenlet for this thread */
- gmain = (PyGreenlet*)PyType_GenericAlloc(&PyGreenlet_Type, 0);
- if (gmain == NULL) {
- return NULL;
- }
- gmain->stack_start = (char*)1;
- gmain->stack_stop = (char*)-1;
- /* GetDict() returns a borrowed reference. Make it strong. */
- gmain->run_info = dict;
- Py_INCREF(dict);
- return gmain;
-}
-
-static int
-green_updatecurrent(void)
-{
- PyObject *exc, *val, *tb;
- PyThreadState* tstate;
- PyGreenlet* current;
- PyGreenlet* previous;
- PyObject* deleteme;
-
-green_updatecurrent_restart:
- /* save current exception */
- PyErr_Fetch(&exc, &val, &tb);
-
- /* get ts_current from the active tstate */
- tstate = PyThreadState_GET();
- if (tstate->dict &&
- (current = (PyGreenlet*)PyDict_GetItem(tstate->dict, ts_curkey))) {
- /* found -- remove it, to avoid keeping a ref */
- Py_INCREF(current);
- PyDict_DelItem(tstate->dict, ts_curkey);
- }
- else {
- /* first time we see this tstate */
- current = green_create_main();
- if (current == NULL) {
- Py_XDECREF(exc);
- Py_XDECREF(val);
- Py_XDECREF(tb);
- return -1;
- }
- }
- assert(current->run_info == tstate->dict);
-
-green_updatecurrent_retry:
- /* update ts_current as soon as possible, in case of nested switches */
- Py_INCREF(current);
- previous = ts_current;
- ts_current = current;
-
- /* save ts_current as the current greenlet of its own thread */
- if (PyDict_SetItem(previous->run_info, ts_curkey, (PyObject*)previous)) {
- Py_DECREF(previous);
- Py_DECREF(current);
- Py_XDECREF(exc);
- Py_XDECREF(val);
- Py_XDECREF(tb);
- return -1;
- }
- Py_DECREF(previous);
-
- /* green_dealloc() cannot delete greenlets from other threads, so
- it stores them in the thread dict; delete them now. */
- deleteme = PyDict_GetItem(tstate->dict, ts_delkey);
- if (deleteme != NULL) {
- /* The only reference to these greenlets should be in this list, so
- clearing the list should let them be deleted again, triggering
- calls to green_dealloc() in the correct thread. This may run
- arbitrary Python code?
- */
- PyList_SetSlice(deleteme, 0, INT_MAX, NULL);
- }
-
- if (ts_current != current) {
- /* some Python code executed above and there was a thread switch,
- * so ts_current points to some other thread again. We need to
- * delete ts_curkey (it's likely there) and retry. */
- PyDict_DelItem(tstate->dict, ts_curkey);
- goto green_updatecurrent_retry;
- }
-
- /* release an extra reference */
- Py_DECREF(current);
- /* restore current exception */
- PyErr_Restore(exc, val, tb);
-
- /* thread switch could happen during PyErr_Restore, in that
- case there's nothing to do except restart from scratch. */
- if (ts_current->run_info != tstate->dict) {
- goto green_updatecurrent_restart;
- }
- return 0;
-}
-
-static PyObject*
-green_statedict(PyGreenlet* g)
-{
- while (!PyGreenlet_STARTED(g)) {
- g = g->parent;
- if (g == NULL) {
- /* garbage collected greenlet in chain */
- return NULL;
- }
- }
- return g->run_info;
-}
-
-/***********************************************************/
-
-/* Some functions must not be inlined:
- * slp_restore_state, when inlined into slp_switch might cause
- it to restore stack over its own local variables
- * slp_save_state, when inlined would add its own local
- variables to the saved stack, wasting space
- * slp_switch, cannot be inlined for obvious reasons
- * g_initialstub, when inlined would receive a pointer into its
- own stack frame, leading to incomplete stack save/restore
-*/
-
-#if defined(__GNUC__) && \
- (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4))
-# define GREENLET_NOINLINE_SUPPORTED
-# define GREENLET_NOINLINE(name) __attribute__((noinline)) name
-#elif defined(_MSC_VER) && (_MSC_VER >= 1300)
-# define GREENLET_NOINLINE_SUPPORTED
-# define GREENLET_NOINLINE(name) __declspec(noinline) name
-#endif
-
-#ifdef GREENLET_NOINLINE_SUPPORTED
-/* add forward declarations */
-static void GREENLET_NOINLINE(slp_restore_state)(void);
-static int GREENLET_NOINLINE(slp_save_state)(char*);
-# if !(defined(MS_WIN64) && defined(_M_X64))
-static int GREENLET_NOINLINE(slp_switch)(void);
-# endif
-static int GREENLET_NOINLINE(g_initialstub)(void*);
-# define GREENLET_NOINLINE_INIT() \
- do { \
- } while (0)
-#else
-/* force compiler to call functions via pointers */
-static void (*slp_restore_state)(void);
-static int (*slp_save_state)(char*);
-static int (*slp_switch)(void);
-static int (*g_initialstub)(void*);
-# define GREENLET_NOINLINE(name) cannot_inline_##name
-# define GREENLET_NOINLINE_INIT() \
- do { \
- slp_restore_state = GREENLET_NOINLINE(slp_restore_state); \
- slp_save_state = GREENLET_NOINLINE(slp_save_state); \
- slp_switch = GREENLET_NOINLINE(slp_switch); \
- g_initialstub = GREENLET_NOINLINE(g_initialstub); \
- } while (0)
-#endif
-
-/*
- * the following macros are spliced into the OS/compiler
- * specific code, in order to simplify maintenance.
- */
-
-#define SLP_SAVE_STATE(stackref, stsizediff) \
- stackref += STACK_MAGIC; \
- if (slp_save_state((char*)stackref)) \
- return -1; \
- if (!PyGreenlet_ACTIVE(ts_target)) \
- return 1; \
- stsizediff = ts_target->stack_start - (char*)stackref
-
-#define SLP_RESTORE_STATE() slp_restore_state()
-
-#define SLP_EVAL
-#define slp_switch GREENLET_NOINLINE(slp_switch)
-#include "slp_platformselect.h"
-#undef slp_switch
-
-#ifndef STACK_MAGIC
-# error \
- "greenlet needs to be ported to this platform, or taught how to detect your compiler properly."
-#endif /* !STACK_MAGIC */
-
-#ifdef EXTERNAL_ASM
-/* CCP addition: Make these functions, to be called from assembler.
- * The token include file for the given platform should enable the
- * EXTERNAL_ASM define so that this is included.
- */
-
-intptr_t
-slp_save_state_asm(intptr_t* ref)
-{
- intptr_t diff;
- SLP_SAVE_STATE(ref, diff);
- return diff;
-}
-
-void
-slp_restore_state_asm(void)
-{
- SLP_RESTORE_STATE();
-}
-
-extern int
-slp_switch(void);
-
-#endif
-
-/***********************************************************/
-
-static int
-g_save(PyGreenlet* g, char* stop)
-{
- /* Save more of g's stack into the heap -- at least up to 'stop'
-
- g->stack_stop |________|
- | |
- | __ stop . . . . .
- | | ==> . .
- |________| _______
- | | | |
- | | | |
- g->stack_start | | |_______| g->stack_copy
-
- */
- intptr_t sz1 = g->stack_saved;
- intptr_t sz2 = stop - g->stack_start;
- assert(g->stack_start != NULL);
- if (sz2 > sz1) {
- char* c = (char*)PyMem_Realloc(g->stack_copy, sz2);
- if (!c) {
- PyErr_NoMemory();
- return -1;
- }
- memcpy(c + sz1, g->stack_start + sz1, sz2 - sz1);
- g->stack_copy = c;
- g->stack_saved = sz2;
- }
- return 0;
-}
-
-static void GREENLET_NOINLINE(slp_restore_state)(void)
-{
- PyGreenlet* g = ts_target;
- PyGreenlet* owner = ts_current;
-
-#ifdef SLP_BEFORE_RESTORE_STATE
- SLP_BEFORE_RESTORE_STATE();
-#endif
-
- /* Restore the heap copy back into the C stack */
- if (g->stack_saved != 0) {
- memcpy(g->stack_start, g->stack_copy, g->stack_saved);
- PyMem_Free(g->stack_copy);
- g->stack_copy = NULL;
- g->stack_saved = 0;
- }
- if (owner->stack_start == NULL) {
- owner = owner->stack_prev; /* greenlet is dying, skip it */
- }
- while (owner && owner->stack_stop <= g->stack_stop) {
- owner = owner->stack_prev; /* find greenlet with more stack */
- }
- g->stack_prev = owner;
-}
-
-static int GREENLET_NOINLINE(slp_save_state)(char* stackref)
-{
- /* must free all the C stack up to target_stop */
- char* target_stop = ts_target->stack_stop;
- PyGreenlet* owner = ts_current;
- assert(owner->stack_saved == 0);
- if (owner->stack_start == NULL) {
- owner = owner->stack_prev; /* not saved if dying */
- }
- else {
- owner->stack_start = stackref;
- }
-
-#ifdef SLP_BEFORE_SAVE_STATE
- SLP_BEFORE_SAVE_STATE();
-#endif
-
- while (owner->stack_stop < target_stop) {
- /* ts_current is entierely within the area to free */
- if (g_save(owner, owner->stack_stop)) {
- return -1; /* XXX */
- }
- owner = owner->stack_prev;
- }
- if (owner != ts_target) {
- if (g_save(owner, target_stop)) {
- return -1; /* XXX */
- }
- }
- return 0;
-}
-
-/**
- Perform a stack switch according to some global variables
- that must be set before calling this function. Those variables
- are:
-
- - ts_current: current greenlet (holds a reference)
- - ts_target: greenlet to switch to (weak reference)
- - ts_passaround_args: NULL if PyErr_Occurred(),
- else a tuple of args sent to ts_target (holds a reference)
- - ts_passaround_kwargs: switch kwargs (holds a reference)
-
- Because the stack switch happens in this function, this function can't use
- its own stack (local) variables, set before the switch, and then accessed after the
- switch. Global variables beginning with ``ts__g_switchstack`` are used
- internally instead.
-
- On return results are passed via global variables as well:
-
- - ts_origin: originating greenlet (holds a reference)
- - ts_current: current greenlet (holds a reference)
- - ts_passaround_args: NULL if PyErr_Occurred(),
- else a tuple of args sent to ts_current (holds a reference)
- - ts_passaround_kwargs: switch kwargs (holds a reference)
-
- It is very important that stack switch is 'atomic', i.e. no
- calls into other Python code allowed (except very few that
- are safe), because global variables are very fragile.
-*/
-static int
-g_switchstack(void)
-{
- int err;
- { /* save state */
- PyGreenlet* current = ts_current;
- PyThreadState* tstate = PyThreadState_GET();
-#if GREENLET_PY311
- current->recursion_depth = (tstate->recursion_limit
- - tstate->recursion_remaining);
-#else
- current->recursion_depth = tstate->recursion_depth;
- current->top_frame = tstate->frame;
-#endif
-#if GREENLET_PY37
- current->context = tstate->context;
-#endif
-#if GREENLET_PY37
- current->exc_info = tstate->exc_info;
- current->exc_state = tstate->exc_state;
-#else
- current->exc_type = tstate->exc_type;
- current->exc_value = tstate->exc_value;
- current->exc_traceback = tstate->exc_traceback;
-#endif
-#if GREENLET_USE_CFRAME
- /*
- IMPORTANT: ``cframe`` is a pointer into the STACK.
- Thus, because the call to ``slp_switch()``
- changes the contents of the stack, you cannot read from
- ``ts_current->cframe`` after that call and necessarily
- get the same values you get from reading it here. Anything
- you need to restore from now to then must be saved
- in a global variable (because we can't use stack variables
- here either).
- */
- current->cframe = tstate->cframe;
- ts__g_switchstack_use_tracing = tstate->cframe->use_tracing;
-#if GREENLET_PY311
- current->current_frame = tstate->cframe->current_frame;
- current->datastack_chunk = tstate->datastack_chunk;
- current->datastack_top = tstate->datastack_top;
- current->datastack_limit = tstate->datastack_limit;
- PyFrameObject *frame = PyThreadState_GetFrame(tstate);
- Py_XDECREF(frame); /* PyThreadState_GetFrame gives us a new reference. */
- current->top_frame = frame;
-#endif
-#endif
- }
-
- err = slp_switch();
-
- if (err < 0) { /* error */
- PyGreenlet* current = ts_current;
- current->top_frame = NULL;
-#if GREENLET_PY37
- green_clear_exc(current);
-#else
- current->exc_type = NULL;
- current->exc_value = NULL;
- current->exc_traceback = NULL;
-#endif
-
- assert(ts_origin == NULL);
- ts_target = NULL;
- }
- else {
- PyGreenlet* target = ts_target;
- PyGreenlet* origin = ts_current;
- PyThreadState* tstate = PyThreadState_GET();
-
-#if GREENLET_PY37
- tstate->context = target->context;
- target->context = NULL;
- /* Incrementing this value invalidates the contextvars cache,
- which would otherwise remain valid across switches */
- tstate->context_ver++;
-#endif
-
-#if GREENLET_PY37
- tstate->exc_state = target->exc_state;
- tstate->exc_info =
- target->exc_info ? target->exc_info : &tstate->exc_state;
-#else
- tstate->exc_type = target->exc_type;
- tstate->exc_value = target->exc_value;
- tstate->exc_traceback = target->exc_traceback;
-#endif
- green_clear_exc(target);
-
-#if GREENLET_USE_CFRAME
- tstate->cframe = target->cframe;
- /*
- If we were tracing, we need to keep tracing.
- There should never be the possibility of hitting the
- root_cframe here. See note above about why we can't
- just copy this from ``origin->cframe->use_tracing``.
- */
- tstate->cframe->use_tracing = ts__g_switchstack_use_tracing;
-#endif
-#if GREENLET_PY311
- tstate->recursion_remaining = (tstate->recursion_limit
- - target->recursion_depth);
- tstate->cframe->current_frame = target->current_frame;
- tstate->datastack_chunk = target->datastack_chunk;
- tstate->datastack_top = target->datastack_top;
- tstate->datastack_limit = target->datastack_limit;
-#else
- tstate->recursion_depth = target->recursion_depth;
- tstate->frame = target->top_frame;
-#endif
- target->top_frame = NULL;
- assert(ts_origin == NULL);
- Py_INCREF(target);
- ts_current = target;
- ts_origin = origin;
- ts_target = NULL;
- }
- return err;
-}
-
-static int
-g_calltrace(PyObject* tracefunc, PyObject* event, PyGreenlet* origin,
- PyGreenlet* target)
-{
- PyObject* retval;
- PyObject *exc_type, *exc_val, *exc_tb;
- PyThreadState* tstate;
- PyErr_Fetch(&exc_type, &exc_val, &exc_tb);
- tstate = PyThreadState_GET();
- tstate->tracing++;
- TSTATE_USE_TRACING(tstate) = 0;
- retval = PyObject_CallFunction(tracefunc, "O(OO)", event, origin, target);
- tstate->tracing--;
- TSTATE_USE_TRACING(tstate) =
- (tstate->tracing <= 0 &&
- ((tstate->c_tracefunc != NULL) || (tstate->c_profilefunc != NULL)));
- if (retval == NULL) {
- /* In case of exceptions trace function is removed */
- if (PyDict_GetItem(tstate->dict, ts_tracekey)) {
- PyDict_DelItem(tstate->dict, ts_tracekey);
- }
- Py_XDECREF(exc_type);
- Py_XDECREF(exc_val);
- Py_XDECREF(exc_tb);
- return -1;
- }
- else {
- Py_DECREF(retval);
- }
- PyErr_Restore(exc_type, exc_val, exc_tb);
- return 0;
-}
-
-static PyObject*
-g_switch(PyGreenlet* target, PyObject* args, PyObject* kwargs)
-{
- /* _consumes_ a reference to the args tuple and kwargs dict,
- and return a new tuple reference */
- int err = 0;
- PyObject* run_info;
-
- /* check ts_current */
- if (!STATE_OK) {
- Py_XDECREF(args);
- Py_XDECREF(kwargs);
- return NULL;
- }
- run_info = green_statedict(target);
- if (run_info == NULL || run_info != ts_current->run_info) {
- Py_XDECREF(args);
- Py_XDECREF(kwargs);
- PyErr_SetString(PyExc_GreenletError,
- run_info ?
- "cannot switch to a different thread" :
- "cannot switch to a garbage collected greenlet");
- return NULL;
- }
-
- ts_passaround_args = args;
- ts_passaround_kwargs = kwargs;
-
- /* find the real target by ignoring dead greenlets,
- and if necessary starting a greenlet. */
- while (target) {
- if (PyGreenlet_ACTIVE(target)) {
- ts_target = target;
- err = g_switchstack();
- break;
- }
- if (!PyGreenlet_STARTED(target)) {
- void* dummymarker;
- ts_target = target;
- err = g_initialstub(&dummymarker);
- if (err == 1) {
- continue; /* retry the switch */
- }
- break;
- }
- target = target->parent;
- }
-
- /* For a very short time, immediately after the 'atomic'
- g_switchstack() call, global variables are in a known state.
- We need to save everything we need, before it is destroyed
- by calls into arbitrary Python code. */
- args = ts_passaround_args;
- ts_passaround_args = NULL;
- kwargs = ts_passaround_kwargs;
- ts_passaround_kwargs = NULL;
- if (err < 0) {
- /* Turn switch errors into switch throws */
- assert(ts_origin == NULL);
- Py_CLEAR(kwargs);
- Py_CLEAR(args);
- }
- else {
- PyGreenlet* origin;
- PyGreenlet* current;
- PyObject* tracefunc;
- origin = ts_origin;
- ts_origin = NULL;
- current = ts_current;
- if ((tracefunc = PyDict_GetItem(current->run_info, ts_tracekey)) != NULL) {
- Py_INCREF(tracefunc);
- if (g_calltrace(tracefunc,
- args ? ts_event_switch : ts_event_throw,
- origin,
- current) < 0) {
- /* Turn trace errors into switch throws */
- Py_CLEAR(kwargs);
- Py_CLEAR(args);
- }
- Py_DECREF(tracefunc);
- }
-
- Py_DECREF(origin);
- }
-
- /* We need to figure out what values to pass to the target greenlet
- based on the arguments that have been passed to greenlet.switch(). If
- switch() was just passed an arg tuple, then we'll just return that.
- If only keyword arguments were passed, then we'll pass the keyword
- argument dict. Otherwise, we'll create a tuple of (args, kwargs) and
- return both. */
- if (kwargs == NULL) {
- return args;
- }
- else if (PyDict_Size(kwargs) == 0) {
- Py_DECREF(kwargs);
- return args;
- }
- else if (PySequence_Length(args) == 0) {
- Py_DECREF(args);
- return kwargs;
- }
- else {
- PyObject* tuple = PyTuple_New(2);
- if (tuple == NULL) {
- Py_DECREF(args);
- Py_DECREF(kwargs);
- return NULL;
- }
- PyTuple_SET_ITEM(tuple, 0, args);
- PyTuple_SET_ITEM(tuple, 1, kwargs);
- return tuple;
- }
-}
-
-static PyObject*
-g_handle_exit(PyObject* result)
-{
- if (result == NULL && PyErr_ExceptionMatches(PyExc_GreenletExit)) {
- /* catch and ignore GreenletExit */
- PyObject *exc, *val, *tb;
- PyErr_Fetch(&exc, &val, &tb);
- if (val == NULL) {
- Py_INCREF(Py_None);
- val = Py_None;
- }
- result = val;
- Py_DECREF(exc);
- Py_XDECREF(tb);
- }
- if (result != NULL) {
- /* package the result into a 1-tuple */
- PyObject* r = result;
- result = PyTuple_New(1);
- if (result) {
- PyTuple_SET_ITEM(result, 0, r);
- }
- else {
- Py_DECREF(r);
- }
- }
- return result;
-}
-
-static int GREENLET_NOINLINE(g_initialstub)(void* mark)
-{
- int err;
- PyObject *o, *run;
- PyObject *exc, *val, *tb;
- PyObject* run_info;
- PyGreenlet* self = ts_target;
- PyObject* args = ts_passaround_args;
- PyObject* kwargs = ts_passaround_kwargs;
-#if GREENLET_USE_CFRAME
- /*
- See green_new(). This is a stack-allocated variable used
- while *self* is in PyObject_Call().
- We want to defer copying the state info until we're sure
- we need it and are in a stable place to do so.
- */
- _PyCFrame trace_info;
-#endif
- /* save exception in case getattr clears it */
- PyErr_Fetch(&exc, &val, &tb);
- /* self.run is the object to call in the new greenlet */
- run = PyObject_GetAttrString((PyObject*)self, "run");
- if (run == NULL) {
- Py_XDECREF(exc);
- Py_XDECREF(val);
- Py_XDECREF(tb);
- return -1;
- }
- /* restore saved exception */
- PyErr_Restore(exc, val, tb);
-
- /* recheck the state in case getattr caused thread switches */
- if (!STATE_OK) {
- Py_DECREF(run);
- return -1;
- }
-
- /* recheck run_info in case greenlet reparented anywhere above */
- run_info = green_statedict(self);
- if (run_info == NULL || run_info != ts_current->run_info) {
- Py_DECREF(run);
- PyErr_SetString(PyExc_GreenletError,
- run_info ?
- "cannot switch to a different thread" :
- "cannot switch to a garbage collected greenlet");
- return -1;
- }
-
- /* by the time we got here another start could happen elsewhere,
- * that means it should now be a regular switch
- */
- if (PyGreenlet_STARTED(self)) {
- Py_DECREF(run);
- ts_passaround_args = args;
- ts_passaround_kwargs = kwargs;
- return 1;
- }
-
-#if GREENLET_USE_CFRAME
- /* OK, we need it, we're about to switch greenlets, save the state. */
- trace_info = *PyThreadState_GET()->cframe;
- /* Make the target greenlet refer to the stack value. */
- self->cframe = &trace_info;
- /*
- And restore the link to the previous frame so this one gets
- unliked appropriately.
- */
- self->cframe->previous = &PyThreadState_GET()->root_cframe;
-#endif
- /* start the greenlet */
- self->stack_start = NULL;
- self->stack_stop = (char*)mark;
- if (ts_current->stack_start == NULL) {
- /* ts_current is dying */
- self->stack_prev = ts_current->stack_prev;
- }
- else {
- self->stack_prev = ts_current;
- }
- self->top_frame = NULL;
- green_clear_exc(self);
-#if GREENLET_PY311
- self->recursion_depth = (PyThreadState_GET()->recursion_limit
- - PyThreadState_GET()->recursion_remaining);
-#else
- self->recursion_depth = PyThreadState_GET()->recursion_depth;
-#endif
-
- /* restore arguments in case they are clobbered */
- ts_target = self;
- ts_passaround_args = args;
- ts_passaround_kwargs = kwargs;
-
- /* perform the initial switch */
- err = g_switchstack();
-
- /* returns twice!
- The 1st time with ``err == 1``: we are in the new greenlet
- The 2nd time with ``err <= 0``: back in the caller's greenlet
- */
- if (err == 1) {
- /* in the new greenlet */
- PyGreenlet* origin;
- PyObject* tracefunc;
- PyObject* result;
- PyGreenlet* parent;
- self->stack_start = (char*)1; /* running */
-
- /* grab origin while we still can */
- origin = ts_origin;
- ts_origin = NULL;
-
- /* now use run_info to store the statedict */
- o = self->run_info;
- self->run_info = green_statedict(self->parent);
- Py_INCREF(self->run_info);
- Py_XDECREF(o);
-
- if ((tracefunc = PyDict_GetItem(self->run_info, ts_tracekey)) != NULL) {
- Py_INCREF(tracefunc);
- if (g_calltrace(tracefunc,
- args ? ts_event_switch : ts_event_throw,
- origin,
- self) < 0) {
- /* Turn trace errors into switch throws */
- Py_CLEAR(kwargs);
- Py_CLEAR(args);
- }
- Py_DECREF(tracefunc);
- }
-
- Py_DECREF(origin);
-
- if (args == NULL) {
- /* pending exception */
- result = NULL;
- }
- else {
- /* call g.run(*args, **kwargs) */
- result = PyObject_Call(run, args, kwargs);
- Py_DECREF(args);
- Py_XDECREF(kwargs);
- }
- Py_DECREF(run);
- result = g_handle_exit(result);
-
- /* jump back to parent */
- self->stack_start = NULL; /* dead */
- for (parent = self->parent; parent != NULL; parent = parent->parent) {
- result = g_switch(parent, result, NULL);
- /* Return here means switch to parent failed,
- * in which case we throw *current* exception
- * to the next parent in chain.
- */
- assert(result == NULL);
- }
- /* We ran out of parents, cannot continue */
- PyErr_WriteUnraisable((PyObject*)self);
- Py_FatalError("greenlets cannot continue");
- }
- /* back in the parent */
- if (err < 0) {
- /* start failed badly, restore greenlet state */
- self->stack_start = NULL;
- self->stack_stop = NULL;
- self->stack_prev = NULL;
- }
- return err;
-}
-
-/***********************************************************/
-
-static PyObject*
-green_new(PyTypeObject* type, PyObject* args, PyObject* kwds)
-{
- PyObject* o =
- PyBaseObject_Type.tp_new(type, ts_empty_tuple, ts_empty_dict);
- if (o != NULL) {
- if (!STATE_OK) {
- Py_DECREF(o);
- return NULL;
- }
- Py_INCREF(ts_current);
- ((PyGreenlet*)o)->parent = ts_current;
-#if GREENLET_USE_CFRAME
- /*
- The PyThreadState->cframe pointer usually points to memory on the
- stack, alloceted in a call into PyEval_EvalFrameDefault.
-
- Initially, before any evaluation begins, it points to the initial
- PyThreadState object's ``root_cframe`` object, which is statically
- allocated for the lifetime of the thread.
-
- A greenlet can last for longer than a call to
- PyEval_EvalFrameDefault, so we can't set its ``cframe`` pointer to
- be the current ``PyThreadState->cframe``; nor could we use one from
- the greenlet parent for the same reason. Yet a further no: we can't
- allocate one scoped to the greenlet and then destroy it when the
- greenlet is deallocated, because inside the interpreter the CFrame
- objects form a linked list, and that too can result in accessing
- memory beyond its dynamic lifetime (if the greenlet doesn't actually
- finish before it dies, its entry could still be in the list).
-
- Using the ``root_cframe`` is problematic, though, because its
- members are never modified by the interpreter and are set to 0,
- meaning that its ``use_tracing`` flag is never updated. We don't
- want to modify that value in the ``root_cframe`` ourself: it
- *shouldn't* matter much because we should probably never get back to
- the point where that's the only cframe on the stack; even if it did
- matter, the major consequence of an incorrect value for
- ``use_tracing`` is that if its true the interpreter does some extra
- work --- however, it's just good code hygiene.
-
- Our solution: before a greenlet runs, after its initial creation,
- it uses the ``root_cframe`` just to have something to put there.
- However, once the greenlet is actually switched to for the first
- time, ``g_initialstub`` (which doesn't actually "return" while the
- greenlet is running) stores a new _PyCFrame on its local stack, and
- copies the appropriate values from the currently running CFrame;
- this is then made the _PyCFrame for the newly-minted greenlet.
- ``g_initialstub`` then proceeds to call ``glet.run()``, which
- results in ``PyEval_...`` adding the _PyCFrame to the list. Switches
- continue as normal. Finally, when the greenlet finishes, the call to
- ``glet.run()`` returns and the _PyCFrame is taken out of the linked
- list and the stack value is now unused and free to expire.
- */
- ((PyGreenlet*)o)->cframe = &PyThreadState_GET()->root_cframe;
-#endif
- }
- return o;
-}
-
-static int
-green_setrun(PyGreenlet* self, PyObject* nrun, void* c);
-static int
-green_setparent(PyGreenlet* self, PyObject* nparent, void* c);
-
-static int
-green_init(PyGreenlet* self, PyObject* args, PyObject* kwargs)
-{
- PyObject* run = NULL;
- PyObject* nparent = NULL;
- static char* kwlist[] = {"run", "parent", 0};
- if (!PyArg_ParseTupleAndKeywords(
- args, kwargs, "|OO:green", kwlist, &run, &nparent)) {
- return -1;
- }
-
- if (run != NULL) {
- if (green_setrun(self, run, NULL)) {
- return -1;
- }
- }
- if (nparent != NULL && nparent != Py_None) {
- return green_setparent(self, nparent, NULL);
- }
- return 0;
-}
-
-static int
-kill_greenlet(PyGreenlet* self)
-{
- /* Cannot raise an exception to kill the greenlet if
- it is not running in the same thread! */
- if (self->run_info == PyThreadState_GET()->dict) {
- /* The dying greenlet cannot be a parent of ts_current
- because the 'parent' field chain would hold a
- reference */
- PyObject* result;
- PyGreenlet* oldparent;
- PyGreenlet* tmp;
- if (!STATE_OK) {
- return -1;
- }
- oldparent = self->parent;
- self->parent = ts_current;
- Py_INCREF(self->parent);
- /* Send the greenlet a GreenletExit exception. */
- PyErr_SetNone(PyExc_GreenletExit);
- result = g_switch(self, NULL, NULL);
- tmp = self->parent;
- self->parent = oldparent;
- Py_XDECREF(tmp);
- if (result == NULL) {
- return -1;
- }
- Py_DECREF(result);
- return 0;
- }
- else {
- /* Not the same thread! Temporarily save the greenlet
- into its thread's ts_delkey list. */
- PyObject* lst;
- lst = PyDict_GetItem(self->run_info, ts_delkey);
- if (lst == NULL) {
- lst = PyList_New(0);
- if (lst == NULL
- || PyDict_SetItem(self->run_info, ts_delkey, lst) < 0) {
- return -1;
- }
- /* PyDict_SetItem now holds a strong reference. PyList_New also
- returned a fresh reference. We need to DECREF it now and let
- the dictionary keep sole ownership. Frow now on, we're working
- with a borrowed reference that will go away when the thread
- dies. */
- Py_DECREF(lst);
- }
- if (PyList_Append(lst, (PyObject*)self) < 0) {
- return -1;
- }
- if (!STATE_OK) { /* to force ts_delkey to be reconsidered */
- return -1;
- }
- return 0;
- }
-}
-
-static int
-green_traverse(PyGreenlet* self, visitproc visit, void* arg)
-{
- /* We must only visit referenced objects, i.e. only objects
- Py_INCREF'ed by this greenlet (directly or indirectly):
- - stack_prev is not visited: holds previous stack pointer, but it's not
- referenced
- - frames are not visited: alive greenlets are not garbage collected
- anyway */
- Py_VISIT((PyObject*)self->parent);
- Py_VISIT(self->run_info);
-#if GREENLET_PY37
- Py_VISIT(self->context);
-#endif
-#if GREENLET_PY37
- Py_VISIT(self->exc_state.exc_value);
-#if !GREENLET_PY311
- Py_VISIT(self->exc_state.exc_type);
- Py_VISIT(self->exc_state.exc_traceback);
-#endif
-#else
- Py_VISIT(self->exc_type);
- Py_VISIT(self->exc_value);
- Py_VISIT(self->exc_traceback);
-#endif
- Py_VISIT(self->dict);
- return 0;
-}
-
-static int
-green_is_gc(PyGreenlet* self)
-{
- /* Main greenlet can be garbage collected since it can only
- become unreachable if the underlying thread exited.
- Active greenlet cannot be garbage collected, however. */
- if (PyGreenlet_MAIN(self) || !PyGreenlet_ACTIVE(self)) {
- return 1;
- }
- return 0;
-}
-
-static int
-green_clear(PyGreenlet* self)
-{
- /* Greenlet is only cleared if it is about to be collected.
- Since active greenlets are not garbage collectable, we can
- be sure that, even if they are deallocated during clear,
- nothing they reference is in unreachable or finalizers,
- so even if it switches we are relatively safe. */
- Py_CLEAR(self->parent);
- Py_CLEAR(self->run_info);
-#if GREENLET_PY37
- Py_CLEAR(self->context);
-#endif
-#if GREENLET_PY37
- Py_CLEAR(self->exc_state.exc_value);
-#if !GREENLET_PY311
- Py_CLEAR(self->exc_state.exc_type);
- Py_CLEAR(self->exc_state.exc_traceback);
-#endif
-#else
- Py_CLEAR(self->exc_type);
- Py_CLEAR(self->exc_value);
- Py_CLEAR(self->exc_traceback);
-#endif
- Py_CLEAR(self->dict);
- return 0;
-}
-
-static void
-green_dealloc(PyGreenlet* self)
-{
- PyObject *error_type, *error_value, *error_traceback;
- Py_ssize_t refcnt;
-
- PyObject_GC_UnTrack(self);
-
- if (PyGreenlet_ACTIVE(self) && self->run_info != NULL &&
- !PyGreenlet_MAIN(self)) {
- /* Hacks hacks hacks copied from instance_dealloc() */
- /* Temporarily resurrect the greenlet. */
- assert(Py_REFCNT(self) == 0);
- Py_SET_REFCNT(self, 1);
- /* Save the current exception, if any. */
- PyErr_Fetch(&error_type, &error_value, &error_traceback);
- if (kill_greenlet(self) < 0) {
- PyErr_WriteUnraisable((PyObject*)self);
- /* XXX what else should we do? */
- }
- /* Check for no resurrection must be done while we keep
- * our internal reference, otherwise PyFile_WriteObject
- * causes recursion if using Py_INCREF/Py_DECREF
- */
- if (Py_REFCNT(self) == 1 && PyGreenlet_ACTIVE(self)) {
- /* Not resurrected, but still not dead!
- XXX what else should we do? we complain. */
- PyObject* f = PySys_GetObject("stderr");
- Py_INCREF(self); /* leak! */
- if (f != NULL) {
- PyFile_WriteString("GreenletExit did not kill ", f);
- PyFile_WriteObject((PyObject*)self, f, 0);
- PyFile_WriteString("\n", f);
- }
- }
- /* Restore the saved exception. */
- PyErr_Restore(error_type, error_value, error_traceback);
- /* Undo the temporary resurrection; can't use DECREF here,
- * it would cause a recursive call.
- */
- assert(Py_REFCNT(self) > 0);
-
- refcnt = Py_REFCNT(self) - 1;
- Py_SET_REFCNT(self, refcnt);
- if (refcnt != 0) {
- /* Resurrected! */
- _Py_NewReference((PyObject*)self);
- Py_SET_REFCNT(self, refcnt);
- /* Better to use tp_finalizer slot (PEP 442)
- * and call ``PyObject_CallFinalizerFromDealloc``,
- * but that's only supported in Python 3.4+; see
- * Modules/_io/iobase.c for an example.
- *
- * The following approach is copied from iobase.c in CPython 2.7.
- * (along with much of this function in general). Here's their
- * comment:
- *
- * When called from a heap type's dealloc, the type will be
- * decref'ed on return (see e.g. subtype_dealloc in typeobject.c). */
- if (PyType_HasFeature(Py_TYPE(self), Py_TPFLAGS_HEAPTYPE)) {
- Py_INCREF(Py_TYPE(self));
- }
-
- PyObject_GC_Track((PyObject*)self);
-
- _Py_DEC_REFTOTAL;
-#ifdef COUNT_ALLOCS
- --Py_TYPE(self)->tp_frees;
- --Py_TYPE(self)->tp_allocs;
-#endif /* COUNT_ALLOCS */
- return;
- }
- }
- if (self->weakreflist != NULL) {
- PyObject_ClearWeakRefs((PyObject*)self);
- }
- Py_CLEAR(self->parent);
- Py_CLEAR(self->run_info);
-#if GREENLET_PY37
- Py_CLEAR(self->context);
-#endif
-#if GREENLET_PY37
- Py_CLEAR(self->exc_state.exc_value);
-#if !GREENLET_PY311
- Py_CLEAR(self->exc_state.exc_type);
- Py_CLEAR(self->exc_state.exc_traceback);
-#endif
-#else
- Py_CLEAR(self->exc_type);
- Py_CLEAR(self->exc_value);
- Py_CLEAR(self->exc_traceback);
-#endif
- Py_CLEAR(self->dict);
- Py_TYPE(self)->tp_free((PyObject*)self);
-}
-
-static PyObject*
-single_result(PyObject* results)
-{
- if (results != NULL && PyTuple_Check(results) &&
- PyTuple_GET_SIZE(results) == 1) {
- PyObject* result = PyTuple_GET_ITEM(results, 0);
- Py_INCREF(result);
- Py_DECREF(results);
- return result;
- }
- else {
- return results;
- }
-}
-
-static PyObject*
-throw_greenlet(PyGreenlet* self, PyObject* typ, PyObject* val, PyObject* tb)
-{
- /* Note: _consumes_ a reference to typ, val, tb */
- PyObject* result = NULL;
- PyErr_Restore(typ, val, tb);
- if (PyGreenlet_STARTED(self) && !PyGreenlet_ACTIVE(self)) {
- /* dead greenlet: turn GreenletExit into a regular return */
- result = g_handle_exit(result);
- }
- return single_result(g_switch(self, result, NULL));
-}
-
-PyDoc_STRVAR(
- green_switch_doc,
- "switch(*args, **kwargs)\n"
- "\n"
- "Switch execution to this greenlet.\n"
- "\n"
- "If this greenlet has never been run, then this greenlet\n"
- "will be switched to using the body of ``self.run(*args, **kwargs)``.\n"
- "\n"
- "If the greenlet is active (has been run, but was switch()'ed\n"
- "out before leaving its run function), then this greenlet will\n"
- "be resumed and the return value to its switch call will be\n"
- "None if no arguments are given, the given argument if one\n"
- "argument is given, or the args tuple and keyword args dict if\n"
- "multiple arguments are given.\n"
- "\n"
- "If the greenlet is dead, or is the current greenlet then this\n"
- "function will simply return the arguments using the same rules as\n"
- "above.\n");
-
-static PyObject*
-green_switch(PyGreenlet* self, PyObject* args, PyObject* kwargs)
-{
- Py_INCREF(args);
- Py_XINCREF(kwargs);
- return single_result(g_switch(self, args, kwargs));
-}
-
-PyDoc_STRVAR(
- green_throw_doc,
- "Switches execution to this greenlet, but immediately raises the\n"
- "given exception in this greenlet. If no argument is provided, the "
- "exception\n"
- "defaults to `greenlet.GreenletExit`. The normal exception\n"
- "propagation rules apply, as described for `switch`. Note that calling "
- "this\n"
- "method is almost equivalent to the following::\n"
- "\n"
- " def raiser():\n"
- " raise typ, val, tb\n"
- " g_raiser = greenlet(raiser, parent=g)\n"
- " g_raiser.switch()\n"
- "\n"
- "except that this trick does not work for the\n"
- "`greenlet.GreenletExit` exception, which would not propagate\n"
- "from ``g_raiser`` to ``g``.\n");
-
-static PyObject*
-green_throw(PyGreenlet* self, PyObject* args)
-{
- PyObject* typ = PyExc_GreenletExit;
- PyObject* val = NULL;
- PyObject* tb = NULL;
-
- if (!PyArg_ParseTuple(args, "|OOO:throw", &typ, &val, &tb)) {
- return NULL;
- }
-
- /* First, check the traceback argument, replacing None, with NULL */
- if (tb == Py_None) {
- tb = NULL;
- }
- else if (tb != NULL && !PyTraceBack_Check(tb)) {
- PyErr_SetString(PyExc_TypeError,
- "throw() third argument must be a traceback object");
- return NULL;
- }
-
- Py_INCREF(typ);
- Py_XINCREF(val);
- Py_XINCREF(tb);
-
- if (PyExceptionClass_Check(typ)) {
- PyErr_NormalizeException(&typ, &val, &tb);
- }
- else if (PyExceptionInstance_Check(typ)) {
- /* Raising an instance. The value should be a dummy. */
- if (val && val != Py_None) {
- PyErr_SetString(
- PyExc_TypeError,
- "instance exception may not have a separate value");
- goto failed_throw;
- }
- else {
- /* Normalize to raise , */
- Py_XDECREF(val);
- val = typ;
- typ = PyExceptionInstance_Class(typ);
- Py_INCREF(typ);
- }
- }
- else {
- /* Not something you can raise. throw() fails. */
- PyErr_Format(PyExc_TypeError,
- "exceptions must be classes, or instances, not %s",
- Py_TYPE(typ)->tp_name);
- goto failed_throw;
- }
-
- return throw_greenlet(self, typ, val, tb);
-
-failed_throw:
- /* Didn't use our arguments, so restore their original refcounts */
- Py_DECREF(typ);
- Py_XDECREF(val);
- Py_XDECREF(tb);
- return NULL;
-}
-
-static int
-green_bool(PyGreenlet* self)
-{
- return PyGreenlet_ACTIVE(self);
-}
-
-static PyObject*
-green_getdict(PyGreenlet* self, void* c)
-{
- if (self->dict == NULL) {
- self->dict = PyDict_New();
- if (self->dict == NULL) {
- return NULL;
- }
- }
- Py_INCREF(self->dict);
- return self->dict;
-}
-
-static int
-green_setdict(PyGreenlet* self, PyObject* val, void* c)
-{
- PyObject* tmp;
-
- if (val == NULL) {
- PyErr_SetString(PyExc_TypeError, "__dict__ may not be deleted");
- return -1;
- }
- if (!PyDict_Check(val)) {
- PyErr_SetString(PyExc_TypeError, "__dict__ must be a dictionary");
- return -1;
- }
- tmp = self->dict;
- Py_INCREF(val);
- self->dict = val;
- Py_XDECREF(tmp);
- return 0;
-}
-
-static int
-_green_not_dead(PyGreenlet* self)
-{
- return PyGreenlet_ACTIVE(self) || !PyGreenlet_STARTED(self);
-}
-
-
-static PyObject*
-green_getdead(PyGreenlet* self, void* c)
-{
- if (_green_not_dead(self)) {
- Py_RETURN_FALSE;
- }
- else {
- Py_RETURN_TRUE;
- }
-}
-
-static PyObject*
-green_get_stack_saved(PyGreenlet* self, void* c)
-{
- return PyLong_FromSsize_t(self->stack_saved);
-}
-
-static PyObject*
-green_getrun(PyGreenlet* self, void* c)
-{
- if (PyGreenlet_STARTED(self) || self->run_info == NULL) {
- PyErr_SetString(PyExc_AttributeError, "run");
- return NULL;
- }
- Py_INCREF(self->run_info);
- return self->run_info;
-}
-
-static int
-green_setrun(PyGreenlet* self, PyObject* nrun, void* c)
-{
- PyObject* o;
- if (PyGreenlet_STARTED(self)) {
- PyErr_SetString(PyExc_AttributeError,
- "run cannot be set "
- "after the start of the greenlet");
- return -1;
- }
- o = self->run_info;
- self->run_info = nrun;
- Py_XINCREF(nrun);
- Py_XDECREF(o);
- return 0;
-}
-
-static PyObject*
-green_getparent(PyGreenlet* self, void* c)
-{
- PyObject* result = self->parent ? (PyObject*)self->parent : Py_None;
- Py_INCREF(result);
- return result;
-}
-
-static int
-green_setparent(PyGreenlet* self, PyObject* nparent, void* c)
-{
- PyGreenlet* p;
- PyObject* run_info = NULL;
- if (nparent == NULL) {
- PyErr_SetString(PyExc_AttributeError, "can't delete attribute");
- return -1;
- }
- if (!PyGreenlet_Check(nparent)) {
- PyErr_SetString(PyExc_TypeError, "parent must be a greenlet");
- return -1;
- }
- for (p = (PyGreenlet*)nparent; p; p = p->parent) {
- if (p == self) {
- PyErr_SetString(PyExc_ValueError, "cyclic parent chain");
- return -1;
- }
- run_info = PyGreenlet_ACTIVE(p) ? p->run_info : NULL;
- }
- if (run_info == NULL) {
- PyErr_SetString(PyExc_ValueError,
- "parent must not be garbage collected");
- return -1;
- }
- if (PyGreenlet_STARTED(self) && self->run_info != run_info) {
- PyErr_SetString(PyExc_ValueError,
- "parent cannot be on a different thread");
- return -1;
- }
- p = self->parent;
- self->parent = (PyGreenlet*)nparent;
- Py_INCREF(nparent);
- Py_XDECREF(p);
- return 0;
-}
-
-#ifdef Py_CONTEXT_H
-# define GREENLET_NO_CONTEXTVARS_REASON "This build of greenlet"
-#else
-# define GREENLET_NO_CONTEXTVARS_REASON "This Python interpreter"
-#endif
-
-static PyObject*
-green_getcontext(PyGreenlet* self, void* c)
-{
-#if GREENLET_PY37
- PyThreadState* tstate = PyThreadState_GET();
- PyObject* result;
-
- if (!STATE_OK) {
- return NULL;
- }
- if (PyGreenlet_ACTIVE(self) && self->top_frame == NULL) {
- /* Currently running greenlet: context is stored in the thread state,
- not the greenlet object. */
- if (self == ts_current) {
- result = tstate->context;
- }
- else {
- PyErr_SetString(PyExc_ValueError,
- "cannot get context of a "
- "greenlet that is running in a different thread");
- return NULL;
- }
- }
- else {
- /* Greenlet is not running: just return context. */
- result = self->context;
- }
- if (result == NULL) {
- result = Py_None;
- }
- Py_INCREF(result);
- return result;
-#else
- PyErr_SetString(PyExc_AttributeError,
- GREENLET_NO_CONTEXTVARS_REASON
- " does not support context variables");
- return NULL;
-#endif
-}
-
-static int
-green_setcontext(PyGreenlet* self, PyObject* nctx, void* c)
-{
-#if GREENLET_PY37
- PyThreadState* tstate;
- PyObject* octx = NULL;
- if (!STATE_OK) {
- return -1;
- }
- if (nctx == NULL) {
- PyErr_SetString(PyExc_AttributeError, "can't delete attribute");
- return -1;
- }
- if (nctx == Py_None) {
- /* "Empty context" is stored as NULL, not None. */
- nctx = NULL;
- }
- else if (!PyContext_CheckExact(nctx)) {
- PyErr_SetString(PyExc_TypeError,
- "greenlet context must be a "
- "contextvars.Context or None");
- return -1;
- }
- tstate = PyThreadState_GET();
- if (PyGreenlet_ACTIVE(self) && self->top_frame == NULL) {
- /* Currently running greenlet: context is stored in the thread state,
- not the greenlet object. */
- if (self == ts_current) {
- octx = tstate->context;
- tstate->context = nctx;
- tstate->context_ver++;
- Py_XINCREF(nctx);
- }
- else {
- PyErr_SetString(PyExc_ValueError,
- "cannot set context of a "
- "greenlet that is running in a different thread");
- return -1;
- }
- }
- else {
- /* Greenlet is not running: just set context. */
- octx = self->context;
- self->context = nctx;
- Py_XINCREF(nctx);
- }
- Py_XDECREF(octx);
- return 0;
-#else
- PyErr_SetString(PyExc_AttributeError,
- GREENLET_NO_CONTEXTVARS_REASON
- " does not support context variables");
- return -1;
-#endif
-}
-
-#undef GREENLET_NO_CONTEXTVARS_REASON
-
-static PyObject*
-green_getframe(PyGreenlet* self, void* c)
-{
- PyObject* result = self->top_frame ? (PyObject*)self->top_frame : Py_None;
- Py_INCREF(result);
- return result;
-}
-
-static PyObject*
-green_getstate(PyGreenlet* self)
-{
- PyErr_Format(PyExc_TypeError,
- "cannot serialize '%s' object",
- Py_TYPE(self)->tp_name);
- return NULL;
-}
-
-static PyObject*
-green_repr(PyGreenlet* self)
-{
- /*
- Return a string like
-
-
- The handling of greenlets across threads is not super good.
- We mostly use the internal definitions of these terms, but they
- generally should make sense to users as well.
- */
- PyObject* result;
- int never_started = !PyGreenlet_STARTED(self) && !PyGreenlet_ACTIVE(self);
-
- if (!STATE_OK) {
- return NULL;
- }
-
-#if PY_MAJOR_VERSION >= 3
-# define GNative_FromFormat PyUnicode_FromFormat
-#else
-# define GNative_FromFormat PyString_FromFormat
-#endif
-
- if (_green_not_dead(self)) {
- /* XXX: The otid= is almost useless becasue you can't correlate it to
- any thread identifier exposed to Python. We could use
- PyThreadState_GET()->thread_id, but we'd need to save that in the
- greenlet, or save the whole PyThreadState object itself.
-
- As it stands, its only useful for identifying greenlets from the same thread.
- */
- result = GNative_FromFormat(
- "<%s object at %p (otid=%p)%s%s%s%s>",
- Py_TYPE(self)->tp_name,
- self,
- self->run_info,
- ts_current == self
- ? " current"
- : (PyGreenlet_STARTED(self) ? " suspended" : ""),
- PyGreenlet_ACTIVE(self) ? " active" : "",
- never_started ? " pending" : " started",
- PyGreenlet_MAIN(self) ? " main" : ""
- );
- }
- else {
- /* main greenlets never really appear dead. */
- result = GNative_FromFormat(
- "<%s object at %p (otid=%p) dead>",
- Py_TYPE(self)->tp_name,
- self,
- self->run_info
- );
- }
-#undef GNative_FromFormat
-
- return result;
-}
-
-/*****************************************************************************
- * C interface
- *
- * These are exported using the CObject API
- */
-
-static PyGreenlet*
-PyGreenlet_GetCurrent(void)
-{
- if (!STATE_OK) {
- return NULL;
- }
- Py_INCREF(ts_current);
- return ts_current;
-}
-
-static int
-PyGreenlet_SetParent(PyGreenlet* g, PyGreenlet* nparent)
-{
- if (!PyGreenlet_Check(g)) {
- PyErr_SetString(PyExc_TypeError, "parent must be a greenlet");
- return -1;
- }
-
- return green_setparent((PyGreenlet*)g, (PyObject*)nparent, NULL);
-}
-
-static PyGreenlet*
-PyGreenlet_New(PyObject* run, PyGreenlet* parent)
-{
- /* XXX: Why doesn't this call green_new()? There's some duplicate
- code. */
- PyGreenlet* g = NULL;
- g = (PyGreenlet*)PyType_GenericAlloc(&PyGreenlet_Type, 0);
- if (g == NULL) {
- return NULL;
- }
-
- if (run != NULL) {
- Py_INCREF(run);
- g->run_info = run;
- }
-
- if (parent != NULL) {
- if (PyGreenlet_SetParent(g, parent)) {
- Py_DECREF(g);
- return NULL;
- }
- }
- else {
- if ((g->parent = PyGreenlet_GetCurrent()) == NULL) {
- Py_DECREF(g);
- return NULL;
- }
- }
-#if GREENLET_USE_CFRAME
- g->cframe = &PyThreadState_GET()->root_cframe;
-#endif
- return g;
-}
-
-static PyObject*
-PyGreenlet_Switch(PyGreenlet* g, PyObject* args, PyObject* kwargs)
-{
- PyGreenlet* self = (PyGreenlet*)g;
-
- if (!PyGreenlet_Check(self)) {
- PyErr_BadArgument();
- return NULL;
- }
-
- if (args == NULL) {
- args = Py_BuildValue("()");
- }
- else {
- Py_INCREF(args);
- }
-
- if (kwargs != NULL && PyDict_Check(kwargs)) {
- Py_INCREF(kwargs);
- }
- else {
- kwargs = NULL;
- }
-
- return single_result(g_switch(self, args, kwargs));
-}
-
-static PyObject*
-PyGreenlet_Throw(PyGreenlet* self, PyObject* typ, PyObject* val, PyObject* tb)
-{
- if (!PyGreenlet_Check(self)) {
- PyErr_BadArgument();
- return NULL;
- }
- Py_INCREF(typ);
- Py_XINCREF(val);
- Py_XINCREF(tb);
- return throw_greenlet(self, typ, val, tb);
-}
-
-/** End C API ****************************************************************/
-
-static PyMethodDef green_methods[] = {
- {"switch",
- (PyCFunction)green_switch,
- METH_VARARGS | METH_KEYWORDS,
- green_switch_doc},
- {"throw", (PyCFunction)green_throw, METH_VARARGS, green_throw_doc},
- {"__getstate__", (PyCFunction)green_getstate, METH_NOARGS, NULL},
- {NULL, NULL} /* sentinel */
-};
-
-static PyGetSetDef green_getsets[] = {
- {"__dict__", (getter)green_getdict, (setter)green_setdict, /*XXX*/ NULL},
- {"run", (getter)green_getrun, (setter)green_setrun, /*XXX*/ NULL},
- {"parent", (getter)green_getparent, (setter)green_setparent, /*XXX*/ NULL},
- {"gr_frame", (getter)green_getframe, NULL, /*XXX*/ NULL},
- {"gr_context",
- (getter)green_getcontext,
- (setter)green_setcontext,
- /*XXX*/ NULL},
- {"dead", (getter)green_getdead, NULL, /*XXX*/ NULL},
- {"_stack_saved", (getter)green_get_stack_saved, NULL, /*XXX*/ NULL},
- {NULL}};
-
-static PyNumberMethods green_as_number = {
- NULL, /* nb_add */
- NULL, /* nb_subtract */
- NULL, /* nb_multiply */
-#if PY_MAJOR_VERSION < 3
- NULL, /* nb_divide */
-#endif
- NULL, /* nb_remainder */
- NULL, /* nb_divmod */
- NULL, /* nb_power */
- NULL, /* nb_negative */
- NULL, /* nb_positive */
- NULL, /* nb_absolute */
- (inquiry)green_bool, /* nb_bool */
-};
-
-PyTypeObject PyGreenlet_Type = {
- PyVarObject_HEAD_INIT(NULL, 0)
- "greenlet.greenlet", /* tp_name */
- sizeof(PyGreenlet), /* tp_basicsize */
- 0, /* tp_itemsize */
- /* methods */
- (destructor)green_dealloc, /* tp_dealloc */
- 0, /* tp_print */
- 0, /* tp_getattr */
- 0, /* tp_setattr */
- 0, /* tp_compare */
- (reprfunc)green_repr, /* tp_repr */
- &green_as_number, /* tp_as _number*/
- 0, /* tp_as _sequence*/
- 0, /* tp_as _mapping*/
- 0, /* tp_hash */
- 0, /* tp_call */
- 0, /* tp_str */
- 0, /* tp_getattro */
- 0, /* tp_setattro */
- 0, /* tp_as_buffer*/
- Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE |
- GREENLET_GC_FLAGS, /* tp_flags */
- "greenlet(run=None, parent=None) -> greenlet\n\n"
- "Creates a new greenlet object (without running it).\n\n"
- " - *run* -- The callable to invoke.\n"
- " - *parent* -- The parent greenlet. The default is the current "
- "greenlet.", /* tp_doc */
- (traverseproc)GREENLET_tp_traverse, /* tp_traverse */
- (inquiry)GREENLET_tp_clear, /* tp_clear */
- 0, /* tp_richcompare */
- offsetof(PyGreenlet, weakreflist), /* tp_weaklistoffset */
- 0, /* tp_iter */
- 0, /* tp_iternext */
- green_methods, /* tp_methods */
- 0, /* tp_members */
- green_getsets, /* tp_getset */
- 0, /* tp_base */
- 0, /* tp_dict */
- 0, /* tp_descr_get */
- 0, /* tp_descr_set */
- offsetof(PyGreenlet, dict), /* tp_dictoffset */
- (initproc)green_init, /* tp_init */
- GREENLET_tp_alloc, /* tp_alloc */
- green_new, /* tp_new */
- GREENLET_tp_free, /* tp_free */
- (inquiry)GREENLET_tp_is_gc, /* tp_is_gc */
-};
-
-PyDoc_STRVAR(mod_getcurrent_doc,
- "getcurrent() -> greenlet\n"
- "\n"
- "Returns the current greenlet (i.e. the one which called this "
- "function).\n");
-
-static PyObject*
-mod_getcurrent(PyObject* self)
-{
- if (!STATE_OK) {
- return NULL;
- }
- Py_INCREF(ts_current);
- return (PyObject*)ts_current;
-}
-
-PyDoc_STRVAR(mod_settrace_doc,
- "settrace(callback) -> object\n"
- "\n"
- "Sets a new tracing function and returns the previous one.\n");
-static PyObject*
-mod_settrace(PyObject* self, PyObject* args)
-{
- int err;
- PyObject* previous;
- PyObject* tracefunc;
- PyGreenlet* current;
- if (!PyArg_ParseTuple(args, "O", &tracefunc)) {
- return NULL;
- }
- if (!STATE_OK) {
- return NULL;
- }
- current = ts_current;
- previous = PyDict_GetItem(current->run_info, ts_tracekey);
- if (previous == NULL) {
- previous = Py_None;
- }
- Py_INCREF(previous);
- if (tracefunc == Py_None) {
- err = previous != Py_None ?
- PyDict_DelItem(current->run_info, ts_tracekey) :
- 0;
- }
- else {
- err = PyDict_SetItem(current->run_info, ts_tracekey, tracefunc);
- }
- if (err < 0) {
- Py_CLEAR(previous);
- }
- return previous;
-}
-
-PyDoc_STRVAR(mod_gettrace_doc,
- "gettrace() -> object\n"
- "\n"
- "Returns the currently set tracing function, or None.\n");
-
-static PyObject*
-mod_gettrace(PyObject* self)
-{
- PyObject* tracefunc;
- if (!STATE_OK) {
- return NULL;
- }
- tracefunc = PyDict_GetItem(ts_current->run_info, ts_tracekey);
- if (tracefunc == NULL) {
- tracefunc = Py_None;
- }
- Py_INCREF(tracefunc);
- return tracefunc;
-}
-
-static PyMethodDef GreenMethods[] = {
- {"getcurrent",
- (PyCFunction)mod_getcurrent,
- METH_NOARGS,
- mod_getcurrent_doc},
- {"settrace", (PyCFunction)mod_settrace, METH_VARARGS, mod_settrace_doc},
- {"gettrace", (PyCFunction)mod_gettrace, METH_NOARGS, mod_gettrace_doc},
- {NULL, NULL} /* Sentinel */
-};
-
-static char* copy_on_greentype[] = {
- "getcurrent", "error", "GreenletExit", "settrace", "gettrace", NULL};
-
-#if PY_MAJOR_VERSION >= 3
-# define INITERROR return NULL
-
-static struct PyModuleDef greenlet_module_def = {
- PyModuleDef_HEAD_INIT,
- "greenlet._greenlet",
- NULL,
- -1,
- GreenMethods,
-};
-
-PyMODINIT_FUNC
-PyInit__greenlet(void)
-#else
-# define INITERROR return
-
-PyMODINIT_FUNC
-init_greenlet(void)
-#endif
-{
- PyObject* m = NULL;
- char** p = NULL;
- PyObject* c_api_object;
- static void* _PyGreenlet_API[PyGreenlet_API_pointers];
-
- GREENLET_NOINLINE_INIT();
-
-#if PY_MAJOR_VERSION >= 3
- m = PyModule_Create(&greenlet_module_def);
-#else
- m = Py_InitModule("greenlet._greenlet", GreenMethods);
-#endif
- if (m == NULL) {
- INITERROR;
- }
-
-#if PY_MAJOR_VERSION >= 3
-# define Greenlet_Intern PyUnicode_InternFromString
-#else
-# define Greenlet_Intern PyString_InternFromString
-#endif
- ts_curkey = Greenlet_Intern("__greenlet_ts_curkey");
- ts_delkey = Greenlet_Intern("__greenlet_ts_delkey");
- ts_tracekey = Greenlet_Intern("__greenlet_ts_tracekey");
- ts_event_switch = Greenlet_Intern("switch");
- ts_event_throw = Greenlet_Intern("throw");
-#undef Greenlet_Intern
-
- if (ts_curkey == NULL || ts_delkey == NULL) {
- INITERROR;
- }
- if (PyType_Ready(&PyGreenlet_Type) < 0) {
- INITERROR;
- }
- PyExc_GreenletError = PyErr_NewException("greenlet.error", NULL, NULL);
- if (PyExc_GreenletError == NULL) {
- INITERROR;
- }
- PyExc_GreenletExit =
- PyErr_NewException("greenlet.GreenletExit", PyExc_BaseException, NULL);
- if (PyExc_GreenletExit == NULL) {
- INITERROR;
- }
-
- ts_empty_tuple = PyTuple_New(0);
- if (ts_empty_tuple == NULL) {
- INITERROR;
- }
-
- ts_empty_dict = PyDict_New();
- if (ts_empty_dict == NULL) {
- INITERROR;
- }
-
- ts_current = green_create_main();
- if (ts_current == NULL) {
- INITERROR;
- }
-
- Py_INCREF(&PyGreenlet_Type);
- PyModule_AddObject(m, "greenlet", (PyObject*)&PyGreenlet_Type);
- Py_INCREF(PyExc_GreenletError);
- PyModule_AddObject(m, "error", PyExc_GreenletError);
- Py_INCREF(PyExc_GreenletExit);
- PyModule_AddObject(m, "GreenletExit", PyExc_GreenletExit);
-
- PyModule_AddObject(m, "GREENLET_USE_GC", PyBool_FromLong(1));
- PyModule_AddObject(m, "GREENLET_USE_TRACING", PyBool_FromLong(1));
- PyModule_AddObject(
- m, "GREENLET_USE_CONTEXT_VARS", PyBool_FromLong(GREENLET_PY37));
-
- /* also publish module-level data as attributes of the greentype. */
- /* XXX: Why? */
- for (p = copy_on_greentype; *p; p++) {
- PyObject* o = PyObject_GetAttrString(m, *p);
- if (!o) {
- continue;
- }
- PyDict_SetItemString(PyGreenlet_Type.tp_dict, *p, o);
- Py_DECREF(o);
- }
-
- /*
- * Expose C API
- */
-
- /* types */
- _PyGreenlet_API[PyGreenlet_Type_NUM] = (void*)&PyGreenlet_Type;
-
- /* exceptions */
- _PyGreenlet_API[PyExc_GreenletError_NUM] = (void*)PyExc_GreenletError;
- _PyGreenlet_API[PyExc_GreenletExit_NUM] = (void*)PyExc_GreenletExit;
-
- /* methods */
- _PyGreenlet_API[PyGreenlet_New_NUM] = (void*)PyGreenlet_New;
- _PyGreenlet_API[PyGreenlet_GetCurrent_NUM] = (void*)PyGreenlet_GetCurrent;
- _PyGreenlet_API[PyGreenlet_Throw_NUM] = (void*)PyGreenlet_Throw;
- _PyGreenlet_API[PyGreenlet_Switch_NUM] = (void*)PyGreenlet_Switch;
- _PyGreenlet_API[PyGreenlet_SetParent_NUM] = (void*)PyGreenlet_SetParent;
-
- /* XXX: Note that our module name is ``greenlet._greenlet``, but for
- backwards compatibility with existing C code, we need the _C_API to
- be directly in greenlet.
- */
- c_api_object =
- PyCapsule_New((void*)_PyGreenlet_API, "greenlet._C_API", NULL);
- if (c_api_object != NULL) {
- PyModule_AddObject(m, "_C_API", c_api_object);
- }
-
-#if PY_MAJOR_VERSION >= 3
- return m;
-#endif
-}
-
-#ifdef __clang__
-# pragma clang diagnostic pop
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/greenlet.h b/env/lib/python3.9/site-packages/greenlet/greenlet.h
deleted file mode 100644
index c788b2f..0000000
--- a/env/lib/python3.9/site-packages/greenlet/greenlet.h
+++ /dev/null
@@ -1,161 +0,0 @@
-/* -*- indent-tabs-mode: nil; tab-width: 4; -*- */
-
-/* Greenlet object interface */
-
-#ifndef Py_GREENLETOBJECT_H
-#define Py_GREENLETOBJECT_H
-
-#include
-
-#ifdef __cplusplus
-extern "C" {
-#endif
-
-/* This is deprecated and undocumented. It does not change. */
-#define GREENLET_VERSION "1.0.0"
-
-#if PY_VERSION_HEX >= 0x30B00A6
-# define GREENLET_PY311 1
- /* _PyInterpreterFrame moved to the internal C API in Python 3.11 */
-# include
-#else
-# define GREENLET_PY311 0
-# define _PyCFrame CFrame
-#endif
-
-typedef struct _greenlet {
- PyObject_HEAD
- char* stack_start;
- char* stack_stop;
- char* stack_copy;
- intptr_t stack_saved;
- struct _greenlet* stack_prev;
- struct _greenlet* parent;
- PyObject* run_info;
- struct _frame* top_frame;
- int recursion_depth;
-#if GREENLET_PY311
- _PyInterpreterFrame *current_frame;
- _PyStackChunk *datastack_chunk;
- PyObject **datastack_top;
- PyObject **datastack_limit;
-#endif
- PyObject* weakreflist;
-#if PY_VERSION_HEX >= 0x030700A3
- _PyErr_StackItem* exc_info;
- _PyErr_StackItem exc_state;
-#else
- PyObject* exc_type;
- PyObject* exc_value;
- PyObject* exc_traceback;
-#endif
- PyObject* dict;
-#if PY_VERSION_HEX >= 0x030700A3
- PyObject* context;
-#endif
-#if PY_VERSION_HEX >= 0x30A00B1
- _PyCFrame* cframe;
-#endif
-} PyGreenlet;
-
-#define PyGreenlet_Check(op) PyObject_TypeCheck(op, &PyGreenlet_Type)
-#define PyGreenlet_MAIN(op) (((PyGreenlet*)(op))->stack_stop == (char*)-1)
-#define PyGreenlet_STARTED(op) (((PyGreenlet*)(op))->stack_stop != NULL)
-#define PyGreenlet_ACTIVE(op) (((PyGreenlet*)(op))->stack_start != NULL)
-#define PyGreenlet_GET_PARENT(op) (((PyGreenlet*)(op))->parent)
-
-/* C API functions */
-
-/* Total number of symbols that are exported */
-#define PyGreenlet_API_pointers 8
-
-#define PyGreenlet_Type_NUM 0
-#define PyExc_GreenletError_NUM 1
-#define PyExc_GreenletExit_NUM 2
-
-#define PyGreenlet_New_NUM 3
-#define PyGreenlet_GetCurrent_NUM 4
-#define PyGreenlet_Throw_NUM 5
-#define PyGreenlet_Switch_NUM 6
-#define PyGreenlet_SetParent_NUM 7
-
-#ifndef GREENLET_MODULE
-/* This section is used by modules that uses the greenlet C API */
-static void** _PyGreenlet_API = NULL;
-
-# define PyGreenlet_Type \
- (*(PyTypeObject*)_PyGreenlet_API[PyGreenlet_Type_NUM])
-
-# define PyExc_GreenletError \
- ((PyObject*)_PyGreenlet_API[PyExc_GreenletError_NUM])
-
-# define PyExc_GreenletExit \
- ((PyObject*)_PyGreenlet_API[PyExc_GreenletExit_NUM])
-
-/*
- * PyGreenlet_New(PyObject *args)
- *
- * greenlet.greenlet(run, parent=None)
- */
-# define PyGreenlet_New \
- (*(PyGreenlet * (*)(PyObject * run, PyGreenlet * parent)) \
- _PyGreenlet_API[PyGreenlet_New_NUM])
-
-/*
- * PyGreenlet_GetCurrent(void)
- *
- * greenlet.getcurrent()
- */
-# define PyGreenlet_GetCurrent \
- (*(PyGreenlet * (*)(void)) _PyGreenlet_API[PyGreenlet_GetCurrent_NUM])
-
-/*
- * PyGreenlet_Throw(
- * PyGreenlet *greenlet,
- * PyObject *typ,
- * PyObject *val,
- * PyObject *tb)
- *
- * g.throw(...)
- */
-# define PyGreenlet_Throw \
- (*(PyObject * (*)(PyGreenlet * self, \
- PyObject * typ, \
- PyObject * val, \
- PyObject * tb)) \
- _PyGreenlet_API[PyGreenlet_Throw_NUM])
-
-/*
- * PyGreenlet_Switch(PyGreenlet *greenlet, PyObject *args)
- *
- * g.switch(*args, **kwargs)
- */
-# define PyGreenlet_Switch \
- (*(PyObject * \
- (*)(PyGreenlet * greenlet, PyObject * args, PyObject * kwargs)) \
- _PyGreenlet_API[PyGreenlet_Switch_NUM])
-
-/*
- * PyGreenlet_SetParent(PyObject *greenlet, PyObject *new_parent)
- *
- * g.parent = new_parent
- */
-# define PyGreenlet_SetParent \
- (*(int (*)(PyGreenlet * greenlet, PyGreenlet * nparent)) \
- _PyGreenlet_API[PyGreenlet_SetParent_NUM])
-
-/* Macro that imports greenlet and initializes C API */
-/* NOTE: This has actually moved to ``greenlet._greenlet._C_API``, but we
- keep the older definition to be sure older code that might have a copy of
- the header still works. */
-# define PyGreenlet_Import() \
- { \
- _PyGreenlet_API = (void**)PyCapsule_Import("greenlet._C_API", 0); \
- }
-
-#endif /* GREENLET_MODULE */
-
-#ifdef __cplusplus
-}
-#endif
-#endif /* !Py_GREENLETOBJECT_H */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/setup_switch_x64_masm.cmd b/env/lib/python3.9/site-packages/greenlet/platform/setup_switch_x64_masm.cmd
deleted file mode 100644
index 0928595..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/setup_switch_x64_masm.cmd
+++ /dev/null
@@ -1,2 +0,0 @@
-call "C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\vcvarsall.bat" amd64
-ml64 /nologo /c /Fo switch_x64_masm.obj switch_x64_masm.asm
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_aarch64_gcc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_aarch64_gcc.h
deleted file mode 100644
index 0b9d556..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_aarch64_gcc.h
+++ /dev/null
@@ -1,69 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 07-Sep-16 Add clang support using x register naming. Fredrik Fornwall
- * 13-Apr-13 Add support for strange GCC caller-save decisions
- * 08-Apr-13 File creation. Michael Matz
- *
- * NOTES
- *
- * Simply save all callee saved registers
- *
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-#define STACK_MAGIC 0
-#define REGS_TO_SAVE "x19", "x20", "x21", "x22", "x23", "x24", "x25", "x26", \
- "x27", "x28", "x30" /* aka lr */, \
- "v8", "v9", "v10", "v11", \
- "v12", "v13", "v14", "v15"
-
-static int
-slp_switch(void)
-{
- int err;
- void *fp;
- register long *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("str x29, %0" : "=m"(fp) : : );
- __asm__ ("mov %0, sp" : "=r" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "add sp,sp,%0\n"
- "add x29,x29,%0\n"
- :
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- /* SLP_SAVE_STATE macro contains some return statements
- (of -1 and 1). It falls through only when
- the return value of slp_save_state() is zero, which
- is placed in x0.
- In that case we (slp_switch) also want to return zero
- (also in x0 of course).
- Now, some GCC versions (seen with 4.8) think it's a
- good idea to save/restore x0 around the call to
- slp_restore_state(), instead of simply zeroing it
- at the return below. But slp_restore_state
- writes random values to the stack slot used for this
- save/restore (from when it once was saved above in
- SLP_SAVE_STATE, when it was still uninitialized), so
- "restoring" that precious zero actually makes us
- return random values. There are some ways to make
- GCC not use that zero value in the normal return path
- (e.g. making err volatile, but that costs a little
- stack space), and the simplest is to call a function
- that returns an unknown value (which happens to be zero),
- so the saved/restored value is unused. */
- __asm__ volatile ("mov %0, #0" : "=r" (err));
- }
- __asm__ volatile ("ldr x29, %0" : : "m" (fp) :);
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- return err;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_alpha_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_alpha_unix.h
deleted file mode 100644
index 216619f..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_alpha_unix.h
+++ /dev/null
@@ -1,30 +0,0 @@
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-#define STACK_MAGIC 0
-
-#define REGS_TO_SAVE "$9", "$10", "$11", "$12", "$13", "$14", "$15", \
- "$f2", "$f3", "$f4", "$f5", "$f6", "$f7", "$f8", "$f9"
-
-static int
-slp_switch(void)
-{
- register int ret;
- register long *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("mov $30, %0" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "addq $30, %0, $30\n\t"
- : /* no outputs */
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("mov $31, %0" : "=r" (ret) : );
- return ret;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_amd64_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_amd64_unix.h
deleted file mode 100644
index 16b99b7..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_amd64_unix.h
+++ /dev/null
@@ -1,84 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 3-May-13 Ralf Schmitt
- * Add support for strange GCC caller-save decisions
- * (ported from switch_aarch64_gcc.h)
- * 18-Aug-11 Alexey Borzenkov
- * Correctly save rbp, csr and cw
- * 01-Apr-04 Hye-Shik Chang
- * Ported from i386 to amd64.
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for spark
- * 31-Avr-02 Armin Rigo
- * Added ebx, esi and edi register-saves.
- * 01-Mar-02 Samual M. Rushing
- * Ported from i386.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-/* #define STACK_MAGIC 3 */
-/* the above works fine with gcc 2.96, but 2.95.3 wants this */
-#define STACK_MAGIC 0
-
-#define REGS_TO_SAVE "r12", "r13", "r14", "r15"
-
-static int
-slp_switch(void)
-{
- int err;
- void* rbp;
- void* rbx;
- unsigned int csr;
- unsigned short cw;
- register long *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("fstcw %0" : "=m" (cw));
- __asm__ volatile ("stmxcsr %0" : "=m" (csr));
- __asm__ volatile ("movq %%rbp, %0" : "=m" (rbp));
- __asm__ volatile ("movq %%rbx, %0" : "=m" (rbx));
- __asm__ ("movq %%rsp, %0" : "=g" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "addq %0, %%rsp\n"
- "addq %0, %%rbp\n"
- :
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- __asm__ volatile ("xorq %%rax, %%rax" : "=a" (err));
- }
- __asm__ volatile ("movq %0, %%rbx" : : "m" (rbx));
- __asm__ volatile ("movq %0, %%rbp" : : "m" (rbp));
- __asm__ volatile ("ldmxcsr %0" : : "m" (csr));
- __asm__ volatile ("fldcw %0" : : "m" (cw));
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_arm32_gcc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_arm32_gcc.h
deleted file mode 100644
index 035d6b9..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_arm32_gcc.h
+++ /dev/null
@@ -1,79 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 14-Aug-06 File creation. Ported from Arm Thumb. Sylvain Baro
- * 3-Sep-06 Commented out saving of r1-r3 (r4 already commented out) as I
- * read that these do not need to be saved. Also added notes and
- * errors related to the frame pointer. Richard Tew.
- *
- * NOTES
- *
- * It is not possible to detect if fp is used or not, so the supplied
- * switch function needs to support it, so that you can remove it if
- * it does not apply to you.
- *
- * POSSIBLE ERRORS
- *
- * "fp cannot be used in asm here"
- *
- * - Try commenting out "fp" in REGS_TO_SAVE.
- *
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-#define STACK_MAGIC 0
-#define REG_SP "sp"
-#define REG_SPSP "sp,sp"
-#ifdef __thumb__
-#define REG_FP "r7"
-#define REG_FPFP "r7,r7"
-#define REGS_TO_SAVE_GENERAL "r4", "r5", "r6", "r8", "r9", "r10", "r11", "lr"
-#else
-#define REG_FP "fp"
-#define REG_FPFP "fp,fp"
-#define REGS_TO_SAVE_GENERAL "r4", "r5", "r6", "r7", "r8", "r9", "r10", "lr"
-#endif
-#if defined(__SOFTFP__)
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL
-#elif defined(__VFP_FP__)
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL, "d8", "d9", "d10", "d11", \
- "d12", "d13", "d14", "d15"
-#elif defined(__MAVERICK__)
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL, "mvf4", "mvf5", "mvf6", "mvf7", \
- "mvf8", "mvf9", "mvf10", "mvf11", \
- "mvf12", "mvf13", "mvf14", "mvf15"
-#else
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL, "f4", "f5", "f6", "f7"
-#endif
-
-static int
-#ifdef __GNUC__
-__attribute__((optimize("no-omit-frame-pointer")))
-#endif
-slp_switch(void)
-{
- void *fp;
- register int *stackref, stsizediff;
- int result;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("mov r0," REG_FP "\n\tstr r0,%0" : "=m" (fp) : : "r0");
- __asm__ ("mov %0," REG_SP : "=r" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "add " REG_SPSP ",%0\n"
- "add " REG_FPFP ",%0\n"
- :
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("ldr r0,%1\n\tmov " REG_FP ",r0\n\tmov %0, #0" : "=r" (result) : "m" (fp) : "r0");
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- return result;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_arm32_ios.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_arm32_ios.h
deleted file mode 100644
index e993707..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_arm32_ios.h
+++ /dev/null
@@ -1,67 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 31-May-15 iOS support. Ported from arm32. Proton
- *
- * NOTES
- *
- * It is not possible to detect if fp is used or not, so the supplied
- * switch function needs to support it, so that you can remove it if
- * it does not apply to you.
- *
- * POSSIBLE ERRORS
- *
- * "fp cannot be used in asm here"
- *
- * - Try commenting out "fp" in REGS_TO_SAVE.
- *
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 0
-#define REG_SP "sp"
-#define REG_SPSP "sp,sp"
-#define REG_FP "r7"
-#define REG_FPFP "r7,r7"
-#define REGS_TO_SAVE_GENERAL "r4", "r5", "r6", "r8", "r10", "r11", "lr"
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL, "d8", "d9", "d10", "d11", \
- "d12", "d13", "d14", "d15"
-
-static int
-#ifdef __GNUC__
-__attribute__((optimize("no-omit-frame-pointer")))
-#endif
-slp_switch(void)
-{
- void *fp;
- register int *stackref, stsizediff, result;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("str " REG_FP ",%0" : "=m" (fp));
- __asm__ ("mov %0," REG_SP : "=r" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "add " REG_SPSP ",%0\n"
- "add " REG_FPFP ",%0\n"
- :
- : "r" (stsizediff)
- : REGS_TO_SAVE /* Clobber registers, force compiler to
- * recalculate address of void *fp from REG_SP or REG_FP */
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile (
- "ldr " REG_FP ", %1\n\t"
- "mov %0, #0"
- : "=r" (result)
- : "m" (fp)
- : REGS_TO_SAVE /* Force compiler to restore saved registers after this */
- );
- return result;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_csky_gcc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_csky_gcc.h
deleted file mode 100644
index 7486b94..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_csky_gcc.h
+++ /dev/null
@@ -1,48 +0,0 @@
-#ifdef SLP_EVAL
-#define STACK_MAGIC 0
-#define REG_FP "r8"
-#ifdef __CSKYABIV2__
-#define REGS_TO_SAVE_GENERAL "r4", "r5", "r6", "r7", "r9", "r10", "r11", "r15",\
- "r16", "r17", "r18", "r19", "r20", "r21", "r22",\
- "r23", "r24", "r25"
-
-#if defined (__CSKY_HARD_FLOAT__) || (__CSKY_VDSP__)
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL, "vr8", "vr9", "vr10", "vr11", "vr12",\
- "vr13", "vr14", "vr15"
-#else
-#define REGS_TO_SAVE REGS_TO_SAVE_GENERAL
-#endif
-#else
-#define REGS_TO_SAVE "r9", "r10", "r11", "r12", "r13", "r15"
-#endif
-
-
-static int
-#ifdef __GNUC__
-__attribute__((optimize("no-omit-frame-pointer")))
-#endif
-slp_switch(void)
-{
- register int *stackref, stsizediff;
- int result;
-
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ ("mov %0, sp" : "=r" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "addu sp,%0\n"
- "addu "REG_FP",%0\n"
- :
- : "r" (stsizediff)
- );
-
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("movi %0, 0" : "=r" (result));
- __asm__ volatile ("" : : : REGS_TO_SAVE);
-
- return result;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_m68k_gcc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_m68k_gcc.h
deleted file mode 100644
index da761c2..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_m68k_gcc.h
+++ /dev/null
@@ -1,38 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 2014-01-06 Andreas Schwab
- * File created.
- */
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 0
-
-#define REGS_TO_SAVE "%d2", "%d3", "%d4", "%d5", "%d6", "%d7", \
- "%a2", "%a3", "%a4"
-
-static int
-slp_switch(void)
-{
- int err;
- int *stackref, stsizediff;
- void *fp, *a5;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("move.l %%fp, %0" : "=m"(fp));
- __asm__ volatile ("move.l %%a5, %0" : "=m"(a5));
- __asm__ ("move.l %%sp, %0" : "=r"(stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile ("add.l %0, %%sp; add.l %0, %%fp" : : "r"(stsizediff));
- SLP_RESTORE_STATE();
- __asm__ volatile ("clr.l %0" : "=g" (err));
- }
- __asm__ volatile ("move.l %0, %%a5" : : "m"(a5));
- __asm__ volatile ("move.l %0, %%fp" : : "m"(fp));
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- return err;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_mips_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_mips_unix.h
deleted file mode 100644
index 1916b26..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_mips_unix.h
+++ /dev/null
@@ -1,64 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 20-Sep-14 Matt Madison
- * Re-code the saving of the gp register for MIPS64.
- * 05-Jan-08 Thiemo Seufer
- * Ported from ppc.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 0
-
-#define REGS_TO_SAVE "$16", "$17", "$18", "$19", "$20", "$21", "$22", \
- "$23", "$30"
-static int
-slp_switch(void)
-{
- register int err;
- register int *stackref, stsizediff;
-#ifdef __mips64
- uint64_t gpsave;
-#endif
- __asm__ __volatile__ ("" : : : REGS_TO_SAVE);
-#ifdef __mips64
- __asm__ __volatile__ ("sd $28,%0" : "=m" (gpsave) : : );
-#endif
- __asm__ ("move %0, $29" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ __volatile__ (
-#ifdef __mips64
- "daddu $29, %0\n"
-#else
- "addu $29, %0\n"
-#endif
- : /* no outputs */
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- }
-#ifdef __mips64
- __asm__ __volatile__ ("ld $28,%0" : : "m" (gpsave) : );
-#endif
- __asm__ __volatile__ ("" : : : REGS_TO_SAVE);
- __asm__ __volatile__ ("move %0, $0" : "=r" (err));
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc64_aix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc64_aix.h
deleted file mode 100644
index e07b8de..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc64_aix.h
+++ /dev/null
@@ -1,103 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 16-Oct-20 Jesse Gorzinski
- * Copied from Linux PPC64 implementation
- * 04-Sep-18 Alexey Borzenkov
- * Workaround a gcc bug using manual save/restore of r30
- * 21-Mar-18 Tulio Magno Quites Machado Filho
- * Added r30 to the list of saved registers in order to fully comply with
- * both ppc64 ELFv1 ABI and the ppc64le ELFv2 ABI, that classify this
- * register as a nonvolatile register used for local variables.
- * 21-Mar-18 Laszlo Boszormenyi
- * Save r2 (TOC pointer) manually.
- * 10-Dec-13 Ulrich Weigand
- * Support ELFv2 ABI. Save float/vector registers.
- * 09-Mar-12 Michael Ellerman
- * 64-bit implementation, copied from 32-bit.
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'r31' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 14-Jan-04 Bob Ippolito
- * added cr2-cr4 to the registers to be saved.
- * Open questions: Should we save FP registers?
- * What about vector registers?
- * Differences between darwin and unix?
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 04-Oct-02 Gustavo Niemeyer
- * Ported from MacOS version.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 29-Jun-02 Christian Tismer
- * Added register 13-29, 31 saves. The same way as
- * Armin Rigo did for the x86_unix version.
- * This seems to be now fully functional!
- * 04-Mar-02 Hye-Shik Chang
- * Ported from i386.
- * 31-Jul-12 Trevor Bowen
- * Changed memory constraints to register only.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 6
-
-#if defined(__ALTIVEC__)
-#define ALTIVEC_REGS \
- "v20", "v21", "v22", "v23", "v24", "v25", "v26", "v27", \
- "v28", "v29", "v30", "v31",
-#else
-#define ALTIVEC_REGS
-#endif
-
-#define REGS_TO_SAVE "r14", "r15", "r16", "r17", "r18", "r19", "r20", \
- "r21", "r22", "r23", "r24", "r25", "r26", "r27", "r28", "r29", \
- "r31", \
- "fr14", "fr15", "fr16", "fr17", "fr18", "fr19", "fr20", "fr21", \
- "fr22", "fr23", "fr24", "fr25", "fr26", "fr27", "fr28", "fr29", \
- "fr30", "fr31", \
- ALTIVEC_REGS \
- "cr2", "cr3", "cr4"
-
-static int
-slp_switch(void)
-{
- register int err;
- register long *stackref, stsizediff;
- void * toc;
- void * r30;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("std 2, %0" : "=m" (toc));
- __asm__ volatile ("std 30, %0" : "=m" (r30));
- __asm__ ("mr %0, 1" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "mr 11, %0\n"
- "add 1, 1, 11\n"
- : /* no outputs */
- : "r" (stsizediff)
- : "11"
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("ld 30, %0" : : "m" (r30));
- __asm__ volatile ("ld 2, %0" : : "m" (toc));
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("li %0, 0" : "=r" (err));
- return err;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc64_linux.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc64_linux.h
deleted file mode 100644
index 88e6847..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc64_linux.h
+++ /dev/null
@@ -1,105 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 04-Sep-18 Alexey Borzenkov
- * Workaround a gcc bug using manual save/restore of r30
- * 21-Mar-18 Tulio Magno Quites Machado Filho
- * Added r30 to the list of saved registers in order to fully comply with
- * both ppc64 ELFv1 ABI and the ppc64le ELFv2 ABI, that classify this
- * register as a nonvolatile register used for local variables.
- * 21-Mar-18 Laszlo Boszormenyi
- * Save r2 (TOC pointer) manually.
- * 10-Dec-13 Ulrich Weigand
- * Support ELFv2 ABI. Save float/vector registers.
- * 09-Mar-12 Michael Ellerman
- * 64-bit implementation, copied from 32-bit.
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'r31' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 14-Jan-04 Bob Ippolito
- * added cr2-cr4 to the registers to be saved.
- * Open questions: Should we save FP registers?
- * What about vector registers?
- * Differences between darwin and unix?
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 04-Oct-02 Gustavo Niemeyer
- * Ported from MacOS version.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 29-Jun-02 Christian Tismer
- * Added register 13-29, 31 saves. The same way as
- * Armin Rigo did for the x86_unix version.
- * This seems to be now fully functional!
- * 04-Mar-02 Hye-Shik Chang
- * Ported from i386.
- * 31-Jul-12 Trevor Bowen
- * Changed memory constraints to register only.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#if _CALL_ELF == 2
-#define STACK_MAGIC 4
-#else
-#define STACK_MAGIC 6
-#endif
-
-#if defined(__ALTIVEC__)
-#define ALTIVEC_REGS \
- "v20", "v21", "v22", "v23", "v24", "v25", "v26", "v27", \
- "v28", "v29", "v30", "v31",
-#else
-#define ALTIVEC_REGS
-#endif
-
-#define REGS_TO_SAVE "r14", "r15", "r16", "r17", "r18", "r19", "r20", \
- "r21", "r22", "r23", "r24", "r25", "r26", "r27", "r28", "r29", \
- "r31", \
- "fr14", "fr15", "fr16", "fr17", "fr18", "fr19", "fr20", "fr21", \
- "fr22", "fr23", "fr24", "fr25", "fr26", "fr27", "fr28", "fr29", \
- "fr30", "fr31", \
- ALTIVEC_REGS \
- "cr2", "cr3", "cr4"
-
-static int
-slp_switch(void)
-{
- register int err;
- register long *stackref, stsizediff;
- void * toc;
- void * r30;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("std 2, %0" : "=m" (toc));
- __asm__ volatile ("std 30, %0" : "=m" (r30));
- __asm__ ("mr %0, 1" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "mr 11, %0\n"
- "add 1, 1, 11\n"
- : /* no outputs */
- : "r" (stsizediff)
- : "11"
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("ld 30, %0" : : "m" (r30));
- __asm__ volatile ("ld 2, %0" : : "m" (toc));
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("li %0, 0" : "=r" (err));
- return err;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_aix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_aix.h
deleted file mode 100644
index c7d476f..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_aix.h
+++ /dev/null
@@ -1,87 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 07-Mar-11 Floris Bruynooghe
- * Do not add stsizediff to general purpose
- * register (GPR) 30 as this is a non-volatile and
- * unused by the PowerOpen Environment, therefore
- * this was modifying a user register instead of the
- * frame pointer (which does not seem to exist).
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'r31' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 14-Jan-04 Bob Ippolito
- * added cr2-cr4 to the registers to be saved.
- * Open questions: Should we save FP registers?
- * What about vector registers?
- * Differences between darwin and unix?
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 04-Oct-02 Gustavo Niemeyer
- * Ported from MacOS version.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 29-Jun-02 Christian Tismer
- * Added register 13-29, 31 saves. The same way as
- * Armin Rigo did for the x86_unix version.
- * This seems to be now fully functional!
- * 04-Mar-02 Hye-Shik Chang
- * Ported from i386.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 3
-
-/* !!!!WARNING!!!! need to add "r31" in the next line if this header file
- * is meant to be compiled non-dynamically!
- */
-#define REGS_TO_SAVE "r13", "r14", "r15", "r16", "r17", "r18", "r19", "r20", \
- "r21", "r22", "r23", "r24", "r25", "r26", "r27", "r28", "r29", \
- "cr2", "cr3", "cr4"
-static int
-slp_switch(void)
-{
- register int err;
- register int *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ ("mr %0, 1" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "mr 11, %0\n"
- "add 1, 1, 11\n"
- : /* no outputs */
- : "r" (stsizediff)
- : "11"
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("li %0, 0" : "=r" (err));
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_linux.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_linux.h
deleted file mode 100644
index 0a71255..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_linux.h
+++ /dev/null
@@ -1,84 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'r31' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 14-Jan-04 Bob Ippolito
- * added cr2-cr4 to the registers to be saved.
- * Open questions: Should we save FP registers?
- * What about vector registers?
- * Differences between darwin and unix?
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 04-Oct-02 Gustavo Niemeyer
- * Ported from MacOS version.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 29-Jun-02 Christian Tismer
- * Added register 13-29, 31 saves. The same way as
- * Armin Rigo did for the x86_unix version.
- * This seems to be now fully functional!
- * 04-Mar-02 Hye-Shik Chang
- * Ported from i386.
- * 31-Jul-12 Trevor Bowen
- * Changed memory constraints to register only.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 3
-
-/* !!!!WARNING!!!! need to add "r31" in the next line if this header file
- * is meant to be compiled non-dynamically!
- */
-#define REGS_TO_SAVE "r13", "r14", "r15", "r16", "r17", "r18", "r19", "r20", \
- "r21", "r22", "r23", "r24", "r25", "r26", "r27", "r28", "r29", \
- "cr2", "cr3", "cr4"
-static int
-slp_switch(void)
-{
- register int err;
- register int *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ ("mr %0, 1" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "mr 11, %0\n"
- "add 1, 1, 11\n"
- "add 30, 30, 11\n"
- : /* no outputs */
- : "r" (stsizediff)
- : "11"
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("li %0, 0" : "=r" (err));
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_macosx.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_macosx.h
deleted file mode 100644
index 56e573f..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_macosx.h
+++ /dev/null
@@ -1,82 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'r31' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 14-Jan-04 Bob Ippolito
- * added cr2-cr4 to the registers to be saved.
- * Open questions: Should we save FP registers?
- * What about vector registers?
- * Differences between darwin and unix?
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 29-Jun-02 Christian Tismer
- * Added register 13-29, 31 saves. The same way as
- * Armin Rigo did for the x86_unix version.
- * This seems to be now fully functional!
- * 04-Mar-02 Hye-Shik Chang
- * Ported from i386.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 3
-
-/* !!!!WARNING!!!! need to add "r31" in the next line if this header file
- * is meant to be compiled non-dynamically!
- */
-#define REGS_TO_SAVE "r13", "r14", "r15", "r16", "r17", "r18", "r19", "r20", \
- "r21", "r22", "r23", "r24", "r25", "r26", "r27", "r28", "r29", \
- "cr2", "cr3", "cr4"
-
-static int
-slp_switch(void)
-{
- register int err;
- register int *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ ("; asm block 2\n\tmr %0, r1" : "=g" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "; asm block 3\n"
- "\tmr r11, %0\n"
- "\tadd r1, r1, r11\n"
- "\tadd r30, r30, r11\n"
- : /* no outputs */
- : "g" (stsizediff)
- : "r11"
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("li %0, 0" : "=r" (err));
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_unix.h
deleted file mode 100644
index 2b3d307..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_ppc_unix.h
+++ /dev/null
@@ -1,82 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'r31' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 14-Jan-04 Bob Ippolito
- * added cr2-cr4 to the registers to be saved.
- * Open questions: Should we save FP registers?
- * What about vector registers?
- * Differences between darwin and unix?
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 04-Oct-02 Gustavo Niemeyer
- * Ported from MacOS version.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 29-Jun-02 Christian Tismer
- * Added register 13-29, 31 saves. The same way as
- * Armin Rigo did for the x86_unix version.
- * This seems to be now fully functional!
- * 04-Mar-02 Hye-Shik Chang
- * Ported from i386.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 3
-
-/* !!!!WARNING!!!! need to add "r31" in the next line if this header file
- * is meant to be compiled non-dynamically!
- */
-#define REGS_TO_SAVE "r13", "r14", "r15", "r16", "r17", "r18", "r19", "r20", \
- "r21", "r22", "r23", "r24", "r25", "r26", "r27", "r28", "r29", \
- "cr2", "cr3", "cr4"
-static int
-slp_switch(void)
-{
- register int err;
- register int *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ ("mr %0, 1" : "=g" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "mr 11, %0\n"
- "add 1, 1, 11\n"
- "add 30, 30, 11\n"
- : /* no outputs */
- : "g" (stsizediff)
- : "11"
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("li %0, 0" : "=r" (err));
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_riscv_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_riscv_unix.h
deleted file mode 100644
index 5b5ea98..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_riscv_unix.h
+++ /dev/null
@@ -1,32 +0,0 @@
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-#define STACK_MAGIC 0
-
-#define REGS_TO_SAVE "s0", "s1", "s2", "s3", "s4", "s5", \
- "s6", "s7", "s8", "s9", "s10", "s11", "fs0", "fs1", \
- "fs2", "fs3", "fs4", "fs5", "fs6", "fs7", "fs8", "fs9", \
- "fs10", "fs11"
-
-static int
-slp_switch(void)
-{
- register int ret;
- register long *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("mv %0, sp" : "=r" (stackref) : );
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "add sp, sp, %0\n\t"
- : /* no outputs */
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("mv %0, zero" : "=r" (ret) : );
- return ret;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_s390_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_s390_unix.h
deleted file mode 100644
index 6641854..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_s390_unix.h
+++ /dev/null
@@ -1,87 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 25-Jan-12 Alexey Borzenkov
- * Fixed Linux/S390 port to work correctly with
- * different optimization options both on 31-bit
- * and 64-bit. Thanks to Stefan Raabe for lots
- * of testing.
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 06-Oct-02 Gustavo Niemeyer
- * Ported to Linux/S390.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#ifdef __s390x__
-#define STACK_MAGIC 20 /* 20 * 8 = 160 bytes of function call area */
-#else
-#define STACK_MAGIC 24 /* 24 * 4 = 96 bytes of function call area */
-#endif
-
-/* Technically, r11-r13 also need saving, but function prolog starts
- with stm(g) and since there are so many saved registers already
- it won't be optimized, resulting in all r6-r15 being saved */
-#define REGS_TO_SAVE "r6", "r7", "r8", "r9", "r10", "r14", \
- "f0", "f1", "f2", "f3", "f4", "f5", "f6", "f7", \
- "f8", "f9", "f10", "f11", "f12", "f13", "f14", "f15"
-
-static int
-slp_switch(void)
-{
- register int ret;
- register long *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
-#ifdef __s390x__
- __asm__ volatile ("lgr %0, 15" : "=r" (stackref) : );
-#else
- __asm__ volatile ("lr %0, 15" : "=r" (stackref) : );
-#endif
- {
- SLP_SAVE_STATE(stackref, stsizediff);
-/* N.B.
- r11 may be used as the frame pointer, and in that case it cannot be
- clobbered and needs offsetting just like the stack pointer (but in cases
- where frame pointer isn't used we might clobber it accidentally). What's
- scary is that r11 is 2nd (and even 1st when GOT is used) callee saved
- register that gcc would chose for surviving function calls. However,
- since r6-r10 are clobbered above, their cost for reuse is reduced, so
- gcc IRA will chose them over r11 (not seeing r11 is implicitly saved),
- making it relatively safe to offset in all cases. :) */
- __asm__ volatile (
-#ifdef __s390x__
- "agr 15, %0\n\t"
- "agr 11, %0"
-#else
- "ar 15, %0\n\t"
- "ar 11, %0"
-#endif
- : /* no outputs */
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("lhi %0, 0" : "=r" (ret) : );
- return ret;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_sparc_sun_gcc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_sparc_sun_gcc.h
deleted file mode 100644
index 652b57f..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_sparc_sun_gcc.h
+++ /dev/null
@@ -1,92 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 16-May-15 Alexey Borzenkov
- * Move stack spilling code inside save/restore functions
- * 30-Aug-13 Floris Bruynooghe
- Clean the register windows again before returning.
- This does not clobber the PIC register as it leaves
- the current window intact and is required for multi-
- threaded code to work correctly.
- * 08-Mar-11 Floris Bruynooghe
- * No need to set return value register explicitly
- * before the stack and framepointer are adjusted
- * as none of the other registers are influenced by
- * this. Also don't needlessly clean the windows
- * ('ta %0" :: "i" (ST_CLEAN_WINDOWS)') as that
- * clobbers the gcc PIC register (%l7).
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * added support for SunOS sparc with gcc
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-
-#define STACK_MAGIC 0
-
-
-#if defined(__sparcv9)
-#define SLP_FLUSHW __asm__ volatile ("flushw")
-#else
-#define SLP_FLUSHW __asm__ volatile ("ta 3") /* ST_FLUSH_WINDOWS */
-#endif
-
-/* On sparc we need to spill register windows inside save/restore functions */
-#define SLP_BEFORE_SAVE_STATE() SLP_FLUSHW
-#define SLP_BEFORE_RESTORE_STATE() SLP_FLUSHW
-
-
-static int
-slp_switch(void)
-{
- register int err;
- register int *stackref, stsizediff;
-
- /* Put current stack pointer into stackref.
- * Register spilling is done in save/restore.
- */
- __asm__ volatile ("mov %%sp, %0" : "=r" (stackref));
-
- {
- /* Thou shalt put SLP_SAVE_STATE into a local block */
- /* Copy the current stack onto the heap */
- SLP_SAVE_STATE(stackref, stsizediff);
-
- /* Increment stack and frame pointer by stsizediff */
- __asm__ volatile (
- "add %0, %%sp, %%sp\n\t"
- "add %0, %%fp, %%fp"
- : : "r" (stsizediff));
-
- /* Copy new stack from it's save store on the heap */
- SLP_RESTORE_STATE();
-
- __asm__ volatile ("mov %1, %0" : "=r" (err) : "i" (0));
- return err;
- }
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_x32_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_x32_unix.h
deleted file mode 100644
index cb14ec1..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_x32_unix.h
+++ /dev/null
@@ -1,63 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 17-Aug-12 Fantix King
- * Ported from amd64.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 0
-
-#define REGS_TO_SAVE "r12", "r13", "r14", "r15"
-
-
-static int
-slp_switch(void)
-{
- void* ebp;
- void* ebx;
- unsigned int csr;
- unsigned short cw;
- register int err;
- register int *stackref, stsizediff;
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("fstcw %0" : "=m" (cw));
- __asm__ volatile ("stmxcsr %0" : "=m" (csr));
- __asm__ volatile ("movl %%ebp, %0" : "=m" (ebp));
- __asm__ volatile ("movl %%ebx, %0" : "=m" (ebx));
- __asm__ ("movl %%esp, %0" : "=g" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "addl %0, %%esp\n"
- "addl %0, %%ebp\n"
- :
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- }
- __asm__ volatile ("movl %0, %%ebx" : : "m" (ebx));
- __asm__ volatile ("movl %0, %%ebp" : : "m" (ebp));
- __asm__ volatile ("ldmxcsr %0" : : "m" (csr));
- __asm__ volatile ("fldcw %0" : : "m" (cw));
- __asm__ volatile ("" : : : REGS_TO_SAVE);
- __asm__ volatile ("xorl %%eax, %%eax" : "=a" (err));
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_masm.asm b/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_masm.asm
deleted file mode 100644
index f5c72a2..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_masm.asm
+++ /dev/null
@@ -1,111 +0,0 @@
-;
-; stack switching code for MASM on x641
-; Kristjan Valur Jonsson, sept 2005
-;
-
-
-;prototypes for our calls
-slp_save_state_asm PROTO
-slp_restore_state_asm PROTO
-
-
-pushxmm MACRO reg
- sub rsp, 16
- .allocstack 16
- movaps [rsp], reg ; faster than movups, but we must be aligned
- ; .savexmm128 reg, offset (don't know what offset is, no documentation)
-ENDM
-popxmm MACRO reg
- movaps reg, [rsp] ; faster than movups, but we must be aligned
- add rsp, 16
-ENDM
-
-pushreg MACRO reg
- push reg
- .pushreg reg
-ENDM
-popreg MACRO reg
- pop reg
-ENDM
-
-
-.code
-slp_switch PROC FRAME
- ;realign stack to 16 bytes after return address push, makes the following faster
- sub rsp,8
- .allocstack 8
-
- pushxmm xmm15
- pushxmm xmm14
- pushxmm xmm13
- pushxmm xmm12
- pushxmm xmm11
- pushxmm xmm10
- pushxmm xmm9
- pushxmm xmm8
- pushxmm xmm7
- pushxmm xmm6
-
- pushreg r15
- pushreg r14
- pushreg r13
- pushreg r12
-
- pushreg rbp
- pushreg rbx
- pushreg rdi
- pushreg rsi
-
- sub rsp, 10h ;allocate the singlefunction argument (must be multiple of 16)
- .allocstack 10h
-.endprolog
-
- lea rcx, [rsp+10h] ;load stack base that we are saving
- call slp_save_state_asm ;pass stackpointer, return offset in eax
- cmp rax, 1
- je EXIT1
- cmp rax, -1
- je EXIT2
- ;actual stack switch:
- add rsp, rax
- call slp_restore_state_asm
- xor rax, rax ;return 0
-
-EXIT:
-
- add rsp, 10h
- popreg rsi
- popreg rdi
- popreg rbx
- popreg rbp
-
- popreg r12
- popreg r13
- popreg r14
- popreg r15
-
- popxmm xmm6
- popxmm xmm7
- popxmm xmm8
- popxmm xmm9
- popxmm xmm10
- popxmm xmm11
- popxmm xmm12
- popxmm xmm13
- popxmm xmm14
- popxmm xmm15
-
- add rsp, 8
- ret
-
-EXIT1:
- mov rax, 1
- jmp EXIT
-
-EXIT2:
- sar rax, 1
- jmp EXIT
-
-slp_switch ENDP
-
-END
\ No newline at end of file
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_masm.obj b/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_masm.obj
deleted file mode 100644
index 64e3e6b..0000000
Binary files a/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_masm.obj and /dev/null differ
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_msvc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_msvc.h
deleted file mode 100644
index 601ea56..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_x64_msvc.h
+++ /dev/null
@@ -1,60 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 26-Sep-02 Christian Tismer
- * again as a result of virtualized stack access,
- * the compiler used less registers. Needed to
- * explicit mention registers in order to get them saved.
- * Thanks to Jeff Senn for pointing this out and help.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 01-Mar-02 Christian Tismer
- * Initial final version after lots of iterations for i386.
- */
-
-/* Avoid alloca redefined warning on mingw64 */
-#ifndef alloca
-#define alloca _alloca
-#endif
-
-#define STACK_REFPLUS 1
-#define STACK_MAGIC 0
-
-/* Use the generic support for an external assembly language slp_switch function. */
-#define EXTERNAL_ASM
-
-#ifdef SLP_EVAL
-/* This always uses the external masm assembly file. */
-#endif
-
-/*
- * further self-processing support
- */
-
-/* we have IsBadReadPtr available, so we can peek at objects */
-/*
-#define STACKLESS_SPY
-
-#ifdef IMPLEMENT_STACKLESSMODULE
-#include "Windows.h"
-#define CANNOT_READ_MEM(p, bytes) IsBadReadPtr(p, bytes)
-
-static int IS_ON_STACK(void*p)
-{
- int stackref;
- intptr_t stackbase = ((intptr_t)&stackref) & 0xfffff000;
- return (intptr_t)p >= stackbase && (intptr_t)p < stackbase + 0x00100000;
-}
-
-#endif
-*/
\ No newline at end of file
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_x86_msvc.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_x86_msvc.h
deleted file mode 100644
index 010a22c..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_x86_msvc.h
+++ /dev/null
@@ -1,88 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 26-Sep-02 Christian Tismer
- * again as a result of virtualized stack access,
- * the compiler used less registers. Needed to
- * explicit mention registers in order to get them saved.
- * Thanks to Jeff Senn for pointing this out and help.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for sparc
- * 01-Mar-02 Christian Tismer
- * Initial final version after lots of iterations for i386.
- */
-
-#define alloca _alloca
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-#define STACK_MAGIC 0
-
-/* Some magic to quell warnings and keep slp_switch() from crashing when built
- with VC90. Disable global optimizations, and the warning: frame pointer
- register 'ebp' modified by inline assembly code */
-#pragma optimize("g", off)
-#pragma warning(disable:4731)
-
-static int
-slp_switch(void)
-{
- void* seh;
- register int *stackref, stsizediff;
- __asm mov eax, fs:[0]
- __asm mov [seh], eax
- __asm mov stackref, esp;
- /* modify EBX, ESI and EDI in order to get them preserved */
- __asm mov ebx, ebx;
- __asm xchg esi, edi;
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm {
- mov eax, stsizediff
- add esp, eax
- add ebp, eax
- }
- SLP_RESTORE_STATE();
- }
- __asm mov eax, [seh]
- __asm mov fs:[0], eax
- return 0;
-}
-
-/* re-enable ebp warning and global optimizations. */
-#pragma optimize("g", on)
-#pragma warning(default:4731)
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/* we have IsBadReadPtr available, so we can peek at objects */
-#define STACKLESS_SPY
-
-#ifdef IMPLEMENT_STACKLESSMODULE
-#include "Windows.h"
-#define CANNOT_READ_MEM(p, bytes) IsBadReadPtr(p, bytes)
-
-static int IS_ON_STACK(void*p)
-{
- int stackref;
- int stackbase = ((int)&stackref) & 0xfffff000;
- return (int)p >= stackbase && (int)p < stackbase + 0x00100000;
-}
-
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/platform/switch_x86_unix.h b/env/lib/python3.9/site-packages/greenlet/platform/switch_x86_unix.h
deleted file mode 100644
index 3a95186..0000000
--- a/env/lib/python3.9/site-packages/greenlet/platform/switch_x86_unix.h
+++ /dev/null
@@ -1,105 +0,0 @@
-/*
- * this is the internal transfer function.
- *
- * HISTORY
- * 3-May-13 Ralf Schmitt
- * Add support for strange GCC caller-save decisions
- * (ported from switch_aarch64_gcc.h)
- * 19-Aug-11 Alexey Borzenkov
- * Correctly save ebp, ebx and cw
- * 07-Sep-05 (py-dev mailing list discussion)
- * removed 'ebx' from the register-saved. !!!! WARNING !!!!
- * It means that this file can no longer be compiled statically!
- * It is now only suitable as part of a dynamic library!
- * 24-Nov-02 Christian Tismer
- * needed to add another magic constant to insure
- * that f in slp_eval_frame(PyFrameObject *f)
- * STACK_REFPLUS will probably be 1 in most cases.
- * gets included into the saved stack area.
- * 17-Sep-02 Christian Tismer
- * after virtualizing stack save/restore, the
- * stack size shrunk a bit. Needed to introduce
- * an adjustment STACK_MAGIC per platform.
- * 15-Sep-02 Gerd Woetzel
- * slightly changed framework for spark
- * 31-Avr-02 Armin Rigo
- * Added ebx, esi and edi register-saves.
- * 01-Mar-02 Samual M. Rushing
- * Ported from i386.
- */
-
-#define STACK_REFPLUS 1
-
-#ifdef SLP_EVAL
-
-/* #define STACK_MAGIC 3 */
-/* the above works fine with gcc 2.96, but 2.95.3 wants this */
-#define STACK_MAGIC 0
-
-#if __GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 5)
-# define ATTR_NOCLONE __attribute__((noclone))
-#else
-# define ATTR_NOCLONE
-#endif
-
-static int
-slp_switch(void)
-{
- int err;
-#ifdef _WIN32
- void *seh;
-#endif
- void *ebp, *ebx;
- unsigned short cw;
- register int *stackref, stsizediff;
- __asm__ volatile ("" : : : "esi", "edi");
- __asm__ volatile ("fstcw %0" : "=m" (cw));
- __asm__ volatile ("movl %%ebp, %0" : "=m" (ebp));
- __asm__ volatile ("movl %%ebx, %0" : "=m" (ebx));
-#ifdef _WIN32
- __asm__ volatile (
- "movl %%fs:0x0, %%eax\n"
- "movl %%eax, %0\n"
- : "=m" (seh)
- :
- : "eax");
-#endif
- __asm__ ("movl %%esp, %0" : "=g" (stackref));
- {
- SLP_SAVE_STATE(stackref, stsizediff);
- __asm__ volatile (
- "addl %0, %%esp\n"
- "addl %0, %%ebp\n"
- :
- : "r" (stsizediff)
- );
- SLP_RESTORE_STATE();
- __asm__ volatile ("xorl %%eax, %%eax" : "=a" (err));
- }
-#ifdef _WIN32
- __asm__ volatile (
- "movl %0, %%eax\n"
- "movl %%eax, %%fs:0x0\n"
- :
- : "m" (seh)
- : "eax");
-#endif
- __asm__ volatile ("movl %0, %%ebx" : : "m" (ebx));
- __asm__ volatile ("movl %0, %%ebp" : : "m" (ebp));
- __asm__ volatile ("fldcw %0" : : "m" (cw));
- __asm__ volatile ("" : : : "esi", "edi");
- return err;
-}
-
-#endif
-
-/*
- * further self-processing support
- */
-
-/*
- * if you want to add self-inspection tools, place them
- * here. See the x86_msvc for the necessary defines.
- * These features are highly experimental und not
- * essential yet.
- */
diff --git a/env/lib/python3.9/site-packages/greenlet/slp_platformselect.h b/env/lib/python3.9/site-packages/greenlet/slp_platformselect.h
deleted file mode 100644
index b5e8eb6..0000000
--- a/env/lib/python3.9/site-packages/greenlet/slp_platformselect.h
+++ /dev/null
@@ -1,58 +0,0 @@
-/*
- * Platform Selection for Stackless Python
- */
-
-#if defined(MS_WIN32) && !defined(MS_WIN64) && defined(_M_IX86) && defined(_MSC_VER)
-#include "platform/switch_x86_msvc.h" /* MS Visual Studio on X86 */
-#elif defined(MS_WIN64) && defined(_M_X64) && defined(_MSC_VER) || defined(__MINGW64__)
-#include "platform/switch_x64_msvc.h" /* MS Visual Studio on X64 */
-#elif defined(__GNUC__) && defined(__amd64__) && defined(__ILP32__)
-#include "platform/switch_x32_unix.h" /* gcc on amd64 with x32 ABI */
-#elif defined(__GNUC__) && defined(__amd64__)
-#include "platform/switch_amd64_unix.h" /* gcc on amd64 */
-#elif defined(__GNUC__) && defined(__i386__)
-#include "platform/switch_x86_unix.h" /* gcc on X86 */
-#elif defined(__GNUC__) && defined(__powerpc64__) && (defined(__linux__) || defined(__FreeBSD__))
-#include "platform/switch_ppc64_linux.h" /* gcc on PowerPC 64-bit */
-#elif defined(__GNUC__) && defined(__PPC__) && (defined(__linux__) || defined(__FreeBSD__))
-#include "platform/switch_ppc_linux.h" /* gcc on PowerPC */
-#elif defined(__GNUC__) && defined(__ppc__) && defined(__APPLE__)
-#include "platform/switch_ppc_macosx.h" /* Apple MacOS X on PowerPC */
-#elif defined(__GNUC__) && defined(__powerpc64__) && defined(_AIX)
-#include "platform/switch_ppc64_aix.h" /* gcc on AIX/PowerPC 64-bit */
-#elif defined(__GNUC__) && defined(_ARCH_PPC) && defined(_AIX)
-#include "platform/switch_ppc_aix.h" /* gcc on AIX/PowerPC */
-#elif defined(__GNUC__) && defined(sparc)
-#include "platform/switch_sparc_sun_gcc.h" /* SunOS sparc with gcc */
-#elif defined(__SUNPRO_C) && defined(sparc) && defined(sun)
-#include "platform/switch_sparc_sun_gcc.h" /* SunStudio on amd64 */
-#elif defined(__SUNPRO_C) && defined(__amd64__) && defined(sun)
-#include "platform/switch_amd64_unix.h" /* SunStudio on amd64 */
-#elif defined(__SUNPRO_C) && defined(__i386__) && defined(sun)
-#include "platform/switch_x86_unix.h" /* SunStudio on x86 */
-#elif defined(__GNUC__) && defined(__s390__) && defined(__linux__)
-#include "platform/switch_s390_unix.h" /* Linux/S390 */
-#elif defined(__GNUC__) && defined(__s390x__) && defined(__linux__)
-#include "platform/switch_s390_unix.h" /* Linux/S390 zSeries (64-bit) */
-#elif defined(__GNUC__) && defined(__arm__)
-#ifdef __APPLE__
-#include
-#endif
-#if TARGET_OS_IPHONE
-#include "platform/switch_arm32_ios.h" /* iPhone OS on arm32 */
-#else
-#include "platform/switch_arm32_gcc.h" /* gcc using arm32 */
-#endif
-#elif defined(__GNUC__) && defined(__mips__) && defined(__linux__)
-#include "platform/switch_mips_unix.h" /* Linux/MIPS */
-#elif defined(__GNUC__) && defined(__aarch64__)
-#include "platform/switch_aarch64_gcc.h" /* Aarch64 ABI */
-#elif defined(__GNUC__) && defined(__mc68000__)
-#include "platform/switch_m68k_gcc.h" /* gcc on m68k */
-#elif defined(__GNUC__) && defined(__csky__)
-#include "platform/switch_csky_gcc.h" /* gcc on csky */
-#elif defined(__GNUC__) && defined(__riscv)
-#include "platform/switch_riscv_unix.h" /* gcc on RISC-V */
-#elif defined(__GNUC__) && defined(__alpha__)
-#include "platform/switch_alpha_unix.h" /* gcc on DEC Alpha */
-#endif
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/__init__.py b/env/lib/python3.9/site-packages/greenlet/tests/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension.c b/env/lib/python3.9/site-packages/greenlet/tests/_test_extension.c
deleted file mode 100644
index 4fe087d..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension.c
+++ /dev/null
@@ -1,216 +0,0 @@
-/* This is a set of functions used by test_extension_interface.py to test the
- * Greenlet C API.
- */
-
-#include "../greenlet.h"
-
-#ifndef Py_RETURN_NONE
-# define Py_RETURN_NONE return Py_INCREF(Py_None), Py_None
-#endif
-
-#define TEST_MODULE_NAME "_test_extension"
-
-static PyObject*
-test_switch(PyObject* self, PyObject* greenlet)
-{
- PyObject* result = NULL;
-
- if (greenlet == NULL || !PyGreenlet_Check(greenlet)) {
- PyErr_BadArgument();
- return NULL;
- }
-
- result = PyGreenlet_Switch((PyGreenlet*)greenlet, NULL, NULL);
- if (result == NULL) {
- if (!PyErr_Occurred()) {
- PyErr_SetString(PyExc_AssertionError,
- "greenlet.switch() failed for some reason.");
- }
- return NULL;
- }
- Py_INCREF(result);
- return result;
-}
-
-static PyObject*
-test_switch_kwargs(PyObject* self, PyObject* args, PyObject* kwargs)
-{
- PyGreenlet* g = NULL;
- PyObject* result = NULL;
-
- PyArg_ParseTuple(args, "O!", &PyGreenlet_Type, &g);
-
- if (g == NULL || !PyGreenlet_Check(g)) {
- PyErr_BadArgument();
- return NULL;
- }
-
- result = PyGreenlet_Switch(g, NULL, kwargs);
- if (result == NULL) {
- if (!PyErr_Occurred()) {
- PyErr_SetString(PyExc_AssertionError,
- "greenlet.switch() failed for some reason.");
- }
- return NULL;
- }
- Py_XINCREF(result);
- return result;
-}
-
-static PyObject*
-test_getcurrent(PyObject* self)
-{
- PyGreenlet* g = PyGreenlet_GetCurrent();
- if (g == NULL || !PyGreenlet_Check(g) || !PyGreenlet_ACTIVE(g)) {
- PyErr_SetString(PyExc_AssertionError,
- "getcurrent() returned an invalid greenlet");
- Py_XDECREF(g);
- return NULL;
- }
- Py_DECREF(g);
- Py_RETURN_NONE;
-}
-
-static PyObject*
-test_setparent(PyObject* self, PyObject* arg)
-{
- PyGreenlet* current;
- PyGreenlet* greenlet = NULL;
-
- if (arg == NULL || !PyGreenlet_Check(arg)) {
- PyErr_BadArgument();
- return NULL;
- }
- if ((current = PyGreenlet_GetCurrent()) == NULL) {
- return NULL;
- }
- greenlet = (PyGreenlet*)arg;
- if (PyGreenlet_SetParent(greenlet, current)) {
- Py_DECREF(current);
- return NULL;
- }
- Py_DECREF(current);
- if (PyGreenlet_Switch(greenlet, NULL, NULL) == NULL) {
- return NULL;
- }
- Py_RETURN_NONE;
-}
-
-static PyObject*
-test_new_greenlet(PyObject* self, PyObject* callable)
-{
- PyObject* result = NULL;
- PyGreenlet* greenlet = PyGreenlet_New(callable, NULL);
-
- if (!greenlet) {
- return NULL;
- }
-
- result = PyGreenlet_Switch(greenlet, NULL, NULL);
- if (result == NULL) {
- return NULL;
- }
-
- Py_INCREF(result);
- return result;
-}
-
-static PyObject*
-test_raise_dead_greenlet(PyObject* self)
-{
- PyErr_SetString(PyExc_GreenletExit, "test GreenletExit exception.");
- return NULL;
-}
-
-static PyObject*
-test_raise_greenlet_error(PyObject* self)
-{
- PyErr_SetString(PyExc_GreenletError, "test greenlet.error exception");
- return NULL;
-}
-
-static PyObject*
-test_throw(PyObject* self, PyGreenlet* g)
-{
- const char msg[] = "take that sucka!";
- PyObject* msg_obj = Py_BuildValue("s", msg);
- PyGreenlet_Throw(g, PyExc_ValueError, msg_obj, NULL);
- Py_DECREF(msg_obj);
- Py_RETURN_NONE;
-}
-
-static PyMethodDef test_methods[] = {
- {"test_switch",
- (PyCFunction)test_switch,
- METH_O,
- "Switch to the provided greenlet sending provided arguments, and \n"
- "return the results."},
- {"test_switch_kwargs",
- (PyCFunction)test_switch_kwargs,
- METH_VARARGS | METH_KEYWORDS,
- "Switch to the provided greenlet sending the provided keyword args."},
- {"test_getcurrent",
- (PyCFunction)test_getcurrent,
- METH_NOARGS,
- "Test PyGreenlet_GetCurrent()"},
- {"test_setparent",
- (PyCFunction)test_setparent,
- METH_O,
- "Se the parent of the provided greenlet and switch to it."},
- {"test_new_greenlet",
- (PyCFunction)test_new_greenlet,
- METH_O,
- "Test PyGreenlet_New()"},
- {"test_raise_dead_greenlet",
- (PyCFunction)test_raise_dead_greenlet,
- METH_NOARGS,
- "Just raise greenlet.GreenletExit"},
- {"test_raise_greenlet_error",
- (PyCFunction)test_raise_greenlet_error,
- METH_NOARGS,
- "Just raise greenlet.error"},
- {"test_throw",
- (PyCFunction)test_throw,
- METH_O,
- "Throw a ValueError at the provided greenlet"},
- {NULL, NULL, 0, NULL}};
-
-#if PY_MAJOR_VERSION >= 3
-# define INITERROR return NULL
-
-static struct PyModuleDef moduledef = {PyModuleDef_HEAD_INIT,
- TEST_MODULE_NAME,
- NULL,
- 0,
- test_methods,
- NULL,
- NULL,
- NULL,
- NULL};
-
-PyMODINIT_FUNC
-PyInit__test_extension(void)
-#else
-# define INITERROR return
-PyMODINIT_FUNC
-init_test_extension(void)
-#endif
-{
- PyObject* module = NULL;
-
-#if PY_MAJOR_VERSION >= 3
- module = PyModule_Create(&moduledef);
-#else
- module = Py_InitModule(TEST_MODULE_NAME, test_methods);
-#endif
-
- if (module == NULL) {
- INITERROR;
- }
-
- PyGreenlet_Import();
-
-#if PY_MAJOR_VERSION >= 3
- return module;
-#endif
-}
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension.cpython-39-darwin.so b/env/lib/python3.9/site-packages/greenlet/tests/_test_extension.cpython-39-darwin.so
deleted file mode 100755
index 00dc1f3..0000000
Binary files a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension.cpython-39-darwin.so and /dev/null differ
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension_cpp.cpp b/env/lib/python3.9/site-packages/greenlet/tests/_test_extension_cpp.cpp
deleted file mode 100644
index 72e3d81..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension_cpp.cpp
+++ /dev/null
@@ -1,121 +0,0 @@
-/* This is a set of functions used to test C++ exceptions are not
- * broken during greenlet switches
- */
-
-#include "../greenlet.h"
-
-struct exception_t {
- int depth;
- exception_t(int depth) : depth(depth) {}
-};
-
-/* Functions are called via pointers to prevent inlining */
-static void (*p_test_exception_throw)(int depth);
-static PyObject* (*p_test_exception_switch_recurse)(int depth, int left);
-
-static void
-test_exception_throw(int depth)
-{
- throw exception_t(depth);
-}
-
-static PyObject*
-test_exception_switch_recurse(int depth, int left)
-{
- if (left > 0) {
- return p_test_exception_switch_recurse(depth, left - 1);
- }
-
- PyObject* result = NULL;
- PyGreenlet* self = PyGreenlet_GetCurrent();
- if (self == NULL)
- return NULL;
-
- try {
- PyGreenlet_Switch(self->parent, NULL, NULL);
- p_test_exception_throw(depth);
- PyErr_SetString(PyExc_RuntimeError,
- "throwing C++ exception didn't work");
- }
- catch (exception_t& e) {
- if (e.depth != depth)
- PyErr_SetString(PyExc_AssertionError, "depth mismatch");
- else
- result = PyLong_FromLong(depth);
- }
- catch (...) {
- PyErr_SetString(PyExc_RuntimeError, "unexpected C++ exception");
- }
-
- Py_DECREF(self);
- return result;
-}
-
-/* test_exception_switch(int depth)
- * - recurses depth times
- * - switches to parent inside try/catch block
- * - throws an exception that (expected to be caught in the same function)
- * - verifies depth matches (exceptions shouldn't be caught in other greenlets)
- */
-static PyObject*
-test_exception_switch(PyObject* self, PyObject* args)
-{
- int depth;
- if (!PyArg_ParseTuple(args, "i", &depth))
- return NULL;
- return p_test_exception_switch_recurse(depth, depth);
-}
-
-static PyMethodDef test_methods[] = {
- {"test_exception_switch",
- (PyCFunction)&test_exception_switch,
- METH_VARARGS,
- "Switches to parent twice, to test exception handling and greenlet "
- "switching."},
- {NULL, NULL, 0, NULL}};
-
-#if PY_MAJOR_VERSION >= 3
-# define INITERROR return NULL
-
-static struct PyModuleDef moduledef = {PyModuleDef_HEAD_INIT,
- "greenlet.tests._test_extension_cpp",
- NULL,
- 0,
- test_methods,
- NULL,
- NULL,
- NULL,
- NULL};
-
-PyMODINIT_FUNC
-PyInit__test_extension_cpp(void)
-#else
-# define INITERROR return
-PyMODINIT_FUNC
-init_test_extension_cpp(void)
-#endif
-{
- PyObject* module = NULL;
-
-#if PY_MAJOR_VERSION >= 3
- module = PyModule_Create(&moduledef);
-#else
- module = Py_InitModule("greenlet.tests._test_extension_cpp", test_methods);
-#endif
-
- if (module == NULL) {
- INITERROR;
- }
-
- PyGreenlet_Import();
- if (_PyGreenlet_API == NULL) {
- INITERROR;
- }
-
- p_test_exception_throw = test_exception_throw;
- p_test_exception_switch_recurse = test_exception_switch_recurse;
-
-#if PY_MAJOR_VERSION >= 3
- return module;
-#endif
-}
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension_cpp.cpython-39-darwin.so b/env/lib/python3.9/site-packages/greenlet/tests/_test_extension_cpp.cpython-39-darwin.so
deleted file mode 100755
index 64f5f18..0000000
Binary files a/env/lib/python3.9/site-packages/greenlet/tests/_test_extension_cpp.cpython-39-darwin.so and /dev/null differ
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_contextvars.py b/env/lib/python3.9/site-packages/greenlet/tests/test_contextvars.py
deleted file mode 100644
index 49b7c0d..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_contextvars.py
+++ /dev/null
@@ -1,266 +0,0 @@
-import unittest
-import gc
-import sys
-
-from functools import partial
-
-from greenlet import greenlet
-from greenlet import getcurrent
-
-
-try:
- from contextvars import Context
- from contextvars import ContextVar
- from contextvars import copy_context
-except ImportError:
- Context = ContextVar = copy_context = None
-
-# We don't support testing if greenlet's built-in context var support is disabled.
-@unittest.skipUnless(Context is not None, "ContextVar not supported")
-class ContextVarsTests(unittest.TestCase):
- def _new_ctx_run(self, *args, **kwargs):
- return copy_context().run(*args, **kwargs)
-
- def _increment(self, greenlet_id, ctx_var, callback, counts, expect):
- if expect is None:
- self.assertIsNone(ctx_var.get())
- else:
- self.assertEqual(ctx_var.get(), expect)
- ctx_var.set(greenlet_id)
- for _ in range(2):
- counts[ctx_var.get()] += 1
- callback()
-
- def _test_context(self, propagate_by):
- id_var = ContextVar("id", default=None)
- id_var.set(0)
-
- callback = getcurrent().switch
- counts = dict((i, 0) for i in range(5))
-
- lets = [
- greenlet(partial(
- partial(
- copy_context().run,
- self._increment
- ) if propagate_by == "run" else self._increment,
- greenlet_id=i,
- ctx_var=id_var,
- callback=callback,
- counts=counts,
- expect=(
- i - 1 if propagate_by == "share" else
- 0 if propagate_by in ("set", "run") else None
- )
- ))
- for i in range(1, 5)
- ]
-
- for let in lets:
- if propagate_by == "set":
- let.gr_context = copy_context()
- elif propagate_by == "share":
- let.gr_context = getcurrent().gr_context
-
- for i in range(2):
- counts[id_var.get()] += 1
- for let in lets:
- let.switch()
-
- if propagate_by == "run":
- # Must leave each context.run() in reverse order of entry
- for let in reversed(lets):
- let.switch()
- else:
- # No context.run(), so fine to exit in any order.
- for let in lets:
- let.switch()
-
- for let in lets:
- self.assertTrue(let.dead)
- # When using run(), we leave the run() as the greenlet dies,
- # and there's no context "underneath". When not using run(),
- # gr_context still reflects the context the greenlet was
- # running in.
- self.assertEqual(let.gr_context is None, propagate_by == "run")
-
- if propagate_by == "share":
- self.assertEqual(counts, {0: 1, 1: 1, 2: 1, 3: 1, 4: 6})
- else:
- self.assertEqual(set(counts.values()), set([2]))
-
- def test_context_propagated_by_context_run(self):
- self._new_ctx_run(self._test_context, "run")
-
- def test_context_propagated_by_setting_attribute(self):
- self._new_ctx_run(self._test_context, "set")
-
- def test_context_not_propagated(self):
- self._new_ctx_run(self._test_context, None)
-
- def test_context_shared(self):
- self._new_ctx_run(self._test_context, "share")
-
- def test_break_ctxvars(self):
- let1 = greenlet(copy_context().run)
- let2 = greenlet(copy_context().run)
- let1.switch(getcurrent().switch)
- let2.switch(getcurrent().switch)
- # Since let2 entered the current context and let1 exits its own, the
- # interpreter emits:
- # RuntimeError: cannot exit context: thread state references a different context object
- let1.switch()
-
- def test_not_broken_if_using_attribute_instead_of_context_run(self):
- let1 = greenlet(getcurrent().switch)
- let2 = greenlet(getcurrent().switch)
- let1.gr_context = copy_context()
- let2.gr_context = copy_context()
- let1.switch()
- let2.switch()
- let1.switch()
- let2.switch()
-
- def test_context_assignment_while_running(self):
- id_var = ContextVar("id", default=None)
-
- def target():
- self.assertIsNone(id_var.get())
- self.assertIsNone(gr.gr_context)
-
- # Context is created on first use
- id_var.set(1)
- self.assertIsInstance(gr.gr_context, Context)
- self.assertEqual(id_var.get(), 1)
- self.assertEqual(gr.gr_context[id_var], 1)
-
- # Clearing the context makes it get re-created as another
- # empty context when next used
- old_context = gr.gr_context
- gr.gr_context = None # assign None while running
- self.assertIsNone(id_var.get())
- self.assertIsNone(gr.gr_context)
- id_var.set(2)
- self.assertIsInstance(gr.gr_context, Context)
- self.assertEqual(id_var.get(), 2)
- self.assertEqual(gr.gr_context[id_var], 2)
-
- new_context = gr.gr_context
- getcurrent().parent.switch((old_context, new_context))
- # parent switches us back to old_context
-
- self.assertEqual(id_var.get(), 1)
- gr.gr_context = new_context # assign non-None while running
- self.assertEqual(id_var.get(), 2)
-
- getcurrent().parent.switch()
- # parent switches us back to no context
- self.assertIsNone(id_var.get())
- self.assertIsNone(gr.gr_context)
- gr.gr_context = old_context
- self.assertEqual(id_var.get(), 1)
-
- getcurrent().parent.switch()
- # parent switches us back to no context
- self.assertIsNone(id_var.get())
- self.assertIsNone(gr.gr_context)
-
- gr = greenlet(target)
-
- with self.assertRaisesRegex(AttributeError, "can't delete attr"):
- del gr.gr_context
-
- self.assertIsNone(gr.gr_context)
- old_context, new_context = gr.switch()
- self.assertIs(new_context, gr.gr_context)
- self.assertEqual(old_context[id_var], 1)
- self.assertEqual(new_context[id_var], 2)
- self.assertEqual(new_context.run(id_var.get), 2)
- gr.gr_context = old_context # assign non-None while suspended
- gr.switch()
- self.assertIs(gr.gr_context, new_context)
- gr.gr_context = None # assign None while suspended
- gr.switch()
- self.assertIs(gr.gr_context, old_context)
- gr.gr_context = None
- gr.switch()
- self.assertIsNone(gr.gr_context)
-
- # Make sure there are no reference leaks
- gr = None
- gc.collect()
- self.assertEqual(sys.getrefcount(old_context), 2)
- self.assertEqual(sys.getrefcount(new_context), 2)
-
- def test_context_assignment_different_thread(self):
- import threading
-
- ctx = Context()
- var = ContextVar("var", default=None)
- is_running = threading.Event()
- should_suspend = threading.Event()
- did_suspend = threading.Event()
- should_exit = threading.Event()
- holder = []
-
- def greenlet_in_thread_fn():
- var.set(1)
- is_running.set()
- should_suspend.wait()
- var.set(2)
- getcurrent().parent.switch()
- holder.append(var.get())
-
- def thread_fn():
- gr = greenlet(greenlet_in_thread_fn)
- gr.gr_context = ctx
- holder.append(gr)
- gr.switch()
- did_suspend.set()
- should_exit.wait()
- gr.switch()
-
- thread = threading.Thread(target=thread_fn, daemon=True)
- thread.start()
- is_running.wait()
- gr = holder[0]
-
- # Can't access or modify context if the greenlet is running
- # in a different thread
- with self.assertRaisesRegex(ValueError, "running in a different"):
- getattr(gr, 'gr_context')
- with self.assertRaisesRegex(ValueError, "running in a different"):
- gr.gr_context = None
-
- should_suspend.set()
- did_suspend.wait()
-
- # OK to access and modify context if greenlet is suspended
- self.assertIs(gr.gr_context, ctx)
- self.assertEqual(gr.gr_context[var], 2)
- gr.gr_context = None
-
- should_exit.set()
- thread.join()
-
- self.assertEqual(holder, [gr, None])
-
- # Context can still be accessed/modified when greenlet is dead:
- self.assertIsNone(gr.gr_context)
- gr.gr_context = ctx
- self.assertIs(gr.gr_context, ctx)
-
-@unittest.skipIf(Context is not None, "ContextVar supported")
-class NoContextVarsTests(unittest.TestCase):
- def test_contextvars_errors(self):
- let1 = greenlet(getcurrent().switch)
- self.assertFalse(hasattr(let1, 'gr_context'))
- with self.assertRaises(AttributeError):
- getattr(let1, 'gr_context')
- with self.assertRaises(AttributeError):
- let1.gr_context = None
- let1.switch()
- with self.assertRaises(AttributeError):
- getattr(let1, 'gr_context')
- with self.assertRaises(AttributeError):
- let1.gr_context = None
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_cpp.py b/env/lib/python3.9/site-packages/greenlet/tests/test_cpp.py
deleted file mode 100644
index 741ea10..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_cpp.py
+++ /dev/null
@@ -1,18 +0,0 @@
-from __future__ import print_function
-from __future__ import absolute_import
-
-import unittest
-
-import greenlet
-from . import _test_extension_cpp
-
-
-class CPPTests(unittest.TestCase):
- def test_exception_switch(self):
- greenlets = []
- for i in range(4):
- g = greenlet.greenlet(_test_extension_cpp.test_exception_switch)
- g.switch(i)
- greenlets.append(g)
- for i, g in enumerate(greenlets):
- self.assertEqual(g.switch(), i)
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_extension_interface.py b/env/lib/python3.9/site-packages/greenlet/tests/test_extension_interface.py
deleted file mode 100644
index a92ea1f..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_extension_interface.py
+++ /dev/null
@@ -1,77 +0,0 @@
-from __future__ import print_function
-from __future__ import absolute_import
-
-import sys
-import unittest
-
-import greenlet
-from . import _test_extension
-
-
-class CAPITests(unittest.TestCase):
- def test_switch(self):
- self.assertEqual(
- 50, _test_extension.test_switch(greenlet.greenlet(lambda: 50)))
-
- def test_switch_kwargs(self):
- def foo(x, y):
- return x * y
- g = greenlet.greenlet(foo)
- self.assertEqual(6, _test_extension.test_switch_kwargs(g, x=3, y=2))
-
- def test_setparent(self):
- def foo():
- def bar():
- greenlet.getcurrent().parent.switch()
-
- # This final switch should go back to the main greenlet, since
- # the test_setparent() function in the C extension should have
- # reparented this greenlet.
- greenlet.getcurrent().parent.switch()
- raise AssertionError("Should never have reached this code")
- child = greenlet.greenlet(bar)
- child.switch()
- greenlet.getcurrent().parent.switch(child)
- greenlet.getcurrent().parent.throw(
- AssertionError("Should never reach this code"))
- foo_child = greenlet.greenlet(foo).switch()
- self.assertEqual(None, _test_extension.test_setparent(foo_child))
-
- def test_getcurrent(self):
- _test_extension.test_getcurrent()
-
- def test_new_greenlet(self):
- self.assertEqual(-15, _test_extension.test_new_greenlet(lambda: -15))
-
- def test_raise_greenlet_dead(self):
- self.assertRaises(
- greenlet.GreenletExit, _test_extension.test_raise_dead_greenlet)
-
- def test_raise_greenlet_error(self):
- self.assertRaises(
- greenlet.error, _test_extension.test_raise_greenlet_error)
-
- def test_throw(self):
- seen = []
-
- def foo():
- try:
- greenlet.getcurrent().parent.switch()
- except ValueError:
- seen.append(sys.exc_info()[1])
- except greenlet.GreenletExit:
- raise AssertionError
- g = greenlet.greenlet(foo)
- g.switch()
- _test_extension.test_throw(g)
- self.assertEqual(len(seen), 1)
- self.assertTrue(
- isinstance(seen[0], ValueError),
- "ValueError was not raised in foo()")
- self.assertEqual(
- str(seen[0]),
- 'take that sucka!',
- "message doesn't match")
-
-if __name__ == '__main__':
- unittest.main()
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_gc.py b/env/lib/python3.9/site-packages/greenlet/tests/test_gc.py
deleted file mode 100644
index a2a41ca..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_gc.py
+++ /dev/null
@@ -1,77 +0,0 @@
-import gc
-import sys
-import unittest
-import weakref
-
-import greenlet
-
-
-class GCTests(unittest.TestCase):
- def test_dead_circular_ref(self):
- o = weakref.ref(greenlet.greenlet(greenlet.getcurrent).switch())
- gc.collect()
- self.assertTrue(o() is None)
- self.assertFalse(gc.garbage, gc.garbage)
-
- if greenlet.GREENLET_USE_GC:
- # These only work with greenlet gc support
-
- def test_circular_greenlet(self):
- class circular_greenlet(greenlet.greenlet):
- pass
- o = circular_greenlet()
- o.self = o
- o = weakref.ref(o)
- gc.collect()
- self.assertTrue(o() is None)
- self.assertFalse(gc.garbage, gc.garbage)
-
- def test_inactive_ref(self):
- class inactive_greenlet(greenlet.greenlet):
- def __init__(self):
- greenlet.greenlet.__init__(self, run=self.run)
-
- def run(self):
- pass
- o = inactive_greenlet()
- o = weakref.ref(o)
- gc.collect()
- self.assertTrue(o() is None)
- self.assertFalse(gc.garbage, gc.garbage)
-
- def test_finalizer_crash(self):
- # This test is designed to crash when active greenlets
- # are made garbage collectable, until the underlying
- # problem is resolved. How does it work:
- # - order of object creation is important
- # - array is created first, so it is moved to unreachable first
- # - we create a cycle between a greenlet and this array
- # - we create an object that participates in gc, is only
- # referenced by a greenlet, and would corrupt gc lists
- # on destruction, the easiest is to use an object with
- # a finalizer
- # - because array is the first object in unreachable it is
- # cleared first, which causes all references to greenlet
- # to disappear and causes greenlet to be destroyed, but since
- # it is still live it causes a switch during gc, which causes
- # an object with finalizer to be destroyed, which causes stack
- # corruption and then a crash
- class object_with_finalizer(object):
- def __del__(self):
- pass
- array = []
- parent = greenlet.getcurrent()
- def greenlet_body():
- greenlet.getcurrent().object = object_with_finalizer()
- try:
- parent.switch()
- finally:
- del greenlet.getcurrent().object
- g = greenlet.greenlet(greenlet_body)
- g.array = array
- array.append(g)
- g.switch()
- del array
- del g
- greenlet.getcurrent()
- gc.collect()
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_generator.py b/env/lib/python3.9/site-packages/greenlet/tests/test_generator.py
deleted file mode 100644
index 62f9f26..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_generator.py
+++ /dev/null
@@ -1,59 +0,0 @@
-import unittest
-from greenlet import greenlet
-
-
-class genlet(greenlet):
-
- def __init__(self, *args, **kwds):
- self.args = args
- self.kwds = kwds
-
- def run(self):
- fn, = self.fn
- fn(*self.args, **self.kwds)
-
- def __iter__(self):
- return self
-
- def __next__(self):
- self.parent = greenlet.getcurrent()
- result = self.switch()
- if self:
- return result
- else:
- raise StopIteration
-
- # Hack: Python < 2.6 compatibility
- next = __next__
-
-
-def Yield(value):
- g = greenlet.getcurrent()
- while not isinstance(g, genlet):
- if g is None:
- raise RuntimeError('yield outside a genlet')
- g = g.parent
- g.parent.switch(value)
-
-
-def generator(func):
- class generator(genlet):
- fn = (func,)
- return generator
-
-# ____________________________________________________________
-
-
-class GeneratorTests(unittest.TestCase):
- def test_generator(self):
- seen = []
-
- def g(n):
- for i in range(n):
- seen.append(i)
- Yield(i)
- g = generator(g)
- for k in range(3):
- for j in g(5):
- seen.append(j)
- self.assertEqual(seen, 3 * [0, 0, 1, 1, 2, 2, 3, 3, 4, 4])
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_generator_nested.py b/env/lib/python3.9/site-packages/greenlet/tests/test_generator_nested.py
deleted file mode 100644
index 6b4f023..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_generator_nested.py
+++ /dev/null
@@ -1,165 +0,0 @@
-import unittest
-from greenlet import greenlet
-
-
-class genlet(greenlet):
-
- def __init__(self, *args, **kwds):
- self.args = args
- self.kwds = kwds
- self.child = None
-
- def run(self):
- fn, = self.fn
- fn(*self.args, **self.kwds)
-
- def __iter__(self):
- return self
-
- def set_child(self, child):
- self.child = child
-
- def __next__(self):
- if self.child:
- child = self.child
- while child.child:
- tmp = child
- child = child.child
- tmp.child = None
-
- result = child.switch()
- else:
- self.parent = greenlet.getcurrent()
- result = self.switch()
-
- if self:
- return result
- else:
- raise StopIteration
-
- # Hack: Python < 2.6 compatibility
- next = __next__
-
-
-def Yield(value, level=1):
- g = greenlet.getcurrent()
-
- while level != 0:
- if not isinstance(g, genlet):
- raise RuntimeError('yield outside a genlet')
- if level > 1:
- g.parent.set_child(g)
- g = g.parent
- level -= 1
-
- g.switch(value)
-
-
-def Genlet(func):
- class Genlet(genlet):
- fn = (func,)
- return Genlet
-
-# ____________________________________________________________
-
-
-def g1(n, seen):
- for i in range(n):
- seen.append(i + 1)
- yield i
-
-
-def g2(n, seen):
- for i in range(n):
- seen.append(i + 1)
- Yield(i)
-
-g2 = Genlet(g2)
-
-
-def nested(i):
- Yield(i)
-
-
-def g3(n, seen):
- for i in range(n):
- seen.append(i + 1)
- nested(i)
-g3 = Genlet(g3)
-
-
-def a(n):
- if n == 0:
- return
- for ii in ax(n - 1):
- Yield(ii)
- Yield(n)
-ax = Genlet(a)
-
-
-def perms(l):
- if len(l) > 1:
- for e in l:
- # No syntactical sugar for generator expressions
- [Yield([e] + p) for p in perms([x for x in l if x != e])]
- else:
- Yield(l)
-perms = Genlet(perms)
-
-
-def gr1(n):
- for ii in range(1, n):
- Yield(ii)
- Yield(ii * ii, 2)
-
-gr1 = Genlet(gr1)
-
-
-def gr2(n, seen):
- for ii in gr1(n):
- seen.append(ii)
-
-gr2 = Genlet(gr2)
-
-
-class NestedGeneratorTests(unittest.TestCase):
- def test_layered_genlets(self):
- seen = []
- for ii in gr2(5, seen):
- seen.append(ii)
- self.assertEqual(seen, [1, 1, 2, 4, 3, 9, 4, 16])
-
- def test_permutations(self):
- gen_perms = perms(list(range(4)))
- permutations = list(gen_perms)
- self.assertEqual(len(permutations), 4 * 3 * 2 * 1)
- self.assertTrue([0, 1, 2, 3] in permutations)
- self.assertTrue([3, 2, 1, 0] in permutations)
- res = []
- for ii in zip(perms(list(range(4))), perms(list(range(3)))):
- res.append(ii)
- self.assertEqual(
- res,
- [([0, 1, 2, 3], [0, 1, 2]), ([0, 1, 3, 2], [0, 2, 1]),
- ([0, 2, 1, 3], [1, 0, 2]), ([0, 2, 3, 1], [1, 2, 0]),
- ([0, 3, 1, 2], [2, 0, 1]), ([0, 3, 2, 1], [2, 1, 0])])
- # XXX Test to make sure we are working as a generator expression
-
- def test_genlet_simple(self):
- for g in [g1, g2, g3]:
- seen = []
- for k in range(3):
- for j in g(5, seen):
- seen.append(j)
- self.assertEqual(seen, 3 * [1, 0, 2, 1, 3, 2, 4, 3, 5, 4])
-
- def test_genlet_bad(self):
- try:
- Yield(10)
- except RuntimeError:
- pass
-
- def test_nested_genlets(self):
- seen = []
- for ii in ax(5):
- seen.append(ii)
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_greenlet.py b/env/lib/python3.9/site-packages/greenlet/tests/test_greenlet.py
deleted file mode 100644
index 5509a8b..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_greenlet.py
+++ /dev/null
@@ -1,728 +0,0 @@
-import gc
-import sys
-import time
-import threading
-import unittest
-from abc import ABCMeta, abstractmethod
-
-from greenlet import greenlet
-
-# We manually manage locks in many tests
-# pylint:disable=consider-using-with
-
-class SomeError(Exception):
- pass
-
-
-def fmain(seen):
- try:
- greenlet.getcurrent().parent.switch()
- except:
- seen.append(sys.exc_info()[0])
- raise
- raise SomeError
-
-
-def send_exception(g, exc):
- # note: send_exception(g, exc) can be now done with g.throw(exc).
- # the purpose of this test is to explicitely check the propagation rules.
- def crasher(exc):
- raise exc
- g1 = greenlet(crasher, parent=g)
- g1.switch(exc)
-
-
-class TestGreenlet(unittest.TestCase):
- def test_simple(self):
- lst = []
-
- def f():
- lst.append(1)
- greenlet.getcurrent().parent.switch()
- lst.append(3)
- g = greenlet(f)
- lst.append(0)
- g.switch()
- lst.append(2)
- g.switch()
- lst.append(4)
- self.assertEqual(lst, list(range(5)))
-
- def test_parent_equals_None(self):
- g = greenlet(parent=None)
- self.assertIsNotNone(g)
- self.assertIs(g.parent, greenlet.getcurrent())
-
- def test_run_equals_None(self):
- g = greenlet(run=None)
- self.assertIsNotNone(g)
- self.assertIsNone(g.run)
-
- def test_two_children(self):
- lst = []
-
- def f():
- lst.append(1)
- greenlet.getcurrent().parent.switch()
- lst.extend([1, 1])
- g = greenlet(f)
- h = greenlet(f)
- g.switch()
- self.assertEqual(len(lst), 1)
- h.switch()
- self.assertEqual(len(lst), 2)
- h.switch()
- self.assertEqual(len(lst), 4)
- self.assertEqual(h.dead, True)
- g.switch()
- self.assertEqual(len(lst), 6)
- self.assertEqual(g.dead, True)
-
- def test_two_recursive_children(self):
- lst = []
-
- def f():
- lst.append(1)
- greenlet.getcurrent().parent.switch()
-
- def g():
- lst.append(1)
- g = greenlet(f)
- g.switch()
- lst.append(1)
- g = greenlet(g)
- g.switch()
- self.assertEqual(len(lst), 3)
- self.assertEqual(sys.getrefcount(g), 2)
-
- def test_threads(self):
- success = []
-
- def f():
- self.test_simple()
- success.append(True)
- ths = [threading.Thread(target=f) for i in range(10)]
- for th in ths:
- th.start()
- for th in ths:
- th.join()
- self.assertEqual(len(success), len(ths))
-
- def test_exception(self):
- seen = []
- g1 = greenlet(fmain)
- g2 = greenlet(fmain)
- g1.switch(seen)
- g2.switch(seen)
- g2.parent = g1
- self.assertEqual(seen, [])
- self.assertRaises(SomeError, g2.switch)
- self.assertEqual(seen, [SomeError])
- g2.switch()
- self.assertEqual(seen, [SomeError])
-
- def test_send_exception(self):
- seen = []
- g1 = greenlet(fmain)
- g1.switch(seen)
- self.assertRaises(KeyError, send_exception, g1, KeyError)
- self.assertEqual(seen, [KeyError])
-
- def test_dealloc(self):
- seen = []
- g1 = greenlet(fmain)
- g2 = greenlet(fmain)
- g1.switch(seen)
- g2.switch(seen)
- self.assertEqual(seen, [])
- del g1
- gc.collect()
- self.assertEqual(seen, [greenlet.GreenletExit])
- del g2
- gc.collect()
- self.assertEqual(seen, [greenlet.GreenletExit, greenlet.GreenletExit])
-
- def test_dealloc_other_thread(self):
- seen = []
- someref = []
- lock = threading.Lock()
- lock.acquire()
- lock2 = threading.Lock()
- lock2.acquire()
-
- def f():
- g1 = greenlet(fmain)
- g1.switch(seen)
- someref.append(g1)
- del g1
- gc.collect()
- lock.release()
- lock2.acquire()
- greenlet() # trigger release
- lock.release()
- lock2.acquire()
- t = threading.Thread(target=f)
- t.start()
- lock.acquire()
- self.assertEqual(seen, [])
- self.assertEqual(len(someref), 1)
- del someref[:]
- gc.collect()
- # g1 is not released immediately because it's from another thread
- self.assertEqual(seen, [])
- lock2.release()
- lock.acquire()
- self.assertEqual(seen, [greenlet.GreenletExit])
- lock2.release()
- t.join()
-
- def test_frame(self):
- def f1():
- f = sys._getframe(0) # pylint:disable=protected-access
- self.assertEqual(f.f_back, None)
- greenlet.getcurrent().parent.switch(f)
- return "meaning of life"
- g = greenlet(f1)
- frame = g.switch()
- self.assertTrue(frame is g.gr_frame)
- self.assertTrue(g)
-
- from_g = g.switch()
- self.assertFalse(g)
- self.assertEqual(from_g, 'meaning of life')
- self.assertEqual(g.gr_frame, None)
-
- def test_thread_bug(self):
- def runner(x):
- g = greenlet(lambda: time.sleep(x))
- g.switch()
- t1 = threading.Thread(target=runner, args=(0.2,))
- t2 = threading.Thread(target=runner, args=(0.3,))
- t1.start()
- t2.start()
- t1.join()
- t2.join()
-
- def test_switch_kwargs(self):
- def run(a, b):
- self.assertEqual(a, 4)
- self.assertEqual(b, 2)
- return 42
- x = greenlet(run).switch(a=4, b=2)
- self.assertEqual(x, 42)
-
- def test_switch_kwargs_to_parent(self):
- def run(x):
- greenlet.getcurrent().parent.switch(x=x)
- greenlet.getcurrent().parent.switch(2, x=3)
- return x, x ** 2
- g = greenlet(run)
- self.assertEqual({'x': 3}, g.switch(3))
- self.assertEqual(((2,), {'x': 3}), g.switch())
- self.assertEqual((3, 9), g.switch())
-
- def test_switch_to_another_thread(self):
- data = {}
- error = None
- created_event = threading.Event()
- done_event = threading.Event()
-
- def run():
- data['g'] = greenlet(lambda: None)
- created_event.set()
- done_event.wait()
- thread = threading.Thread(target=run)
- thread.start()
- created_event.wait()
- try:
- data['g'].switch()
- except greenlet.error:
- error = sys.exc_info()[1]
- self.assertIsNotNone(error, "greenlet.error was not raised!")
- done_event.set()
- thread.join()
-
- def test_exc_state(self):
- def f():
- try:
- raise ValueError('fun')
- except: # pylint:disable=bare-except
- exc_info = sys.exc_info()
- greenlet(h).switch()
- self.assertEqual(exc_info, sys.exc_info())
-
- def h():
- self.assertEqual(sys.exc_info(), (None, None, None))
-
- greenlet(f).switch()
-
- def test_instance_dict(self):
- def f():
- greenlet.getcurrent().test = 42
- def deldict(g):
- del g.__dict__
- def setdict(g, value):
- g.__dict__ = value
- g = greenlet(f)
- self.assertEqual(g.__dict__, {})
- g.switch()
- self.assertEqual(g.test, 42)
- self.assertEqual(g.__dict__, {'test': 42})
- g.__dict__ = g.__dict__
- self.assertEqual(g.__dict__, {'test': 42})
- self.assertRaises(TypeError, deldict, g)
- self.assertRaises(TypeError, setdict, g, 42)
-
- def test_threaded_reparent(self):
- data = {}
- created_event = threading.Event()
- done_event = threading.Event()
-
- def run():
- data['g'] = greenlet(lambda: None)
- created_event.set()
- done_event.wait()
-
- def blank():
- greenlet.getcurrent().parent.switch()
-
- def setparent(g, value):
- g.parent = value
-
- thread = threading.Thread(target=run)
- thread.start()
- created_event.wait()
- g = greenlet(blank)
- g.switch()
- self.assertRaises(ValueError, setparent, g, data['g'])
- done_event.set()
- thread.join()
-
- def test_deepcopy(self):
- import copy
- self.assertRaises(TypeError, copy.copy, greenlet())
- self.assertRaises(TypeError, copy.deepcopy, greenlet())
-
- def test_parent_restored_on_kill(self):
- hub = greenlet(lambda: None)
- main = greenlet.getcurrent()
- result = []
- def worker():
- try:
- # Wait to be killed
- main.switch()
- except greenlet.GreenletExit:
- # Resurrect and switch to parent
- result.append(greenlet.getcurrent().parent)
- result.append(greenlet.getcurrent())
- hub.switch()
- g = greenlet(worker, parent=hub)
- g.switch()
- del g
- self.assertTrue(result)
- self.assertEqual(result[0], main)
- self.assertEqual(result[1].parent, hub)
-
- def test_parent_return_failure(self):
- # No run causes AttributeError on switch
- g1 = greenlet()
- # Greenlet that implicitly switches to parent
- g2 = greenlet(lambda: None, parent=g1)
- # AttributeError should propagate to us, no fatal errors
- self.assertRaises(AttributeError, g2.switch)
-
- def test_throw_exception_not_lost(self):
- class mygreenlet(greenlet):
- def __getattribute__(self, name):
- try:
- raise Exception()
- except: # pylint:disable=bare-except
- pass
- return greenlet.__getattribute__(self, name)
- g = mygreenlet(lambda: None)
- self.assertRaises(SomeError, g.throw, SomeError())
-
- def test_throw_doesnt_crash(self):
- result = []
- def worker():
- greenlet.getcurrent().parent.switch()
- def creator():
- g = greenlet(worker)
- g.switch()
- result.append(g)
- t = threading.Thread(target=creator)
- t.start()
- t.join()
- self.assertRaises(greenlet.error, result[0].throw, SomeError())
-
- def test_recursive_startup(self):
- class convoluted(greenlet):
- def __init__(self):
- greenlet.__init__(self)
- self.count = 0
- def __getattribute__(self, name):
- if name == 'run' and self.count == 0:
- self.count = 1
- self.switch(43)
- return greenlet.__getattribute__(self, name)
- def run(self, value):
- while True:
- self.parent.switch(value)
- g = convoluted()
- self.assertEqual(g.switch(42), 43)
-
- def test_unexpected_reparenting(self):
- another = []
- def worker():
- g = greenlet(lambda: None)
- another.append(g)
- g.switch()
- t = threading.Thread(target=worker)
- t.start()
- t.join()
- class convoluted(greenlet):
- def __getattribute__(self, name):
- if name == 'run':
- self.parent = another[0] # pylint:disable=attribute-defined-outside-init
- return greenlet.__getattribute__(self, name)
- g = convoluted(lambda: None)
- self.assertRaises(greenlet.error, g.switch)
-
- def test_threaded_updatecurrent(self):
- # released when main thread should execute
- lock1 = threading.Lock()
- lock1.acquire()
- # released when another thread should execute
- lock2 = threading.Lock()
- lock2.acquire()
- class finalized(object):
- def __del__(self):
- # happens while in green_updatecurrent() in main greenlet
- # should be very careful not to accidentally call it again
- # at the same time we must make sure another thread executes
- lock2.release()
- lock1.acquire()
- # now ts_current belongs to another thread
- def deallocator():
- greenlet.getcurrent().parent.switch()
- def fthread():
- lock2.acquire()
- greenlet.getcurrent()
- del g[0]
- lock1.release()
- lock2.acquire()
- greenlet.getcurrent()
- lock1.release()
- main = greenlet.getcurrent()
- g = [greenlet(deallocator)]
- g[0].bomb = finalized()
- g[0].switch()
- t = threading.Thread(target=fthread)
- t.start()
- # let another thread grab ts_current and deallocate g[0]
- lock2.release()
- lock1.acquire()
- # this is the corner stone
- # getcurrent() will notice that ts_current belongs to another thread
- # and start the update process, which would notice that g[0] should
- # be deallocated, and that will execute an object's finalizer. Now,
- # that object will let another thread run so it can grab ts_current
- # again, which would likely crash the interpreter if there's no
- # check for this case at the end of green_updatecurrent(). This test
- # passes if getcurrent() returns correct result, but it's likely
- # to randomly crash if it's not anyway.
- self.assertEqual(greenlet.getcurrent(), main)
- # wait for another thread to complete, just in case
- t.join()
-
- def test_dealloc_switch_args_not_lost(self):
- seen = []
- def worker():
- # wait for the value
- value = greenlet.getcurrent().parent.switch()
- # delete all references to ourself
- del worker[0]
- initiator.parent = greenlet.getcurrent().parent
- # switch to main with the value, but because
- # ts_current is the last reference to us we
- # return immediately
- try:
- greenlet.getcurrent().parent.switch(value)
- finally:
- seen.append(greenlet.getcurrent())
- def initiator():
- return 42 # implicitly falls thru to parent
- worker = [greenlet(worker)]
- worker[0].switch() # prime worker
- initiator = greenlet(initiator, worker[0])
- value = initiator.switch()
- self.assertTrue(seen)
- self.assertEqual(value, 42)
-
-
-
- def test_tuple_subclass(self):
- if sys.version_info[0] > 2:
- # There's no apply in Python 3.x
- def _apply(func, a, k):
- func(*a, **k)
- else:
- _apply = apply # pylint:disable=undefined-variable
-
- class mytuple(tuple):
- def __len__(self):
- greenlet.getcurrent().switch()
- return tuple.__len__(self)
- args = mytuple()
- kwargs = dict(a=42)
- def switchapply():
- _apply(greenlet.getcurrent().parent.switch, args, kwargs)
- g = greenlet(switchapply)
- self.assertEqual(g.switch(), kwargs)
-
- def test_abstract_subclasses(self):
- AbstractSubclass = ABCMeta(
- 'AbstractSubclass',
- (greenlet,),
- {'run': abstractmethod(lambda self: None)})
-
- class BadSubclass(AbstractSubclass):
- pass
-
- class GoodSubclass(AbstractSubclass):
- def run(self):
- pass
-
- GoodSubclass() # should not raise
- self.assertRaises(TypeError, BadSubclass)
-
- def test_implicit_parent_with_threads(self):
- if not gc.isenabled():
- return # cannot test with disabled gc
- N = gc.get_threshold()[0]
- if N < 50:
- return # cannot test with such a small N
- def attempt():
- lock1 = threading.Lock()
- lock1.acquire()
- lock2 = threading.Lock()
- lock2.acquire()
- recycled = [False]
- def another_thread():
- lock1.acquire() # wait for gc
- greenlet.getcurrent() # update ts_current
- lock2.release() # release gc
- t = threading.Thread(target=another_thread)
- t.start()
- class gc_callback(object):
- def __del__(self):
- lock1.release()
- lock2.acquire()
- recycled[0] = True
- class garbage(object):
- def __init__(self):
- self.cycle = self
- self.callback = gc_callback()
- l = []
- x = range(N*2)
- current = greenlet.getcurrent()
- g = garbage()
- for _ in x:
- g = None # lose reference to garbage
- if recycled[0]:
- # gc callback called prematurely
- t.join()
- return False
- last = greenlet()
- if recycled[0]:
- break # yes! gc called in green_new
- l.append(last) # increase allocation counter
- else:
- # gc callback not called when expected
- gc.collect()
- if recycled[0]:
- t.join()
- return False
- self.assertEqual(last.parent, current)
- for g in l:
- self.assertEqual(g.parent, current)
- return True
- for _ in range(5):
- if attempt():
- break
-
- def test_issue_245_reference_counting_subclass_no_threads(self):
- # https://github.com/python-greenlet/greenlet/issues/245
- # Before the fix, this crashed pretty reliably on
- # Python 3.10, at least on macOS; but much less reliably on other
- # interpreters (memory layout must have changed).
- # The threaded test crashed more reliably on more interpreters.
- from greenlet import getcurrent
- from greenlet import GreenletExit
-
- class Greenlet(greenlet):
- pass
-
- initial_refs = sys.getrefcount(Greenlet)
- # This has to be an instance variable because
- # Python 2 raises a SyntaxError if we delete a local
- # variable referenced in an inner scope.
- self.glets = [] # pylint:disable=attribute-defined-outside-init
-
- def greenlet_main():
- try:
- getcurrent().parent.switch()
- except GreenletExit:
- self.glets.append(getcurrent())
-
- # Before the
- for _ in range(10):
- Greenlet(greenlet_main).switch()
-
- del self.glets
- self.assertEqual(sys.getrefcount(Greenlet), initial_refs)
-
- def test_issue_245_reference_counting_subclass_threads(self):
- # https://github.com/python-greenlet/greenlet/issues/245
- from threading import Thread
- from threading import Event
-
- from greenlet import getcurrent
-
- class MyGreenlet(greenlet):
- pass
-
- glets = []
- ref_cleared = Event()
-
- def greenlet_main():
- getcurrent().parent.switch()
-
- def thread_main(greenlet_running_event):
- mine = MyGreenlet(greenlet_main)
- glets.append(mine)
- # The greenlets being deleted must be active
- mine.switch()
- # Don't keep any reference to it in this thread
- del mine
- # Let main know we published our greenlet.
- greenlet_running_event.set()
- # Wait for main to let us know the references are
- # gone and the greenlet objects no longer reachable
- ref_cleared.wait()
- # The creating thread must call getcurrent() (or a few other
- # greenlet APIs) because that's when the thread-local list of dead
- # greenlets gets cleared.
- getcurrent()
-
- # We start with 3 references to the subclass:
- # - This module
- # - Its __mro__
- # - The __subclassess__ attribute of greenlet
- # - (If we call gc.get_referents(), we find four entries, including
- # some other tuple ``(greenlet)`` that I'm not sure about but must be part
- # of the machinery.)
- #
- # On Python 3.10 it's often enough to just run 3 threads; on Python 2.7,
- # more threads are needed, and the results are still
- # non-deterministic. Presumably the memory layouts are different
- initial_refs = sys.getrefcount(MyGreenlet)
- thread_ready_events = []
- for _ in range(
- initial_refs + 45
- ):
- event = Event()
- thread = Thread(target=thread_main, args=(event,))
- thread_ready_events.append(event)
- thread.start()
-
-
- for done_event in thread_ready_events:
- done_event.wait()
-
-
- del glets[:]
- ref_cleared.set()
- # Let any other thread run; it will crash the interpreter
- # if not fixed (or silently corrupt memory and we possibly crash
- # later).
- time.sleep(1)
- self.assertEqual(sys.getrefcount(MyGreenlet), initial_refs)
-
-
-class TestRepr(unittest.TestCase):
-
- def assertEndsWith(self, got, suffix):
- self.assertTrue(got.endswith(suffix), (got, suffix))
-
- def test_main_while_running(self):
- r = repr(greenlet.getcurrent())
- self.assertEndsWith(r, " current active started main>")
-
- def test_main_in_background(self):
- main = greenlet.getcurrent()
- def run():
- return repr(main)
-
- g = greenlet(run)
- r = g.switch()
- self.assertEndsWith(r, ' suspended active started main>')
-
- def test_initial(self):
- r = repr(greenlet())
- self.assertEndsWith(r, ' pending>')
-
- def test_main_from_other_thread(self):
- main = greenlet.getcurrent()
-
- class T(threading.Thread):
- original_main = thread_main = None
- main_glet = None
- def run(self):
- self.original_main = repr(main)
- self.main_glet = greenlet.getcurrent()
- self.thread_main = repr(self.main_glet)
-
- t = T()
- t.start()
- t.join(10)
-
- self.assertEndsWith(t.original_main, ' suspended active started main>')
- self.assertEndsWith(t.thread_main, ' current active started main>')
-
- r = repr(t.main_glet)
- # main greenlets, even from dead threads, never really appear dead
- # TODO: Can we find a better way to differentiate that?
- assert not t.main_glet.dead
- self.assertEndsWith(r, ' suspended active started main>')
-
- def test_dead(self):
- g = greenlet(lambda: None)
- g.switch()
- self.assertEndsWith(repr(g), ' dead>')
- self.assertNotIn('suspended', repr(g))
- self.assertNotIn('started', repr(g))
- self.assertNotIn('active', repr(g))
-
- def test_formatting_produces_native_str(self):
- # https://github.com/python-greenlet/greenlet/issues/218
- # %s formatting on Python 2 was producing unicode, not str.
-
- g_dead = greenlet(lambda: None)
- g_not_started = greenlet(lambda: None)
- g_cur = greenlet.getcurrent()
-
- for g in g_dead, g_not_started, g_cur:
-
- self.assertIsInstance(
- '%s' % (g,),
- str
- )
- self.assertIsInstance(
- '%r' % (g,),
- str,
- )
-
-
-if __name__ == '__main__':
- unittest.main()
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_leaks.py b/env/lib/python3.9/site-packages/greenlet/tests/test_leaks.py
deleted file mode 100644
index 2b02bfd..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_leaks.py
+++ /dev/null
@@ -1,178 +0,0 @@
-import unittest
-import sys
-import gc
-
-import time
-import weakref
-import threading
-
-import greenlet
-
-class TestLeaks(unittest.TestCase):
-
- def test_arg_refs(self):
- args = ('a', 'b', 'c')
- refcount_before = sys.getrefcount(args)
- # pylint:disable=unnecessary-lambda
- g = greenlet.greenlet(
- lambda *args: greenlet.getcurrent().parent.switch(*args))
- for _ in range(100):
- g.switch(*args)
- self.assertEqual(sys.getrefcount(args), refcount_before)
-
- def test_kwarg_refs(self):
- kwargs = {}
- # pylint:disable=unnecessary-lambda
- g = greenlet.greenlet(
- lambda **kwargs: greenlet.getcurrent().parent.switch(**kwargs))
- for _ in range(100):
- g.switch(**kwargs)
- self.assertEqual(sys.getrefcount(kwargs), 2)
-
- assert greenlet.GREENLET_USE_GC # Option to disable this was removed in 1.0
-
- def recycle_threads(self):
- # By introducing a thread that does sleep we allow other threads,
- # that have triggered their __block condition, but did not have a
- # chance to deallocate their thread state yet, to finally do so.
- # The way it works is by requiring a GIL switch (different thread),
- # which does a GIL release (sleep), which might do a GIL switch
- # to finished threads and allow them to clean up.
- def worker():
- time.sleep(0.001)
- t = threading.Thread(target=worker)
- t.start()
- time.sleep(0.001)
- t.join()
-
- def test_threaded_leak(self):
- gg = []
- def worker():
- # only main greenlet present
- gg.append(weakref.ref(greenlet.getcurrent()))
- for _ in range(2):
- t = threading.Thread(target=worker)
- t.start()
- t.join()
- del t
- greenlet.getcurrent() # update ts_current
- self.recycle_threads()
- greenlet.getcurrent() # update ts_current
- gc.collect()
- greenlet.getcurrent() # update ts_current
- for g in gg:
- self.assertIsNone(g())
-
- def test_threaded_adv_leak(self):
- gg = []
- def worker():
- # main and additional *finished* greenlets
- ll = greenlet.getcurrent().ll = []
- def additional():
- ll.append(greenlet.getcurrent())
- for _ in range(2):
- greenlet.greenlet(additional).switch()
- gg.append(weakref.ref(greenlet.getcurrent()))
- for _ in range(2):
- t = threading.Thread(target=worker)
- t.start()
- t.join()
- del t
- greenlet.getcurrent() # update ts_current
- self.recycle_threads()
- greenlet.getcurrent() # update ts_current
- gc.collect()
- greenlet.getcurrent() # update ts_current
- for g in gg:
- self.assertIsNone(g())
-
- def test_issue251_killing_cross_thread_leaks_list(self, manually_collect_background=True):
- # See https://github.com/python-greenlet/greenlet/issues/251
- # Killing a greenlet (probably not the main one)
- # in one thread from another thread would
- # result in leaking a list (the ts_delkey list).
-
- # For the test to be valid, even empty lists have to be tracked by the
- # GC
- assert gc.is_tracked([])
-
- def count_objects(kind=list):
- # pylint:disable=unidiomatic-typecheck
- # Collect the garbage.
- for _ in range(3):
- gc.collect()
- gc.collect()
- return sum(
- 1
- for x in gc.get_objects()
- if type(x) is kind
- )
-
- # XXX: The main greenlet of a dead thread is only released
- # when one of the proper greenlet APIs is used from a different
- # running thread. See #252 (https://github.com/python-greenlet/greenlet/issues/252)
- greenlet.getcurrent()
- greenlets_before = count_objects(greenlet.greenlet)
-
- background_glet_running = threading.Event()
- background_glet_killed = threading.Event()
- background_greenlets = []
- def background_greenlet():
- # Throw control back to the main greenlet.
- greenlet.getcurrent().parent.switch()
-
- def background_thread():
- glet = greenlet.greenlet(background_greenlet)
- background_greenlets.append(glet)
- glet.switch() # Be sure it's active.
- # Control is ours again.
- del glet # Delete one reference from the thread it runs in.
- background_glet_running.set()
- background_glet_killed.wait()
- # To trigger the background collection of the dead
- # greenlet, thus clearing out the contents of the list, we
- # need to run some APIs. See issue 252.
- if manually_collect_background:
- greenlet.getcurrent()
-
-
- t = threading.Thread(target=background_thread)
- t.start()
- background_glet_running.wait()
-
- lists_before = count_objects()
-
- assert len(background_greenlets) == 1
- self.assertFalse(background_greenlets[0].dead)
- # Delete the last reference to the background greenlet
- # from a different thread. This puts it in the background thread's
- # ts_delkey list.
- del background_greenlets[:]
- background_glet_killed.set()
-
- # Now wait for the background thread to die.
- t.join(10)
- del t
-
- # Free the background main greenlet by forcing greenlet to notice a difference.
- greenlet.getcurrent()
- greenlets_after = count_objects(greenlet.greenlet)
-
- lists_after = count_objects()
- # On 2.7, we observe that lists_after is smaller than
- # lists_before. No idea what lists got cleaned up. All the
- # Python 3 versions match exactly.
- self.assertLessEqual(lists_after, lists_before)
-
- self.assertEqual(greenlets_before, greenlets_after)
-
- @unittest.expectedFailure
- def test_issue251_issue252_need_to_collect_in_background(self):
- # This still fails because the leak of the list
- # still exists when we don't call a greenlet API before exiting the
- # thread. The proximate cause is that neither of the two greenlets
- # from the background thread are actually being destroyed, even though
- # the GC is in fact visiting both objects.
- # It's not clear where that leak is? For some reason the thread-local dict
- # holding it isn't being cleaned up.
- self.test_issue251_killing_cross_thread_leaks_list(manually_collect_background=False)
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_stack_saved.py b/env/lib/python3.9/site-packages/greenlet/tests/test_stack_saved.py
deleted file mode 100644
index 6c7353b..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_stack_saved.py
+++ /dev/null
@@ -1,19 +0,0 @@
-import greenlet
-import unittest
-
-
-class Test(unittest.TestCase):
-
- def test_stack_saved(self):
- main = greenlet.getcurrent()
- self.assertEqual(main._stack_saved, 0)
-
- def func():
- main.switch(main._stack_saved)
-
- g = greenlet.greenlet(func)
- x = g.switch()
- assert x > 0, x
- assert g._stack_saved > 0, g._stack_saved
- g.switch()
- assert g._stack_saved == 0, g._stack_saved
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_throw.py b/env/lib/python3.9/site-packages/greenlet/tests/test_throw.py
deleted file mode 100644
index a2014a9..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_throw.py
+++ /dev/null
@@ -1,100 +0,0 @@
-import sys
-import unittest
-
-from greenlet import greenlet
-
-
-def switch(*args):
- return greenlet.getcurrent().parent.switch(*args)
-
-
-class ThrowTests(unittest.TestCase):
- def test_class(self):
- def f():
- try:
- switch("ok")
- except RuntimeError:
- switch("ok")
- return
- switch("fail")
- g = greenlet(f)
- res = g.switch()
- self.assertEqual(res, "ok")
- res = g.throw(RuntimeError)
- self.assertEqual(res, "ok")
-
- def test_val(self):
- def f():
- try:
- switch("ok")
- except RuntimeError:
- val = sys.exc_info()[1]
- if str(val) == "ciao":
- switch("ok")
- return
- switch("fail")
-
- g = greenlet(f)
- res = g.switch()
- self.assertEqual(res, "ok")
- res = g.throw(RuntimeError("ciao"))
- self.assertEqual(res, "ok")
-
- g = greenlet(f)
- res = g.switch()
- self.assertEqual(res, "ok")
- res = g.throw(RuntimeError, "ciao")
- self.assertEqual(res, "ok")
-
- def test_kill(self):
- def f():
- switch("ok")
- switch("fail")
- g = greenlet(f)
- res = g.switch()
- self.assertEqual(res, "ok")
- res = g.throw()
- self.assertTrue(isinstance(res, greenlet.GreenletExit))
- self.assertTrue(g.dead)
- res = g.throw() # immediately eaten by the already-dead greenlet
- self.assertTrue(isinstance(res, greenlet.GreenletExit))
-
- def test_throw_goes_to_original_parent(self):
- main = greenlet.getcurrent()
-
- def f1():
- try:
- main.switch("f1 ready to catch")
- except IndexError:
- return "caught"
- else:
- return "normal exit"
-
- def f2():
- main.switch("from f2")
-
- g1 = greenlet(f1)
- g2 = greenlet(f2, parent=g1)
- self.assertRaises(IndexError, g2.throw, IndexError)
- self.assertTrue(g2.dead)
- self.assertTrue(g1.dead)
-
- g1 = greenlet(f1)
- g2 = greenlet(f2, parent=g1)
- res = g1.switch()
- self.assertEqual(res, "f1 ready to catch")
- res = g2.throw(IndexError)
- self.assertEqual(res, "caught")
- self.assertTrue(g2.dead)
- self.assertTrue(g1.dead)
-
- g1 = greenlet(f1)
- g2 = greenlet(f2, parent=g1)
- res = g1.switch()
- self.assertEqual(res, "f1 ready to catch")
- res = g2.switch()
- self.assertEqual(res, "from f2")
- res = g2.throw(IndexError)
- self.assertEqual(res, "caught")
- self.assertTrue(g2.dead)
- self.assertTrue(g1.dead)
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_tracing.py b/env/lib/python3.9/site-packages/greenlet/tests/test_tracing.py
deleted file mode 100644
index 2ab4d71..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_tracing.py
+++ /dev/null
@@ -1,267 +0,0 @@
-import sys
-import unittest
-import greenlet
-
-class SomeError(Exception):
- pass
-
-class GreenletTracer(object):
- oldtrace = None
-
- def __init__(self, error_on_trace=False):
- self.actions = []
- self.error_on_trace = error_on_trace
-
- def __call__(self, *args):
- self.actions.append(args)
- if self.error_on_trace:
- raise SomeError
-
- def __enter__(self):
- self.oldtrace = greenlet.settrace(self)
- return self.actions
-
- def __exit__(self, *args):
- greenlet.settrace(self.oldtrace)
-
-
-class TestGreenletTracing(unittest.TestCase):
- """
- Tests of ``greenlet.settrace()``
- """
-
- def test_greenlet_tracing(self):
- main = greenlet.getcurrent()
- def dummy():
- pass
- def dummyexc():
- raise SomeError()
-
- with GreenletTracer() as actions:
- g1 = greenlet.greenlet(dummy)
- g1.switch()
- g2 = greenlet.greenlet(dummyexc)
- self.assertRaises(SomeError, g2.switch)
-
- self.assertEqual(actions, [
- ('switch', (main, g1)),
- ('switch', (g1, main)),
- ('switch', (main, g2)),
- ('throw', (g2, main)),
- ])
-
- def test_exception_disables_tracing(self):
- main = greenlet.getcurrent()
- def dummy():
- main.switch()
- g = greenlet.greenlet(dummy)
- g.switch()
- with GreenletTracer(error_on_trace=True) as actions:
- self.assertRaises(SomeError, g.switch)
- self.assertEqual(greenlet.gettrace(), None)
-
- self.assertEqual(actions, [
- ('switch', (main, g)),
- ])
-
-
-class PythonTracer(object):
- oldtrace = None
-
- def __init__(self):
- self.actions = []
-
- def __call__(self, frame, event, arg):
- # Record the co_name so we have an idea what function we're in.
- self.actions.append((event, frame.f_code.co_name))
-
- def __enter__(self):
- self.oldtrace = sys.setprofile(self)
- return self.actions
-
- def __exit__(self, *args):
- sys.setprofile(self.oldtrace)
-
-def tpt_callback():
- return 42
-
-class TestPythonTracing(unittest.TestCase):
- """
- Tests of the interaction of ``sys.settrace()``
- with greenlet facilities.
-
- NOTE: Most of this is probably CPython specific.
- """
-
- maxDiff = None
-
- def test_trace_events_trivial(self):
- with PythonTracer() as actions:
- tpt_callback()
- # If we use the sys.settrace instead of setprofile, we get
- # this:
-
- # self.assertEqual(actions, [
- # ('call', 'tpt_callback'),
- # ('call', '__exit__'),
- # ])
-
- self.assertEqual(actions, [
- ('return', '__enter__'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('call', '__exit__'),
- ('c_call', '__exit__'),
- ])
-
- def _trace_switch(self, glet):
- with PythonTracer() as actions:
- glet.switch()
- return actions
-
- def _check_trace_events_func_already_set(self, glet):
- actions = self._trace_switch(glet)
- self.assertEqual(actions, [
- ('return', '__enter__'),
- ('c_call', '_trace_switch'),
- ('call', 'run'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('return', 'run'),
- ('c_return', '_trace_switch'),
- ('call', '__exit__'),
- ('c_call', '__exit__'),
- ])
-
- def test_trace_events_into_greenlet_func_already_set(self):
- def run():
- return tpt_callback()
-
- self._check_trace_events_func_already_set(greenlet.greenlet(run))
-
- def test_trace_events_into_greenlet_subclass_already_set(self):
- class X(greenlet.greenlet):
- def run(self):
- return tpt_callback()
- self._check_trace_events_func_already_set(X())
-
- def _check_trace_events_from_greenlet_sets_profiler(self, g, tracer):
- g.switch()
- tpt_callback()
- tracer.__exit__()
- self.assertEqual(tracer.actions, [
- ('return', '__enter__'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('return', 'run'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('call', '__exit__'),
- ('c_call', '__exit__'),
- ])
-
-
- def test_trace_events_from_greenlet_func_sets_profiler(self):
- tracer = PythonTracer()
- def run():
- tracer.__enter__()
- return tpt_callback()
-
- self._check_trace_events_from_greenlet_sets_profiler(greenlet.greenlet(run),
- tracer)
-
- def test_trace_events_from_greenlet_subclass_sets_profiler(self):
- tracer = PythonTracer()
- class X(greenlet.greenlet):
- def run(self):
- tracer.__enter__()
- return tpt_callback()
-
- self._check_trace_events_from_greenlet_sets_profiler(X(), tracer)
-
-
- def test_trace_events_multiple_greenlets_switching(self):
- tracer = PythonTracer()
-
- g1 = None
- g2 = None
-
- def g1_run():
- tracer.__enter__()
- tpt_callback()
- g2.switch()
- tpt_callback()
- return 42
-
- def g2_run():
- tpt_callback()
- tracer.__exit__()
- tpt_callback()
- g1.switch()
-
- g1 = greenlet.greenlet(g1_run)
- g2 = greenlet.greenlet(g2_run)
-
- x = g1.switch()
- self.assertEqual(x, 42)
- tpt_callback() # ensure not in the trace
- self.assertEqual(tracer.actions, [
- ('return', '__enter__'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('c_call', 'g1_run'),
- ('call', 'g2_run'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('call', '__exit__'),
- ('c_call', '__exit__'),
- ])
-
- def test_trace_events_multiple_greenlets_switching_siblings(self):
- # Like the first version, but get both greenlets running first
- # as "siblings" and then establish the tracing.
- tracer = PythonTracer()
-
- g1 = None
- g2 = None
-
- def g1_run():
- greenlet.getcurrent().parent.switch()
- tracer.__enter__()
- tpt_callback()
- g2.switch()
- tpt_callback()
- return 42
-
- def g2_run():
- greenlet.getcurrent().parent.switch()
-
- tpt_callback()
- tracer.__exit__()
- tpt_callback()
- g1.switch()
-
- g1 = greenlet.greenlet(g1_run)
- g2 = greenlet.greenlet(g2_run)
-
- # Start g1
- g1.switch()
- # And it immediately returns control to us.
- # Start g2
- g2.switch()
- # Which also returns. Now kick of the real part of the
- # test.
- x = g1.switch()
- self.assertEqual(x, 42)
-
- tpt_callback() # ensure not in the trace
- self.assertEqual(tracer.actions, [
- ('return', '__enter__'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('c_call', 'g1_run'),
- ('call', 'tpt_callback'),
- ('return', 'tpt_callback'),
- ('call', '__exit__'),
- ('c_call', '__exit__'),
- ])
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_version.py b/env/lib/python3.9/site-packages/greenlet/tests/test_version.py
deleted file mode 100644
index 0c9a497..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_version.py
+++ /dev/null
@@ -1,39 +0,0 @@
-#! /usr/bin/env python
-from __future__ import absolute_import
-from __future__ import print_function
-
-import sys
-import os
-import unittest
-
-import greenlet
-
-class VersionTests(unittest.TestCase):
- def test_version(self):
- def find_dominating_file(name):
- if os.path.exists(name):
- return name
-
- tried = []
- here = os.path.abspath(os.path.dirname(__file__))
- for i in range(10):
- up = ['..'] * i
- path = [here] + up + [name]
- fname = os.path.join(*path)
- fname = os.path.abspath(fname)
- tried.append(fname)
- if os.path.exists(fname):
- return fname
- raise AssertionError("Could not find file " + name + "; checked " + str(tried))
-
- try:
- setup_py = find_dominating_file('setup.py')
- except AssertionError as e:
- raise unittest.SkipTest("Unable to find setup.py; must be out of tree. " + str(e))
-
-
- invoke_setup = "%s %s --version" % (sys.executable, setup_py)
- with os.popen(invoke_setup) as f:
- sversion = f.read().strip()
-
- self.assertEqual(sversion, greenlet.__version__)
diff --git a/env/lib/python3.9/site-packages/greenlet/tests/test_weakref.py b/env/lib/python3.9/site-packages/greenlet/tests/test_weakref.py
deleted file mode 100644
index 6a2ff06..0000000
--- a/env/lib/python3.9/site-packages/greenlet/tests/test_weakref.py
+++ /dev/null
@@ -1,34 +0,0 @@
-import gc
-import greenlet
-import weakref
-import unittest
-
-
-class WeakRefTests(unittest.TestCase):
- def test_dead_weakref(self):
- def _dead_greenlet():
- g = greenlet.greenlet(lambda: None)
- g.switch()
- return g
- o = weakref.ref(_dead_greenlet())
- gc.collect()
- self.assertEqual(o(), None)
-
- def test_inactive_weakref(self):
- o = weakref.ref(greenlet.greenlet())
- gc.collect()
- self.assertEqual(o(), None)
-
- def test_dealloc_weakref(self):
- seen = []
- def worker():
- try:
- greenlet.getcurrent().parent.switch()
- finally:
- seen.append(g())
- g = greenlet.greenlet(worker)
- g.switch()
- g2 = greenlet.greenlet(lambda: None, g)
- g = weakref.ref(g2)
- g2 = None
- self.assertEqual(seen, [None])
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/INSTALLER b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/INSTALLER
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/INSTALLER
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/LICENSE b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/LICENSE
deleted file mode 100644
index 65865a9..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/LICENSE
+++ /dev/null
@@ -1,23 +0,0 @@
-2009-2018 (c) Benoît Chesneau
-2009-2015 (c) Paul J. Davis
-
-Permission is hereby granted, free of charge, to any person
-obtaining a copy of this software and associated documentation
-files (the "Software"), to deal in the Software without
-restriction, including without limitation the rights to use,
-copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the
-Software is furnished to do so, subject to the following
-conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
-OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
-HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
-WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
-OTHER DEALINGS IN THE SOFTWARE.
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/METADATA b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/METADATA
deleted file mode 100644
index 39eb40f..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/METADATA
+++ /dev/null
@@ -1,120 +0,0 @@
-Metadata-Version: 2.1
-Name: gunicorn
-Version: 20.1.0
-Summary: WSGI HTTP Server for UNIX
-Home-page: https://gunicorn.org
-Author: Benoit Chesneau
-Author-email: benoitc@e-engura.com
-License: MIT
-Project-URL: Documentation, https://docs.gunicorn.org
-Project-URL: Homepage, https://gunicorn.org
-Project-URL: Issue tracker, https://github.com/benoitc/gunicorn/issues
-Project-URL: Source code, https://github.com/benoitc/gunicorn
-Platform: UNKNOWN
-Classifier: Development Status :: 5 - Production/Stable
-Classifier: Environment :: Other Environment
-Classifier: Intended Audience :: Developers
-Classifier: License :: OSI Approved :: MIT License
-Classifier: Operating System :: MacOS :: MacOS X
-Classifier: Operating System :: POSIX
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.5
-Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Programming Language :: Python :: 3.8
-Classifier: Programming Language :: Python :: 3.9
-Classifier: Programming Language :: Python :: 3 :: Only
-Classifier: Programming Language :: Python :: Implementation :: CPython
-Classifier: Programming Language :: Python :: Implementation :: PyPy
-Classifier: Topic :: Internet
-Classifier: Topic :: Utilities
-Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Classifier: Topic :: Internet :: WWW/HTTP
-Classifier: Topic :: Internet :: WWW/HTTP :: WSGI
-Classifier: Topic :: Internet :: WWW/HTTP :: WSGI :: Server
-Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
-Requires-Python: >=3.5
-Requires-Dist: setuptools (>=3.0)
-Provides-Extra: eventlet
-Requires-Dist: eventlet (>=0.24.1) ; extra == 'eventlet'
-Provides-Extra: gevent
-Requires-Dist: gevent (>=1.4.0) ; extra == 'gevent'
-Provides-Extra: gthread
-Provides-Extra: setproctitle
-Requires-Dist: setproctitle ; extra == 'setproctitle'
-Provides-Extra: tornado
-Requires-Dist: tornado (>=0.2) ; extra == 'tornado'
-
-Gunicorn
---------
-
-.. image:: https://img.shields.io/pypi/v/gunicorn.svg?style=flat
- :alt: PyPI version
- :target: https://pypi.python.org/pypi/gunicorn
-
-.. image:: https://img.shields.io/pypi/pyversions/gunicorn.svg
- :alt: Supported Python versions
- :target: https://pypi.python.org/pypi/gunicorn
-
-.. image:: https://travis-ci.org/benoitc/gunicorn.svg?branch=master
- :alt: Build Status
- :target: https://travis-ci.org/benoitc/gunicorn
-
-Gunicorn 'Green Unicorn' is a Python WSGI HTTP Server for UNIX. It's a pre-fork
-worker model ported from Ruby's Unicorn_ project. The Gunicorn server is broadly
-compatible with various web frameworks, simply implemented, light on server
-resource usage, and fairly speedy.
-
-Feel free to join us in `#gunicorn`_ on Freenode_.
-
-Documentation
--------------
-
-The documentation is hosted at https://docs.gunicorn.org.
-
-Installation
-------------
-
-Gunicorn requires **Python 3.x >= 3.5**.
-
-Install from PyPI::
-
- $ pip install gunicorn
-
-
-Usage
------
-
-Basic usage::
-
- $ gunicorn [OPTIONS] APP_MODULE
-
-Where ``APP_MODULE`` is of the pattern ``$(MODULE_NAME):$(VARIABLE_NAME)``. The
-module name can be a full dotted path. The variable name refers to a WSGI
-callable that should be found in the specified module.
-
-Example with test app::
-
- $ cd examples
- $ gunicorn --workers=2 test:app
-
-
-Contributing
-------------
-
-See `our complete contributor's guide `_ for more details.
-
-
-License
--------
-
-Gunicorn is released under the MIT License. See the LICENSE_ file for more
-details.
-
-.. _Unicorn: https://bogomips.org/unicorn/
-.. _`#gunicorn`: https://webchat.freenode.net/?channels=gunicorn
-.. _Freenode: https://freenode.net/
-.. _LICENSE: https://github.com/benoitc/gunicorn/blob/master/LICENSE
-
-
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/RECORD b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/RECORD
deleted file mode 100644
index 08c3609..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/RECORD
+++ /dev/null
@@ -1,77 +0,0 @@
-../../../bin/gunicorn,sha256=h8H8driGBoqBHDJnXhMl_Ls-u-5AmY8zWY_325UBtis,257
-gunicorn-20.1.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-gunicorn-20.1.0.dist-info/LICENSE,sha256=eJ_hG5Lhyr-890S1_MOSyb1cZ5hgOk6J-SW2M3mE0d8,1136
-gunicorn-20.1.0.dist-info/METADATA,sha256=-0kZuLv3CwPyNDUH40lI3VZN4CbFt3YCalVUprINtfs,3771
-gunicorn-20.1.0.dist-info/RECORD,,
-gunicorn-20.1.0.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-gunicorn-20.1.0.dist-info/WHEEL,sha256=OqRkF0eY5GHssMorFjlbTIq072vpHpF60fIQA6lS9xA,92
-gunicorn-20.1.0.dist-info/entry_points.txt,sha256=iKqKVTg4RzByDFxtGUiHcSAFATmYSYMO0f7S5RPLK6o,130
-gunicorn-20.1.0.dist-info/top_level.txt,sha256=cdMaa2yhxb8do-WioY9qRHUCfwf55YztjwQCncaInoE,9
-gunicorn/__init__.py,sha256=cMSZ4dqEfhR1eghYtGfJ9Fw4Xw9RN6U7BzH3tRpT4fQ,279
-gunicorn/__main__.py,sha256=kv-LQeOm8rXRw_NQTj8Tg3l3jv9eKMAm6jyKzYY0Hs8,171
-gunicorn/__pycache__/__init__.cpython-39.pyc,,
-gunicorn/__pycache__/__main__.cpython-39.pyc,,
-gunicorn/__pycache__/arbiter.cpython-39.pyc,,
-gunicorn/__pycache__/config.cpython-39.pyc,,
-gunicorn/__pycache__/debug.cpython-39.pyc,,
-gunicorn/__pycache__/errors.cpython-39.pyc,,
-gunicorn/__pycache__/glogging.cpython-39.pyc,,
-gunicorn/__pycache__/pidfile.cpython-39.pyc,,
-gunicorn/__pycache__/reloader.cpython-39.pyc,,
-gunicorn/__pycache__/sock.cpython-39.pyc,,
-gunicorn/__pycache__/systemd.cpython-39.pyc,,
-gunicorn/__pycache__/util.cpython-39.pyc,,
-gunicorn/app/__init__.py,sha256=GuqstqdkizeV4HRbd8aGMBn0Q8IDOyRU1wMMNqNe5GY,127
-gunicorn/app/__pycache__/__init__.cpython-39.pyc,,
-gunicorn/app/__pycache__/base.cpython-39.pyc,,
-gunicorn/app/__pycache__/pasterapp.cpython-39.pyc,,
-gunicorn/app/__pycache__/wsgiapp.cpython-39.pyc,,
-gunicorn/app/base.py,sha256=wIDHzndfzyTcKySUMJmW_mscgLVj_K9w7UCOsUNcVFo,7150
-gunicorn/app/pasterapp.py,sha256=Bb0JwQNqZxmZ-gvvZUGWAEc9RX2BdhdhfhJ2a12Xafo,2038
-gunicorn/app/wsgiapp.py,sha256=Ktb5z0GPkCpDqQ0zS8zccYCvqJi8Su4zOekwKJulwBA,1926
-gunicorn/arbiter.py,sha256=0U6C550IKETMLzTXe1scCcNfayXPKo0ZB0nJkvRWxVA,20521
-gunicorn/config.py,sha256=IxV1P9X41D2_1tTkuOR093SUKxXf5tbYVUYpfymaygU,61423
-gunicorn/debug.py,sha256=UUw-eteLEm_OQ98D6K3XtDjx4Dya2H35zdiu8z7F7uc,2289
-gunicorn/errors.py,sha256=JlDBjag90gMiRwLHG3xzEJzDOntSl1iM32R277-U6j0,919
-gunicorn/glogging.py,sha256=k_bt1mrTczN0El0rWq9FE1pwi5cTYFJeg9xHBj_d-ZE,14913
-gunicorn/http/__init__.py,sha256=b4TF3x5F0VYOPTOeNYwRGR1EYHBaPMhZRMoNeuD5-n0,277
-gunicorn/http/__pycache__/__init__.cpython-39.pyc,,
-gunicorn/http/__pycache__/body.cpython-39.pyc,,
-gunicorn/http/__pycache__/errors.cpython-39.pyc,,
-gunicorn/http/__pycache__/message.cpython-39.pyc,,
-gunicorn/http/__pycache__/parser.cpython-39.pyc,,
-gunicorn/http/__pycache__/unreader.cpython-39.pyc,,
-gunicorn/http/__pycache__/wsgi.cpython-39.pyc,,
-gunicorn/http/body.py,sha256=X1vbGcTSM3-2UI2ubtavuTS4yOd0fpTyfeFaQZ_x92o,7297
-gunicorn/http/errors.py,sha256=sNjF2lm4m2qyZ9l95_U33FRxPXpxXzjnZyYqWS-hxd4,2850
-gunicorn/http/message.py,sha256=hmSmf8DOHkRNstcYYkhuw0eg065pTDL8BeybtPftTVc,11759
-gunicorn/http/parser.py,sha256=6eNGDUMEURYqzCXsftv3a4hYuD_fBvttZxOJuRbdKNg,1364
-gunicorn/http/unreader.py,sha256=pXVde3fNCUIO2FLOSJ0iNtEEpA0m8GH6_R2Sl-cB-J8,1943
-gunicorn/http/wsgi.py,sha256=25Q6VZlBFpt-Wqmsxwt6FLw7-ckk1dU5XBkeY7i5mmc,12328
-gunicorn/instrument/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-gunicorn/instrument/__pycache__/__init__.cpython-39.pyc,,
-gunicorn/instrument/__pycache__/statsd.cpython-39.pyc,,
-gunicorn/instrument/statsd.py,sha256=-_DKM8T-3CHaq3pyxhdS6UoV70oD21tTXyJ4bVTrVFc,4633
-gunicorn/pidfile.py,sha256=U3TpoE5_05wQxonGS4pV-aLkq8BMSvql142XJnE2olw,2367
-gunicorn/reloader.py,sha256=jDxzT3Mn2NdcKD9Jiex6HNh-XSjAlK-iw7L4R36h-L0,3777
-gunicorn/sock.py,sha256=dIpBDeH2X-pzMB94VRmc-MbMrbE_ZFTAhuasOo7QebM,6110
-gunicorn/systemd.py,sha256=k2qJb6wAEv9Vk-k8zuTr9OyHJW6K2GkqWrSNoR3zTrs,2511
-gunicorn/util.py,sha256=supyIhToKSH4QONMMqTwChHwFHovRSInR92xxURSPQg,18516
-gunicorn/workers/__init__.py,sha256=Gv_JJXKofikyiPbRAUQ0IXIchKxgt0Gu-8y-nYRN9vY,594
-gunicorn/workers/__pycache__/__init__.cpython-39.pyc,,
-gunicorn/workers/__pycache__/base.cpython-39.pyc,,
-gunicorn/workers/__pycache__/base_async.cpython-39.pyc,,
-gunicorn/workers/__pycache__/geventlet.cpython-39.pyc,,
-gunicorn/workers/__pycache__/ggevent.cpython-39.pyc,,
-gunicorn/workers/__pycache__/gthread.cpython-39.pyc,,
-gunicorn/workers/__pycache__/gtornado.cpython-39.pyc,,
-gunicorn/workers/__pycache__/sync.cpython-39.pyc,,
-gunicorn/workers/__pycache__/workertmp.cpython-39.pyc,,
-gunicorn/workers/base.py,sha256=jNF8BnkHhaFNEmvfKrH0DI2-LiOs9UbKFGAPOXoFH30,9103
-gunicorn/workers/base_async.py,sha256=Eyb-zHt6bhaVfsCVygauVGVbw6WrX0KKvk5kIK-2yZ4,5693
-gunicorn/workers/geventlet.py,sha256=DDlj1MGimp-dpovHVOJB0eEvqD45_O0xcFG06v5vnEg,5713
-gunicorn/workers/ggevent.py,sha256=qgrz1Lsfcnjh8pthi6FW8BQrbT5KqLZhGxMjC1fNtEc,5733
-gunicorn/workers/gthread.py,sha256=bteTEQkeEKJMgJbtf3GcP2oCRV8HNGx4_lfzaWblTjE,12194
-gunicorn/workers/gtornado.py,sha256=0d_MoAXbLsy1LnbKc3C2joo0AwQLYnmpCJCQupl-n-Q,5988
-gunicorn/workers/sync.py,sha256=HvyNnCDlAFH3o2Ynm6W_F3IXnYQLBAT1oSn_uQ3LhCA,7327
-gunicorn/workers/workertmp.py,sha256=4sygTmNodn5vZ5qUnSSB0dUwtfetgAxrTrhhYxgEObY,1649
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/REQUESTED b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/REQUESTED
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/WHEEL b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/WHEEL
deleted file mode 100644
index 385faab..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/WHEEL
+++ /dev/null
@@ -1,5 +0,0 @@
-Wheel-Version: 1.0
-Generator: bdist_wheel (0.36.2)
-Root-Is-Purelib: true
-Tag: py3-none-any
-
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/entry_points.txt b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/entry_points.txt
deleted file mode 100644
index f70a5be..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/entry_points.txt
+++ /dev/null
@@ -1,7 +0,0 @@
-
- [console_scripts]
- gunicorn=gunicorn.app.wsgiapp:run
-
- [paste.server_runner]
- main=gunicorn.app.pasterapp:serve
-
\ No newline at end of file
diff --git a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/top_level.txt b/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/top_level.txt
deleted file mode 100644
index 8f22dcc..0000000
--- a/env/lib/python3.9/site-packages/gunicorn-20.1.0.dist-info/top_level.txt
+++ /dev/null
@@ -1 +0,0 @@
-gunicorn
diff --git a/env/lib/python3.9/site-packages/gunicorn/__init__.py b/env/lib/python3.9/site-packages/gunicorn/__init__.py
deleted file mode 100644
index 29edada..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/__init__.py
+++ /dev/null
@@ -1,9 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-version_info = (20, 1, 0)
-__version__ = ".".join([str(v) for v in version_info])
-SERVER = "gunicorn"
-SERVER_SOFTWARE = "%s/%s" % (SERVER, __version__)
diff --git a/env/lib/python3.9/site-packages/gunicorn/__main__.py b/env/lib/python3.9/site-packages/gunicorn/__main__.py
deleted file mode 100644
index 49ba696..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/__main__.py
+++ /dev/null
@@ -1,7 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-from gunicorn.app.wsgiapp import run
-run()
diff --git a/env/lib/python3.9/site-packages/gunicorn/app/__init__.py b/env/lib/python3.9/site-packages/gunicorn/app/__init__.py
deleted file mode 100644
index 87f0611..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/app/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
diff --git a/env/lib/python3.9/site-packages/gunicorn/app/base.py b/env/lib/python3.9/site-packages/gunicorn/app/base.py
deleted file mode 100644
index df8c666..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/app/base.py
+++ /dev/null
@@ -1,231 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-import importlib.util
-import importlib.machinery
-import os
-import sys
-import traceback
-
-from gunicorn import util
-from gunicorn.arbiter import Arbiter
-from gunicorn.config import Config, get_default_config_file
-from gunicorn import debug
-
-
-class BaseApplication(object):
- """
- An application interface for configuring and loading
- the various necessities for any given web framework.
- """
- def __init__(self, usage=None, prog=None):
- self.usage = usage
- self.cfg = None
- self.callable = None
- self.prog = prog
- self.logger = None
- self.do_load_config()
-
- def do_load_config(self):
- """
- Loads the configuration
- """
- try:
- self.load_default_config()
- self.load_config()
- except Exception as e:
- print("\nError: %s" % str(e), file=sys.stderr)
- sys.stderr.flush()
- sys.exit(1)
-
- def load_default_config(self):
- # init configuration
- self.cfg = Config(self.usage, prog=self.prog)
-
- def init(self, parser, opts, args):
- raise NotImplementedError
-
- def load(self):
- raise NotImplementedError
-
- def load_config(self):
- """
- This method is used to load the configuration from one or several input(s).
- Custom Command line, configuration file.
- You have to override this method in your class.
- """
- raise NotImplementedError
-
- def reload(self):
- self.do_load_config()
- if self.cfg.spew:
- debug.spew()
-
- def wsgi(self):
- if self.callable is None:
- self.callable = self.load()
- return self.callable
-
- def run(self):
- try:
- Arbiter(self).run()
- except RuntimeError as e:
- print("\nError: %s\n" % e, file=sys.stderr)
- sys.stderr.flush()
- sys.exit(1)
-
-
-class Application(BaseApplication):
-
- # 'init' and 'load' methods are implemented by WSGIApplication.
- # pylint: disable=abstract-method
-
- def chdir(self):
- # chdir to the configured path before loading,
- # default is the current dir
- os.chdir(self.cfg.chdir)
-
- # add the path to sys.path
- if self.cfg.chdir not in sys.path:
- sys.path.insert(0, self.cfg.chdir)
-
- def get_config_from_filename(self, filename):
-
- if not os.path.exists(filename):
- raise RuntimeError("%r doesn't exist" % filename)
-
- ext = os.path.splitext(filename)[1]
-
- try:
- module_name = '__config__'
- if ext in [".py", ".pyc"]:
- spec = importlib.util.spec_from_file_location(module_name, filename)
- else:
- msg = "configuration file should have a valid Python extension.\n"
- util.warn(msg)
- loader_ = importlib.machinery.SourceFileLoader(module_name, filename)
- spec = importlib.util.spec_from_file_location(module_name, filename, loader=loader_)
- mod = importlib.util.module_from_spec(spec)
- sys.modules[module_name] = mod
- spec.loader.exec_module(mod)
- except Exception:
- print("Failed to read config file: %s" % filename, file=sys.stderr)
- traceback.print_exc()
- sys.stderr.flush()
- sys.exit(1)
-
- return vars(mod)
-
- def get_config_from_module_name(self, module_name):
- return vars(importlib.import_module(module_name))
-
- def load_config_from_module_name_or_filename(self, location):
- """
- Loads the configuration file: the file is a python file, otherwise raise an RuntimeError
- Exception or stop the process if the configuration file contains a syntax error.
- """
-
- if location.startswith("python:"):
- module_name = location[len("python:"):]
- cfg = self.get_config_from_module_name(module_name)
- else:
- if location.startswith("file:"):
- filename = location[len("file:"):]
- else:
- filename = location
- cfg = self.get_config_from_filename(filename)
-
- for k, v in cfg.items():
- # Ignore unknown names
- if k not in self.cfg.settings:
- continue
- try:
- self.cfg.set(k.lower(), v)
- except Exception:
- print("Invalid value for %s: %s\n" % (k, v), file=sys.stderr)
- sys.stderr.flush()
- raise
-
- return cfg
-
- def load_config_from_file(self, filename):
- return self.load_config_from_module_name_or_filename(location=filename)
-
- def load_config(self):
- # parse console args
- parser = self.cfg.parser()
- args = parser.parse_args()
-
- # optional settings from apps
- cfg = self.init(parser, args, args.args)
-
- # set up import paths and follow symlinks
- self.chdir()
-
- # Load up the any app specific configuration
- if cfg:
- for k, v in cfg.items():
- self.cfg.set(k.lower(), v)
-
- env_args = parser.parse_args(self.cfg.get_cmd_args_from_env())
-
- if args.config:
- self.load_config_from_file(args.config)
- elif env_args.config:
- self.load_config_from_file(env_args.config)
- else:
- default_config = get_default_config_file()
- if default_config is not None:
- self.load_config_from_file(default_config)
-
- # Load up environment configuration
- for k, v in vars(env_args).items():
- if v is None:
- continue
- if k == "args":
- continue
- self.cfg.set(k.lower(), v)
-
- # Lastly, update the configuration with any command line settings.
- for k, v in vars(args).items():
- if v is None:
- continue
- if k == "args":
- continue
- self.cfg.set(k.lower(), v)
-
- # current directory might be changed by the config now
- # set up import paths and follow symlinks
- self.chdir()
-
- def run(self):
- if self.cfg.print_config:
- print(self.cfg)
-
- if self.cfg.print_config or self.cfg.check_config:
- try:
- self.load()
- except Exception:
- msg = "\nError while loading the application:\n"
- print(msg, file=sys.stderr)
- traceback.print_exc()
- sys.stderr.flush()
- sys.exit(1)
- sys.exit(0)
-
- if self.cfg.spew:
- debug.spew()
-
- if self.cfg.daemon:
- util.daemonize(self.cfg.enable_stdio_inheritance)
-
- # set python paths
- if self.cfg.pythonpath:
- paths = self.cfg.pythonpath.split(",")
- for path in paths:
- pythonpath = os.path.abspath(path)
- if pythonpath not in sys.path:
- sys.path.insert(0, pythonpath)
-
- super().run()
diff --git a/env/lib/python3.9/site-packages/gunicorn/app/pasterapp.py b/env/lib/python3.9/site-packages/gunicorn/app/pasterapp.py
deleted file mode 100644
index 4c9fc7d..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/app/pasterapp.py
+++ /dev/null
@@ -1,75 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import configparser
-import os
-
-from paste.deploy import loadapp
-
-from gunicorn.app.wsgiapp import WSGIApplication
-from gunicorn.config import get_default_config_file
-
-
-def get_wsgi_app(config_uri, name=None, defaults=None):
- if ':' not in config_uri:
- config_uri = "config:%s" % config_uri
-
- return loadapp(
- config_uri,
- name=name,
- relative_to=os.getcwd(),
- global_conf=defaults,
- )
-
-
-def has_logging_config(config_file):
- parser = configparser.ConfigParser()
- parser.read([config_file])
- return parser.has_section('loggers')
-
-
-def serve(app, global_conf, **local_conf):
- """\
- A Paste Deployment server runner.
-
- Example configuration:
-
- [server:main]
- use = egg:gunicorn#main
- host = 127.0.0.1
- port = 5000
- """
- config_file = global_conf['__file__']
- gunicorn_config_file = local_conf.pop('config', None)
-
- host = local_conf.pop('host', '')
- port = local_conf.pop('port', '')
- if host and port:
- local_conf['bind'] = '%s:%s' % (host, port)
- elif host:
- local_conf['bind'] = host.split(',')
-
- class PasterServerApplication(WSGIApplication):
- def load_config(self):
- self.cfg.set("default_proc_name", config_file)
-
- if has_logging_config(config_file):
- self.cfg.set("logconfig", config_file)
-
- if gunicorn_config_file:
- self.load_config_from_file(gunicorn_config_file)
- else:
- default_gunicorn_config_file = get_default_config_file()
- if default_gunicorn_config_file is not None:
- self.load_config_from_file(default_gunicorn_config_file)
-
- for k, v in local_conf.items():
- if v is not None:
- self.cfg.set(k.lower(), v)
-
- def load(self):
- return app
-
- PasterServerApplication().run()
diff --git a/env/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py b/env/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py
deleted file mode 100644
index 36cfba9..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py
+++ /dev/null
@@ -1,71 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import os
-
-from gunicorn.errors import ConfigError
-from gunicorn.app.base import Application
-from gunicorn import util
-
-
-class WSGIApplication(Application):
- def init(self, parser, opts, args):
- self.app_uri = None
-
- if opts.paste:
- from .pasterapp import has_logging_config
-
- config_uri = os.path.abspath(opts.paste)
- config_file = config_uri.split('#')[0]
-
- if not os.path.exists(config_file):
- raise ConfigError("%r not found" % config_file)
-
- self.cfg.set("default_proc_name", config_file)
- self.app_uri = config_uri
-
- if has_logging_config(config_file):
- self.cfg.set("logconfig", config_file)
-
- return
-
- if len(args) > 0:
- self.cfg.set("default_proc_name", args[0])
- self.app_uri = args[0]
-
- def load_config(self):
- super().load_config()
-
- if self.app_uri is None:
- if self.cfg.wsgi_app is not None:
- self.app_uri = self.cfg.wsgi_app
- else:
- raise ConfigError("No application module specified.")
-
- def load_wsgiapp(self):
- return util.import_app(self.app_uri)
-
- def load_pasteapp(self):
- from .pasterapp import get_wsgi_app
- return get_wsgi_app(self.app_uri, defaults=self.cfg.paste_global_conf)
-
- def load(self):
- if self.cfg.paste is not None:
- return self.load_pasteapp()
- else:
- return self.load_wsgiapp()
-
-
-def run():
- """\
- The ``gunicorn`` command line runner for launching Gunicorn with
- generic WSGI applications.
- """
- from gunicorn.app.wsgiapp import WSGIApplication
- WSGIApplication("%(prog)s [OPTIONS] [APP_MODULE]").run()
-
-
-if __name__ == '__main__':
- run()
diff --git a/env/lib/python3.9/site-packages/gunicorn/arbiter.py b/env/lib/python3.9/site-packages/gunicorn/arbiter.py
deleted file mode 100644
index 24ec387..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/arbiter.py
+++ /dev/null
@@ -1,652 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-import errno
-import os
-import random
-import select
-import signal
-import sys
-import time
-import traceback
-
-from gunicorn.errors import HaltServer, AppImportError
-from gunicorn.pidfile import Pidfile
-from gunicorn import sock, systemd, util
-
-from gunicorn import __version__, SERVER_SOFTWARE
-
-
-class Arbiter(object):
- """
- Arbiter maintain the workers processes alive. It launches or
- kills them if needed. It also manages application reloading
- via SIGHUP/USR2.
- """
-
- # A flag indicating if a worker failed to
- # to boot. If a worker process exist with
- # this error code, the arbiter will terminate.
- WORKER_BOOT_ERROR = 3
-
- # A flag indicating if an application failed to be loaded
- APP_LOAD_ERROR = 4
-
- START_CTX = {}
-
- LISTENERS = []
- WORKERS = {}
- PIPE = []
-
- # I love dynamic languages
- SIG_QUEUE = []
- SIGNALS = [getattr(signal, "SIG%s" % x)
- for x in "HUP QUIT INT TERM TTIN TTOU USR1 USR2 WINCH".split()]
- SIG_NAMES = dict(
- (getattr(signal, name), name[3:].lower()) for name in dir(signal)
- if name[:3] == "SIG" and name[3] != "_"
- )
-
- def __init__(self, app):
- os.environ["SERVER_SOFTWARE"] = SERVER_SOFTWARE
-
- self._num_workers = None
- self._last_logged_active_worker_count = None
- self.log = None
-
- self.setup(app)
-
- self.pidfile = None
- self.systemd = False
- self.worker_age = 0
- self.reexec_pid = 0
- self.master_pid = 0
- self.master_name = "Master"
-
- cwd = util.getcwd()
-
- args = sys.argv[:]
- args.insert(0, sys.executable)
-
- # init start context
- self.START_CTX = {
- "args": args,
- "cwd": cwd,
- 0: sys.executable
- }
-
- def _get_num_workers(self):
- return self._num_workers
-
- def _set_num_workers(self, value):
- old_value = self._num_workers
- self._num_workers = value
- self.cfg.nworkers_changed(self, value, old_value)
- num_workers = property(_get_num_workers, _set_num_workers)
-
- def setup(self, app):
- self.app = app
- self.cfg = app.cfg
-
- if self.log is None:
- self.log = self.cfg.logger_class(app.cfg)
-
- # reopen files
- if 'GUNICORN_FD' in os.environ:
- self.log.reopen_files()
-
- self.worker_class = self.cfg.worker_class
- self.address = self.cfg.address
- self.num_workers = self.cfg.workers
- self.timeout = self.cfg.timeout
- self.proc_name = self.cfg.proc_name
-
- self.log.debug('Current configuration:\n{0}'.format(
- '\n'.join(
- ' {0}: {1}'.format(config, value.value)
- for config, value
- in sorted(self.cfg.settings.items(),
- key=lambda setting: setting[1]))))
-
- # set enviroment' variables
- if self.cfg.env:
- for k, v in self.cfg.env.items():
- os.environ[k] = v
-
- if self.cfg.preload_app:
- self.app.wsgi()
-
- def start(self):
- """\
- Initialize the arbiter. Start listening and set pidfile if needed.
- """
- self.log.info("Starting gunicorn %s", __version__)
-
- if 'GUNICORN_PID' in os.environ:
- self.master_pid = int(os.environ.get('GUNICORN_PID'))
- self.proc_name = self.proc_name + ".2"
- self.master_name = "Master.2"
-
- self.pid = os.getpid()
- if self.cfg.pidfile is not None:
- pidname = self.cfg.pidfile
- if self.master_pid != 0:
- pidname += ".2"
- self.pidfile = Pidfile(pidname)
- self.pidfile.create(self.pid)
- self.cfg.on_starting(self)
-
- self.init_signals()
-
- if not self.LISTENERS:
- fds = None
- listen_fds = systemd.listen_fds()
- if listen_fds:
- self.systemd = True
- fds = range(systemd.SD_LISTEN_FDS_START,
- systemd.SD_LISTEN_FDS_START + listen_fds)
-
- elif self.master_pid:
- fds = []
- for fd in os.environ.pop('GUNICORN_FD').split(','):
- fds.append(int(fd))
-
- self.LISTENERS = sock.create_sockets(self.cfg, self.log, fds)
-
- listeners_str = ",".join([str(l) for l in self.LISTENERS])
- self.log.debug("Arbiter booted")
- self.log.info("Listening at: %s (%s)", listeners_str, self.pid)
- self.log.info("Using worker: %s", self.cfg.worker_class_str)
- systemd.sd_notify("READY=1\nSTATUS=Gunicorn arbiter booted", self.log)
-
- # check worker class requirements
- if hasattr(self.worker_class, "check_config"):
- self.worker_class.check_config(self.cfg, self.log)
-
- self.cfg.when_ready(self)
-
- def init_signals(self):
- """\
- Initialize master signal handling. Most of the signals
- are queued. Child signals only wake up the master.
- """
- # close old PIPE
- for p in self.PIPE:
- os.close(p)
-
- # initialize the pipe
- self.PIPE = pair = os.pipe()
- for p in pair:
- util.set_non_blocking(p)
- util.close_on_exec(p)
-
- self.log.close_on_exec()
-
- # initialize all signals
- for s in self.SIGNALS:
- signal.signal(s, self.signal)
- signal.signal(signal.SIGCHLD, self.handle_chld)
-
- def signal(self, sig, frame):
- if len(self.SIG_QUEUE) < 5:
- self.SIG_QUEUE.append(sig)
- self.wakeup()
-
- def run(self):
- "Main master loop."
- self.start()
- util._setproctitle("master [%s]" % self.proc_name)
-
- try:
- self.manage_workers()
-
- while True:
- self.maybe_promote_master()
-
- sig = self.SIG_QUEUE.pop(0) if self.SIG_QUEUE else None
- if sig is None:
- self.sleep()
- self.murder_workers()
- self.manage_workers()
- continue
-
- if sig not in self.SIG_NAMES:
- self.log.info("Ignoring unknown signal: %s", sig)
- continue
-
- signame = self.SIG_NAMES.get(sig)
- handler = getattr(self, "handle_%s" % signame, None)
- if not handler:
- self.log.error("Unhandled signal: %s", signame)
- continue
- self.log.info("Handling signal: %s", signame)
- handler()
- self.wakeup()
- except (StopIteration, KeyboardInterrupt):
- self.halt()
- except HaltServer as inst:
- self.halt(reason=inst.reason, exit_status=inst.exit_status)
- except SystemExit:
- raise
- except Exception:
- self.log.info("Unhandled exception in main loop",
- exc_info=True)
- self.stop(False)
- if self.pidfile is not None:
- self.pidfile.unlink()
- sys.exit(-1)
-
- def handle_chld(self, sig, frame):
- "SIGCHLD handling"
- self.reap_workers()
- self.wakeup()
-
- def handle_hup(self):
- """\
- HUP handling.
- - Reload configuration
- - Start the new worker processes with a new configuration
- - Gracefully shutdown the old worker processes
- """
- self.log.info("Hang up: %s", self.master_name)
- self.reload()
-
- def handle_term(self):
- "SIGTERM handling"
- raise StopIteration
-
- def handle_int(self):
- "SIGINT handling"
- self.stop(False)
- raise StopIteration
-
- def handle_quit(self):
- "SIGQUIT handling"
- self.stop(False)
- raise StopIteration
-
- def handle_ttin(self):
- """\
- SIGTTIN handling.
- Increases the number of workers by one.
- """
- self.num_workers += 1
- self.manage_workers()
-
- def handle_ttou(self):
- """\
- SIGTTOU handling.
- Decreases the number of workers by one.
- """
- if self.num_workers <= 1:
- return
- self.num_workers -= 1
- self.manage_workers()
-
- def handle_usr1(self):
- """\
- SIGUSR1 handling.
- Kill all workers by sending them a SIGUSR1
- """
- self.log.reopen_files()
- self.kill_workers(signal.SIGUSR1)
-
- def handle_usr2(self):
- """\
- SIGUSR2 handling.
- Creates a new arbiter/worker set as a fork of the current
- arbiter without affecting old workers. Use this to do live
- deployment with the ability to backout a change.
- """
- self.reexec()
-
- def handle_winch(self):
- """SIGWINCH handling"""
- if self.cfg.daemon:
- self.log.info("graceful stop of workers")
- self.num_workers = 0
- self.kill_workers(signal.SIGTERM)
- else:
- self.log.debug("SIGWINCH ignored. Not daemonized")
-
- def maybe_promote_master(self):
- if self.master_pid == 0:
- return
-
- if self.master_pid != os.getppid():
- self.log.info("Master has been promoted.")
- # reset master infos
- self.master_name = "Master"
- self.master_pid = 0
- self.proc_name = self.cfg.proc_name
- del os.environ['GUNICORN_PID']
- # rename the pidfile
- if self.pidfile is not None:
- self.pidfile.rename(self.cfg.pidfile)
- # reset proctitle
- util._setproctitle("master [%s]" % self.proc_name)
-
- def wakeup(self):
- """\
- Wake up the arbiter by writing to the PIPE
- """
- try:
- os.write(self.PIPE[1], b'.')
- except IOError as e:
- if e.errno not in [errno.EAGAIN, errno.EINTR]:
- raise
-
- def halt(self, reason=None, exit_status=0):
- """ halt arbiter """
- self.stop()
- self.log.info("Shutting down: %s", self.master_name)
- if reason is not None:
- self.log.info("Reason: %s", reason)
- if self.pidfile is not None:
- self.pidfile.unlink()
- self.cfg.on_exit(self)
- sys.exit(exit_status)
-
- def sleep(self):
- """\
- Sleep until PIPE is readable or we timeout.
- A readable PIPE means a signal occurred.
- """
- try:
- ready = select.select([self.PIPE[0]], [], [], 1.0)
- if not ready[0]:
- return
- while os.read(self.PIPE[0], 1):
- pass
- except (select.error, OSError) as e:
- # TODO: select.error is a subclass of OSError since Python 3.3.
- error_number = getattr(e, 'errno', e.args[0])
- if error_number not in [errno.EAGAIN, errno.EINTR]:
- raise
- except KeyboardInterrupt:
- sys.exit()
-
- def stop(self, graceful=True):
- """\
- Stop workers
-
- :attr graceful: boolean, If True (the default) workers will be
- killed gracefully (ie. trying to wait for the current connection)
- """
- unlink = (
- self.reexec_pid == self.master_pid == 0
- and not self.systemd
- and not self.cfg.reuse_port
- )
- sock.close_sockets(self.LISTENERS, unlink)
-
- self.LISTENERS = []
- sig = signal.SIGTERM
- if not graceful:
- sig = signal.SIGQUIT
- limit = time.time() + self.cfg.graceful_timeout
- # instruct the workers to exit
- self.kill_workers(sig)
- # wait until the graceful timeout
- while self.WORKERS and time.time() < limit:
- time.sleep(0.1)
-
- self.kill_workers(signal.SIGKILL)
-
- def reexec(self):
- """\
- Relaunch the master and workers.
- """
- if self.reexec_pid != 0:
- self.log.warning("USR2 signal ignored. Child exists.")
- return
-
- if self.master_pid != 0:
- self.log.warning("USR2 signal ignored. Parent exists.")
- return
-
- master_pid = os.getpid()
- self.reexec_pid = os.fork()
- if self.reexec_pid != 0:
- return
-
- self.cfg.pre_exec(self)
-
- environ = self.cfg.env_orig.copy()
- environ['GUNICORN_PID'] = str(master_pid)
-
- if self.systemd:
- environ['LISTEN_PID'] = str(os.getpid())
- environ['LISTEN_FDS'] = str(len(self.LISTENERS))
- else:
- environ['GUNICORN_FD'] = ','.join(
- str(l.fileno()) for l in self.LISTENERS)
-
- os.chdir(self.START_CTX['cwd'])
-
- # exec the process using the original environment
- os.execvpe(self.START_CTX[0], self.START_CTX['args'], environ)
-
- def reload(self):
- old_address = self.cfg.address
-
- # reset old environment
- for k in self.cfg.env:
- if k in self.cfg.env_orig:
- # reset the key to the value it had before
- # we launched gunicorn
- os.environ[k] = self.cfg.env_orig[k]
- else:
- # delete the value set by gunicorn
- try:
- del os.environ[k]
- except KeyError:
- pass
-
- # reload conf
- self.app.reload()
- self.setup(self.app)
-
- # reopen log files
- self.log.reopen_files()
-
- # do we need to change listener ?
- if old_address != self.cfg.address:
- # close all listeners
- for l in self.LISTENERS:
- l.close()
- # init new listeners
- self.LISTENERS = sock.create_sockets(self.cfg, self.log)
- listeners_str = ",".join([str(l) for l in self.LISTENERS])
- self.log.info("Listening at: %s", listeners_str)
-
- # do some actions on reload
- self.cfg.on_reload(self)
-
- # unlink pidfile
- if self.pidfile is not None:
- self.pidfile.unlink()
-
- # create new pidfile
- if self.cfg.pidfile is not None:
- self.pidfile = Pidfile(self.cfg.pidfile)
- self.pidfile.create(self.pid)
-
- # set new proc_name
- util._setproctitle("master [%s]" % self.proc_name)
-
- # spawn new workers
- for _ in range(self.cfg.workers):
- self.spawn_worker()
-
- # manage workers
- self.manage_workers()
-
- def murder_workers(self):
- """\
- Kill unused/idle workers
- """
- if not self.timeout:
- return
- workers = list(self.WORKERS.items())
- for (pid, worker) in workers:
- try:
- if time.time() - worker.tmp.last_update() <= self.timeout:
- continue
- except (OSError, ValueError):
- continue
-
- if not worker.aborted:
- self.log.critical("WORKER TIMEOUT (pid:%s)", pid)
- worker.aborted = True
- self.kill_worker(pid, signal.SIGABRT)
- else:
- self.kill_worker(pid, signal.SIGKILL)
-
- def reap_workers(self):
- """\
- Reap workers to avoid zombie processes
- """
- try:
- while True:
- wpid, status = os.waitpid(-1, os.WNOHANG)
- if not wpid:
- break
- if self.reexec_pid == wpid:
- self.reexec_pid = 0
- else:
- # A worker was terminated. If the termination reason was
- # that it could not boot, we'll shut it down to avoid
- # infinite start/stop cycles.
- exitcode = status >> 8
- if exitcode == self.WORKER_BOOT_ERROR:
- reason = "Worker failed to boot."
- raise HaltServer(reason, self.WORKER_BOOT_ERROR)
- if exitcode == self.APP_LOAD_ERROR:
- reason = "App failed to load."
- raise HaltServer(reason, self.APP_LOAD_ERROR)
- if os.WIFSIGNALED(status):
- self.log.warning(
- "Worker with pid %s was terminated due to signal %s",
- wpid,
- os.WTERMSIG(status)
- )
-
- worker = self.WORKERS.pop(wpid, None)
- if not worker:
- continue
- worker.tmp.close()
- self.cfg.child_exit(self, worker)
- except OSError as e:
- if e.errno != errno.ECHILD:
- raise
-
- def manage_workers(self):
- """\
- Maintain the number of workers by spawning or killing
- as required.
- """
- if len(self.WORKERS) < self.num_workers:
- self.spawn_workers()
-
- workers = self.WORKERS.items()
- workers = sorted(workers, key=lambda w: w[1].age)
- while len(workers) > self.num_workers:
- (pid, _) = workers.pop(0)
- self.kill_worker(pid, signal.SIGTERM)
-
- active_worker_count = len(workers)
- if self._last_logged_active_worker_count != active_worker_count:
- self._last_logged_active_worker_count = active_worker_count
- self.log.debug("{0} workers".format(active_worker_count),
- extra={"metric": "gunicorn.workers",
- "value": active_worker_count,
- "mtype": "gauge"})
-
- def spawn_worker(self):
- self.worker_age += 1
- worker = self.worker_class(self.worker_age, self.pid, self.LISTENERS,
- self.app, self.timeout / 2.0,
- self.cfg, self.log)
- self.cfg.pre_fork(self, worker)
- pid = os.fork()
- if pid != 0:
- worker.pid = pid
- self.WORKERS[pid] = worker
- return pid
-
- # Do not inherit the temporary files of other workers
- for sibling in self.WORKERS.values():
- sibling.tmp.close()
-
- # Process Child
- worker.pid = os.getpid()
- try:
- util._setproctitle("worker [%s]" % self.proc_name)
- self.log.info("Booting worker with pid: %s", worker.pid)
- self.cfg.post_fork(self, worker)
- worker.init_process()
- sys.exit(0)
- except SystemExit:
- raise
- except AppImportError as e:
- self.log.debug("Exception while loading the application",
- exc_info=True)
- print("%s" % e, file=sys.stderr)
- sys.stderr.flush()
- sys.exit(self.APP_LOAD_ERROR)
- except Exception:
- self.log.exception("Exception in worker process")
- if not worker.booted:
- sys.exit(self.WORKER_BOOT_ERROR)
- sys.exit(-1)
- finally:
- self.log.info("Worker exiting (pid: %s)", worker.pid)
- try:
- worker.tmp.close()
- self.cfg.worker_exit(self, worker)
- except Exception:
- self.log.warning("Exception during worker exit:\n%s",
- traceback.format_exc())
-
- def spawn_workers(self):
- """\
- Spawn new workers as needed.
-
- This is where a worker process leaves the main loop
- of the master process.
- """
-
- for _ in range(self.num_workers - len(self.WORKERS)):
- self.spawn_worker()
- time.sleep(0.1 * random.random())
-
- def kill_workers(self, sig):
- """\
- Kill all workers with the signal `sig`
- :attr sig: `signal.SIG*` value
- """
- worker_pids = list(self.WORKERS.keys())
- for pid in worker_pids:
- self.kill_worker(pid, sig)
-
- def kill_worker(self, pid, sig):
- """\
- Kill a worker
-
- :attr pid: int, worker pid
- :attr sig: `signal.SIG*` value
- """
- try:
- os.kill(pid, sig)
- except OSError as e:
- if e.errno == errno.ESRCH:
- try:
- worker = self.WORKERS.pop(pid)
- worker.tmp.close()
- self.cfg.worker_exit(self, worker)
- return
- except (KeyError, OSError):
- return
- raise
diff --git a/env/lib/python3.9/site-packages/gunicorn/config.py b/env/lib/python3.9/site-packages/gunicorn/config.py
deleted file mode 100644
index bc24b70..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/config.py
+++ /dev/null
@@ -1,2190 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-# Please remember to run "make -C docs html" after update "desc" attributes.
-
-import argparse
-import copy
-import grp
-import inspect
-import os
-import pwd
-import re
-import shlex
-import ssl
-import sys
-import textwrap
-
-from gunicorn import __version__, util
-from gunicorn.errors import ConfigError
-from gunicorn.reloader import reloader_engines
-
-KNOWN_SETTINGS = []
-PLATFORM = sys.platform
-
-
-def make_settings(ignore=None):
- settings = {}
- ignore = ignore or ()
- for s in KNOWN_SETTINGS:
- setting = s()
- if setting.name in ignore:
- continue
- settings[setting.name] = setting.copy()
- return settings
-
-
-def auto_int(_, x):
- # for compatible with octal numbers in python3
- if re.match(r'0(\d)', x, re.IGNORECASE):
- x = x.replace('0', '0o', 1)
- return int(x, 0)
-
-
-class Config(object):
-
- def __init__(self, usage=None, prog=None):
- self.settings = make_settings()
- self.usage = usage
- self.prog = prog or os.path.basename(sys.argv[0])
- self.env_orig = os.environ.copy()
-
- def __str__(self):
- lines = []
- kmax = max(len(k) for k in self.settings)
- for k in sorted(self.settings):
- v = self.settings[k].value
- if callable(v):
- v = "<{}()>".format(v.__qualname__)
- lines.append("{k:{kmax}} = {v}".format(k=k, v=v, kmax=kmax))
- return "\n".join(lines)
-
- def __getattr__(self, name):
- if name not in self.settings:
- raise AttributeError("No configuration setting for: %s" % name)
- return self.settings[name].get()
-
- def __setattr__(self, name, value):
- if name != "settings" and name in self.settings:
- raise AttributeError("Invalid access!")
- super().__setattr__(name, value)
-
- def set(self, name, value):
- if name not in self.settings:
- raise AttributeError("No configuration setting for: %s" % name)
- self.settings[name].set(value)
-
- def get_cmd_args_from_env(self):
- if 'GUNICORN_CMD_ARGS' in self.env_orig:
- return shlex.split(self.env_orig['GUNICORN_CMD_ARGS'])
- return []
-
- def parser(self):
- kwargs = {
- "usage": self.usage,
- "prog": self.prog
- }
- parser = argparse.ArgumentParser(**kwargs)
- parser.add_argument("-v", "--version",
- action="version", default=argparse.SUPPRESS,
- version="%(prog)s (version " + __version__ + ")\n",
- help="show program's version number and exit")
- parser.add_argument("args", nargs="*", help=argparse.SUPPRESS)
-
- keys = sorted(self.settings, key=self.settings.__getitem__)
- for k in keys:
- self.settings[k].add_option(parser)
-
- return parser
-
- @property
- def worker_class_str(self):
- uri = self.settings['worker_class'].get()
-
- # are we using a threaded worker?
- is_sync = uri.endswith('SyncWorker') or uri == 'sync'
- if is_sync and self.threads > 1:
- return "gthread"
- return uri
-
- @property
- def worker_class(self):
- uri = self.settings['worker_class'].get()
-
- # are we using a threaded worker?
- is_sync = uri.endswith('SyncWorker') or uri == 'sync'
- if is_sync and self.threads > 1:
- uri = "gunicorn.workers.gthread.ThreadWorker"
-
- worker_class = util.load_class(uri)
- if hasattr(worker_class, "setup"):
- worker_class.setup()
- return worker_class
-
- @property
- def address(self):
- s = self.settings['bind'].get()
- return [util.parse_address(util.bytes_to_str(bind)) for bind in s]
-
- @property
- def uid(self):
- return self.settings['user'].get()
-
- @property
- def gid(self):
- return self.settings['group'].get()
-
- @property
- def proc_name(self):
- pn = self.settings['proc_name'].get()
- if pn is not None:
- return pn
- else:
- return self.settings['default_proc_name'].get()
-
- @property
- def logger_class(self):
- uri = self.settings['logger_class'].get()
- if uri == "simple":
- # support the default
- uri = LoggerClass.default
-
- # if default logger is in use, and statsd is on, automagically switch
- # to the statsd logger
- if uri == LoggerClass.default:
- if 'statsd_host' in self.settings and self.settings['statsd_host'].value is not None:
- uri = "gunicorn.instrument.statsd.Statsd"
-
- logger_class = util.load_class(
- uri,
- default="gunicorn.glogging.Logger",
- section="gunicorn.loggers")
-
- if hasattr(logger_class, "install"):
- logger_class.install()
- return logger_class
-
- @property
- def is_ssl(self):
- return self.certfile or self.keyfile
-
- @property
- def ssl_options(self):
- opts = {}
- for name, value in self.settings.items():
- if value.section == 'SSL':
- opts[name] = value.get()
- return opts
-
- @property
- def env(self):
- raw_env = self.settings['raw_env'].get()
- env = {}
-
- if not raw_env:
- return env
-
- for e in raw_env:
- s = util.bytes_to_str(e)
- try:
- k, v = s.split('=', 1)
- except ValueError:
- raise RuntimeError("environment setting %r invalid" % s)
-
- env[k] = v
-
- return env
-
- @property
- def sendfile(self):
- if self.settings['sendfile'].get() is not None:
- return False
-
- if 'SENDFILE' in os.environ:
- sendfile = os.environ['SENDFILE'].lower()
- return sendfile in ['y', '1', 'yes', 'true']
-
- return True
-
- @property
- def reuse_port(self):
- return self.settings['reuse_port'].get()
-
- @property
- def paste_global_conf(self):
- raw_global_conf = self.settings['raw_paste_global_conf'].get()
- if raw_global_conf is None:
- return None
-
- global_conf = {}
- for e in raw_global_conf:
- s = util.bytes_to_str(e)
- try:
- k, v = re.split(r'(?" % (
- self.__class__.__module__,
- self.__class__.__name__,
- id(self),
- self.value,
- )
-
-
-Setting = SettingMeta('Setting', (Setting,), {})
-
-
-def validate_bool(val):
- if val is None:
- return
-
- if isinstance(val, bool):
- return val
- if not isinstance(val, str):
- raise TypeError("Invalid type for casting: %s" % val)
- if val.lower().strip() == "true":
- return True
- elif val.lower().strip() == "false":
- return False
- else:
- raise ValueError("Invalid boolean: %s" % val)
-
-
-def validate_dict(val):
- if not isinstance(val, dict):
- raise TypeError("Value is not a dictionary: %s " % val)
- return val
-
-
-def validate_pos_int(val):
- if not isinstance(val, int):
- val = int(val, 0)
- else:
- # Booleans are ints!
- val = int(val)
- if val < 0:
- raise ValueError("Value must be positive: %s" % val)
- return val
-
-
-def validate_ssl_version(val):
- ssl_versions = {}
- for protocol in [p for p in dir(ssl) if p.startswith("PROTOCOL_")]:
- ssl_versions[protocol[9:]] = getattr(ssl, protocol)
- if val in ssl_versions:
- # string matching PROTOCOL_...
- return ssl_versions[val]
-
- try:
- intval = validate_pos_int(val)
- if intval in ssl_versions.values():
- # positive int matching a protocol int constant
- return intval
- except (ValueError, TypeError):
- # negative integer or not an integer
- # drop this in favour of the more descriptive ValueError below
- pass
-
- raise ValueError("Invalid ssl_version: %s. Valid options: %s"
- % (val, ', '.join(ssl_versions)))
-
-
-def validate_string(val):
- if val is None:
- return None
- if not isinstance(val, str):
- raise TypeError("Not a string: %s" % val)
- return val.strip()
-
-
-def validate_file_exists(val):
- if val is None:
- return None
- if not os.path.exists(val):
- raise ValueError("File %s does not exists." % val)
- return val
-
-
-def validate_list_string(val):
- if not val:
- return []
-
- # legacy syntax
- if isinstance(val, str):
- val = [val]
-
- return [validate_string(v) for v in val]
-
-
-def validate_list_of_existing_files(val):
- return [validate_file_exists(v) for v in validate_list_string(val)]
-
-
-def validate_string_to_list(val):
- val = validate_string(val)
-
- if not val:
- return []
-
- return [v.strip() for v in val.split(",") if v]
-
-
-def validate_class(val):
- if inspect.isfunction(val) or inspect.ismethod(val):
- val = val()
- if inspect.isclass(val):
- return val
- return validate_string(val)
-
-
-def validate_callable(arity):
- def _validate_callable(val):
- if isinstance(val, str):
- try:
- mod_name, obj_name = val.rsplit(".", 1)
- except ValueError:
- raise TypeError("Value '%s' is not import string. "
- "Format: module[.submodules...].object" % val)
- try:
- mod = __import__(mod_name, fromlist=[obj_name])
- val = getattr(mod, obj_name)
- except ImportError as e:
- raise TypeError(str(e))
- except AttributeError:
- raise TypeError("Can not load '%s' from '%s'"
- "" % (obj_name, mod_name))
- if not callable(val):
- raise TypeError("Value is not callable: %s" % val)
- if arity != -1 and arity != util.get_arity(val):
- raise TypeError("Value must have an arity of: %s" % arity)
- return val
- return _validate_callable
-
-
-def validate_user(val):
- if val is None:
- return os.geteuid()
- if isinstance(val, int):
- return val
- elif val.isdigit():
- return int(val)
- else:
- try:
- return pwd.getpwnam(val).pw_uid
- except KeyError:
- raise ConfigError("No such user: '%s'" % val)
-
-
-def validate_group(val):
- if val is None:
- return os.getegid()
-
- if isinstance(val, int):
- return val
- elif val.isdigit():
- return int(val)
- else:
- try:
- return grp.getgrnam(val).gr_gid
- except KeyError:
- raise ConfigError("No such group: '%s'" % val)
-
-
-def validate_post_request(val):
- val = validate_callable(-1)(val)
-
- largs = util.get_arity(val)
- if largs == 4:
- return val
- elif largs == 3:
- return lambda worker, req, env, _r: val(worker, req, env)
- elif largs == 2:
- return lambda worker, req, _e, _r: val(worker, req)
- else:
- raise TypeError("Value must have an arity of: 4")
-
-
-def validate_chdir(val):
- # valid if the value is a string
- val = validate_string(val)
-
- # transform relative paths
- path = os.path.abspath(os.path.normpath(os.path.join(util.getcwd(), val)))
-
- # test if the path exists
- if not os.path.exists(path):
- raise ConfigError("can't chdir to %r" % val)
-
- return path
-
-
-def validate_hostport(val):
- val = validate_string(val)
- if val is None:
- return None
- elements = val.split(":")
- if len(elements) == 2:
- return (elements[0], int(elements[1]))
- else:
- raise TypeError("Value must consist of: hostname:port")
-
-
-def validate_reload_engine(val):
- if val not in reloader_engines:
- raise ConfigError("Invalid reload_engine: %r" % val)
-
- return val
-
-
-def get_default_config_file():
- config_path = os.path.join(os.path.abspath(os.getcwd()),
- 'gunicorn.conf.py')
- if os.path.exists(config_path):
- return config_path
- return None
-
-
-class ConfigFile(Setting):
- name = "config"
- section = "Config File"
- cli = ["-c", "--config"]
- meta = "CONFIG"
- validator = validate_string
- default = "./gunicorn.conf.py"
- desc = """\
- The Gunicorn config file.
-
- A string of the form ``PATH``, ``file:PATH``, or ``python:MODULE_NAME``.
-
- Only has an effect when specified on the command line or as part of an
- application specific configuration.
-
- By default, a file named ``gunicorn.conf.py`` will be read from the same
- directory where gunicorn is being run.
-
- .. versionchanged:: 19.4
- Loading the config from a Python module requires the ``python:``
- prefix.
- """
-
-class WSGIApp(Setting):
- name = "wsgi_app"
- section = "Config File"
- meta = "STRING"
- validator = validate_string
- default = None
- desc = """\
- A WSGI application path in pattern ``$(MODULE_NAME):$(VARIABLE_NAME)``.
-
- .. versionadded:: 20.1.0
- """
-
-class Bind(Setting):
- name = "bind"
- action = "append"
- section = "Server Socket"
- cli = ["-b", "--bind"]
- meta = "ADDRESS"
- validator = validate_list_string
-
- if 'PORT' in os.environ:
- default = ['0.0.0.0:{0}'.format(os.environ.get('PORT'))]
- else:
- default = ['127.0.0.1:8000']
-
- desc = """\
- The socket to bind.
-
- A string of the form: ``HOST``, ``HOST:PORT``, ``unix:PATH``,
- ``fd://FD``. An IP is a valid ``HOST``.
-
- .. versionchanged:: 20.0
- Support for ``fd://FD`` got added.
-
- Multiple addresses can be bound. ex.::
-
- $ gunicorn -b 127.0.0.1:8000 -b [::1]:8000 test:app
-
- will bind the `test:app` application on localhost both on ipv6
- and ipv4 interfaces.
-
- If the ``PORT`` environment variable is defined, the default
- is ``['0.0.0.0:$PORT']``. If it is not defined, the default
- is ``['127.0.0.1:8000']``.
- """
-
-
-class Backlog(Setting):
- name = "backlog"
- section = "Server Socket"
- cli = ["--backlog"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 2048
- desc = """\
- The maximum number of pending connections.
-
- This refers to the number of clients that can be waiting to be served.
- Exceeding this number results in the client getting an error when
- attempting to connect. It should only affect servers under significant
- load.
-
- Must be a positive integer. Generally set in the 64-2048 range.
- """
-
-
-class Workers(Setting):
- name = "workers"
- section = "Worker Processes"
- cli = ["-w", "--workers"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = int(os.environ.get("WEB_CONCURRENCY", 1))
- desc = """\
- The number of worker processes for handling requests.
-
- A positive integer generally in the ``2-4 x $(NUM_CORES)`` range.
- You'll want to vary this a bit to find the best for your particular
- application's work load.
-
- By default, the value of the ``WEB_CONCURRENCY`` environment variable,
- which is set by some Platform-as-a-Service providers such as Heroku. If
- it is not defined, the default is ``1``.
- """
-
-
-class WorkerClass(Setting):
- name = "worker_class"
- section = "Worker Processes"
- cli = ["-k", "--worker-class"]
- meta = "STRING"
- validator = validate_class
- default = "sync"
- desc = """\
- The type of workers to use.
-
- The default class (``sync``) should handle most "normal" types of
- workloads. You'll want to read :doc:`design` for information on when
- you might want to choose one of the other worker classes. Required
- libraries may be installed using setuptools' ``extras_require`` feature.
-
- A string referring to one of the following bundled classes:
-
- * ``sync``
- * ``eventlet`` - Requires eventlet >= 0.24.1 (or install it via
- ``pip install gunicorn[eventlet]``)
- * ``gevent`` - Requires gevent >= 1.4 (or install it via
- ``pip install gunicorn[gevent]``)
- * ``tornado`` - Requires tornado >= 0.2 (or install it via
- ``pip install gunicorn[tornado]``)
- * ``gthread`` - Python 2 requires the futures package to be installed
- (or install it via ``pip install gunicorn[gthread]``)
-
- Optionally, you can provide your own worker by giving Gunicorn a
- Python path to a subclass of ``gunicorn.workers.base.Worker``.
- This alternative syntax will load the gevent class:
- ``gunicorn.workers.ggevent.GeventWorker``.
- """
-
-
-class WorkerThreads(Setting):
- name = "threads"
- section = "Worker Processes"
- cli = ["--threads"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 1
- desc = """\
- The number of worker threads for handling requests.
-
- Run each worker with the specified number of threads.
-
- A positive integer generally in the ``2-4 x $(NUM_CORES)`` range.
- You'll want to vary this a bit to find the best for your particular
- application's work load.
-
- If it is not defined, the default is ``1``.
-
- This setting only affects the Gthread worker type.
-
- .. note::
- If you try to use the ``sync`` worker type and set the ``threads``
- setting to more than 1, the ``gthread`` worker type will be used
- instead.
- """
-
-
-class WorkerConnections(Setting):
- name = "worker_connections"
- section = "Worker Processes"
- cli = ["--worker-connections"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 1000
- desc = """\
- The maximum number of simultaneous clients.
-
- This setting only affects the Eventlet and Gevent worker types.
- """
-
-
-class MaxRequests(Setting):
- name = "max_requests"
- section = "Worker Processes"
- cli = ["--max-requests"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 0
- desc = """\
- The maximum number of requests a worker will process before restarting.
-
- Any value greater than zero will limit the number of requests a worker
- will process before automatically restarting. This is a simple method
- to help limit the damage of memory leaks.
-
- If this is set to zero (the default) then the automatic worker
- restarts are disabled.
- """
-
-
-class MaxRequestsJitter(Setting):
- name = "max_requests_jitter"
- section = "Worker Processes"
- cli = ["--max-requests-jitter"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 0
- desc = """\
- The maximum jitter to add to the *max_requests* setting.
-
- The jitter causes the restart per worker to be randomized by
- ``randint(0, max_requests_jitter)``. This is intended to stagger worker
- restarts to avoid all workers restarting at the same time.
-
- .. versionadded:: 19.2
- """
-
-
-class Timeout(Setting):
- name = "timeout"
- section = "Worker Processes"
- cli = ["-t", "--timeout"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 30
- desc = """\
- Workers silent for more than this many seconds are killed and restarted.
-
- Value is a positive number or 0. Setting it to 0 has the effect of
- infinite timeouts by disabling timeouts for all workers entirely.
-
- Generally, the default of thirty seconds should suffice. Only set this
- noticeably higher if you're sure of the repercussions for sync workers.
- For the non sync workers it just means that the worker process is still
- communicating and is not tied to the length of time required to handle a
- single request.
- """
-
-
-class GracefulTimeout(Setting):
- name = "graceful_timeout"
- section = "Worker Processes"
- cli = ["--graceful-timeout"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 30
- desc = """\
- Timeout for graceful workers restart.
-
- After receiving a restart signal, workers have this much time to finish
- serving requests. Workers still alive after the timeout (starting from
- the receipt of the restart signal) are force killed.
- """
-
-
-class Keepalive(Setting):
- name = "keepalive"
- section = "Worker Processes"
- cli = ["--keep-alive"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 2
- desc = """\
- The number of seconds to wait for requests on a Keep-Alive connection.
-
- Generally set in the 1-5 seconds range for servers with direct connection
- to the client (e.g. when you don't have separate load balancer). When
- Gunicorn is deployed behind a load balancer, it often makes sense to
- set this to a higher value.
-
- .. note::
- ``sync`` worker does not support persistent connections and will
- ignore this option.
- """
-
-
-class LimitRequestLine(Setting):
- name = "limit_request_line"
- section = "Security"
- cli = ["--limit-request-line"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 4094
- desc = """\
- The maximum size of HTTP request line in bytes.
-
- This parameter is used to limit the allowed size of a client's
- HTTP request-line. Since the request-line consists of the HTTP
- method, URI, and protocol version, this directive places a
- restriction on the length of a request-URI allowed for a request
- on the server. A server needs this value to be large enough to
- hold any of its resource names, including any information that
- might be passed in the query part of a GET request. Value is a number
- from 0 (unlimited) to 8190.
-
- This parameter can be used to prevent any DDOS attack.
- """
-
-
-class LimitRequestFields(Setting):
- name = "limit_request_fields"
- section = "Security"
- cli = ["--limit-request-fields"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 100
- desc = """\
- Limit the number of HTTP headers fields in a request.
-
- This parameter is used to limit the number of headers in a request to
- prevent DDOS attack. Used with the *limit_request_field_size* it allows
- more safety. By default this value is 100 and can't be larger than
- 32768.
- """
-
-
-class LimitRequestFieldSize(Setting):
- name = "limit_request_field_size"
- section = "Security"
- cli = ["--limit-request-field_size"]
- meta = "INT"
- validator = validate_pos_int
- type = int
- default = 8190
- desc = """\
- Limit the allowed size of an HTTP request header field.
-
- Value is a positive number or 0. Setting it to 0 will allow unlimited
- header field sizes.
-
- .. warning::
- Setting this parameter to a very high or unlimited value can open
- up for DDOS attacks.
- """
-
-
-class Reload(Setting):
- name = "reload"
- section = 'Debugging'
- cli = ['--reload']
- validator = validate_bool
- action = 'store_true'
- default = False
-
- desc = '''\
- Restart workers when code changes.
-
- This setting is intended for development. It will cause workers to be
- restarted whenever application code changes.
-
- The reloader is incompatible with application preloading. When using a
- paste configuration be sure that the server block does not import any
- application code or the reload will not work as designed.
-
- The default behavior is to attempt inotify with a fallback to file
- system polling. Generally, inotify should be preferred if available
- because it consumes less system resources.
-
- .. note::
- In order to use the inotify reloader, you must have the ``inotify``
- package installed.
- '''
-
-
-class ReloadEngine(Setting):
- name = "reload_engine"
- section = "Debugging"
- cli = ["--reload-engine"]
- meta = "STRING"
- validator = validate_reload_engine
- default = "auto"
- desc = """\
- The implementation that should be used to power :ref:`reload`.
-
- Valid engines are:
-
- * ``'auto'``
- * ``'poll'``
- * ``'inotify'`` (requires inotify)
-
- .. versionadded:: 19.7
- """
-
-
-class ReloadExtraFiles(Setting):
- name = "reload_extra_files"
- action = "append"
- section = "Debugging"
- cli = ["--reload-extra-file"]
- meta = "FILES"
- validator = validate_list_of_existing_files
- default = []
- desc = """\
- Extends :ref:`reload` option to also watch and reload on additional files
- (e.g., templates, configurations, specifications, etc.).
-
- .. versionadded:: 19.8
- """
-
-
-class Spew(Setting):
- name = "spew"
- section = "Debugging"
- cli = ["--spew"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Install a trace function that spews every line executed by the server.
-
- This is the nuclear option.
- """
-
-
-class ConfigCheck(Setting):
- name = "check_config"
- section = "Debugging"
- cli = ["--check-config"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Check the configuration and exit. The exit status is 0 if the
- configuration is correct, and 1 if the configuration is incorrect.
- """
-
-
-class PrintConfig(Setting):
- name = "print_config"
- section = "Debugging"
- cli = ["--print-config"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Print the configuration settings as fully resolved. Implies :ref:`check-config`.
- """
-
-
-class PreloadApp(Setting):
- name = "preload_app"
- section = "Server Mechanics"
- cli = ["--preload"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Load application code before the worker processes are forked.
-
- By preloading an application you can save some RAM resources as well as
- speed up server boot times. Although, if you defer application loading
- to each worker process, you can reload your application code easily by
- restarting workers.
- """
-
-
-class Sendfile(Setting):
- name = "sendfile"
- section = "Server Mechanics"
- cli = ["--no-sendfile"]
- validator = validate_bool
- action = "store_const"
- const = False
-
- desc = """\
- Disables the use of ``sendfile()``.
-
- If not set, the value of the ``SENDFILE`` environment variable is used
- to enable or disable its usage.
-
- .. versionadded:: 19.2
- .. versionchanged:: 19.4
- Swapped ``--sendfile`` with ``--no-sendfile`` to actually allow
- disabling.
- .. versionchanged:: 19.6
- added support for the ``SENDFILE`` environment variable
- """
-
-
-class ReusePort(Setting):
- name = "reuse_port"
- section = "Server Mechanics"
- cli = ["--reuse-port"]
- validator = validate_bool
- action = "store_true"
- default = False
-
- desc = """\
- Set the ``SO_REUSEPORT`` flag on the listening socket.
-
- .. versionadded:: 19.8
- """
-
-
-class Chdir(Setting):
- name = "chdir"
- section = "Server Mechanics"
- cli = ["--chdir"]
- validator = validate_chdir
- default = util.getcwd()
- desc = """\
- Change directory to specified directory before loading apps.
- """
-
-
-class Daemon(Setting):
- name = "daemon"
- section = "Server Mechanics"
- cli = ["-D", "--daemon"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Daemonize the Gunicorn process.
-
- Detaches the server from the controlling terminal and enters the
- background.
- """
-
-
-class Env(Setting):
- name = "raw_env"
- action = "append"
- section = "Server Mechanics"
- cli = ["-e", "--env"]
- meta = "ENV"
- validator = validate_list_string
- default = []
-
- desc = """\
- Set environment variables in the execution environment.
-
- Should be a list of strings in the ``key=value`` format.
-
- For example on the command line:
-
- .. code-block:: console
-
- $ gunicorn -b 127.0.0.1:8000 --env FOO=1 test:app
-
- Or in the configuration file:
-
- .. code-block:: python
-
- raw_env = ["FOO=1"]
- """
-
-
-class Pidfile(Setting):
- name = "pidfile"
- section = "Server Mechanics"
- cli = ["-p", "--pid"]
- meta = "FILE"
- validator = validate_string
- default = None
- desc = """\
- A filename to use for the PID file.
-
- If not set, no PID file will be written.
- """
-
-
-class WorkerTmpDir(Setting):
- name = "worker_tmp_dir"
- section = "Server Mechanics"
- cli = ["--worker-tmp-dir"]
- meta = "DIR"
- validator = validate_string
- default = None
- desc = """\
- A directory to use for the worker heartbeat temporary file.
-
- If not set, the default temporary directory will be used.
-
- .. note::
- The current heartbeat system involves calling ``os.fchmod`` on
- temporary file handlers and may block a worker for arbitrary time
- if the directory is on a disk-backed filesystem.
-
- See :ref:`blocking-os-fchmod` for more detailed information
- and a solution for avoiding this problem.
- """
-
-
-class User(Setting):
- name = "user"
- section = "Server Mechanics"
- cli = ["-u", "--user"]
- meta = "USER"
- validator = validate_user
- default = os.geteuid()
- desc = """\
- Switch worker processes to run as this user.
-
- A valid user id (as an integer) or the name of a user that can be
- retrieved with a call to ``pwd.getpwnam(value)`` or ``None`` to not
- change the worker process user.
- """
-
-
-class Group(Setting):
- name = "group"
- section = "Server Mechanics"
- cli = ["-g", "--group"]
- meta = "GROUP"
- validator = validate_group
- default = os.getegid()
- desc = """\
- Switch worker process to run as this group.
-
- A valid group id (as an integer) or the name of a user that can be
- retrieved with a call to ``pwd.getgrnam(value)`` or ``None`` to not
- change the worker processes group.
- """
-
-
-class Umask(Setting):
- name = "umask"
- section = "Server Mechanics"
- cli = ["-m", "--umask"]
- meta = "INT"
- validator = validate_pos_int
- type = auto_int
- default = 0
- desc = """\
- A bit mask for the file mode on files written by Gunicorn.
-
- Note that this affects unix socket permissions.
-
- A valid value for the ``os.umask(mode)`` call or a string compatible
- with ``int(value, 0)`` (``0`` means Python guesses the base, so values
- like ``0``, ``0xFF``, ``0022`` are valid for decimal, hex, and octal
- representations)
- """
-
-
-class Initgroups(Setting):
- name = "initgroups"
- section = "Server Mechanics"
- cli = ["--initgroups"]
- validator = validate_bool
- action = 'store_true'
- default = False
-
- desc = """\
- If true, set the worker process's group access list with all of the
- groups of which the specified username is a member, plus the specified
- group id.
-
- .. versionadded:: 19.7
- """
-
-
-class TmpUploadDir(Setting):
- name = "tmp_upload_dir"
- section = "Server Mechanics"
- meta = "DIR"
- validator = validate_string
- default = None
- desc = """\
- Directory to store temporary request data as they are read.
-
- This may disappear in the near future.
-
- This path should be writable by the process permissions set for Gunicorn
- workers. If not specified, Gunicorn will choose a system generated
- temporary directory.
- """
-
-
-class SecureSchemeHeader(Setting):
- name = "secure_scheme_headers"
- section = "Server Mechanics"
- validator = validate_dict
- default = {
- "X-FORWARDED-PROTOCOL": "ssl",
- "X-FORWARDED-PROTO": "https",
- "X-FORWARDED-SSL": "on"
- }
- desc = """\
-
- A dictionary containing headers and values that the front-end proxy
- uses to indicate HTTPS requests. If the source IP is permitted by
- ``forwarded-allow-ips`` (below), *and* at least one request header matches
- a key-value pair listed in this dictionary, then Gunicorn will set
- ``wsgi.url_scheme`` to ``https``, so your application can tell that the
- request is secure.
-
- If the other headers listed in this dictionary are not present in the request, they will be ignored,
- but if the other headers are present and do not match the provided values, then
- the request will fail to parse. See the note below for more detailed examples of this behaviour.
-
- The dictionary should map upper-case header names to exact string
- values. The value comparisons are case-sensitive, unlike the header
- names, so make sure they're exactly what your front-end proxy sends
- when handling HTTPS requests.
-
- It is important that your front-end proxy configuration ensures that
- the headers defined here can not be passed directly from the client.
- """
-
-
-class ForwardedAllowIPS(Setting):
- name = "forwarded_allow_ips"
- section = "Server Mechanics"
- cli = ["--forwarded-allow-ips"]
- meta = "STRING"
- validator = validate_string_to_list
- default = os.environ.get("FORWARDED_ALLOW_IPS", "127.0.0.1")
- desc = """\
- Front-end's IPs from which allowed to handle set secure headers.
- (comma separate).
-
- Set to ``*`` to disable checking of Front-end IPs (useful for setups
- where you don't know in advance the IP address of Front-end, but
- you still trust the environment).
-
- By default, the value of the ``FORWARDED_ALLOW_IPS`` environment
- variable. If it is not defined, the default is ``"127.0.0.1"``.
-
- .. note::
-
- The interplay between the request headers, the value of ``forwarded_allow_ips``, and the value of
- ``secure_scheme_headers`` is complex. Various scenarios are documented below to further elaborate. In each case, we
- have a request from the remote address 134.213.44.18, and the default value of ``secure_scheme_headers``:
-
- .. code::
-
- secure_scheme_headers = {
- 'X-FORWARDED-PROTOCOL': 'ssl',
- 'X-FORWARDED-PROTO': 'https',
- 'X-FORWARDED-SSL': 'on'
- }
-
-
- .. list-table::
- :header-rows: 1
- :align: center
- :widths: auto
-
- * - ``forwarded-allow-ips``
- - Secure Request Headers
- - Result
- - Explanation
- * - .. code::
-
- ["127.0.0.1"]
- - .. code::
-
- X-Forwarded-Proto: https
- - .. code::
-
- wsgi.url_scheme = "http"
- - IP address was not allowed
- * - .. code::
-
- "*"
- -
- - .. code::
-
- wsgi.url_scheme = "http"
- - IP address allowed, but no secure headers provided
- * - .. code::
-
- "*"
- - .. code::
-
- X-Forwarded-Proto: https
- - .. code::
-
- wsgi.url_scheme = "https"
- - IP address allowed, one request header matched
- * - .. code::
-
- ["134.213.44.18"]
- - .. code::
-
- X-Forwarded-Ssl: on
- X-Forwarded-Proto: http
- - ``InvalidSchemeHeaders()`` raised
- - IP address allowed, but the two secure headers disagreed on if HTTPS was used
-
-
- """
-
-
-class AccessLog(Setting):
- name = "accesslog"
- section = "Logging"
- cli = ["--access-logfile"]
- meta = "FILE"
- validator = validate_string
- default = None
- desc = """\
- The Access log file to write to.
-
- ``'-'`` means log to stdout.
- """
-
-
-class DisableRedirectAccessToSyslog(Setting):
- name = "disable_redirect_access_to_syslog"
- section = "Logging"
- cli = ["--disable-redirect-access-to-syslog"]
- validator = validate_bool
- action = 'store_true'
- default = False
- desc = """\
- Disable redirect access logs to syslog.
-
- .. versionadded:: 19.8
- """
-
-
-class AccessLogFormat(Setting):
- name = "access_log_format"
- section = "Logging"
- cli = ["--access-logformat"]
- meta = "STRING"
- validator = validate_string
- default = '%(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"'
- desc = """\
- The access log format.
-
- =========== ===========
- Identifier Description
- =========== ===========
- h remote address
- l ``'-'``
- u user name
- t date of the request
- r status line (e.g. ``GET / HTTP/1.1``)
- m request method
- U URL path without query string
- q query string
- H protocol
- s status
- B response length
- b response length or ``'-'`` (CLF format)
- f referer
- a user agent
- T request time in seconds
- M request time in milliseconds
- D request time in microseconds
- L request time in decimal seconds
- p process ID
- {header}i request header
- {header}o response header
- {variable}e environment variable
- =========== ===========
-
- Use lowercase for header and environment variable names, and put
- ``{...}x`` names inside ``%(...)s``. For example::
-
- %({x-forwarded-for}i)s
- """
-
-
-class ErrorLog(Setting):
- name = "errorlog"
- section = "Logging"
- cli = ["--error-logfile", "--log-file"]
- meta = "FILE"
- validator = validate_string
- default = '-'
- desc = """\
- The Error log file to write to.
-
- Using ``'-'`` for FILE makes gunicorn log to stderr.
-
- .. versionchanged:: 19.2
- Log to stderr by default.
-
- """
-
-
-class Loglevel(Setting):
- name = "loglevel"
- section = "Logging"
- cli = ["--log-level"]
- meta = "LEVEL"
- validator = validate_string
- default = "info"
- desc = """\
- The granularity of Error log outputs.
-
- Valid level names are:
-
- * ``'debug'``
- * ``'info'``
- * ``'warning'``
- * ``'error'``
- * ``'critical'``
- """
-
-
-class CaptureOutput(Setting):
- name = "capture_output"
- section = "Logging"
- cli = ["--capture-output"]
- validator = validate_bool
- action = 'store_true'
- default = False
- desc = """\
- Redirect stdout/stderr to specified file in :ref:`errorlog`.
-
- .. versionadded:: 19.6
- """
-
-
-class LoggerClass(Setting):
- name = "logger_class"
- section = "Logging"
- cli = ["--logger-class"]
- meta = "STRING"
- validator = validate_class
- default = "gunicorn.glogging.Logger"
- desc = """\
- The logger you want to use to log events in Gunicorn.
-
- The default class (``gunicorn.glogging.Logger``) handles most
- normal usages in logging. It provides error and access logging.
-
- You can provide your own logger by giving Gunicorn a Python path to a
- class that quacks like ``gunicorn.glogging.Logger``.
- """
-
-
-class LogConfig(Setting):
- name = "logconfig"
- section = "Logging"
- cli = ["--log-config"]
- meta = "FILE"
- validator = validate_string
- default = None
- desc = """\
- The log config file to use.
- Gunicorn uses the standard Python logging module's Configuration
- file format.
- """
-
-
-class LogConfigDict(Setting):
- name = "logconfig_dict"
- section = "Logging"
- validator = validate_dict
- default = {}
- desc = """\
- The log config dictionary to use, using the standard Python
- logging module's dictionary configuration format. This option
- takes precedence over the :ref:`logconfig` option, which uses the
- older file configuration format.
-
- Format: https://docs.python.org/3/library/logging.config.html#logging.config.dictConfig
-
- .. versionadded:: 19.8
- """
-
-
-class SyslogTo(Setting):
- name = "syslog_addr"
- section = "Logging"
- cli = ["--log-syslog-to"]
- meta = "SYSLOG_ADDR"
- validator = validate_string
-
- if PLATFORM == "darwin":
- default = "unix:///var/run/syslog"
- elif PLATFORM in ('freebsd', 'dragonfly', ):
- default = "unix:///var/run/log"
- elif PLATFORM == "openbsd":
- default = "unix:///dev/log"
- else:
- default = "udp://localhost:514"
-
- desc = """\
- Address to send syslog messages.
-
- Address is a string of the form:
-
- * ``unix://PATH#TYPE`` : for unix domain socket. ``TYPE`` can be ``stream``
- for the stream driver or ``dgram`` for the dgram driver.
- ``stream`` is the default.
- * ``udp://HOST:PORT`` : for UDP sockets
- * ``tcp://HOST:PORT`` : for TCP sockets
-
- """
-
-
-class Syslog(Setting):
- name = "syslog"
- section = "Logging"
- cli = ["--log-syslog"]
- validator = validate_bool
- action = 'store_true'
- default = False
- desc = """\
- Send *Gunicorn* logs to syslog.
-
- .. versionchanged:: 19.8
- You can now disable sending access logs by using the
- :ref:`disable-redirect-access-to-syslog` setting.
- """
-
-
-class SyslogPrefix(Setting):
- name = "syslog_prefix"
- section = "Logging"
- cli = ["--log-syslog-prefix"]
- meta = "SYSLOG_PREFIX"
- validator = validate_string
- default = None
- desc = """\
- Makes Gunicorn use the parameter as program-name in the syslog entries.
-
- All entries will be prefixed by ``gunicorn.``. By default the
- program name is the name of the process.
- """
-
-
-class SyslogFacility(Setting):
- name = "syslog_facility"
- section = "Logging"
- cli = ["--log-syslog-facility"]
- meta = "SYSLOG_FACILITY"
- validator = validate_string
- default = "user"
- desc = """\
- Syslog facility name
- """
-
-
-class EnableStdioInheritance(Setting):
- name = "enable_stdio_inheritance"
- section = "Logging"
- cli = ["-R", "--enable-stdio-inheritance"]
- validator = validate_bool
- default = False
- action = "store_true"
- desc = """\
- Enable stdio inheritance.
-
- Enable inheritance for stdio file descriptors in daemon mode.
-
- Note: To disable the Python stdout buffering, you can to set the user
- environment variable ``PYTHONUNBUFFERED`` .
- """
-
-
-# statsD monitoring
-class StatsdHost(Setting):
- name = "statsd_host"
- section = "Logging"
- cli = ["--statsd-host"]
- meta = "STATSD_ADDR"
- default = None
- validator = validate_hostport
- desc = """\
- ``host:port`` of the statsd server to log to.
-
- .. versionadded:: 19.1
- """
-
-# Datadog Statsd (dogstatsd) tags. https://docs.datadoghq.com/developers/dogstatsd/
-class DogstatsdTags(Setting):
- name = "dogstatsd_tags"
- section = "Logging"
- cli = ["--dogstatsd-tags"]
- meta = "DOGSTATSD_TAGS"
- default = ""
- validator = validate_string
- desc = """\
- A comma-delimited list of datadog statsd (dogstatsd) tags to append to
- statsd metrics.
-
- .. versionadded:: 20
- """
-
-class StatsdPrefix(Setting):
- name = "statsd_prefix"
- section = "Logging"
- cli = ["--statsd-prefix"]
- meta = "STATSD_PREFIX"
- default = ""
- validator = validate_string
- desc = """\
- Prefix to use when emitting statsd metrics (a trailing ``.`` is added,
- if not provided).
-
- .. versionadded:: 19.2
- """
-
-
-class Procname(Setting):
- name = "proc_name"
- section = "Process Naming"
- cli = ["-n", "--name"]
- meta = "STRING"
- validator = validate_string
- default = None
- desc = """\
- A base to use with setproctitle for process naming.
-
- This affects things like ``ps`` and ``top``. If you're going to be
- running more than one instance of Gunicorn you'll probably want to set a
- name to tell them apart. This requires that you install the setproctitle
- module.
-
- If not set, the *default_proc_name* setting will be used.
- """
-
-
-class DefaultProcName(Setting):
- name = "default_proc_name"
- section = "Process Naming"
- validator = validate_string
- default = "gunicorn"
- desc = """\
- Internal setting that is adjusted for each type of application.
- """
-
-
-class PythonPath(Setting):
- name = "pythonpath"
- section = "Server Mechanics"
- cli = ["--pythonpath"]
- meta = "STRING"
- validator = validate_string
- default = None
- desc = """\
- A comma-separated list of directories to add to the Python path.
-
- e.g.
- ``'/home/djangoprojects/myproject,/home/python/mylibrary'``.
- """
-
-
-class Paste(Setting):
- name = "paste"
- section = "Server Mechanics"
- cli = ["--paste", "--paster"]
- meta = "STRING"
- validator = validate_string
- default = None
- desc = """\
- Load a PasteDeploy config file. The argument may contain a ``#``
- symbol followed by the name of an app section from the config file,
- e.g. ``production.ini#admin``.
-
- At this time, using alternate server blocks is not supported. Use the
- command line arguments to control server configuration instead.
- """
-
-
-class OnStarting(Setting):
- name = "on_starting"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def on_starting(server):
- pass
- default = staticmethod(on_starting)
- desc = """\
- Called just before the master process is initialized.
-
- The callable needs to accept a single instance variable for the Arbiter.
- """
-
-
-class OnReload(Setting):
- name = "on_reload"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def on_reload(server):
- pass
- default = staticmethod(on_reload)
- desc = """\
- Called to recycle workers during a reload via SIGHUP.
-
- The callable needs to accept a single instance variable for the Arbiter.
- """
-
-
-class WhenReady(Setting):
- name = "when_ready"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def when_ready(server):
- pass
- default = staticmethod(when_ready)
- desc = """\
- Called just after the server is started.
-
- The callable needs to accept a single instance variable for the Arbiter.
- """
-
-
-class Prefork(Setting):
- name = "pre_fork"
- section = "Server Hooks"
- validator = validate_callable(2)
- type = callable
-
- def pre_fork(server, worker):
- pass
- default = staticmethod(pre_fork)
- desc = """\
- Called just before a worker is forked.
-
- The callable needs to accept two instance variables for the Arbiter and
- new Worker.
- """
-
-
-class Postfork(Setting):
- name = "post_fork"
- section = "Server Hooks"
- validator = validate_callable(2)
- type = callable
-
- def post_fork(server, worker):
- pass
- default = staticmethod(post_fork)
- desc = """\
- Called just after a worker has been forked.
-
- The callable needs to accept two instance variables for the Arbiter and
- new Worker.
- """
-
-
-class PostWorkerInit(Setting):
- name = "post_worker_init"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def post_worker_init(worker):
- pass
-
- default = staticmethod(post_worker_init)
- desc = """\
- Called just after a worker has initialized the application.
-
- The callable needs to accept one instance variable for the initialized
- Worker.
- """
-
-
-class WorkerInt(Setting):
- name = "worker_int"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def worker_int(worker):
- pass
-
- default = staticmethod(worker_int)
- desc = """\
- Called just after a worker exited on SIGINT or SIGQUIT.
-
- The callable needs to accept one instance variable for the initialized
- Worker.
- """
-
-
-class WorkerAbort(Setting):
- name = "worker_abort"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def worker_abort(worker):
- pass
-
- default = staticmethod(worker_abort)
- desc = """\
- Called when a worker received the SIGABRT signal.
-
- This call generally happens on timeout.
-
- The callable needs to accept one instance variable for the initialized
- Worker.
- """
-
-
-class PreExec(Setting):
- name = "pre_exec"
- section = "Server Hooks"
- validator = validate_callable(1)
- type = callable
-
- def pre_exec(server):
- pass
- default = staticmethod(pre_exec)
- desc = """\
- Called just before a new master process is forked.
-
- The callable needs to accept a single instance variable for the Arbiter.
- """
-
-
-class PreRequest(Setting):
- name = "pre_request"
- section = "Server Hooks"
- validator = validate_callable(2)
- type = callable
-
- def pre_request(worker, req):
- worker.log.debug("%s %s" % (req.method, req.path))
- default = staticmethod(pre_request)
- desc = """\
- Called just before a worker processes the request.
-
- The callable needs to accept two instance variables for the Worker and
- the Request.
- """
-
-
-class PostRequest(Setting):
- name = "post_request"
- section = "Server Hooks"
- validator = validate_post_request
- type = callable
-
- def post_request(worker, req, environ, resp):
- pass
- default = staticmethod(post_request)
- desc = """\
- Called after a worker processes the request.
-
- The callable needs to accept two instance variables for the Worker and
- the Request.
- """
-
-
-class ChildExit(Setting):
- name = "child_exit"
- section = "Server Hooks"
- validator = validate_callable(2)
- type = callable
-
- def child_exit(server, worker):
- pass
- default = staticmethod(child_exit)
- desc = """\
- Called just after a worker has been exited, in the master process.
-
- The callable needs to accept two instance variables for the Arbiter and
- the just-exited Worker.
-
- .. versionadded:: 19.7
- """
-
-
-class WorkerExit(Setting):
- name = "worker_exit"
- section = "Server Hooks"
- validator = validate_callable(2)
- type = callable
-
- def worker_exit(server, worker):
- pass
- default = staticmethod(worker_exit)
- desc = """\
- Called just after a worker has been exited, in the worker process.
-
- The callable needs to accept two instance variables for the Arbiter and
- the just-exited Worker.
- """
-
-
-class NumWorkersChanged(Setting):
- name = "nworkers_changed"
- section = "Server Hooks"
- validator = validate_callable(3)
- type = callable
-
- def nworkers_changed(server, new_value, old_value):
- pass
- default = staticmethod(nworkers_changed)
- desc = """\
- Called just after *num_workers* has been changed.
-
- The callable needs to accept an instance variable of the Arbiter and
- two integers of number of workers after and before change.
-
- If the number of workers is set for the first time, *old_value* would
- be ``None``.
- """
-
-
-class OnExit(Setting):
- name = "on_exit"
- section = "Server Hooks"
- validator = validate_callable(1)
-
- def on_exit(server):
- pass
-
- default = staticmethod(on_exit)
- desc = """\
- Called just before exiting Gunicorn.
-
- The callable needs to accept a single instance variable for the Arbiter.
- """
-
-
-class ProxyProtocol(Setting):
- name = "proxy_protocol"
- section = "Server Mechanics"
- cli = ["--proxy-protocol"]
- validator = validate_bool
- default = False
- action = "store_true"
- desc = """\
- Enable detect PROXY protocol (PROXY mode).
-
- Allow using HTTP and Proxy together. It may be useful for work with
- stunnel as HTTPS frontend and Gunicorn as HTTP server.
-
- PROXY protocol: http://haproxy.1wt.eu/download/1.5/doc/proxy-protocol.txt
-
- Example for stunnel config::
-
- [https]
- protocol = proxy
- accept = 443
- connect = 80
- cert = /etc/ssl/certs/stunnel.pem
- key = /etc/ssl/certs/stunnel.key
- """
-
-
-class ProxyAllowFrom(Setting):
- name = "proxy_allow_ips"
- section = "Server Mechanics"
- cli = ["--proxy-allow-from"]
- validator = validate_string_to_list
- default = "127.0.0.1"
- desc = """\
- Front-end's IPs from which allowed accept proxy requests (comma separate).
-
- Set to ``*`` to disable checking of Front-end IPs (useful for setups
- where you don't know in advance the IP address of Front-end, but
- you still trust the environment)
- """
-
-
-class KeyFile(Setting):
- name = "keyfile"
- section = "SSL"
- cli = ["--keyfile"]
- meta = "FILE"
- validator = validate_string
- default = None
- desc = """\
- SSL key file
- """
-
-
-class CertFile(Setting):
- name = "certfile"
- section = "SSL"
- cli = ["--certfile"]
- meta = "FILE"
- validator = validate_string
- default = None
- desc = """\
- SSL certificate file
- """
-
-
-class SSLVersion(Setting):
- name = "ssl_version"
- section = "SSL"
- cli = ["--ssl-version"]
- validator = validate_ssl_version
-
- if hasattr(ssl, "PROTOCOL_TLS"):
- default = ssl.PROTOCOL_TLS
- else:
- default = ssl.PROTOCOL_SSLv23
-
- desc = """\
- SSL version to use (see stdlib ssl module's)
-
- .. versionchanged:: 20.0.1
- The default value has been changed from ``ssl.PROTOCOL_SSLv23`` to
- ``ssl.PROTOCOL_TLS`` when Python >= 3.6 .
-
- """
- default = ssl.PROTOCOL_SSLv23
- desc = """\
- SSL version to use.
-
- ============= ============
- --ssl-version Description
- ============= ============
- SSLv3 SSLv3 is not-secure and is strongly discouraged.
- SSLv23 Alias for TLS. Deprecated in Python 3.6, use TLS.
- TLS Negotiate highest possible version between client/server.
- Can yield SSL. (Python 3.6+)
- TLSv1 TLS 1.0
- TLSv1_1 TLS 1.1 (Python 3.4+)
- TLSv1_2 TLS 1.2 (Python 3.4+)
- TLS_SERVER Auto-negotiate the highest protocol version like TLS,
- but only support server-side SSLSocket connections.
- (Python 3.6+)
- ============= ============
-
- .. versionchanged:: 19.7
- The default value has been changed from ``ssl.PROTOCOL_TLSv1`` to
- ``ssl.PROTOCOL_SSLv23``.
- .. versionchanged:: 20.0
- This setting now accepts string names based on ``ssl.PROTOCOL_``
- constants.
- """
-
-
-class CertReqs(Setting):
- name = "cert_reqs"
- section = "SSL"
- cli = ["--cert-reqs"]
- validator = validate_pos_int
- default = ssl.CERT_NONE
- desc = """\
- Whether client certificate is required (see stdlib ssl module's)
- """
-
-
-class CACerts(Setting):
- name = "ca_certs"
- section = "SSL"
- cli = ["--ca-certs"]
- meta = "FILE"
- validator = validate_string
- default = None
- desc = """\
- CA certificates file
- """
-
-
-class SuppressRaggedEOFs(Setting):
- name = "suppress_ragged_eofs"
- section = "SSL"
- cli = ["--suppress-ragged-eofs"]
- action = "store_true"
- default = True
- validator = validate_bool
- desc = """\
- Suppress ragged EOFs (see stdlib ssl module's)
- """
-
-
-class DoHandshakeOnConnect(Setting):
- name = "do_handshake_on_connect"
- section = "SSL"
- cli = ["--do-handshake-on-connect"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Whether to perform SSL handshake on socket connect (see stdlib ssl module's)
- """
-
-
-class Ciphers(Setting):
- name = "ciphers"
- section = "SSL"
- cli = ["--ciphers"]
- validator = validate_string
- default = None
- desc = """\
- SSL Cipher suite to use, in the format of an OpenSSL cipher list.
-
- By default we use the default cipher list from Python's ``ssl`` module,
- which contains ciphers considered strong at the time of each Python
- release.
-
- As a recommended alternative, the Open Web App Security Project (OWASP)
- offers `a vetted set of strong cipher strings rated A+ to C-
- `_.
- OWASP provides details on user-agent compatibility at each security level.
-
- See the `OpenSSL Cipher List Format Documentation
- `_
- for details on the format of an OpenSSL cipher list.
- """
-
-
-class PasteGlobalConf(Setting):
- name = "raw_paste_global_conf"
- action = "append"
- section = "Server Mechanics"
- cli = ["--paste-global"]
- meta = "CONF"
- validator = validate_list_string
- default = []
-
- desc = """\
- Set a PasteDeploy global config variable in ``key=value`` form.
-
- The option can be specified multiple times.
-
- The variables are passed to the the PasteDeploy entrypoint. Example::
-
- $ gunicorn -b 127.0.0.1:8000 --paste development.ini --paste-global FOO=1 --paste-global BAR=2
-
- .. versionadded:: 19.7
- """
-
-
-class StripHeaderSpaces(Setting):
- name = "strip_header_spaces"
- section = "Server Mechanics"
- cli = ["--strip-header-spaces"]
- validator = validate_bool
- action = "store_true"
- default = False
- desc = """\
- Strip spaces present between the header name and the the ``:``.
-
- This is known to induce vulnerabilities and is not compliant with the HTTP/1.1 standard.
- See https://portswigger.net/research/http-desync-attacks-request-smuggling-reborn.
-
- Use with care and only if necessary.
- """
diff --git a/env/lib/python3.9/site-packages/gunicorn/debug.py b/env/lib/python3.9/site-packages/gunicorn/debug.py
deleted file mode 100644
index 996fe1b..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/debug.py
+++ /dev/null
@@ -1,69 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-"""The debug module contains utilities and functions for better
-debugging Gunicorn."""
-
-import sys
-import linecache
-import re
-import inspect
-
-__all__ = ['spew', 'unspew']
-
-_token_spliter = re.compile(r'\W+')
-
-
-class Spew(object):
-
- def __init__(self, trace_names=None, show_values=True):
- self.trace_names = trace_names
- self.show_values = show_values
-
- def __call__(self, frame, event, arg):
- if event == 'line':
- lineno = frame.f_lineno
- if '__file__' in frame.f_globals:
- filename = frame.f_globals['__file__']
- if (filename.endswith('.pyc') or
- filename.endswith('.pyo')):
- filename = filename[:-1]
- name = frame.f_globals['__name__']
- line = linecache.getline(filename, lineno)
- else:
- name = '[unknown]'
- try:
- src = inspect.getsourcelines(frame)
- line = src[lineno]
- except IOError:
- line = 'Unknown code named [%s]. VM instruction #%d' % (
- frame.f_code.co_name, frame.f_lasti)
- if self.trace_names is None or name in self.trace_names:
- print('%s:%s: %s' % (name, lineno, line.rstrip()))
- if not self.show_values:
- return self
- details = []
- tokens = _token_spliter.split(line)
- for tok in tokens:
- if tok in frame.f_globals:
- details.append('%s=%r' % (tok, frame.f_globals[tok]))
- if tok in frame.f_locals:
- details.append('%s=%r' % (tok, frame.f_locals[tok]))
- if details:
- print("\t%s" % ' '.join(details))
- return self
-
-
-def spew(trace_names=None, show_values=False):
- """Install a trace hook which writes incredibly detailed logs
- about what code is being executed to stdout.
- """
- sys.settrace(Spew(trace_names, show_values))
-
-
-def unspew():
- """Remove the trace hook installed by spew.
- """
- sys.settrace(None)
diff --git a/env/lib/python3.9/site-packages/gunicorn/errors.py b/env/lib/python3.9/site-packages/gunicorn/errors.py
deleted file mode 100644
index 727d336..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/errors.py
+++ /dev/null
@@ -1,29 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-# We don't need to call super() in __init__ methods of our
-# BaseException and Exception classes because we also define
-# our own __str__ methods so there is no need to pass 'message'
-# to the base class to get a meaningful output from 'str(exc)'.
-# pylint: disable=super-init-not-called
-
-
-# we inherit from BaseException here to make sure to not be caught
-# at application level
-class HaltServer(BaseException):
- def __init__(self, reason, exit_status=1):
- self.reason = reason
- self.exit_status = exit_status
-
- def __str__(self):
- return "" % (self.reason, self.exit_status)
-
-
-class ConfigError(Exception):
- """ Exception raised on config error """
-
-
-class AppImportError(Exception):
- """ Exception raised when loading an application """
diff --git a/env/lib/python3.9/site-packages/gunicorn/glogging.py b/env/lib/python3.9/site-packages/gunicorn/glogging.py
deleted file mode 100644
index 08bc121..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/glogging.py
+++ /dev/null
@@ -1,464 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import base64
-import binascii
-import time
-import logging
-logging.Logger.manager.emittedNoHandlerWarning = 1 # noqa
-from logging.config import dictConfig
-from logging.config import fileConfig
-import os
-import socket
-import sys
-import threading
-import traceback
-
-from gunicorn import util
-
-
-# syslog facility codes
-SYSLOG_FACILITIES = {
- "auth": 4,
- "authpriv": 10,
- "cron": 9,
- "daemon": 3,
- "ftp": 11,
- "kern": 0,
- "lpr": 6,
- "mail": 2,
- "news": 7,
- "security": 4, # DEPRECATED
- "syslog": 5,
- "user": 1,
- "uucp": 8,
- "local0": 16,
- "local1": 17,
- "local2": 18,
- "local3": 19,
- "local4": 20,
- "local5": 21,
- "local6": 22,
- "local7": 23
-}
-
-
-CONFIG_DEFAULTS = dict(
- version=1,
- disable_existing_loggers=False,
-
- root={"level": "INFO", "handlers": ["console"]},
- loggers={
- "gunicorn.error": {
- "level": "INFO",
- "handlers": ["error_console"],
- "propagate": True,
- "qualname": "gunicorn.error"
- },
-
- "gunicorn.access": {
- "level": "INFO",
- "handlers": ["console"],
- "propagate": True,
- "qualname": "gunicorn.access"
- }
- },
- handlers={
- "console": {
- "class": "logging.StreamHandler",
- "formatter": "generic",
- "stream": "ext://sys.stdout"
- },
- "error_console": {
- "class": "logging.StreamHandler",
- "formatter": "generic",
- "stream": "ext://sys.stderr"
- },
- },
- formatters={
- "generic": {
- "format": "%(asctime)s [%(process)d] [%(levelname)s] %(message)s",
- "datefmt": "[%Y-%m-%d %H:%M:%S %z]",
- "class": "logging.Formatter"
- }
- }
-)
-
-
-def loggers():
- """ get list of all loggers """
- root = logging.root
- existing = root.manager.loggerDict.keys()
- return [logging.getLogger(name) for name in existing]
-
-
-class SafeAtoms(dict):
-
- def __init__(self, atoms):
- dict.__init__(self)
- for key, value in atoms.items():
- if isinstance(value, str):
- self[key] = value.replace('"', '\\"')
- else:
- self[key] = value
-
- def __getitem__(self, k):
- if k.startswith("{"):
- kl = k.lower()
- if kl in self:
- return super().__getitem__(kl)
- else:
- return "-"
- if k in self:
- return super().__getitem__(k)
- else:
- return '-'
-
-
-def parse_syslog_address(addr):
-
- # unix domain socket type depends on backend
- # SysLogHandler will try both when given None
- if addr.startswith("unix://"):
- sock_type = None
-
- # set socket type only if explicitly requested
- parts = addr.split("#", 1)
- if len(parts) == 2:
- addr = parts[0]
- if parts[1] == "dgram":
- sock_type = socket.SOCK_DGRAM
-
- return (sock_type, addr.split("unix://")[1])
-
- if addr.startswith("udp://"):
- addr = addr.split("udp://")[1]
- socktype = socket.SOCK_DGRAM
- elif addr.startswith("tcp://"):
- addr = addr.split("tcp://")[1]
- socktype = socket.SOCK_STREAM
- else:
- raise RuntimeError("invalid syslog address")
-
- if '[' in addr and ']' in addr:
- host = addr.split(']')[0][1:].lower()
- elif ':' in addr:
- host = addr.split(':')[0].lower()
- elif addr == "":
- host = "localhost"
- else:
- host = addr.lower()
-
- addr = addr.split(']')[-1]
- if ":" in addr:
- port = addr.split(':', 1)[1]
- if not port.isdigit():
- raise RuntimeError("%r is not a valid port number." % port)
- port = int(port)
- else:
- port = 514
-
- return (socktype, (host, port))
-
-
-class Logger(object):
-
- LOG_LEVELS = {
- "critical": logging.CRITICAL,
- "error": logging.ERROR,
- "warning": logging.WARNING,
- "info": logging.INFO,
- "debug": logging.DEBUG
- }
- loglevel = logging.INFO
-
- error_fmt = r"%(asctime)s [%(process)d] [%(levelname)s] %(message)s"
- datefmt = r"[%Y-%m-%d %H:%M:%S %z]"
-
- access_fmt = "%(message)s"
- syslog_fmt = "[%(process)d] %(message)s"
-
- atoms_wrapper_class = SafeAtoms
-
- def __init__(self, cfg):
- self.error_log = logging.getLogger("gunicorn.error")
- self.error_log.propagate = False
- self.access_log = logging.getLogger("gunicorn.access")
- self.access_log.propagate = False
- self.error_handlers = []
- self.access_handlers = []
- self.logfile = None
- self.lock = threading.Lock()
- self.cfg = cfg
- self.setup(cfg)
-
- def setup(self, cfg):
- self.loglevel = self.LOG_LEVELS.get(cfg.loglevel.lower(), logging.INFO)
- self.error_log.setLevel(self.loglevel)
- self.access_log.setLevel(logging.INFO)
-
- # set gunicorn.error handler
- if self.cfg.capture_output and cfg.errorlog != "-":
- for stream in sys.stdout, sys.stderr:
- stream.flush()
-
- self.logfile = open(cfg.errorlog, 'a+')
- os.dup2(self.logfile.fileno(), sys.stdout.fileno())
- os.dup2(self.logfile.fileno(), sys.stderr.fileno())
-
- self._set_handler(self.error_log, cfg.errorlog,
- logging.Formatter(self.error_fmt, self.datefmt))
-
- # set gunicorn.access handler
- if cfg.accesslog is not None:
- self._set_handler(
- self.access_log, cfg.accesslog,
- fmt=logging.Formatter(self.access_fmt), stream=sys.stdout
- )
-
- # set syslog handler
- if cfg.syslog:
- self._set_syslog_handler(
- self.error_log, cfg, self.syslog_fmt, "error"
- )
- if not cfg.disable_redirect_access_to_syslog:
- self._set_syslog_handler(
- self.access_log, cfg, self.syslog_fmt, "access"
- )
-
- if cfg.logconfig_dict:
- config = CONFIG_DEFAULTS.copy()
- config.update(cfg.logconfig_dict)
- try:
- dictConfig(config)
- except (
- AttributeError,
- ImportError,
- ValueError,
- TypeError
- ) as exc:
- raise RuntimeError(str(exc))
- elif cfg.logconfig:
- if os.path.exists(cfg.logconfig):
- defaults = CONFIG_DEFAULTS.copy()
- defaults['__file__'] = cfg.logconfig
- defaults['here'] = os.path.dirname(cfg.logconfig)
- fileConfig(cfg.logconfig, defaults=defaults,
- disable_existing_loggers=False)
- else:
- msg = "Error: log config '%s' not found"
- raise RuntimeError(msg % cfg.logconfig)
-
- def critical(self, msg, *args, **kwargs):
- self.error_log.critical(msg, *args, **kwargs)
-
- def error(self, msg, *args, **kwargs):
- self.error_log.error(msg, *args, **kwargs)
-
- def warning(self, msg, *args, **kwargs):
- self.error_log.warning(msg, *args, **kwargs)
-
- def info(self, msg, *args, **kwargs):
- self.error_log.info(msg, *args, **kwargs)
-
- def debug(self, msg, *args, **kwargs):
- self.error_log.debug(msg, *args, **kwargs)
-
- def exception(self, msg, *args, **kwargs):
- self.error_log.exception(msg, *args, **kwargs)
-
- def log(self, lvl, msg, *args, **kwargs):
- if isinstance(lvl, str):
- lvl = self.LOG_LEVELS.get(lvl.lower(), logging.INFO)
- self.error_log.log(lvl, msg, *args, **kwargs)
-
- def atoms(self, resp, req, environ, request_time):
- """ Gets atoms for log formating.
- """
- status = resp.status
- if isinstance(status, str):
- status = status.split(None, 1)[0]
- atoms = {
- 'h': environ.get('REMOTE_ADDR', '-'),
- 'l': '-',
- 'u': self._get_user(environ) or '-',
- 't': self.now(),
- 'r': "%s %s %s" % (environ['REQUEST_METHOD'],
- environ['RAW_URI'],
- environ["SERVER_PROTOCOL"]),
- 's': status,
- 'm': environ.get('REQUEST_METHOD'),
- 'U': environ.get('PATH_INFO'),
- 'q': environ.get('QUERY_STRING'),
- 'H': environ.get('SERVER_PROTOCOL'),
- 'b': getattr(resp, 'sent', None) is not None and str(resp.sent) or '-',
- 'B': getattr(resp, 'sent', None),
- 'f': environ.get('HTTP_REFERER', '-'),
- 'a': environ.get('HTTP_USER_AGENT', '-'),
- 'T': request_time.seconds,
- 'D': (request_time.seconds * 1000000) + request_time.microseconds,
- 'M': (request_time.seconds * 1000) + int(request_time.microseconds/1000),
- 'L': "%d.%06d" % (request_time.seconds, request_time.microseconds),
- 'p': "<%s>" % os.getpid()
- }
-
- # add request headers
- if hasattr(req, 'headers'):
- req_headers = req.headers
- else:
- req_headers = req
-
- if hasattr(req_headers, "items"):
- req_headers = req_headers.items()
-
- atoms.update({"{%s}i" % k.lower(): v for k, v in req_headers})
-
- resp_headers = resp.headers
- if hasattr(resp_headers, "items"):
- resp_headers = resp_headers.items()
-
- # add response headers
- atoms.update({"{%s}o" % k.lower(): v for k, v in resp_headers})
-
- # add environ variables
- environ_variables = environ.items()
- atoms.update({"{%s}e" % k.lower(): v for k, v in environ_variables})
-
- return atoms
-
- def access(self, resp, req, environ, request_time):
- """ See http://httpd.apache.org/docs/2.0/logs.html#combined
- for format details
- """
-
- if not (self.cfg.accesslog or self.cfg.logconfig or
- self.cfg.logconfig_dict or
- (self.cfg.syslog and not self.cfg.disable_redirect_access_to_syslog)):
- return
-
- # wrap atoms:
- # - make sure atoms will be test case insensitively
- # - if atom doesn't exist replace it by '-'
- safe_atoms = self.atoms_wrapper_class(
- self.atoms(resp, req, environ, request_time)
- )
-
- try:
- self.access_log.info(self.cfg.access_log_format, safe_atoms)
- except Exception:
- self.error(traceback.format_exc())
-
- def now(self):
- """ return date in Apache Common Log Format """
- return time.strftime('[%d/%b/%Y:%H:%M:%S %z]')
-
- def reopen_files(self):
- if self.cfg.capture_output and self.cfg.errorlog != "-":
- for stream in sys.stdout, sys.stderr:
- stream.flush()
-
- with self.lock:
- if self.logfile is not None:
- self.logfile.close()
- self.logfile = open(self.cfg.errorlog, 'a+')
- os.dup2(self.logfile.fileno(), sys.stdout.fileno())
- os.dup2(self.logfile.fileno(), sys.stderr.fileno())
-
- for log in loggers():
- for handler in log.handlers:
- if isinstance(handler, logging.FileHandler):
- handler.acquire()
- try:
- if handler.stream:
- handler.close()
- handler.stream = handler._open()
- finally:
- handler.release()
-
- def close_on_exec(self):
- for log in loggers():
- for handler in log.handlers:
- if isinstance(handler, logging.FileHandler):
- handler.acquire()
- try:
- if handler.stream:
- util.close_on_exec(handler.stream.fileno())
- finally:
- handler.release()
-
- def _get_gunicorn_handler(self, log):
- for h in log.handlers:
- if getattr(h, "_gunicorn", False):
- return h
-
- def _set_handler(self, log, output, fmt, stream=None):
- # remove previous gunicorn log handler
- h = self._get_gunicorn_handler(log)
- if h:
- log.handlers.remove(h)
-
- if output is not None:
- if output == "-":
- h = logging.StreamHandler(stream)
- else:
- util.check_is_writeable(output)
- h = logging.FileHandler(output)
- # make sure the user can reopen the file
- try:
- os.chown(h.baseFilename, self.cfg.user, self.cfg.group)
- except OSError:
- # it's probably OK there, we assume the user has given
- # /dev/null as a parameter.
- pass
-
- h.setFormatter(fmt)
- h._gunicorn = True
- log.addHandler(h)
-
- def _set_syslog_handler(self, log, cfg, fmt, name):
- # setup format
- prefix = cfg.syslog_prefix or cfg.proc_name.replace(":", ".")
-
- prefix = "gunicorn.%s.%s" % (prefix, name)
-
- # set format
- fmt = logging.Formatter(r"%s: %s" % (prefix, fmt))
-
- # syslog facility
- try:
- facility = SYSLOG_FACILITIES[cfg.syslog_facility.lower()]
- except KeyError:
- raise RuntimeError("unknown facility name")
-
- # parse syslog address
- socktype, addr = parse_syslog_address(cfg.syslog_addr)
-
- # finally setup the syslog handler
- h = logging.handlers.SysLogHandler(address=addr,
- facility=facility, socktype=socktype)
-
- h.setFormatter(fmt)
- h._gunicorn = True
- log.addHandler(h)
-
- def _get_user(self, environ):
- user = None
- http_auth = environ.get("HTTP_AUTHORIZATION")
- if http_auth and http_auth.lower().startswith('basic'):
- auth = http_auth.split(" ", 1)
- if len(auth) == 2:
- try:
- # b64decode doesn't accept unicode in Python < 3.3
- # so we need to convert it to a byte string
- auth = base64.b64decode(auth[1].strip().encode('utf-8'))
- # b64decode returns a byte string
- auth = auth.decode('utf-8')
- auth = auth.split(":", 1)
- except (TypeError, binascii.Error, UnicodeDecodeError) as exc:
- self.debug("Couldn't get username: %s", exc)
- return user
- if len(auth) == 2:
- user = auth[0]
- return user
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/__init__.py b/env/lib/python3.9/site-packages/gunicorn/http/__init__.py
deleted file mode 100644
index 1da6f3e..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/__init__.py
+++ /dev/null
@@ -1,9 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-from gunicorn.http.message import Message, Request
-from gunicorn.http.parser import RequestParser
-
-__all__ = ['Message', 'Request', 'RequestParser']
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/body.py b/env/lib/python3.9/site-packages/gunicorn/http/body.py
deleted file mode 100644
index afde368..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/body.py
+++ /dev/null
@@ -1,262 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import io
-import sys
-
-from gunicorn.http.errors import (NoMoreData, ChunkMissingTerminator,
- InvalidChunkSize)
-
-
-class ChunkedReader(object):
- def __init__(self, req, unreader):
- self.req = req
- self.parser = self.parse_chunked(unreader)
- self.buf = io.BytesIO()
-
- def read(self, size):
- if not isinstance(size, int):
- raise TypeError("size must be an integral type")
- if size < 0:
- raise ValueError("Size must be positive.")
- if size == 0:
- return b""
-
- if self.parser:
- while self.buf.tell() < size:
- try:
- self.buf.write(next(self.parser))
- except StopIteration:
- self.parser = None
- break
-
- data = self.buf.getvalue()
- ret, rest = data[:size], data[size:]
- self.buf = io.BytesIO()
- self.buf.write(rest)
- return ret
-
- def parse_trailers(self, unreader, data):
- buf = io.BytesIO()
- buf.write(data)
-
- idx = buf.getvalue().find(b"\r\n\r\n")
- done = buf.getvalue()[:2] == b"\r\n"
- while idx < 0 and not done:
- self.get_data(unreader, buf)
- idx = buf.getvalue().find(b"\r\n\r\n")
- done = buf.getvalue()[:2] == b"\r\n"
- if done:
- unreader.unread(buf.getvalue()[2:])
- return b""
- self.req.trailers = self.req.parse_headers(buf.getvalue()[:idx])
- unreader.unread(buf.getvalue()[idx + 4:])
-
- def parse_chunked(self, unreader):
- (size, rest) = self.parse_chunk_size(unreader)
- while size > 0:
- while size > len(rest):
- size -= len(rest)
- yield rest
- rest = unreader.read()
- if not rest:
- raise NoMoreData()
- yield rest[:size]
- # Remove \r\n after chunk
- rest = rest[size:]
- while len(rest) < 2:
- rest += unreader.read()
- if rest[:2] != b'\r\n':
- raise ChunkMissingTerminator(rest[:2])
- (size, rest) = self.parse_chunk_size(unreader, data=rest[2:])
-
- def parse_chunk_size(self, unreader, data=None):
- buf = io.BytesIO()
- if data is not None:
- buf.write(data)
-
- idx = buf.getvalue().find(b"\r\n")
- while idx < 0:
- self.get_data(unreader, buf)
- idx = buf.getvalue().find(b"\r\n")
-
- data = buf.getvalue()
- line, rest_chunk = data[:idx], data[idx + 2:]
-
- chunk_size = line.split(b";", 1)[0].strip()
- try:
- chunk_size = int(chunk_size, 16)
- except ValueError:
- raise InvalidChunkSize(chunk_size)
-
- if chunk_size == 0:
- try:
- self.parse_trailers(unreader, rest_chunk)
- except NoMoreData:
- pass
- return (0, None)
- return (chunk_size, rest_chunk)
-
- def get_data(self, unreader, buf):
- data = unreader.read()
- if not data:
- raise NoMoreData()
- buf.write(data)
-
-
-class LengthReader(object):
- def __init__(self, unreader, length):
- self.unreader = unreader
- self.length = length
-
- def read(self, size):
- if not isinstance(size, int):
- raise TypeError("size must be an integral type")
-
- size = min(self.length, size)
- if size < 0:
- raise ValueError("Size must be positive.")
- if size == 0:
- return b""
-
- buf = io.BytesIO()
- data = self.unreader.read()
- while data:
- buf.write(data)
- if buf.tell() >= size:
- break
- data = self.unreader.read()
-
- buf = buf.getvalue()
- ret, rest = buf[:size], buf[size:]
- self.unreader.unread(rest)
- self.length -= size
- return ret
-
-
-class EOFReader(object):
- def __init__(self, unreader):
- self.unreader = unreader
- self.buf = io.BytesIO()
- self.finished = False
-
- def read(self, size):
- if not isinstance(size, int):
- raise TypeError("size must be an integral type")
- if size < 0:
- raise ValueError("Size must be positive.")
- if size == 0:
- return b""
-
- if self.finished:
- data = self.buf.getvalue()
- ret, rest = data[:size], data[size:]
- self.buf = io.BytesIO()
- self.buf.write(rest)
- return ret
-
- data = self.unreader.read()
- while data:
- self.buf.write(data)
- if self.buf.tell() > size:
- break
- data = self.unreader.read()
-
- if not data:
- self.finished = True
-
- data = self.buf.getvalue()
- ret, rest = data[:size], data[size:]
- self.buf = io.BytesIO()
- self.buf.write(rest)
- return ret
-
-
-class Body(object):
- def __init__(self, reader):
- self.reader = reader
- self.buf = io.BytesIO()
-
- def __iter__(self):
- return self
-
- def __next__(self):
- ret = self.readline()
- if not ret:
- raise StopIteration()
- return ret
-
- next = __next__
-
- def getsize(self, size):
- if size is None:
- return sys.maxsize
- elif not isinstance(size, int):
- raise TypeError("size must be an integral type")
- elif size < 0:
- return sys.maxsize
- return size
-
- def read(self, size=None):
- size = self.getsize(size)
- if size == 0:
- return b""
-
- if size < self.buf.tell():
- data = self.buf.getvalue()
- ret, rest = data[:size], data[size:]
- self.buf = io.BytesIO()
- self.buf.write(rest)
- return ret
-
- while size > self.buf.tell():
- data = self.reader.read(1024)
- if not data:
- break
- self.buf.write(data)
-
- data = self.buf.getvalue()
- ret, rest = data[:size], data[size:]
- self.buf = io.BytesIO()
- self.buf.write(rest)
- return ret
-
- def readline(self, size=None):
- size = self.getsize(size)
- if size == 0:
- return b""
-
- data = self.buf.getvalue()
- self.buf = io.BytesIO()
-
- ret = []
- while 1:
- idx = data.find(b"\n", 0, size)
- idx = idx + 1 if idx >= 0 else size if len(data) >= size else 0
- if idx:
- ret.append(data[:idx])
- self.buf.write(data[idx:])
- break
-
- ret.append(data)
- size -= len(data)
- data = self.reader.read(min(1024, size))
- if not data:
- break
-
- return b"".join(ret)
-
- def readlines(self, size=None):
- ret = []
- data = self.read()
- while data:
- pos = data.find(b"\n")
- if pos < 0:
- ret.append(data)
- data = b""
- else:
- line, data = data[:pos + 1], data[pos + 1:]
- ret.append(line)
- return ret
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/errors.py b/env/lib/python3.9/site-packages/gunicorn/http/errors.py
deleted file mode 100644
index 7839ef0..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/errors.py
+++ /dev/null
@@ -1,120 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-# We don't need to call super() in __init__ methods of our
-# BaseException and Exception classes because we also define
-# our own __str__ methods so there is no need to pass 'message'
-# to the base class to get a meaningful output from 'str(exc)'.
-# pylint: disable=super-init-not-called
-
-
-class ParseException(Exception):
- pass
-
-
-class NoMoreData(IOError):
- def __init__(self, buf=None):
- self.buf = buf
-
- def __str__(self):
- return "No more data after: %r" % self.buf
-
-
-class InvalidRequestLine(ParseException):
- def __init__(self, req):
- self.req = req
- self.code = 400
-
- def __str__(self):
- return "Invalid HTTP request line: %r" % self.req
-
-
-class InvalidRequestMethod(ParseException):
- def __init__(self, method):
- self.method = method
-
- def __str__(self):
- return "Invalid HTTP method: %r" % self.method
-
-
-class InvalidHTTPVersion(ParseException):
- def __init__(self, version):
- self.version = version
-
- def __str__(self):
- return "Invalid HTTP Version: %r" % self.version
-
-
-class InvalidHeader(ParseException):
- def __init__(self, hdr, req=None):
- self.hdr = hdr
- self.req = req
-
- def __str__(self):
- return "Invalid HTTP Header: %r" % self.hdr
-
-
-class InvalidHeaderName(ParseException):
- def __init__(self, hdr):
- self.hdr = hdr
-
- def __str__(self):
- return "Invalid HTTP header name: %r" % self.hdr
-
-
-class InvalidChunkSize(IOError):
- def __init__(self, data):
- self.data = data
-
- def __str__(self):
- return "Invalid chunk size: %r" % self.data
-
-
-class ChunkMissingTerminator(IOError):
- def __init__(self, term):
- self.term = term
-
- def __str__(self):
- return "Invalid chunk terminator is not '\\r\\n': %r" % self.term
-
-
-class LimitRequestLine(ParseException):
- def __init__(self, size, max_size):
- self.size = size
- self.max_size = max_size
-
- def __str__(self):
- return "Request Line is too large (%s > %s)" % (self.size, self.max_size)
-
-
-class LimitRequestHeaders(ParseException):
- def __init__(self, msg):
- self.msg = msg
-
- def __str__(self):
- return self.msg
-
-
-class InvalidProxyLine(ParseException):
- def __init__(self, line):
- self.line = line
- self.code = 400
-
- def __str__(self):
- return "Invalid PROXY line: %r" % self.line
-
-
-class ForbiddenProxyRequest(ParseException):
- def __init__(self, host):
- self.host = host
- self.code = 403
-
- def __str__(self):
- return "Proxy request from %r not allowed" % self.host
-
-
-class InvalidSchemeHeaders(ParseException):
- def __str__(self):
- return "Contradictory scheme headers"
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/message.py b/env/lib/python3.9/site-packages/gunicorn/http/message.py
deleted file mode 100644
index 17d2240..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/message.py
+++ /dev/null
@@ -1,356 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import io
-import re
-import socket
-
-from gunicorn.http.body import ChunkedReader, LengthReader, EOFReader, Body
-from gunicorn.http.errors import (
- InvalidHeader, InvalidHeaderName, NoMoreData,
- InvalidRequestLine, InvalidRequestMethod, InvalidHTTPVersion,
- LimitRequestLine, LimitRequestHeaders,
-)
-from gunicorn.http.errors import InvalidProxyLine, ForbiddenProxyRequest
-from gunicorn.http.errors import InvalidSchemeHeaders
-from gunicorn.util import bytes_to_str, split_request_uri
-
-MAX_REQUEST_LINE = 8190
-MAX_HEADERS = 32768
-DEFAULT_MAX_HEADERFIELD_SIZE = 8190
-
-HEADER_RE = re.compile(r"[\x00-\x1F\x7F()<>@,;:\[\]={} \t\\\"]")
-METH_RE = re.compile(r"[A-Z0-9$-_.]{3,20}")
-VERSION_RE = re.compile(r"HTTP/(\d+)\.(\d+)")
-
-
-class Message(object):
- def __init__(self, cfg, unreader, peer_addr):
- self.cfg = cfg
- self.unreader = unreader
- self.peer_addr = peer_addr
- self.version = None
- self.headers = []
- self.trailers = []
- self.body = None
- self.scheme = "https" if cfg.is_ssl else "http"
-
- # set headers limits
- self.limit_request_fields = cfg.limit_request_fields
- if (self.limit_request_fields <= 0
- or self.limit_request_fields > MAX_HEADERS):
- self.limit_request_fields = MAX_HEADERS
- self.limit_request_field_size = cfg.limit_request_field_size
- if self.limit_request_field_size < 0:
- self.limit_request_field_size = DEFAULT_MAX_HEADERFIELD_SIZE
-
- # set max header buffer size
- max_header_field_size = self.limit_request_field_size or DEFAULT_MAX_HEADERFIELD_SIZE
- self.max_buffer_headers = self.limit_request_fields * \
- (max_header_field_size + 2) + 4
-
- unused = self.parse(self.unreader)
- self.unreader.unread(unused)
- self.set_body_reader()
-
- def parse(self, unreader):
- raise NotImplementedError()
-
- def parse_headers(self, data):
- cfg = self.cfg
- headers = []
-
- # Split lines on \r\n keeping the \r\n on each line
- lines = [bytes_to_str(line) + "\r\n" for line in data.split(b"\r\n")]
-
- # handle scheme headers
- scheme_header = False
- secure_scheme_headers = {}
- if ('*' in cfg.forwarded_allow_ips or
- not isinstance(self.peer_addr, tuple)
- or self.peer_addr[0] in cfg.forwarded_allow_ips):
- secure_scheme_headers = cfg.secure_scheme_headers
-
- # Parse headers into key/value pairs paying attention
- # to continuation lines.
- while lines:
- if len(headers) >= self.limit_request_fields:
- raise LimitRequestHeaders("limit request headers fields")
-
- # Parse initial header name : value pair.
- curr = lines.pop(0)
- header_length = len(curr)
- if curr.find(":") < 0:
- raise InvalidHeader(curr.strip())
- name, value = curr.split(":", 1)
- if self.cfg.strip_header_spaces:
- name = name.rstrip(" \t").upper()
- else:
- name = name.upper()
- if HEADER_RE.search(name):
- raise InvalidHeaderName(name)
-
- name, value = name.strip(), [value.lstrip()]
-
- # Consume value continuation lines
- while lines and lines[0].startswith((" ", "\t")):
- curr = lines.pop(0)
- header_length += len(curr)
- if header_length > self.limit_request_field_size > 0:
- raise LimitRequestHeaders("limit request headers "
- "fields size")
- value.append(curr)
- value = ''.join(value).rstrip()
-
- if header_length > self.limit_request_field_size > 0:
- raise LimitRequestHeaders("limit request headers fields size")
-
- if name in secure_scheme_headers:
- secure = value == secure_scheme_headers[name]
- scheme = "https" if secure else "http"
- if scheme_header:
- if scheme != self.scheme:
- raise InvalidSchemeHeaders()
- else:
- scheme_header = True
- self.scheme = scheme
-
- headers.append((name, value))
-
- return headers
-
- def set_body_reader(self):
- chunked = False
- content_length = None
-
- for (name, value) in self.headers:
- if name == "CONTENT-LENGTH":
- if content_length is not None:
- raise InvalidHeader("CONTENT-LENGTH", req=self)
- content_length = value
- elif name == "TRANSFER-ENCODING":
- if value.lower() == "chunked":
- chunked = True
-
- if chunked:
- self.body = Body(ChunkedReader(self, self.unreader))
- elif content_length is not None:
- try:
- content_length = int(content_length)
- except ValueError:
- raise InvalidHeader("CONTENT-LENGTH", req=self)
-
- if content_length < 0:
- raise InvalidHeader("CONTENT-LENGTH", req=self)
-
- self.body = Body(LengthReader(self.unreader, content_length))
- else:
- self.body = Body(EOFReader(self.unreader))
-
- def should_close(self):
- for (h, v) in self.headers:
- if h == "CONNECTION":
- v = v.lower().strip()
- if v == "close":
- return True
- elif v == "keep-alive":
- return False
- break
- return self.version <= (1, 0)
-
-
-class Request(Message):
- def __init__(self, cfg, unreader, peer_addr, req_number=1):
- self.method = None
- self.uri = None
- self.path = None
- self.query = None
- self.fragment = None
-
- # get max request line size
- self.limit_request_line = cfg.limit_request_line
- if (self.limit_request_line < 0
- or self.limit_request_line >= MAX_REQUEST_LINE):
- self.limit_request_line = MAX_REQUEST_LINE
-
- self.req_number = req_number
- self.proxy_protocol_info = None
- super().__init__(cfg, unreader, peer_addr)
-
- def get_data(self, unreader, buf, stop=False):
- data = unreader.read()
- if not data:
- if stop:
- raise StopIteration()
- raise NoMoreData(buf.getvalue())
- buf.write(data)
-
- def parse(self, unreader):
- buf = io.BytesIO()
- self.get_data(unreader, buf, stop=True)
-
- # get request line
- line, rbuf = self.read_line(unreader, buf, self.limit_request_line)
-
- # proxy protocol
- if self.proxy_protocol(bytes_to_str(line)):
- # get next request line
- buf = io.BytesIO()
- buf.write(rbuf)
- line, rbuf = self.read_line(unreader, buf, self.limit_request_line)
-
- self.parse_request_line(line)
- buf = io.BytesIO()
- buf.write(rbuf)
-
- # Headers
- data = buf.getvalue()
- idx = data.find(b"\r\n\r\n")
-
- done = data[:2] == b"\r\n"
- while True:
- idx = data.find(b"\r\n\r\n")
- done = data[:2] == b"\r\n"
-
- if idx < 0 and not done:
- self.get_data(unreader, buf)
- data = buf.getvalue()
- if len(data) > self.max_buffer_headers:
- raise LimitRequestHeaders("max buffer headers")
- else:
- break
-
- if done:
- self.unreader.unread(data[2:])
- return b""
-
- self.headers = self.parse_headers(data[:idx])
-
- ret = data[idx + 4:]
- buf = None
- return ret
-
- def read_line(self, unreader, buf, limit=0):
- data = buf.getvalue()
-
- while True:
- idx = data.find(b"\r\n")
- if idx >= 0:
- # check if the request line is too large
- if idx > limit > 0:
- raise LimitRequestLine(idx, limit)
- break
- if len(data) - 2 > limit > 0:
- raise LimitRequestLine(len(data), limit)
- self.get_data(unreader, buf)
- data = buf.getvalue()
-
- return (data[:idx], # request line,
- data[idx + 2:]) # residue in the buffer, skip \r\n
-
- def proxy_protocol(self, line):
- """\
- Detect, check and parse proxy protocol.
-
- :raises: ForbiddenProxyRequest, InvalidProxyLine.
- :return: True for proxy protocol line else False
- """
- if not self.cfg.proxy_protocol:
- return False
-
- if self.req_number != 1:
- return False
-
- if not line.startswith("PROXY"):
- return False
-
- self.proxy_protocol_access_check()
- self.parse_proxy_protocol(line)
-
- return True
-
- def proxy_protocol_access_check(self):
- # check in allow list
- if ("*" not in self.cfg.proxy_allow_ips and
- isinstance(self.peer_addr, tuple) and
- self.peer_addr[0] not in self.cfg.proxy_allow_ips):
- raise ForbiddenProxyRequest(self.peer_addr[0])
-
- def parse_proxy_protocol(self, line):
- bits = line.split()
-
- if len(bits) != 6:
- raise InvalidProxyLine(line)
-
- # Extract data
- proto = bits[1]
- s_addr = bits[2]
- d_addr = bits[3]
-
- # Validation
- if proto not in ["TCP4", "TCP6"]:
- raise InvalidProxyLine("protocol '%s' not supported" % proto)
- if proto == "TCP4":
- try:
- socket.inet_pton(socket.AF_INET, s_addr)
- socket.inet_pton(socket.AF_INET, d_addr)
- except socket.error:
- raise InvalidProxyLine(line)
- elif proto == "TCP6":
- try:
- socket.inet_pton(socket.AF_INET6, s_addr)
- socket.inet_pton(socket.AF_INET6, d_addr)
- except socket.error:
- raise InvalidProxyLine(line)
-
- try:
- s_port = int(bits[4])
- d_port = int(bits[5])
- except ValueError:
- raise InvalidProxyLine("invalid port %s" % line)
-
- if not ((0 <= s_port <= 65535) and (0 <= d_port <= 65535)):
- raise InvalidProxyLine("invalid port %s" % line)
-
- # Set data
- self.proxy_protocol_info = {
- "proxy_protocol": proto,
- "client_addr": s_addr,
- "client_port": s_port,
- "proxy_addr": d_addr,
- "proxy_port": d_port
- }
-
- def parse_request_line(self, line_bytes):
- bits = [bytes_to_str(bit) for bit in line_bytes.split(None, 2)]
- if len(bits) != 3:
- raise InvalidRequestLine(bytes_to_str(line_bytes))
-
- # Method
- if not METH_RE.match(bits[0]):
- raise InvalidRequestMethod(bits[0])
- self.method = bits[0].upper()
-
- # URI
- self.uri = bits[1]
-
- try:
- parts = split_request_uri(self.uri)
- except ValueError:
- raise InvalidRequestLine(bytes_to_str(line_bytes))
- self.path = parts.path or ""
- self.query = parts.query or ""
- self.fragment = parts.fragment or ""
-
- # Version
- match = VERSION_RE.match(bits[2])
- if match is None:
- raise InvalidHTTPVersion(bits[2])
- self.version = (int(match.group(1)), int(match.group(2)))
-
- def set_body_reader(self):
- super().set_body_reader()
- if isinstance(self.body.reader, EOFReader):
- self.body = Body(LengthReader(self.unreader, 0))
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/parser.py b/env/lib/python3.9/site-packages/gunicorn/http/parser.py
deleted file mode 100644
index 5d689f0..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/parser.py
+++ /dev/null
@@ -1,52 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-from gunicorn.http.message import Request
-from gunicorn.http.unreader import SocketUnreader, IterUnreader
-
-
-class Parser(object):
-
- mesg_class = None
-
- def __init__(self, cfg, source, source_addr):
- self.cfg = cfg
- if hasattr(source, "recv"):
- self.unreader = SocketUnreader(source)
- else:
- self.unreader = IterUnreader(source)
- self.mesg = None
- self.source_addr = source_addr
-
- # request counter (for keepalive connetions)
- self.req_count = 0
-
- def __iter__(self):
- return self
-
- def __next__(self):
- # Stop if HTTP dictates a stop.
- if self.mesg and self.mesg.should_close():
- raise StopIteration()
-
- # Discard any unread body of the previous message
- if self.mesg:
- data = self.mesg.body.read(8192)
- while data:
- data = self.mesg.body.read(8192)
-
- # Parse the next request
- self.req_count += 1
- self.mesg = self.mesg_class(self.cfg, self.unreader, self.source_addr, self.req_count)
- if not self.mesg:
- raise StopIteration()
- return self.mesg
-
- next = __next__
-
-
-class RequestParser(Parser):
-
- mesg_class = Request
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/unreader.py b/env/lib/python3.9/site-packages/gunicorn/http/unreader.py
deleted file mode 100644
index 273bfc3..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/unreader.py
+++ /dev/null
@@ -1,79 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import io
-import os
-
-# Classes that can undo reading data from
-# a given type of data source.
-
-
-class Unreader(object):
- def __init__(self):
- self.buf = io.BytesIO()
-
- def chunk(self):
- raise NotImplementedError()
-
- def read(self, size=None):
- if size is not None and not isinstance(size, int):
- raise TypeError("size parameter must be an int or long.")
-
- if size is not None:
- if size == 0:
- return b""
- if size < 0:
- size = None
-
- self.buf.seek(0, os.SEEK_END)
-
- if size is None and self.buf.tell():
- ret = self.buf.getvalue()
- self.buf = io.BytesIO()
- return ret
- if size is None:
- d = self.chunk()
- return d
-
- while self.buf.tell() < size:
- chunk = self.chunk()
- if not chunk:
- ret = self.buf.getvalue()
- self.buf = io.BytesIO()
- return ret
- self.buf.write(chunk)
- data = self.buf.getvalue()
- self.buf = io.BytesIO()
- self.buf.write(data[size:])
- return data[:size]
-
- def unread(self, data):
- self.buf.seek(0, os.SEEK_END)
- self.buf.write(data)
-
-
-class SocketUnreader(Unreader):
- def __init__(self, sock, max_chunk=8192):
- super().__init__()
- self.sock = sock
- self.mxchunk = max_chunk
-
- def chunk(self):
- return self.sock.recv(self.mxchunk)
-
-
-class IterUnreader(Unreader):
- def __init__(self, iterable):
- super().__init__()
- self.iter = iter(iterable)
-
- def chunk(self):
- if not self.iter:
- return b""
- try:
- return next(self.iter)
- except StopIteration:
- self.iter = None
- return b""
diff --git a/env/lib/python3.9/site-packages/gunicorn/http/wsgi.py b/env/lib/python3.9/site-packages/gunicorn/http/wsgi.py
deleted file mode 100644
index 478677f..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/http/wsgi.py
+++ /dev/null
@@ -1,393 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import io
-import logging
-import os
-import re
-import sys
-
-from gunicorn.http.message import HEADER_RE
-from gunicorn.http.errors import InvalidHeader, InvalidHeaderName
-from gunicorn import SERVER_SOFTWARE, SERVER
-import gunicorn.util as util
-
-# Send files in at most 1GB blocks as some operating systems can have problems
-# with sending files in blocks over 2GB.
-BLKSIZE = 0x3FFFFFFF
-
-HEADER_VALUE_RE = re.compile(r'[\x00-\x1F\x7F]')
-
-log = logging.getLogger(__name__)
-
-
-class FileWrapper(object):
-
- def __init__(self, filelike, blksize=8192):
- self.filelike = filelike
- self.blksize = blksize
- if hasattr(filelike, 'close'):
- self.close = filelike.close
-
- def __getitem__(self, key):
- data = self.filelike.read(self.blksize)
- if data:
- return data
- raise IndexError
-
-
-class WSGIErrorsWrapper(io.RawIOBase):
-
- def __init__(self, cfg):
- # There is no public __init__ method for RawIOBase so
- # we don't need to call super() in the __init__ method.
- # pylint: disable=super-init-not-called
- errorlog = logging.getLogger("gunicorn.error")
- handlers = errorlog.handlers
- self.streams = []
-
- if cfg.errorlog == "-":
- self.streams.append(sys.stderr)
- handlers = handlers[1:]
-
- for h in handlers:
- if hasattr(h, "stream"):
- self.streams.append(h.stream)
-
- def write(self, data):
- for stream in self.streams:
- try:
- stream.write(data)
- except UnicodeError:
- stream.write(data.encode("UTF-8"))
- stream.flush()
-
-
-def base_environ(cfg):
- return {
- "wsgi.errors": WSGIErrorsWrapper(cfg),
- "wsgi.version": (1, 0),
- "wsgi.multithread": False,
- "wsgi.multiprocess": (cfg.workers > 1),
- "wsgi.run_once": False,
- "wsgi.file_wrapper": FileWrapper,
- "wsgi.input_terminated": True,
- "SERVER_SOFTWARE": SERVER_SOFTWARE,
- }
-
-
-def default_environ(req, sock, cfg):
- env = base_environ(cfg)
- env.update({
- "wsgi.input": req.body,
- "gunicorn.socket": sock,
- "REQUEST_METHOD": req.method,
- "QUERY_STRING": req.query,
- "RAW_URI": req.uri,
- "SERVER_PROTOCOL": "HTTP/%s" % ".".join([str(v) for v in req.version])
- })
- return env
-
-
-def proxy_environ(req):
- info = req.proxy_protocol_info
-
- if not info:
- return {}
-
- return {
- "PROXY_PROTOCOL": info["proxy_protocol"],
- "REMOTE_ADDR": info["client_addr"],
- "REMOTE_PORT": str(info["client_port"]),
- "PROXY_ADDR": info["proxy_addr"],
- "PROXY_PORT": str(info["proxy_port"]),
- }
-
-
-def create(req, sock, client, server, cfg):
- resp = Response(req, sock, cfg)
-
- # set initial environ
- environ = default_environ(req, sock, cfg)
-
- # default variables
- host = None
- script_name = os.environ.get("SCRIPT_NAME", "")
-
- # add the headers to the environ
- for hdr_name, hdr_value in req.headers:
- if hdr_name == "EXPECT":
- # handle expect
- if hdr_value.lower() == "100-continue":
- sock.send(b"HTTP/1.1 100 Continue\r\n\r\n")
- elif hdr_name == 'HOST':
- host = hdr_value
- elif hdr_name == "SCRIPT_NAME":
- script_name = hdr_value
- elif hdr_name == "CONTENT-TYPE":
- environ['CONTENT_TYPE'] = hdr_value
- continue
- elif hdr_name == "CONTENT-LENGTH":
- environ['CONTENT_LENGTH'] = hdr_value
- continue
-
- key = 'HTTP_' + hdr_name.replace('-', '_')
- if key in environ:
- hdr_value = "%s,%s" % (environ[key], hdr_value)
- environ[key] = hdr_value
-
- # set the url scheme
- environ['wsgi.url_scheme'] = req.scheme
-
- # set the REMOTE_* keys in environ
- # authors should be aware that REMOTE_HOST and REMOTE_ADDR
- # may not qualify the remote addr:
- # http://www.ietf.org/rfc/rfc3875
- if isinstance(client, str):
- environ['REMOTE_ADDR'] = client
- elif isinstance(client, bytes):
- environ['REMOTE_ADDR'] = client.decode()
- else:
- environ['REMOTE_ADDR'] = client[0]
- environ['REMOTE_PORT'] = str(client[1])
-
- # handle the SERVER_*
- # Normally only the application should use the Host header but since the
- # WSGI spec doesn't support unix sockets, we are using it to create
- # viable SERVER_* if possible.
- if isinstance(server, str):
- server = server.split(":")
- if len(server) == 1:
- # unix socket
- if host:
- server = host.split(':')
- if len(server) == 1:
- if req.scheme == "http":
- server.append(80)
- elif req.scheme == "https":
- server.append(443)
- else:
- server.append('')
- else:
- # no host header given which means that we are not behind a
- # proxy, so append an empty port.
- server.append('')
- environ['SERVER_NAME'] = server[0]
- environ['SERVER_PORT'] = str(server[1])
-
- # set the path and script name
- path_info = req.path
- if script_name:
- path_info = path_info.split(script_name, 1)[1]
- environ['PATH_INFO'] = util.unquote_to_wsgi_str(path_info)
- environ['SCRIPT_NAME'] = script_name
-
- # override the environ with the correct remote and server address if
- # we are behind a proxy using the proxy protocol.
- environ.update(proxy_environ(req))
- return resp, environ
-
-
-class Response(object):
-
- def __init__(self, req, sock, cfg):
- self.req = req
- self.sock = sock
- self.version = SERVER
- self.status = None
- self.chunked = False
- self.must_close = False
- self.headers = []
- self.headers_sent = False
- self.response_length = None
- self.sent = 0
- self.upgrade = False
- self.cfg = cfg
-
- def force_close(self):
- self.must_close = True
-
- def should_close(self):
- if self.must_close or self.req.should_close():
- return True
- if self.response_length is not None or self.chunked:
- return False
- if self.req.method == 'HEAD':
- return False
- if self.status_code < 200 or self.status_code in (204, 304):
- return False
- return True
-
- def start_response(self, status, headers, exc_info=None):
- if exc_info:
- try:
- if self.status and self.headers_sent:
- util.reraise(exc_info[0], exc_info[1], exc_info[2])
- finally:
- exc_info = None
- elif self.status is not None:
- raise AssertionError("Response headers already set!")
-
- self.status = status
-
- # get the status code from the response here so we can use it to check
- # the need for the connection header later without parsing the string
- # each time.
- try:
- self.status_code = int(self.status.split()[0])
- except ValueError:
- self.status_code = None
-
- self.process_headers(headers)
- self.chunked = self.is_chunked()
- return self.write
-
- def process_headers(self, headers):
- for name, value in headers:
- if not isinstance(name, str):
- raise TypeError('%r is not a string' % name)
-
- if HEADER_RE.search(name):
- raise InvalidHeaderName('%r' % name)
-
- if not isinstance(value, str):
- raise TypeError('%r is not a string' % value)
-
- if HEADER_VALUE_RE.search(value):
- raise InvalidHeader('%r' % value)
-
- value = value.strip()
- lname = name.lower().strip()
- if lname == "content-length":
- self.response_length = int(value)
- elif util.is_hoppish(name):
- if lname == "connection":
- # handle websocket
- if value.lower().strip() == "upgrade":
- self.upgrade = True
- elif lname == "upgrade":
- if value.lower().strip() == "websocket":
- self.headers.append((name.strip(), value))
-
- # ignore hopbyhop headers
- continue
- self.headers.append((name.strip(), value))
-
- def is_chunked(self):
- # Only use chunked responses when the client is
- # speaking HTTP/1.1 or newer and there was
- # no Content-Length header set.
- if self.response_length is not None:
- return False
- elif self.req.version <= (1, 0):
- return False
- elif self.req.method == 'HEAD':
- # Responses to a HEAD request MUST NOT contain a response body.
- return False
- elif self.status_code in (204, 304):
- # Do not use chunked responses when the response is guaranteed to
- # not have a response body.
- return False
- return True
-
- def default_headers(self):
- # set the connection header
- if self.upgrade:
- connection = "upgrade"
- elif self.should_close():
- connection = "close"
- else:
- connection = "keep-alive"
-
- headers = [
- "HTTP/%s.%s %s\r\n" % (self.req.version[0],
- self.req.version[1], self.status),
- "Server: %s\r\n" % self.version,
- "Date: %s\r\n" % util.http_date(),
- "Connection: %s\r\n" % connection
- ]
- if self.chunked:
- headers.append("Transfer-Encoding: chunked\r\n")
- return headers
-
- def send_headers(self):
- if self.headers_sent:
- return
- tosend = self.default_headers()
- tosend.extend(["%s: %s\r\n" % (k, v) for k, v in self.headers])
-
- header_str = "%s\r\n" % "".join(tosend)
- util.write(self.sock, util.to_bytestring(header_str, "latin-1"))
- self.headers_sent = True
-
- def write(self, arg):
- self.send_headers()
- if not isinstance(arg, bytes):
- raise TypeError('%r is not a byte' % arg)
- arglen = len(arg)
- tosend = arglen
- if self.response_length is not None:
- if self.sent >= self.response_length:
- # Never write more than self.response_length bytes
- return
-
- tosend = min(self.response_length - self.sent, tosend)
- if tosend < arglen:
- arg = arg[:tosend]
-
- # Sending an empty chunk signals the end of the
- # response and prematurely closes the response
- if self.chunked and tosend == 0:
- return
-
- self.sent += tosend
- util.write(self.sock, arg, self.chunked)
-
- def can_sendfile(self):
- return self.cfg.sendfile is not False
-
- def sendfile(self, respiter):
- if self.cfg.is_ssl or not self.can_sendfile():
- return False
-
- if not util.has_fileno(respiter.filelike):
- return False
-
- fileno = respiter.filelike.fileno()
- try:
- offset = os.lseek(fileno, 0, os.SEEK_CUR)
- if self.response_length is None:
- filesize = os.fstat(fileno).st_size
- nbytes = filesize - offset
- else:
- nbytes = self.response_length
- except (OSError, io.UnsupportedOperation):
- return False
-
- self.send_headers()
-
- if self.is_chunked():
- chunk_size = "%X\r\n" % nbytes
- self.sock.sendall(chunk_size.encode('utf-8'))
-
- self.sock.sendfile(respiter.filelike, count=nbytes)
-
- if self.is_chunked():
- self.sock.sendall(b"\r\n")
-
- os.lseek(fileno, offset, os.SEEK_SET)
-
- return True
-
- def write_file(self, respiter):
- if not self.sendfile(respiter):
- for item in respiter:
- self.write(item)
-
- def close(self):
- if not self.headers_sent:
- self.send_headers()
- if self.chunked:
- util.write_chunk(self.sock, b"")
diff --git a/env/lib/python3.9/site-packages/gunicorn/instrument/__init__.py b/env/lib/python3.9/site-packages/gunicorn/instrument/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/gunicorn/instrument/statsd.py b/env/lib/python3.9/site-packages/gunicorn/instrument/statsd.py
deleted file mode 100644
index afbfd7b..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/instrument/statsd.py
+++ /dev/null
@@ -1,130 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-"Bare-bones implementation of statsD's protocol, client-side"
-
-import logging
-import socket
-from re import sub
-
-from gunicorn.glogging import Logger
-
-# Instrumentation constants
-METRIC_VAR = "metric"
-VALUE_VAR = "value"
-MTYPE_VAR = "mtype"
-GAUGE_TYPE = "gauge"
-COUNTER_TYPE = "counter"
-HISTOGRAM_TYPE = "histogram"
-
-
-class Statsd(Logger):
- """statsD-based instrumentation, that passes as a logger
- """
- def __init__(self, cfg):
- """host, port: statsD server
- """
- Logger.__init__(self, cfg)
- self.prefix = sub(r"^(.+[^.]+)\.*$", "\\g<1>.", cfg.statsd_prefix)
- try:
- host, port = cfg.statsd_host
- self.sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
- self.sock.connect((host, int(port)))
- except Exception:
- self.sock = None
-
- self.dogstatsd_tags = cfg.dogstatsd_tags
-
- # Log errors and warnings
- def critical(self, msg, *args, **kwargs):
- Logger.critical(self, msg, *args, **kwargs)
- self.increment("gunicorn.log.critical", 1)
-
- def error(self, msg, *args, **kwargs):
- Logger.error(self, msg, *args, **kwargs)
- self.increment("gunicorn.log.error", 1)
-
- def warning(self, msg, *args, **kwargs):
- Logger.warning(self, msg, *args, **kwargs)
- self.increment("gunicorn.log.warning", 1)
-
- def exception(self, msg, *args, **kwargs):
- Logger.exception(self, msg, *args, **kwargs)
- self.increment("gunicorn.log.exception", 1)
-
- # Special treatment for info, the most common log level
- def info(self, msg, *args, **kwargs):
- self.log(logging.INFO, msg, *args, **kwargs)
-
- # skip the run-of-the-mill logs
- def debug(self, msg, *args, **kwargs):
- self.log(logging.DEBUG, msg, *args, **kwargs)
-
- def log(self, lvl, msg, *args, **kwargs):
- """Log a given statistic if metric, value and type are present
- """
- try:
- extra = kwargs.get("extra", None)
- if extra is not None:
- metric = extra.get(METRIC_VAR, None)
- value = extra.get(VALUE_VAR, None)
- typ = extra.get(MTYPE_VAR, None)
- if metric and value and typ:
- if typ == GAUGE_TYPE:
- self.gauge(metric, value)
- elif typ == COUNTER_TYPE:
- self.increment(metric, value)
- elif typ == HISTOGRAM_TYPE:
- self.histogram(metric, value)
- else:
- pass
-
- # Log to parent logger only if there is something to say
- if msg:
- Logger.log(self, lvl, msg, *args, **kwargs)
- except Exception:
- Logger.warning(self, "Failed to log to statsd", exc_info=True)
-
- # access logging
- def access(self, resp, req, environ, request_time):
- """Measure request duration
- request_time is a datetime.timedelta
- """
- Logger.access(self, resp, req, environ, request_time)
- duration_in_ms = request_time.seconds * 1000 + float(request_time.microseconds) / 10 ** 3
- status = resp.status
- if isinstance(status, str):
- status = int(status.split(None, 1)[0])
- self.histogram("gunicorn.request.duration", duration_in_ms)
- self.increment("gunicorn.requests", 1)
- self.increment("gunicorn.request.status.%d" % status, 1)
-
- # statsD methods
- # you can use those directly if you want
- def gauge(self, name, value):
- self._sock_send("{0}{1}:{2}|g".format(self.prefix, name, value))
-
- def increment(self, name, value, sampling_rate=1.0):
- self._sock_send("{0}{1}:{2}|c|@{3}".format(self.prefix, name, value, sampling_rate))
-
- def decrement(self, name, value, sampling_rate=1.0):
- self._sock_send("{0}{1}:-{2}|c|@{3}".format(self.prefix, name, value, sampling_rate))
-
- def histogram(self, name, value):
- self._sock_send("{0}{1}:{2}|ms".format(self.prefix, name, value))
-
- def _sock_send(self, msg):
- try:
- if isinstance(msg, str):
- msg = msg.encode("ascii")
-
- # http://docs.datadoghq.com/guides/dogstatsd/#datagram-format
- if self.dogstatsd_tags:
- msg = msg + b"|#" + self.dogstatsd_tags.encode('ascii')
-
- if self.sock:
- self.sock.send(msg)
- except Exception:
- Logger.warning(self, "Error sending message to statsd", exc_info=True)
diff --git a/env/lib/python3.9/site-packages/gunicorn/pidfile.py b/env/lib/python3.9/site-packages/gunicorn/pidfile.py
deleted file mode 100644
index 585b02a..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/pidfile.py
+++ /dev/null
@@ -1,86 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import errno
-import os
-import tempfile
-
-
-class Pidfile(object):
- """\
- Manage a PID file. If a specific name is provided
- it and '"%s.oldpid" % name' will be used. Otherwise
- we create a temp file using os.mkstemp.
- """
-
- def __init__(self, fname):
- self.fname = fname
- self.pid = None
-
- def create(self, pid):
- oldpid = self.validate()
- if oldpid:
- if oldpid == os.getpid():
- return
- msg = "Already running on PID %s (or pid file '%s' is stale)"
- raise RuntimeError(msg % (oldpid, self.fname))
-
- self.pid = pid
-
- # Write pidfile
- fdir = os.path.dirname(self.fname)
- if fdir and not os.path.isdir(fdir):
- raise RuntimeError("%s doesn't exist. Can't create pidfile." % fdir)
- fd, fname = tempfile.mkstemp(dir=fdir)
- os.write(fd, ("%s\n" % self.pid).encode('utf-8'))
- if self.fname:
- os.rename(fname, self.fname)
- else:
- self.fname = fname
- os.close(fd)
-
- # set permissions to -rw-r--r--
- os.chmod(self.fname, 420)
-
- def rename(self, path):
- self.unlink()
- self.fname = path
- self.create(self.pid)
-
- def unlink(self):
- """ delete pidfile"""
- try:
- with open(self.fname, "r") as f:
- pid1 = int(f.read() or 0)
-
- if pid1 == self.pid:
- os.unlink(self.fname)
- except Exception:
- pass
-
- def validate(self):
- """ Validate pidfile and make it stale if needed"""
- if not self.fname:
- return
- try:
- with open(self.fname, "r") as f:
- try:
- wpid = int(f.read())
- except ValueError:
- return
-
- try:
- os.kill(wpid, 0)
- return wpid
- except OSError as e:
- if e.args[0] == errno.EPERM:
- return wpid
- if e.args[0] == errno.ESRCH:
- return
- raise
- except IOError as e:
- if e.args[0] == errno.ENOENT:
- return
- raise
diff --git a/env/lib/python3.9/site-packages/gunicorn/reloader.py b/env/lib/python3.9/site-packages/gunicorn/reloader.py
deleted file mode 100644
index c196478..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/reloader.py
+++ /dev/null
@@ -1,132 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-# pylint: disable=no-else-continue
-
-import os
-import os.path
-import re
-import sys
-import time
-import threading
-
-COMPILED_EXT_RE = re.compile(r'py[co]$')
-
-
-class Reloader(threading.Thread):
- def __init__(self, extra_files=None, interval=1, callback=None):
- super().__init__()
- self.setDaemon(True)
- self._extra_files = set(extra_files or ())
- self._interval = interval
- self._callback = callback
-
- def add_extra_file(self, filename):
- self._extra_files.add(filename)
-
- def get_files(self):
- fnames = [
- COMPILED_EXT_RE.sub('py', module.__file__)
- for module in tuple(sys.modules.values())
- if getattr(module, '__file__', None)
- ]
-
- fnames.extend(self._extra_files)
-
- return fnames
-
- def run(self):
- mtimes = {}
- while True:
- for filename in self.get_files():
- try:
- mtime = os.stat(filename).st_mtime
- except OSError:
- continue
- old_time = mtimes.get(filename)
- if old_time is None:
- mtimes[filename] = mtime
- continue
- elif mtime > old_time:
- if self._callback:
- self._callback(filename)
- time.sleep(self._interval)
-
-
-has_inotify = False
-if sys.platform.startswith('linux'):
- try:
- from inotify.adapters import Inotify
- import inotify.constants
- has_inotify = True
- except ImportError:
- pass
-
-
-if has_inotify:
-
- class InotifyReloader(threading.Thread):
- event_mask = (inotify.constants.IN_CREATE | inotify.constants.IN_DELETE
- | inotify.constants.IN_DELETE_SELF | inotify.constants.IN_MODIFY
- | inotify.constants.IN_MOVE_SELF | inotify.constants.IN_MOVED_FROM
- | inotify.constants.IN_MOVED_TO)
-
- def __init__(self, extra_files=None, callback=None):
- super().__init__()
- self.setDaemon(True)
- self._callback = callback
- self._dirs = set()
- self._watcher = Inotify()
-
- for extra_file in extra_files:
- self.add_extra_file(extra_file)
-
- def add_extra_file(self, filename):
- dirname = os.path.dirname(filename)
-
- if dirname in self._dirs:
- return
-
- self._watcher.add_watch(dirname, mask=self.event_mask)
- self._dirs.add(dirname)
-
- def get_dirs(self):
- fnames = [
- os.path.dirname(os.path.abspath(COMPILED_EXT_RE.sub('py', module.__file__)))
- for module in tuple(sys.modules.values())
- if getattr(module, '__file__', None)
- ]
-
- return set(fnames)
-
- def run(self):
- self._dirs = self.get_dirs()
-
- for dirname in self._dirs:
- if os.path.isdir(dirname):
- self._watcher.add_watch(dirname, mask=self.event_mask)
-
- for event in self._watcher.event_gen():
- if event is None:
- continue
-
- filename = event[3]
-
- self._callback(filename)
-
-else:
-
- class InotifyReloader(object):
- def __init__(self, callback=None):
- raise ImportError('You must have the inotify module installed to '
- 'use the inotify reloader')
-
-
-preferred_reloader = InotifyReloader if has_inotify else Reloader
-
-reloader_engines = {
- 'auto': preferred_reloader,
- 'poll': Reloader,
- 'inotify': InotifyReloader,
-}
diff --git a/env/lib/python3.9/site-packages/gunicorn/sock.py b/env/lib/python3.9/site-packages/gunicorn/sock.py
deleted file mode 100644
index d458677..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/sock.py
+++ /dev/null
@@ -1,212 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import errno
-import os
-import socket
-import stat
-import sys
-import time
-
-from gunicorn import util
-
-
-class BaseSocket(object):
-
- def __init__(self, address, conf, log, fd=None):
- self.log = log
- self.conf = conf
-
- self.cfg_addr = address
- if fd is None:
- sock = socket.socket(self.FAMILY, socket.SOCK_STREAM)
- bound = False
- else:
- sock = socket.fromfd(fd, self.FAMILY, socket.SOCK_STREAM)
- os.close(fd)
- bound = True
-
- self.sock = self.set_options(sock, bound=bound)
-
- def __str__(self):
- return "" % self.sock.fileno()
-
- def __getattr__(self, name):
- return getattr(self.sock, name)
-
- def set_options(self, sock, bound=False):
- sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
- if (self.conf.reuse_port
- and hasattr(socket, 'SO_REUSEPORT')): # pragma: no cover
- try:
- sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
- except socket.error as err:
- if err.errno not in (errno.ENOPROTOOPT, errno.EINVAL):
- raise
- if not bound:
- self.bind(sock)
- sock.setblocking(0)
-
- # make sure that the socket can be inherited
- if hasattr(sock, "set_inheritable"):
- sock.set_inheritable(True)
-
- sock.listen(self.conf.backlog)
- return sock
-
- def bind(self, sock):
- sock.bind(self.cfg_addr)
-
- def close(self):
- if self.sock is None:
- return
-
- try:
- self.sock.close()
- except socket.error as e:
- self.log.info("Error while closing socket %s", str(e))
-
- self.sock = None
-
-
-class TCPSocket(BaseSocket):
-
- FAMILY = socket.AF_INET
-
- def __str__(self):
- if self.conf.is_ssl:
- scheme = "https"
- else:
- scheme = "http"
-
- addr = self.sock.getsockname()
- return "%s://%s:%d" % (scheme, addr[0], addr[1])
-
- def set_options(self, sock, bound=False):
- sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
- return super().set_options(sock, bound=bound)
-
-
-class TCP6Socket(TCPSocket):
-
- FAMILY = socket.AF_INET6
-
- def __str__(self):
- (host, port, _, _) = self.sock.getsockname()
- return "http://[%s]:%d" % (host, port)
-
-
-class UnixSocket(BaseSocket):
-
- FAMILY = socket.AF_UNIX
-
- def __init__(self, addr, conf, log, fd=None):
- if fd is None:
- try:
- st = os.stat(addr)
- except OSError as e:
- if e.args[0] != errno.ENOENT:
- raise
- else:
- if stat.S_ISSOCK(st.st_mode):
- os.remove(addr)
- else:
- raise ValueError("%r is not a socket" % addr)
- super().__init__(addr, conf, log, fd=fd)
-
- def __str__(self):
- return "unix:%s" % self.cfg_addr
-
- def bind(self, sock):
- old_umask = os.umask(self.conf.umask)
- sock.bind(self.cfg_addr)
- util.chown(self.cfg_addr, self.conf.uid, self.conf.gid)
- os.umask(old_umask)
-
-
-def _sock_type(addr):
- if isinstance(addr, tuple):
- if util.is_ipv6(addr[0]):
- sock_type = TCP6Socket
- else:
- sock_type = TCPSocket
- elif isinstance(addr, (str, bytes)):
- sock_type = UnixSocket
- else:
- raise TypeError("Unable to create socket from: %r" % addr)
- return sock_type
-
-
-def create_sockets(conf, log, fds=None):
- """
- Create a new socket for the configured addresses or file descriptors.
-
- If a configured address is a tuple then a TCP socket is created.
- If it is a string, a Unix socket is created. Otherwise, a TypeError is
- raised.
- """
- listeners = []
-
- # get it only once
- addr = conf.address
- fdaddr = [bind for bind in addr if isinstance(bind, int)]
- if fds:
- fdaddr += list(fds)
- laddr = [bind for bind in addr if not isinstance(bind, int)]
-
- # check ssl config early to raise the error on startup
- # only the certfile is needed since it can contains the keyfile
- if conf.certfile and not os.path.exists(conf.certfile):
- raise ValueError('certfile "%s" does not exist' % conf.certfile)
-
- if conf.keyfile and not os.path.exists(conf.keyfile):
- raise ValueError('keyfile "%s" does not exist' % conf.keyfile)
-
- # sockets are already bound
- if fdaddr:
- for fd in fdaddr:
- sock = socket.fromfd(fd, socket.AF_UNIX, socket.SOCK_STREAM)
- sock_name = sock.getsockname()
- sock_type = _sock_type(sock_name)
- listener = sock_type(sock_name, conf, log, fd=fd)
- listeners.append(listener)
-
- return listeners
-
- # no sockets is bound, first initialization of gunicorn in this env.
- for addr in laddr:
- sock_type = _sock_type(addr)
- sock = None
- for i in range(5):
- try:
- sock = sock_type(addr, conf, log)
- except socket.error as e:
- if e.args[0] == errno.EADDRINUSE:
- log.error("Connection in use: %s", str(addr))
- if e.args[0] == errno.EADDRNOTAVAIL:
- log.error("Invalid address: %s", str(addr))
- if i < 5:
- msg = "connection to {addr} failed: {error}"
- log.debug(msg.format(addr=str(addr), error=str(e)))
- log.error("Retrying in 1 second.")
- time.sleep(1)
- else:
- break
-
- if sock is None:
- log.error("Can't connect to %s", str(addr))
- sys.exit(1)
-
- listeners.append(sock)
-
- return listeners
-
-
-def close_sockets(listeners, unlink=True):
- for sock in listeners:
- sock_name = sock.getsockname()
- sock.close()
- if unlink and _sock_type(sock_name) is UnixSocket:
- os.unlink(sock_name)
diff --git a/env/lib/python3.9/site-packages/gunicorn/systemd.py b/env/lib/python3.9/site-packages/gunicorn/systemd.py
deleted file mode 100644
index cea4822..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/systemd.py
+++ /dev/null
@@ -1,77 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import os
-import socket
-
-SD_LISTEN_FDS_START = 3
-
-
-def listen_fds(unset_environment=True):
- """
- Get the number of sockets inherited from systemd socket activation.
-
- :param unset_environment: clear systemd environment variables unless False
- :type unset_environment: bool
- :return: the number of sockets to inherit from systemd socket activation
- :rtype: int
-
- Returns zero immediately if $LISTEN_PID is not set to the current pid.
- Otherwise, returns the number of systemd activation sockets specified by
- $LISTEN_FDS.
-
- When $LISTEN_PID matches the current pid, unsets the environment variables
- unless the ``unset_environment`` flag is ``False``.
-
- .. note::
- Unlike the sd_listen_fds C function, this implementation does not set
- the FD_CLOEXEC flag because the gunicorn arbiter never needs to do this.
-
- .. seealso::
- ``_
-
- """
- fds = int(os.environ.get('LISTEN_FDS', 0))
- listen_pid = int(os.environ.get('LISTEN_PID', 0))
-
- if listen_pid != os.getpid():
- return 0
-
- if unset_environment:
- os.environ.pop('LISTEN_PID', None)
- os.environ.pop('LISTEN_FDS', None)
-
- return fds
-
-
-def sd_notify(state, logger, unset_environment=False):
- """Send a notification to systemd. state is a string; see
- the man page of sd_notify (http://www.freedesktop.org/software/systemd/man/sd_notify.html)
- for a description of the allowable values.
-
- If the unset_environment parameter is True, sd_notify() will unset
- the $NOTIFY_SOCKET environment variable before returning (regardless of
- whether the function call itself succeeded or not). Further calls to
- sd_notify() will then fail, but the variable is no longer inherited by
- child processes.
- """
-
-
- addr = os.environ.get('NOTIFY_SOCKET')
- if addr is None:
- # not run in a service, just a noop
- return
- try:
- sock = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM | socket.SOCK_CLOEXEC)
- if addr[0] == '@':
- addr = '\0' + addr[1:]
- sock.connect(addr)
- sock.sendall(state.encode('utf-8'))
- except:
- logger.debug("Exception while invoking sd_notify()", exc_info=True)
- finally:
- if unset_environment:
- os.environ.pop('NOTIFY_SOCKET')
- sock.close()
diff --git a/env/lib/python3.9/site-packages/gunicorn/util.py b/env/lib/python3.9/site-packages/gunicorn/util.py
deleted file mode 100644
index a821e35..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/util.py
+++ /dev/null
@@ -1,639 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-import ast
-import email.utils
-import errno
-import fcntl
-import html
-import importlib
-import inspect
-import io
-import logging
-import os
-import pwd
-import random
-import re
-import socket
-import sys
-import textwrap
-import time
-import traceback
-import warnings
-
-import pkg_resources
-
-from gunicorn.errors import AppImportError
-from gunicorn.workers import SUPPORTED_WORKERS
-import urllib.parse
-
-REDIRECT_TO = getattr(os, 'devnull', '/dev/null')
-
-# Server and Date aren't technically hop-by-hop
-# headers, but they are in the purview of the
-# origin server which the WSGI spec says we should
-# act like. So we drop them and add our own.
-#
-# In the future, concatenation server header values
-# might be better, but nothing else does it and
-# dropping them is easier.
-hop_headers = set("""
- connection keep-alive proxy-authenticate proxy-authorization
- te trailers transfer-encoding upgrade
- server date
- """.split())
-
-try:
- from setproctitle import setproctitle
-
- def _setproctitle(title):
- setproctitle("gunicorn: %s" % title)
-except ImportError:
- def _setproctitle(title):
- pass
-
-
-def load_class(uri, default="gunicorn.workers.sync.SyncWorker",
- section="gunicorn.workers"):
- if inspect.isclass(uri):
- return uri
- if uri.startswith("egg:"):
- # uses entry points
- entry_str = uri.split("egg:")[1]
- try:
- dist, name = entry_str.rsplit("#", 1)
- except ValueError:
- dist = entry_str
- name = default
-
- try:
- return pkg_resources.load_entry_point(dist, section, name)
- except Exception:
- exc = traceback.format_exc()
- msg = "class uri %r invalid or not found: \n\n[%s]"
- raise RuntimeError(msg % (uri, exc))
- else:
- components = uri.split('.')
- if len(components) == 1:
- while True:
- if uri.startswith("#"):
- uri = uri[1:]
-
- if uri in SUPPORTED_WORKERS:
- components = SUPPORTED_WORKERS[uri].split(".")
- break
-
- try:
- return pkg_resources.load_entry_point(
- "gunicorn", section, uri
- )
- except Exception:
- exc = traceback.format_exc()
- msg = "class uri %r invalid or not found: \n\n[%s]"
- raise RuntimeError(msg % (uri, exc))
-
- klass = components.pop(-1)
-
- try:
- mod = importlib.import_module('.'.join(components))
- except:
- exc = traceback.format_exc()
- msg = "class uri %r invalid or not found: \n\n[%s]"
- raise RuntimeError(msg % (uri, exc))
- return getattr(mod, klass)
-
-
-positionals = (
- inspect.Parameter.POSITIONAL_ONLY,
- inspect.Parameter.POSITIONAL_OR_KEYWORD,
-)
-
-
-def get_arity(f):
- sig = inspect.signature(f)
- arity = 0
-
- for param in sig.parameters.values():
- if param.kind in positionals:
- arity += 1
-
- return arity
-
-
-def get_username(uid):
- """ get the username for a user id"""
- return pwd.getpwuid(uid).pw_name
-
-
-def set_owner_process(uid, gid, initgroups=False):
- """ set user and group of workers processes """
-
- if gid:
- if uid:
- try:
- username = get_username(uid)
- except KeyError:
- initgroups = False
-
- # versions of python < 2.6.2 don't manage unsigned int for
- # groups like on osx or fedora
- gid = abs(gid) & 0x7FFFFFFF
-
- if initgroups:
- os.initgroups(username, gid)
- elif gid != os.getgid():
- os.setgid(gid)
-
- if uid:
- os.setuid(uid)
-
-
-def chown(path, uid, gid):
- os.chown(path, uid, gid)
-
-
-if sys.platform.startswith("win"):
- def _waitfor(func, pathname, waitall=False):
- # Perform the operation
- func(pathname)
- # Now setup the wait loop
- if waitall:
- dirname = pathname
- else:
- dirname, name = os.path.split(pathname)
- dirname = dirname or '.'
- # Check for `pathname` to be removed from the filesystem.
- # The exponential backoff of the timeout amounts to a total
- # of ~1 second after which the deletion is probably an error
- # anyway.
- # Testing on a i7@4.3GHz shows that usually only 1 iteration is
- # required when contention occurs.
- timeout = 0.001
- while timeout < 1.0:
- # Note we are only testing for the existence of the file(s) in
- # the contents of the directory regardless of any security or
- # access rights. If we have made it this far, we have sufficient
- # permissions to do that much using Python's equivalent of the
- # Windows API FindFirstFile.
- # Other Windows APIs can fail or give incorrect results when
- # dealing with files that are pending deletion.
- L = os.listdir(dirname)
- if not L if waitall else name in L:
- return
- # Increase the timeout and try again
- time.sleep(timeout)
- timeout *= 2
- warnings.warn('tests may fail, delete still pending for ' + pathname,
- RuntimeWarning, stacklevel=4)
-
- def _unlink(filename):
- _waitfor(os.unlink, filename)
-else:
- _unlink = os.unlink
-
-
-def unlink(filename):
- try:
- _unlink(filename)
- except OSError as error:
- # The filename need not exist.
- if error.errno not in (errno.ENOENT, errno.ENOTDIR):
- raise
-
-
-def is_ipv6(addr):
- try:
- socket.inet_pton(socket.AF_INET6, addr)
- except socket.error: # not a valid address
- return False
- except ValueError: # ipv6 not supported on this platform
- return False
- return True
-
-
-def parse_address(netloc, default_port='8000'):
- if re.match(r'unix:(//)?', netloc):
- return re.split(r'unix:(//)?', netloc)[-1]
-
- if netloc.startswith("fd://"):
- fd = netloc[5:]
- try:
- return int(fd)
- except ValueError:
- raise RuntimeError("%r is not a valid file descriptor." % fd) from None
-
- if netloc.startswith("tcp://"):
- netloc = netloc.split("tcp://")[1]
- host, port = netloc, default_port
-
- if '[' in netloc and ']' in netloc:
- host = netloc.split(']')[0][1:]
- port = (netloc.split(']:') + [default_port])[1]
- elif ':' in netloc:
- host, port = (netloc.split(':') + [default_port])[:2]
- elif netloc == "":
- host, port = "0.0.0.0", default_port
-
- try:
- port = int(port)
- except ValueError:
- raise RuntimeError("%r is not a valid port number." % port)
-
- return host.lower(), port
-
-
-def close_on_exec(fd):
- flags = fcntl.fcntl(fd, fcntl.F_GETFD)
- flags |= fcntl.FD_CLOEXEC
- fcntl.fcntl(fd, fcntl.F_SETFD, flags)
-
-
-def set_non_blocking(fd):
- flags = fcntl.fcntl(fd, fcntl.F_GETFL) | os.O_NONBLOCK
- fcntl.fcntl(fd, fcntl.F_SETFL, flags)
-
-
-def close(sock):
- try:
- sock.close()
- except socket.error:
- pass
-
-
-try:
- from os import closerange
-except ImportError:
- def closerange(fd_low, fd_high):
- # Iterate through and close all file descriptors.
- for fd in range(fd_low, fd_high):
- try:
- os.close(fd)
- except OSError: # ERROR, fd wasn't open to begin with (ignored)
- pass
-
-
-def write_chunk(sock, data):
- if isinstance(data, str):
- data = data.encode('utf-8')
- chunk_size = "%X\r\n" % len(data)
- chunk = b"".join([chunk_size.encode('utf-8'), data, b"\r\n"])
- sock.sendall(chunk)
-
-
-def write(sock, data, chunked=False):
- if chunked:
- return write_chunk(sock, data)
- sock.sendall(data)
-
-
-def write_nonblock(sock, data, chunked=False):
- timeout = sock.gettimeout()
- if timeout != 0.0:
- try:
- sock.setblocking(0)
- return write(sock, data, chunked)
- finally:
- sock.setblocking(1)
- else:
- return write(sock, data, chunked)
-
-
-def write_error(sock, status_int, reason, mesg):
- html_error = textwrap.dedent("""\
-
-
- %(reason)s
-
-
-
%(reason)s
- %(mesg)s
-
-
- """) % {"reason": reason, "mesg": html.escape(mesg)}
-
- http = textwrap.dedent("""\
- HTTP/1.1 %s %s\r
- Connection: close\r
- Content-Type: text/html\r
- Content-Length: %d\r
- \r
- %s""") % (str(status_int), reason, len(html_error), html_error)
- write_nonblock(sock, http.encode('latin1'))
-
-
-def _called_with_wrong_args(f):
- """Check whether calling a function raised a ``TypeError`` because
- the call failed or because something in the function raised the
- error.
-
- :param f: The function that was called.
- :return: ``True`` if the call failed.
- """
- tb = sys.exc_info()[2]
-
- try:
- while tb is not None:
- if tb.tb_frame.f_code is f.__code__:
- # In the function, it was called successfully.
- return False
-
- tb = tb.tb_next
-
- # Didn't reach the function.
- return True
- finally:
- # Delete tb to break a circular reference in Python 2.
- # https://docs.python.org/2/library/sys.html#sys.exc_info
- del tb
-
-
-def import_app(module):
- parts = module.split(":", 1)
- if len(parts) == 1:
- obj = "application"
- else:
- module, obj = parts[0], parts[1]
-
- try:
- mod = importlib.import_module(module)
- except ImportError:
- if module.endswith(".py") and os.path.exists(module):
- msg = "Failed to find application, did you mean '%s:%s'?"
- raise ImportError(msg % (module.rsplit(".", 1)[0], obj))
- raise
-
- # Parse obj as a single expression to determine if it's a valid
- # attribute name or function call.
- try:
- expression = ast.parse(obj, mode="eval").body
- except SyntaxError:
- raise AppImportError(
- "Failed to parse %r as an attribute name or function call." % obj
- )
-
- if isinstance(expression, ast.Name):
- name = expression.id
- args = kwargs = None
- elif isinstance(expression, ast.Call):
- # Ensure the function name is an attribute name only.
- if not isinstance(expression.func, ast.Name):
- raise AppImportError("Function reference must be a simple name: %r" % obj)
-
- name = expression.func.id
-
- # Parse the positional and keyword arguments as literals.
- try:
- args = [ast.literal_eval(arg) for arg in expression.args]
- kwargs = {kw.arg: ast.literal_eval(kw.value) for kw in expression.keywords}
- except ValueError:
- # literal_eval gives cryptic error messages, show a generic
- # message with the full expression instead.
- raise AppImportError(
- "Failed to parse arguments as literal values: %r" % obj
- )
- else:
- raise AppImportError(
- "Failed to parse %r as an attribute name or function call." % obj
- )
-
- is_debug = logging.root.level == logging.DEBUG
- try:
- app = getattr(mod, name)
- except AttributeError:
- if is_debug:
- traceback.print_exception(*sys.exc_info())
- raise AppImportError("Failed to find attribute %r in %r." % (name, module))
-
- # If the expression was a function call, call the retrieved object
- # to get the real application.
- if args is not None:
- try:
- app = app(*args, **kwargs)
- except TypeError as e:
- # If the TypeError was due to bad arguments to the factory
- # function, show Python's nice error message without a
- # traceback.
- if _called_with_wrong_args(app):
- raise AppImportError(
- "".join(traceback.format_exception_only(TypeError, e)).strip()
- )
-
- # Otherwise it was raised from within the function, show the
- # full traceback.
- raise
-
- if app is None:
- raise AppImportError("Failed to find application object: %r" % obj)
-
- if not callable(app):
- raise AppImportError("Application object must be callable.")
- return app
-
-
-def getcwd():
- # get current path, try to use PWD env first
- try:
- a = os.stat(os.environ['PWD'])
- b = os.stat(os.getcwd())
- if a.st_ino == b.st_ino and a.st_dev == b.st_dev:
- cwd = os.environ['PWD']
- else:
- cwd = os.getcwd()
- except Exception:
- cwd = os.getcwd()
- return cwd
-
-
-def http_date(timestamp=None):
- """Return the current date and time formatted for a message header."""
- if timestamp is None:
- timestamp = time.time()
- s = email.utils.formatdate(timestamp, localtime=False, usegmt=True)
- return s
-
-
-def is_hoppish(header):
- return header.lower().strip() in hop_headers
-
-
-def daemonize(enable_stdio_inheritance=False):
- """\
- Standard daemonization of a process.
- http://www.svbug.com/documentation/comp.unix.programmer-FAQ/faq_2.html#SEC16
- """
- if 'GUNICORN_FD' not in os.environ:
- if os.fork():
- os._exit(0)
- os.setsid()
-
- if os.fork():
- os._exit(0)
-
- os.umask(0o22)
-
- # In both the following any file descriptors above stdin
- # stdout and stderr are left untouched. The inheritance
- # option simply allows one to have output go to a file
- # specified by way of shell redirection when not wanting
- # to use --error-log option.
-
- if not enable_stdio_inheritance:
- # Remap all of stdin, stdout and stderr on to
- # /dev/null. The expectation is that users have
- # specified the --error-log option.
-
- closerange(0, 3)
-
- fd_null = os.open(REDIRECT_TO, os.O_RDWR)
-
- if fd_null != 0:
- os.dup2(fd_null, 0)
-
- os.dup2(fd_null, 1)
- os.dup2(fd_null, 2)
-
- else:
- fd_null = os.open(REDIRECT_TO, os.O_RDWR)
-
- # Always redirect stdin to /dev/null as we would
- # never expect to need to read interactive input.
-
- if fd_null != 0:
- os.close(0)
- os.dup2(fd_null, 0)
-
- # If stdout and stderr are still connected to
- # their original file descriptors we check to see
- # if they are associated with terminal devices.
- # When they are we map them to /dev/null so that
- # are still detached from any controlling terminal
- # properly. If not we preserve them as they are.
- #
- # If stdin and stdout were not hooked up to the
- # original file descriptors, then all bets are
- # off and all we can really do is leave them as
- # they were.
- #
- # This will allow 'gunicorn ... > output.log 2>&1'
- # to work with stdout/stderr going to the file
- # as expected.
- #
- # Note that if using --error-log option, the log
- # file specified through shell redirection will
- # only be used up until the log file specified
- # by the option takes over. As it replaces stdout
- # and stderr at the file descriptor level, then
- # anything using stdout or stderr, including having
- # cached a reference to them, will still work.
-
- def redirect(stream, fd_expect):
- try:
- fd = stream.fileno()
- if fd == fd_expect and stream.isatty():
- os.close(fd)
- os.dup2(fd_null, fd)
- except AttributeError:
- pass
-
- redirect(sys.stdout, 1)
- redirect(sys.stderr, 2)
-
-
-def seed():
- try:
- random.seed(os.urandom(64))
- except NotImplementedError:
- random.seed('%s.%s' % (time.time(), os.getpid()))
-
-
-def check_is_writeable(path):
- try:
- f = open(path, 'a')
- except IOError as e:
- raise RuntimeError("Error: '%s' isn't writable [%r]" % (path, e))
- f.close()
-
-
-def to_bytestring(value, encoding="utf8"):
- """Converts a string argument to a byte string"""
- if isinstance(value, bytes):
- return value
- if not isinstance(value, str):
- raise TypeError('%r is not a string' % value)
-
- return value.encode(encoding)
-
-
-def has_fileno(obj):
- if not hasattr(obj, "fileno"):
- return False
-
- # check BytesIO case and maybe others
- try:
- obj.fileno()
- except (AttributeError, IOError, io.UnsupportedOperation):
- return False
-
- return True
-
-
-def warn(msg):
- print("!!!", file=sys.stderr)
-
- lines = msg.splitlines()
- for i, line in enumerate(lines):
- if i == 0:
- line = "WARNING: %s" % line
- print("!!! %s" % line, file=sys.stderr)
-
- print("!!!\n", file=sys.stderr)
- sys.stderr.flush()
-
-
-def make_fail_app(msg):
- msg = to_bytestring(msg)
-
- def app(environ, start_response):
- start_response("500 Internal Server Error", [
- ("Content-Type", "text/plain"),
- ("Content-Length", str(len(msg)))
- ])
- return [msg]
-
- return app
-
-
-def split_request_uri(uri):
- if uri.startswith("//"):
- # When the path starts with //, urlsplit considers it as a
- # relative uri while the RFC says we should consider it as abs_path
- # http://www.w3.org/Protocols/rfc2616/rfc2616-sec5.html#sec5.1.2
- # We use temporary dot prefix to workaround this behaviour
- parts = urllib.parse.urlsplit("." + uri)
- return parts._replace(path=parts.path[1:])
-
- return urllib.parse.urlsplit(uri)
-
-
-# From six.reraise
-def reraise(tp, value, tb=None):
- try:
- if value is None:
- value = tp()
- if value.__traceback__ is not tb:
- raise value.with_traceback(tb)
- raise value
- finally:
- value = None
- tb = None
-
-
-def bytes_to_str(b):
- if isinstance(b, str):
- return b
- return str(b, 'latin1')
-
-
-def unquote_to_wsgi_str(string):
- return urllib.parse.unquote_to_bytes(string).decode('latin-1')
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/__init__.py b/env/lib/python3.9/site-packages/gunicorn/workers/__init__.py
deleted file mode 100644
index ae753e1..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/__init__.py
+++ /dev/null
@@ -1,15 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-# supported gunicorn workers.
-SUPPORTED_WORKERS = {
- "sync": "gunicorn.workers.sync.SyncWorker",
- "eventlet": "gunicorn.workers.geventlet.EventletWorker",
- "gevent": "gunicorn.workers.ggevent.GeventWorker",
- "gevent_wsgi": "gunicorn.workers.ggevent.GeventPyWSGIWorker",
- "gevent_pywsgi": "gunicorn.workers.ggevent.GeventPyWSGIWorker",
- "tornado": "gunicorn.workers.gtornado.TornadoWorker",
- "gthread": "gunicorn.workers.gthread.ThreadWorker",
-}
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/base.py b/env/lib/python3.9/site-packages/gunicorn/workers/base.py
deleted file mode 100644
index a6d84bd..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/base.py
+++ /dev/null
@@ -1,273 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import io
-import os
-import signal
-import sys
-import time
-import traceback
-from datetime import datetime
-from random import randint
-from ssl import SSLError
-
-from gunicorn import util
-from gunicorn.http.errors import (
- ForbiddenProxyRequest, InvalidHeader,
- InvalidHeaderName, InvalidHTTPVersion,
- InvalidProxyLine, InvalidRequestLine,
- InvalidRequestMethod, InvalidSchemeHeaders,
- LimitRequestHeaders, LimitRequestLine,
-)
-from gunicorn.http.wsgi import Response, default_environ
-from gunicorn.reloader import reloader_engines
-from gunicorn.workers.workertmp import WorkerTmp
-
-
-class Worker(object):
-
- SIGNALS = [getattr(signal, "SIG%s" % x) for x in (
- "ABRT HUP QUIT INT TERM USR1 USR2 WINCH CHLD".split()
- )]
-
- PIPE = []
-
- def __init__(self, age, ppid, sockets, app, timeout, cfg, log):
- """\
- This is called pre-fork so it shouldn't do anything to the
- current process. If there's a need to make process wide
- changes you'll want to do that in ``self.init_process()``.
- """
- self.age = age
- self.pid = "[booting]"
- self.ppid = ppid
- self.sockets = sockets
- self.app = app
- self.timeout = timeout
- self.cfg = cfg
- self.booted = False
- self.aborted = False
- self.reloader = None
-
- self.nr = 0
-
- if cfg.max_requests > 0:
- jitter = randint(0, cfg.max_requests_jitter)
- self.max_requests = cfg.max_requests + jitter
- else:
- self.max_requests = sys.maxsize
-
- self.alive = True
- self.log = log
- self.tmp = WorkerTmp(cfg)
-
- def __str__(self):
- return "" % self.pid
-
- def notify(self):
- """\
- Your worker subclass must arrange to have this method called
- once every ``self.timeout`` seconds. If you fail in accomplishing
- this task, the master process will murder your workers.
- """
- self.tmp.notify()
-
- def run(self):
- """\
- This is the mainloop of a worker process. You should override
- this method in a subclass to provide the intended behaviour
- for your particular evil schemes.
- """
- raise NotImplementedError()
-
- def init_process(self):
- """\
- If you override this method in a subclass, the last statement
- in the function should be to call this method with
- super().init_process() so that the ``run()`` loop is initiated.
- """
-
- # set environment' variables
- if self.cfg.env:
- for k, v in self.cfg.env.items():
- os.environ[k] = v
-
- util.set_owner_process(self.cfg.uid, self.cfg.gid,
- initgroups=self.cfg.initgroups)
-
- # Reseed the random number generator
- util.seed()
-
- # For waking ourselves up
- self.PIPE = os.pipe()
- for p in self.PIPE:
- util.set_non_blocking(p)
- util.close_on_exec(p)
-
- # Prevent fd inheritance
- for s in self.sockets:
- util.close_on_exec(s)
- util.close_on_exec(self.tmp.fileno())
-
- self.wait_fds = self.sockets + [self.PIPE[0]]
-
- self.log.close_on_exec()
-
- self.init_signals()
-
- # start the reloader
- if self.cfg.reload:
- def changed(fname):
- self.log.info("Worker reloading: %s modified", fname)
- self.alive = False
- os.write(self.PIPE[1], b"1")
- self.cfg.worker_int(self)
- time.sleep(0.1)
- sys.exit(0)
-
- reloader_cls = reloader_engines[self.cfg.reload_engine]
- self.reloader = reloader_cls(extra_files=self.cfg.reload_extra_files,
- callback=changed)
-
- self.load_wsgi()
- if self.reloader:
- self.reloader.start()
-
- self.cfg.post_worker_init(self)
-
- # Enter main run loop
- self.booted = True
- self.run()
-
- def load_wsgi(self):
- try:
- self.wsgi = self.app.wsgi()
- except SyntaxError as e:
- if not self.cfg.reload:
- raise
-
- self.log.exception(e)
-
- # fix from PR #1228
- # storing the traceback into exc_tb will create a circular reference.
- # per https://docs.python.org/2/library/sys.html#sys.exc_info warning,
- # delete the traceback after use.
- try:
- _, exc_val, exc_tb = sys.exc_info()
- self.reloader.add_extra_file(exc_val.filename)
-
- tb_string = io.StringIO()
- traceback.print_tb(exc_tb, file=tb_string)
- self.wsgi = util.make_fail_app(tb_string.getvalue())
- finally:
- del exc_tb
-
- def init_signals(self):
- # reset signaling
- for s in self.SIGNALS:
- signal.signal(s, signal.SIG_DFL)
- # init new signaling
- signal.signal(signal.SIGQUIT, self.handle_quit)
- signal.signal(signal.SIGTERM, self.handle_exit)
- signal.signal(signal.SIGINT, self.handle_quit)
- signal.signal(signal.SIGWINCH, self.handle_winch)
- signal.signal(signal.SIGUSR1, self.handle_usr1)
- signal.signal(signal.SIGABRT, self.handle_abort)
-
- # Don't let SIGTERM and SIGUSR1 disturb active requests
- # by interrupting system calls
- signal.siginterrupt(signal.SIGTERM, False)
- signal.siginterrupt(signal.SIGUSR1, False)
-
- if hasattr(signal, 'set_wakeup_fd'):
- signal.set_wakeup_fd(self.PIPE[1])
-
- def handle_usr1(self, sig, frame):
- self.log.reopen_files()
-
- def handle_exit(self, sig, frame):
- self.alive = False
-
- def handle_quit(self, sig, frame):
- self.alive = False
- # worker_int callback
- self.cfg.worker_int(self)
- time.sleep(0.1)
- sys.exit(0)
-
- def handle_abort(self, sig, frame):
- self.alive = False
- self.cfg.worker_abort(self)
- sys.exit(1)
-
- def handle_error(self, req, client, addr, exc):
- request_start = datetime.now()
- addr = addr or ('', -1) # unix socket case
- if isinstance(exc, (
- InvalidRequestLine, InvalidRequestMethod,
- InvalidHTTPVersion, InvalidHeader, InvalidHeaderName,
- LimitRequestLine, LimitRequestHeaders,
- InvalidProxyLine, ForbiddenProxyRequest,
- InvalidSchemeHeaders,
- SSLError,
- )):
-
- status_int = 400
- reason = "Bad Request"
-
- if isinstance(exc, InvalidRequestLine):
- mesg = "Invalid Request Line '%s'" % str(exc)
- elif isinstance(exc, InvalidRequestMethod):
- mesg = "Invalid Method '%s'" % str(exc)
- elif isinstance(exc, InvalidHTTPVersion):
- mesg = "Invalid HTTP Version '%s'" % str(exc)
- elif isinstance(exc, (InvalidHeaderName, InvalidHeader,)):
- mesg = "%s" % str(exc)
- if not req and hasattr(exc, "req"):
- req = exc.req # for access log
- elif isinstance(exc, LimitRequestLine):
- mesg = "%s" % str(exc)
- elif isinstance(exc, LimitRequestHeaders):
- mesg = "Error parsing headers: '%s'" % str(exc)
- elif isinstance(exc, InvalidProxyLine):
- mesg = "'%s'" % str(exc)
- elif isinstance(exc, ForbiddenProxyRequest):
- reason = "Forbidden"
- mesg = "Request forbidden"
- status_int = 403
- elif isinstance(exc, InvalidSchemeHeaders):
- mesg = "%s" % str(exc)
- elif isinstance(exc, SSLError):
- reason = "Forbidden"
- mesg = "'%s'" % str(exc)
- status_int = 403
-
- msg = "Invalid request from ip={ip}: {error}"
- self.log.debug(msg.format(ip=addr[0], error=str(exc)))
- else:
- if hasattr(req, "uri"):
- self.log.exception("Error handling request %s", req.uri)
- status_int = 500
- reason = "Internal Server Error"
- mesg = ""
-
- if req is not None:
- request_time = datetime.now() - request_start
- environ = default_environ(req, client, self.cfg)
- environ['REMOTE_ADDR'] = addr[0]
- environ['REMOTE_PORT'] = str(addr[1])
- resp = Response(req, client, self.cfg)
- resp.status = "%s %s" % (status_int, reason)
- resp.response_length = len(mesg)
- self.log.access(resp, req, environ, request_time)
-
- try:
- util.write_error(client, status_int, reason, mesg)
- except Exception:
- self.log.debug("Failed to send error message.")
-
- def handle_winch(self, sig, fname):
- # Ignore SIGWINCH in worker. Fixes a crash on OpenBSD.
- self.log.debug("worker: SIGWINCH ignored.")
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/base_async.py b/env/lib/python3.9/site-packages/gunicorn/workers/base_async.py
deleted file mode 100644
index 73c3f6c..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/base_async.py
+++ /dev/null
@@ -1,148 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-from datetime import datetime
-import errno
-import socket
-import ssl
-import sys
-
-import gunicorn.http as http
-import gunicorn.http.wsgi as wsgi
-import gunicorn.util as util
-import gunicorn.workers.base as base
-
-ALREADY_HANDLED = object()
-
-
-class AsyncWorker(base.Worker):
-
- def __init__(self, *args, **kwargs):
- super().__init__(*args, **kwargs)
- self.worker_connections = self.cfg.worker_connections
-
- def timeout_ctx(self):
- raise NotImplementedError()
-
- def is_already_handled(self, respiter):
- # some workers will need to overload this function to raise a StopIteration
- return respiter == ALREADY_HANDLED
-
- def handle(self, listener, client, addr):
- req = None
- try:
- parser = http.RequestParser(self.cfg, client, addr)
- try:
- listener_name = listener.getsockname()
- if not self.cfg.keepalive:
- req = next(parser)
- self.handle_request(listener_name, req, client, addr)
- else:
- # keepalive loop
- proxy_protocol_info = {}
- while True:
- req = None
- with self.timeout_ctx():
- req = next(parser)
- if not req:
- break
- if req.proxy_protocol_info:
- proxy_protocol_info = req.proxy_protocol_info
- else:
- req.proxy_protocol_info = proxy_protocol_info
- self.handle_request(listener_name, req, client, addr)
- except http.errors.NoMoreData as e:
- self.log.debug("Ignored premature client disconnection. %s", e)
- except StopIteration as e:
- self.log.debug("Closing connection. %s", e)
- except ssl.SSLError:
- # pass to next try-except level
- util.reraise(*sys.exc_info())
- except EnvironmentError:
- # pass to next try-except level
- util.reraise(*sys.exc_info())
- except Exception as e:
- self.handle_error(req, client, addr, e)
- except ssl.SSLError as e:
- if e.args[0] == ssl.SSL_ERROR_EOF:
- self.log.debug("ssl connection closed")
- client.close()
- else:
- self.log.debug("Error processing SSL request.")
- self.handle_error(req, client, addr, e)
- except EnvironmentError as e:
- if e.errno not in (errno.EPIPE, errno.ECONNRESET, errno.ENOTCONN):
- self.log.exception("Socket error processing request.")
- else:
- if e.errno == errno.ECONNRESET:
- self.log.debug("Ignoring connection reset")
- elif e.errno == errno.ENOTCONN:
- self.log.debug("Ignoring socket not connected")
- else:
- self.log.debug("Ignoring EPIPE")
- except Exception as e:
- self.handle_error(req, client, addr, e)
- finally:
- util.close(client)
-
- def handle_request(self, listener_name, req, sock, addr):
- request_start = datetime.now()
- environ = {}
- resp = None
- try:
- self.cfg.pre_request(self, req)
- resp, environ = wsgi.create(req, sock, addr,
- listener_name, self.cfg)
- environ["wsgi.multithread"] = True
- self.nr += 1
- if self.nr >= self.max_requests:
- if self.alive:
- self.log.info("Autorestarting worker after current request.")
- self.alive = False
-
- if not self.alive or not self.cfg.keepalive:
- resp.force_close()
-
- respiter = self.wsgi(environ, resp.start_response)
- if self.is_already_handled(respiter):
- return False
- try:
- if isinstance(respiter, environ['wsgi.file_wrapper']):
- resp.write_file(respiter)
- else:
- for item in respiter:
- resp.write(item)
- resp.close()
- request_time = datetime.now() - request_start
- self.log.access(resp, req, environ, request_time)
- finally:
- if hasattr(respiter, "close"):
- respiter.close()
- if resp.should_close():
- raise StopIteration()
- except StopIteration:
- raise
- except EnvironmentError:
- # If the original exception was a socket.error we delegate
- # handling it to the caller (where handle() might ignore it)
- util.reraise(*sys.exc_info())
- except Exception:
- if resp and resp.headers_sent:
- # If the requests have already been sent, we should close the
- # connection to indicate the error.
- self.log.exception("Error handling request")
- try:
- sock.shutdown(socket.SHUT_RDWR)
- sock.close()
- except EnvironmentError:
- pass
- raise StopIteration()
- raise
- finally:
- try:
- self.cfg.post_request(self, req, environ, resp)
- except Exception:
- self.log.exception("Exception in post_request hook")
- return True
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/geventlet.py b/env/lib/python3.9/site-packages/gunicorn/workers/geventlet.py
deleted file mode 100644
index ffdb206..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/geventlet.py
+++ /dev/null
@@ -1,179 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-from functools import partial
-import sys
-
-try:
- import eventlet
-except ImportError:
- raise RuntimeError("eventlet worker requires eventlet 0.24.1 or higher")
-else:
- from pkg_resources import parse_version
- if parse_version(eventlet.__version__) < parse_version('0.24.1'):
- raise RuntimeError("eventlet worker requires eventlet 0.24.1 or higher")
-
-from eventlet import hubs, greenthread
-from eventlet.greenio import GreenSocket
-from eventlet.wsgi import ALREADY_HANDLED as EVENTLET_ALREADY_HANDLED
-import greenlet
-
-from gunicorn.workers.base_async import AsyncWorker
-
-
-def _eventlet_socket_sendfile(self, file, offset=0, count=None):
- # Based on the implementation in gevent which in turn is slightly
- # modified from the standard library implementation.
- if self.gettimeout() == 0:
- raise ValueError("non-blocking sockets are not supported")
- if offset:
- file.seek(offset)
- blocksize = min(count, 8192) if count else 8192
- total_sent = 0
- # localize variable access to minimize overhead
- file_read = file.read
- sock_send = self.send
- try:
- while True:
- if count:
- blocksize = min(count - total_sent, blocksize)
- if blocksize <= 0:
- break
- data = memoryview(file_read(blocksize))
- if not data:
- break # EOF
- while True:
- try:
- sent = sock_send(data)
- except BlockingIOError:
- continue
- else:
- total_sent += sent
- if sent < len(data):
- data = data[sent:]
- else:
- break
- return total_sent
- finally:
- if total_sent > 0 and hasattr(file, 'seek'):
- file.seek(offset + total_sent)
-
-
-
-def _eventlet_serve(sock, handle, concurrency):
- """
- Serve requests forever.
-
- This code is nearly identical to ``eventlet.convenience.serve`` except
- that it attempts to join the pool at the end, which allows for gunicorn
- graceful shutdowns.
- """
- pool = eventlet.greenpool.GreenPool(concurrency)
- server_gt = eventlet.greenthread.getcurrent()
-
- while True:
- try:
- conn, addr = sock.accept()
- gt = pool.spawn(handle, conn, addr)
- gt.link(_eventlet_stop, server_gt, conn)
- conn, addr, gt = None, None, None
- except eventlet.StopServe:
- sock.close()
- pool.waitall()
- return
-
-
-def _eventlet_stop(client, server, conn):
- """
- Stop a greenlet handling a request and close its connection.
-
- This code is lifted from eventlet so as not to depend on undocumented
- functions in the library.
- """
- try:
- try:
- client.wait()
- finally:
- conn.close()
- except greenlet.GreenletExit:
- pass
- except Exception:
- greenthread.kill(server, *sys.exc_info())
-
-
-def patch_sendfile():
- # As of eventlet 0.25.1, GreenSocket.sendfile doesn't exist,
- # meaning the native implementations of socket.sendfile will be used.
- # If os.sendfile exists, it will attempt to use that, failing explicitly
- # if the socket is in non-blocking mode, which the underlying
- # socket object /is/. Even the regular _sendfile_use_send will
- # fail in that way; plus, it would use the underlying socket.send which isn't
- # properly cooperative. So we have to monkey-patch a working socket.sendfile()
- # into GreenSocket; in this method, `self.send` will be the GreenSocket's
- # send method which is properly cooperative.
- if not hasattr(GreenSocket, 'sendfile'):
- GreenSocket.sendfile = _eventlet_socket_sendfile
-
-
-class EventletWorker(AsyncWorker):
-
- def patch(self):
- hubs.use_hub()
- eventlet.monkey_patch()
- patch_sendfile()
-
- def is_already_handled(self, respiter):
- if respiter == EVENTLET_ALREADY_HANDLED:
- raise StopIteration()
- return super().is_already_handled(respiter)
-
- def init_process(self):
- self.patch()
- super().init_process()
-
- def handle_quit(self, sig, frame):
- eventlet.spawn(super().handle_quit, sig, frame)
-
- def handle_usr1(self, sig, frame):
- eventlet.spawn(super().handle_usr1, sig, frame)
-
- def timeout_ctx(self):
- return eventlet.Timeout(self.cfg.keepalive or None, False)
-
- def handle(self, listener, client, addr):
- if self.cfg.is_ssl:
- client = eventlet.wrap_ssl(client, server_side=True,
- **self.cfg.ssl_options)
-
- super().handle(listener, client, addr)
-
- def run(self):
- acceptors = []
- for sock in self.sockets:
- gsock = GreenSocket(sock)
- gsock.setblocking(1)
- hfun = partial(self.handle, gsock)
- acceptor = eventlet.spawn(_eventlet_serve, gsock, hfun,
- self.worker_connections)
-
- acceptors.append(acceptor)
- eventlet.sleep(0.0)
-
- while self.alive:
- self.notify()
- eventlet.sleep(1.0)
-
- self.notify()
- try:
- with eventlet.Timeout(self.cfg.graceful_timeout) as t:
- for a in acceptors:
- a.kill(eventlet.StopServe())
- for a in acceptors:
- a.wait()
- except eventlet.Timeout as te:
- if te != t:
- raise
- for a in acceptors:
- a.kill()
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/ggevent.py b/env/lib/python3.9/site-packages/gunicorn/workers/ggevent.py
deleted file mode 100644
index 3941814..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/ggevent.py
+++ /dev/null
@@ -1,189 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import os
-import sys
-from datetime import datetime
-from functools import partial
-import time
-
-try:
- import gevent
-except ImportError:
- raise RuntimeError("gevent worker requires gevent 1.4 or higher")
-else:
- from pkg_resources import parse_version
- if parse_version(gevent.__version__) < parse_version('1.4'):
- raise RuntimeError("gevent worker requires gevent 1.4 or higher")
-
-from gevent.pool import Pool
-from gevent.server import StreamServer
-from gevent import hub, monkey, socket, pywsgi
-
-import gunicorn
-from gunicorn.http.wsgi import base_environ
-from gunicorn.workers.base_async import AsyncWorker
-
-VERSION = "gevent/%s gunicorn/%s" % (gevent.__version__, gunicorn.__version__)
-
-
-class GeventWorker(AsyncWorker):
-
- server_class = None
- wsgi_handler = None
-
- def patch(self):
- monkey.patch_all()
-
- # patch sockets
- sockets = []
- for s in self.sockets:
- sockets.append(socket.socket(s.FAMILY, socket.SOCK_STREAM,
- fileno=s.sock.fileno()))
- self.sockets = sockets
-
- def notify(self):
- super().notify()
- if self.ppid != os.getppid():
- self.log.info("Parent changed, shutting down: %s", self)
- sys.exit(0)
-
- def timeout_ctx(self):
- return gevent.Timeout(self.cfg.keepalive, False)
-
- def run(self):
- servers = []
- ssl_args = {}
-
- if self.cfg.is_ssl:
- ssl_args = dict(server_side=True, **self.cfg.ssl_options)
-
- for s in self.sockets:
- s.setblocking(1)
- pool = Pool(self.worker_connections)
- if self.server_class is not None:
- environ = base_environ(self.cfg)
- environ.update({
- "wsgi.multithread": True,
- "SERVER_SOFTWARE": VERSION,
- })
- server = self.server_class(
- s, application=self.wsgi, spawn=pool, log=self.log,
- handler_class=self.wsgi_handler, environ=environ,
- **ssl_args)
- else:
- hfun = partial(self.handle, s)
- server = StreamServer(s, handle=hfun, spawn=pool, **ssl_args)
- if self.cfg.workers > 1:
- server.max_accept = 1
-
- server.start()
- servers.append(server)
-
- while self.alive:
- self.notify()
- gevent.sleep(1.0)
-
- try:
- # Stop accepting requests
- for server in servers:
- if hasattr(server, 'close'): # gevent 1.0
- server.close()
- if hasattr(server, 'kill'): # gevent < 1.0
- server.kill()
-
- # Handle current requests until graceful_timeout
- ts = time.time()
- while time.time() - ts <= self.cfg.graceful_timeout:
- accepting = 0
- for server in servers:
- if server.pool.free_count() != server.pool.size:
- accepting += 1
-
- # if no server is accepting a connection, we can exit
- if not accepting:
- return
-
- self.notify()
- gevent.sleep(1.0)
-
- # Force kill all active the handlers
- self.log.warning("Worker graceful timeout (pid:%s)" % self.pid)
- for server in servers:
- server.stop(timeout=1)
- except Exception:
- pass
-
- def handle(self, listener, client, addr):
- # Connected socket timeout defaults to socket.getdefaulttimeout().
- # This forces to blocking mode.
- client.setblocking(1)
- super().handle(listener, client, addr)
-
- def handle_request(self, listener_name, req, sock, addr):
- try:
- super().handle_request(listener_name, req, sock, addr)
- except gevent.GreenletExit:
- pass
- except SystemExit:
- pass
-
- def handle_quit(self, sig, frame):
- # Move this out of the signal handler so we can use
- # blocking calls. See #1126
- gevent.spawn(super().handle_quit, sig, frame)
-
- def handle_usr1(self, sig, frame):
- # Make the gevent workers handle the usr1 signal
- # by deferring to a new greenlet. See #1645
- gevent.spawn(super().handle_usr1, sig, frame)
-
- def init_process(self):
- self.patch()
- hub.reinit()
- super().init_process()
-
-
-class GeventResponse(object):
-
- status = None
- headers = None
- sent = None
-
- def __init__(self, status, headers, clength):
- self.status = status
- self.headers = headers
- self.sent = clength
-
-
-class PyWSGIHandler(pywsgi.WSGIHandler):
-
- def log_request(self):
- start = datetime.fromtimestamp(self.time_start)
- finish = datetime.fromtimestamp(self.time_finish)
- response_time = finish - start
- resp_headers = getattr(self, 'response_headers', {})
- resp = GeventResponse(self.status, resp_headers, self.response_length)
- if hasattr(self, 'headers'):
- req_headers = self.headers.items()
- else:
- req_headers = []
- self.server.log.access(resp, req_headers, self.environ, response_time)
-
- def get_environ(self):
- env = super().get_environ()
- env['gunicorn.sock'] = self.socket
- env['RAW_URI'] = self.path
- return env
-
-
-class PyWSGIServer(pywsgi.WSGIServer):
- pass
-
-
-class GeventPyWSGIWorker(GeventWorker):
- "The Gevent StreamServer based workers."
- server_class = PyWSGIServer
- wsgi_handler = PyWSGIHandler
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/gthread.py b/env/lib/python3.9/site-packages/gunicorn/workers/gthread.py
deleted file mode 100644
index d531811..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/gthread.py
+++ /dev/null
@@ -1,362 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-# design:
-# A threaded worker accepts connections in the main loop, accepted
-# connections are added to the thread pool as a connection job.
-# Keepalive connections are put back in the loop waiting for an event.
-# If no event happen after the keep alive timeout, the connection is
-# closed.
-# pylint: disable=no-else-break
-
-import concurrent.futures as futures
-import errno
-import os
-import selectors
-import socket
-import ssl
-import sys
-import time
-from collections import deque
-from datetime import datetime
-from functools import partial
-from threading import RLock
-
-from . import base
-from .. import http
-from .. import util
-from ..http import wsgi
-
-
-class TConn(object):
-
- def __init__(self, cfg, sock, client, server):
- self.cfg = cfg
- self.sock = sock
- self.client = client
- self.server = server
-
- self.timeout = None
- self.parser = None
-
- # set the socket to non blocking
- self.sock.setblocking(False)
-
- def init(self):
- self.sock.setblocking(True)
- if self.parser is None:
- # wrap the socket if needed
- if self.cfg.is_ssl:
- self.sock = ssl.wrap_socket(self.sock, server_side=True,
- **self.cfg.ssl_options)
-
- # initialize the parser
- self.parser = http.RequestParser(self.cfg, self.sock, self.client)
-
- def set_timeout(self):
- # set the timeout
- self.timeout = time.time() + self.cfg.keepalive
-
- def close(self):
- util.close(self.sock)
-
-
-class ThreadWorker(base.Worker):
-
- def __init__(self, *args, **kwargs):
- super().__init__(*args, **kwargs)
- self.worker_connections = self.cfg.worker_connections
- self.max_keepalived = self.cfg.worker_connections - self.cfg.threads
- # initialise the pool
- self.tpool = None
- self.poller = None
- self._lock = None
- self.futures = deque()
- self._keep = deque()
- self.nr_conns = 0
-
- @classmethod
- def check_config(cls, cfg, log):
- max_keepalived = cfg.worker_connections - cfg.threads
-
- if max_keepalived <= 0 and cfg.keepalive:
- log.warning("No keepalived connections can be handled. " +
- "Check the number of worker connections and threads.")
-
- def init_process(self):
- self.tpool = self.get_thread_pool()
- self.poller = selectors.DefaultSelector()
- self._lock = RLock()
- super().init_process()
-
- def get_thread_pool(self):
- """Override this method to customize how the thread pool is created"""
- return futures.ThreadPoolExecutor(max_workers=self.cfg.threads)
-
- def handle_quit(self, sig, frame):
- self.alive = False
- # worker_int callback
- self.cfg.worker_int(self)
- self.tpool.shutdown(False)
- time.sleep(0.1)
- sys.exit(0)
-
- def _wrap_future(self, fs, conn):
- fs.conn = conn
- self.futures.append(fs)
- fs.add_done_callback(self.finish_request)
-
- def enqueue_req(self, conn):
- conn.init()
- # submit the connection to a worker
- fs = self.tpool.submit(self.handle, conn)
- self._wrap_future(fs, conn)
-
- def accept(self, server, listener):
- try:
- sock, client = listener.accept()
- # initialize the connection object
- conn = TConn(self.cfg, sock, client, server)
- self.nr_conns += 1
- # enqueue the job
- self.enqueue_req(conn)
- except EnvironmentError as e:
- if e.errno not in (errno.EAGAIN, errno.ECONNABORTED,
- errno.EWOULDBLOCK):
- raise
-
- def reuse_connection(self, conn, client):
- with self._lock:
- # unregister the client from the poller
- self.poller.unregister(client)
- # remove the connection from keepalive
- try:
- self._keep.remove(conn)
- except ValueError:
- # race condition
- return
-
- # submit the connection to a worker
- self.enqueue_req(conn)
-
- def murder_keepalived(self):
- now = time.time()
- while True:
- with self._lock:
- try:
- # remove the connection from the queue
- conn = self._keep.popleft()
- except IndexError:
- break
-
- delta = conn.timeout - now
- if delta > 0:
- # add the connection back to the queue
- with self._lock:
- self._keep.appendleft(conn)
- break
- else:
- self.nr_conns -= 1
- # remove the socket from the poller
- with self._lock:
- try:
- self.poller.unregister(conn.sock)
- except EnvironmentError as e:
- if e.errno != errno.EBADF:
- raise
- except KeyError:
- # already removed by the system, continue
- pass
-
- # close the socket
- conn.close()
-
- def is_parent_alive(self):
- # If our parent changed then we shut down.
- if self.ppid != os.getppid():
- self.log.info("Parent changed, shutting down: %s", self)
- return False
- return True
-
- def run(self):
- # init listeners, add them to the event loop
- for sock in self.sockets:
- sock.setblocking(False)
- # a race condition during graceful shutdown may make the listener
- # name unavailable in the request handler so capture it once here
- server = sock.getsockname()
- acceptor = partial(self.accept, server)
- self.poller.register(sock, selectors.EVENT_READ, acceptor)
-
- while self.alive:
- # notify the arbiter we are alive
- self.notify()
-
- # can we accept more connections?
- if self.nr_conns < self.worker_connections:
- # wait for an event
- events = self.poller.select(1.0)
- for key, _ in events:
- callback = key.data
- callback(key.fileobj)
-
- # check (but do not wait) for finished requests
- result = futures.wait(self.futures, timeout=0,
- return_when=futures.FIRST_COMPLETED)
- else:
- # wait for a request to finish
- result = futures.wait(self.futures, timeout=1.0,
- return_when=futures.FIRST_COMPLETED)
-
- # clean up finished requests
- for fut in result.done:
- self.futures.remove(fut)
-
- if not self.is_parent_alive():
- break
-
- # handle keepalive timeouts
- self.murder_keepalived()
-
- self.tpool.shutdown(False)
- self.poller.close()
-
- for s in self.sockets:
- s.close()
-
- futures.wait(self.futures, timeout=self.cfg.graceful_timeout)
-
- def finish_request(self, fs):
- if fs.cancelled():
- self.nr_conns -= 1
- fs.conn.close()
- return
-
- try:
- (keepalive, conn) = fs.result()
- # if the connection should be kept alived add it
- # to the eventloop and record it
- if keepalive and self.alive:
- # flag the socket as non blocked
- conn.sock.setblocking(False)
-
- # register the connection
- conn.set_timeout()
- with self._lock:
- self._keep.append(conn)
-
- # add the socket to the event loop
- self.poller.register(conn.sock, selectors.EVENT_READ,
- partial(self.reuse_connection, conn))
- else:
- self.nr_conns -= 1
- conn.close()
- except Exception:
- # an exception happened, make sure to close the
- # socket.
- self.nr_conns -= 1
- fs.conn.close()
-
- def handle(self, conn):
- keepalive = False
- req = None
- try:
- req = next(conn.parser)
- if not req:
- return (False, conn)
-
- # handle the request
- keepalive = self.handle_request(req, conn)
- if keepalive:
- return (keepalive, conn)
- except http.errors.NoMoreData as e:
- self.log.debug("Ignored premature client disconnection. %s", e)
-
- except StopIteration as e:
- self.log.debug("Closing connection. %s", e)
- except ssl.SSLError as e:
- if e.args[0] == ssl.SSL_ERROR_EOF:
- self.log.debug("ssl connection closed")
- conn.sock.close()
- else:
- self.log.debug("Error processing SSL request.")
- self.handle_error(req, conn.sock, conn.client, e)
-
- except EnvironmentError as e:
- if e.errno not in (errno.EPIPE, errno.ECONNRESET, errno.ENOTCONN):
- self.log.exception("Socket error processing request.")
- else:
- if e.errno == errno.ECONNRESET:
- self.log.debug("Ignoring connection reset")
- elif e.errno == errno.ENOTCONN:
- self.log.debug("Ignoring socket not connected")
- else:
- self.log.debug("Ignoring connection epipe")
- except Exception as e:
- self.handle_error(req, conn.sock, conn.client, e)
-
- return (False, conn)
-
- def handle_request(self, req, conn):
- environ = {}
- resp = None
- try:
- self.cfg.pre_request(self, req)
- request_start = datetime.now()
- resp, environ = wsgi.create(req, conn.sock, conn.client,
- conn.server, self.cfg)
- environ["wsgi.multithread"] = True
- self.nr += 1
- if self.nr >= self.max_requests:
- if self.alive:
- self.log.info("Autorestarting worker after current request.")
- self.alive = False
- resp.force_close()
-
- if not self.alive or not self.cfg.keepalive:
- resp.force_close()
- elif len(self._keep) >= self.max_keepalived:
- resp.force_close()
-
- respiter = self.wsgi(environ, resp.start_response)
- try:
- if isinstance(respiter, environ['wsgi.file_wrapper']):
- resp.write_file(respiter)
- else:
- for item in respiter:
- resp.write(item)
-
- resp.close()
- request_time = datetime.now() - request_start
- self.log.access(resp, req, environ, request_time)
- finally:
- if hasattr(respiter, "close"):
- respiter.close()
-
- if resp.should_close():
- self.log.debug("Closing connection.")
- return False
- except EnvironmentError:
- # pass to next try-except level
- util.reraise(*sys.exc_info())
- except Exception:
- if resp and resp.headers_sent:
- # If the requests have already been sent, we should close the
- # connection to indicate the error.
- self.log.exception("Error handling request")
- try:
- conn.sock.shutdown(socket.SHUT_RDWR)
- conn.sock.close()
- except EnvironmentError:
- pass
- raise StopIteration()
- raise
- finally:
- try:
- self.cfg.post_request(self, req, environ, resp)
- except Exception:
- self.log.exception("Exception in post_request hook")
-
- return True
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/gtornado.py b/env/lib/python3.9/site-packages/gunicorn/workers/gtornado.py
deleted file mode 100644
index 9dd3d7b..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/gtornado.py
+++ /dev/null
@@ -1,171 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import copy
-import os
-import sys
-
-try:
- import tornado
-except ImportError:
- raise RuntimeError("You need tornado installed to use this worker.")
-import tornado.web
-import tornado.httpserver
-from tornado.ioloop import IOLoop, PeriodicCallback
-from tornado.wsgi import WSGIContainer
-from gunicorn.workers.base import Worker
-from gunicorn import __version__ as gversion
-
-
-# Tornado 5.0 updated its IOLoop, and the `io_loop` arguments to many
-# Tornado functions have been removed in Tornado 5.0. Also, they no
-# longer store PeriodCallbacks in ioloop._callbacks. Instead we store
-# them on our side, and use stop() on them when stopping the worker.
-# See https://www.tornadoweb.org/en/stable/releases/v5.0.0.html#backwards-compatibility-notes
-# for more details.
-TORNADO5 = tornado.version_info >= (5, 0, 0)
-
-
-class TornadoWorker(Worker):
-
- @classmethod
- def setup(cls):
- web = sys.modules.pop("tornado.web")
- old_clear = web.RequestHandler.clear
-
- def clear(self):
- old_clear(self)
- if "Gunicorn" not in self._headers["Server"]:
- self._headers["Server"] += " (Gunicorn/%s)" % gversion
- web.RequestHandler.clear = clear
- sys.modules["tornado.web"] = web
-
- def handle_exit(self, sig, frame):
- if self.alive:
- super().handle_exit(sig, frame)
-
- def handle_request(self):
- self.nr += 1
- if self.alive and self.nr >= self.max_requests:
- self.log.info("Autorestarting worker after current request.")
- self.alive = False
-
- def watchdog(self):
- if self.alive:
- self.notify()
-
- if self.ppid != os.getppid():
- self.log.info("Parent changed, shutting down: %s", self)
- self.alive = False
-
- def heartbeat(self):
- if not self.alive:
- if self.server_alive:
- if hasattr(self, 'server'):
- try:
- self.server.stop()
- except Exception:
- pass
- self.server_alive = False
- else:
- if TORNADO5:
- for callback in self.callbacks:
- callback.stop()
- self.ioloop.stop()
- else:
- if not self.ioloop._callbacks:
- self.ioloop.stop()
-
- def init_process(self):
- # IOLoop cannot survive a fork or be shared across processes
- # in any way. When multiple processes are being used, each process
- # should create its own IOLoop. We should clear current IOLoop
- # if exists before os.fork.
- IOLoop.clear_current()
- super().init_process()
-
- def run(self):
- self.ioloop = IOLoop.instance()
- self.alive = True
- self.server_alive = False
-
- if TORNADO5:
- self.callbacks = []
- self.callbacks.append(PeriodicCallback(self.watchdog, 1000))
- self.callbacks.append(PeriodicCallback(self.heartbeat, 1000))
- for callback in self.callbacks:
- callback.start()
- else:
- PeriodicCallback(self.watchdog, 1000, io_loop=self.ioloop).start()
- PeriodicCallback(self.heartbeat, 1000, io_loop=self.ioloop).start()
-
- # Assume the app is a WSGI callable if its not an
- # instance of tornado.web.Application or is an
- # instance of tornado.wsgi.WSGIApplication
- app = self.wsgi
-
- if tornado.version_info[0] < 6:
- if not isinstance(app, tornado.web.Application) or \
- isinstance(app, tornado.wsgi.WSGIApplication):
- app = WSGIContainer(app)
- elif not isinstance(app, WSGIContainer):
- app = WSGIContainer(app)
-
- # Monkey-patching HTTPConnection.finish to count the
- # number of requests being handled by Tornado. This
- # will help gunicorn shutdown the worker if max_requests
- # is exceeded.
- httpserver = sys.modules["tornado.httpserver"]
- if hasattr(httpserver, 'HTTPConnection'):
- old_connection_finish = httpserver.HTTPConnection.finish
-
- def finish(other):
- self.handle_request()
- old_connection_finish(other)
- httpserver.HTTPConnection.finish = finish
- sys.modules["tornado.httpserver"] = httpserver
-
- server_class = tornado.httpserver.HTTPServer
- else:
-
- class _HTTPServer(tornado.httpserver.HTTPServer):
-
- def on_close(instance, server_conn):
- self.handle_request()
- super(_HTTPServer, instance).on_close(server_conn)
-
- server_class = _HTTPServer
-
- if self.cfg.is_ssl:
- _ssl_opt = copy.deepcopy(self.cfg.ssl_options)
- # tornado refuses initialization if ssl_options contains following
- # options
- del _ssl_opt["do_handshake_on_connect"]
- del _ssl_opt["suppress_ragged_eofs"]
- if TORNADO5:
- server = server_class(app, ssl_options=_ssl_opt)
- else:
- server = server_class(app, io_loop=self.ioloop,
- ssl_options=_ssl_opt)
- else:
- if TORNADO5:
- server = server_class(app)
- else:
- server = server_class(app, io_loop=self.ioloop)
-
- self.server = server
- self.server_alive = True
-
- for s in self.sockets:
- s.setblocking(0)
- if hasattr(server, "add_socket"): # tornado > 2.0
- server.add_socket(s)
- elif hasattr(server, "_sockets"): # tornado 2.0
- server._sockets[s.fileno()] = s
-
- server.no_keep_alive = self.cfg.keepalive <= 0
- server.start(num_processes=1)
-
- self.ioloop.start()
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/sync.py b/env/lib/python3.9/site-packages/gunicorn/workers/sync.py
deleted file mode 100644
index eeb7f63..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/sync.py
+++ /dev/null
@@ -1,211 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-#
-
-from datetime import datetime
-import errno
-import os
-import select
-import socket
-import ssl
-import sys
-
-import gunicorn.http as http
-import gunicorn.http.wsgi as wsgi
-import gunicorn.util as util
-import gunicorn.workers.base as base
-
-
-class StopWaiting(Exception):
- """ exception raised to stop waiting for a connection """
-
-
-class SyncWorker(base.Worker):
-
- def accept(self, listener):
- client, addr = listener.accept()
- client.setblocking(1)
- util.close_on_exec(client)
- self.handle(listener, client, addr)
-
- def wait(self, timeout):
- try:
- self.notify()
- ret = select.select(self.wait_fds, [], [], timeout)
- if ret[0]:
- if self.PIPE[0] in ret[0]:
- os.read(self.PIPE[0], 1)
- return ret[0]
-
- except select.error as e:
- if e.args[0] == errno.EINTR:
- return self.sockets
- if e.args[0] == errno.EBADF:
- if self.nr < 0:
- return self.sockets
- else:
- raise StopWaiting
- raise
-
- def is_parent_alive(self):
- # If our parent changed then we shut down.
- if self.ppid != os.getppid():
- self.log.info("Parent changed, shutting down: %s", self)
- return False
- return True
-
- def run_for_one(self, timeout):
- listener = self.sockets[0]
- while self.alive:
- self.notify()
-
- # Accept a connection. If we get an error telling us
- # that no connection is waiting we fall down to the
- # select which is where we'll wait for a bit for new
- # workers to come give us some love.
- try:
- self.accept(listener)
- # Keep processing clients until no one is waiting. This
- # prevents the need to select() for every client that we
- # process.
- continue
-
- except EnvironmentError as e:
- if e.errno not in (errno.EAGAIN, errno.ECONNABORTED,
- errno.EWOULDBLOCK):
- raise
-
- if not self.is_parent_alive():
- return
-
- try:
- self.wait(timeout)
- except StopWaiting:
- return
-
- def run_for_multiple(self, timeout):
- while self.alive:
- self.notify()
-
- try:
- ready = self.wait(timeout)
- except StopWaiting:
- return
-
- if ready is not None:
- for listener in ready:
- if listener == self.PIPE[0]:
- continue
-
- try:
- self.accept(listener)
- except EnvironmentError as e:
- if e.errno not in (errno.EAGAIN, errno.ECONNABORTED,
- errno.EWOULDBLOCK):
- raise
-
- if not self.is_parent_alive():
- return
-
- def run(self):
- # if no timeout is given the worker will never wait and will
- # use the CPU for nothing. This minimal timeout prevent it.
- timeout = self.timeout or 0.5
-
- # self.socket appears to lose its blocking status after
- # we fork in the arbiter. Reset it here.
- for s in self.sockets:
- s.setblocking(0)
-
- if len(self.sockets) > 1:
- self.run_for_multiple(timeout)
- else:
- self.run_for_one(timeout)
-
- def handle(self, listener, client, addr):
- req = None
- try:
- if self.cfg.is_ssl:
- client = ssl.wrap_socket(client, server_side=True,
- **self.cfg.ssl_options)
-
- parser = http.RequestParser(self.cfg, client, addr)
- req = next(parser)
- self.handle_request(listener, req, client, addr)
- except http.errors.NoMoreData as e:
- self.log.debug("Ignored premature client disconnection. %s", e)
- except StopIteration as e:
- self.log.debug("Closing connection. %s", e)
- except ssl.SSLError as e:
- if e.args[0] == ssl.SSL_ERROR_EOF:
- self.log.debug("ssl connection closed")
- client.close()
- else:
- self.log.debug("Error processing SSL request.")
- self.handle_error(req, client, addr, e)
- except EnvironmentError as e:
- if e.errno not in (errno.EPIPE, errno.ECONNRESET, errno.ENOTCONN):
- self.log.exception("Socket error processing request.")
- else:
- if e.errno == errno.ECONNRESET:
- self.log.debug("Ignoring connection reset")
- elif e.errno == errno.ENOTCONN:
- self.log.debug("Ignoring socket not connected")
- else:
- self.log.debug("Ignoring EPIPE")
- except Exception as e:
- self.handle_error(req, client, addr, e)
- finally:
- util.close(client)
-
- def handle_request(self, listener, req, client, addr):
- environ = {}
- resp = None
- try:
- self.cfg.pre_request(self, req)
- request_start = datetime.now()
- resp, environ = wsgi.create(req, client, addr,
- listener.getsockname(), self.cfg)
- # Force the connection closed until someone shows
- # a buffering proxy that supports Keep-Alive to
- # the backend.
- resp.force_close()
- self.nr += 1
- if self.nr >= self.max_requests:
- self.log.info("Autorestarting worker after current request.")
- self.alive = False
- respiter = self.wsgi(environ, resp.start_response)
- try:
- if isinstance(respiter, environ['wsgi.file_wrapper']):
- resp.write_file(respiter)
- else:
- for item in respiter:
- resp.write(item)
- resp.close()
- request_time = datetime.now() - request_start
- self.log.access(resp, req, environ, request_time)
- finally:
- if hasattr(respiter, "close"):
- respiter.close()
- except EnvironmentError:
- # pass to next try-except level
- util.reraise(*sys.exc_info())
- except Exception:
- if resp and resp.headers_sent:
- # If the requests have already been sent, we should close the
- # connection to indicate the error.
- self.log.exception("Error handling request")
- try:
- client.shutdown(socket.SHUT_RDWR)
- client.close()
- except EnvironmentError:
- pass
- raise StopIteration()
- raise
- finally:
- try:
- self.cfg.post_request(self, req, environ, resp)
- except Exception:
- self.log.exception("Exception in post_request hook")
diff --git a/env/lib/python3.9/site-packages/gunicorn/workers/workertmp.py b/env/lib/python3.9/site-packages/gunicorn/workers/workertmp.py
deleted file mode 100644
index 65bbe54..0000000
--- a/env/lib/python3.9/site-packages/gunicorn/workers/workertmp.py
+++ /dev/null
@@ -1,55 +0,0 @@
-# -*- coding: utf-8 -
-#
-# This file is part of gunicorn released under the MIT license.
-# See the NOTICE for more information.
-
-import os
-import platform
-import tempfile
-
-from gunicorn import util
-
-PLATFORM = platform.system()
-IS_CYGWIN = PLATFORM.startswith('CYGWIN')
-
-
-class WorkerTmp(object):
-
- def __init__(self, cfg):
- old_umask = os.umask(cfg.umask)
- fdir = cfg.worker_tmp_dir
- if fdir and not os.path.isdir(fdir):
- raise RuntimeError("%s doesn't exist. Can't create workertmp." % fdir)
- fd, name = tempfile.mkstemp(prefix="wgunicorn-", dir=fdir)
- os.umask(old_umask)
-
- # change the owner and group of the file if the worker will run as
- # a different user or group, so that the worker can modify the file
- if cfg.uid != os.geteuid() or cfg.gid != os.getegid():
- util.chown(name, cfg.uid, cfg.gid)
-
- # unlink the file so we don't leak tempory files
- try:
- if not IS_CYGWIN:
- util.unlink(name)
- # In Python 3.8, open() emits RuntimeWarning if buffering=1 for binary mode.
- # Because we never write to this file, pass 0 to switch buffering off.
- self._tmp = os.fdopen(fd, 'w+b', 0)
- except Exception:
- os.close(fd)
- raise
-
- self.spinner = 0
-
- def notify(self):
- self.spinner = (self.spinner + 1) % 2
- os.fchmod(self._tmp.fileno(), self.spinner)
-
- def last_update(self):
- return os.fstat(self._tmp.fileno()).st_ctime
-
- def fileno(self):
- return self._tmp.fileno()
-
- def close(self):
- return self._tmp.close()
diff --git a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/INSTALLER b/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/INSTALLER
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/INSTALLER
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/LICENSE.txt b/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/LICENSE.txt
deleted file mode 100644
index 8f080ea..0000000
--- a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/LICENSE.txt
+++ /dev/null
@@ -1,22 +0,0 @@
-The MIT License (MIT)
-
-Copyright (c) 2016 Nathaniel J. Smith and other contributors
-
-Permission is hereby granted, free of charge, to any person obtaining
-a copy of this software and associated documentation files (the
-"Software"), to deal in the Software without restriction, including
-without limitation the rights to use, copy, modify, merge, publish,
-distribute, sublicense, and/or sell copies of the Software, and to
-permit persons to whom the Software is furnished to do so, subject to
-the following conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
-LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
-OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
-WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/METADATA b/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/METADATA
deleted file mode 100644
index 8c77b40..0000000
--- a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/METADATA
+++ /dev/null
@@ -1,197 +0,0 @@
-Metadata-Version: 2.1
-Name: h11
-Version: 0.13.0
-Summary: A pure-Python, bring-your-own-I/O implementation of HTTP/1.1
-Home-page: https://github.com/python-hyper/h11
-Author: Nathaniel J. Smith
-Author-email: njs@pobox.com
-License: MIT
-Platform: UNKNOWN
-Classifier: Development Status :: 3 - Alpha
-Classifier: Intended Audience :: Developers
-Classifier: License :: OSI Approved :: MIT License
-Classifier: Programming Language :: Python :: Implementation :: CPython
-Classifier: Programming Language :: Python :: Implementation :: PyPy
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3 :: Only
-Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Programming Language :: Python :: 3.8
-Classifier: Programming Language :: Python :: 3.9
-Classifier: Topic :: Internet :: WWW/HTTP
-Classifier: Topic :: System :: Networking
-Requires-Python: >=3.6
-License-File: LICENSE.txt
-Requires-Dist: dataclasses ; python_version < "3.7"
-Requires-Dist: typing-extensions ; python_version < "3.8"
-
-h11
-===
-
-.. image:: https://travis-ci.org/python-hyper/h11.svg?branch=master
- :target: https://travis-ci.org/python-hyper/h11
- :alt: Automated test status
-
-.. image:: https://codecov.io/gh/python-hyper/h11/branch/master/graph/badge.svg
- :target: https://codecov.io/gh/python-hyper/h11
- :alt: Test coverage
-
-.. image:: https://readthedocs.org/projects/h11/badge/?version=latest
- :target: http://h11.readthedocs.io/en/latest/?badge=latest
- :alt: Documentation Status
-
-This is a little HTTP/1.1 library written from scratch in Python,
-heavily inspired by `hyper-h2 `_.
-
-It's a "bring-your-own-I/O" library; h11 contains no IO code
-whatsoever. This means you can hook h11 up to your favorite network
-API, and that could be anything you want: synchronous, threaded,
-asynchronous, or your own implementation of `RFC 6214
-`_ -- h11 won't judge you.
-(Compare this to the current state of the art, where every time a `new
-network API `_ comes along then someone
-gets to start over reimplementing the entire HTTP protocol from
-scratch.) Cory Benfield made an `excellent blog post describing the
-benefits of this approach
-`_, or if you like video
-then here's his `PyCon 2016 talk on the same theme
-`_.
-
-This also means that h11 is not immediately useful out of the box:
-it's a toolkit for building programs that speak HTTP, not something
-that could directly replace ``requests`` or ``twisted.web`` or
-whatever. But h11 makes it much easier to implement something like
-``requests`` or ``twisted.web``.
-
-At a high level, working with h11 goes like this:
-
-1) First, create an ``h11.Connection`` object to track the state of a
- single HTTP/1.1 connection.
-
-2) When you read data off the network, pass it to
- ``conn.receive_data(...)``; you'll get back a list of objects
- representing high-level HTTP "events".
-
-3) When you want to send a high-level HTTP event, create the
- corresponding "event" object and pass it to ``conn.send(...)``;
- this will give you back some bytes that you can then push out
- through the network.
-
-For example, a client might instantiate and then send a
-``h11.Request`` object, then zero or more ``h11.Data`` objects for the
-request body (e.g., if this is a POST), and then a
-``h11.EndOfMessage`` to indicate the end of the message. Then the
-server would then send back a ``h11.Response``, some ``h11.Data``, and
-its own ``h11.EndOfMessage``. If either side violates the protocol,
-you'll get a ``h11.ProtocolError`` exception.
-
-h11 is suitable for implementing both servers and clients, and has a
-pleasantly symmetric API: the events you send as a client are exactly
-the ones that you receive as a server and vice-versa.
-
-`Here's an example of a tiny HTTP client
-`_
-
-It also has `a fine manual `_.
-
-FAQ
----
-
-*Whyyyyy?*
-
-I wanted to play with HTTP in `Curio
-`__ and `Trio
-`__, which at the time didn't have any
-HTTP libraries. So I thought, no big deal, Python has, like, a dozen
-different implementations of HTTP, surely I can find one that's
-reusable. I didn't find one, but I did find Cory's call-to-arms
-blog-post. So I figured, well, fine, if I have to implement HTTP from
-scratch, at least I can make sure no-one *else* has to ever again.
-
-*Should I use it?*
-
-Maybe. You should be aware that it's a very young project. But, it's
-feature complete and has an exhaustive test-suite and complete docs,
-so the next step is for people to try using it and see how it goes
-:-). If you do then please let us know -- if nothing else we'll want
-to talk to you before making any incompatible changes!
-
-*What are the features/limitations?*
-
-Roughly speaking, it's trying to be a robust, complete, and non-hacky
-implementation of the first "chapter" of the HTTP/1.1 spec: `RFC 7230:
-HTTP/1.1 Message Syntax and Routing
-`_. That is, it mostly focuses on
-implementing HTTP at the level of taking bytes on and off the wire,
-and the headers related to that, and tries to be anal about spec
-conformance. It doesn't know about higher-level concerns like URL
-routing, conditional GETs, cross-origin cookie policies, or content
-negotiation. But it does know how to take care of framing,
-cross-version differences in keep-alive handling, and the "obsolete
-line folding" rule, so you can focus your energies on the hard /
-interesting parts for your application, and it tries to support the
-full specification in the sense that any useful HTTP/1.1 conformant
-application should be able to use h11.
-
-It's pure Python, and has no dependencies outside of the standard
-library.
-
-It has a test suite with 100.0% coverage for both statements and
-branches.
-
-Currently it supports Python 3 (testing on 3.6-3.9) and PyPy 3.
-The last Python 2-compatible version was h11 0.11.x.
-(Originally it had a Cython wrapper for `http-parser
-`_ and a beautiful nested state
-machine implemented with ``yield from`` to postprocess the output. But
-I had to take these out -- the new *parser* needs fewer lines-of-code
-than the old *parser wrapper*, is written in pure Python, uses no
-exotic language syntax, and has more features. It's sad, really; that
-old state machine was really slick. I just need a few sentences here
-to mourn that.)
-
-I don't know how fast it is. I haven't benchmarked or profiled it yet,
-so it's probably got a few pointless hot spots, and I've been trying
-to err on the side of simplicity and robustness instead of
-micro-optimization. But at the architectural level I tried hard to
-avoid fundamentally bad decisions, e.g., I believe that all the
-parsing algorithms remain linear-time even in the face of pathological
-input like slowloris, and there are no byte-by-byte loops. (I also
-believe that it maintains bounded memory usage in the face of
-arbitrary/pathological input.)
-
-The whole library is ~800 lines-of-code. You can read and understand
-the whole thing in less than an hour. Most of the energy invested in
-this so far has been spent on trying to keep things simple by
-minimizing special-cases and ad hoc state manipulation; even though it
-is now quite small and simple, I'm still annoyed that I haven't
-figured out how to make it even smaller and simpler. (Unfortunately,
-HTTP does not lend itself to simplicity.)
-
-The API is ~feature complete and I don't expect the general outlines
-to change much, but you can't judge an API's ergonomics until you
-actually document and use it, so I'd expect some changes in the
-details.
-
-*How do I try it?*
-
-.. code-block:: sh
-
- $ pip install h11
- $ git clone git@github.com:python-hyper/h11
- $ cd h11/examples
- $ python basic-client.py
-
-and go from there.
-
-*License?*
-
-MIT
-
-*Code of conduct?*
-
-Contributors are requested to follow our `code of conduct
-`_ in
-all project spaces.
-
-
diff --git a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/RECORD b/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/RECORD
deleted file mode 100644
index 965ac4f..0000000
--- a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/RECORD
+++ /dev/null
@@ -1,52 +0,0 @@
-h11-0.13.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-h11-0.13.0.dist-info/LICENSE.txt,sha256=N9tbuFkm2yikJ6JYZ_ELEjIAOuob5pzLhRE4rbjm82E,1124
-h11-0.13.0.dist-info/METADATA,sha256=Fd9foEJycn0gUB9YsXul6neMlYnEU0MRQ8IUBsSOHxE,8245
-h11-0.13.0.dist-info/RECORD,,
-h11-0.13.0.dist-info/WHEEL,sha256=ewwEueio1C2XeHTvT17n8dZUJgOvyCWCt0WVNLClP9o,92
-h11-0.13.0.dist-info/top_level.txt,sha256=F7dC4jl3zeh8TGHEPaWJrMbeuoWbS379Gwdi-Yvdcis,4
-h11/__init__.py,sha256=iO1KzkSO42yZ6ffg-VMgbx_ZVTWGUY00nRYEWn-s3kY,1507
-h11/__pycache__/__init__.cpython-39.pyc,,
-h11/__pycache__/_abnf.cpython-39.pyc,,
-h11/__pycache__/_connection.cpython-39.pyc,,
-h11/__pycache__/_events.cpython-39.pyc,,
-h11/__pycache__/_headers.cpython-39.pyc,,
-h11/__pycache__/_readers.cpython-39.pyc,,
-h11/__pycache__/_receivebuffer.cpython-39.pyc,,
-h11/__pycache__/_state.cpython-39.pyc,,
-h11/__pycache__/_util.cpython-39.pyc,,
-h11/__pycache__/_version.cpython-39.pyc,,
-h11/__pycache__/_writers.cpython-39.pyc,,
-h11/_abnf.py,sha256=tMKqgOEkTHHp8sPd_gmU9Qowe_yXXrihct63RX2zJsg,4637
-h11/_connection.py,sha256=udHjqEO1fOcQUKa3hYIw88DMeoyG4fxiIXKjgE4DwJw,26480
-h11/_events.py,sha256=LEfuvg1AbhHaVRwxCd0I-pFn9-ezUOaoL8o2Kvy1PBA,11816
-h11/_headers.py,sha256=tRwZuFy5Wj4Yi9VVad_s7EqwCgeN6O3TIbcHd5CN_GI,10230
-h11/_readers.py,sha256=TWWoSbLVBfYGzD5dunReTd2QCxz466wjwu-4Fkzk_sQ,8370
-h11/_receivebuffer.py,sha256=xrspsdsNgWFxRfQcTXxR8RrdjRXXTK0Io5cQYWpJ1Ws,5252
-h11/_state.py,sha256=F8MPHIFMJV3kUPYR3YjrjqjJ1AYp_FZ38UwGr0855lE,13184
-h11/_util.py,sha256=LWkkjXyJaFlAy6Lt39w73UStklFT5ovcvo0TkY7RYuk,4888
-h11/_version.py,sha256=ye-8iNs3P1TB71VRGlNQe2OxnAe-RupjozAMywAS5z8,686
-h11/_writers.py,sha256=7WBTXyJqFAUqqmLl5adGF8_7UVQdOVa2phL8s8csljI,5063
-h11/py.typed,sha256=sow9soTwP9T_gEAQSVh7Gb8855h04Nwmhs2We-JRgZM,7
-h11/tests/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-h11/tests/__pycache__/__init__.cpython-39.pyc,,
-h11/tests/__pycache__/helpers.cpython-39.pyc,,
-h11/tests/__pycache__/test_against_stdlib_http.cpython-39.pyc,,
-h11/tests/__pycache__/test_connection.cpython-39.pyc,,
-h11/tests/__pycache__/test_events.cpython-39.pyc,,
-h11/tests/__pycache__/test_headers.cpython-39.pyc,,
-h11/tests/__pycache__/test_helpers.cpython-39.pyc,,
-h11/tests/__pycache__/test_io.cpython-39.pyc,,
-h11/tests/__pycache__/test_receivebuffer.cpython-39.pyc,,
-h11/tests/__pycache__/test_state.cpython-39.pyc,,
-h11/tests/__pycache__/test_util.cpython-39.pyc,,
-h11/tests/data/test-file,sha256=ZJ03Rqs98oJw29OHzJg7LlMzyGQaRAY0r3AqBeM2wVU,65
-h11/tests/helpers.py,sha256=a1EVG_p7xU4wRsa3tMPTRxuaKCmretok9sxXWvqfmQA,3355
-h11/tests/test_against_stdlib_http.py,sha256=cojCHgHXFQ8gWhNlEEwl3trmOpN-5uDukRoHnElqo3A,3995
-h11/tests/test_connection.py,sha256=ZbPLDPclKvjgjAhgk-WlCPBaf17c4XUIV2tpaW08jOI,38720
-h11/tests/test_events.py,sha256=LPVLbcV-NvPNK9fW3rraR6Bdpz1hAlsWubMtNaJ5gHg,4657
-h11/tests/test_headers.py,sha256=qd8T1Zenuz5GbD6wklSJ5G8VS7trrYgMV0jT-SMvqg8,5612
-h11/tests/test_helpers.py,sha256=kAo0CEM4LGqmyyP2ZFmhsyq3UFJqoFfAbzu3hbWreRM,794
-h11/tests/test_io.py,sha256=gXFSKpcx6n3-Ld0Y8w5kBkom1LZsCq3uHtqdotQ3S2c,16243
-h11/tests/test_receivebuffer.py,sha256=3jGbeJM36Akqg_pAhPb7XzIn2NS6RhPg-Ryg8Eu6ytk,3454
-h11/tests/test_state.py,sha256=rqll9WqFsJPE0zSrtCn9LH659mPKsDeXZ-DwXwleuBQ,8928
-h11/tests/test_util.py,sha256=ZWdRng_P-JP-cnvmcBhazBxfyWmEKBB0NLrDy5eq3h0,2970
diff --git a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/WHEEL b/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/WHEEL
deleted file mode 100644
index 5bad85f..0000000
--- a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/WHEEL
+++ /dev/null
@@ -1,5 +0,0 @@
-Wheel-Version: 1.0
-Generator: bdist_wheel (0.37.0)
-Root-Is-Purelib: true
-Tag: py3-none-any
-
diff --git a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/top_level.txt b/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/top_level.txt
deleted file mode 100644
index 0d24def..0000000
--- a/env/lib/python3.9/site-packages/h11-0.13.0.dist-info/top_level.txt
+++ /dev/null
@@ -1 +0,0 @@
-h11
diff --git a/env/lib/python3.9/site-packages/h11/__init__.py b/env/lib/python3.9/site-packages/h11/__init__.py
deleted file mode 100644
index 989e92c..0000000
--- a/env/lib/python3.9/site-packages/h11/__init__.py
+++ /dev/null
@@ -1,62 +0,0 @@
-# A highish-level implementation of the HTTP/1.1 wire protocol (RFC 7230),
-# containing no networking code at all, loosely modelled on hyper-h2's generic
-# implementation of HTTP/2 (and in particular the h2.connection.H2Connection
-# class). There's still a bunch of subtle details you need to get right if you
-# want to make this actually useful, because it doesn't implement all the
-# semantics to check that what you're asking to write to the wire is sensible,
-# but at least it gets you out of dealing with the wire itself.
-
-from h11._connection import Connection, NEED_DATA, PAUSED
-from h11._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from h11._state import (
- CLIENT,
- CLOSED,
- DONE,
- ERROR,
- IDLE,
- MIGHT_SWITCH_PROTOCOL,
- MUST_CLOSE,
- SEND_BODY,
- SEND_RESPONSE,
- SERVER,
- SWITCHED_PROTOCOL,
-)
-from h11._util import LocalProtocolError, ProtocolError, RemoteProtocolError
-from h11._version import __version__
-
-PRODUCT_ID = "python-h11/" + __version__
-
-
-__all__ = (
- "Connection",
- "NEED_DATA",
- "PAUSED",
- "ConnectionClosed",
- "Data",
- "EndOfMessage",
- "Event",
- "InformationalResponse",
- "Request",
- "Response",
- "CLIENT",
- "CLOSED",
- "DONE",
- "ERROR",
- "IDLE",
- "MUST_CLOSE",
- "SEND_BODY",
- "SEND_RESPONSE",
- "SERVER",
- "SWITCHED_PROTOCOL",
- "ProtocolError",
- "LocalProtocolError",
- "RemoteProtocolError",
-)
diff --git a/env/lib/python3.9/site-packages/h11/_abnf.py b/env/lib/python3.9/site-packages/h11/_abnf.py
deleted file mode 100644
index e6d49e1..0000000
--- a/env/lib/python3.9/site-packages/h11/_abnf.py
+++ /dev/null
@@ -1,129 +0,0 @@
-# We use native strings for all the re patterns, to take advantage of string
-# formatting, and then convert to bytestrings when compiling the final re
-# objects.
-
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#whitespace
-# OWS = *( SP / HTAB )
-# ; optional whitespace
-OWS = r"[ \t]*"
-
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#rule.token.separators
-# token = 1*tchar
-#
-# tchar = "!" / "#" / "$" / "%" / "&" / "'" / "*"
-# / "+" / "-" / "." / "^" / "_" / "`" / "|" / "~"
-# / DIGIT / ALPHA
-# ; any VCHAR, except delimiters
-token = r"[-!#$%&'*+.^_`|~0-9a-zA-Z]+"
-
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#header.fields
-# field-name = token
-field_name = token
-
-# The standard says:
-#
-# field-value = *( field-content / obs-fold )
-# field-content = field-vchar [ 1*( SP / HTAB ) field-vchar ]
-# field-vchar = VCHAR / obs-text
-# obs-fold = CRLF 1*( SP / HTAB )
-# ; obsolete line folding
-# ; see Section 3.2.4
-#
-# https://tools.ietf.org/html/rfc5234#appendix-B.1
-#
-# VCHAR = %x21-7E
-# ; visible (printing) characters
-#
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#rule.quoted-string
-# obs-text = %x80-FF
-#
-# However, the standard definition of field-content is WRONG! It disallows
-# fields containing a single visible character surrounded by whitespace,
-# e.g. "foo a bar".
-#
-# See: https://www.rfc-editor.org/errata_search.php?rfc=7230&eid=4189
-#
-# So our definition of field_content attempts to fix it up...
-#
-# Also, we allow lots of control characters, because apparently people assume
-# that they're legal in practice (e.g., google analytics makes cookies with
-# \x01 in them!):
-# https://github.com/python-hyper/h11/issues/57
-# We still don't allow NUL or whitespace, because those are often treated as
-# meta-characters and letting them through can lead to nasty issues like SSRF.
-vchar = r"[\x21-\x7e]"
-vchar_or_obs_text = r"[^\x00\s]"
-field_vchar = vchar_or_obs_text
-field_content = r"{field_vchar}+(?:[ \t]+{field_vchar}+)*".format(**globals())
-
-# We handle obs-fold at a different level, and our fixed-up field_content
-# already grows to swallow the whole value, so ? instead of *
-field_value = r"({field_content})?".format(**globals())
-
-# header-field = field-name ":" OWS field-value OWS
-header_field = (
- r"(?P{field_name})"
- r":"
- r"{OWS}"
- r"(?P{field_value})"
- r"{OWS}".format(**globals())
-)
-
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#request.line
-#
-# request-line = method SP request-target SP HTTP-version CRLF
-# method = token
-# HTTP-version = HTTP-name "/" DIGIT "." DIGIT
-# HTTP-name = %x48.54.54.50 ; "HTTP", case-sensitive
-#
-# request-target is complicated (see RFC 7230 sec 5.3) -- could be path, full
-# URL, host+port (for connect), or even "*", but in any case we are guaranteed
-# that it contists of the visible printing characters.
-method = token
-request_target = r"{vchar}+".format(**globals())
-http_version = r"HTTP/(?P[0-9]\.[0-9])"
-request_line = (
- r"(?P{method})"
- r" "
- r"(?P{request_target})"
- r" "
- r"{http_version}".format(**globals())
-)
-
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#status.line
-#
-# status-line = HTTP-version SP status-code SP reason-phrase CRLF
-# status-code = 3DIGIT
-# reason-phrase = *( HTAB / SP / VCHAR / obs-text )
-status_code = r"[0-9]{3}"
-reason_phrase = r"([ \t]|{vchar_or_obs_text})*".format(**globals())
-status_line = (
- r"{http_version}"
- r" "
- r"(?P{status_code})"
- # However, there are apparently a few too many servers out there that just
- # leave out the reason phrase:
- # https://github.com/scrapy/scrapy/issues/345#issuecomment-281756036
- # https://github.com/seanmonstar/httparse/issues/29
- # so make it optional. ?: is a non-capturing group.
- r"(?: (?P{reason_phrase}))?".format(**globals())
-)
-
-HEXDIG = r"[0-9A-Fa-f]"
-# Actually
-#
-# chunk-size = 1*HEXDIG
-#
-# but we impose an upper-limit to avoid ridiculosity. len(str(2**64)) == 20
-chunk_size = r"({HEXDIG}){{1,20}}".format(**globals())
-# Actually
-#
-# chunk-ext = *( ";" chunk-ext-name [ "=" chunk-ext-val ] )
-#
-# but we aren't parsing the things so we don't really care.
-chunk_ext = r";.*"
-chunk_header = (
- r"(?P{chunk_size})"
- r"(?P{chunk_ext})?"
- r"\r\n".format(**globals())
-)
diff --git a/env/lib/python3.9/site-packages/h11/_connection.py b/env/lib/python3.9/site-packages/h11/_connection.py
deleted file mode 100644
index d11386f..0000000
--- a/env/lib/python3.9/site-packages/h11/_connection.py
+++ /dev/null
@@ -1,631 +0,0 @@
-# This contains the main Connection class. Everything in h11 revolves around
-# this.
-from typing import Any, Callable, cast, Dict, List, Optional, Tuple, Type, Union
-
-from ._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from ._headers import get_comma_header, has_expect_100_continue, set_comma_header
-from ._readers import READERS, ReadersType
-from ._receivebuffer import ReceiveBuffer
-from ._state import (
- _SWITCH_CONNECT,
- _SWITCH_UPGRADE,
- CLIENT,
- ConnectionState,
- DONE,
- ERROR,
- MIGHT_SWITCH_PROTOCOL,
- SEND_BODY,
- SERVER,
- SWITCHED_PROTOCOL,
-)
-from ._util import ( # Import the internal things we need
- LocalProtocolError,
- RemoteProtocolError,
- Sentinel,
-)
-from ._writers import WRITERS, WritersType
-
-# Everything in __all__ gets re-exported as part of the h11 public API.
-__all__ = ["Connection", "NEED_DATA", "PAUSED"]
-
-
-class NEED_DATA(Sentinel, metaclass=Sentinel):
- pass
-
-
-class PAUSED(Sentinel, metaclass=Sentinel):
- pass
-
-
-# If we ever have this much buffered without it making a complete parseable
-# event, we error out. The only time we really buffer is when reading the
-# request/response line + headers together, so this is effectively the limit on
-# the size of that.
-#
-# Some precedents for defaults:
-# - node.js: 80 * 1024
-# - tomcat: 8 * 1024
-# - IIS: 16 * 1024
-# - Apache: <8 KiB per line>
-DEFAULT_MAX_INCOMPLETE_EVENT_SIZE = 16 * 1024
-
-# RFC 7230's rules for connection lifecycles:
-# - If either side says they want to close the connection, then the connection
-# must close.
-# - HTTP/1.1 defaults to keep-alive unless someone says Connection: close
-# - HTTP/1.0 defaults to close unless both sides say Connection: keep-alive
-# (and even this is a mess -- e.g. if you're implementing a proxy then
-# sending Connection: keep-alive is forbidden).
-#
-# We simplify life by simply not supporting keep-alive with HTTP/1.0 peers. So
-# our rule is:
-# - If someone says Connection: close, we will close
-# - If someone uses HTTP/1.0, we will close.
-def _keep_alive(event: Union[Request, Response]) -> bool:
- connection = get_comma_header(event.headers, b"connection")
- if b"close" in connection:
- return False
- if getattr(event, "http_version", b"1.1") < b"1.1":
- return False
- return True
-
-
-def _body_framing(
- request_method: bytes, event: Union[Request, Response]
-) -> Tuple[str, Union[Tuple[()], Tuple[int]]]:
- # Called when we enter SEND_BODY to figure out framing information for
- # this body.
- #
- # These are the only two events that can trigger a SEND_BODY state:
- assert type(event) in (Request, Response)
- # Returns one of:
- #
- # ("content-length", count)
- # ("chunked", ())
- # ("http/1.0", ())
- #
- # which are (lookup key, *args) for constructing body reader/writer
- # objects.
- #
- # Reference: https://tools.ietf.org/html/rfc7230#section-3.3.3
- #
- # Step 1: some responses always have an empty body, regardless of what the
- # headers say.
- if type(event) is Response:
- if (
- event.status_code in (204, 304)
- or request_method == b"HEAD"
- or (request_method == b"CONNECT" and 200 <= event.status_code < 300)
- ):
- return ("content-length", (0,))
- # Section 3.3.3 also lists another case -- responses with status_code
- # < 200. For us these are InformationalResponses, not Responses, so
- # they can't get into this function in the first place.
- assert event.status_code >= 200
-
- # Step 2: check for Transfer-Encoding (T-E beats C-L):
- transfer_encodings = get_comma_header(event.headers, b"transfer-encoding")
- if transfer_encodings:
- assert transfer_encodings == [b"chunked"]
- return ("chunked", ())
-
- # Step 3: check for Content-Length
- content_lengths = get_comma_header(event.headers, b"content-length")
- if content_lengths:
- return ("content-length", (int(content_lengths[0]),))
-
- # Step 4: no applicable headers; fallback/default depends on type
- if type(event) is Request:
- return ("content-length", (0,))
- else:
- return ("http/1.0", ())
-
-
-################################################################
-#
-# The main Connection class
-#
-################################################################
-
-
-class Connection:
- """An object encapsulating the state of an HTTP connection.
-
- Args:
- our_role: If you're implementing a client, pass :data:`h11.CLIENT`. If
- you're implementing a server, pass :data:`h11.SERVER`.
-
- max_incomplete_event_size (int):
- The maximum number of bytes we're willing to buffer of an
- incomplete event. In practice this mostly sets a limit on the
- maximum size of the request/response line + headers. If this is
- exceeded, then :meth:`next_event` will raise
- :exc:`RemoteProtocolError`.
-
- """
-
- def __init__(
- self,
- our_role: Type[Sentinel],
- max_incomplete_event_size: int = DEFAULT_MAX_INCOMPLETE_EVENT_SIZE,
- ) -> None:
- self._max_incomplete_event_size = max_incomplete_event_size
- # State and role tracking
- if our_role not in (CLIENT, SERVER):
- raise ValueError("expected CLIENT or SERVER, not {!r}".format(our_role))
- self.our_role = our_role
- self.their_role: Type[Sentinel]
- if our_role is CLIENT:
- self.their_role = SERVER
- else:
- self.their_role = CLIENT
- self._cstate = ConnectionState()
-
- # Callables for converting data->events or vice-versa given the
- # current state
- self._writer = self._get_io_object(self.our_role, None, WRITERS)
- self._reader = self._get_io_object(self.their_role, None, READERS)
-
- # Holds any unprocessed received data
- self._receive_buffer = ReceiveBuffer()
- # If this is true, then it indicates that the incoming connection was
- # closed *after* the end of whatever's in self._receive_buffer:
- self._receive_buffer_closed = False
-
- # Extra bits of state that don't fit into the state machine.
- #
- # These two are only used to interpret framing headers for figuring
- # out how to read/write response bodies. their_http_version is also
- # made available as a convenient public API.
- self.their_http_version: Optional[bytes] = None
- self._request_method: Optional[bytes] = None
- # This is pure flow-control and doesn't at all affect the set of legal
- # transitions, so no need to bother ConnectionState with it:
- self.client_is_waiting_for_100_continue = False
-
- @property
- def states(self) -> Dict[Type[Sentinel], Type[Sentinel]]:
- """A dictionary like::
-
- {CLIENT: , SERVER: }
-
- See :ref:`state-machine` for details.
-
- """
- return dict(self._cstate.states)
-
- @property
- def our_state(self) -> Type[Sentinel]:
- """The current state of whichever role we are playing. See
- :ref:`state-machine` for details.
- """
- return self._cstate.states[self.our_role]
-
- @property
- def their_state(self) -> Type[Sentinel]:
- """The current state of whichever role we are NOT playing. See
- :ref:`state-machine` for details.
- """
- return self._cstate.states[self.their_role]
-
- @property
- def they_are_waiting_for_100_continue(self) -> bool:
- return self.their_role is CLIENT and self.client_is_waiting_for_100_continue
-
- def start_next_cycle(self) -> None:
- """Attempt to reset our connection state for a new request/response
- cycle.
-
- If both client and server are in :data:`DONE` state, then resets them
- both to :data:`IDLE` state in preparation for a new request/response
- cycle on this same connection. Otherwise, raises a
- :exc:`LocalProtocolError`.
-
- See :ref:`keepalive-and-pipelining`.
-
- """
- old_states = dict(self._cstate.states)
- self._cstate.start_next_cycle()
- self._request_method = None
- # self.their_http_version gets left alone, since it presumably lasts
- # beyond a single request/response cycle
- assert not self.client_is_waiting_for_100_continue
- self._respond_to_state_changes(old_states)
-
- def _process_error(self, role: Type[Sentinel]) -> None:
- old_states = dict(self._cstate.states)
- self._cstate.process_error(role)
- self._respond_to_state_changes(old_states)
-
- def _server_switch_event(self, event: Event) -> Optional[Type[Sentinel]]:
- if type(event) is InformationalResponse and event.status_code == 101:
- return _SWITCH_UPGRADE
- if type(event) is Response:
- if (
- _SWITCH_CONNECT in self._cstate.pending_switch_proposals
- and 200 <= event.status_code < 300
- ):
- return _SWITCH_CONNECT
- return None
-
- # All events go through here
- def _process_event(self, role: Type[Sentinel], event: Event) -> None:
- # First, pass the event through the state machine to make sure it
- # succeeds.
- old_states = dict(self._cstate.states)
- if role is CLIENT and type(event) is Request:
- if event.method == b"CONNECT":
- self._cstate.process_client_switch_proposal(_SWITCH_CONNECT)
- if get_comma_header(event.headers, b"upgrade"):
- self._cstate.process_client_switch_proposal(_SWITCH_UPGRADE)
- server_switch_event = None
- if role is SERVER:
- server_switch_event = self._server_switch_event(event)
- self._cstate.process_event(role, type(event), server_switch_event)
-
- # Then perform the updates triggered by it.
-
- if type(event) is Request:
- self._request_method = event.method
-
- if role is self.their_role and type(event) in (
- Request,
- Response,
- InformationalResponse,
- ):
- event = cast(Union[Request, Response, InformationalResponse], event)
- self.their_http_version = event.http_version
-
- # Keep alive handling
- #
- # RFC 7230 doesn't really say what one should do if Connection: close
- # shows up on a 1xx InformationalResponse. I think the idea is that
- # this is not supposed to happen. In any case, if it does happen, we
- # ignore it.
- if type(event) in (Request, Response) and not _keep_alive(
- cast(Union[Request, Response], event)
- ):
- self._cstate.process_keep_alive_disabled()
-
- # 100-continue
- if type(event) is Request and has_expect_100_continue(event):
- self.client_is_waiting_for_100_continue = True
- if type(event) in (InformationalResponse, Response):
- self.client_is_waiting_for_100_continue = False
- if role is CLIENT and type(event) in (Data, EndOfMessage):
- self.client_is_waiting_for_100_continue = False
-
- self._respond_to_state_changes(old_states, event)
-
- def _get_io_object(
- self,
- role: Type[Sentinel],
- event: Optional[Event],
- io_dict: Union[ReadersType, WritersType],
- ) -> Optional[Callable[..., Any]]:
- # event may be None; it's only used when entering SEND_BODY
- state = self._cstate.states[role]
- if state is SEND_BODY:
- # Special case: the io_dict has a dict of reader/writer factories
- # that depend on the request/response framing.
- framing_type, args = _body_framing(
- cast(bytes, self._request_method), cast(Union[Request, Response], event)
- )
- return io_dict[SEND_BODY][framing_type](*args) # type: ignore[index]
- else:
- # General case: the io_dict just has the appropriate reader/writer
- # for this state
- return io_dict.get((role, state)) # type: ignore
-
- # This must be called after any action that might have caused
- # self._cstate.states to change.
- def _respond_to_state_changes(
- self,
- old_states: Dict[Type[Sentinel], Type[Sentinel]],
- event: Optional[Event] = None,
- ) -> None:
- # Update reader/writer
- if self.our_state != old_states[self.our_role]:
- self._writer = self._get_io_object(self.our_role, event, WRITERS)
- if self.their_state != old_states[self.their_role]:
- self._reader = self._get_io_object(self.their_role, event, READERS)
-
- @property
- def trailing_data(self) -> Tuple[bytes, bool]:
- """Data that has been received, but not yet processed, represented as
- a tuple with two elements, where the first is a byte-string containing
- the unprocessed data itself, and the second is a bool that is True if
- the receive connection was closed.
-
- See :ref:`switching-protocols` for discussion of why you'd want this.
- """
- return (bytes(self._receive_buffer), self._receive_buffer_closed)
-
- def receive_data(self, data: bytes) -> None:
- """Add data to our internal receive buffer.
-
- This does not actually do any processing on the data, just stores
- it. To trigger processing, you have to call :meth:`next_event`.
-
- Args:
- data (:term:`bytes-like object`):
- The new data that was just received.
-
- Special case: If *data* is an empty byte-string like ``b""``,
- then this indicates that the remote side has closed the
- connection (end of file). Normally this is convenient, because
- standard Python APIs like :meth:`file.read` or
- :meth:`socket.recv` use ``b""`` to indicate end-of-file, while
- other failures to read are indicated using other mechanisms
- like raising :exc:`TimeoutError`. When using such an API you
- can just blindly pass through whatever you get from ``read``
- to :meth:`receive_data`, and everything will work.
-
- But, if you have an API where reading an empty string is a
- valid non-EOF condition, then you need to be aware of this and
- make sure to check for such strings and avoid passing them to
- :meth:`receive_data`.
-
- Returns:
- Nothing, but after calling this you should call :meth:`next_event`
- to parse the newly received data.
-
- Raises:
- RuntimeError:
- Raised if you pass an empty *data*, indicating EOF, and then
- pass a non-empty *data*, indicating more data that somehow
- arrived after the EOF.
-
- (Calling ``receive_data(b"")`` multiple times is fine,
- and equivalent to calling it once.)
-
- """
- if data:
- if self._receive_buffer_closed:
- raise RuntimeError("received close, then received more data?")
- self._receive_buffer += data
- else:
- self._receive_buffer_closed = True
-
- def _extract_next_receive_event(self) -> Union[Event, Type[Sentinel]]:
- state = self.their_state
- # We don't pause immediately when they enter DONE, because even in
- # DONE state we can still process a ConnectionClosed() event. But
- # if we have data in our buffer, then we definitely aren't getting
- # a ConnectionClosed() immediately and we need to pause.
- if state is DONE and self._receive_buffer:
- return PAUSED
- if state is MIGHT_SWITCH_PROTOCOL or state is SWITCHED_PROTOCOL:
- return PAUSED
- assert self._reader is not None
- event = self._reader(self._receive_buffer)
- if event is None:
- if not self._receive_buffer and self._receive_buffer_closed:
- # In some unusual cases (basically just HTTP/1.0 bodies), EOF
- # triggers an actual protocol event; in that case, we want to
- # return that event, and then the state will change and we'll
- # get called again to generate the actual ConnectionClosed().
- if hasattr(self._reader, "read_eof"):
- event = self._reader.read_eof() # type: ignore[attr-defined]
- else:
- event = ConnectionClosed()
- if event is None:
- event = NEED_DATA
- return event # type: ignore[no-any-return]
-
- def next_event(self) -> Union[Event, Type[Sentinel]]:
- """Parse the next event out of our receive buffer, update our internal
- state, and return it.
-
- This is a mutating operation -- think of it like calling :func:`next`
- on an iterator.
-
- Returns:
- : One of three things:
-
- 1) An event object -- see :ref:`events`.
-
- 2) The special constant :data:`NEED_DATA`, which indicates that
- you need to read more data from your socket and pass it to
- :meth:`receive_data` before this method will be able to return
- any more events.
-
- 3) The special constant :data:`PAUSED`, which indicates that we
- are not in a state where we can process incoming data (usually
- because the peer has finished their part of the current
- request/response cycle, and you have not yet called
- :meth:`start_next_cycle`). See :ref:`flow-control` for details.
-
- Raises:
- RemoteProtocolError:
- The peer has misbehaved. You should close the connection
- (possibly after sending some kind of 4xx response).
-
- Once this method returns :class:`ConnectionClosed` once, then all
- subsequent calls will also return :class:`ConnectionClosed`.
-
- If this method raises any exception besides :exc:`RemoteProtocolError`
- then that's a bug -- if it happens please file a bug report!
-
- If this method raises any exception then it also sets
- :attr:`Connection.their_state` to :data:`ERROR` -- see
- :ref:`error-handling` for discussion.
-
- """
-
- if self.their_state is ERROR:
- raise RemoteProtocolError("Can't receive data when peer state is ERROR")
- try:
- event = self._extract_next_receive_event()
- if event not in [NEED_DATA, PAUSED]:
- self._process_event(self.their_role, cast(Event, event))
- if event is NEED_DATA:
- if len(self._receive_buffer) > self._max_incomplete_event_size:
- # 431 is "Request header fields too large" which is pretty
- # much the only situation where we can get here
- raise RemoteProtocolError(
- "Receive buffer too long", error_status_hint=431
- )
- if self._receive_buffer_closed:
- # We're still trying to complete some event, but that's
- # never going to happen because no more data is coming
- raise RemoteProtocolError("peer unexpectedly closed connection")
- return event
- except BaseException as exc:
- self._process_error(self.their_role)
- if isinstance(exc, LocalProtocolError):
- exc._reraise_as_remote_protocol_error()
- else:
- raise
-
- def send(self, event: Event) -> Optional[bytes]:
- """Convert a high-level event into bytes that can be sent to the peer,
- while updating our internal state machine.
-
- Args:
- event: The :ref:`event ` to send.
-
- Returns:
- If ``type(event) is ConnectionClosed``, then returns
- ``None``. Otherwise, returns a :term:`bytes-like object`.
-
- Raises:
- LocalProtocolError:
- Sending this event at this time would violate our
- understanding of the HTTP/1.1 protocol.
-
- If this method raises any exception then it also sets
- :attr:`Connection.our_state` to :data:`ERROR` -- see
- :ref:`error-handling` for discussion.
-
- """
- data_list = self.send_with_data_passthrough(event)
- if data_list is None:
- return None
- else:
- return b"".join(data_list)
-
- def send_with_data_passthrough(self, event: Event) -> Optional[List[bytes]]:
- """Identical to :meth:`send`, except that in situations where
- :meth:`send` returns a single :term:`bytes-like object`, this instead
- returns a list of them -- and when sending a :class:`Data` event, this
- list is guaranteed to contain the exact object you passed in as
- :attr:`Data.data`. See :ref:`sendfile` for discussion.
-
- """
- if self.our_state is ERROR:
- raise LocalProtocolError("Can't send data when our state is ERROR")
- try:
- if type(event) is Response:
- event = self._clean_up_response_headers_for_sending(event)
- # We want to call _process_event before calling the writer,
- # because if someone tries to do something invalid then this will
- # give a sensible error message, while our writers all just assume
- # they will only receive valid events. But, _process_event might
- # change self._writer. So we have to do a little dance:
- writer = self._writer
- self._process_event(self.our_role, event)
- if type(event) is ConnectionClosed:
- return None
- else:
- # In any situation where writer is None, process_event should
- # have raised ProtocolError
- assert writer is not None
- data_list: List[bytes] = []
- writer(event, data_list.append)
- return data_list
- except:
- self._process_error(self.our_role)
- raise
-
- def send_failed(self) -> None:
- """Notify the state machine that we failed to send the data it gave
- us.
-
- This causes :attr:`Connection.our_state` to immediately become
- :data:`ERROR` -- see :ref:`error-handling` for discussion.
-
- """
- self._process_error(self.our_role)
-
- # When sending a Response, we take responsibility for a few things:
- #
- # - Sometimes you MUST set Connection: close. We take care of those
- # times. (You can also set it yourself if you want, and if you do then
- # we'll respect that and close the connection at the right time. But you
- # don't have to worry about that unless you want to.)
- #
- # - The user has to set Content-Length if they want it. Otherwise, for
- # responses that have bodies (e.g. not HEAD), then we will automatically
- # select the right mechanism for streaming a body of unknown length,
- # which depends on depending on the peer's HTTP version.
- #
- # This function's *only* responsibility is making sure headers are set up
- # right -- everything downstream just looks at the headers. There are no
- # side channels.
- def _clean_up_response_headers_for_sending(self, response: Response) -> Response:
- assert type(response) is Response
-
- headers = response.headers
- need_close = False
-
- # HEAD requests need some special handling: they always act like they
- # have Content-Length: 0, and that's how _body_framing treats
- # them. But their headers are supposed to match what we would send if
- # the request was a GET. (Technically there is one deviation allowed:
- # we're allowed to leave out the framing headers -- see
- # https://tools.ietf.org/html/rfc7231#section-4.3.2 . But it's just as
- # easy to get them right.)
- method_for_choosing_headers = cast(bytes, self._request_method)
- if method_for_choosing_headers == b"HEAD":
- method_for_choosing_headers = b"GET"
- framing_type, _ = _body_framing(method_for_choosing_headers, response)
- if framing_type in ("chunked", "http/1.0"):
- # This response has a body of unknown length.
- # If our peer is HTTP/1.1, we use Transfer-Encoding: chunked
- # If our peer is HTTP/1.0, we use no framing headers, and close the
- # connection afterwards.
- #
- # Make sure to clear Content-Length (in principle user could have
- # set both and then we ignored Content-Length b/c
- # Transfer-Encoding overwrote it -- this would be naughty of them,
- # but the HTTP spec says that if our peer does this then we have
- # to fix it instead of erroring out, so we'll accord the user the
- # same respect).
- headers = set_comma_header(headers, b"content-length", [])
- if self.their_http_version is None or self.their_http_version < b"1.1":
- # Either we never got a valid request and are sending back an
- # error (their_http_version is None), so we assume the worst;
- # or else we did get a valid HTTP/1.0 request, so we know that
- # they don't understand chunked encoding.
- headers = set_comma_header(headers, b"transfer-encoding", [])
- # This is actually redundant ATM, since currently we
- # unconditionally disable keep-alive when talking to HTTP/1.0
- # peers. But let's be defensive just in case we add
- # Connection: keep-alive support later:
- if self._request_method != b"HEAD":
- need_close = True
- else:
- headers = set_comma_header(headers, b"transfer-encoding", [b"chunked"])
-
- if not self._cstate.keep_alive or need_close:
- # Make sure Connection: close is set
- connection = set(get_comma_header(headers, b"connection"))
- connection.discard(b"keep-alive")
- connection.add(b"close")
- headers = set_comma_header(headers, b"connection", sorted(connection))
-
- return Response(
- headers=headers,
- status_code=response.status_code,
- http_version=response.http_version,
- reason=response.reason,
- )
diff --git a/env/lib/python3.9/site-packages/h11/_events.py b/env/lib/python3.9/site-packages/h11/_events.py
deleted file mode 100644
index 075bf8a..0000000
--- a/env/lib/python3.9/site-packages/h11/_events.py
+++ /dev/null
@@ -1,369 +0,0 @@
-# High level events that make up HTTP/1.1 conversations. Loosely inspired by
-# the corresponding events in hyper-h2:
-#
-# http://python-hyper.org/h2/en/stable/api.html#events
-#
-# Don't subclass these. Stuff will break.
-
-import re
-from abc import ABC
-from dataclasses import dataclass, field
-from typing import Any, cast, Dict, List, Tuple, Union
-
-from ._abnf import method, request_target
-from ._headers import Headers, normalize_and_validate
-from ._util import bytesify, LocalProtocolError, validate
-
-# Everything in __all__ gets re-exported as part of the h11 public API.
-__all__ = [
- "Event",
- "Request",
- "InformationalResponse",
- "Response",
- "Data",
- "EndOfMessage",
- "ConnectionClosed",
-]
-
-method_re = re.compile(method.encode("ascii"))
-request_target_re = re.compile(request_target.encode("ascii"))
-
-
-class Event(ABC):
- """
- Base class for h11 events.
- """
-
- __slots__ = ()
-
-
-@dataclass(init=False, frozen=True)
-class Request(Event):
- """The beginning of an HTTP request.
-
- Fields:
-
- .. attribute:: method
-
- An HTTP method, e.g. ``b"GET"`` or ``b"POST"``. Always a byte
- string. :term:`Bytes-like objects ` and native
- strings containing only ascii characters will be automatically
- converted to byte strings.
-
- .. attribute:: target
-
- The target of an HTTP request, e.g. ``b"/index.html"``, or one of the
- more exotic formats described in `RFC 7320, section 5.3
- `_. Always a byte
- string. :term:`Bytes-like objects ` and native
- strings containing only ascii characters will be automatically
- converted to byte strings.
-
- .. attribute:: headers
-
- Request headers, represented as a list of (name, value) pairs. See
- :ref:`the header normalization rules ` for details.
-
- .. attribute:: http_version
-
- The HTTP protocol version, represented as a byte string like
- ``b"1.1"``. See :ref:`the HTTP version normalization rules
- ` for details.
-
- """
-
- __slots__ = ("method", "headers", "target", "http_version")
-
- method: bytes
- headers: Headers
- target: bytes
- http_version: bytes
-
- def __init__(
- self,
- *,
- method: Union[bytes, str],
- headers: Union[Headers, List[Tuple[bytes, bytes]], List[Tuple[str, str]]],
- target: Union[bytes, str],
- http_version: Union[bytes, str] = b"1.1",
- _parsed: bool = False,
- ) -> None:
- super().__init__()
- if isinstance(headers, Headers):
- object.__setattr__(self, "headers", headers)
- else:
- object.__setattr__(
- self, "headers", normalize_and_validate(headers, _parsed=_parsed)
- )
- if not _parsed:
- object.__setattr__(self, "method", bytesify(method))
- object.__setattr__(self, "target", bytesify(target))
- object.__setattr__(self, "http_version", bytesify(http_version))
- else:
- object.__setattr__(self, "method", method)
- object.__setattr__(self, "target", target)
- object.__setattr__(self, "http_version", http_version)
-
- # "A server MUST respond with a 400 (Bad Request) status code to any
- # HTTP/1.1 request message that lacks a Host header field and to any
- # request message that contains more than one Host header field or a
- # Host header field with an invalid field-value."
- # -- https://tools.ietf.org/html/rfc7230#section-5.4
- host_count = 0
- for name, value in self.headers:
- if name == b"host":
- host_count += 1
- if self.http_version == b"1.1" and host_count == 0:
- raise LocalProtocolError("Missing mandatory Host: header")
- if host_count > 1:
- raise LocalProtocolError("Found multiple Host: headers")
-
- validate(method_re, self.method, "Illegal method characters")
- validate(request_target_re, self.target, "Illegal target characters")
-
- # This is an unhashable type.
- __hash__ = None # type: ignore
-
-
-@dataclass(init=False, frozen=True)
-class _ResponseBase(Event):
- __slots__ = ("headers", "http_version", "reason", "status_code")
-
- headers: Headers
- http_version: bytes
- reason: bytes
- status_code: int
-
- def __init__(
- self,
- *,
- headers: Union[Headers, List[Tuple[bytes, bytes]], List[Tuple[str, str]]],
- status_code: int,
- http_version: Union[bytes, str] = b"1.1",
- reason: Union[bytes, str] = b"",
- _parsed: bool = False,
- ) -> None:
- super().__init__()
- if isinstance(headers, Headers):
- object.__setattr__(self, "headers", headers)
- else:
- object.__setattr__(
- self, "headers", normalize_and_validate(headers, _parsed=_parsed)
- )
- if not _parsed:
- object.__setattr__(self, "reason", bytesify(reason))
- object.__setattr__(self, "http_version", bytesify(http_version))
- if not isinstance(status_code, int):
- raise LocalProtocolError("status code must be integer")
- # Because IntEnum objects are instances of int, but aren't
- # duck-compatible (sigh), see gh-72.
- object.__setattr__(self, "status_code", int(status_code))
- else:
- object.__setattr__(self, "reason", reason)
- object.__setattr__(self, "http_version", http_version)
- object.__setattr__(self, "status_code", status_code)
-
- self.__post_init__()
-
- def __post_init__(self) -> None:
- pass
-
- # This is an unhashable type.
- __hash__ = None # type: ignore
-
-
-@dataclass(init=False, frozen=True)
-class InformationalResponse(_ResponseBase):
- """An HTTP informational response.
-
- Fields:
-
- .. attribute:: status_code
-
- The status code of this response, as an integer. For an
- :class:`InformationalResponse`, this is always in the range [100,
- 200).
-
- .. attribute:: headers
-
- Request headers, represented as a list of (name, value) pairs. See
- :ref:`the header normalization rules ` for
- details.
-
- .. attribute:: http_version
-
- The HTTP protocol version, represented as a byte string like
- ``b"1.1"``. See :ref:`the HTTP version normalization rules
- ` for details.
-
- .. attribute:: reason
-
- The reason phrase of this response, as a byte string. For example:
- ``b"OK"``, or ``b"Not Found"``.
-
- """
-
- def __post_init__(self) -> None:
- if not (100 <= self.status_code < 200):
- raise LocalProtocolError(
- "InformationalResponse status_code should be in range "
- "[100, 200), not {}".format(self.status_code)
- )
-
- # This is an unhashable type.
- __hash__ = None # type: ignore
-
-
-@dataclass(init=False, frozen=True)
-class Response(_ResponseBase):
- """The beginning of an HTTP response.
-
- Fields:
-
- .. attribute:: status_code
-
- The status code of this response, as an integer. For an
- :class:`Response`, this is always in the range [200,
- 1000).
-
- .. attribute:: headers
-
- Request headers, represented as a list of (name, value) pairs. See
- :ref:`the header normalization rules ` for details.
-
- .. attribute:: http_version
-
- The HTTP protocol version, represented as a byte string like
- ``b"1.1"``. See :ref:`the HTTP version normalization rules
- ` for details.
-
- .. attribute:: reason
-
- The reason phrase of this response, as a byte string. For example:
- ``b"OK"``, or ``b"Not Found"``.
-
- """
-
- def __post_init__(self) -> None:
- if not (200 <= self.status_code < 1000):
- raise LocalProtocolError(
- "Response status_code should be in range [200, 1000), not {}".format(
- self.status_code
- )
- )
-
- # This is an unhashable type.
- __hash__ = None # type: ignore
-
-
-@dataclass(init=False, frozen=True)
-class Data(Event):
- """Part of an HTTP message body.
-
- Fields:
-
- .. attribute:: data
-
- A :term:`bytes-like object` containing part of a message body. Or, if
- using the ``combine=False`` argument to :meth:`Connection.send`, then
- any object that your socket writing code knows what to do with, and for
- which calling :func:`len` returns the number of bytes that will be
- written -- see :ref:`sendfile` for details.
-
- .. attribute:: chunk_start
-
- A marker that indicates whether this data object is from the start of a
- chunked transfer encoding chunk. This field is ignored when when a Data
- event is provided to :meth:`Connection.send`: it is only valid on
- events emitted from :meth:`Connection.next_event`. You probably
- shouldn't use this attribute at all; see
- :ref:`chunk-delimiters-are-bad` for details.
-
- .. attribute:: chunk_end
-
- A marker that indicates whether this data object is the last for a
- given chunked transfer encoding chunk. This field is ignored when when
- a Data event is provided to :meth:`Connection.send`: it is only valid
- on events emitted from :meth:`Connection.next_event`. You probably
- shouldn't use this attribute at all; see
- :ref:`chunk-delimiters-are-bad` for details.
-
- """
-
- __slots__ = ("data", "chunk_start", "chunk_end")
-
- data: bytes
- chunk_start: bool
- chunk_end: bool
-
- def __init__(
- self, data: bytes, chunk_start: bool = False, chunk_end: bool = False
- ) -> None:
- object.__setattr__(self, "data", data)
- object.__setattr__(self, "chunk_start", chunk_start)
- object.__setattr__(self, "chunk_end", chunk_end)
-
- # This is an unhashable type.
- __hash__ = None # type: ignore
-
-
-# XX FIXME: "A recipient MUST ignore (or consider as an error) any fields that
-# are forbidden to be sent in a trailer, since processing them as if they were
-# present in the header section might bypass external security filters."
-# https://svn.tools.ietf.org/svn/wg/httpbis/specs/rfc7230.html#chunked.trailer.part
-# Unfortunately, the list of forbidden fields is long and vague :-/
-@dataclass(init=False, frozen=True)
-class EndOfMessage(Event):
- """The end of an HTTP message.
-
- Fields:
-
- .. attribute:: headers
-
- Default value: ``[]``
-
- Any trailing headers attached to this message, represented as a list of
- (name, value) pairs. See :ref:`the header normalization rules
- ` for details.
-
- Must be empty unless ``Transfer-Encoding: chunked`` is in use.
-
- """
-
- __slots__ = ("headers",)
-
- headers: Headers
-
- def __init__(
- self,
- *,
- headers: Union[
- Headers, List[Tuple[bytes, bytes]], List[Tuple[str, str]], None
- ] = None,
- _parsed: bool = False,
- ) -> None:
- super().__init__()
- if headers is None:
- headers = Headers([])
- elif not isinstance(headers, Headers):
- headers = normalize_and_validate(headers, _parsed=_parsed)
-
- object.__setattr__(self, "headers", headers)
-
- # This is an unhashable type.
- __hash__ = None # type: ignore
-
-
-@dataclass(frozen=True)
-class ConnectionClosed(Event):
- """This event indicates that the sender has closed their outgoing
- connection.
-
- Note that this does not necessarily mean that they can't *receive* further
- data, because TCP connections are composed to two one-way channels which
- can be closed independently. See :ref:`closing` for details.
-
- No fields.
- """
-
- pass
diff --git a/env/lib/python3.9/site-packages/h11/_headers.py b/env/lib/python3.9/site-packages/h11/_headers.py
deleted file mode 100644
index acc4596..0000000
--- a/env/lib/python3.9/site-packages/h11/_headers.py
+++ /dev/null
@@ -1,278 +0,0 @@
-import re
-from typing import AnyStr, cast, List, overload, Sequence, Tuple, TYPE_CHECKING, Union
-
-from ._abnf import field_name, field_value
-from ._util import bytesify, LocalProtocolError, validate
-
-if TYPE_CHECKING:
- from ._events import Request
-
-try:
- from typing import Literal
-except ImportError:
- from typing_extensions import Literal # type: ignore
-
-
-# Facts
-# -----
-#
-# Headers are:
-# keys: case-insensitive ascii
-# values: mixture of ascii and raw bytes
-#
-# "Historically, HTTP has allowed field content with text in the ISO-8859-1
-# charset [ISO-8859-1], supporting other charsets only through use of
-# [RFC2047] encoding. In practice, most HTTP header field values use only a
-# subset of the US-ASCII charset [USASCII]. Newly defined header fields SHOULD
-# limit their field values to US-ASCII octets. A recipient SHOULD treat other
-# octets in field content (obs-text) as opaque data."
-# And it deprecates all non-ascii values
-#
-# Leading/trailing whitespace in header names is forbidden
-#
-# Values get leading/trailing whitespace stripped
-#
-# Content-Disposition actually needs to contain unicode semantically; to
-# accomplish this it has a terrifically weird way of encoding the filename
-# itself as ascii (and even this still has lots of cross-browser
-# incompatibilities)
-#
-# Order is important:
-# "a proxy MUST NOT change the order of these field values when forwarding a
-# message"
-# (and there are several headers where the order indicates a preference)
-#
-# Multiple occurences of the same header:
-# "A sender MUST NOT generate multiple header fields with the same field name
-# in a message unless either the entire field value for that header field is
-# defined as a comma-separated list [or the header is Set-Cookie which gets a
-# special exception]" - RFC 7230. (cookies are in RFC 6265)
-#
-# So every header aside from Set-Cookie can be merged by b", ".join if it
-# occurs repeatedly. But, of course, they can't necessarily be split by
-# .split(b","), because quoting.
-#
-# Given all this mess (case insensitive, duplicates allowed, order is
-# important, ...), there doesn't appear to be any standard way to handle
-# headers in Python -- they're almost like dicts, but... actually just
-# aren't. For now we punt and just use a super simple representation: headers
-# are a list of pairs
-#
-# [(name1, value1), (name2, value2), ...]
-#
-# where all entries are bytestrings, names are lowercase and have no
-# leading/trailing whitespace, and values are bytestrings with no
-# leading/trailing whitespace. Searching and updating are done via naive O(n)
-# methods.
-#
-# Maybe a dict-of-lists would be better?
-
-_content_length_re = re.compile(br"[0-9]+")
-_field_name_re = re.compile(field_name.encode("ascii"))
-_field_value_re = re.compile(field_value.encode("ascii"))
-
-
-class Headers(Sequence[Tuple[bytes, bytes]]):
- """
- A list-like interface that allows iterating over headers as byte-pairs
- of (lowercased-name, value).
-
- Internally we actually store the representation as three-tuples,
- including both the raw original casing, in order to preserve casing
- over-the-wire, and the lowercased name, for case-insensitive comparisions.
-
- r = Request(
- method="GET",
- target="/",
- headers=[("Host", "example.org"), ("Connection", "keep-alive")],
- http_version="1.1",
- )
- assert r.headers == [
- (b"host", b"example.org"),
- (b"connection", b"keep-alive")
- ]
- assert r.headers.raw_items() == [
- (b"Host", b"example.org"),
- (b"Connection", b"keep-alive")
- ]
- """
-
- __slots__ = "_full_items"
-
- def __init__(self, full_items: List[Tuple[bytes, bytes, bytes]]) -> None:
- self._full_items = full_items
-
- def __bool__(self) -> bool:
- return bool(self._full_items)
-
- def __eq__(self, other: object) -> bool:
- return list(self) == list(other) # type: ignore
-
- def __len__(self) -> int:
- return len(self._full_items)
-
- def __repr__(self) -> str:
- return "" % repr(list(self))
-
- def __getitem__(self, idx: int) -> Tuple[bytes, bytes]: # type: ignore[override]
- _, name, value = self._full_items[idx]
- return (name, value)
-
- def raw_items(self) -> List[Tuple[bytes, bytes]]:
- return [(raw_name, value) for raw_name, _, value in self._full_items]
-
-
-HeaderTypes = Union[
- List[Tuple[bytes, bytes]],
- List[Tuple[bytes, str]],
- List[Tuple[str, bytes]],
- List[Tuple[str, str]],
-]
-
-
-@overload
-def normalize_and_validate(headers: Headers, _parsed: Literal[True]) -> Headers:
- ...
-
-
-@overload
-def normalize_and_validate(headers: HeaderTypes, _parsed: Literal[False]) -> Headers:
- ...
-
-
-@overload
-def normalize_and_validate(
- headers: Union[Headers, HeaderTypes], _parsed: bool = False
-) -> Headers:
- ...
-
-
-def normalize_and_validate(
- headers: Union[Headers, HeaderTypes], _parsed: bool = False
-) -> Headers:
- new_headers = []
- seen_content_length = None
- saw_transfer_encoding = False
- for name, value in headers:
- # For headers coming out of the parser, we can safely skip some steps,
- # because it always returns bytes and has already run these regexes
- # over the data:
- if not _parsed:
- name = bytesify(name)
- value = bytesify(value)
- validate(_field_name_re, name, "Illegal header name {!r}", name)
- validate(_field_value_re, value, "Illegal header value {!r}", value)
- assert isinstance(name, bytes)
- assert isinstance(value, bytes)
-
- raw_name = name
- name = name.lower()
- if name == b"content-length":
- lengths = {length.strip() for length in value.split(b",")}
- if len(lengths) != 1:
- raise LocalProtocolError("conflicting Content-Length headers")
- value = lengths.pop()
- validate(_content_length_re, value, "bad Content-Length")
- if seen_content_length is None:
- seen_content_length = value
- new_headers.append((raw_name, name, value))
- elif seen_content_length != value:
- raise LocalProtocolError("conflicting Content-Length headers")
- elif name == b"transfer-encoding":
- # "A server that receives a request message with a transfer coding
- # it does not understand SHOULD respond with 501 (Not
- # Implemented)."
- # https://tools.ietf.org/html/rfc7230#section-3.3.1
- if saw_transfer_encoding:
- raise LocalProtocolError(
- "multiple Transfer-Encoding headers", error_status_hint=501
- )
- # "All transfer-coding names are case-insensitive"
- # -- https://tools.ietf.org/html/rfc7230#section-4
- value = value.lower()
- if value != b"chunked":
- raise LocalProtocolError(
- "Only Transfer-Encoding: chunked is supported",
- error_status_hint=501,
- )
- saw_transfer_encoding = True
- new_headers.append((raw_name, name, value))
- else:
- new_headers.append((raw_name, name, value))
- return Headers(new_headers)
-
-
-def get_comma_header(headers: Headers, name: bytes) -> List[bytes]:
- # Should only be used for headers whose value is a list of
- # comma-separated, case-insensitive values.
- #
- # The header name `name` is expected to be lower-case bytes.
- #
- # Connection: meets these criteria (including cast insensitivity).
- #
- # Content-Length: technically is just a single value (1*DIGIT), but the
- # standard makes reference to implementations that do multiple values, and
- # using this doesn't hurt. Ditto, case insensitivity doesn't things either
- # way.
- #
- # Transfer-Encoding: is more complex (allows for quoted strings), so
- # splitting on , is actually wrong. For example, this is legal:
- #
- # Transfer-Encoding: foo; options="1,2", chunked
- #
- # and should be parsed as
- #
- # foo; options="1,2"
- # chunked
- #
- # but this naive function will parse it as
- #
- # foo; options="1
- # 2"
- # chunked
- #
- # However, this is okay because the only thing we are going to do with
- # any Transfer-Encoding is reject ones that aren't just "chunked", so
- # both of these will be treated the same anyway.
- #
- # Expect: the only legal value is the literal string
- # "100-continue". Splitting on commas is harmless. Case insensitive.
- #
- out: List[bytes] = []
- for _, found_name, found_raw_value in headers._full_items:
- if found_name == name:
- found_raw_value = found_raw_value.lower()
- for found_split_value in found_raw_value.split(b","):
- found_split_value = found_split_value.strip()
- if found_split_value:
- out.append(found_split_value)
- return out
-
-
-def set_comma_header(headers: Headers, name: bytes, new_values: List[bytes]) -> Headers:
- # The header name `name` is expected to be lower-case bytes.
- #
- # Note that when we store the header we use title casing for the header
- # names, in order to match the conventional HTTP header style.
- #
- # Simply calling `.title()` is a blunt approach, but it's correct
- # here given the cases where we're using `set_comma_header`...
- #
- # Connection, Content-Length, Transfer-Encoding.
- new_headers: List[Tuple[bytes, bytes]] = []
- for found_raw_name, found_name, found_raw_value in headers._full_items:
- if found_name != name:
- new_headers.append((found_raw_name, found_raw_value))
- for new_value in new_values:
- new_headers.append((name.title(), new_value))
- return normalize_and_validate(new_headers)
-
-
-def has_expect_100_continue(request: "Request") -> bool:
- # https://tools.ietf.org/html/rfc7231#section-5.1.1
- # "A server that receives a 100-continue expectation in an HTTP/1.0 request
- # MUST ignore that expectation."
- if request.http_version < b"1.1":
- return False
- expect = get_comma_header(request.headers, b"expect")
- return b"100-continue" in expect
diff --git a/env/lib/python3.9/site-packages/h11/_readers.py b/env/lib/python3.9/site-packages/h11/_readers.py
deleted file mode 100644
index a036d79..0000000
--- a/env/lib/python3.9/site-packages/h11/_readers.py
+++ /dev/null
@@ -1,249 +0,0 @@
-# Code to read HTTP data
-#
-# Strategy: each reader is a callable which takes a ReceiveBuffer object, and
-# either:
-# 1) consumes some of it and returns an Event
-# 2) raises a LocalProtocolError (for consistency -- e.g. we call validate()
-# and it might raise a LocalProtocolError, so simpler just to always use
-# this)
-# 3) returns None, meaning "I need more data"
-#
-# If they have a .read_eof attribute, then this will be called if an EOF is
-# received -- but this is optional. Either way, the actual ConnectionClosed
-# event will be generated afterwards.
-#
-# READERS is a dict describing how to pick a reader. It maps states to either:
-# - a reader
-# - or, for body readers, a dict of per-framing reader factories
-
-import re
-from typing import Any, Callable, Dict, Iterable, NoReturn, Optional, Tuple, Type, Union
-
-from ._abnf import chunk_header, header_field, request_line, status_line
-from ._events import Data, EndOfMessage, InformationalResponse, Request, Response
-from ._receivebuffer import ReceiveBuffer
-from ._state import (
- CLIENT,
- CLOSED,
- DONE,
- IDLE,
- MUST_CLOSE,
- SEND_BODY,
- SEND_RESPONSE,
- SERVER,
-)
-from ._util import LocalProtocolError, RemoteProtocolError, Sentinel, validate
-
-__all__ = ["READERS"]
-
-header_field_re = re.compile(header_field.encode("ascii"))
-
-# Remember that this has to run in O(n) time -- so e.g. the bytearray cast is
-# critical.
-obs_fold_re = re.compile(br"[ \t]+")
-
-
-def _obsolete_line_fold(lines: Iterable[bytes]) -> Iterable[bytes]:
- it = iter(lines)
- last: Optional[bytes] = None
- for line in it:
- match = obs_fold_re.match(line)
- if match:
- if last is None:
- raise LocalProtocolError("continuation line at start of headers")
- if not isinstance(last, bytearray):
- last = bytearray(last)
- last += b" "
- last += line[match.end() :]
- else:
- if last is not None:
- yield last
- last = line
- if last is not None:
- yield last
-
-
-def _decode_header_lines(
- lines: Iterable[bytes],
-) -> Iterable[Tuple[bytes, bytes]]:
- for line in _obsolete_line_fold(lines):
- matches = validate(header_field_re, line, "illegal header line: {!r}", line)
- yield (matches["field_name"], matches["field_value"])
-
-
-request_line_re = re.compile(request_line.encode("ascii"))
-
-
-def maybe_read_from_IDLE_client(buf: ReceiveBuffer) -> Optional[Request]:
- lines = buf.maybe_extract_lines()
- if lines is None:
- if buf.is_next_line_obviously_invalid_request_line():
- raise LocalProtocolError("illegal request line")
- return None
- if not lines:
- raise LocalProtocolError("no request line received")
- matches = validate(
- request_line_re, lines[0], "illegal request line: {!r}", lines[0]
- )
- return Request(
- headers=list(_decode_header_lines(lines[1:])), _parsed=True, **matches
- )
-
-
-status_line_re = re.compile(status_line.encode("ascii"))
-
-
-def maybe_read_from_SEND_RESPONSE_server(
- buf: ReceiveBuffer,
-) -> Union[InformationalResponse, Response, None]:
- lines = buf.maybe_extract_lines()
- if lines is None:
- if buf.is_next_line_obviously_invalid_request_line():
- raise LocalProtocolError("illegal request line")
- return None
- if not lines:
- raise LocalProtocolError("no response line received")
- matches = validate(status_line_re, lines[0], "illegal status line: {!r}", lines[0])
- http_version = (
- b"1.1" if matches["http_version"] is None else matches["http_version"]
- )
- reason = b"" if matches["reason"] is None else matches["reason"]
- status_code = int(matches["status_code"])
- class_: Union[Type[InformationalResponse], Type[Response]] = (
- InformationalResponse if status_code < 200 else Response
- )
- return class_(
- headers=list(_decode_header_lines(lines[1:])),
- _parsed=True,
- status_code=status_code,
- reason=reason,
- http_version=http_version,
- )
-
-
-class ContentLengthReader:
- def __init__(self, length: int) -> None:
- self._length = length
- self._remaining = length
-
- def __call__(self, buf: ReceiveBuffer) -> Union[Data, EndOfMessage, None]:
- if self._remaining == 0:
- return EndOfMessage()
- data = buf.maybe_extract_at_most(self._remaining)
- if data is None:
- return None
- self._remaining -= len(data)
- return Data(data=data)
-
- def read_eof(self) -> NoReturn:
- raise RemoteProtocolError(
- "peer closed connection without sending complete message body "
- "(received {} bytes, expected {})".format(
- self._length - self._remaining, self._length
- )
- )
-
-
-chunk_header_re = re.compile(chunk_header.encode("ascii"))
-
-
-class ChunkedReader:
- def __init__(self) -> None:
- self._bytes_in_chunk = 0
- # After reading a chunk, we have to throw away the trailing \r\n; if
- # this is >0 then we discard that many bytes before resuming regular
- # de-chunkification.
- self._bytes_to_discard = 0
- self._reading_trailer = False
-
- def __call__(self, buf: ReceiveBuffer) -> Union[Data, EndOfMessage, None]:
- if self._reading_trailer:
- lines = buf.maybe_extract_lines()
- if lines is None:
- return None
- return EndOfMessage(headers=list(_decode_header_lines(lines)))
- if self._bytes_to_discard > 0:
- data = buf.maybe_extract_at_most(self._bytes_to_discard)
- if data is None:
- return None
- self._bytes_to_discard -= len(data)
- if self._bytes_to_discard > 0:
- return None
- # else, fall through and read some more
- assert self._bytes_to_discard == 0
- if self._bytes_in_chunk == 0:
- # We need to refill our chunk count
- chunk_header = buf.maybe_extract_next_line()
- if chunk_header is None:
- return None
- matches = validate(
- chunk_header_re,
- chunk_header,
- "illegal chunk header: {!r}",
- chunk_header,
- )
- # XX FIXME: we discard chunk extensions. Does anyone care?
- self._bytes_in_chunk = int(matches["chunk_size"], base=16)
- if self._bytes_in_chunk == 0:
- self._reading_trailer = True
- return self(buf)
- chunk_start = True
- else:
- chunk_start = False
- assert self._bytes_in_chunk > 0
- data = buf.maybe_extract_at_most(self._bytes_in_chunk)
- if data is None:
- return None
- self._bytes_in_chunk -= len(data)
- if self._bytes_in_chunk == 0:
- self._bytes_to_discard = 2
- chunk_end = True
- else:
- chunk_end = False
- return Data(data=data, chunk_start=chunk_start, chunk_end=chunk_end)
-
- def read_eof(self) -> NoReturn:
- raise RemoteProtocolError(
- "peer closed connection without sending complete message body "
- "(incomplete chunked read)"
- )
-
-
-class Http10Reader:
- def __call__(self, buf: ReceiveBuffer) -> Optional[Data]:
- data = buf.maybe_extract_at_most(999999999)
- if data is None:
- return None
- return Data(data=data)
-
- def read_eof(self) -> EndOfMessage:
- return EndOfMessage()
-
-
-def expect_nothing(buf: ReceiveBuffer) -> None:
- if buf:
- raise LocalProtocolError("Got data when expecting EOF")
- return None
-
-
-ReadersType = Dict[
- Union[Sentinel, Tuple[Sentinel, Sentinel]],
- Union[Callable[..., Any], Dict[str, Callable[..., Any]]],
-]
-
-READERS: ReadersType = {
- (CLIENT, IDLE): maybe_read_from_IDLE_client,
- (SERVER, IDLE): maybe_read_from_SEND_RESPONSE_server,
- (SERVER, SEND_RESPONSE): maybe_read_from_SEND_RESPONSE_server,
- (CLIENT, DONE): expect_nothing,
- (CLIENT, MUST_CLOSE): expect_nothing,
- (CLIENT, CLOSED): expect_nothing,
- (SERVER, DONE): expect_nothing,
- (SERVER, MUST_CLOSE): expect_nothing,
- (SERVER, CLOSED): expect_nothing,
- SEND_BODY: {
- "chunked": ChunkedReader,
- "content-length": ContentLengthReader,
- "http/1.0": Http10Reader,
- },
-}
diff --git a/env/lib/python3.9/site-packages/h11/_receivebuffer.py b/env/lib/python3.9/site-packages/h11/_receivebuffer.py
deleted file mode 100644
index e5c4e08..0000000
--- a/env/lib/python3.9/site-packages/h11/_receivebuffer.py
+++ /dev/null
@@ -1,153 +0,0 @@
-import re
-import sys
-from typing import List, Optional, Union
-
-__all__ = ["ReceiveBuffer"]
-
-
-# Operations we want to support:
-# - find next \r\n or \r\n\r\n (\n or \n\n are also acceptable),
-# or wait until there is one
-# - read at-most-N bytes
-# Goals:
-# - on average, do this fast
-# - worst case, do this in O(n) where n is the number of bytes processed
-# Plan:
-# - store bytearray, offset, how far we've searched for a separator token
-# - use the how-far-we've-searched data to avoid rescanning
-# - while doing a stream of uninterrupted processing, advance offset instead
-# of constantly copying
-# WARNING:
-# - I haven't benchmarked or profiled any of this yet.
-#
-# Note that starting in Python 3.4, deleting the initial n bytes from a
-# bytearray is amortized O(n), thanks to some excellent work by Antoine
-# Martin:
-#
-# https://bugs.python.org/issue19087
-#
-# This means that if we only supported 3.4+, we could get rid of the code here
-# involving self._start and self.compress, because it's doing exactly the same
-# thing that bytearray now does internally.
-#
-# BUT unfortunately, we still support 2.7, and reading short segments out of a
-# long buffer MUST be O(bytes read) to avoid DoS issues, so we can't actually
-# delete this code. Yet:
-#
-# https://pythonclock.org/
-#
-# (Two things to double-check first though: make sure PyPy also has the
-# optimization, and benchmark to make sure it's a win, since we do have a
-# slightly clever thing where we delay calling compress() until we've
-# processed a whole event, which could in theory be slightly more efficient
-# than the internal bytearray support.)
-blank_line_regex = re.compile(b"\n\r?\n", re.MULTILINE)
-
-
-class ReceiveBuffer:
- def __init__(self) -> None:
- self._data = bytearray()
- self._next_line_search = 0
- self._multiple_lines_search = 0
-
- def __iadd__(self, byteslike: Union[bytes, bytearray]) -> "ReceiveBuffer":
- self._data += byteslike
- return self
-
- def __bool__(self) -> bool:
- return bool(len(self))
-
- def __len__(self) -> int:
- return len(self._data)
-
- # for @property unprocessed_data
- def __bytes__(self) -> bytes:
- return bytes(self._data)
-
- def _extract(self, count: int) -> bytearray:
- # extracting an initial slice of the data buffer and return it
- out = self._data[:count]
- del self._data[:count]
-
- self._next_line_search = 0
- self._multiple_lines_search = 0
-
- return out
-
- def maybe_extract_at_most(self, count: int) -> Optional[bytearray]:
- """
- Extract a fixed number of bytes from the buffer.
- """
- out = self._data[:count]
- if not out:
- return None
-
- return self._extract(count)
-
- def maybe_extract_next_line(self) -> Optional[bytearray]:
- """
- Extract the first line, if it is completed in the buffer.
- """
- # Only search in buffer space that we've not already looked at.
- search_start_index = max(0, self._next_line_search - 1)
- partial_idx = self._data.find(b"\r\n", search_start_index)
-
- if partial_idx == -1:
- self._next_line_search = len(self._data)
- return None
-
- # + 2 is to compensate len(b"\r\n")
- idx = partial_idx + 2
-
- return self._extract(idx)
-
- def maybe_extract_lines(self) -> Optional[List[bytearray]]:
- """
- Extract everything up to the first blank line, and return a list of lines.
- """
- # Handle the case where we have an immediate empty line.
- if self._data[:1] == b"\n":
- self._extract(1)
- return []
-
- if self._data[:2] == b"\r\n":
- self._extract(2)
- return []
-
- # Only search in buffer space that we've not already looked at.
- match = blank_line_regex.search(self._data, self._multiple_lines_search)
- if match is None:
- self._multiple_lines_search = max(0, len(self._data) - 2)
- return None
-
- # Truncate the buffer and return it.
- idx = match.span(0)[-1]
- out = self._extract(idx)
- lines = out.split(b"\n")
-
- for line in lines:
- if line.endswith(b"\r"):
- del line[-1]
-
- assert lines[-2] == lines[-1] == b""
-
- del lines[-2:]
-
- return lines
-
- # In theory we should wait until `\r\n` before starting to validate
- # incoming data. However it's interesting to detect (very) invalid data
- # early given they might not even contain `\r\n` at all (hence only
- # timeout will get rid of them).
- # This is not a 100% effective detection but more of a cheap sanity check
- # allowing for early abort in some useful cases.
- # This is especially interesting when peer is messing up with HTTPS and
- # sent us a TLS stream where we were expecting plain HTTP given all
- # versions of TLS so far start handshake with a 0x16 message type code.
- def is_next_line_obviously_invalid_request_line(self) -> bool:
- try:
- # HTTP header line must not contain non-printable characters
- # and should not start with a space
- return self._data[0] < 0x21
- except IndexError:
- return False
diff --git a/env/lib/python3.9/site-packages/h11/_state.py b/env/lib/python3.9/site-packages/h11/_state.py
deleted file mode 100644
index 2790768..0000000
--- a/env/lib/python3.9/site-packages/h11/_state.py
+++ /dev/null
@@ -1,363 +0,0 @@
-################################################################
-# The core state machine
-################################################################
-#
-# Rule 1: everything that affects the state machine and state transitions must
-# live here in this file. As much as possible goes into the table-based
-# representation, but for the bits that don't quite fit, the actual code and
-# state must nonetheless live here.
-#
-# Rule 2: this file does not know about what role we're playing; it only knows
-# about HTTP request/response cycles in the abstract. This ensures that we
-# don't cheat and apply different rules to local and remote parties.
-#
-#
-# Theory of operation
-# ===================
-#
-# Possibly the simplest way to think about this is that we actually have 5
-# different state machines here. Yes, 5. These are:
-#
-# 1) The client state, with its complicated automaton (see the docs)
-# 2) The server state, with its complicated automaton (see the docs)
-# 3) The keep-alive state, with possible states {True, False}
-# 4) The SWITCH_CONNECT state, with possible states {False, True}
-# 5) The SWITCH_UPGRADE state, with possible states {False, True}
-#
-# For (3)-(5), the first state listed is the initial state.
-#
-# (1)-(3) are stored explicitly in member variables. The last
-# two are stored implicitly in the pending_switch_proposals set as:
-# (state of 4) == (_SWITCH_CONNECT in pending_switch_proposals)
-# (state of 5) == (_SWITCH_UPGRADE in pending_switch_proposals)
-#
-# And each of these machines has two different kinds of transitions:
-#
-# a) Event-triggered
-# b) State-triggered
-#
-# Event triggered is the obvious thing that you'd think it is: some event
-# happens, and if it's the right event at the right time then a transition
-# happens. But there are somewhat complicated rules for which machines can
-# "see" which events. (As a rule of thumb, if a machine "sees" an event, this
-# means two things: the event can affect the machine, and if the machine is
-# not in a state where it expects that event then it's an error.) These rules
-# are:
-#
-# 1) The client machine sees all h11.events objects emitted by the client.
-#
-# 2) The server machine sees all h11.events objects emitted by the server.
-#
-# It also sees the client's Request event.
-#
-# And sometimes, server events are annotated with a _SWITCH_* event. For
-# example, we can have a (Response, _SWITCH_CONNECT) event, which is
-# different from a regular Response event.
-#
-# 3) The keep-alive machine sees the process_keep_alive_disabled() event
-# (which is derived from Request/Response events), and this event
-# transitions it from True -> False, or from False -> False. There's no way
-# to transition back.
-#
-# 4&5) The _SWITCH_* machines transition from False->True when we get a
-# Request that proposes the relevant type of switch (via
-# process_client_switch_proposals), and they go from True->False when we
-# get a Response that has no _SWITCH_* annotation.
-#
-# So that's event-triggered transitions.
-#
-# State-triggered transitions are less standard. What they do here is couple
-# the machines together. The way this works is, when certain *joint*
-# configurations of states are achieved, then we automatically transition to a
-# new *joint* state. So, for example, if we're ever in a joint state with
-#
-# client: DONE
-# keep-alive: False
-#
-# then the client state immediately transitions to:
-#
-# client: MUST_CLOSE
-#
-# This is fundamentally different from an event-based transition, because it
-# doesn't matter how we arrived at the {client: DONE, keep-alive: False} state
-# -- maybe the client transitioned SEND_BODY -> DONE, or keep-alive
-# transitioned True -> False. Either way, once this precondition is satisfied,
-# this transition is immediately triggered.
-#
-# What if two conflicting state-based transitions get enabled at the same
-# time? In practice there's only one case where this arises (client DONE ->
-# MIGHT_SWITCH_PROTOCOL versus DONE -> MUST_CLOSE), and we resolve it by
-# explicitly prioritizing the DONE -> MIGHT_SWITCH_PROTOCOL transition.
-#
-# Implementation
-# --------------
-#
-# The event-triggered transitions for the server and client machines are all
-# stored explicitly in a table. Ditto for the state-triggered transitions that
-# involve just the server and client state.
-#
-# The transitions for the other machines, and the state-triggered transitions
-# that involve the other machines, are written out as explicit Python code.
-#
-# It'd be nice if there were some cleaner way to do all this. This isn't
-# *too* terrible, but I feel like it could probably be better.
-#
-# WARNING
-# -------
-#
-# The script that generates the state machine diagrams for the docs knows how
-# to read out the EVENT_TRIGGERED_TRANSITIONS and STATE_TRIGGERED_TRANSITIONS
-# tables. But it can't automatically read the transitions that are written
-# directly in Python code. So if you touch those, you need to also update the
-# script to keep it in sync!
-from typing import cast, Dict, Optional, Set, Tuple, Type, Union
-
-from ._events import *
-from ._util import LocalProtocolError, Sentinel
-
-# Everything in __all__ gets re-exported as part of the h11 public API.
-__all__ = [
- "CLIENT",
- "SERVER",
- "IDLE",
- "SEND_RESPONSE",
- "SEND_BODY",
- "DONE",
- "MUST_CLOSE",
- "CLOSED",
- "MIGHT_SWITCH_PROTOCOL",
- "SWITCHED_PROTOCOL",
- "ERROR",
-]
-
-
-class CLIENT(Sentinel, metaclass=Sentinel):
- pass
-
-
-class SERVER(Sentinel, metaclass=Sentinel):
- pass
-
-
-# States
-class IDLE(Sentinel, metaclass=Sentinel):
- pass
-
-
-class SEND_RESPONSE(Sentinel, metaclass=Sentinel):
- pass
-
-
-class SEND_BODY(Sentinel, metaclass=Sentinel):
- pass
-
-
-class DONE(Sentinel, metaclass=Sentinel):
- pass
-
-
-class MUST_CLOSE(Sentinel, metaclass=Sentinel):
- pass
-
-
-class CLOSED(Sentinel, metaclass=Sentinel):
- pass
-
-
-class ERROR(Sentinel, metaclass=Sentinel):
- pass
-
-
-# Switch types
-class MIGHT_SWITCH_PROTOCOL(Sentinel, metaclass=Sentinel):
- pass
-
-
-class SWITCHED_PROTOCOL(Sentinel, metaclass=Sentinel):
- pass
-
-
-class _SWITCH_UPGRADE(Sentinel, metaclass=Sentinel):
- pass
-
-
-class _SWITCH_CONNECT(Sentinel, metaclass=Sentinel):
- pass
-
-
-EventTransitionType = Dict[
- Type[Sentinel],
- Dict[
- Type[Sentinel],
- Dict[Union[Type[Event], Tuple[Type[Event], Type[Sentinel]]], Type[Sentinel]],
- ],
-]
-
-EVENT_TRIGGERED_TRANSITIONS: EventTransitionType = {
- CLIENT: {
- IDLE: {Request: SEND_BODY, ConnectionClosed: CLOSED},
- SEND_BODY: {Data: SEND_BODY, EndOfMessage: DONE},
- DONE: {ConnectionClosed: CLOSED},
- MUST_CLOSE: {ConnectionClosed: CLOSED},
- CLOSED: {ConnectionClosed: CLOSED},
- MIGHT_SWITCH_PROTOCOL: {},
- SWITCHED_PROTOCOL: {},
- ERROR: {},
- },
- SERVER: {
- IDLE: {
- ConnectionClosed: CLOSED,
- Response: SEND_BODY,
- # Special case: server sees client Request events, in this form
- (Request, CLIENT): SEND_RESPONSE,
- },
- SEND_RESPONSE: {
- InformationalResponse: SEND_RESPONSE,
- Response: SEND_BODY,
- (InformationalResponse, _SWITCH_UPGRADE): SWITCHED_PROTOCOL,
- (Response, _SWITCH_CONNECT): SWITCHED_PROTOCOL,
- },
- SEND_BODY: {Data: SEND_BODY, EndOfMessage: DONE},
- DONE: {ConnectionClosed: CLOSED},
- MUST_CLOSE: {ConnectionClosed: CLOSED},
- CLOSED: {ConnectionClosed: CLOSED},
- SWITCHED_PROTOCOL: {},
- ERROR: {},
- },
-}
-
-# NB: there are also some special-case state-triggered transitions hard-coded
-# into _fire_state_triggered_transitions below.
-STATE_TRIGGERED_TRANSITIONS = {
- # (Client state, Server state) -> new states
- # Protocol negotiation
- (MIGHT_SWITCH_PROTOCOL, SWITCHED_PROTOCOL): {CLIENT: SWITCHED_PROTOCOL},
- # Socket shutdown
- (CLOSED, DONE): {SERVER: MUST_CLOSE},
- (CLOSED, IDLE): {SERVER: MUST_CLOSE},
- (ERROR, DONE): {SERVER: MUST_CLOSE},
- (DONE, CLOSED): {CLIENT: MUST_CLOSE},
- (IDLE, CLOSED): {CLIENT: MUST_CLOSE},
- (DONE, ERROR): {CLIENT: MUST_CLOSE},
-}
-
-
-class ConnectionState:
- def __init__(self) -> None:
- # Extra bits of state that don't quite fit into the state model.
-
- # If this is False then it enables the automatic DONE -> MUST_CLOSE
- # transition. Don't set this directly; call .keep_alive_disabled()
- self.keep_alive = True
-
- # This is a subset of {UPGRADE, CONNECT}, containing the proposals
- # made by the client for switching protocols.
- self.pending_switch_proposals: Set[Type[Sentinel]] = set()
-
- self.states: Dict[Type[Sentinel], Type[Sentinel]] = {CLIENT: IDLE, SERVER: IDLE}
-
- def process_error(self, role: Type[Sentinel]) -> None:
- self.states[role] = ERROR
- self._fire_state_triggered_transitions()
-
- def process_keep_alive_disabled(self) -> None:
- self.keep_alive = False
- self._fire_state_triggered_transitions()
-
- def process_client_switch_proposal(self, switch_event: Type[Sentinel]) -> None:
- self.pending_switch_proposals.add(switch_event)
- self._fire_state_triggered_transitions()
-
- def process_event(
- self,
- role: Type[Sentinel],
- event_type: Type[Event],
- server_switch_event: Optional[Type[Sentinel]] = None,
- ) -> None:
- _event_type: Union[Type[Event], Tuple[Type[Event], Type[Sentinel]]] = event_type
- if server_switch_event is not None:
- assert role is SERVER
- if server_switch_event not in self.pending_switch_proposals:
- raise LocalProtocolError(
- "Received server {} event without a pending proposal".format(
- server_switch_event
- )
- )
- _event_type = (event_type, server_switch_event)
- if server_switch_event is None and _event_type is Response:
- self.pending_switch_proposals = set()
- self._fire_event_triggered_transitions(role, _event_type)
- # Special case: the server state does get to see Request
- # events.
- if _event_type is Request:
- assert role is CLIENT
- self._fire_event_triggered_transitions(SERVER, (Request, CLIENT))
- self._fire_state_triggered_transitions()
-
- def _fire_event_triggered_transitions(
- self,
- role: Type[Sentinel],
- event_type: Union[Type[Event], Tuple[Type[Event], Type[Sentinel]]],
- ) -> None:
- state = self.states[role]
- try:
- new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
- except KeyError:
- event_type = cast(Type[Event], event_type)
- raise LocalProtocolError(
- "can't handle event type {} when role={} and state={}".format(
- event_type.__name__, role, self.states[role]
- )
- ) from None
- self.states[role] = new_state
-
- def _fire_state_triggered_transitions(self) -> None:
- # We apply these rules repeatedly until converging on a fixed point
- while True:
- start_states = dict(self.states)
-
- # It could happen that both these special-case transitions are
- # enabled at the same time:
- #
- # DONE -> MIGHT_SWITCH_PROTOCOL
- # DONE -> MUST_CLOSE
- #
- # For example, this will always be true of a HTTP/1.0 client
- # requesting CONNECT. If this happens, the protocol switch takes
- # priority. From there the client will either go to
- # SWITCHED_PROTOCOL, in which case it's none of our business when
- # they close the connection, or else the server will deny the
- # request, in which case the client will go back to DONE and then
- # from there to MUST_CLOSE.
- if self.pending_switch_proposals:
- if self.states[CLIENT] is DONE:
- self.states[CLIENT] = MIGHT_SWITCH_PROTOCOL
-
- if not self.pending_switch_proposals:
- if self.states[CLIENT] is MIGHT_SWITCH_PROTOCOL:
- self.states[CLIENT] = DONE
-
- if not self.keep_alive:
- for role in (CLIENT, SERVER):
- if self.states[role] is DONE:
- self.states[role] = MUST_CLOSE
-
- # Tabular state-triggered transitions
- joint_state = (self.states[CLIENT], self.states[SERVER])
- changes = STATE_TRIGGERED_TRANSITIONS.get(joint_state, {})
- self.states.update(changes) # type: ignore
-
- if self.states == start_states:
- # Fixed point reached
- return
-
- def start_next_cycle(self) -> None:
- if self.states != {CLIENT: DONE, SERVER: DONE}:
- raise LocalProtocolError(
- "not in a reusable state. self.states={}".format(self.states)
- )
- # Can't reach DONE/DONE with any of these active, but still, let's be
- # sure.
- assert self.keep_alive
- assert not self.pending_switch_proposals
- self.states = {CLIENT: IDLE, SERVER: IDLE}
diff --git a/env/lib/python3.9/site-packages/h11/_util.py b/env/lib/python3.9/site-packages/h11/_util.py
deleted file mode 100644
index 6718445..0000000
--- a/env/lib/python3.9/site-packages/h11/_util.py
+++ /dev/null
@@ -1,135 +0,0 @@
-from typing import Any, Dict, NoReturn, Pattern, Tuple, Type, TypeVar, Union
-
-__all__ = [
- "ProtocolError",
- "LocalProtocolError",
- "RemoteProtocolError",
- "validate",
- "bytesify",
-]
-
-
-class ProtocolError(Exception):
- """Exception indicating a violation of the HTTP/1.1 protocol.
-
- This as an abstract base class, with two concrete base classes:
- :exc:`LocalProtocolError`, which indicates that you tried to do something
- that HTTP/1.1 says is illegal, and :exc:`RemoteProtocolError`, which
- indicates that the remote peer tried to do something that HTTP/1.1 says is
- illegal. See :ref:`error-handling` for details.
-
- In addition to the normal :exc:`Exception` features, it has one attribute:
-
- .. attribute:: error_status_hint
-
- This gives a suggestion as to what status code a server might use if
- this error occurred as part of a request.
-
- For a :exc:`RemoteProtocolError`, this is useful as a suggestion for
- how you might want to respond to a misbehaving peer, if you're
- implementing a server.
-
- For a :exc:`LocalProtocolError`, this can be taken as a suggestion for
- how your peer might have responded to *you* if h11 had allowed you to
- continue.
-
- The default is 400 Bad Request, a generic catch-all for protocol
- violations.
-
- """
-
- def __init__(self, msg: str, error_status_hint: int = 400) -> None:
- if type(self) is ProtocolError:
- raise TypeError("tried to directly instantiate ProtocolError")
- Exception.__init__(self, msg)
- self.error_status_hint = error_status_hint
-
-
-# Strategy: there are a number of public APIs where a LocalProtocolError can
-# be raised (send(), all the different event constructors, ...), and only one
-# public API where RemoteProtocolError can be raised
-# (receive_data()). Therefore we always raise LocalProtocolError internally,
-# and then receive_data will translate this into a RemoteProtocolError.
-#
-# Internally:
-# LocalProtocolError is the generic "ProtocolError".
-# Externally:
-# LocalProtocolError is for local errors and RemoteProtocolError is for
-# remote errors.
-class LocalProtocolError(ProtocolError):
- def _reraise_as_remote_protocol_error(self) -> NoReturn:
- # After catching a LocalProtocolError, use this method to re-raise it
- # as a RemoteProtocolError. This method must be called from inside an
- # except: block.
- #
- # An easy way to get an equivalent RemoteProtocolError is just to
- # modify 'self' in place.
- self.__class__ = RemoteProtocolError # type: ignore
- # But the re-raising is somewhat non-trivial -- you might think that
- # now that we've modified the in-flight exception object, that just
- # doing 'raise' to re-raise it would be enough. But it turns out that
- # this doesn't work, because Python tracks the exception type
- # (exc_info[0]) separately from the exception object (exc_info[1]),
- # and we only modified the latter. So we really do need to re-raise
- # the new type explicitly.
- # On py3, the traceback is part of the exception object, so our
- # in-place modification preserved it and we can just re-raise:
- raise self
-
-
-class RemoteProtocolError(ProtocolError):
- pass
-
-
-def validate(
- regex: Pattern[bytes], data: bytes, msg: str = "malformed data", *format_args: Any
-) -> Dict[str, bytes]:
- match = regex.fullmatch(data)
- if not match:
- if format_args:
- msg = msg.format(*format_args)
- raise LocalProtocolError(msg)
- return match.groupdict()
-
-
-# Sentinel values
-#
-# - Inherit identity-based comparison and hashing from object
-# - Have a nice repr
-# - Have a *bonus property*: type(sentinel) is sentinel
-#
-# The bonus property is useful if you want to take the return value from
-# next_event() and do some sort of dispatch based on type(event).
-
-_T_Sentinel = TypeVar("_T_Sentinel", bound="Sentinel")
-
-
-class Sentinel(type):
- def __new__(
- cls: Type[_T_Sentinel],
- name: str,
- bases: Tuple[type, ...],
- namespace: Dict[str, Any],
- **kwds: Any
- ) -> _T_Sentinel:
- assert bases == (Sentinel,)
- v = super().__new__(cls, name, bases, namespace, **kwds)
- v.__class__ = v # type: ignore
- return v
-
- def __repr__(self) -> str:
- return self.__name__
-
-
-# Used for methods, request targets, HTTP versions, header names, and header
-# values. Accepts ascii-strings, or bytes/bytearray/memoryview/..., and always
-# returns bytes.
-def bytesify(s: Union[bytes, bytearray, memoryview, int, str]) -> bytes:
- # Fast-path:
- if type(s) is bytes:
- return s
- if isinstance(s, str):
- s = s.encode("ascii")
- if isinstance(s, int):
- raise TypeError("expected bytes-like object, not int")
- return bytes(s)
diff --git a/env/lib/python3.9/site-packages/h11/_version.py b/env/lib/python3.9/site-packages/h11/_version.py
deleted file mode 100644
index 75d4288..0000000
--- a/env/lib/python3.9/site-packages/h11/_version.py
+++ /dev/null
@@ -1,16 +0,0 @@
-# This file must be kept very simple, because it is consumed from several
-# places -- it is imported by h11/__init__.py, execfile'd by setup.py, etc.
-
-# We use a simple scheme:
-# 1.0.0 -> 1.0.0+dev -> 1.1.0 -> 1.1.0+dev
-# where the +dev versions are never released into the wild, they're just what
-# we stick into the VCS in between releases.
-#
-# This is compatible with PEP 440:
-# http://legacy.python.org/dev/peps/pep-0440/
-# via the use of the "local suffix" "+dev", which is disallowed on index
-# servers and causes 1.0.0+dev to sort after plain 1.0.0, which is what we
-# want. (Contrast with the special suffix 1.0.0.dev, which sorts *before*
-# 1.0.0.)
-
-__version__ = "0.13.0"
diff --git a/env/lib/python3.9/site-packages/h11/_writers.py b/env/lib/python3.9/site-packages/h11/_writers.py
deleted file mode 100644
index 90a8dc0..0000000
--- a/env/lib/python3.9/site-packages/h11/_writers.py
+++ /dev/null
@@ -1,145 +0,0 @@
-# Code to read HTTP data
-#
-# Strategy: each writer takes an event + a write-some-bytes function, which is
-# calls.
-#
-# WRITERS is a dict describing how to pick a reader. It maps states to either:
-# - a writer
-# - or, for body writers, a dict of framin-dependent writer factories
-
-from typing import Any, Callable, Dict, List, Tuple, Type, Union
-
-from ._events import Data, EndOfMessage, Event, InformationalResponse, Request, Response
-from ._headers import Headers
-from ._state import CLIENT, IDLE, SEND_BODY, SEND_RESPONSE, SERVER
-from ._util import LocalProtocolError, Sentinel
-
-__all__ = ["WRITERS"]
-
-Writer = Callable[[bytes], Any]
-
-
-def write_headers(headers: Headers, write: Writer) -> None:
- # "Since the Host field-value is critical information for handling a
- # request, a user agent SHOULD generate Host as the first header field
- # following the request-line." - RFC 7230
- raw_items = headers._full_items
- for raw_name, name, value in raw_items:
- if name == b"host":
- write(b"%s: %s\r\n" % (raw_name, value))
- for raw_name, name, value in raw_items:
- if name != b"host":
- write(b"%s: %s\r\n" % (raw_name, value))
- write(b"\r\n")
-
-
-def write_request(request: Request, write: Writer) -> None:
- if request.http_version != b"1.1":
- raise LocalProtocolError("I only send HTTP/1.1")
- write(b"%s %s HTTP/1.1\r\n" % (request.method, request.target))
- write_headers(request.headers, write)
-
-
-# Shared between InformationalResponse and Response
-def write_any_response(
- response: Union[InformationalResponse, Response], write: Writer
-) -> None:
- if response.http_version != b"1.1":
- raise LocalProtocolError("I only send HTTP/1.1")
- status_bytes = str(response.status_code).encode("ascii")
- # We don't bother sending ascii status messages like "OK"; they're
- # optional and ignored by the protocol. (But the space after the numeric
- # status code is mandatory.)
- #
- # XX FIXME: could at least make an effort to pull out the status message
- # from stdlib's http.HTTPStatus table. Or maybe just steal their enums
- # (either by import or copy/paste). We already accept them as status codes
- # since they're of type IntEnum < int.
- write(b"HTTP/1.1 %s %s\r\n" % (status_bytes, response.reason))
- write_headers(response.headers, write)
-
-
-class BodyWriter:
- def __call__(self, event: Event, write: Writer) -> None:
- if type(event) is Data:
- self.send_data(event.data, write)
- elif type(event) is EndOfMessage:
- self.send_eom(event.headers, write)
- else: # pragma: no cover
- assert False
-
- def send_data(self, data: bytes, write: Writer) -> None:
- pass
-
- def send_eom(self, headers: Headers, write: Writer) -> None:
- pass
-
-
-#
-# These are all careful not to do anything to 'data' except call len(data) and
-# write(data). This allows us to transparently pass-through funny objects,
-# like placeholder objects referring to files on disk that will be sent via
-# sendfile(2).
-#
-class ContentLengthWriter(BodyWriter):
- def __init__(self, length: int) -> None:
- self._length = length
-
- def send_data(self, data: bytes, write: Writer) -> None:
- self._length -= len(data)
- if self._length < 0:
- raise LocalProtocolError("Too much data for declared Content-Length")
- write(data)
-
- def send_eom(self, headers: Headers, write: Writer) -> None:
- if self._length != 0:
- raise LocalProtocolError("Too little data for declared Content-Length")
- if headers:
- raise LocalProtocolError("Content-Length and trailers don't mix")
-
-
-class ChunkedWriter(BodyWriter):
- def send_data(self, data: bytes, write: Writer) -> None:
- # if we encoded 0-length data in the naive way, it would look like an
- # end-of-message.
- if not data:
- return
- write(b"%x\r\n" % len(data))
- write(data)
- write(b"\r\n")
-
- def send_eom(self, headers: Headers, write: Writer) -> None:
- write(b"0\r\n")
- write_headers(headers, write)
-
-
-class Http10Writer(BodyWriter):
- def send_data(self, data: bytes, write: Writer) -> None:
- write(data)
-
- def send_eom(self, headers: Headers, write: Writer) -> None:
- if headers:
- raise LocalProtocolError("can't send trailers to HTTP/1.0 client")
- # no need to close the socket ourselves, that will be taken care of by
- # Connection: close machinery
-
-
-WritersType = Dict[
- Union[Tuple[Sentinel, Sentinel], Sentinel],
- Union[
- Dict[str, Type[BodyWriter]],
- Callable[[Union[InformationalResponse, Response], Writer], None],
- Callable[[Request, Writer], None],
- ],
-]
-
-WRITERS: WritersType = {
- (CLIENT, IDLE): write_request,
- (SERVER, IDLE): write_any_response,
- (SERVER, SEND_RESPONSE): write_any_response,
- SEND_BODY: {
- "chunked": ChunkedWriter,
- "content-length": ContentLengthWriter,
- "http/1.0": Http10Writer,
- },
-}
diff --git a/env/lib/python3.9/site-packages/h11/py.typed b/env/lib/python3.9/site-packages/h11/py.typed
deleted file mode 100644
index f5642f7..0000000
--- a/env/lib/python3.9/site-packages/h11/py.typed
+++ /dev/null
@@ -1 +0,0 @@
-Marker
diff --git a/env/lib/python3.9/site-packages/h11/tests/__init__.py b/env/lib/python3.9/site-packages/h11/tests/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/h11/tests/data/test-file b/env/lib/python3.9/site-packages/h11/tests/data/test-file
deleted file mode 100644
index d0be0a6..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/data/test-file
+++ /dev/null
@@ -1 +0,0 @@
-92b12bc045050b55b848d37167a1a63947c364579889ce1d39788e45e9fac9e5
diff --git a/env/lib/python3.9/site-packages/h11/tests/helpers.py b/env/lib/python3.9/site-packages/h11/tests/helpers.py
deleted file mode 100644
index 571be44..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/helpers.py
+++ /dev/null
@@ -1,101 +0,0 @@
-from typing import cast, List, Type, Union, ValuesView
-
-from .._connection import Connection, NEED_DATA, PAUSED
-from .._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from .._state import CLIENT, CLOSED, DONE, MUST_CLOSE, SERVER
-from .._util import Sentinel
-
-try:
- from typing import Literal
-except ImportError:
- from typing_extensions import Literal # type: ignore
-
-
-def get_all_events(conn: Connection) -> List[Event]:
- got_events = []
- while True:
- event = conn.next_event()
- if event in (NEED_DATA, PAUSED):
- break
- event = cast(Event, event)
- got_events.append(event)
- if type(event) is ConnectionClosed:
- break
- return got_events
-
-
-def receive_and_get(conn: Connection, data: bytes) -> List[Event]:
- conn.receive_data(data)
- return get_all_events(conn)
-
-
-# Merges adjacent Data events, converts payloads to bytestrings, and removes
-# chunk boundaries.
-def normalize_data_events(in_events: List[Event]) -> List[Event]:
- out_events: List[Event] = []
- for event in in_events:
- if type(event) is Data:
- event = Data(data=bytes(event.data), chunk_start=False, chunk_end=False)
- if out_events and type(out_events[-1]) is type(event) is Data:
- out_events[-1] = Data(
- data=out_events[-1].data + event.data,
- chunk_start=out_events[-1].chunk_start,
- chunk_end=out_events[-1].chunk_end,
- )
- else:
- out_events.append(event)
- return out_events
-
-
-# Given that we want to write tests that push some events through a Connection
-# and check that its state updates appropriately... we might as make a habit
-# of pushing them through two Connections with a fake network link in
-# between.
-class ConnectionPair:
- def __init__(self) -> None:
- self.conn = {CLIENT: Connection(CLIENT), SERVER: Connection(SERVER)}
- self.other = {CLIENT: SERVER, SERVER: CLIENT}
-
- @property
- def conns(self) -> ValuesView[Connection]:
- return self.conn.values()
-
- # expect="match" if expect=send_events; expect=[...] to say what expected
- def send(
- self,
- role: Type[Sentinel],
- send_events: Union[List[Event], Event],
- expect: Union[List[Event], Event, Literal["match"]] = "match",
- ) -> bytes:
- if not isinstance(send_events, list):
- send_events = [send_events]
- data = b""
- closed = False
- for send_event in send_events:
- new_data = self.conn[role].send(send_event)
- if new_data is None:
- closed = True
- else:
- data += new_data
- # send uses b"" to mean b"", and None to mean closed
- # receive uses b"" to mean closed, and None to mean "try again"
- # so we have to translate between the two conventions
- if data:
- self.conn[self.other[role]].receive_data(data)
- if closed:
- self.conn[self.other[role]].receive_data(b"")
- got_events = get_all_events(self.conn[self.other[role]])
- if expect == "match":
- expect = send_events
- if not isinstance(expect, list):
- expect = [expect]
- assert got_events == expect
- return data
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_against_stdlib_http.py b/env/lib/python3.9/site-packages/h11/tests/test_against_stdlib_http.py
deleted file mode 100644
index d2ee131..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_against_stdlib_http.py
+++ /dev/null
@@ -1,115 +0,0 @@
-import json
-import os.path
-import socket
-import socketserver
-import threading
-from contextlib import closing, contextmanager
-from http.server import SimpleHTTPRequestHandler
-from typing import Callable, Generator
-from urllib.request import urlopen
-
-import h11
-
-
-@contextmanager
-def socket_server(
- handler: Callable[..., socketserver.BaseRequestHandler]
-) -> Generator[socketserver.TCPServer, None, None]:
- httpd = socketserver.TCPServer(("127.0.0.1", 0), handler)
- thread = threading.Thread(
- target=httpd.serve_forever, kwargs={"poll_interval": 0.01}
- )
- thread.daemon = True
- try:
- thread.start()
- yield httpd
- finally:
- httpd.shutdown()
-
-
-test_file_path = os.path.join(os.path.dirname(__file__), "data/test-file")
-with open(test_file_path, "rb") as f:
- test_file_data = f.read()
-
-
-class SingleMindedRequestHandler(SimpleHTTPRequestHandler):
- def translate_path(self, path: str) -> str:
- return test_file_path
-
-
-def test_h11_as_client() -> None:
- with socket_server(SingleMindedRequestHandler) as httpd:
- with closing(socket.create_connection(httpd.server_address)) as s:
- c = h11.Connection(h11.CLIENT)
-
- s.sendall(
- c.send( # type: ignore[arg-type]
- h11.Request(
- method="GET", target="/foo", headers=[("Host", "localhost")]
- )
- )
- )
- s.sendall(c.send(h11.EndOfMessage())) # type: ignore[arg-type]
-
- data = bytearray()
- while True:
- event = c.next_event()
- print(event)
- if event is h11.NEED_DATA:
- # Use a small read buffer to make things more challenging
- # and exercise more paths :-)
- c.receive_data(s.recv(10))
- continue
- if type(event) is h11.Response:
- assert event.status_code == 200
- if type(event) is h11.Data:
- data += event.data
- if type(event) is h11.EndOfMessage:
- break
- assert bytes(data) == test_file_data
-
-
-class H11RequestHandler(socketserver.BaseRequestHandler):
- def handle(self) -> None:
- with closing(self.request) as s:
- c = h11.Connection(h11.SERVER)
- request = None
- while True:
- event = c.next_event()
- if event is h11.NEED_DATA:
- # Use a small read buffer to make things more challenging
- # and exercise more paths :-)
- c.receive_data(s.recv(10))
- continue
- if type(event) is h11.Request:
- request = event
- if type(event) is h11.EndOfMessage:
- break
- assert request is not None
- info = json.dumps(
- {
- "method": request.method.decode("ascii"),
- "target": request.target.decode("ascii"),
- "headers": {
- name.decode("ascii"): value.decode("ascii")
- for (name, value) in request.headers
- },
- }
- )
- s.sendall(c.send(h11.Response(status_code=200, headers=[]))) # type: ignore[arg-type]
- s.sendall(c.send(h11.Data(data=info.encode("ascii"))))
- s.sendall(c.send(h11.EndOfMessage()))
-
-
-def test_h11_as_server() -> None:
- with socket_server(H11RequestHandler) as httpd:
- host, port = httpd.server_address
- url = "http://{}:{}/some-path".format(host, port)
- with closing(urlopen(url)) as f:
- assert f.getcode() == 200
- data = f.read()
- info = json.loads(data.decode("ascii"))
- print(info)
- assert info["method"] == "GET"
- assert info["target"] == "/some-path"
- assert "urllib" in info["headers"]["user-agent"]
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_connection.py b/env/lib/python3.9/site-packages/h11/tests/test_connection.py
deleted file mode 100644
index 73a27b9..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_connection.py
+++ /dev/null
@@ -1,1122 +0,0 @@
-from typing import Any, cast, Dict, List, Optional, Tuple, Type
-
-import pytest
-
-from .._connection import _body_framing, _keep_alive, Connection, NEED_DATA, PAUSED
-from .._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from .._state import (
- CLIENT,
- CLOSED,
- DONE,
- ERROR,
- IDLE,
- MIGHT_SWITCH_PROTOCOL,
- MUST_CLOSE,
- SEND_BODY,
- SEND_RESPONSE,
- SERVER,
- SWITCHED_PROTOCOL,
-)
-from .._util import LocalProtocolError, RemoteProtocolError, Sentinel
-from .helpers import ConnectionPair, get_all_events, receive_and_get
-
-
-def test__keep_alive() -> None:
- assert _keep_alive(
- Request(method="GET", target="/", headers=[("Host", "Example.com")])
- )
- assert not _keep_alive(
- Request(
- method="GET",
- target="/",
- headers=[("Host", "Example.com"), ("Connection", "close")],
- )
- )
- assert not _keep_alive(
- Request(
- method="GET",
- target="/",
- headers=[("Host", "Example.com"), ("Connection", "a, b, cLOse, foo")],
- )
- )
- assert not _keep_alive(
- Request(method="GET", target="/", headers=[], http_version="1.0") # type: ignore[arg-type]
- )
-
- assert _keep_alive(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- assert not _keep_alive(Response(status_code=200, headers=[("Connection", "close")]))
- assert not _keep_alive(
- Response(status_code=200, headers=[("Connection", "a, b, cLOse, foo")])
- )
- assert not _keep_alive(Response(status_code=200, headers=[], http_version="1.0")) # type: ignore[arg-type]
-
-
-def test__body_framing() -> None:
- def headers(cl: Optional[int], te: bool) -> List[Tuple[str, str]]:
- headers = []
- if cl is not None:
- headers.append(("Content-Length", str(cl)))
- if te:
- headers.append(("Transfer-Encoding", "chunked"))
- return headers
-
- def resp(
- status_code: int = 200, cl: Optional[int] = None, te: bool = False
- ) -> Response:
- return Response(status_code=status_code, headers=headers(cl, te))
-
- def req(cl: Optional[int] = None, te: bool = False) -> Request:
- h = headers(cl, te)
- h += [("Host", "example.com")]
- return Request(method="GET", target="/", headers=h)
-
- # Special cases where the headers are ignored:
- for kwargs in [{}, {"cl": 100}, {"te": True}, {"cl": 100, "te": True}]:
- kwargs = cast(Dict[str, Any], kwargs)
- for meth, r in [
- (b"HEAD", resp(**kwargs)),
- (b"GET", resp(status_code=204, **kwargs)),
- (b"GET", resp(status_code=304, **kwargs)),
- ]:
- assert _body_framing(meth, r) == ("content-length", (0,))
-
- # Transfer-encoding
- for kwargs in [{"te": True}, {"cl": 100, "te": True}]:
- kwargs = cast(Dict[str, Any], kwargs)
- for meth, r in [(None, req(**kwargs)), (b"GET", resp(**kwargs))]: # type: ignore
- assert _body_framing(meth, r) == ("chunked", ())
-
- # Content-Length
- for meth, r in [(None, req(cl=100)), (b"GET", resp(cl=100))]: # type: ignore
- assert _body_framing(meth, r) == ("content-length", (100,))
-
- # No headers
- assert _body_framing(None, req()) == ("content-length", (0,)) # type: ignore
- assert _body_framing(b"GET", resp()) == ("http/1.0", ())
-
-
-def test_Connection_basics_and_content_length() -> None:
- with pytest.raises(ValueError):
- Connection("CLIENT") # type: ignore
-
- p = ConnectionPair()
- assert p.conn[CLIENT].our_role is CLIENT
- assert p.conn[CLIENT].their_role is SERVER
- assert p.conn[SERVER].our_role is SERVER
- assert p.conn[SERVER].their_role is CLIENT
-
- data = p.send(
- CLIENT,
- Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com"), ("Content-Length", "10")],
- ),
- )
- assert data == (
- b"GET / HTTP/1.1\r\n" b"Host: example.com\r\n" b"Content-Length: 10\r\n\r\n"
- )
-
- for conn in p.conns:
- assert conn.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
- assert p.conn[CLIENT].our_state is SEND_BODY
- assert p.conn[CLIENT].their_state is SEND_RESPONSE
- assert p.conn[SERVER].our_state is SEND_RESPONSE
- assert p.conn[SERVER].their_state is SEND_BODY
-
- assert p.conn[CLIENT].their_http_version is None
- assert p.conn[SERVER].their_http_version == b"1.1"
-
- data = p.send(SERVER, InformationalResponse(status_code=100, headers=[])) # type: ignore[arg-type]
- assert data == b"HTTP/1.1 100 \r\n\r\n"
-
- data = p.send(SERVER, Response(status_code=200, headers=[("Content-Length", "11")]))
- assert data == b"HTTP/1.1 200 \r\nContent-Length: 11\r\n\r\n"
-
- for conn in p.conns:
- assert conn.states == {CLIENT: SEND_BODY, SERVER: SEND_BODY}
-
- assert p.conn[CLIENT].their_http_version == b"1.1"
- assert p.conn[SERVER].their_http_version == b"1.1"
-
- data = p.send(CLIENT, Data(data=b"12345"))
- assert data == b"12345"
- data = p.send(
- CLIENT, Data(data=b"67890"), expect=[Data(data=b"67890"), EndOfMessage()]
- )
- assert data == b"67890"
- data = p.send(CLIENT, EndOfMessage(), expect=[])
- assert data == b""
-
- for conn in p.conns:
- assert conn.states == {CLIENT: DONE, SERVER: SEND_BODY}
-
- data = p.send(SERVER, Data(data=b"1234567890"))
- assert data == b"1234567890"
- data = p.send(SERVER, Data(data=b"1"), expect=[Data(data=b"1"), EndOfMessage()])
- assert data == b"1"
- data = p.send(SERVER, EndOfMessage(), expect=[])
- assert data == b""
-
- for conn in p.conns:
- assert conn.states == {CLIENT: DONE, SERVER: DONE}
-
-
-def test_chunked() -> None:
- p = ConnectionPair()
-
- p.send(
- CLIENT,
- Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com"), ("Transfer-Encoding", "chunked")],
- ),
- )
- data = p.send(CLIENT, Data(data=b"1234567890", chunk_start=True, chunk_end=True))
- assert data == b"a\r\n1234567890\r\n"
- data = p.send(CLIENT, Data(data=b"abcde", chunk_start=True, chunk_end=True))
- assert data == b"5\r\nabcde\r\n"
- data = p.send(CLIENT, Data(data=b""), expect=[])
- assert data == b""
- data = p.send(CLIENT, EndOfMessage(headers=[("hello", "there")]))
- assert data == b"0\r\nhello: there\r\n\r\n"
-
- p.send(
- SERVER, Response(status_code=200, headers=[("Transfer-Encoding", "chunked")])
- )
- p.send(SERVER, Data(data=b"54321", chunk_start=True, chunk_end=True))
- p.send(SERVER, Data(data=b"12345", chunk_start=True, chunk_end=True))
- p.send(SERVER, EndOfMessage())
-
- for conn in p.conns:
- assert conn.states == {CLIENT: DONE, SERVER: DONE}
-
-
-def test_chunk_boundaries() -> None:
- conn = Connection(our_role=SERVER)
-
- request = (
- b"POST / HTTP/1.1\r\n"
- b"Host: example.com\r\n"
- b"Transfer-Encoding: chunked\r\n"
- b"\r\n"
- )
- conn.receive_data(request)
- assert conn.next_event() == Request(
- method="POST",
- target="/",
- headers=[("Host", "example.com"), ("Transfer-Encoding", "chunked")],
- )
- assert conn.next_event() is NEED_DATA
-
- conn.receive_data(b"5\r\nhello\r\n")
- assert conn.next_event() == Data(data=b"hello", chunk_start=True, chunk_end=True)
-
- conn.receive_data(b"5\r\nhel")
- assert conn.next_event() == Data(data=b"hel", chunk_start=True, chunk_end=False)
-
- conn.receive_data(b"l")
- assert conn.next_event() == Data(data=b"l", chunk_start=False, chunk_end=False)
-
- conn.receive_data(b"o\r\n")
- assert conn.next_event() == Data(data=b"o", chunk_start=False, chunk_end=True)
-
- conn.receive_data(b"5\r\nhello")
- assert conn.next_event() == Data(data=b"hello", chunk_start=True, chunk_end=True)
-
- conn.receive_data(b"\r\n")
- assert conn.next_event() == NEED_DATA
-
- conn.receive_data(b"0\r\n\r\n")
- assert conn.next_event() == EndOfMessage()
-
-
-def test_client_talking_to_http10_server() -> None:
- c = Connection(CLIENT)
- c.send(Request(method="GET", target="/", headers=[("Host", "example.com")]))
- c.send(EndOfMessage())
- assert c.our_state is DONE
- # No content-length, so Http10 framing for body
- assert receive_and_get(c, b"HTTP/1.0 200 OK\r\n\r\n") == [
- Response(status_code=200, headers=[], http_version="1.0", reason=b"OK") # type: ignore[arg-type]
- ]
- assert c.our_state is MUST_CLOSE
- assert receive_and_get(c, b"12345") == [Data(data=b"12345")]
- assert receive_and_get(c, b"67890") == [Data(data=b"67890")]
- assert receive_and_get(c, b"") == [EndOfMessage(), ConnectionClosed()]
- assert c.their_state is CLOSED
-
-
-def test_server_talking_to_http10_client() -> None:
- c = Connection(SERVER)
- # No content-length, so no body
- # NB: no host header
- assert receive_and_get(c, b"GET / HTTP/1.0\r\n\r\n") == [
- Request(method="GET", target="/", headers=[], http_version="1.0"), # type: ignore[arg-type]
- EndOfMessage(),
- ]
- assert c.their_state is MUST_CLOSE
-
- # We automatically Connection: close back at them
- assert (
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- == b"HTTP/1.1 200 \r\nConnection: close\r\n\r\n"
- )
-
- assert c.send(Data(data=b"12345")) == b"12345"
- assert c.send(EndOfMessage()) == b""
- assert c.our_state is MUST_CLOSE
-
- # Check that it works if they do send Content-Length
- c = Connection(SERVER)
- # NB: no host header
- assert receive_and_get(c, b"POST / HTTP/1.0\r\nContent-Length: 10\r\n\r\n1") == [
- Request(
- method="POST",
- target="/",
- headers=[("Content-Length", "10")],
- http_version="1.0",
- ),
- Data(data=b"1"),
- ]
- assert receive_and_get(c, b"234567890") == [Data(data=b"234567890"), EndOfMessage()]
- assert c.their_state is MUST_CLOSE
- assert receive_and_get(c, b"") == [ConnectionClosed()]
-
-
-def test_automatic_transfer_encoding_in_response() -> None:
- # Check that in responses, the user can specify either Transfer-Encoding:
- # chunked or no framing at all, and in both cases we automatically select
- # the right option depending on whether the peer speaks HTTP/1.0 or
- # HTTP/1.1
- for user_headers in [
- [("Transfer-Encoding", "chunked")],
- [],
- # In fact, this even works if Content-Length is set,
- # because if both are set then Transfer-Encoding wins
- [("Transfer-Encoding", "chunked"), ("Content-Length", "100")],
- ]:
- user_headers = cast(List[Tuple[str, str]], user_headers)
- p = ConnectionPair()
- p.send(
- CLIENT,
- [
- Request(method="GET", target="/", headers=[("Host", "example.com")]),
- EndOfMessage(),
- ],
- )
- # When speaking to HTTP/1.1 client, all of the above cases get
- # normalized to Transfer-Encoding: chunked
- p.send(
- SERVER,
- Response(status_code=200, headers=user_headers),
- expect=Response(
- status_code=200, headers=[("Transfer-Encoding", "chunked")]
- ),
- )
-
- # When speaking to HTTP/1.0 client, all of the above cases get
- # normalized to no-framing-headers
- c = Connection(SERVER)
- receive_and_get(c, b"GET / HTTP/1.0\r\n\r\n")
- assert (
- c.send(Response(status_code=200, headers=user_headers))
- == b"HTTP/1.1 200 \r\nConnection: close\r\n\r\n"
- )
- assert c.send(Data(data=b"12345")) == b"12345"
-
-
-def test_automagic_connection_close_handling() -> None:
- p = ConnectionPair()
- # If the user explicitly sets Connection: close, then we notice and
- # respect it
- p.send(
- CLIENT,
- [
- Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com"), ("Connection", "close")],
- ),
- EndOfMessage(),
- ],
- )
- for conn in p.conns:
- assert conn.states[CLIENT] is MUST_CLOSE
- # And if the client sets it, the server automatically echoes it back
- p.send(
- SERVER,
- # no header here...
- [Response(status_code=204, headers=[]), EndOfMessage()], # type: ignore[arg-type]
- # ...but oh look, it arrived anyway
- expect=[
- Response(status_code=204, headers=[("connection", "close")]),
- EndOfMessage(),
- ],
- )
- for conn in p.conns:
- assert conn.states == {CLIENT: MUST_CLOSE, SERVER: MUST_CLOSE}
-
-
-def test_100_continue() -> None:
- def setup() -> ConnectionPair:
- p = ConnectionPair()
- p.send(
- CLIENT,
- Request(
- method="GET",
- target="/",
- headers=[
- ("Host", "example.com"),
- ("Content-Length", "100"),
- ("Expect", "100-continue"),
- ],
- ),
- )
- for conn in p.conns:
- assert conn.client_is_waiting_for_100_continue
- assert not p.conn[CLIENT].they_are_waiting_for_100_continue
- assert p.conn[SERVER].they_are_waiting_for_100_continue
- return p
-
- # Disabled by 100 Continue
- p = setup()
- p.send(SERVER, InformationalResponse(status_code=100, headers=[])) # type: ignore[arg-type]
- for conn in p.conns:
- assert not conn.client_is_waiting_for_100_continue
- assert not conn.they_are_waiting_for_100_continue
-
- # Disabled by a real response
- p = setup()
- p.send(
- SERVER, Response(status_code=200, headers=[("Transfer-Encoding", "chunked")])
- )
- for conn in p.conns:
- assert not conn.client_is_waiting_for_100_continue
- assert not conn.they_are_waiting_for_100_continue
-
- # Disabled by the client going ahead and sending stuff anyway
- p = setup()
- p.send(CLIENT, Data(data=b"12345"))
- for conn in p.conns:
- assert not conn.client_is_waiting_for_100_continue
- assert not conn.they_are_waiting_for_100_continue
-
-
-def test_max_incomplete_event_size_countermeasure() -> None:
- # Infinitely long headers are definitely not okay
- c = Connection(SERVER)
- c.receive_data(b"GET / HTTP/1.0\r\nEndless: ")
- assert c.next_event() is NEED_DATA
- with pytest.raises(RemoteProtocolError):
- while True:
- c.receive_data(b"a" * 1024)
- c.next_event()
-
- # Checking that the same header is accepted / rejected depending on the
- # max_incomplete_event_size setting:
- c = Connection(SERVER, max_incomplete_event_size=5000)
- c.receive_data(b"GET / HTTP/1.0\r\nBig: ")
- c.receive_data(b"a" * 4000)
- c.receive_data(b"\r\n\r\n")
- assert get_all_events(c) == [
- Request(
- method="GET", target="/", http_version="1.0", headers=[("big", "a" * 4000)]
- ),
- EndOfMessage(),
- ]
-
- c = Connection(SERVER, max_incomplete_event_size=4000)
- c.receive_data(b"GET / HTTP/1.0\r\nBig: ")
- c.receive_data(b"a" * 4000)
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
- # Temporarily exceeding the size limit is fine, as long as its done with
- # complete events:
- c = Connection(SERVER, max_incomplete_event_size=5000)
- c.receive_data(b"GET / HTTP/1.0\r\nContent-Length: 10000")
- c.receive_data(b"\r\n\r\n" + b"a" * 10000)
- assert get_all_events(c) == [
- Request(
- method="GET",
- target="/",
- http_version="1.0",
- headers=[("Content-Length", "10000")],
- ),
- Data(data=b"a" * 10000),
- EndOfMessage(),
- ]
-
- c = Connection(SERVER, max_incomplete_event_size=100)
- # Two pipelined requests to create a way-too-big receive buffer... but
- # it's fine because we're not checking
- c.receive_data(
- b"GET /1 HTTP/1.1\r\nHost: a\r\n\r\n"
- b"GET /2 HTTP/1.1\r\nHost: b\r\n\r\n" + b"X" * 1000
- )
- assert get_all_events(c) == [
- Request(method="GET", target="/1", headers=[("host", "a")]),
- EndOfMessage(),
- ]
- # Even more data comes in, still no problem
- c.receive_data(b"X" * 1000)
- # We can respond and reuse to get the second pipelined request
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
- c.start_next_cycle()
- assert get_all_events(c) == [
- Request(method="GET", target="/2", headers=[("host", "b")]),
- EndOfMessage(),
- ]
- # But once we unpause and try to read the next message, and find that it's
- # incomplete and the buffer is *still* way too large, then *that's* a
- # problem:
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
- c.start_next_cycle()
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
-
-def test_reuse_simple() -> None:
- p = ConnectionPair()
- p.send(
- CLIENT,
- [Request(method="GET", target="/", headers=[("Host", "a")]), EndOfMessage()],
- )
- p.send(
- SERVER,
- [
- Response(status_code=200, headers=[(b"transfer-encoding", b"chunked")]),
- EndOfMessage(),
- ],
- )
- for conn in p.conns:
- assert conn.states == {CLIENT: DONE, SERVER: DONE}
- conn.start_next_cycle()
-
- p.send(
- CLIENT,
- [
- Request(method="DELETE", target="/foo", headers=[("Host", "a")]),
- EndOfMessage(),
- ],
- )
- p.send(
- SERVER,
- [
- Response(status_code=404, headers=[(b"transfer-encoding", b"chunked")]),
- EndOfMessage(),
- ],
- )
-
-
-def test_pipelining() -> None:
- # Client doesn't support pipelining, so we have to do this by hand
- c = Connection(SERVER)
- assert c.next_event() is NEED_DATA
- # 3 requests all bunched up
- c.receive_data(
- b"GET /1 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
- b"12345"
- b"GET /2 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
- b"67890"
- b"GET /3 HTTP/1.1\r\nHost: a.com\r\n\r\n"
- )
- assert get_all_events(c) == [
- Request(
- method="GET",
- target="/1",
- headers=[("Host", "a.com"), ("Content-Length", "5")],
- ),
- Data(data=b"12345"),
- EndOfMessage(),
- ]
- assert c.their_state is DONE
- assert c.our_state is SEND_RESPONSE
-
- assert c.next_event() is PAUSED
-
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
- assert c.their_state is DONE
- assert c.our_state is DONE
-
- c.start_next_cycle()
-
- assert get_all_events(c) == [
- Request(
- method="GET",
- target="/2",
- headers=[("Host", "a.com"), ("Content-Length", "5")],
- ),
- Data(data=b"67890"),
- EndOfMessage(),
- ]
- assert c.next_event() is PAUSED
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
- c.start_next_cycle()
-
- assert get_all_events(c) == [
- Request(method="GET", target="/3", headers=[("Host", "a.com")]),
- EndOfMessage(),
- ]
- # Doesn't pause this time, no trailing data
- assert c.next_event() is NEED_DATA
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
-
- # Arrival of more data triggers pause
- assert c.next_event() is NEED_DATA
- c.receive_data(b"SADF")
- assert c.next_event() is PAUSED
- assert c.trailing_data == (b"SADF", False)
- # If EOF arrives while paused, we don't see that either:
- c.receive_data(b"")
- assert c.trailing_data == (b"SADF", True)
- assert c.next_event() is PAUSED
- c.receive_data(b"")
- assert c.next_event() is PAUSED
- # Can't call receive_data with non-empty buf after closing it
- with pytest.raises(RuntimeError):
- c.receive_data(b"FDSA")
-
-
-def test_protocol_switch() -> None:
- for (req, deny, accept) in [
- (
- Request(
- method="CONNECT",
- target="example.com:443",
- headers=[("Host", "foo"), ("Content-Length", "1")],
- ),
- Response(status_code=404, headers=[(b"transfer-encoding", b"chunked")]),
- Response(status_code=200, headers=[(b"transfer-encoding", b"chunked")]),
- ),
- (
- Request(
- method="GET",
- target="/",
- headers=[("Host", "foo"), ("Content-Length", "1"), ("Upgrade", "a, b")],
- ),
- Response(status_code=200, headers=[(b"transfer-encoding", b"chunked")]),
- InformationalResponse(status_code=101, headers=[("Upgrade", "a")]),
- ),
- (
- Request(
- method="CONNECT",
- target="example.com:443",
- headers=[("Host", "foo"), ("Content-Length", "1"), ("Upgrade", "a, b")],
- ),
- Response(status_code=404, headers=[(b"transfer-encoding", b"chunked")]),
- # Accept CONNECT, not upgrade
- Response(status_code=200, headers=[(b"transfer-encoding", b"chunked")]),
- ),
- (
- Request(
- method="CONNECT",
- target="example.com:443",
- headers=[("Host", "foo"), ("Content-Length", "1"), ("Upgrade", "a, b")],
- ),
- Response(status_code=404, headers=[(b"transfer-encoding", b"chunked")]),
- # Accept Upgrade, not CONNECT
- InformationalResponse(status_code=101, headers=[("Upgrade", "b")]),
- ),
- ]:
-
- def setup() -> ConnectionPair:
- p = ConnectionPair()
- p.send(CLIENT, req)
- # No switch-related state change stuff yet; the client has to
- # finish the request before that kicks in
- for conn in p.conns:
- assert conn.states[CLIENT] is SEND_BODY
- p.send(CLIENT, [Data(data=b"1"), EndOfMessage()])
- for conn in p.conns:
- assert conn.states[CLIENT] is MIGHT_SWITCH_PROTOCOL
- assert p.conn[SERVER].next_event() is PAUSED
- return p
-
- # Test deny case
- p = setup()
- p.send(SERVER, deny)
- for conn in p.conns:
- assert conn.states == {CLIENT: DONE, SERVER: SEND_BODY}
- p.send(SERVER, EndOfMessage())
- # Check that re-use is still allowed after a denial
- for conn in p.conns:
- conn.start_next_cycle()
-
- # Test accept case
- p = setup()
- p.send(SERVER, accept)
- for conn in p.conns:
- assert conn.states == {CLIENT: SWITCHED_PROTOCOL, SERVER: SWITCHED_PROTOCOL}
- conn.receive_data(b"123")
- assert conn.next_event() is PAUSED
- conn.receive_data(b"456")
- assert conn.next_event() is PAUSED
- assert conn.trailing_data == (b"123456", False)
-
- # Pausing in might-switch, then recovery
- # (weird artificial case where the trailing data actually is valid
- # HTTP for some reason, because this makes it easier to test the state
- # logic)
- p = setup()
- sc = p.conn[SERVER]
- sc.receive_data(b"GET / HTTP/1.0\r\n\r\n")
- assert sc.next_event() is PAUSED
- assert sc.trailing_data == (b"GET / HTTP/1.0\r\n\r\n", False)
- sc.send(deny)
- assert sc.next_event() is PAUSED
- sc.send(EndOfMessage())
- sc.start_next_cycle()
- assert get_all_events(sc) == [
- Request(method="GET", target="/", headers=[], http_version="1.0"), # type: ignore[arg-type]
- EndOfMessage(),
- ]
-
- # When we're DONE, have no trailing data, and the connection gets
- # closed, we report ConnectionClosed(). When we're in might-switch or
- # switched, we don't.
- p = setup()
- sc = p.conn[SERVER]
- sc.receive_data(b"")
- assert sc.next_event() is PAUSED
- assert sc.trailing_data == (b"", True)
- p.send(SERVER, accept)
- assert sc.next_event() is PAUSED
-
- p = setup()
- sc = p.conn[SERVER]
- sc.receive_data(b"")
- assert sc.next_event() is PAUSED
- sc.send(deny)
- assert sc.next_event() == ConnectionClosed()
-
- # You can't send after switching protocols, or while waiting for a
- # protocol switch
- p = setup()
- with pytest.raises(LocalProtocolError):
- p.conn[CLIENT].send(
- Request(method="GET", target="/", headers=[("Host", "a")])
- )
- p = setup()
- p.send(SERVER, accept)
- with pytest.raises(LocalProtocolError):
- p.conn[SERVER].send(Data(data=b"123"))
-
-
-def test_close_simple() -> None:
- # Just immediately closing a new connection without anything having
- # happened yet.
- for (who_shot_first, who_shot_second) in [(CLIENT, SERVER), (SERVER, CLIENT)]:
-
- def setup() -> ConnectionPair:
- p = ConnectionPair()
- p.send(who_shot_first, ConnectionClosed())
- for conn in p.conns:
- assert conn.states == {
- who_shot_first: CLOSED,
- who_shot_second: MUST_CLOSE,
- }
- return p
-
- # You can keep putting b"" into a closed connection, and you keep
- # getting ConnectionClosed() out:
- p = setup()
- assert p.conn[who_shot_second].next_event() == ConnectionClosed()
- assert p.conn[who_shot_second].next_event() == ConnectionClosed()
- p.conn[who_shot_second].receive_data(b"")
- assert p.conn[who_shot_second].next_event() == ConnectionClosed()
- # Second party can close...
- p = setup()
- p.send(who_shot_second, ConnectionClosed())
- for conn in p.conns:
- assert conn.our_state is CLOSED
- assert conn.their_state is CLOSED
- # But trying to receive new data on a closed connection is a
- # RuntimeError (not ProtocolError, because the problem here isn't
- # violation of HTTP, it's violation of physics)
- p = setup()
- with pytest.raises(RuntimeError):
- p.conn[who_shot_second].receive_data(b"123")
- # And receiving new data on a MUST_CLOSE connection is a ProtocolError
- p = setup()
- p.conn[who_shot_first].receive_data(b"GET")
- with pytest.raises(RemoteProtocolError):
- p.conn[who_shot_first].next_event()
-
-
-def test_close_different_states() -> None:
- req = [
- Request(method="GET", target="/foo", headers=[("Host", "a")]),
- EndOfMessage(),
- ]
- resp = [
- Response(status_code=200, headers=[(b"transfer-encoding", b"chunked")]),
- EndOfMessage(),
- ]
-
- # Client before request
- p = ConnectionPair()
- p.send(CLIENT, ConnectionClosed())
- for conn in p.conns:
- assert conn.states == {CLIENT: CLOSED, SERVER: MUST_CLOSE}
-
- # Client after request
- p = ConnectionPair()
- p.send(CLIENT, req)
- p.send(CLIENT, ConnectionClosed())
- for conn in p.conns:
- assert conn.states == {CLIENT: CLOSED, SERVER: SEND_RESPONSE}
-
- # Server after request -> not allowed
- p = ConnectionPair()
- p.send(CLIENT, req)
- with pytest.raises(LocalProtocolError):
- p.conn[SERVER].send(ConnectionClosed())
- p.conn[CLIENT].receive_data(b"")
- with pytest.raises(RemoteProtocolError):
- p.conn[CLIENT].next_event()
-
- # Server after response
- p = ConnectionPair()
- p.send(CLIENT, req)
- p.send(SERVER, resp)
- p.send(SERVER, ConnectionClosed())
- for conn in p.conns:
- assert conn.states == {CLIENT: MUST_CLOSE, SERVER: CLOSED}
-
- # Both after closing (ConnectionClosed() is idempotent)
- p = ConnectionPair()
- p.send(CLIENT, req)
- p.send(SERVER, resp)
- p.send(CLIENT, ConnectionClosed())
- p.send(SERVER, ConnectionClosed())
- p.send(CLIENT, ConnectionClosed())
- p.send(SERVER, ConnectionClosed())
-
- # In the middle of sending -> not allowed
- p = ConnectionPair()
- p.send(
- CLIENT,
- Request(
- method="GET", target="/", headers=[("Host", "a"), ("Content-Length", "10")]
- ),
- )
- with pytest.raises(LocalProtocolError):
- p.conn[CLIENT].send(ConnectionClosed())
- p.conn[SERVER].receive_data(b"")
- with pytest.raises(RemoteProtocolError):
- p.conn[SERVER].next_event()
-
-
-# Receive several requests and then client shuts down their side of the
-# connection; we can respond to each
-def test_pipelined_close() -> None:
- c = Connection(SERVER)
- # 2 requests then a close
- c.receive_data(
- b"GET /1 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
- b"12345"
- b"GET /2 HTTP/1.1\r\nHost: a.com\r\nContent-Length: 5\r\n\r\n"
- b"67890"
- )
- c.receive_data(b"")
- assert get_all_events(c) == [
- Request(
- method="GET",
- target="/1",
- headers=[("host", "a.com"), ("content-length", "5")],
- ),
- Data(data=b"12345"),
- EndOfMessage(),
- ]
- assert c.states[CLIENT] is DONE
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
- assert c.states[SERVER] is DONE
- c.start_next_cycle()
- assert get_all_events(c) == [
- Request(
- method="GET",
- target="/2",
- headers=[("host", "a.com"), ("content-length", "5")],
- ),
- Data(data=b"67890"),
- EndOfMessage(),
- ConnectionClosed(),
- ]
- assert c.states == {CLIENT: CLOSED, SERVER: SEND_RESPONSE}
- c.send(Response(status_code=200, headers=[])) # type: ignore[arg-type]
- c.send(EndOfMessage())
- assert c.states == {CLIENT: CLOSED, SERVER: MUST_CLOSE}
- c.send(ConnectionClosed())
- assert c.states == {CLIENT: CLOSED, SERVER: CLOSED}
-
-
-def test_sendfile() -> None:
- class SendfilePlaceholder:
- def __len__(self) -> int:
- return 10
-
- placeholder = SendfilePlaceholder()
-
- def setup(
- header: Tuple[str, str], http_version: str
- ) -> Tuple[Connection, Optional[List[bytes]]]:
- c = Connection(SERVER)
- receive_and_get(
- c, "GET / HTTP/{}\r\nHost: a\r\n\r\n".format(http_version).encode("ascii")
- )
- headers = []
- if header:
- headers.append(header)
- c.send(Response(status_code=200, headers=headers))
- return c, c.send_with_data_passthrough(Data(data=placeholder)) # type: ignore
-
- c, data = setup(("Content-Length", "10"), "1.1")
- assert data == [placeholder] # type: ignore
- # Raises an error if the connection object doesn't think we've sent
- # exactly 10 bytes
- c.send(EndOfMessage())
-
- _, data = setup(("Transfer-Encoding", "chunked"), "1.1")
- assert placeholder in data # type: ignore
- data[data.index(placeholder)] = b"x" * 10 # type: ignore
- assert b"".join(data) == b"a\r\nxxxxxxxxxx\r\n" # type: ignore
-
- c, data = setup(None, "1.0") # type: ignore
- assert data == [placeholder] # type: ignore
- assert c.our_state is SEND_BODY
-
-
-def test_errors() -> None:
- # After a receive error, you can't receive
- for role in [CLIENT, SERVER]:
- c = Connection(our_role=role)
- c.receive_data(b"gibberish\r\n\r\n")
- with pytest.raises(RemoteProtocolError):
- c.next_event()
- # Now any attempt to receive continues to raise
- assert c.their_state is ERROR
- assert c.our_state is not ERROR
- print(c._cstate.states)
- with pytest.raises(RemoteProtocolError):
- c.next_event()
- # But we can still yell at the client for sending us gibberish
- if role is SERVER:
- assert (
- c.send(Response(status_code=400, headers=[])) # type: ignore[arg-type]
- == b"HTTP/1.1 400 \r\nConnection: close\r\n\r\n"
- )
-
- # After an error sending, you can no longer send
- # (This is especially important for things like content-length errors,
- # where there's complex internal state being modified)
- def conn(role: Type[Sentinel]) -> Connection:
- c = Connection(our_role=role)
- if role is SERVER:
- # Put it into the state where it *could* send a response...
- receive_and_get(c, b"GET / HTTP/1.0\r\n\r\n")
- assert c.our_state is SEND_RESPONSE
- return c
-
- for role in [CLIENT, SERVER]:
- if role is CLIENT:
- # This HTTP/1.0 request won't be detected as bad until after we go
- # through the state machine and hit the writing code
- good = Request(method="GET", target="/", headers=[("Host", "example.com")])
- bad = Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com")],
- http_version="1.0",
- )
- elif role is SERVER:
- good = Response(status_code=200, headers=[]) # type: ignore[arg-type,assignment]
- bad = Response(status_code=200, headers=[], http_version="1.0") # type: ignore[arg-type,assignment]
- # Make sure 'good' actually is good
- c = conn(role)
- c.send(good)
- assert c.our_state is not ERROR
- # Do that again, but this time sending 'bad' first
- c = conn(role)
- with pytest.raises(LocalProtocolError):
- c.send(bad)
- assert c.our_state is ERROR
- assert c.their_state is not ERROR
- # Now 'good' is not so good
- with pytest.raises(LocalProtocolError):
- c.send(good)
-
- # And check send_failed() too
- c = conn(role)
- c.send_failed()
- assert c.our_state is ERROR
- assert c.their_state is not ERROR
- # This is idempotent
- c.send_failed()
- assert c.our_state is ERROR
- assert c.their_state is not ERROR
-
-
-def test_idle_receive_nothing() -> None:
- # At one point this incorrectly raised an error
- for role in [CLIENT, SERVER]:
- c = Connection(role)
- assert c.next_event() is NEED_DATA
-
-
-def test_connection_drop() -> None:
- c = Connection(SERVER)
- c.receive_data(b"GET /")
- assert c.next_event() is NEED_DATA
- c.receive_data(b"")
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
-
-def test_408_request_timeout() -> None:
- # Should be able to send this spontaneously as a server without seeing
- # anything from client
- p = ConnectionPair()
- p.send(SERVER, Response(status_code=408, headers=[(b"connection", b"close")]))
-
-
-# This used to raise IndexError
-def test_empty_request() -> None:
- c = Connection(SERVER)
- c.receive_data(b"\r\n")
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
-
-# This used to raise IndexError
-def test_empty_response() -> None:
- c = Connection(CLIENT)
- c.send(Request(method="GET", target="/", headers=[("Host", "a")]))
- c.receive_data(b"\r\n")
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
-
-@pytest.mark.parametrize(
- "data",
- [
- b"\x00",
- b"\x20",
- b"\x16\x03\x01\x00\xa5", # Typical start of a TLS Client Hello
- ],
-)
-def test_early_detection_of_invalid_request(data: bytes) -> None:
- c = Connection(SERVER)
- # Early detection should occur before even receiving a `\r\n`
- c.receive_data(data)
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
-
-@pytest.mark.parametrize(
- "data",
- [
- b"\x00",
- b"\x20",
- b"\x16\x03\x03\x00\x31", # Typical start of a TLS Server Hello
- ],
-)
-def test_early_detection_of_invalid_response(data: bytes) -> None:
- c = Connection(CLIENT)
- # Early detection should occur before even receiving a `\r\n`
- c.receive_data(data)
- with pytest.raises(RemoteProtocolError):
- c.next_event()
-
-
-# This used to give different headers for HEAD and GET.
-# The correct way to handle HEAD is to put whatever headers we *would* have
-# put if it were a GET -- even though we know that for HEAD, those headers
-# will be ignored.
-def test_HEAD_framing_headers() -> None:
- def setup(method: bytes, http_version: bytes) -> Connection:
- c = Connection(SERVER)
- c.receive_data(
- method + b" / HTTP/" + http_version + b"\r\n" + b"Host: example.com\r\n\r\n"
- )
- assert type(c.next_event()) is Request
- assert type(c.next_event()) is EndOfMessage
- return c
-
- for method in [b"GET", b"HEAD"]:
- # No Content-Length, HTTP/1.1 peer, should use chunked
- c = setup(method, b"1.1")
- assert (
- c.send(Response(status_code=200, headers=[])) == b"HTTP/1.1 200 \r\n" # type: ignore[arg-type]
- b"Transfer-Encoding: chunked\r\n\r\n"
- )
-
- # No Content-Length, HTTP/1.0 peer, frame with connection: close
- c = setup(method, b"1.0")
- assert (
- c.send(Response(status_code=200, headers=[])) == b"HTTP/1.1 200 \r\n" # type: ignore[arg-type]
- b"Connection: close\r\n\r\n"
- )
-
- # Content-Length + Transfer-Encoding, TE wins
- c = setup(method, b"1.1")
- assert (
- c.send(
- Response(
- status_code=200,
- headers=[
- ("Content-Length", "100"),
- ("Transfer-Encoding", "chunked"),
- ],
- )
- )
- == b"HTTP/1.1 200 \r\n"
- b"Transfer-Encoding: chunked\r\n\r\n"
- )
-
-
-def test_special_exceptions_for_lost_connection_in_message_body() -> None:
- c = Connection(SERVER)
- c.receive_data(
- b"POST / HTTP/1.1\r\n" b"Host: example.com\r\n" b"Content-Length: 100\r\n\r\n"
- )
- assert type(c.next_event()) is Request
- assert c.next_event() is NEED_DATA
- c.receive_data(b"12345")
- assert c.next_event() == Data(data=b"12345")
- c.receive_data(b"")
- with pytest.raises(RemoteProtocolError) as excinfo:
- c.next_event()
- assert "received 5 bytes" in str(excinfo.value)
- assert "expected 100" in str(excinfo.value)
-
- c = Connection(SERVER)
- c.receive_data(
- b"POST / HTTP/1.1\r\n"
- b"Host: example.com\r\n"
- b"Transfer-Encoding: chunked\r\n\r\n"
- )
- assert type(c.next_event()) is Request
- assert c.next_event() is NEED_DATA
- c.receive_data(b"8\r\n012345")
- assert c.next_event().data == b"012345" # type: ignore
- c.receive_data(b"")
- with pytest.raises(RemoteProtocolError) as excinfo:
- c.next_event()
- assert "incomplete chunked read" in str(excinfo.value)
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_events.py b/env/lib/python3.9/site-packages/h11/tests/test_events.py
deleted file mode 100644
index bc6c313..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_events.py
+++ /dev/null
@@ -1,150 +0,0 @@
-from http import HTTPStatus
-
-import pytest
-
-from .. import _events
-from .._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from .._util import LocalProtocolError
-
-
-def test_events() -> None:
- with pytest.raises(LocalProtocolError):
- # Missing Host:
- req = Request(
- method="GET", target="/", headers=[("a", "b")], http_version="1.1"
- )
- # But this is okay (HTTP/1.0)
- req = Request(method="GET", target="/", headers=[("a", "b")], http_version="1.0")
- # fields are normalized
- assert req.method == b"GET"
- assert req.target == b"/"
- assert req.headers == [(b"a", b"b")]
- assert req.http_version == b"1.0"
-
- # This is also okay -- has a Host (with weird capitalization, which is ok)
- req = Request(
- method="GET",
- target="/",
- headers=[("a", "b"), ("hOSt", "example.com")],
- http_version="1.1",
- )
- # we normalize header capitalization
- assert req.headers == [(b"a", b"b"), (b"host", b"example.com")]
-
- # Multiple host is bad too
- with pytest.raises(LocalProtocolError):
- req = Request(
- method="GET",
- target="/",
- headers=[("Host", "a"), ("Host", "a")],
- http_version="1.1",
- )
- # Even for HTTP/1.0
- with pytest.raises(LocalProtocolError):
- req = Request(
- method="GET",
- target="/",
- headers=[("Host", "a"), ("Host", "a")],
- http_version="1.0",
- )
-
- # Header values are validated
- for bad_char in "\x00\r\n\f\v":
- with pytest.raises(LocalProtocolError):
- req = Request(
- method="GET",
- target="/",
- headers=[("Host", "a"), ("Foo", "asd" + bad_char)],
- http_version="1.0",
- )
-
- # But for compatibility we allow non-whitespace control characters, even
- # though they're forbidden by the spec.
- Request(
- method="GET",
- target="/",
- headers=[("Host", "a"), ("Foo", "asd\x01\x02\x7f")],
- http_version="1.0",
- )
-
- # Request target is validated
- for bad_byte in b"\x00\x20\x7f\xee":
- target = bytearray(b"/")
- target.append(bad_byte)
- with pytest.raises(LocalProtocolError):
- Request(
- method="GET", target=target, headers=[("Host", "a")], http_version="1.1"
- )
-
- # Request method is validated
- with pytest.raises(LocalProtocolError):
- Request(
- method="GET / HTTP/1.1",
- target=target,
- headers=[("Host", "a")],
- http_version="1.1",
- )
-
- ir = InformationalResponse(status_code=100, headers=[("Host", "a")])
- assert ir.status_code == 100
- assert ir.headers == [(b"host", b"a")]
- assert ir.http_version == b"1.1"
-
- with pytest.raises(LocalProtocolError):
- InformationalResponse(status_code=200, headers=[("Host", "a")])
-
- resp = Response(status_code=204, headers=[], http_version="1.0") # type: ignore[arg-type]
- assert resp.status_code == 204
- assert resp.headers == []
- assert resp.http_version == b"1.0"
-
- with pytest.raises(LocalProtocolError):
- resp = Response(status_code=100, headers=[], http_version="1.0") # type: ignore[arg-type]
-
- with pytest.raises(LocalProtocolError):
- Response(status_code="100", headers=[], http_version="1.0") # type: ignore[arg-type]
-
- with pytest.raises(LocalProtocolError):
- InformationalResponse(status_code=b"100", headers=[], http_version="1.0") # type: ignore[arg-type]
-
- d = Data(data=b"asdf")
- assert d.data == b"asdf"
-
- eom = EndOfMessage()
- assert eom.headers == []
-
- cc = ConnectionClosed()
- assert repr(cc) == "ConnectionClosed()"
-
-
-def test_intenum_status_code() -> None:
- # https://github.com/python-hyper/h11/issues/72
-
- r = Response(status_code=HTTPStatus.OK, headers=[], http_version="1.0") # type: ignore[arg-type]
- assert r.status_code == HTTPStatus.OK
- assert type(r.status_code) is not type(HTTPStatus.OK)
- assert type(r.status_code) is int
-
-
-def test_header_casing() -> None:
- r = Request(
- method="GET",
- target="/",
- headers=[("Host", "example.org"), ("Connection", "keep-alive")],
- http_version="1.1",
- )
- assert len(r.headers) == 2
- assert r.headers[0] == (b"host", b"example.org")
- assert r.headers == [(b"host", b"example.org"), (b"connection", b"keep-alive")]
- assert r.headers.raw_items() == [
- (b"Host", b"example.org"),
- (b"Connection", b"keep-alive"),
- ]
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_headers.py b/env/lib/python3.9/site-packages/h11/tests/test_headers.py
deleted file mode 100644
index ba53d08..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_headers.py
+++ /dev/null
@@ -1,157 +0,0 @@
-import pytest
-
-from .._events import Request
-from .._headers import (
- get_comma_header,
- has_expect_100_continue,
- Headers,
- normalize_and_validate,
- set_comma_header,
-)
-from .._util import LocalProtocolError
-
-
-def test_normalize_and_validate() -> None:
- assert normalize_and_validate([("foo", "bar")]) == [(b"foo", b"bar")]
- assert normalize_and_validate([(b"foo", b"bar")]) == [(b"foo", b"bar")]
-
- # no leading/trailing whitespace in names
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([(b"foo ", "bar")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([(b" foo", "bar")])
-
- # no weird characters in names
- with pytest.raises(LocalProtocolError) as excinfo:
- normalize_and_validate([(b"foo bar", b"baz")])
- assert "foo bar" in str(excinfo.value)
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([(b"foo\x00bar", b"baz")])
- # Not even 8-bit characters:
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([(b"foo\xffbar", b"baz")])
- # And not even the control characters we allow in values:
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([(b"foo\x01bar", b"baz")])
-
- # no return or NUL characters in values
- with pytest.raises(LocalProtocolError) as excinfo:
- normalize_and_validate([("foo", "bar\rbaz")])
- assert "bar\\rbaz" in str(excinfo.value)
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("foo", "bar\nbaz")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("foo", "bar\x00baz")])
- # no leading/trailing whitespace
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("foo", "barbaz ")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("foo", " barbaz")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("foo", "barbaz\t")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("foo", "\tbarbaz")])
-
- # content-length
- assert normalize_and_validate([("Content-Length", "1")]) == [
- (b"content-length", b"1")
- ]
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("Content-Length", "asdf")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("Content-Length", "1x")])
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("Content-Length", "1"), ("Content-Length", "2")])
- assert normalize_and_validate(
- [("Content-Length", "0"), ("Content-Length", "0")]
- ) == [(b"content-length", b"0")]
- assert normalize_and_validate([("Content-Length", "0 , 0")]) == [
- (b"content-length", b"0")
- ]
- with pytest.raises(LocalProtocolError):
- normalize_and_validate(
- [("Content-Length", "1"), ("Content-Length", "1"), ("Content-Length", "2")]
- )
- with pytest.raises(LocalProtocolError):
- normalize_and_validate([("Content-Length", "1 , 1,2")])
-
- # transfer-encoding
- assert normalize_and_validate([("Transfer-Encoding", "chunked")]) == [
- (b"transfer-encoding", b"chunked")
- ]
- assert normalize_and_validate([("Transfer-Encoding", "cHuNkEd")]) == [
- (b"transfer-encoding", b"chunked")
- ]
- with pytest.raises(LocalProtocolError) as excinfo:
- normalize_and_validate([("Transfer-Encoding", "gzip")])
- assert excinfo.value.error_status_hint == 501 # Not Implemented
- with pytest.raises(LocalProtocolError) as excinfo:
- normalize_and_validate(
- [("Transfer-Encoding", "chunked"), ("Transfer-Encoding", "gzip")]
- )
- assert excinfo.value.error_status_hint == 501 # Not Implemented
-
-
-def test_get_set_comma_header() -> None:
- headers = normalize_and_validate(
- [
- ("Connection", "close"),
- ("whatever", "something"),
- ("connectiON", "fOo,, , BAR"),
- ]
- )
-
- assert get_comma_header(headers, b"connection") == [b"close", b"foo", b"bar"]
-
- headers = set_comma_header(headers, b"newthing", ["a", "b"]) # type: ignore
-
- with pytest.raises(LocalProtocolError):
- set_comma_header(headers, b"newthing", [" a", "b"]) # type: ignore
-
- assert headers == [
- (b"connection", b"close"),
- (b"whatever", b"something"),
- (b"connection", b"fOo,, , BAR"),
- (b"newthing", b"a"),
- (b"newthing", b"b"),
- ]
-
- headers = set_comma_header(headers, b"whatever", ["different thing"]) # type: ignore
-
- assert headers == [
- (b"connection", b"close"),
- (b"connection", b"fOo,, , BAR"),
- (b"newthing", b"a"),
- (b"newthing", b"b"),
- (b"whatever", b"different thing"),
- ]
-
-
-def test_has_100_continue() -> None:
- assert has_expect_100_continue(
- Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com"), ("Expect", "100-continue")],
- )
- )
- assert not has_expect_100_continue(
- Request(method="GET", target="/", headers=[("Host", "example.com")])
- )
- # Case insensitive
- assert has_expect_100_continue(
- Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com"), ("Expect", "100-Continue")],
- )
- )
- # Doesn't work in HTTP/1.0
- assert not has_expect_100_continue(
- Request(
- method="GET",
- target="/",
- headers=[("Host", "example.com"), ("Expect", "100-continue")],
- http_version="1.0",
- )
- )
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_helpers.py b/env/lib/python3.9/site-packages/h11/tests/test_helpers.py
deleted file mode 100644
index c329c76..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_helpers.py
+++ /dev/null
@@ -1,32 +0,0 @@
-from .._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from .helpers import normalize_data_events
-
-
-def test_normalize_data_events() -> None:
- assert normalize_data_events(
- [
- Data(data=bytearray(b"1")),
- Data(data=b"2"),
- Response(status_code=200, headers=[]), # type: ignore[arg-type]
- Data(data=b"3"),
- Data(data=b"4"),
- EndOfMessage(),
- Data(data=b"5"),
- Data(data=b"6"),
- Data(data=b"7"),
- ]
- ) == [
- Data(data=b"12"),
- Response(status_code=200, headers=[]), # type: ignore[arg-type]
- Data(data=b"34"),
- EndOfMessage(),
- Data(data=b"567"),
- ]
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_io.py b/env/lib/python3.9/site-packages/h11/tests/test_io.py
deleted file mode 100644
index e9c01bd..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_io.py
+++ /dev/null
@@ -1,566 +0,0 @@
-from typing import Any, Callable, Generator, List
-
-import pytest
-
-from .._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from .._headers import Headers, normalize_and_validate
-from .._readers import (
- _obsolete_line_fold,
- ChunkedReader,
- ContentLengthReader,
- Http10Reader,
- READERS,
-)
-from .._receivebuffer import ReceiveBuffer
-from .._state import (
- CLIENT,
- CLOSED,
- DONE,
- IDLE,
- MIGHT_SWITCH_PROTOCOL,
- MUST_CLOSE,
- SEND_BODY,
- SEND_RESPONSE,
- SERVER,
- SWITCHED_PROTOCOL,
-)
-from .._util import LocalProtocolError
-from .._writers import (
- ChunkedWriter,
- ContentLengthWriter,
- Http10Writer,
- write_any_response,
- write_headers,
- write_request,
- WRITERS,
-)
-from .helpers import normalize_data_events
-
-SIMPLE_CASES = [
- (
- (CLIENT, IDLE),
- Request(
- method="GET",
- target="/a",
- headers=[("Host", "foo"), ("Connection", "close")],
- ),
- b"GET /a HTTP/1.1\r\nHost: foo\r\nConnection: close\r\n\r\n",
- ),
- (
- (SERVER, SEND_RESPONSE),
- Response(status_code=200, headers=[("Connection", "close")], reason=b"OK"),
- b"HTTP/1.1 200 OK\r\nConnection: close\r\n\r\n",
- ),
- (
- (SERVER, SEND_RESPONSE),
- Response(status_code=200, headers=[], reason=b"OK"), # type: ignore[arg-type]
- b"HTTP/1.1 200 OK\r\n\r\n",
- ),
- (
- (SERVER, SEND_RESPONSE),
- InformationalResponse(
- status_code=101, headers=[("Upgrade", "websocket")], reason=b"Upgrade"
- ),
- b"HTTP/1.1 101 Upgrade\r\nUpgrade: websocket\r\n\r\n",
- ),
- (
- (SERVER, SEND_RESPONSE),
- InformationalResponse(status_code=101, headers=[], reason=b"Upgrade"), # type: ignore[arg-type]
- b"HTTP/1.1 101 Upgrade\r\n\r\n",
- ),
-]
-
-
-def dowrite(writer: Callable[..., None], obj: Any) -> bytes:
- got_list: List[bytes] = []
- writer(obj, got_list.append)
- return b"".join(got_list)
-
-
-def tw(writer: Any, obj: Any, expected: Any) -> None:
- got = dowrite(writer, obj)
- assert got == expected
-
-
-def makebuf(data: bytes) -> ReceiveBuffer:
- buf = ReceiveBuffer()
- buf += data
- return buf
-
-
-def tr(reader: Any, data: bytes, expected: Any) -> None:
- def check(got: Any) -> None:
- assert got == expected
- # Headers should always be returned as bytes, not e.g. bytearray
- # https://github.com/python-hyper/wsproto/pull/54#issuecomment-377709478
- for name, value in getattr(got, "headers", []):
- assert type(name) is bytes
- assert type(value) is bytes
-
- # Simple: consume whole thing
- buf = makebuf(data)
- check(reader(buf))
- assert not buf
-
- # Incrementally growing buffer
- buf = ReceiveBuffer()
- for i in range(len(data)):
- assert reader(buf) is None
- buf += data[i : i + 1]
- check(reader(buf))
-
- # Trailing data
- buf = makebuf(data)
- buf += b"trailing"
- check(reader(buf))
- assert bytes(buf) == b"trailing"
-
-
-def test_writers_simple() -> None:
- for ((role, state), event, binary) in SIMPLE_CASES:
- tw(WRITERS[role, state], event, binary)
-
-
-def test_readers_simple() -> None:
- for ((role, state), event, binary) in SIMPLE_CASES:
- tr(READERS[role, state], binary, event)
-
-
-def test_writers_unusual() -> None:
- # Simple test of the write_headers utility routine
- tw(
- write_headers,
- normalize_and_validate([("foo", "bar"), ("baz", "quux")]),
- b"foo: bar\r\nbaz: quux\r\n\r\n",
- )
- tw(write_headers, Headers([]), b"\r\n")
-
- # We understand HTTP/1.0, but we don't speak it
- with pytest.raises(LocalProtocolError):
- tw(
- write_request,
- Request(
- method="GET",
- target="/",
- headers=[("Host", "foo"), ("Connection", "close")],
- http_version="1.0",
- ),
- None,
- )
- with pytest.raises(LocalProtocolError):
- tw(
- write_any_response,
- Response(
- status_code=200, headers=[("Connection", "close")], http_version="1.0"
- ),
- None,
- )
-
-
-def test_readers_unusual() -> None:
- # Reading HTTP/1.0
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.0\r\nSome: header\r\n\r\n",
- Request(
- method="HEAD",
- target="/foo",
- headers=[("Some", "header")],
- http_version="1.0",
- ),
- )
-
- # check no-headers, since it's only legal with HTTP/1.0
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.0\r\n\r\n",
- Request(method="HEAD", target="/foo", headers=[], http_version="1.0"), # type: ignore[arg-type]
- )
-
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.0 200 OK\r\nSome: header\r\n\r\n",
- Response(
- status_code=200,
- headers=[("Some", "header")],
- http_version="1.0",
- reason=b"OK",
- ),
- )
-
- # single-character header values (actually disallowed by the ABNF in RFC
- # 7230 -- this is a bug in the standard that we originally copied...)
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.0 200 OK\r\n" b"Foo: a a a a a \r\n\r\n",
- Response(
- status_code=200,
- headers=[("Foo", "a a a a a")],
- http_version="1.0",
- reason=b"OK",
- ),
- )
-
- # Empty headers -- also legal
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.0 200 OK\r\n" b"Foo:\r\n\r\n",
- Response(
- status_code=200, headers=[("Foo", "")], http_version="1.0", reason=b"OK"
- ),
- )
-
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.0 200 OK\r\n" b"Foo: \t \t \r\n\r\n",
- Response(
- status_code=200, headers=[("Foo", "")], http_version="1.0", reason=b"OK"
- ),
- )
-
- # Tolerate broken servers that leave off the response code
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.0 200\r\n" b"Foo: bar\r\n\r\n",
- Response(
- status_code=200, headers=[("Foo", "bar")], http_version="1.0", reason=b""
- ),
- )
-
- # Tolerate headers line endings (\r\n and \n)
- # \n\r\b between headers and body
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.1 200 OK\r\nSomeHeader: val\n\r\n",
- Response(
- status_code=200,
- headers=[("SomeHeader", "val")],
- http_version="1.1",
- reason="OK",
- ),
- )
-
- # delimited only with \n
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.1 200 OK\nSomeHeader1: val1\nSomeHeader2: val2\n\n",
- Response(
- status_code=200,
- headers=[("SomeHeader1", "val1"), ("SomeHeader2", "val2")],
- http_version="1.1",
- reason="OK",
- ),
- )
-
- # mixed \r\n and \n
- tr(
- READERS[SERVER, SEND_RESPONSE],
- b"HTTP/1.1 200 OK\r\nSomeHeader1: val1\nSomeHeader2: val2\n\r\n",
- Response(
- status_code=200,
- headers=[("SomeHeader1", "val1"), ("SomeHeader2", "val2")],
- http_version="1.1",
- reason="OK",
- ),
- )
-
- # obsolete line folding
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n"
- b"Host: example.com\r\n"
- b"Some: multi-line\r\n"
- b" header\r\n"
- b"\tnonsense\r\n"
- b" \t \t\tI guess\r\n"
- b"Connection: close\r\n"
- b"More-nonsense: in the\r\n"
- b" last header \r\n\r\n",
- Request(
- method="HEAD",
- target="/foo",
- headers=[
- ("Host", "example.com"),
- ("Some", "multi-line header nonsense I guess"),
- ("Connection", "close"),
- ("More-nonsense", "in the last header"),
- ],
- ),
- )
-
- with pytest.raises(LocalProtocolError):
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n" b" folded: line\r\n\r\n",
- None,
- )
-
- with pytest.raises(LocalProtocolError):
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n" b"foo : line\r\n\r\n",
- None,
- )
- with pytest.raises(LocalProtocolError):
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n" b"foo\t: line\r\n\r\n",
- None,
- )
- with pytest.raises(LocalProtocolError):
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n" b"foo\t: line\r\n\r\n",
- None,
- )
- with pytest.raises(LocalProtocolError):
- tr(READERS[CLIENT, IDLE], b"HEAD /foo HTTP/1.1\r\n" b": line\r\n\r\n", None)
-
-
-def test__obsolete_line_fold_bytes() -> None:
- # _obsolete_line_fold has a defensive cast to bytearray, which is
- # necessary to protect against O(n^2) behavior in case anyone ever passes
- # in regular bytestrings... but right now we never pass in regular
- # bytestrings. so this test just exists to get some coverage on that
- # defensive cast.
- assert list(_obsolete_line_fold([b"aaa", b"bbb", b" ccc", b"ddd"])) == [
- b"aaa",
- bytearray(b"bbb ccc"),
- b"ddd",
- ]
-
-
-def _run_reader_iter(
- reader: Any, buf: bytes, do_eof: bool
-) -> Generator[Any, None, None]:
- while True:
- event = reader(buf)
- if event is None:
- break
- yield event
- # body readers have undefined behavior after returning EndOfMessage,
- # because this changes the state so they don't get called again
- if type(event) is EndOfMessage:
- break
- if do_eof:
- assert not buf
- yield reader.read_eof()
-
-
-def _run_reader(*args: Any) -> List[Event]:
- events = list(_run_reader_iter(*args))
- return normalize_data_events(events)
-
-
-def t_body_reader(thunk: Any, data: bytes, expected: Any, do_eof: bool = False) -> None:
- # Simple: consume whole thing
- print("Test 1")
- buf = makebuf(data)
- assert _run_reader(thunk(), buf, do_eof) == expected
-
- # Incrementally growing buffer
- print("Test 2")
- reader = thunk()
- buf = ReceiveBuffer()
- events = []
- for i in range(len(data)):
- events += _run_reader(reader, buf, False)
- buf += data[i : i + 1]
- events += _run_reader(reader, buf, do_eof)
- assert normalize_data_events(events) == expected
-
- is_complete = any(type(event) is EndOfMessage for event in expected)
- if is_complete and not do_eof:
- buf = makebuf(data + b"trailing")
- assert _run_reader(thunk(), buf, False) == expected
-
-
-def test_ContentLengthReader() -> None:
- t_body_reader(lambda: ContentLengthReader(0), b"", [EndOfMessage()])
-
- t_body_reader(
- lambda: ContentLengthReader(10),
- b"0123456789",
- [Data(data=b"0123456789"), EndOfMessage()],
- )
-
-
-def test_Http10Reader() -> None:
- t_body_reader(Http10Reader, b"", [EndOfMessage()], do_eof=True)
- t_body_reader(Http10Reader, b"asdf", [Data(data=b"asdf")], do_eof=False)
- t_body_reader(
- Http10Reader, b"asdf", [Data(data=b"asdf"), EndOfMessage()], do_eof=True
- )
-
-
-def test_ChunkedReader() -> None:
- t_body_reader(ChunkedReader, b"0\r\n\r\n", [EndOfMessage()])
-
- t_body_reader(
- ChunkedReader,
- b"0\r\nSome: header\r\n\r\n",
- [EndOfMessage(headers=[("Some", "header")])],
- )
-
- t_body_reader(
- ChunkedReader,
- b"5\r\n01234\r\n"
- + b"10\r\n0123456789abcdef\r\n"
- + b"0\r\n"
- + b"Some: header\r\n\r\n",
- [
- Data(data=b"012340123456789abcdef"),
- EndOfMessage(headers=[("Some", "header")]),
- ],
- )
-
- t_body_reader(
- ChunkedReader,
- b"5\r\n01234\r\n" + b"10\r\n0123456789abcdef\r\n" + b"0\r\n\r\n",
- [Data(data=b"012340123456789abcdef"), EndOfMessage()],
- )
-
- # handles upper and lowercase hex
- t_body_reader(
- ChunkedReader,
- b"aA\r\n" + b"x" * 0xAA + b"\r\n" + b"0\r\n\r\n",
- [Data(data=b"x" * 0xAA), EndOfMessage()],
- )
-
- # refuses arbitrarily long chunk integers
- with pytest.raises(LocalProtocolError):
- # Technically this is legal HTTP/1.1, but we refuse to process chunk
- # sizes that don't fit into 20 characters of hex
- t_body_reader(ChunkedReader, b"9" * 100 + b"\r\nxxx", [Data(data=b"xxx")])
-
- # refuses garbage in the chunk count
- with pytest.raises(LocalProtocolError):
- t_body_reader(ChunkedReader, b"10\x00\r\nxxx", None)
-
- # handles (and discards) "chunk extensions" omg wtf
- t_body_reader(
- ChunkedReader,
- b"5; hello=there\r\n"
- + b"xxxxx"
- + b"\r\n"
- + b'0; random="junk"; some=more; canbe=lonnnnngg\r\n\r\n',
- [Data(data=b"xxxxx"), EndOfMessage()],
- )
-
-
-def test_ContentLengthWriter() -> None:
- w = ContentLengthWriter(5)
- assert dowrite(w, Data(data=b"123")) == b"123"
- assert dowrite(w, Data(data=b"45")) == b"45"
- assert dowrite(w, EndOfMessage()) == b""
-
- w = ContentLengthWriter(5)
- with pytest.raises(LocalProtocolError):
- dowrite(w, Data(data=b"123456"))
-
- w = ContentLengthWriter(5)
- dowrite(w, Data(data=b"123"))
- with pytest.raises(LocalProtocolError):
- dowrite(w, Data(data=b"456"))
-
- w = ContentLengthWriter(5)
- dowrite(w, Data(data=b"123"))
- with pytest.raises(LocalProtocolError):
- dowrite(w, EndOfMessage())
-
- w = ContentLengthWriter(5)
- dowrite(w, Data(data=b"123")) == b"123"
- dowrite(w, Data(data=b"45")) == b"45"
- with pytest.raises(LocalProtocolError):
- dowrite(w, EndOfMessage(headers=[("Etag", "asdf")]))
-
-
-def test_ChunkedWriter() -> None:
- w = ChunkedWriter()
- assert dowrite(w, Data(data=b"aaa")) == b"3\r\naaa\r\n"
- assert dowrite(w, Data(data=b"a" * 20)) == b"14\r\n" + b"a" * 20 + b"\r\n"
-
- assert dowrite(w, Data(data=b"")) == b""
-
- assert dowrite(w, EndOfMessage()) == b"0\r\n\r\n"
-
- assert (
- dowrite(w, EndOfMessage(headers=[("Etag", "asdf"), ("a", "b")]))
- == b"0\r\nEtag: asdf\r\na: b\r\n\r\n"
- )
-
-
-def test_Http10Writer() -> None:
- w = Http10Writer()
- assert dowrite(w, Data(data=b"1234")) == b"1234"
- assert dowrite(w, EndOfMessage()) == b""
-
- with pytest.raises(LocalProtocolError):
- dowrite(w, EndOfMessage(headers=[("Etag", "asdf")]))
-
-
-def test_reject_garbage_after_request_line() -> None:
- with pytest.raises(LocalProtocolError):
- tr(READERS[SERVER, SEND_RESPONSE], b"HTTP/1.0 200 OK\x00xxxx\r\n\r\n", None)
-
-
-def test_reject_garbage_after_response_line() -> None:
- with pytest.raises(LocalProtocolError):
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1 xxxxxx\r\n" b"Host: a\r\n\r\n",
- None,
- )
-
-
-def test_reject_garbage_in_header_line() -> None:
- with pytest.raises(LocalProtocolError):
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n" b"Host: foo\x00bar\r\n\r\n",
- None,
- )
-
-
-def test_reject_non_vchar_in_path() -> None:
- for bad_char in b"\x00\x20\x7f\xee":
- message = bytearray(b"HEAD /")
- message.append(bad_char)
- message.extend(b" HTTP/1.1\r\nHost: foobar\r\n\r\n")
- with pytest.raises(LocalProtocolError):
- tr(READERS[CLIENT, IDLE], message, None)
-
-
-# https://github.com/python-hyper/h11/issues/57
-def test_allow_some_garbage_in_cookies() -> None:
- tr(
- READERS[CLIENT, IDLE],
- b"HEAD /foo HTTP/1.1\r\n"
- b"Host: foo\r\n"
- b"Set-Cookie: ___utmvafIumyLc=kUd\x01UpAt; path=/; Max-Age=900\r\n"
- b"\r\n",
- Request(
- method="HEAD",
- target="/foo",
- headers=[
- ("Host", "foo"),
- ("Set-Cookie", "___utmvafIumyLc=kUd\x01UpAt; path=/; Max-Age=900"),
- ],
- ),
- )
-
-
-def test_host_comes_first() -> None:
- tw(
- write_headers,
- normalize_and_validate([("foo", "bar"), ("Host", "example.com")]),
- b"Host: example.com\r\nfoo: bar\r\n\r\n",
- )
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_receivebuffer.py b/env/lib/python3.9/site-packages/h11/tests/test_receivebuffer.py
deleted file mode 100644
index 21a3870..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_receivebuffer.py
+++ /dev/null
@@ -1,135 +0,0 @@
-import re
-from typing import Tuple
-
-import pytest
-
-from .._receivebuffer import ReceiveBuffer
-
-
-def test_receivebuffer() -> None:
- b = ReceiveBuffer()
- assert not b
- assert len(b) == 0
- assert bytes(b) == b""
-
- b += b"123"
- assert b
- assert len(b) == 3
- assert bytes(b) == b"123"
-
- assert bytes(b) == b"123"
-
- assert b.maybe_extract_at_most(2) == b"12"
- assert b
- assert len(b) == 1
- assert bytes(b) == b"3"
-
- assert bytes(b) == b"3"
-
- assert b.maybe_extract_at_most(10) == b"3"
- assert bytes(b) == b""
-
- assert b.maybe_extract_at_most(10) is None
- assert not b
-
- ################################################################
- # maybe_extract_until_next
- ################################################################
-
- b += b"123\n456\r\n789\r\n"
-
- assert b.maybe_extract_next_line() == b"123\n456\r\n"
- assert bytes(b) == b"789\r\n"
-
- assert b.maybe_extract_next_line() == b"789\r\n"
- assert bytes(b) == b""
-
- b += b"12\r"
- assert b.maybe_extract_next_line() is None
- assert bytes(b) == b"12\r"
-
- b += b"345\n\r"
- assert b.maybe_extract_next_line() is None
- assert bytes(b) == b"12\r345\n\r"
-
- # here we stopped at the middle of b"\r\n" delimiter
-
- b += b"\n6789aaa123\r\n"
- assert b.maybe_extract_next_line() == b"12\r345\n\r\n"
- assert b.maybe_extract_next_line() == b"6789aaa123\r\n"
- assert b.maybe_extract_next_line() is None
- assert bytes(b) == b""
-
- ################################################################
- # maybe_extract_lines
- ################################################################
-
- b += b"123\r\na: b\r\nfoo:bar\r\n\r\ntrailing"
- lines = b.maybe_extract_lines()
- assert lines == [b"123", b"a: b", b"foo:bar"]
- assert bytes(b) == b"trailing"
-
- assert b.maybe_extract_lines() is None
-
- b += b"\r\n\r"
- assert b.maybe_extract_lines() is None
-
- assert b.maybe_extract_at_most(100) == b"trailing\r\n\r"
- assert not b
-
- # Empty body case (as happens at the end of chunked encoding if there are
- # no trailing headers, e.g.)
- b += b"\r\ntrailing"
- assert b.maybe_extract_lines() == []
- assert bytes(b) == b"trailing"
-
-
-@pytest.mark.parametrize(
- "data",
- [
- pytest.param(
- (
- b"HTTP/1.1 200 OK\r\n",
- b"Content-type: text/plain\r\n",
- b"Connection: close\r\n",
- b"\r\n",
- b"Some body",
- ),
- id="with_crlf_delimiter",
- ),
- pytest.param(
- (
- b"HTTP/1.1 200 OK\n",
- b"Content-type: text/plain\n",
- b"Connection: close\n",
- b"\n",
- b"Some body",
- ),
- id="with_lf_only_delimiter",
- ),
- pytest.param(
- (
- b"HTTP/1.1 200 OK\n",
- b"Content-type: text/plain\r\n",
- b"Connection: close\n",
- b"\n",
- b"Some body",
- ),
- id="with_mixed_crlf_and_lf",
- ),
- ],
-)
-def test_receivebuffer_for_invalid_delimiter(data: Tuple[bytes]) -> None:
- b = ReceiveBuffer()
-
- for line in data:
- b += line
-
- lines = b.maybe_extract_lines()
-
- assert lines == [
- b"HTTP/1.1 200 OK",
- b"Content-type: text/plain",
- b"Connection: close",
- ]
- assert bytes(b) == b"Some body"
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_state.py b/env/lib/python3.9/site-packages/h11/tests/test_state.py
deleted file mode 100644
index bc974e6..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_state.py
+++ /dev/null
@@ -1,271 +0,0 @@
-import pytest
-
-from .._events import (
- ConnectionClosed,
- Data,
- EndOfMessage,
- Event,
- InformationalResponse,
- Request,
- Response,
-)
-from .._state import (
- _SWITCH_CONNECT,
- _SWITCH_UPGRADE,
- CLIENT,
- CLOSED,
- ConnectionState,
- DONE,
- IDLE,
- MIGHT_SWITCH_PROTOCOL,
- MUST_CLOSE,
- SEND_BODY,
- SEND_RESPONSE,
- SERVER,
- SWITCHED_PROTOCOL,
-)
-from .._util import LocalProtocolError
-
-
-def test_ConnectionState() -> None:
- cs = ConnectionState()
-
- # Basic event-triggered transitions
-
- assert cs.states == {CLIENT: IDLE, SERVER: IDLE}
-
- cs.process_event(CLIENT, Request)
- # The SERVER-Request special case:
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
-
- # Illegal transitions raise an error and nothing happens
- with pytest.raises(LocalProtocolError):
- cs.process_event(CLIENT, Request)
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
-
- cs.process_event(SERVER, InformationalResponse)
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
-
- cs.process_event(SERVER, Response)
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_BODY}
-
- cs.process_event(CLIENT, EndOfMessage)
- cs.process_event(SERVER, EndOfMessage)
- assert cs.states == {CLIENT: DONE, SERVER: DONE}
-
- # State-triggered transition
-
- cs.process_event(SERVER, ConnectionClosed)
- assert cs.states == {CLIENT: MUST_CLOSE, SERVER: CLOSED}
-
-
-def test_ConnectionState_keep_alive() -> None:
- # keep_alive = False
- cs = ConnectionState()
- cs.process_event(CLIENT, Request)
- cs.process_keep_alive_disabled()
- cs.process_event(CLIENT, EndOfMessage)
- assert cs.states == {CLIENT: MUST_CLOSE, SERVER: SEND_RESPONSE}
-
- cs.process_event(SERVER, Response)
- cs.process_event(SERVER, EndOfMessage)
- assert cs.states == {CLIENT: MUST_CLOSE, SERVER: MUST_CLOSE}
-
-
-def test_ConnectionState_keep_alive_in_DONE() -> None:
- # Check that if keep_alive is disabled when the CLIENT is already in DONE,
- # then this is sufficient to immediately trigger the DONE -> MUST_CLOSE
- # transition
- cs = ConnectionState()
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, EndOfMessage)
- assert cs.states[CLIENT] is DONE
- cs.process_keep_alive_disabled()
- assert cs.states[CLIENT] is MUST_CLOSE
-
-
-def test_ConnectionState_switch_denied() -> None:
- for switch_type in (_SWITCH_CONNECT, _SWITCH_UPGRADE):
- for deny_early in (True, False):
- cs = ConnectionState()
- cs.process_client_switch_proposal(switch_type)
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, Data)
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
-
- assert switch_type in cs.pending_switch_proposals
-
- if deny_early:
- # before client reaches DONE
- cs.process_event(SERVER, Response)
- assert not cs.pending_switch_proposals
-
- cs.process_event(CLIENT, EndOfMessage)
-
- if deny_early:
- assert cs.states == {CLIENT: DONE, SERVER: SEND_BODY}
- else:
- assert cs.states == {
- CLIENT: MIGHT_SWITCH_PROTOCOL,
- SERVER: SEND_RESPONSE,
- }
-
- cs.process_event(SERVER, InformationalResponse)
- assert cs.states == {
- CLIENT: MIGHT_SWITCH_PROTOCOL,
- SERVER: SEND_RESPONSE,
- }
-
- cs.process_event(SERVER, Response)
- assert cs.states == {CLIENT: DONE, SERVER: SEND_BODY}
- assert not cs.pending_switch_proposals
-
-
-_response_type_for_switch = {
- _SWITCH_UPGRADE: InformationalResponse,
- _SWITCH_CONNECT: Response,
- None: Response,
-}
-
-
-def test_ConnectionState_protocol_switch_accepted() -> None:
- for switch_event in [_SWITCH_UPGRADE, _SWITCH_CONNECT]:
- cs = ConnectionState()
- cs.process_client_switch_proposal(switch_event)
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, Data)
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
-
- cs.process_event(CLIENT, EndOfMessage)
- assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
-
- cs.process_event(SERVER, InformationalResponse)
- assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
-
- cs.process_event(SERVER, _response_type_for_switch[switch_event], switch_event)
- assert cs.states == {CLIENT: SWITCHED_PROTOCOL, SERVER: SWITCHED_PROTOCOL}
-
-
-def test_ConnectionState_double_protocol_switch() -> None:
- # CONNECT + Upgrade is legal! Very silly, but legal. So we support
- # it. Because sometimes doing the silly thing is easier than not.
- for server_switch in [None, _SWITCH_UPGRADE, _SWITCH_CONNECT]:
- cs = ConnectionState()
- cs.process_client_switch_proposal(_SWITCH_UPGRADE)
- cs.process_client_switch_proposal(_SWITCH_CONNECT)
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, EndOfMessage)
- assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
- cs.process_event(
- SERVER, _response_type_for_switch[server_switch], server_switch
- )
- if server_switch is None:
- assert cs.states == {CLIENT: DONE, SERVER: SEND_BODY}
- else:
- assert cs.states == {CLIENT: SWITCHED_PROTOCOL, SERVER: SWITCHED_PROTOCOL}
-
-
-def test_ConnectionState_inconsistent_protocol_switch() -> None:
- for client_switches, server_switch in [
- ([], _SWITCH_CONNECT),
- ([], _SWITCH_UPGRADE),
- ([_SWITCH_UPGRADE], _SWITCH_CONNECT),
- ([_SWITCH_CONNECT], _SWITCH_UPGRADE),
- ]:
- cs = ConnectionState()
- for client_switch in client_switches: # type: ignore[attr-defined]
- cs.process_client_switch_proposal(client_switch)
- cs.process_event(CLIENT, Request)
- with pytest.raises(LocalProtocolError):
- cs.process_event(SERVER, Response, server_switch)
-
-
-def test_ConnectionState_keepalive_protocol_switch_interaction() -> None:
- # keep_alive=False + pending_switch_proposals
- cs = ConnectionState()
- cs.process_client_switch_proposal(_SWITCH_UPGRADE)
- cs.process_event(CLIENT, Request)
- cs.process_keep_alive_disabled()
- cs.process_event(CLIENT, Data)
- assert cs.states == {CLIENT: SEND_BODY, SERVER: SEND_RESPONSE}
-
- # the protocol switch "wins"
- cs.process_event(CLIENT, EndOfMessage)
- assert cs.states == {CLIENT: MIGHT_SWITCH_PROTOCOL, SERVER: SEND_RESPONSE}
-
- # but when the server denies the request, keep_alive comes back into play
- cs.process_event(SERVER, Response)
- assert cs.states == {CLIENT: MUST_CLOSE, SERVER: SEND_BODY}
-
-
-def test_ConnectionState_reuse() -> None:
- cs = ConnectionState()
-
- with pytest.raises(LocalProtocolError):
- cs.start_next_cycle()
-
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, EndOfMessage)
-
- with pytest.raises(LocalProtocolError):
- cs.start_next_cycle()
-
- cs.process_event(SERVER, Response)
- cs.process_event(SERVER, EndOfMessage)
-
- cs.start_next_cycle()
- assert cs.states == {CLIENT: IDLE, SERVER: IDLE}
-
- # No keepalive
-
- cs.process_event(CLIENT, Request)
- cs.process_keep_alive_disabled()
- cs.process_event(CLIENT, EndOfMessage)
- cs.process_event(SERVER, Response)
- cs.process_event(SERVER, EndOfMessage)
-
- with pytest.raises(LocalProtocolError):
- cs.start_next_cycle()
-
- # One side closed
-
- cs = ConnectionState()
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, EndOfMessage)
- cs.process_event(CLIENT, ConnectionClosed)
- cs.process_event(SERVER, Response)
- cs.process_event(SERVER, EndOfMessage)
-
- with pytest.raises(LocalProtocolError):
- cs.start_next_cycle()
-
- # Succesful protocol switch
-
- cs = ConnectionState()
- cs.process_client_switch_proposal(_SWITCH_UPGRADE)
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, EndOfMessage)
- cs.process_event(SERVER, InformationalResponse, _SWITCH_UPGRADE)
-
- with pytest.raises(LocalProtocolError):
- cs.start_next_cycle()
-
- # Failed protocol switch
-
- cs = ConnectionState()
- cs.process_client_switch_proposal(_SWITCH_UPGRADE)
- cs.process_event(CLIENT, Request)
- cs.process_event(CLIENT, EndOfMessage)
- cs.process_event(SERVER, Response)
- cs.process_event(SERVER, EndOfMessage)
-
- cs.start_next_cycle()
- assert cs.states == {CLIENT: IDLE, SERVER: IDLE}
-
-
-def test_server_request_is_illegal() -> None:
- # There used to be a bug in how we handled the Request special case that
- # made this allowed...
- cs = ConnectionState()
- with pytest.raises(LocalProtocolError):
- cs.process_event(SERVER, Request)
diff --git a/env/lib/python3.9/site-packages/h11/tests/test_util.py b/env/lib/python3.9/site-packages/h11/tests/test_util.py
deleted file mode 100644
index 1637919..0000000
--- a/env/lib/python3.9/site-packages/h11/tests/test_util.py
+++ /dev/null
@@ -1,112 +0,0 @@
-import re
-import sys
-import traceback
-from typing import NoReturn
-
-import pytest
-
-from .._util import (
- bytesify,
- LocalProtocolError,
- ProtocolError,
- RemoteProtocolError,
- Sentinel,
- validate,
-)
-
-
-def test_ProtocolError() -> None:
- with pytest.raises(TypeError):
- ProtocolError("abstract base class")
-
-
-def test_LocalProtocolError() -> None:
- try:
- raise LocalProtocolError("foo")
- except LocalProtocolError as e:
- assert str(e) == "foo"
- assert e.error_status_hint == 400
-
- try:
- raise LocalProtocolError("foo", error_status_hint=418)
- except LocalProtocolError as e:
- assert str(e) == "foo"
- assert e.error_status_hint == 418
-
- def thunk() -> NoReturn:
- raise LocalProtocolError("a", error_status_hint=420)
-
- try:
- try:
- thunk()
- except LocalProtocolError as exc1:
- orig_traceback = "".join(traceback.format_tb(sys.exc_info()[2]))
- exc1._reraise_as_remote_protocol_error()
- except RemoteProtocolError as exc2:
- assert type(exc2) is RemoteProtocolError
- assert exc2.args == ("a",)
- assert exc2.error_status_hint == 420
- new_traceback = "".join(traceback.format_tb(sys.exc_info()[2]))
- assert new_traceback.endswith(orig_traceback)
-
-
-def test_validate() -> None:
- my_re = re.compile(br"(?P[0-9]+)\.(?P[0-9]+)")
- with pytest.raises(LocalProtocolError):
- validate(my_re, b"0.")
-
- groups = validate(my_re, b"0.1")
- assert groups == {"group1": b"0", "group2": b"1"}
-
- # successful partial matches are an error - must match whole string
- with pytest.raises(LocalProtocolError):
- validate(my_re, b"0.1xx")
- with pytest.raises(LocalProtocolError):
- validate(my_re, b"0.1\n")
-
-
-def test_validate_formatting() -> None:
- my_re = re.compile(br"foo")
-
- with pytest.raises(LocalProtocolError) as excinfo:
- validate(my_re, b"", "oops")
- assert "oops" in str(excinfo.value)
-
- with pytest.raises(LocalProtocolError) as excinfo:
- validate(my_re, b"", "oops {}")
- assert "oops {}" in str(excinfo.value)
-
- with pytest.raises(LocalProtocolError) as excinfo:
- validate(my_re, b"", "oops {} xx", 10)
- assert "oops 10 xx" in str(excinfo.value)
-
-
-def test_make_sentinel() -> None:
- class S(Sentinel, metaclass=Sentinel):
- pass
-
- assert repr(S) == "S"
- assert S == S
- assert type(S).__name__ == "S"
- assert S in {S}
- assert type(S) is S
-
- class S2(Sentinel, metaclass=Sentinel):
- pass
-
- assert repr(S2) == "S2"
- assert S != S2
- assert S not in {S2}
- assert type(S) is not type(S2)
-
-
-def test_bytesify() -> None:
- assert bytesify(b"123") == b"123"
- assert bytesify(bytearray(b"123")) == b"123"
- assert bytesify("123") == b"123"
-
- with pytest.raises(UnicodeEncodeError):
- bytesify("\u1234")
-
- with pytest.raises(TypeError):
- bytesify(10)
diff --git a/env/lib/python3.9/site-packages/idna-3.3.dist-info/INSTALLER b/env/lib/python3.9/site-packages/idna-3.3.dist-info/INSTALLER
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/idna-3.3.dist-info/INSTALLER
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/idna-3.3.dist-info/LICENSE.md b/env/lib/python3.9/site-packages/idna-3.3.dist-info/LICENSE.md
deleted file mode 100644
index b6f8732..0000000
--- a/env/lib/python3.9/site-packages/idna-3.3.dist-info/LICENSE.md
+++ /dev/null
@@ -1,29 +0,0 @@
-BSD 3-Clause License
-
-Copyright (c) 2013-2021, Kim Davies
-All rights reserved.
-
-Redistribution and use in source and binary forms, with or without
-modification, are permitted provided that the following conditions are met:
-
-1. Redistributions of source code must retain the above copyright notice, this
- list of conditions and the following disclaimer.
-
-2. Redistributions in binary form must reproduce the above copyright notice,
- this list of conditions and the following disclaimer in the documentation
- and/or other materials provided with the distribution.
-
-3. Neither the name of the copyright holder nor the names of its
- contributors may be used to endorse or promote products derived from
- this software without specific prior written permission.
-
-THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
-AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
-DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
-FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
-DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
-SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
-CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
-OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
-OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/env/lib/python3.9/site-packages/idna-3.3.dist-info/METADATA b/env/lib/python3.9/site-packages/idna-3.3.dist-info/METADATA
deleted file mode 100644
index 6446805..0000000
--- a/env/lib/python3.9/site-packages/idna-3.3.dist-info/METADATA
+++ /dev/null
@@ -1,236 +0,0 @@
-Metadata-Version: 2.1
-Name: idna
-Version: 3.3
-Summary: Internationalized Domain Names in Applications (IDNA)
-Home-page: https://github.com/kjd/idna
-Author: Kim Davies
-Author-email: kim@cynosure.com.au
-License: BSD-3-Clause
-Platform: UNKNOWN
-Classifier: Development Status :: 5 - Production/Stable
-Classifier: Intended Audience :: Developers
-Classifier: Intended Audience :: System Administrators
-Classifier: License :: OSI Approved :: BSD License
-Classifier: Operating System :: OS Independent
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3 :: Only
-Classifier: Programming Language :: Python :: 3.5
-Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Programming Language :: Python :: 3.8
-Classifier: Programming Language :: Python :: 3.9
-Classifier: Programming Language :: Python :: 3.10
-Classifier: Programming Language :: Python :: Implementation :: CPython
-Classifier: Programming Language :: Python :: Implementation :: PyPy
-Classifier: Topic :: Internet :: Name Service (DNS)
-Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Classifier: Topic :: Utilities
-Requires-Python: >=3.5
-License-File: LICENSE.md
-
-Internationalized Domain Names in Applications (IDNA)
-=====================================================
-
-Support for the Internationalised Domain Names in Applications
-(IDNA) protocol as specified in `RFC 5891 `_.
-This is the latest version of the protocol and is sometimes referred to as
-“IDNA 2008”.
-
-This library also provides support for Unicode Technical Standard 46,
-`Unicode IDNA Compatibility Processing `_.
-
-This acts as a suitable replacement for the “encodings.idna” module that
-comes with the Python standard library, but which only supports the
-older superseded IDNA specification (`RFC 3490 `_).
-
-Basic functions are simply executed:
-
-.. code-block:: pycon
-
- >>> import idna
- >>> idna.encode('ドメイン.テスト')
- b'xn--eckwd4c7c.xn--zckzah'
- >>> print(idna.decode('xn--eckwd4c7c.xn--zckzah'))
- ドメイン.テスト
-
-
-Installation
-------------
-
-To install this library, you can use pip:
-
-.. code-block:: bash
-
- $ pip install idna
-
-Alternatively, you can install the package using the bundled setup script:
-
-.. code-block:: bash
-
- $ python setup.py install
-
-
-Usage
------
-
-For typical usage, the ``encode`` and ``decode`` functions will take a domain
-name argument and perform a conversion to A-labels or U-labels respectively.
-
-.. code-block:: pycon
-
- >>> import idna
- >>> idna.encode('ドメイン.テスト')
- b'xn--eckwd4c7c.xn--zckzah'
- >>> print(idna.decode('xn--eckwd4c7c.xn--zckzah'))
- ドメイン.テスト
-
-You may use the codec encoding and decoding methods using the
-``idna.codec`` module:
-
-.. code-block:: pycon
-
- >>> import idna.codec
- >>> print('домен.испытание'.encode('idna'))
- b'xn--d1acufc.xn--80akhbyknj4f'
- >>> print(b'xn--d1acufc.xn--80akhbyknj4f'.decode('idna'))
- домен.испытание
-
-Conversions can be applied at a per-label basis using the ``ulabel`` or ``alabel``
-functions if necessary:
-
-.. code-block:: pycon
-
- >>> idna.alabel('测试')
- b'xn--0zwm56d'
-
-Compatibility Mapping (UTS #46)
-+++++++++++++++++++++++++++++++
-
-As described in `RFC 5895 `_, the IDNA
-specification does not normalize input from different potential ways a user
-may input a domain name. This functionality, known as a “mapping”, is
-considered by the specification to be a local user-interface issue distinct
-from IDNA conversion functionality.
-
-This library provides one such mapping, that was developed by the Unicode
-Consortium. Known as `Unicode IDNA Compatibility Processing `_,
-it provides for both a regular mapping for typical applications, as well as
-a transitional mapping to help migrate from older IDNA 2003 applications.
-
-For example, “Königsgäßchen” is not a permissible label as *LATIN CAPITAL
-LETTER K* is not allowed (nor are capital letters in general). UTS 46 will
-convert this into lower case prior to applying the IDNA conversion.
-
-.. code-block:: pycon
-
- >>> import idna
- >>> idna.encode('Königsgäßchen')
- ...
- idna.core.InvalidCodepoint: Codepoint U+004B at position 1 of 'Königsgäßchen' not allowed
- >>> idna.encode('Königsgäßchen', uts46=True)
- b'xn--knigsgchen-b4a3dun'
- >>> print(idna.decode('xn--knigsgchen-b4a3dun'))
- königsgäßchen
-
-Transitional processing provides conversions to help transition from the older
-2003 standard to the current standard. For example, in the original IDNA
-specification, the *LATIN SMALL LETTER SHARP S* (ß) was converted into two
-*LATIN SMALL LETTER S* (ss), whereas in the current IDNA specification this
-conversion is not performed.
-
-.. code-block:: pycon
-
- >>> idna.encode('Königsgäßchen', uts46=True, transitional=True)
- 'xn--knigsgsschen-lcb0w'
-
-Implementors should use transitional processing with caution, only in rare
-cases where conversion from legacy labels to current labels must be performed
-(i.e. IDNA implementations that pre-date 2008). For typical applications
-that just need to convert labels, transitional processing is unlikely to be
-beneficial and could produce unexpected incompatible results.
-
-``encodings.idna`` Compatibility
-++++++++++++++++++++++++++++++++
-
-Function calls from the Python built-in ``encodings.idna`` module are
-mapped to their IDNA 2008 equivalents using the ``idna.compat`` module.
-Simply substitute the ``import`` clause in your code to refer to the
-new module name.
-
-Exceptions
-----------
-
-All errors raised during the conversion following the specification should
-raise an exception derived from the ``idna.IDNAError`` base class.
-
-More specific exceptions that may be generated as ``idna.IDNABidiError``
-when the error reflects an illegal combination of left-to-right and
-right-to-left characters in a label; ``idna.InvalidCodepoint`` when
-a specific codepoint is an illegal character in an IDN label (i.e.
-INVALID); and ``idna.InvalidCodepointContext`` when the codepoint is
-illegal based on its positional context (i.e. it is CONTEXTO or CONTEXTJ
-but the contextual requirements are not satisfied.)
-
-Building and Diagnostics
-------------------------
-
-The IDNA and UTS 46 functionality relies upon pre-calculated lookup
-tables for performance. These tables are derived from computing against
-eligibility criteria in the respective standards. These tables are
-computed using the command-line script ``tools/idna-data``.
-
-This tool will fetch relevant codepoint data from the Unicode repository
-and perform the required calculations to identify eligibility. There are
-three main modes:
-
-* ``idna-data make-libdata``. Generates ``idnadata.py`` and ``uts46data.py``,
- the pre-calculated lookup tables using for IDNA and UTS 46 conversions. Implementors
- who wish to track this library against a different Unicode version may use this tool
- to manually generate a different version of the ``idnadata.py`` and ``uts46data.py``
- files.
-
-* ``idna-data make-table``. Generate a table of the IDNA disposition
- (e.g. PVALID, CONTEXTJ, CONTEXTO) in the format found in Appendix B.1 of RFC
- 5892 and the pre-computed tables published by `IANA `_.
-
-* ``idna-data U+0061``. Prints debugging output on the various properties
- associated with an individual Unicode codepoint (in this case, U+0061), that are
- used to assess the IDNA and UTS 46 status of a codepoint. This is helpful in debugging
- or analysis.
-
-The tool accepts a number of arguments, described using ``idna-data -h``. Most notably,
-the ``--version`` argument allows the specification of the version of Unicode to use
-in computing the table data. For example, ``idna-data --version 9.0.0 make-libdata``
-will generate library data against Unicode 9.0.0.
-
-
-Additional Notes
-----------------
-
-* **Packages**. The latest tagged release version is published in the
- `Python Package Index `_.
-
-* **Version support**. This library supports Python 3.5 and higher. As this library
- serves as a low-level toolkit for a variety of applications, many of which strive
- for broad compatibility with older Python versions, there is no rush to remove
- older intepreter support. Removing support for older versions should be well
- justified in that the maintenance burden has become too high.
-
-* **Python 2**. Python 2 is supported by version 2.x of this library. While active
- development of the version 2.x series has ended, notable issues being corrected
- may be backported to 2.x. Use "idna<3" in your requirements file if you need this
- library for a Python 2 application.
-
-* **Testing**. The library has a test suite based on each rule of the IDNA specification, as
- well as tests that are provided as part of the Unicode Technical Standard 46,
- `Unicode IDNA Compatibility Processing `_.
-
-* **Emoji**. It is an occasional request to support emoji domains in this library. Encoding
- of symbols like emoji is expressly prohibited by the technical standard IDNA 2008 and
- emoji domains are broadly phased out across the domain industry due to associated security
- risks. For now, applications that wish need to support these non-compliant labels may
- wish to consider trying the encode/decode operation in this library first, and then falling
- back to using `encodings.idna`. See `the Github project `_
- for more discussion.
-
diff --git a/env/lib/python3.9/site-packages/idna-3.3.dist-info/RECORD b/env/lib/python3.9/site-packages/idna-3.3.dist-info/RECORD
deleted file mode 100644
index 2c9f0dd..0000000
--- a/env/lib/python3.9/site-packages/idna-3.3.dist-info/RECORD
+++ /dev/null
@@ -1,23 +0,0 @@
-idna-3.3.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-idna-3.3.dist-info/LICENSE.md,sha256=otbk2UC9JNvnuWRc3hmpeSzFHbeuDVrNMBrIYMqj6DY,1523
-idna-3.3.dist-info/METADATA,sha256=BdqiAf8ou4x1nzIHp2_sDfXWjl7BrSUGpOeVzbYHQuQ,9765
-idna-3.3.dist-info/RECORD,,
-idna-3.3.dist-info/WHEEL,sha256=ewwEueio1C2XeHTvT17n8dZUJgOvyCWCt0WVNLClP9o,92
-idna-3.3.dist-info/top_level.txt,sha256=jSag9sEDqvSPftxOQy-ABfGV_RSy7oFh4zZJpODV8k0,5
-idna/__init__.py,sha256=KJQN1eQBr8iIK5SKrJ47lXvxG0BJ7Lm38W4zT0v_8lk,849
-idna/__pycache__/__init__.cpython-39.pyc,,
-idna/__pycache__/codec.cpython-39.pyc,,
-idna/__pycache__/compat.cpython-39.pyc,,
-idna/__pycache__/core.cpython-39.pyc,,
-idna/__pycache__/idnadata.cpython-39.pyc,,
-idna/__pycache__/intranges.cpython-39.pyc,,
-idna/__pycache__/package_data.cpython-39.pyc,,
-idna/__pycache__/uts46data.cpython-39.pyc,,
-idna/codec.py,sha256=6ly5odKfqrytKT9_7UrlGklHnf1DSK2r9C6cSM4sa28,3374
-idna/compat.py,sha256=0_sOEUMT4CVw9doD3vyRhX80X19PwqFoUBs7gWsFME4,321
-idna/core.py,sha256=RFIkY-HhFZaDoBEFjGwyGd_vWI04uOAQjnzueMWqwOU,12795
-idna/idnadata.py,sha256=fzMzkCea2xieVxcrjngJ-2pLsKQNejPCZFlBajIuQdw,44025
-idna/intranges.py,sha256=YBr4fRYuWH7kTKS2tXlFjM24ZF1Pdvcir-aywniInqg,1881
-idna/package_data.py,sha256=szxQhV0ZD0nKJ84Kuobw3l8q4_KeCyXjFRdpwIpKZmw,21
-idna/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-idna/uts46data.py,sha256=o-D7V-a0fOLZNd7tvxof6MYfUd0TBZzE2bLR5XO67xU,204400
diff --git a/env/lib/python3.9/site-packages/idna-3.3.dist-info/WHEEL b/env/lib/python3.9/site-packages/idna-3.3.dist-info/WHEEL
deleted file mode 100644
index 5bad85f..0000000
--- a/env/lib/python3.9/site-packages/idna-3.3.dist-info/WHEEL
+++ /dev/null
@@ -1,5 +0,0 @@
-Wheel-Version: 1.0
-Generator: bdist_wheel (0.37.0)
-Root-Is-Purelib: true
-Tag: py3-none-any
-
diff --git a/env/lib/python3.9/site-packages/idna-3.3.dist-info/top_level.txt b/env/lib/python3.9/site-packages/idna-3.3.dist-info/top_level.txt
deleted file mode 100644
index c40472e..0000000
--- a/env/lib/python3.9/site-packages/idna-3.3.dist-info/top_level.txt
+++ /dev/null
@@ -1 +0,0 @@
-idna
diff --git a/env/lib/python3.9/site-packages/idna/__init__.py b/env/lib/python3.9/site-packages/idna/__init__.py
deleted file mode 100644
index a40eeaf..0000000
--- a/env/lib/python3.9/site-packages/idna/__init__.py
+++ /dev/null
@@ -1,44 +0,0 @@
-from .package_data import __version__
-from .core import (
- IDNABidiError,
- IDNAError,
- InvalidCodepoint,
- InvalidCodepointContext,
- alabel,
- check_bidi,
- check_hyphen_ok,
- check_initial_combiner,
- check_label,
- check_nfc,
- decode,
- encode,
- ulabel,
- uts46_remap,
- valid_contextj,
- valid_contexto,
- valid_label_length,
- valid_string_length,
-)
-from .intranges import intranges_contain
-
-__all__ = [
- "IDNABidiError",
- "IDNAError",
- "InvalidCodepoint",
- "InvalidCodepointContext",
- "alabel",
- "check_bidi",
- "check_hyphen_ok",
- "check_initial_combiner",
- "check_label",
- "check_nfc",
- "decode",
- "encode",
- "intranges_contain",
- "ulabel",
- "uts46_remap",
- "valid_contextj",
- "valid_contexto",
- "valid_label_length",
- "valid_string_length",
-]
diff --git a/env/lib/python3.9/site-packages/idna/codec.py b/env/lib/python3.9/site-packages/idna/codec.py
deleted file mode 100644
index 1ca9ba6..0000000
--- a/env/lib/python3.9/site-packages/idna/codec.py
+++ /dev/null
@@ -1,112 +0,0 @@
-from .core import encode, decode, alabel, ulabel, IDNAError
-import codecs
-import re
-from typing import Tuple, Optional
-
-_unicode_dots_re = re.compile('[\u002e\u3002\uff0e\uff61]')
-
-class Codec(codecs.Codec):
-
- def encode(self, data: str, errors: str = 'strict') -> Tuple[bytes, int]:
- if errors != 'strict':
- raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
-
- if not data:
- return b"", 0
-
- return encode(data), len(data)
-
- def decode(self, data: bytes, errors: str = 'strict') -> Tuple[str, int]:
- if errors != 'strict':
- raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
-
- if not data:
- return '', 0
-
- return decode(data), len(data)
-
-class IncrementalEncoder(codecs.BufferedIncrementalEncoder):
- def _buffer_encode(self, data: str, errors: str, final: bool) -> Tuple[str, int]: # type: ignore
- if errors != 'strict':
- raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
-
- if not data:
- return "", 0
-
- labels = _unicode_dots_re.split(data)
- trailing_dot = ''
- if labels:
- if not labels[-1]:
- trailing_dot = '.'
- del labels[-1]
- elif not final:
- # Keep potentially unfinished label until the next call
- del labels[-1]
- if labels:
- trailing_dot = '.'
-
- result = []
- size = 0
- for label in labels:
- result.append(alabel(label))
- if size:
- size += 1
- size += len(label)
-
- # Join with U+002E
- result_str = '.'.join(result) + trailing_dot # type: ignore
- size += len(trailing_dot)
- return result_str, size
-
-class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
- def _buffer_decode(self, data: str, errors: str, final: bool) -> Tuple[str, int]: # type: ignore
- if errors != 'strict':
- raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
-
- if not data:
- return ('', 0)
-
- labels = _unicode_dots_re.split(data)
- trailing_dot = ''
- if labels:
- if not labels[-1]:
- trailing_dot = '.'
- del labels[-1]
- elif not final:
- # Keep potentially unfinished label until the next call
- del labels[-1]
- if labels:
- trailing_dot = '.'
-
- result = []
- size = 0
- for label in labels:
- result.append(ulabel(label))
- if size:
- size += 1
- size += len(label)
-
- result_str = '.'.join(result) + trailing_dot
- size += len(trailing_dot)
- return (result_str, size)
-
-
-class StreamWriter(Codec, codecs.StreamWriter):
- pass
-
-
-class StreamReader(Codec, codecs.StreamReader):
- pass
-
-
-def getregentry() -> codecs.CodecInfo:
- # Compatibility as a search_function for codecs.register()
- return codecs.CodecInfo(
- name='idna',
- encode=Codec().encode, # type: ignore
- decode=Codec().decode, # type: ignore
- incrementalencoder=IncrementalEncoder,
- incrementaldecoder=IncrementalDecoder,
- streamwriter=StreamWriter,
- streamreader=StreamReader,
- )
diff --git a/env/lib/python3.9/site-packages/idna/compat.py b/env/lib/python3.9/site-packages/idna/compat.py
deleted file mode 100644
index 786e6bd..0000000
--- a/env/lib/python3.9/site-packages/idna/compat.py
+++ /dev/null
@@ -1,13 +0,0 @@
-from .core import *
-from .codec import *
-from typing import Any, Union
-
-def ToASCII(label: str) -> bytes:
- return encode(label)
-
-def ToUnicode(label: Union[bytes, bytearray]) -> str:
- return decode(label)
-
-def nameprep(s: Any) -> None:
- raise NotImplementedError('IDNA 2008 does not utilise nameprep protocol')
-
diff --git a/env/lib/python3.9/site-packages/idna/core.py b/env/lib/python3.9/site-packages/idna/core.py
deleted file mode 100644
index 55ab967..0000000
--- a/env/lib/python3.9/site-packages/idna/core.py
+++ /dev/null
@@ -1,397 +0,0 @@
-from . import idnadata
-import bisect
-import unicodedata
-import re
-from typing import Union, Optional
-from .intranges import intranges_contain
-
-_virama_combining_class = 9
-_alabel_prefix = b'xn--'
-_unicode_dots_re = re.compile('[\u002e\u3002\uff0e\uff61]')
-
-class IDNAError(UnicodeError):
- """ Base exception for all IDNA-encoding related problems """
- pass
-
-
-class IDNABidiError(IDNAError):
- """ Exception when bidirectional requirements are not satisfied """
- pass
-
-
-class InvalidCodepoint(IDNAError):
- """ Exception when a disallowed or unallocated codepoint is used """
- pass
-
-
-class InvalidCodepointContext(IDNAError):
- """ Exception when the codepoint is not valid in the context it is used """
- pass
-
-
-def _combining_class(cp: int) -> int:
- v = unicodedata.combining(chr(cp))
- if v == 0:
- if not unicodedata.name(chr(cp)):
- raise ValueError('Unknown character in unicodedata')
- return v
-
-def _is_script(cp: str, script: str) -> bool:
- return intranges_contain(ord(cp), idnadata.scripts[script])
-
-def _punycode(s: str) -> bytes:
- return s.encode('punycode')
-
-def _unot(s: int) -> str:
- return 'U+{:04X}'.format(s)
-
-
-def valid_label_length(label: Union[bytes, str]) -> bool:
- if len(label) > 63:
- return False
- return True
-
-
-def valid_string_length(label: Union[bytes, str], trailing_dot: bool) -> bool:
- if len(label) > (254 if trailing_dot else 253):
- return False
- return True
-
-
-def check_bidi(label: str, check_ltr: bool = False) -> bool:
- # Bidi rules should only be applied if string contains RTL characters
- bidi_label = False
- for (idx, cp) in enumerate(label, 1):
- direction = unicodedata.bidirectional(cp)
- if direction == '':
- # String likely comes from a newer version of Unicode
- raise IDNABidiError('Unknown directionality in label {} at position {}'.format(repr(label), idx))
- if direction in ['R', 'AL', 'AN']:
- bidi_label = True
- if not bidi_label and not check_ltr:
- return True
-
- # Bidi rule 1
- direction = unicodedata.bidirectional(label[0])
- if direction in ['R', 'AL']:
- rtl = True
- elif direction == 'L':
- rtl = False
- else:
- raise IDNABidiError('First codepoint in label {} must be directionality L, R or AL'.format(repr(label)))
-
- valid_ending = False
- number_type = None # type: Optional[str]
- for (idx, cp) in enumerate(label, 1):
- direction = unicodedata.bidirectional(cp)
-
- if rtl:
- # Bidi rule 2
- if not direction in ['R', 'AL', 'AN', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
- raise IDNABidiError('Invalid direction for codepoint at position {} in a right-to-left label'.format(idx))
- # Bidi rule 3
- if direction in ['R', 'AL', 'EN', 'AN']:
- valid_ending = True
- elif direction != 'NSM':
- valid_ending = False
- # Bidi rule 4
- if direction in ['AN', 'EN']:
- if not number_type:
- number_type = direction
- else:
- if number_type != direction:
- raise IDNABidiError('Can not mix numeral types in a right-to-left label')
- else:
- # Bidi rule 5
- if not direction in ['L', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
- raise IDNABidiError('Invalid direction for codepoint at position {} in a left-to-right label'.format(idx))
- # Bidi rule 6
- if direction in ['L', 'EN']:
- valid_ending = True
- elif direction != 'NSM':
- valid_ending = False
-
- if not valid_ending:
- raise IDNABidiError('Label ends with illegal codepoint directionality')
-
- return True
-
-
-def check_initial_combiner(label: str) -> bool:
- if unicodedata.category(label[0])[0] == 'M':
- raise IDNAError('Label begins with an illegal combining character')
- return True
-
-
-def check_hyphen_ok(label: str) -> bool:
- if label[2:4] == '--':
- raise IDNAError('Label has disallowed hyphens in 3rd and 4th position')
- if label[0] == '-' or label[-1] == '-':
- raise IDNAError('Label must not start or end with a hyphen')
- return True
-
-
-def check_nfc(label: str) -> None:
- if unicodedata.normalize('NFC', label) != label:
- raise IDNAError('Label must be in Normalization Form C')
-
-
-def valid_contextj(label: str, pos: int) -> bool:
- cp_value = ord(label[pos])
-
- if cp_value == 0x200c:
-
- if pos > 0:
- if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
- return True
-
- ok = False
- for i in range(pos-1, -1, -1):
- joining_type = idnadata.joining_types.get(ord(label[i]))
- if joining_type == ord('T'):
- continue
- if joining_type in [ord('L'), ord('D')]:
- ok = True
- break
-
- if not ok:
- return False
-
- ok = False
- for i in range(pos+1, len(label)):
- joining_type = idnadata.joining_types.get(ord(label[i]))
- if joining_type == ord('T'):
- continue
- if joining_type in [ord('R'), ord('D')]:
- ok = True
- break
- return ok
-
- if cp_value == 0x200d:
-
- if pos > 0:
- if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
- return True
- return False
-
- else:
-
- return False
-
-
-def valid_contexto(label: str, pos: int, exception: bool = False) -> bool:
- cp_value = ord(label[pos])
-
- if cp_value == 0x00b7:
- if 0 < pos < len(label)-1:
- if ord(label[pos - 1]) == 0x006c and ord(label[pos + 1]) == 0x006c:
- return True
- return False
-
- elif cp_value == 0x0375:
- if pos < len(label)-1 and len(label) > 1:
- return _is_script(label[pos + 1], 'Greek')
- return False
-
- elif cp_value == 0x05f3 or cp_value == 0x05f4:
- if pos > 0:
- return _is_script(label[pos - 1], 'Hebrew')
- return False
-
- elif cp_value == 0x30fb:
- for cp in label:
- if cp == '\u30fb':
- continue
- if _is_script(cp, 'Hiragana') or _is_script(cp, 'Katakana') or _is_script(cp, 'Han'):
- return True
- return False
-
- elif 0x660 <= cp_value <= 0x669:
- for cp in label:
- if 0x6f0 <= ord(cp) <= 0x06f9:
- return False
- return True
-
- elif 0x6f0 <= cp_value <= 0x6f9:
- for cp in label:
- if 0x660 <= ord(cp) <= 0x0669:
- return False
- return True
-
- return False
-
-
-def check_label(label: Union[str, bytes, bytearray]) -> None:
- if isinstance(label, (bytes, bytearray)):
- label = label.decode('utf-8')
- if len(label) == 0:
- raise IDNAError('Empty Label')
-
- check_nfc(label)
- check_hyphen_ok(label)
- check_initial_combiner(label)
-
- for (pos, cp) in enumerate(label):
- cp_value = ord(cp)
- if intranges_contain(cp_value, idnadata.codepoint_classes['PVALID']):
- continue
- elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTJ']):
- try:
- if not valid_contextj(label, pos):
- raise InvalidCodepointContext('Joiner {} not allowed at position {} in {}'.format(
- _unot(cp_value), pos+1, repr(label)))
- except ValueError:
- raise IDNAError('Unknown codepoint adjacent to joiner {} at position {} in {}'.format(
- _unot(cp_value), pos+1, repr(label)))
- elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTO']):
- if not valid_contexto(label, pos):
- raise InvalidCodepointContext('Codepoint {} not allowed at position {} in {}'.format(_unot(cp_value), pos+1, repr(label)))
- else:
- raise InvalidCodepoint('Codepoint {} at position {} of {} not allowed'.format(_unot(cp_value), pos+1, repr(label)))
-
- check_bidi(label)
-
-
-def alabel(label: str) -> bytes:
- try:
- label_bytes = label.encode('ascii')
- ulabel(label_bytes)
- if not valid_label_length(label_bytes):
- raise IDNAError('Label too long')
- return label_bytes
- except UnicodeEncodeError:
- pass
-
- if not label:
- raise IDNAError('No Input')
-
- label = str(label)
- check_label(label)
- label_bytes = _punycode(label)
- label_bytes = _alabel_prefix + label_bytes
-
- if not valid_label_length(label_bytes):
- raise IDNAError('Label too long')
-
- return label_bytes
-
-
-def ulabel(label: Union[str, bytes, bytearray]) -> str:
- if not isinstance(label, (bytes, bytearray)):
- try:
- label_bytes = label.encode('ascii')
- except UnicodeEncodeError:
- check_label(label)
- return label
- else:
- label_bytes = label
-
- label_bytes = label_bytes.lower()
- if label_bytes.startswith(_alabel_prefix):
- label_bytes = label_bytes[len(_alabel_prefix):]
- if not label_bytes:
- raise IDNAError('Malformed A-label, no Punycode eligible content found')
- if label_bytes.decode('ascii')[-1] == '-':
- raise IDNAError('A-label must not end with a hyphen')
- else:
- check_label(label_bytes)
- return label_bytes.decode('ascii')
-
- try:
- label = label_bytes.decode('punycode')
- except UnicodeError:
- raise IDNAError('Invalid A-label')
- check_label(label)
- return label
-
-
-def uts46_remap(domain: str, std3_rules: bool = True, transitional: bool = False) -> str:
- """Re-map the characters in the string according to UTS46 processing."""
- from .uts46data import uts46data
- output = ''
-
- for pos, char in enumerate(domain):
- code_point = ord(char)
- try:
- uts46row = uts46data[code_point if code_point < 256 else
- bisect.bisect_left(uts46data, (code_point, 'Z')) - 1]
- status = uts46row[1]
- replacement = None # type: Optional[str]
- if len(uts46row) == 3:
- replacement = uts46row[2] # type: ignore
- if (status == 'V' or
- (status == 'D' and not transitional) or
- (status == '3' and not std3_rules and replacement is None)):
- output += char
- elif replacement is not None and (status == 'M' or
- (status == '3' and not std3_rules) or
- (status == 'D' and transitional)):
- output += replacement
- elif status != 'I':
- raise IndexError()
- except IndexError:
- raise InvalidCodepoint(
- 'Codepoint {} not allowed at position {} in {}'.format(
- _unot(code_point), pos + 1, repr(domain)))
-
- return unicodedata.normalize('NFC', output)
-
-
-def encode(s: Union[str, bytes, bytearray], strict: bool = False, uts46: bool = False, std3_rules: bool = False, transitional: bool = False) -> bytes:
- if isinstance(s, (bytes, bytearray)):
- s = s.decode('ascii')
- if uts46:
- s = uts46_remap(s, std3_rules, transitional)
- trailing_dot = False
- result = []
- if strict:
- labels = s.split('.')
- else:
- labels = _unicode_dots_re.split(s)
- if not labels or labels == ['']:
- raise IDNAError('Empty domain')
- if labels[-1] == '':
- del labels[-1]
- trailing_dot = True
- for label in labels:
- s = alabel(label)
- if s:
- result.append(s)
- else:
- raise IDNAError('Empty label')
- if trailing_dot:
- result.append(b'')
- s = b'.'.join(result)
- if not valid_string_length(s, trailing_dot):
- raise IDNAError('Domain too long')
- return s
-
-
-def decode(s: Union[str, bytes, bytearray], strict: bool = False, uts46: bool = False, std3_rules: bool = False) -> str:
- try:
- if isinstance(s, (bytes, bytearray)):
- s = s.decode('ascii')
- except UnicodeDecodeError:
- raise IDNAError('Invalid ASCII in A-label')
- if uts46:
- s = uts46_remap(s, std3_rules, False)
- trailing_dot = False
- result = []
- if not strict:
- labels = _unicode_dots_re.split(s)
- else:
- labels = s.split('.')
- if not labels or labels == ['']:
- raise IDNAError('Empty domain')
- if not labels[-1]:
- del labels[-1]
- trailing_dot = True
- for label in labels:
- s = ulabel(label)
- if s:
- result.append(s)
- else:
- raise IDNAError('Empty label')
- if trailing_dot:
- result.append('')
- return '.'.join(result)
diff --git a/env/lib/python3.9/site-packages/idna/idnadata.py b/env/lib/python3.9/site-packages/idna/idnadata.py
deleted file mode 100644
index 1b5805d..0000000
--- a/env/lib/python3.9/site-packages/idna/idnadata.py
+++ /dev/null
@@ -1,2137 +0,0 @@
-# This file is automatically generated by tools/idna-data
-
-__version__ = '14.0.0'
-scripts = {
- 'Greek': (
- 0x37000000374,
- 0x37500000378,
- 0x37a0000037e,
- 0x37f00000380,
- 0x38400000385,
- 0x38600000387,
- 0x3880000038b,
- 0x38c0000038d,
- 0x38e000003a2,
- 0x3a3000003e2,
- 0x3f000000400,
- 0x1d2600001d2b,
- 0x1d5d00001d62,
- 0x1d6600001d6b,
- 0x1dbf00001dc0,
- 0x1f0000001f16,
- 0x1f1800001f1e,
- 0x1f2000001f46,
- 0x1f4800001f4e,
- 0x1f5000001f58,
- 0x1f5900001f5a,
- 0x1f5b00001f5c,
- 0x1f5d00001f5e,
- 0x1f5f00001f7e,
- 0x1f8000001fb5,
- 0x1fb600001fc5,
- 0x1fc600001fd4,
- 0x1fd600001fdc,
- 0x1fdd00001ff0,
- 0x1ff200001ff5,
- 0x1ff600001fff,
- 0x212600002127,
- 0xab650000ab66,
- 0x101400001018f,
- 0x101a0000101a1,
- 0x1d2000001d246,
- ),
- 'Han': (
- 0x2e8000002e9a,
- 0x2e9b00002ef4,
- 0x2f0000002fd6,
- 0x300500003006,
- 0x300700003008,
- 0x30210000302a,
- 0x30380000303c,
- 0x340000004dc0,
- 0x4e000000a000,
- 0xf9000000fa6e,
- 0xfa700000fada,
- 0x16fe200016fe4,
- 0x16ff000016ff2,
- 0x200000002a6e0,
- 0x2a7000002b739,
- 0x2b7400002b81e,
- 0x2b8200002cea2,
- 0x2ceb00002ebe1,
- 0x2f8000002fa1e,
- 0x300000003134b,
- ),
- 'Hebrew': (
- 0x591000005c8,
- 0x5d0000005eb,
- 0x5ef000005f5,
- 0xfb1d0000fb37,
- 0xfb380000fb3d,
- 0xfb3e0000fb3f,
- 0xfb400000fb42,
- 0xfb430000fb45,
- 0xfb460000fb50,
- ),
- 'Hiragana': (
- 0x304100003097,
- 0x309d000030a0,
- 0x1b0010001b120,
- 0x1b1500001b153,
- 0x1f2000001f201,
- ),
- 'Katakana': (
- 0x30a1000030fb,
- 0x30fd00003100,
- 0x31f000003200,
- 0x32d0000032ff,
- 0x330000003358,
- 0xff660000ff70,
- 0xff710000ff9e,
- 0x1aff00001aff4,
- 0x1aff50001affc,
- 0x1affd0001afff,
- 0x1b0000001b001,
- 0x1b1200001b123,
- 0x1b1640001b168,
- ),
-}
-joining_types = {
- 0x600: 85,
- 0x601: 85,
- 0x602: 85,
- 0x603: 85,
- 0x604: 85,
- 0x605: 85,
- 0x608: 85,
- 0x60b: 85,
- 0x620: 68,
- 0x621: 85,
- 0x622: 82,
- 0x623: 82,
- 0x624: 82,
- 0x625: 82,
- 0x626: 68,
- 0x627: 82,
- 0x628: 68,
- 0x629: 82,
- 0x62a: 68,
- 0x62b: 68,
- 0x62c: 68,
- 0x62d: 68,
- 0x62e: 68,
- 0x62f: 82,
- 0x630: 82,
- 0x631: 82,
- 0x632: 82,
- 0x633: 68,
- 0x634: 68,
- 0x635: 68,
- 0x636: 68,
- 0x637: 68,
- 0x638: 68,
- 0x639: 68,
- 0x63a: 68,
- 0x63b: 68,
- 0x63c: 68,
- 0x63d: 68,
- 0x63e: 68,
- 0x63f: 68,
- 0x640: 67,
- 0x641: 68,
- 0x642: 68,
- 0x643: 68,
- 0x644: 68,
- 0x645: 68,
- 0x646: 68,
- 0x647: 68,
- 0x648: 82,
- 0x649: 68,
- 0x64a: 68,
- 0x66e: 68,
- 0x66f: 68,
- 0x671: 82,
- 0x672: 82,
- 0x673: 82,
- 0x674: 85,
- 0x675: 82,
- 0x676: 82,
- 0x677: 82,
- 0x678: 68,
- 0x679: 68,
- 0x67a: 68,
- 0x67b: 68,
- 0x67c: 68,
- 0x67d: 68,
- 0x67e: 68,
- 0x67f: 68,
- 0x680: 68,
- 0x681: 68,
- 0x682: 68,
- 0x683: 68,
- 0x684: 68,
- 0x685: 68,
- 0x686: 68,
- 0x687: 68,
- 0x688: 82,
- 0x689: 82,
- 0x68a: 82,
- 0x68b: 82,
- 0x68c: 82,
- 0x68d: 82,
- 0x68e: 82,
- 0x68f: 82,
- 0x690: 82,
- 0x691: 82,
- 0x692: 82,
- 0x693: 82,
- 0x694: 82,
- 0x695: 82,
- 0x696: 82,
- 0x697: 82,
- 0x698: 82,
- 0x699: 82,
- 0x69a: 68,
- 0x69b: 68,
- 0x69c: 68,
- 0x69d: 68,
- 0x69e: 68,
- 0x69f: 68,
- 0x6a0: 68,
- 0x6a1: 68,
- 0x6a2: 68,
- 0x6a3: 68,
- 0x6a4: 68,
- 0x6a5: 68,
- 0x6a6: 68,
- 0x6a7: 68,
- 0x6a8: 68,
- 0x6a9: 68,
- 0x6aa: 68,
- 0x6ab: 68,
- 0x6ac: 68,
- 0x6ad: 68,
- 0x6ae: 68,
- 0x6af: 68,
- 0x6b0: 68,
- 0x6b1: 68,
- 0x6b2: 68,
- 0x6b3: 68,
- 0x6b4: 68,
- 0x6b5: 68,
- 0x6b6: 68,
- 0x6b7: 68,
- 0x6b8: 68,
- 0x6b9: 68,
- 0x6ba: 68,
- 0x6bb: 68,
- 0x6bc: 68,
- 0x6bd: 68,
- 0x6be: 68,
- 0x6bf: 68,
- 0x6c0: 82,
- 0x6c1: 68,
- 0x6c2: 68,
- 0x6c3: 82,
- 0x6c4: 82,
- 0x6c5: 82,
- 0x6c6: 82,
- 0x6c7: 82,
- 0x6c8: 82,
- 0x6c9: 82,
- 0x6ca: 82,
- 0x6cb: 82,
- 0x6cc: 68,
- 0x6cd: 82,
- 0x6ce: 68,
- 0x6cf: 82,
- 0x6d0: 68,
- 0x6d1: 68,
- 0x6d2: 82,
- 0x6d3: 82,
- 0x6d5: 82,
- 0x6dd: 85,
- 0x6ee: 82,
- 0x6ef: 82,
- 0x6fa: 68,
- 0x6fb: 68,
- 0x6fc: 68,
- 0x6ff: 68,
- 0x70f: 84,
- 0x710: 82,
- 0x712: 68,
- 0x713: 68,
- 0x714: 68,
- 0x715: 82,
- 0x716: 82,
- 0x717: 82,
- 0x718: 82,
- 0x719: 82,
- 0x71a: 68,
- 0x71b: 68,
- 0x71c: 68,
- 0x71d: 68,
- 0x71e: 82,
- 0x71f: 68,
- 0x720: 68,
- 0x721: 68,
- 0x722: 68,
- 0x723: 68,
- 0x724: 68,
- 0x725: 68,
- 0x726: 68,
- 0x727: 68,
- 0x728: 82,
- 0x729: 68,
- 0x72a: 82,
- 0x72b: 68,
- 0x72c: 82,
- 0x72d: 68,
- 0x72e: 68,
- 0x72f: 82,
- 0x74d: 82,
- 0x74e: 68,
- 0x74f: 68,
- 0x750: 68,
- 0x751: 68,
- 0x752: 68,
- 0x753: 68,
- 0x754: 68,
- 0x755: 68,
- 0x756: 68,
- 0x757: 68,
- 0x758: 68,
- 0x759: 82,
- 0x75a: 82,
- 0x75b: 82,
- 0x75c: 68,
- 0x75d: 68,
- 0x75e: 68,
- 0x75f: 68,
- 0x760: 68,
- 0x761: 68,
- 0x762: 68,
- 0x763: 68,
- 0x764: 68,
- 0x765: 68,
- 0x766: 68,
- 0x767: 68,
- 0x768: 68,
- 0x769: 68,
- 0x76a: 68,
- 0x76b: 82,
- 0x76c: 82,
- 0x76d: 68,
- 0x76e: 68,
- 0x76f: 68,
- 0x770: 68,
- 0x771: 82,
- 0x772: 68,
- 0x773: 82,
- 0x774: 82,
- 0x775: 68,
- 0x776: 68,
- 0x777: 68,
- 0x778: 82,
- 0x779: 82,
- 0x77a: 68,
- 0x77b: 68,
- 0x77c: 68,
- 0x77d: 68,
- 0x77e: 68,
- 0x77f: 68,
- 0x7ca: 68,
- 0x7cb: 68,
- 0x7cc: 68,
- 0x7cd: 68,
- 0x7ce: 68,
- 0x7cf: 68,
- 0x7d0: 68,
- 0x7d1: 68,
- 0x7d2: 68,
- 0x7d3: 68,
- 0x7d4: 68,
- 0x7d5: 68,
- 0x7d6: 68,
- 0x7d7: 68,
- 0x7d8: 68,
- 0x7d9: 68,
- 0x7da: 68,
- 0x7db: 68,
- 0x7dc: 68,
- 0x7dd: 68,
- 0x7de: 68,
- 0x7df: 68,
- 0x7e0: 68,
- 0x7e1: 68,
- 0x7e2: 68,
- 0x7e3: 68,
- 0x7e4: 68,
- 0x7e5: 68,
- 0x7e6: 68,
- 0x7e7: 68,
- 0x7e8: 68,
- 0x7e9: 68,
- 0x7ea: 68,
- 0x7fa: 67,
- 0x840: 82,
- 0x841: 68,
- 0x842: 68,
- 0x843: 68,
- 0x844: 68,
- 0x845: 68,
- 0x846: 82,
- 0x847: 82,
- 0x848: 68,
- 0x849: 82,
- 0x84a: 68,
- 0x84b: 68,
- 0x84c: 68,
- 0x84d: 68,
- 0x84e: 68,
- 0x84f: 68,
- 0x850: 68,
- 0x851: 68,
- 0x852: 68,
- 0x853: 68,
- 0x854: 82,
- 0x855: 68,
- 0x856: 82,
- 0x857: 82,
- 0x858: 82,
- 0x860: 68,
- 0x861: 85,
- 0x862: 68,
- 0x863: 68,
- 0x864: 68,
- 0x865: 68,
- 0x866: 85,
- 0x867: 82,
- 0x868: 68,
- 0x869: 82,
- 0x86a: 82,
- 0x870: 82,
- 0x871: 82,
- 0x872: 82,
- 0x873: 82,
- 0x874: 82,
- 0x875: 82,
- 0x876: 82,
- 0x877: 82,
- 0x878: 82,
- 0x879: 82,
- 0x87a: 82,
- 0x87b: 82,
- 0x87c: 82,
- 0x87d: 82,
- 0x87e: 82,
- 0x87f: 82,
- 0x880: 82,
- 0x881: 82,
- 0x882: 82,
- 0x883: 67,
- 0x884: 67,
- 0x885: 67,
- 0x886: 68,
- 0x887: 85,
- 0x888: 85,
- 0x889: 68,
- 0x88a: 68,
- 0x88b: 68,
- 0x88c: 68,
- 0x88d: 68,
- 0x88e: 82,
- 0x890: 85,
- 0x891: 85,
- 0x8a0: 68,
- 0x8a1: 68,
- 0x8a2: 68,
- 0x8a3: 68,
- 0x8a4: 68,
- 0x8a5: 68,
- 0x8a6: 68,
- 0x8a7: 68,
- 0x8a8: 68,
- 0x8a9: 68,
- 0x8aa: 82,
- 0x8ab: 82,
- 0x8ac: 82,
- 0x8ad: 85,
- 0x8ae: 82,
- 0x8af: 68,
- 0x8b0: 68,
- 0x8b1: 82,
- 0x8b2: 82,
- 0x8b3: 68,
- 0x8b4: 68,
- 0x8b5: 68,
- 0x8b6: 68,
- 0x8b7: 68,
- 0x8b8: 68,
- 0x8b9: 82,
- 0x8ba: 68,
- 0x8bb: 68,
- 0x8bc: 68,
- 0x8bd: 68,
- 0x8be: 68,
- 0x8bf: 68,
- 0x8c0: 68,
- 0x8c1: 68,
- 0x8c2: 68,
- 0x8c3: 68,
- 0x8c4: 68,
- 0x8c5: 68,
- 0x8c6: 68,
- 0x8c7: 68,
- 0x8c8: 68,
- 0x8e2: 85,
- 0x1806: 85,
- 0x1807: 68,
- 0x180a: 67,
- 0x180e: 85,
- 0x1820: 68,
- 0x1821: 68,
- 0x1822: 68,
- 0x1823: 68,
- 0x1824: 68,
- 0x1825: 68,
- 0x1826: 68,
- 0x1827: 68,
- 0x1828: 68,
- 0x1829: 68,
- 0x182a: 68,
- 0x182b: 68,
- 0x182c: 68,
- 0x182d: 68,
- 0x182e: 68,
- 0x182f: 68,
- 0x1830: 68,
- 0x1831: 68,
- 0x1832: 68,
- 0x1833: 68,
- 0x1834: 68,
- 0x1835: 68,
- 0x1836: 68,
- 0x1837: 68,
- 0x1838: 68,
- 0x1839: 68,
- 0x183a: 68,
- 0x183b: 68,
- 0x183c: 68,
- 0x183d: 68,
- 0x183e: 68,
- 0x183f: 68,
- 0x1840: 68,
- 0x1841: 68,
- 0x1842: 68,
- 0x1843: 68,
- 0x1844: 68,
- 0x1845: 68,
- 0x1846: 68,
- 0x1847: 68,
- 0x1848: 68,
- 0x1849: 68,
- 0x184a: 68,
- 0x184b: 68,
- 0x184c: 68,
- 0x184d: 68,
- 0x184e: 68,
- 0x184f: 68,
- 0x1850: 68,
- 0x1851: 68,
- 0x1852: 68,
- 0x1853: 68,
- 0x1854: 68,
- 0x1855: 68,
- 0x1856: 68,
- 0x1857: 68,
- 0x1858: 68,
- 0x1859: 68,
- 0x185a: 68,
- 0x185b: 68,
- 0x185c: 68,
- 0x185d: 68,
- 0x185e: 68,
- 0x185f: 68,
- 0x1860: 68,
- 0x1861: 68,
- 0x1862: 68,
- 0x1863: 68,
- 0x1864: 68,
- 0x1865: 68,
- 0x1866: 68,
- 0x1867: 68,
- 0x1868: 68,
- 0x1869: 68,
- 0x186a: 68,
- 0x186b: 68,
- 0x186c: 68,
- 0x186d: 68,
- 0x186e: 68,
- 0x186f: 68,
- 0x1870: 68,
- 0x1871: 68,
- 0x1872: 68,
- 0x1873: 68,
- 0x1874: 68,
- 0x1875: 68,
- 0x1876: 68,
- 0x1877: 68,
- 0x1878: 68,
- 0x1880: 85,
- 0x1881: 85,
- 0x1882: 85,
- 0x1883: 85,
- 0x1884: 85,
- 0x1885: 84,
- 0x1886: 84,
- 0x1887: 68,
- 0x1888: 68,
- 0x1889: 68,
- 0x188a: 68,
- 0x188b: 68,
- 0x188c: 68,
- 0x188d: 68,
- 0x188e: 68,
- 0x188f: 68,
- 0x1890: 68,
- 0x1891: 68,
- 0x1892: 68,
- 0x1893: 68,
- 0x1894: 68,
- 0x1895: 68,
- 0x1896: 68,
- 0x1897: 68,
- 0x1898: 68,
- 0x1899: 68,
- 0x189a: 68,
- 0x189b: 68,
- 0x189c: 68,
- 0x189d: 68,
- 0x189e: 68,
- 0x189f: 68,
- 0x18a0: 68,
- 0x18a1: 68,
- 0x18a2: 68,
- 0x18a3: 68,
- 0x18a4: 68,
- 0x18a5: 68,
- 0x18a6: 68,
- 0x18a7: 68,
- 0x18a8: 68,
- 0x18aa: 68,
- 0x200c: 85,
- 0x200d: 67,
- 0x202f: 85,
- 0x2066: 85,
- 0x2067: 85,
- 0x2068: 85,
- 0x2069: 85,
- 0xa840: 68,
- 0xa841: 68,
- 0xa842: 68,
- 0xa843: 68,
- 0xa844: 68,
- 0xa845: 68,
- 0xa846: 68,
- 0xa847: 68,
- 0xa848: 68,
- 0xa849: 68,
- 0xa84a: 68,
- 0xa84b: 68,
- 0xa84c: 68,
- 0xa84d: 68,
- 0xa84e: 68,
- 0xa84f: 68,
- 0xa850: 68,
- 0xa851: 68,
- 0xa852: 68,
- 0xa853: 68,
- 0xa854: 68,
- 0xa855: 68,
- 0xa856: 68,
- 0xa857: 68,
- 0xa858: 68,
- 0xa859: 68,
- 0xa85a: 68,
- 0xa85b: 68,
- 0xa85c: 68,
- 0xa85d: 68,
- 0xa85e: 68,
- 0xa85f: 68,
- 0xa860: 68,
- 0xa861: 68,
- 0xa862: 68,
- 0xa863: 68,
- 0xa864: 68,
- 0xa865: 68,
- 0xa866: 68,
- 0xa867: 68,
- 0xa868: 68,
- 0xa869: 68,
- 0xa86a: 68,
- 0xa86b: 68,
- 0xa86c: 68,
- 0xa86d: 68,
- 0xa86e: 68,
- 0xa86f: 68,
- 0xa870: 68,
- 0xa871: 68,
- 0xa872: 76,
- 0xa873: 85,
- 0x10ac0: 68,
- 0x10ac1: 68,
- 0x10ac2: 68,
- 0x10ac3: 68,
- 0x10ac4: 68,
- 0x10ac5: 82,
- 0x10ac6: 85,
- 0x10ac7: 82,
- 0x10ac8: 85,
- 0x10ac9: 82,
- 0x10aca: 82,
- 0x10acb: 85,
- 0x10acc: 85,
- 0x10acd: 76,
- 0x10ace: 82,
- 0x10acf: 82,
- 0x10ad0: 82,
- 0x10ad1: 82,
- 0x10ad2: 82,
- 0x10ad3: 68,
- 0x10ad4: 68,
- 0x10ad5: 68,
- 0x10ad6: 68,
- 0x10ad7: 76,
- 0x10ad8: 68,
- 0x10ad9: 68,
- 0x10ada: 68,
- 0x10adb: 68,
- 0x10adc: 68,
- 0x10add: 82,
- 0x10ade: 68,
- 0x10adf: 68,
- 0x10ae0: 68,
- 0x10ae1: 82,
- 0x10ae2: 85,
- 0x10ae3: 85,
- 0x10ae4: 82,
- 0x10aeb: 68,
- 0x10aec: 68,
- 0x10aed: 68,
- 0x10aee: 68,
- 0x10aef: 82,
- 0x10b80: 68,
- 0x10b81: 82,
- 0x10b82: 68,
- 0x10b83: 82,
- 0x10b84: 82,
- 0x10b85: 82,
- 0x10b86: 68,
- 0x10b87: 68,
- 0x10b88: 68,
- 0x10b89: 82,
- 0x10b8a: 68,
- 0x10b8b: 68,
- 0x10b8c: 82,
- 0x10b8d: 68,
- 0x10b8e: 82,
- 0x10b8f: 82,
- 0x10b90: 68,
- 0x10b91: 82,
- 0x10ba9: 82,
- 0x10baa: 82,
- 0x10bab: 82,
- 0x10bac: 82,
- 0x10bad: 68,
- 0x10bae: 68,
- 0x10baf: 85,
- 0x10d00: 76,
- 0x10d01: 68,
- 0x10d02: 68,
- 0x10d03: 68,
- 0x10d04: 68,
- 0x10d05: 68,
- 0x10d06: 68,
- 0x10d07: 68,
- 0x10d08: 68,
- 0x10d09: 68,
- 0x10d0a: 68,
- 0x10d0b: 68,
- 0x10d0c: 68,
- 0x10d0d: 68,
- 0x10d0e: 68,
- 0x10d0f: 68,
- 0x10d10: 68,
- 0x10d11: 68,
- 0x10d12: 68,
- 0x10d13: 68,
- 0x10d14: 68,
- 0x10d15: 68,
- 0x10d16: 68,
- 0x10d17: 68,
- 0x10d18: 68,
- 0x10d19: 68,
- 0x10d1a: 68,
- 0x10d1b: 68,
- 0x10d1c: 68,
- 0x10d1d: 68,
- 0x10d1e: 68,
- 0x10d1f: 68,
- 0x10d20: 68,
- 0x10d21: 68,
- 0x10d22: 82,
- 0x10d23: 68,
- 0x10f30: 68,
- 0x10f31: 68,
- 0x10f32: 68,
- 0x10f33: 82,
- 0x10f34: 68,
- 0x10f35: 68,
- 0x10f36: 68,
- 0x10f37: 68,
- 0x10f38: 68,
- 0x10f39: 68,
- 0x10f3a: 68,
- 0x10f3b: 68,
- 0x10f3c: 68,
- 0x10f3d: 68,
- 0x10f3e: 68,
- 0x10f3f: 68,
- 0x10f40: 68,
- 0x10f41: 68,
- 0x10f42: 68,
- 0x10f43: 68,
- 0x10f44: 68,
- 0x10f45: 85,
- 0x10f51: 68,
- 0x10f52: 68,
- 0x10f53: 68,
- 0x10f54: 82,
- 0x10f70: 68,
- 0x10f71: 68,
- 0x10f72: 68,
- 0x10f73: 68,
- 0x10f74: 82,
- 0x10f75: 82,
- 0x10f76: 68,
- 0x10f77: 68,
- 0x10f78: 68,
- 0x10f79: 68,
- 0x10f7a: 68,
- 0x10f7b: 68,
- 0x10f7c: 68,
- 0x10f7d: 68,
- 0x10f7e: 68,
- 0x10f7f: 68,
- 0x10f80: 68,
- 0x10f81: 68,
- 0x10fb0: 68,
- 0x10fb1: 85,
- 0x10fb2: 68,
- 0x10fb3: 68,
- 0x10fb4: 82,
- 0x10fb5: 82,
- 0x10fb6: 82,
- 0x10fb7: 85,
- 0x10fb8: 68,
- 0x10fb9: 82,
- 0x10fba: 82,
- 0x10fbb: 68,
- 0x10fbc: 68,
- 0x10fbd: 82,
- 0x10fbe: 68,
- 0x10fbf: 68,
- 0x10fc0: 85,
- 0x10fc1: 68,
- 0x10fc2: 82,
- 0x10fc3: 82,
- 0x10fc4: 68,
- 0x10fc5: 85,
- 0x10fc6: 85,
- 0x10fc7: 85,
- 0x10fc8: 85,
- 0x10fc9: 82,
- 0x10fca: 68,
- 0x10fcb: 76,
- 0x110bd: 85,
- 0x110cd: 85,
- 0x1e900: 68,
- 0x1e901: 68,
- 0x1e902: 68,
- 0x1e903: 68,
- 0x1e904: 68,
- 0x1e905: 68,
- 0x1e906: 68,
- 0x1e907: 68,
- 0x1e908: 68,
- 0x1e909: 68,
- 0x1e90a: 68,
- 0x1e90b: 68,
- 0x1e90c: 68,
- 0x1e90d: 68,
- 0x1e90e: 68,
- 0x1e90f: 68,
- 0x1e910: 68,
- 0x1e911: 68,
- 0x1e912: 68,
- 0x1e913: 68,
- 0x1e914: 68,
- 0x1e915: 68,
- 0x1e916: 68,
- 0x1e917: 68,
- 0x1e918: 68,
- 0x1e919: 68,
- 0x1e91a: 68,
- 0x1e91b: 68,
- 0x1e91c: 68,
- 0x1e91d: 68,
- 0x1e91e: 68,
- 0x1e91f: 68,
- 0x1e920: 68,
- 0x1e921: 68,
- 0x1e922: 68,
- 0x1e923: 68,
- 0x1e924: 68,
- 0x1e925: 68,
- 0x1e926: 68,
- 0x1e927: 68,
- 0x1e928: 68,
- 0x1e929: 68,
- 0x1e92a: 68,
- 0x1e92b: 68,
- 0x1e92c: 68,
- 0x1e92d: 68,
- 0x1e92e: 68,
- 0x1e92f: 68,
- 0x1e930: 68,
- 0x1e931: 68,
- 0x1e932: 68,
- 0x1e933: 68,
- 0x1e934: 68,
- 0x1e935: 68,
- 0x1e936: 68,
- 0x1e937: 68,
- 0x1e938: 68,
- 0x1e939: 68,
- 0x1e93a: 68,
- 0x1e93b: 68,
- 0x1e93c: 68,
- 0x1e93d: 68,
- 0x1e93e: 68,
- 0x1e93f: 68,
- 0x1e940: 68,
- 0x1e941: 68,
- 0x1e942: 68,
- 0x1e943: 68,
- 0x1e94b: 84,
-}
-codepoint_classes = {
- 'PVALID': (
- 0x2d0000002e,
- 0x300000003a,
- 0x610000007b,
- 0xdf000000f7,
- 0xf800000100,
- 0x10100000102,
- 0x10300000104,
- 0x10500000106,
- 0x10700000108,
- 0x1090000010a,
- 0x10b0000010c,
- 0x10d0000010e,
- 0x10f00000110,
- 0x11100000112,
- 0x11300000114,
- 0x11500000116,
- 0x11700000118,
- 0x1190000011a,
- 0x11b0000011c,
- 0x11d0000011e,
- 0x11f00000120,
- 0x12100000122,
- 0x12300000124,
- 0x12500000126,
- 0x12700000128,
- 0x1290000012a,
- 0x12b0000012c,
- 0x12d0000012e,
- 0x12f00000130,
- 0x13100000132,
- 0x13500000136,
- 0x13700000139,
- 0x13a0000013b,
- 0x13c0000013d,
- 0x13e0000013f,
- 0x14200000143,
- 0x14400000145,
- 0x14600000147,
- 0x14800000149,
- 0x14b0000014c,
- 0x14d0000014e,
- 0x14f00000150,
- 0x15100000152,
- 0x15300000154,
- 0x15500000156,
- 0x15700000158,
- 0x1590000015a,
- 0x15b0000015c,
- 0x15d0000015e,
- 0x15f00000160,
- 0x16100000162,
- 0x16300000164,
- 0x16500000166,
- 0x16700000168,
- 0x1690000016a,
- 0x16b0000016c,
- 0x16d0000016e,
- 0x16f00000170,
- 0x17100000172,
- 0x17300000174,
- 0x17500000176,
- 0x17700000178,
- 0x17a0000017b,
- 0x17c0000017d,
- 0x17e0000017f,
- 0x18000000181,
- 0x18300000184,
- 0x18500000186,
- 0x18800000189,
- 0x18c0000018e,
- 0x19200000193,
- 0x19500000196,
- 0x1990000019c,
- 0x19e0000019f,
- 0x1a1000001a2,
- 0x1a3000001a4,
- 0x1a5000001a6,
- 0x1a8000001a9,
- 0x1aa000001ac,
- 0x1ad000001ae,
- 0x1b0000001b1,
- 0x1b4000001b5,
- 0x1b6000001b7,
- 0x1b9000001bc,
- 0x1bd000001c4,
- 0x1ce000001cf,
- 0x1d0000001d1,
- 0x1d2000001d3,
- 0x1d4000001d5,
- 0x1d6000001d7,
- 0x1d8000001d9,
- 0x1da000001db,
- 0x1dc000001de,
- 0x1df000001e0,
- 0x1e1000001e2,
- 0x1e3000001e4,
- 0x1e5000001e6,
- 0x1e7000001e8,
- 0x1e9000001ea,
- 0x1eb000001ec,
- 0x1ed000001ee,
- 0x1ef000001f1,
- 0x1f5000001f6,
- 0x1f9000001fa,
- 0x1fb000001fc,
- 0x1fd000001fe,
- 0x1ff00000200,
- 0x20100000202,
- 0x20300000204,
- 0x20500000206,
- 0x20700000208,
- 0x2090000020a,
- 0x20b0000020c,
- 0x20d0000020e,
- 0x20f00000210,
- 0x21100000212,
- 0x21300000214,
- 0x21500000216,
- 0x21700000218,
- 0x2190000021a,
- 0x21b0000021c,
- 0x21d0000021e,
- 0x21f00000220,
- 0x22100000222,
- 0x22300000224,
- 0x22500000226,
- 0x22700000228,
- 0x2290000022a,
- 0x22b0000022c,
- 0x22d0000022e,
- 0x22f00000230,
- 0x23100000232,
- 0x2330000023a,
- 0x23c0000023d,
- 0x23f00000241,
- 0x24200000243,
- 0x24700000248,
- 0x2490000024a,
- 0x24b0000024c,
- 0x24d0000024e,
- 0x24f000002b0,
- 0x2b9000002c2,
- 0x2c6000002d2,
- 0x2ec000002ed,
- 0x2ee000002ef,
- 0x30000000340,
- 0x34200000343,
- 0x3460000034f,
- 0x35000000370,
- 0x37100000372,
- 0x37300000374,
- 0x37700000378,
- 0x37b0000037e,
- 0x39000000391,
- 0x3ac000003cf,
- 0x3d7000003d8,
- 0x3d9000003da,
- 0x3db000003dc,
- 0x3dd000003de,
- 0x3df000003e0,
- 0x3e1000003e2,
- 0x3e3000003e4,
- 0x3e5000003e6,
- 0x3e7000003e8,
- 0x3e9000003ea,
- 0x3eb000003ec,
- 0x3ed000003ee,
- 0x3ef000003f0,
- 0x3f3000003f4,
- 0x3f8000003f9,
- 0x3fb000003fd,
- 0x43000000460,
- 0x46100000462,
- 0x46300000464,
- 0x46500000466,
- 0x46700000468,
- 0x4690000046a,
- 0x46b0000046c,
- 0x46d0000046e,
- 0x46f00000470,
- 0x47100000472,
- 0x47300000474,
- 0x47500000476,
- 0x47700000478,
- 0x4790000047a,
- 0x47b0000047c,
- 0x47d0000047e,
- 0x47f00000480,
- 0x48100000482,
- 0x48300000488,
- 0x48b0000048c,
- 0x48d0000048e,
- 0x48f00000490,
- 0x49100000492,
- 0x49300000494,
- 0x49500000496,
- 0x49700000498,
- 0x4990000049a,
- 0x49b0000049c,
- 0x49d0000049e,
- 0x49f000004a0,
- 0x4a1000004a2,
- 0x4a3000004a4,
- 0x4a5000004a6,
- 0x4a7000004a8,
- 0x4a9000004aa,
- 0x4ab000004ac,
- 0x4ad000004ae,
- 0x4af000004b0,
- 0x4b1000004b2,
- 0x4b3000004b4,
- 0x4b5000004b6,
- 0x4b7000004b8,
- 0x4b9000004ba,
- 0x4bb000004bc,
- 0x4bd000004be,
- 0x4bf000004c0,
- 0x4c2000004c3,
- 0x4c4000004c5,
- 0x4c6000004c7,
- 0x4c8000004c9,
- 0x4ca000004cb,
- 0x4cc000004cd,
- 0x4ce000004d0,
- 0x4d1000004d2,
- 0x4d3000004d4,
- 0x4d5000004d6,
- 0x4d7000004d8,
- 0x4d9000004da,
- 0x4db000004dc,
- 0x4dd000004de,
- 0x4df000004e0,
- 0x4e1000004e2,
- 0x4e3000004e4,
- 0x4e5000004e6,
- 0x4e7000004e8,
- 0x4e9000004ea,
- 0x4eb000004ec,
- 0x4ed000004ee,
- 0x4ef000004f0,
- 0x4f1000004f2,
- 0x4f3000004f4,
- 0x4f5000004f6,
- 0x4f7000004f8,
- 0x4f9000004fa,
- 0x4fb000004fc,
- 0x4fd000004fe,
- 0x4ff00000500,
- 0x50100000502,
- 0x50300000504,
- 0x50500000506,
- 0x50700000508,
- 0x5090000050a,
- 0x50b0000050c,
- 0x50d0000050e,
- 0x50f00000510,
- 0x51100000512,
- 0x51300000514,
- 0x51500000516,
- 0x51700000518,
- 0x5190000051a,
- 0x51b0000051c,
- 0x51d0000051e,
- 0x51f00000520,
- 0x52100000522,
- 0x52300000524,
- 0x52500000526,
- 0x52700000528,
- 0x5290000052a,
- 0x52b0000052c,
- 0x52d0000052e,
- 0x52f00000530,
- 0x5590000055a,
- 0x56000000587,
- 0x58800000589,
- 0x591000005be,
- 0x5bf000005c0,
- 0x5c1000005c3,
- 0x5c4000005c6,
- 0x5c7000005c8,
- 0x5d0000005eb,
- 0x5ef000005f3,
- 0x6100000061b,
- 0x62000000640,
- 0x64100000660,
- 0x66e00000675,
- 0x679000006d4,
- 0x6d5000006dd,
- 0x6df000006e9,
- 0x6ea000006f0,
- 0x6fa00000700,
- 0x7100000074b,
- 0x74d000007b2,
- 0x7c0000007f6,
- 0x7fd000007fe,
- 0x8000000082e,
- 0x8400000085c,
- 0x8600000086b,
- 0x87000000888,
- 0x8890000088f,
- 0x898000008e2,
- 0x8e300000958,
- 0x96000000964,
- 0x96600000970,
- 0x97100000984,
- 0x9850000098d,
- 0x98f00000991,
- 0x993000009a9,
- 0x9aa000009b1,
- 0x9b2000009b3,
- 0x9b6000009ba,
- 0x9bc000009c5,
- 0x9c7000009c9,
- 0x9cb000009cf,
- 0x9d7000009d8,
- 0x9e0000009e4,
- 0x9e6000009f2,
- 0x9fc000009fd,
- 0x9fe000009ff,
- 0xa0100000a04,
- 0xa0500000a0b,
- 0xa0f00000a11,
- 0xa1300000a29,
- 0xa2a00000a31,
- 0xa3200000a33,
- 0xa3500000a36,
- 0xa3800000a3a,
- 0xa3c00000a3d,
- 0xa3e00000a43,
- 0xa4700000a49,
- 0xa4b00000a4e,
- 0xa5100000a52,
- 0xa5c00000a5d,
- 0xa6600000a76,
- 0xa8100000a84,
- 0xa8500000a8e,
- 0xa8f00000a92,
- 0xa9300000aa9,
- 0xaaa00000ab1,
- 0xab200000ab4,
- 0xab500000aba,
- 0xabc00000ac6,
- 0xac700000aca,
- 0xacb00000ace,
- 0xad000000ad1,
- 0xae000000ae4,
- 0xae600000af0,
- 0xaf900000b00,
- 0xb0100000b04,
- 0xb0500000b0d,
- 0xb0f00000b11,
- 0xb1300000b29,
- 0xb2a00000b31,
- 0xb3200000b34,
- 0xb3500000b3a,
- 0xb3c00000b45,
- 0xb4700000b49,
- 0xb4b00000b4e,
- 0xb5500000b58,
- 0xb5f00000b64,
- 0xb6600000b70,
- 0xb7100000b72,
- 0xb8200000b84,
- 0xb8500000b8b,
- 0xb8e00000b91,
- 0xb9200000b96,
- 0xb9900000b9b,
- 0xb9c00000b9d,
- 0xb9e00000ba0,
- 0xba300000ba5,
- 0xba800000bab,
- 0xbae00000bba,
- 0xbbe00000bc3,
- 0xbc600000bc9,
- 0xbca00000bce,
- 0xbd000000bd1,
- 0xbd700000bd8,
- 0xbe600000bf0,
- 0xc0000000c0d,
- 0xc0e00000c11,
- 0xc1200000c29,
- 0xc2a00000c3a,
- 0xc3c00000c45,
- 0xc4600000c49,
- 0xc4a00000c4e,
- 0xc5500000c57,
- 0xc5800000c5b,
- 0xc5d00000c5e,
- 0xc6000000c64,
- 0xc6600000c70,
- 0xc8000000c84,
- 0xc8500000c8d,
- 0xc8e00000c91,
- 0xc9200000ca9,
- 0xcaa00000cb4,
- 0xcb500000cba,
- 0xcbc00000cc5,
- 0xcc600000cc9,
- 0xcca00000cce,
- 0xcd500000cd7,
- 0xcdd00000cdf,
- 0xce000000ce4,
- 0xce600000cf0,
- 0xcf100000cf3,
- 0xd0000000d0d,
- 0xd0e00000d11,
- 0xd1200000d45,
- 0xd4600000d49,
- 0xd4a00000d4f,
- 0xd5400000d58,
- 0xd5f00000d64,
- 0xd6600000d70,
- 0xd7a00000d80,
- 0xd8100000d84,
- 0xd8500000d97,
- 0xd9a00000db2,
- 0xdb300000dbc,
- 0xdbd00000dbe,
- 0xdc000000dc7,
- 0xdca00000dcb,
- 0xdcf00000dd5,
- 0xdd600000dd7,
- 0xdd800000de0,
- 0xde600000df0,
- 0xdf200000df4,
- 0xe0100000e33,
- 0xe3400000e3b,
- 0xe4000000e4f,
- 0xe5000000e5a,
- 0xe8100000e83,
- 0xe8400000e85,
- 0xe8600000e8b,
- 0xe8c00000ea4,
- 0xea500000ea6,
- 0xea700000eb3,
- 0xeb400000ebe,
- 0xec000000ec5,
- 0xec600000ec7,
- 0xec800000ece,
- 0xed000000eda,
- 0xede00000ee0,
- 0xf0000000f01,
- 0xf0b00000f0c,
- 0xf1800000f1a,
- 0xf2000000f2a,
- 0xf3500000f36,
- 0xf3700000f38,
- 0xf3900000f3a,
- 0xf3e00000f43,
- 0xf4400000f48,
- 0xf4900000f4d,
- 0xf4e00000f52,
- 0xf5300000f57,
- 0xf5800000f5c,
- 0xf5d00000f69,
- 0xf6a00000f6d,
- 0xf7100000f73,
- 0xf7400000f75,
- 0xf7a00000f81,
- 0xf8200000f85,
- 0xf8600000f93,
- 0xf9400000f98,
- 0xf9900000f9d,
- 0xf9e00000fa2,
- 0xfa300000fa7,
- 0xfa800000fac,
- 0xfad00000fb9,
- 0xfba00000fbd,
- 0xfc600000fc7,
- 0x10000000104a,
- 0x10500000109e,
- 0x10d0000010fb,
- 0x10fd00001100,
- 0x120000001249,
- 0x124a0000124e,
- 0x125000001257,
- 0x125800001259,
- 0x125a0000125e,
- 0x126000001289,
- 0x128a0000128e,
- 0x1290000012b1,
- 0x12b2000012b6,
- 0x12b8000012bf,
- 0x12c0000012c1,
- 0x12c2000012c6,
- 0x12c8000012d7,
- 0x12d800001311,
- 0x131200001316,
- 0x13180000135b,
- 0x135d00001360,
- 0x138000001390,
- 0x13a0000013f6,
- 0x14010000166d,
- 0x166f00001680,
- 0x16810000169b,
- 0x16a0000016eb,
- 0x16f1000016f9,
- 0x170000001716,
- 0x171f00001735,
- 0x174000001754,
- 0x17600000176d,
- 0x176e00001771,
- 0x177200001774,
- 0x1780000017b4,
- 0x17b6000017d4,
- 0x17d7000017d8,
- 0x17dc000017de,
- 0x17e0000017ea,
- 0x18100000181a,
- 0x182000001879,
- 0x1880000018ab,
- 0x18b0000018f6,
- 0x19000000191f,
- 0x19200000192c,
- 0x19300000193c,
- 0x19460000196e,
- 0x197000001975,
- 0x1980000019ac,
- 0x19b0000019ca,
- 0x19d0000019da,
- 0x1a0000001a1c,
- 0x1a2000001a5f,
- 0x1a6000001a7d,
- 0x1a7f00001a8a,
- 0x1a9000001a9a,
- 0x1aa700001aa8,
- 0x1ab000001abe,
- 0x1abf00001acf,
- 0x1b0000001b4d,
- 0x1b5000001b5a,
- 0x1b6b00001b74,
- 0x1b8000001bf4,
- 0x1c0000001c38,
- 0x1c4000001c4a,
- 0x1c4d00001c7e,
- 0x1cd000001cd3,
- 0x1cd400001cfb,
- 0x1d0000001d2c,
- 0x1d2f00001d30,
- 0x1d3b00001d3c,
- 0x1d4e00001d4f,
- 0x1d6b00001d78,
- 0x1d7900001d9b,
- 0x1dc000001e00,
- 0x1e0100001e02,
- 0x1e0300001e04,
- 0x1e0500001e06,
- 0x1e0700001e08,
- 0x1e0900001e0a,
- 0x1e0b00001e0c,
- 0x1e0d00001e0e,
- 0x1e0f00001e10,
- 0x1e1100001e12,
- 0x1e1300001e14,
- 0x1e1500001e16,
- 0x1e1700001e18,
- 0x1e1900001e1a,
- 0x1e1b00001e1c,
- 0x1e1d00001e1e,
- 0x1e1f00001e20,
- 0x1e2100001e22,
- 0x1e2300001e24,
- 0x1e2500001e26,
- 0x1e2700001e28,
- 0x1e2900001e2a,
- 0x1e2b00001e2c,
- 0x1e2d00001e2e,
- 0x1e2f00001e30,
- 0x1e3100001e32,
- 0x1e3300001e34,
- 0x1e3500001e36,
- 0x1e3700001e38,
- 0x1e3900001e3a,
- 0x1e3b00001e3c,
- 0x1e3d00001e3e,
- 0x1e3f00001e40,
- 0x1e4100001e42,
- 0x1e4300001e44,
- 0x1e4500001e46,
- 0x1e4700001e48,
- 0x1e4900001e4a,
- 0x1e4b00001e4c,
- 0x1e4d00001e4e,
- 0x1e4f00001e50,
- 0x1e5100001e52,
- 0x1e5300001e54,
- 0x1e5500001e56,
- 0x1e5700001e58,
- 0x1e5900001e5a,
- 0x1e5b00001e5c,
- 0x1e5d00001e5e,
- 0x1e5f00001e60,
- 0x1e6100001e62,
- 0x1e6300001e64,
- 0x1e6500001e66,
- 0x1e6700001e68,
- 0x1e6900001e6a,
- 0x1e6b00001e6c,
- 0x1e6d00001e6e,
- 0x1e6f00001e70,
- 0x1e7100001e72,
- 0x1e7300001e74,
- 0x1e7500001e76,
- 0x1e7700001e78,
- 0x1e7900001e7a,
- 0x1e7b00001e7c,
- 0x1e7d00001e7e,
- 0x1e7f00001e80,
- 0x1e8100001e82,
- 0x1e8300001e84,
- 0x1e8500001e86,
- 0x1e8700001e88,
- 0x1e8900001e8a,
- 0x1e8b00001e8c,
- 0x1e8d00001e8e,
- 0x1e8f00001e90,
- 0x1e9100001e92,
- 0x1e9300001e94,
- 0x1e9500001e9a,
- 0x1e9c00001e9e,
- 0x1e9f00001ea0,
- 0x1ea100001ea2,
- 0x1ea300001ea4,
- 0x1ea500001ea6,
- 0x1ea700001ea8,
- 0x1ea900001eaa,
- 0x1eab00001eac,
- 0x1ead00001eae,
- 0x1eaf00001eb0,
- 0x1eb100001eb2,
- 0x1eb300001eb4,
- 0x1eb500001eb6,
- 0x1eb700001eb8,
- 0x1eb900001eba,
- 0x1ebb00001ebc,
- 0x1ebd00001ebe,
- 0x1ebf00001ec0,
- 0x1ec100001ec2,
- 0x1ec300001ec4,
- 0x1ec500001ec6,
- 0x1ec700001ec8,
- 0x1ec900001eca,
- 0x1ecb00001ecc,
- 0x1ecd00001ece,
- 0x1ecf00001ed0,
- 0x1ed100001ed2,
- 0x1ed300001ed4,
- 0x1ed500001ed6,
- 0x1ed700001ed8,
- 0x1ed900001eda,
- 0x1edb00001edc,
- 0x1edd00001ede,
- 0x1edf00001ee0,
- 0x1ee100001ee2,
- 0x1ee300001ee4,
- 0x1ee500001ee6,
- 0x1ee700001ee8,
- 0x1ee900001eea,
- 0x1eeb00001eec,
- 0x1eed00001eee,
- 0x1eef00001ef0,
- 0x1ef100001ef2,
- 0x1ef300001ef4,
- 0x1ef500001ef6,
- 0x1ef700001ef8,
- 0x1ef900001efa,
- 0x1efb00001efc,
- 0x1efd00001efe,
- 0x1eff00001f08,
- 0x1f1000001f16,
- 0x1f2000001f28,
- 0x1f3000001f38,
- 0x1f4000001f46,
- 0x1f5000001f58,
- 0x1f6000001f68,
- 0x1f7000001f71,
- 0x1f7200001f73,
- 0x1f7400001f75,
- 0x1f7600001f77,
- 0x1f7800001f79,
- 0x1f7a00001f7b,
- 0x1f7c00001f7d,
- 0x1fb000001fb2,
- 0x1fb600001fb7,
- 0x1fc600001fc7,
- 0x1fd000001fd3,
- 0x1fd600001fd8,
- 0x1fe000001fe3,
- 0x1fe400001fe8,
- 0x1ff600001ff7,
- 0x214e0000214f,
- 0x218400002185,
- 0x2c3000002c60,
- 0x2c6100002c62,
- 0x2c6500002c67,
- 0x2c6800002c69,
- 0x2c6a00002c6b,
- 0x2c6c00002c6d,
- 0x2c7100002c72,
- 0x2c7300002c75,
- 0x2c7600002c7c,
- 0x2c8100002c82,
- 0x2c8300002c84,
- 0x2c8500002c86,
- 0x2c8700002c88,
- 0x2c8900002c8a,
- 0x2c8b00002c8c,
- 0x2c8d00002c8e,
- 0x2c8f00002c90,
- 0x2c9100002c92,
- 0x2c9300002c94,
- 0x2c9500002c96,
- 0x2c9700002c98,
- 0x2c9900002c9a,
- 0x2c9b00002c9c,
- 0x2c9d00002c9e,
- 0x2c9f00002ca0,
- 0x2ca100002ca2,
- 0x2ca300002ca4,
- 0x2ca500002ca6,
- 0x2ca700002ca8,
- 0x2ca900002caa,
- 0x2cab00002cac,
- 0x2cad00002cae,
- 0x2caf00002cb0,
- 0x2cb100002cb2,
- 0x2cb300002cb4,
- 0x2cb500002cb6,
- 0x2cb700002cb8,
- 0x2cb900002cba,
- 0x2cbb00002cbc,
- 0x2cbd00002cbe,
- 0x2cbf00002cc0,
- 0x2cc100002cc2,
- 0x2cc300002cc4,
- 0x2cc500002cc6,
- 0x2cc700002cc8,
- 0x2cc900002cca,
- 0x2ccb00002ccc,
- 0x2ccd00002cce,
- 0x2ccf00002cd0,
- 0x2cd100002cd2,
- 0x2cd300002cd4,
- 0x2cd500002cd6,
- 0x2cd700002cd8,
- 0x2cd900002cda,
- 0x2cdb00002cdc,
- 0x2cdd00002cde,
- 0x2cdf00002ce0,
- 0x2ce100002ce2,
- 0x2ce300002ce5,
- 0x2cec00002ced,
- 0x2cee00002cf2,
- 0x2cf300002cf4,
- 0x2d0000002d26,
- 0x2d2700002d28,
- 0x2d2d00002d2e,
- 0x2d3000002d68,
- 0x2d7f00002d97,
- 0x2da000002da7,
- 0x2da800002daf,
- 0x2db000002db7,
- 0x2db800002dbf,
- 0x2dc000002dc7,
- 0x2dc800002dcf,
- 0x2dd000002dd7,
- 0x2dd800002ddf,
- 0x2de000002e00,
- 0x2e2f00002e30,
- 0x300500003008,
- 0x302a0000302e,
- 0x303c0000303d,
- 0x304100003097,
- 0x30990000309b,
- 0x309d0000309f,
- 0x30a1000030fb,
- 0x30fc000030ff,
- 0x310500003130,
- 0x31a0000031c0,
- 0x31f000003200,
- 0x340000004dc0,
- 0x4e000000a48d,
- 0xa4d00000a4fe,
- 0xa5000000a60d,
- 0xa6100000a62c,
- 0xa6410000a642,
- 0xa6430000a644,
- 0xa6450000a646,
- 0xa6470000a648,
- 0xa6490000a64a,
- 0xa64b0000a64c,
- 0xa64d0000a64e,
- 0xa64f0000a650,
- 0xa6510000a652,
- 0xa6530000a654,
- 0xa6550000a656,
- 0xa6570000a658,
- 0xa6590000a65a,
- 0xa65b0000a65c,
- 0xa65d0000a65e,
- 0xa65f0000a660,
- 0xa6610000a662,
- 0xa6630000a664,
- 0xa6650000a666,
- 0xa6670000a668,
- 0xa6690000a66a,
- 0xa66b0000a66c,
- 0xa66d0000a670,
- 0xa6740000a67e,
- 0xa67f0000a680,
- 0xa6810000a682,
- 0xa6830000a684,
- 0xa6850000a686,
- 0xa6870000a688,
- 0xa6890000a68a,
- 0xa68b0000a68c,
- 0xa68d0000a68e,
- 0xa68f0000a690,
- 0xa6910000a692,
- 0xa6930000a694,
- 0xa6950000a696,
- 0xa6970000a698,
- 0xa6990000a69a,
- 0xa69b0000a69c,
- 0xa69e0000a6e6,
- 0xa6f00000a6f2,
- 0xa7170000a720,
- 0xa7230000a724,
- 0xa7250000a726,
- 0xa7270000a728,
- 0xa7290000a72a,
- 0xa72b0000a72c,
- 0xa72d0000a72e,
- 0xa72f0000a732,
- 0xa7330000a734,
- 0xa7350000a736,
- 0xa7370000a738,
- 0xa7390000a73a,
- 0xa73b0000a73c,
- 0xa73d0000a73e,
- 0xa73f0000a740,
- 0xa7410000a742,
- 0xa7430000a744,
- 0xa7450000a746,
- 0xa7470000a748,
- 0xa7490000a74a,
- 0xa74b0000a74c,
- 0xa74d0000a74e,
- 0xa74f0000a750,
- 0xa7510000a752,
- 0xa7530000a754,
- 0xa7550000a756,
- 0xa7570000a758,
- 0xa7590000a75a,
- 0xa75b0000a75c,
- 0xa75d0000a75e,
- 0xa75f0000a760,
- 0xa7610000a762,
- 0xa7630000a764,
- 0xa7650000a766,
- 0xa7670000a768,
- 0xa7690000a76a,
- 0xa76b0000a76c,
- 0xa76d0000a76e,
- 0xa76f0000a770,
- 0xa7710000a779,
- 0xa77a0000a77b,
- 0xa77c0000a77d,
- 0xa77f0000a780,
- 0xa7810000a782,
- 0xa7830000a784,
- 0xa7850000a786,
- 0xa7870000a789,
- 0xa78c0000a78d,
- 0xa78e0000a790,
- 0xa7910000a792,
- 0xa7930000a796,
- 0xa7970000a798,
- 0xa7990000a79a,
- 0xa79b0000a79c,
- 0xa79d0000a79e,
- 0xa79f0000a7a0,
- 0xa7a10000a7a2,
- 0xa7a30000a7a4,
- 0xa7a50000a7a6,
- 0xa7a70000a7a8,
- 0xa7a90000a7aa,
- 0xa7af0000a7b0,
- 0xa7b50000a7b6,
- 0xa7b70000a7b8,
- 0xa7b90000a7ba,
- 0xa7bb0000a7bc,
- 0xa7bd0000a7be,
- 0xa7bf0000a7c0,
- 0xa7c10000a7c2,
- 0xa7c30000a7c4,
- 0xa7c80000a7c9,
- 0xa7ca0000a7cb,
- 0xa7d10000a7d2,
- 0xa7d30000a7d4,
- 0xa7d50000a7d6,
- 0xa7d70000a7d8,
- 0xa7d90000a7da,
- 0xa7f20000a7f5,
- 0xa7f60000a7f8,
- 0xa7fa0000a828,
- 0xa82c0000a82d,
- 0xa8400000a874,
- 0xa8800000a8c6,
- 0xa8d00000a8da,
- 0xa8e00000a8f8,
- 0xa8fb0000a8fc,
- 0xa8fd0000a92e,
- 0xa9300000a954,
- 0xa9800000a9c1,
- 0xa9cf0000a9da,
- 0xa9e00000a9ff,
- 0xaa000000aa37,
- 0xaa400000aa4e,
- 0xaa500000aa5a,
- 0xaa600000aa77,
- 0xaa7a0000aac3,
- 0xaadb0000aade,
- 0xaae00000aaf0,
- 0xaaf20000aaf7,
- 0xab010000ab07,
- 0xab090000ab0f,
- 0xab110000ab17,
- 0xab200000ab27,
- 0xab280000ab2f,
- 0xab300000ab5b,
- 0xab600000ab6a,
- 0xabc00000abeb,
- 0xabec0000abee,
- 0xabf00000abfa,
- 0xac000000d7a4,
- 0xfa0e0000fa10,
- 0xfa110000fa12,
- 0xfa130000fa15,
- 0xfa1f0000fa20,
- 0xfa210000fa22,
- 0xfa230000fa25,
- 0xfa270000fa2a,
- 0xfb1e0000fb1f,
- 0xfe200000fe30,
- 0xfe730000fe74,
- 0x100000001000c,
- 0x1000d00010027,
- 0x100280001003b,
- 0x1003c0001003e,
- 0x1003f0001004e,
- 0x100500001005e,
- 0x10080000100fb,
- 0x101fd000101fe,
- 0x102800001029d,
- 0x102a0000102d1,
- 0x102e0000102e1,
- 0x1030000010320,
- 0x1032d00010341,
- 0x103420001034a,
- 0x103500001037b,
- 0x103800001039e,
- 0x103a0000103c4,
- 0x103c8000103d0,
- 0x104280001049e,
- 0x104a0000104aa,
- 0x104d8000104fc,
- 0x1050000010528,
- 0x1053000010564,
- 0x10597000105a2,
- 0x105a3000105b2,
- 0x105b3000105ba,
- 0x105bb000105bd,
- 0x1060000010737,
- 0x1074000010756,
- 0x1076000010768,
- 0x1078000010786,
- 0x10787000107b1,
- 0x107b2000107bb,
- 0x1080000010806,
- 0x1080800010809,
- 0x1080a00010836,
- 0x1083700010839,
- 0x1083c0001083d,
- 0x1083f00010856,
- 0x1086000010877,
- 0x108800001089f,
- 0x108e0000108f3,
- 0x108f4000108f6,
- 0x1090000010916,
- 0x109200001093a,
- 0x10980000109b8,
- 0x109be000109c0,
- 0x10a0000010a04,
- 0x10a0500010a07,
- 0x10a0c00010a14,
- 0x10a1500010a18,
- 0x10a1900010a36,
- 0x10a3800010a3b,
- 0x10a3f00010a40,
- 0x10a6000010a7d,
- 0x10a8000010a9d,
- 0x10ac000010ac8,
- 0x10ac900010ae7,
- 0x10b0000010b36,
- 0x10b4000010b56,
- 0x10b6000010b73,
- 0x10b8000010b92,
- 0x10c0000010c49,
- 0x10cc000010cf3,
- 0x10d0000010d28,
- 0x10d3000010d3a,
- 0x10e8000010eaa,
- 0x10eab00010ead,
- 0x10eb000010eb2,
- 0x10f0000010f1d,
- 0x10f2700010f28,
- 0x10f3000010f51,
- 0x10f7000010f86,
- 0x10fb000010fc5,
- 0x10fe000010ff7,
- 0x1100000011047,
- 0x1106600011076,
- 0x1107f000110bb,
- 0x110c2000110c3,
- 0x110d0000110e9,
- 0x110f0000110fa,
- 0x1110000011135,
- 0x1113600011140,
- 0x1114400011148,
- 0x1115000011174,
- 0x1117600011177,
- 0x11180000111c5,
- 0x111c9000111cd,
- 0x111ce000111db,
- 0x111dc000111dd,
- 0x1120000011212,
- 0x1121300011238,
- 0x1123e0001123f,
- 0x1128000011287,
- 0x1128800011289,
- 0x1128a0001128e,
- 0x1128f0001129e,
- 0x1129f000112a9,
- 0x112b0000112eb,
- 0x112f0000112fa,
- 0x1130000011304,
- 0x113050001130d,
- 0x1130f00011311,
- 0x1131300011329,
- 0x1132a00011331,
- 0x1133200011334,
- 0x113350001133a,
- 0x1133b00011345,
- 0x1134700011349,
- 0x1134b0001134e,
- 0x1135000011351,
- 0x1135700011358,
- 0x1135d00011364,
- 0x113660001136d,
- 0x1137000011375,
- 0x114000001144b,
- 0x114500001145a,
- 0x1145e00011462,
- 0x11480000114c6,
- 0x114c7000114c8,
- 0x114d0000114da,
- 0x11580000115b6,
- 0x115b8000115c1,
- 0x115d8000115de,
- 0x1160000011641,
- 0x1164400011645,
- 0x116500001165a,
- 0x11680000116b9,
- 0x116c0000116ca,
- 0x117000001171b,
- 0x1171d0001172c,
- 0x117300001173a,
- 0x1174000011747,
- 0x118000001183b,
- 0x118c0000118ea,
- 0x118ff00011907,
- 0x119090001190a,
- 0x1190c00011914,
- 0x1191500011917,
- 0x1191800011936,
- 0x1193700011939,
- 0x1193b00011944,
- 0x119500001195a,
- 0x119a0000119a8,
- 0x119aa000119d8,
- 0x119da000119e2,
- 0x119e3000119e5,
- 0x11a0000011a3f,
- 0x11a4700011a48,
- 0x11a5000011a9a,
- 0x11a9d00011a9e,
- 0x11ab000011af9,
- 0x11c0000011c09,
- 0x11c0a00011c37,
- 0x11c3800011c41,
- 0x11c5000011c5a,
- 0x11c7200011c90,
- 0x11c9200011ca8,
- 0x11ca900011cb7,
- 0x11d0000011d07,
- 0x11d0800011d0a,
- 0x11d0b00011d37,
- 0x11d3a00011d3b,
- 0x11d3c00011d3e,
- 0x11d3f00011d48,
- 0x11d5000011d5a,
- 0x11d6000011d66,
- 0x11d6700011d69,
- 0x11d6a00011d8f,
- 0x11d9000011d92,
- 0x11d9300011d99,
- 0x11da000011daa,
- 0x11ee000011ef7,
- 0x11fb000011fb1,
- 0x120000001239a,
- 0x1248000012544,
- 0x12f9000012ff1,
- 0x130000001342f,
- 0x1440000014647,
- 0x1680000016a39,
- 0x16a4000016a5f,
- 0x16a6000016a6a,
- 0x16a7000016abf,
- 0x16ac000016aca,
- 0x16ad000016aee,
- 0x16af000016af5,
- 0x16b0000016b37,
- 0x16b4000016b44,
- 0x16b5000016b5a,
- 0x16b6300016b78,
- 0x16b7d00016b90,
- 0x16e6000016e80,
- 0x16f0000016f4b,
- 0x16f4f00016f88,
- 0x16f8f00016fa0,
- 0x16fe000016fe2,
- 0x16fe300016fe5,
- 0x16ff000016ff2,
- 0x17000000187f8,
- 0x1880000018cd6,
- 0x18d0000018d09,
- 0x1aff00001aff4,
- 0x1aff50001affc,
- 0x1affd0001afff,
- 0x1b0000001b123,
- 0x1b1500001b153,
- 0x1b1640001b168,
- 0x1b1700001b2fc,
- 0x1bc000001bc6b,
- 0x1bc700001bc7d,
- 0x1bc800001bc89,
- 0x1bc900001bc9a,
- 0x1bc9d0001bc9f,
- 0x1cf000001cf2e,
- 0x1cf300001cf47,
- 0x1da000001da37,
- 0x1da3b0001da6d,
- 0x1da750001da76,
- 0x1da840001da85,
- 0x1da9b0001daa0,
- 0x1daa10001dab0,
- 0x1df000001df1f,
- 0x1e0000001e007,
- 0x1e0080001e019,
- 0x1e01b0001e022,
- 0x1e0230001e025,
- 0x1e0260001e02b,
- 0x1e1000001e12d,
- 0x1e1300001e13e,
- 0x1e1400001e14a,
- 0x1e14e0001e14f,
- 0x1e2900001e2af,
- 0x1e2c00001e2fa,
- 0x1e7e00001e7e7,
- 0x1e7e80001e7ec,
- 0x1e7ed0001e7ef,
- 0x1e7f00001e7ff,
- 0x1e8000001e8c5,
- 0x1e8d00001e8d7,
- 0x1e9220001e94c,
- 0x1e9500001e95a,
- 0x1fbf00001fbfa,
- 0x200000002a6e0,
- 0x2a7000002b739,
- 0x2b7400002b81e,
- 0x2b8200002cea2,
- 0x2ceb00002ebe1,
- 0x300000003134b,
- ),
- 'CONTEXTJ': (
- 0x200c0000200e,
- ),
- 'CONTEXTO': (
- 0xb7000000b8,
- 0x37500000376,
- 0x5f3000005f5,
- 0x6600000066a,
- 0x6f0000006fa,
- 0x30fb000030fc,
- ),
-}
diff --git a/env/lib/python3.9/site-packages/idna/intranges.py b/env/lib/python3.9/site-packages/idna/intranges.py
deleted file mode 100644
index 6a43b04..0000000
--- a/env/lib/python3.9/site-packages/idna/intranges.py
+++ /dev/null
@@ -1,54 +0,0 @@
-"""
-Given a list of integers, made up of (hopefully) a small number of long runs
-of consecutive integers, compute a representation of the form
-((start1, end1), (start2, end2) ...). Then answer the question "was x present
-in the original list?" in time O(log(# runs)).
-"""
-
-import bisect
-from typing import List, Tuple
-
-def intranges_from_list(list_: List[int]) -> Tuple[int, ...]:
- """Represent a list of integers as a sequence of ranges:
- ((start_0, end_0), (start_1, end_1), ...), such that the original
- integers are exactly those x such that start_i <= x < end_i for some i.
-
- Ranges are encoded as single integers (start << 32 | end), not as tuples.
- """
-
- sorted_list = sorted(list_)
- ranges = []
- last_write = -1
- for i in range(len(sorted_list)):
- if i+1 < len(sorted_list):
- if sorted_list[i] == sorted_list[i+1]-1:
- continue
- current_range = sorted_list[last_write+1:i+1]
- ranges.append(_encode_range(current_range[0], current_range[-1] + 1))
- last_write = i
-
- return tuple(ranges)
-
-def _encode_range(start: int, end: int) -> int:
- return (start << 32) | end
-
-def _decode_range(r: int) -> Tuple[int, int]:
- return (r >> 32), (r & ((1 << 32) - 1))
-
-
-def intranges_contain(int_: int, ranges: Tuple[int, ...]) -> bool:
- """Determine if `int_` falls into one of the ranges in `ranges`."""
- tuple_ = _encode_range(int_, 0)
- pos = bisect.bisect_left(ranges, tuple_)
- # we could be immediately ahead of a tuple (start, end)
- # with start < int_ <= end
- if pos > 0:
- left, right = _decode_range(ranges[pos-1])
- if left <= int_ < right:
- return True
- # or we could be immediately behind a tuple (int_, end)
- if pos < len(ranges):
- left, _ = _decode_range(ranges[pos])
- if left == int_:
- return True
- return False
diff --git a/env/lib/python3.9/site-packages/idna/package_data.py b/env/lib/python3.9/site-packages/idna/package_data.py
deleted file mode 100644
index f5ea87c..0000000
--- a/env/lib/python3.9/site-packages/idna/package_data.py
+++ /dev/null
@@ -1,2 +0,0 @@
-__version__ = '3.3'
-
diff --git a/env/lib/python3.9/site-packages/idna/py.typed b/env/lib/python3.9/site-packages/idna/py.typed
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/idna/uts46data.py b/env/lib/python3.9/site-packages/idna/uts46data.py
deleted file mode 100644
index 8f65705..0000000
--- a/env/lib/python3.9/site-packages/idna/uts46data.py
+++ /dev/null
@@ -1,8512 +0,0 @@
-# This file is automatically generated by tools/idna-data
-# vim: set fileencoding=utf-8 :
-
-from typing import List, Tuple, Union
-
-
-"""IDNA Mapping Table from UTS46."""
-
-
-__version__ = '14.0.0'
-def _seg_0() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x0, '3'),
- (0x1, '3'),
- (0x2, '3'),
- (0x3, '3'),
- (0x4, '3'),
- (0x5, '3'),
- (0x6, '3'),
- (0x7, '3'),
- (0x8, '3'),
- (0x9, '3'),
- (0xA, '3'),
- (0xB, '3'),
- (0xC, '3'),
- (0xD, '3'),
- (0xE, '3'),
- (0xF, '3'),
- (0x10, '3'),
- (0x11, '3'),
- (0x12, '3'),
- (0x13, '3'),
- (0x14, '3'),
- (0x15, '3'),
- (0x16, '3'),
- (0x17, '3'),
- (0x18, '3'),
- (0x19, '3'),
- (0x1A, '3'),
- (0x1B, '3'),
- (0x1C, '3'),
- (0x1D, '3'),
- (0x1E, '3'),
- (0x1F, '3'),
- (0x20, '3'),
- (0x21, '3'),
- (0x22, '3'),
- (0x23, '3'),
- (0x24, '3'),
- (0x25, '3'),
- (0x26, '3'),
- (0x27, '3'),
- (0x28, '3'),
- (0x29, '3'),
- (0x2A, '3'),
- (0x2B, '3'),
- (0x2C, '3'),
- (0x2D, 'V'),
- (0x2E, 'V'),
- (0x2F, '3'),
- (0x30, 'V'),
- (0x31, 'V'),
- (0x32, 'V'),
- (0x33, 'V'),
- (0x34, 'V'),
- (0x35, 'V'),
- (0x36, 'V'),
- (0x37, 'V'),
- (0x38, 'V'),
- (0x39, 'V'),
- (0x3A, '3'),
- (0x3B, '3'),
- (0x3C, '3'),
- (0x3D, '3'),
- (0x3E, '3'),
- (0x3F, '3'),
- (0x40, '3'),
- (0x41, 'M', 'a'),
- (0x42, 'M', 'b'),
- (0x43, 'M', 'c'),
- (0x44, 'M', 'd'),
- (0x45, 'M', 'e'),
- (0x46, 'M', 'f'),
- (0x47, 'M', 'g'),
- (0x48, 'M', 'h'),
- (0x49, 'M', 'i'),
- (0x4A, 'M', 'j'),
- (0x4B, 'M', 'k'),
- (0x4C, 'M', 'l'),
- (0x4D, 'M', 'm'),
- (0x4E, 'M', 'n'),
- (0x4F, 'M', 'o'),
- (0x50, 'M', 'p'),
- (0x51, 'M', 'q'),
- (0x52, 'M', 'r'),
- (0x53, 'M', 's'),
- (0x54, 'M', 't'),
- (0x55, 'M', 'u'),
- (0x56, 'M', 'v'),
- (0x57, 'M', 'w'),
- (0x58, 'M', 'x'),
- (0x59, 'M', 'y'),
- (0x5A, 'M', 'z'),
- (0x5B, '3'),
- (0x5C, '3'),
- (0x5D, '3'),
- (0x5E, '3'),
- (0x5F, '3'),
- (0x60, '3'),
- (0x61, 'V'),
- (0x62, 'V'),
- (0x63, 'V'),
- ]
-
-def _seg_1() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x64, 'V'),
- (0x65, 'V'),
- (0x66, 'V'),
- (0x67, 'V'),
- (0x68, 'V'),
- (0x69, 'V'),
- (0x6A, 'V'),
- (0x6B, 'V'),
- (0x6C, 'V'),
- (0x6D, 'V'),
- (0x6E, 'V'),
- (0x6F, 'V'),
- (0x70, 'V'),
- (0x71, 'V'),
- (0x72, 'V'),
- (0x73, 'V'),
- (0x74, 'V'),
- (0x75, 'V'),
- (0x76, 'V'),
- (0x77, 'V'),
- (0x78, 'V'),
- (0x79, 'V'),
- (0x7A, 'V'),
- (0x7B, '3'),
- (0x7C, '3'),
- (0x7D, '3'),
- (0x7E, '3'),
- (0x7F, '3'),
- (0x80, 'X'),
- (0x81, 'X'),
- (0x82, 'X'),
- (0x83, 'X'),
- (0x84, 'X'),
- (0x85, 'X'),
- (0x86, 'X'),
- (0x87, 'X'),
- (0x88, 'X'),
- (0x89, 'X'),
- (0x8A, 'X'),
- (0x8B, 'X'),
- (0x8C, 'X'),
- (0x8D, 'X'),
- (0x8E, 'X'),
- (0x8F, 'X'),
- (0x90, 'X'),
- (0x91, 'X'),
- (0x92, 'X'),
- (0x93, 'X'),
- (0x94, 'X'),
- (0x95, 'X'),
- (0x96, 'X'),
- (0x97, 'X'),
- (0x98, 'X'),
- (0x99, 'X'),
- (0x9A, 'X'),
- (0x9B, 'X'),
- (0x9C, 'X'),
- (0x9D, 'X'),
- (0x9E, 'X'),
- (0x9F, 'X'),
- (0xA0, '3', ' '),
- (0xA1, 'V'),
- (0xA2, 'V'),
- (0xA3, 'V'),
- (0xA4, 'V'),
- (0xA5, 'V'),
- (0xA6, 'V'),
- (0xA7, 'V'),
- (0xA8, '3', ' ̈'),
- (0xA9, 'V'),
- (0xAA, 'M', 'a'),
- (0xAB, 'V'),
- (0xAC, 'V'),
- (0xAD, 'I'),
- (0xAE, 'V'),
- (0xAF, '3', ' ̄'),
- (0xB0, 'V'),
- (0xB1, 'V'),
- (0xB2, 'M', '2'),
- (0xB3, 'M', '3'),
- (0xB4, '3', ' ́'),
- (0xB5, 'M', 'μ'),
- (0xB6, 'V'),
- (0xB7, 'V'),
- (0xB8, '3', ' ̧'),
- (0xB9, 'M', '1'),
- (0xBA, 'M', 'o'),
- (0xBB, 'V'),
- (0xBC, 'M', '1⁄4'),
- (0xBD, 'M', '1⁄2'),
- (0xBE, 'M', '3⁄4'),
- (0xBF, 'V'),
- (0xC0, 'M', 'à'),
- (0xC1, 'M', 'á'),
- (0xC2, 'M', 'â'),
- (0xC3, 'M', 'ã'),
- (0xC4, 'M', 'ä'),
- (0xC5, 'M', 'å'),
- (0xC6, 'M', 'æ'),
- (0xC7, 'M', 'ç'),
- ]
-
-def _seg_2() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xC8, 'M', 'è'),
- (0xC9, 'M', 'é'),
- (0xCA, 'M', 'ê'),
- (0xCB, 'M', 'ë'),
- (0xCC, 'M', 'ì'),
- (0xCD, 'M', 'í'),
- (0xCE, 'M', 'î'),
- (0xCF, 'M', 'ï'),
- (0xD0, 'M', 'ð'),
- (0xD1, 'M', 'ñ'),
- (0xD2, 'M', 'ò'),
- (0xD3, 'M', 'ó'),
- (0xD4, 'M', 'ô'),
- (0xD5, 'M', 'õ'),
- (0xD6, 'M', 'ö'),
- (0xD7, 'V'),
- (0xD8, 'M', 'ø'),
- (0xD9, 'M', 'ù'),
- (0xDA, 'M', 'ú'),
- (0xDB, 'M', 'û'),
- (0xDC, 'M', 'ü'),
- (0xDD, 'M', 'ý'),
- (0xDE, 'M', 'þ'),
- (0xDF, 'D', 'ss'),
- (0xE0, 'V'),
- (0xE1, 'V'),
- (0xE2, 'V'),
- (0xE3, 'V'),
- (0xE4, 'V'),
- (0xE5, 'V'),
- (0xE6, 'V'),
- (0xE7, 'V'),
- (0xE8, 'V'),
- (0xE9, 'V'),
- (0xEA, 'V'),
- (0xEB, 'V'),
- (0xEC, 'V'),
- (0xED, 'V'),
- (0xEE, 'V'),
- (0xEF, 'V'),
- (0xF0, 'V'),
- (0xF1, 'V'),
- (0xF2, 'V'),
- (0xF3, 'V'),
- (0xF4, 'V'),
- (0xF5, 'V'),
- (0xF6, 'V'),
- (0xF7, 'V'),
- (0xF8, 'V'),
- (0xF9, 'V'),
- (0xFA, 'V'),
- (0xFB, 'V'),
- (0xFC, 'V'),
- (0xFD, 'V'),
- (0xFE, 'V'),
- (0xFF, 'V'),
- (0x100, 'M', 'ā'),
- (0x101, 'V'),
- (0x102, 'M', 'ă'),
- (0x103, 'V'),
- (0x104, 'M', 'ą'),
- (0x105, 'V'),
- (0x106, 'M', 'ć'),
- (0x107, 'V'),
- (0x108, 'M', 'ĉ'),
- (0x109, 'V'),
- (0x10A, 'M', 'ċ'),
- (0x10B, 'V'),
- (0x10C, 'M', 'č'),
- (0x10D, 'V'),
- (0x10E, 'M', 'ď'),
- (0x10F, 'V'),
- (0x110, 'M', 'đ'),
- (0x111, 'V'),
- (0x112, 'M', 'ē'),
- (0x113, 'V'),
- (0x114, 'M', 'ĕ'),
- (0x115, 'V'),
- (0x116, 'M', 'ė'),
- (0x117, 'V'),
- (0x118, 'M', 'ę'),
- (0x119, 'V'),
- (0x11A, 'M', 'ě'),
- (0x11B, 'V'),
- (0x11C, 'M', 'ĝ'),
- (0x11D, 'V'),
- (0x11E, 'M', 'ğ'),
- (0x11F, 'V'),
- (0x120, 'M', 'ġ'),
- (0x121, 'V'),
- (0x122, 'M', 'ģ'),
- (0x123, 'V'),
- (0x124, 'M', 'ĥ'),
- (0x125, 'V'),
- (0x126, 'M', 'ħ'),
- (0x127, 'V'),
- (0x128, 'M', 'ĩ'),
- (0x129, 'V'),
- (0x12A, 'M', 'ī'),
- (0x12B, 'V'),
- ]
-
-def _seg_3() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x12C, 'M', 'ĭ'),
- (0x12D, 'V'),
- (0x12E, 'M', 'į'),
- (0x12F, 'V'),
- (0x130, 'M', 'i̇'),
- (0x131, 'V'),
- (0x132, 'M', 'ij'),
- (0x134, 'M', 'ĵ'),
- (0x135, 'V'),
- (0x136, 'M', 'ķ'),
- (0x137, 'V'),
- (0x139, 'M', 'ĺ'),
- (0x13A, 'V'),
- (0x13B, 'M', 'ļ'),
- (0x13C, 'V'),
- (0x13D, 'M', 'ľ'),
- (0x13E, 'V'),
- (0x13F, 'M', 'l·'),
- (0x141, 'M', 'ł'),
- (0x142, 'V'),
- (0x143, 'M', 'ń'),
- (0x144, 'V'),
- (0x145, 'M', 'ņ'),
- (0x146, 'V'),
- (0x147, 'M', 'ň'),
- (0x148, 'V'),
- (0x149, 'M', 'ʼn'),
- (0x14A, 'M', 'ŋ'),
- (0x14B, 'V'),
- (0x14C, 'M', 'ō'),
- (0x14D, 'V'),
- (0x14E, 'M', 'ŏ'),
- (0x14F, 'V'),
- (0x150, 'M', 'ő'),
- (0x151, 'V'),
- (0x152, 'M', 'œ'),
- (0x153, 'V'),
- (0x154, 'M', 'ŕ'),
- (0x155, 'V'),
- (0x156, 'M', 'ŗ'),
- (0x157, 'V'),
- (0x158, 'M', 'ř'),
- (0x159, 'V'),
- (0x15A, 'M', 'ś'),
- (0x15B, 'V'),
- (0x15C, 'M', 'ŝ'),
- (0x15D, 'V'),
- (0x15E, 'M', 'ş'),
- (0x15F, 'V'),
- (0x160, 'M', 'š'),
- (0x161, 'V'),
- (0x162, 'M', 'ţ'),
- (0x163, 'V'),
- (0x164, 'M', 'ť'),
- (0x165, 'V'),
- (0x166, 'M', 'ŧ'),
- (0x167, 'V'),
- (0x168, 'M', 'ũ'),
- (0x169, 'V'),
- (0x16A, 'M', 'ū'),
- (0x16B, 'V'),
- (0x16C, 'M', 'ŭ'),
- (0x16D, 'V'),
- (0x16E, 'M', 'ů'),
- (0x16F, 'V'),
- (0x170, 'M', 'ű'),
- (0x171, 'V'),
- (0x172, 'M', 'ų'),
- (0x173, 'V'),
- (0x174, 'M', 'ŵ'),
- (0x175, 'V'),
- (0x176, 'M', 'ŷ'),
- (0x177, 'V'),
- (0x178, 'M', 'ÿ'),
- (0x179, 'M', 'ź'),
- (0x17A, 'V'),
- (0x17B, 'M', 'ż'),
- (0x17C, 'V'),
- (0x17D, 'M', 'ž'),
- (0x17E, 'V'),
- (0x17F, 'M', 's'),
- (0x180, 'V'),
- (0x181, 'M', 'ɓ'),
- (0x182, 'M', 'ƃ'),
- (0x183, 'V'),
- (0x184, 'M', 'ƅ'),
- (0x185, 'V'),
- (0x186, 'M', 'ɔ'),
- (0x187, 'M', 'ƈ'),
- (0x188, 'V'),
- (0x189, 'M', 'ɖ'),
- (0x18A, 'M', 'ɗ'),
- (0x18B, 'M', 'ƌ'),
- (0x18C, 'V'),
- (0x18E, 'M', 'ǝ'),
- (0x18F, 'M', 'ə'),
- (0x190, 'M', 'ɛ'),
- (0x191, 'M', 'ƒ'),
- (0x192, 'V'),
- (0x193, 'M', 'ɠ'),
- ]
-
-def _seg_4() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x194, 'M', 'ɣ'),
- (0x195, 'V'),
- (0x196, 'M', 'ɩ'),
- (0x197, 'M', 'ɨ'),
- (0x198, 'M', 'ƙ'),
- (0x199, 'V'),
- (0x19C, 'M', 'ɯ'),
- (0x19D, 'M', 'ɲ'),
- (0x19E, 'V'),
- (0x19F, 'M', 'ɵ'),
- (0x1A0, 'M', 'ơ'),
- (0x1A1, 'V'),
- (0x1A2, 'M', 'ƣ'),
- (0x1A3, 'V'),
- (0x1A4, 'M', 'ƥ'),
- (0x1A5, 'V'),
- (0x1A6, 'M', 'ʀ'),
- (0x1A7, 'M', 'ƨ'),
- (0x1A8, 'V'),
- (0x1A9, 'M', 'ʃ'),
- (0x1AA, 'V'),
- (0x1AC, 'M', 'ƭ'),
- (0x1AD, 'V'),
- (0x1AE, 'M', 'ʈ'),
- (0x1AF, 'M', 'ư'),
- (0x1B0, 'V'),
- (0x1B1, 'M', 'ʊ'),
- (0x1B2, 'M', 'ʋ'),
- (0x1B3, 'M', 'ƴ'),
- (0x1B4, 'V'),
- (0x1B5, 'M', 'ƶ'),
- (0x1B6, 'V'),
- (0x1B7, 'M', 'ʒ'),
- (0x1B8, 'M', 'ƹ'),
- (0x1B9, 'V'),
- (0x1BC, 'M', 'ƽ'),
- (0x1BD, 'V'),
- (0x1C4, 'M', 'dž'),
- (0x1C7, 'M', 'lj'),
- (0x1CA, 'M', 'nj'),
- (0x1CD, 'M', 'ǎ'),
- (0x1CE, 'V'),
- (0x1CF, 'M', 'ǐ'),
- (0x1D0, 'V'),
- (0x1D1, 'M', 'ǒ'),
- (0x1D2, 'V'),
- (0x1D3, 'M', 'ǔ'),
- (0x1D4, 'V'),
- (0x1D5, 'M', 'ǖ'),
- (0x1D6, 'V'),
- (0x1D7, 'M', 'ǘ'),
- (0x1D8, 'V'),
- (0x1D9, 'M', 'ǚ'),
- (0x1DA, 'V'),
- (0x1DB, 'M', 'ǜ'),
- (0x1DC, 'V'),
- (0x1DE, 'M', 'ǟ'),
- (0x1DF, 'V'),
- (0x1E0, 'M', 'ǡ'),
- (0x1E1, 'V'),
- (0x1E2, 'M', 'ǣ'),
- (0x1E3, 'V'),
- (0x1E4, 'M', 'ǥ'),
- (0x1E5, 'V'),
- (0x1E6, 'M', 'ǧ'),
- (0x1E7, 'V'),
- (0x1E8, 'M', 'ǩ'),
- (0x1E9, 'V'),
- (0x1EA, 'M', 'ǫ'),
- (0x1EB, 'V'),
- (0x1EC, 'M', 'ǭ'),
- (0x1ED, 'V'),
- (0x1EE, 'M', 'ǯ'),
- (0x1EF, 'V'),
- (0x1F1, 'M', 'dz'),
- (0x1F4, 'M', 'ǵ'),
- (0x1F5, 'V'),
- (0x1F6, 'M', 'ƕ'),
- (0x1F7, 'M', 'ƿ'),
- (0x1F8, 'M', 'ǹ'),
- (0x1F9, 'V'),
- (0x1FA, 'M', 'ǻ'),
- (0x1FB, 'V'),
- (0x1FC, 'M', 'ǽ'),
- (0x1FD, 'V'),
- (0x1FE, 'M', 'ǿ'),
- (0x1FF, 'V'),
- (0x200, 'M', 'ȁ'),
- (0x201, 'V'),
- (0x202, 'M', 'ȃ'),
- (0x203, 'V'),
- (0x204, 'M', 'ȅ'),
- (0x205, 'V'),
- (0x206, 'M', 'ȇ'),
- (0x207, 'V'),
- (0x208, 'M', 'ȉ'),
- (0x209, 'V'),
- (0x20A, 'M', 'ȋ'),
- (0x20B, 'V'),
- (0x20C, 'M', 'ȍ'),
- ]
-
-def _seg_5() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x20D, 'V'),
- (0x20E, 'M', 'ȏ'),
- (0x20F, 'V'),
- (0x210, 'M', 'ȑ'),
- (0x211, 'V'),
- (0x212, 'M', 'ȓ'),
- (0x213, 'V'),
- (0x214, 'M', 'ȕ'),
- (0x215, 'V'),
- (0x216, 'M', 'ȗ'),
- (0x217, 'V'),
- (0x218, 'M', 'ș'),
- (0x219, 'V'),
- (0x21A, 'M', 'ț'),
- (0x21B, 'V'),
- (0x21C, 'M', 'ȝ'),
- (0x21D, 'V'),
- (0x21E, 'M', 'ȟ'),
- (0x21F, 'V'),
- (0x220, 'M', 'ƞ'),
- (0x221, 'V'),
- (0x222, 'M', 'ȣ'),
- (0x223, 'V'),
- (0x224, 'M', 'ȥ'),
- (0x225, 'V'),
- (0x226, 'M', 'ȧ'),
- (0x227, 'V'),
- (0x228, 'M', 'ȩ'),
- (0x229, 'V'),
- (0x22A, 'M', 'ȫ'),
- (0x22B, 'V'),
- (0x22C, 'M', 'ȭ'),
- (0x22D, 'V'),
- (0x22E, 'M', 'ȯ'),
- (0x22F, 'V'),
- (0x230, 'M', 'ȱ'),
- (0x231, 'V'),
- (0x232, 'M', 'ȳ'),
- (0x233, 'V'),
- (0x23A, 'M', 'ⱥ'),
- (0x23B, 'M', 'ȼ'),
- (0x23C, 'V'),
- (0x23D, 'M', 'ƚ'),
- (0x23E, 'M', 'ⱦ'),
- (0x23F, 'V'),
- (0x241, 'M', 'ɂ'),
- (0x242, 'V'),
- (0x243, 'M', 'ƀ'),
- (0x244, 'M', 'ʉ'),
- (0x245, 'M', 'ʌ'),
- (0x246, 'M', 'ɇ'),
- (0x247, 'V'),
- (0x248, 'M', 'ɉ'),
- (0x249, 'V'),
- (0x24A, 'M', 'ɋ'),
- (0x24B, 'V'),
- (0x24C, 'M', 'ɍ'),
- (0x24D, 'V'),
- (0x24E, 'M', 'ɏ'),
- (0x24F, 'V'),
- (0x2B0, 'M', 'h'),
- (0x2B1, 'M', 'ɦ'),
- (0x2B2, 'M', 'j'),
- (0x2B3, 'M', 'r'),
- (0x2B4, 'M', 'ɹ'),
- (0x2B5, 'M', 'ɻ'),
- (0x2B6, 'M', 'ʁ'),
- (0x2B7, 'M', 'w'),
- (0x2B8, 'M', 'y'),
- (0x2B9, 'V'),
- (0x2D8, '3', ' ̆'),
- (0x2D9, '3', ' ̇'),
- (0x2DA, '3', ' ̊'),
- (0x2DB, '3', ' ̨'),
- (0x2DC, '3', ' ̃'),
- (0x2DD, '3', ' ̋'),
- (0x2DE, 'V'),
- (0x2E0, 'M', 'ɣ'),
- (0x2E1, 'M', 'l'),
- (0x2E2, 'M', 's'),
- (0x2E3, 'M', 'x'),
- (0x2E4, 'M', 'ʕ'),
- (0x2E5, 'V'),
- (0x340, 'M', '̀'),
- (0x341, 'M', '́'),
- (0x342, 'V'),
- (0x343, 'M', '̓'),
- (0x344, 'M', '̈́'),
- (0x345, 'M', 'ι'),
- (0x346, 'V'),
- (0x34F, 'I'),
- (0x350, 'V'),
- (0x370, 'M', 'ͱ'),
- (0x371, 'V'),
- (0x372, 'M', 'ͳ'),
- (0x373, 'V'),
- (0x374, 'M', 'ʹ'),
- (0x375, 'V'),
- (0x376, 'M', 'ͷ'),
- (0x377, 'V'),
- ]
-
-def _seg_6() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x378, 'X'),
- (0x37A, '3', ' ι'),
- (0x37B, 'V'),
- (0x37E, '3', ';'),
- (0x37F, 'M', 'ϳ'),
- (0x380, 'X'),
- (0x384, '3', ' ́'),
- (0x385, '3', ' ̈́'),
- (0x386, 'M', 'ά'),
- (0x387, 'M', '·'),
- (0x388, 'M', 'έ'),
- (0x389, 'M', 'ή'),
- (0x38A, 'M', 'ί'),
- (0x38B, 'X'),
- (0x38C, 'M', 'ό'),
- (0x38D, 'X'),
- (0x38E, 'M', 'ύ'),
- (0x38F, 'M', 'ώ'),
- (0x390, 'V'),
- (0x391, 'M', 'α'),
- (0x392, 'M', 'β'),
- (0x393, 'M', 'γ'),
- (0x394, 'M', 'δ'),
- (0x395, 'M', 'ε'),
- (0x396, 'M', 'ζ'),
- (0x397, 'M', 'η'),
- (0x398, 'M', 'θ'),
- (0x399, 'M', 'ι'),
- (0x39A, 'M', 'κ'),
- (0x39B, 'M', 'λ'),
- (0x39C, 'M', 'μ'),
- (0x39D, 'M', 'ν'),
- (0x39E, 'M', 'ξ'),
- (0x39F, 'M', 'ο'),
- (0x3A0, 'M', 'π'),
- (0x3A1, 'M', 'ρ'),
- (0x3A2, 'X'),
- (0x3A3, 'M', 'σ'),
- (0x3A4, 'M', 'τ'),
- (0x3A5, 'M', 'υ'),
- (0x3A6, 'M', 'φ'),
- (0x3A7, 'M', 'χ'),
- (0x3A8, 'M', 'ψ'),
- (0x3A9, 'M', 'ω'),
- (0x3AA, 'M', 'ϊ'),
- (0x3AB, 'M', 'ϋ'),
- (0x3AC, 'V'),
- (0x3C2, 'D', 'σ'),
- (0x3C3, 'V'),
- (0x3CF, 'M', 'ϗ'),
- (0x3D0, 'M', 'β'),
- (0x3D1, 'M', 'θ'),
- (0x3D2, 'M', 'υ'),
- (0x3D3, 'M', 'ύ'),
- (0x3D4, 'M', 'ϋ'),
- (0x3D5, 'M', 'φ'),
- (0x3D6, 'M', 'π'),
- (0x3D7, 'V'),
- (0x3D8, 'M', 'ϙ'),
- (0x3D9, 'V'),
- (0x3DA, 'M', 'ϛ'),
- (0x3DB, 'V'),
- (0x3DC, 'M', 'ϝ'),
- (0x3DD, 'V'),
- (0x3DE, 'M', 'ϟ'),
- (0x3DF, 'V'),
- (0x3E0, 'M', 'ϡ'),
- (0x3E1, 'V'),
- (0x3E2, 'M', 'ϣ'),
- (0x3E3, 'V'),
- (0x3E4, 'M', 'ϥ'),
- (0x3E5, 'V'),
- (0x3E6, 'M', 'ϧ'),
- (0x3E7, 'V'),
- (0x3E8, 'M', 'ϩ'),
- (0x3E9, 'V'),
- (0x3EA, 'M', 'ϫ'),
- (0x3EB, 'V'),
- (0x3EC, 'M', 'ϭ'),
- (0x3ED, 'V'),
- (0x3EE, 'M', 'ϯ'),
- (0x3EF, 'V'),
- (0x3F0, 'M', 'κ'),
- (0x3F1, 'M', 'ρ'),
- (0x3F2, 'M', 'σ'),
- (0x3F3, 'V'),
- (0x3F4, 'M', 'θ'),
- (0x3F5, 'M', 'ε'),
- (0x3F6, 'V'),
- (0x3F7, 'M', 'ϸ'),
- (0x3F8, 'V'),
- (0x3F9, 'M', 'σ'),
- (0x3FA, 'M', 'ϻ'),
- (0x3FB, 'V'),
- (0x3FD, 'M', 'ͻ'),
- (0x3FE, 'M', 'ͼ'),
- (0x3FF, 'M', 'ͽ'),
- (0x400, 'M', 'ѐ'),
- (0x401, 'M', 'ё'),
- (0x402, 'M', 'ђ'),
- ]
-
-def _seg_7() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x403, 'M', 'ѓ'),
- (0x404, 'M', 'є'),
- (0x405, 'M', 'ѕ'),
- (0x406, 'M', 'і'),
- (0x407, 'M', 'ї'),
- (0x408, 'M', 'ј'),
- (0x409, 'M', 'љ'),
- (0x40A, 'M', 'њ'),
- (0x40B, 'M', 'ћ'),
- (0x40C, 'M', 'ќ'),
- (0x40D, 'M', 'ѝ'),
- (0x40E, 'M', 'ў'),
- (0x40F, 'M', 'џ'),
- (0x410, 'M', 'а'),
- (0x411, 'M', 'б'),
- (0x412, 'M', 'в'),
- (0x413, 'M', 'г'),
- (0x414, 'M', 'д'),
- (0x415, 'M', 'е'),
- (0x416, 'M', 'ж'),
- (0x417, 'M', 'з'),
- (0x418, 'M', 'и'),
- (0x419, 'M', 'й'),
- (0x41A, 'M', 'к'),
- (0x41B, 'M', 'л'),
- (0x41C, 'M', 'м'),
- (0x41D, 'M', 'н'),
- (0x41E, 'M', 'о'),
- (0x41F, 'M', 'п'),
- (0x420, 'M', 'р'),
- (0x421, 'M', 'с'),
- (0x422, 'M', 'т'),
- (0x423, 'M', 'у'),
- (0x424, 'M', 'ф'),
- (0x425, 'M', 'х'),
- (0x426, 'M', 'ц'),
- (0x427, 'M', 'ч'),
- (0x428, 'M', 'ш'),
- (0x429, 'M', 'щ'),
- (0x42A, 'M', 'ъ'),
- (0x42B, 'M', 'ы'),
- (0x42C, 'M', 'ь'),
- (0x42D, 'M', 'э'),
- (0x42E, 'M', 'ю'),
- (0x42F, 'M', 'я'),
- (0x430, 'V'),
- (0x460, 'M', 'ѡ'),
- (0x461, 'V'),
- (0x462, 'M', 'ѣ'),
- (0x463, 'V'),
- (0x464, 'M', 'ѥ'),
- (0x465, 'V'),
- (0x466, 'M', 'ѧ'),
- (0x467, 'V'),
- (0x468, 'M', 'ѩ'),
- (0x469, 'V'),
- (0x46A, 'M', 'ѫ'),
- (0x46B, 'V'),
- (0x46C, 'M', 'ѭ'),
- (0x46D, 'V'),
- (0x46E, 'M', 'ѯ'),
- (0x46F, 'V'),
- (0x470, 'M', 'ѱ'),
- (0x471, 'V'),
- (0x472, 'M', 'ѳ'),
- (0x473, 'V'),
- (0x474, 'M', 'ѵ'),
- (0x475, 'V'),
- (0x476, 'M', 'ѷ'),
- (0x477, 'V'),
- (0x478, 'M', 'ѹ'),
- (0x479, 'V'),
- (0x47A, 'M', 'ѻ'),
- (0x47B, 'V'),
- (0x47C, 'M', 'ѽ'),
- (0x47D, 'V'),
- (0x47E, 'M', 'ѿ'),
- (0x47F, 'V'),
- (0x480, 'M', 'ҁ'),
- (0x481, 'V'),
- (0x48A, 'M', 'ҋ'),
- (0x48B, 'V'),
- (0x48C, 'M', 'ҍ'),
- (0x48D, 'V'),
- (0x48E, 'M', 'ҏ'),
- (0x48F, 'V'),
- (0x490, 'M', 'ґ'),
- (0x491, 'V'),
- (0x492, 'M', 'ғ'),
- (0x493, 'V'),
- (0x494, 'M', 'ҕ'),
- (0x495, 'V'),
- (0x496, 'M', 'җ'),
- (0x497, 'V'),
- (0x498, 'M', 'ҙ'),
- (0x499, 'V'),
- (0x49A, 'M', 'қ'),
- (0x49B, 'V'),
- (0x49C, 'M', 'ҝ'),
- (0x49D, 'V'),
- ]
-
-def _seg_8() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x49E, 'M', 'ҟ'),
- (0x49F, 'V'),
- (0x4A0, 'M', 'ҡ'),
- (0x4A1, 'V'),
- (0x4A2, 'M', 'ң'),
- (0x4A3, 'V'),
- (0x4A4, 'M', 'ҥ'),
- (0x4A5, 'V'),
- (0x4A6, 'M', 'ҧ'),
- (0x4A7, 'V'),
- (0x4A8, 'M', 'ҩ'),
- (0x4A9, 'V'),
- (0x4AA, 'M', 'ҫ'),
- (0x4AB, 'V'),
- (0x4AC, 'M', 'ҭ'),
- (0x4AD, 'V'),
- (0x4AE, 'M', 'ү'),
- (0x4AF, 'V'),
- (0x4B0, 'M', 'ұ'),
- (0x4B1, 'V'),
- (0x4B2, 'M', 'ҳ'),
- (0x4B3, 'V'),
- (0x4B4, 'M', 'ҵ'),
- (0x4B5, 'V'),
- (0x4B6, 'M', 'ҷ'),
- (0x4B7, 'V'),
- (0x4B8, 'M', 'ҹ'),
- (0x4B9, 'V'),
- (0x4BA, 'M', 'һ'),
- (0x4BB, 'V'),
- (0x4BC, 'M', 'ҽ'),
- (0x4BD, 'V'),
- (0x4BE, 'M', 'ҿ'),
- (0x4BF, 'V'),
- (0x4C0, 'X'),
- (0x4C1, 'M', 'ӂ'),
- (0x4C2, 'V'),
- (0x4C3, 'M', 'ӄ'),
- (0x4C4, 'V'),
- (0x4C5, 'M', 'ӆ'),
- (0x4C6, 'V'),
- (0x4C7, 'M', 'ӈ'),
- (0x4C8, 'V'),
- (0x4C9, 'M', 'ӊ'),
- (0x4CA, 'V'),
- (0x4CB, 'M', 'ӌ'),
- (0x4CC, 'V'),
- (0x4CD, 'M', 'ӎ'),
- (0x4CE, 'V'),
- (0x4D0, 'M', 'ӑ'),
- (0x4D1, 'V'),
- (0x4D2, 'M', 'ӓ'),
- (0x4D3, 'V'),
- (0x4D4, 'M', 'ӕ'),
- (0x4D5, 'V'),
- (0x4D6, 'M', 'ӗ'),
- (0x4D7, 'V'),
- (0x4D8, 'M', 'ә'),
- (0x4D9, 'V'),
- (0x4DA, 'M', 'ӛ'),
- (0x4DB, 'V'),
- (0x4DC, 'M', 'ӝ'),
- (0x4DD, 'V'),
- (0x4DE, 'M', 'ӟ'),
- (0x4DF, 'V'),
- (0x4E0, 'M', 'ӡ'),
- (0x4E1, 'V'),
- (0x4E2, 'M', 'ӣ'),
- (0x4E3, 'V'),
- (0x4E4, 'M', 'ӥ'),
- (0x4E5, 'V'),
- (0x4E6, 'M', 'ӧ'),
- (0x4E7, 'V'),
- (0x4E8, 'M', 'ө'),
- (0x4E9, 'V'),
- (0x4EA, 'M', 'ӫ'),
- (0x4EB, 'V'),
- (0x4EC, 'M', 'ӭ'),
- (0x4ED, 'V'),
- (0x4EE, 'M', 'ӯ'),
- (0x4EF, 'V'),
- (0x4F0, 'M', 'ӱ'),
- (0x4F1, 'V'),
- (0x4F2, 'M', 'ӳ'),
- (0x4F3, 'V'),
- (0x4F4, 'M', 'ӵ'),
- (0x4F5, 'V'),
- (0x4F6, 'M', 'ӷ'),
- (0x4F7, 'V'),
- (0x4F8, 'M', 'ӹ'),
- (0x4F9, 'V'),
- (0x4FA, 'M', 'ӻ'),
- (0x4FB, 'V'),
- (0x4FC, 'M', 'ӽ'),
- (0x4FD, 'V'),
- (0x4FE, 'M', 'ӿ'),
- (0x4FF, 'V'),
- (0x500, 'M', 'ԁ'),
- (0x501, 'V'),
- (0x502, 'M', 'ԃ'),
- ]
-
-def _seg_9() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x503, 'V'),
- (0x504, 'M', 'ԅ'),
- (0x505, 'V'),
- (0x506, 'M', 'ԇ'),
- (0x507, 'V'),
- (0x508, 'M', 'ԉ'),
- (0x509, 'V'),
- (0x50A, 'M', 'ԋ'),
- (0x50B, 'V'),
- (0x50C, 'M', 'ԍ'),
- (0x50D, 'V'),
- (0x50E, 'M', 'ԏ'),
- (0x50F, 'V'),
- (0x510, 'M', 'ԑ'),
- (0x511, 'V'),
- (0x512, 'M', 'ԓ'),
- (0x513, 'V'),
- (0x514, 'M', 'ԕ'),
- (0x515, 'V'),
- (0x516, 'M', 'ԗ'),
- (0x517, 'V'),
- (0x518, 'M', 'ԙ'),
- (0x519, 'V'),
- (0x51A, 'M', 'ԛ'),
- (0x51B, 'V'),
- (0x51C, 'M', 'ԝ'),
- (0x51D, 'V'),
- (0x51E, 'M', 'ԟ'),
- (0x51F, 'V'),
- (0x520, 'M', 'ԡ'),
- (0x521, 'V'),
- (0x522, 'M', 'ԣ'),
- (0x523, 'V'),
- (0x524, 'M', 'ԥ'),
- (0x525, 'V'),
- (0x526, 'M', 'ԧ'),
- (0x527, 'V'),
- (0x528, 'M', 'ԩ'),
- (0x529, 'V'),
- (0x52A, 'M', 'ԫ'),
- (0x52B, 'V'),
- (0x52C, 'M', 'ԭ'),
- (0x52D, 'V'),
- (0x52E, 'M', 'ԯ'),
- (0x52F, 'V'),
- (0x530, 'X'),
- (0x531, 'M', 'ա'),
- (0x532, 'M', 'բ'),
- (0x533, 'M', 'գ'),
- (0x534, 'M', 'դ'),
- (0x535, 'M', 'ե'),
- (0x536, 'M', 'զ'),
- (0x537, 'M', 'է'),
- (0x538, 'M', 'ը'),
- (0x539, 'M', 'թ'),
- (0x53A, 'M', 'ժ'),
- (0x53B, 'M', 'ի'),
- (0x53C, 'M', 'լ'),
- (0x53D, 'M', 'խ'),
- (0x53E, 'M', 'ծ'),
- (0x53F, 'M', 'կ'),
- (0x540, 'M', 'հ'),
- (0x541, 'M', 'ձ'),
- (0x542, 'M', 'ղ'),
- (0x543, 'M', 'ճ'),
- (0x544, 'M', 'մ'),
- (0x545, 'M', 'յ'),
- (0x546, 'M', 'ն'),
- (0x547, 'M', 'շ'),
- (0x548, 'M', 'ո'),
- (0x549, 'M', 'չ'),
- (0x54A, 'M', 'պ'),
- (0x54B, 'M', 'ջ'),
- (0x54C, 'M', 'ռ'),
- (0x54D, 'M', 'ս'),
- (0x54E, 'M', 'վ'),
- (0x54F, 'M', 'տ'),
- (0x550, 'M', 'ր'),
- (0x551, 'M', 'ց'),
- (0x552, 'M', 'ւ'),
- (0x553, 'M', 'փ'),
- (0x554, 'M', 'ք'),
- (0x555, 'M', 'օ'),
- (0x556, 'M', 'ֆ'),
- (0x557, 'X'),
- (0x559, 'V'),
- (0x587, 'M', 'եւ'),
- (0x588, 'V'),
- (0x58B, 'X'),
- (0x58D, 'V'),
- (0x590, 'X'),
- (0x591, 'V'),
- (0x5C8, 'X'),
- (0x5D0, 'V'),
- (0x5EB, 'X'),
- (0x5EF, 'V'),
- (0x5F5, 'X'),
- (0x606, 'V'),
- (0x61C, 'X'),
- (0x61D, 'V'),
- ]
-
-def _seg_10() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x675, 'M', 'اٴ'),
- (0x676, 'M', 'وٴ'),
- (0x677, 'M', 'ۇٴ'),
- (0x678, 'M', 'يٴ'),
- (0x679, 'V'),
- (0x6DD, 'X'),
- (0x6DE, 'V'),
- (0x70E, 'X'),
- (0x710, 'V'),
- (0x74B, 'X'),
- (0x74D, 'V'),
- (0x7B2, 'X'),
- (0x7C0, 'V'),
- (0x7FB, 'X'),
- (0x7FD, 'V'),
- (0x82E, 'X'),
- (0x830, 'V'),
- (0x83F, 'X'),
- (0x840, 'V'),
- (0x85C, 'X'),
- (0x85E, 'V'),
- (0x85F, 'X'),
- (0x860, 'V'),
- (0x86B, 'X'),
- (0x870, 'V'),
- (0x88F, 'X'),
- (0x898, 'V'),
- (0x8E2, 'X'),
- (0x8E3, 'V'),
- (0x958, 'M', 'क़'),
- (0x959, 'M', 'ख़'),
- (0x95A, 'M', 'ग़'),
- (0x95B, 'M', 'ज़'),
- (0x95C, 'M', 'ड़'),
- (0x95D, 'M', 'ढ़'),
- (0x95E, 'M', 'फ़'),
- (0x95F, 'M', 'य़'),
- (0x960, 'V'),
- (0x984, 'X'),
- (0x985, 'V'),
- (0x98D, 'X'),
- (0x98F, 'V'),
- (0x991, 'X'),
- (0x993, 'V'),
- (0x9A9, 'X'),
- (0x9AA, 'V'),
- (0x9B1, 'X'),
- (0x9B2, 'V'),
- (0x9B3, 'X'),
- (0x9B6, 'V'),
- (0x9BA, 'X'),
- (0x9BC, 'V'),
- (0x9C5, 'X'),
- (0x9C7, 'V'),
- (0x9C9, 'X'),
- (0x9CB, 'V'),
- (0x9CF, 'X'),
- (0x9D7, 'V'),
- (0x9D8, 'X'),
- (0x9DC, 'M', 'ড়'),
- (0x9DD, 'M', 'ঢ়'),
- (0x9DE, 'X'),
- (0x9DF, 'M', 'য়'),
- (0x9E0, 'V'),
- (0x9E4, 'X'),
- (0x9E6, 'V'),
- (0x9FF, 'X'),
- (0xA01, 'V'),
- (0xA04, 'X'),
- (0xA05, 'V'),
- (0xA0B, 'X'),
- (0xA0F, 'V'),
- (0xA11, 'X'),
- (0xA13, 'V'),
- (0xA29, 'X'),
- (0xA2A, 'V'),
- (0xA31, 'X'),
- (0xA32, 'V'),
- (0xA33, 'M', 'ਲ਼'),
- (0xA34, 'X'),
- (0xA35, 'V'),
- (0xA36, 'M', 'ਸ਼'),
- (0xA37, 'X'),
- (0xA38, 'V'),
- (0xA3A, 'X'),
- (0xA3C, 'V'),
- (0xA3D, 'X'),
- (0xA3E, 'V'),
- (0xA43, 'X'),
- (0xA47, 'V'),
- (0xA49, 'X'),
- (0xA4B, 'V'),
- (0xA4E, 'X'),
- (0xA51, 'V'),
- (0xA52, 'X'),
- (0xA59, 'M', 'ਖ਼'),
- (0xA5A, 'M', 'ਗ਼'),
- (0xA5B, 'M', 'ਜ਼'),
- (0xA5C, 'V'),
- (0xA5D, 'X'),
- ]
-
-def _seg_11() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xA5E, 'M', 'ਫ਼'),
- (0xA5F, 'X'),
- (0xA66, 'V'),
- (0xA77, 'X'),
- (0xA81, 'V'),
- (0xA84, 'X'),
- (0xA85, 'V'),
- (0xA8E, 'X'),
- (0xA8F, 'V'),
- (0xA92, 'X'),
- (0xA93, 'V'),
- (0xAA9, 'X'),
- (0xAAA, 'V'),
- (0xAB1, 'X'),
- (0xAB2, 'V'),
- (0xAB4, 'X'),
- (0xAB5, 'V'),
- (0xABA, 'X'),
- (0xABC, 'V'),
- (0xAC6, 'X'),
- (0xAC7, 'V'),
- (0xACA, 'X'),
- (0xACB, 'V'),
- (0xACE, 'X'),
- (0xAD0, 'V'),
- (0xAD1, 'X'),
- (0xAE0, 'V'),
- (0xAE4, 'X'),
- (0xAE6, 'V'),
- (0xAF2, 'X'),
- (0xAF9, 'V'),
- (0xB00, 'X'),
- (0xB01, 'V'),
- (0xB04, 'X'),
- (0xB05, 'V'),
- (0xB0D, 'X'),
- (0xB0F, 'V'),
- (0xB11, 'X'),
- (0xB13, 'V'),
- (0xB29, 'X'),
- (0xB2A, 'V'),
- (0xB31, 'X'),
- (0xB32, 'V'),
- (0xB34, 'X'),
- (0xB35, 'V'),
- (0xB3A, 'X'),
- (0xB3C, 'V'),
- (0xB45, 'X'),
- (0xB47, 'V'),
- (0xB49, 'X'),
- (0xB4B, 'V'),
- (0xB4E, 'X'),
- (0xB55, 'V'),
- (0xB58, 'X'),
- (0xB5C, 'M', 'ଡ଼'),
- (0xB5D, 'M', 'ଢ଼'),
- (0xB5E, 'X'),
- (0xB5F, 'V'),
- (0xB64, 'X'),
- (0xB66, 'V'),
- (0xB78, 'X'),
- (0xB82, 'V'),
- (0xB84, 'X'),
- (0xB85, 'V'),
- (0xB8B, 'X'),
- (0xB8E, 'V'),
- (0xB91, 'X'),
- (0xB92, 'V'),
- (0xB96, 'X'),
- (0xB99, 'V'),
- (0xB9B, 'X'),
- (0xB9C, 'V'),
- (0xB9D, 'X'),
- (0xB9E, 'V'),
- (0xBA0, 'X'),
- (0xBA3, 'V'),
- (0xBA5, 'X'),
- (0xBA8, 'V'),
- (0xBAB, 'X'),
- (0xBAE, 'V'),
- (0xBBA, 'X'),
- (0xBBE, 'V'),
- (0xBC3, 'X'),
- (0xBC6, 'V'),
- (0xBC9, 'X'),
- (0xBCA, 'V'),
- (0xBCE, 'X'),
- (0xBD0, 'V'),
- (0xBD1, 'X'),
- (0xBD7, 'V'),
- (0xBD8, 'X'),
- (0xBE6, 'V'),
- (0xBFB, 'X'),
- (0xC00, 'V'),
- (0xC0D, 'X'),
- (0xC0E, 'V'),
- (0xC11, 'X'),
- (0xC12, 'V'),
- (0xC29, 'X'),
- (0xC2A, 'V'),
- ]
-
-def _seg_12() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xC3A, 'X'),
- (0xC3C, 'V'),
- (0xC45, 'X'),
- (0xC46, 'V'),
- (0xC49, 'X'),
- (0xC4A, 'V'),
- (0xC4E, 'X'),
- (0xC55, 'V'),
- (0xC57, 'X'),
- (0xC58, 'V'),
- (0xC5B, 'X'),
- (0xC5D, 'V'),
- (0xC5E, 'X'),
- (0xC60, 'V'),
- (0xC64, 'X'),
- (0xC66, 'V'),
- (0xC70, 'X'),
- (0xC77, 'V'),
- (0xC8D, 'X'),
- (0xC8E, 'V'),
- (0xC91, 'X'),
- (0xC92, 'V'),
- (0xCA9, 'X'),
- (0xCAA, 'V'),
- (0xCB4, 'X'),
- (0xCB5, 'V'),
- (0xCBA, 'X'),
- (0xCBC, 'V'),
- (0xCC5, 'X'),
- (0xCC6, 'V'),
- (0xCC9, 'X'),
- (0xCCA, 'V'),
- (0xCCE, 'X'),
- (0xCD5, 'V'),
- (0xCD7, 'X'),
- (0xCDD, 'V'),
- (0xCDF, 'X'),
- (0xCE0, 'V'),
- (0xCE4, 'X'),
- (0xCE6, 'V'),
- (0xCF0, 'X'),
- (0xCF1, 'V'),
- (0xCF3, 'X'),
- (0xD00, 'V'),
- (0xD0D, 'X'),
- (0xD0E, 'V'),
- (0xD11, 'X'),
- (0xD12, 'V'),
- (0xD45, 'X'),
- (0xD46, 'V'),
- (0xD49, 'X'),
- (0xD4A, 'V'),
- (0xD50, 'X'),
- (0xD54, 'V'),
- (0xD64, 'X'),
- (0xD66, 'V'),
- (0xD80, 'X'),
- (0xD81, 'V'),
- (0xD84, 'X'),
- (0xD85, 'V'),
- (0xD97, 'X'),
- (0xD9A, 'V'),
- (0xDB2, 'X'),
- (0xDB3, 'V'),
- (0xDBC, 'X'),
- (0xDBD, 'V'),
- (0xDBE, 'X'),
- (0xDC0, 'V'),
- (0xDC7, 'X'),
- (0xDCA, 'V'),
- (0xDCB, 'X'),
- (0xDCF, 'V'),
- (0xDD5, 'X'),
- (0xDD6, 'V'),
- (0xDD7, 'X'),
- (0xDD8, 'V'),
- (0xDE0, 'X'),
- (0xDE6, 'V'),
- (0xDF0, 'X'),
- (0xDF2, 'V'),
- (0xDF5, 'X'),
- (0xE01, 'V'),
- (0xE33, 'M', 'ํา'),
- (0xE34, 'V'),
- (0xE3B, 'X'),
- (0xE3F, 'V'),
- (0xE5C, 'X'),
- (0xE81, 'V'),
- (0xE83, 'X'),
- (0xE84, 'V'),
- (0xE85, 'X'),
- (0xE86, 'V'),
- (0xE8B, 'X'),
- (0xE8C, 'V'),
- (0xEA4, 'X'),
- (0xEA5, 'V'),
- (0xEA6, 'X'),
- (0xEA7, 'V'),
- (0xEB3, 'M', 'ໍາ'),
- (0xEB4, 'V'),
- ]
-
-def _seg_13() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xEBE, 'X'),
- (0xEC0, 'V'),
- (0xEC5, 'X'),
- (0xEC6, 'V'),
- (0xEC7, 'X'),
- (0xEC8, 'V'),
- (0xECE, 'X'),
- (0xED0, 'V'),
- (0xEDA, 'X'),
- (0xEDC, 'M', 'ຫນ'),
- (0xEDD, 'M', 'ຫມ'),
- (0xEDE, 'V'),
- (0xEE0, 'X'),
- (0xF00, 'V'),
- (0xF0C, 'M', '་'),
- (0xF0D, 'V'),
- (0xF43, 'M', 'གྷ'),
- (0xF44, 'V'),
- (0xF48, 'X'),
- (0xF49, 'V'),
- (0xF4D, 'M', 'ཌྷ'),
- (0xF4E, 'V'),
- (0xF52, 'M', 'དྷ'),
- (0xF53, 'V'),
- (0xF57, 'M', 'བྷ'),
- (0xF58, 'V'),
- (0xF5C, 'M', 'ཛྷ'),
- (0xF5D, 'V'),
- (0xF69, 'M', 'ཀྵ'),
- (0xF6A, 'V'),
- (0xF6D, 'X'),
- (0xF71, 'V'),
- (0xF73, 'M', 'ཱི'),
- (0xF74, 'V'),
- (0xF75, 'M', 'ཱུ'),
- (0xF76, 'M', 'ྲྀ'),
- (0xF77, 'M', 'ྲཱྀ'),
- (0xF78, 'M', 'ླྀ'),
- (0xF79, 'M', 'ླཱྀ'),
- (0xF7A, 'V'),
- (0xF81, 'M', 'ཱྀ'),
- (0xF82, 'V'),
- (0xF93, 'M', 'ྒྷ'),
- (0xF94, 'V'),
- (0xF98, 'X'),
- (0xF99, 'V'),
- (0xF9D, 'M', 'ྜྷ'),
- (0xF9E, 'V'),
- (0xFA2, 'M', 'ྡྷ'),
- (0xFA3, 'V'),
- (0xFA7, 'M', 'ྦྷ'),
- (0xFA8, 'V'),
- (0xFAC, 'M', 'ྫྷ'),
- (0xFAD, 'V'),
- (0xFB9, 'M', 'ྐྵ'),
- (0xFBA, 'V'),
- (0xFBD, 'X'),
- (0xFBE, 'V'),
- (0xFCD, 'X'),
- (0xFCE, 'V'),
- (0xFDB, 'X'),
- (0x1000, 'V'),
- (0x10A0, 'X'),
- (0x10C7, 'M', 'ⴧ'),
- (0x10C8, 'X'),
- (0x10CD, 'M', 'ⴭ'),
- (0x10CE, 'X'),
- (0x10D0, 'V'),
- (0x10FC, 'M', 'ნ'),
- (0x10FD, 'V'),
- (0x115F, 'X'),
- (0x1161, 'V'),
- (0x1249, 'X'),
- (0x124A, 'V'),
- (0x124E, 'X'),
- (0x1250, 'V'),
- (0x1257, 'X'),
- (0x1258, 'V'),
- (0x1259, 'X'),
- (0x125A, 'V'),
- (0x125E, 'X'),
- (0x1260, 'V'),
- (0x1289, 'X'),
- (0x128A, 'V'),
- (0x128E, 'X'),
- (0x1290, 'V'),
- (0x12B1, 'X'),
- (0x12B2, 'V'),
- (0x12B6, 'X'),
- (0x12B8, 'V'),
- (0x12BF, 'X'),
- (0x12C0, 'V'),
- (0x12C1, 'X'),
- (0x12C2, 'V'),
- (0x12C6, 'X'),
- (0x12C8, 'V'),
- (0x12D7, 'X'),
- (0x12D8, 'V'),
- (0x1311, 'X'),
- (0x1312, 'V'),
- ]
-
-def _seg_14() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1316, 'X'),
- (0x1318, 'V'),
- (0x135B, 'X'),
- (0x135D, 'V'),
- (0x137D, 'X'),
- (0x1380, 'V'),
- (0x139A, 'X'),
- (0x13A0, 'V'),
- (0x13F6, 'X'),
- (0x13F8, 'M', 'Ᏸ'),
- (0x13F9, 'M', 'Ᏹ'),
- (0x13FA, 'M', 'Ᏺ'),
- (0x13FB, 'M', 'Ᏻ'),
- (0x13FC, 'M', 'Ᏼ'),
- (0x13FD, 'M', 'Ᏽ'),
- (0x13FE, 'X'),
- (0x1400, 'V'),
- (0x1680, 'X'),
- (0x1681, 'V'),
- (0x169D, 'X'),
- (0x16A0, 'V'),
- (0x16F9, 'X'),
- (0x1700, 'V'),
- (0x1716, 'X'),
- (0x171F, 'V'),
- (0x1737, 'X'),
- (0x1740, 'V'),
- (0x1754, 'X'),
- (0x1760, 'V'),
- (0x176D, 'X'),
- (0x176E, 'V'),
- (0x1771, 'X'),
- (0x1772, 'V'),
- (0x1774, 'X'),
- (0x1780, 'V'),
- (0x17B4, 'X'),
- (0x17B6, 'V'),
- (0x17DE, 'X'),
- (0x17E0, 'V'),
- (0x17EA, 'X'),
- (0x17F0, 'V'),
- (0x17FA, 'X'),
- (0x1800, 'V'),
- (0x1806, 'X'),
- (0x1807, 'V'),
- (0x180B, 'I'),
- (0x180E, 'X'),
- (0x180F, 'I'),
- (0x1810, 'V'),
- (0x181A, 'X'),
- (0x1820, 'V'),
- (0x1879, 'X'),
- (0x1880, 'V'),
- (0x18AB, 'X'),
- (0x18B0, 'V'),
- (0x18F6, 'X'),
- (0x1900, 'V'),
- (0x191F, 'X'),
- (0x1920, 'V'),
- (0x192C, 'X'),
- (0x1930, 'V'),
- (0x193C, 'X'),
- (0x1940, 'V'),
- (0x1941, 'X'),
- (0x1944, 'V'),
- (0x196E, 'X'),
- (0x1970, 'V'),
- (0x1975, 'X'),
- (0x1980, 'V'),
- (0x19AC, 'X'),
- (0x19B0, 'V'),
- (0x19CA, 'X'),
- (0x19D0, 'V'),
- (0x19DB, 'X'),
- (0x19DE, 'V'),
- (0x1A1C, 'X'),
- (0x1A1E, 'V'),
- (0x1A5F, 'X'),
- (0x1A60, 'V'),
- (0x1A7D, 'X'),
- (0x1A7F, 'V'),
- (0x1A8A, 'X'),
- (0x1A90, 'V'),
- (0x1A9A, 'X'),
- (0x1AA0, 'V'),
- (0x1AAE, 'X'),
- (0x1AB0, 'V'),
- (0x1ACF, 'X'),
- (0x1B00, 'V'),
- (0x1B4D, 'X'),
- (0x1B50, 'V'),
- (0x1B7F, 'X'),
- (0x1B80, 'V'),
- (0x1BF4, 'X'),
- (0x1BFC, 'V'),
- (0x1C38, 'X'),
- (0x1C3B, 'V'),
- (0x1C4A, 'X'),
- (0x1C4D, 'V'),
- (0x1C80, 'M', 'в'),
- ]
-
-def _seg_15() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1C81, 'M', 'д'),
- (0x1C82, 'M', 'о'),
- (0x1C83, 'M', 'с'),
- (0x1C84, 'M', 'т'),
- (0x1C86, 'M', 'ъ'),
- (0x1C87, 'M', 'ѣ'),
- (0x1C88, 'M', 'ꙋ'),
- (0x1C89, 'X'),
- (0x1C90, 'M', 'ა'),
- (0x1C91, 'M', 'ბ'),
- (0x1C92, 'M', 'გ'),
- (0x1C93, 'M', 'დ'),
- (0x1C94, 'M', 'ე'),
- (0x1C95, 'M', 'ვ'),
- (0x1C96, 'M', 'ზ'),
- (0x1C97, 'M', 'თ'),
- (0x1C98, 'M', 'ი'),
- (0x1C99, 'M', 'კ'),
- (0x1C9A, 'M', 'ლ'),
- (0x1C9B, 'M', 'მ'),
- (0x1C9C, 'M', 'ნ'),
- (0x1C9D, 'M', 'ო'),
- (0x1C9E, 'M', 'პ'),
- (0x1C9F, 'M', 'ჟ'),
- (0x1CA0, 'M', 'რ'),
- (0x1CA1, 'M', 'ს'),
- (0x1CA2, 'M', 'ტ'),
- (0x1CA3, 'M', 'უ'),
- (0x1CA4, 'M', 'ფ'),
- (0x1CA5, 'M', 'ქ'),
- (0x1CA6, 'M', 'ღ'),
- (0x1CA7, 'M', 'ყ'),
- (0x1CA8, 'M', 'შ'),
- (0x1CA9, 'M', 'ჩ'),
- (0x1CAA, 'M', 'ც'),
- (0x1CAB, 'M', 'ძ'),
- (0x1CAC, 'M', 'წ'),
- (0x1CAD, 'M', 'ჭ'),
- (0x1CAE, 'M', 'ხ'),
- (0x1CAF, 'M', 'ჯ'),
- (0x1CB0, 'M', 'ჰ'),
- (0x1CB1, 'M', 'ჱ'),
- (0x1CB2, 'M', 'ჲ'),
- (0x1CB3, 'M', 'ჳ'),
- (0x1CB4, 'M', 'ჴ'),
- (0x1CB5, 'M', 'ჵ'),
- (0x1CB6, 'M', 'ჶ'),
- (0x1CB7, 'M', 'ჷ'),
- (0x1CB8, 'M', 'ჸ'),
- (0x1CB9, 'M', 'ჹ'),
- (0x1CBA, 'M', 'ჺ'),
- (0x1CBB, 'X'),
- (0x1CBD, 'M', 'ჽ'),
- (0x1CBE, 'M', 'ჾ'),
- (0x1CBF, 'M', 'ჿ'),
- (0x1CC0, 'V'),
- (0x1CC8, 'X'),
- (0x1CD0, 'V'),
- (0x1CFB, 'X'),
- (0x1D00, 'V'),
- (0x1D2C, 'M', 'a'),
- (0x1D2D, 'M', 'æ'),
- (0x1D2E, 'M', 'b'),
- (0x1D2F, 'V'),
- (0x1D30, 'M', 'd'),
- (0x1D31, 'M', 'e'),
- (0x1D32, 'M', 'ǝ'),
- (0x1D33, 'M', 'g'),
- (0x1D34, 'M', 'h'),
- (0x1D35, 'M', 'i'),
- (0x1D36, 'M', 'j'),
- (0x1D37, 'M', 'k'),
- (0x1D38, 'M', 'l'),
- (0x1D39, 'M', 'm'),
- (0x1D3A, 'M', 'n'),
- (0x1D3B, 'V'),
- (0x1D3C, 'M', 'o'),
- (0x1D3D, 'M', 'ȣ'),
- (0x1D3E, 'M', 'p'),
- (0x1D3F, 'M', 'r'),
- (0x1D40, 'M', 't'),
- (0x1D41, 'M', 'u'),
- (0x1D42, 'M', 'w'),
- (0x1D43, 'M', 'a'),
- (0x1D44, 'M', 'ɐ'),
- (0x1D45, 'M', 'ɑ'),
- (0x1D46, 'M', 'ᴂ'),
- (0x1D47, 'M', 'b'),
- (0x1D48, 'M', 'd'),
- (0x1D49, 'M', 'e'),
- (0x1D4A, 'M', 'ə'),
- (0x1D4B, 'M', 'ɛ'),
- (0x1D4C, 'M', 'ɜ'),
- (0x1D4D, 'M', 'g'),
- (0x1D4E, 'V'),
- (0x1D4F, 'M', 'k'),
- (0x1D50, 'M', 'm'),
- (0x1D51, 'M', 'ŋ'),
- (0x1D52, 'M', 'o'),
- (0x1D53, 'M', 'ɔ'),
- ]
-
-def _seg_16() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D54, 'M', 'ᴖ'),
- (0x1D55, 'M', 'ᴗ'),
- (0x1D56, 'M', 'p'),
- (0x1D57, 'M', 't'),
- (0x1D58, 'M', 'u'),
- (0x1D59, 'M', 'ᴝ'),
- (0x1D5A, 'M', 'ɯ'),
- (0x1D5B, 'M', 'v'),
- (0x1D5C, 'M', 'ᴥ'),
- (0x1D5D, 'M', 'β'),
- (0x1D5E, 'M', 'γ'),
- (0x1D5F, 'M', 'δ'),
- (0x1D60, 'M', 'φ'),
- (0x1D61, 'M', 'χ'),
- (0x1D62, 'M', 'i'),
- (0x1D63, 'M', 'r'),
- (0x1D64, 'M', 'u'),
- (0x1D65, 'M', 'v'),
- (0x1D66, 'M', 'β'),
- (0x1D67, 'M', 'γ'),
- (0x1D68, 'M', 'ρ'),
- (0x1D69, 'M', 'φ'),
- (0x1D6A, 'M', 'χ'),
- (0x1D6B, 'V'),
- (0x1D78, 'M', 'н'),
- (0x1D79, 'V'),
- (0x1D9B, 'M', 'ɒ'),
- (0x1D9C, 'M', 'c'),
- (0x1D9D, 'M', 'ɕ'),
- (0x1D9E, 'M', 'ð'),
- (0x1D9F, 'M', 'ɜ'),
- (0x1DA0, 'M', 'f'),
- (0x1DA1, 'M', 'ɟ'),
- (0x1DA2, 'M', 'ɡ'),
- (0x1DA3, 'M', 'ɥ'),
- (0x1DA4, 'M', 'ɨ'),
- (0x1DA5, 'M', 'ɩ'),
- (0x1DA6, 'M', 'ɪ'),
- (0x1DA7, 'M', 'ᵻ'),
- (0x1DA8, 'M', 'ʝ'),
- (0x1DA9, 'M', 'ɭ'),
- (0x1DAA, 'M', 'ᶅ'),
- (0x1DAB, 'M', 'ʟ'),
- (0x1DAC, 'M', 'ɱ'),
- (0x1DAD, 'M', 'ɰ'),
- (0x1DAE, 'M', 'ɲ'),
- (0x1DAF, 'M', 'ɳ'),
- (0x1DB0, 'M', 'ɴ'),
- (0x1DB1, 'M', 'ɵ'),
- (0x1DB2, 'M', 'ɸ'),
- (0x1DB3, 'M', 'ʂ'),
- (0x1DB4, 'M', 'ʃ'),
- (0x1DB5, 'M', 'ƫ'),
- (0x1DB6, 'M', 'ʉ'),
- (0x1DB7, 'M', 'ʊ'),
- (0x1DB8, 'M', 'ᴜ'),
- (0x1DB9, 'M', 'ʋ'),
- (0x1DBA, 'M', 'ʌ'),
- (0x1DBB, 'M', 'z'),
- (0x1DBC, 'M', 'ʐ'),
- (0x1DBD, 'M', 'ʑ'),
- (0x1DBE, 'M', 'ʒ'),
- (0x1DBF, 'M', 'θ'),
- (0x1DC0, 'V'),
- (0x1E00, 'M', 'ḁ'),
- (0x1E01, 'V'),
- (0x1E02, 'M', 'ḃ'),
- (0x1E03, 'V'),
- (0x1E04, 'M', 'ḅ'),
- (0x1E05, 'V'),
- (0x1E06, 'M', 'ḇ'),
- (0x1E07, 'V'),
- (0x1E08, 'M', 'ḉ'),
- (0x1E09, 'V'),
- (0x1E0A, 'M', 'ḋ'),
- (0x1E0B, 'V'),
- (0x1E0C, 'M', 'ḍ'),
- (0x1E0D, 'V'),
- (0x1E0E, 'M', 'ḏ'),
- (0x1E0F, 'V'),
- (0x1E10, 'M', 'ḑ'),
- (0x1E11, 'V'),
- (0x1E12, 'M', 'ḓ'),
- (0x1E13, 'V'),
- (0x1E14, 'M', 'ḕ'),
- (0x1E15, 'V'),
- (0x1E16, 'M', 'ḗ'),
- (0x1E17, 'V'),
- (0x1E18, 'M', 'ḙ'),
- (0x1E19, 'V'),
- (0x1E1A, 'M', 'ḛ'),
- (0x1E1B, 'V'),
- (0x1E1C, 'M', 'ḝ'),
- (0x1E1D, 'V'),
- (0x1E1E, 'M', 'ḟ'),
- (0x1E1F, 'V'),
- (0x1E20, 'M', 'ḡ'),
- (0x1E21, 'V'),
- (0x1E22, 'M', 'ḣ'),
- (0x1E23, 'V'),
- ]
-
-def _seg_17() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1E24, 'M', 'ḥ'),
- (0x1E25, 'V'),
- (0x1E26, 'M', 'ḧ'),
- (0x1E27, 'V'),
- (0x1E28, 'M', 'ḩ'),
- (0x1E29, 'V'),
- (0x1E2A, 'M', 'ḫ'),
- (0x1E2B, 'V'),
- (0x1E2C, 'M', 'ḭ'),
- (0x1E2D, 'V'),
- (0x1E2E, 'M', 'ḯ'),
- (0x1E2F, 'V'),
- (0x1E30, 'M', 'ḱ'),
- (0x1E31, 'V'),
- (0x1E32, 'M', 'ḳ'),
- (0x1E33, 'V'),
- (0x1E34, 'M', 'ḵ'),
- (0x1E35, 'V'),
- (0x1E36, 'M', 'ḷ'),
- (0x1E37, 'V'),
- (0x1E38, 'M', 'ḹ'),
- (0x1E39, 'V'),
- (0x1E3A, 'M', 'ḻ'),
- (0x1E3B, 'V'),
- (0x1E3C, 'M', 'ḽ'),
- (0x1E3D, 'V'),
- (0x1E3E, 'M', 'ḿ'),
- (0x1E3F, 'V'),
- (0x1E40, 'M', 'ṁ'),
- (0x1E41, 'V'),
- (0x1E42, 'M', 'ṃ'),
- (0x1E43, 'V'),
- (0x1E44, 'M', 'ṅ'),
- (0x1E45, 'V'),
- (0x1E46, 'M', 'ṇ'),
- (0x1E47, 'V'),
- (0x1E48, 'M', 'ṉ'),
- (0x1E49, 'V'),
- (0x1E4A, 'M', 'ṋ'),
- (0x1E4B, 'V'),
- (0x1E4C, 'M', 'ṍ'),
- (0x1E4D, 'V'),
- (0x1E4E, 'M', 'ṏ'),
- (0x1E4F, 'V'),
- (0x1E50, 'M', 'ṑ'),
- (0x1E51, 'V'),
- (0x1E52, 'M', 'ṓ'),
- (0x1E53, 'V'),
- (0x1E54, 'M', 'ṕ'),
- (0x1E55, 'V'),
- (0x1E56, 'M', 'ṗ'),
- (0x1E57, 'V'),
- (0x1E58, 'M', 'ṙ'),
- (0x1E59, 'V'),
- (0x1E5A, 'M', 'ṛ'),
- (0x1E5B, 'V'),
- (0x1E5C, 'M', 'ṝ'),
- (0x1E5D, 'V'),
- (0x1E5E, 'M', 'ṟ'),
- (0x1E5F, 'V'),
- (0x1E60, 'M', 'ṡ'),
- (0x1E61, 'V'),
- (0x1E62, 'M', 'ṣ'),
- (0x1E63, 'V'),
- (0x1E64, 'M', 'ṥ'),
- (0x1E65, 'V'),
- (0x1E66, 'M', 'ṧ'),
- (0x1E67, 'V'),
- (0x1E68, 'M', 'ṩ'),
- (0x1E69, 'V'),
- (0x1E6A, 'M', 'ṫ'),
- (0x1E6B, 'V'),
- (0x1E6C, 'M', 'ṭ'),
- (0x1E6D, 'V'),
- (0x1E6E, 'M', 'ṯ'),
- (0x1E6F, 'V'),
- (0x1E70, 'M', 'ṱ'),
- (0x1E71, 'V'),
- (0x1E72, 'M', 'ṳ'),
- (0x1E73, 'V'),
- (0x1E74, 'M', 'ṵ'),
- (0x1E75, 'V'),
- (0x1E76, 'M', 'ṷ'),
- (0x1E77, 'V'),
- (0x1E78, 'M', 'ṹ'),
- (0x1E79, 'V'),
- (0x1E7A, 'M', 'ṻ'),
- (0x1E7B, 'V'),
- (0x1E7C, 'M', 'ṽ'),
- (0x1E7D, 'V'),
- (0x1E7E, 'M', 'ṿ'),
- (0x1E7F, 'V'),
- (0x1E80, 'M', 'ẁ'),
- (0x1E81, 'V'),
- (0x1E82, 'M', 'ẃ'),
- (0x1E83, 'V'),
- (0x1E84, 'M', 'ẅ'),
- (0x1E85, 'V'),
- (0x1E86, 'M', 'ẇ'),
- (0x1E87, 'V'),
- ]
-
-def _seg_18() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1E88, 'M', 'ẉ'),
- (0x1E89, 'V'),
- (0x1E8A, 'M', 'ẋ'),
- (0x1E8B, 'V'),
- (0x1E8C, 'M', 'ẍ'),
- (0x1E8D, 'V'),
- (0x1E8E, 'M', 'ẏ'),
- (0x1E8F, 'V'),
- (0x1E90, 'M', 'ẑ'),
- (0x1E91, 'V'),
- (0x1E92, 'M', 'ẓ'),
- (0x1E93, 'V'),
- (0x1E94, 'M', 'ẕ'),
- (0x1E95, 'V'),
- (0x1E9A, 'M', 'aʾ'),
- (0x1E9B, 'M', 'ṡ'),
- (0x1E9C, 'V'),
- (0x1E9E, 'M', 'ss'),
- (0x1E9F, 'V'),
- (0x1EA0, 'M', 'ạ'),
- (0x1EA1, 'V'),
- (0x1EA2, 'M', 'ả'),
- (0x1EA3, 'V'),
- (0x1EA4, 'M', 'ấ'),
- (0x1EA5, 'V'),
- (0x1EA6, 'M', 'ầ'),
- (0x1EA7, 'V'),
- (0x1EA8, 'M', 'ẩ'),
- (0x1EA9, 'V'),
- (0x1EAA, 'M', 'ẫ'),
- (0x1EAB, 'V'),
- (0x1EAC, 'M', 'ậ'),
- (0x1EAD, 'V'),
- (0x1EAE, 'M', 'ắ'),
- (0x1EAF, 'V'),
- (0x1EB0, 'M', 'ằ'),
- (0x1EB1, 'V'),
- (0x1EB2, 'M', 'ẳ'),
- (0x1EB3, 'V'),
- (0x1EB4, 'M', 'ẵ'),
- (0x1EB5, 'V'),
- (0x1EB6, 'M', 'ặ'),
- (0x1EB7, 'V'),
- (0x1EB8, 'M', 'ẹ'),
- (0x1EB9, 'V'),
- (0x1EBA, 'M', 'ẻ'),
- (0x1EBB, 'V'),
- (0x1EBC, 'M', 'ẽ'),
- (0x1EBD, 'V'),
- (0x1EBE, 'M', 'ế'),
- (0x1EBF, 'V'),
- (0x1EC0, 'M', 'ề'),
- (0x1EC1, 'V'),
- (0x1EC2, 'M', 'ể'),
- (0x1EC3, 'V'),
- (0x1EC4, 'M', 'ễ'),
- (0x1EC5, 'V'),
- (0x1EC6, 'M', 'ệ'),
- (0x1EC7, 'V'),
- (0x1EC8, 'M', 'ỉ'),
- (0x1EC9, 'V'),
- (0x1ECA, 'M', 'ị'),
- (0x1ECB, 'V'),
- (0x1ECC, 'M', 'ọ'),
- (0x1ECD, 'V'),
- (0x1ECE, 'M', 'ỏ'),
- (0x1ECF, 'V'),
- (0x1ED0, 'M', 'ố'),
- (0x1ED1, 'V'),
- (0x1ED2, 'M', 'ồ'),
- (0x1ED3, 'V'),
- (0x1ED4, 'M', 'ổ'),
- (0x1ED5, 'V'),
- (0x1ED6, 'M', 'ỗ'),
- (0x1ED7, 'V'),
- (0x1ED8, 'M', 'ộ'),
- (0x1ED9, 'V'),
- (0x1EDA, 'M', 'ớ'),
- (0x1EDB, 'V'),
- (0x1EDC, 'M', 'ờ'),
- (0x1EDD, 'V'),
- (0x1EDE, 'M', 'ở'),
- (0x1EDF, 'V'),
- (0x1EE0, 'M', 'ỡ'),
- (0x1EE1, 'V'),
- (0x1EE2, 'M', 'ợ'),
- (0x1EE3, 'V'),
- (0x1EE4, 'M', 'ụ'),
- (0x1EE5, 'V'),
- (0x1EE6, 'M', 'ủ'),
- (0x1EE7, 'V'),
- (0x1EE8, 'M', 'ứ'),
- (0x1EE9, 'V'),
- (0x1EEA, 'M', 'ừ'),
- (0x1EEB, 'V'),
- (0x1EEC, 'M', 'ử'),
- (0x1EED, 'V'),
- (0x1EEE, 'M', 'ữ'),
- (0x1EEF, 'V'),
- (0x1EF0, 'M', 'ự'),
- ]
-
-def _seg_19() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1EF1, 'V'),
- (0x1EF2, 'M', 'ỳ'),
- (0x1EF3, 'V'),
- (0x1EF4, 'M', 'ỵ'),
- (0x1EF5, 'V'),
- (0x1EF6, 'M', 'ỷ'),
- (0x1EF7, 'V'),
- (0x1EF8, 'M', 'ỹ'),
- (0x1EF9, 'V'),
- (0x1EFA, 'M', 'ỻ'),
- (0x1EFB, 'V'),
- (0x1EFC, 'M', 'ỽ'),
- (0x1EFD, 'V'),
- (0x1EFE, 'M', 'ỿ'),
- (0x1EFF, 'V'),
- (0x1F08, 'M', 'ἀ'),
- (0x1F09, 'M', 'ἁ'),
- (0x1F0A, 'M', 'ἂ'),
- (0x1F0B, 'M', 'ἃ'),
- (0x1F0C, 'M', 'ἄ'),
- (0x1F0D, 'M', 'ἅ'),
- (0x1F0E, 'M', 'ἆ'),
- (0x1F0F, 'M', 'ἇ'),
- (0x1F10, 'V'),
- (0x1F16, 'X'),
- (0x1F18, 'M', 'ἐ'),
- (0x1F19, 'M', 'ἑ'),
- (0x1F1A, 'M', 'ἒ'),
- (0x1F1B, 'M', 'ἓ'),
- (0x1F1C, 'M', 'ἔ'),
- (0x1F1D, 'M', 'ἕ'),
- (0x1F1E, 'X'),
- (0x1F20, 'V'),
- (0x1F28, 'M', 'ἠ'),
- (0x1F29, 'M', 'ἡ'),
- (0x1F2A, 'M', 'ἢ'),
- (0x1F2B, 'M', 'ἣ'),
- (0x1F2C, 'M', 'ἤ'),
- (0x1F2D, 'M', 'ἥ'),
- (0x1F2E, 'M', 'ἦ'),
- (0x1F2F, 'M', 'ἧ'),
- (0x1F30, 'V'),
- (0x1F38, 'M', 'ἰ'),
- (0x1F39, 'M', 'ἱ'),
- (0x1F3A, 'M', 'ἲ'),
- (0x1F3B, 'M', 'ἳ'),
- (0x1F3C, 'M', 'ἴ'),
- (0x1F3D, 'M', 'ἵ'),
- (0x1F3E, 'M', 'ἶ'),
- (0x1F3F, 'M', 'ἷ'),
- (0x1F40, 'V'),
- (0x1F46, 'X'),
- (0x1F48, 'M', 'ὀ'),
- (0x1F49, 'M', 'ὁ'),
- (0x1F4A, 'M', 'ὂ'),
- (0x1F4B, 'M', 'ὃ'),
- (0x1F4C, 'M', 'ὄ'),
- (0x1F4D, 'M', 'ὅ'),
- (0x1F4E, 'X'),
- (0x1F50, 'V'),
- (0x1F58, 'X'),
- (0x1F59, 'M', 'ὑ'),
- (0x1F5A, 'X'),
- (0x1F5B, 'M', 'ὓ'),
- (0x1F5C, 'X'),
- (0x1F5D, 'M', 'ὕ'),
- (0x1F5E, 'X'),
- (0x1F5F, 'M', 'ὗ'),
- (0x1F60, 'V'),
- (0x1F68, 'M', 'ὠ'),
- (0x1F69, 'M', 'ὡ'),
- (0x1F6A, 'M', 'ὢ'),
- (0x1F6B, 'M', 'ὣ'),
- (0x1F6C, 'M', 'ὤ'),
- (0x1F6D, 'M', 'ὥ'),
- (0x1F6E, 'M', 'ὦ'),
- (0x1F6F, 'M', 'ὧ'),
- (0x1F70, 'V'),
- (0x1F71, 'M', 'ά'),
- (0x1F72, 'V'),
- (0x1F73, 'M', 'έ'),
- (0x1F74, 'V'),
- (0x1F75, 'M', 'ή'),
- (0x1F76, 'V'),
- (0x1F77, 'M', 'ί'),
- (0x1F78, 'V'),
- (0x1F79, 'M', 'ό'),
- (0x1F7A, 'V'),
- (0x1F7B, 'M', 'ύ'),
- (0x1F7C, 'V'),
- (0x1F7D, 'M', 'ώ'),
- (0x1F7E, 'X'),
- (0x1F80, 'M', 'ἀι'),
- (0x1F81, 'M', 'ἁι'),
- (0x1F82, 'M', 'ἂι'),
- (0x1F83, 'M', 'ἃι'),
- (0x1F84, 'M', 'ἄι'),
- (0x1F85, 'M', 'ἅι'),
- (0x1F86, 'M', 'ἆι'),
- (0x1F87, 'M', 'ἇι'),
- ]
-
-def _seg_20() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1F88, 'M', 'ἀι'),
- (0x1F89, 'M', 'ἁι'),
- (0x1F8A, 'M', 'ἂι'),
- (0x1F8B, 'M', 'ἃι'),
- (0x1F8C, 'M', 'ἄι'),
- (0x1F8D, 'M', 'ἅι'),
- (0x1F8E, 'M', 'ἆι'),
- (0x1F8F, 'M', 'ἇι'),
- (0x1F90, 'M', 'ἠι'),
- (0x1F91, 'M', 'ἡι'),
- (0x1F92, 'M', 'ἢι'),
- (0x1F93, 'M', 'ἣι'),
- (0x1F94, 'M', 'ἤι'),
- (0x1F95, 'M', 'ἥι'),
- (0x1F96, 'M', 'ἦι'),
- (0x1F97, 'M', 'ἧι'),
- (0x1F98, 'M', 'ἠι'),
- (0x1F99, 'M', 'ἡι'),
- (0x1F9A, 'M', 'ἢι'),
- (0x1F9B, 'M', 'ἣι'),
- (0x1F9C, 'M', 'ἤι'),
- (0x1F9D, 'M', 'ἥι'),
- (0x1F9E, 'M', 'ἦι'),
- (0x1F9F, 'M', 'ἧι'),
- (0x1FA0, 'M', 'ὠι'),
- (0x1FA1, 'M', 'ὡι'),
- (0x1FA2, 'M', 'ὢι'),
- (0x1FA3, 'M', 'ὣι'),
- (0x1FA4, 'M', 'ὤι'),
- (0x1FA5, 'M', 'ὥι'),
- (0x1FA6, 'M', 'ὦι'),
- (0x1FA7, 'M', 'ὧι'),
- (0x1FA8, 'M', 'ὠι'),
- (0x1FA9, 'M', 'ὡι'),
- (0x1FAA, 'M', 'ὢι'),
- (0x1FAB, 'M', 'ὣι'),
- (0x1FAC, 'M', 'ὤι'),
- (0x1FAD, 'M', 'ὥι'),
- (0x1FAE, 'M', 'ὦι'),
- (0x1FAF, 'M', 'ὧι'),
- (0x1FB0, 'V'),
- (0x1FB2, 'M', 'ὰι'),
- (0x1FB3, 'M', 'αι'),
- (0x1FB4, 'M', 'άι'),
- (0x1FB5, 'X'),
- (0x1FB6, 'V'),
- (0x1FB7, 'M', 'ᾶι'),
- (0x1FB8, 'M', 'ᾰ'),
- (0x1FB9, 'M', 'ᾱ'),
- (0x1FBA, 'M', 'ὰ'),
- (0x1FBB, 'M', 'ά'),
- (0x1FBC, 'M', 'αι'),
- (0x1FBD, '3', ' ̓'),
- (0x1FBE, 'M', 'ι'),
- (0x1FBF, '3', ' ̓'),
- (0x1FC0, '3', ' ͂'),
- (0x1FC1, '3', ' ̈͂'),
- (0x1FC2, 'M', 'ὴι'),
- (0x1FC3, 'M', 'ηι'),
- (0x1FC4, 'M', 'ήι'),
- (0x1FC5, 'X'),
- (0x1FC6, 'V'),
- (0x1FC7, 'M', 'ῆι'),
- (0x1FC8, 'M', 'ὲ'),
- (0x1FC9, 'M', 'έ'),
- (0x1FCA, 'M', 'ὴ'),
- (0x1FCB, 'M', 'ή'),
- (0x1FCC, 'M', 'ηι'),
- (0x1FCD, '3', ' ̓̀'),
- (0x1FCE, '3', ' ̓́'),
- (0x1FCF, '3', ' ̓͂'),
- (0x1FD0, 'V'),
- (0x1FD3, 'M', 'ΐ'),
- (0x1FD4, 'X'),
- (0x1FD6, 'V'),
- (0x1FD8, 'M', 'ῐ'),
- (0x1FD9, 'M', 'ῑ'),
- (0x1FDA, 'M', 'ὶ'),
- (0x1FDB, 'M', 'ί'),
- (0x1FDC, 'X'),
- (0x1FDD, '3', ' ̔̀'),
- (0x1FDE, '3', ' ̔́'),
- (0x1FDF, '3', ' ̔͂'),
- (0x1FE0, 'V'),
- (0x1FE3, 'M', 'ΰ'),
- (0x1FE4, 'V'),
- (0x1FE8, 'M', 'ῠ'),
- (0x1FE9, 'M', 'ῡ'),
- (0x1FEA, 'M', 'ὺ'),
- (0x1FEB, 'M', 'ύ'),
- (0x1FEC, 'M', 'ῥ'),
- (0x1FED, '3', ' ̈̀'),
- (0x1FEE, '3', ' ̈́'),
- (0x1FEF, '3', '`'),
- (0x1FF0, 'X'),
- (0x1FF2, 'M', 'ὼι'),
- (0x1FF3, 'M', 'ωι'),
- (0x1FF4, 'M', 'ώι'),
- (0x1FF5, 'X'),
- (0x1FF6, 'V'),
- ]
-
-def _seg_21() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1FF7, 'M', 'ῶι'),
- (0x1FF8, 'M', 'ὸ'),
- (0x1FF9, 'M', 'ό'),
- (0x1FFA, 'M', 'ὼ'),
- (0x1FFB, 'M', 'ώ'),
- (0x1FFC, 'M', 'ωι'),
- (0x1FFD, '3', ' ́'),
- (0x1FFE, '3', ' ̔'),
- (0x1FFF, 'X'),
- (0x2000, '3', ' '),
- (0x200B, 'I'),
- (0x200C, 'D', ''),
- (0x200E, 'X'),
- (0x2010, 'V'),
- (0x2011, 'M', '‐'),
- (0x2012, 'V'),
- (0x2017, '3', ' ̳'),
- (0x2018, 'V'),
- (0x2024, 'X'),
- (0x2027, 'V'),
- (0x2028, 'X'),
- (0x202F, '3', ' '),
- (0x2030, 'V'),
- (0x2033, 'M', '′′'),
- (0x2034, 'M', '′′′'),
- (0x2035, 'V'),
- (0x2036, 'M', '‵‵'),
- (0x2037, 'M', '‵‵‵'),
- (0x2038, 'V'),
- (0x203C, '3', '!!'),
- (0x203D, 'V'),
- (0x203E, '3', ' ̅'),
- (0x203F, 'V'),
- (0x2047, '3', '??'),
- (0x2048, '3', '?!'),
- (0x2049, '3', '!?'),
- (0x204A, 'V'),
- (0x2057, 'M', '′′′′'),
- (0x2058, 'V'),
- (0x205F, '3', ' '),
- (0x2060, 'I'),
- (0x2061, 'X'),
- (0x2064, 'I'),
- (0x2065, 'X'),
- (0x2070, 'M', '0'),
- (0x2071, 'M', 'i'),
- (0x2072, 'X'),
- (0x2074, 'M', '4'),
- (0x2075, 'M', '5'),
- (0x2076, 'M', '6'),
- (0x2077, 'M', '7'),
- (0x2078, 'M', '8'),
- (0x2079, 'M', '9'),
- (0x207A, '3', '+'),
- (0x207B, 'M', '−'),
- (0x207C, '3', '='),
- (0x207D, '3', '('),
- (0x207E, '3', ')'),
- (0x207F, 'M', 'n'),
- (0x2080, 'M', '0'),
- (0x2081, 'M', '1'),
- (0x2082, 'M', '2'),
- (0x2083, 'M', '3'),
- (0x2084, 'M', '4'),
- (0x2085, 'M', '5'),
- (0x2086, 'M', '6'),
- (0x2087, 'M', '7'),
- (0x2088, 'M', '8'),
- (0x2089, 'M', '9'),
- (0x208A, '3', '+'),
- (0x208B, 'M', '−'),
- (0x208C, '3', '='),
- (0x208D, '3', '('),
- (0x208E, '3', ')'),
- (0x208F, 'X'),
- (0x2090, 'M', 'a'),
- (0x2091, 'M', 'e'),
- (0x2092, 'M', 'o'),
- (0x2093, 'M', 'x'),
- (0x2094, 'M', 'ə'),
- (0x2095, 'M', 'h'),
- (0x2096, 'M', 'k'),
- (0x2097, 'M', 'l'),
- (0x2098, 'M', 'm'),
- (0x2099, 'M', 'n'),
- (0x209A, 'M', 'p'),
- (0x209B, 'M', 's'),
- (0x209C, 'M', 't'),
- (0x209D, 'X'),
- (0x20A0, 'V'),
- (0x20A8, 'M', 'rs'),
- (0x20A9, 'V'),
- (0x20C1, 'X'),
- (0x20D0, 'V'),
- (0x20F1, 'X'),
- (0x2100, '3', 'a/c'),
- (0x2101, '3', 'a/s'),
- (0x2102, 'M', 'c'),
- (0x2103, 'M', '°c'),
- (0x2104, 'V'),
- ]
-
-def _seg_22() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2105, '3', 'c/o'),
- (0x2106, '3', 'c/u'),
- (0x2107, 'M', 'ɛ'),
- (0x2108, 'V'),
- (0x2109, 'M', '°f'),
- (0x210A, 'M', 'g'),
- (0x210B, 'M', 'h'),
- (0x210F, 'M', 'ħ'),
- (0x2110, 'M', 'i'),
- (0x2112, 'M', 'l'),
- (0x2114, 'V'),
- (0x2115, 'M', 'n'),
- (0x2116, 'M', 'no'),
- (0x2117, 'V'),
- (0x2119, 'M', 'p'),
- (0x211A, 'M', 'q'),
- (0x211B, 'M', 'r'),
- (0x211E, 'V'),
- (0x2120, 'M', 'sm'),
- (0x2121, 'M', 'tel'),
- (0x2122, 'M', 'tm'),
- (0x2123, 'V'),
- (0x2124, 'M', 'z'),
- (0x2125, 'V'),
- (0x2126, 'M', 'ω'),
- (0x2127, 'V'),
- (0x2128, 'M', 'z'),
- (0x2129, 'V'),
- (0x212A, 'M', 'k'),
- (0x212B, 'M', 'å'),
- (0x212C, 'M', 'b'),
- (0x212D, 'M', 'c'),
- (0x212E, 'V'),
- (0x212F, 'M', 'e'),
- (0x2131, 'M', 'f'),
- (0x2132, 'X'),
- (0x2133, 'M', 'm'),
- (0x2134, 'M', 'o'),
- (0x2135, 'M', 'א'),
- (0x2136, 'M', 'ב'),
- (0x2137, 'M', 'ג'),
- (0x2138, 'M', 'ד'),
- (0x2139, 'M', 'i'),
- (0x213A, 'V'),
- (0x213B, 'M', 'fax'),
- (0x213C, 'M', 'π'),
- (0x213D, 'M', 'γ'),
- (0x213F, 'M', 'π'),
- (0x2140, 'M', '∑'),
- (0x2141, 'V'),
- (0x2145, 'M', 'd'),
- (0x2147, 'M', 'e'),
- (0x2148, 'M', 'i'),
- (0x2149, 'M', 'j'),
- (0x214A, 'V'),
- (0x2150, 'M', '1⁄7'),
- (0x2151, 'M', '1⁄9'),
- (0x2152, 'M', '1⁄10'),
- (0x2153, 'M', '1⁄3'),
- (0x2154, 'M', '2⁄3'),
- (0x2155, 'M', '1⁄5'),
- (0x2156, 'M', '2⁄5'),
- (0x2157, 'M', '3⁄5'),
- (0x2158, 'M', '4⁄5'),
- (0x2159, 'M', '1⁄6'),
- (0x215A, 'M', '5⁄6'),
- (0x215B, 'M', '1⁄8'),
- (0x215C, 'M', '3⁄8'),
- (0x215D, 'M', '5⁄8'),
- (0x215E, 'M', '7⁄8'),
- (0x215F, 'M', '1⁄'),
- (0x2160, 'M', 'i'),
- (0x2161, 'M', 'ii'),
- (0x2162, 'M', 'iii'),
- (0x2163, 'M', 'iv'),
- (0x2164, 'M', 'v'),
- (0x2165, 'M', 'vi'),
- (0x2166, 'M', 'vii'),
- (0x2167, 'M', 'viii'),
- (0x2168, 'M', 'ix'),
- (0x2169, 'M', 'x'),
- (0x216A, 'M', 'xi'),
- (0x216B, 'M', 'xii'),
- (0x216C, 'M', 'l'),
- (0x216D, 'M', 'c'),
- (0x216E, 'M', 'd'),
- (0x216F, 'M', 'm'),
- (0x2170, 'M', 'i'),
- (0x2171, 'M', 'ii'),
- (0x2172, 'M', 'iii'),
- (0x2173, 'M', 'iv'),
- (0x2174, 'M', 'v'),
- (0x2175, 'M', 'vi'),
- (0x2176, 'M', 'vii'),
- (0x2177, 'M', 'viii'),
- (0x2178, 'M', 'ix'),
- (0x2179, 'M', 'x'),
- (0x217A, 'M', 'xi'),
- (0x217B, 'M', 'xii'),
- (0x217C, 'M', 'l'),
- ]
-
-def _seg_23() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x217D, 'M', 'c'),
- (0x217E, 'M', 'd'),
- (0x217F, 'M', 'm'),
- (0x2180, 'V'),
- (0x2183, 'X'),
- (0x2184, 'V'),
- (0x2189, 'M', '0⁄3'),
- (0x218A, 'V'),
- (0x218C, 'X'),
- (0x2190, 'V'),
- (0x222C, 'M', '∫∫'),
- (0x222D, 'M', '∫∫∫'),
- (0x222E, 'V'),
- (0x222F, 'M', '∮∮'),
- (0x2230, 'M', '∮∮∮'),
- (0x2231, 'V'),
- (0x2260, '3'),
- (0x2261, 'V'),
- (0x226E, '3'),
- (0x2270, 'V'),
- (0x2329, 'M', '〈'),
- (0x232A, 'M', '〉'),
- (0x232B, 'V'),
- (0x2427, 'X'),
- (0x2440, 'V'),
- (0x244B, 'X'),
- (0x2460, 'M', '1'),
- (0x2461, 'M', '2'),
- (0x2462, 'M', '3'),
- (0x2463, 'M', '4'),
- (0x2464, 'M', '5'),
- (0x2465, 'M', '6'),
- (0x2466, 'M', '7'),
- (0x2467, 'M', '8'),
- (0x2468, 'M', '9'),
- (0x2469, 'M', '10'),
- (0x246A, 'M', '11'),
- (0x246B, 'M', '12'),
- (0x246C, 'M', '13'),
- (0x246D, 'M', '14'),
- (0x246E, 'M', '15'),
- (0x246F, 'M', '16'),
- (0x2470, 'M', '17'),
- (0x2471, 'M', '18'),
- (0x2472, 'M', '19'),
- (0x2473, 'M', '20'),
- (0x2474, '3', '(1)'),
- (0x2475, '3', '(2)'),
- (0x2476, '3', '(3)'),
- (0x2477, '3', '(4)'),
- (0x2478, '3', '(5)'),
- (0x2479, '3', '(6)'),
- (0x247A, '3', '(7)'),
- (0x247B, '3', '(8)'),
- (0x247C, '3', '(9)'),
- (0x247D, '3', '(10)'),
- (0x247E, '3', '(11)'),
- (0x247F, '3', '(12)'),
- (0x2480, '3', '(13)'),
- (0x2481, '3', '(14)'),
- (0x2482, '3', '(15)'),
- (0x2483, '3', '(16)'),
- (0x2484, '3', '(17)'),
- (0x2485, '3', '(18)'),
- (0x2486, '3', '(19)'),
- (0x2487, '3', '(20)'),
- (0x2488, 'X'),
- (0x249C, '3', '(a)'),
- (0x249D, '3', '(b)'),
- (0x249E, '3', '(c)'),
- (0x249F, '3', '(d)'),
- (0x24A0, '3', '(e)'),
- (0x24A1, '3', '(f)'),
- (0x24A2, '3', '(g)'),
- (0x24A3, '3', '(h)'),
- (0x24A4, '3', '(i)'),
- (0x24A5, '3', '(j)'),
- (0x24A6, '3', '(k)'),
- (0x24A7, '3', '(l)'),
- (0x24A8, '3', '(m)'),
- (0x24A9, '3', '(n)'),
- (0x24AA, '3', '(o)'),
- (0x24AB, '3', '(p)'),
- (0x24AC, '3', '(q)'),
- (0x24AD, '3', '(r)'),
- (0x24AE, '3', '(s)'),
- (0x24AF, '3', '(t)'),
- (0x24B0, '3', '(u)'),
- (0x24B1, '3', '(v)'),
- (0x24B2, '3', '(w)'),
- (0x24B3, '3', '(x)'),
- (0x24B4, '3', '(y)'),
- (0x24B5, '3', '(z)'),
- (0x24B6, 'M', 'a'),
- (0x24B7, 'M', 'b'),
- (0x24B8, 'M', 'c'),
- (0x24B9, 'M', 'd'),
- (0x24BA, 'M', 'e'),
- (0x24BB, 'M', 'f'),
- (0x24BC, 'M', 'g'),
- ]
-
-def _seg_24() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x24BD, 'M', 'h'),
- (0x24BE, 'M', 'i'),
- (0x24BF, 'M', 'j'),
- (0x24C0, 'M', 'k'),
- (0x24C1, 'M', 'l'),
- (0x24C2, 'M', 'm'),
- (0x24C3, 'M', 'n'),
- (0x24C4, 'M', 'o'),
- (0x24C5, 'M', 'p'),
- (0x24C6, 'M', 'q'),
- (0x24C7, 'M', 'r'),
- (0x24C8, 'M', 's'),
- (0x24C9, 'M', 't'),
- (0x24CA, 'M', 'u'),
- (0x24CB, 'M', 'v'),
- (0x24CC, 'M', 'w'),
- (0x24CD, 'M', 'x'),
- (0x24CE, 'M', 'y'),
- (0x24CF, 'M', 'z'),
- (0x24D0, 'M', 'a'),
- (0x24D1, 'M', 'b'),
- (0x24D2, 'M', 'c'),
- (0x24D3, 'M', 'd'),
- (0x24D4, 'M', 'e'),
- (0x24D5, 'M', 'f'),
- (0x24D6, 'M', 'g'),
- (0x24D7, 'M', 'h'),
- (0x24D8, 'M', 'i'),
- (0x24D9, 'M', 'j'),
- (0x24DA, 'M', 'k'),
- (0x24DB, 'M', 'l'),
- (0x24DC, 'M', 'm'),
- (0x24DD, 'M', 'n'),
- (0x24DE, 'M', 'o'),
- (0x24DF, 'M', 'p'),
- (0x24E0, 'M', 'q'),
- (0x24E1, 'M', 'r'),
- (0x24E2, 'M', 's'),
- (0x24E3, 'M', 't'),
- (0x24E4, 'M', 'u'),
- (0x24E5, 'M', 'v'),
- (0x24E6, 'M', 'w'),
- (0x24E7, 'M', 'x'),
- (0x24E8, 'M', 'y'),
- (0x24E9, 'M', 'z'),
- (0x24EA, 'M', '0'),
- (0x24EB, 'V'),
- (0x2A0C, 'M', '∫∫∫∫'),
- (0x2A0D, 'V'),
- (0x2A74, '3', '::='),
- (0x2A75, '3', '=='),
- (0x2A76, '3', '==='),
- (0x2A77, 'V'),
- (0x2ADC, 'M', '⫝̸'),
- (0x2ADD, 'V'),
- (0x2B74, 'X'),
- (0x2B76, 'V'),
- (0x2B96, 'X'),
- (0x2B97, 'V'),
- (0x2C00, 'M', 'ⰰ'),
- (0x2C01, 'M', 'ⰱ'),
- (0x2C02, 'M', 'ⰲ'),
- (0x2C03, 'M', 'ⰳ'),
- (0x2C04, 'M', 'ⰴ'),
- (0x2C05, 'M', 'ⰵ'),
- (0x2C06, 'M', 'ⰶ'),
- (0x2C07, 'M', 'ⰷ'),
- (0x2C08, 'M', 'ⰸ'),
- (0x2C09, 'M', 'ⰹ'),
- (0x2C0A, 'M', 'ⰺ'),
- (0x2C0B, 'M', 'ⰻ'),
- (0x2C0C, 'M', 'ⰼ'),
- (0x2C0D, 'M', 'ⰽ'),
- (0x2C0E, 'M', 'ⰾ'),
- (0x2C0F, 'M', 'ⰿ'),
- (0x2C10, 'M', 'ⱀ'),
- (0x2C11, 'M', 'ⱁ'),
- (0x2C12, 'M', 'ⱂ'),
- (0x2C13, 'M', 'ⱃ'),
- (0x2C14, 'M', 'ⱄ'),
- (0x2C15, 'M', 'ⱅ'),
- (0x2C16, 'M', 'ⱆ'),
- (0x2C17, 'M', 'ⱇ'),
- (0x2C18, 'M', 'ⱈ'),
- (0x2C19, 'M', 'ⱉ'),
- (0x2C1A, 'M', 'ⱊ'),
- (0x2C1B, 'M', 'ⱋ'),
- (0x2C1C, 'M', 'ⱌ'),
- (0x2C1D, 'M', 'ⱍ'),
- (0x2C1E, 'M', 'ⱎ'),
- (0x2C1F, 'M', 'ⱏ'),
- (0x2C20, 'M', 'ⱐ'),
- (0x2C21, 'M', 'ⱑ'),
- (0x2C22, 'M', 'ⱒ'),
- (0x2C23, 'M', 'ⱓ'),
- (0x2C24, 'M', 'ⱔ'),
- (0x2C25, 'M', 'ⱕ'),
- (0x2C26, 'M', 'ⱖ'),
- (0x2C27, 'M', 'ⱗ'),
- (0x2C28, 'M', 'ⱘ'),
- ]
-
-def _seg_25() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2C29, 'M', 'ⱙ'),
- (0x2C2A, 'M', 'ⱚ'),
- (0x2C2B, 'M', 'ⱛ'),
- (0x2C2C, 'M', 'ⱜ'),
- (0x2C2D, 'M', 'ⱝ'),
- (0x2C2E, 'M', 'ⱞ'),
- (0x2C2F, 'M', 'ⱟ'),
- (0x2C30, 'V'),
- (0x2C60, 'M', 'ⱡ'),
- (0x2C61, 'V'),
- (0x2C62, 'M', 'ɫ'),
- (0x2C63, 'M', 'ᵽ'),
- (0x2C64, 'M', 'ɽ'),
- (0x2C65, 'V'),
- (0x2C67, 'M', 'ⱨ'),
- (0x2C68, 'V'),
- (0x2C69, 'M', 'ⱪ'),
- (0x2C6A, 'V'),
- (0x2C6B, 'M', 'ⱬ'),
- (0x2C6C, 'V'),
- (0x2C6D, 'M', 'ɑ'),
- (0x2C6E, 'M', 'ɱ'),
- (0x2C6F, 'M', 'ɐ'),
- (0x2C70, 'M', 'ɒ'),
- (0x2C71, 'V'),
- (0x2C72, 'M', 'ⱳ'),
- (0x2C73, 'V'),
- (0x2C75, 'M', 'ⱶ'),
- (0x2C76, 'V'),
- (0x2C7C, 'M', 'j'),
- (0x2C7D, 'M', 'v'),
- (0x2C7E, 'M', 'ȿ'),
- (0x2C7F, 'M', 'ɀ'),
- (0x2C80, 'M', 'ⲁ'),
- (0x2C81, 'V'),
- (0x2C82, 'M', 'ⲃ'),
- (0x2C83, 'V'),
- (0x2C84, 'M', 'ⲅ'),
- (0x2C85, 'V'),
- (0x2C86, 'M', 'ⲇ'),
- (0x2C87, 'V'),
- (0x2C88, 'M', 'ⲉ'),
- (0x2C89, 'V'),
- (0x2C8A, 'M', 'ⲋ'),
- (0x2C8B, 'V'),
- (0x2C8C, 'M', 'ⲍ'),
- (0x2C8D, 'V'),
- (0x2C8E, 'M', 'ⲏ'),
- (0x2C8F, 'V'),
- (0x2C90, 'M', 'ⲑ'),
- (0x2C91, 'V'),
- (0x2C92, 'M', 'ⲓ'),
- (0x2C93, 'V'),
- (0x2C94, 'M', 'ⲕ'),
- (0x2C95, 'V'),
- (0x2C96, 'M', 'ⲗ'),
- (0x2C97, 'V'),
- (0x2C98, 'M', 'ⲙ'),
- (0x2C99, 'V'),
- (0x2C9A, 'M', 'ⲛ'),
- (0x2C9B, 'V'),
- (0x2C9C, 'M', 'ⲝ'),
- (0x2C9D, 'V'),
- (0x2C9E, 'M', 'ⲟ'),
- (0x2C9F, 'V'),
- (0x2CA0, 'M', 'ⲡ'),
- (0x2CA1, 'V'),
- (0x2CA2, 'M', 'ⲣ'),
- (0x2CA3, 'V'),
- (0x2CA4, 'M', 'ⲥ'),
- (0x2CA5, 'V'),
- (0x2CA6, 'M', 'ⲧ'),
- (0x2CA7, 'V'),
- (0x2CA8, 'M', 'ⲩ'),
- (0x2CA9, 'V'),
- (0x2CAA, 'M', 'ⲫ'),
- (0x2CAB, 'V'),
- (0x2CAC, 'M', 'ⲭ'),
- (0x2CAD, 'V'),
- (0x2CAE, 'M', 'ⲯ'),
- (0x2CAF, 'V'),
- (0x2CB0, 'M', 'ⲱ'),
- (0x2CB1, 'V'),
- (0x2CB2, 'M', 'ⲳ'),
- (0x2CB3, 'V'),
- (0x2CB4, 'M', 'ⲵ'),
- (0x2CB5, 'V'),
- (0x2CB6, 'M', 'ⲷ'),
- (0x2CB7, 'V'),
- (0x2CB8, 'M', 'ⲹ'),
- (0x2CB9, 'V'),
- (0x2CBA, 'M', 'ⲻ'),
- (0x2CBB, 'V'),
- (0x2CBC, 'M', 'ⲽ'),
- (0x2CBD, 'V'),
- (0x2CBE, 'M', 'ⲿ'),
- (0x2CBF, 'V'),
- (0x2CC0, 'M', 'ⳁ'),
- (0x2CC1, 'V'),
- (0x2CC2, 'M', 'ⳃ'),
- ]
-
-def _seg_26() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2CC3, 'V'),
- (0x2CC4, 'M', 'ⳅ'),
- (0x2CC5, 'V'),
- (0x2CC6, 'M', 'ⳇ'),
- (0x2CC7, 'V'),
- (0x2CC8, 'M', 'ⳉ'),
- (0x2CC9, 'V'),
- (0x2CCA, 'M', 'ⳋ'),
- (0x2CCB, 'V'),
- (0x2CCC, 'M', 'ⳍ'),
- (0x2CCD, 'V'),
- (0x2CCE, 'M', 'ⳏ'),
- (0x2CCF, 'V'),
- (0x2CD0, 'M', 'ⳑ'),
- (0x2CD1, 'V'),
- (0x2CD2, 'M', 'ⳓ'),
- (0x2CD3, 'V'),
- (0x2CD4, 'M', 'ⳕ'),
- (0x2CD5, 'V'),
- (0x2CD6, 'M', 'ⳗ'),
- (0x2CD7, 'V'),
- (0x2CD8, 'M', 'ⳙ'),
- (0x2CD9, 'V'),
- (0x2CDA, 'M', 'ⳛ'),
- (0x2CDB, 'V'),
- (0x2CDC, 'M', 'ⳝ'),
- (0x2CDD, 'V'),
- (0x2CDE, 'M', 'ⳟ'),
- (0x2CDF, 'V'),
- (0x2CE0, 'M', 'ⳡ'),
- (0x2CE1, 'V'),
- (0x2CE2, 'M', 'ⳣ'),
- (0x2CE3, 'V'),
- (0x2CEB, 'M', 'ⳬ'),
- (0x2CEC, 'V'),
- (0x2CED, 'M', 'ⳮ'),
- (0x2CEE, 'V'),
- (0x2CF2, 'M', 'ⳳ'),
- (0x2CF3, 'V'),
- (0x2CF4, 'X'),
- (0x2CF9, 'V'),
- (0x2D26, 'X'),
- (0x2D27, 'V'),
- (0x2D28, 'X'),
- (0x2D2D, 'V'),
- (0x2D2E, 'X'),
- (0x2D30, 'V'),
- (0x2D68, 'X'),
- (0x2D6F, 'M', 'ⵡ'),
- (0x2D70, 'V'),
- (0x2D71, 'X'),
- (0x2D7F, 'V'),
- (0x2D97, 'X'),
- (0x2DA0, 'V'),
- (0x2DA7, 'X'),
- (0x2DA8, 'V'),
- (0x2DAF, 'X'),
- (0x2DB0, 'V'),
- (0x2DB7, 'X'),
- (0x2DB8, 'V'),
- (0x2DBF, 'X'),
- (0x2DC0, 'V'),
- (0x2DC7, 'X'),
- (0x2DC8, 'V'),
- (0x2DCF, 'X'),
- (0x2DD0, 'V'),
- (0x2DD7, 'X'),
- (0x2DD8, 'V'),
- (0x2DDF, 'X'),
- (0x2DE0, 'V'),
- (0x2E5E, 'X'),
- (0x2E80, 'V'),
- (0x2E9A, 'X'),
- (0x2E9B, 'V'),
- (0x2E9F, 'M', '母'),
- (0x2EA0, 'V'),
- (0x2EF3, 'M', '龟'),
- (0x2EF4, 'X'),
- (0x2F00, 'M', '一'),
- (0x2F01, 'M', '丨'),
- (0x2F02, 'M', '丶'),
- (0x2F03, 'M', '丿'),
- (0x2F04, 'M', '乙'),
- (0x2F05, 'M', '亅'),
- (0x2F06, 'M', '二'),
- (0x2F07, 'M', '亠'),
- (0x2F08, 'M', '人'),
- (0x2F09, 'M', '儿'),
- (0x2F0A, 'M', '入'),
- (0x2F0B, 'M', '八'),
- (0x2F0C, 'M', '冂'),
- (0x2F0D, 'M', '冖'),
- (0x2F0E, 'M', '冫'),
- (0x2F0F, 'M', '几'),
- (0x2F10, 'M', '凵'),
- (0x2F11, 'M', '刀'),
- (0x2F12, 'M', '力'),
- (0x2F13, 'M', '勹'),
- (0x2F14, 'M', '匕'),
- (0x2F15, 'M', '匚'),
- ]
-
-def _seg_27() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F16, 'M', '匸'),
- (0x2F17, 'M', '十'),
- (0x2F18, 'M', '卜'),
- (0x2F19, 'M', '卩'),
- (0x2F1A, 'M', '厂'),
- (0x2F1B, 'M', '厶'),
- (0x2F1C, 'M', '又'),
- (0x2F1D, 'M', '口'),
- (0x2F1E, 'M', '囗'),
- (0x2F1F, 'M', '土'),
- (0x2F20, 'M', '士'),
- (0x2F21, 'M', '夂'),
- (0x2F22, 'M', '夊'),
- (0x2F23, 'M', '夕'),
- (0x2F24, 'M', '大'),
- (0x2F25, 'M', '女'),
- (0x2F26, 'M', '子'),
- (0x2F27, 'M', '宀'),
- (0x2F28, 'M', '寸'),
- (0x2F29, 'M', '小'),
- (0x2F2A, 'M', '尢'),
- (0x2F2B, 'M', '尸'),
- (0x2F2C, 'M', '屮'),
- (0x2F2D, 'M', '山'),
- (0x2F2E, 'M', '巛'),
- (0x2F2F, 'M', '工'),
- (0x2F30, 'M', '己'),
- (0x2F31, 'M', '巾'),
- (0x2F32, 'M', '干'),
- (0x2F33, 'M', '幺'),
- (0x2F34, 'M', '广'),
- (0x2F35, 'M', '廴'),
- (0x2F36, 'M', '廾'),
- (0x2F37, 'M', '弋'),
- (0x2F38, 'M', '弓'),
- (0x2F39, 'M', '彐'),
- (0x2F3A, 'M', '彡'),
- (0x2F3B, 'M', '彳'),
- (0x2F3C, 'M', '心'),
- (0x2F3D, 'M', '戈'),
- (0x2F3E, 'M', '戶'),
- (0x2F3F, 'M', '手'),
- (0x2F40, 'M', '支'),
- (0x2F41, 'M', '攴'),
- (0x2F42, 'M', '文'),
- (0x2F43, 'M', '斗'),
- (0x2F44, 'M', '斤'),
- (0x2F45, 'M', '方'),
- (0x2F46, 'M', '无'),
- (0x2F47, 'M', '日'),
- (0x2F48, 'M', '曰'),
- (0x2F49, 'M', '月'),
- (0x2F4A, 'M', '木'),
- (0x2F4B, 'M', '欠'),
- (0x2F4C, 'M', '止'),
- (0x2F4D, 'M', '歹'),
- (0x2F4E, 'M', '殳'),
- (0x2F4F, 'M', '毋'),
- (0x2F50, 'M', '比'),
- (0x2F51, 'M', '毛'),
- (0x2F52, 'M', '氏'),
- (0x2F53, 'M', '气'),
- (0x2F54, 'M', '水'),
- (0x2F55, 'M', '火'),
- (0x2F56, 'M', '爪'),
- (0x2F57, 'M', '父'),
- (0x2F58, 'M', '爻'),
- (0x2F59, 'M', '爿'),
- (0x2F5A, 'M', '片'),
- (0x2F5B, 'M', '牙'),
- (0x2F5C, 'M', '牛'),
- (0x2F5D, 'M', '犬'),
- (0x2F5E, 'M', '玄'),
- (0x2F5F, 'M', '玉'),
- (0x2F60, 'M', '瓜'),
- (0x2F61, 'M', '瓦'),
- (0x2F62, 'M', '甘'),
- (0x2F63, 'M', '生'),
- (0x2F64, 'M', '用'),
- (0x2F65, 'M', '田'),
- (0x2F66, 'M', '疋'),
- (0x2F67, 'M', '疒'),
- (0x2F68, 'M', '癶'),
- (0x2F69, 'M', '白'),
- (0x2F6A, 'M', '皮'),
- (0x2F6B, 'M', '皿'),
- (0x2F6C, 'M', '目'),
- (0x2F6D, 'M', '矛'),
- (0x2F6E, 'M', '矢'),
- (0x2F6F, 'M', '石'),
- (0x2F70, 'M', '示'),
- (0x2F71, 'M', '禸'),
- (0x2F72, 'M', '禾'),
- (0x2F73, 'M', '穴'),
- (0x2F74, 'M', '立'),
- (0x2F75, 'M', '竹'),
- (0x2F76, 'M', '米'),
- (0x2F77, 'M', '糸'),
- (0x2F78, 'M', '缶'),
- (0x2F79, 'M', '网'),
- ]
-
-def _seg_28() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F7A, 'M', '羊'),
- (0x2F7B, 'M', '羽'),
- (0x2F7C, 'M', '老'),
- (0x2F7D, 'M', '而'),
- (0x2F7E, 'M', '耒'),
- (0x2F7F, 'M', '耳'),
- (0x2F80, 'M', '聿'),
- (0x2F81, 'M', '肉'),
- (0x2F82, 'M', '臣'),
- (0x2F83, 'M', '自'),
- (0x2F84, 'M', '至'),
- (0x2F85, 'M', '臼'),
- (0x2F86, 'M', '舌'),
- (0x2F87, 'M', '舛'),
- (0x2F88, 'M', '舟'),
- (0x2F89, 'M', '艮'),
- (0x2F8A, 'M', '色'),
- (0x2F8B, 'M', '艸'),
- (0x2F8C, 'M', '虍'),
- (0x2F8D, 'M', '虫'),
- (0x2F8E, 'M', '血'),
- (0x2F8F, 'M', '行'),
- (0x2F90, 'M', '衣'),
- (0x2F91, 'M', '襾'),
- (0x2F92, 'M', '見'),
- (0x2F93, 'M', '角'),
- (0x2F94, 'M', '言'),
- (0x2F95, 'M', '谷'),
- (0x2F96, 'M', '豆'),
- (0x2F97, 'M', '豕'),
- (0x2F98, 'M', '豸'),
- (0x2F99, 'M', '貝'),
- (0x2F9A, 'M', '赤'),
- (0x2F9B, 'M', '走'),
- (0x2F9C, 'M', '足'),
- (0x2F9D, 'M', '身'),
- (0x2F9E, 'M', '車'),
- (0x2F9F, 'M', '辛'),
- (0x2FA0, 'M', '辰'),
- (0x2FA1, 'M', '辵'),
- (0x2FA2, 'M', '邑'),
- (0x2FA3, 'M', '酉'),
- (0x2FA4, 'M', '釆'),
- (0x2FA5, 'M', '里'),
- (0x2FA6, 'M', '金'),
- (0x2FA7, 'M', '長'),
- (0x2FA8, 'M', '門'),
- (0x2FA9, 'M', '阜'),
- (0x2FAA, 'M', '隶'),
- (0x2FAB, 'M', '隹'),
- (0x2FAC, 'M', '雨'),
- (0x2FAD, 'M', '靑'),
- (0x2FAE, 'M', '非'),
- (0x2FAF, 'M', '面'),
- (0x2FB0, 'M', '革'),
- (0x2FB1, 'M', '韋'),
- (0x2FB2, 'M', '韭'),
- (0x2FB3, 'M', '音'),
- (0x2FB4, 'M', '頁'),
- (0x2FB5, 'M', '風'),
- (0x2FB6, 'M', '飛'),
- (0x2FB7, 'M', '食'),
- (0x2FB8, 'M', '首'),
- (0x2FB9, 'M', '香'),
- (0x2FBA, 'M', '馬'),
- (0x2FBB, 'M', '骨'),
- (0x2FBC, 'M', '高'),
- (0x2FBD, 'M', '髟'),
- (0x2FBE, 'M', '鬥'),
- (0x2FBF, 'M', '鬯'),
- (0x2FC0, 'M', '鬲'),
- (0x2FC1, 'M', '鬼'),
- (0x2FC2, 'M', '魚'),
- (0x2FC3, 'M', '鳥'),
- (0x2FC4, 'M', '鹵'),
- (0x2FC5, 'M', '鹿'),
- (0x2FC6, 'M', '麥'),
- (0x2FC7, 'M', '麻'),
- (0x2FC8, 'M', '黃'),
- (0x2FC9, 'M', '黍'),
- (0x2FCA, 'M', '黑'),
- (0x2FCB, 'M', '黹'),
- (0x2FCC, 'M', '黽'),
- (0x2FCD, 'M', '鼎'),
- (0x2FCE, 'M', '鼓'),
- (0x2FCF, 'M', '鼠'),
- (0x2FD0, 'M', '鼻'),
- (0x2FD1, 'M', '齊'),
- (0x2FD2, 'M', '齒'),
- (0x2FD3, 'M', '龍'),
- (0x2FD4, 'M', '龜'),
- (0x2FD5, 'M', '龠'),
- (0x2FD6, 'X'),
- (0x3000, '3', ' '),
- (0x3001, 'V'),
- (0x3002, 'M', '.'),
- (0x3003, 'V'),
- (0x3036, 'M', '〒'),
- (0x3037, 'V'),
- (0x3038, 'M', '十'),
- ]
-
-def _seg_29() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x3039, 'M', '卄'),
- (0x303A, 'M', '卅'),
- (0x303B, 'V'),
- (0x3040, 'X'),
- (0x3041, 'V'),
- (0x3097, 'X'),
- (0x3099, 'V'),
- (0x309B, '3', ' ゙'),
- (0x309C, '3', ' ゚'),
- (0x309D, 'V'),
- (0x309F, 'M', 'より'),
- (0x30A0, 'V'),
- (0x30FF, 'M', 'コト'),
- (0x3100, 'X'),
- (0x3105, 'V'),
- (0x3130, 'X'),
- (0x3131, 'M', 'ᄀ'),
- (0x3132, 'M', 'ᄁ'),
- (0x3133, 'M', 'ᆪ'),
- (0x3134, 'M', 'ᄂ'),
- (0x3135, 'M', 'ᆬ'),
- (0x3136, 'M', 'ᆭ'),
- (0x3137, 'M', 'ᄃ'),
- (0x3138, 'M', 'ᄄ'),
- (0x3139, 'M', 'ᄅ'),
- (0x313A, 'M', 'ᆰ'),
- (0x313B, 'M', 'ᆱ'),
- (0x313C, 'M', 'ᆲ'),
- (0x313D, 'M', 'ᆳ'),
- (0x313E, 'M', 'ᆴ'),
- (0x313F, 'M', 'ᆵ'),
- (0x3140, 'M', 'ᄚ'),
- (0x3141, 'M', 'ᄆ'),
- (0x3142, 'M', 'ᄇ'),
- (0x3143, 'M', 'ᄈ'),
- (0x3144, 'M', 'ᄡ'),
- (0x3145, 'M', 'ᄉ'),
- (0x3146, 'M', 'ᄊ'),
- (0x3147, 'M', 'ᄋ'),
- (0x3148, 'M', 'ᄌ'),
- (0x3149, 'M', 'ᄍ'),
- (0x314A, 'M', 'ᄎ'),
- (0x314B, 'M', 'ᄏ'),
- (0x314C, 'M', 'ᄐ'),
- (0x314D, 'M', 'ᄑ'),
- (0x314E, 'M', 'ᄒ'),
- (0x314F, 'M', 'ᅡ'),
- (0x3150, 'M', 'ᅢ'),
- (0x3151, 'M', 'ᅣ'),
- (0x3152, 'M', 'ᅤ'),
- (0x3153, 'M', 'ᅥ'),
- (0x3154, 'M', 'ᅦ'),
- (0x3155, 'M', 'ᅧ'),
- (0x3156, 'M', 'ᅨ'),
- (0x3157, 'M', 'ᅩ'),
- (0x3158, 'M', 'ᅪ'),
- (0x3159, 'M', 'ᅫ'),
- (0x315A, 'M', 'ᅬ'),
- (0x315B, 'M', 'ᅭ'),
- (0x315C, 'M', 'ᅮ'),
- (0x315D, 'M', 'ᅯ'),
- (0x315E, 'M', 'ᅰ'),
- (0x315F, 'M', 'ᅱ'),
- (0x3160, 'M', 'ᅲ'),
- (0x3161, 'M', 'ᅳ'),
- (0x3162, 'M', 'ᅴ'),
- (0x3163, 'M', 'ᅵ'),
- (0x3164, 'X'),
- (0x3165, 'M', 'ᄔ'),
- (0x3166, 'M', 'ᄕ'),
- (0x3167, 'M', 'ᇇ'),
- (0x3168, 'M', 'ᇈ'),
- (0x3169, 'M', 'ᇌ'),
- (0x316A, 'M', 'ᇎ'),
- (0x316B, 'M', 'ᇓ'),
- (0x316C, 'M', 'ᇗ'),
- (0x316D, 'M', 'ᇙ'),
- (0x316E, 'M', 'ᄜ'),
- (0x316F, 'M', 'ᇝ'),
- (0x3170, 'M', 'ᇟ'),
- (0x3171, 'M', 'ᄝ'),
- (0x3172, 'M', 'ᄞ'),
- (0x3173, 'M', 'ᄠ'),
- (0x3174, 'M', 'ᄢ'),
- (0x3175, 'M', 'ᄣ'),
- (0x3176, 'M', 'ᄧ'),
- (0x3177, 'M', 'ᄩ'),
- (0x3178, 'M', 'ᄫ'),
- (0x3179, 'M', 'ᄬ'),
- (0x317A, 'M', 'ᄭ'),
- (0x317B, 'M', 'ᄮ'),
- (0x317C, 'M', 'ᄯ'),
- (0x317D, 'M', 'ᄲ'),
- (0x317E, 'M', 'ᄶ'),
- (0x317F, 'M', 'ᅀ'),
- (0x3180, 'M', 'ᅇ'),
- (0x3181, 'M', 'ᅌ'),
- (0x3182, 'M', 'ᇱ'),
- (0x3183, 'M', 'ᇲ'),
- (0x3184, 'M', 'ᅗ'),
- ]
-
-def _seg_30() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x3185, 'M', 'ᅘ'),
- (0x3186, 'M', 'ᅙ'),
- (0x3187, 'M', 'ᆄ'),
- (0x3188, 'M', 'ᆅ'),
- (0x3189, 'M', 'ᆈ'),
- (0x318A, 'M', 'ᆑ'),
- (0x318B, 'M', 'ᆒ'),
- (0x318C, 'M', 'ᆔ'),
- (0x318D, 'M', 'ᆞ'),
- (0x318E, 'M', 'ᆡ'),
- (0x318F, 'X'),
- (0x3190, 'V'),
- (0x3192, 'M', '一'),
- (0x3193, 'M', '二'),
- (0x3194, 'M', '三'),
- (0x3195, 'M', '四'),
- (0x3196, 'M', '上'),
- (0x3197, 'M', '中'),
- (0x3198, 'M', '下'),
- (0x3199, 'M', '甲'),
- (0x319A, 'M', '乙'),
- (0x319B, 'M', '丙'),
- (0x319C, 'M', '丁'),
- (0x319D, 'M', '天'),
- (0x319E, 'M', '地'),
- (0x319F, 'M', '人'),
- (0x31A0, 'V'),
- (0x31E4, 'X'),
- (0x31F0, 'V'),
- (0x3200, '3', '(ᄀ)'),
- (0x3201, '3', '(ᄂ)'),
- (0x3202, '3', '(ᄃ)'),
- (0x3203, '3', '(ᄅ)'),
- (0x3204, '3', '(ᄆ)'),
- (0x3205, '3', '(ᄇ)'),
- (0x3206, '3', '(ᄉ)'),
- (0x3207, '3', '(ᄋ)'),
- (0x3208, '3', '(ᄌ)'),
- (0x3209, '3', '(ᄎ)'),
- (0x320A, '3', '(ᄏ)'),
- (0x320B, '3', '(ᄐ)'),
- (0x320C, '3', '(ᄑ)'),
- (0x320D, '3', '(ᄒ)'),
- (0x320E, '3', '(가)'),
- (0x320F, '3', '(나)'),
- (0x3210, '3', '(다)'),
- (0x3211, '3', '(라)'),
- (0x3212, '3', '(마)'),
- (0x3213, '3', '(바)'),
- (0x3214, '3', '(사)'),
- (0x3215, '3', '(아)'),
- (0x3216, '3', '(자)'),
- (0x3217, '3', '(차)'),
- (0x3218, '3', '(카)'),
- (0x3219, '3', '(타)'),
- (0x321A, '3', '(파)'),
- (0x321B, '3', '(하)'),
- (0x321C, '3', '(주)'),
- (0x321D, '3', '(오전)'),
- (0x321E, '3', '(오후)'),
- (0x321F, 'X'),
- (0x3220, '3', '(一)'),
- (0x3221, '3', '(二)'),
- (0x3222, '3', '(三)'),
- (0x3223, '3', '(四)'),
- (0x3224, '3', '(五)'),
- (0x3225, '3', '(六)'),
- (0x3226, '3', '(七)'),
- (0x3227, '3', '(八)'),
- (0x3228, '3', '(九)'),
- (0x3229, '3', '(十)'),
- (0x322A, '3', '(月)'),
- (0x322B, '3', '(火)'),
- (0x322C, '3', '(水)'),
- (0x322D, '3', '(木)'),
- (0x322E, '3', '(金)'),
- (0x322F, '3', '(土)'),
- (0x3230, '3', '(日)'),
- (0x3231, '3', '(株)'),
- (0x3232, '3', '(有)'),
- (0x3233, '3', '(社)'),
- (0x3234, '3', '(名)'),
- (0x3235, '3', '(特)'),
- (0x3236, '3', '(財)'),
- (0x3237, '3', '(祝)'),
- (0x3238, '3', '(労)'),
- (0x3239, '3', '(代)'),
- (0x323A, '3', '(呼)'),
- (0x323B, '3', '(学)'),
- (0x323C, '3', '(監)'),
- (0x323D, '3', '(企)'),
- (0x323E, '3', '(資)'),
- (0x323F, '3', '(協)'),
- (0x3240, '3', '(祭)'),
- (0x3241, '3', '(休)'),
- (0x3242, '3', '(自)'),
- (0x3243, '3', '(至)'),
- (0x3244, 'M', '問'),
- (0x3245, 'M', '幼'),
- (0x3246, 'M', '文'),
- ]
-
-def _seg_31() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x3247, 'M', '箏'),
- (0x3248, 'V'),
- (0x3250, 'M', 'pte'),
- (0x3251, 'M', '21'),
- (0x3252, 'M', '22'),
- (0x3253, 'M', '23'),
- (0x3254, 'M', '24'),
- (0x3255, 'M', '25'),
- (0x3256, 'M', '26'),
- (0x3257, 'M', '27'),
- (0x3258, 'M', '28'),
- (0x3259, 'M', '29'),
- (0x325A, 'M', '30'),
- (0x325B, 'M', '31'),
- (0x325C, 'M', '32'),
- (0x325D, 'M', '33'),
- (0x325E, 'M', '34'),
- (0x325F, 'M', '35'),
- (0x3260, 'M', 'ᄀ'),
- (0x3261, 'M', 'ᄂ'),
- (0x3262, 'M', 'ᄃ'),
- (0x3263, 'M', 'ᄅ'),
- (0x3264, 'M', 'ᄆ'),
- (0x3265, 'M', 'ᄇ'),
- (0x3266, 'M', 'ᄉ'),
- (0x3267, 'M', 'ᄋ'),
- (0x3268, 'M', 'ᄌ'),
- (0x3269, 'M', 'ᄎ'),
- (0x326A, 'M', 'ᄏ'),
- (0x326B, 'M', 'ᄐ'),
- (0x326C, 'M', 'ᄑ'),
- (0x326D, 'M', 'ᄒ'),
- (0x326E, 'M', '가'),
- (0x326F, 'M', '나'),
- (0x3270, 'M', '다'),
- (0x3271, 'M', '라'),
- (0x3272, 'M', '마'),
- (0x3273, 'M', '바'),
- (0x3274, 'M', '사'),
- (0x3275, 'M', '아'),
- (0x3276, 'M', '자'),
- (0x3277, 'M', '차'),
- (0x3278, 'M', '카'),
- (0x3279, 'M', '타'),
- (0x327A, 'M', '파'),
- (0x327B, 'M', '하'),
- (0x327C, 'M', '참고'),
- (0x327D, 'M', '주의'),
- (0x327E, 'M', '우'),
- (0x327F, 'V'),
- (0x3280, 'M', '一'),
- (0x3281, 'M', '二'),
- (0x3282, 'M', '三'),
- (0x3283, 'M', '四'),
- (0x3284, 'M', '五'),
- (0x3285, 'M', '六'),
- (0x3286, 'M', '七'),
- (0x3287, 'M', '八'),
- (0x3288, 'M', '九'),
- (0x3289, 'M', '十'),
- (0x328A, 'M', '月'),
- (0x328B, 'M', '火'),
- (0x328C, 'M', '水'),
- (0x328D, 'M', '木'),
- (0x328E, 'M', '金'),
- (0x328F, 'M', '土'),
- (0x3290, 'M', '日'),
- (0x3291, 'M', '株'),
- (0x3292, 'M', '有'),
- (0x3293, 'M', '社'),
- (0x3294, 'M', '名'),
- (0x3295, 'M', '特'),
- (0x3296, 'M', '財'),
- (0x3297, 'M', '祝'),
- (0x3298, 'M', '労'),
- (0x3299, 'M', '秘'),
- (0x329A, 'M', '男'),
- (0x329B, 'M', '女'),
- (0x329C, 'M', '適'),
- (0x329D, 'M', '優'),
- (0x329E, 'M', '印'),
- (0x329F, 'M', '注'),
- (0x32A0, 'M', '項'),
- (0x32A1, 'M', '休'),
- (0x32A2, 'M', '写'),
- (0x32A3, 'M', '正'),
- (0x32A4, 'M', '上'),
- (0x32A5, 'M', '中'),
- (0x32A6, 'M', '下'),
- (0x32A7, 'M', '左'),
- (0x32A8, 'M', '右'),
- (0x32A9, 'M', '医'),
- (0x32AA, 'M', '宗'),
- (0x32AB, 'M', '学'),
- (0x32AC, 'M', '監'),
- (0x32AD, 'M', '企'),
- (0x32AE, 'M', '資'),
- (0x32AF, 'M', '協'),
- (0x32B0, 'M', '夜'),
- (0x32B1, 'M', '36'),
- ]
-
-def _seg_32() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x32B2, 'M', '37'),
- (0x32B3, 'M', '38'),
- (0x32B4, 'M', '39'),
- (0x32B5, 'M', '40'),
- (0x32B6, 'M', '41'),
- (0x32B7, 'M', '42'),
- (0x32B8, 'M', '43'),
- (0x32B9, 'M', '44'),
- (0x32BA, 'M', '45'),
- (0x32BB, 'M', '46'),
- (0x32BC, 'M', '47'),
- (0x32BD, 'M', '48'),
- (0x32BE, 'M', '49'),
- (0x32BF, 'M', '50'),
- (0x32C0, 'M', '1月'),
- (0x32C1, 'M', '2月'),
- (0x32C2, 'M', '3月'),
- (0x32C3, 'M', '4月'),
- (0x32C4, 'M', '5月'),
- (0x32C5, 'M', '6月'),
- (0x32C6, 'M', '7月'),
- (0x32C7, 'M', '8月'),
- (0x32C8, 'M', '9月'),
- (0x32C9, 'M', '10月'),
- (0x32CA, 'M', '11月'),
- (0x32CB, 'M', '12月'),
- (0x32CC, 'M', 'hg'),
- (0x32CD, 'M', 'erg'),
- (0x32CE, 'M', 'ev'),
- (0x32CF, 'M', 'ltd'),
- (0x32D0, 'M', 'ア'),
- (0x32D1, 'M', 'イ'),
- (0x32D2, 'M', 'ウ'),
- (0x32D3, 'M', 'エ'),
- (0x32D4, 'M', 'オ'),
- (0x32D5, 'M', 'カ'),
- (0x32D6, 'M', 'キ'),
- (0x32D7, 'M', 'ク'),
- (0x32D8, 'M', 'ケ'),
- (0x32D9, 'M', 'コ'),
- (0x32DA, 'M', 'サ'),
- (0x32DB, 'M', 'シ'),
- (0x32DC, 'M', 'ス'),
- (0x32DD, 'M', 'セ'),
- (0x32DE, 'M', 'ソ'),
- (0x32DF, 'M', 'タ'),
- (0x32E0, 'M', 'チ'),
- (0x32E1, 'M', 'ツ'),
- (0x32E2, 'M', 'テ'),
- (0x32E3, 'M', 'ト'),
- (0x32E4, 'M', 'ナ'),
- (0x32E5, 'M', 'ニ'),
- (0x32E6, 'M', 'ヌ'),
- (0x32E7, 'M', 'ネ'),
- (0x32E8, 'M', 'ノ'),
- (0x32E9, 'M', 'ハ'),
- (0x32EA, 'M', 'ヒ'),
- (0x32EB, 'M', 'フ'),
- (0x32EC, 'M', 'ヘ'),
- (0x32ED, 'M', 'ホ'),
- (0x32EE, 'M', 'マ'),
- (0x32EF, 'M', 'ミ'),
- (0x32F0, 'M', 'ム'),
- (0x32F1, 'M', 'メ'),
- (0x32F2, 'M', 'モ'),
- (0x32F3, 'M', 'ヤ'),
- (0x32F4, 'M', 'ユ'),
- (0x32F5, 'M', 'ヨ'),
- (0x32F6, 'M', 'ラ'),
- (0x32F7, 'M', 'リ'),
- (0x32F8, 'M', 'ル'),
- (0x32F9, 'M', 'レ'),
- (0x32FA, 'M', 'ロ'),
- (0x32FB, 'M', 'ワ'),
- (0x32FC, 'M', 'ヰ'),
- (0x32FD, 'M', 'ヱ'),
- (0x32FE, 'M', 'ヲ'),
- (0x32FF, 'M', '令和'),
- (0x3300, 'M', 'アパート'),
- (0x3301, 'M', 'アルファ'),
- (0x3302, 'M', 'アンペア'),
- (0x3303, 'M', 'アール'),
- (0x3304, 'M', 'イニング'),
- (0x3305, 'M', 'インチ'),
- (0x3306, 'M', 'ウォン'),
- (0x3307, 'M', 'エスクード'),
- (0x3308, 'M', 'エーカー'),
- (0x3309, 'M', 'オンス'),
- (0x330A, 'M', 'オーム'),
- (0x330B, 'M', 'カイリ'),
- (0x330C, 'M', 'カラット'),
- (0x330D, 'M', 'カロリー'),
- (0x330E, 'M', 'ガロン'),
- (0x330F, 'M', 'ガンマ'),
- (0x3310, 'M', 'ギガ'),
- (0x3311, 'M', 'ギニー'),
- (0x3312, 'M', 'キュリー'),
- (0x3313, 'M', 'ギルダー'),
- (0x3314, 'M', 'キロ'),
- (0x3315, 'M', 'キログラム'),
- ]
-
-def _seg_33() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x3316, 'M', 'キロメートル'),
- (0x3317, 'M', 'キロワット'),
- (0x3318, 'M', 'グラム'),
- (0x3319, 'M', 'グラムトン'),
- (0x331A, 'M', 'クルゼイロ'),
- (0x331B, 'M', 'クローネ'),
- (0x331C, 'M', 'ケース'),
- (0x331D, 'M', 'コルナ'),
- (0x331E, 'M', 'コーポ'),
- (0x331F, 'M', 'サイクル'),
- (0x3320, 'M', 'サンチーム'),
- (0x3321, 'M', 'シリング'),
- (0x3322, 'M', 'センチ'),
- (0x3323, 'M', 'セント'),
- (0x3324, 'M', 'ダース'),
- (0x3325, 'M', 'デシ'),
- (0x3326, 'M', 'ドル'),
- (0x3327, 'M', 'トン'),
- (0x3328, 'M', 'ナノ'),
- (0x3329, 'M', 'ノット'),
- (0x332A, 'M', 'ハイツ'),
- (0x332B, 'M', 'パーセント'),
- (0x332C, 'M', 'パーツ'),
- (0x332D, 'M', 'バーレル'),
- (0x332E, 'M', 'ピアストル'),
- (0x332F, 'M', 'ピクル'),
- (0x3330, 'M', 'ピコ'),
- (0x3331, 'M', 'ビル'),
- (0x3332, 'M', 'ファラッド'),
- (0x3333, 'M', 'フィート'),
- (0x3334, 'M', 'ブッシェル'),
- (0x3335, 'M', 'フラン'),
- (0x3336, 'M', 'ヘクタール'),
- (0x3337, 'M', 'ペソ'),
- (0x3338, 'M', 'ペニヒ'),
- (0x3339, 'M', 'ヘルツ'),
- (0x333A, 'M', 'ペンス'),
- (0x333B, 'M', 'ページ'),
- (0x333C, 'M', 'ベータ'),
- (0x333D, 'M', 'ポイント'),
- (0x333E, 'M', 'ボルト'),
- (0x333F, 'M', 'ホン'),
- (0x3340, 'M', 'ポンド'),
- (0x3341, 'M', 'ホール'),
- (0x3342, 'M', 'ホーン'),
- (0x3343, 'M', 'マイクロ'),
- (0x3344, 'M', 'マイル'),
- (0x3345, 'M', 'マッハ'),
- (0x3346, 'M', 'マルク'),
- (0x3347, 'M', 'マンション'),
- (0x3348, 'M', 'ミクロン'),
- (0x3349, 'M', 'ミリ'),
- (0x334A, 'M', 'ミリバール'),
- (0x334B, 'M', 'メガ'),
- (0x334C, 'M', 'メガトン'),
- (0x334D, 'M', 'メートル'),
- (0x334E, 'M', 'ヤード'),
- (0x334F, 'M', 'ヤール'),
- (0x3350, 'M', 'ユアン'),
- (0x3351, 'M', 'リットル'),
- (0x3352, 'M', 'リラ'),
- (0x3353, 'M', 'ルピー'),
- (0x3354, 'M', 'ルーブル'),
- (0x3355, 'M', 'レム'),
- (0x3356, 'M', 'レントゲン'),
- (0x3357, 'M', 'ワット'),
- (0x3358, 'M', '0点'),
- (0x3359, 'M', '1点'),
- (0x335A, 'M', '2点'),
- (0x335B, 'M', '3点'),
- (0x335C, 'M', '4点'),
- (0x335D, 'M', '5点'),
- (0x335E, 'M', '6点'),
- (0x335F, 'M', '7点'),
- (0x3360, 'M', '8点'),
- (0x3361, 'M', '9点'),
- (0x3362, 'M', '10点'),
- (0x3363, 'M', '11点'),
- (0x3364, 'M', '12点'),
- (0x3365, 'M', '13点'),
- (0x3366, 'M', '14点'),
- (0x3367, 'M', '15点'),
- (0x3368, 'M', '16点'),
- (0x3369, 'M', '17点'),
- (0x336A, 'M', '18点'),
- (0x336B, 'M', '19点'),
- (0x336C, 'M', '20点'),
- (0x336D, 'M', '21点'),
- (0x336E, 'M', '22点'),
- (0x336F, 'M', '23点'),
- (0x3370, 'M', '24点'),
- (0x3371, 'M', 'hpa'),
- (0x3372, 'M', 'da'),
- (0x3373, 'M', 'au'),
- (0x3374, 'M', 'bar'),
- (0x3375, 'M', 'ov'),
- (0x3376, 'M', 'pc'),
- (0x3377, 'M', 'dm'),
- (0x3378, 'M', 'dm2'),
- (0x3379, 'M', 'dm3'),
- ]
-
-def _seg_34() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x337A, 'M', 'iu'),
- (0x337B, 'M', '平成'),
- (0x337C, 'M', '昭和'),
- (0x337D, 'M', '大正'),
- (0x337E, 'M', '明治'),
- (0x337F, 'M', '株式会社'),
- (0x3380, 'M', 'pa'),
- (0x3381, 'M', 'na'),
- (0x3382, 'M', 'μa'),
- (0x3383, 'M', 'ma'),
- (0x3384, 'M', 'ka'),
- (0x3385, 'M', 'kb'),
- (0x3386, 'M', 'mb'),
- (0x3387, 'M', 'gb'),
- (0x3388, 'M', 'cal'),
- (0x3389, 'M', 'kcal'),
- (0x338A, 'M', 'pf'),
- (0x338B, 'M', 'nf'),
- (0x338C, 'M', 'μf'),
- (0x338D, 'M', 'μg'),
- (0x338E, 'M', 'mg'),
- (0x338F, 'M', 'kg'),
- (0x3390, 'M', 'hz'),
- (0x3391, 'M', 'khz'),
- (0x3392, 'M', 'mhz'),
- (0x3393, 'M', 'ghz'),
- (0x3394, 'M', 'thz'),
- (0x3395, 'M', 'μl'),
- (0x3396, 'M', 'ml'),
- (0x3397, 'M', 'dl'),
- (0x3398, 'M', 'kl'),
- (0x3399, 'M', 'fm'),
- (0x339A, 'M', 'nm'),
- (0x339B, 'M', 'μm'),
- (0x339C, 'M', 'mm'),
- (0x339D, 'M', 'cm'),
- (0x339E, 'M', 'km'),
- (0x339F, 'M', 'mm2'),
- (0x33A0, 'M', 'cm2'),
- (0x33A1, 'M', 'm2'),
- (0x33A2, 'M', 'km2'),
- (0x33A3, 'M', 'mm3'),
- (0x33A4, 'M', 'cm3'),
- (0x33A5, 'M', 'm3'),
- (0x33A6, 'M', 'km3'),
- (0x33A7, 'M', 'm∕s'),
- (0x33A8, 'M', 'm∕s2'),
- (0x33A9, 'M', 'pa'),
- (0x33AA, 'M', 'kpa'),
- (0x33AB, 'M', 'mpa'),
- (0x33AC, 'M', 'gpa'),
- (0x33AD, 'M', 'rad'),
- (0x33AE, 'M', 'rad∕s'),
- (0x33AF, 'M', 'rad∕s2'),
- (0x33B0, 'M', 'ps'),
- (0x33B1, 'M', 'ns'),
- (0x33B2, 'M', 'μs'),
- (0x33B3, 'M', 'ms'),
- (0x33B4, 'M', 'pv'),
- (0x33B5, 'M', 'nv'),
- (0x33B6, 'M', 'μv'),
- (0x33B7, 'M', 'mv'),
- (0x33B8, 'M', 'kv'),
- (0x33B9, 'M', 'mv'),
- (0x33BA, 'M', 'pw'),
- (0x33BB, 'M', 'nw'),
- (0x33BC, 'M', 'μw'),
- (0x33BD, 'M', 'mw'),
- (0x33BE, 'M', 'kw'),
- (0x33BF, 'M', 'mw'),
- (0x33C0, 'M', 'kω'),
- (0x33C1, 'M', 'mω'),
- (0x33C2, 'X'),
- (0x33C3, 'M', 'bq'),
- (0x33C4, 'M', 'cc'),
- (0x33C5, 'M', 'cd'),
- (0x33C6, 'M', 'c∕kg'),
- (0x33C7, 'X'),
- (0x33C8, 'M', 'db'),
- (0x33C9, 'M', 'gy'),
- (0x33CA, 'M', 'ha'),
- (0x33CB, 'M', 'hp'),
- (0x33CC, 'M', 'in'),
- (0x33CD, 'M', 'kk'),
- (0x33CE, 'M', 'km'),
- (0x33CF, 'M', 'kt'),
- (0x33D0, 'M', 'lm'),
- (0x33D1, 'M', 'ln'),
- (0x33D2, 'M', 'log'),
- (0x33D3, 'M', 'lx'),
- (0x33D4, 'M', 'mb'),
- (0x33D5, 'M', 'mil'),
- (0x33D6, 'M', 'mol'),
- (0x33D7, 'M', 'ph'),
- (0x33D8, 'X'),
- (0x33D9, 'M', 'ppm'),
- (0x33DA, 'M', 'pr'),
- (0x33DB, 'M', 'sr'),
- (0x33DC, 'M', 'sv'),
- (0x33DD, 'M', 'wb'),
- ]
-
-def _seg_35() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x33DE, 'M', 'v∕m'),
- (0x33DF, 'M', 'a∕m'),
- (0x33E0, 'M', '1日'),
- (0x33E1, 'M', '2日'),
- (0x33E2, 'M', '3日'),
- (0x33E3, 'M', '4日'),
- (0x33E4, 'M', '5日'),
- (0x33E5, 'M', '6日'),
- (0x33E6, 'M', '7日'),
- (0x33E7, 'M', '8日'),
- (0x33E8, 'M', '9日'),
- (0x33E9, 'M', '10日'),
- (0x33EA, 'M', '11日'),
- (0x33EB, 'M', '12日'),
- (0x33EC, 'M', '13日'),
- (0x33ED, 'M', '14日'),
- (0x33EE, 'M', '15日'),
- (0x33EF, 'M', '16日'),
- (0x33F0, 'M', '17日'),
- (0x33F1, 'M', '18日'),
- (0x33F2, 'M', '19日'),
- (0x33F3, 'M', '20日'),
- (0x33F4, 'M', '21日'),
- (0x33F5, 'M', '22日'),
- (0x33F6, 'M', '23日'),
- (0x33F7, 'M', '24日'),
- (0x33F8, 'M', '25日'),
- (0x33F9, 'M', '26日'),
- (0x33FA, 'M', '27日'),
- (0x33FB, 'M', '28日'),
- (0x33FC, 'M', '29日'),
- (0x33FD, 'M', '30日'),
- (0x33FE, 'M', '31日'),
- (0x33FF, 'M', 'gal'),
- (0x3400, 'V'),
- (0xA48D, 'X'),
- (0xA490, 'V'),
- (0xA4C7, 'X'),
- (0xA4D0, 'V'),
- (0xA62C, 'X'),
- (0xA640, 'M', 'ꙁ'),
- (0xA641, 'V'),
- (0xA642, 'M', 'ꙃ'),
- (0xA643, 'V'),
- (0xA644, 'M', 'ꙅ'),
- (0xA645, 'V'),
- (0xA646, 'M', 'ꙇ'),
- (0xA647, 'V'),
- (0xA648, 'M', 'ꙉ'),
- (0xA649, 'V'),
- (0xA64A, 'M', 'ꙋ'),
- (0xA64B, 'V'),
- (0xA64C, 'M', 'ꙍ'),
- (0xA64D, 'V'),
- (0xA64E, 'M', 'ꙏ'),
- (0xA64F, 'V'),
- (0xA650, 'M', 'ꙑ'),
- (0xA651, 'V'),
- (0xA652, 'M', 'ꙓ'),
- (0xA653, 'V'),
- (0xA654, 'M', 'ꙕ'),
- (0xA655, 'V'),
- (0xA656, 'M', 'ꙗ'),
- (0xA657, 'V'),
- (0xA658, 'M', 'ꙙ'),
- (0xA659, 'V'),
- (0xA65A, 'M', 'ꙛ'),
- (0xA65B, 'V'),
- (0xA65C, 'M', 'ꙝ'),
- (0xA65D, 'V'),
- (0xA65E, 'M', 'ꙟ'),
- (0xA65F, 'V'),
- (0xA660, 'M', 'ꙡ'),
- (0xA661, 'V'),
- (0xA662, 'M', 'ꙣ'),
- (0xA663, 'V'),
- (0xA664, 'M', 'ꙥ'),
- (0xA665, 'V'),
- (0xA666, 'M', 'ꙧ'),
- (0xA667, 'V'),
- (0xA668, 'M', 'ꙩ'),
- (0xA669, 'V'),
- (0xA66A, 'M', 'ꙫ'),
- (0xA66B, 'V'),
- (0xA66C, 'M', 'ꙭ'),
- (0xA66D, 'V'),
- (0xA680, 'M', 'ꚁ'),
- (0xA681, 'V'),
- (0xA682, 'M', 'ꚃ'),
- (0xA683, 'V'),
- (0xA684, 'M', 'ꚅ'),
- (0xA685, 'V'),
- (0xA686, 'M', 'ꚇ'),
- (0xA687, 'V'),
- (0xA688, 'M', 'ꚉ'),
- (0xA689, 'V'),
- (0xA68A, 'M', 'ꚋ'),
- (0xA68B, 'V'),
- (0xA68C, 'M', 'ꚍ'),
- (0xA68D, 'V'),
- ]
-
-def _seg_36() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xA68E, 'M', 'ꚏ'),
- (0xA68F, 'V'),
- (0xA690, 'M', 'ꚑ'),
- (0xA691, 'V'),
- (0xA692, 'M', 'ꚓ'),
- (0xA693, 'V'),
- (0xA694, 'M', 'ꚕ'),
- (0xA695, 'V'),
- (0xA696, 'M', 'ꚗ'),
- (0xA697, 'V'),
- (0xA698, 'M', 'ꚙ'),
- (0xA699, 'V'),
- (0xA69A, 'M', 'ꚛ'),
- (0xA69B, 'V'),
- (0xA69C, 'M', 'ъ'),
- (0xA69D, 'M', 'ь'),
- (0xA69E, 'V'),
- (0xA6F8, 'X'),
- (0xA700, 'V'),
- (0xA722, 'M', 'ꜣ'),
- (0xA723, 'V'),
- (0xA724, 'M', 'ꜥ'),
- (0xA725, 'V'),
- (0xA726, 'M', 'ꜧ'),
- (0xA727, 'V'),
- (0xA728, 'M', 'ꜩ'),
- (0xA729, 'V'),
- (0xA72A, 'M', 'ꜫ'),
- (0xA72B, 'V'),
- (0xA72C, 'M', 'ꜭ'),
- (0xA72D, 'V'),
- (0xA72E, 'M', 'ꜯ'),
- (0xA72F, 'V'),
- (0xA732, 'M', 'ꜳ'),
- (0xA733, 'V'),
- (0xA734, 'M', 'ꜵ'),
- (0xA735, 'V'),
- (0xA736, 'M', 'ꜷ'),
- (0xA737, 'V'),
- (0xA738, 'M', 'ꜹ'),
- (0xA739, 'V'),
- (0xA73A, 'M', 'ꜻ'),
- (0xA73B, 'V'),
- (0xA73C, 'M', 'ꜽ'),
- (0xA73D, 'V'),
- (0xA73E, 'M', 'ꜿ'),
- (0xA73F, 'V'),
- (0xA740, 'M', 'ꝁ'),
- (0xA741, 'V'),
- (0xA742, 'M', 'ꝃ'),
- (0xA743, 'V'),
- (0xA744, 'M', 'ꝅ'),
- (0xA745, 'V'),
- (0xA746, 'M', 'ꝇ'),
- (0xA747, 'V'),
- (0xA748, 'M', 'ꝉ'),
- (0xA749, 'V'),
- (0xA74A, 'M', 'ꝋ'),
- (0xA74B, 'V'),
- (0xA74C, 'M', 'ꝍ'),
- (0xA74D, 'V'),
- (0xA74E, 'M', 'ꝏ'),
- (0xA74F, 'V'),
- (0xA750, 'M', 'ꝑ'),
- (0xA751, 'V'),
- (0xA752, 'M', 'ꝓ'),
- (0xA753, 'V'),
- (0xA754, 'M', 'ꝕ'),
- (0xA755, 'V'),
- (0xA756, 'M', 'ꝗ'),
- (0xA757, 'V'),
- (0xA758, 'M', 'ꝙ'),
- (0xA759, 'V'),
- (0xA75A, 'M', 'ꝛ'),
- (0xA75B, 'V'),
- (0xA75C, 'M', 'ꝝ'),
- (0xA75D, 'V'),
- (0xA75E, 'M', 'ꝟ'),
- (0xA75F, 'V'),
- (0xA760, 'M', 'ꝡ'),
- (0xA761, 'V'),
- (0xA762, 'M', 'ꝣ'),
- (0xA763, 'V'),
- (0xA764, 'M', 'ꝥ'),
- (0xA765, 'V'),
- (0xA766, 'M', 'ꝧ'),
- (0xA767, 'V'),
- (0xA768, 'M', 'ꝩ'),
- (0xA769, 'V'),
- (0xA76A, 'M', 'ꝫ'),
- (0xA76B, 'V'),
- (0xA76C, 'M', 'ꝭ'),
- (0xA76D, 'V'),
- (0xA76E, 'M', 'ꝯ'),
- (0xA76F, 'V'),
- (0xA770, 'M', 'ꝯ'),
- (0xA771, 'V'),
- (0xA779, 'M', 'ꝺ'),
- (0xA77A, 'V'),
- (0xA77B, 'M', 'ꝼ'),
- ]
-
-def _seg_37() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xA77C, 'V'),
- (0xA77D, 'M', 'ᵹ'),
- (0xA77E, 'M', 'ꝿ'),
- (0xA77F, 'V'),
- (0xA780, 'M', 'ꞁ'),
- (0xA781, 'V'),
- (0xA782, 'M', 'ꞃ'),
- (0xA783, 'V'),
- (0xA784, 'M', 'ꞅ'),
- (0xA785, 'V'),
- (0xA786, 'M', 'ꞇ'),
- (0xA787, 'V'),
- (0xA78B, 'M', 'ꞌ'),
- (0xA78C, 'V'),
- (0xA78D, 'M', 'ɥ'),
- (0xA78E, 'V'),
- (0xA790, 'M', 'ꞑ'),
- (0xA791, 'V'),
- (0xA792, 'M', 'ꞓ'),
- (0xA793, 'V'),
- (0xA796, 'M', 'ꞗ'),
- (0xA797, 'V'),
- (0xA798, 'M', 'ꞙ'),
- (0xA799, 'V'),
- (0xA79A, 'M', 'ꞛ'),
- (0xA79B, 'V'),
- (0xA79C, 'M', 'ꞝ'),
- (0xA79D, 'V'),
- (0xA79E, 'M', 'ꞟ'),
- (0xA79F, 'V'),
- (0xA7A0, 'M', 'ꞡ'),
- (0xA7A1, 'V'),
- (0xA7A2, 'M', 'ꞣ'),
- (0xA7A3, 'V'),
- (0xA7A4, 'M', 'ꞥ'),
- (0xA7A5, 'V'),
- (0xA7A6, 'M', 'ꞧ'),
- (0xA7A7, 'V'),
- (0xA7A8, 'M', 'ꞩ'),
- (0xA7A9, 'V'),
- (0xA7AA, 'M', 'ɦ'),
- (0xA7AB, 'M', 'ɜ'),
- (0xA7AC, 'M', 'ɡ'),
- (0xA7AD, 'M', 'ɬ'),
- (0xA7AE, 'M', 'ɪ'),
- (0xA7AF, 'V'),
- (0xA7B0, 'M', 'ʞ'),
- (0xA7B1, 'M', 'ʇ'),
- (0xA7B2, 'M', 'ʝ'),
- (0xA7B3, 'M', 'ꭓ'),
- (0xA7B4, 'M', 'ꞵ'),
- (0xA7B5, 'V'),
- (0xA7B6, 'M', 'ꞷ'),
- (0xA7B7, 'V'),
- (0xA7B8, 'M', 'ꞹ'),
- (0xA7B9, 'V'),
- (0xA7BA, 'M', 'ꞻ'),
- (0xA7BB, 'V'),
- (0xA7BC, 'M', 'ꞽ'),
- (0xA7BD, 'V'),
- (0xA7BE, 'M', 'ꞿ'),
- (0xA7BF, 'V'),
- (0xA7C0, 'M', 'ꟁ'),
- (0xA7C1, 'V'),
- (0xA7C2, 'M', 'ꟃ'),
- (0xA7C3, 'V'),
- (0xA7C4, 'M', 'ꞔ'),
- (0xA7C5, 'M', 'ʂ'),
- (0xA7C6, 'M', 'ᶎ'),
- (0xA7C7, 'M', 'ꟈ'),
- (0xA7C8, 'V'),
- (0xA7C9, 'M', 'ꟊ'),
- (0xA7CA, 'V'),
- (0xA7CB, 'X'),
- (0xA7D0, 'M', 'ꟑ'),
- (0xA7D1, 'V'),
- (0xA7D2, 'X'),
- (0xA7D3, 'V'),
- (0xA7D4, 'X'),
- (0xA7D5, 'V'),
- (0xA7D6, 'M', 'ꟗ'),
- (0xA7D7, 'V'),
- (0xA7D8, 'M', 'ꟙ'),
- (0xA7D9, 'V'),
- (0xA7DA, 'X'),
- (0xA7F2, 'M', 'c'),
- (0xA7F3, 'M', 'f'),
- (0xA7F4, 'M', 'q'),
- (0xA7F5, 'M', 'ꟶ'),
- (0xA7F6, 'V'),
- (0xA7F8, 'M', 'ħ'),
- (0xA7F9, 'M', 'œ'),
- (0xA7FA, 'V'),
- (0xA82D, 'X'),
- (0xA830, 'V'),
- (0xA83A, 'X'),
- (0xA840, 'V'),
- (0xA878, 'X'),
- (0xA880, 'V'),
- (0xA8C6, 'X'),
- ]
-
-def _seg_38() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xA8CE, 'V'),
- (0xA8DA, 'X'),
- (0xA8E0, 'V'),
- (0xA954, 'X'),
- (0xA95F, 'V'),
- (0xA97D, 'X'),
- (0xA980, 'V'),
- (0xA9CE, 'X'),
- (0xA9CF, 'V'),
- (0xA9DA, 'X'),
- (0xA9DE, 'V'),
- (0xA9FF, 'X'),
- (0xAA00, 'V'),
- (0xAA37, 'X'),
- (0xAA40, 'V'),
- (0xAA4E, 'X'),
- (0xAA50, 'V'),
- (0xAA5A, 'X'),
- (0xAA5C, 'V'),
- (0xAAC3, 'X'),
- (0xAADB, 'V'),
- (0xAAF7, 'X'),
- (0xAB01, 'V'),
- (0xAB07, 'X'),
- (0xAB09, 'V'),
- (0xAB0F, 'X'),
- (0xAB11, 'V'),
- (0xAB17, 'X'),
- (0xAB20, 'V'),
- (0xAB27, 'X'),
- (0xAB28, 'V'),
- (0xAB2F, 'X'),
- (0xAB30, 'V'),
- (0xAB5C, 'M', 'ꜧ'),
- (0xAB5D, 'M', 'ꬷ'),
- (0xAB5E, 'M', 'ɫ'),
- (0xAB5F, 'M', 'ꭒ'),
- (0xAB60, 'V'),
- (0xAB69, 'M', 'ʍ'),
- (0xAB6A, 'V'),
- (0xAB6C, 'X'),
- (0xAB70, 'M', 'Ꭰ'),
- (0xAB71, 'M', 'Ꭱ'),
- (0xAB72, 'M', 'Ꭲ'),
- (0xAB73, 'M', 'Ꭳ'),
- (0xAB74, 'M', 'Ꭴ'),
- (0xAB75, 'M', 'Ꭵ'),
- (0xAB76, 'M', 'Ꭶ'),
- (0xAB77, 'M', 'Ꭷ'),
- (0xAB78, 'M', 'Ꭸ'),
- (0xAB79, 'M', 'Ꭹ'),
- (0xAB7A, 'M', 'Ꭺ'),
- (0xAB7B, 'M', 'Ꭻ'),
- (0xAB7C, 'M', 'Ꭼ'),
- (0xAB7D, 'M', 'Ꭽ'),
- (0xAB7E, 'M', 'Ꭾ'),
- (0xAB7F, 'M', 'Ꭿ'),
- (0xAB80, 'M', 'Ꮀ'),
- (0xAB81, 'M', 'Ꮁ'),
- (0xAB82, 'M', 'Ꮂ'),
- (0xAB83, 'M', 'Ꮃ'),
- (0xAB84, 'M', 'Ꮄ'),
- (0xAB85, 'M', 'Ꮅ'),
- (0xAB86, 'M', 'Ꮆ'),
- (0xAB87, 'M', 'Ꮇ'),
- (0xAB88, 'M', 'Ꮈ'),
- (0xAB89, 'M', 'Ꮉ'),
- (0xAB8A, 'M', 'Ꮊ'),
- (0xAB8B, 'M', 'Ꮋ'),
- (0xAB8C, 'M', 'Ꮌ'),
- (0xAB8D, 'M', 'Ꮍ'),
- (0xAB8E, 'M', 'Ꮎ'),
- (0xAB8F, 'M', 'Ꮏ'),
- (0xAB90, 'M', 'Ꮐ'),
- (0xAB91, 'M', 'Ꮑ'),
- (0xAB92, 'M', 'Ꮒ'),
- (0xAB93, 'M', 'Ꮓ'),
- (0xAB94, 'M', 'Ꮔ'),
- (0xAB95, 'M', 'Ꮕ'),
- (0xAB96, 'M', 'Ꮖ'),
- (0xAB97, 'M', 'Ꮗ'),
- (0xAB98, 'M', 'Ꮘ'),
- (0xAB99, 'M', 'Ꮙ'),
- (0xAB9A, 'M', 'Ꮚ'),
- (0xAB9B, 'M', 'Ꮛ'),
- (0xAB9C, 'M', 'Ꮜ'),
- (0xAB9D, 'M', 'Ꮝ'),
- (0xAB9E, 'M', 'Ꮞ'),
- (0xAB9F, 'M', 'Ꮟ'),
- (0xABA0, 'M', 'Ꮠ'),
- (0xABA1, 'M', 'Ꮡ'),
- (0xABA2, 'M', 'Ꮢ'),
- (0xABA3, 'M', 'Ꮣ'),
- (0xABA4, 'M', 'Ꮤ'),
- (0xABA5, 'M', 'Ꮥ'),
- (0xABA6, 'M', 'Ꮦ'),
- (0xABA7, 'M', 'Ꮧ'),
- (0xABA8, 'M', 'Ꮨ'),
- (0xABA9, 'M', 'Ꮩ'),
- (0xABAA, 'M', 'Ꮪ'),
- ]
-
-def _seg_39() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xABAB, 'M', 'Ꮫ'),
- (0xABAC, 'M', 'Ꮬ'),
- (0xABAD, 'M', 'Ꮭ'),
- (0xABAE, 'M', 'Ꮮ'),
- (0xABAF, 'M', 'Ꮯ'),
- (0xABB0, 'M', 'Ꮰ'),
- (0xABB1, 'M', 'Ꮱ'),
- (0xABB2, 'M', 'Ꮲ'),
- (0xABB3, 'M', 'Ꮳ'),
- (0xABB4, 'M', 'Ꮴ'),
- (0xABB5, 'M', 'Ꮵ'),
- (0xABB6, 'M', 'Ꮶ'),
- (0xABB7, 'M', 'Ꮷ'),
- (0xABB8, 'M', 'Ꮸ'),
- (0xABB9, 'M', 'Ꮹ'),
- (0xABBA, 'M', 'Ꮺ'),
- (0xABBB, 'M', 'Ꮻ'),
- (0xABBC, 'M', 'Ꮼ'),
- (0xABBD, 'M', 'Ꮽ'),
- (0xABBE, 'M', 'Ꮾ'),
- (0xABBF, 'M', 'Ꮿ'),
- (0xABC0, 'V'),
- (0xABEE, 'X'),
- (0xABF0, 'V'),
- (0xABFA, 'X'),
- (0xAC00, 'V'),
- (0xD7A4, 'X'),
- (0xD7B0, 'V'),
- (0xD7C7, 'X'),
- (0xD7CB, 'V'),
- (0xD7FC, 'X'),
- (0xF900, 'M', '豈'),
- (0xF901, 'M', '更'),
- (0xF902, 'M', '車'),
- (0xF903, 'M', '賈'),
- (0xF904, 'M', '滑'),
- (0xF905, 'M', '串'),
- (0xF906, 'M', '句'),
- (0xF907, 'M', '龜'),
- (0xF909, 'M', '契'),
- (0xF90A, 'M', '金'),
- (0xF90B, 'M', '喇'),
- (0xF90C, 'M', '奈'),
- (0xF90D, 'M', '懶'),
- (0xF90E, 'M', '癩'),
- (0xF90F, 'M', '羅'),
- (0xF910, 'M', '蘿'),
- (0xF911, 'M', '螺'),
- (0xF912, 'M', '裸'),
- (0xF913, 'M', '邏'),
- (0xF914, 'M', '樂'),
- (0xF915, 'M', '洛'),
- (0xF916, 'M', '烙'),
- (0xF917, 'M', '珞'),
- (0xF918, 'M', '落'),
- (0xF919, 'M', '酪'),
- (0xF91A, 'M', '駱'),
- (0xF91B, 'M', '亂'),
- (0xF91C, 'M', '卵'),
- (0xF91D, 'M', '欄'),
- (0xF91E, 'M', '爛'),
- (0xF91F, 'M', '蘭'),
- (0xF920, 'M', '鸞'),
- (0xF921, 'M', '嵐'),
- (0xF922, 'M', '濫'),
- (0xF923, 'M', '藍'),
- (0xF924, 'M', '襤'),
- (0xF925, 'M', '拉'),
- (0xF926, 'M', '臘'),
- (0xF927, 'M', '蠟'),
- (0xF928, 'M', '廊'),
- (0xF929, 'M', '朗'),
- (0xF92A, 'M', '浪'),
- (0xF92B, 'M', '狼'),
- (0xF92C, 'M', '郎'),
- (0xF92D, 'M', '來'),
- (0xF92E, 'M', '冷'),
- (0xF92F, 'M', '勞'),
- (0xF930, 'M', '擄'),
- (0xF931, 'M', '櫓'),
- (0xF932, 'M', '爐'),
- (0xF933, 'M', '盧'),
- (0xF934, 'M', '老'),
- (0xF935, 'M', '蘆'),
- (0xF936, 'M', '虜'),
- (0xF937, 'M', '路'),
- (0xF938, 'M', '露'),
- (0xF939, 'M', '魯'),
- (0xF93A, 'M', '鷺'),
- (0xF93B, 'M', '碌'),
- (0xF93C, 'M', '祿'),
- (0xF93D, 'M', '綠'),
- (0xF93E, 'M', '菉'),
- (0xF93F, 'M', '錄'),
- (0xF940, 'M', '鹿'),
- (0xF941, 'M', '論'),
- (0xF942, 'M', '壟'),
- (0xF943, 'M', '弄'),
- (0xF944, 'M', '籠'),
- (0xF945, 'M', '聾'),
- ]
-
-def _seg_40() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xF946, 'M', '牢'),
- (0xF947, 'M', '磊'),
- (0xF948, 'M', '賂'),
- (0xF949, 'M', '雷'),
- (0xF94A, 'M', '壘'),
- (0xF94B, 'M', '屢'),
- (0xF94C, 'M', '樓'),
- (0xF94D, 'M', '淚'),
- (0xF94E, 'M', '漏'),
- (0xF94F, 'M', '累'),
- (0xF950, 'M', '縷'),
- (0xF951, 'M', '陋'),
- (0xF952, 'M', '勒'),
- (0xF953, 'M', '肋'),
- (0xF954, 'M', '凜'),
- (0xF955, 'M', '凌'),
- (0xF956, 'M', '稜'),
- (0xF957, 'M', '綾'),
- (0xF958, 'M', '菱'),
- (0xF959, 'M', '陵'),
- (0xF95A, 'M', '讀'),
- (0xF95B, 'M', '拏'),
- (0xF95C, 'M', '樂'),
- (0xF95D, 'M', '諾'),
- (0xF95E, 'M', '丹'),
- (0xF95F, 'M', '寧'),
- (0xF960, 'M', '怒'),
- (0xF961, 'M', '率'),
- (0xF962, 'M', '異'),
- (0xF963, 'M', '北'),
- (0xF964, 'M', '磻'),
- (0xF965, 'M', '便'),
- (0xF966, 'M', '復'),
- (0xF967, 'M', '不'),
- (0xF968, 'M', '泌'),
- (0xF969, 'M', '數'),
- (0xF96A, 'M', '索'),
- (0xF96B, 'M', '參'),
- (0xF96C, 'M', '塞'),
- (0xF96D, 'M', '省'),
- (0xF96E, 'M', '葉'),
- (0xF96F, 'M', '說'),
- (0xF970, 'M', '殺'),
- (0xF971, 'M', '辰'),
- (0xF972, 'M', '沈'),
- (0xF973, 'M', '拾'),
- (0xF974, 'M', '若'),
- (0xF975, 'M', '掠'),
- (0xF976, 'M', '略'),
- (0xF977, 'M', '亮'),
- (0xF978, 'M', '兩'),
- (0xF979, 'M', '凉'),
- (0xF97A, 'M', '梁'),
- (0xF97B, 'M', '糧'),
- (0xF97C, 'M', '良'),
- (0xF97D, 'M', '諒'),
- (0xF97E, 'M', '量'),
- (0xF97F, 'M', '勵'),
- (0xF980, 'M', '呂'),
- (0xF981, 'M', '女'),
- (0xF982, 'M', '廬'),
- (0xF983, 'M', '旅'),
- (0xF984, 'M', '濾'),
- (0xF985, 'M', '礪'),
- (0xF986, 'M', '閭'),
- (0xF987, 'M', '驪'),
- (0xF988, 'M', '麗'),
- (0xF989, 'M', '黎'),
- (0xF98A, 'M', '力'),
- (0xF98B, 'M', '曆'),
- (0xF98C, 'M', '歷'),
- (0xF98D, 'M', '轢'),
- (0xF98E, 'M', '年'),
- (0xF98F, 'M', '憐'),
- (0xF990, 'M', '戀'),
- (0xF991, 'M', '撚'),
- (0xF992, 'M', '漣'),
- (0xF993, 'M', '煉'),
- (0xF994, 'M', '璉'),
- (0xF995, 'M', '秊'),
- (0xF996, 'M', '練'),
- (0xF997, 'M', '聯'),
- (0xF998, 'M', '輦'),
- (0xF999, 'M', '蓮'),
- (0xF99A, 'M', '連'),
- (0xF99B, 'M', '鍊'),
- (0xF99C, 'M', '列'),
- (0xF99D, 'M', '劣'),
- (0xF99E, 'M', '咽'),
- (0xF99F, 'M', '烈'),
- (0xF9A0, 'M', '裂'),
- (0xF9A1, 'M', '說'),
- (0xF9A2, 'M', '廉'),
- (0xF9A3, 'M', '念'),
- (0xF9A4, 'M', '捻'),
- (0xF9A5, 'M', '殮'),
- (0xF9A6, 'M', '簾'),
- (0xF9A7, 'M', '獵'),
- (0xF9A8, 'M', '令'),
- (0xF9A9, 'M', '囹'),
- ]
-
-def _seg_41() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xF9AA, 'M', '寧'),
- (0xF9AB, 'M', '嶺'),
- (0xF9AC, 'M', '怜'),
- (0xF9AD, 'M', '玲'),
- (0xF9AE, 'M', '瑩'),
- (0xF9AF, 'M', '羚'),
- (0xF9B0, 'M', '聆'),
- (0xF9B1, 'M', '鈴'),
- (0xF9B2, 'M', '零'),
- (0xF9B3, 'M', '靈'),
- (0xF9B4, 'M', '領'),
- (0xF9B5, 'M', '例'),
- (0xF9B6, 'M', '禮'),
- (0xF9B7, 'M', '醴'),
- (0xF9B8, 'M', '隸'),
- (0xF9B9, 'M', '惡'),
- (0xF9BA, 'M', '了'),
- (0xF9BB, 'M', '僚'),
- (0xF9BC, 'M', '寮'),
- (0xF9BD, 'M', '尿'),
- (0xF9BE, 'M', '料'),
- (0xF9BF, 'M', '樂'),
- (0xF9C0, 'M', '燎'),
- (0xF9C1, 'M', '療'),
- (0xF9C2, 'M', '蓼'),
- (0xF9C3, 'M', '遼'),
- (0xF9C4, 'M', '龍'),
- (0xF9C5, 'M', '暈'),
- (0xF9C6, 'M', '阮'),
- (0xF9C7, 'M', '劉'),
- (0xF9C8, 'M', '杻'),
- (0xF9C9, 'M', '柳'),
- (0xF9CA, 'M', '流'),
- (0xF9CB, 'M', '溜'),
- (0xF9CC, 'M', '琉'),
- (0xF9CD, 'M', '留'),
- (0xF9CE, 'M', '硫'),
- (0xF9CF, 'M', '紐'),
- (0xF9D0, 'M', '類'),
- (0xF9D1, 'M', '六'),
- (0xF9D2, 'M', '戮'),
- (0xF9D3, 'M', '陸'),
- (0xF9D4, 'M', '倫'),
- (0xF9D5, 'M', '崙'),
- (0xF9D6, 'M', '淪'),
- (0xF9D7, 'M', '輪'),
- (0xF9D8, 'M', '律'),
- (0xF9D9, 'M', '慄'),
- (0xF9DA, 'M', '栗'),
- (0xF9DB, 'M', '率'),
- (0xF9DC, 'M', '隆'),
- (0xF9DD, 'M', '利'),
- (0xF9DE, 'M', '吏'),
- (0xF9DF, 'M', '履'),
- (0xF9E0, 'M', '易'),
- (0xF9E1, 'M', '李'),
- (0xF9E2, 'M', '梨'),
- (0xF9E3, 'M', '泥'),
- (0xF9E4, 'M', '理'),
- (0xF9E5, 'M', '痢'),
- (0xF9E6, 'M', '罹'),
- (0xF9E7, 'M', '裏'),
- (0xF9E8, 'M', '裡'),
- (0xF9E9, 'M', '里'),
- (0xF9EA, 'M', '離'),
- (0xF9EB, 'M', '匿'),
- (0xF9EC, 'M', '溺'),
- (0xF9ED, 'M', '吝'),
- (0xF9EE, 'M', '燐'),
- (0xF9EF, 'M', '璘'),
- (0xF9F0, 'M', '藺'),
- (0xF9F1, 'M', '隣'),
- (0xF9F2, 'M', '鱗'),
- (0xF9F3, 'M', '麟'),
- (0xF9F4, 'M', '林'),
- (0xF9F5, 'M', '淋'),
- (0xF9F6, 'M', '臨'),
- (0xF9F7, 'M', '立'),
- (0xF9F8, 'M', '笠'),
- (0xF9F9, 'M', '粒'),
- (0xF9FA, 'M', '狀'),
- (0xF9FB, 'M', '炙'),
- (0xF9FC, 'M', '識'),
- (0xF9FD, 'M', '什'),
- (0xF9FE, 'M', '茶'),
- (0xF9FF, 'M', '刺'),
- (0xFA00, 'M', '切'),
- (0xFA01, 'M', '度'),
- (0xFA02, 'M', '拓'),
- (0xFA03, 'M', '糖'),
- (0xFA04, 'M', '宅'),
- (0xFA05, 'M', '洞'),
- (0xFA06, 'M', '暴'),
- (0xFA07, 'M', '輻'),
- (0xFA08, 'M', '行'),
- (0xFA09, 'M', '降'),
- (0xFA0A, 'M', '見'),
- (0xFA0B, 'M', '廓'),
- (0xFA0C, 'M', '兀'),
- (0xFA0D, 'M', '嗀'),
- ]
-
-def _seg_42() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFA0E, 'V'),
- (0xFA10, 'M', '塚'),
- (0xFA11, 'V'),
- (0xFA12, 'M', '晴'),
- (0xFA13, 'V'),
- (0xFA15, 'M', '凞'),
- (0xFA16, 'M', '猪'),
- (0xFA17, 'M', '益'),
- (0xFA18, 'M', '礼'),
- (0xFA19, 'M', '神'),
- (0xFA1A, 'M', '祥'),
- (0xFA1B, 'M', '福'),
- (0xFA1C, 'M', '靖'),
- (0xFA1D, 'M', '精'),
- (0xFA1E, 'M', '羽'),
- (0xFA1F, 'V'),
- (0xFA20, 'M', '蘒'),
- (0xFA21, 'V'),
- (0xFA22, 'M', '諸'),
- (0xFA23, 'V'),
- (0xFA25, 'M', '逸'),
- (0xFA26, 'M', '都'),
- (0xFA27, 'V'),
- (0xFA2A, 'M', '飯'),
- (0xFA2B, 'M', '飼'),
- (0xFA2C, 'M', '館'),
- (0xFA2D, 'M', '鶴'),
- (0xFA2E, 'M', '郞'),
- (0xFA2F, 'M', '隷'),
- (0xFA30, 'M', '侮'),
- (0xFA31, 'M', '僧'),
- (0xFA32, 'M', '免'),
- (0xFA33, 'M', '勉'),
- (0xFA34, 'M', '勤'),
- (0xFA35, 'M', '卑'),
- (0xFA36, 'M', '喝'),
- (0xFA37, 'M', '嘆'),
- (0xFA38, 'M', '器'),
- (0xFA39, 'M', '塀'),
- (0xFA3A, 'M', '墨'),
- (0xFA3B, 'M', '層'),
- (0xFA3C, 'M', '屮'),
- (0xFA3D, 'M', '悔'),
- (0xFA3E, 'M', '慨'),
- (0xFA3F, 'M', '憎'),
- (0xFA40, 'M', '懲'),
- (0xFA41, 'M', '敏'),
- (0xFA42, 'M', '既'),
- (0xFA43, 'M', '暑'),
- (0xFA44, 'M', '梅'),
- (0xFA45, 'M', '海'),
- (0xFA46, 'M', '渚'),
- (0xFA47, 'M', '漢'),
- (0xFA48, 'M', '煮'),
- (0xFA49, 'M', '爫'),
- (0xFA4A, 'M', '琢'),
- (0xFA4B, 'M', '碑'),
- (0xFA4C, 'M', '社'),
- (0xFA4D, 'M', '祉'),
- (0xFA4E, 'M', '祈'),
- (0xFA4F, 'M', '祐'),
- (0xFA50, 'M', '祖'),
- (0xFA51, 'M', '祝'),
- (0xFA52, 'M', '禍'),
- (0xFA53, 'M', '禎'),
- (0xFA54, 'M', '穀'),
- (0xFA55, 'M', '突'),
- (0xFA56, 'M', '節'),
- (0xFA57, 'M', '練'),
- (0xFA58, 'M', '縉'),
- (0xFA59, 'M', '繁'),
- (0xFA5A, 'M', '署'),
- (0xFA5B, 'M', '者'),
- (0xFA5C, 'M', '臭'),
- (0xFA5D, 'M', '艹'),
- (0xFA5F, 'M', '著'),
- (0xFA60, 'M', '褐'),
- (0xFA61, 'M', '視'),
- (0xFA62, 'M', '謁'),
- (0xFA63, 'M', '謹'),
- (0xFA64, 'M', '賓'),
- (0xFA65, 'M', '贈'),
- (0xFA66, 'M', '辶'),
- (0xFA67, 'M', '逸'),
- (0xFA68, 'M', '難'),
- (0xFA69, 'M', '響'),
- (0xFA6A, 'M', '頻'),
- (0xFA6B, 'M', '恵'),
- (0xFA6C, 'M', '𤋮'),
- (0xFA6D, 'M', '舘'),
- (0xFA6E, 'X'),
- (0xFA70, 'M', '並'),
- (0xFA71, 'M', '况'),
- (0xFA72, 'M', '全'),
- (0xFA73, 'M', '侀'),
- (0xFA74, 'M', '充'),
- (0xFA75, 'M', '冀'),
- (0xFA76, 'M', '勇'),
- (0xFA77, 'M', '勺'),
- (0xFA78, 'M', '喝'),
- ]
-
-def _seg_43() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFA79, 'M', '啕'),
- (0xFA7A, 'M', '喙'),
- (0xFA7B, 'M', '嗢'),
- (0xFA7C, 'M', '塚'),
- (0xFA7D, 'M', '墳'),
- (0xFA7E, 'M', '奄'),
- (0xFA7F, 'M', '奔'),
- (0xFA80, 'M', '婢'),
- (0xFA81, 'M', '嬨'),
- (0xFA82, 'M', '廒'),
- (0xFA83, 'M', '廙'),
- (0xFA84, 'M', '彩'),
- (0xFA85, 'M', '徭'),
- (0xFA86, 'M', '惘'),
- (0xFA87, 'M', '慎'),
- (0xFA88, 'M', '愈'),
- (0xFA89, 'M', '憎'),
- (0xFA8A, 'M', '慠'),
- (0xFA8B, 'M', '懲'),
- (0xFA8C, 'M', '戴'),
- (0xFA8D, 'M', '揄'),
- (0xFA8E, 'M', '搜'),
- (0xFA8F, 'M', '摒'),
- (0xFA90, 'M', '敖'),
- (0xFA91, 'M', '晴'),
- (0xFA92, 'M', '朗'),
- (0xFA93, 'M', '望'),
- (0xFA94, 'M', '杖'),
- (0xFA95, 'M', '歹'),
- (0xFA96, 'M', '殺'),
- (0xFA97, 'M', '流'),
- (0xFA98, 'M', '滛'),
- (0xFA99, 'M', '滋'),
- (0xFA9A, 'M', '漢'),
- (0xFA9B, 'M', '瀞'),
- (0xFA9C, 'M', '煮'),
- (0xFA9D, 'M', '瞧'),
- (0xFA9E, 'M', '爵'),
- (0xFA9F, 'M', '犯'),
- (0xFAA0, 'M', '猪'),
- (0xFAA1, 'M', '瑱'),
- (0xFAA2, 'M', '甆'),
- (0xFAA3, 'M', '画'),
- (0xFAA4, 'M', '瘝'),
- (0xFAA5, 'M', '瘟'),
- (0xFAA6, 'M', '益'),
- (0xFAA7, 'M', '盛'),
- (0xFAA8, 'M', '直'),
- (0xFAA9, 'M', '睊'),
- (0xFAAA, 'M', '着'),
- (0xFAAB, 'M', '磌'),
- (0xFAAC, 'M', '窱'),
- (0xFAAD, 'M', '節'),
- (0xFAAE, 'M', '类'),
- (0xFAAF, 'M', '絛'),
- (0xFAB0, 'M', '練'),
- (0xFAB1, 'M', '缾'),
- (0xFAB2, 'M', '者'),
- (0xFAB3, 'M', '荒'),
- (0xFAB4, 'M', '華'),
- (0xFAB5, 'M', '蝹'),
- (0xFAB6, 'M', '襁'),
- (0xFAB7, 'M', '覆'),
- (0xFAB8, 'M', '視'),
- (0xFAB9, 'M', '調'),
- (0xFABA, 'M', '諸'),
- (0xFABB, 'M', '請'),
- (0xFABC, 'M', '謁'),
- (0xFABD, 'M', '諾'),
- (0xFABE, 'M', '諭'),
- (0xFABF, 'M', '謹'),
- (0xFAC0, 'M', '變'),
- (0xFAC1, 'M', '贈'),
- (0xFAC2, 'M', '輸'),
- (0xFAC3, 'M', '遲'),
- (0xFAC4, 'M', '醙'),
- (0xFAC5, 'M', '鉶'),
- (0xFAC6, 'M', '陼'),
- (0xFAC7, 'M', '難'),
- (0xFAC8, 'M', '靖'),
- (0xFAC9, 'M', '韛'),
- (0xFACA, 'M', '響'),
- (0xFACB, 'M', '頋'),
- (0xFACC, 'M', '頻'),
- (0xFACD, 'M', '鬒'),
- (0xFACE, 'M', '龜'),
- (0xFACF, 'M', '𢡊'),
- (0xFAD0, 'M', '𢡄'),
- (0xFAD1, 'M', '𣏕'),
- (0xFAD2, 'M', '㮝'),
- (0xFAD3, 'M', '䀘'),
- (0xFAD4, 'M', '䀹'),
- (0xFAD5, 'M', '𥉉'),
- (0xFAD6, 'M', '𥳐'),
- (0xFAD7, 'M', '𧻓'),
- (0xFAD8, 'M', '齃'),
- (0xFAD9, 'M', '龎'),
- (0xFADA, 'X'),
- (0xFB00, 'M', 'ff'),
- (0xFB01, 'M', 'fi'),
- ]
-
-def _seg_44() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFB02, 'M', 'fl'),
- (0xFB03, 'M', 'ffi'),
- (0xFB04, 'M', 'ffl'),
- (0xFB05, 'M', 'st'),
- (0xFB07, 'X'),
- (0xFB13, 'M', 'մն'),
- (0xFB14, 'M', 'մե'),
- (0xFB15, 'M', 'մի'),
- (0xFB16, 'M', 'վն'),
- (0xFB17, 'M', 'մխ'),
- (0xFB18, 'X'),
- (0xFB1D, 'M', 'יִ'),
- (0xFB1E, 'V'),
- (0xFB1F, 'M', 'ײַ'),
- (0xFB20, 'M', 'ע'),
- (0xFB21, 'M', 'א'),
- (0xFB22, 'M', 'ד'),
- (0xFB23, 'M', 'ה'),
- (0xFB24, 'M', 'כ'),
- (0xFB25, 'M', 'ל'),
- (0xFB26, 'M', 'ם'),
- (0xFB27, 'M', 'ר'),
- (0xFB28, 'M', 'ת'),
- (0xFB29, '3', '+'),
- (0xFB2A, 'M', 'שׁ'),
- (0xFB2B, 'M', 'שׂ'),
- (0xFB2C, 'M', 'שּׁ'),
- (0xFB2D, 'M', 'שּׂ'),
- (0xFB2E, 'M', 'אַ'),
- (0xFB2F, 'M', 'אָ'),
- (0xFB30, 'M', 'אּ'),
- (0xFB31, 'M', 'בּ'),
- (0xFB32, 'M', 'גּ'),
- (0xFB33, 'M', 'דּ'),
- (0xFB34, 'M', 'הּ'),
- (0xFB35, 'M', 'וּ'),
- (0xFB36, 'M', 'זּ'),
- (0xFB37, 'X'),
- (0xFB38, 'M', 'טּ'),
- (0xFB39, 'M', 'יּ'),
- (0xFB3A, 'M', 'ךּ'),
- (0xFB3B, 'M', 'כּ'),
- (0xFB3C, 'M', 'לּ'),
- (0xFB3D, 'X'),
- (0xFB3E, 'M', 'מּ'),
- (0xFB3F, 'X'),
- (0xFB40, 'M', 'נּ'),
- (0xFB41, 'M', 'סּ'),
- (0xFB42, 'X'),
- (0xFB43, 'M', 'ףּ'),
- (0xFB44, 'M', 'פּ'),
- (0xFB45, 'X'),
- (0xFB46, 'M', 'צּ'),
- (0xFB47, 'M', 'קּ'),
- (0xFB48, 'M', 'רּ'),
- (0xFB49, 'M', 'שּ'),
- (0xFB4A, 'M', 'תּ'),
- (0xFB4B, 'M', 'וֹ'),
- (0xFB4C, 'M', 'בֿ'),
- (0xFB4D, 'M', 'כֿ'),
- (0xFB4E, 'M', 'פֿ'),
- (0xFB4F, 'M', 'אל'),
- (0xFB50, 'M', 'ٱ'),
- (0xFB52, 'M', 'ٻ'),
- (0xFB56, 'M', 'پ'),
- (0xFB5A, 'M', 'ڀ'),
- (0xFB5E, 'M', 'ٺ'),
- (0xFB62, 'M', 'ٿ'),
- (0xFB66, 'M', 'ٹ'),
- (0xFB6A, 'M', 'ڤ'),
- (0xFB6E, 'M', 'ڦ'),
- (0xFB72, 'M', 'ڄ'),
- (0xFB76, 'M', 'ڃ'),
- (0xFB7A, 'M', 'چ'),
- (0xFB7E, 'M', 'ڇ'),
- (0xFB82, 'M', 'ڍ'),
- (0xFB84, 'M', 'ڌ'),
- (0xFB86, 'M', 'ڎ'),
- (0xFB88, 'M', 'ڈ'),
- (0xFB8A, 'M', 'ژ'),
- (0xFB8C, 'M', 'ڑ'),
- (0xFB8E, 'M', 'ک'),
- (0xFB92, 'M', 'گ'),
- (0xFB96, 'M', 'ڳ'),
- (0xFB9A, 'M', 'ڱ'),
- (0xFB9E, 'M', 'ں'),
- (0xFBA0, 'M', 'ڻ'),
- (0xFBA4, 'M', 'ۀ'),
- (0xFBA6, 'M', 'ہ'),
- (0xFBAA, 'M', 'ھ'),
- (0xFBAE, 'M', 'ے'),
- (0xFBB0, 'M', 'ۓ'),
- (0xFBB2, 'V'),
- (0xFBC3, 'X'),
- (0xFBD3, 'M', 'ڭ'),
- (0xFBD7, 'M', 'ۇ'),
- (0xFBD9, 'M', 'ۆ'),
- (0xFBDB, 'M', 'ۈ'),
- (0xFBDD, 'M', 'ۇٴ'),
- (0xFBDE, 'M', 'ۋ'),
- ]
-
-def _seg_45() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFBE0, 'M', 'ۅ'),
- (0xFBE2, 'M', 'ۉ'),
- (0xFBE4, 'M', 'ې'),
- (0xFBE8, 'M', 'ى'),
- (0xFBEA, 'M', 'ئا'),
- (0xFBEC, 'M', 'ئە'),
- (0xFBEE, 'M', 'ئو'),
- (0xFBF0, 'M', 'ئۇ'),
- (0xFBF2, 'M', 'ئۆ'),
- (0xFBF4, 'M', 'ئۈ'),
- (0xFBF6, 'M', 'ئې'),
- (0xFBF9, 'M', 'ئى'),
- (0xFBFC, 'M', 'ی'),
- (0xFC00, 'M', 'ئج'),
- (0xFC01, 'M', 'ئح'),
- (0xFC02, 'M', 'ئم'),
- (0xFC03, 'M', 'ئى'),
- (0xFC04, 'M', 'ئي'),
- (0xFC05, 'M', 'بج'),
- (0xFC06, 'M', 'بح'),
- (0xFC07, 'M', 'بخ'),
- (0xFC08, 'M', 'بم'),
- (0xFC09, 'M', 'بى'),
- (0xFC0A, 'M', 'بي'),
- (0xFC0B, 'M', 'تج'),
- (0xFC0C, 'M', 'تح'),
- (0xFC0D, 'M', 'تخ'),
- (0xFC0E, 'M', 'تم'),
- (0xFC0F, 'M', 'تى'),
- (0xFC10, 'M', 'تي'),
- (0xFC11, 'M', 'ثج'),
- (0xFC12, 'M', 'ثم'),
- (0xFC13, 'M', 'ثى'),
- (0xFC14, 'M', 'ثي'),
- (0xFC15, 'M', 'جح'),
- (0xFC16, 'M', 'جم'),
- (0xFC17, 'M', 'حج'),
- (0xFC18, 'M', 'حم'),
- (0xFC19, 'M', 'خج'),
- (0xFC1A, 'M', 'خح'),
- (0xFC1B, 'M', 'خم'),
- (0xFC1C, 'M', 'سج'),
- (0xFC1D, 'M', 'سح'),
- (0xFC1E, 'M', 'سخ'),
- (0xFC1F, 'M', 'سم'),
- (0xFC20, 'M', 'صح'),
- (0xFC21, 'M', 'صم'),
- (0xFC22, 'M', 'ضج'),
- (0xFC23, 'M', 'ضح'),
- (0xFC24, 'M', 'ضخ'),
- (0xFC25, 'M', 'ضم'),
- (0xFC26, 'M', 'طح'),
- (0xFC27, 'M', 'طم'),
- (0xFC28, 'M', 'ظم'),
- (0xFC29, 'M', 'عج'),
- (0xFC2A, 'M', 'عم'),
- (0xFC2B, 'M', 'غج'),
- (0xFC2C, 'M', 'غم'),
- (0xFC2D, 'M', 'فج'),
- (0xFC2E, 'M', 'فح'),
- (0xFC2F, 'M', 'فخ'),
- (0xFC30, 'M', 'فم'),
- (0xFC31, 'M', 'فى'),
- (0xFC32, 'M', 'في'),
- (0xFC33, 'M', 'قح'),
- (0xFC34, 'M', 'قم'),
- (0xFC35, 'M', 'قى'),
- (0xFC36, 'M', 'قي'),
- (0xFC37, 'M', 'كا'),
- (0xFC38, 'M', 'كج'),
- (0xFC39, 'M', 'كح'),
- (0xFC3A, 'M', 'كخ'),
- (0xFC3B, 'M', 'كل'),
- (0xFC3C, 'M', 'كم'),
- (0xFC3D, 'M', 'كى'),
- (0xFC3E, 'M', 'كي'),
- (0xFC3F, 'M', 'لج'),
- (0xFC40, 'M', 'لح'),
- (0xFC41, 'M', 'لخ'),
- (0xFC42, 'M', 'لم'),
- (0xFC43, 'M', 'لى'),
- (0xFC44, 'M', 'لي'),
- (0xFC45, 'M', 'مج'),
- (0xFC46, 'M', 'مح'),
- (0xFC47, 'M', 'مخ'),
- (0xFC48, 'M', 'مم'),
- (0xFC49, 'M', 'مى'),
- (0xFC4A, 'M', 'مي'),
- (0xFC4B, 'M', 'نج'),
- (0xFC4C, 'M', 'نح'),
- (0xFC4D, 'M', 'نخ'),
- (0xFC4E, 'M', 'نم'),
- (0xFC4F, 'M', 'نى'),
- (0xFC50, 'M', 'ني'),
- (0xFC51, 'M', 'هج'),
- (0xFC52, 'M', 'هم'),
- (0xFC53, 'M', 'هى'),
- (0xFC54, 'M', 'هي'),
- (0xFC55, 'M', 'يج'),
- (0xFC56, 'M', 'يح'),
- ]
-
-def _seg_46() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFC57, 'M', 'يخ'),
- (0xFC58, 'M', 'يم'),
- (0xFC59, 'M', 'يى'),
- (0xFC5A, 'M', 'يي'),
- (0xFC5B, 'M', 'ذٰ'),
- (0xFC5C, 'M', 'رٰ'),
- (0xFC5D, 'M', 'ىٰ'),
- (0xFC5E, '3', ' ٌّ'),
- (0xFC5F, '3', ' ٍّ'),
- (0xFC60, '3', ' َّ'),
- (0xFC61, '3', ' ُّ'),
- (0xFC62, '3', ' ِّ'),
- (0xFC63, '3', ' ّٰ'),
- (0xFC64, 'M', 'ئر'),
- (0xFC65, 'M', 'ئز'),
- (0xFC66, 'M', 'ئم'),
- (0xFC67, 'M', 'ئن'),
- (0xFC68, 'M', 'ئى'),
- (0xFC69, 'M', 'ئي'),
- (0xFC6A, 'M', 'بر'),
- (0xFC6B, 'M', 'بز'),
- (0xFC6C, 'M', 'بم'),
- (0xFC6D, 'M', 'بن'),
- (0xFC6E, 'M', 'بى'),
- (0xFC6F, 'M', 'بي'),
- (0xFC70, 'M', 'تر'),
- (0xFC71, 'M', 'تز'),
- (0xFC72, 'M', 'تم'),
- (0xFC73, 'M', 'تن'),
- (0xFC74, 'M', 'تى'),
- (0xFC75, 'M', 'تي'),
- (0xFC76, 'M', 'ثر'),
- (0xFC77, 'M', 'ثز'),
- (0xFC78, 'M', 'ثم'),
- (0xFC79, 'M', 'ثن'),
- (0xFC7A, 'M', 'ثى'),
- (0xFC7B, 'M', 'ثي'),
- (0xFC7C, 'M', 'فى'),
- (0xFC7D, 'M', 'في'),
- (0xFC7E, 'M', 'قى'),
- (0xFC7F, 'M', 'قي'),
- (0xFC80, 'M', 'كا'),
- (0xFC81, 'M', 'كل'),
- (0xFC82, 'M', 'كم'),
- (0xFC83, 'M', 'كى'),
- (0xFC84, 'M', 'كي'),
- (0xFC85, 'M', 'لم'),
- (0xFC86, 'M', 'لى'),
- (0xFC87, 'M', 'لي'),
- (0xFC88, 'M', 'ما'),
- (0xFC89, 'M', 'مم'),
- (0xFC8A, 'M', 'نر'),
- (0xFC8B, 'M', 'نز'),
- (0xFC8C, 'M', 'نم'),
- (0xFC8D, 'M', 'نن'),
- (0xFC8E, 'M', 'نى'),
- (0xFC8F, 'M', 'ني'),
- (0xFC90, 'M', 'ىٰ'),
- (0xFC91, 'M', 'ير'),
- (0xFC92, 'M', 'يز'),
- (0xFC93, 'M', 'يم'),
- (0xFC94, 'M', 'ين'),
- (0xFC95, 'M', 'يى'),
- (0xFC96, 'M', 'يي'),
- (0xFC97, 'M', 'ئج'),
- (0xFC98, 'M', 'ئح'),
- (0xFC99, 'M', 'ئخ'),
- (0xFC9A, 'M', 'ئم'),
- (0xFC9B, 'M', 'ئه'),
- (0xFC9C, 'M', 'بج'),
- (0xFC9D, 'M', 'بح'),
- (0xFC9E, 'M', 'بخ'),
- (0xFC9F, 'M', 'بم'),
- (0xFCA0, 'M', 'به'),
- (0xFCA1, 'M', 'تج'),
- (0xFCA2, 'M', 'تح'),
- (0xFCA3, 'M', 'تخ'),
- (0xFCA4, 'M', 'تم'),
- (0xFCA5, 'M', 'ته'),
- (0xFCA6, 'M', 'ثم'),
- (0xFCA7, 'M', 'جح'),
- (0xFCA8, 'M', 'جم'),
- (0xFCA9, 'M', 'حج'),
- (0xFCAA, 'M', 'حم'),
- (0xFCAB, 'M', 'خج'),
- (0xFCAC, 'M', 'خم'),
- (0xFCAD, 'M', 'سج'),
- (0xFCAE, 'M', 'سح'),
- (0xFCAF, 'M', 'سخ'),
- (0xFCB0, 'M', 'سم'),
- (0xFCB1, 'M', 'صح'),
- (0xFCB2, 'M', 'صخ'),
- (0xFCB3, 'M', 'صم'),
- (0xFCB4, 'M', 'ضج'),
- (0xFCB5, 'M', 'ضح'),
- (0xFCB6, 'M', 'ضخ'),
- (0xFCB7, 'M', 'ضم'),
- (0xFCB8, 'M', 'طح'),
- (0xFCB9, 'M', 'ظم'),
- (0xFCBA, 'M', 'عج'),
- ]
-
-def _seg_47() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFCBB, 'M', 'عم'),
- (0xFCBC, 'M', 'غج'),
- (0xFCBD, 'M', 'غم'),
- (0xFCBE, 'M', 'فج'),
- (0xFCBF, 'M', 'فح'),
- (0xFCC0, 'M', 'فخ'),
- (0xFCC1, 'M', 'فم'),
- (0xFCC2, 'M', 'قح'),
- (0xFCC3, 'M', 'قم'),
- (0xFCC4, 'M', 'كج'),
- (0xFCC5, 'M', 'كح'),
- (0xFCC6, 'M', 'كخ'),
- (0xFCC7, 'M', 'كل'),
- (0xFCC8, 'M', 'كم'),
- (0xFCC9, 'M', 'لج'),
- (0xFCCA, 'M', 'لح'),
- (0xFCCB, 'M', 'لخ'),
- (0xFCCC, 'M', 'لم'),
- (0xFCCD, 'M', 'له'),
- (0xFCCE, 'M', 'مج'),
- (0xFCCF, 'M', 'مح'),
- (0xFCD0, 'M', 'مخ'),
- (0xFCD1, 'M', 'مم'),
- (0xFCD2, 'M', 'نج'),
- (0xFCD3, 'M', 'نح'),
- (0xFCD4, 'M', 'نخ'),
- (0xFCD5, 'M', 'نم'),
- (0xFCD6, 'M', 'نه'),
- (0xFCD7, 'M', 'هج'),
- (0xFCD8, 'M', 'هم'),
- (0xFCD9, 'M', 'هٰ'),
- (0xFCDA, 'M', 'يج'),
- (0xFCDB, 'M', 'يح'),
- (0xFCDC, 'M', 'يخ'),
- (0xFCDD, 'M', 'يم'),
- (0xFCDE, 'M', 'يه'),
- (0xFCDF, 'M', 'ئم'),
- (0xFCE0, 'M', 'ئه'),
- (0xFCE1, 'M', 'بم'),
- (0xFCE2, 'M', 'به'),
- (0xFCE3, 'M', 'تم'),
- (0xFCE4, 'M', 'ته'),
- (0xFCE5, 'M', 'ثم'),
- (0xFCE6, 'M', 'ثه'),
- (0xFCE7, 'M', 'سم'),
- (0xFCE8, 'M', 'سه'),
- (0xFCE9, 'M', 'شم'),
- (0xFCEA, 'M', 'شه'),
- (0xFCEB, 'M', 'كل'),
- (0xFCEC, 'M', 'كم'),
- (0xFCED, 'M', 'لم'),
- (0xFCEE, 'M', 'نم'),
- (0xFCEF, 'M', 'نه'),
- (0xFCF0, 'M', 'يم'),
- (0xFCF1, 'M', 'يه'),
- (0xFCF2, 'M', 'ـَّ'),
- (0xFCF3, 'M', 'ـُّ'),
- (0xFCF4, 'M', 'ـِّ'),
- (0xFCF5, 'M', 'طى'),
- (0xFCF6, 'M', 'طي'),
- (0xFCF7, 'M', 'عى'),
- (0xFCF8, 'M', 'عي'),
- (0xFCF9, 'M', 'غى'),
- (0xFCFA, 'M', 'غي'),
- (0xFCFB, 'M', 'سى'),
- (0xFCFC, 'M', 'سي'),
- (0xFCFD, 'M', 'شى'),
- (0xFCFE, 'M', 'شي'),
- (0xFCFF, 'M', 'حى'),
- (0xFD00, 'M', 'حي'),
- (0xFD01, 'M', 'جى'),
- (0xFD02, 'M', 'جي'),
- (0xFD03, 'M', 'خى'),
- (0xFD04, 'M', 'خي'),
- (0xFD05, 'M', 'صى'),
- (0xFD06, 'M', 'صي'),
- (0xFD07, 'M', 'ضى'),
- (0xFD08, 'M', 'ضي'),
- (0xFD09, 'M', 'شج'),
- (0xFD0A, 'M', 'شح'),
- (0xFD0B, 'M', 'شخ'),
- (0xFD0C, 'M', 'شم'),
- (0xFD0D, 'M', 'شر'),
- (0xFD0E, 'M', 'سر'),
- (0xFD0F, 'M', 'صر'),
- (0xFD10, 'M', 'ضر'),
- (0xFD11, 'M', 'طى'),
- (0xFD12, 'M', 'طي'),
- (0xFD13, 'M', 'عى'),
- (0xFD14, 'M', 'عي'),
- (0xFD15, 'M', 'غى'),
- (0xFD16, 'M', 'غي'),
- (0xFD17, 'M', 'سى'),
- (0xFD18, 'M', 'سي'),
- (0xFD19, 'M', 'شى'),
- (0xFD1A, 'M', 'شي'),
- (0xFD1B, 'M', 'حى'),
- (0xFD1C, 'M', 'حي'),
- (0xFD1D, 'M', 'جى'),
- (0xFD1E, 'M', 'جي'),
- ]
-
-def _seg_48() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFD1F, 'M', 'خى'),
- (0xFD20, 'M', 'خي'),
- (0xFD21, 'M', 'صى'),
- (0xFD22, 'M', 'صي'),
- (0xFD23, 'M', 'ضى'),
- (0xFD24, 'M', 'ضي'),
- (0xFD25, 'M', 'شج'),
- (0xFD26, 'M', 'شح'),
- (0xFD27, 'M', 'شخ'),
- (0xFD28, 'M', 'شم'),
- (0xFD29, 'M', 'شر'),
- (0xFD2A, 'M', 'سر'),
- (0xFD2B, 'M', 'صر'),
- (0xFD2C, 'M', 'ضر'),
- (0xFD2D, 'M', 'شج'),
- (0xFD2E, 'M', 'شح'),
- (0xFD2F, 'M', 'شخ'),
- (0xFD30, 'M', 'شم'),
- (0xFD31, 'M', 'سه'),
- (0xFD32, 'M', 'شه'),
- (0xFD33, 'M', 'طم'),
- (0xFD34, 'M', 'سج'),
- (0xFD35, 'M', 'سح'),
- (0xFD36, 'M', 'سخ'),
- (0xFD37, 'M', 'شج'),
- (0xFD38, 'M', 'شح'),
- (0xFD39, 'M', 'شخ'),
- (0xFD3A, 'M', 'طم'),
- (0xFD3B, 'M', 'ظم'),
- (0xFD3C, 'M', 'اً'),
- (0xFD3E, 'V'),
- (0xFD50, 'M', 'تجم'),
- (0xFD51, 'M', 'تحج'),
- (0xFD53, 'M', 'تحم'),
- (0xFD54, 'M', 'تخم'),
- (0xFD55, 'M', 'تمج'),
- (0xFD56, 'M', 'تمح'),
- (0xFD57, 'M', 'تمخ'),
- (0xFD58, 'M', 'جمح'),
- (0xFD5A, 'M', 'حمي'),
- (0xFD5B, 'M', 'حمى'),
- (0xFD5C, 'M', 'سحج'),
- (0xFD5D, 'M', 'سجح'),
- (0xFD5E, 'M', 'سجى'),
- (0xFD5F, 'M', 'سمح'),
- (0xFD61, 'M', 'سمج'),
- (0xFD62, 'M', 'سمم'),
- (0xFD64, 'M', 'صحح'),
- (0xFD66, 'M', 'صمم'),
- (0xFD67, 'M', 'شحم'),
- (0xFD69, 'M', 'شجي'),
- (0xFD6A, 'M', 'شمخ'),
- (0xFD6C, 'M', 'شمم'),
- (0xFD6E, 'M', 'ضحى'),
- (0xFD6F, 'M', 'ضخم'),
- (0xFD71, 'M', 'طمح'),
- (0xFD73, 'M', 'طمم'),
- (0xFD74, 'M', 'طمي'),
- (0xFD75, 'M', 'عجم'),
- (0xFD76, 'M', 'عمم'),
- (0xFD78, 'M', 'عمى'),
- (0xFD79, 'M', 'غمم'),
- (0xFD7A, 'M', 'غمي'),
- (0xFD7B, 'M', 'غمى'),
- (0xFD7C, 'M', 'فخم'),
- (0xFD7E, 'M', 'قمح'),
- (0xFD7F, 'M', 'قمم'),
- (0xFD80, 'M', 'لحم'),
- (0xFD81, 'M', 'لحي'),
- (0xFD82, 'M', 'لحى'),
- (0xFD83, 'M', 'لجج'),
- (0xFD85, 'M', 'لخم'),
- (0xFD87, 'M', 'لمح'),
- (0xFD89, 'M', 'محج'),
- (0xFD8A, 'M', 'محم'),
- (0xFD8B, 'M', 'محي'),
- (0xFD8C, 'M', 'مجح'),
- (0xFD8D, 'M', 'مجم'),
- (0xFD8E, 'M', 'مخج'),
- (0xFD8F, 'M', 'مخم'),
- (0xFD90, 'X'),
- (0xFD92, 'M', 'مجخ'),
- (0xFD93, 'M', 'همج'),
- (0xFD94, 'M', 'همم'),
- (0xFD95, 'M', 'نحم'),
- (0xFD96, 'M', 'نحى'),
- (0xFD97, 'M', 'نجم'),
- (0xFD99, 'M', 'نجى'),
- (0xFD9A, 'M', 'نمي'),
- (0xFD9B, 'M', 'نمى'),
- (0xFD9C, 'M', 'يمم'),
- (0xFD9E, 'M', 'بخي'),
- (0xFD9F, 'M', 'تجي'),
- (0xFDA0, 'M', 'تجى'),
- (0xFDA1, 'M', 'تخي'),
- (0xFDA2, 'M', 'تخى'),
- (0xFDA3, 'M', 'تمي'),
- (0xFDA4, 'M', 'تمى'),
- (0xFDA5, 'M', 'جمي'),
- (0xFDA6, 'M', 'جحى'),
- ]
-
-def _seg_49() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFDA7, 'M', 'جمى'),
- (0xFDA8, 'M', 'سخى'),
- (0xFDA9, 'M', 'صحي'),
- (0xFDAA, 'M', 'شحي'),
- (0xFDAB, 'M', 'ضحي'),
- (0xFDAC, 'M', 'لجي'),
- (0xFDAD, 'M', 'لمي'),
- (0xFDAE, 'M', 'يحي'),
- (0xFDAF, 'M', 'يجي'),
- (0xFDB0, 'M', 'يمي'),
- (0xFDB1, 'M', 'ممي'),
- (0xFDB2, 'M', 'قمي'),
- (0xFDB3, 'M', 'نحي'),
- (0xFDB4, 'M', 'قمح'),
- (0xFDB5, 'M', 'لحم'),
- (0xFDB6, 'M', 'عمي'),
- (0xFDB7, 'M', 'كمي'),
- (0xFDB8, 'M', 'نجح'),
- (0xFDB9, 'M', 'مخي'),
- (0xFDBA, 'M', 'لجم'),
- (0xFDBB, 'M', 'كمم'),
- (0xFDBC, 'M', 'لجم'),
- (0xFDBD, 'M', 'نجح'),
- (0xFDBE, 'M', 'جحي'),
- (0xFDBF, 'M', 'حجي'),
- (0xFDC0, 'M', 'مجي'),
- (0xFDC1, 'M', 'فمي'),
- (0xFDC2, 'M', 'بحي'),
- (0xFDC3, 'M', 'كمم'),
- (0xFDC4, 'M', 'عجم'),
- (0xFDC5, 'M', 'صمم'),
- (0xFDC6, 'M', 'سخي'),
- (0xFDC7, 'M', 'نجي'),
- (0xFDC8, 'X'),
- (0xFDCF, 'V'),
- (0xFDD0, 'X'),
- (0xFDF0, 'M', 'صلے'),
- (0xFDF1, 'M', 'قلے'),
- (0xFDF2, 'M', 'الله'),
- (0xFDF3, 'M', 'اكبر'),
- (0xFDF4, 'M', 'محمد'),
- (0xFDF5, 'M', 'صلعم'),
- (0xFDF6, 'M', 'رسول'),
- (0xFDF7, 'M', 'عليه'),
- (0xFDF8, 'M', 'وسلم'),
- (0xFDF9, 'M', 'صلى'),
- (0xFDFA, '3', 'صلى الله عليه وسلم'),
- (0xFDFB, '3', 'جل جلاله'),
- (0xFDFC, 'M', 'ریال'),
- (0xFDFD, 'V'),
- (0xFE00, 'I'),
- (0xFE10, '3', ','),
- (0xFE11, 'M', '、'),
- (0xFE12, 'X'),
- (0xFE13, '3', ':'),
- (0xFE14, '3', ';'),
- (0xFE15, '3', '!'),
- (0xFE16, '3', '?'),
- (0xFE17, 'M', '〖'),
- (0xFE18, 'M', '〗'),
- (0xFE19, 'X'),
- (0xFE20, 'V'),
- (0xFE30, 'X'),
- (0xFE31, 'M', '—'),
- (0xFE32, 'M', '–'),
- (0xFE33, '3', '_'),
- (0xFE35, '3', '('),
- (0xFE36, '3', ')'),
- (0xFE37, '3', '{'),
- (0xFE38, '3', '}'),
- (0xFE39, 'M', '〔'),
- (0xFE3A, 'M', '〕'),
- (0xFE3B, 'M', '【'),
- (0xFE3C, 'M', '】'),
- (0xFE3D, 'M', '《'),
- (0xFE3E, 'M', '》'),
- (0xFE3F, 'M', '〈'),
- (0xFE40, 'M', '〉'),
- (0xFE41, 'M', '「'),
- (0xFE42, 'M', '」'),
- (0xFE43, 'M', '『'),
- (0xFE44, 'M', '』'),
- (0xFE45, 'V'),
- (0xFE47, '3', '['),
- (0xFE48, '3', ']'),
- (0xFE49, '3', ' ̅'),
- (0xFE4D, '3', '_'),
- (0xFE50, '3', ','),
- (0xFE51, 'M', '、'),
- (0xFE52, 'X'),
- (0xFE54, '3', ';'),
- (0xFE55, '3', ':'),
- (0xFE56, '3', '?'),
- (0xFE57, '3', '!'),
- (0xFE58, 'M', '—'),
- (0xFE59, '3', '('),
- (0xFE5A, '3', ')'),
- (0xFE5B, '3', '{'),
- (0xFE5C, '3', '}'),
- (0xFE5D, 'M', '〔'),
- ]
-
-def _seg_50() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFE5E, 'M', '〕'),
- (0xFE5F, '3', '#'),
- (0xFE60, '3', '&'),
- (0xFE61, '3', '*'),
- (0xFE62, '3', '+'),
- (0xFE63, 'M', '-'),
- (0xFE64, '3', '<'),
- (0xFE65, '3', '>'),
- (0xFE66, '3', '='),
- (0xFE67, 'X'),
- (0xFE68, '3', '\\'),
- (0xFE69, '3', '$'),
- (0xFE6A, '3', '%'),
- (0xFE6B, '3', '@'),
- (0xFE6C, 'X'),
- (0xFE70, '3', ' ً'),
- (0xFE71, 'M', 'ـً'),
- (0xFE72, '3', ' ٌ'),
- (0xFE73, 'V'),
- (0xFE74, '3', ' ٍ'),
- (0xFE75, 'X'),
- (0xFE76, '3', ' َ'),
- (0xFE77, 'M', 'ـَ'),
- (0xFE78, '3', ' ُ'),
- (0xFE79, 'M', 'ـُ'),
- (0xFE7A, '3', ' ِ'),
- (0xFE7B, 'M', 'ـِ'),
- (0xFE7C, '3', ' ّ'),
- (0xFE7D, 'M', 'ـّ'),
- (0xFE7E, '3', ' ْ'),
- (0xFE7F, 'M', 'ـْ'),
- (0xFE80, 'M', 'ء'),
- (0xFE81, 'M', 'آ'),
- (0xFE83, 'M', 'أ'),
- (0xFE85, 'M', 'ؤ'),
- (0xFE87, 'M', 'إ'),
- (0xFE89, 'M', 'ئ'),
- (0xFE8D, 'M', 'ا'),
- (0xFE8F, 'M', 'ب'),
- (0xFE93, 'M', 'ة'),
- (0xFE95, 'M', 'ت'),
- (0xFE99, 'M', 'ث'),
- (0xFE9D, 'M', 'ج'),
- (0xFEA1, 'M', 'ح'),
- (0xFEA5, 'M', 'خ'),
- (0xFEA9, 'M', 'د'),
- (0xFEAB, 'M', 'ذ'),
- (0xFEAD, 'M', 'ر'),
- (0xFEAF, 'M', 'ز'),
- (0xFEB1, 'M', 'س'),
- (0xFEB5, 'M', 'ش'),
- (0xFEB9, 'M', 'ص'),
- (0xFEBD, 'M', 'ض'),
- (0xFEC1, 'M', 'ط'),
- (0xFEC5, 'M', 'ظ'),
- (0xFEC9, 'M', 'ع'),
- (0xFECD, 'M', 'غ'),
- (0xFED1, 'M', 'ف'),
- (0xFED5, 'M', 'ق'),
- (0xFED9, 'M', 'ك'),
- (0xFEDD, 'M', 'ل'),
- (0xFEE1, 'M', 'م'),
- (0xFEE5, 'M', 'ن'),
- (0xFEE9, 'M', 'ه'),
- (0xFEED, 'M', 'و'),
- (0xFEEF, 'M', 'ى'),
- (0xFEF1, 'M', 'ي'),
- (0xFEF5, 'M', 'لآ'),
- (0xFEF7, 'M', 'لأ'),
- (0xFEF9, 'M', 'لإ'),
- (0xFEFB, 'M', 'لا'),
- (0xFEFD, 'X'),
- (0xFEFF, 'I'),
- (0xFF00, 'X'),
- (0xFF01, '3', '!'),
- (0xFF02, '3', '"'),
- (0xFF03, '3', '#'),
- (0xFF04, '3', '$'),
- (0xFF05, '3', '%'),
- (0xFF06, '3', '&'),
- (0xFF07, '3', '\''),
- (0xFF08, '3', '('),
- (0xFF09, '3', ')'),
- (0xFF0A, '3', '*'),
- (0xFF0B, '3', '+'),
- (0xFF0C, '3', ','),
- (0xFF0D, 'M', '-'),
- (0xFF0E, 'M', '.'),
- (0xFF0F, '3', '/'),
- (0xFF10, 'M', '0'),
- (0xFF11, 'M', '1'),
- (0xFF12, 'M', '2'),
- (0xFF13, 'M', '3'),
- (0xFF14, 'M', '4'),
- (0xFF15, 'M', '5'),
- (0xFF16, 'M', '6'),
- (0xFF17, 'M', '7'),
- (0xFF18, 'M', '8'),
- (0xFF19, 'M', '9'),
- (0xFF1A, '3', ':'),
- ]
-
-def _seg_51() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFF1B, '3', ';'),
- (0xFF1C, '3', '<'),
- (0xFF1D, '3', '='),
- (0xFF1E, '3', '>'),
- (0xFF1F, '3', '?'),
- (0xFF20, '3', '@'),
- (0xFF21, 'M', 'a'),
- (0xFF22, 'M', 'b'),
- (0xFF23, 'M', 'c'),
- (0xFF24, 'M', 'd'),
- (0xFF25, 'M', 'e'),
- (0xFF26, 'M', 'f'),
- (0xFF27, 'M', 'g'),
- (0xFF28, 'M', 'h'),
- (0xFF29, 'M', 'i'),
- (0xFF2A, 'M', 'j'),
- (0xFF2B, 'M', 'k'),
- (0xFF2C, 'M', 'l'),
- (0xFF2D, 'M', 'm'),
- (0xFF2E, 'M', 'n'),
- (0xFF2F, 'M', 'o'),
- (0xFF30, 'M', 'p'),
- (0xFF31, 'M', 'q'),
- (0xFF32, 'M', 'r'),
- (0xFF33, 'M', 's'),
- (0xFF34, 'M', 't'),
- (0xFF35, 'M', 'u'),
- (0xFF36, 'M', 'v'),
- (0xFF37, 'M', 'w'),
- (0xFF38, 'M', 'x'),
- (0xFF39, 'M', 'y'),
- (0xFF3A, 'M', 'z'),
- (0xFF3B, '3', '['),
- (0xFF3C, '3', '\\'),
- (0xFF3D, '3', ']'),
- (0xFF3E, '3', '^'),
- (0xFF3F, '3', '_'),
- (0xFF40, '3', '`'),
- (0xFF41, 'M', 'a'),
- (0xFF42, 'M', 'b'),
- (0xFF43, 'M', 'c'),
- (0xFF44, 'M', 'd'),
- (0xFF45, 'M', 'e'),
- (0xFF46, 'M', 'f'),
- (0xFF47, 'M', 'g'),
- (0xFF48, 'M', 'h'),
- (0xFF49, 'M', 'i'),
- (0xFF4A, 'M', 'j'),
- (0xFF4B, 'M', 'k'),
- (0xFF4C, 'M', 'l'),
- (0xFF4D, 'M', 'm'),
- (0xFF4E, 'M', 'n'),
- (0xFF4F, 'M', 'o'),
- (0xFF50, 'M', 'p'),
- (0xFF51, 'M', 'q'),
- (0xFF52, 'M', 'r'),
- (0xFF53, 'M', 's'),
- (0xFF54, 'M', 't'),
- (0xFF55, 'M', 'u'),
- (0xFF56, 'M', 'v'),
- (0xFF57, 'M', 'w'),
- (0xFF58, 'M', 'x'),
- (0xFF59, 'M', 'y'),
- (0xFF5A, 'M', 'z'),
- (0xFF5B, '3', '{'),
- (0xFF5C, '3', '|'),
- (0xFF5D, '3', '}'),
- (0xFF5E, '3', '~'),
- (0xFF5F, 'M', '⦅'),
- (0xFF60, 'M', '⦆'),
- (0xFF61, 'M', '.'),
- (0xFF62, 'M', '「'),
- (0xFF63, 'M', '」'),
- (0xFF64, 'M', '、'),
- (0xFF65, 'M', '・'),
- (0xFF66, 'M', 'ヲ'),
- (0xFF67, 'M', 'ァ'),
- (0xFF68, 'M', 'ィ'),
- (0xFF69, 'M', 'ゥ'),
- (0xFF6A, 'M', 'ェ'),
- (0xFF6B, 'M', 'ォ'),
- (0xFF6C, 'M', 'ャ'),
- (0xFF6D, 'M', 'ュ'),
- (0xFF6E, 'M', 'ョ'),
- (0xFF6F, 'M', 'ッ'),
- (0xFF70, 'M', 'ー'),
- (0xFF71, 'M', 'ア'),
- (0xFF72, 'M', 'イ'),
- (0xFF73, 'M', 'ウ'),
- (0xFF74, 'M', 'エ'),
- (0xFF75, 'M', 'オ'),
- (0xFF76, 'M', 'カ'),
- (0xFF77, 'M', 'キ'),
- (0xFF78, 'M', 'ク'),
- (0xFF79, 'M', 'ケ'),
- (0xFF7A, 'M', 'コ'),
- (0xFF7B, 'M', 'サ'),
- (0xFF7C, 'M', 'シ'),
- (0xFF7D, 'M', 'ス'),
- (0xFF7E, 'M', 'セ'),
- ]
-
-def _seg_52() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFF7F, 'M', 'ソ'),
- (0xFF80, 'M', 'タ'),
- (0xFF81, 'M', 'チ'),
- (0xFF82, 'M', 'ツ'),
- (0xFF83, 'M', 'テ'),
- (0xFF84, 'M', 'ト'),
- (0xFF85, 'M', 'ナ'),
- (0xFF86, 'M', 'ニ'),
- (0xFF87, 'M', 'ヌ'),
- (0xFF88, 'M', 'ネ'),
- (0xFF89, 'M', 'ノ'),
- (0xFF8A, 'M', 'ハ'),
- (0xFF8B, 'M', 'ヒ'),
- (0xFF8C, 'M', 'フ'),
- (0xFF8D, 'M', 'ヘ'),
- (0xFF8E, 'M', 'ホ'),
- (0xFF8F, 'M', 'マ'),
- (0xFF90, 'M', 'ミ'),
- (0xFF91, 'M', 'ム'),
- (0xFF92, 'M', 'メ'),
- (0xFF93, 'M', 'モ'),
- (0xFF94, 'M', 'ヤ'),
- (0xFF95, 'M', 'ユ'),
- (0xFF96, 'M', 'ヨ'),
- (0xFF97, 'M', 'ラ'),
- (0xFF98, 'M', 'リ'),
- (0xFF99, 'M', 'ル'),
- (0xFF9A, 'M', 'レ'),
- (0xFF9B, 'M', 'ロ'),
- (0xFF9C, 'M', 'ワ'),
- (0xFF9D, 'M', 'ン'),
- (0xFF9E, 'M', '゙'),
- (0xFF9F, 'M', '゚'),
- (0xFFA0, 'X'),
- (0xFFA1, 'M', 'ᄀ'),
- (0xFFA2, 'M', 'ᄁ'),
- (0xFFA3, 'M', 'ᆪ'),
- (0xFFA4, 'M', 'ᄂ'),
- (0xFFA5, 'M', 'ᆬ'),
- (0xFFA6, 'M', 'ᆭ'),
- (0xFFA7, 'M', 'ᄃ'),
- (0xFFA8, 'M', 'ᄄ'),
- (0xFFA9, 'M', 'ᄅ'),
- (0xFFAA, 'M', 'ᆰ'),
- (0xFFAB, 'M', 'ᆱ'),
- (0xFFAC, 'M', 'ᆲ'),
- (0xFFAD, 'M', 'ᆳ'),
- (0xFFAE, 'M', 'ᆴ'),
- (0xFFAF, 'M', 'ᆵ'),
- (0xFFB0, 'M', 'ᄚ'),
- (0xFFB1, 'M', 'ᄆ'),
- (0xFFB2, 'M', 'ᄇ'),
- (0xFFB3, 'M', 'ᄈ'),
- (0xFFB4, 'M', 'ᄡ'),
- (0xFFB5, 'M', 'ᄉ'),
- (0xFFB6, 'M', 'ᄊ'),
- (0xFFB7, 'M', 'ᄋ'),
- (0xFFB8, 'M', 'ᄌ'),
- (0xFFB9, 'M', 'ᄍ'),
- (0xFFBA, 'M', 'ᄎ'),
- (0xFFBB, 'M', 'ᄏ'),
- (0xFFBC, 'M', 'ᄐ'),
- (0xFFBD, 'M', 'ᄑ'),
- (0xFFBE, 'M', 'ᄒ'),
- (0xFFBF, 'X'),
- (0xFFC2, 'M', 'ᅡ'),
- (0xFFC3, 'M', 'ᅢ'),
- (0xFFC4, 'M', 'ᅣ'),
- (0xFFC5, 'M', 'ᅤ'),
- (0xFFC6, 'M', 'ᅥ'),
- (0xFFC7, 'M', 'ᅦ'),
- (0xFFC8, 'X'),
- (0xFFCA, 'M', 'ᅧ'),
- (0xFFCB, 'M', 'ᅨ'),
- (0xFFCC, 'M', 'ᅩ'),
- (0xFFCD, 'M', 'ᅪ'),
- (0xFFCE, 'M', 'ᅫ'),
- (0xFFCF, 'M', 'ᅬ'),
- (0xFFD0, 'X'),
- (0xFFD2, 'M', 'ᅭ'),
- (0xFFD3, 'M', 'ᅮ'),
- (0xFFD4, 'M', 'ᅯ'),
- (0xFFD5, 'M', 'ᅰ'),
- (0xFFD6, 'M', 'ᅱ'),
- (0xFFD7, 'M', 'ᅲ'),
- (0xFFD8, 'X'),
- (0xFFDA, 'M', 'ᅳ'),
- (0xFFDB, 'M', 'ᅴ'),
- (0xFFDC, 'M', 'ᅵ'),
- (0xFFDD, 'X'),
- (0xFFE0, 'M', '¢'),
- (0xFFE1, 'M', '£'),
- (0xFFE2, 'M', '¬'),
- (0xFFE3, '3', ' ̄'),
- (0xFFE4, 'M', '¦'),
- (0xFFE5, 'M', '¥'),
- (0xFFE6, 'M', '₩'),
- (0xFFE7, 'X'),
- (0xFFE8, 'M', '│'),
- (0xFFE9, 'M', '←'),
- ]
-
-def _seg_53() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0xFFEA, 'M', '↑'),
- (0xFFEB, 'M', '→'),
- (0xFFEC, 'M', '↓'),
- (0xFFED, 'M', '■'),
- (0xFFEE, 'M', '○'),
- (0xFFEF, 'X'),
- (0x10000, 'V'),
- (0x1000C, 'X'),
- (0x1000D, 'V'),
- (0x10027, 'X'),
- (0x10028, 'V'),
- (0x1003B, 'X'),
- (0x1003C, 'V'),
- (0x1003E, 'X'),
- (0x1003F, 'V'),
- (0x1004E, 'X'),
- (0x10050, 'V'),
- (0x1005E, 'X'),
- (0x10080, 'V'),
- (0x100FB, 'X'),
- (0x10100, 'V'),
- (0x10103, 'X'),
- (0x10107, 'V'),
- (0x10134, 'X'),
- (0x10137, 'V'),
- (0x1018F, 'X'),
- (0x10190, 'V'),
- (0x1019D, 'X'),
- (0x101A0, 'V'),
- (0x101A1, 'X'),
- (0x101D0, 'V'),
- (0x101FE, 'X'),
- (0x10280, 'V'),
- (0x1029D, 'X'),
- (0x102A0, 'V'),
- (0x102D1, 'X'),
- (0x102E0, 'V'),
- (0x102FC, 'X'),
- (0x10300, 'V'),
- (0x10324, 'X'),
- (0x1032D, 'V'),
- (0x1034B, 'X'),
- (0x10350, 'V'),
- (0x1037B, 'X'),
- (0x10380, 'V'),
- (0x1039E, 'X'),
- (0x1039F, 'V'),
- (0x103C4, 'X'),
- (0x103C8, 'V'),
- (0x103D6, 'X'),
- (0x10400, 'M', '𐐨'),
- (0x10401, 'M', '𐐩'),
- (0x10402, 'M', '𐐪'),
- (0x10403, 'M', '𐐫'),
- (0x10404, 'M', '𐐬'),
- (0x10405, 'M', '𐐭'),
- (0x10406, 'M', '𐐮'),
- (0x10407, 'M', '𐐯'),
- (0x10408, 'M', '𐐰'),
- (0x10409, 'M', '𐐱'),
- (0x1040A, 'M', '𐐲'),
- (0x1040B, 'M', '𐐳'),
- (0x1040C, 'M', '𐐴'),
- (0x1040D, 'M', '𐐵'),
- (0x1040E, 'M', '𐐶'),
- (0x1040F, 'M', '𐐷'),
- (0x10410, 'M', '𐐸'),
- (0x10411, 'M', '𐐹'),
- (0x10412, 'M', '𐐺'),
- (0x10413, 'M', '𐐻'),
- (0x10414, 'M', '𐐼'),
- (0x10415, 'M', '𐐽'),
- (0x10416, 'M', '𐐾'),
- (0x10417, 'M', '𐐿'),
- (0x10418, 'M', '𐑀'),
- (0x10419, 'M', '𐑁'),
- (0x1041A, 'M', '𐑂'),
- (0x1041B, 'M', '𐑃'),
- (0x1041C, 'M', '𐑄'),
- (0x1041D, 'M', '𐑅'),
- (0x1041E, 'M', '𐑆'),
- (0x1041F, 'M', '𐑇'),
- (0x10420, 'M', '𐑈'),
- (0x10421, 'M', '𐑉'),
- (0x10422, 'M', '𐑊'),
- (0x10423, 'M', '𐑋'),
- (0x10424, 'M', '𐑌'),
- (0x10425, 'M', '𐑍'),
- (0x10426, 'M', '𐑎'),
- (0x10427, 'M', '𐑏'),
- (0x10428, 'V'),
- (0x1049E, 'X'),
- (0x104A0, 'V'),
- (0x104AA, 'X'),
- (0x104B0, 'M', '𐓘'),
- (0x104B1, 'M', '𐓙'),
- (0x104B2, 'M', '𐓚'),
- (0x104B3, 'M', '𐓛'),
- (0x104B4, 'M', '𐓜'),
- (0x104B5, 'M', '𐓝'),
- ]
-
-def _seg_54() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x104B6, 'M', '𐓞'),
- (0x104B7, 'M', '𐓟'),
- (0x104B8, 'M', '𐓠'),
- (0x104B9, 'M', '𐓡'),
- (0x104BA, 'M', '𐓢'),
- (0x104BB, 'M', '𐓣'),
- (0x104BC, 'M', '𐓤'),
- (0x104BD, 'M', '𐓥'),
- (0x104BE, 'M', '𐓦'),
- (0x104BF, 'M', '𐓧'),
- (0x104C0, 'M', '𐓨'),
- (0x104C1, 'M', '𐓩'),
- (0x104C2, 'M', '𐓪'),
- (0x104C3, 'M', '𐓫'),
- (0x104C4, 'M', '𐓬'),
- (0x104C5, 'M', '𐓭'),
- (0x104C6, 'M', '𐓮'),
- (0x104C7, 'M', '𐓯'),
- (0x104C8, 'M', '𐓰'),
- (0x104C9, 'M', '𐓱'),
- (0x104CA, 'M', '𐓲'),
- (0x104CB, 'M', '𐓳'),
- (0x104CC, 'M', '𐓴'),
- (0x104CD, 'M', '𐓵'),
- (0x104CE, 'M', '𐓶'),
- (0x104CF, 'M', '𐓷'),
- (0x104D0, 'M', '𐓸'),
- (0x104D1, 'M', '𐓹'),
- (0x104D2, 'M', '𐓺'),
- (0x104D3, 'M', '𐓻'),
- (0x104D4, 'X'),
- (0x104D8, 'V'),
- (0x104FC, 'X'),
- (0x10500, 'V'),
- (0x10528, 'X'),
- (0x10530, 'V'),
- (0x10564, 'X'),
- (0x1056F, 'V'),
- (0x10570, 'M', '𐖗'),
- (0x10571, 'M', '𐖘'),
- (0x10572, 'M', '𐖙'),
- (0x10573, 'M', '𐖚'),
- (0x10574, 'M', '𐖛'),
- (0x10575, 'M', '𐖜'),
- (0x10576, 'M', '𐖝'),
- (0x10577, 'M', '𐖞'),
- (0x10578, 'M', '𐖟'),
- (0x10579, 'M', '𐖠'),
- (0x1057A, 'M', '𐖡'),
- (0x1057B, 'X'),
- (0x1057C, 'M', '𐖣'),
- (0x1057D, 'M', '𐖤'),
- (0x1057E, 'M', '𐖥'),
- (0x1057F, 'M', '𐖦'),
- (0x10580, 'M', '𐖧'),
- (0x10581, 'M', '𐖨'),
- (0x10582, 'M', '𐖩'),
- (0x10583, 'M', '𐖪'),
- (0x10584, 'M', '𐖫'),
- (0x10585, 'M', '𐖬'),
- (0x10586, 'M', '𐖭'),
- (0x10587, 'M', '𐖮'),
- (0x10588, 'M', '𐖯'),
- (0x10589, 'M', '𐖰'),
- (0x1058A, 'M', '𐖱'),
- (0x1058B, 'X'),
- (0x1058C, 'M', '𐖳'),
- (0x1058D, 'M', '𐖴'),
- (0x1058E, 'M', '𐖵'),
- (0x1058F, 'M', '𐖶'),
- (0x10590, 'M', '𐖷'),
- (0x10591, 'M', '𐖸'),
- (0x10592, 'M', '𐖹'),
- (0x10593, 'X'),
- (0x10594, 'M', '𐖻'),
- (0x10595, 'M', '𐖼'),
- (0x10596, 'X'),
- (0x10597, 'V'),
- (0x105A2, 'X'),
- (0x105A3, 'V'),
- (0x105B2, 'X'),
- (0x105B3, 'V'),
- (0x105BA, 'X'),
- (0x105BB, 'V'),
- (0x105BD, 'X'),
- (0x10600, 'V'),
- (0x10737, 'X'),
- (0x10740, 'V'),
- (0x10756, 'X'),
- (0x10760, 'V'),
- (0x10768, 'X'),
- (0x10780, 'V'),
- (0x10781, 'M', 'ː'),
- (0x10782, 'M', 'ˑ'),
- (0x10783, 'M', 'æ'),
- (0x10784, 'M', 'ʙ'),
- (0x10785, 'M', 'ɓ'),
- (0x10786, 'X'),
- (0x10787, 'M', 'ʣ'),
- (0x10788, 'M', 'ꭦ'),
- ]
-
-def _seg_55() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x10789, 'M', 'ʥ'),
- (0x1078A, 'M', 'ʤ'),
- (0x1078B, 'M', 'ɖ'),
- (0x1078C, 'M', 'ɗ'),
- (0x1078D, 'M', 'ᶑ'),
- (0x1078E, 'M', 'ɘ'),
- (0x1078F, 'M', 'ɞ'),
- (0x10790, 'M', 'ʩ'),
- (0x10791, 'M', 'ɤ'),
- (0x10792, 'M', 'ɢ'),
- (0x10793, 'M', 'ɠ'),
- (0x10794, 'M', 'ʛ'),
- (0x10795, 'M', 'ħ'),
- (0x10796, 'M', 'ʜ'),
- (0x10797, 'M', 'ɧ'),
- (0x10798, 'M', 'ʄ'),
- (0x10799, 'M', 'ʪ'),
- (0x1079A, 'M', 'ʫ'),
- (0x1079B, 'M', 'ɬ'),
- (0x1079C, 'M', '𝼄'),
- (0x1079D, 'M', 'ꞎ'),
- (0x1079E, 'M', 'ɮ'),
- (0x1079F, 'M', '𝼅'),
- (0x107A0, 'M', 'ʎ'),
- (0x107A1, 'M', '𝼆'),
- (0x107A2, 'M', 'ø'),
- (0x107A3, 'M', 'ɶ'),
- (0x107A4, 'M', 'ɷ'),
- (0x107A5, 'M', 'q'),
- (0x107A6, 'M', 'ɺ'),
- (0x107A7, 'M', '𝼈'),
- (0x107A8, 'M', 'ɽ'),
- (0x107A9, 'M', 'ɾ'),
- (0x107AA, 'M', 'ʀ'),
- (0x107AB, 'M', 'ʨ'),
- (0x107AC, 'M', 'ʦ'),
- (0x107AD, 'M', 'ꭧ'),
- (0x107AE, 'M', 'ʧ'),
- (0x107AF, 'M', 'ʈ'),
- (0x107B0, 'M', 'ⱱ'),
- (0x107B1, 'X'),
- (0x107B2, 'M', 'ʏ'),
- (0x107B3, 'M', 'ʡ'),
- (0x107B4, 'M', 'ʢ'),
- (0x107B5, 'M', 'ʘ'),
- (0x107B6, 'M', 'ǀ'),
- (0x107B7, 'M', 'ǁ'),
- (0x107B8, 'M', 'ǂ'),
- (0x107B9, 'M', '𝼊'),
- (0x107BA, 'M', '𝼞'),
- (0x107BB, 'X'),
- (0x10800, 'V'),
- (0x10806, 'X'),
- (0x10808, 'V'),
- (0x10809, 'X'),
- (0x1080A, 'V'),
- (0x10836, 'X'),
- (0x10837, 'V'),
- (0x10839, 'X'),
- (0x1083C, 'V'),
- (0x1083D, 'X'),
- (0x1083F, 'V'),
- (0x10856, 'X'),
- (0x10857, 'V'),
- (0x1089F, 'X'),
- (0x108A7, 'V'),
- (0x108B0, 'X'),
- (0x108E0, 'V'),
- (0x108F3, 'X'),
- (0x108F4, 'V'),
- (0x108F6, 'X'),
- (0x108FB, 'V'),
- (0x1091C, 'X'),
- (0x1091F, 'V'),
- (0x1093A, 'X'),
- (0x1093F, 'V'),
- (0x10940, 'X'),
- (0x10980, 'V'),
- (0x109B8, 'X'),
- (0x109BC, 'V'),
- (0x109D0, 'X'),
- (0x109D2, 'V'),
- (0x10A04, 'X'),
- (0x10A05, 'V'),
- (0x10A07, 'X'),
- (0x10A0C, 'V'),
- (0x10A14, 'X'),
- (0x10A15, 'V'),
- (0x10A18, 'X'),
- (0x10A19, 'V'),
- (0x10A36, 'X'),
- (0x10A38, 'V'),
- (0x10A3B, 'X'),
- (0x10A3F, 'V'),
- (0x10A49, 'X'),
- (0x10A50, 'V'),
- (0x10A59, 'X'),
- (0x10A60, 'V'),
- (0x10AA0, 'X'),
- (0x10AC0, 'V'),
- ]
-
-def _seg_56() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x10AE7, 'X'),
- (0x10AEB, 'V'),
- (0x10AF7, 'X'),
- (0x10B00, 'V'),
- (0x10B36, 'X'),
- (0x10B39, 'V'),
- (0x10B56, 'X'),
- (0x10B58, 'V'),
- (0x10B73, 'X'),
- (0x10B78, 'V'),
- (0x10B92, 'X'),
- (0x10B99, 'V'),
- (0x10B9D, 'X'),
- (0x10BA9, 'V'),
- (0x10BB0, 'X'),
- (0x10C00, 'V'),
- (0x10C49, 'X'),
- (0x10C80, 'M', '𐳀'),
- (0x10C81, 'M', '𐳁'),
- (0x10C82, 'M', '𐳂'),
- (0x10C83, 'M', '𐳃'),
- (0x10C84, 'M', '𐳄'),
- (0x10C85, 'M', '𐳅'),
- (0x10C86, 'M', '𐳆'),
- (0x10C87, 'M', '𐳇'),
- (0x10C88, 'M', '𐳈'),
- (0x10C89, 'M', '𐳉'),
- (0x10C8A, 'M', '𐳊'),
- (0x10C8B, 'M', '𐳋'),
- (0x10C8C, 'M', '𐳌'),
- (0x10C8D, 'M', '𐳍'),
- (0x10C8E, 'M', '𐳎'),
- (0x10C8F, 'M', '𐳏'),
- (0x10C90, 'M', '𐳐'),
- (0x10C91, 'M', '𐳑'),
- (0x10C92, 'M', '𐳒'),
- (0x10C93, 'M', '𐳓'),
- (0x10C94, 'M', '𐳔'),
- (0x10C95, 'M', '𐳕'),
- (0x10C96, 'M', '𐳖'),
- (0x10C97, 'M', '𐳗'),
- (0x10C98, 'M', '𐳘'),
- (0x10C99, 'M', '𐳙'),
- (0x10C9A, 'M', '𐳚'),
- (0x10C9B, 'M', '𐳛'),
- (0x10C9C, 'M', '𐳜'),
- (0x10C9D, 'M', '𐳝'),
- (0x10C9E, 'M', '𐳞'),
- (0x10C9F, 'M', '𐳟'),
- (0x10CA0, 'M', '𐳠'),
- (0x10CA1, 'M', '𐳡'),
- (0x10CA2, 'M', '𐳢'),
- (0x10CA3, 'M', '𐳣'),
- (0x10CA4, 'M', '𐳤'),
- (0x10CA5, 'M', '𐳥'),
- (0x10CA6, 'M', '𐳦'),
- (0x10CA7, 'M', '𐳧'),
- (0x10CA8, 'M', '𐳨'),
- (0x10CA9, 'M', '𐳩'),
- (0x10CAA, 'M', '𐳪'),
- (0x10CAB, 'M', '𐳫'),
- (0x10CAC, 'M', '𐳬'),
- (0x10CAD, 'M', '𐳭'),
- (0x10CAE, 'M', '𐳮'),
- (0x10CAF, 'M', '𐳯'),
- (0x10CB0, 'M', '𐳰'),
- (0x10CB1, 'M', '𐳱'),
- (0x10CB2, 'M', '𐳲'),
- (0x10CB3, 'X'),
- (0x10CC0, 'V'),
- (0x10CF3, 'X'),
- (0x10CFA, 'V'),
- (0x10D28, 'X'),
- (0x10D30, 'V'),
- (0x10D3A, 'X'),
- (0x10E60, 'V'),
- (0x10E7F, 'X'),
- (0x10E80, 'V'),
- (0x10EAA, 'X'),
- (0x10EAB, 'V'),
- (0x10EAE, 'X'),
- (0x10EB0, 'V'),
- (0x10EB2, 'X'),
- (0x10F00, 'V'),
- (0x10F28, 'X'),
- (0x10F30, 'V'),
- (0x10F5A, 'X'),
- (0x10F70, 'V'),
- (0x10F8A, 'X'),
- (0x10FB0, 'V'),
- (0x10FCC, 'X'),
- (0x10FE0, 'V'),
- (0x10FF7, 'X'),
- (0x11000, 'V'),
- (0x1104E, 'X'),
- (0x11052, 'V'),
- (0x11076, 'X'),
- (0x1107F, 'V'),
- (0x110BD, 'X'),
- (0x110BE, 'V'),
- ]
-
-def _seg_57() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x110C3, 'X'),
- (0x110D0, 'V'),
- (0x110E9, 'X'),
- (0x110F0, 'V'),
- (0x110FA, 'X'),
- (0x11100, 'V'),
- (0x11135, 'X'),
- (0x11136, 'V'),
- (0x11148, 'X'),
- (0x11150, 'V'),
- (0x11177, 'X'),
- (0x11180, 'V'),
- (0x111E0, 'X'),
- (0x111E1, 'V'),
- (0x111F5, 'X'),
- (0x11200, 'V'),
- (0x11212, 'X'),
- (0x11213, 'V'),
- (0x1123F, 'X'),
- (0x11280, 'V'),
- (0x11287, 'X'),
- (0x11288, 'V'),
- (0x11289, 'X'),
- (0x1128A, 'V'),
- (0x1128E, 'X'),
- (0x1128F, 'V'),
- (0x1129E, 'X'),
- (0x1129F, 'V'),
- (0x112AA, 'X'),
- (0x112B0, 'V'),
- (0x112EB, 'X'),
- (0x112F0, 'V'),
- (0x112FA, 'X'),
- (0x11300, 'V'),
- (0x11304, 'X'),
- (0x11305, 'V'),
- (0x1130D, 'X'),
- (0x1130F, 'V'),
- (0x11311, 'X'),
- (0x11313, 'V'),
- (0x11329, 'X'),
- (0x1132A, 'V'),
- (0x11331, 'X'),
- (0x11332, 'V'),
- (0x11334, 'X'),
- (0x11335, 'V'),
- (0x1133A, 'X'),
- (0x1133B, 'V'),
- (0x11345, 'X'),
- (0x11347, 'V'),
- (0x11349, 'X'),
- (0x1134B, 'V'),
- (0x1134E, 'X'),
- (0x11350, 'V'),
- (0x11351, 'X'),
- (0x11357, 'V'),
- (0x11358, 'X'),
- (0x1135D, 'V'),
- (0x11364, 'X'),
- (0x11366, 'V'),
- (0x1136D, 'X'),
- (0x11370, 'V'),
- (0x11375, 'X'),
- (0x11400, 'V'),
- (0x1145C, 'X'),
- (0x1145D, 'V'),
- (0x11462, 'X'),
- (0x11480, 'V'),
- (0x114C8, 'X'),
- (0x114D0, 'V'),
- (0x114DA, 'X'),
- (0x11580, 'V'),
- (0x115B6, 'X'),
- (0x115B8, 'V'),
- (0x115DE, 'X'),
- (0x11600, 'V'),
- (0x11645, 'X'),
- (0x11650, 'V'),
- (0x1165A, 'X'),
- (0x11660, 'V'),
- (0x1166D, 'X'),
- (0x11680, 'V'),
- (0x116BA, 'X'),
- (0x116C0, 'V'),
- (0x116CA, 'X'),
- (0x11700, 'V'),
- (0x1171B, 'X'),
- (0x1171D, 'V'),
- (0x1172C, 'X'),
- (0x11730, 'V'),
- (0x11747, 'X'),
- (0x11800, 'V'),
- (0x1183C, 'X'),
- (0x118A0, 'M', '𑣀'),
- (0x118A1, 'M', '𑣁'),
- (0x118A2, 'M', '𑣂'),
- (0x118A3, 'M', '𑣃'),
- (0x118A4, 'M', '𑣄'),
- (0x118A5, 'M', '𑣅'),
- (0x118A6, 'M', '𑣆'),
- ]
-
-def _seg_58() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x118A7, 'M', '𑣇'),
- (0x118A8, 'M', '𑣈'),
- (0x118A9, 'M', '𑣉'),
- (0x118AA, 'M', '𑣊'),
- (0x118AB, 'M', '𑣋'),
- (0x118AC, 'M', '𑣌'),
- (0x118AD, 'M', '𑣍'),
- (0x118AE, 'M', '𑣎'),
- (0x118AF, 'M', '𑣏'),
- (0x118B0, 'M', '𑣐'),
- (0x118B1, 'M', '𑣑'),
- (0x118B2, 'M', '𑣒'),
- (0x118B3, 'M', '𑣓'),
- (0x118B4, 'M', '𑣔'),
- (0x118B5, 'M', '𑣕'),
- (0x118B6, 'M', '𑣖'),
- (0x118B7, 'M', '𑣗'),
- (0x118B8, 'M', '𑣘'),
- (0x118B9, 'M', '𑣙'),
- (0x118BA, 'M', '𑣚'),
- (0x118BB, 'M', '𑣛'),
- (0x118BC, 'M', '𑣜'),
- (0x118BD, 'M', '𑣝'),
- (0x118BE, 'M', '𑣞'),
- (0x118BF, 'M', '𑣟'),
- (0x118C0, 'V'),
- (0x118F3, 'X'),
- (0x118FF, 'V'),
- (0x11907, 'X'),
- (0x11909, 'V'),
- (0x1190A, 'X'),
- (0x1190C, 'V'),
- (0x11914, 'X'),
- (0x11915, 'V'),
- (0x11917, 'X'),
- (0x11918, 'V'),
- (0x11936, 'X'),
- (0x11937, 'V'),
- (0x11939, 'X'),
- (0x1193B, 'V'),
- (0x11947, 'X'),
- (0x11950, 'V'),
- (0x1195A, 'X'),
- (0x119A0, 'V'),
- (0x119A8, 'X'),
- (0x119AA, 'V'),
- (0x119D8, 'X'),
- (0x119DA, 'V'),
- (0x119E5, 'X'),
- (0x11A00, 'V'),
- (0x11A48, 'X'),
- (0x11A50, 'V'),
- (0x11AA3, 'X'),
- (0x11AB0, 'V'),
- (0x11AF9, 'X'),
- (0x11C00, 'V'),
- (0x11C09, 'X'),
- (0x11C0A, 'V'),
- (0x11C37, 'X'),
- (0x11C38, 'V'),
- (0x11C46, 'X'),
- (0x11C50, 'V'),
- (0x11C6D, 'X'),
- (0x11C70, 'V'),
- (0x11C90, 'X'),
- (0x11C92, 'V'),
- (0x11CA8, 'X'),
- (0x11CA9, 'V'),
- (0x11CB7, 'X'),
- (0x11D00, 'V'),
- (0x11D07, 'X'),
- (0x11D08, 'V'),
- (0x11D0A, 'X'),
- (0x11D0B, 'V'),
- (0x11D37, 'X'),
- (0x11D3A, 'V'),
- (0x11D3B, 'X'),
- (0x11D3C, 'V'),
- (0x11D3E, 'X'),
- (0x11D3F, 'V'),
- (0x11D48, 'X'),
- (0x11D50, 'V'),
- (0x11D5A, 'X'),
- (0x11D60, 'V'),
- (0x11D66, 'X'),
- (0x11D67, 'V'),
- (0x11D69, 'X'),
- (0x11D6A, 'V'),
- (0x11D8F, 'X'),
- (0x11D90, 'V'),
- (0x11D92, 'X'),
- (0x11D93, 'V'),
- (0x11D99, 'X'),
- (0x11DA0, 'V'),
- (0x11DAA, 'X'),
- (0x11EE0, 'V'),
- (0x11EF9, 'X'),
- (0x11FB0, 'V'),
- (0x11FB1, 'X'),
- (0x11FC0, 'V'),
- ]
-
-def _seg_59() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x11FF2, 'X'),
- (0x11FFF, 'V'),
- (0x1239A, 'X'),
- (0x12400, 'V'),
- (0x1246F, 'X'),
- (0x12470, 'V'),
- (0x12475, 'X'),
- (0x12480, 'V'),
- (0x12544, 'X'),
- (0x12F90, 'V'),
- (0x12FF3, 'X'),
- (0x13000, 'V'),
- (0x1342F, 'X'),
- (0x14400, 'V'),
- (0x14647, 'X'),
- (0x16800, 'V'),
- (0x16A39, 'X'),
- (0x16A40, 'V'),
- (0x16A5F, 'X'),
- (0x16A60, 'V'),
- (0x16A6A, 'X'),
- (0x16A6E, 'V'),
- (0x16ABF, 'X'),
- (0x16AC0, 'V'),
- (0x16ACA, 'X'),
- (0x16AD0, 'V'),
- (0x16AEE, 'X'),
- (0x16AF0, 'V'),
- (0x16AF6, 'X'),
- (0x16B00, 'V'),
- (0x16B46, 'X'),
- (0x16B50, 'V'),
- (0x16B5A, 'X'),
- (0x16B5B, 'V'),
- (0x16B62, 'X'),
- (0x16B63, 'V'),
- (0x16B78, 'X'),
- (0x16B7D, 'V'),
- (0x16B90, 'X'),
- (0x16E40, 'M', '𖹠'),
- (0x16E41, 'M', '𖹡'),
- (0x16E42, 'M', '𖹢'),
- (0x16E43, 'M', '𖹣'),
- (0x16E44, 'M', '𖹤'),
- (0x16E45, 'M', '𖹥'),
- (0x16E46, 'M', '𖹦'),
- (0x16E47, 'M', '𖹧'),
- (0x16E48, 'M', '𖹨'),
- (0x16E49, 'M', '𖹩'),
- (0x16E4A, 'M', '𖹪'),
- (0x16E4B, 'M', '𖹫'),
- (0x16E4C, 'M', '𖹬'),
- (0x16E4D, 'M', '𖹭'),
- (0x16E4E, 'M', '𖹮'),
- (0x16E4F, 'M', '𖹯'),
- (0x16E50, 'M', '𖹰'),
- (0x16E51, 'M', '𖹱'),
- (0x16E52, 'M', '𖹲'),
- (0x16E53, 'M', '𖹳'),
- (0x16E54, 'M', '𖹴'),
- (0x16E55, 'M', '𖹵'),
- (0x16E56, 'M', '𖹶'),
- (0x16E57, 'M', '𖹷'),
- (0x16E58, 'M', '𖹸'),
- (0x16E59, 'M', '𖹹'),
- (0x16E5A, 'M', '𖹺'),
- (0x16E5B, 'M', '𖹻'),
- (0x16E5C, 'M', '𖹼'),
- (0x16E5D, 'M', '𖹽'),
- (0x16E5E, 'M', '𖹾'),
- (0x16E5F, 'M', '𖹿'),
- (0x16E60, 'V'),
- (0x16E9B, 'X'),
- (0x16F00, 'V'),
- (0x16F4B, 'X'),
- (0x16F4F, 'V'),
- (0x16F88, 'X'),
- (0x16F8F, 'V'),
- (0x16FA0, 'X'),
- (0x16FE0, 'V'),
- (0x16FE5, 'X'),
- (0x16FF0, 'V'),
- (0x16FF2, 'X'),
- (0x17000, 'V'),
- (0x187F8, 'X'),
- (0x18800, 'V'),
- (0x18CD6, 'X'),
- (0x18D00, 'V'),
- (0x18D09, 'X'),
- (0x1AFF0, 'V'),
- (0x1AFF4, 'X'),
- (0x1AFF5, 'V'),
- (0x1AFFC, 'X'),
- (0x1AFFD, 'V'),
- (0x1AFFF, 'X'),
- (0x1B000, 'V'),
- (0x1B123, 'X'),
- (0x1B150, 'V'),
- (0x1B153, 'X'),
- (0x1B164, 'V'),
- ]
-
-def _seg_60() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1B168, 'X'),
- (0x1B170, 'V'),
- (0x1B2FC, 'X'),
- (0x1BC00, 'V'),
- (0x1BC6B, 'X'),
- (0x1BC70, 'V'),
- (0x1BC7D, 'X'),
- (0x1BC80, 'V'),
- (0x1BC89, 'X'),
- (0x1BC90, 'V'),
- (0x1BC9A, 'X'),
- (0x1BC9C, 'V'),
- (0x1BCA0, 'I'),
- (0x1BCA4, 'X'),
- (0x1CF00, 'V'),
- (0x1CF2E, 'X'),
- (0x1CF30, 'V'),
- (0x1CF47, 'X'),
- (0x1CF50, 'V'),
- (0x1CFC4, 'X'),
- (0x1D000, 'V'),
- (0x1D0F6, 'X'),
- (0x1D100, 'V'),
- (0x1D127, 'X'),
- (0x1D129, 'V'),
- (0x1D15E, 'M', '𝅗𝅥'),
- (0x1D15F, 'M', '𝅘𝅥'),
- (0x1D160, 'M', '𝅘𝅥𝅮'),
- (0x1D161, 'M', '𝅘𝅥𝅯'),
- (0x1D162, 'M', '𝅘𝅥𝅰'),
- (0x1D163, 'M', '𝅘𝅥𝅱'),
- (0x1D164, 'M', '𝅘𝅥𝅲'),
- (0x1D165, 'V'),
- (0x1D173, 'X'),
- (0x1D17B, 'V'),
- (0x1D1BB, 'M', '𝆹𝅥'),
- (0x1D1BC, 'M', '𝆺𝅥'),
- (0x1D1BD, 'M', '𝆹𝅥𝅮'),
- (0x1D1BE, 'M', '𝆺𝅥𝅮'),
- (0x1D1BF, 'M', '𝆹𝅥𝅯'),
- (0x1D1C0, 'M', '𝆺𝅥𝅯'),
- (0x1D1C1, 'V'),
- (0x1D1EB, 'X'),
- (0x1D200, 'V'),
- (0x1D246, 'X'),
- (0x1D2E0, 'V'),
- (0x1D2F4, 'X'),
- (0x1D300, 'V'),
- (0x1D357, 'X'),
- (0x1D360, 'V'),
- (0x1D379, 'X'),
- (0x1D400, 'M', 'a'),
- (0x1D401, 'M', 'b'),
- (0x1D402, 'M', 'c'),
- (0x1D403, 'M', 'd'),
- (0x1D404, 'M', 'e'),
- (0x1D405, 'M', 'f'),
- (0x1D406, 'M', 'g'),
- (0x1D407, 'M', 'h'),
- (0x1D408, 'M', 'i'),
- (0x1D409, 'M', 'j'),
- (0x1D40A, 'M', 'k'),
- (0x1D40B, 'M', 'l'),
- (0x1D40C, 'M', 'm'),
- (0x1D40D, 'M', 'n'),
- (0x1D40E, 'M', 'o'),
- (0x1D40F, 'M', 'p'),
- (0x1D410, 'M', 'q'),
- (0x1D411, 'M', 'r'),
- (0x1D412, 'M', 's'),
- (0x1D413, 'M', 't'),
- (0x1D414, 'M', 'u'),
- (0x1D415, 'M', 'v'),
- (0x1D416, 'M', 'w'),
- (0x1D417, 'M', 'x'),
- (0x1D418, 'M', 'y'),
- (0x1D419, 'M', 'z'),
- (0x1D41A, 'M', 'a'),
- (0x1D41B, 'M', 'b'),
- (0x1D41C, 'M', 'c'),
- (0x1D41D, 'M', 'd'),
- (0x1D41E, 'M', 'e'),
- (0x1D41F, 'M', 'f'),
- (0x1D420, 'M', 'g'),
- (0x1D421, 'M', 'h'),
- (0x1D422, 'M', 'i'),
- (0x1D423, 'M', 'j'),
- (0x1D424, 'M', 'k'),
- (0x1D425, 'M', 'l'),
- (0x1D426, 'M', 'm'),
- (0x1D427, 'M', 'n'),
- (0x1D428, 'M', 'o'),
- (0x1D429, 'M', 'p'),
- (0x1D42A, 'M', 'q'),
- (0x1D42B, 'M', 'r'),
- (0x1D42C, 'M', 's'),
- (0x1D42D, 'M', 't'),
- (0x1D42E, 'M', 'u'),
- (0x1D42F, 'M', 'v'),
- (0x1D430, 'M', 'w'),
- ]
-
-def _seg_61() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D431, 'M', 'x'),
- (0x1D432, 'M', 'y'),
- (0x1D433, 'M', 'z'),
- (0x1D434, 'M', 'a'),
- (0x1D435, 'M', 'b'),
- (0x1D436, 'M', 'c'),
- (0x1D437, 'M', 'd'),
- (0x1D438, 'M', 'e'),
- (0x1D439, 'M', 'f'),
- (0x1D43A, 'M', 'g'),
- (0x1D43B, 'M', 'h'),
- (0x1D43C, 'M', 'i'),
- (0x1D43D, 'M', 'j'),
- (0x1D43E, 'M', 'k'),
- (0x1D43F, 'M', 'l'),
- (0x1D440, 'M', 'm'),
- (0x1D441, 'M', 'n'),
- (0x1D442, 'M', 'o'),
- (0x1D443, 'M', 'p'),
- (0x1D444, 'M', 'q'),
- (0x1D445, 'M', 'r'),
- (0x1D446, 'M', 's'),
- (0x1D447, 'M', 't'),
- (0x1D448, 'M', 'u'),
- (0x1D449, 'M', 'v'),
- (0x1D44A, 'M', 'w'),
- (0x1D44B, 'M', 'x'),
- (0x1D44C, 'M', 'y'),
- (0x1D44D, 'M', 'z'),
- (0x1D44E, 'M', 'a'),
- (0x1D44F, 'M', 'b'),
- (0x1D450, 'M', 'c'),
- (0x1D451, 'M', 'd'),
- (0x1D452, 'M', 'e'),
- (0x1D453, 'M', 'f'),
- (0x1D454, 'M', 'g'),
- (0x1D455, 'X'),
- (0x1D456, 'M', 'i'),
- (0x1D457, 'M', 'j'),
- (0x1D458, 'M', 'k'),
- (0x1D459, 'M', 'l'),
- (0x1D45A, 'M', 'm'),
- (0x1D45B, 'M', 'n'),
- (0x1D45C, 'M', 'o'),
- (0x1D45D, 'M', 'p'),
- (0x1D45E, 'M', 'q'),
- (0x1D45F, 'M', 'r'),
- (0x1D460, 'M', 's'),
- (0x1D461, 'M', 't'),
- (0x1D462, 'M', 'u'),
- (0x1D463, 'M', 'v'),
- (0x1D464, 'M', 'w'),
- (0x1D465, 'M', 'x'),
- (0x1D466, 'M', 'y'),
- (0x1D467, 'M', 'z'),
- (0x1D468, 'M', 'a'),
- (0x1D469, 'M', 'b'),
- (0x1D46A, 'M', 'c'),
- (0x1D46B, 'M', 'd'),
- (0x1D46C, 'M', 'e'),
- (0x1D46D, 'M', 'f'),
- (0x1D46E, 'M', 'g'),
- (0x1D46F, 'M', 'h'),
- (0x1D470, 'M', 'i'),
- (0x1D471, 'M', 'j'),
- (0x1D472, 'M', 'k'),
- (0x1D473, 'M', 'l'),
- (0x1D474, 'M', 'm'),
- (0x1D475, 'M', 'n'),
- (0x1D476, 'M', 'o'),
- (0x1D477, 'M', 'p'),
- (0x1D478, 'M', 'q'),
- (0x1D479, 'M', 'r'),
- (0x1D47A, 'M', 's'),
- (0x1D47B, 'M', 't'),
- (0x1D47C, 'M', 'u'),
- (0x1D47D, 'M', 'v'),
- (0x1D47E, 'M', 'w'),
- (0x1D47F, 'M', 'x'),
- (0x1D480, 'M', 'y'),
- (0x1D481, 'M', 'z'),
- (0x1D482, 'M', 'a'),
- (0x1D483, 'M', 'b'),
- (0x1D484, 'M', 'c'),
- (0x1D485, 'M', 'd'),
- (0x1D486, 'M', 'e'),
- (0x1D487, 'M', 'f'),
- (0x1D488, 'M', 'g'),
- (0x1D489, 'M', 'h'),
- (0x1D48A, 'M', 'i'),
- (0x1D48B, 'M', 'j'),
- (0x1D48C, 'M', 'k'),
- (0x1D48D, 'M', 'l'),
- (0x1D48E, 'M', 'm'),
- (0x1D48F, 'M', 'n'),
- (0x1D490, 'M', 'o'),
- (0x1D491, 'M', 'p'),
- (0x1D492, 'M', 'q'),
- (0x1D493, 'M', 'r'),
- (0x1D494, 'M', 's'),
- ]
-
-def _seg_62() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D495, 'M', 't'),
- (0x1D496, 'M', 'u'),
- (0x1D497, 'M', 'v'),
- (0x1D498, 'M', 'w'),
- (0x1D499, 'M', 'x'),
- (0x1D49A, 'M', 'y'),
- (0x1D49B, 'M', 'z'),
- (0x1D49C, 'M', 'a'),
- (0x1D49D, 'X'),
- (0x1D49E, 'M', 'c'),
- (0x1D49F, 'M', 'd'),
- (0x1D4A0, 'X'),
- (0x1D4A2, 'M', 'g'),
- (0x1D4A3, 'X'),
- (0x1D4A5, 'M', 'j'),
- (0x1D4A6, 'M', 'k'),
- (0x1D4A7, 'X'),
- (0x1D4A9, 'M', 'n'),
- (0x1D4AA, 'M', 'o'),
- (0x1D4AB, 'M', 'p'),
- (0x1D4AC, 'M', 'q'),
- (0x1D4AD, 'X'),
- (0x1D4AE, 'M', 's'),
- (0x1D4AF, 'M', 't'),
- (0x1D4B0, 'M', 'u'),
- (0x1D4B1, 'M', 'v'),
- (0x1D4B2, 'M', 'w'),
- (0x1D4B3, 'M', 'x'),
- (0x1D4B4, 'M', 'y'),
- (0x1D4B5, 'M', 'z'),
- (0x1D4B6, 'M', 'a'),
- (0x1D4B7, 'M', 'b'),
- (0x1D4B8, 'M', 'c'),
- (0x1D4B9, 'M', 'd'),
- (0x1D4BA, 'X'),
- (0x1D4BB, 'M', 'f'),
- (0x1D4BC, 'X'),
- (0x1D4BD, 'M', 'h'),
- (0x1D4BE, 'M', 'i'),
- (0x1D4BF, 'M', 'j'),
- (0x1D4C0, 'M', 'k'),
- (0x1D4C1, 'M', 'l'),
- (0x1D4C2, 'M', 'm'),
- (0x1D4C3, 'M', 'n'),
- (0x1D4C4, 'X'),
- (0x1D4C5, 'M', 'p'),
- (0x1D4C6, 'M', 'q'),
- (0x1D4C7, 'M', 'r'),
- (0x1D4C8, 'M', 's'),
- (0x1D4C9, 'M', 't'),
- (0x1D4CA, 'M', 'u'),
- (0x1D4CB, 'M', 'v'),
- (0x1D4CC, 'M', 'w'),
- (0x1D4CD, 'M', 'x'),
- (0x1D4CE, 'M', 'y'),
- (0x1D4CF, 'M', 'z'),
- (0x1D4D0, 'M', 'a'),
- (0x1D4D1, 'M', 'b'),
- (0x1D4D2, 'M', 'c'),
- (0x1D4D3, 'M', 'd'),
- (0x1D4D4, 'M', 'e'),
- (0x1D4D5, 'M', 'f'),
- (0x1D4D6, 'M', 'g'),
- (0x1D4D7, 'M', 'h'),
- (0x1D4D8, 'M', 'i'),
- (0x1D4D9, 'M', 'j'),
- (0x1D4DA, 'M', 'k'),
- (0x1D4DB, 'M', 'l'),
- (0x1D4DC, 'M', 'm'),
- (0x1D4DD, 'M', 'n'),
- (0x1D4DE, 'M', 'o'),
- (0x1D4DF, 'M', 'p'),
- (0x1D4E0, 'M', 'q'),
- (0x1D4E1, 'M', 'r'),
- (0x1D4E2, 'M', 's'),
- (0x1D4E3, 'M', 't'),
- (0x1D4E4, 'M', 'u'),
- (0x1D4E5, 'M', 'v'),
- (0x1D4E6, 'M', 'w'),
- (0x1D4E7, 'M', 'x'),
- (0x1D4E8, 'M', 'y'),
- (0x1D4E9, 'M', 'z'),
- (0x1D4EA, 'M', 'a'),
- (0x1D4EB, 'M', 'b'),
- (0x1D4EC, 'M', 'c'),
- (0x1D4ED, 'M', 'd'),
- (0x1D4EE, 'M', 'e'),
- (0x1D4EF, 'M', 'f'),
- (0x1D4F0, 'M', 'g'),
- (0x1D4F1, 'M', 'h'),
- (0x1D4F2, 'M', 'i'),
- (0x1D4F3, 'M', 'j'),
- (0x1D4F4, 'M', 'k'),
- (0x1D4F5, 'M', 'l'),
- (0x1D4F6, 'M', 'm'),
- (0x1D4F7, 'M', 'n'),
- (0x1D4F8, 'M', 'o'),
- (0x1D4F9, 'M', 'p'),
- (0x1D4FA, 'M', 'q'),
- (0x1D4FB, 'M', 'r'),
- ]
-
-def _seg_63() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D4FC, 'M', 's'),
- (0x1D4FD, 'M', 't'),
- (0x1D4FE, 'M', 'u'),
- (0x1D4FF, 'M', 'v'),
- (0x1D500, 'M', 'w'),
- (0x1D501, 'M', 'x'),
- (0x1D502, 'M', 'y'),
- (0x1D503, 'M', 'z'),
- (0x1D504, 'M', 'a'),
- (0x1D505, 'M', 'b'),
- (0x1D506, 'X'),
- (0x1D507, 'M', 'd'),
- (0x1D508, 'M', 'e'),
- (0x1D509, 'M', 'f'),
- (0x1D50A, 'M', 'g'),
- (0x1D50B, 'X'),
- (0x1D50D, 'M', 'j'),
- (0x1D50E, 'M', 'k'),
- (0x1D50F, 'M', 'l'),
- (0x1D510, 'M', 'm'),
- (0x1D511, 'M', 'n'),
- (0x1D512, 'M', 'o'),
- (0x1D513, 'M', 'p'),
- (0x1D514, 'M', 'q'),
- (0x1D515, 'X'),
- (0x1D516, 'M', 's'),
- (0x1D517, 'M', 't'),
- (0x1D518, 'M', 'u'),
- (0x1D519, 'M', 'v'),
- (0x1D51A, 'M', 'w'),
- (0x1D51B, 'M', 'x'),
- (0x1D51C, 'M', 'y'),
- (0x1D51D, 'X'),
- (0x1D51E, 'M', 'a'),
- (0x1D51F, 'M', 'b'),
- (0x1D520, 'M', 'c'),
- (0x1D521, 'M', 'd'),
- (0x1D522, 'M', 'e'),
- (0x1D523, 'M', 'f'),
- (0x1D524, 'M', 'g'),
- (0x1D525, 'M', 'h'),
- (0x1D526, 'M', 'i'),
- (0x1D527, 'M', 'j'),
- (0x1D528, 'M', 'k'),
- (0x1D529, 'M', 'l'),
- (0x1D52A, 'M', 'm'),
- (0x1D52B, 'M', 'n'),
- (0x1D52C, 'M', 'o'),
- (0x1D52D, 'M', 'p'),
- (0x1D52E, 'M', 'q'),
- (0x1D52F, 'M', 'r'),
- (0x1D530, 'M', 's'),
- (0x1D531, 'M', 't'),
- (0x1D532, 'M', 'u'),
- (0x1D533, 'M', 'v'),
- (0x1D534, 'M', 'w'),
- (0x1D535, 'M', 'x'),
- (0x1D536, 'M', 'y'),
- (0x1D537, 'M', 'z'),
- (0x1D538, 'M', 'a'),
- (0x1D539, 'M', 'b'),
- (0x1D53A, 'X'),
- (0x1D53B, 'M', 'd'),
- (0x1D53C, 'M', 'e'),
- (0x1D53D, 'M', 'f'),
- (0x1D53E, 'M', 'g'),
- (0x1D53F, 'X'),
- (0x1D540, 'M', 'i'),
- (0x1D541, 'M', 'j'),
- (0x1D542, 'M', 'k'),
- (0x1D543, 'M', 'l'),
- (0x1D544, 'M', 'm'),
- (0x1D545, 'X'),
- (0x1D546, 'M', 'o'),
- (0x1D547, 'X'),
- (0x1D54A, 'M', 's'),
- (0x1D54B, 'M', 't'),
- (0x1D54C, 'M', 'u'),
- (0x1D54D, 'M', 'v'),
- (0x1D54E, 'M', 'w'),
- (0x1D54F, 'M', 'x'),
- (0x1D550, 'M', 'y'),
- (0x1D551, 'X'),
- (0x1D552, 'M', 'a'),
- (0x1D553, 'M', 'b'),
- (0x1D554, 'M', 'c'),
- (0x1D555, 'M', 'd'),
- (0x1D556, 'M', 'e'),
- (0x1D557, 'M', 'f'),
- (0x1D558, 'M', 'g'),
- (0x1D559, 'M', 'h'),
- (0x1D55A, 'M', 'i'),
- (0x1D55B, 'M', 'j'),
- (0x1D55C, 'M', 'k'),
- (0x1D55D, 'M', 'l'),
- (0x1D55E, 'M', 'm'),
- (0x1D55F, 'M', 'n'),
- (0x1D560, 'M', 'o'),
- (0x1D561, 'M', 'p'),
- (0x1D562, 'M', 'q'),
- ]
-
-def _seg_64() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D563, 'M', 'r'),
- (0x1D564, 'M', 's'),
- (0x1D565, 'M', 't'),
- (0x1D566, 'M', 'u'),
- (0x1D567, 'M', 'v'),
- (0x1D568, 'M', 'w'),
- (0x1D569, 'M', 'x'),
- (0x1D56A, 'M', 'y'),
- (0x1D56B, 'M', 'z'),
- (0x1D56C, 'M', 'a'),
- (0x1D56D, 'M', 'b'),
- (0x1D56E, 'M', 'c'),
- (0x1D56F, 'M', 'd'),
- (0x1D570, 'M', 'e'),
- (0x1D571, 'M', 'f'),
- (0x1D572, 'M', 'g'),
- (0x1D573, 'M', 'h'),
- (0x1D574, 'M', 'i'),
- (0x1D575, 'M', 'j'),
- (0x1D576, 'M', 'k'),
- (0x1D577, 'M', 'l'),
- (0x1D578, 'M', 'm'),
- (0x1D579, 'M', 'n'),
- (0x1D57A, 'M', 'o'),
- (0x1D57B, 'M', 'p'),
- (0x1D57C, 'M', 'q'),
- (0x1D57D, 'M', 'r'),
- (0x1D57E, 'M', 's'),
- (0x1D57F, 'M', 't'),
- (0x1D580, 'M', 'u'),
- (0x1D581, 'M', 'v'),
- (0x1D582, 'M', 'w'),
- (0x1D583, 'M', 'x'),
- (0x1D584, 'M', 'y'),
- (0x1D585, 'M', 'z'),
- (0x1D586, 'M', 'a'),
- (0x1D587, 'M', 'b'),
- (0x1D588, 'M', 'c'),
- (0x1D589, 'M', 'd'),
- (0x1D58A, 'M', 'e'),
- (0x1D58B, 'M', 'f'),
- (0x1D58C, 'M', 'g'),
- (0x1D58D, 'M', 'h'),
- (0x1D58E, 'M', 'i'),
- (0x1D58F, 'M', 'j'),
- (0x1D590, 'M', 'k'),
- (0x1D591, 'M', 'l'),
- (0x1D592, 'M', 'm'),
- (0x1D593, 'M', 'n'),
- (0x1D594, 'M', 'o'),
- (0x1D595, 'M', 'p'),
- (0x1D596, 'M', 'q'),
- (0x1D597, 'M', 'r'),
- (0x1D598, 'M', 's'),
- (0x1D599, 'M', 't'),
- (0x1D59A, 'M', 'u'),
- (0x1D59B, 'M', 'v'),
- (0x1D59C, 'M', 'w'),
- (0x1D59D, 'M', 'x'),
- (0x1D59E, 'M', 'y'),
- (0x1D59F, 'M', 'z'),
- (0x1D5A0, 'M', 'a'),
- (0x1D5A1, 'M', 'b'),
- (0x1D5A2, 'M', 'c'),
- (0x1D5A3, 'M', 'd'),
- (0x1D5A4, 'M', 'e'),
- (0x1D5A5, 'M', 'f'),
- (0x1D5A6, 'M', 'g'),
- (0x1D5A7, 'M', 'h'),
- (0x1D5A8, 'M', 'i'),
- (0x1D5A9, 'M', 'j'),
- (0x1D5AA, 'M', 'k'),
- (0x1D5AB, 'M', 'l'),
- (0x1D5AC, 'M', 'm'),
- (0x1D5AD, 'M', 'n'),
- (0x1D5AE, 'M', 'o'),
- (0x1D5AF, 'M', 'p'),
- (0x1D5B0, 'M', 'q'),
- (0x1D5B1, 'M', 'r'),
- (0x1D5B2, 'M', 's'),
- (0x1D5B3, 'M', 't'),
- (0x1D5B4, 'M', 'u'),
- (0x1D5B5, 'M', 'v'),
- (0x1D5B6, 'M', 'w'),
- (0x1D5B7, 'M', 'x'),
- (0x1D5B8, 'M', 'y'),
- (0x1D5B9, 'M', 'z'),
- (0x1D5BA, 'M', 'a'),
- (0x1D5BB, 'M', 'b'),
- (0x1D5BC, 'M', 'c'),
- (0x1D5BD, 'M', 'd'),
- (0x1D5BE, 'M', 'e'),
- (0x1D5BF, 'M', 'f'),
- (0x1D5C0, 'M', 'g'),
- (0x1D5C1, 'M', 'h'),
- (0x1D5C2, 'M', 'i'),
- (0x1D5C3, 'M', 'j'),
- (0x1D5C4, 'M', 'k'),
- (0x1D5C5, 'M', 'l'),
- (0x1D5C6, 'M', 'm'),
- ]
-
-def _seg_65() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D5C7, 'M', 'n'),
- (0x1D5C8, 'M', 'o'),
- (0x1D5C9, 'M', 'p'),
- (0x1D5CA, 'M', 'q'),
- (0x1D5CB, 'M', 'r'),
- (0x1D5CC, 'M', 's'),
- (0x1D5CD, 'M', 't'),
- (0x1D5CE, 'M', 'u'),
- (0x1D5CF, 'M', 'v'),
- (0x1D5D0, 'M', 'w'),
- (0x1D5D1, 'M', 'x'),
- (0x1D5D2, 'M', 'y'),
- (0x1D5D3, 'M', 'z'),
- (0x1D5D4, 'M', 'a'),
- (0x1D5D5, 'M', 'b'),
- (0x1D5D6, 'M', 'c'),
- (0x1D5D7, 'M', 'd'),
- (0x1D5D8, 'M', 'e'),
- (0x1D5D9, 'M', 'f'),
- (0x1D5DA, 'M', 'g'),
- (0x1D5DB, 'M', 'h'),
- (0x1D5DC, 'M', 'i'),
- (0x1D5DD, 'M', 'j'),
- (0x1D5DE, 'M', 'k'),
- (0x1D5DF, 'M', 'l'),
- (0x1D5E0, 'M', 'm'),
- (0x1D5E1, 'M', 'n'),
- (0x1D5E2, 'M', 'o'),
- (0x1D5E3, 'M', 'p'),
- (0x1D5E4, 'M', 'q'),
- (0x1D5E5, 'M', 'r'),
- (0x1D5E6, 'M', 's'),
- (0x1D5E7, 'M', 't'),
- (0x1D5E8, 'M', 'u'),
- (0x1D5E9, 'M', 'v'),
- (0x1D5EA, 'M', 'w'),
- (0x1D5EB, 'M', 'x'),
- (0x1D5EC, 'M', 'y'),
- (0x1D5ED, 'M', 'z'),
- (0x1D5EE, 'M', 'a'),
- (0x1D5EF, 'M', 'b'),
- (0x1D5F0, 'M', 'c'),
- (0x1D5F1, 'M', 'd'),
- (0x1D5F2, 'M', 'e'),
- (0x1D5F3, 'M', 'f'),
- (0x1D5F4, 'M', 'g'),
- (0x1D5F5, 'M', 'h'),
- (0x1D5F6, 'M', 'i'),
- (0x1D5F7, 'M', 'j'),
- (0x1D5F8, 'M', 'k'),
- (0x1D5F9, 'M', 'l'),
- (0x1D5FA, 'M', 'm'),
- (0x1D5FB, 'M', 'n'),
- (0x1D5FC, 'M', 'o'),
- (0x1D5FD, 'M', 'p'),
- (0x1D5FE, 'M', 'q'),
- (0x1D5FF, 'M', 'r'),
- (0x1D600, 'M', 's'),
- (0x1D601, 'M', 't'),
- (0x1D602, 'M', 'u'),
- (0x1D603, 'M', 'v'),
- (0x1D604, 'M', 'w'),
- (0x1D605, 'M', 'x'),
- (0x1D606, 'M', 'y'),
- (0x1D607, 'M', 'z'),
- (0x1D608, 'M', 'a'),
- (0x1D609, 'M', 'b'),
- (0x1D60A, 'M', 'c'),
- (0x1D60B, 'M', 'd'),
- (0x1D60C, 'M', 'e'),
- (0x1D60D, 'M', 'f'),
- (0x1D60E, 'M', 'g'),
- (0x1D60F, 'M', 'h'),
- (0x1D610, 'M', 'i'),
- (0x1D611, 'M', 'j'),
- (0x1D612, 'M', 'k'),
- (0x1D613, 'M', 'l'),
- (0x1D614, 'M', 'm'),
- (0x1D615, 'M', 'n'),
- (0x1D616, 'M', 'o'),
- (0x1D617, 'M', 'p'),
- (0x1D618, 'M', 'q'),
- (0x1D619, 'M', 'r'),
- (0x1D61A, 'M', 's'),
- (0x1D61B, 'M', 't'),
- (0x1D61C, 'M', 'u'),
- (0x1D61D, 'M', 'v'),
- (0x1D61E, 'M', 'w'),
- (0x1D61F, 'M', 'x'),
- (0x1D620, 'M', 'y'),
- (0x1D621, 'M', 'z'),
- (0x1D622, 'M', 'a'),
- (0x1D623, 'M', 'b'),
- (0x1D624, 'M', 'c'),
- (0x1D625, 'M', 'd'),
- (0x1D626, 'M', 'e'),
- (0x1D627, 'M', 'f'),
- (0x1D628, 'M', 'g'),
- (0x1D629, 'M', 'h'),
- (0x1D62A, 'M', 'i'),
- ]
-
-def _seg_66() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D62B, 'M', 'j'),
- (0x1D62C, 'M', 'k'),
- (0x1D62D, 'M', 'l'),
- (0x1D62E, 'M', 'm'),
- (0x1D62F, 'M', 'n'),
- (0x1D630, 'M', 'o'),
- (0x1D631, 'M', 'p'),
- (0x1D632, 'M', 'q'),
- (0x1D633, 'M', 'r'),
- (0x1D634, 'M', 's'),
- (0x1D635, 'M', 't'),
- (0x1D636, 'M', 'u'),
- (0x1D637, 'M', 'v'),
- (0x1D638, 'M', 'w'),
- (0x1D639, 'M', 'x'),
- (0x1D63A, 'M', 'y'),
- (0x1D63B, 'M', 'z'),
- (0x1D63C, 'M', 'a'),
- (0x1D63D, 'M', 'b'),
- (0x1D63E, 'M', 'c'),
- (0x1D63F, 'M', 'd'),
- (0x1D640, 'M', 'e'),
- (0x1D641, 'M', 'f'),
- (0x1D642, 'M', 'g'),
- (0x1D643, 'M', 'h'),
- (0x1D644, 'M', 'i'),
- (0x1D645, 'M', 'j'),
- (0x1D646, 'M', 'k'),
- (0x1D647, 'M', 'l'),
- (0x1D648, 'M', 'm'),
- (0x1D649, 'M', 'n'),
- (0x1D64A, 'M', 'o'),
- (0x1D64B, 'M', 'p'),
- (0x1D64C, 'M', 'q'),
- (0x1D64D, 'M', 'r'),
- (0x1D64E, 'M', 's'),
- (0x1D64F, 'M', 't'),
- (0x1D650, 'M', 'u'),
- (0x1D651, 'M', 'v'),
- (0x1D652, 'M', 'w'),
- (0x1D653, 'M', 'x'),
- (0x1D654, 'M', 'y'),
- (0x1D655, 'M', 'z'),
- (0x1D656, 'M', 'a'),
- (0x1D657, 'M', 'b'),
- (0x1D658, 'M', 'c'),
- (0x1D659, 'M', 'd'),
- (0x1D65A, 'M', 'e'),
- (0x1D65B, 'M', 'f'),
- (0x1D65C, 'M', 'g'),
- (0x1D65D, 'M', 'h'),
- (0x1D65E, 'M', 'i'),
- (0x1D65F, 'M', 'j'),
- (0x1D660, 'M', 'k'),
- (0x1D661, 'M', 'l'),
- (0x1D662, 'M', 'm'),
- (0x1D663, 'M', 'n'),
- (0x1D664, 'M', 'o'),
- (0x1D665, 'M', 'p'),
- (0x1D666, 'M', 'q'),
- (0x1D667, 'M', 'r'),
- (0x1D668, 'M', 's'),
- (0x1D669, 'M', 't'),
- (0x1D66A, 'M', 'u'),
- (0x1D66B, 'M', 'v'),
- (0x1D66C, 'M', 'w'),
- (0x1D66D, 'M', 'x'),
- (0x1D66E, 'M', 'y'),
- (0x1D66F, 'M', 'z'),
- (0x1D670, 'M', 'a'),
- (0x1D671, 'M', 'b'),
- (0x1D672, 'M', 'c'),
- (0x1D673, 'M', 'd'),
- (0x1D674, 'M', 'e'),
- (0x1D675, 'M', 'f'),
- (0x1D676, 'M', 'g'),
- (0x1D677, 'M', 'h'),
- (0x1D678, 'M', 'i'),
- (0x1D679, 'M', 'j'),
- (0x1D67A, 'M', 'k'),
- (0x1D67B, 'M', 'l'),
- (0x1D67C, 'M', 'm'),
- (0x1D67D, 'M', 'n'),
- (0x1D67E, 'M', 'o'),
- (0x1D67F, 'M', 'p'),
- (0x1D680, 'M', 'q'),
- (0x1D681, 'M', 'r'),
- (0x1D682, 'M', 's'),
- (0x1D683, 'M', 't'),
- (0x1D684, 'M', 'u'),
- (0x1D685, 'M', 'v'),
- (0x1D686, 'M', 'w'),
- (0x1D687, 'M', 'x'),
- (0x1D688, 'M', 'y'),
- (0x1D689, 'M', 'z'),
- (0x1D68A, 'M', 'a'),
- (0x1D68B, 'M', 'b'),
- (0x1D68C, 'M', 'c'),
- (0x1D68D, 'M', 'd'),
- (0x1D68E, 'M', 'e'),
- ]
-
-def _seg_67() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D68F, 'M', 'f'),
- (0x1D690, 'M', 'g'),
- (0x1D691, 'M', 'h'),
- (0x1D692, 'M', 'i'),
- (0x1D693, 'M', 'j'),
- (0x1D694, 'M', 'k'),
- (0x1D695, 'M', 'l'),
- (0x1D696, 'M', 'm'),
- (0x1D697, 'M', 'n'),
- (0x1D698, 'M', 'o'),
- (0x1D699, 'M', 'p'),
- (0x1D69A, 'M', 'q'),
- (0x1D69B, 'M', 'r'),
- (0x1D69C, 'M', 's'),
- (0x1D69D, 'M', 't'),
- (0x1D69E, 'M', 'u'),
- (0x1D69F, 'M', 'v'),
- (0x1D6A0, 'M', 'w'),
- (0x1D6A1, 'M', 'x'),
- (0x1D6A2, 'M', 'y'),
- (0x1D6A3, 'M', 'z'),
- (0x1D6A4, 'M', 'ı'),
- (0x1D6A5, 'M', 'ȷ'),
- (0x1D6A6, 'X'),
- (0x1D6A8, 'M', 'α'),
- (0x1D6A9, 'M', 'β'),
- (0x1D6AA, 'M', 'γ'),
- (0x1D6AB, 'M', 'δ'),
- (0x1D6AC, 'M', 'ε'),
- (0x1D6AD, 'M', 'ζ'),
- (0x1D6AE, 'M', 'η'),
- (0x1D6AF, 'M', 'θ'),
- (0x1D6B0, 'M', 'ι'),
- (0x1D6B1, 'M', 'κ'),
- (0x1D6B2, 'M', 'λ'),
- (0x1D6B3, 'M', 'μ'),
- (0x1D6B4, 'M', 'ν'),
- (0x1D6B5, 'M', 'ξ'),
- (0x1D6B6, 'M', 'ο'),
- (0x1D6B7, 'M', 'π'),
- (0x1D6B8, 'M', 'ρ'),
- (0x1D6B9, 'M', 'θ'),
- (0x1D6BA, 'M', 'σ'),
- (0x1D6BB, 'M', 'τ'),
- (0x1D6BC, 'M', 'υ'),
- (0x1D6BD, 'M', 'φ'),
- (0x1D6BE, 'M', 'χ'),
- (0x1D6BF, 'M', 'ψ'),
- (0x1D6C0, 'M', 'ω'),
- (0x1D6C1, 'M', '∇'),
- (0x1D6C2, 'M', 'α'),
- (0x1D6C3, 'M', 'β'),
- (0x1D6C4, 'M', 'γ'),
- (0x1D6C5, 'M', 'δ'),
- (0x1D6C6, 'M', 'ε'),
- (0x1D6C7, 'M', 'ζ'),
- (0x1D6C8, 'M', 'η'),
- (0x1D6C9, 'M', 'θ'),
- (0x1D6CA, 'M', 'ι'),
- (0x1D6CB, 'M', 'κ'),
- (0x1D6CC, 'M', 'λ'),
- (0x1D6CD, 'M', 'μ'),
- (0x1D6CE, 'M', 'ν'),
- (0x1D6CF, 'M', 'ξ'),
- (0x1D6D0, 'M', 'ο'),
- (0x1D6D1, 'M', 'π'),
- (0x1D6D2, 'M', 'ρ'),
- (0x1D6D3, 'M', 'σ'),
- (0x1D6D5, 'M', 'τ'),
- (0x1D6D6, 'M', 'υ'),
- (0x1D6D7, 'M', 'φ'),
- (0x1D6D8, 'M', 'χ'),
- (0x1D6D9, 'M', 'ψ'),
- (0x1D6DA, 'M', 'ω'),
- (0x1D6DB, 'M', '∂'),
- (0x1D6DC, 'M', 'ε'),
- (0x1D6DD, 'M', 'θ'),
- (0x1D6DE, 'M', 'κ'),
- (0x1D6DF, 'M', 'φ'),
- (0x1D6E0, 'M', 'ρ'),
- (0x1D6E1, 'M', 'π'),
- (0x1D6E2, 'M', 'α'),
- (0x1D6E3, 'M', 'β'),
- (0x1D6E4, 'M', 'γ'),
- (0x1D6E5, 'M', 'δ'),
- (0x1D6E6, 'M', 'ε'),
- (0x1D6E7, 'M', 'ζ'),
- (0x1D6E8, 'M', 'η'),
- (0x1D6E9, 'M', 'θ'),
- (0x1D6EA, 'M', 'ι'),
- (0x1D6EB, 'M', 'κ'),
- (0x1D6EC, 'M', 'λ'),
- (0x1D6ED, 'M', 'μ'),
- (0x1D6EE, 'M', 'ν'),
- (0x1D6EF, 'M', 'ξ'),
- (0x1D6F0, 'M', 'ο'),
- (0x1D6F1, 'M', 'π'),
- (0x1D6F2, 'M', 'ρ'),
- (0x1D6F3, 'M', 'θ'),
- (0x1D6F4, 'M', 'σ'),
- ]
-
-def _seg_68() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D6F5, 'M', 'τ'),
- (0x1D6F6, 'M', 'υ'),
- (0x1D6F7, 'M', 'φ'),
- (0x1D6F8, 'M', 'χ'),
- (0x1D6F9, 'M', 'ψ'),
- (0x1D6FA, 'M', 'ω'),
- (0x1D6FB, 'M', '∇'),
- (0x1D6FC, 'M', 'α'),
- (0x1D6FD, 'M', 'β'),
- (0x1D6FE, 'M', 'γ'),
- (0x1D6FF, 'M', 'δ'),
- (0x1D700, 'M', 'ε'),
- (0x1D701, 'M', 'ζ'),
- (0x1D702, 'M', 'η'),
- (0x1D703, 'M', 'θ'),
- (0x1D704, 'M', 'ι'),
- (0x1D705, 'M', 'κ'),
- (0x1D706, 'M', 'λ'),
- (0x1D707, 'M', 'μ'),
- (0x1D708, 'M', 'ν'),
- (0x1D709, 'M', 'ξ'),
- (0x1D70A, 'M', 'ο'),
- (0x1D70B, 'M', 'π'),
- (0x1D70C, 'M', 'ρ'),
- (0x1D70D, 'M', 'σ'),
- (0x1D70F, 'M', 'τ'),
- (0x1D710, 'M', 'υ'),
- (0x1D711, 'M', 'φ'),
- (0x1D712, 'M', 'χ'),
- (0x1D713, 'M', 'ψ'),
- (0x1D714, 'M', 'ω'),
- (0x1D715, 'M', '∂'),
- (0x1D716, 'M', 'ε'),
- (0x1D717, 'M', 'θ'),
- (0x1D718, 'M', 'κ'),
- (0x1D719, 'M', 'φ'),
- (0x1D71A, 'M', 'ρ'),
- (0x1D71B, 'M', 'π'),
- (0x1D71C, 'M', 'α'),
- (0x1D71D, 'M', 'β'),
- (0x1D71E, 'M', 'γ'),
- (0x1D71F, 'M', 'δ'),
- (0x1D720, 'M', 'ε'),
- (0x1D721, 'M', 'ζ'),
- (0x1D722, 'M', 'η'),
- (0x1D723, 'M', 'θ'),
- (0x1D724, 'M', 'ι'),
- (0x1D725, 'M', 'κ'),
- (0x1D726, 'M', 'λ'),
- (0x1D727, 'M', 'μ'),
- (0x1D728, 'M', 'ν'),
- (0x1D729, 'M', 'ξ'),
- (0x1D72A, 'M', 'ο'),
- (0x1D72B, 'M', 'π'),
- (0x1D72C, 'M', 'ρ'),
- (0x1D72D, 'M', 'θ'),
- (0x1D72E, 'M', 'σ'),
- (0x1D72F, 'M', 'τ'),
- (0x1D730, 'M', 'υ'),
- (0x1D731, 'M', 'φ'),
- (0x1D732, 'M', 'χ'),
- (0x1D733, 'M', 'ψ'),
- (0x1D734, 'M', 'ω'),
- (0x1D735, 'M', '∇'),
- (0x1D736, 'M', 'α'),
- (0x1D737, 'M', 'β'),
- (0x1D738, 'M', 'γ'),
- (0x1D739, 'M', 'δ'),
- (0x1D73A, 'M', 'ε'),
- (0x1D73B, 'M', 'ζ'),
- (0x1D73C, 'M', 'η'),
- (0x1D73D, 'M', 'θ'),
- (0x1D73E, 'M', 'ι'),
- (0x1D73F, 'M', 'κ'),
- (0x1D740, 'M', 'λ'),
- (0x1D741, 'M', 'μ'),
- (0x1D742, 'M', 'ν'),
- (0x1D743, 'M', 'ξ'),
- (0x1D744, 'M', 'ο'),
- (0x1D745, 'M', 'π'),
- (0x1D746, 'M', 'ρ'),
- (0x1D747, 'M', 'σ'),
- (0x1D749, 'M', 'τ'),
- (0x1D74A, 'M', 'υ'),
- (0x1D74B, 'M', 'φ'),
- (0x1D74C, 'M', 'χ'),
- (0x1D74D, 'M', 'ψ'),
- (0x1D74E, 'M', 'ω'),
- (0x1D74F, 'M', '∂'),
- (0x1D750, 'M', 'ε'),
- (0x1D751, 'M', 'θ'),
- (0x1D752, 'M', 'κ'),
- (0x1D753, 'M', 'φ'),
- (0x1D754, 'M', 'ρ'),
- (0x1D755, 'M', 'π'),
- (0x1D756, 'M', 'α'),
- (0x1D757, 'M', 'β'),
- (0x1D758, 'M', 'γ'),
- (0x1D759, 'M', 'δ'),
- (0x1D75A, 'M', 'ε'),
- ]
-
-def _seg_69() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D75B, 'M', 'ζ'),
- (0x1D75C, 'M', 'η'),
- (0x1D75D, 'M', 'θ'),
- (0x1D75E, 'M', 'ι'),
- (0x1D75F, 'M', 'κ'),
- (0x1D760, 'M', 'λ'),
- (0x1D761, 'M', 'μ'),
- (0x1D762, 'M', 'ν'),
- (0x1D763, 'M', 'ξ'),
- (0x1D764, 'M', 'ο'),
- (0x1D765, 'M', 'π'),
- (0x1D766, 'M', 'ρ'),
- (0x1D767, 'M', 'θ'),
- (0x1D768, 'M', 'σ'),
- (0x1D769, 'M', 'τ'),
- (0x1D76A, 'M', 'υ'),
- (0x1D76B, 'M', 'φ'),
- (0x1D76C, 'M', 'χ'),
- (0x1D76D, 'M', 'ψ'),
- (0x1D76E, 'M', 'ω'),
- (0x1D76F, 'M', '∇'),
- (0x1D770, 'M', 'α'),
- (0x1D771, 'M', 'β'),
- (0x1D772, 'M', 'γ'),
- (0x1D773, 'M', 'δ'),
- (0x1D774, 'M', 'ε'),
- (0x1D775, 'M', 'ζ'),
- (0x1D776, 'M', 'η'),
- (0x1D777, 'M', 'θ'),
- (0x1D778, 'M', 'ι'),
- (0x1D779, 'M', 'κ'),
- (0x1D77A, 'M', 'λ'),
- (0x1D77B, 'M', 'μ'),
- (0x1D77C, 'M', 'ν'),
- (0x1D77D, 'M', 'ξ'),
- (0x1D77E, 'M', 'ο'),
- (0x1D77F, 'M', 'π'),
- (0x1D780, 'M', 'ρ'),
- (0x1D781, 'M', 'σ'),
- (0x1D783, 'M', 'τ'),
- (0x1D784, 'M', 'υ'),
- (0x1D785, 'M', 'φ'),
- (0x1D786, 'M', 'χ'),
- (0x1D787, 'M', 'ψ'),
- (0x1D788, 'M', 'ω'),
- (0x1D789, 'M', '∂'),
- (0x1D78A, 'M', 'ε'),
- (0x1D78B, 'M', 'θ'),
- (0x1D78C, 'M', 'κ'),
- (0x1D78D, 'M', 'φ'),
- (0x1D78E, 'M', 'ρ'),
- (0x1D78F, 'M', 'π'),
- (0x1D790, 'M', 'α'),
- (0x1D791, 'M', 'β'),
- (0x1D792, 'M', 'γ'),
- (0x1D793, 'M', 'δ'),
- (0x1D794, 'M', 'ε'),
- (0x1D795, 'M', 'ζ'),
- (0x1D796, 'M', 'η'),
- (0x1D797, 'M', 'θ'),
- (0x1D798, 'M', 'ι'),
- (0x1D799, 'M', 'κ'),
- (0x1D79A, 'M', 'λ'),
- (0x1D79B, 'M', 'μ'),
- (0x1D79C, 'M', 'ν'),
- (0x1D79D, 'M', 'ξ'),
- (0x1D79E, 'M', 'ο'),
- (0x1D79F, 'M', 'π'),
- (0x1D7A0, 'M', 'ρ'),
- (0x1D7A1, 'M', 'θ'),
- (0x1D7A2, 'M', 'σ'),
- (0x1D7A3, 'M', 'τ'),
- (0x1D7A4, 'M', 'υ'),
- (0x1D7A5, 'M', 'φ'),
- (0x1D7A6, 'M', 'χ'),
- (0x1D7A7, 'M', 'ψ'),
- (0x1D7A8, 'M', 'ω'),
- (0x1D7A9, 'M', '∇'),
- (0x1D7AA, 'M', 'α'),
- (0x1D7AB, 'M', 'β'),
- (0x1D7AC, 'M', 'γ'),
- (0x1D7AD, 'M', 'δ'),
- (0x1D7AE, 'M', 'ε'),
- (0x1D7AF, 'M', 'ζ'),
- (0x1D7B0, 'M', 'η'),
- (0x1D7B1, 'M', 'θ'),
- (0x1D7B2, 'M', 'ι'),
- (0x1D7B3, 'M', 'κ'),
- (0x1D7B4, 'M', 'λ'),
- (0x1D7B5, 'M', 'μ'),
- (0x1D7B6, 'M', 'ν'),
- (0x1D7B7, 'M', 'ξ'),
- (0x1D7B8, 'M', 'ο'),
- (0x1D7B9, 'M', 'π'),
- (0x1D7BA, 'M', 'ρ'),
- (0x1D7BB, 'M', 'σ'),
- (0x1D7BD, 'M', 'τ'),
- (0x1D7BE, 'M', 'υ'),
- (0x1D7BF, 'M', 'φ'),
- (0x1D7C0, 'M', 'χ'),
- ]
-
-def _seg_70() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1D7C1, 'M', 'ψ'),
- (0x1D7C2, 'M', 'ω'),
- (0x1D7C3, 'M', '∂'),
- (0x1D7C4, 'M', 'ε'),
- (0x1D7C5, 'M', 'θ'),
- (0x1D7C6, 'M', 'κ'),
- (0x1D7C7, 'M', 'φ'),
- (0x1D7C8, 'M', 'ρ'),
- (0x1D7C9, 'M', 'π'),
- (0x1D7CA, 'M', 'ϝ'),
- (0x1D7CC, 'X'),
- (0x1D7CE, 'M', '0'),
- (0x1D7CF, 'M', '1'),
- (0x1D7D0, 'M', '2'),
- (0x1D7D1, 'M', '3'),
- (0x1D7D2, 'M', '4'),
- (0x1D7D3, 'M', '5'),
- (0x1D7D4, 'M', '6'),
- (0x1D7D5, 'M', '7'),
- (0x1D7D6, 'M', '8'),
- (0x1D7D7, 'M', '9'),
- (0x1D7D8, 'M', '0'),
- (0x1D7D9, 'M', '1'),
- (0x1D7DA, 'M', '2'),
- (0x1D7DB, 'M', '3'),
- (0x1D7DC, 'M', '4'),
- (0x1D7DD, 'M', '5'),
- (0x1D7DE, 'M', '6'),
- (0x1D7DF, 'M', '7'),
- (0x1D7E0, 'M', '8'),
- (0x1D7E1, 'M', '9'),
- (0x1D7E2, 'M', '0'),
- (0x1D7E3, 'M', '1'),
- (0x1D7E4, 'M', '2'),
- (0x1D7E5, 'M', '3'),
- (0x1D7E6, 'M', '4'),
- (0x1D7E7, 'M', '5'),
- (0x1D7E8, 'M', '6'),
- (0x1D7E9, 'M', '7'),
- (0x1D7EA, 'M', '8'),
- (0x1D7EB, 'M', '9'),
- (0x1D7EC, 'M', '0'),
- (0x1D7ED, 'M', '1'),
- (0x1D7EE, 'M', '2'),
- (0x1D7EF, 'M', '3'),
- (0x1D7F0, 'M', '4'),
- (0x1D7F1, 'M', '5'),
- (0x1D7F2, 'M', '6'),
- (0x1D7F3, 'M', '7'),
- (0x1D7F4, 'M', '8'),
- (0x1D7F5, 'M', '9'),
- (0x1D7F6, 'M', '0'),
- (0x1D7F7, 'M', '1'),
- (0x1D7F8, 'M', '2'),
- (0x1D7F9, 'M', '3'),
- (0x1D7FA, 'M', '4'),
- (0x1D7FB, 'M', '5'),
- (0x1D7FC, 'M', '6'),
- (0x1D7FD, 'M', '7'),
- (0x1D7FE, 'M', '8'),
- (0x1D7FF, 'M', '9'),
- (0x1D800, 'V'),
- (0x1DA8C, 'X'),
- (0x1DA9B, 'V'),
- (0x1DAA0, 'X'),
- (0x1DAA1, 'V'),
- (0x1DAB0, 'X'),
- (0x1DF00, 'V'),
- (0x1DF1F, 'X'),
- (0x1E000, 'V'),
- (0x1E007, 'X'),
- (0x1E008, 'V'),
- (0x1E019, 'X'),
- (0x1E01B, 'V'),
- (0x1E022, 'X'),
- (0x1E023, 'V'),
- (0x1E025, 'X'),
- (0x1E026, 'V'),
- (0x1E02B, 'X'),
- (0x1E100, 'V'),
- (0x1E12D, 'X'),
- (0x1E130, 'V'),
- (0x1E13E, 'X'),
- (0x1E140, 'V'),
- (0x1E14A, 'X'),
- (0x1E14E, 'V'),
- (0x1E150, 'X'),
- (0x1E290, 'V'),
- (0x1E2AF, 'X'),
- (0x1E2C0, 'V'),
- (0x1E2FA, 'X'),
- (0x1E2FF, 'V'),
- (0x1E300, 'X'),
- (0x1E7E0, 'V'),
- (0x1E7E7, 'X'),
- (0x1E7E8, 'V'),
- (0x1E7EC, 'X'),
- (0x1E7ED, 'V'),
- (0x1E7EF, 'X'),
- (0x1E7F0, 'V'),
- ]
-
-def _seg_71() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1E7FF, 'X'),
- (0x1E800, 'V'),
- (0x1E8C5, 'X'),
- (0x1E8C7, 'V'),
- (0x1E8D7, 'X'),
- (0x1E900, 'M', '𞤢'),
- (0x1E901, 'M', '𞤣'),
- (0x1E902, 'M', '𞤤'),
- (0x1E903, 'M', '𞤥'),
- (0x1E904, 'M', '𞤦'),
- (0x1E905, 'M', '𞤧'),
- (0x1E906, 'M', '𞤨'),
- (0x1E907, 'M', '𞤩'),
- (0x1E908, 'M', '𞤪'),
- (0x1E909, 'M', '𞤫'),
- (0x1E90A, 'M', '𞤬'),
- (0x1E90B, 'M', '𞤭'),
- (0x1E90C, 'M', '𞤮'),
- (0x1E90D, 'M', '𞤯'),
- (0x1E90E, 'M', '𞤰'),
- (0x1E90F, 'M', '𞤱'),
- (0x1E910, 'M', '𞤲'),
- (0x1E911, 'M', '𞤳'),
- (0x1E912, 'M', '𞤴'),
- (0x1E913, 'M', '𞤵'),
- (0x1E914, 'M', '𞤶'),
- (0x1E915, 'M', '𞤷'),
- (0x1E916, 'M', '𞤸'),
- (0x1E917, 'M', '𞤹'),
- (0x1E918, 'M', '𞤺'),
- (0x1E919, 'M', '𞤻'),
- (0x1E91A, 'M', '𞤼'),
- (0x1E91B, 'M', '𞤽'),
- (0x1E91C, 'M', '𞤾'),
- (0x1E91D, 'M', '𞤿'),
- (0x1E91E, 'M', '𞥀'),
- (0x1E91F, 'M', '𞥁'),
- (0x1E920, 'M', '𞥂'),
- (0x1E921, 'M', '𞥃'),
- (0x1E922, 'V'),
- (0x1E94C, 'X'),
- (0x1E950, 'V'),
- (0x1E95A, 'X'),
- (0x1E95E, 'V'),
- (0x1E960, 'X'),
- (0x1EC71, 'V'),
- (0x1ECB5, 'X'),
- (0x1ED01, 'V'),
- (0x1ED3E, 'X'),
- (0x1EE00, 'M', 'ا'),
- (0x1EE01, 'M', 'ب'),
- (0x1EE02, 'M', 'ج'),
- (0x1EE03, 'M', 'د'),
- (0x1EE04, 'X'),
- (0x1EE05, 'M', 'و'),
- (0x1EE06, 'M', 'ز'),
- (0x1EE07, 'M', 'ح'),
- (0x1EE08, 'M', 'ط'),
- (0x1EE09, 'M', 'ي'),
- (0x1EE0A, 'M', 'ك'),
- (0x1EE0B, 'M', 'ل'),
- (0x1EE0C, 'M', 'م'),
- (0x1EE0D, 'M', 'ن'),
- (0x1EE0E, 'M', 'س'),
- (0x1EE0F, 'M', 'ع'),
- (0x1EE10, 'M', 'ف'),
- (0x1EE11, 'M', 'ص'),
- (0x1EE12, 'M', 'ق'),
- (0x1EE13, 'M', 'ر'),
- (0x1EE14, 'M', 'ش'),
- (0x1EE15, 'M', 'ت'),
- (0x1EE16, 'M', 'ث'),
- (0x1EE17, 'M', 'خ'),
- (0x1EE18, 'M', 'ذ'),
- (0x1EE19, 'M', 'ض'),
- (0x1EE1A, 'M', 'ظ'),
- (0x1EE1B, 'M', 'غ'),
- (0x1EE1C, 'M', 'ٮ'),
- (0x1EE1D, 'M', 'ں'),
- (0x1EE1E, 'M', 'ڡ'),
- (0x1EE1F, 'M', 'ٯ'),
- (0x1EE20, 'X'),
- (0x1EE21, 'M', 'ب'),
- (0x1EE22, 'M', 'ج'),
- (0x1EE23, 'X'),
- (0x1EE24, 'M', 'ه'),
- (0x1EE25, 'X'),
- (0x1EE27, 'M', 'ح'),
- (0x1EE28, 'X'),
- (0x1EE29, 'M', 'ي'),
- (0x1EE2A, 'M', 'ك'),
- (0x1EE2B, 'M', 'ل'),
- (0x1EE2C, 'M', 'م'),
- (0x1EE2D, 'M', 'ن'),
- (0x1EE2E, 'M', 'س'),
- (0x1EE2F, 'M', 'ع'),
- (0x1EE30, 'M', 'ف'),
- (0x1EE31, 'M', 'ص'),
- (0x1EE32, 'M', 'ق'),
- (0x1EE33, 'X'),
- ]
-
-def _seg_72() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1EE34, 'M', 'ش'),
- (0x1EE35, 'M', 'ت'),
- (0x1EE36, 'M', 'ث'),
- (0x1EE37, 'M', 'خ'),
- (0x1EE38, 'X'),
- (0x1EE39, 'M', 'ض'),
- (0x1EE3A, 'X'),
- (0x1EE3B, 'M', 'غ'),
- (0x1EE3C, 'X'),
- (0x1EE42, 'M', 'ج'),
- (0x1EE43, 'X'),
- (0x1EE47, 'M', 'ح'),
- (0x1EE48, 'X'),
- (0x1EE49, 'M', 'ي'),
- (0x1EE4A, 'X'),
- (0x1EE4B, 'M', 'ل'),
- (0x1EE4C, 'X'),
- (0x1EE4D, 'M', 'ن'),
- (0x1EE4E, 'M', 'س'),
- (0x1EE4F, 'M', 'ع'),
- (0x1EE50, 'X'),
- (0x1EE51, 'M', 'ص'),
- (0x1EE52, 'M', 'ق'),
- (0x1EE53, 'X'),
- (0x1EE54, 'M', 'ش'),
- (0x1EE55, 'X'),
- (0x1EE57, 'M', 'خ'),
- (0x1EE58, 'X'),
- (0x1EE59, 'M', 'ض'),
- (0x1EE5A, 'X'),
- (0x1EE5B, 'M', 'غ'),
- (0x1EE5C, 'X'),
- (0x1EE5D, 'M', 'ں'),
- (0x1EE5E, 'X'),
- (0x1EE5F, 'M', 'ٯ'),
- (0x1EE60, 'X'),
- (0x1EE61, 'M', 'ب'),
- (0x1EE62, 'M', 'ج'),
- (0x1EE63, 'X'),
- (0x1EE64, 'M', 'ه'),
- (0x1EE65, 'X'),
- (0x1EE67, 'M', 'ح'),
- (0x1EE68, 'M', 'ط'),
- (0x1EE69, 'M', 'ي'),
- (0x1EE6A, 'M', 'ك'),
- (0x1EE6B, 'X'),
- (0x1EE6C, 'M', 'م'),
- (0x1EE6D, 'M', 'ن'),
- (0x1EE6E, 'M', 'س'),
- (0x1EE6F, 'M', 'ع'),
- (0x1EE70, 'M', 'ف'),
- (0x1EE71, 'M', 'ص'),
- (0x1EE72, 'M', 'ق'),
- (0x1EE73, 'X'),
- (0x1EE74, 'M', 'ش'),
- (0x1EE75, 'M', 'ت'),
- (0x1EE76, 'M', 'ث'),
- (0x1EE77, 'M', 'خ'),
- (0x1EE78, 'X'),
- (0x1EE79, 'M', 'ض'),
- (0x1EE7A, 'M', 'ظ'),
- (0x1EE7B, 'M', 'غ'),
- (0x1EE7C, 'M', 'ٮ'),
- (0x1EE7D, 'X'),
- (0x1EE7E, 'M', 'ڡ'),
- (0x1EE7F, 'X'),
- (0x1EE80, 'M', 'ا'),
- (0x1EE81, 'M', 'ب'),
- (0x1EE82, 'M', 'ج'),
- (0x1EE83, 'M', 'د'),
- (0x1EE84, 'M', 'ه'),
- (0x1EE85, 'M', 'و'),
- (0x1EE86, 'M', 'ز'),
- (0x1EE87, 'M', 'ح'),
- (0x1EE88, 'M', 'ط'),
- (0x1EE89, 'M', 'ي'),
- (0x1EE8A, 'X'),
- (0x1EE8B, 'M', 'ل'),
- (0x1EE8C, 'M', 'م'),
- (0x1EE8D, 'M', 'ن'),
- (0x1EE8E, 'M', 'س'),
- (0x1EE8F, 'M', 'ع'),
- (0x1EE90, 'M', 'ف'),
- (0x1EE91, 'M', 'ص'),
- (0x1EE92, 'M', 'ق'),
- (0x1EE93, 'M', 'ر'),
- (0x1EE94, 'M', 'ش'),
- (0x1EE95, 'M', 'ت'),
- (0x1EE96, 'M', 'ث'),
- (0x1EE97, 'M', 'خ'),
- (0x1EE98, 'M', 'ذ'),
- (0x1EE99, 'M', 'ض'),
- (0x1EE9A, 'M', 'ظ'),
- (0x1EE9B, 'M', 'غ'),
- (0x1EE9C, 'X'),
- (0x1EEA1, 'M', 'ب'),
- (0x1EEA2, 'M', 'ج'),
- (0x1EEA3, 'M', 'د'),
- (0x1EEA4, 'X'),
- (0x1EEA5, 'M', 'و'),
- ]
-
-def _seg_73() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1EEA6, 'M', 'ز'),
- (0x1EEA7, 'M', 'ح'),
- (0x1EEA8, 'M', 'ط'),
- (0x1EEA9, 'M', 'ي'),
- (0x1EEAA, 'X'),
- (0x1EEAB, 'M', 'ل'),
- (0x1EEAC, 'M', 'م'),
- (0x1EEAD, 'M', 'ن'),
- (0x1EEAE, 'M', 'س'),
- (0x1EEAF, 'M', 'ع'),
- (0x1EEB0, 'M', 'ف'),
- (0x1EEB1, 'M', 'ص'),
- (0x1EEB2, 'M', 'ق'),
- (0x1EEB3, 'M', 'ر'),
- (0x1EEB4, 'M', 'ش'),
- (0x1EEB5, 'M', 'ت'),
- (0x1EEB6, 'M', 'ث'),
- (0x1EEB7, 'M', 'خ'),
- (0x1EEB8, 'M', 'ذ'),
- (0x1EEB9, 'M', 'ض'),
- (0x1EEBA, 'M', 'ظ'),
- (0x1EEBB, 'M', 'غ'),
- (0x1EEBC, 'X'),
- (0x1EEF0, 'V'),
- (0x1EEF2, 'X'),
- (0x1F000, 'V'),
- (0x1F02C, 'X'),
- (0x1F030, 'V'),
- (0x1F094, 'X'),
- (0x1F0A0, 'V'),
- (0x1F0AF, 'X'),
- (0x1F0B1, 'V'),
- (0x1F0C0, 'X'),
- (0x1F0C1, 'V'),
- (0x1F0D0, 'X'),
- (0x1F0D1, 'V'),
- (0x1F0F6, 'X'),
- (0x1F101, '3', '0,'),
- (0x1F102, '3', '1,'),
- (0x1F103, '3', '2,'),
- (0x1F104, '3', '3,'),
- (0x1F105, '3', '4,'),
- (0x1F106, '3', '5,'),
- (0x1F107, '3', '6,'),
- (0x1F108, '3', '7,'),
- (0x1F109, '3', '8,'),
- (0x1F10A, '3', '9,'),
- (0x1F10B, 'V'),
- (0x1F110, '3', '(a)'),
- (0x1F111, '3', '(b)'),
- (0x1F112, '3', '(c)'),
- (0x1F113, '3', '(d)'),
- (0x1F114, '3', '(e)'),
- (0x1F115, '3', '(f)'),
- (0x1F116, '3', '(g)'),
- (0x1F117, '3', '(h)'),
- (0x1F118, '3', '(i)'),
- (0x1F119, '3', '(j)'),
- (0x1F11A, '3', '(k)'),
- (0x1F11B, '3', '(l)'),
- (0x1F11C, '3', '(m)'),
- (0x1F11D, '3', '(n)'),
- (0x1F11E, '3', '(o)'),
- (0x1F11F, '3', '(p)'),
- (0x1F120, '3', '(q)'),
- (0x1F121, '3', '(r)'),
- (0x1F122, '3', '(s)'),
- (0x1F123, '3', '(t)'),
- (0x1F124, '3', '(u)'),
- (0x1F125, '3', '(v)'),
- (0x1F126, '3', '(w)'),
- (0x1F127, '3', '(x)'),
- (0x1F128, '3', '(y)'),
- (0x1F129, '3', '(z)'),
- (0x1F12A, 'M', '〔s〕'),
- (0x1F12B, 'M', 'c'),
- (0x1F12C, 'M', 'r'),
- (0x1F12D, 'M', 'cd'),
- (0x1F12E, 'M', 'wz'),
- (0x1F12F, 'V'),
- (0x1F130, 'M', 'a'),
- (0x1F131, 'M', 'b'),
- (0x1F132, 'M', 'c'),
- (0x1F133, 'M', 'd'),
- (0x1F134, 'M', 'e'),
- (0x1F135, 'M', 'f'),
- (0x1F136, 'M', 'g'),
- (0x1F137, 'M', 'h'),
- (0x1F138, 'M', 'i'),
- (0x1F139, 'M', 'j'),
- (0x1F13A, 'M', 'k'),
- (0x1F13B, 'M', 'l'),
- (0x1F13C, 'M', 'm'),
- (0x1F13D, 'M', 'n'),
- (0x1F13E, 'M', 'o'),
- (0x1F13F, 'M', 'p'),
- (0x1F140, 'M', 'q'),
- (0x1F141, 'M', 'r'),
- (0x1F142, 'M', 's'),
- (0x1F143, 'M', 't'),
- ]
-
-def _seg_74() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1F144, 'M', 'u'),
- (0x1F145, 'M', 'v'),
- (0x1F146, 'M', 'w'),
- (0x1F147, 'M', 'x'),
- (0x1F148, 'M', 'y'),
- (0x1F149, 'M', 'z'),
- (0x1F14A, 'M', 'hv'),
- (0x1F14B, 'M', 'mv'),
- (0x1F14C, 'M', 'sd'),
- (0x1F14D, 'M', 'ss'),
- (0x1F14E, 'M', 'ppv'),
- (0x1F14F, 'M', 'wc'),
- (0x1F150, 'V'),
- (0x1F16A, 'M', 'mc'),
- (0x1F16B, 'M', 'md'),
- (0x1F16C, 'M', 'mr'),
- (0x1F16D, 'V'),
- (0x1F190, 'M', 'dj'),
- (0x1F191, 'V'),
- (0x1F1AE, 'X'),
- (0x1F1E6, 'V'),
- (0x1F200, 'M', 'ほか'),
- (0x1F201, 'M', 'ココ'),
- (0x1F202, 'M', 'サ'),
- (0x1F203, 'X'),
- (0x1F210, 'M', '手'),
- (0x1F211, 'M', '字'),
- (0x1F212, 'M', '双'),
- (0x1F213, 'M', 'デ'),
- (0x1F214, 'M', '二'),
- (0x1F215, 'M', '多'),
- (0x1F216, 'M', '解'),
- (0x1F217, 'M', '天'),
- (0x1F218, 'M', '交'),
- (0x1F219, 'M', '映'),
- (0x1F21A, 'M', '無'),
- (0x1F21B, 'M', '料'),
- (0x1F21C, 'M', '前'),
- (0x1F21D, 'M', '後'),
- (0x1F21E, 'M', '再'),
- (0x1F21F, 'M', '新'),
- (0x1F220, 'M', '初'),
- (0x1F221, 'M', '終'),
- (0x1F222, 'M', '生'),
- (0x1F223, 'M', '販'),
- (0x1F224, 'M', '声'),
- (0x1F225, 'M', '吹'),
- (0x1F226, 'M', '演'),
- (0x1F227, 'M', '投'),
- (0x1F228, 'M', '捕'),
- (0x1F229, 'M', '一'),
- (0x1F22A, 'M', '三'),
- (0x1F22B, 'M', '遊'),
- (0x1F22C, 'M', '左'),
- (0x1F22D, 'M', '中'),
- (0x1F22E, 'M', '右'),
- (0x1F22F, 'M', '指'),
- (0x1F230, 'M', '走'),
- (0x1F231, 'M', '打'),
- (0x1F232, 'M', '禁'),
- (0x1F233, 'M', '空'),
- (0x1F234, 'M', '合'),
- (0x1F235, 'M', '満'),
- (0x1F236, 'M', '有'),
- (0x1F237, 'M', '月'),
- (0x1F238, 'M', '申'),
- (0x1F239, 'M', '割'),
- (0x1F23A, 'M', '営'),
- (0x1F23B, 'M', '配'),
- (0x1F23C, 'X'),
- (0x1F240, 'M', '〔本〕'),
- (0x1F241, 'M', '〔三〕'),
- (0x1F242, 'M', '〔二〕'),
- (0x1F243, 'M', '〔安〕'),
- (0x1F244, 'M', '〔点〕'),
- (0x1F245, 'M', '〔打〕'),
- (0x1F246, 'M', '〔盗〕'),
- (0x1F247, 'M', '〔勝〕'),
- (0x1F248, 'M', '〔敗〕'),
- (0x1F249, 'X'),
- (0x1F250, 'M', '得'),
- (0x1F251, 'M', '可'),
- (0x1F252, 'X'),
- (0x1F260, 'V'),
- (0x1F266, 'X'),
- (0x1F300, 'V'),
- (0x1F6D8, 'X'),
- (0x1F6DD, 'V'),
- (0x1F6ED, 'X'),
- (0x1F6F0, 'V'),
- (0x1F6FD, 'X'),
- (0x1F700, 'V'),
- (0x1F774, 'X'),
- (0x1F780, 'V'),
- (0x1F7D9, 'X'),
- (0x1F7E0, 'V'),
- (0x1F7EC, 'X'),
- (0x1F7F0, 'V'),
- (0x1F7F1, 'X'),
- (0x1F800, 'V'),
- ]
-
-def _seg_75() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x1F80C, 'X'),
- (0x1F810, 'V'),
- (0x1F848, 'X'),
- (0x1F850, 'V'),
- (0x1F85A, 'X'),
- (0x1F860, 'V'),
- (0x1F888, 'X'),
- (0x1F890, 'V'),
- (0x1F8AE, 'X'),
- (0x1F8B0, 'V'),
- (0x1F8B2, 'X'),
- (0x1F900, 'V'),
- (0x1FA54, 'X'),
- (0x1FA60, 'V'),
- (0x1FA6E, 'X'),
- (0x1FA70, 'V'),
- (0x1FA75, 'X'),
- (0x1FA78, 'V'),
- (0x1FA7D, 'X'),
- (0x1FA80, 'V'),
- (0x1FA87, 'X'),
- (0x1FA90, 'V'),
- (0x1FAAD, 'X'),
- (0x1FAB0, 'V'),
- (0x1FABB, 'X'),
- (0x1FAC0, 'V'),
- (0x1FAC6, 'X'),
- (0x1FAD0, 'V'),
- (0x1FADA, 'X'),
- (0x1FAE0, 'V'),
- (0x1FAE8, 'X'),
- (0x1FAF0, 'V'),
- (0x1FAF7, 'X'),
- (0x1FB00, 'V'),
- (0x1FB93, 'X'),
- (0x1FB94, 'V'),
- (0x1FBCB, 'X'),
- (0x1FBF0, 'M', '0'),
- (0x1FBF1, 'M', '1'),
- (0x1FBF2, 'M', '2'),
- (0x1FBF3, 'M', '3'),
- (0x1FBF4, 'M', '4'),
- (0x1FBF5, 'M', '5'),
- (0x1FBF6, 'M', '6'),
- (0x1FBF7, 'M', '7'),
- (0x1FBF8, 'M', '8'),
- (0x1FBF9, 'M', '9'),
- (0x1FBFA, 'X'),
- (0x20000, 'V'),
- (0x2A6E0, 'X'),
- (0x2A700, 'V'),
- (0x2B739, 'X'),
- (0x2B740, 'V'),
- (0x2B81E, 'X'),
- (0x2B820, 'V'),
- (0x2CEA2, 'X'),
- (0x2CEB0, 'V'),
- (0x2EBE1, 'X'),
- (0x2F800, 'M', '丽'),
- (0x2F801, 'M', '丸'),
- (0x2F802, 'M', '乁'),
- (0x2F803, 'M', '𠄢'),
- (0x2F804, 'M', '你'),
- (0x2F805, 'M', '侮'),
- (0x2F806, 'M', '侻'),
- (0x2F807, 'M', '倂'),
- (0x2F808, 'M', '偺'),
- (0x2F809, 'M', '備'),
- (0x2F80A, 'M', '僧'),
- (0x2F80B, 'M', '像'),
- (0x2F80C, 'M', '㒞'),
- (0x2F80D, 'M', '𠘺'),
- (0x2F80E, 'M', '免'),
- (0x2F80F, 'M', '兔'),
- (0x2F810, 'M', '兤'),
- (0x2F811, 'M', '具'),
- (0x2F812, 'M', '𠔜'),
- (0x2F813, 'M', '㒹'),
- (0x2F814, 'M', '內'),
- (0x2F815, 'M', '再'),
- (0x2F816, 'M', '𠕋'),
- (0x2F817, 'M', '冗'),
- (0x2F818, 'M', '冤'),
- (0x2F819, 'M', '仌'),
- (0x2F81A, 'M', '冬'),
- (0x2F81B, 'M', '况'),
- (0x2F81C, 'M', '𩇟'),
- (0x2F81D, 'M', '凵'),
- (0x2F81E, 'M', '刃'),
- (0x2F81F, 'M', '㓟'),
- (0x2F820, 'M', '刻'),
- (0x2F821, 'M', '剆'),
- (0x2F822, 'M', '割'),
- (0x2F823, 'M', '剷'),
- (0x2F824, 'M', '㔕'),
- (0x2F825, 'M', '勇'),
- (0x2F826, 'M', '勉'),
- (0x2F827, 'M', '勤'),
- (0x2F828, 'M', '勺'),
- (0x2F829, 'M', '包'),
- ]
-
-def _seg_76() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F82A, 'M', '匆'),
- (0x2F82B, 'M', '北'),
- (0x2F82C, 'M', '卉'),
- (0x2F82D, 'M', '卑'),
- (0x2F82E, 'M', '博'),
- (0x2F82F, 'M', '即'),
- (0x2F830, 'M', '卽'),
- (0x2F831, 'M', '卿'),
- (0x2F834, 'M', '𠨬'),
- (0x2F835, 'M', '灰'),
- (0x2F836, 'M', '及'),
- (0x2F837, 'M', '叟'),
- (0x2F838, 'M', '𠭣'),
- (0x2F839, 'M', '叫'),
- (0x2F83A, 'M', '叱'),
- (0x2F83B, 'M', '吆'),
- (0x2F83C, 'M', '咞'),
- (0x2F83D, 'M', '吸'),
- (0x2F83E, 'M', '呈'),
- (0x2F83F, 'M', '周'),
- (0x2F840, 'M', '咢'),
- (0x2F841, 'M', '哶'),
- (0x2F842, 'M', '唐'),
- (0x2F843, 'M', '啓'),
- (0x2F844, 'M', '啣'),
- (0x2F845, 'M', '善'),
- (0x2F847, 'M', '喙'),
- (0x2F848, 'M', '喫'),
- (0x2F849, 'M', '喳'),
- (0x2F84A, 'M', '嗂'),
- (0x2F84B, 'M', '圖'),
- (0x2F84C, 'M', '嘆'),
- (0x2F84D, 'M', '圗'),
- (0x2F84E, 'M', '噑'),
- (0x2F84F, 'M', '噴'),
- (0x2F850, 'M', '切'),
- (0x2F851, 'M', '壮'),
- (0x2F852, 'M', '城'),
- (0x2F853, 'M', '埴'),
- (0x2F854, 'M', '堍'),
- (0x2F855, 'M', '型'),
- (0x2F856, 'M', '堲'),
- (0x2F857, 'M', '報'),
- (0x2F858, 'M', '墬'),
- (0x2F859, 'M', '𡓤'),
- (0x2F85A, 'M', '売'),
- (0x2F85B, 'M', '壷'),
- (0x2F85C, 'M', '夆'),
- (0x2F85D, 'M', '多'),
- (0x2F85E, 'M', '夢'),
- (0x2F85F, 'M', '奢'),
- (0x2F860, 'M', '𡚨'),
- (0x2F861, 'M', '𡛪'),
- (0x2F862, 'M', '姬'),
- (0x2F863, 'M', '娛'),
- (0x2F864, 'M', '娧'),
- (0x2F865, 'M', '姘'),
- (0x2F866, 'M', '婦'),
- (0x2F867, 'M', '㛮'),
- (0x2F868, 'X'),
- (0x2F869, 'M', '嬈'),
- (0x2F86A, 'M', '嬾'),
- (0x2F86C, 'M', '𡧈'),
- (0x2F86D, 'M', '寃'),
- (0x2F86E, 'M', '寘'),
- (0x2F86F, 'M', '寧'),
- (0x2F870, 'M', '寳'),
- (0x2F871, 'M', '𡬘'),
- (0x2F872, 'M', '寿'),
- (0x2F873, 'M', '将'),
- (0x2F874, 'X'),
- (0x2F875, 'M', '尢'),
- (0x2F876, 'M', '㞁'),
- (0x2F877, 'M', '屠'),
- (0x2F878, 'M', '屮'),
- (0x2F879, 'M', '峀'),
- (0x2F87A, 'M', '岍'),
- (0x2F87B, 'M', '𡷤'),
- (0x2F87C, 'M', '嵃'),
- (0x2F87D, 'M', '𡷦'),
- (0x2F87E, 'M', '嵮'),
- (0x2F87F, 'M', '嵫'),
- (0x2F880, 'M', '嵼'),
- (0x2F881, 'M', '巡'),
- (0x2F882, 'M', '巢'),
- (0x2F883, 'M', '㠯'),
- (0x2F884, 'M', '巽'),
- (0x2F885, 'M', '帨'),
- (0x2F886, 'M', '帽'),
- (0x2F887, 'M', '幩'),
- (0x2F888, 'M', '㡢'),
- (0x2F889, 'M', '𢆃'),
- (0x2F88A, 'M', '㡼'),
- (0x2F88B, 'M', '庰'),
- (0x2F88C, 'M', '庳'),
- (0x2F88D, 'M', '庶'),
- (0x2F88E, 'M', '廊'),
- (0x2F88F, 'M', '𪎒'),
- (0x2F890, 'M', '廾'),
- (0x2F891, 'M', '𢌱'),
- ]
-
-def _seg_77() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F893, 'M', '舁'),
- (0x2F894, 'M', '弢'),
- (0x2F896, 'M', '㣇'),
- (0x2F897, 'M', '𣊸'),
- (0x2F898, 'M', '𦇚'),
- (0x2F899, 'M', '形'),
- (0x2F89A, 'M', '彫'),
- (0x2F89B, 'M', '㣣'),
- (0x2F89C, 'M', '徚'),
- (0x2F89D, 'M', '忍'),
- (0x2F89E, 'M', '志'),
- (0x2F89F, 'M', '忹'),
- (0x2F8A0, 'M', '悁'),
- (0x2F8A1, 'M', '㤺'),
- (0x2F8A2, 'M', '㤜'),
- (0x2F8A3, 'M', '悔'),
- (0x2F8A4, 'M', '𢛔'),
- (0x2F8A5, 'M', '惇'),
- (0x2F8A6, 'M', '慈'),
- (0x2F8A7, 'M', '慌'),
- (0x2F8A8, 'M', '慎'),
- (0x2F8A9, 'M', '慌'),
- (0x2F8AA, 'M', '慺'),
- (0x2F8AB, 'M', '憎'),
- (0x2F8AC, 'M', '憲'),
- (0x2F8AD, 'M', '憤'),
- (0x2F8AE, 'M', '憯'),
- (0x2F8AF, 'M', '懞'),
- (0x2F8B0, 'M', '懲'),
- (0x2F8B1, 'M', '懶'),
- (0x2F8B2, 'M', '成'),
- (0x2F8B3, 'M', '戛'),
- (0x2F8B4, 'M', '扝'),
- (0x2F8B5, 'M', '抱'),
- (0x2F8B6, 'M', '拔'),
- (0x2F8B7, 'M', '捐'),
- (0x2F8B8, 'M', '𢬌'),
- (0x2F8B9, 'M', '挽'),
- (0x2F8BA, 'M', '拼'),
- (0x2F8BB, 'M', '捨'),
- (0x2F8BC, 'M', '掃'),
- (0x2F8BD, 'M', '揤'),
- (0x2F8BE, 'M', '𢯱'),
- (0x2F8BF, 'M', '搢'),
- (0x2F8C0, 'M', '揅'),
- (0x2F8C1, 'M', '掩'),
- (0x2F8C2, 'M', '㨮'),
- (0x2F8C3, 'M', '摩'),
- (0x2F8C4, 'M', '摾'),
- (0x2F8C5, 'M', '撝'),
- (0x2F8C6, 'M', '摷'),
- (0x2F8C7, 'M', '㩬'),
- (0x2F8C8, 'M', '敏'),
- (0x2F8C9, 'M', '敬'),
- (0x2F8CA, 'M', '𣀊'),
- (0x2F8CB, 'M', '旣'),
- (0x2F8CC, 'M', '書'),
- (0x2F8CD, 'M', '晉'),
- (0x2F8CE, 'M', '㬙'),
- (0x2F8CF, 'M', '暑'),
- (0x2F8D0, 'M', '㬈'),
- (0x2F8D1, 'M', '㫤'),
- (0x2F8D2, 'M', '冒'),
- (0x2F8D3, 'M', '冕'),
- (0x2F8D4, 'M', '最'),
- (0x2F8D5, 'M', '暜'),
- (0x2F8D6, 'M', '肭'),
- (0x2F8D7, 'M', '䏙'),
- (0x2F8D8, 'M', '朗'),
- (0x2F8D9, 'M', '望'),
- (0x2F8DA, 'M', '朡'),
- (0x2F8DB, 'M', '杞'),
- (0x2F8DC, 'M', '杓'),
- (0x2F8DD, 'M', '𣏃'),
- (0x2F8DE, 'M', '㭉'),
- (0x2F8DF, 'M', '柺'),
- (0x2F8E0, 'M', '枅'),
- (0x2F8E1, 'M', '桒'),
- (0x2F8E2, 'M', '梅'),
- (0x2F8E3, 'M', '𣑭'),
- (0x2F8E4, 'M', '梎'),
- (0x2F8E5, 'M', '栟'),
- (0x2F8E6, 'M', '椔'),
- (0x2F8E7, 'M', '㮝'),
- (0x2F8E8, 'M', '楂'),
- (0x2F8E9, 'M', '榣'),
- (0x2F8EA, 'M', '槪'),
- (0x2F8EB, 'M', '檨'),
- (0x2F8EC, 'M', '𣚣'),
- (0x2F8ED, 'M', '櫛'),
- (0x2F8EE, 'M', '㰘'),
- (0x2F8EF, 'M', '次'),
- (0x2F8F0, 'M', '𣢧'),
- (0x2F8F1, 'M', '歔'),
- (0x2F8F2, 'M', '㱎'),
- (0x2F8F3, 'M', '歲'),
- (0x2F8F4, 'M', '殟'),
- (0x2F8F5, 'M', '殺'),
- (0x2F8F6, 'M', '殻'),
- (0x2F8F7, 'M', '𣪍'),
- ]
-
-def _seg_78() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F8F8, 'M', '𡴋'),
- (0x2F8F9, 'M', '𣫺'),
- (0x2F8FA, 'M', '汎'),
- (0x2F8FB, 'M', '𣲼'),
- (0x2F8FC, 'M', '沿'),
- (0x2F8FD, 'M', '泍'),
- (0x2F8FE, 'M', '汧'),
- (0x2F8FF, 'M', '洖'),
- (0x2F900, 'M', '派'),
- (0x2F901, 'M', '海'),
- (0x2F902, 'M', '流'),
- (0x2F903, 'M', '浩'),
- (0x2F904, 'M', '浸'),
- (0x2F905, 'M', '涅'),
- (0x2F906, 'M', '𣴞'),
- (0x2F907, 'M', '洴'),
- (0x2F908, 'M', '港'),
- (0x2F909, 'M', '湮'),
- (0x2F90A, 'M', '㴳'),
- (0x2F90B, 'M', '滋'),
- (0x2F90C, 'M', '滇'),
- (0x2F90D, 'M', '𣻑'),
- (0x2F90E, 'M', '淹'),
- (0x2F90F, 'M', '潮'),
- (0x2F910, 'M', '𣽞'),
- (0x2F911, 'M', '𣾎'),
- (0x2F912, 'M', '濆'),
- (0x2F913, 'M', '瀹'),
- (0x2F914, 'M', '瀞'),
- (0x2F915, 'M', '瀛'),
- (0x2F916, 'M', '㶖'),
- (0x2F917, 'M', '灊'),
- (0x2F918, 'M', '災'),
- (0x2F919, 'M', '灷'),
- (0x2F91A, 'M', '炭'),
- (0x2F91B, 'M', '𠔥'),
- (0x2F91C, 'M', '煅'),
- (0x2F91D, 'M', '𤉣'),
- (0x2F91E, 'M', '熜'),
- (0x2F91F, 'X'),
- (0x2F920, 'M', '爨'),
- (0x2F921, 'M', '爵'),
- (0x2F922, 'M', '牐'),
- (0x2F923, 'M', '𤘈'),
- (0x2F924, 'M', '犀'),
- (0x2F925, 'M', '犕'),
- (0x2F926, 'M', '𤜵'),
- (0x2F927, 'M', '𤠔'),
- (0x2F928, 'M', '獺'),
- (0x2F929, 'M', '王'),
- (0x2F92A, 'M', '㺬'),
- (0x2F92B, 'M', '玥'),
- (0x2F92C, 'M', '㺸'),
- (0x2F92E, 'M', '瑇'),
- (0x2F92F, 'M', '瑜'),
- (0x2F930, 'M', '瑱'),
- (0x2F931, 'M', '璅'),
- (0x2F932, 'M', '瓊'),
- (0x2F933, 'M', '㼛'),
- (0x2F934, 'M', '甤'),
- (0x2F935, 'M', '𤰶'),
- (0x2F936, 'M', '甾'),
- (0x2F937, 'M', '𤲒'),
- (0x2F938, 'M', '異'),
- (0x2F939, 'M', '𢆟'),
- (0x2F93A, 'M', '瘐'),
- (0x2F93B, 'M', '𤾡'),
- (0x2F93C, 'M', '𤾸'),
- (0x2F93D, 'M', '𥁄'),
- (0x2F93E, 'M', '㿼'),
- (0x2F93F, 'M', '䀈'),
- (0x2F940, 'M', '直'),
- (0x2F941, 'M', '𥃳'),
- (0x2F942, 'M', '𥃲'),
- (0x2F943, 'M', '𥄙'),
- (0x2F944, 'M', '𥄳'),
- (0x2F945, 'M', '眞'),
- (0x2F946, 'M', '真'),
- (0x2F948, 'M', '睊'),
- (0x2F949, 'M', '䀹'),
- (0x2F94A, 'M', '瞋'),
- (0x2F94B, 'M', '䁆'),
- (0x2F94C, 'M', '䂖'),
- (0x2F94D, 'M', '𥐝'),
- (0x2F94E, 'M', '硎'),
- (0x2F94F, 'M', '碌'),
- (0x2F950, 'M', '磌'),
- (0x2F951, 'M', '䃣'),
- (0x2F952, 'M', '𥘦'),
- (0x2F953, 'M', '祖'),
- (0x2F954, 'M', '𥚚'),
- (0x2F955, 'M', '𥛅'),
- (0x2F956, 'M', '福'),
- (0x2F957, 'M', '秫'),
- (0x2F958, 'M', '䄯'),
- (0x2F959, 'M', '穀'),
- (0x2F95A, 'M', '穊'),
- (0x2F95B, 'M', '穏'),
- (0x2F95C, 'M', '𥥼'),
- (0x2F95D, 'M', '𥪧'),
- ]
-
-def _seg_79() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F95F, 'X'),
- (0x2F960, 'M', '䈂'),
- (0x2F961, 'M', '𥮫'),
- (0x2F962, 'M', '篆'),
- (0x2F963, 'M', '築'),
- (0x2F964, 'M', '䈧'),
- (0x2F965, 'M', '𥲀'),
- (0x2F966, 'M', '糒'),
- (0x2F967, 'M', '䊠'),
- (0x2F968, 'M', '糨'),
- (0x2F969, 'M', '糣'),
- (0x2F96A, 'M', '紀'),
- (0x2F96B, 'M', '𥾆'),
- (0x2F96C, 'M', '絣'),
- (0x2F96D, 'M', '䌁'),
- (0x2F96E, 'M', '緇'),
- (0x2F96F, 'M', '縂'),
- (0x2F970, 'M', '繅'),
- (0x2F971, 'M', '䌴'),
- (0x2F972, 'M', '𦈨'),
- (0x2F973, 'M', '𦉇'),
- (0x2F974, 'M', '䍙'),
- (0x2F975, 'M', '𦋙'),
- (0x2F976, 'M', '罺'),
- (0x2F977, 'M', '𦌾'),
- (0x2F978, 'M', '羕'),
- (0x2F979, 'M', '翺'),
- (0x2F97A, 'M', '者'),
- (0x2F97B, 'M', '𦓚'),
- (0x2F97C, 'M', '𦔣'),
- (0x2F97D, 'M', '聠'),
- (0x2F97E, 'M', '𦖨'),
- (0x2F97F, 'M', '聰'),
- (0x2F980, 'M', '𣍟'),
- (0x2F981, 'M', '䏕'),
- (0x2F982, 'M', '育'),
- (0x2F983, 'M', '脃'),
- (0x2F984, 'M', '䐋'),
- (0x2F985, 'M', '脾'),
- (0x2F986, 'M', '媵'),
- (0x2F987, 'M', '𦞧'),
- (0x2F988, 'M', '𦞵'),
- (0x2F989, 'M', '𣎓'),
- (0x2F98A, 'M', '𣎜'),
- (0x2F98B, 'M', '舁'),
- (0x2F98C, 'M', '舄'),
- (0x2F98D, 'M', '辞'),
- (0x2F98E, 'M', '䑫'),
- (0x2F98F, 'M', '芑'),
- (0x2F990, 'M', '芋'),
- (0x2F991, 'M', '芝'),
- (0x2F992, 'M', '劳'),
- (0x2F993, 'M', '花'),
- (0x2F994, 'M', '芳'),
- (0x2F995, 'M', '芽'),
- (0x2F996, 'M', '苦'),
- (0x2F997, 'M', '𦬼'),
- (0x2F998, 'M', '若'),
- (0x2F999, 'M', '茝'),
- (0x2F99A, 'M', '荣'),
- (0x2F99B, 'M', '莭'),
- (0x2F99C, 'M', '茣'),
- (0x2F99D, 'M', '莽'),
- (0x2F99E, 'M', '菧'),
- (0x2F99F, 'M', '著'),
- (0x2F9A0, 'M', '荓'),
- (0x2F9A1, 'M', '菊'),
- (0x2F9A2, 'M', '菌'),
- (0x2F9A3, 'M', '菜'),
- (0x2F9A4, 'M', '𦰶'),
- (0x2F9A5, 'M', '𦵫'),
- (0x2F9A6, 'M', '𦳕'),
- (0x2F9A7, 'M', '䔫'),
- (0x2F9A8, 'M', '蓱'),
- (0x2F9A9, 'M', '蓳'),
- (0x2F9AA, 'M', '蔖'),
- (0x2F9AB, 'M', '𧏊'),
- (0x2F9AC, 'M', '蕤'),
- (0x2F9AD, 'M', '𦼬'),
- (0x2F9AE, 'M', '䕝'),
- (0x2F9AF, 'M', '䕡'),
- (0x2F9B0, 'M', '𦾱'),
- (0x2F9B1, 'M', '𧃒'),
- (0x2F9B2, 'M', '䕫'),
- (0x2F9B3, 'M', '虐'),
- (0x2F9B4, 'M', '虜'),
- (0x2F9B5, 'M', '虧'),
- (0x2F9B6, 'M', '虩'),
- (0x2F9B7, 'M', '蚩'),
- (0x2F9B8, 'M', '蚈'),
- (0x2F9B9, 'M', '蜎'),
- (0x2F9BA, 'M', '蛢'),
- (0x2F9BB, 'M', '蝹'),
- (0x2F9BC, 'M', '蜨'),
- (0x2F9BD, 'M', '蝫'),
- (0x2F9BE, 'M', '螆'),
- (0x2F9BF, 'X'),
- (0x2F9C0, 'M', '蟡'),
- (0x2F9C1, 'M', '蠁'),
- (0x2F9C2, 'M', '䗹'),
- ]
-
-def _seg_80() -> List[Union[Tuple[int, str], Tuple[int, str, str]]]:
- return [
- (0x2F9C3, 'M', '衠'),
- (0x2F9C4, 'M', '衣'),
- (0x2F9C5, 'M', '𧙧'),
- (0x2F9C6, 'M', '裗'),
- (0x2F9C7, 'M', '裞'),
- (0x2F9C8, 'M', '䘵'),
- (0x2F9C9, 'M', '裺'),
- (0x2F9CA, 'M', '㒻'),
- (0x2F9CB, 'M', '𧢮'),
- (0x2F9CC, 'M', '𧥦'),
- (0x2F9CD, 'M', '䚾'),
- (0x2F9CE, 'M', '䛇'),
- (0x2F9CF, 'M', '誠'),
- (0x2F9D0, 'M', '諭'),
- (0x2F9D1, 'M', '變'),
- (0x2F9D2, 'M', '豕'),
- (0x2F9D3, 'M', '𧲨'),
- (0x2F9D4, 'M', '貫'),
- (0x2F9D5, 'M', '賁'),
- (0x2F9D6, 'M', '贛'),
- (0x2F9D7, 'M', '起'),
- (0x2F9D8, 'M', '𧼯'),
- (0x2F9D9, 'M', '𠠄'),
- (0x2F9DA, 'M', '跋'),
- (0x2F9DB, 'M', '趼'),
- (0x2F9DC, 'M', '跰'),
- (0x2F9DD, 'M', '𠣞'),
- (0x2F9DE, 'M', '軔'),
- (0x2F9DF, 'M', '輸'),
- (0x2F9E0, 'M', '𨗒'),
- (0x2F9E1, 'M', '𨗭'),
- (0x2F9E2, 'M', '邔'),
- (0x2F9E3, 'M', '郱'),
- (0x2F9E4, 'M', '鄑'),
- (0x2F9E5, 'M', '𨜮'),
- (0x2F9E6, 'M', '鄛'),
- (0x2F9E7, 'M', '鈸'),
- (0x2F9E8, 'M', '鋗'),
- (0x2F9E9, 'M', '鋘'),
- (0x2F9EA, 'M', '鉼'),
- (0x2F9EB, 'M', '鏹'),
- (0x2F9EC, 'M', '鐕'),
- (0x2F9ED, 'M', '𨯺'),
- (0x2F9EE, 'M', '開'),
- (0x2F9EF, 'M', '䦕'),
- (0x2F9F0, 'M', '閷'),
- (0x2F9F1, 'M', '𨵷'),
- (0x2F9F2, 'M', '䧦'),
- (0x2F9F3, 'M', '雃'),
- (0x2F9F4, 'M', '嶲'),
- (0x2F9F5, 'M', '霣'),
- (0x2F9F6, 'M', '𩅅'),
- (0x2F9F7, 'M', '𩈚'),
- (0x2F9F8, 'M', '䩮'),
- (0x2F9F9, 'M', '䩶'),
- (0x2F9FA, 'M', '韠'),
- (0x2F9FB, 'M', '𩐊'),
- (0x2F9FC, 'M', '䪲'),
- (0x2F9FD, 'M', '𩒖'),
- (0x2F9FE, 'M', '頋'),
- (0x2FA00, 'M', '頩'),
- (0x2FA01, 'M', '𩖶'),
- (0x2FA02, 'M', '飢'),
- (0x2FA03, 'M', '䬳'),
- (0x2FA04, 'M', '餩'),
- (0x2FA05, 'M', '馧'),
- (0x2FA06, 'M', '駂'),
- (0x2FA07, 'M', '駾'),
- (0x2FA08, 'M', '䯎'),
- (0x2FA09, 'M', '𩬰'),
- (0x2FA0A, 'M', '鬒'),
- (0x2FA0B, 'M', '鱀'),
- (0x2FA0C, 'M', '鳽'),
- (0x2FA0D, 'M', '䳎'),
- (0x2FA0E, 'M', '䳭'),
- (0x2FA0F, 'M', '鵧'),
- (0x2FA10, 'M', '𪃎'),
- (0x2FA11, 'M', '䳸'),
- (0x2FA12, 'M', '𪄅'),
- (0x2FA13, 'M', '𪈎'),
- (0x2FA14, 'M', '𪊑'),
- (0x2FA15, 'M', '麻'),
- (0x2FA16, 'M', '䵖'),
- (0x2FA17, 'M', '黹'),
- (0x2FA18, 'M', '黾'),
- (0x2FA19, 'M', '鼅'),
- (0x2FA1A, 'M', '鼏'),
- (0x2FA1B, 'M', '鼖'),
- (0x2FA1C, 'M', '鼻'),
- (0x2FA1D, 'M', '𪘀'),
- (0x2FA1E, 'X'),
- (0x30000, 'V'),
- (0x3134B, 'X'),
- (0xE0100, 'I'),
- (0xE01F0, 'X'),
- ]
-
-uts46data = tuple(
- _seg_0()
- + _seg_1()
- + _seg_2()
- + _seg_3()
- + _seg_4()
- + _seg_5()
- + _seg_6()
- + _seg_7()
- + _seg_8()
- + _seg_9()
- + _seg_10()
- + _seg_11()
- + _seg_12()
- + _seg_13()
- + _seg_14()
- + _seg_15()
- + _seg_16()
- + _seg_17()
- + _seg_18()
- + _seg_19()
- + _seg_20()
- + _seg_21()
- + _seg_22()
- + _seg_23()
- + _seg_24()
- + _seg_25()
- + _seg_26()
- + _seg_27()
- + _seg_28()
- + _seg_29()
- + _seg_30()
- + _seg_31()
- + _seg_32()
- + _seg_33()
- + _seg_34()
- + _seg_35()
- + _seg_36()
- + _seg_37()
- + _seg_38()
- + _seg_39()
- + _seg_40()
- + _seg_41()
- + _seg_42()
- + _seg_43()
- + _seg_44()
- + _seg_45()
- + _seg_46()
- + _seg_47()
- + _seg_48()
- + _seg_49()
- + _seg_50()
- + _seg_51()
- + _seg_52()
- + _seg_53()
- + _seg_54()
- + _seg_55()
- + _seg_56()
- + _seg_57()
- + _seg_58()
- + _seg_59()
- + _seg_60()
- + _seg_61()
- + _seg_62()
- + _seg_63()
- + _seg_64()
- + _seg_65()
- + _seg_66()
- + _seg_67()
- + _seg_68()
- + _seg_69()
- + _seg_70()
- + _seg_71()
- + _seg_72()
- + _seg_73()
- + _seg_74()
- + _seg_75()
- + _seg_76()
- + _seg_77()
- + _seg_78()
- + _seg_79()
- + _seg_80()
-) # type: Tuple[Union[Tuple[int, str], Tuple[int, str, str]], ...]
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/INSTALLER b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/INSTALLER
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/INSTALLER
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/LICENSE.txt b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/LICENSE.txt
deleted file mode 100644
index 8e7b65e..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/LICENSE.txt
+++ /dev/null
@@ -1,20 +0,0 @@
-Copyright (c) 2008-present The pip developers (see AUTHORS.txt file)
-
-Permission is hereby granted, free of charge, to any person obtaining
-a copy of this software and associated documentation files (the
-"Software"), to deal in the Software without restriction, including
-without limitation the rights to use, copy, modify, merge, publish,
-distribute, sublicense, and/or sell copies of the Software, and to
-permit persons to whom the Software is furnished to do so, subject to
-the following conditions:
-
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
-NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
-LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
-OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
-WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/METADATA b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/METADATA
deleted file mode 100644
index a26721b..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/METADATA
+++ /dev/null
@@ -1,92 +0,0 @@
-Metadata-Version: 2.1
-Name: pip
-Version: 22.1.1
-Summary: The PyPA recommended tool for installing Python packages.
-Home-page: https://pip.pypa.io/
-Author: The pip developers
-Author-email: distutils-sig@python.org
-License: MIT
-Project-URL: Documentation, https://pip.pypa.io
-Project-URL: Source, https://github.com/pypa/pip
-Project-URL: Changelog, https://pip.pypa.io/en/stable/news/
-Platform: UNKNOWN
-Classifier: Development Status :: 5 - Production/Stable
-Classifier: Intended Audience :: Developers
-Classifier: License :: OSI Approved :: MIT License
-Classifier: Topic :: Software Development :: Build Tools
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3 :: Only
-Classifier: Programming Language :: Python :: 3.7
-Classifier: Programming Language :: Python :: 3.8
-Classifier: Programming Language :: Python :: 3.9
-Classifier: Programming Language :: Python :: 3.10
-Classifier: Programming Language :: Python :: Implementation :: CPython
-Classifier: Programming Language :: Python :: Implementation :: PyPy
-Requires-Python: >=3.7
-License-File: LICENSE.txt
-
-pip - The Python Package Installer
-==================================
-
-.. image:: https://img.shields.io/pypi/v/pip.svg
- :target: https://pypi.org/project/pip/
-
-.. image:: https://readthedocs.org/projects/pip/badge/?version=latest
- :target: https://pip.pypa.io/en/latest
-
-pip is the `package installer`_ for Python. You can use pip to install packages from the `Python Package Index`_ and other indexes.
-
-Please take a look at our documentation for how to install and use pip:
-
-* `Installation`_
-* `Usage`_
-
-We release updates regularly, with a new version every 3 months. Find more details in our documentation:
-
-* `Release notes`_
-* `Release process`_
-
-In pip 20.3, we've `made a big improvement to the heart of pip`_; `learn more`_. We want your input, so `sign up for our user experience research studies`_ to help us do it right.
-
-**Note**: pip 21.0, in January 2021, removed Python 2 support, per pip's `Python 2 support policy`_. Please migrate to Python 3.
-
-If you find bugs, need help, or want to talk to the developers, please use our mailing lists or chat rooms:
-
-* `Issue tracking`_
-* `Discourse channel`_
-* `User IRC`_
-
-If you want to get involved head over to GitHub to get the source code, look at our development documentation and feel free to jump on the developer mailing lists and chat rooms:
-
-* `GitHub page`_
-* `Development documentation`_
-* `Development mailing list`_
-* `Development IRC`_
-
-Code of Conduct
----------------
-
-Everyone interacting in the pip project's codebases, issue trackers, chat
-rooms, and mailing lists is expected to follow the `PSF Code of Conduct`_.
-
-.. _package installer: https://packaging.python.org/guides/tool-recommendations/
-.. _Python Package Index: https://pypi.org
-.. _Installation: https://pip.pypa.io/en/stable/installation/
-.. _Usage: https://pip.pypa.io/en/stable/
-.. _Release notes: https://pip.pypa.io/en/stable/news.html
-.. _Release process: https://pip.pypa.io/en/latest/development/release-process/
-.. _GitHub page: https://github.com/pypa/pip
-.. _Development documentation: https://pip.pypa.io/en/latest/development
-.. _made a big improvement to the heart of pip: https://pyfound.blogspot.com/2020/11/pip-20-3-new-resolver.html
-.. _learn more: https://pip.pypa.io/en/latest/user_guide/#changes-to-the-pip-dependency-resolver-in-20-3-2020
-.. _sign up for our user experience research studies: https://pyfound.blogspot.com/2020/03/new-pip-resolver-to-roll-out-this-year.html
-.. _Python 2 support policy: https://pip.pypa.io/en/latest/development/release-process/#python-2-support
-.. _Issue tracking: https://github.com/pypa/pip/issues
-.. _Discourse channel: https://discuss.python.org/c/packaging
-.. _Development mailing list: https://mail.python.org/mailman3/lists/distutils-sig.python.org/
-.. _User IRC: https://kiwiirc.com/nextclient/#ircs://irc.libera.chat:+6697/pypa
-.. _Development IRC: https://kiwiirc.com/nextclient/#ircs://irc.libera.chat:+6697/pypa-dev
-.. _PSF Code of Conduct: https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md
-
-
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/RECORD b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/RECORD
deleted file mode 100644
index 4aac26c..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/RECORD
+++ /dev/null
@@ -1,1059 +0,0 @@
-../../../bin/pip,sha256=FVmsDE2uGIMfeDUGKbIddhWRiKIHz06rSdB2ggLhNrA,261
-../../../bin/pip3,sha256=FVmsDE2uGIMfeDUGKbIddhWRiKIHz06rSdB2ggLhNrA,261
-../../../bin/pip3.9,sha256=FVmsDE2uGIMfeDUGKbIddhWRiKIHz06rSdB2ggLhNrA,261
-pip-22.1.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-pip-22.1.1.dist-info/LICENSE.txt,sha256=Y0MApmnUmurmWxLGxIySTFGkzfPR_whtw0VtyLyqIQQ,1093
-pip-22.1.1.dist-info/METADATA,sha256=A-hH8Vq8qVmD8le8cWsFRXJcHOhj4N5ki2tgE_0fWro,4166
-pip-22.1.1.dist-info/RECORD,,
-pip-22.1.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip-22.1.1.dist-info/WHEEL,sha256=G16H4A3IeoQmnOrYV4ueZGKSjhipXx8zc8nu9FGlvMA,92
-pip-22.1.1.dist-info/entry_points.txt,sha256=5ExSa1s54zSPNA_1epJn5SX06786S8k5YHwskMvVYzw,125
-pip-22.1.1.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
-pip/__init__.py,sha256=53bBm3XDDyAFfcRmdIAYGAkWzg8AiAPPVm0I4gtymA8,357
-pip/__main__.py,sha256=mXwWDftNLMKfwVqKFWGE_uuBZvGSIiUELhLkeysIuZc,1198
-pip/__pycache__/__init__.cpython-39.pyc,,
-pip/__pycache__/__main__.cpython-39.pyc,,
-pip/_internal/__init__.py,sha256=nnFCuxrPMgALrIDxSoy-H6Zj4W4UY60D-uL1aJyq0pc,573
-pip/_internal/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/__pycache__/build_env.cpython-39.pyc,,
-pip/_internal/__pycache__/cache.cpython-39.pyc,,
-pip/_internal/__pycache__/configuration.cpython-39.pyc,,
-pip/_internal/__pycache__/exceptions.cpython-39.pyc,,
-pip/_internal/__pycache__/main.cpython-39.pyc,,
-pip/_internal/__pycache__/pyproject.cpython-39.pyc,,
-pip/_internal/__pycache__/self_outdated_check.cpython-39.pyc,,
-pip/_internal/__pycache__/wheel_builder.cpython-39.pyc,,
-pip/_internal/build_env.py,sha256=nyl5iZ3c91zs9dxxQhA1iz1zUeempyn31mQ9YVi9hA0,10217
-pip/_internal/cache.py,sha256=71eaYwrls34HJ6gzbmmYiotiKhPNFTM_tqYJXD5nf3s,9441
-pip/_internal/cli/__init__.py,sha256=FkHBgpxxb-_gd6r1FjnNhfMOzAUYyXoXKJ6abijfcFU,132
-pip/_internal/cli/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/autocompletion.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/base_command.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/cmdoptions.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/command_context.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/main.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/main_parser.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/parser.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/progress_bars.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/req_command.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/spinners.cpython-39.pyc,,
-pip/_internal/cli/__pycache__/status_codes.cpython-39.pyc,,
-pip/_internal/cli/autocompletion.py,sha256=wY2JPZY2Eji1vhR7bVo-yCBPJ9LCy6P80iOAhZD1Vi8,6676
-pip/_internal/cli/base_command.py,sha256=EiHzq1RBubmgYkhsVgJLNc4Y18koPUS1TzMVJwpnYxc,8146
-pip/_internal/cli/cmdoptions.py,sha256=-SBPqCWFL0y7LpkoUxjfWBwnjpkuowkoaXMUMsZlSSg,30030
-pip/_internal/cli/command_context.py,sha256=RHgIPwtObh5KhMrd3YZTkl8zbVG-6Okml7YbFX4Ehg0,774
-pip/_internal/cli/main.py,sha256=ioJ8IVlb2K1qLOxR-tXkee9lURhYV89CDM71MKag7YY,2472
-pip/_internal/cli/main_parser.py,sha256=Q9TnytfuC5Z2JSjBFWVGtEdYLFy7rukNIb04movHdAo,2614
-pip/_internal/cli/parser.py,sha256=tWP-K1uSxnJyXu3WE0kkH3niAYRBeuUaxeydhzOdhL4,10817
-pip/_internal/cli/progress_bars.py,sha256=So4mPoSjXkXiSHiTzzquH3VVyVD_njXlHJSExYPXAow,1968
-pip/_internal/cli/req_command.py,sha256=Khk8YorP3tt5AGcySg7dzqSo0Xf-eEdwuoOlrlA696E,18060
-pip/_internal/cli/spinners.py,sha256=rs_NveD0wCoJ9GiJJmOjGC1UPVK8isOQpQsFVE899zQ,5098
-pip/_internal/cli/status_codes.py,sha256=sEFHUaUJbqv8iArL3HAtcztWZmGOFX01hTesSytDEh0,116
-pip/_internal/commands/__init__.py,sha256=Vc1HjsLEtyCh7506OozPHPKXe2Hk-z9cFkFF3BMj1lM,3736
-pip/_internal/commands/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/cache.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/check.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/completion.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/configuration.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/debug.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/download.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/freeze.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/hash.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/help.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/index.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/install.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/list.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/search.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/show.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/uninstall.cpython-39.pyc,,
-pip/_internal/commands/__pycache__/wheel.cpython-39.pyc,,
-pip/_internal/commands/cache.py,sha256=p9gvc6W_xgxE2zO0o8NXqO1gGJEinEK42qEC-a7Cnuk,7524
-pip/_internal/commands/check.py,sha256=0gjXR7j36xJT5cs2heYU_dfOfpnFfzX8OoPNNoKhqdM,1685
-pip/_internal/commands/completion.py,sha256=H0TJvGrdsoleuIyQKzJbicLFppYx2OZA0BLNpQDeFjI,4129
-pip/_internal/commands/configuration.py,sha256=ZJRO2YMzI5vPA2ADTWZrSsvGd4q880CylCUzEUJziZY,9500
-pip/_internal/commands/debug.py,sha256=I47PEYjlrj8WHK0CzVo3O6gZMxYtF3NnkujeuUQs23A,6656
-pip/_internal/commands/download.py,sha256=Ya6wURnDDpnMi7XamdVlArJfbhRYF6lUsxAyWcp0eR0,5015
-pip/_internal/commands/freeze.py,sha256=gCjoD6foBZPBAAYx5t8zZLkJhsF_ZRtnb3dPuD7beO8,2951
-pip/_internal/commands/hash.py,sha256=EVVOuvGtoPEdFi8SNnmdqlCQrhCxV-kJsdwtdcCnXGQ,1703
-pip/_internal/commands/help.py,sha256=gcc6QDkcgHMOuAn5UxaZwAStsRBrnGSn_yxjS57JIoM,1132
-pip/_internal/commands/index.py,sha256=8pYkICUJlccjm3E83b7UuZ5DtOfLh1N7ZHXAgkajjHo,4849
-pip/_internal/commands/install.py,sha256=cBqNasaasPll8L8X9W1_olT4XNXg1qsJwSkRGMTiQY8,28142
-pip/_internal/commands/list.py,sha256=u10qlFCMqtqfCWLMVHl2T2r2YNlgIkIVEiveUb0O9HA,12235
-pip/_internal/commands/search.py,sha256=sbBZiARRc050QquOKcCvOr2K3XLsoYebLKZGRi__iUI,5697
-pip/_internal/commands/show.py,sha256=CJI8q4SSY0X346K1hi4Th8Nbyhl4nxPTBJUuzOlTaYE,6129
-pip/_internal/commands/uninstall.py,sha256=0JQhifYxecNrJAwoILFwjm9V1V3liXzNT-y4bgRXXPw,3680
-pip/_internal/commands/wheel.py,sha256=dar33wNjUyTN6Cy8PVxV5TerJS1u7pZmKoqgoYiQh7g,6307
-pip/_internal/configuration.py,sha256=uBKTus43pDIO6IzT2mLWQeROmHhtnoabhniKNjPYvD0,13529
-pip/_internal/distributions/__init__.py,sha256=Hq6kt6gXBgjNit5hTTWLAzeCNOKoB-N0pGYSqehrli8,858
-pip/_internal/distributions/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/distributions/__pycache__/base.cpython-39.pyc,,
-pip/_internal/distributions/__pycache__/installed.cpython-39.pyc,,
-pip/_internal/distributions/__pycache__/sdist.cpython-39.pyc,,
-pip/_internal/distributions/__pycache__/wheel.cpython-39.pyc,,
-pip/_internal/distributions/base.py,sha256=jrF1Vi7eGyqFqMHrieh1PIOrGU7KeCxhYPZnbvtmvGY,1221
-pip/_internal/distributions/installed.py,sha256=NI2OgsgH9iBq9l5vB-56vOg5YsybOy-AU4VE5CSCO2I,729
-pip/_internal/distributions/sdist.py,sha256=SQBdkatXSigKGG_SaD0U0p1Jwdfrg26UCNcHgkXZfdA,6494
-pip/_internal/distributions/wheel.py,sha256=m-J4XO-gvFerlYsFzzSXYDvrx8tLZlJFTCgDxctn8ig,1164
-pip/_internal/exceptions.py,sha256=U-dV1ixkSz6NAU6Aw9dosKi2EzZ5D3BA7ilYZuTLKeU,20912
-pip/_internal/index/__init__.py,sha256=vpt-JeTZefh8a-FC22ZeBSXFVbuBcXSGiILhQZJaNpQ,30
-pip/_internal/index/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/index/__pycache__/collector.cpython-39.pyc,,
-pip/_internal/index/__pycache__/package_finder.cpython-39.pyc,,
-pip/_internal/index/__pycache__/sources.cpython-39.pyc,,
-pip/_internal/index/collector.py,sha256=E4yZHzlzPtaXg2BxaugrNg1Jwtwgs4gC-Q_0bzYrBU4,19671
-pip/_internal/index/package_finder.py,sha256=Cl36YIyJ_4VvfNF0yC8N3pCQK8-ZkrnJkth3diQVeUc,37822
-pip/_internal/index/sources.py,sha256=SVyPitv08-Qalh2_Bk5diAJ9GAA_d-a93koouQodAG0,6557
-pip/_internal/locations/__init__.py,sha256=OVGqKMaxOmRAuXXYkhzMwLtCb79NX6x4twMu0pdz_aE,17323
-pip/_internal/locations/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/locations/__pycache__/_distutils.cpython-39.pyc,,
-pip/_internal/locations/__pycache__/_sysconfig.cpython-39.pyc,,
-pip/_internal/locations/__pycache__/base.cpython-39.pyc,,
-pip/_internal/locations/_distutils.py,sha256=aJ0K4ruV59AhzrUTh5OltbiGlMtDFYNgWI-HzdGRQAk,5855
-pip/_internal/locations/_sysconfig.py,sha256=LQNKTJKyjVqxXaPntlBwdUqTG1xwYf6GVCKMbyRJx5M,7918
-pip/_internal/locations/base.py,sha256=x5D1ONktmPJd8nnUTh-ELsAJ7fiXA-k-0a_vhfi2_Us,1579
-pip/_internal/main.py,sha256=r-UnUe8HLo5XFJz8inTcOOTiu_sxNhgHb6VwlGUllOI,340
-pip/_internal/metadata/__init__.py,sha256=O9kLHkAOH_P4avAjIrZtEMwwUT5nGCpG4Yx6e96pr9w,3534
-pip/_internal/metadata/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/metadata/__pycache__/base.cpython-39.pyc,,
-pip/_internal/metadata/__pycache__/pkg_resources.cpython-39.pyc,,
-pip/_internal/metadata/base.py,sha256=xPdPOwMGVYchXc0p1Cjj3hIQDDhdUkN-PDr1RWjkkQU,20183
-pip/_internal/metadata/importlib/__init__.py,sha256=9ZVO8BoE7NEZPmoHp5Ap_NJo0HgNIezXXg-TFTtt3Z4,107
-pip/_internal/metadata/importlib/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/metadata/importlib/__pycache__/_compat.cpython-39.pyc,,
-pip/_internal/metadata/importlib/__pycache__/_dists.cpython-39.pyc,,
-pip/_internal/metadata/importlib/__pycache__/_envs.cpython-39.pyc,,
-pip/_internal/metadata/importlib/_compat.py,sha256=uInrVO_Y6ayV-2m7dvQ2bm-dmjdig2MnAiQWgNNBDzY,1430
-pip/_internal/metadata/importlib/_dists.py,sha256=NKZHkFS12YoGzyI6joWUEFplQXCgwuBfrD1mY9owE6M,10112
-pip/_internal/metadata/importlib/_envs.py,sha256=dE4Eq8ctYpI7wklDzkEm3-cPyStESHZt2i3MS9oFYtE,6516
-pip/_internal/metadata/pkg_resources.py,sha256=u-4vBrHVTMMd8e051DzcRwL4Zo6gtBTwQBRWMW9YAA8,9297
-pip/_internal/models/__init__.py,sha256=3DHUd_qxpPozfzouoqa9g9ts1Czr5qaHfFxbnxriepM,63
-pip/_internal/models/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/models/__pycache__/candidate.cpython-39.pyc,,
-pip/_internal/models/__pycache__/direct_url.cpython-39.pyc,,
-pip/_internal/models/__pycache__/format_control.cpython-39.pyc,,
-pip/_internal/models/__pycache__/index.cpython-39.pyc,,
-pip/_internal/models/__pycache__/link.cpython-39.pyc,,
-pip/_internal/models/__pycache__/scheme.cpython-39.pyc,,
-pip/_internal/models/__pycache__/search_scope.cpython-39.pyc,,
-pip/_internal/models/__pycache__/selection_prefs.cpython-39.pyc,,
-pip/_internal/models/__pycache__/target_python.cpython-39.pyc,,
-pip/_internal/models/__pycache__/wheel.cpython-39.pyc,,
-pip/_internal/models/candidate.py,sha256=6pcABsaR7CfIHlbJbr2_kMkVJFL_yrYjTx6SVWUnCPQ,990
-pip/_internal/models/direct_url.py,sha256=HLO0sL2aYB6n45bwmd72TDN05sLHJlOQI8M01l2SH3I,5877
-pip/_internal/models/format_control.py,sha256=DJpMYjxeYKKQdwNcML2_F0vtAh-qnKTYe-CpTxQe-4g,2520
-pip/_internal/models/index.py,sha256=tYnL8oxGi4aSNWur0mG8DAP7rC6yuha_MwJO8xw0crI,1030
-pip/_internal/models/link.py,sha256=hoT_qsOBAgLBm9GKqpBrNF_mrEXeGXQE-aH_RX2cGgg,9817
-pip/_internal/models/scheme.py,sha256=3EFQp_ICu_shH1-TBqhl0QAusKCPDFOlgHFeN4XowWs,738
-pip/_internal/models/search_scope.py,sha256=LwloG0PJAmtI1hFXIypsD95kWE9xfR5hf_a2v1Vw7sk,4520
-pip/_internal/models/selection_prefs.py,sha256=KZdi66gsR-_RUXUr9uejssk3rmTHrQVJWeNA2sV-VSY,1907
-pip/_internal/models/target_python.py,sha256=qKpZox7J8NAaPmDs5C_aniwfPDxzvpkrCKqfwndG87k,3858
-pip/_internal/models/wheel.py,sha256=hN9Ub-m-cAJCajCcQHyQNsqpcDCbPPDlEzBDwaBMc14,3500
-pip/_internal/network/__init__.py,sha256=jf6Tt5nV_7zkARBrKojIXItgejvoegVJVKUbhAa5Ioc,50
-pip/_internal/network/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/network/__pycache__/auth.cpython-39.pyc,,
-pip/_internal/network/__pycache__/cache.cpython-39.pyc,,
-pip/_internal/network/__pycache__/download.cpython-39.pyc,,
-pip/_internal/network/__pycache__/lazy_wheel.cpython-39.pyc,,
-pip/_internal/network/__pycache__/session.cpython-39.pyc,,
-pip/_internal/network/__pycache__/utils.cpython-39.pyc,,
-pip/_internal/network/__pycache__/xmlrpc.cpython-39.pyc,,
-pip/_internal/network/auth.py,sha256=qQ1-XcpujA2SfBcF-XI-5aqqMptBiRnmiQZLePAmn4Q,12201
-pip/_internal/network/cache.py,sha256=hgXftU-eau4MWxHSLquTMzepYq5BPC2zhCkhN3glBy8,2145
-pip/_internal/network/download.py,sha256=12Ef_L7MlhNUN_0-n_3DggozWJER8c9J0us16cbvkKA,6062
-pip/_internal/network/lazy_wheel.py,sha256=W2jR0Qd02_ZDnpWHsw_IfYDIs4B8q0GfBQ9yurIU_RY,7654
-pip/_internal/network/session.py,sha256=z5IQr5ekRpyjawoouUKCrPRk9LlkzLn8aYKN6x5Ru5I,16849
-pip/_internal/network/utils.py,sha256=6A5SrUJEEUHxbGtbscwU2NpCyz-3ztiDlGWHpRRhsJ8,4073
-pip/_internal/network/xmlrpc.py,sha256=AzQgG4GgS152_cqmGr_Oz2MIXsCal-xfsis7fA7nmU0,1791
-pip/_internal/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_internal/operations/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/operations/__pycache__/check.cpython-39.pyc,,
-pip/_internal/operations/__pycache__/freeze.cpython-39.pyc,,
-pip/_internal/operations/__pycache__/prepare.cpython-39.pyc,,
-pip/_internal/operations/build/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_internal/operations/build/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/build_tracker.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/metadata.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/metadata_editable.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/metadata_legacy.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/wheel.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/wheel_editable.cpython-39.pyc,,
-pip/_internal/operations/build/__pycache__/wheel_legacy.cpython-39.pyc,,
-pip/_internal/operations/build/build_tracker.py,sha256=vf81EwomN3xe9G8qRJED0VGqNikmRQRQoobNsxi5Xrs,4133
-pip/_internal/operations/build/metadata.py,sha256=ES_uRmAvhrNm_nDTpZxshBfUsvnXtkj-g_4rZrH9Rww,1404
-pip/_internal/operations/build/metadata_editable.py,sha256=_Rai0VZjxoeJUkjkuICrq45LtjwFoDOveosMYH43rKc,1456
-pip/_internal/operations/build/metadata_legacy.py,sha256=o-eU21As175hDC7dluM1fJJ_FqokTIShyWpjKaIpHZw,2198
-pip/_internal/operations/build/wheel.py,sha256=AO9XnTGhTgHtZmU8Dkbfo1OGr41rBuSDjIgAa4zUKgE,1063
-pip/_internal/operations/build/wheel_editable.py,sha256=TVETY-L_M_dSEKBhTIcQOP75zKVXw8tuq1U354Mm30A,1405
-pip/_internal/operations/build/wheel_legacy.py,sha256=C9j6rukgQI1n_JeQLoZGuDdfUwzCXShyIdPTp6edbMQ,3064
-pip/_internal/operations/check.py,sha256=ca4O9CkPt9Em9sLCf3H0iVt1GIcW7M8C0U5XooaBuT4,5109
-pip/_internal/operations/freeze.py,sha256=mwTZ2uML8aQgo3k8MR79a7SZmmmvdAJqdyaknKbavmg,9784
-pip/_internal/operations/install/__init__.py,sha256=mX7hyD2GNBO2mFGokDQ30r_GXv7Y_PLdtxcUv144e-s,51
-pip/_internal/operations/install/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/operations/install/__pycache__/editable_legacy.cpython-39.pyc,,
-pip/_internal/operations/install/__pycache__/legacy.cpython-39.pyc,,
-pip/_internal/operations/install/__pycache__/wheel.cpython-39.pyc,,
-pip/_internal/operations/install/editable_legacy.py,sha256=ee4kfJHNuzTdKItbfAsNOSEwq_vD7DRPGkBdK48yBhU,1354
-pip/_internal/operations/install/legacy.py,sha256=x7BG8kBm0K3JO6AR4sBl0zh2LOrfUaz7EdNt-keHBv4,4091
-pip/_internal/operations/install/wheel.py,sha256=Od4lJAWBXi0e-UZN6lmKzyAdQ_CBEvrl4hJ3pxnZFac,27466
-pip/_internal/operations/prepare.py,sha256=gUi21alboMVl-p0t_qWWNXzxrKe3dx_gN7QdyT2YTA4,21796
-pip/_internal/pyproject.py,sha256=ob0Gb0l12YLZNxjdpZGRfWHgjqhZTnSVv96RuJyNOfs,7074
-pip/_internal/req/__init__.py,sha256=rUQ9d_Sh3E5kNYqX9pkN0D06YL-LrtcbJQ-LiIonq08,2807
-pip/_internal/req/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/req/__pycache__/constructors.cpython-39.pyc,,
-pip/_internal/req/__pycache__/req_file.cpython-39.pyc,,
-pip/_internal/req/__pycache__/req_install.cpython-39.pyc,,
-pip/_internal/req/__pycache__/req_set.cpython-39.pyc,,
-pip/_internal/req/__pycache__/req_uninstall.cpython-39.pyc,,
-pip/_internal/req/constructors.py,sha256=ypjtq1mOQ3d2mFkFPMf_6Mr8SLKeHQk3tUKHA1ddG0U,16611
-pip/_internal/req/req_file.py,sha256=Qgqx7qLfDO3ai72oO2U1u928_6Idajun5VFRWPZg3XM,17502
-pip/_internal/req/req_install.py,sha256=KyaEnBIpnTutjQRVdO-IHf377kuN6J03_jukEAU5AWw,32689
-pip/_internal/req/req_set.py,sha256=KigSLlg7K-W4LYKD62d7LWqV6vr9TZPHcNZTES6SVgU,2356
-pip/_internal/req/req_uninstall.py,sha256=ZFQfgSNz6H1BMsgl87nQNr2iaQCcbFcmXpW8rKVQcic,24045
-pip/_internal/resolution/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_internal/resolution/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/resolution/__pycache__/base.cpython-39.pyc,,
-pip/_internal/resolution/base.py,sha256=qlmh325SBVfvG6Me9gc5Nsh5sdwHBwzHBq6aEXtKsLA,583
-pip/_internal/resolution/legacy/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_internal/resolution/legacy/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/resolution/legacy/__pycache__/resolver.cpython-39.pyc,,
-pip/_internal/resolution/legacy/resolver.py,sha256=YcEhIgqtdecdT-iScrAVVe7yauVW2NgV6Dxk3ErfOFE,23646
-pip/_internal/resolution/resolvelib/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_internal/resolution/resolvelib/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/base.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/candidates.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/factory.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/found_candidates.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/provider.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/reporter.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/requirements.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/__pycache__/resolver.cpython-39.pyc,,
-pip/_internal/resolution/resolvelib/base.py,sha256=u1O4fkvCO4mhmu5i32xrDv9AX5NgUci_eYVyBDQhTIM,5220
-pip/_internal/resolution/resolvelib/candidates.py,sha256=7kO5_CjtSApawYc9GETlDM14BcJLxhgcpDWpwPoKdyU,18507
-pip/_internal/resolution/resolvelib/factory.py,sha256=qeY-drkzYYKpNzmNmdVq-KqajpqZuLpwBakQKguFkKI,28642
-pip/_internal/resolution/resolvelib/found_candidates.py,sha256=hvL3Hoa9VaYo-qEOZkBi2Iqw251UDxPz-uMHVaWmLpE,5705
-pip/_internal/resolution/resolvelib/provider.py,sha256=LzQQyzMVaZYAwLgKInbq-it6mbQL1gX0hGohz5Cr5wg,9915
-pip/_internal/resolution/resolvelib/reporter.py,sha256=3ZVVYrs5PqvLFJkGLcuXoMK5mTInFzl31xjUpDBpZZk,2526
-pip/_internal/resolution/resolvelib/requirements.py,sha256=B1ndvKPSuyyyTEXt9sKhbwminViSWnBrJa7qO2ln4Z0,5455
-pip/_internal/resolution/resolvelib/resolver.py,sha256=UsWuwuTu9aYHIfEBnEb7e1r3tXGgJbSA5LVgQqdVZ2w,11633
-pip/_internal/self_outdated_check.py,sha256=_Ti_PF6yQSpxvksxHlgd1gQ4umyaFoHg0ShcQ5ec9g8,8102
-pip/_internal/utils/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_internal/utils/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/_log.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/appdirs.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/compat.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/compatibility_tags.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/datetime.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/deprecation.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/direct_url_helpers.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/distutils_args.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/egg_link.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/encoding.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/entrypoints.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/filesystem.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/filetypes.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/glibc.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/hashes.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/inject_securetransport.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/logging.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/misc.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/models.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/packaging.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/setuptools_build.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/subprocess.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/temp_dir.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/unpacking.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/urls.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/virtualenv.cpython-39.pyc,,
-pip/_internal/utils/__pycache__/wheel.cpython-39.pyc,,
-pip/_internal/utils/_log.py,sha256=-jHLOE_THaZz5BFcCnoSL9EYAtJ0nXem49s9of4jvKw,1015
-pip/_internal/utils/appdirs.py,sha256=swgcTKOm3daLeXTW6v5BUS2Ti2RvEnGRQYH_yDXklAo,1665
-pip/_internal/utils/compat.py,sha256=ACyBfLgj3_XG-iA5omEDrXqDM0cQKzi8h8HRBInzG6Q,1884
-pip/_internal/utils/compatibility_tags.py,sha256=ydin8QG8BHqYRsPY4OL6cmb44CbqXl1T0xxS97VhHkk,5377
-pip/_internal/utils/datetime.py,sha256=m21Y3wAtQc-ji6Veb6k_M5g6A0ZyFI4egchTdnwh-pQ,242
-pip/_internal/utils/deprecation.py,sha256=NKo8VqLioJ4nnXXGmW4KdasxF90EFHkZaHeX1fT08C8,3627
-pip/_internal/utils/direct_url_helpers.py,sha256=6F1tc2rcKaCZmgfVwsE6ObIe_Pux23mUVYA-2D9wCFc,3206
-pip/_internal/utils/distutils_args.py,sha256=mcAscyp80vTt3xAGTipnpgc83V-_wCvydNELVXLq7JI,1249
-pip/_internal/utils/egg_link.py,sha256=5MVlpz5LirT4iLQq86OYzjXaYF0D4Qk1dprEI7ThST4,2203
-pip/_internal/utils/encoding.py,sha256=qqsXDtiwMIjXMEiIVSaOjwH5YmirCaK-dIzb6-XJsL0,1169
-pip/_internal/utils/entrypoints.py,sha256=GgeG2FUbbYhQ0sYgG2AtM-a4d1P8MJYdmEl5IhQ-WeM,2900
-pip/_internal/utils/filesystem.py,sha256=RhMIXUaNVMGjc3rhsDahWQ4MavvEQDdqXqgq-F6fpw8,5122
-pip/_internal/utils/filetypes.py,sha256=i8XAQ0eFCog26Fw9yV0Yb1ygAqKYB1w9Cz9n0fj8gZU,716
-pip/_internal/utils/glibc.py,sha256=tDfwVYnJCOC0BNVpItpy8CGLP9BjkxFHdl0mTS0J7fc,3110
-pip/_internal/utils/hashes.py,sha256=EPVx_I0UI8Gvu_skgLwpJA90pHZ5Ev1qNaZagYOub7I,4811
-pip/_internal/utils/inject_securetransport.py,sha256=o-QRVMGiENrTJxw3fAhA7uxpdEdw6M41TjHYtSVRrcg,795
-pip/_internal/utils/logging.py,sha256=RQX67Z1EqWSujDeQrEQvHFaaObk9QavvfXCIQhRs7fk,11587
-pip/_internal/utils/misc.py,sha256=49Rs2NgrD4JGTKFt0farCm7FIAi-rjyoxgioArhCW_0,21617
-pip/_internal/utils/models.py,sha256=5GoYU586SrxURMvDn_jBMJInitviJg4O5-iOU-6I0WY,1193
-pip/_internal/utils/packaging.py,sha256=5Wm6_x7lKrlqVjPI5MBN_RurcRHwVYoQ7Ksrs84de7s,2108
-pip/_internal/utils/setuptools_build.py,sha256=vNH9hQB9wT6d-h1hVQhBKw91jNeT42meHpVeii-urOI,5652
-pip/_internal/utils/subprocess.py,sha256=MYySbvY7qBevRxq_RFfOsDqG4vMqrB4vDoL_eyPE6Bo,9197
-pip/_internal/utils/temp_dir.py,sha256=aCX489gRa4Nu0dMKRFyGhV6maJr60uEynu5uCbKR4Qg,7702
-pip/_internal/utils/unpacking.py,sha256=SBb2iV1crb89MDRTEKY86R4A_UOWApTQn9VQVcMDOlE,8821
-pip/_internal/utils/urls.py,sha256=AhaesUGl-9it6uvG6fsFPOr9ynFpGaTMk4t5XTX7Z_Q,1759
-pip/_internal/utils/virtualenv.py,sha256=4_48qMzCwB_F5jIK5BC_ua7uiAMVifmQWU9NdaGUoVA,3459
-pip/_internal/utils/wheel.py,sha256=lXOgZyTlOm5HmK8tw5iw0A3_5A6wRzsXHOaQkIvvloU,4549
-pip/_internal/vcs/__init__.py,sha256=UAqvzpbi0VbZo3Ub6skEeZAw-ooIZR-zX_WpCbxyCoU,596
-pip/_internal/vcs/__pycache__/__init__.cpython-39.pyc,,
-pip/_internal/vcs/__pycache__/bazaar.cpython-39.pyc,,
-pip/_internal/vcs/__pycache__/git.cpython-39.pyc,,
-pip/_internal/vcs/__pycache__/mercurial.cpython-39.pyc,,
-pip/_internal/vcs/__pycache__/subversion.cpython-39.pyc,,
-pip/_internal/vcs/__pycache__/versioncontrol.cpython-39.pyc,,
-pip/_internal/vcs/bazaar.py,sha256=IGb5ca1xSZfgegRD2_JeyoZPrQQHs7lEYEIgpVsKpoU,3047
-pip/_internal/vcs/git.py,sha256=mjhwudCx9WlLNkxZ6_kOKmueF0rLoU2i1xeASKF6yiQ,18116
-pip/_internal/vcs/mercurial.py,sha256=Bzbd518Jsx-EJI0IhIobiQqiRsUv5TWYnrmRIFWE0Gw,5238
-pip/_internal/vcs/subversion.py,sha256=TEMRdwECvMcXakZX0pTNUep79kmBYkWDkWFkrYmcmac,11718
-pip/_internal/vcs/versioncontrol.py,sha256=KUOc-hN51em9jrqxKwUR3JnkgSE-xSOqMiiJcSaL6B8,22811
-pip/_internal/wheel_builder.py,sha256=65rOA8FSYt3c3HyqEw17uujjlCgqmoKEIv6rv9xN2NM,12307
-pip/_vendor/__init__.py,sha256=xjcBX0EP50pkaMdCssrsBXoZgo2hTtYxlcH1CIyA3T4,4708
-pip/_vendor/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/__pycache__/six.cpython-39.pyc,,
-pip/_vendor/__pycache__/typing_extensions.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__init__.py,sha256=hrxlv3q7upsfyMw8k3gQ9vagBax1pYHSGGqYlZ0Zk0M,465
-pip/_vendor/cachecontrol/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/_cmd.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/adapter.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/cache.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/compat.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/controller.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/filewrapper.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/heuristics.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/serialize.cpython-39.pyc,,
-pip/_vendor/cachecontrol/__pycache__/wrapper.cpython-39.pyc,,
-pip/_vendor/cachecontrol/_cmd.py,sha256=lxUXqfNTVx84zf6tcWbkLZHA6WVBRtJRpfeA9ZqhaAY,1379
-pip/_vendor/cachecontrol/adapter.py,sha256=ew9OYEQHEOjvGl06ZsuX8W3DAvHWsQKHwWAxISyGug8,5033
-pip/_vendor/cachecontrol/cache.py,sha256=Tty45fOjH40fColTGkqKQvQQmbYsMpk-nCyfLcv2vG4,1535
-pip/_vendor/cachecontrol/caches/__init__.py,sha256=h-1cUmOz6mhLsjTjOrJ8iPejpGdLCyG4lzTftfGZvLg,242
-pip/_vendor/cachecontrol/caches/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/cachecontrol/caches/__pycache__/file_cache.cpython-39.pyc,,
-pip/_vendor/cachecontrol/caches/__pycache__/redis_cache.cpython-39.pyc,,
-pip/_vendor/cachecontrol/caches/file_cache.py,sha256=GpexcE29LoY4MaZwPUTcUBZaDdcsjqyLxZFznk8Hbr4,5271
-pip/_vendor/cachecontrol/caches/redis_cache.py,sha256=mp-QWonP40I3xJGK3XVO-Gs9a3UjzlqqEmp9iLJH9F4,1033
-pip/_vendor/cachecontrol/compat.py,sha256=LNx7vqBndYdHU8YuJt53ab_8rzMGTXVrvMb7CZJkxG0,778
-pip/_vendor/cachecontrol/controller.py,sha256=bAYrt7x_VH4toNpI066LQxbHpYGpY1MxxmZAhspplvw,16416
-pip/_vendor/cachecontrol/filewrapper.py,sha256=X4BAQOO26GNOR7nH_fhTzAfeuct2rBQcx_15MyFBpcs,3946
-pip/_vendor/cachecontrol/heuristics.py,sha256=8kAyuZLSCyEIgQr6vbUwfhpqg9ows4mM0IV6DWazevI,4154
-pip/_vendor/cachecontrol/serialize.py,sha256=_U1NU_C-SDgFzkbAxAsPDgMTHeTWZZaHCQnZN_jh0U8,7105
-pip/_vendor/cachecontrol/wrapper.py,sha256=X3-KMZ20Ho3VtqyVaXclpeQpFzokR5NE8tZSfvKVaB8,774
-pip/_vendor/certifi/__init__.py,sha256=xWdRgntT3j1V95zkRipGOg_A1UfEju2FcpujhysZLRI,62
-pip/_vendor/certifi/__main__.py,sha256=1k3Cr95vCxxGRGDljrW3wMdpZdL3Nhf0u1n-k2qdsCY,255
-pip/_vendor/certifi/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/certifi/__pycache__/__main__.cpython-39.pyc,,
-pip/_vendor/certifi/__pycache__/core.cpython-39.pyc,,
-pip/_vendor/certifi/cacert.pem,sha256=-og4Keu4zSpgL5shwfhd4kz0eUnVILzrGCi0zRy2kGw,265969
-pip/_vendor/certifi/core.py,sha256=gOFd0zHYlx4krrLEn982esOtmz3djiG0BFSDhgjlvcI,2840
-pip/_vendor/chardet/__init__.py,sha256=mWZaWmvZkhwfBEAT9O1Y6nRTfKzhT7FHhQTTAujbqUA,3271
-pip/_vendor/chardet/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/big5freq.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/big5prober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/chardistribution.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/charsetgroupprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/charsetprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/codingstatemachine.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/compat.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/cp949prober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/enums.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/escprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/escsm.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/eucjpprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/euckrfreq.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/euckrprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/euctwfreq.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/euctwprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/gb2312freq.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/gb2312prober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/hebrewprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/jisfreq.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/jpcntx.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langbulgarianmodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langgreekmodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langhebrewmodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langhungarianmodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langrussianmodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langthaimodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/langturkishmodel.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/latin1prober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/mbcharsetprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/mbcsgroupprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/mbcssm.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/sbcharsetprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/sbcsgroupprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/sjisprober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/universaldetector.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/utf8prober.cpython-39.pyc,,
-pip/_vendor/chardet/__pycache__/version.cpython-39.pyc,,
-pip/_vendor/chardet/big5freq.py,sha256=D_zK5GyzoVsRes0HkLJziltFQX0bKCLOrFe9_xDvO_8,31254
-pip/_vendor/chardet/big5prober.py,sha256=kBxHbdetBpPe7xrlb-e990iot64g_eGSLd32lB7_h3M,1757
-pip/_vendor/chardet/chardistribution.py,sha256=3woWS62KrGooKyqz4zQSnjFbJpa6V7g02daAibTwcl8,9411
-pip/_vendor/chardet/charsetgroupprober.py,sha256=GZLReHP6FRRn43hvSOoGCxYamErKzyp6RgOQxVeC3kg,3839
-pip/_vendor/chardet/charsetprober.py,sha256=KSmwJErjypyj0bRZmC5F5eM7c8YQgLYIjZXintZNstg,5110
-pip/_vendor/chardet/cli/__init__.py,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
-pip/_vendor/chardet/cli/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/chardet/cli/__pycache__/chardetect.cpython-39.pyc,,
-pip/_vendor/chardet/cli/chardetect.py,sha256=XK5zqjUG2a4-y6eLHZ8ThYcp6WWUrdlmELxNypcc2SE,2747
-pip/_vendor/chardet/codingstatemachine.py,sha256=VYp_6cyyki5sHgXDSZnXW4q1oelHc3cu9AyQTX7uug8,3590
-pip/_vendor/chardet/compat.py,sha256=40zr6wICZwknxyuLGGcIOPyve8DTebBCbbvttvnmp5Q,1200
-pip/_vendor/chardet/cp949prober.py,sha256=TZ434QX8zzBsnUvL_8wm4AQVTZ2ZkqEEQL_lNw9f9ow,1855
-pip/_vendor/chardet/enums.py,sha256=Aimwdb9as1dJKZaFNUH2OhWIVBVd6ZkJJ_WK5sNY8cU,1661
-pip/_vendor/chardet/escprober.py,sha256=kkyqVg1Yw3DIOAMJ2bdlyQgUFQhuHAW8dUGskToNWSc,3950
-pip/_vendor/chardet/escsm.py,sha256=RuXlgNvTIDarndvllNCk5WZBIpdCxQ0kcd9EAuxUh84,10510
-pip/_vendor/chardet/eucjpprober.py,sha256=iD8Jdp0ISRjgjiVN7f0e8xGeQJ5GM2oeZ1dA8nbSeUw,3749
-pip/_vendor/chardet/euckrfreq.py,sha256=-7GdmvgWez4-eO4SuXpa7tBiDi5vRXQ8WvdFAzVaSfo,13546
-pip/_vendor/chardet/euckrprober.py,sha256=MqFMTQXxW4HbzIpZ9lKDHB3GN8SP4yiHenTmf8g_PxY,1748
-pip/_vendor/chardet/euctwfreq.py,sha256=No1WyduFOgB5VITUA7PLyC5oJRNzRyMbBxaKI1l16MA,31621
-pip/_vendor/chardet/euctwprober.py,sha256=13p6EP4yRaxqnP4iHtxHOJ6R2zxHq1_m8hTRjzVZ95c,1747
-pip/_vendor/chardet/gb2312freq.py,sha256=JX8lsweKLmnCwmk8UHEQsLgkr_rP_kEbvivC4qPOrlc,20715
-pip/_vendor/chardet/gb2312prober.py,sha256=gGvIWi9WhDjE-xQXHvNIyrnLvEbMAYgyUSZ65HUfylw,1754
-pip/_vendor/chardet/hebrewprober.py,sha256=c3SZ-K7hvyzGY6JRAZxJgwJ_sUS9k0WYkvMY00YBYFo,13838
-pip/_vendor/chardet/jisfreq.py,sha256=vpmJv2Bu0J8gnMVRPHMFefTRvo_ha1mryLig8CBwgOg,25777
-pip/_vendor/chardet/jpcntx.py,sha256=PYlNqRUQT8LM3cT5FmHGP0iiscFlTWED92MALvBungo,19643
-pip/_vendor/chardet/langbulgarianmodel.py,sha256=rk9CJpuxO0bObboJcv6gNgWuosYZmd8qEEds5y7DS_Y,105697
-pip/_vendor/chardet/langgreekmodel.py,sha256=S-uNQ1ihC75yhBvSux24gLFZv3QyctMwC6OxLJdX-bw,99571
-pip/_vendor/chardet/langhebrewmodel.py,sha256=DzPP6TPGG_-PV7tqspu_d8duueqm7uN-5eQ0aHUw1Gg,98776
-pip/_vendor/chardet/langhungarianmodel.py,sha256=RtJH7DZdsmaHqyK46Kkmnk5wQHiJwJPPJSqqIlpeZRc,102498
-pip/_vendor/chardet/langrussianmodel.py,sha256=THqJOhSxiTQcHboDNSc5yofc2koXXQFHFyjtyuntUfM,131180
-pip/_vendor/chardet/langthaimodel.py,sha256=R1wXHnUMtejpw0JnH_JO8XdYasME6wjVqp1zP7TKLgg,103312
-pip/_vendor/chardet/langturkishmodel.py,sha256=rfwanTptTwSycE4-P-QasPmzd-XVYgevytzjlEzBBu8,95946
-pip/_vendor/chardet/latin1prober.py,sha256=S2IoORhFk39FEFOlSFWtgVybRiP6h7BlLldHVclNkU8,5370
-pip/_vendor/chardet/mbcharsetprober.py,sha256=AR95eFH9vuqSfvLQZN-L5ijea25NOBCoXqw8s5O9xLQ,3413
-pip/_vendor/chardet/mbcsgroupprober.py,sha256=h6TRnnYq2OxG1WdD5JOyxcdVpn7dG0q-vB8nWr5mbh4,2012
-pip/_vendor/chardet/mbcssm.py,sha256=SY32wVIF3HzcjY3BaEspy9metbNSKxIIB0RKPn7tjpI,25481
-pip/_vendor/chardet/metadata/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/chardet/metadata/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/chardet/metadata/__pycache__/languages.cpython-39.pyc,,
-pip/_vendor/chardet/metadata/languages.py,sha256=41tLq3eLSrBEbEVVQpVGFq9K7o1ln9b1HpY1l0hCUQo,19474
-pip/_vendor/chardet/sbcharsetprober.py,sha256=nmyMyuxzG87DN6K3Rk2MUzJLMLR69MrWpdnHzOwVUwQ,6136
-pip/_vendor/chardet/sbcsgroupprober.py,sha256=hqefQuXmiFyDBArOjujH6hd6WFXlOD1kWCsxDhjx5Vc,4309
-pip/_vendor/chardet/sjisprober.py,sha256=IIt-lZj0WJqK4rmUZzKZP4GJlE8KUEtFYVuY96ek5MQ,3774
-pip/_vendor/chardet/universaldetector.py,sha256=DpZTXCX0nUHXxkQ9sr4GZxGB_hveZ6hWt3uM94cgWKs,12503
-pip/_vendor/chardet/utf8prober.py,sha256=IdD8v3zWOsB8OLiyPi-y_fqwipRFxV9Nc1eKBLSuIEw,2766
-pip/_vendor/chardet/version.py,sha256=A4CILFAd8MRVG1HoXPp45iK9RLlWyV73a1EtwE8Tvn8,242
-pip/_vendor/colorama/__init__.py,sha256=pCdErryzLSzDW5P-rRPBlPLqbBtIRNJB6cMgoeJns5k,239
-pip/_vendor/colorama/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/colorama/__pycache__/ansi.cpython-39.pyc,,
-pip/_vendor/colorama/__pycache__/ansitowin32.cpython-39.pyc,,
-pip/_vendor/colorama/__pycache__/initialise.cpython-39.pyc,,
-pip/_vendor/colorama/__pycache__/win32.cpython-39.pyc,,
-pip/_vendor/colorama/__pycache__/winterm.cpython-39.pyc,,
-pip/_vendor/colorama/ansi.py,sha256=Top4EeEuaQdBWdteKMEcGOTeKeF19Q-Wo_6_Cj5kOzQ,2522
-pip/_vendor/colorama/ansitowin32.py,sha256=yV7CEmCb19MjnJKODZEEvMH_fnbJhwnpzo4sxZuGXmA,10517
-pip/_vendor/colorama/initialise.py,sha256=PprovDNxMTrvoNHFcL2NZjpH2XzDc8BLxLxiErfUl4k,1915
-pip/_vendor/colorama/win32.py,sha256=bJ8Il9jwaBN5BJ8bmN6FoYZ1QYuMKv2j8fGrXh7TJjw,5404
-pip/_vendor/colorama/winterm.py,sha256=2y_2b7Zsv34feAsP67mLOVc-Bgq51mdYGo571VprlrM,6438
-pip/_vendor/distlib/__init__.py,sha256=HTGLP7dnTRTQCbEZNGUxBq-0sobr0KQUMn3yd6uEObA,581
-pip/_vendor/distlib/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/compat.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/database.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/index.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/locators.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/manifest.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/markers.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/metadata.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/resources.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/scripts.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/util.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/version.cpython-39.pyc,,
-pip/_vendor/distlib/__pycache__/wheel.cpython-39.pyc,,
-pip/_vendor/distlib/_backport/__init__.py,sha256=bqS_dTOH6uW9iGgd0uzfpPjo6vZ4xpPZ7kyfZJ2vNaw,274
-pip/_vendor/distlib/_backport/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/distlib/_backport/__pycache__/misc.cpython-39.pyc,,
-pip/_vendor/distlib/_backport/__pycache__/shutil.cpython-39.pyc,,
-pip/_vendor/distlib/_backport/__pycache__/sysconfig.cpython-39.pyc,,
-pip/_vendor/distlib/_backport/__pycache__/tarfile.cpython-39.pyc,,
-pip/_vendor/distlib/_backport/misc.py,sha256=KWecINdbFNOxSOP1fGF680CJnaC6S4fBRgEtaYTw0ig,971
-pip/_vendor/distlib/_backport/shutil.py,sha256=IX_G2NPqwecJibkIDje04bqu0xpHkfSQ2GaGdEVqM5Y,25707
-pip/_vendor/distlib/_backport/sysconfig.cfg,sha256=swZKxq9RY5e9r3PXCrlvQPMsvOdiWZBTHLEbqS8LJLU,2617
-pip/_vendor/distlib/_backport/sysconfig.py,sha256=BQHFlb6pubCl_dvT1NjtzIthylofjKisox239stDg0U,26854
-pip/_vendor/distlib/_backport/tarfile.py,sha256=Ihp7rXRcjbIKw8COm9wSePV9ARGXbSF9gGXAMn2Q-KU,92628
-pip/_vendor/distlib/compat.py,sha256=fbsxc5PfJ2wBx1K4k6mQ2goAYs-GZW0tcOPIlE_vf0I,41495
-pip/_vendor/distlib/database.py,sha256=Kl0YvPQKc4OcpVi7k5cFziydM1xOK8iqdxLGXgbZHV4,51059
-pip/_vendor/distlib/index.py,sha256=UfcimNW19AB7IKWam4VaJbXuCBvArKfSxhV16EwavzE,20739
-pip/_vendor/distlib/locators.py,sha256=AKlB3oZvfOTg4E0CtfwOzujFL19X5V4XUA4eHdKOu44,51965
-pip/_vendor/distlib/manifest.py,sha256=nQEhYmgoreaBZzyFzwYsXxJARu3fo4EkunU163U16iE,14811
-pip/_vendor/distlib/markers.py,sha256=9c70ISEKwBjmUOHuIdOygVnRVESOKdNYp9a2TVn4qrI,4989
-pip/_vendor/distlib/metadata.py,sha256=vatoxFdmBr6ie-sTVXVNPOPG3uwMDWJTnEECnm7xDCw,39109
-pip/_vendor/distlib/resources.py,sha256=LwbPksc0A1JMbi6XnuPdMBUn83X7BPuFNWqPGEKI698,10820
-pip/_vendor/distlib/scripts.py,sha256=tjSwENINeV91ROZxec5zTSMRg2jEeKc4enyCHDzNvEE,17720
-pip/_vendor/distlib/t32.exe,sha256=NS3xBCVAld35JVFNmb-1QRyVtThukMrwZVeXn4LhaEQ,96768
-pip/_vendor/distlib/t64-arm.exe,sha256=8WGDh6aI8WJAjngRNQpyJpB21Sv20PCYYFSNW1fWd6w,180736
-pip/_vendor/distlib/t64.exe,sha256=oAqHes78rUWVM0OtVqIhUvequl_PKhAhXYQWnUf7zR0,105984
-pip/_vendor/distlib/util.py,sha256=0Uq_qa63FCLtdyNdWvMnmPbiSvVa-ykHM2E8HT7LSIU,67766
-pip/_vendor/distlib/version.py,sha256=WG__LyAa2GwmA6qSoEJtvJE8REA1LZpbSizy8WvhJLk,23513
-pip/_vendor/distlib/w32.exe,sha256=lJtnZdeUxTZWya_EW5DZos_K5rswRECGspIl8ZJCIXs,90112
-pip/_vendor/distlib/w64-arm.exe,sha256=Q_HdzVu9zxYdaBa3m0iJ5_ddLOEqtPe8x30WADoXza8,166400
-pip/_vendor/distlib/w64.exe,sha256=0aRzoN2BO9NWW4ENy4_4vHkHR4qZTFZNVSAJJYlODTI,99840
-pip/_vendor/distlib/wheel.py,sha256=pj5VVCjqZMcHvgizORWwAFPS7hOk61CZ59dxP8laQ4E,42943
-pip/_vendor/distro/__init__.py,sha256=2fHjF-SfgPvjyNZ1iHh_wjqWdR_Yo5ODHwZC0jLBPhc,981
-pip/_vendor/distro/__main__.py,sha256=bu9d3TifoKciZFcqRBuygV3GSuThnVD_m2IK4cz96Vs,64
-pip/_vendor/distro/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/distro/__pycache__/__main__.cpython-39.pyc,,
-pip/_vendor/distro/__pycache__/distro.cpython-39.pyc,,
-pip/_vendor/distro/distro.py,sha256=UYQG_9H_iSOt422uasA92HlY7aXeTnWKdV-IhsSAdwQ,48841
-pip/_vendor/html5lib/__init__.py,sha256=BYzcKCqeEii52xDrqBFruhnmtmkiuHXFyFh-cglQ8mk,1160
-pip/_vendor/html5lib/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/_ihatexml.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/_inputstream.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/_tokenizer.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/_utils.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/constants.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/html5parser.cpython-39.pyc,,
-pip/_vendor/html5lib/__pycache__/serializer.cpython-39.pyc,,
-pip/_vendor/html5lib/_ihatexml.py,sha256=ifOwF7pXqmyThIXc3boWc96s4MDezqRrRVp7FwDYUFs,16728
-pip/_vendor/html5lib/_inputstream.py,sha256=jErNASMlkgs7MpOM9Ve_VdLDJyFFweAjLuhVutZz33U,32353
-pip/_vendor/html5lib/_tokenizer.py,sha256=04mgA2sNTniutl2fxFv-ei5bns4iRaPxVXXHh_HrV_4,77040
-pip/_vendor/html5lib/_trie/__init__.py,sha256=nqfgO910329BEVJ5T4psVwQtjd2iJyEXQ2-X8c1YxwU,109
-pip/_vendor/html5lib/_trie/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/html5lib/_trie/__pycache__/_base.cpython-39.pyc,,
-pip/_vendor/html5lib/_trie/__pycache__/py.cpython-39.pyc,,
-pip/_vendor/html5lib/_trie/_base.py,sha256=CaybYyMro8uERQYjby2tTeSUatnWDfWroUN9N7ety5w,1013
-pip/_vendor/html5lib/_trie/py.py,sha256=wXmQLrZRf4MyWNyg0m3h81m9InhLR7GJ002mIIZh-8o,1775
-pip/_vendor/html5lib/_utils.py,sha256=Dx9AKntksRjFT1veBj7I362pf5OgIaT0zglwq43RnfU,4931
-pip/_vendor/html5lib/constants.py,sha256=Ll-yzLU_jcjyAI_h57zkqZ7aQWE5t5xA4y_jQgoUUhw,83464
-pip/_vendor/html5lib/filters/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/html5lib/filters/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/alphabeticalattributes.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/base.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/inject_meta_charset.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/lint.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/optionaltags.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/sanitizer.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/__pycache__/whitespace.cpython-39.pyc,,
-pip/_vendor/html5lib/filters/alphabeticalattributes.py,sha256=lViZc2JMCclXi_5gduvmdzrRxtO5Xo9ONnbHBVCsykU,919
-pip/_vendor/html5lib/filters/base.py,sha256=z-IU9ZAYjpsVsqmVt7kuWC63jR11hDMr6CVrvuao8W0,286
-pip/_vendor/html5lib/filters/inject_meta_charset.py,sha256=egDXUEHXmAG9504xz0K6ALDgYkvUrC2q15YUVeNlVQg,2945
-pip/_vendor/html5lib/filters/lint.py,sha256=jk6q56xY0ojiYfvpdP-OZSm9eTqcAdRqhCoPItemPYA,3643
-pip/_vendor/html5lib/filters/optionaltags.py,sha256=8lWT75J0aBOHmPgfmqTHSfPpPMp01T84NKu0CRedxcE,10588
-pip/_vendor/html5lib/filters/sanitizer.py,sha256=m6oGmkBhkGAnn2nV6D4hE78SCZ6WEnK9rKdZB3uXBIc,26897
-pip/_vendor/html5lib/filters/whitespace.py,sha256=8eWqZxd4UC4zlFGW6iyY6f-2uuT8pOCSALc3IZt7_t4,1214
-pip/_vendor/html5lib/html5parser.py,sha256=anr-aXre_ImfrkQ35c_rftKXxC80vJCREKe06Tq15HA,117186
-pip/_vendor/html5lib/serializer.py,sha256=_PpvcZF07cwE7xr9uKkZqh5f4UEaI8ltCU2xPJzaTpk,15759
-pip/_vendor/html5lib/treeadapters/__init__.py,sha256=A0rY5gXIe4bJOiSGRO_j_tFhngRBO8QZPzPtPw5dFzo,679
-pip/_vendor/html5lib/treeadapters/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/html5lib/treeadapters/__pycache__/genshi.cpython-39.pyc,,
-pip/_vendor/html5lib/treeadapters/__pycache__/sax.cpython-39.pyc,,
-pip/_vendor/html5lib/treeadapters/genshi.py,sha256=CH27pAsDKmu4ZGkAUrwty7u0KauGLCZRLPMzaO3M5vo,1715
-pip/_vendor/html5lib/treeadapters/sax.py,sha256=BKS8woQTnKiqeffHsxChUqL4q2ZR_wb5fc9MJ3zQC8s,1776
-pip/_vendor/html5lib/treebuilders/__init__.py,sha256=AysSJyvPfikCMMsTVvaxwkgDieELD5dfR8FJIAuq7hY,3592
-pip/_vendor/html5lib/treebuilders/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/html5lib/treebuilders/__pycache__/base.cpython-39.pyc,,
-pip/_vendor/html5lib/treebuilders/__pycache__/dom.cpython-39.pyc,,
-pip/_vendor/html5lib/treebuilders/__pycache__/etree.cpython-39.pyc,,
-pip/_vendor/html5lib/treebuilders/__pycache__/etree_lxml.cpython-39.pyc,,
-pip/_vendor/html5lib/treebuilders/base.py,sha256=z-o51vt9r_l2IDG5IioTOKGzZne4Fy3_Fc-7ztrOh4I,14565
-pip/_vendor/html5lib/treebuilders/dom.py,sha256=22whb0C71zXIsai5mamg6qzBEiigcBIvaDy4Asw3at0,8925
-pip/_vendor/html5lib/treebuilders/etree.py,sha256=w5ZFpKk6bAxnrwD2_BrF5EVC7vzz0L3LMi9Sxrbc_8w,12836
-pip/_vendor/html5lib/treebuilders/etree_lxml.py,sha256=9gqDjs-IxsPhBYa5cpvv2FZ1KZlG83Giusy2lFmvIkE,14766
-pip/_vendor/html5lib/treewalkers/__init__.py,sha256=OBPtc1TU5mGyy18QDMxKEyYEz0wxFUUNj5v0-XgmYhY,5719
-pip/_vendor/html5lib/treewalkers/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/html5lib/treewalkers/__pycache__/base.cpython-39.pyc,,
-pip/_vendor/html5lib/treewalkers/__pycache__/dom.cpython-39.pyc,,
-pip/_vendor/html5lib/treewalkers/__pycache__/etree.cpython-39.pyc,,
-pip/_vendor/html5lib/treewalkers/__pycache__/etree_lxml.cpython-39.pyc,,
-pip/_vendor/html5lib/treewalkers/__pycache__/genshi.cpython-39.pyc,,
-pip/_vendor/html5lib/treewalkers/base.py,sha256=ouiOsuSzvI0KgzdWP8PlxIaSNs9falhbiinAEc_UIJY,7476
-pip/_vendor/html5lib/treewalkers/dom.py,sha256=EHyFR8D8lYNnyDU9lx_IKigVJRyecUGua0mOi7HBukc,1413
-pip/_vendor/html5lib/treewalkers/etree.py,sha256=xo1L5m9VtkfpFJK0pFmkLVajhqYYVisVZn3k9kYpPkI,4551
-pip/_vendor/html5lib/treewalkers/etree_lxml.py,sha256=_b0LAVWLcVu9WaU_-w3D8f0IRSpCbjf667V-3NRdhTw,6357
-pip/_vendor/html5lib/treewalkers/genshi.py,sha256=4D2PECZ5n3ZN3qu3jMl9yY7B81jnQApBQSVlfaIuYbA,2309
-pip/_vendor/idna/__init__.py,sha256=KJQN1eQBr8iIK5SKrJ47lXvxG0BJ7Lm38W4zT0v_8lk,849
-pip/_vendor/idna/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/codec.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/compat.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/core.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/idnadata.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/intranges.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/package_data.cpython-39.pyc,,
-pip/_vendor/idna/__pycache__/uts46data.cpython-39.pyc,,
-pip/_vendor/idna/codec.py,sha256=6ly5odKfqrytKT9_7UrlGklHnf1DSK2r9C6cSM4sa28,3374
-pip/_vendor/idna/compat.py,sha256=0_sOEUMT4CVw9doD3vyRhX80X19PwqFoUBs7gWsFME4,321
-pip/_vendor/idna/core.py,sha256=RFIkY-HhFZaDoBEFjGwyGd_vWI04uOAQjnzueMWqwOU,12795
-pip/_vendor/idna/idnadata.py,sha256=fzMzkCea2xieVxcrjngJ-2pLsKQNejPCZFlBajIuQdw,44025
-pip/_vendor/idna/intranges.py,sha256=YBr4fRYuWH7kTKS2tXlFjM24ZF1Pdvcir-aywniInqg,1881
-pip/_vendor/idna/package_data.py,sha256=szxQhV0ZD0nKJ84Kuobw3l8q4_KeCyXjFRdpwIpKZmw,21
-pip/_vendor/idna/uts46data.py,sha256=o-D7V-a0fOLZNd7tvxof6MYfUd0TBZzE2bLR5XO67xU,204400
-pip/_vendor/msgpack/__init__.py,sha256=2gJwcsTIaAtCM0GMi2rU-_Y6kILeeQuqRkrQ22jSANc,1118
-pip/_vendor/msgpack/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/msgpack/__pycache__/_version.cpython-39.pyc,,
-pip/_vendor/msgpack/__pycache__/exceptions.cpython-39.pyc,,
-pip/_vendor/msgpack/__pycache__/ext.cpython-39.pyc,,
-pip/_vendor/msgpack/__pycache__/fallback.cpython-39.pyc,,
-pip/_vendor/msgpack/_version.py,sha256=JpTcnRd3YUioA24NDtDZbLW0Nhl2yA-N1Rq2lLDBB-g,20
-pip/_vendor/msgpack/exceptions.py,sha256=dCTWei8dpkrMsQDcjQk74ATl9HsIBH0ybt8zOPNqMYc,1081
-pip/_vendor/msgpack/ext.py,sha256=4l356Y4sVEcvCla2dh_cL57vh4GMhZfa3kuWHFHYz6A,6088
-pip/_vendor/msgpack/fallback.py,sha256=L5jriXysURbf6rPbbHbvXgvoFrKZiryIBmujMTcrf3A,34475
-pip/_vendor/packaging/__about__.py,sha256=ugASIO2w1oUyH8_COqQ2X_s0rDhjbhQC3yJocD03h2c,661
-pip/_vendor/packaging/__init__.py,sha256=b9Kk5MF7KxhhLgcDmiUWukN-LatWFxPdNug0joPhHSk,497
-pip/_vendor/packaging/__pycache__/__about__.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/_manylinux.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/_musllinux.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/_structures.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/markers.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/requirements.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/specifiers.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/tags.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/utils.cpython-39.pyc,,
-pip/_vendor/packaging/__pycache__/version.cpython-39.pyc,,
-pip/_vendor/packaging/_manylinux.py,sha256=XcbiXB-qcjv3bcohp6N98TMpOP4_j3m-iOA8ptK2GWY,11488
-pip/_vendor/packaging/_musllinux.py,sha256=_KGgY_qc7vhMGpoqss25n2hiLCNKRtvz9mCrS7gkqyc,4378
-pip/_vendor/packaging/_structures.py,sha256=q3eVNmbWJGG_S0Dit_S3Ao8qQqz_5PYTXFAKBZe5yr4,1431
-pip/_vendor/packaging/markers.py,sha256=AJBOcY8Oq0kYc570KuuPTkvuqjAlhufaE2c9sCUbm64,8487
-pip/_vendor/packaging/requirements.py,sha256=NtDlPBtojpn1IUC85iMjPNsUmufjpSlwnNA-Xb4m5NA,4676
-pip/_vendor/packaging/specifiers.py,sha256=LRQ0kFsHrl5qfcFNEEJrIFYsnIHQUJXY9fIsakTrrqE,30110
-pip/_vendor/packaging/tags.py,sha256=lmsnGNiJ8C4D_Pf9PbM0qgbZvD9kmB9lpZBQUZa3R_Y,15699
-pip/_vendor/packaging/utils.py,sha256=dJjeat3BS-TYn1RrUFVwufUMasbtzLfYRoy_HXENeFQ,4200
-pip/_vendor/packaging/version.py,sha256=_fLRNrFrxYcHVfyo8vk9j8s6JM8N_xsSxVFr6RJyco8,14665
-pip/_vendor/pep517/__init__.py,sha256=Y1bATL2qbFNN6M_DQa4yyrwqjpIiL-j9T6kBmR0DS14,130
-pip/_vendor/pep517/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/build.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/check.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/colorlog.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/compat.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/dirtools.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/envbuild.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/meta.cpython-39.pyc,,
-pip/_vendor/pep517/__pycache__/wrappers.cpython-39.pyc,,
-pip/_vendor/pep517/build.py,sha256=2bar6EdjwIz2Dlfy94qdxn3oA9mVnnny40mfoT5f-qI,3457
-pip/_vendor/pep517/check.py,sha256=bCORq1WrHjhpTONa-zpAqG0EB9rHNuhO1ORu6DsDuL8,6084
-pip/_vendor/pep517/colorlog.py,sha256=Tk9AuYm_cLF3BKTBoSTJt9bRryn0aFojIQOwbfVUTxQ,4098
-pip/_vendor/pep517/compat.py,sha256=NmLImE5oiDT3gbEhJ4w7xeoMFcpAPrGu_NltBytSJUY,1253
-pip/_vendor/pep517/dirtools.py,sha256=2mkAkAL0mRz_elYFjRKuekTJVipH1zTn4tbf1EDev84,1129
-pip/_vendor/pep517/envbuild.py,sha256=zFde--rmzjXMLXcm7SA_3hDtgk5VCTA8hjpk88RbF6E,6100
-pip/_vendor/pep517/in_process/__init__.py,sha256=MyWoAi8JHdcBv7yXuWpUSVADbx6LSB9rZh7kTIgdA8Y,563
-pip/_vendor/pep517/in_process/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pep517/in_process/__pycache__/_in_process.cpython-39.pyc,,
-pip/_vendor/pep517/in_process/_in_process.py,sha256=D3waguyNSGcwosociD5USfcycYr2RCzCjYtxX5UHQmQ,11201
-pip/_vendor/pep517/meta.py,sha256=8mnM5lDnT4zXQpBTliJbRGfesH7iioHwozbDxALPS9Y,2463
-pip/_vendor/pep517/wrappers.py,sha256=impq7Cz_LL1iDF1iiOzYWB4MaEu6O6Gps7TJ5qsJz1Q,13429
-pip/_vendor/pkg_resources/__init__.py,sha256=NnpQ3g6BCHzpMgOR_OLBmYtniY4oOzdKpwqghfq_6ug,108287
-pip/_vendor/pkg_resources/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pkg_resources/__pycache__/py31compat.cpython-39.pyc,,
-pip/_vendor/pkg_resources/py31compat.py,sha256=CRk8fkiPRDLsbi5pZcKsHI__Pbmh_94L8mr9Qy9Ab2U,562
-pip/_vendor/platformdirs/__init__.py,sha256=x0aUmmovXXuRFVrVQBtwIiovX12B7rUkdV4F9UlLz0Y,12831
-pip/_vendor/platformdirs/__main__.py,sha256=ZmsnTxEOxtTvwa-Y_Vfab_JN3X4XCVeN8X0yyy9-qnc,1176
-pip/_vendor/platformdirs/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/__main__.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/android.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/api.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/macos.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/unix.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/version.cpython-39.pyc,,
-pip/_vendor/platformdirs/__pycache__/windows.cpython-39.pyc,,
-pip/_vendor/platformdirs/android.py,sha256=GKizhyS7ESRiU67u8UnBJLm46goau9937EchXWbPBlk,4068
-pip/_vendor/platformdirs/api.py,sha256=MXKHXOL3eh_-trSok-JUTjAR_zjmmKF3rjREVABjP8s,4910
-pip/_vendor/platformdirs/macos.py,sha256=-3UXQewbT0yMhMdkzRXfXGAntmLIH7Qt4a9Hlf8I5_Y,2655
-pip/_vendor/platformdirs/unix.py,sha256=b4aVYTz0qZ50HntwOXo8r6tp82jAa3qTjxw-WlnC2yc,6910
-pip/_vendor/platformdirs/version.py,sha256=tsBKKPDX3LLh39yHXeTYauGRbRd-AmOJr9SwKldlFIU,78
-pip/_vendor/platformdirs/windows.py,sha256=ISruopR5UGBePC0BxCxXevkZYfjJsIZc49YWU5iYfQ4,6439
-pip/_vendor/pygments/__init__.py,sha256=CAmA9UthykwxvtutUcH0IxqtiyQcSg6CmYdM-jKlcRY,3002
-pip/_vendor/pygments/__main__.py,sha256=X7rGLMUC54EXgO14FZ9goKXZDmhPzKXTsUglmb_McIU,353
-pip/_vendor/pygments/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/__main__.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/cmdline.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/console.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/filter.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/formatter.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/lexer.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/modeline.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/plugin.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/regexopt.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/scanner.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/sphinxext.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/style.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/token.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/unistring.cpython-39.pyc,,
-pip/_vendor/pygments/__pycache__/util.cpython-39.pyc,,
-pip/_vendor/pygments/cmdline.py,sha256=XpsyWgErcSqHC7rXiYKLF3Y61Uy8SR2DNQDDhZGuezg,23408
-pip/_vendor/pygments/console.py,sha256=QZXBUAkyl4dPLQ1e6XHjQu3mmXBWvuGQwsQT2q1mtCY,1697
-pip/_vendor/pygments/filter.py,sha256=35iMZiB1rcuogxokm92kViB2DPXPp_wWoxWuMmwvvzY,1938
-pip/_vendor/pygments/filters/__init__.py,sha256=-veOimzCyYGEARru2Dfo6ofSYcZ8tGsIVuMprtaZQ24,40292
-pip/_vendor/pygments/filters/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pygments/formatter.py,sha256=zSBbX2U_OOriy7SJvSTK6OAxjuXtROWxQlNpJEJZjBA,2917
-pip/_vendor/pygments/formatters/__init__.py,sha256=fjkYDy5-F998XczKi0ymHFayr5ObIRLHF8cgp9k8kpA,5119
-pip/_vendor/pygments/formatters/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/_mapping.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/bbcode.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/groff.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/html.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/img.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/irc.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/latex.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/other.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/pangomarkup.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/rtf.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/svg.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/terminal.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/__pycache__/terminal256.cpython-39.pyc,,
-pip/_vendor/pygments/formatters/_mapping.py,sha256=3A1rYSjYN9MLduCFWy2_mYhllPVpwlw55anRYnPXX8w,6516
-pip/_vendor/pygments/formatters/bbcode.py,sha256=cSKMOioUnE4TzvCCsK4IbJ6G78W07ZwHtkz4V1Wte0U,3314
-pip/_vendor/pygments/formatters/groff.py,sha256=ULgMKvGeLswX0KZn3IBp0p0U3rruiSHBtpl6O5qbqLs,5005
-pip/_vendor/pygments/formatters/html.py,sha256=0jM7Jc4xA4tsjmPq35uklm_En_OVdcNb0__SEXp2pDQ,35330
-pip/_vendor/pygments/formatters/img.py,sha256=r4iag_jCfyv_LhIt-1fRDeVEEoAfVJzkD9nZChIwiS8,21819
-pip/_vendor/pygments/formatters/irc.py,sha256=gi_IeIZeNaTfTMtvseLigZdS6lNicN7r7O7rnI6myo0,5871
-pip/_vendor/pygments/formatters/latex.py,sha256=qZUerrHt2Nn2aB4gJcdqj99qBkIxl_1v1ukYsf230Gk,18930
-pip/_vendor/pygments/formatters/other.py,sha256=Q01LtkqPZ8m_EYdgMVzXPUGjHoL00lXI3By97wzytYU,5073
-pip/_vendor/pygments/formatters/pangomarkup.py,sha256=ZpjALTSuGFwviJd5kOYwr-1NgqxCX3XRJrjXC7x1UbQ,2212
-pip/_vendor/pygments/formatters/rtf.py,sha256=qh7-z_wbUsTY6z7fZUGrYECYBlWB0wEdBwIZVEVybL0,5014
-pip/_vendor/pygments/formatters/svg.py,sha256=T7Jj004I3JUPOr48aAhQ368K2qWCciUyMQ2tdU-LB-4,7335
-pip/_vendor/pygments/formatters/terminal.py,sha256=cRD5hitINOkYlGZo9ma252vpJYPSGNgLivrsm6zGyec,4674
-pip/_vendor/pygments/formatters/terminal256.py,sha256=Bvz9zZL3UWc94TDm1GhKMI4x0BTit0XplhyRL0zmtkw,11753
-pip/_vendor/pygments/lexer.py,sha256=ECXWlEsbRnKs_njozZns6BGQ4riTMzct_BzAr3zV6dY,31937
-pip/_vendor/pygments/lexers/__init__.py,sha256=6Ds0GVBP3jrIU02wmjRdpoL4eFGhwT2IVD1zf3cV5_Y,11307
-pip/_vendor/pygments/lexers/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pygments/lexers/__pycache__/_mapping.cpython-39.pyc,,
-pip/_vendor/pygments/lexers/__pycache__/python.cpython-39.pyc,,
-pip/_vendor/pygments/lexers/_mapping.py,sha256=jAxmvh5wvNkD-p3Fh6E7hY_B0sGbcxWRfseT6iq7ex4,70032
-pip/_vendor/pygments/lexers/python.py,sha256=LXnk43Lcngqn9xj6eRqdk2f73oF4kHZWiwgHMM_RlVM,52776
-pip/_vendor/pygments/modeline.py,sha256=37fen3cf1moCz4vMVJqX41eAQCmj8pzUchikgPcHp-U,986
-pip/_vendor/pygments/plugin.py,sha256=zGSig3S7QX-3o6RDxd4_Uvice_t25l_BN9aQQ9k8vmU,1727
-pip/_vendor/pygments/regexopt.py,sha256=mj8Fgu3sT0d5PZwRwDLexEvVOQbuHeosubQnqVwgiqs,3072
-pip/_vendor/pygments/scanner.py,sha256=nGoHy-Npk2ylUd4bws_CJN1hK785Xqo8e0teRmNX2jo,3091
-pip/_vendor/pygments/sphinxext.py,sha256=FZ2puvLe2Bztqtj6UJvQd7D8TvtOZ1GsfRJObvH59tE,4630
-pip/_vendor/pygments/style.py,sha256=lGyan5bU42q1kGMfFqafwL3g1j5EurTvfkv8vdP7NzQ,6257
-pip/_vendor/pygments/styles/__init__.py,sha256=Qx2zq6ufbDNE2cTp51M-s9zW-sDE-KLIqFw31qr3Bhg,3252
-pip/_vendor/pygments/styles/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pygments/token.py,sha256=lNPgeaQTzu2DEUi6n_lxAIU7uy4DVj8LMI3nSVnTjks,6143
-pip/_vendor/pygments/unistring.py,sha256=Xs0FzOzE0l0iWRoTlcgi-Q_kAMdF5Gt5FL_goGKJc98,63188
-pip/_vendor/pygments/util.py,sha256=s9n8BQXIxG3lIwCPWv5-ci8yhaqq5JbEVK9v8Z-8_3I,9123
-pip/_vendor/pyparsing/__init__.py,sha256=qY88Z_HpaQVYsK2hMI_jJVmnaMI05c0Nx2_n7dRUpHk,9171
-pip/_vendor/pyparsing/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/actions.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/common.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/core.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/exceptions.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/helpers.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/results.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/testing.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/unicode.cpython-39.pyc,,
-pip/_vendor/pyparsing/__pycache__/util.cpython-39.pyc,,
-pip/_vendor/pyparsing/actions.py,sha256=60v7mETOBzc01YPH_qQD5isavgcSJpAfIKpzgjM3vaU,6429
-pip/_vendor/pyparsing/common.py,sha256=lFL97ooIeR75CmW5hjURZqwDCTgruqltcTCZ-ulLO2Q,12936
-pip/_vendor/pyparsing/core.py,sha256=OoDgtdSt6wjRyHe6VN7vjKOicIn0Gn6H4ZmUQLDPG9I,213312
-pip/_vendor/pyparsing/diagram/__init__.py,sha256=rUja7jHEHCAmP3j9XEIELP3izw2u6JHVtgvaWLcklJc,23032
-pip/_vendor/pyparsing/diagram/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/pyparsing/exceptions.py,sha256=H4D9gqMavqmAFSsdrU_J6bO-jA-T-A7yvtXWZpooIUA,9030
-pip/_vendor/pyparsing/helpers.py,sha256=EyjpgDOc3ivwRsU4VXxAWdgIs5gaqMDaLWcwRh5mqxc,39007
-pip/_vendor/pyparsing/results.py,sha256=Hd6FAAh5sF8zGXpwsamdVqFUblIwyQf0FH0t7FCb1OY,25353
-pip/_vendor/pyparsing/testing.py,sha256=szs8AKZREZMhL0y0vsMfaTVAnpqPHetg6VKJBNmc4QY,13388
-pip/_vendor/pyparsing/unicode.py,sha256=IR-ioeGY29cZ49tG8Ts7ITPWWNP5G2DcZs58oa8zn44,10381
-pip/_vendor/pyparsing/util.py,sha256=kq772O5YSeXOSdP-M31EWpbH_ayj7BMHImBYo9xPD5M,6805
-pip/_vendor/requests/__init__.py,sha256=6IUFQM6K9V2NIu4fe4LtUsN21-TFbw_w3EfPpdUN-qc,5130
-pip/_vendor/requests/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/__version__.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/_internal_utils.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/adapters.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/api.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/auth.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/certs.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/compat.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/cookies.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/exceptions.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/help.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/hooks.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/models.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/packages.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/sessions.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/status_codes.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/structures.cpython-39.pyc,,
-pip/_vendor/requests/__pycache__/utils.cpython-39.pyc,,
-pip/_vendor/requests/__version__.py,sha256=q8miOQaomOv3S74lK4eQs1zZ5jwcnOusyEU-M2idhts,441
-pip/_vendor/requests/_internal_utils.py,sha256=Zx3PnEUccyfsB-ie11nZVAW8qClJy0gx1qNME7rgT18,1096
-pip/_vendor/requests/adapters.py,sha256=WazYJQ_b2LHhNDb_y0hscNlWVsSe5ca5I3pymPrer5w,21861
-pip/_vendor/requests/api.py,sha256=hjuoP79IAEmX6Dysrw8t032cLfwLHxbI_wM4gC5G9t0,6402
-pip/_vendor/requests/auth.py,sha256=OMoJIVKyRLy9THr91y8rxysZuclwPB-K1Xg1zBomUhQ,10207
-pip/_vendor/requests/certs.py,sha256=nXRVq9DtGmv_1AYbwjTu9UrgAcdJv05ZvkNeaoLOZxY,465
-pip/_vendor/requests/compat.py,sha256=N1281mkcTluMjKqCSLf88LR6HNOygEhS1TbR9LLsoVY,2114
-pip/_vendor/requests/cookies.py,sha256=Y-bKX6TvW3FnYlE6Au0SXtVVWcaNdFvuAwQxw-G0iTI,18430
-pip/_vendor/requests/exceptions.py,sha256=VcpBXOL-9JYhNbK8OZxCIImBgpQSXJlUelDPf1f-pmM,3446
-pip/_vendor/requests/help.py,sha256=dyhe3lcmHXnFCzDiZVjcGmVvvO_jtsfAm-AC542ndw8,3972
-pip/_vendor/requests/hooks.py,sha256=QReGyy0bRcr5rkwCuObNakbYsc7EkiKeBwG4qHekr2Q,757
-pip/_vendor/requests/models.py,sha256=7pzscX_47qxx7-zEaBWGxMoB33Vdf6HLoUKZh1ktEvM,35116
-pip/_vendor/requests/packages.py,sha256=njJmVifY4aSctuW3PP5EFRCxjEwMRDO6J_feG2dKWsI,695
-pip/_vendor/requests/sessions.py,sha256=Zu-Y9YPlwTIsyFx1hvIrc3ziyeFpuFPqcOuSuz8BNWs,29835
-pip/_vendor/requests/status_codes.py,sha256=gT79Pbs_cQjBgp-fvrUgg1dn2DQO32bDj4TInjnMPSc,4188
-pip/_vendor/requests/structures.py,sha256=msAtr9mq1JxHd-JRyiILfdFlpbJwvvFuP3rfUQT_QxE,3005
-pip/_vendor/requests/utils.py,sha256=siud-FQ6xgKFbL49DRvAb3PMQMMHoeCL_TCmuHh9AUU,33301
-pip/_vendor/resolvelib/__init__.py,sha256=UL-B2BDI0_TRIqkfGwLHKLxY-LjBlomz7941wDqzB1I,537
-pip/_vendor/resolvelib/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/resolvelib/__pycache__/providers.cpython-39.pyc,,
-pip/_vendor/resolvelib/__pycache__/reporters.cpython-39.pyc,,
-pip/_vendor/resolvelib/__pycache__/resolvers.cpython-39.pyc,,
-pip/_vendor/resolvelib/__pycache__/structs.cpython-39.pyc,,
-pip/_vendor/resolvelib/compat/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/resolvelib/compat/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/resolvelib/compat/__pycache__/collections_abc.cpython-39.pyc,,
-pip/_vendor/resolvelib/compat/collections_abc.py,sha256=uy8xUZ-NDEw916tugUXm8HgwCGiMO0f-RcdnpkfXfOs,156
-pip/_vendor/resolvelib/providers.py,sha256=roVmFBItQJ0TkhNua65h8LdNny7rmeqVEXZu90QiP4o,5872
-pip/_vendor/resolvelib/reporters.py,sha256=fW91NKf-lK8XN7i6Yd_rczL5QeOT3sc6AKhpaTEnP3E,1583
-pip/_vendor/resolvelib/resolvers.py,sha256=2wYzVGBGerbmcIpH8cFmgSKgLSETz8jmwBMGjCBMHG4,17592
-pip/_vendor/resolvelib/structs.py,sha256=IVIYof6sA_N4ZEiE1C1UhzTX495brCNnyCdgq6CYq28,4794
-pip/_vendor/rich/__init__.py,sha256=oE1WmC8HOKuh1hFK1qTx1vKG_ng4TrFBK1jYf_vxvTk,5843
-pip/_vendor/rich/__main__.py,sha256=BmTmBWI93ytq75IEPi1uAAdeRYzFfDbgaAXjsX1ogig,8808
-pip/_vendor/rich/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/__main__.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_cell_widths.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_emoji_codes.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_emoji_replace.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_extension.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_inspect.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_log_render.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_loop.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_lru_cache.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_palettes.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_pick.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_ratio.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_spinners.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_stack.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_timer.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_win32_console.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_windows.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_windows_renderer.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/_wrap.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/abc.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/align.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/ansi.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/bar.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/box.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/cells.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/color.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/color_triplet.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/columns.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/console.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/constrain.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/containers.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/control.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/default_styles.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/diagnose.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/emoji.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/errors.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/file_proxy.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/filesize.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/highlighter.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/json.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/jupyter.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/layout.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/live.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/live_render.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/logging.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/markup.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/measure.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/padding.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/pager.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/palette.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/panel.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/pretty.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/progress.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/progress_bar.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/prompt.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/protocol.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/region.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/repr.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/rule.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/scope.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/screen.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/segment.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/spinner.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/status.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/style.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/styled.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/syntax.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/table.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/terminal_theme.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/text.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/theme.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/themes.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/traceback.cpython-39.pyc,,
-pip/_vendor/rich/__pycache__/tree.cpython-39.pyc,,
-pip/_vendor/rich/_cell_widths.py,sha256=2n4EiJi3X9sqIq0O16kUZ_zy6UYMd3xFfChlKfnW1Hc,10096
-pip/_vendor/rich/_emoji_codes.py,sha256=hu1VL9nbVdppJrVoijVshRlcRRe_v3dju3Mmd2sKZdY,140235
-pip/_vendor/rich/_emoji_replace.py,sha256=n-kcetsEUx2ZUmhQrfeMNc-teeGhpuSQ5F8VPBsyvDo,1064
-pip/_vendor/rich/_extension.py,sha256=Xt47QacCKwYruzjDi-gOBq724JReDj9Cm9xUi5fr-34,265
-pip/_vendor/rich/_inspect.py,sha256=8nKlSSFO24I43aStJQx2XD1BParboxbWP7yTZLANkzM,7817
-pip/_vendor/rich/_log_render.py,sha256=1ByI0PA1ZpxZY3CGJOK54hjlq4X-Bz_boIjIqCd8Kns,3225
-pip/_vendor/rich/_loop.py,sha256=hV_6CLdoPm0va22Wpw4zKqM0RYsz3TZxXj0PoS-9eDQ,1236
-pip/_vendor/rich/_lru_cache.py,sha256=yQ6ErnRu0O5lq6UcP9c-kN9jSYpN3Yc8PAJJtsb_T64,1218
-pip/_vendor/rich/_palettes.py,sha256=cdev1JQKZ0JvlguV9ipHgznTdnvlIzUFDBb0It2PzjI,7063
-pip/_vendor/rich/_pick.py,sha256=evDt8QN4lF5CiwrUIXlOJCntitBCOsI3ZLPEIAVRLJU,423
-pip/_vendor/rich/_ratio.py,sha256=2lLSliL025Y-YMfdfGbutkQDevhcyDqc-DtUYW9mU70,5472
-pip/_vendor/rich/_spinners.py,sha256=U2r1_g_1zSjsjiUdAESc2iAMc3i4ri_S8PYP6kQ5z1I,19919
-pip/_vendor/rich/_stack.py,sha256=-C8OK7rxn3sIUdVwxZBBpeHhIzX0eI-VM3MemYfaXm0,351
-pip/_vendor/rich/_timer.py,sha256=zelxbT6oPFZnNrwWPpc1ktUeAT-Vc4fuFcRZLQGLtMI,417
-pip/_vendor/rich/_win32_console.py,sha256=LxoB5BDZnNHGlMz_h37tuX55cYfCGS2yLkz577I3bcA,21608
-pip/_vendor/rich/_windows.py,sha256=dvNl9TmfPzNVxiKk5WDFihErZ5796g2UC9-KGGyfXmk,1926
-pip/_vendor/rich/_windows_renderer.py,sha256=ycLsblXTK_DN09z7CmeJysQqMDn_fntciNhQ0ZixtSU,2599
-pip/_vendor/rich/_wrap.py,sha256=OtnSxnERkuNlSM1d_MYtNg8KIYTcTBk3peg16dCZH_U,1804
-pip/_vendor/rich/abc.py,sha256=ON-E-ZqSSheZ88VrKX2M3PXpFbGEUUZPMa_Af0l-4f0,890
-pip/_vendor/rich/align.py,sha256=FV6_GS-8uhIyViMng3hkIWSFaTgMohK1Oqyjl8I8mGE,10368
-pip/_vendor/rich/ansi.py,sha256=HtaPG7dvgL6_yo0sQmx5CM05DJ4_1goY5SWXXOYNaKs,6820
-pip/_vendor/rich/bar.py,sha256=a7UD303BccRCrEhGjfMElpv5RFYIinaAhAuqYqhUvmw,3264
-pip/_vendor/rich/box.py,sha256=o0ywz1iW0WjGLPrRVDAZPh1CVPEgAOaWsn8Bf3sf43g,9069
-pip/_vendor/rich/cells.py,sha256=d2-hlFshem63kNqukoRsP2WTtfLaebxzEbg94rpgSIk,4091
-pip/_vendor/rich/color.py,sha256=kp87L8V4-3qayE6CUxtW_nP8Ujfew_-DAhNwYMXBMOY,17957
-pip/_vendor/rich/color_triplet.py,sha256=3lhQkdJbvWPoLDO-AnYImAWmJvV5dlgYNCVZ97ORaN4,1054
-pip/_vendor/rich/columns.py,sha256=HUX0KcMm9dsKNi11fTbiM_h2iDtl8ySCaVcxlalEzq8,7131
-pip/_vendor/rich/console.py,sha256=10JPWOEiykzAy_XpVxYE0adcN2IloznJwpT350Kq4Uo,94759
-pip/_vendor/rich/constrain.py,sha256=1VIPuC8AgtKWrcncQrjBdYqA3JVWysu6jZo1rrh7c7Q,1288
-pip/_vendor/rich/containers.py,sha256=aKgm5UDHn5Nmui6IJaKdsZhbHClh_X7D-_Wg8Ehrr7s,5497
-pip/_vendor/rich/control.py,sha256=NxfWdYelgEyIcwQClH_rCkSdbY4n8CCe_k6yt9LQLRM,5293
-pip/_vendor/rich/default_styles.py,sha256=I_evueuzWmuBD1ERl62oBVHypCEQP5sTs8AN_bPuzp8,7675
-pip/_vendor/rich/diagnose.py,sha256=6QBVGOg1_8LFQRhkPbPA2GFMw9J6n-q38QDRgygtT1E,920
-pip/_vendor/rich/emoji.py,sha256=omTF9asaAnsM4yLY94eR_9dgRRSm1lHUszX20D1yYCQ,2501
-pip/_vendor/rich/errors.py,sha256=5pP3Kc5d4QJ_c0KFsxrfyhjiPVe7J1zOqSFbFAzcV-Y,642
-pip/_vendor/rich/file_proxy.py,sha256=fHeReSO3VJ7IbH_9ri-OrPYbFC3UYOzeTNjngiiWOcY,1613
-pip/_vendor/rich/filesize.py,sha256=yShoVpARafJBreyZFaAhC4OhnJ6ydC1WXR-Ez4wU_YQ,2507
-pip/_vendor/rich/highlighter.py,sha256=v1R6BZIVUzd3PIw9EVc0d59Spr9cI-RPIKqF3OXGOvM,5695
-pip/_vendor/rich/json.py,sha256=RCm4lXBXrjvXHpqrWPH8wdGP0jEo4IohLmkddlhRY18,5051
-pip/_vendor/rich/jupyter.py,sha256=P7IScv6bwoSKpj7_9LFygBsVTYQX3tZm0YoKGCKGffE,3164
-pip/_vendor/rich/layout.py,sha256=E3xJ4fomizUADwime3VA0lBXoMSPl9blEokIzVBjO0Q,14074
-pip/_vendor/rich/live.py,sha256=OKxMaFU5sFfuR--cJftGYjSvg1VPQri1U_DNZUjCsvI,13711
-pip/_vendor/rich/live_render.py,sha256=zElm3PrfSIvjOce28zETHMIUf9pFYSUA5o0AflgUP64,3667
-pip/_vendor/rich/logging.py,sha256=10j13lPr-QuYqEEBz_2aRJp8gNYvSN2wmCUlUqJcPLM,11471
-pip/_vendor/rich/markup.py,sha256=X0yMUpkQajdz24btnELBfDdRWTT-jmwHX4zTfDWrrkM,8192
-pip/_vendor/rich/measure.py,sha256=Bn7ycJmASksRWLkiudhxcKvbl_aSriYSlOtVgwiB1RI,5305
-pip/_vendor/rich/padding.py,sha256=kTFGsdGe0os7tXLnHKpwTI90CXEvrceeZGCshmJy5zw,4970
-pip/_vendor/rich/pager.py,sha256=SO_ETBFKbg3n_AgOzXm41Sv36YxXAyI3_R-KOY2_uSc,828
-pip/_vendor/rich/palette.py,sha256=lInvR1ODDT2f3UZMfL1grq7dY_pDdKHw4bdUgOGaM4Y,3396
-pip/_vendor/rich/panel.py,sha256=CzdojkDAjxAKgvDxis47nWzUh1V2NniOqkJJQajosG8,8744
-pip/_vendor/rich/pretty.py,sha256=V7fC5AUuRYR6KuhvSdaJNZAC6YAPzP61M3A1OlLZIxg,35831
-pip/_vendor/rich/progress.py,sha256=1scaQfT2Kt2bWAXSDwexsQzKv06hix88vxej0Eh1cvw,55822
-pip/_vendor/rich/progress_bar.py,sha256=ELiBaxJOgsRYKpNIrot7BC0bFXvmf8cTd6nxI02BbK0,7762
-pip/_vendor/rich/prompt.py,sha256=x0mW-pIPodJM4ry6grgmmLrl8VZp99kqcmdnBe70YYA,11303
-pip/_vendor/rich/protocol.py,sha256=5hHHDDNHckdk8iWH5zEbi-zuIVSF5hbU2jIo47R7lTE,1391
-pip/_vendor/rich/region.py,sha256=rNT9xZrVZTYIXZC0NYn41CJQwYNbR-KecPOxTgQvB8Y,166
-pip/_vendor/rich/repr.py,sha256=Je91CIrZN_av9L3FRCKCs5yoX2LvczrCNKqUbVsjUvQ,4449
-pip/_vendor/rich/rule.py,sha256=cPK6NYo4kzh-vM_8a-rXajXplsbaHa6ahErYvGSsrJ0,4197
-pip/_vendor/rich/scope.py,sha256=HX13XsJfqzQHpPfw4Jn9JmJjCsRj9uhHxXQEqjkwyLA,2842
-pip/_vendor/rich/screen.py,sha256=YoeReESUhx74grqb0mSSb9lghhysWmFHYhsbMVQjXO8,1591
-pip/_vendor/rich/segment.py,sha256=dfWPgAsF15BaP5kxr7Qvq_JHVYAguB4aVUlWm3Fq8vY,23890
-pip/_vendor/rich/spinner.py,sha256=V6dW0jIk5IO0_2MyxyftQf5VjCHI0T2cRhJ4F31hPIQ,4312
-pip/_vendor/rich/status.py,sha256=gJsIXIZeSo3urOyxRUjs6VrhX5CZrA0NxIQ-dxhCnwo,4425
-pip/_vendor/rich/style.py,sha256=AD1I7atfclsFCtGeL8ronH1Jj-02WLp9ZQ2VYqmpBjM,26469
-pip/_vendor/rich/styled.py,sha256=eZNnzGrI4ki_54pgY3Oj0T-x3lxdXTYh4_ryDB24wBU,1258
-pip/_vendor/rich/syntax.py,sha256=e-IZvu3I9Y-wQuOCcszxwxNgM4BsBZ2veQcezeFo6FI,28731
-pip/_vendor/rich/table.py,sha256=Kv2CO3cDMhU1jRPAktga77JlkXe5x-t_3H_AOJK_H1c,39455
-pip/_vendor/rich/terminal_theme.py,sha256=Iu4pemtVB5ULlHldKmaHd5OlkkPQ30zdQkyCk1yoSUQ,3371
-pip/_vendor/rich/text.py,sha256=7jY1hBB4LF-g6QmuhltffWns8ylkV5w25SOiWBl6_xg,44563
-pip/_vendor/rich/theme.py,sha256=GKNtQhDBZKAzDaY0vQVQQFzbc0uWfFe6CJXA-syT7zQ,3627
-pip/_vendor/rich/themes.py,sha256=0xgTLozfabebYtcJtDdC5QkX5IVUEaviqDUJJh4YVFk,102
-pip/_vendor/rich/traceback.py,sha256=hF34nWNJr8hFOghqPB-yBYXFAViGPx4CFNBAVugrR_g,25944
-pip/_vendor/rich/tree.py,sha256=qCOh1U5-Hxc5yGH-OnaXEYS8aKTwwwD8igWCPmPTvqc,9159
-pip/_vendor/six.py,sha256=TOOfQi7nFGfMrIvtdr6wX4wyHH8M7aknmuLfo2cBBrM,34549
-pip/_vendor/tenacity/__init__.py,sha256=GLLsTFD4Bd5VDgTR6mU_FxyOsrxc48qONorVaRebeD4,18257
-pip/_vendor/tenacity/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/_asyncio.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/_utils.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/after.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/before.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/before_sleep.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/nap.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/retry.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/stop.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/tornadoweb.cpython-39.pyc,,
-pip/_vendor/tenacity/__pycache__/wait.cpython-39.pyc,,
-pip/_vendor/tenacity/_asyncio.py,sha256=HEb0BVJEeBJE9P-m9XBxh1KcaF96BwoeqkJCL5sbVcQ,3314
-pip/_vendor/tenacity/_utils.py,sha256=-y68scDcyoqvTJuJJ0GTfjdSCljEYlbCYvgk7nM4NdM,1944
-pip/_vendor/tenacity/after.py,sha256=dlmyxxFy2uqpLXDr838DiEd7jgv2AGthsWHGYcGYsaI,1496
-pip/_vendor/tenacity/before.py,sha256=7XtvRmO0dRWUp8SVn24OvIiGFj8-4OP5muQRUiWgLh0,1376
-pip/_vendor/tenacity/before_sleep.py,sha256=ThyDvqKU5yle_IvYQz_b6Tp6UjUS0PhVp6zgqYl9U6Y,1908
-pip/_vendor/tenacity/nap.py,sha256=fRWvnz1aIzbIq9Ap3gAkAZgDH6oo5zxMrU6ZOVByq0I,1383
-pip/_vendor/tenacity/retry.py,sha256=62R71W59bQjuNyFKsDM7hE2aEkEPtwNBRA0tnsEvgSk,6645
-pip/_vendor/tenacity/stop.py,sha256=sKHmHaoSaW6sKu3dTxUVKr1-stVkY7lw4Y9yjZU30zQ,2790
-pip/_vendor/tenacity/tornadoweb.py,sha256=E8lWO2nwe6dJgoB-N2HhQprYLDLB_UdSgFnv-EN6wKE,2145
-pip/_vendor/tenacity/wait.py,sha256=e_Saa6I2tsNLpCL1t9897wN2fGb0XQMQlE4bU2t9V2w,6691
-pip/_vendor/tomli/__init__.py,sha256=JhUwV66DB1g4Hvt1UQCVMdfCu-IgAV8FXmvDU9onxd4,396
-pip/_vendor/tomli/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/tomli/__pycache__/_parser.cpython-39.pyc,,
-pip/_vendor/tomli/__pycache__/_re.cpython-39.pyc,,
-pip/_vendor/tomli/__pycache__/_types.cpython-39.pyc,,
-pip/_vendor/tomli/_parser.py,sha256=g9-ENaALS-B8dokYpCuzUFalWlog7T-SIYMjLZSWrtM,22633
-pip/_vendor/tomli/_re.py,sha256=dbjg5ChZT23Ka9z9DHOXfdtSpPwUfdgMXnj8NOoly-w,2943
-pip/_vendor/tomli/_types.py,sha256=-GTG2VUqkpxwMqzmVO4F7ybKddIbAnuAHXfmWQcTi3Q,254
-pip/_vendor/typing_extensions.py,sha256=m4WkE0dv9dPqte7Xqlbdb9I_hVUvziwRnfRhP3uS8Ak,70649
-pip/_vendor/urllib3/__init__.py,sha256=j3yzHIbmW7CS-IKQJ9-PPQf_YKO8EOAey_rMW0UR7us,2763
-pip/_vendor/urllib3/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/_collections.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/_version.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/connection.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/connectionpool.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/exceptions.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/fields.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/filepost.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/poolmanager.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/request.cpython-39.pyc,,
-pip/_vendor/urllib3/__pycache__/response.cpython-39.pyc,,
-pip/_vendor/urllib3/_collections.py,sha256=Rp1mVyBgc_UlAcp6M3at1skJBXR5J43NawRTvW2g_XY,10811
-pip/_vendor/urllib3/_version.py,sha256=WE7GLYd0IVwgk-1gQZ-7jw00bCUYjYTIlcWIk7NOhEM,63
-pip/_vendor/urllib3/connection.py,sha256=mMuCIjdG01kRpFUENwJRoDKmYer7CZO56pfTbBCS7cw,20070
-pip/_vendor/urllib3/connectionpool.py,sha256=qz-ICrW6g4TZVCbDQ8fRe68BMpXkskkR9vAVY9zUWtA,39013
-pip/_vendor/urllib3/contrib/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/urllib3/contrib/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/__pycache__/_appengine_environ.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/__pycache__/appengine.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/__pycache__/ntlmpool.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/__pycache__/pyopenssl.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/__pycache__/securetransport.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/__pycache__/socks.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/_appengine_environ.py,sha256=bDbyOEhW2CKLJcQqAKAyrEHN-aklsyHFKq6vF8ZFsmk,957
-pip/_vendor/urllib3/contrib/_securetransport/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/urllib3/contrib/_securetransport/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/_securetransport/__pycache__/bindings.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/_securetransport/__pycache__/low_level.cpython-39.pyc,,
-pip/_vendor/urllib3/contrib/_securetransport/bindings.py,sha256=4Xk64qIkPBt09A5q-RIFUuDhNc9mXilVapm7WnYnzRw,17632
-pip/_vendor/urllib3/contrib/_securetransport/low_level.py,sha256=B2JBB2_NRP02xK6DCa1Pa9IuxrPwxzDzZbixQkb7U9M,13922
-pip/_vendor/urllib3/contrib/appengine.py,sha256=lfzpHFmJiO82shClLEm3QB62SYgHWnjpZOH_2JhU5Tc,11034
-pip/_vendor/urllib3/contrib/ntlmpool.py,sha256=ej9gGvfAb2Gt00lafFp45SIoRz-QwrQ4WChm6gQmAlM,4538
-pip/_vendor/urllib3/contrib/pyopenssl.py,sha256=DD4pInv_3OEEGffEFynBoirc8ldR789sLmGSKukzA0E,16900
-pip/_vendor/urllib3/contrib/securetransport.py,sha256=4qUKo7PUV-vVIqXmr2BD-sH7qplB918jiD5eNsRI9vU,34449
-pip/_vendor/urllib3/contrib/socks.py,sha256=aRi9eWXo9ZEb95XUxef4Z21CFlnnjbEiAo9HOseoMt4,7097
-pip/_vendor/urllib3/exceptions.py,sha256=0Mnno3KHTNfXRfY7638NufOPkUb6mXOm-Lqj-4x2w8A,8217
-pip/_vendor/urllib3/fields.py,sha256=kvLDCg_JmH1lLjUUEY_FLS8UhY7hBvDPuVETbY8mdrM,8579
-pip/_vendor/urllib3/filepost.py,sha256=5b_qqgRHVlL7uLtdAYBzBh-GHmU5AfJVt_2N0XS3PeY,2440
-pip/_vendor/urllib3/packages/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/urllib3/packages/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/urllib3/packages/__pycache__/six.cpython-39.pyc,,
-pip/_vendor/urllib3/packages/backports/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
-pip/_vendor/urllib3/packages/backports/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/urllib3/packages/backports/__pycache__/makefile.cpython-39.pyc,,
-pip/_vendor/urllib3/packages/backports/makefile.py,sha256=nbzt3i0agPVP07jqqgjhaYjMmuAi_W5E0EywZivVO8E,1417
-pip/_vendor/urllib3/packages/six.py,sha256=1LVW7ljqRirFlfExjwl-v1B7vSAUNTmzGMs-qays2zg,34666
-pip/_vendor/urllib3/poolmanager.py,sha256=0KOOJECoeLYVjUHvv-0h4Oq3FFQQ2yb-Fnjkbj8gJO0,19786
-pip/_vendor/urllib3/request.py,sha256=ZFSIqX0C6WizixecChZ3_okyu7BEv0lZu1VT0s6h4SM,5985
-pip/_vendor/urllib3/response.py,sha256=36JUM28H4dHsuCQgIPeN91LNcK8r1wBUJGFLk3ALfJc,28156
-pip/_vendor/urllib3/util/__init__.py,sha256=JEmSmmqqLyaw8P51gUImZh8Gwg9i1zSe-DoqAitn2nc,1155
-pip/_vendor/urllib3/util/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/connection.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/proxy.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/queue.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/request.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/response.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/retry.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/ssl_.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/ssl_match_hostname.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/ssltransport.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/timeout.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/url.cpython-39.pyc,,
-pip/_vendor/urllib3/util/__pycache__/wait.cpython-39.pyc,,
-pip/_vendor/urllib3/util/connection.py,sha256=5Lx2B1PW29KxBn2T0xkN1CBgRBa3gGVJBKoQoRogEVk,4901
-pip/_vendor/urllib3/util/proxy.py,sha256=zUvPPCJrp6dOF0N4GAVbOcl6o-4uXKSrGiTkkr5vUS4,1605
-pip/_vendor/urllib3/util/queue.py,sha256=nRgX8_eX-_VkvxoX096QWoz8Ps0QHUAExILCY_7PncM,498
-pip/_vendor/urllib3/util/request.py,sha256=C0OUt2tcU6LRiQJ7YYNP9GvPrSvl7ziIBekQ-5nlBZk,3997
-pip/_vendor/urllib3/util/response.py,sha256=GJpg3Egi9qaJXRwBh5wv-MNuRWan5BIu40oReoxWP28,3510
-pip/_vendor/urllib3/util/retry.py,sha256=iESg2PvViNdXBRY4MpL4h0kqwOOkHkxmLn1kkhFHPU8,22001
-pip/_vendor/urllib3/util/ssl_.py,sha256=X4-AqW91aYPhPx6-xbf66yHFQKbqqfC_5Zt4WkLX1Hc,17177
-pip/_vendor/urllib3/util/ssl_match_hostname.py,sha256=Ir4cZVEjmAk8gUAIHWSi7wtOO83UCYABY2xFD1Ql_WA,5758
-pip/_vendor/urllib3/util/ssltransport.py,sha256=NA-u5rMTrDFDFC8QzRKUEKMG0561hOD4qBTr3Z4pv6E,6895
-pip/_vendor/urllib3/util/timeout.py,sha256=QSbBUNOB9yh6AnDn61SrLQ0hg5oz0I9-uXEG91AJuIg,10003
-pip/_vendor/urllib3/util/url.py,sha256=QVEzcbHipbXyCWwH6R4K4TR-N8T4LM55WEMwNUTBmLE,14047
-pip/_vendor/urllib3/util/wait.py,sha256=3MUKRSAUJDB2tgco7qRUskW0zXGAWYvRRE4Q1_6xlLs,5404
-pip/_vendor/vendor.txt,sha256=6ghRsqZy1gdYozHzRGFwwvZMmd4LdEl0e8AGnTGR3fo,482
-pip/_vendor/webencodings/__init__.py,sha256=qOBJIuPy_4ByYH6W_bNgJF-qYQ2DoU-dKsDu5yRWCXg,10579
-pip/_vendor/webencodings/__pycache__/__init__.cpython-39.pyc,,
-pip/_vendor/webencodings/__pycache__/labels.cpython-39.pyc,,
-pip/_vendor/webencodings/__pycache__/mklabels.cpython-39.pyc,,
-pip/_vendor/webencodings/__pycache__/tests.cpython-39.pyc,,
-pip/_vendor/webencodings/__pycache__/x_user_defined.cpython-39.pyc,,
-pip/_vendor/webencodings/labels.py,sha256=4AO_KxTddqGtrL9ns7kAPjb0CcN6xsCIxbK37HY9r3E,8979
-pip/_vendor/webencodings/mklabels.py,sha256=GYIeywnpaLnP0GSic8LFWgd0UVvO_l1Nc6YoF-87R_4,1305
-pip/_vendor/webencodings/tests.py,sha256=OtGLyjhNY1fvkW1GvLJ_FV9ZoqC9Anyjr7q3kxTbzNs,6563
-pip/_vendor/webencodings/x_user_defined.py,sha256=yOqWSdmpytGfUgh_Z6JYgDNhoc-BAHyyeeT15Fr42tM,4307
-pip/py.typed,sha256=EBVvvPRTn_eIpz5e5QztSCdrMX7Qwd7VP93RSoIlZ2I,286
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/REQUESTED b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/REQUESTED
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/WHEEL b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/WHEEL
deleted file mode 100644
index becc9a6..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/WHEEL
+++ /dev/null
@@ -1,5 +0,0 @@
-Wheel-Version: 1.0
-Generator: bdist_wheel (0.37.1)
-Root-Is-Purelib: true
-Tag: py3-none-any
-
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/entry_points.txt b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/entry_points.txt
deleted file mode 100644
index 9609f72..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/entry_points.txt
+++ /dev/null
@@ -1,5 +0,0 @@
-[console_scripts]
-pip = pip._internal.cli.main:main
-pip3 = pip._internal.cli.main:main
-pip3.9 = pip._internal.cli.main:main
-
diff --git a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/top_level.txt b/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/top_level.txt
deleted file mode 100644
index a1b589e..0000000
--- a/env/lib/python3.9/site-packages/pip-22.1.1.dist-info/top_level.txt
+++ /dev/null
@@ -1 +0,0 @@
-pip
diff --git a/env/lib/python3.9/site-packages/pip/__init__.py b/env/lib/python3.9/site-packages/pip/__init__.py
deleted file mode 100644
index 3ff15c9..0000000
--- a/env/lib/python3.9/site-packages/pip/__init__.py
+++ /dev/null
@@ -1,13 +0,0 @@
-from typing import List, Optional
-
-__version__ = "22.1.1"
-
-
-def main(args: Optional[List[str]] = None) -> int:
- """This is an internal API only meant for use by pip's own console scripts.
-
- For additional details, see https://github.com/pypa/pip/issues/7498.
- """
- from pip._internal.utils.entrypoints import _wrapper
-
- return _wrapper(args)
diff --git a/env/lib/python3.9/site-packages/pip/__main__.py b/env/lib/python3.9/site-packages/pip/__main__.py
deleted file mode 100644
index fe34a7b..0000000
--- a/env/lib/python3.9/site-packages/pip/__main__.py
+++ /dev/null
@@ -1,31 +0,0 @@
-import os
-import sys
-import warnings
-
-# Remove '' and current working directory from the first entry
-# of sys.path, if present to avoid using current directory
-# in pip commands check, freeze, install, list and show,
-# when invoked as python -m pip
-if sys.path[0] in ("", os.getcwd()):
- sys.path.pop(0)
-
-# If we are running from a wheel, add the wheel to sys.path
-# This allows the usage python pip-*.whl/pip install pip-*.whl
-if __package__ == "":
- # __file__ is pip-*.whl/pip/__main__.py
- # first dirname call strips of '/__main__.py', second strips off '/pip'
- # Resulting path is the name of the wheel itself
- # Add that to sys.path so we can import pip
- path = os.path.dirname(os.path.dirname(__file__))
- sys.path.insert(0, path)
-
-if __name__ == "__main__":
- # Work around the error reported in #9540, pending a proper fix.
- # Note: It is essential the warning filter is set *before* importing
- # pip, as the deprecation happens at import time, not runtime.
- warnings.filterwarnings(
- "ignore", category=DeprecationWarning, module=".*packaging\\.version"
- )
- from pip._internal.cli.main import main as _main
-
- sys.exit(_main())
diff --git a/env/lib/python3.9/site-packages/pip/_internal/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/__init__.py
deleted file mode 100644
index 6afb5c6..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/__init__.py
+++ /dev/null
@@ -1,19 +0,0 @@
-from typing import List, Optional
-
-import pip._internal.utils.inject_securetransport # noqa
-from pip._internal.utils import _log
-
-# init_logging() must be called before any call to logging.getLogger()
-# which happens at import of most modules.
-_log.init_logging()
-
-
-def main(args: (Optional[List[str]]) = None) -> int:
- """This is preserved for old console scripts that may still be referencing
- it.
-
- For additional details, see https://github.com/pypa/pip/issues/7498.
- """
- from pip._internal.utils.entrypoints import _wrapper
-
- return _wrapper(args)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/build_env.py b/env/lib/python3.9/site-packages/pip/_internal/build_env.py
deleted file mode 100644
index ccf2b4b..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/build_env.py
+++ /dev/null
@@ -1,304 +0,0 @@
-"""Build Environment used for isolation during sdist building
-"""
-
-import contextlib
-import logging
-import os
-import pathlib
-import sys
-import textwrap
-import zipfile
-from collections import OrderedDict
-from sysconfig import get_paths
-from types import TracebackType
-from typing import TYPE_CHECKING, Generator, Iterable, List, Optional, Set, Tuple, Type
-
-from pip._vendor.certifi import where
-from pip._vendor.packaging.requirements import Requirement
-from pip._vendor.packaging.version import Version
-
-from pip import __file__ as pip_location
-from pip._internal.cli.spinners import open_spinner
-from pip._internal.locations import get_platlib, get_prefixed_libs, get_purelib
-from pip._internal.metadata import get_default_environment, get_environment
-from pip._internal.utils.subprocess import call_subprocess
-from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
-
-if TYPE_CHECKING:
- from pip._internal.index.package_finder import PackageFinder
-
-logger = logging.getLogger(__name__)
-
-
-class _Prefix:
- def __init__(self, path: str) -> None:
- self.path = path
- self.setup = False
- self.bin_dir = get_paths(
- "nt" if os.name == "nt" else "posix_prefix",
- vars={"base": path, "platbase": path},
- )["scripts"]
- self.lib_dirs = get_prefixed_libs(path)
-
-
-@contextlib.contextmanager
-def _create_standalone_pip() -> Generator[str, None, None]:
- """Create a "standalone pip" zip file.
-
- The zip file's content is identical to the currently-running pip.
- It will be used to install requirements into the build environment.
- """
- source = pathlib.Path(pip_location).resolve().parent
-
- # Return the current instance if `source` is not a directory. We can't build
- # a zip from this, and it likely means the instance is already standalone.
- if not source.is_dir():
- yield str(source)
- return
-
- with TempDirectory(kind="standalone-pip") as tmp_dir:
- pip_zip = os.path.join(tmp_dir.path, "__env_pip__.zip")
- kwargs = {}
- if sys.version_info >= (3, 8):
- kwargs["strict_timestamps"] = False
- with zipfile.ZipFile(pip_zip, "w", **kwargs) as zf:
- for child in source.rglob("*"):
- zf.write(child, child.relative_to(source.parent).as_posix())
- yield os.path.join(pip_zip, "pip")
-
-
-class BuildEnvironment:
- """Creates and manages an isolated environment to install build deps"""
-
- def __init__(self) -> None:
- temp_dir = TempDirectory(kind=tempdir_kinds.BUILD_ENV, globally_managed=True)
-
- self._prefixes = OrderedDict(
- (name, _Prefix(os.path.join(temp_dir.path, name)))
- for name in ("normal", "overlay")
- )
-
- self._bin_dirs: List[str] = []
- self._lib_dirs: List[str] = []
- for prefix in reversed(list(self._prefixes.values())):
- self._bin_dirs.append(prefix.bin_dir)
- self._lib_dirs.extend(prefix.lib_dirs)
-
- # Customize site to:
- # - ensure .pth files are honored
- # - prevent access to system site packages
- system_sites = {
- os.path.normcase(site) for site in (get_purelib(), get_platlib())
- }
- self._site_dir = os.path.join(temp_dir.path, "site")
- if not os.path.exists(self._site_dir):
- os.mkdir(self._site_dir)
- with open(
- os.path.join(self._site_dir, "sitecustomize.py"), "w", encoding="utf-8"
- ) as fp:
- fp.write(
- textwrap.dedent(
- """
- import os, site, sys
-
- # First, drop system-sites related paths.
- original_sys_path = sys.path[:]
- known_paths = set()
- for path in {system_sites!r}:
- site.addsitedir(path, known_paths=known_paths)
- system_paths = set(
- os.path.normcase(path)
- for path in sys.path[len(original_sys_path):]
- )
- original_sys_path = [
- path for path in original_sys_path
- if os.path.normcase(path) not in system_paths
- ]
- sys.path = original_sys_path
-
- # Second, add lib directories.
- # ensuring .pth file are processed.
- for path in {lib_dirs!r}:
- assert not path in sys.path
- site.addsitedir(path)
- """
- ).format(system_sites=system_sites, lib_dirs=self._lib_dirs)
- )
-
- def __enter__(self) -> None:
- self._save_env = {
- name: os.environ.get(name, None)
- for name in ("PATH", "PYTHONNOUSERSITE", "PYTHONPATH")
- }
-
- path = self._bin_dirs[:]
- old_path = self._save_env["PATH"]
- if old_path:
- path.extend(old_path.split(os.pathsep))
-
- pythonpath = [self._site_dir]
-
- os.environ.update(
- {
- "PATH": os.pathsep.join(path),
- "PYTHONNOUSERSITE": "1",
- "PYTHONPATH": os.pathsep.join(pythonpath),
- }
- )
-
- def __exit__(
- self,
- exc_type: Optional[Type[BaseException]],
- exc_val: Optional[BaseException],
- exc_tb: Optional[TracebackType],
- ) -> None:
- for varname, old_value in self._save_env.items():
- if old_value is None:
- os.environ.pop(varname, None)
- else:
- os.environ[varname] = old_value
-
- def check_requirements(
- self, reqs: Iterable[str]
- ) -> Tuple[Set[Tuple[str, str]], Set[str]]:
- """Return 2 sets:
- - conflicting requirements: set of (installed, wanted) reqs tuples
- - missing requirements: set of reqs
- """
- missing = set()
- conflicting = set()
- if reqs:
- env = (
- get_environment(self._lib_dirs)
- if hasattr(self, "_lib_dirs")
- else get_default_environment()
- )
- for req_str in reqs:
- req = Requirement(req_str)
- # We're explicitly evaluating with an empty extra value, since build
- # environments are not provided any mechanism to select specific extras.
- if req.marker is not None and not req.marker.evaluate({"extra": ""}):
- continue
- dist = env.get_distribution(req.name)
- if not dist:
- missing.add(req_str)
- continue
- if isinstance(dist.version, Version):
- installed_req_str = f"{req.name}=={dist.version}"
- else:
- installed_req_str = f"{req.name}==={dist.version}"
- if not req.specifier.contains(dist.version, prereleases=True):
- conflicting.add((installed_req_str, req_str))
- # FIXME: Consider direct URL?
- return conflicting, missing
-
- def install_requirements(
- self,
- finder: "PackageFinder",
- requirements: Iterable[str],
- prefix_as_string: str,
- *,
- kind: str,
- ) -> None:
- prefix = self._prefixes[prefix_as_string]
- assert not prefix.setup
- prefix.setup = True
- if not requirements:
- return
- with contextlib.ExitStack() as ctx:
- pip_runnable = ctx.enter_context(_create_standalone_pip())
- self._install_requirements(
- pip_runnable,
- finder,
- requirements,
- prefix,
- kind=kind,
- )
-
- @staticmethod
- def _install_requirements(
- pip_runnable: str,
- finder: "PackageFinder",
- requirements: Iterable[str],
- prefix: _Prefix,
- *,
- kind: str,
- ) -> None:
- args: List[str] = [
- sys.executable,
- pip_runnable,
- "install",
- "--ignore-installed",
- "--no-user",
- "--prefix",
- prefix.path,
- "--no-warn-script-location",
- ]
- if logger.getEffectiveLevel() <= logging.DEBUG:
- args.append("-v")
- for format_control in ("no_binary", "only_binary"):
- formats = getattr(finder.format_control, format_control)
- args.extend(
- (
- "--" + format_control.replace("_", "-"),
- ",".join(sorted(formats or {":none:"})),
- )
- )
-
- index_urls = finder.index_urls
- if index_urls:
- args.extend(["-i", index_urls[0]])
- for extra_index in index_urls[1:]:
- args.extend(["--extra-index-url", extra_index])
- else:
- args.append("--no-index")
- for link in finder.find_links:
- args.extend(["--find-links", link])
-
- for host in finder.trusted_hosts:
- args.extend(["--trusted-host", host])
- if finder.allow_all_prereleases:
- args.append("--pre")
- if finder.prefer_binary:
- args.append("--prefer-binary")
- args.append("--")
- args.extend(requirements)
- extra_environ = {"_PIP_STANDALONE_CERT": where()}
- with open_spinner(f"Installing {kind}") as spinner:
- call_subprocess(
- args,
- command_desc=f"pip subprocess to install {kind}",
- spinner=spinner,
- extra_environ=extra_environ,
- )
-
-
-class NoOpBuildEnvironment(BuildEnvironment):
- """A no-op drop-in replacement for BuildEnvironment"""
-
- def __init__(self) -> None:
- pass
-
- def __enter__(self) -> None:
- pass
-
- def __exit__(
- self,
- exc_type: Optional[Type[BaseException]],
- exc_val: Optional[BaseException],
- exc_tb: Optional[TracebackType],
- ) -> None:
- pass
-
- def cleanup(self) -> None:
- pass
-
- def install_requirements(
- self,
- finder: "PackageFinder",
- requirements: Iterable[str],
- prefix_as_string: str,
- *,
- kind: str,
- ) -> None:
- raise NotImplementedError()
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cache.py b/env/lib/python3.9/site-packages/pip/_internal/cache.py
deleted file mode 100644
index 1d6df22..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cache.py
+++ /dev/null
@@ -1,264 +0,0 @@
-"""Cache Management
-"""
-
-import hashlib
-import json
-import logging
-import os
-from typing import Any, Dict, List, Optional, Set
-
-from pip._vendor.packaging.tags import Tag, interpreter_name, interpreter_version
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.exceptions import InvalidWheelFilename
-from pip._internal.models.format_control import FormatControl
-from pip._internal.models.link import Link
-from pip._internal.models.wheel import Wheel
-from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
-from pip._internal.utils.urls import path_to_url
-
-logger = logging.getLogger(__name__)
-
-
-def _hash_dict(d: Dict[str, str]) -> str:
- """Return a stable sha224 of a dictionary."""
- s = json.dumps(d, sort_keys=True, separators=(",", ":"), ensure_ascii=True)
- return hashlib.sha224(s.encode("ascii")).hexdigest()
-
-
-class Cache:
- """An abstract class - provides cache directories for data from links
-
-
- :param cache_dir: The root of the cache.
- :param format_control: An object of FormatControl class to limit
- binaries being read from the cache.
- :param allowed_formats: which formats of files the cache should store.
- ('binary' and 'source' are the only allowed values)
- """
-
- def __init__(
- self, cache_dir: str, format_control: FormatControl, allowed_formats: Set[str]
- ) -> None:
- super().__init__()
- assert not cache_dir or os.path.isabs(cache_dir)
- self.cache_dir = cache_dir or None
- self.format_control = format_control
- self.allowed_formats = allowed_formats
-
- _valid_formats = {"source", "binary"}
- assert self.allowed_formats.union(_valid_formats) == _valid_formats
-
- def _get_cache_path_parts(self, link: Link) -> List[str]:
- """Get parts of part that must be os.path.joined with cache_dir"""
-
- # We want to generate an url to use as our cache key, we don't want to
- # just re-use the URL because it might have other items in the fragment
- # and we don't care about those.
- key_parts = {"url": link.url_without_fragment}
- if link.hash_name is not None and link.hash is not None:
- key_parts[link.hash_name] = link.hash
- if link.subdirectory_fragment:
- key_parts["subdirectory"] = link.subdirectory_fragment
-
- # Include interpreter name, major and minor version in cache key
- # to cope with ill-behaved sdists that build a different wheel
- # depending on the python version their setup.py is being run on,
- # and don't encode the difference in compatibility tags.
- # https://github.com/pypa/pip/issues/7296
- key_parts["interpreter_name"] = interpreter_name()
- key_parts["interpreter_version"] = interpreter_version()
-
- # Encode our key url with sha224, we'll use this because it has similar
- # security properties to sha256, but with a shorter total output (and
- # thus less secure). However the differences don't make a lot of
- # difference for our use case here.
- hashed = _hash_dict(key_parts)
-
- # We want to nest the directories some to prevent having a ton of top
- # level directories where we might run out of sub directories on some
- # FS.
- parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]]
-
- return parts
-
- def _get_candidates(self, link: Link, canonical_package_name: str) -> List[Any]:
- can_not_cache = not self.cache_dir or not canonical_package_name or not link
- if can_not_cache:
- return []
-
- formats = self.format_control.get_allowed_formats(canonical_package_name)
- if not self.allowed_formats.intersection(formats):
- return []
-
- candidates = []
- path = self.get_path_for_link(link)
- if os.path.isdir(path):
- for candidate in os.listdir(path):
- candidates.append((candidate, path))
- return candidates
-
- def get_path_for_link(self, link: Link) -> str:
- """Return a directory to store cached items in for link."""
- raise NotImplementedError()
-
- def get(
- self,
- link: Link,
- package_name: Optional[str],
- supported_tags: List[Tag],
- ) -> Link:
- """Returns a link to a cached item if it exists, otherwise returns the
- passed link.
- """
- raise NotImplementedError()
-
-
-class SimpleWheelCache(Cache):
- """A cache of wheels for future installs."""
-
- def __init__(self, cache_dir: str, format_control: FormatControl) -> None:
- super().__init__(cache_dir, format_control, {"binary"})
-
- def get_path_for_link(self, link: Link) -> str:
- """Return a directory to store cached wheels for link
-
- Because there are M wheels for any one sdist, we provide a directory
- to cache them in, and then consult that directory when looking up
- cache hits.
-
- We only insert things into the cache if they have plausible version
- numbers, so that we don't contaminate the cache with things that were
- not unique. E.g. ./package might have dozens of installs done for it
- and build a version of 0.0...and if we built and cached a wheel, we'd
- end up using the same wheel even if the source has been edited.
-
- :param link: The link of the sdist for which this will cache wheels.
- """
- parts = self._get_cache_path_parts(link)
- assert self.cache_dir
- # Store wheels within the root cache_dir
- return os.path.join(self.cache_dir, "wheels", *parts)
-
- def get(
- self,
- link: Link,
- package_name: Optional[str],
- supported_tags: List[Tag],
- ) -> Link:
- candidates = []
-
- if not package_name:
- return link
-
- canonical_package_name = canonicalize_name(package_name)
- for wheel_name, wheel_dir in self._get_candidates(link, canonical_package_name):
- try:
- wheel = Wheel(wheel_name)
- except InvalidWheelFilename:
- continue
- if canonicalize_name(wheel.name) != canonical_package_name:
- logger.debug(
- "Ignoring cached wheel %s for %s as it "
- "does not match the expected distribution name %s.",
- wheel_name,
- link,
- package_name,
- )
- continue
- if not wheel.supported(supported_tags):
- # Built for a different python/arch/etc
- continue
- candidates.append(
- (
- wheel.support_index_min(supported_tags),
- wheel_name,
- wheel_dir,
- )
- )
-
- if not candidates:
- return link
-
- _, wheel_name, wheel_dir = min(candidates)
- return Link(path_to_url(os.path.join(wheel_dir, wheel_name)))
-
-
-class EphemWheelCache(SimpleWheelCache):
- """A SimpleWheelCache that creates it's own temporary cache directory"""
-
- def __init__(self, format_control: FormatControl) -> None:
- self._temp_dir = TempDirectory(
- kind=tempdir_kinds.EPHEM_WHEEL_CACHE,
- globally_managed=True,
- )
-
- super().__init__(self._temp_dir.path, format_control)
-
-
-class CacheEntry:
- def __init__(
- self,
- link: Link,
- persistent: bool,
- ):
- self.link = link
- self.persistent = persistent
-
-
-class WheelCache(Cache):
- """Wraps EphemWheelCache and SimpleWheelCache into a single Cache
-
- This Cache allows for gracefully degradation, using the ephem wheel cache
- when a certain link is not found in the simple wheel cache first.
- """
-
- def __init__(self, cache_dir: str, format_control: FormatControl) -> None:
- super().__init__(cache_dir, format_control, {"binary"})
- self._wheel_cache = SimpleWheelCache(cache_dir, format_control)
- self._ephem_cache = EphemWheelCache(format_control)
-
- def get_path_for_link(self, link: Link) -> str:
- return self._wheel_cache.get_path_for_link(link)
-
- def get_ephem_path_for_link(self, link: Link) -> str:
- return self._ephem_cache.get_path_for_link(link)
-
- def get(
- self,
- link: Link,
- package_name: Optional[str],
- supported_tags: List[Tag],
- ) -> Link:
- cache_entry = self.get_cache_entry(link, package_name, supported_tags)
- if cache_entry is None:
- return link
- return cache_entry.link
-
- def get_cache_entry(
- self,
- link: Link,
- package_name: Optional[str],
- supported_tags: List[Tag],
- ) -> Optional[CacheEntry]:
- """Returns a CacheEntry with a link to a cached item if it exists or
- None. The cache entry indicates if the item was found in the persistent
- or ephemeral cache.
- """
- retval = self._wheel_cache.get(
- link=link,
- package_name=package_name,
- supported_tags=supported_tags,
- )
- if retval is not link:
- return CacheEntry(retval, persistent=True)
-
- retval = self._ephem_cache.get(
- link=link,
- package_name=package_name,
- supported_tags=supported_tags,
- )
- if retval is not link:
- return CacheEntry(retval, persistent=False)
-
- return None
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/cli/__init__.py
deleted file mode 100644
index e589bb9..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-"""Subpackage containing all of pip's command line interface related code
-"""
-
-# This file intentionally does not import submodules
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/autocompletion.py b/env/lib/python3.9/site-packages/pip/_internal/cli/autocompletion.py
deleted file mode 100644
index 226fe84..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/autocompletion.py
+++ /dev/null
@@ -1,171 +0,0 @@
-"""Logic that powers autocompletion installed by ``pip completion``.
-"""
-
-import optparse
-import os
-import sys
-from itertools import chain
-from typing import Any, Iterable, List, Optional
-
-from pip._internal.cli.main_parser import create_main_parser
-from pip._internal.commands import commands_dict, create_command
-from pip._internal.metadata import get_default_environment
-
-
-def autocomplete() -> None:
- """Entry Point for completion of main and subcommand options."""
- # Don't complete if user hasn't sourced bash_completion file.
- if "PIP_AUTO_COMPLETE" not in os.environ:
- return
- cwords = os.environ["COMP_WORDS"].split()[1:]
- cword = int(os.environ["COMP_CWORD"])
- try:
- current = cwords[cword - 1]
- except IndexError:
- current = ""
-
- parser = create_main_parser()
- subcommands = list(commands_dict)
- options = []
-
- # subcommand
- subcommand_name: Optional[str] = None
- for word in cwords:
- if word in subcommands:
- subcommand_name = word
- break
- # subcommand options
- if subcommand_name is not None:
- # special case: 'help' subcommand has no options
- if subcommand_name == "help":
- sys.exit(1)
- # special case: list locally installed dists for show and uninstall
- should_list_installed = not current.startswith("-") and subcommand_name in [
- "show",
- "uninstall",
- ]
- if should_list_installed:
- env = get_default_environment()
- lc = current.lower()
- installed = [
- dist.canonical_name
- for dist in env.iter_installed_distributions(local_only=True)
- if dist.canonical_name.startswith(lc)
- and dist.canonical_name not in cwords[1:]
- ]
- # if there are no dists installed, fall back to option completion
- if installed:
- for dist in installed:
- print(dist)
- sys.exit(1)
-
- should_list_installables = (
- not current.startswith("-") and subcommand_name == "install"
- )
- if should_list_installables:
- for path in auto_complete_paths(current, "path"):
- print(path)
- sys.exit(1)
-
- subcommand = create_command(subcommand_name)
-
- for opt in subcommand.parser.option_list_all:
- if opt.help != optparse.SUPPRESS_HELP:
- for opt_str in opt._long_opts + opt._short_opts:
- options.append((opt_str, opt.nargs))
-
- # filter out previously specified options from available options
- prev_opts = [x.split("=")[0] for x in cwords[1 : cword - 1]]
- options = [(x, v) for (x, v) in options if x not in prev_opts]
- # filter options by current input
- options = [(k, v) for k, v in options if k.startswith(current)]
- # get completion type given cwords and available subcommand options
- completion_type = get_path_completion_type(
- cwords,
- cword,
- subcommand.parser.option_list_all,
- )
- # get completion files and directories if ``completion_type`` is
- # ````, ```` or ````
- if completion_type:
- paths = auto_complete_paths(current, completion_type)
- options = [(path, 0) for path in paths]
- for option in options:
- opt_label = option[0]
- # append '=' to options which require args
- if option[1] and option[0][:2] == "--":
- opt_label += "="
- print(opt_label)
- else:
- # show main parser options only when necessary
-
- opts = [i.option_list for i in parser.option_groups]
- opts.append(parser.option_list)
- flattened_opts = chain.from_iterable(opts)
- if current.startswith("-"):
- for opt in flattened_opts:
- if opt.help != optparse.SUPPRESS_HELP:
- subcommands += opt._long_opts + opt._short_opts
- else:
- # get completion type given cwords and all available options
- completion_type = get_path_completion_type(cwords, cword, flattened_opts)
- if completion_type:
- subcommands = list(auto_complete_paths(current, completion_type))
-
- print(" ".join([x for x in subcommands if x.startswith(current)]))
- sys.exit(1)
-
-
-def get_path_completion_type(
- cwords: List[str], cword: int, opts: Iterable[Any]
-) -> Optional[str]:
- """Get the type of path completion (``file``, ``dir``, ``path`` or None)
-
- :param cwords: same as the environmental variable ``COMP_WORDS``
- :param cword: same as the environmental variable ``COMP_CWORD``
- :param opts: The available options to check
- :return: path completion type (``file``, ``dir``, ``path`` or None)
- """
- if cword < 2 or not cwords[cword - 2].startswith("-"):
- return None
- for opt in opts:
- if opt.help == optparse.SUPPRESS_HELP:
- continue
- for o in str(opt).split("/"):
- if cwords[cword - 2].split("=")[0] == o:
- if not opt.metavar or any(
- x in ("path", "file", "dir") for x in opt.metavar.split("/")
- ):
- return opt.metavar
- return None
-
-
-def auto_complete_paths(current: str, completion_type: str) -> Iterable[str]:
- """If ``completion_type`` is ``file`` or ``path``, list all regular files
- and directories starting with ``current``; otherwise only list directories
- starting with ``current``.
-
- :param current: The word to be completed
- :param completion_type: path completion type(``file``, ``path`` or ``dir``)
- :return: A generator of regular files and/or directories
- """
- directory, filename = os.path.split(current)
- current_path = os.path.abspath(directory)
- # Don't complete paths if they can't be accessed
- if not os.access(current_path, os.R_OK):
- return
- filename = os.path.normcase(filename)
- # list all files that start with ``filename``
- file_list = (
- x for x in os.listdir(current_path) if os.path.normcase(x).startswith(filename)
- )
- for f in file_list:
- opt = os.path.join(current_path, f)
- comp_file = os.path.normcase(os.path.join(directory, f))
- # complete regular files when there is not ```` after option
- # complete directories when there is ````, ```` or
- # ````after option
- if completion_type != "dir" and os.path.isfile(opt):
- yield comp_file
- elif os.path.isdir(opt):
- yield os.path.join(comp_file, "")
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/base_command.py b/env/lib/python3.9/site-packages/pip/_internal/cli/base_command.py
deleted file mode 100644
index 0774f26..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/base_command.py
+++ /dev/null
@@ -1,223 +0,0 @@
-"""Base Command class, and related routines"""
-
-import functools
-import logging
-import logging.config
-import optparse
-import os
-import sys
-import traceback
-from optparse import Values
-from typing import Any, Callable, List, Optional, Tuple
-
-from pip._vendor.rich import traceback as rich_traceback
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.command_context import CommandContextMixIn
-from pip._internal.cli.parser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
-from pip._internal.cli.status_codes import (
- ERROR,
- PREVIOUS_BUILD_DIR_ERROR,
- UNKNOWN_ERROR,
- VIRTUALENV_NOT_FOUND,
-)
-from pip._internal.exceptions import (
- BadCommand,
- CommandError,
- DiagnosticPipError,
- InstallationError,
- NetworkConnectionError,
- PreviousBuildDirError,
- UninstallationError,
-)
-from pip._internal.utils.filesystem import check_path_owner
-from pip._internal.utils.logging import BrokenStdoutLoggingError, setup_logging
-from pip._internal.utils.misc import get_prog, normalize_path
-from pip._internal.utils.temp_dir import TempDirectoryTypeRegistry as TempDirRegistry
-from pip._internal.utils.temp_dir import global_tempdir_manager, tempdir_registry
-from pip._internal.utils.virtualenv import running_under_virtualenv
-
-__all__ = ["Command"]
-
-logger = logging.getLogger(__name__)
-
-
-class Command(CommandContextMixIn):
- usage: str = ""
- ignore_require_venv: bool = False
-
- def __init__(self, name: str, summary: str, isolated: bool = False) -> None:
- super().__init__()
-
- self.name = name
- self.summary = summary
- self.parser = ConfigOptionParser(
- usage=self.usage,
- prog=f"{get_prog()} {name}",
- formatter=UpdatingDefaultsHelpFormatter(),
- add_help_option=False,
- name=name,
- description=self.__doc__,
- isolated=isolated,
- )
-
- self.tempdir_registry: Optional[TempDirRegistry] = None
-
- # Commands should add options to this option group
- optgroup_name = f"{self.name.capitalize()} Options"
- self.cmd_opts = optparse.OptionGroup(self.parser, optgroup_name)
-
- # Add the general options
- gen_opts = cmdoptions.make_option_group(
- cmdoptions.general_group,
- self.parser,
- )
- self.parser.add_option_group(gen_opts)
-
- self.add_options()
-
- def add_options(self) -> None:
- pass
-
- def handle_pip_version_check(self, options: Values) -> None:
- """
- This is a no-op so that commands by default do not do the pip version
- check.
- """
- # Make sure we do the pip version check if the index_group options
- # are present.
- assert not hasattr(options, "no_index")
-
- def run(self, options: Values, args: List[str]) -> int:
- raise NotImplementedError
-
- def parse_args(self, args: List[str]) -> Tuple[Values, List[str]]:
- # factored out for testability
- return self.parser.parse_args(args)
-
- def main(self, args: List[str]) -> int:
- try:
- with self.main_context():
- return self._main(args)
- finally:
- logging.shutdown()
-
- def _main(self, args: List[str]) -> int:
- # We must initialize this before the tempdir manager, otherwise the
- # configuration would not be accessible by the time we clean up the
- # tempdir manager.
- self.tempdir_registry = self.enter_context(tempdir_registry())
- # Intentionally set as early as possible so globally-managed temporary
- # directories are available to the rest of the code.
- self.enter_context(global_tempdir_manager())
-
- options, args = self.parse_args(args)
-
- # Set verbosity so that it can be used elsewhere.
- self.verbosity = options.verbose - options.quiet
-
- level_number = setup_logging(
- verbosity=self.verbosity,
- no_color=options.no_color,
- user_log_file=options.log,
- )
-
- # TODO: Try to get these passing down from the command?
- # without resorting to os.environ to hold these.
- # This also affects isolated builds and it should.
-
- if options.no_input:
- os.environ["PIP_NO_INPUT"] = "1"
-
- if options.exists_action:
- os.environ["PIP_EXISTS_ACTION"] = " ".join(options.exists_action)
-
- if options.require_venv and not self.ignore_require_venv:
- # If a venv is required check if it can really be found
- if not running_under_virtualenv():
- logger.critical("Could not find an activated virtualenv (required).")
- sys.exit(VIRTUALENV_NOT_FOUND)
-
- if options.cache_dir:
- options.cache_dir = normalize_path(options.cache_dir)
- if not check_path_owner(options.cache_dir):
- logger.warning(
- "The directory '%s' or its parent directory is not owned "
- "or is not writable by the current user. The cache "
- "has been disabled. Check the permissions and owner of "
- "that directory. If executing pip with sudo, you should "
- "use sudo's -H flag.",
- options.cache_dir,
- )
- options.cache_dir = None
-
- if "2020-resolver" in options.features_enabled:
- logger.warning(
- "--use-feature=2020-resolver no longer has any effect, "
- "since it is now the default dependency resolver in pip. "
- "This will become an error in pip 21.0."
- )
-
- def intercepts_unhandled_exc(
- run_func: Callable[..., int]
- ) -> Callable[..., int]:
- @functools.wraps(run_func)
- def exc_logging_wrapper(*args: Any) -> int:
- try:
- status = run_func(*args)
- assert isinstance(status, int)
- return status
- except DiagnosticPipError as exc:
- logger.error("[present-rich] %s", exc)
- logger.debug("Exception information:", exc_info=True)
-
- return ERROR
- except PreviousBuildDirError as exc:
- logger.critical(str(exc))
- logger.debug("Exception information:", exc_info=True)
-
- return PREVIOUS_BUILD_DIR_ERROR
- except (
- InstallationError,
- UninstallationError,
- BadCommand,
- NetworkConnectionError,
- ) as exc:
- logger.critical(str(exc))
- logger.debug("Exception information:", exc_info=True)
-
- return ERROR
- except CommandError as exc:
- logger.critical("%s", exc)
- logger.debug("Exception information:", exc_info=True)
-
- return ERROR
- except BrokenStdoutLoggingError:
- # Bypass our logger and write any remaining messages to
- # stderr because stdout no longer works.
- print("ERROR: Pipe to stdout was broken", file=sys.stderr)
- if level_number <= logging.DEBUG:
- traceback.print_exc(file=sys.stderr)
-
- return ERROR
- except KeyboardInterrupt:
- logger.critical("Operation cancelled by user")
- logger.debug("Exception information:", exc_info=True)
-
- return ERROR
- except BaseException:
- logger.critical("Exception:", exc_info=True)
-
- return UNKNOWN_ERROR
-
- return exc_logging_wrapper
-
- try:
- if not options.debug_mode:
- run = intercepts_unhandled_exc(self.run)
- else:
- run = self.run
- rich_traceback.install(show_locals=True)
- return run(options, args)
- finally:
- self.handle_pip_version_check(options)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/cmdoptions.py b/env/lib/python3.9/site-packages/pip/_internal/cli/cmdoptions.py
deleted file mode 100644
index e96b8e5..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/cmdoptions.py
+++ /dev/null
@@ -1,1064 +0,0 @@
-"""
-shared options and groups
-
-The principle here is to define options once, but *not* instantiate them
-globally. One reason being that options with action='append' can carry state
-between parses. pip parses general options twice internally, and shouldn't
-pass on state. To be consistent, all options will follow this design.
-"""
-
-# The following comment should be removed at some point in the future.
-# mypy: strict-optional=False
-
-import importlib.util
-import logging
-import os
-import textwrap
-from functools import partial
-from optparse import SUPPRESS_HELP, Option, OptionGroup, OptionParser, Values
-from textwrap import dedent
-from typing import Any, Callable, Dict, Optional, Tuple
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.cli.parser import ConfigOptionParser
-from pip._internal.exceptions import CommandError
-from pip._internal.locations import USER_CACHE_DIR, get_src_prefix
-from pip._internal.models.format_control import FormatControl
-from pip._internal.models.index import PyPI
-from pip._internal.models.target_python import TargetPython
-from pip._internal.utils.hashes import STRONG_HASHES
-from pip._internal.utils.misc import strtobool
-
-logger = logging.getLogger(__name__)
-
-
-def raise_option_error(parser: OptionParser, option: Option, msg: str) -> None:
- """
- Raise an option parsing error using parser.error().
-
- Args:
- parser: an OptionParser instance.
- option: an Option instance.
- msg: the error text.
- """
- msg = f"{option} error: {msg}"
- msg = textwrap.fill(" ".join(msg.split()))
- parser.error(msg)
-
-
-def make_option_group(group: Dict[str, Any], parser: ConfigOptionParser) -> OptionGroup:
- """
- Return an OptionGroup object
- group -- assumed to be dict with 'name' and 'options' keys
- parser -- an optparse Parser
- """
- option_group = OptionGroup(parser, group["name"])
- for option in group["options"]:
- option_group.add_option(option())
- return option_group
-
-
-def check_install_build_global(
- options: Values, check_options: Optional[Values] = None
-) -> None:
- """Disable wheels if per-setup.py call options are set.
-
- :param options: The OptionParser options to update.
- :param check_options: The options to check, if not supplied defaults to
- options.
- """
- if check_options is None:
- check_options = options
-
- def getname(n: str) -> Optional[Any]:
- return getattr(check_options, n, None)
-
- names = ["build_options", "global_options", "install_options"]
- if any(map(getname, names)):
- control = options.format_control
- control.disallow_binaries()
- logger.warning(
- "Disabling all use of wheels due to the use of --build-option "
- "/ --global-option / --install-option.",
- )
-
-
-def check_dist_restriction(options: Values, check_target: bool = False) -> None:
- """Function for determining if custom platform options are allowed.
-
- :param options: The OptionParser options.
- :param check_target: Whether or not to check if --target is being used.
- """
- dist_restriction_set = any(
- [
- options.python_version,
- options.platforms,
- options.abis,
- options.implementation,
- ]
- )
-
- binary_only = FormatControl(set(), {":all:"})
- sdist_dependencies_allowed = (
- options.format_control != binary_only and not options.ignore_dependencies
- )
-
- # Installations or downloads using dist restrictions must not combine
- # source distributions and dist-specific wheels, as they are not
- # guaranteed to be locally compatible.
- if dist_restriction_set and sdist_dependencies_allowed:
- raise CommandError(
- "When restricting platform and interpreter constraints using "
- "--python-version, --platform, --abi, or --implementation, "
- "either --no-deps must be set, or --only-binary=:all: must be "
- "set and --no-binary must not be set (or must be set to "
- ":none:)."
- )
-
- if check_target:
- if dist_restriction_set and not options.target_dir:
- raise CommandError(
- "Can not use any platform or abi specific options unless "
- "installing via '--target'"
- )
-
-
-def _path_option_check(option: Option, opt: str, value: str) -> str:
- return os.path.expanduser(value)
-
-
-def _package_name_option_check(option: Option, opt: str, value: str) -> str:
- return canonicalize_name(value)
-
-
-class PipOption(Option):
- TYPES = Option.TYPES + ("path", "package_name")
- TYPE_CHECKER = Option.TYPE_CHECKER.copy()
- TYPE_CHECKER["package_name"] = _package_name_option_check
- TYPE_CHECKER["path"] = _path_option_check
-
-
-###########
-# options #
-###########
-
-help_: Callable[..., Option] = partial(
- Option,
- "-h",
- "--help",
- dest="help",
- action="help",
- help="Show help.",
-)
-
-debug_mode: Callable[..., Option] = partial(
- Option,
- "--debug",
- dest="debug_mode",
- action="store_true",
- default=False,
- help=(
- "Let unhandled exceptions propagate outside the main subroutine, "
- "instead of logging them to stderr."
- ),
-)
-
-isolated_mode: Callable[..., Option] = partial(
- Option,
- "--isolated",
- dest="isolated_mode",
- action="store_true",
- default=False,
- help=(
- "Run pip in an isolated mode, ignoring environment variables and user "
- "configuration."
- ),
-)
-
-require_virtualenv: Callable[..., Option] = partial(
- Option,
- "--require-virtualenv",
- "--require-venv",
- dest="require_venv",
- action="store_true",
- default=False,
- help=(
- "Allow pip to only run in a virtual environment; "
- "exit with an error otherwise."
- ),
-)
-
-verbose: Callable[..., Option] = partial(
- Option,
- "-v",
- "--verbose",
- dest="verbose",
- action="count",
- default=0,
- help="Give more output. Option is additive, and can be used up to 3 times.",
-)
-
-no_color: Callable[..., Option] = partial(
- Option,
- "--no-color",
- dest="no_color",
- action="store_true",
- default=False,
- help="Suppress colored output.",
-)
-
-version: Callable[..., Option] = partial(
- Option,
- "-V",
- "--version",
- dest="version",
- action="store_true",
- help="Show version and exit.",
-)
-
-quiet: Callable[..., Option] = partial(
- Option,
- "-q",
- "--quiet",
- dest="quiet",
- action="count",
- default=0,
- help=(
- "Give less output. Option is additive, and can be used up to 3"
- " times (corresponding to WARNING, ERROR, and CRITICAL logging"
- " levels)."
- ),
-)
-
-progress_bar: Callable[..., Option] = partial(
- Option,
- "--progress-bar",
- dest="progress_bar",
- type="choice",
- choices=["on", "off"],
- default="on",
- help="Specify whether the progress bar should be used [on, off] (default: on)",
-)
-
-log: Callable[..., Option] = partial(
- PipOption,
- "--log",
- "--log-file",
- "--local-log",
- dest="log",
- metavar="path",
- type="path",
- help="Path to a verbose appending log.",
-)
-
-no_input: Callable[..., Option] = partial(
- Option,
- # Don't ask for input
- "--no-input",
- dest="no_input",
- action="store_true",
- default=False,
- help="Disable prompting for input.",
-)
-
-proxy: Callable[..., Option] = partial(
- Option,
- "--proxy",
- dest="proxy",
- type="str",
- default="",
- help="Specify a proxy in the form scheme://[user:passwd@]proxy.server:port.",
-)
-
-retries: Callable[..., Option] = partial(
- Option,
- "--retries",
- dest="retries",
- type="int",
- default=5,
- help="Maximum number of retries each connection should attempt "
- "(default %default times).",
-)
-
-timeout: Callable[..., Option] = partial(
- Option,
- "--timeout",
- "--default-timeout",
- metavar="sec",
- dest="timeout",
- type="float",
- default=15,
- help="Set the socket timeout (default %default seconds).",
-)
-
-
-def exists_action() -> Option:
- return Option(
- # Option when path already exist
- "--exists-action",
- dest="exists_action",
- type="choice",
- choices=["s", "i", "w", "b", "a"],
- default=[],
- action="append",
- metavar="action",
- help="Default action when a path already exists: "
- "(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.",
- )
-
-
-cert: Callable[..., Option] = partial(
- PipOption,
- "--cert",
- dest="cert",
- type="path",
- metavar="path",
- help=(
- "Path to PEM-encoded CA certificate bundle. "
- "If provided, overrides the default. "
- "See 'SSL Certificate Verification' in pip documentation "
- "for more information."
- ),
-)
-
-client_cert: Callable[..., Option] = partial(
- PipOption,
- "--client-cert",
- dest="client_cert",
- type="path",
- default=None,
- metavar="path",
- help="Path to SSL client certificate, a single file containing the "
- "private key and the certificate in PEM format.",
-)
-
-index_url: Callable[..., Option] = partial(
- Option,
- "-i",
- "--index-url",
- "--pypi-url",
- dest="index_url",
- metavar="URL",
- default=PyPI.simple_url,
- help="Base URL of the Python Package Index (default %default). "
- "This should point to a repository compliant with PEP 503 "
- "(the simple repository API) or a local directory laid out "
- "in the same format.",
-)
-
-
-def extra_index_url() -> Option:
- return Option(
- "--extra-index-url",
- dest="extra_index_urls",
- metavar="URL",
- action="append",
- default=[],
- help="Extra URLs of package indexes to use in addition to "
- "--index-url. Should follow the same rules as "
- "--index-url.",
- )
-
-
-no_index: Callable[..., Option] = partial(
- Option,
- "--no-index",
- dest="no_index",
- action="store_true",
- default=False,
- help="Ignore package index (only looking at --find-links URLs instead).",
-)
-
-
-def find_links() -> Option:
- return Option(
- "-f",
- "--find-links",
- dest="find_links",
- action="append",
- default=[],
- metavar="url",
- help="If a URL or path to an html file, then parse for links to "
- "archives such as sdist (.tar.gz) or wheel (.whl) files. "
- "If a local path or file:// URL that's a directory, "
- "then look for archives in the directory listing. "
- "Links to VCS project URLs are not supported.",
- )
-
-
-def trusted_host() -> Option:
- return Option(
- "--trusted-host",
- dest="trusted_hosts",
- action="append",
- metavar="HOSTNAME",
- default=[],
- help="Mark this host or host:port pair as trusted, even though it "
- "does not have valid or any HTTPS.",
- )
-
-
-def constraints() -> Option:
- return Option(
- "-c",
- "--constraint",
- dest="constraints",
- action="append",
- default=[],
- metavar="file",
- help="Constrain versions using the given constraints file. "
- "This option can be used multiple times.",
- )
-
-
-def requirements() -> Option:
- return Option(
- "-r",
- "--requirement",
- dest="requirements",
- action="append",
- default=[],
- metavar="file",
- help="Install from the given requirements file. "
- "This option can be used multiple times.",
- )
-
-
-def editable() -> Option:
- return Option(
- "-e",
- "--editable",
- dest="editables",
- action="append",
- default=[],
- metavar="path/url",
- help=(
- "Install a project in editable mode (i.e. setuptools "
- '"develop mode") from a local project path or a VCS url.'
- ),
- )
-
-
-def _handle_src(option: Option, opt_str: str, value: str, parser: OptionParser) -> None:
- value = os.path.abspath(value)
- setattr(parser.values, option.dest, value)
-
-
-src: Callable[..., Option] = partial(
- PipOption,
- "--src",
- "--source",
- "--source-dir",
- "--source-directory",
- dest="src_dir",
- type="path",
- metavar="dir",
- default=get_src_prefix(),
- action="callback",
- callback=_handle_src,
- help="Directory to check out editable projects into. "
- 'The default in a virtualenv is "/src". '
- 'The default for global installs is "/src".',
-)
-
-
-def _get_format_control(values: Values, option: Option) -> Any:
- """Get a format_control object."""
- return getattr(values, option.dest)
-
-
-def _handle_no_binary(
- option: Option, opt_str: str, value: str, parser: OptionParser
-) -> None:
- existing = _get_format_control(parser.values, option)
- FormatControl.handle_mutual_excludes(
- value,
- existing.no_binary,
- existing.only_binary,
- )
-
-
-def _handle_only_binary(
- option: Option, opt_str: str, value: str, parser: OptionParser
-) -> None:
- existing = _get_format_control(parser.values, option)
- FormatControl.handle_mutual_excludes(
- value,
- existing.only_binary,
- existing.no_binary,
- )
-
-
-def no_binary() -> Option:
- format_control = FormatControl(set(), set())
- return Option(
- "--no-binary",
- dest="format_control",
- action="callback",
- callback=_handle_no_binary,
- type="str",
- default=format_control,
- help="Do not use binary packages. Can be supplied multiple times, and "
- 'each time adds to the existing value. Accepts either ":all:" to '
- 'disable all binary packages, ":none:" to empty the set (notice '
- "the colons), or one or more package names with commas between "
- "them (no colons). Note that some packages are tricky to compile "
- "and may fail to install when this option is used on them.",
- )
-
-
-def only_binary() -> Option:
- format_control = FormatControl(set(), set())
- return Option(
- "--only-binary",
- dest="format_control",
- action="callback",
- callback=_handle_only_binary,
- type="str",
- default=format_control,
- help="Do not use source packages. Can be supplied multiple times, and "
- 'each time adds to the existing value. Accepts either ":all:" to '
- 'disable all source packages, ":none:" to empty the set, or one '
- "or more package names with commas between them. Packages "
- "without binary distributions will fail to install when this "
- "option is used on them.",
- )
-
-
-platforms: Callable[..., Option] = partial(
- Option,
- "--platform",
- dest="platforms",
- metavar="platform",
- action="append",
- default=None,
- help=(
- "Only use wheels compatible with . Defaults to the "
- "platform of the running system. Use this option multiple times to "
- "specify multiple platforms supported by the target interpreter."
- ),
-)
-
-
-# This was made a separate function for unit-testing purposes.
-def _convert_python_version(value: str) -> Tuple[Tuple[int, ...], Optional[str]]:
- """
- Convert a version string like "3", "37", or "3.7.3" into a tuple of ints.
-
- :return: A 2-tuple (version_info, error_msg), where `error_msg` is
- non-None if and only if there was a parsing error.
- """
- if not value:
- # The empty string is the same as not providing a value.
- return (None, None)
-
- parts = value.split(".")
- if len(parts) > 3:
- return ((), "at most three version parts are allowed")
-
- if len(parts) == 1:
- # Then we are in the case of "3" or "37".
- value = parts[0]
- if len(value) > 1:
- parts = [value[0], value[1:]]
-
- try:
- version_info = tuple(int(part) for part in parts)
- except ValueError:
- return ((), "each version part must be an integer")
-
- return (version_info, None)
-
-
-def _handle_python_version(
- option: Option, opt_str: str, value: str, parser: OptionParser
-) -> None:
- """
- Handle a provided --python-version value.
- """
- version_info, error_msg = _convert_python_version(value)
- if error_msg is not None:
- msg = "invalid --python-version value: {!r}: {}".format(
- value,
- error_msg,
- )
- raise_option_error(parser, option=option, msg=msg)
-
- parser.values.python_version = version_info
-
-
-python_version: Callable[..., Option] = partial(
- Option,
- "--python-version",
- dest="python_version",
- metavar="python_version",
- action="callback",
- callback=_handle_python_version,
- type="str",
- default=None,
- help=dedent(
- """\
- The Python interpreter version to use for wheel and "Requires-Python"
- compatibility checks. Defaults to a version derived from the running
- interpreter. The version can be specified using up to three dot-separated
- integers (e.g. "3" for 3.0.0, "3.7" for 3.7.0, or "3.7.3"). A major-minor
- version can also be given as a string without dots (e.g. "37" for 3.7.0).
- """
- ),
-)
-
-
-implementation: Callable[..., Option] = partial(
- Option,
- "--implementation",
- dest="implementation",
- metavar="implementation",
- default=None,
- help=(
- "Only use wheels compatible with Python "
- "implementation , e.g. 'pp', 'jy', 'cp', "
- " or 'ip'. If not specified, then the current "
- "interpreter implementation is used. Use 'py' to force "
- "implementation-agnostic wheels."
- ),
-)
-
-
-abis: Callable[..., Option] = partial(
- Option,
- "--abi",
- dest="abis",
- metavar="abi",
- action="append",
- default=None,
- help=(
- "Only use wheels compatible with Python abi , e.g. 'pypy_41'. "
- "If not specified, then the current interpreter abi tag is used. "
- "Use this option multiple times to specify multiple abis supported "
- "by the target interpreter. Generally you will need to specify "
- "--implementation, --platform, and --python-version when using this "
- "option."
- ),
-)
-
-
-def add_target_python_options(cmd_opts: OptionGroup) -> None:
- cmd_opts.add_option(platforms())
- cmd_opts.add_option(python_version())
- cmd_opts.add_option(implementation())
- cmd_opts.add_option(abis())
-
-
-def make_target_python(options: Values) -> TargetPython:
- target_python = TargetPython(
- platforms=options.platforms,
- py_version_info=options.python_version,
- abis=options.abis,
- implementation=options.implementation,
- )
-
- return target_python
-
-
-def prefer_binary() -> Option:
- return Option(
- "--prefer-binary",
- dest="prefer_binary",
- action="store_true",
- default=False,
- help="Prefer older binary packages over newer source packages.",
- )
-
-
-cache_dir: Callable[..., Option] = partial(
- PipOption,
- "--cache-dir",
- dest="cache_dir",
- default=USER_CACHE_DIR,
- metavar="dir",
- type="path",
- help="Store the cache data in .",
-)
-
-
-def _handle_no_cache_dir(
- option: Option, opt: str, value: str, parser: OptionParser
-) -> None:
- """
- Process a value provided for the --no-cache-dir option.
-
- This is an optparse.Option callback for the --no-cache-dir option.
- """
- # The value argument will be None if --no-cache-dir is passed via the
- # command-line, since the option doesn't accept arguments. However,
- # the value can be non-None if the option is triggered e.g. by an
- # environment variable, like PIP_NO_CACHE_DIR=true.
- if value is not None:
- # Then parse the string value to get argument error-checking.
- try:
- strtobool(value)
- except ValueError as exc:
- raise_option_error(parser, option=option, msg=str(exc))
-
- # Originally, setting PIP_NO_CACHE_DIR to a value that strtobool()
- # converted to 0 (like "false" or "no") caused cache_dir to be disabled
- # rather than enabled (logic would say the latter). Thus, we disable
- # the cache directory not just on values that parse to True, but (for
- # backwards compatibility reasons) also on values that parse to False.
- # In other words, always set it to False if the option is provided in
- # some (valid) form.
- parser.values.cache_dir = False
-
-
-no_cache: Callable[..., Option] = partial(
- Option,
- "--no-cache-dir",
- dest="cache_dir",
- action="callback",
- callback=_handle_no_cache_dir,
- help="Disable the cache.",
-)
-
-no_deps: Callable[..., Option] = partial(
- Option,
- "--no-deps",
- "--no-dependencies",
- dest="ignore_dependencies",
- action="store_true",
- default=False,
- help="Don't install package dependencies.",
-)
-
-ignore_requires_python: Callable[..., Option] = partial(
- Option,
- "--ignore-requires-python",
- dest="ignore_requires_python",
- action="store_true",
- help="Ignore the Requires-Python information.",
-)
-
-no_build_isolation: Callable[..., Option] = partial(
- Option,
- "--no-build-isolation",
- dest="build_isolation",
- action="store_false",
- default=True,
- help="Disable isolation when building a modern source distribution. "
- "Build dependencies specified by PEP 518 must be already installed "
- "if this option is used.",
-)
-
-check_build_deps: Callable[..., Option] = partial(
- Option,
- "--check-build-dependencies",
- dest="check_build_deps",
- action="store_true",
- default=False,
- help="Check the build dependencies when PEP517 is used.",
-)
-
-
-def _handle_no_use_pep517(
- option: Option, opt: str, value: str, parser: OptionParser
-) -> None:
- """
- Process a value provided for the --no-use-pep517 option.
-
- This is an optparse.Option callback for the no_use_pep517 option.
- """
- # Since --no-use-pep517 doesn't accept arguments, the value argument
- # will be None if --no-use-pep517 is passed via the command-line.
- # However, the value can be non-None if the option is triggered e.g.
- # by an environment variable, for example "PIP_NO_USE_PEP517=true".
- if value is not None:
- msg = """A value was passed for --no-use-pep517,
- probably using either the PIP_NO_USE_PEP517 environment variable
- or the "no-use-pep517" config file option. Use an appropriate value
- of the PIP_USE_PEP517 environment variable or the "use-pep517"
- config file option instead.
- """
- raise_option_error(parser, option=option, msg=msg)
-
- # If user doesn't wish to use pep517, we check if setuptools is installed
- # and raise error if it is not.
- if not importlib.util.find_spec("setuptools"):
- msg = "It is not possible to use --no-use-pep517 without setuptools installed."
- raise_option_error(parser, option=option, msg=msg)
-
- # Otherwise, --no-use-pep517 was passed via the command-line.
- parser.values.use_pep517 = False
-
-
-use_pep517: Any = partial(
- Option,
- "--use-pep517",
- dest="use_pep517",
- action="store_true",
- default=None,
- help="Use PEP 517 for building source distributions "
- "(use --no-use-pep517 to force legacy behaviour).",
-)
-
-no_use_pep517: Any = partial(
- Option,
- "--no-use-pep517",
- dest="use_pep517",
- action="callback",
- callback=_handle_no_use_pep517,
- default=None,
- help=SUPPRESS_HELP,
-)
-
-
-def _handle_config_settings(
- option: Option, opt_str: str, value: str, parser: OptionParser
-) -> None:
- key, sep, val = value.partition("=")
- if sep != "=":
- parser.error(f"Arguments to {opt_str} must be of the form KEY=VAL") # noqa
- dest = getattr(parser.values, option.dest)
- if dest is None:
- dest = {}
- setattr(parser.values, option.dest, dest)
- dest[key] = val
-
-
-config_settings: Callable[..., Option] = partial(
- Option,
- "--config-settings",
- dest="config_settings",
- type=str,
- action="callback",
- callback=_handle_config_settings,
- metavar="settings",
- help="Configuration settings to be passed to the PEP 517 build backend. "
- "Settings take the form KEY=VALUE. Use multiple --config-settings options "
- "to pass multiple keys to the backend.",
-)
-
-install_options: Callable[..., Option] = partial(
- Option,
- "--install-option",
- dest="install_options",
- action="append",
- metavar="options",
- help="Extra arguments to be supplied to the setup.py install "
- 'command (use like --install-option="--install-scripts=/usr/local/'
- 'bin"). Use multiple --install-option options to pass multiple '
- "options to setup.py install. If you are using an option with a "
- "directory path, be sure to use absolute path.",
-)
-
-build_options: Callable[..., Option] = partial(
- Option,
- "--build-option",
- dest="build_options",
- metavar="options",
- action="append",
- help="Extra arguments to be supplied to 'setup.py bdist_wheel'.",
-)
-
-global_options: Callable[..., Option] = partial(
- Option,
- "--global-option",
- dest="global_options",
- action="append",
- metavar="options",
- help="Extra global options to be supplied to the setup.py "
- "call before the install or bdist_wheel command.",
-)
-
-no_clean: Callable[..., Option] = partial(
- Option,
- "--no-clean",
- action="store_true",
- default=False,
- help="Don't clean up build directories.",
-)
-
-pre: Callable[..., Option] = partial(
- Option,
- "--pre",
- action="store_true",
- default=False,
- help="Include pre-release and development versions. By default, "
- "pip only finds stable versions.",
-)
-
-disable_pip_version_check: Callable[..., Option] = partial(
- Option,
- "--disable-pip-version-check",
- dest="disable_pip_version_check",
- action="store_true",
- default=False,
- help="Don't periodically check PyPI to determine whether a new version "
- "of pip is available for download. Implied with --no-index.",
-)
-
-root_user_action: Callable[..., Option] = partial(
- Option,
- "--root-user-action",
- dest="root_user_action",
- default="warn",
- choices=["warn", "ignore"],
- help="Action if pip is run as a root user. By default, a warning message is shown.",
-)
-
-
-def _handle_merge_hash(
- option: Option, opt_str: str, value: str, parser: OptionParser
-) -> None:
- """Given a value spelled "algo:digest", append the digest to a list
- pointed to in a dict by the algo name."""
- if not parser.values.hashes:
- parser.values.hashes = {}
- try:
- algo, digest = value.split(":", 1)
- except ValueError:
- parser.error(
- "Arguments to {} must be a hash name " # noqa
- "followed by a value, like --hash=sha256:"
- "abcde...".format(opt_str)
- )
- if algo not in STRONG_HASHES:
- parser.error(
- "Allowed hash algorithms for {} are {}.".format( # noqa
- opt_str, ", ".join(STRONG_HASHES)
- )
- )
- parser.values.hashes.setdefault(algo, []).append(digest)
-
-
-hash: Callable[..., Option] = partial(
- Option,
- "--hash",
- # Hash values eventually end up in InstallRequirement.hashes due to
- # __dict__ copying in process_line().
- dest="hashes",
- action="callback",
- callback=_handle_merge_hash,
- type="string",
- help="Verify that the package's archive matches this "
- "hash before installing. Example: --hash=sha256:abcdef...",
-)
-
-
-require_hashes: Callable[..., Option] = partial(
- Option,
- "--require-hashes",
- dest="require_hashes",
- action="store_true",
- default=False,
- help="Require a hash to check each requirement against, for "
- "repeatable installs. This option is implied when any package in a "
- "requirements file has a --hash option.",
-)
-
-
-list_path: Callable[..., Option] = partial(
- PipOption,
- "--path",
- dest="path",
- type="path",
- action="append",
- help="Restrict to the specified installation path for listing "
- "packages (can be used multiple times).",
-)
-
-
-def check_list_path_option(options: Values) -> None:
- if options.path and (options.user or options.local):
- raise CommandError("Cannot combine '--path' with '--user' or '--local'")
-
-
-list_exclude: Callable[..., Option] = partial(
- PipOption,
- "--exclude",
- dest="excludes",
- action="append",
- metavar="package",
- type="package_name",
- help="Exclude specified package from the output",
-)
-
-
-no_python_version_warning: Callable[..., Option] = partial(
- Option,
- "--no-python-version-warning",
- dest="no_python_version_warning",
- action="store_true",
- default=False,
- help="Silence deprecation warnings for upcoming unsupported Pythons.",
-)
-
-
-use_new_feature: Callable[..., Option] = partial(
- Option,
- "--use-feature",
- dest="features_enabled",
- metavar="feature",
- action="append",
- default=[],
- choices=["2020-resolver", "fast-deps"],
- help="Enable new functionality, that may be backward incompatible.",
-)
-
-use_deprecated_feature: Callable[..., Option] = partial(
- Option,
- "--use-deprecated",
- dest="deprecated_features_enabled",
- metavar="feature",
- action="append",
- default=[],
- choices=[
- "legacy-resolver",
- "backtrack-on-build-failures",
- "html5lib",
- ],
- help=("Enable deprecated functionality, that will be removed in the future."),
-)
-
-
-##########
-# groups #
-##########
-
-general_group: Dict[str, Any] = {
- "name": "General Options",
- "options": [
- help_,
- debug_mode,
- isolated_mode,
- require_virtualenv,
- verbose,
- version,
- quiet,
- log,
- no_input,
- proxy,
- retries,
- timeout,
- exists_action,
- trusted_host,
- cert,
- client_cert,
- cache_dir,
- no_cache,
- disable_pip_version_check,
- no_color,
- no_python_version_warning,
- use_new_feature,
- use_deprecated_feature,
- ],
-}
-
-index_group: Dict[str, Any] = {
- "name": "Package Index Options",
- "options": [
- index_url,
- extra_index_url,
- no_index,
- find_links,
- ],
-}
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/command_context.py b/env/lib/python3.9/site-packages/pip/_internal/cli/command_context.py
deleted file mode 100644
index 139995a..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/command_context.py
+++ /dev/null
@@ -1,27 +0,0 @@
-from contextlib import ExitStack, contextmanager
-from typing import ContextManager, Generator, TypeVar
-
-_T = TypeVar("_T", covariant=True)
-
-
-class CommandContextMixIn:
- def __init__(self) -> None:
- super().__init__()
- self._in_main_context = False
- self._main_context = ExitStack()
-
- @contextmanager
- def main_context(self) -> Generator[None, None, None]:
- assert not self._in_main_context
-
- self._in_main_context = True
- try:
- with self._main_context:
- yield
- finally:
- self._in_main_context = False
-
- def enter_context(self, context_provider: ContextManager[_T]) -> _T:
- assert self._in_main_context
-
- return self._main_context.enter_context(context_provider)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/main.py b/env/lib/python3.9/site-packages/pip/_internal/cli/main.py
deleted file mode 100644
index 0e31221..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/main.py
+++ /dev/null
@@ -1,70 +0,0 @@
-"""Primary application entrypoint.
-"""
-import locale
-import logging
-import os
-import sys
-from typing import List, Optional
-
-from pip._internal.cli.autocompletion import autocomplete
-from pip._internal.cli.main_parser import parse_command
-from pip._internal.commands import create_command
-from pip._internal.exceptions import PipError
-from pip._internal.utils import deprecation
-
-logger = logging.getLogger(__name__)
-
-
-# Do not import and use main() directly! Using it directly is actively
-# discouraged by pip's maintainers. The name, location and behavior of
-# this function is subject to change, so calling it directly is not
-# portable across different pip versions.
-
-# In addition, running pip in-process is unsupported and unsafe. This is
-# elaborated in detail at
-# https://pip.pypa.io/en/stable/user_guide/#using-pip-from-your-program.
-# That document also provides suggestions that should work for nearly
-# all users that are considering importing and using main() directly.
-
-# However, we know that certain users will still want to invoke pip
-# in-process. If you understand and accept the implications of using pip
-# in an unsupported manner, the best approach is to use runpy to avoid
-# depending on the exact location of this entry point.
-
-# The following example shows how to use runpy to invoke pip in that
-# case:
-#
-# sys.argv = ["pip", your, args, here]
-# runpy.run_module("pip", run_name="__main__")
-#
-# Note that this will exit the process after running, unlike a direct
-# call to main. As it is not safe to do any processing after calling
-# main, this should not be an issue in practice.
-
-
-def main(args: Optional[List[str]] = None) -> int:
- if args is None:
- args = sys.argv[1:]
-
- # Configure our deprecation warnings to be sent through loggers
- deprecation.install_warning_logger()
-
- autocomplete()
-
- try:
- cmd_name, cmd_args = parse_command(args)
- except PipError as exc:
- sys.stderr.write(f"ERROR: {exc}")
- sys.stderr.write(os.linesep)
- sys.exit(1)
-
- # Needed for locale.getpreferredencoding(False) to work
- # in pip._internal.utils.encoding.auto_decode
- try:
- locale.setlocale(locale.LC_ALL, "")
- except locale.Error as e:
- # setlocale can apparently crash if locale are uninitialized
- logger.debug("Ignoring error %s when setting locale", e)
- command = create_command(cmd_name, isolated=("--isolated" in cmd_args))
-
- return command.main(cmd_args)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/main_parser.py b/env/lib/python3.9/site-packages/pip/_internal/cli/main_parser.py
deleted file mode 100644
index 3666ab0..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/main_parser.py
+++ /dev/null
@@ -1,87 +0,0 @@
-"""A single place for constructing and exposing the main parser
-"""
-
-import os
-import sys
-from typing import List, Tuple
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.parser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
-from pip._internal.commands import commands_dict, get_similar_commands
-from pip._internal.exceptions import CommandError
-from pip._internal.utils.misc import get_pip_version, get_prog
-
-__all__ = ["create_main_parser", "parse_command"]
-
-
-def create_main_parser() -> ConfigOptionParser:
- """Creates and returns the main parser for pip's CLI"""
-
- parser = ConfigOptionParser(
- usage="\n%prog [options]",
- add_help_option=False,
- formatter=UpdatingDefaultsHelpFormatter(),
- name="global",
- prog=get_prog(),
- )
- parser.disable_interspersed_args()
-
- parser.version = get_pip_version()
-
- # add the general options
- gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
- parser.add_option_group(gen_opts)
-
- # so the help formatter knows
- parser.main = True # type: ignore
-
- # create command listing for description
- description = [""] + [
- f"{name:27} {command_info.summary}"
- for name, command_info in commands_dict.items()
- ]
- parser.description = "\n".join(description)
-
- return parser
-
-
-def parse_command(args: List[str]) -> Tuple[str, List[str]]:
- parser = create_main_parser()
-
- # Note: parser calls disable_interspersed_args(), so the result of this
- # call is to split the initial args into the general options before the
- # subcommand and everything else.
- # For example:
- # args: ['--timeout=5', 'install', '--user', 'INITools']
- # general_options: ['--timeout==5']
- # args_else: ['install', '--user', 'INITools']
- general_options, args_else = parser.parse_args(args)
-
- # --version
- if general_options.version:
- sys.stdout.write(parser.version)
- sys.stdout.write(os.linesep)
- sys.exit()
-
- # pip || pip help -> print_help()
- if not args_else or (args_else[0] == "help" and len(args_else) == 1):
- parser.print_help()
- sys.exit()
-
- # the subcommand name
- cmd_name = args_else[0]
-
- if cmd_name not in commands_dict:
- guess = get_similar_commands(cmd_name)
-
- msg = [f'unknown command "{cmd_name}"']
- if guess:
- msg.append(f'maybe you meant "{guess}"')
-
- raise CommandError(" - ".join(msg))
-
- # all the args without the subcommand
- cmd_args = args[:]
- cmd_args.remove(cmd_name)
-
- return cmd_name, cmd_args
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/parser.py b/env/lib/python3.9/site-packages/pip/_internal/cli/parser.py
deleted file mode 100644
index c762cf2..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/parser.py
+++ /dev/null
@@ -1,294 +0,0 @@
-"""Base option parser setup"""
-
-import logging
-import optparse
-import shutil
-import sys
-import textwrap
-from contextlib import suppress
-from typing import Any, Dict, Generator, List, Tuple
-
-from pip._internal.cli.status_codes import UNKNOWN_ERROR
-from pip._internal.configuration import Configuration, ConfigurationError
-from pip._internal.utils.misc import redact_auth_from_url, strtobool
-
-logger = logging.getLogger(__name__)
-
-
-class PrettyHelpFormatter(optparse.IndentedHelpFormatter):
- """A prettier/less verbose help formatter for optparse."""
-
- def __init__(self, *args: Any, **kwargs: Any) -> None:
- # help position must be aligned with __init__.parseopts.description
- kwargs["max_help_position"] = 30
- kwargs["indent_increment"] = 1
- kwargs["width"] = shutil.get_terminal_size()[0] - 2
- super().__init__(*args, **kwargs)
-
- def format_option_strings(self, option: optparse.Option) -> str:
- return self._format_option_strings(option)
-
- def _format_option_strings(
- self, option: optparse.Option, mvarfmt: str = " <{}>", optsep: str = ", "
- ) -> str:
- """
- Return a comma-separated list of option strings and metavars.
-
- :param option: tuple of (short opt, long opt), e.g: ('-f', '--format')
- :param mvarfmt: metavar format string
- :param optsep: separator
- """
- opts = []
-
- if option._short_opts:
- opts.append(option._short_opts[0])
- if option._long_opts:
- opts.append(option._long_opts[0])
- if len(opts) > 1:
- opts.insert(1, optsep)
-
- if option.takes_value():
- assert option.dest is not None
- metavar = option.metavar or option.dest.lower()
- opts.append(mvarfmt.format(metavar.lower()))
-
- return "".join(opts)
-
- def format_heading(self, heading: str) -> str:
- if heading == "Options":
- return ""
- return heading + ":\n"
-
- def format_usage(self, usage: str) -> str:
- """
- Ensure there is only one newline between usage and the first heading
- if there is no description.
- """
- msg = "\nUsage: {}\n".format(self.indent_lines(textwrap.dedent(usage), " "))
- return msg
-
- def format_description(self, description: str) -> str:
- # leave full control over description to us
- if description:
- if hasattr(self.parser, "main"):
- label = "Commands"
- else:
- label = "Description"
- # some doc strings have initial newlines, some don't
- description = description.lstrip("\n")
- # some doc strings have final newlines and spaces, some don't
- description = description.rstrip()
- # dedent, then reindent
- description = self.indent_lines(textwrap.dedent(description), " ")
- description = f"{label}:\n{description}\n"
- return description
- else:
- return ""
-
- def format_epilog(self, epilog: str) -> str:
- # leave full control over epilog to us
- if epilog:
- return epilog
- else:
- return ""
-
- def indent_lines(self, text: str, indent: str) -> str:
- new_lines = [indent + line for line in text.split("\n")]
- return "\n".join(new_lines)
-
-
-class UpdatingDefaultsHelpFormatter(PrettyHelpFormatter):
- """Custom help formatter for use in ConfigOptionParser.
-
- This is updates the defaults before expanding them, allowing
- them to show up correctly in the help listing.
-
- Also redact auth from url type options
- """
-
- def expand_default(self, option: optparse.Option) -> str:
- default_values = None
- if self.parser is not None:
- assert isinstance(self.parser, ConfigOptionParser)
- self.parser._update_defaults(self.parser.defaults)
- assert option.dest is not None
- default_values = self.parser.defaults.get(option.dest)
- help_text = super().expand_default(option)
-
- if default_values and option.metavar == "URL":
- if isinstance(default_values, str):
- default_values = [default_values]
-
- # If its not a list, we should abort and just return the help text
- if not isinstance(default_values, list):
- default_values = []
-
- for val in default_values:
- help_text = help_text.replace(val, redact_auth_from_url(val))
-
- return help_text
-
-
-class CustomOptionParser(optparse.OptionParser):
- def insert_option_group(
- self, idx: int, *args: Any, **kwargs: Any
- ) -> optparse.OptionGroup:
- """Insert an OptionGroup at a given position."""
- group = self.add_option_group(*args, **kwargs)
-
- self.option_groups.pop()
- self.option_groups.insert(idx, group)
-
- return group
-
- @property
- def option_list_all(self) -> List[optparse.Option]:
- """Get a list of all options, including those in option groups."""
- res = self.option_list[:]
- for i in self.option_groups:
- res.extend(i.option_list)
-
- return res
-
-
-class ConfigOptionParser(CustomOptionParser):
- """Custom option parser which updates its defaults by checking the
- configuration files and environmental variables"""
-
- def __init__(
- self,
- *args: Any,
- name: str,
- isolated: bool = False,
- **kwargs: Any,
- ) -> None:
- self.name = name
- self.config = Configuration(isolated)
-
- assert self.name
- super().__init__(*args, **kwargs)
-
- def check_default(self, option: optparse.Option, key: str, val: Any) -> Any:
- try:
- return option.check_value(key, val)
- except optparse.OptionValueError as exc:
- print(f"An error occurred during configuration: {exc}")
- sys.exit(3)
-
- def _get_ordered_configuration_items(
- self,
- ) -> Generator[Tuple[str, Any], None, None]:
- # Configuration gives keys in an unordered manner. Order them.
- override_order = ["global", self.name, ":env:"]
-
- # Pool the options into different groups
- section_items: Dict[str, List[Tuple[str, Any]]] = {
- name: [] for name in override_order
- }
- for section_key, val in self.config.items():
- # ignore empty values
- if not val:
- logger.debug(
- "Ignoring configuration key '%s' as it's value is empty.",
- section_key,
- )
- continue
-
- section, key = section_key.split(".", 1)
- if section in override_order:
- section_items[section].append((key, val))
-
- # Yield each group in their override order
- for section in override_order:
- for key, val in section_items[section]:
- yield key, val
-
- def _update_defaults(self, defaults: Dict[str, Any]) -> Dict[str, Any]:
- """Updates the given defaults with values from the config files and
- the environ. Does a little special handling for certain types of
- options (lists)."""
-
- # Accumulate complex default state.
- self.values = optparse.Values(self.defaults)
- late_eval = set()
- # Then set the options with those values
- for key, val in self._get_ordered_configuration_items():
- # '--' because configuration supports only long names
- option = self.get_option("--" + key)
-
- # Ignore options not present in this parser. E.g. non-globals put
- # in [global] by users that want them to apply to all applicable
- # commands.
- if option is None:
- continue
-
- assert option.dest is not None
-
- if option.action in ("store_true", "store_false"):
- try:
- val = strtobool(val)
- except ValueError:
- self.error(
- "{} is not a valid value for {} option, " # noqa
- "please specify a boolean value like yes/no, "
- "true/false or 1/0 instead.".format(val, key)
- )
- elif option.action == "count":
- with suppress(ValueError):
- val = strtobool(val)
- with suppress(ValueError):
- val = int(val)
- if not isinstance(val, int) or val < 0:
- self.error(
- "{} is not a valid value for {} option, " # noqa
- "please instead specify either a non-negative integer "
- "or a boolean value like yes/no or false/true "
- "which is equivalent to 1/0.".format(val, key)
- )
- elif option.action == "append":
- val = val.split()
- val = [self.check_default(option, key, v) for v in val]
- elif option.action == "callback":
- assert option.callback is not None
- late_eval.add(option.dest)
- opt_str = option.get_opt_string()
- val = option.convert_value(opt_str, val)
- # From take_action
- args = option.callback_args or ()
- kwargs = option.callback_kwargs or {}
- option.callback(option, opt_str, val, self, *args, **kwargs)
- else:
- val = self.check_default(option, key, val)
-
- defaults[option.dest] = val
-
- for key in late_eval:
- defaults[key] = getattr(self.values, key)
- self.values = None
- return defaults
-
- def get_default_values(self) -> optparse.Values:
- """Overriding to make updating the defaults after instantiation of
- the option parser possible, _update_defaults() does the dirty work."""
- if not self.process_default_values:
- # Old, pre-Optik 1.5 behaviour.
- return optparse.Values(self.defaults)
-
- # Load the configuration, or error out in case of an error
- try:
- self.config.load()
- except ConfigurationError as err:
- self.exit(UNKNOWN_ERROR, str(err))
-
- defaults = self._update_defaults(self.defaults.copy()) # ours
- for option in self._get_all_options():
- assert option.dest is not None
- default = defaults.get(option.dest)
- if isinstance(default, str):
- opt_str = option.get_opt_string()
- defaults[option.dest] = option.check_value(opt_str, default)
- return optparse.Values(defaults)
-
- def error(self, msg: str) -> None:
- self.print_usage(sys.stderr)
- self.exit(UNKNOWN_ERROR, f"{msg}\n")
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/progress_bars.py b/env/lib/python3.9/site-packages/pip/_internal/cli/progress_bars.py
deleted file mode 100644
index 0ad1403..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/progress_bars.py
+++ /dev/null
@@ -1,68 +0,0 @@
-import functools
-from typing import Callable, Generator, Iterable, Iterator, Optional, Tuple
-
-from pip._vendor.rich.progress import (
- BarColumn,
- DownloadColumn,
- FileSizeColumn,
- Progress,
- ProgressColumn,
- SpinnerColumn,
- TextColumn,
- TimeElapsedColumn,
- TimeRemainingColumn,
- TransferSpeedColumn,
-)
-
-from pip._internal.utils.logging import get_indentation
-
-DownloadProgressRenderer = Callable[[Iterable[bytes]], Iterator[bytes]]
-
-
-def _rich_progress_bar(
- iterable: Iterable[bytes],
- *,
- bar_type: str,
- size: int,
-) -> Generator[bytes, None, None]:
- assert bar_type == "on", "This should only be used in the default mode."
-
- if not size:
- total = float("inf")
- columns: Tuple[ProgressColumn, ...] = (
- TextColumn("[progress.description]{task.description}"),
- SpinnerColumn("line", speed=1.5),
- FileSizeColumn(),
- TransferSpeedColumn(),
- TimeElapsedColumn(),
- )
- else:
- total = size
- columns = (
- TextColumn("[progress.description]{task.description}"),
- BarColumn(),
- DownloadColumn(),
- TransferSpeedColumn(),
- TextColumn("eta"),
- TimeRemainingColumn(),
- )
-
- progress = Progress(*columns, refresh_per_second=30)
- task_id = progress.add_task(" " * (get_indentation() + 2), total=total)
- with progress:
- for chunk in iterable:
- yield chunk
- progress.update(task_id, advance=len(chunk))
-
-
-def get_download_progress_renderer(
- *, bar_type: str, size: Optional[int] = None
-) -> DownloadProgressRenderer:
- """Get an object that can be used to render the download progress.
-
- Returns a callable, that takes an iterable to "wrap".
- """
- if bar_type == "on":
- return functools.partial(_rich_progress_bar, bar_type=bar_type, size=size)
- else:
- return iter # no-op, when passed an iterator
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/req_command.py b/env/lib/python3.9/site-packages/pip/_internal/cli/req_command.py
deleted file mode 100644
index 539d21d..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/req_command.py
+++ /dev/null
@@ -1,488 +0,0 @@
-"""Contains the Command base classes that depend on PipSession.
-
-The classes in this module are in a separate module so the commands not
-needing download / PackageFinder capability don't unnecessarily import the
-PackageFinder machinery and all its vendored dependencies, etc.
-"""
-
-import logging
-import os
-import sys
-from functools import partial
-from optparse import Values
-from typing import Any, List, Optional, Tuple
-
-from pip._internal.cache import WheelCache
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.command_context import CommandContextMixIn
-from pip._internal.exceptions import CommandError, PreviousBuildDirError
-from pip._internal.index.collector import LinkCollector
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.models.selection_prefs import SelectionPreferences
-from pip._internal.models.target_python import TargetPython
-from pip._internal.network.session import PipSession
-from pip._internal.operations.build.build_tracker import BuildTracker
-from pip._internal.operations.prepare import RequirementPreparer
-from pip._internal.req.constructors import (
- install_req_from_editable,
- install_req_from_line,
- install_req_from_parsed_requirement,
- install_req_from_req_string,
-)
-from pip._internal.req.req_file import parse_requirements
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.resolution.base import BaseResolver
-from pip._internal.self_outdated_check import pip_self_version_check
-from pip._internal.utils.deprecation import deprecated
-from pip._internal.utils.temp_dir import (
- TempDirectory,
- TempDirectoryTypeRegistry,
- tempdir_kinds,
-)
-from pip._internal.utils.virtualenv import running_under_virtualenv
-
-logger = logging.getLogger(__name__)
-
-
-class SessionCommandMixin(CommandContextMixIn):
-
- """
- A class mixin for command classes needing _build_session().
- """
-
- def __init__(self) -> None:
- super().__init__()
- self._session: Optional[PipSession] = None
-
- @classmethod
- def _get_index_urls(cls, options: Values) -> Optional[List[str]]:
- """Return a list of index urls from user-provided options."""
- index_urls = []
- if not getattr(options, "no_index", False):
- url = getattr(options, "index_url", None)
- if url:
- index_urls.append(url)
- urls = getattr(options, "extra_index_urls", None)
- if urls:
- index_urls.extend(urls)
- # Return None rather than an empty list
- return index_urls or None
-
- def get_default_session(self, options: Values) -> PipSession:
- """Get a default-managed session."""
- if self._session is None:
- self._session = self.enter_context(self._build_session(options))
- # there's no type annotation on requests.Session, so it's
- # automatically ContextManager[Any] and self._session becomes Any,
- # then https://github.com/python/mypy/issues/7696 kicks in
- assert self._session is not None
- return self._session
-
- def _build_session(
- self,
- options: Values,
- retries: Optional[int] = None,
- timeout: Optional[int] = None,
- ) -> PipSession:
- assert not options.cache_dir or os.path.isabs(options.cache_dir)
- session = PipSession(
- cache=(
- os.path.join(options.cache_dir, "http") if options.cache_dir else None
- ),
- retries=retries if retries is not None else options.retries,
- trusted_hosts=options.trusted_hosts,
- index_urls=self._get_index_urls(options),
- )
-
- # Handle custom ca-bundles from the user
- if options.cert:
- session.verify = options.cert
-
- # Handle SSL client certificate
- if options.client_cert:
- session.cert = options.client_cert
-
- # Handle timeouts
- if options.timeout or timeout:
- session.timeout = timeout if timeout is not None else options.timeout
-
- # Handle configured proxies
- if options.proxy:
- session.proxies = {
- "http": options.proxy,
- "https": options.proxy,
- }
-
- # Determine if we can prompt the user for authentication or not
- session.auth.prompting = not options.no_input
-
- return session
-
-
-class IndexGroupCommand(Command, SessionCommandMixin):
-
- """
- Abstract base class for commands with the index_group options.
-
- This also corresponds to the commands that permit the pip version check.
- """
-
- def handle_pip_version_check(self, options: Values) -> None:
- """
- Do the pip version check if not disabled.
-
- This overrides the default behavior of not doing the check.
- """
- # Make sure the index_group options are present.
- assert hasattr(options, "no_index")
-
- if options.disable_pip_version_check or options.no_index:
- return
-
- # Otherwise, check if we're using the latest version of pip available.
- session = self._build_session(
- options, retries=0, timeout=min(5, options.timeout)
- )
- with session:
- pip_self_version_check(session, options)
-
-
-KEEPABLE_TEMPDIR_TYPES = [
- tempdir_kinds.BUILD_ENV,
- tempdir_kinds.EPHEM_WHEEL_CACHE,
- tempdir_kinds.REQ_BUILD,
-]
-
-
-def warn_if_run_as_root() -> None:
- """Output a warning for sudo users on Unix.
-
- In a virtual environment, sudo pip still writes to virtualenv.
- On Windows, users may run pip as Administrator without issues.
- This warning only applies to Unix root users outside of virtualenv.
- """
- if running_under_virtualenv():
- return
- if not hasattr(os, "getuid"):
- return
- # On Windows, there are no "system managed" Python packages. Installing as
- # Administrator via pip is the correct way of updating system environments.
- #
- # We choose sys.platform over utils.compat.WINDOWS here to enable Mypy platform
- # checks: https://mypy.readthedocs.io/en/stable/common_issues.html
- if sys.platform == "win32" or sys.platform == "cygwin":
- return
-
- if os.getuid() != 0:
- return
-
- logger.warning(
- "Running pip as the 'root' user can result in broken permissions and "
- "conflicting behaviour with the system package manager. "
- "It is recommended to use a virtual environment instead: "
- "https://pip.pypa.io/warnings/venv"
- )
-
-
-def with_cleanup(func: Any) -> Any:
- """Decorator for common logic related to managing temporary
- directories.
- """
-
- def configure_tempdir_registry(registry: TempDirectoryTypeRegistry) -> None:
- for t in KEEPABLE_TEMPDIR_TYPES:
- registry.set_delete(t, False)
-
- def wrapper(
- self: RequirementCommand, options: Values, args: List[Any]
- ) -> Optional[int]:
- assert self.tempdir_registry is not None
- if options.no_clean:
- configure_tempdir_registry(self.tempdir_registry)
-
- try:
- return func(self, options, args)
- except PreviousBuildDirError:
- # This kind of conflict can occur when the user passes an explicit
- # build directory with a pre-existing folder. In that case we do
- # not want to accidentally remove it.
- configure_tempdir_registry(self.tempdir_registry)
- raise
-
- return wrapper
-
-
-class RequirementCommand(IndexGroupCommand):
- def __init__(self, *args: Any, **kw: Any) -> None:
- super().__init__(*args, **kw)
-
- self.cmd_opts.add_option(cmdoptions.no_clean())
-
- @staticmethod
- def determine_resolver_variant(options: Values) -> str:
- """Determines which resolver should be used, based on the given options."""
- if "legacy-resolver" in options.deprecated_features_enabled:
- return "legacy"
-
- return "2020-resolver"
-
- @staticmethod
- def determine_build_failure_suppression(options: Values) -> bool:
- """Determines whether build failures should be suppressed and backtracked on."""
- if "backtrack-on-build-failures" not in options.deprecated_features_enabled:
- return False
-
- if "legacy-resolver" in options.deprecated_features_enabled:
- raise CommandError("Cannot backtrack with legacy resolver.")
-
- deprecated(
- reason=(
- "Backtracking on build failures can mask issues related to how "
- "a package generates metadata or builds a wheel. This flag will "
- "be removed in pip 22.2."
- ),
- gone_in=None,
- replacement=(
- "avoiding known-bad versions by explicitly telling pip to ignore them "
- "(either directly as requirements, or via a constraints file)"
- ),
- feature_flag=None,
- issue=10655,
- )
- return True
-
- @classmethod
- def make_requirement_preparer(
- cls,
- temp_build_dir: TempDirectory,
- options: Values,
- build_tracker: BuildTracker,
- session: PipSession,
- finder: PackageFinder,
- use_user_site: bool,
- download_dir: Optional[str] = None,
- verbosity: int = 0,
- ) -> RequirementPreparer:
- """
- Create a RequirementPreparer instance for the given parameters.
- """
- temp_build_dir_path = temp_build_dir.path
- assert temp_build_dir_path is not None
-
- resolver_variant = cls.determine_resolver_variant(options)
- if resolver_variant == "2020-resolver":
- lazy_wheel = "fast-deps" in options.features_enabled
- if lazy_wheel:
- logger.warning(
- "pip is using lazily downloaded wheels using HTTP "
- "range requests to obtain dependency information. "
- "This experimental feature is enabled through "
- "--use-feature=fast-deps and it is not ready for "
- "production."
- )
- else:
- lazy_wheel = False
- if "fast-deps" in options.features_enabled:
- logger.warning(
- "fast-deps has no effect when used with the legacy resolver."
- )
-
- return RequirementPreparer(
- build_dir=temp_build_dir_path,
- src_dir=options.src_dir,
- download_dir=download_dir,
- build_isolation=options.build_isolation,
- check_build_deps=options.check_build_deps,
- build_tracker=build_tracker,
- session=session,
- progress_bar=options.progress_bar,
- finder=finder,
- require_hashes=options.require_hashes,
- use_user_site=use_user_site,
- lazy_wheel=lazy_wheel,
- verbosity=verbosity,
- )
-
- @classmethod
- def make_resolver(
- cls,
- preparer: RequirementPreparer,
- finder: PackageFinder,
- options: Values,
- wheel_cache: Optional[WheelCache] = None,
- use_user_site: bool = False,
- ignore_installed: bool = True,
- ignore_requires_python: bool = False,
- force_reinstall: bool = False,
- upgrade_strategy: str = "to-satisfy-only",
- use_pep517: Optional[bool] = None,
- py_version_info: Optional[Tuple[int, ...]] = None,
- ) -> BaseResolver:
- """
- Create a Resolver instance for the given parameters.
- """
- make_install_req = partial(
- install_req_from_req_string,
- isolated=options.isolated_mode,
- use_pep517=use_pep517,
- config_settings=getattr(options, "config_settings", None),
- )
- suppress_build_failures = cls.determine_build_failure_suppression(options)
- resolver_variant = cls.determine_resolver_variant(options)
- # The long import name and duplicated invocation is needed to convince
- # Mypy into correctly typechecking. Otherwise it would complain the
- # "Resolver" class being redefined.
- if resolver_variant == "2020-resolver":
- import pip._internal.resolution.resolvelib.resolver
-
- return pip._internal.resolution.resolvelib.resolver.Resolver(
- preparer=preparer,
- finder=finder,
- wheel_cache=wheel_cache,
- make_install_req=make_install_req,
- use_user_site=use_user_site,
- ignore_dependencies=options.ignore_dependencies,
- ignore_installed=ignore_installed,
- ignore_requires_python=ignore_requires_python,
- force_reinstall=force_reinstall,
- upgrade_strategy=upgrade_strategy,
- py_version_info=py_version_info,
- suppress_build_failures=suppress_build_failures,
- )
- import pip._internal.resolution.legacy.resolver
-
- return pip._internal.resolution.legacy.resolver.Resolver(
- preparer=preparer,
- finder=finder,
- wheel_cache=wheel_cache,
- make_install_req=make_install_req,
- use_user_site=use_user_site,
- ignore_dependencies=options.ignore_dependencies,
- ignore_installed=ignore_installed,
- ignore_requires_python=ignore_requires_python,
- force_reinstall=force_reinstall,
- upgrade_strategy=upgrade_strategy,
- py_version_info=py_version_info,
- )
-
- def get_requirements(
- self,
- args: List[str],
- options: Values,
- finder: PackageFinder,
- session: PipSession,
- ) -> List[InstallRequirement]:
- """
- Parse command-line arguments into the corresponding requirements.
- """
- requirements: List[InstallRequirement] = []
- for filename in options.constraints:
- for parsed_req in parse_requirements(
- filename,
- constraint=True,
- finder=finder,
- options=options,
- session=session,
- ):
- req_to_add = install_req_from_parsed_requirement(
- parsed_req,
- isolated=options.isolated_mode,
- user_supplied=False,
- )
- requirements.append(req_to_add)
-
- for req in args:
- req_to_add = install_req_from_line(
- req,
- None,
- isolated=options.isolated_mode,
- use_pep517=options.use_pep517,
- user_supplied=True,
- config_settings=getattr(options, "config_settings", None),
- )
- requirements.append(req_to_add)
-
- for req in options.editables:
- req_to_add = install_req_from_editable(
- req,
- user_supplied=True,
- isolated=options.isolated_mode,
- use_pep517=options.use_pep517,
- config_settings=getattr(options, "config_settings", None),
- )
- requirements.append(req_to_add)
-
- # NOTE: options.require_hashes may be set if --require-hashes is True
- for filename in options.requirements:
- for parsed_req in parse_requirements(
- filename, finder=finder, options=options, session=session
- ):
- req_to_add = install_req_from_parsed_requirement(
- parsed_req,
- isolated=options.isolated_mode,
- use_pep517=options.use_pep517,
- user_supplied=True,
- )
- requirements.append(req_to_add)
-
- # If any requirement has hash options, enable hash checking.
- if any(req.has_hash_options for req in requirements):
- options.require_hashes = True
-
- if not (args or options.editables or options.requirements):
- opts = {"name": self.name}
- if options.find_links:
- raise CommandError(
- "You must give at least one requirement to {name} "
- '(maybe you meant "pip {name} {links}"?)'.format(
- **dict(opts, links=" ".join(options.find_links))
- )
- )
- else:
- raise CommandError(
- "You must give at least one requirement to {name} "
- '(see "pip help {name}")'.format(**opts)
- )
-
- return requirements
-
- @staticmethod
- def trace_basic_info(finder: PackageFinder) -> None:
- """
- Trace basic information about the provided objects.
- """
- # Display where finder is looking for packages
- search_scope = finder.search_scope
- locations = search_scope.get_formatted_locations()
- if locations:
- logger.info(locations)
-
- def _build_package_finder(
- self,
- options: Values,
- session: PipSession,
- target_python: Optional[TargetPython] = None,
- ignore_requires_python: Optional[bool] = None,
- ) -> PackageFinder:
- """
- Create a package finder appropriate to this requirement command.
-
- :param ignore_requires_python: Whether to ignore incompatible
- "Requires-Python" values in links. Defaults to False.
- """
- link_collector = LinkCollector.create(session, options=options)
- selection_prefs = SelectionPreferences(
- allow_yanked=True,
- format_control=options.format_control,
- allow_all_prereleases=options.pre,
- prefer_binary=options.prefer_binary,
- ignore_requires_python=ignore_requires_python,
- )
-
- return PackageFinder.create(
- link_collector=link_collector,
- selection_prefs=selection_prefs,
- target_python=target_python,
- use_deprecated_html5lib="html5lib" in options.deprecated_features_enabled,
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/spinners.py b/env/lib/python3.9/site-packages/pip/_internal/cli/spinners.py
deleted file mode 100644
index a50e6ad..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/spinners.py
+++ /dev/null
@@ -1,159 +0,0 @@
-import contextlib
-import itertools
-import logging
-import sys
-import time
-from typing import IO, Generator
-
-from pip._internal.utils.compat import WINDOWS
-from pip._internal.utils.logging import get_indentation
-
-logger = logging.getLogger(__name__)
-
-
-class SpinnerInterface:
- def spin(self) -> None:
- raise NotImplementedError()
-
- def finish(self, final_status: str) -> None:
- raise NotImplementedError()
-
-
-class InteractiveSpinner(SpinnerInterface):
- def __init__(
- self,
- message: str,
- file: IO[str] = None,
- spin_chars: str = "-\\|/",
- # Empirically, 8 updates/second looks nice
- min_update_interval_seconds: float = 0.125,
- ):
- self._message = message
- if file is None:
- file = sys.stdout
- self._file = file
- self._rate_limiter = RateLimiter(min_update_interval_seconds)
- self._finished = False
-
- self._spin_cycle = itertools.cycle(spin_chars)
-
- self._file.write(" " * get_indentation() + self._message + " ... ")
- self._width = 0
-
- def _write(self, status: str) -> None:
- assert not self._finished
- # Erase what we wrote before by backspacing to the beginning, writing
- # spaces to overwrite the old text, and then backspacing again
- backup = "\b" * self._width
- self._file.write(backup + " " * self._width + backup)
- # Now we have a blank slate to add our status
- self._file.write(status)
- self._width = len(status)
- self._file.flush()
- self._rate_limiter.reset()
-
- def spin(self) -> None:
- if self._finished:
- return
- if not self._rate_limiter.ready():
- return
- self._write(next(self._spin_cycle))
-
- def finish(self, final_status: str) -> None:
- if self._finished:
- return
- self._write(final_status)
- self._file.write("\n")
- self._file.flush()
- self._finished = True
-
-
-# Used for dumb terminals, non-interactive installs (no tty), etc.
-# We still print updates occasionally (once every 60 seconds by default) to
-# act as a keep-alive for systems like Travis-CI that take lack-of-output as
-# an indication that a task has frozen.
-class NonInteractiveSpinner(SpinnerInterface):
- def __init__(self, message: str, min_update_interval_seconds: float = 60.0) -> None:
- self._message = message
- self._finished = False
- self._rate_limiter = RateLimiter(min_update_interval_seconds)
- self._update("started")
-
- def _update(self, status: str) -> None:
- assert not self._finished
- self._rate_limiter.reset()
- logger.info("%s: %s", self._message, status)
-
- def spin(self) -> None:
- if self._finished:
- return
- if not self._rate_limiter.ready():
- return
- self._update("still running...")
-
- def finish(self, final_status: str) -> None:
- if self._finished:
- return
- self._update(f"finished with status '{final_status}'")
- self._finished = True
-
-
-class RateLimiter:
- def __init__(self, min_update_interval_seconds: float) -> None:
- self._min_update_interval_seconds = min_update_interval_seconds
- self._last_update: float = 0
-
- def ready(self) -> bool:
- now = time.time()
- delta = now - self._last_update
- return delta >= self._min_update_interval_seconds
-
- def reset(self) -> None:
- self._last_update = time.time()
-
-
-@contextlib.contextmanager
-def open_spinner(message: str) -> Generator[SpinnerInterface, None, None]:
- # Interactive spinner goes directly to sys.stdout rather than being routed
- # through the logging system, but it acts like it has level INFO,
- # i.e. it's only displayed if we're at level INFO or better.
- # Non-interactive spinner goes through the logging system, so it is always
- # in sync with logging configuration.
- if sys.stdout.isatty() and logger.getEffectiveLevel() <= logging.INFO:
- spinner: SpinnerInterface = InteractiveSpinner(message)
- else:
- spinner = NonInteractiveSpinner(message)
- try:
- with hidden_cursor(sys.stdout):
- yield spinner
- except KeyboardInterrupt:
- spinner.finish("canceled")
- raise
- except Exception:
- spinner.finish("error")
- raise
- else:
- spinner.finish("done")
-
-
-HIDE_CURSOR = "\x1b[?25l"
-SHOW_CURSOR = "\x1b[?25h"
-
-
-@contextlib.contextmanager
-def hidden_cursor(file: IO[str]) -> Generator[None, None, None]:
- # The Windows terminal does not support the hide/show cursor ANSI codes,
- # even via colorama. So don't even try.
- if WINDOWS:
- yield
- # We don't want to clutter the output with control characters if we're
- # writing to a file, or if the user is running with --quiet.
- # See https://github.com/pypa/pip/issues/3418
- elif not file.isatty() or logger.getEffectiveLevel() > logging.INFO:
- yield
- else:
- file.write(HIDE_CURSOR)
- try:
- yield
- finally:
- file.write(SHOW_CURSOR)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/cli/status_codes.py b/env/lib/python3.9/site-packages/pip/_internal/cli/status_codes.py
deleted file mode 100644
index 5e29502..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/cli/status_codes.py
+++ /dev/null
@@ -1,6 +0,0 @@
-SUCCESS = 0
-ERROR = 1
-UNKNOWN_ERROR = 2
-VIRTUALENV_NOT_FOUND = 3
-PREVIOUS_BUILD_DIR_ERROR = 4
-NO_MATCHES_FOUND = 23
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/commands/__init__.py
deleted file mode 100644
index c72f24f..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/__init__.py
+++ /dev/null
@@ -1,127 +0,0 @@
-"""
-Package containing all pip commands
-"""
-
-import importlib
-from collections import namedtuple
-from typing import Any, Dict, Optional
-
-from pip._internal.cli.base_command import Command
-
-CommandInfo = namedtuple("CommandInfo", "module_path, class_name, summary")
-
-# This dictionary does a bunch of heavy lifting for help output:
-# - Enables avoiding additional (costly) imports for presenting `--help`.
-# - The ordering matters for help display.
-#
-# Even though the module path starts with the same "pip._internal.commands"
-# prefix, the full path makes testing easier (specifically when modifying
-# `commands_dict` in test setup / teardown).
-commands_dict: Dict[str, CommandInfo] = {
- "install": CommandInfo(
- "pip._internal.commands.install",
- "InstallCommand",
- "Install packages.",
- ),
- "download": CommandInfo(
- "pip._internal.commands.download",
- "DownloadCommand",
- "Download packages.",
- ),
- "uninstall": CommandInfo(
- "pip._internal.commands.uninstall",
- "UninstallCommand",
- "Uninstall packages.",
- ),
- "freeze": CommandInfo(
- "pip._internal.commands.freeze",
- "FreezeCommand",
- "Output installed packages in requirements format.",
- ),
- "list": CommandInfo(
- "pip._internal.commands.list",
- "ListCommand",
- "List installed packages.",
- ),
- "show": CommandInfo(
- "pip._internal.commands.show",
- "ShowCommand",
- "Show information about installed packages.",
- ),
- "check": CommandInfo(
- "pip._internal.commands.check",
- "CheckCommand",
- "Verify installed packages have compatible dependencies.",
- ),
- "config": CommandInfo(
- "pip._internal.commands.configuration",
- "ConfigurationCommand",
- "Manage local and global configuration.",
- ),
- "search": CommandInfo(
- "pip._internal.commands.search",
- "SearchCommand",
- "Search PyPI for packages.",
- ),
- "cache": CommandInfo(
- "pip._internal.commands.cache",
- "CacheCommand",
- "Inspect and manage pip's wheel cache.",
- ),
- "index": CommandInfo(
- "pip._internal.commands.index",
- "IndexCommand",
- "Inspect information available from package indexes.",
- ),
- "wheel": CommandInfo(
- "pip._internal.commands.wheel",
- "WheelCommand",
- "Build wheels from your requirements.",
- ),
- "hash": CommandInfo(
- "pip._internal.commands.hash",
- "HashCommand",
- "Compute hashes of package archives.",
- ),
- "completion": CommandInfo(
- "pip._internal.commands.completion",
- "CompletionCommand",
- "A helper command used for command completion.",
- ),
- "debug": CommandInfo(
- "pip._internal.commands.debug",
- "DebugCommand",
- "Show information useful for debugging.",
- ),
- "help": CommandInfo(
- "pip._internal.commands.help",
- "HelpCommand",
- "Show help for commands.",
- ),
-}
-
-
-def create_command(name: str, **kwargs: Any) -> Command:
- """
- Create an instance of the Command class with the given name.
- """
- module_path, class_name, summary = commands_dict[name]
- module = importlib.import_module(module_path)
- command_class = getattr(module, class_name)
- command = command_class(name=name, summary=summary, **kwargs)
-
- return command
-
-
-def get_similar_commands(name: str) -> Optional[str]:
- """Command name auto-correct."""
- from difflib import get_close_matches
-
- name = name.lower()
-
- close_commands = get_close_matches(name, commands_dict.keys())
-
- if close_commands:
- return close_commands[0]
- else:
- return None
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/cache.py b/env/lib/python3.9/site-packages/pip/_internal/commands/cache.py
deleted file mode 100644
index f1a489d..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/cache.py
+++ /dev/null
@@ -1,223 +0,0 @@
-import os
-import textwrap
-from optparse import Values
-from typing import Any, List
-
-import pip._internal.utils.filesystem as filesystem
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.exceptions import CommandError, PipError
-from pip._internal.utils.logging import getLogger
-
-logger = getLogger(__name__)
-
-
-class CacheCommand(Command):
- """
- Inspect and manage pip's wheel cache.
-
- Subcommands:
-
- - dir: Show the cache directory.
- - info: Show information about the cache.
- - list: List filenames of packages stored in the cache.
- - remove: Remove one or more package from the cache.
- - purge: Remove all items from the cache.
-
- ```` can be a glob expression or a package name.
- """
-
- ignore_require_venv = True
- usage = """
- %prog dir
- %prog info
- %prog list [] [--format=[human, abspath]]
- %prog remove
- %prog purge
- """
-
- def add_options(self) -> None:
-
- self.cmd_opts.add_option(
- "--format",
- action="store",
- dest="list_format",
- default="human",
- choices=("human", "abspath"),
- help="Select the output format among: human (default) or abspath",
- )
-
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- handlers = {
- "dir": self.get_cache_dir,
- "info": self.get_cache_info,
- "list": self.list_cache_items,
- "remove": self.remove_cache_items,
- "purge": self.purge_cache,
- }
-
- if not options.cache_dir:
- logger.error("pip cache commands can not function since cache is disabled.")
- return ERROR
-
- # Determine action
- if not args or args[0] not in handlers:
- logger.error(
- "Need an action (%s) to perform.",
- ", ".join(sorted(handlers)),
- )
- return ERROR
-
- action = args[0]
-
- # Error handling happens here, not in the action-handlers.
- try:
- handlers[action](options, args[1:])
- except PipError as e:
- logger.error(e.args[0])
- return ERROR
-
- return SUCCESS
-
- def get_cache_dir(self, options: Values, args: List[Any]) -> None:
- if args:
- raise CommandError("Too many arguments")
-
- logger.info(options.cache_dir)
-
- def get_cache_info(self, options: Values, args: List[Any]) -> None:
- if args:
- raise CommandError("Too many arguments")
-
- num_http_files = len(self._find_http_files(options))
- num_packages = len(self._find_wheels(options, "*"))
-
- http_cache_location = self._cache_dir(options, "http")
- wheels_cache_location = self._cache_dir(options, "wheels")
- http_cache_size = filesystem.format_directory_size(http_cache_location)
- wheels_cache_size = filesystem.format_directory_size(wheels_cache_location)
-
- message = (
- textwrap.dedent(
- """
- Package index page cache location: {http_cache_location}
- Package index page cache size: {http_cache_size}
- Number of HTTP files: {num_http_files}
- Wheels location: {wheels_cache_location}
- Wheels size: {wheels_cache_size}
- Number of wheels: {package_count}
- """
- )
- .format(
- http_cache_location=http_cache_location,
- http_cache_size=http_cache_size,
- num_http_files=num_http_files,
- wheels_cache_location=wheels_cache_location,
- package_count=num_packages,
- wheels_cache_size=wheels_cache_size,
- )
- .strip()
- )
-
- logger.info(message)
-
- def list_cache_items(self, options: Values, args: List[Any]) -> None:
- if len(args) > 1:
- raise CommandError("Too many arguments")
-
- if args:
- pattern = args[0]
- else:
- pattern = "*"
-
- files = self._find_wheels(options, pattern)
- if options.list_format == "human":
- self.format_for_human(files)
- else:
- self.format_for_abspath(files)
-
- def format_for_human(self, files: List[str]) -> None:
- if not files:
- logger.info("Nothing cached.")
- return
-
- results = []
- for filename in files:
- wheel = os.path.basename(filename)
- size = filesystem.format_file_size(filename)
- results.append(f" - {wheel} ({size})")
- logger.info("Cache contents:\n")
- logger.info("\n".join(sorted(results)))
-
- def format_for_abspath(self, files: List[str]) -> None:
- if not files:
- return
-
- results = []
- for filename in files:
- results.append(filename)
-
- logger.info("\n".join(sorted(results)))
-
- def remove_cache_items(self, options: Values, args: List[Any]) -> None:
- if len(args) > 1:
- raise CommandError("Too many arguments")
-
- if not args:
- raise CommandError("Please provide a pattern")
-
- files = self._find_wheels(options, args[0])
-
- no_matching_msg = "No matching packages"
- if args[0] == "*":
- # Only fetch http files if no specific pattern given
- files += self._find_http_files(options)
- else:
- # Add the pattern to the log message
- no_matching_msg += ' for pattern "{}"'.format(args[0])
-
- if not files:
- logger.warning(no_matching_msg)
-
- for filename in files:
- os.unlink(filename)
- logger.verbose("Removed %s", filename)
- logger.info("Files removed: %s", len(files))
-
- def purge_cache(self, options: Values, args: List[Any]) -> None:
- if args:
- raise CommandError("Too many arguments")
-
- return self.remove_cache_items(options, ["*"])
-
- def _cache_dir(self, options: Values, subdir: str) -> str:
- return os.path.join(options.cache_dir, subdir)
-
- def _find_http_files(self, options: Values) -> List[str]:
- http_dir = self._cache_dir(options, "http")
- return filesystem.find_files(http_dir, "*")
-
- def _find_wheels(self, options: Values, pattern: str) -> List[str]:
- wheel_dir = self._cache_dir(options, "wheels")
-
- # The wheel filename format, as specified in PEP 427, is:
- # {distribution}-{version}(-{build})?-{python}-{abi}-{platform}.whl
- #
- # Additionally, non-alphanumeric values in the distribution are
- # normalized to underscores (_), meaning hyphens can never occur
- # before `-{version}`.
- #
- # Given that information:
- # - If the pattern we're given contains a hyphen (-), the user is
- # providing at least the version. Thus, we can just append `*.whl`
- # to match the rest of it.
- # - If the pattern we're given doesn't contain a hyphen (-), the
- # user is only providing the name. Thus, we append `-*.whl` to
- # match the hyphen before the version, followed by anything else.
- #
- # PEP 427: https://www.python.org/dev/peps/pep-0427/
- pattern = pattern + ("*.whl" if "-" in pattern else "-*.whl")
-
- return filesystem.find_files(wheel_dir, pattern)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/check.py b/env/lib/python3.9/site-packages/pip/_internal/commands/check.py
deleted file mode 100644
index 3864220..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/check.py
+++ /dev/null
@@ -1,53 +0,0 @@
-import logging
-from optparse import Values
-from typing import List
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.operations.check import (
- check_package_set,
- create_package_set_from_installed,
-)
-from pip._internal.utils.misc import write_output
-
-logger = logging.getLogger(__name__)
-
-
-class CheckCommand(Command):
- """Verify installed packages have compatible dependencies."""
-
- usage = """
- %prog [options]"""
-
- def run(self, options: Values, args: List[str]) -> int:
-
- package_set, parsing_probs = create_package_set_from_installed()
- missing, conflicting = check_package_set(package_set)
-
- for project_name in missing:
- version = package_set[project_name].version
- for dependency in missing[project_name]:
- write_output(
- "%s %s requires %s, which is not installed.",
- project_name,
- version,
- dependency[0],
- )
-
- for project_name in conflicting:
- version = package_set[project_name].version
- for dep_name, dep_version, req in conflicting[project_name]:
- write_output(
- "%s %s has requirement %s, but you have %s %s.",
- project_name,
- version,
- req,
- dep_name,
- dep_version,
- )
-
- if missing or conflicting or parsing_probs:
- return ERROR
- else:
- write_output("No broken requirements found.")
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/completion.py b/env/lib/python3.9/site-packages/pip/_internal/commands/completion.py
deleted file mode 100644
index deaa308..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/completion.py
+++ /dev/null
@@ -1,126 +0,0 @@
-import sys
-import textwrap
-from optparse import Values
-from typing import List
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.utils.misc import get_prog
-
-BASE_COMPLETION = """
-# pip {shell} completion start{script}# pip {shell} completion end
-"""
-
-COMPLETION_SCRIPTS = {
- "bash": """
- _pip_completion()
- {{
- COMPREPLY=( $( COMP_WORDS="${{COMP_WORDS[*]}}" \\
- COMP_CWORD=$COMP_CWORD \\
- PIP_AUTO_COMPLETE=1 $1 2>/dev/null ) )
- }}
- complete -o default -F _pip_completion {prog}
- """,
- "zsh": """
- function _pip_completion {{
- local words cword
- read -Ac words
- read -cn cword
- reply=( $( COMP_WORDS="$words[*]" \\
- COMP_CWORD=$(( cword-1 )) \\
- PIP_AUTO_COMPLETE=1 $words[1] 2>/dev/null ))
- }}
- compctl -K _pip_completion {prog}
- """,
- "fish": """
- function __fish_complete_pip
- set -lx COMP_WORDS (commandline -o) ""
- set -lx COMP_CWORD ( \\
- math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\
- )
- set -lx PIP_AUTO_COMPLETE 1
- string split \\ -- (eval $COMP_WORDS[1])
- end
- complete -fa "(__fish_complete_pip)" -c {prog}
- """,
- "powershell": """
- if ((Test-Path Function:\\TabExpansion) -and -not `
- (Test-Path Function:\\_pip_completeBackup)) {{
- Rename-Item Function:\\TabExpansion _pip_completeBackup
- }}
- function TabExpansion($line, $lastWord) {{
- $lastBlock = [regex]::Split($line, '[|;]')[-1].TrimStart()
- if ($lastBlock.StartsWith("{prog} ")) {{
- $Env:COMP_WORDS=$lastBlock
- $Env:COMP_CWORD=$lastBlock.Split().Length - 1
- $Env:PIP_AUTO_COMPLETE=1
- (& {prog}).Split()
- Remove-Item Env:COMP_WORDS
- Remove-Item Env:COMP_CWORD
- Remove-Item Env:PIP_AUTO_COMPLETE
- }}
- elseif (Test-Path Function:\\_pip_completeBackup) {{
- # Fall back on existing tab expansion
- _pip_completeBackup $line $lastWord
- }}
- }}
- """,
-}
-
-
-class CompletionCommand(Command):
- """A helper command to be used for command completion."""
-
- ignore_require_venv = True
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "--bash",
- "-b",
- action="store_const",
- const="bash",
- dest="shell",
- help="Emit completion code for bash",
- )
- self.cmd_opts.add_option(
- "--zsh",
- "-z",
- action="store_const",
- const="zsh",
- dest="shell",
- help="Emit completion code for zsh",
- )
- self.cmd_opts.add_option(
- "--fish",
- "-f",
- action="store_const",
- const="fish",
- dest="shell",
- help="Emit completion code for fish",
- )
- self.cmd_opts.add_option(
- "--powershell",
- "-p",
- action="store_const",
- const="powershell",
- dest="shell",
- help="Emit completion code for powershell",
- )
-
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- """Prints the completion code of the given shell"""
- shells = COMPLETION_SCRIPTS.keys()
- shell_options = ["--" + shell for shell in sorted(shells)]
- if options.shell in shells:
- script = textwrap.dedent(
- COMPLETION_SCRIPTS.get(options.shell, "").format(prog=get_prog())
- )
- print(BASE_COMPLETION.format(script=script, shell=options.shell))
- return SUCCESS
- else:
- sys.stderr.write(
- "ERROR: You must pass {}\n".format(" or ".join(shell_options))
- )
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/configuration.py b/env/lib/python3.9/site-packages/pip/_internal/commands/configuration.py
deleted file mode 100644
index e383732..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/configuration.py
+++ /dev/null
@@ -1,276 +0,0 @@
-import logging
-import os
-import subprocess
-from optparse import Values
-from typing import Any, List, Optional
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.configuration import (
- Configuration,
- Kind,
- get_configuration_files,
- kinds,
-)
-from pip._internal.exceptions import PipError
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.misc import get_prog, write_output
-
-logger = logging.getLogger(__name__)
-
-
-class ConfigurationCommand(Command):
- """
- Manage local and global configuration.
-
- Subcommands:
-
- - list: List the active configuration (or from the file specified)
- - edit: Edit the configuration file in an editor
- - get: Get the value associated with command.option
- - set: Set the command.option=value
- - unset: Unset the value associated with command.option
- - debug: List the configuration files and values defined under them
-
- Configuration keys should be dot separated command and option name,
- with the special prefix "global" affecting any command. For example,
- "pip config set global.index-url https://example.org/" would configure
- the index url for all commands, but "pip config set download.timeout 10"
- would configure a 10 second timeout only for "pip download" commands.
-
- If none of --user, --global and --site are passed, a virtual
- environment configuration file is used if one is active and the file
- exists. Otherwise, all modifications happen to the user file by
- default.
- """
-
- ignore_require_venv = True
- usage = """
- %prog [] list
- %prog [] [--editor ] edit
-
- %prog [] get command.option
- %prog [] set command.option value
- %prog [] unset command.option
- %prog [] debug
- """
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "--editor",
- dest="editor",
- action="store",
- default=None,
- help=(
- "Editor to use to edit the file. Uses VISUAL or EDITOR "
- "environment variables if not provided."
- ),
- )
-
- self.cmd_opts.add_option(
- "--global",
- dest="global_file",
- action="store_true",
- default=False,
- help="Use the system-wide configuration file only",
- )
-
- self.cmd_opts.add_option(
- "--user",
- dest="user_file",
- action="store_true",
- default=False,
- help="Use the user configuration file only",
- )
-
- self.cmd_opts.add_option(
- "--site",
- dest="site_file",
- action="store_true",
- default=False,
- help="Use the current environment configuration file only",
- )
-
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- handlers = {
- "list": self.list_values,
- "edit": self.open_in_editor,
- "get": self.get_name,
- "set": self.set_name_value,
- "unset": self.unset_name,
- "debug": self.list_config_values,
- }
-
- # Determine action
- if not args or args[0] not in handlers:
- logger.error(
- "Need an action (%s) to perform.",
- ", ".join(sorted(handlers)),
- )
- return ERROR
-
- action = args[0]
-
- # Determine which configuration files are to be loaded
- # Depends on whether the command is modifying.
- try:
- load_only = self._determine_file(
- options, need_value=(action in ["get", "set", "unset", "edit"])
- )
- except PipError as e:
- logger.error(e.args[0])
- return ERROR
-
- # Load a new configuration
- self.configuration = Configuration(
- isolated=options.isolated_mode, load_only=load_only
- )
- self.configuration.load()
-
- # Error handling happens here, not in the action-handlers.
- try:
- handlers[action](options, args[1:])
- except PipError as e:
- logger.error(e.args[0])
- return ERROR
-
- return SUCCESS
-
- def _determine_file(self, options: Values, need_value: bool) -> Optional[Kind]:
- file_options = [
- key
- for key, value in (
- (kinds.USER, options.user_file),
- (kinds.GLOBAL, options.global_file),
- (kinds.SITE, options.site_file),
- )
- if value
- ]
-
- if not file_options:
- if not need_value:
- return None
- # Default to user, unless there's a site file.
- elif any(
- os.path.exists(site_config_file)
- for site_config_file in get_configuration_files()[kinds.SITE]
- ):
- return kinds.SITE
- else:
- return kinds.USER
- elif len(file_options) == 1:
- return file_options[0]
-
- raise PipError(
- "Need exactly one file to operate upon "
- "(--user, --site, --global) to perform."
- )
-
- def list_values(self, options: Values, args: List[str]) -> None:
- self._get_n_args(args, "list", n=0)
-
- for key, value in sorted(self.configuration.items()):
- write_output("%s=%r", key, value)
-
- def get_name(self, options: Values, args: List[str]) -> None:
- key = self._get_n_args(args, "get [name]", n=1)
- value = self.configuration.get_value(key)
-
- write_output("%s", value)
-
- def set_name_value(self, options: Values, args: List[str]) -> None:
- key, value = self._get_n_args(args, "set [name] [value]", n=2)
- self.configuration.set_value(key, value)
-
- self._save_configuration()
-
- def unset_name(self, options: Values, args: List[str]) -> None:
- key = self._get_n_args(args, "unset [name]", n=1)
- self.configuration.unset_value(key)
-
- self._save_configuration()
-
- def list_config_values(self, options: Values, args: List[str]) -> None:
- """List config key-value pairs across different config files"""
- self._get_n_args(args, "debug", n=0)
-
- self.print_env_var_values()
- # Iterate over config files and print if they exist, and the
- # key-value pairs present in them if they do
- for variant, files in sorted(self.configuration.iter_config_files()):
- write_output("%s:", variant)
- for fname in files:
- with indent_log():
- file_exists = os.path.exists(fname)
- write_output("%s, exists: %r", fname, file_exists)
- if file_exists:
- self.print_config_file_values(variant)
-
- def print_config_file_values(self, variant: Kind) -> None:
- """Get key-value pairs from the file of a variant"""
- for name, value in self.configuration.get_values_in_config(variant).items():
- with indent_log():
- write_output("%s: %s", name, value)
-
- def print_env_var_values(self) -> None:
- """Get key-values pairs present as environment variables"""
- write_output("%s:", "env_var")
- with indent_log():
- for key, value in sorted(self.configuration.get_environ_vars()):
- env_var = f"PIP_{key.upper()}"
- write_output("%s=%r", env_var, value)
-
- def open_in_editor(self, options: Values, args: List[str]) -> None:
- editor = self._determine_editor(options)
-
- fname = self.configuration.get_file_to_edit()
- if fname is None:
- raise PipError("Could not determine appropriate file.")
-
- try:
- subprocess.check_call([editor, fname])
- except FileNotFoundError as e:
- if not e.filename:
- e.filename = editor
- raise
- except subprocess.CalledProcessError as e:
- raise PipError(
- "Editor Subprocess exited with exit code {}".format(e.returncode)
- )
-
- def _get_n_args(self, args: List[str], example: str, n: int) -> Any:
- """Helper to make sure the command got the right number of arguments"""
- if len(args) != n:
- msg = (
- "Got unexpected number of arguments, expected {}. "
- '(example: "{} config {}")'
- ).format(n, get_prog(), example)
- raise PipError(msg)
-
- if n == 1:
- return args[0]
- else:
- return args
-
- def _save_configuration(self) -> None:
- # We successfully ran a modifying command. Need to save the
- # configuration.
- try:
- self.configuration.save()
- except Exception:
- logger.exception(
- "Unable to save configuration. Please report this as a bug."
- )
- raise PipError("Internal Error.")
-
- def _determine_editor(self, options: Values) -> str:
- if options.editor is not None:
- return options.editor
- elif "VISUAL" in os.environ:
- return os.environ["VISUAL"]
- elif "EDITOR" in os.environ:
- return os.environ["EDITOR"]
- else:
- raise PipError("Could not determine editor to use.")
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/debug.py b/env/lib/python3.9/site-packages/pip/_internal/commands/debug.py
deleted file mode 100644
index 084d7fa..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/debug.py
+++ /dev/null
@@ -1,203 +0,0 @@
-import locale
-import logging
-import os
-import sys
-from optparse import Values
-from types import ModuleType
-from typing import Any, Dict, List, Optional
-
-import pip._vendor
-from pip._vendor.certifi import where
-from pip._vendor.packaging.version import parse as parse_version
-
-from pip import __file__ as pip_location
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.cmdoptions import make_target_python
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.configuration import Configuration
-from pip._internal.metadata import get_environment
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.misc import get_pip_version
-
-logger = logging.getLogger(__name__)
-
-
-def show_value(name: str, value: Any) -> None:
- logger.info("%s: %s", name, value)
-
-
-def show_sys_implementation() -> None:
- logger.info("sys.implementation:")
- implementation_name = sys.implementation.name
- with indent_log():
- show_value("name", implementation_name)
-
-
-def create_vendor_txt_map() -> Dict[str, str]:
- vendor_txt_path = os.path.join(
- os.path.dirname(pip_location), "_vendor", "vendor.txt"
- )
-
- with open(vendor_txt_path) as f:
- # Purge non version specifying lines.
- # Also, remove any space prefix or suffixes (including comments).
- lines = [
- line.strip().split(" ", 1)[0] for line in f.readlines() if "==" in line
- ]
-
- # Transform into "module" -> version dict.
- return dict(line.split("==", 1) for line in lines)
-
-
-def get_module_from_module_name(module_name: str) -> ModuleType:
- # Module name can be uppercase in vendor.txt for some reason...
- module_name = module_name.lower()
- # PATCH: setuptools is actually only pkg_resources.
- if module_name == "setuptools":
- module_name = "pkg_resources"
-
- __import__(f"pip._vendor.{module_name}", globals(), locals(), level=0)
- return getattr(pip._vendor, module_name)
-
-
-def get_vendor_version_from_module(module_name: str) -> Optional[str]:
- module = get_module_from_module_name(module_name)
- version = getattr(module, "__version__", None)
-
- if not version:
- # Try to find version in debundled module info.
- assert module.__file__ is not None
- env = get_environment([os.path.dirname(module.__file__)])
- dist = env.get_distribution(module_name)
- if dist:
- version = str(dist.version)
-
- return version
-
-
-def show_actual_vendor_versions(vendor_txt_versions: Dict[str, str]) -> None:
- """Log the actual version and print extra info if there is
- a conflict or if the actual version could not be imported.
- """
- for module_name, expected_version in vendor_txt_versions.items():
- extra_message = ""
- actual_version = get_vendor_version_from_module(module_name)
- if not actual_version:
- extra_message = (
- " (Unable to locate actual module version, using"
- " vendor.txt specified version)"
- )
- actual_version = expected_version
- elif parse_version(actual_version) != parse_version(expected_version):
- extra_message = (
- " (CONFLICT: vendor.txt suggests version should"
- " be {})".format(expected_version)
- )
- logger.info("%s==%s%s", module_name, actual_version, extra_message)
-
-
-def show_vendor_versions() -> None:
- logger.info("vendored library versions:")
-
- vendor_txt_versions = create_vendor_txt_map()
- with indent_log():
- show_actual_vendor_versions(vendor_txt_versions)
-
-
-def show_tags(options: Values) -> None:
- tag_limit = 10
-
- target_python = make_target_python(options)
- tags = target_python.get_tags()
-
- # Display the target options that were explicitly provided.
- formatted_target = target_python.format_given()
- suffix = ""
- if formatted_target:
- suffix = f" (target: {formatted_target})"
-
- msg = "Compatible tags: {}{}".format(len(tags), suffix)
- logger.info(msg)
-
- if options.verbose < 1 and len(tags) > tag_limit:
- tags_limited = True
- tags = tags[:tag_limit]
- else:
- tags_limited = False
-
- with indent_log():
- for tag in tags:
- logger.info(str(tag))
-
- if tags_limited:
- msg = (
- "...\n[First {tag_limit} tags shown. Pass --verbose to show all.]"
- ).format(tag_limit=tag_limit)
- logger.info(msg)
-
-
-def ca_bundle_info(config: Configuration) -> str:
- levels = set()
- for key, _ in config.items():
- levels.add(key.split(".")[0])
-
- if not levels:
- return "Not specified"
-
- levels_that_override_global = ["install", "wheel", "download"]
- global_overriding_level = [
- level for level in levels if level in levels_that_override_global
- ]
- if not global_overriding_level:
- return "global"
-
- if "global" in levels:
- levels.remove("global")
- return ", ".join(levels)
-
-
-class DebugCommand(Command):
- """
- Display debug information.
- """
-
- usage = """
- %prog """
- ignore_require_venv = True
-
- def add_options(self) -> None:
- cmdoptions.add_target_python_options(self.cmd_opts)
- self.parser.insert_option_group(0, self.cmd_opts)
- self.parser.config.load()
-
- def run(self, options: Values, args: List[str]) -> int:
- logger.warning(
- "This command is only meant for debugging. "
- "Do not use this with automation for parsing and getting these "
- "details, since the output and options of this command may "
- "change without notice."
- )
- show_value("pip version", get_pip_version())
- show_value("sys.version", sys.version)
- show_value("sys.executable", sys.executable)
- show_value("sys.getdefaultencoding", sys.getdefaultencoding())
- show_value("sys.getfilesystemencoding", sys.getfilesystemencoding())
- show_value(
- "locale.getpreferredencoding",
- locale.getpreferredencoding(),
- )
- show_value("sys.platform", sys.platform)
- show_sys_implementation()
-
- show_value("'cert' config value", ca_bundle_info(self.parser.config))
- show_value("REQUESTS_CA_BUNDLE", os.environ.get("REQUESTS_CA_BUNDLE"))
- show_value("CURL_CA_BUNDLE", os.environ.get("CURL_CA_BUNDLE"))
- show_value("pip._vendor.certifi.where()", where())
- show_value("pip._vendor.DEBUNDLED", pip._vendor.DEBUNDLED)
-
- show_vendor_versions()
-
- show_tags(options)
-
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/download.py b/env/lib/python3.9/site-packages/pip/_internal/commands/download.py
deleted file mode 100644
index d70ce4f..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/download.py
+++ /dev/null
@@ -1,141 +0,0 @@
-import logging
-import os
-from optparse import Values
-from typing import List
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.cmdoptions import make_target_python
-from pip._internal.cli.req_command import RequirementCommand, with_cleanup
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.operations.build.build_tracker import get_build_tracker
-from pip._internal.utils.misc import ensure_dir, normalize_path, write_output
-from pip._internal.utils.temp_dir import TempDirectory
-
-logger = logging.getLogger(__name__)
-
-
-class DownloadCommand(RequirementCommand):
- """
- Download packages from:
-
- - PyPI (and other indexes) using requirement specifiers.
- - VCS project urls.
- - Local project directories.
- - Local or remote source archives.
-
- pip also supports downloading from "requirements files", which provide
- an easy way to specify a whole environment to be downloaded.
- """
-
- usage = """
- %prog [options] [package-index-options] ...
- %prog [options] -r [package-index-options] ...
- %prog [options] ...
- %prog [options] ...
- %prog [options] ..."""
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(cmdoptions.constraints())
- self.cmd_opts.add_option(cmdoptions.requirements())
- self.cmd_opts.add_option(cmdoptions.no_deps())
- self.cmd_opts.add_option(cmdoptions.global_options())
- self.cmd_opts.add_option(cmdoptions.no_binary())
- self.cmd_opts.add_option(cmdoptions.only_binary())
- self.cmd_opts.add_option(cmdoptions.prefer_binary())
- self.cmd_opts.add_option(cmdoptions.src())
- self.cmd_opts.add_option(cmdoptions.pre())
- self.cmd_opts.add_option(cmdoptions.require_hashes())
- self.cmd_opts.add_option(cmdoptions.progress_bar())
- self.cmd_opts.add_option(cmdoptions.no_build_isolation())
- self.cmd_opts.add_option(cmdoptions.use_pep517())
- self.cmd_opts.add_option(cmdoptions.no_use_pep517())
- self.cmd_opts.add_option(cmdoptions.check_build_deps())
- self.cmd_opts.add_option(cmdoptions.ignore_requires_python())
-
- self.cmd_opts.add_option(
- "-d",
- "--dest",
- "--destination-dir",
- "--destination-directory",
- dest="download_dir",
- metavar="dir",
- default=os.curdir,
- help="Download packages into .",
- )
-
- cmdoptions.add_target_python_options(self.cmd_opts)
-
- index_opts = cmdoptions.make_option_group(
- cmdoptions.index_group,
- self.parser,
- )
-
- self.parser.insert_option_group(0, index_opts)
- self.parser.insert_option_group(0, self.cmd_opts)
-
- @with_cleanup
- def run(self, options: Values, args: List[str]) -> int:
-
- options.ignore_installed = True
- # editable doesn't really make sense for `pip download`, but the bowels
- # of the RequirementSet code require that property.
- options.editables = []
-
- cmdoptions.check_dist_restriction(options)
-
- options.download_dir = normalize_path(options.download_dir)
- ensure_dir(options.download_dir)
-
- session = self.get_default_session(options)
-
- target_python = make_target_python(options)
- finder = self._build_package_finder(
- options=options,
- session=session,
- target_python=target_python,
- ignore_requires_python=options.ignore_requires_python,
- )
-
- build_tracker = self.enter_context(get_build_tracker())
-
- directory = TempDirectory(
- delete=not options.no_clean,
- kind="download",
- globally_managed=True,
- )
-
- reqs = self.get_requirements(args, options, finder, session)
-
- preparer = self.make_requirement_preparer(
- temp_build_dir=directory,
- options=options,
- build_tracker=build_tracker,
- session=session,
- finder=finder,
- download_dir=options.download_dir,
- use_user_site=False,
- verbosity=self.verbosity,
- )
-
- resolver = self.make_resolver(
- preparer=preparer,
- finder=finder,
- options=options,
- ignore_requires_python=options.ignore_requires_python,
- py_version_info=options.python_version,
- )
-
- self.trace_basic_info(finder)
-
- requirement_set = resolver.resolve(reqs, check_supported_wheels=True)
-
- downloaded: List[str] = []
- for req in requirement_set.requirements.values():
- if req.satisfied_by is None:
- assert req.name is not None
- preparer.save_linked_requirement(req)
- downloaded.append(req.name)
- if downloaded:
- write_output("Successfully downloaded %s", " ".join(downloaded))
-
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/freeze.py b/env/lib/python3.9/site-packages/pip/_internal/commands/freeze.py
deleted file mode 100644
index 5fa6d39..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/freeze.py
+++ /dev/null
@@ -1,97 +0,0 @@
-import sys
-from optparse import Values
-from typing import List
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.operations.freeze import freeze
-from pip._internal.utils.compat import stdlib_pkgs
-
-DEV_PKGS = {"pip", "setuptools", "distribute", "wheel"}
-
-
-class FreezeCommand(Command):
- """
- Output installed packages in requirements format.
-
- packages are listed in a case-insensitive sorted order.
- """
-
- usage = """
- %prog [options]"""
- log_streams = ("ext://sys.stderr", "ext://sys.stderr")
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "-r",
- "--requirement",
- dest="requirements",
- action="append",
- default=[],
- metavar="file",
- help=(
- "Use the order in the given requirements file and its "
- "comments when generating output. This option can be "
- "used multiple times."
- ),
- )
- self.cmd_opts.add_option(
- "-l",
- "--local",
- dest="local",
- action="store_true",
- default=False,
- help=(
- "If in a virtualenv that has global access, do not output "
- "globally-installed packages."
- ),
- )
- self.cmd_opts.add_option(
- "--user",
- dest="user",
- action="store_true",
- default=False,
- help="Only output packages installed in user-site.",
- )
- self.cmd_opts.add_option(cmdoptions.list_path())
- self.cmd_opts.add_option(
- "--all",
- dest="freeze_all",
- action="store_true",
- help=(
- "Do not skip these packages in the output:"
- " {}".format(", ".join(DEV_PKGS))
- ),
- )
- self.cmd_opts.add_option(
- "--exclude-editable",
- dest="exclude_editable",
- action="store_true",
- help="Exclude editable package from output.",
- )
- self.cmd_opts.add_option(cmdoptions.list_exclude())
-
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- skip = set(stdlib_pkgs)
- if not options.freeze_all:
- skip.update(DEV_PKGS)
-
- if options.excludes:
- skip.update(options.excludes)
-
- cmdoptions.check_list_path_option(options)
-
- for line in freeze(
- requirement=options.requirements,
- local_only=options.local,
- user_only=options.user,
- paths=options.path,
- isolated=options.isolated_mode,
- skip=skip,
- exclude_editable=options.exclude_editable,
- ):
- sys.stdout.write(line + "\n")
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/hash.py b/env/lib/python3.9/site-packages/pip/_internal/commands/hash.py
deleted file mode 100644
index 042dac8..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/hash.py
+++ /dev/null
@@ -1,59 +0,0 @@
-import hashlib
-import logging
-import sys
-from optparse import Values
-from typing import List
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.utils.hashes import FAVORITE_HASH, STRONG_HASHES
-from pip._internal.utils.misc import read_chunks, write_output
-
-logger = logging.getLogger(__name__)
-
-
-class HashCommand(Command):
- """
- Compute a hash of a local package archive.
-
- These can be used with --hash in a requirements file to do repeatable
- installs.
- """
-
- usage = "%prog [options] ..."
- ignore_require_venv = True
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "-a",
- "--algorithm",
- dest="algorithm",
- choices=STRONG_HASHES,
- action="store",
- default=FAVORITE_HASH,
- help="The hash algorithm to use: one of {}".format(
- ", ".join(STRONG_HASHES)
- ),
- )
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- if not args:
- self.parser.print_usage(sys.stderr)
- return ERROR
-
- algorithm = options.algorithm
- for path in args:
- write_output(
- "%s:\n--hash=%s:%s", path, algorithm, _hash_of_file(path, algorithm)
- )
- return SUCCESS
-
-
-def _hash_of_file(path: str, algorithm: str) -> str:
- """Return the hash digest of a file."""
- with open(path, "rb") as archive:
- hash = hashlib.new(algorithm)
- for chunk in read_chunks(archive):
- hash.update(chunk)
- return hash.hexdigest()
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/help.py b/env/lib/python3.9/site-packages/pip/_internal/commands/help.py
deleted file mode 100644
index 6206631..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/help.py
+++ /dev/null
@@ -1,41 +0,0 @@
-from optparse import Values
-from typing import List
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.exceptions import CommandError
-
-
-class HelpCommand(Command):
- """Show help for commands"""
-
- usage = """
- %prog """
- ignore_require_venv = True
-
- def run(self, options: Values, args: List[str]) -> int:
- from pip._internal.commands import (
- commands_dict,
- create_command,
- get_similar_commands,
- )
-
- try:
- # 'pip help' with no args is handled by pip.__init__.parseopt()
- cmd_name = args[0] # the command we need help for
- except IndexError:
- return SUCCESS
-
- if cmd_name not in commands_dict:
- guess = get_similar_commands(cmd_name)
-
- msg = [f'unknown command "{cmd_name}"']
- if guess:
- msg.append(f'maybe you meant "{guess}"')
-
- raise CommandError(" - ".join(msg))
-
- command = create_command(cmd_name)
- command.parser.print_help()
-
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/index.py b/env/lib/python3.9/site-packages/pip/_internal/commands/index.py
deleted file mode 100644
index 9d8aae3..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/index.py
+++ /dev/null
@@ -1,139 +0,0 @@
-import logging
-from optparse import Values
-from typing import Any, Iterable, List, Optional, Union
-
-from pip._vendor.packaging.version import LegacyVersion, Version
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.req_command import IndexGroupCommand
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.commands.search import print_dist_installation_info
-from pip._internal.exceptions import CommandError, DistributionNotFound, PipError
-from pip._internal.index.collector import LinkCollector
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.models.selection_prefs import SelectionPreferences
-from pip._internal.models.target_python import TargetPython
-from pip._internal.network.session import PipSession
-from pip._internal.utils.misc import write_output
-
-logger = logging.getLogger(__name__)
-
-
-class IndexCommand(IndexGroupCommand):
- """
- Inspect information available from package indexes.
- """
-
- usage = """
- %prog versions
- """
-
- def add_options(self) -> None:
- cmdoptions.add_target_python_options(self.cmd_opts)
-
- self.cmd_opts.add_option(cmdoptions.ignore_requires_python())
- self.cmd_opts.add_option(cmdoptions.pre())
- self.cmd_opts.add_option(cmdoptions.no_binary())
- self.cmd_opts.add_option(cmdoptions.only_binary())
-
- index_opts = cmdoptions.make_option_group(
- cmdoptions.index_group,
- self.parser,
- )
-
- self.parser.insert_option_group(0, index_opts)
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- handlers = {
- "versions": self.get_available_package_versions,
- }
-
- logger.warning(
- "pip index is currently an experimental command. "
- "It may be removed/changed in a future release "
- "without prior warning."
- )
-
- # Determine action
- if not args or args[0] not in handlers:
- logger.error(
- "Need an action (%s) to perform.",
- ", ".join(sorted(handlers)),
- )
- return ERROR
-
- action = args[0]
-
- # Error handling happens here, not in the action-handlers.
- try:
- handlers[action](options, args[1:])
- except PipError as e:
- logger.error(e.args[0])
- return ERROR
-
- return SUCCESS
-
- def _build_package_finder(
- self,
- options: Values,
- session: PipSession,
- target_python: Optional[TargetPython] = None,
- ignore_requires_python: Optional[bool] = None,
- ) -> PackageFinder:
- """
- Create a package finder appropriate to the index command.
- """
- link_collector = LinkCollector.create(session, options=options)
-
- # Pass allow_yanked=False to ignore yanked versions.
- selection_prefs = SelectionPreferences(
- allow_yanked=False,
- allow_all_prereleases=options.pre,
- ignore_requires_python=ignore_requires_python,
- )
-
- return PackageFinder.create(
- link_collector=link_collector,
- selection_prefs=selection_prefs,
- target_python=target_python,
- use_deprecated_html5lib="html5lib" in options.deprecated_features_enabled,
- )
-
- def get_available_package_versions(self, options: Values, args: List[Any]) -> None:
- if len(args) != 1:
- raise CommandError("You need to specify exactly one argument")
-
- target_python = cmdoptions.make_target_python(options)
- query = args[0]
-
- with self._build_session(options) as session:
- finder = self._build_package_finder(
- options=options,
- session=session,
- target_python=target_python,
- ignore_requires_python=options.ignore_requires_python,
- )
-
- versions: Iterable[Union[LegacyVersion, Version]] = (
- candidate.version for candidate in finder.find_all_candidates(query)
- )
-
- if not options.pre:
- # Remove prereleases
- versions = (
- version for version in versions if not version.is_prerelease
- )
- versions = set(versions)
-
- if not versions:
- raise DistributionNotFound(
- "No matching distribution found for {}".format(query)
- )
-
- formatted_versions = [str(ver) for ver in sorted(versions, reverse=True)]
- latest = formatted_versions[0]
-
- write_output("{} ({})".format(query, latest))
- write_output("Available versions: {}".format(", ".join(formatted_versions)))
- print_dist_installation_info(query, latest)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/install.py b/env/lib/python3.9/site-packages/pip/_internal/commands/install.py
deleted file mode 100644
index 3634ea0..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/install.py
+++ /dev/null
@@ -1,773 +0,0 @@
-import errno
-import operator
-import os
-import shutil
-import site
-from optparse import SUPPRESS_HELP, Values
-from typing import Iterable, List, Optional
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.cache import WheelCache
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.cmdoptions import make_target_python
-from pip._internal.cli.req_command import (
- RequirementCommand,
- warn_if_run_as_root,
- with_cleanup,
-)
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.exceptions import CommandError, InstallationError
-from pip._internal.locations import get_scheme
-from pip._internal.metadata import get_environment
-from pip._internal.models.format_control import FormatControl
-from pip._internal.operations.build.build_tracker import get_build_tracker
-from pip._internal.operations.check import ConflictDetails, check_install_conflicts
-from pip._internal.req import install_given_reqs
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.compat import WINDOWS
-from pip._internal.utils.distutils_args import parse_distutils_args
-from pip._internal.utils.filesystem import test_writable_dir
-from pip._internal.utils.logging import getLogger
-from pip._internal.utils.misc import (
- ensure_dir,
- get_pip_version,
- protect_pip_from_modification_on_windows,
- write_output,
-)
-from pip._internal.utils.temp_dir import TempDirectory
-from pip._internal.utils.virtualenv import (
- running_under_virtualenv,
- virtualenv_no_global,
-)
-from pip._internal.wheel_builder import (
- BinaryAllowedPredicate,
- build,
- should_build_for_install_command,
-)
-
-logger = getLogger(__name__)
-
-
-def get_check_binary_allowed(format_control: FormatControl) -> BinaryAllowedPredicate:
- def check_binary_allowed(req: InstallRequirement) -> bool:
- canonical_name = canonicalize_name(req.name or "")
- allowed_formats = format_control.get_allowed_formats(canonical_name)
- return "binary" in allowed_formats
-
- return check_binary_allowed
-
-
-class InstallCommand(RequirementCommand):
- """
- Install packages from:
-
- - PyPI (and other indexes) using requirement specifiers.
- - VCS project urls.
- - Local project directories.
- - Local or remote source archives.
-
- pip also supports installing from "requirements files", which provide
- an easy way to specify a whole environment to be installed.
- """
-
- usage = """
- %prog [options] [package-index-options] ...
- %prog [options] -r [package-index-options] ...
- %prog [options] [-e] ...
- %prog [options] [-e] ...
- %prog [options] ..."""
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(cmdoptions.requirements())
- self.cmd_opts.add_option(cmdoptions.constraints())
- self.cmd_opts.add_option(cmdoptions.no_deps())
- self.cmd_opts.add_option(cmdoptions.pre())
-
- self.cmd_opts.add_option(cmdoptions.editable())
- self.cmd_opts.add_option(
- "-t",
- "--target",
- dest="target_dir",
- metavar="dir",
- default=None,
- help=(
- "Install packages into . "
- "By default this will not replace existing files/folders in "
- ". Use --upgrade to replace existing packages in "
- "with new versions."
- ),
- )
- cmdoptions.add_target_python_options(self.cmd_opts)
-
- self.cmd_opts.add_option(
- "--user",
- dest="use_user_site",
- action="store_true",
- help=(
- "Install to the Python user install directory for your "
- "platform. Typically ~/.local/, or %APPDATA%\\Python on "
- "Windows. (See the Python documentation for site.USER_BASE "
- "for full details.)"
- ),
- )
- self.cmd_opts.add_option(
- "--no-user",
- dest="use_user_site",
- action="store_false",
- help=SUPPRESS_HELP,
- )
- self.cmd_opts.add_option(
- "--root",
- dest="root_path",
- metavar="dir",
- default=None,
- help="Install everything relative to this alternate root directory.",
- )
- self.cmd_opts.add_option(
- "--prefix",
- dest="prefix_path",
- metavar="dir",
- default=None,
- help=(
- "Installation prefix where lib, bin and other top-level "
- "folders are placed"
- ),
- )
-
- self.cmd_opts.add_option(cmdoptions.src())
-
- self.cmd_opts.add_option(
- "-U",
- "--upgrade",
- dest="upgrade",
- action="store_true",
- help=(
- "Upgrade all specified packages to the newest available "
- "version. The handling of dependencies depends on the "
- "upgrade-strategy used."
- ),
- )
-
- self.cmd_opts.add_option(
- "--upgrade-strategy",
- dest="upgrade_strategy",
- default="only-if-needed",
- choices=["only-if-needed", "eager"],
- help=(
- "Determines how dependency upgrading should be handled "
- "[default: %default]. "
- '"eager" - dependencies are upgraded regardless of '
- "whether the currently installed version satisfies the "
- "requirements of the upgraded package(s). "
- '"only-if-needed" - are upgraded only when they do not '
- "satisfy the requirements of the upgraded package(s)."
- ),
- )
-
- self.cmd_opts.add_option(
- "--force-reinstall",
- dest="force_reinstall",
- action="store_true",
- help="Reinstall all packages even if they are already up-to-date.",
- )
-
- self.cmd_opts.add_option(
- "-I",
- "--ignore-installed",
- dest="ignore_installed",
- action="store_true",
- help=(
- "Ignore the installed packages, overwriting them. "
- "This can break your system if the existing package "
- "is of a different version or was installed "
- "with a different package manager!"
- ),
- )
-
- self.cmd_opts.add_option(cmdoptions.ignore_requires_python())
- self.cmd_opts.add_option(cmdoptions.no_build_isolation())
- self.cmd_opts.add_option(cmdoptions.use_pep517())
- self.cmd_opts.add_option(cmdoptions.no_use_pep517())
- self.cmd_opts.add_option(cmdoptions.check_build_deps())
-
- self.cmd_opts.add_option(cmdoptions.config_settings())
- self.cmd_opts.add_option(cmdoptions.install_options())
- self.cmd_opts.add_option(cmdoptions.global_options())
-
- self.cmd_opts.add_option(
- "--compile",
- action="store_true",
- dest="compile",
- default=True,
- help="Compile Python source files to bytecode",
- )
-
- self.cmd_opts.add_option(
- "--no-compile",
- action="store_false",
- dest="compile",
- help="Do not compile Python source files to bytecode",
- )
-
- self.cmd_opts.add_option(
- "--no-warn-script-location",
- action="store_false",
- dest="warn_script_location",
- default=True,
- help="Do not warn when installing scripts outside PATH",
- )
- self.cmd_opts.add_option(
- "--no-warn-conflicts",
- action="store_false",
- dest="warn_about_conflicts",
- default=True,
- help="Do not warn about broken dependencies",
- )
- self.cmd_opts.add_option(cmdoptions.no_binary())
- self.cmd_opts.add_option(cmdoptions.only_binary())
- self.cmd_opts.add_option(cmdoptions.prefer_binary())
- self.cmd_opts.add_option(cmdoptions.require_hashes())
- self.cmd_opts.add_option(cmdoptions.progress_bar())
- self.cmd_opts.add_option(cmdoptions.root_user_action())
-
- index_opts = cmdoptions.make_option_group(
- cmdoptions.index_group,
- self.parser,
- )
-
- self.parser.insert_option_group(0, index_opts)
- self.parser.insert_option_group(0, self.cmd_opts)
-
- @with_cleanup
- def run(self, options: Values, args: List[str]) -> int:
- if options.use_user_site and options.target_dir is not None:
- raise CommandError("Can not combine '--user' and '--target'")
-
- cmdoptions.check_install_build_global(options)
- upgrade_strategy = "to-satisfy-only"
- if options.upgrade:
- upgrade_strategy = options.upgrade_strategy
-
- cmdoptions.check_dist_restriction(options, check_target=True)
-
- install_options = options.install_options or []
-
- logger.verbose("Using %s", get_pip_version())
- options.use_user_site = decide_user_install(
- options.use_user_site,
- prefix_path=options.prefix_path,
- target_dir=options.target_dir,
- root_path=options.root_path,
- isolated_mode=options.isolated_mode,
- )
-
- target_temp_dir: Optional[TempDirectory] = None
- target_temp_dir_path: Optional[str] = None
- if options.target_dir:
- options.ignore_installed = True
- options.target_dir = os.path.abspath(options.target_dir)
- if (
- # fmt: off
- os.path.exists(options.target_dir) and
- not os.path.isdir(options.target_dir)
- # fmt: on
- ):
- raise CommandError(
- "Target path exists but is not a directory, will not continue."
- )
-
- # Create a target directory for using with the target option
- target_temp_dir = TempDirectory(kind="target")
- target_temp_dir_path = target_temp_dir.path
- self.enter_context(target_temp_dir)
-
- global_options = options.global_options or []
-
- session = self.get_default_session(options)
-
- target_python = make_target_python(options)
- finder = self._build_package_finder(
- options=options,
- session=session,
- target_python=target_python,
- ignore_requires_python=options.ignore_requires_python,
- )
- wheel_cache = WheelCache(options.cache_dir, options.format_control)
-
- build_tracker = self.enter_context(get_build_tracker())
-
- directory = TempDirectory(
- delete=not options.no_clean,
- kind="install",
- globally_managed=True,
- )
-
- try:
- reqs = self.get_requirements(args, options, finder, session)
-
- # Only when installing is it permitted to use PEP 660.
- # In other circumstances (pip wheel, pip download) we generate
- # regular (i.e. non editable) metadata and wheels.
- for req in reqs:
- req.permit_editable_wheels = True
-
- reject_location_related_install_options(reqs, options.install_options)
-
- preparer = self.make_requirement_preparer(
- temp_build_dir=directory,
- options=options,
- build_tracker=build_tracker,
- session=session,
- finder=finder,
- use_user_site=options.use_user_site,
- verbosity=self.verbosity,
- )
- resolver = self.make_resolver(
- preparer=preparer,
- finder=finder,
- options=options,
- wheel_cache=wheel_cache,
- use_user_site=options.use_user_site,
- ignore_installed=options.ignore_installed,
- ignore_requires_python=options.ignore_requires_python,
- force_reinstall=options.force_reinstall,
- upgrade_strategy=upgrade_strategy,
- use_pep517=options.use_pep517,
- )
-
- self.trace_basic_info(finder)
-
- requirement_set = resolver.resolve(
- reqs, check_supported_wheels=not options.target_dir
- )
-
- try:
- pip_req = requirement_set.get_requirement("pip")
- except KeyError:
- modifying_pip = False
- else:
- # If we're not replacing an already installed pip,
- # we're not modifying it.
- modifying_pip = pip_req.satisfied_by is None
- protect_pip_from_modification_on_windows(modifying_pip=modifying_pip)
-
- check_binary_allowed = get_check_binary_allowed(finder.format_control)
-
- reqs_to_build = [
- r
- for r in requirement_set.requirements.values()
- if should_build_for_install_command(r, check_binary_allowed)
- ]
-
- _, build_failures = build(
- reqs_to_build,
- wheel_cache=wheel_cache,
- verify=True,
- build_options=[],
- global_options=[],
- )
-
- # If we're using PEP 517, we cannot do a legacy setup.py install
- # so we fail here.
- pep517_build_failure_names: List[str] = [
- r.name for r in build_failures if r.use_pep517 # type: ignore
- ]
- if pep517_build_failure_names:
- raise InstallationError(
- "Could not build wheels for {}, which is required to "
- "install pyproject.toml-based projects".format(
- ", ".join(pep517_build_failure_names)
- )
- )
-
- # For now, we just warn about failures building legacy
- # requirements, as we'll fall through to a setup.py install for
- # those.
- for r in build_failures:
- if not r.use_pep517:
- r.legacy_install_reason = 8368
-
- to_install = resolver.get_installation_order(requirement_set)
-
- # Check for conflicts in the package set we're installing.
- conflicts: Optional[ConflictDetails] = None
- should_warn_about_conflicts = (
- not options.ignore_dependencies and options.warn_about_conflicts
- )
- if should_warn_about_conflicts:
- conflicts = self._determine_conflicts(to_install)
-
- # Don't warn about script install locations if
- # --target or --prefix has been specified
- warn_script_location = options.warn_script_location
- if options.target_dir or options.prefix_path:
- warn_script_location = False
-
- installed = install_given_reqs(
- to_install,
- install_options,
- global_options,
- root=options.root_path,
- home=target_temp_dir_path,
- prefix=options.prefix_path,
- warn_script_location=warn_script_location,
- use_user_site=options.use_user_site,
- pycompile=options.compile,
- )
-
- lib_locations = get_lib_location_guesses(
- user=options.use_user_site,
- home=target_temp_dir_path,
- root=options.root_path,
- prefix=options.prefix_path,
- isolated=options.isolated_mode,
- )
- env = get_environment(lib_locations)
-
- installed.sort(key=operator.attrgetter("name"))
- items = []
- for result in installed:
- item = result.name
- try:
- installed_dist = env.get_distribution(item)
- if installed_dist is not None:
- item = f"{item}-{installed_dist.version}"
- except Exception:
- pass
- items.append(item)
-
- if conflicts is not None:
- self._warn_about_conflicts(
- conflicts,
- resolver_variant=self.determine_resolver_variant(options),
- )
-
- installed_desc = " ".join(items)
- if installed_desc:
- write_output(
- "Successfully installed %s",
- installed_desc,
- )
- except OSError as error:
- show_traceback = self.verbosity >= 1
-
- message = create_os_error_message(
- error,
- show_traceback,
- options.use_user_site,
- )
- logger.error(message, exc_info=show_traceback) # noqa
-
- return ERROR
-
- if options.target_dir:
- assert target_temp_dir
- self._handle_target_dir(
- options.target_dir, target_temp_dir, options.upgrade
- )
- if options.root_user_action == "warn":
- warn_if_run_as_root()
- return SUCCESS
-
- def _handle_target_dir(
- self, target_dir: str, target_temp_dir: TempDirectory, upgrade: bool
- ) -> None:
- ensure_dir(target_dir)
-
- # Checking both purelib and platlib directories for installed
- # packages to be moved to target directory
- lib_dir_list = []
-
- # Checking both purelib and platlib directories for installed
- # packages to be moved to target directory
- scheme = get_scheme("", home=target_temp_dir.path)
- purelib_dir = scheme.purelib
- platlib_dir = scheme.platlib
- data_dir = scheme.data
-
- if os.path.exists(purelib_dir):
- lib_dir_list.append(purelib_dir)
- if os.path.exists(platlib_dir) and platlib_dir != purelib_dir:
- lib_dir_list.append(platlib_dir)
- if os.path.exists(data_dir):
- lib_dir_list.append(data_dir)
-
- for lib_dir in lib_dir_list:
- for item in os.listdir(lib_dir):
- if lib_dir == data_dir:
- ddir = os.path.join(data_dir, item)
- if any(s.startswith(ddir) for s in lib_dir_list[:-1]):
- continue
- target_item_dir = os.path.join(target_dir, item)
- if os.path.exists(target_item_dir):
- if not upgrade:
- logger.warning(
- "Target directory %s already exists. Specify "
- "--upgrade to force replacement.",
- target_item_dir,
- )
- continue
- if os.path.islink(target_item_dir):
- logger.warning(
- "Target directory %s already exists and is "
- "a link. pip will not automatically replace "
- "links, please remove if replacement is "
- "desired.",
- target_item_dir,
- )
- continue
- if os.path.isdir(target_item_dir):
- shutil.rmtree(target_item_dir)
- else:
- os.remove(target_item_dir)
-
- shutil.move(os.path.join(lib_dir, item), target_item_dir)
-
- def _determine_conflicts(
- self, to_install: List[InstallRequirement]
- ) -> Optional[ConflictDetails]:
- try:
- return check_install_conflicts(to_install)
- except Exception:
- logger.exception(
- "Error while checking for conflicts. Please file an issue on "
- "pip's issue tracker: https://github.com/pypa/pip/issues/new"
- )
- return None
-
- def _warn_about_conflicts(
- self, conflict_details: ConflictDetails, resolver_variant: str
- ) -> None:
- package_set, (missing, conflicting) = conflict_details
- if not missing and not conflicting:
- return
-
- parts: List[str] = []
- if resolver_variant == "legacy":
- parts.append(
- "pip's legacy dependency resolver does not consider dependency "
- "conflicts when selecting packages. This behaviour is the "
- "source of the following dependency conflicts."
- )
- else:
- assert resolver_variant == "2020-resolver"
- parts.append(
- "pip's dependency resolver does not currently take into account "
- "all the packages that are installed. This behaviour is the "
- "source of the following dependency conflicts."
- )
-
- # NOTE: There is some duplication here, with commands/check.py
- for project_name in missing:
- version = package_set[project_name][0]
- for dependency in missing[project_name]:
- message = (
- "{name} {version} requires {requirement}, "
- "which is not installed."
- ).format(
- name=project_name,
- version=version,
- requirement=dependency[1],
- )
- parts.append(message)
-
- for project_name in conflicting:
- version = package_set[project_name][0]
- for dep_name, dep_version, req in conflicting[project_name]:
- message = (
- "{name} {version} requires {requirement}, but {you} have "
- "{dep_name} {dep_version} which is incompatible."
- ).format(
- name=project_name,
- version=version,
- requirement=req,
- dep_name=dep_name,
- dep_version=dep_version,
- you=("you" if resolver_variant == "2020-resolver" else "you'll"),
- )
- parts.append(message)
-
- logger.critical("\n".join(parts))
-
-
-def get_lib_location_guesses(
- user: bool = False,
- home: Optional[str] = None,
- root: Optional[str] = None,
- isolated: bool = False,
- prefix: Optional[str] = None,
-) -> List[str]:
- scheme = get_scheme(
- "",
- user=user,
- home=home,
- root=root,
- isolated=isolated,
- prefix=prefix,
- )
- return [scheme.purelib, scheme.platlib]
-
-
-def site_packages_writable(root: Optional[str], isolated: bool) -> bool:
- return all(
- test_writable_dir(d)
- for d in set(get_lib_location_guesses(root=root, isolated=isolated))
- )
-
-
-def decide_user_install(
- use_user_site: Optional[bool],
- prefix_path: Optional[str] = None,
- target_dir: Optional[str] = None,
- root_path: Optional[str] = None,
- isolated_mode: bool = False,
-) -> bool:
- """Determine whether to do a user install based on the input options.
-
- If use_user_site is False, no additional checks are done.
- If use_user_site is True, it is checked for compatibility with other
- options.
- If use_user_site is None, the default behaviour depends on the environment,
- which is provided by the other arguments.
- """
- # In some cases (config from tox), use_user_site can be set to an integer
- # rather than a bool, which 'use_user_site is False' wouldn't catch.
- if (use_user_site is not None) and (not use_user_site):
- logger.debug("Non-user install by explicit request")
- return False
-
- if use_user_site:
- if prefix_path:
- raise CommandError(
- "Can not combine '--user' and '--prefix' as they imply "
- "different installation locations"
- )
- if virtualenv_no_global():
- raise InstallationError(
- "Can not perform a '--user' install. User site-packages "
- "are not visible in this virtualenv."
- )
- logger.debug("User install by explicit request")
- return True
-
- # If we are here, user installs have not been explicitly requested/avoided
- assert use_user_site is None
-
- # user install incompatible with --prefix/--target
- if prefix_path or target_dir:
- logger.debug("Non-user install due to --prefix or --target option")
- return False
-
- # If user installs are not enabled, choose a non-user install
- if not site.ENABLE_USER_SITE:
- logger.debug("Non-user install because user site-packages disabled")
- return False
-
- # If we have permission for a non-user install, do that,
- # otherwise do a user install.
- if site_packages_writable(root=root_path, isolated=isolated_mode):
- logger.debug("Non-user install because site-packages writeable")
- return False
-
- logger.info(
- "Defaulting to user installation because normal site-packages "
- "is not writeable"
- )
- return True
-
-
-def reject_location_related_install_options(
- requirements: List[InstallRequirement], options: Optional[List[str]]
-) -> None:
- """If any location-changing --install-option arguments were passed for
- requirements or on the command-line, then show a deprecation warning.
- """
-
- def format_options(option_names: Iterable[str]) -> List[str]:
- return ["--{}".format(name.replace("_", "-")) for name in option_names]
-
- offenders = []
-
- for requirement in requirements:
- install_options = requirement.install_options
- location_options = parse_distutils_args(install_options)
- if location_options:
- offenders.append(
- "{!r} from {}".format(
- format_options(location_options.keys()), requirement
- )
- )
-
- if options:
- location_options = parse_distutils_args(options)
- if location_options:
- offenders.append(
- "{!r} from command line".format(format_options(location_options.keys()))
- )
-
- if not offenders:
- return
-
- raise CommandError(
- "Location-changing options found in --install-option: {}."
- " This is unsupported, use pip-level options like --user,"
- " --prefix, --root, and --target instead.".format("; ".join(offenders))
- )
-
-
-def create_os_error_message(
- error: OSError, show_traceback: bool, using_user_site: bool
-) -> str:
- """Format an error message for an OSError
-
- It may occur anytime during the execution of the install command.
- """
- parts = []
-
- # Mention the error if we are not going to show a traceback
- parts.append("Could not install packages due to an OSError")
- if not show_traceback:
- parts.append(": ")
- parts.append(str(error))
- else:
- parts.append(".")
-
- # Spilt the error indication from a helper message (if any)
- parts[-1] += "\n"
-
- # Suggest useful actions to the user:
- # (1) using user site-packages or (2) verifying the permissions
- if error.errno == errno.EACCES:
- user_option_part = "Consider using the `--user` option"
- permissions_part = "Check the permissions"
-
- if not running_under_virtualenv() and not using_user_site:
- parts.extend(
- [
- user_option_part,
- " or ",
- permissions_part.lower(),
- ]
- )
- else:
- parts.append(permissions_part)
- parts.append(".\n")
-
- # Suggest the user to enable Long Paths if path length is
- # more than 260
- if (
- WINDOWS
- and error.errno == errno.ENOENT
- and error.filename
- and len(error.filename) > 260
- ):
- parts.append(
- "HINT: This error might have occurred since "
- "this system does not have Windows Long Path "
- "support enabled. You can find information on "
- "how to enable this at "
- "https://pip.pypa.io/warnings/enable-long-paths\n"
- )
-
- return "".join(parts).strip() + "\n"
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/list.py b/env/lib/python3.9/site-packages/pip/_internal/commands/list.py
deleted file mode 100644
index fc229ef..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/list.py
+++ /dev/null
@@ -1,361 +0,0 @@
-import json
-import logging
-from optparse import Values
-from typing import TYPE_CHECKING, Generator, List, Optional, Sequence, Tuple, cast
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.req_command import IndexGroupCommand
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.exceptions import CommandError
-from pip._internal.index.collector import LinkCollector
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata import BaseDistribution, get_environment
-from pip._internal.models.selection_prefs import SelectionPreferences
-from pip._internal.network.session import PipSession
-from pip._internal.utils.compat import stdlib_pkgs
-from pip._internal.utils.misc import tabulate, write_output
-
-if TYPE_CHECKING:
- from pip._internal.metadata.base import DistributionVersion
-
- class _DistWithLatestInfo(BaseDistribution):
- """Give the distribution object a couple of extra fields.
-
- These will be populated during ``get_outdated()``. This is dirty but
- makes the rest of the code much cleaner.
- """
-
- latest_version: DistributionVersion
- latest_filetype: str
-
- _ProcessedDists = Sequence[_DistWithLatestInfo]
-
-
-logger = logging.getLogger(__name__)
-
-
-class ListCommand(IndexGroupCommand):
- """
- List installed packages, including editables.
-
- Packages are listed in a case-insensitive sorted order.
- """
-
- ignore_require_venv = True
- usage = """
- %prog [options]"""
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "-o",
- "--outdated",
- action="store_true",
- default=False,
- help="List outdated packages",
- )
- self.cmd_opts.add_option(
- "-u",
- "--uptodate",
- action="store_true",
- default=False,
- help="List uptodate packages",
- )
- self.cmd_opts.add_option(
- "-e",
- "--editable",
- action="store_true",
- default=False,
- help="List editable projects.",
- )
- self.cmd_opts.add_option(
- "-l",
- "--local",
- action="store_true",
- default=False,
- help=(
- "If in a virtualenv that has global access, do not list "
- "globally-installed packages."
- ),
- )
- self.cmd_opts.add_option(
- "--user",
- dest="user",
- action="store_true",
- default=False,
- help="Only output packages installed in user-site.",
- )
- self.cmd_opts.add_option(cmdoptions.list_path())
- self.cmd_opts.add_option(
- "--pre",
- action="store_true",
- default=False,
- help=(
- "Include pre-release and development versions. By default, "
- "pip only finds stable versions."
- ),
- )
-
- self.cmd_opts.add_option(
- "--format",
- action="store",
- dest="list_format",
- default="columns",
- choices=("columns", "freeze", "json"),
- help="Select the output format among: columns (default), freeze, or json",
- )
-
- self.cmd_opts.add_option(
- "--not-required",
- action="store_true",
- dest="not_required",
- help="List packages that are not dependencies of installed packages.",
- )
-
- self.cmd_opts.add_option(
- "--exclude-editable",
- action="store_false",
- dest="include_editable",
- help="Exclude editable package from output.",
- )
- self.cmd_opts.add_option(
- "--include-editable",
- action="store_true",
- dest="include_editable",
- help="Include editable package from output.",
- default=True,
- )
- self.cmd_opts.add_option(cmdoptions.list_exclude())
- index_opts = cmdoptions.make_option_group(cmdoptions.index_group, self.parser)
-
- self.parser.insert_option_group(0, index_opts)
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def _build_package_finder(
- self, options: Values, session: PipSession
- ) -> PackageFinder:
- """
- Create a package finder appropriate to this list command.
- """
- link_collector = LinkCollector.create(session, options=options)
-
- # Pass allow_yanked=False to ignore yanked versions.
- selection_prefs = SelectionPreferences(
- allow_yanked=False,
- allow_all_prereleases=options.pre,
- )
-
- return PackageFinder.create(
- link_collector=link_collector,
- selection_prefs=selection_prefs,
- use_deprecated_html5lib="html5lib" in options.deprecated_features_enabled,
- )
-
- def run(self, options: Values, args: List[str]) -> int:
- if options.outdated and options.uptodate:
- raise CommandError("Options --outdated and --uptodate cannot be combined.")
-
- cmdoptions.check_list_path_option(options)
-
- skip = set(stdlib_pkgs)
- if options.excludes:
- skip.update(canonicalize_name(n) for n in options.excludes)
-
- packages: "_ProcessedDists" = [
- cast("_DistWithLatestInfo", d)
- for d in get_environment(options.path).iter_installed_distributions(
- local_only=options.local,
- user_only=options.user,
- editables_only=options.editable,
- include_editables=options.include_editable,
- skip=skip,
- )
- ]
-
- # get_not_required must be called firstly in order to find and
- # filter out all dependencies correctly. Otherwise a package
- # can't be identified as requirement because some parent packages
- # could be filtered out before.
- if options.not_required:
- packages = self.get_not_required(packages, options)
-
- if options.outdated:
- packages = self.get_outdated(packages, options)
- elif options.uptodate:
- packages = self.get_uptodate(packages, options)
-
- self.output_package_listing(packages, options)
- return SUCCESS
-
- def get_outdated(
- self, packages: "_ProcessedDists", options: Values
- ) -> "_ProcessedDists":
- return [
- dist
- for dist in self.iter_packages_latest_infos(packages, options)
- if dist.latest_version > dist.version
- ]
-
- def get_uptodate(
- self, packages: "_ProcessedDists", options: Values
- ) -> "_ProcessedDists":
- return [
- dist
- for dist in self.iter_packages_latest_infos(packages, options)
- if dist.latest_version == dist.version
- ]
-
- def get_not_required(
- self, packages: "_ProcessedDists", options: Values
- ) -> "_ProcessedDists":
- dep_keys = {
- canonicalize_name(dep.name)
- for dist in packages
- for dep in (dist.iter_dependencies() or ())
- }
-
- # Create a set to remove duplicate packages, and cast it to a list
- # to keep the return type consistent with get_outdated and
- # get_uptodate
- return list({pkg for pkg in packages if pkg.canonical_name not in dep_keys})
-
- def iter_packages_latest_infos(
- self, packages: "_ProcessedDists", options: Values
- ) -> Generator["_DistWithLatestInfo", None, None]:
- with self._build_session(options) as session:
- finder = self._build_package_finder(options, session)
-
- def latest_info(
- dist: "_DistWithLatestInfo",
- ) -> Optional["_DistWithLatestInfo"]:
- all_candidates = finder.find_all_candidates(dist.canonical_name)
- if not options.pre:
- # Remove prereleases
- all_candidates = [
- candidate
- for candidate in all_candidates
- if not candidate.version.is_prerelease
- ]
-
- evaluator = finder.make_candidate_evaluator(
- project_name=dist.canonical_name,
- )
- best_candidate = evaluator.sort_best_candidate(all_candidates)
- if best_candidate is None:
- return None
-
- remote_version = best_candidate.version
- if best_candidate.link.is_wheel:
- typ = "wheel"
- else:
- typ = "sdist"
- dist.latest_version = remote_version
- dist.latest_filetype = typ
- return dist
-
- for dist in map(latest_info, packages):
- if dist is not None:
- yield dist
-
- def output_package_listing(
- self, packages: "_ProcessedDists", options: Values
- ) -> None:
- packages = sorted(
- packages,
- key=lambda dist: dist.canonical_name,
- )
- if options.list_format == "columns" and packages:
- data, header = format_for_columns(packages, options)
- self.output_package_listing_columns(data, header)
- elif options.list_format == "freeze":
- for dist in packages:
- if options.verbose >= 1:
- write_output(
- "%s==%s (%s)", dist.raw_name, dist.version, dist.location
- )
- else:
- write_output("%s==%s", dist.raw_name, dist.version)
- elif options.list_format == "json":
- write_output(format_for_json(packages, options))
-
- def output_package_listing_columns(
- self, data: List[List[str]], header: List[str]
- ) -> None:
- # insert the header first: we need to know the size of column names
- if len(data) > 0:
- data.insert(0, header)
-
- pkg_strings, sizes = tabulate(data)
-
- # Create and add a separator.
- if len(data) > 0:
- pkg_strings.insert(1, " ".join(map(lambda x: "-" * x, sizes)))
-
- for val in pkg_strings:
- write_output(val)
-
-
-def format_for_columns(
- pkgs: "_ProcessedDists", options: Values
-) -> Tuple[List[List[str]], List[str]]:
- """
- Convert the package data into something usable
- by output_package_listing_columns.
- """
- header = ["Package", "Version"]
-
- running_outdated = options.outdated
- if running_outdated:
- header.extend(["Latest", "Type"])
-
- has_editables = any(x.editable for x in pkgs)
- if has_editables:
- header.append("Editable project location")
-
- if options.verbose >= 1:
- header.append("Location")
- if options.verbose >= 1:
- header.append("Installer")
-
- data = []
- for proj in pkgs:
- # if we're working on the 'outdated' list, separate out the
- # latest_version and type
- row = [proj.raw_name, str(proj.version)]
-
- if running_outdated:
- row.append(str(proj.latest_version))
- row.append(proj.latest_filetype)
-
- if has_editables:
- row.append(proj.editable_project_location or "")
-
- if options.verbose >= 1:
- row.append(proj.location or "")
- if options.verbose >= 1:
- row.append(proj.installer)
-
- data.append(row)
-
- return data, header
-
-
-def format_for_json(packages: "_ProcessedDists", options: Values) -> str:
- data = []
- for dist in packages:
- info = {
- "name": dist.raw_name,
- "version": str(dist.version),
- }
- if options.verbose >= 1:
- info["location"] = dist.location or ""
- info["installer"] = dist.installer
- if options.outdated:
- info["latest_version"] = str(dist.latest_version)
- info["latest_filetype"] = dist.latest_filetype
- editable_project_location = dist.editable_project_location
- if editable_project_location:
- info["editable_project_location"] = editable_project_location
- data.append(info)
- return json.dumps(data)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/search.py b/env/lib/python3.9/site-packages/pip/_internal/commands/search.py
deleted file mode 100644
index 03ed925..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/search.py
+++ /dev/null
@@ -1,174 +0,0 @@
-import logging
-import shutil
-import sys
-import textwrap
-import xmlrpc.client
-from collections import OrderedDict
-from optparse import Values
-from typing import TYPE_CHECKING, Dict, List, Optional
-
-from pip._vendor.packaging.version import parse as parse_version
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.req_command import SessionCommandMixin
-from pip._internal.cli.status_codes import NO_MATCHES_FOUND, SUCCESS
-from pip._internal.exceptions import CommandError
-from pip._internal.metadata import get_default_environment
-from pip._internal.models.index import PyPI
-from pip._internal.network.xmlrpc import PipXmlrpcTransport
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.misc import write_output
-
-if TYPE_CHECKING:
- from typing import TypedDict
-
- class TransformedHit(TypedDict):
- name: str
- summary: str
- versions: List[str]
-
-
-logger = logging.getLogger(__name__)
-
-
-class SearchCommand(Command, SessionCommandMixin):
- """Search for PyPI packages whose name or summary contains ."""
-
- usage = """
- %prog [options] """
- ignore_require_venv = True
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "-i",
- "--index",
- dest="index",
- metavar="URL",
- default=PyPI.pypi_url,
- help="Base URL of Python Package Index (default %default)",
- )
-
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- if not args:
- raise CommandError("Missing required argument (search query).")
- query = args
- pypi_hits = self.search(query, options)
- hits = transform_hits(pypi_hits)
-
- terminal_width = None
- if sys.stdout.isatty():
- terminal_width = shutil.get_terminal_size()[0]
-
- print_results(hits, terminal_width=terminal_width)
- if pypi_hits:
- return SUCCESS
- return NO_MATCHES_FOUND
-
- def search(self, query: List[str], options: Values) -> List[Dict[str, str]]:
- index_url = options.index
-
- session = self.get_default_session(options)
-
- transport = PipXmlrpcTransport(index_url, session)
- pypi = xmlrpc.client.ServerProxy(index_url, transport)
- try:
- hits = pypi.search({"name": query, "summary": query}, "or")
- except xmlrpc.client.Fault as fault:
- message = "XMLRPC request failed [code: {code}]\n{string}".format(
- code=fault.faultCode,
- string=fault.faultString,
- )
- raise CommandError(message)
- assert isinstance(hits, list)
- return hits
-
-
-def transform_hits(hits: List[Dict[str, str]]) -> List["TransformedHit"]:
- """
- The list from pypi is really a list of versions. We want a list of
- packages with the list of versions stored inline. This converts the
- list from pypi into one we can use.
- """
- packages: Dict[str, "TransformedHit"] = OrderedDict()
- for hit in hits:
- name = hit["name"]
- summary = hit["summary"]
- version = hit["version"]
-
- if name not in packages.keys():
- packages[name] = {
- "name": name,
- "summary": summary,
- "versions": [version],
- }
- else:
- packages[name]["versions"].append(version)
-
- # if this is the highest version, replace summary and score
- if version == highest_version(packages[name]["versions"]):
- packages[name]["summary"] = summary
-
- return list(packages.values())
-
-
-def print_dist_installation_info(name: str, latest: str) -> None:
- env = get_default_environment()
- dist = env.get_distribution(name)
- if dist is not None:
- with indent_log():
- if dist.version == latest:
- write_output("INSTALLED: %s (latest)", dist.version)
- else:
- write_output("INSTALLED: %s", dist.version)
- if parse_version(latest).pre:
- write_output(
- "LATEST: %s (pre-release; install"
- " with `pip install --pre`)",
- latest,
- )
- else:
- write_output("LATEST: %s", latest)
-
-
-def print_results(
- hits: List["TransformedHit"],
- name_column_width: Optional[int] = None,
- terminal_width: Optional[int] = None,
-) -> None:
- if not hits:
- return
- if name_column_width is None:
- name_column_width = (
- max(
- [
- len(hit["name"]) + len(highest_version(hit.get("versions", ["-"])))
- for hit in hits
- ]
- )
- + 4
- )
-
- for hit in hits:
- name = hit["name"]
- summary = hit["summary"] or ""
- latest = highest_version(hit.get("versions", ["-"]))
- if terminal_width is not None:
- target_width = terminal_width - name_column_width - 5
- if target_width > 10:
- # wrap and indent summary to fit terminal
- summary_lines = textwrap.wrap(summary, target_width)
- summary = ("\n" + " " * (name_column_width + 3)).join(summary_lines)
-
- name_latest = f"{name} ({latest})"
- line = f"{name_latest:{name_column_width}} - {summary}"
- try:
- write_output(line)
- print_dist_installation_info(name, latest)
- except UnicodeEncodeError:
- pass
-
-
-def highest_version(versions: List[str]) -> str:
- return max(versions, key=parse_version)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/show.py b/env/lib/python3.9/site-packages/pip/_internal/commands/show.py
deleted file mode 100644
index 212167c..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/show.py
+++ /dev/null
@@ -1,183 +0,0 @@
-import logging
-from optparse import Values
-from typing import Generator, Iterable, Iterator, List, NamedTuple, Optional
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.status_codes import ERROR, SUCCESS
-from pip._internal.metadata import BaseDistribution, get_default_environment
-from pip._internal.utils.misc import write_output
-
-logger = logging.getLogger(__name__)
-
-
-class ShowCommand(Command):
- """
- Show information about one or more installed packages.
-
- The output is in RFC-compliant mail header format.
- """
-
- usage = """
- %prog [options] ..."""
- ignore_require_venv = True
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "-f",
- "--files",
- dest="files",
- action="store_true",
- default=False,
- help="Show the full list of installed files for each package.",
- )
-
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- if not args:
- logger.warning("ERROR: Please provide a package name or names.")
- return ERROR
- query = args
-
- results = search_packages_info(query)
- if not print_results(
- results, list_files=options.files, verbose=options.verbose
- ):
- return ERROR
- return SUCCESS
-
-
-class _PackageInfo(NamedTuple):
- name: str
- version: str
- location: str
- requires: List[str]
- required_by: List[str]
- installer: str
- metadata_version: str
- classifiers: List[str]
- summary: str
- homepage: str
- project_urls: List[str]
- author: str
- author_email: str
- license: str
- entry_points: List[str]
- files: Optional[List[str]]
-
-
-def search_packages_info(query: List[str]) -> Generator[_PackageInfo, None, None]:
- """
- Gather details from installed distributions. Print distribution name,
- version, location, and installed files. Installed files requires a
- pip generated 'installed-files.txt' in the distributions '.egg-info'
- directory.
- """
- env = get_default_environment()
-
- installed = {dist.canonical_name: dist for dist in env.iter_all_distributions()}
- query_names = [canonicalize_name(name) for name in query]
- missing = sorted(
- [name for name, pkg in zip(query, query_names) if pkg not in installed]
- )
- if missing:
- logger.warning("Package(s) not found: %s", ", ".join(missing))
-
- def _get_requiring_packages(current_dist: BaseDistribution) -> Iterator[str]:
- return (
- dist.metadata["Name"] or "UNKNOWN"
- for dist in installed.values()
- if current_dist.canonical_name
- in {canonicalize_name(d.name) for d in dist.iter_dependencies()}
- )
-
- for query_name in query_names:
- try:
- dist = installed[query_name]
- except KeyError:
- continue
-
- requires = sorted((req.name for req in dist.iter_dependencies()), key=str.lower)
- required_by = sorted(_get_requiring_packages(dist), key=str.lower)
-
- try:
- entry_points_text = dist.read_text("entry_points.txt")
- entry_points = entry_points_text.splitlines(keepends=False)
- except FileNotFoundError:
- entry_points = []
-
- files_iter = dist.iter_declared_entries()
- if files_iter is None:
- files: Optional[List[str]] = None
- else:
- files = sorted(files_iter)
-
- metadata = dist.metadata
-
- yield _PackageInfo(
- name=dist.raw_name,
- version=str(dist.version),
- location=dist.location or "",
- requires=requires,
- required_by=required_by,
- installer=dist.installer,
- metadata_version=dist.metadata_version or "",
- classifiers=metadata.get_all("Classifier", []),
- summary=metadata.get("Summary", ""),
- homepage=metadata.get("Home-page", ""),
- project_urls=metadata.get_all("Project-URL", []),
- author=metadata.get("Author", ""),
- author_email=metadata.get("Author-email", ""),
- license=metadata.get("License", ""),
- entry_points=entry_points,
- files=files,
- )
-
-
-def print_results(
- distributions: Iterable[_PackageInfo],
- list_files: bool,
- verbose: bool,
-) -> bool:
- """
- Print the information from installed distributions found.
- """
- results_printed = False
- for i, dist in enumerate(distributions):
- results_printed = True
- if i > 0:
- write_output("---")
-
- write_output("Name: %s", dist.name)
- write_output("Version: %s", dist.version)
- write_output("Summary: %s", dist.summary)
- write_output("Home-page: %s", dist.homepage)
- write_output("Author: %s", dist.author)
- write_output("Author-email: %s", dist.author_email)
- write_output("License: %s", dist.license)
- write_output("Location: %s", dist.location)
- write_output("Requires: %s", ", ".join(dist.requires))
- write_output("Required-by: %s", ", ".join(dist.required_by))
-
- if verbose:
- write_output("Metadata-Version: %s", dist.metadata_version)
- write_output("Installer: %s", dist.installer)
- write_output("Classifiers:")
- for classifier in dist.classifiers:
- write_output(" %s", classifier)
- write_output("Entry-points:")
- for entry in dist.entry_points:
- write_output(" %s", entry.strip())
- write_output("Project-URLs:")
- for project_url in dist.project_urls:
- write_output(" %s", project_url)
- if list_files:
- write_output("Files:")
- if dist.files is None:
- write_output("Cannot locate RECORD or installed-files.txt")
- else:
- for line in dist.files:
- write_output(" %s", line.strip())
- return results_printed
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/uninstall.py b/env/lib/python3.9/site-packages/pip/_internal/commands/uninstall.py
deleted file mode 100644
index dea8077..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/uninstall.py
+++ /dev/null
@@ -1,106 +0,0 @@
-import logging
-from optparse import Values
-from typing import List
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.base_command import Command
-from pip._internal.cli.req_command import SessionCommandMixin, warn_if_run_as_root
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.exceptions import InstallationError
-from pip._internal.req import parse_requirements
-from pip._internal.req.constructors import (
- install_req_from_line,
- install_req_from_parsed_requirement,
-)
-from pip._internal.utils.misc import protect_pip_from_modification_on_windows
-
-logger = logging.getLogger(__name__)
-
-
-class UninstallCommand(Command, SessionCommandMixin):
- """
- Uninstall packages.
-
- pip is able to uninstall most installed packages. Known exceptions are:
-
- - Pure distutils packages installed with ``python setup.py install``, which
- leave behind no metadata to determine what files were installed.
- - Script wrappers installed by ``python setup.py develop``.
- """
-
- usage = """
- %prog [options] ...
- %prog [options] -r ..."""
-
- def add_options(self) -> None:
- self.cmd_opts.add_option(
- "-r",
- "--requirement",
- dest="requirements",
- action="append",
- default=[],
- metavar="file",
- help=(
- "Uninstall all the packages listed in the given requirements "
- "file. This option can be used multiple times."
- ),
- )
- self.cmd_opts.add_option(
- "-y",
- "--yes",
- dest="yes",
- action="store_true",
- help="Don't ask for confirmation of uninstall deletions.",
- )
- self.cmd_opts.add_option(cmdoptions.root_user_action())
- self.parser.insert_option_group(0, self.cmd_opts)
-
- def run(self, options: Values, args: List[str]) -> int:
- session = self.get_default_session(options)
-
- reqs_to_uninstall = {}
- for name in args:
- req = install_req_from_line(
- name,
- isolated=options.isolated_mode,
- )
- if req.name:
- reqs_to_uninstall[canonicalize_name(req.name)] = req
- else:
- logger.warning(
- "Invalid requirement: %r ignored -"
- " the uninstall command expects named"
- " requirements.",
- name,
- )
- for filename in options.requirements:
- for parsed_req in parse_requirements(
- filename, options=options, session=session
- ):
- req = install_req_from_parsed_requirement(
- parsed_req, isolated=options.isolated_mode
- )
- if req.name:
- reqs_to_uninstall[canonicalize_name(req.name)] = req
- if not reqs_to_uninstall:
- raise InstallationError(
- f"You must give at least one requirement to {self.name} (see "
- f'"pip help {self.name}")'
- )
-
- protect_pip_from_modification_on_windows(
- modifying_pip="pip" in reqs_to_uninstall
- )
-
- for req in reqs_to_uninstall.values():
- uninstall_pathset = req.uninstall(
- auto_confirm=options.yes,
- verbose=self.verbosity > 0,
- )
- if uninstall_pathset:
- uninstall_pathset.commit()
- if options.root_user_action == "warn":
- warn_if_run_as_root()
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/commands/wheel.py b/env/lib/python3.9/site-packages/pip/_internal/commands/wheel.py
deleted file mode 100644
index 9dd6c82..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/commands/wheel.py
+++ /dev/null
@@ -1,178 +0,0 @@
-import logging
-import os
-import shutil
-from optparse import Values
-from typing import List
-
-from pip._internal.cache import WheelCache
-from pip._internal.cli import cmdoptions
-from pip._internal.cli.req_command import RequirementCommand, with_cleanup
-from pip._internal.cli.status_codes import SUCCESS
-from pip._internal.exceptions import CommandError
-from pip._internal.operations.build.build_tracker import get_build_tracker
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.misc import ensure_dir, normalize_path
-from pip._internal.utils.temp_dir import TempDirectory
-from pip._internal.wheel_builder import build, should_build_for_wheel_command
-
-logger = logging.getLogger(__name__)
-
-
-class WheelCommand(RequirementCommand):
- """
- Build Wheel archives for your requirements and dependencies.
-
- Wheel is a built-package format, and offers the advantage of not
- recompiling your software during every install. For more details, see the
- wheel docs: https://wheel.readthedocs.io/en/latest/
-
- 'pip wheel' uses the build system interface as described here:
- https://pip.pypa.io/en/stable/reference/build-system/
-
- """
-
- usage = """
- %prog [options] ...
- %prog [options] -r ...
- %prog [options] [-e] ...
- %prog [options] [-e] ...
- %prog [options] ..."""
-
- def add_options(self) -> None:
-
- self.cmd_opts.add_option(
- "-w",
- "--wheel-dir",
- dest="wheel_dir",
- metavar="dir",
- default=os.curdir,
- help=(
- "Build wheels into , where the default is the "
- "current working directory."
- ),
- )
- self.cmd_opts.add_option(cmdoptions.no_binary())
- self.cmd_opts.add_option(cmdoptions.only_binary())
- self.cmd_opts.add_option(cmdoptions.prefer_binary())
- self.cmd_opts.add_option(cmdoptions.no_build_isolation())
- self.cmd_opts.add_option(cmdoptions.use_pep517())
- self.cmd_opts.add_option(cmdoptions.no_use_pep517())
- self.cmd_opts.add_option(cmdoptions.check_build_deps())
- self.cmd_opts.add_option(cmdoptions.constraints())
- self.cmd_opts.add_option(cmdoptions.editable())
- self.cmd_opts.add_option(cmdoptions.requirements())
- self.cmd_opts.add_option(cmdoptions.src())
- self.cmd_opts.add_option(cmdoptions.ignore_requires_python())
- self.cmd_opts.add_option(cmdoptions.no_deps())
- self.cmd_opts.add_option(cmdoptions.progress_bar())
-
- self.cmd_opts.add_option(
- "--no-verify",
- dest="no_verify",
- action="store_true",
- default=False,
- help="Don't verify if built wheel is valid.",
- )
-
- self.cmd_opts.add_option(cmdoptions.config_settings())
- self.cmd_opts.add_option(cmdoptions.build_options())
- self.cmd_opts.add_option(cmdoptions.global_options())
-
- self.cmd_opts.add_option(
- "--pre",
- action="store_true",
- default=False,
- help=(
- "Include pre-release and development versions. By default, "
- "pip only finds stable versions."
- ),
- )
-
- self.cmd_opts.add_option(cmdoptions.require_hashes())
-
- index_opts = cmdoptions.make_option_group(
- cmdoptions.index_group,
- self.parser,
- )
-
- self.parser.insert_option_group(0, index_opts)
- self.parser.insert_option_group(0, self.cmd_opts)
-
- @with_cleanup
- def run(self, options: Values, args: List[str]) -> int:
- cmdoptions.check_install_build_global(options)
-
- session = self.get_default_session(options)
-
- finder = self._build_package_finder(options, session)
- wheel_cache = WheelCache(options.cache_dir, options.format_control)
-
- options.wheel_dir = normalize_path(options.wheel_dir)
- ensure_dir(options.wheel_dir)
-
- build_tracker = self.enter_context(get_build_tracker())
-
- directory = TempDirectory(
- delete=not options.no_clean,
- kind="wheel",
- globally_managed=True,
- )
-
- reqs = self.get_requirements(args, options, finder, session)
-
- preparer = self.make_requirement_preparer(
- temp_build_dir=directory,
- options=options,
- build_tracker=build_tracker,
- session=session,
- finder=finder,
- download_dir=options.wheel_dir,
- use_user_site=False,
- verbosity=self.verbosity,
- )
-
- resolver = self.make_resolver(
- preparer=preparer,
- finder=finder,
- options=options,
- wheel_cache=wheel_cache,
- ignore_requires_python=options.ignore_requires_python,
- use_pep517=options.use_pep517,
- )
-
- self.trace_basic_info(finder)
-
- requirement_set = resolver.resolve(reqs, check_supported_wheels=True)
-
- reqs_to_build: List[InstallRequirement] = []
- for req in requirement_set.requirements.values():
- if req.is_wheel:
- preparer.save_linked_requirement(req)
- elif should_build_for_wheel_command(req):
- reqs_to_build.append(req)
-
- # build wheels
- build_successes, build_failures = build(
- reqs_to_build,
- wheel_cache=wheel_cache,
- verify=(not options.no_verify),
- build_options=options.build_options or [],
- global_options=options.global_options or [],
- )
- for req in build_successes:
- assert req.link and req.link.is_wheel
- assert req.local_file_path
- # copy from cache to target directory
- try:
- shutil.copy(req.local_file_path, options.wheel_dir)
- except OSError as e:
- logger.warning(
- "Building wheel for %s failed: %s",
- req.name,
- e,
- )
- build_failures.append(req)
- if len(build_failures) != 0:
- raise CommandError("Failed to build one or more wheels")
-
- return SUCCESS
diff --git a/env/lib/python3.9/site-packages/pip/_internal/configuration.py b/env/lib/python3.9/site-packages/pip/_internal/configuration.py
deleted file mode 100644
index 8fd46c9..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/configuration.py
+++ /dev/null
@@ -1,374 +0,0 @@
-"""Configuration management setup
-
-Some terminology:
-- name
- As written in config files.
-- value
- Value associated with a name
-- key
- Name combined with it's section (section.name)
-- variant
- A single word describing where the configuration key-value pair came from
-"""
-
-import configparser
-import locale
-import os
-import sys
-from typing import Any, Dict, Iterable, List, NewType, Optional, Tuple
-
-from pip._internal.exceptions import (
- ConfigurationError,
- ConfigurationFileCouldNotBeLoaded,
-)
-from pip._internal.utils import appdirs
-from pip._internal.utils.compat import WINDOWS
-from pip._internal.utils.logging import getLogger
-from pip._internal.utils.misc import ensure_dir, enum
-
-RawConfigParser = configparser.RawConfigParser # Shorthand
-Kind = NewType("Kind", str)
-
-CONFIG_BASENAME = "pip.ini" if WINDOWS else "pip.conf"
-ENV_NAMES_IGNORED = "version", "help"
-
-# The kinds of configurations there are.
-kinds = enum(
- USER="user", # User Specific
- GLOBAL="global", # System Wide
- SITE="site", # [Virtual] Environment Specific
- ENV="env", # from PIP_CONFIG_FILE
- ENV_VAR="env-var", # from Environment Variables
-)
-OVERRIDE_ORDER = kinds.GLOBAL, kinds.USER, kinds.SITE, kinds.ENV, kinds.ENV_VAR
-VALID_LOAD_ONLY = kinds.USER, kinds.GLOBAL, kinds.SITE
-
-logger = getLogger(__name__)
-
-
-# NOTE: Maybe use the optionx attribute to normalize keynames.
-def _normalize_name(name: str) -> str:
- """Make a name consistent regardless of source (environment or file)"""
- name = name.lower().replace("_", "-")
- if name.startswith("--"):
- name = name[2:] # only prefer long opts
- return name
-
-
-def _disassemble_key(name: str) -> List[str]:
- if "." not in name:
- error_message = (
- "Key does not contain dot separated section and key. "
- "Perhaps you wanted to use 'global.{}' instead?"
- ).format(name)
- raise ConfigurationError(error_message)
- return name.split(".", 1)
-
-
-def get_configuration_files() -> Dict[Kind, List[str]]:
- global_config_files = [
- os.path.join(path, CONFIG_BASENAME) for path in appdirs.site_config_dirs("pip")
- ]
-
- site_config_file = os.path.join(sys.prefix, CONFIG_BASENAME)
- legacy_config_file = os.path.join(
- os.path.expanduser("~"),
- "pip" if WINDOWS else ".pip",
- CONFIG_BASENAME,
- )
- new_config_file = os.path.join(appdirs.user_config_dir("pip"), CONFIG_BASENAME)
- return {
- kinds.GLOBAL: global_config_files,
- kinds.SITE: [site_config_file],
- kinds.USER: [legacy_config_file, new_config_file],
- }
-
-
-class Configuration:
- """Handles management of configuration.
-
- Provides an interface to accessing and managing configuration files.
-
- This class converts provides an API that takes "section.key-name" style
- keys and stores the value associated with it as "key-name" under the
- section "section".
-
- This allows for a clean interface wherein the both the section and the
- key-name are preserved in an easy to manage form in the configuration files
- and the data stored is also nice.
- """
-
- def __init__(self, isolated: bool, load_only: Optional[Kind] = None) -> None:
- super().__init__()
-
- if load_only is not None and load_only not in VALID_LOAD_ONLY:
- raise ConfigurationError(
- "Got invalid value for load_only - should be one of {}".format(
- ", ".join(map(repr, VALID_LOAD_ONLY))
- )
- )
- self.isolated = isolated
- self.load_only = load_only
-
- # Because we keep track of where we got the data from
- self._parsers: Dict[Kind, List[Tuple[str, RawConfigParser]]] = {
- variant: [] for variant in OVERRIDE_ORDER
- }
- self._config: Dict[Kind, Dict[str, Any]] = {
- variant: {} for variant in OVERRIDE_ORDER
- }
- self._modified_parsers: List[Tuple[str, RawConfigParser]] = []
-
- def load(self) -> None:
- """Loads configuration from configuration files and environment"""
- self._load_config_files()
- if not self.isolated:
- self._load_environment_vars()
-
- def get_file_to_edit(self) -> Optional[str]:
- """Returns the file with highest priority in configuration"""
- assert self.load_only is not None, "Need to be specified a file to be editing"
-
- try:
- return self._get_parser_to_modify()[0]
- except IndexError:
- return None
-
- def items(self) -> Iterable[Tuple[str, Any]]:
- """Returns key-value pairs like dict.items() representing the loaded
- configuration
- """
- return self._dictionary.items()
-
- def get_value(self, key: str) -> Any:
- """Get a value from the configuration."""
- orig_key = key
- key = _normalize_name(key)
- try:
- return self._dictionary[key]
- except KeyError:
- # disassembling triggers a more useful error message than simply
- # "No such key" in the case that the key isn't in the form command.option
- _disassemble_key(key)
- raise ConfigurationError(f"No such key - {orig_key}")
-
- def set_value(self, key: str, value: Any) -> None:
- """Modify a value in the configuration."""
- key = _normalize_name(key)
- self._ensure_have_load_only()
-
- assert self.load_only
- fname, parser = self._get_parser_to_modify()
-
- if parser is not None:
- section, name = _disassemble_key(key)
-
- # Modify the parser and the configuration
- if not parser.has_section(section):
- parser.add_section(section)
- parser.set(section, name, value)
-
- self._config[self.load_only][key] = value
- self._mark_as_modified(fname, parser)
-
- def unset_value(self, key: str) -> None:
- """Unset a value in the configuration."""
- orig_key = key
- key = _normalize_name(key)
- self._ensure_have_load_only()
-
- assert self.load_only
- if key not in self._config[self.load_only]:
- raise ConfigurationError(f"No such key - {orig_key}")
-
- fname, parser = self._get_parser_to_modify()
-
- if parser is not None:
- section, name = _disassemble_key(key)
- if not (
- parser.has_section(section) and parser.remove_option(section, name)
- ):
- # The option was not removed.
- raise ConfigurationError(
- "Fatal Internal error [id=1]. Please report as a bug."
- )
-
- # The section may be empty after the option was removed.
- if not parser.items(section):
- parser.remove_section(section)
- self._mark_as_modified(fname, parser)
-
- del self._config[self.load_only][key]
-
- def save(self) -> None:
- """Save the current in-memory state."""
- self._ensure_have_load_only()
-
- for fname, parser in self._modified_parsers:
- logger.info("Writing to %s", fname)
-
- # Ensure directory exists.
- ensure_dir(os.path.dirname(fname))
-
- with open(fname, "w") as f:
- parser.write(f)
-
- #
- # Private routines
- #
-
- def _ensure_have_load_only(self) -> None:
- if self.load_only is None:
- raise ConfigurationError("Needed a specific file to be modifying.")
- logger.debug("Will be working with %s variant only", self.load_only)
-
- @property
- def _dictionary(self) -> Dict[str, Any]:
- """A dictionary representing the loaded configuration."""
- # NOTE: Dictionaries are not populated if not loaded. So, conditionals
- # are not needed here.
- retval = {}
-
- for variant in OVERRIDE_ORDER:
- retval.update(self._config[variant])
-
- return retval
-
- def _load_config_files(self) -> None:
- """Loads configuration from configuration files"""
- config_files = dict(self.iter_config_files())
- if config_files[kinds.ENV][0:1] == [os.devnull]:
- logger.debug(
- "Skipping loading configuration files due to "
- "environment's PIP_CONFIG_FILE being os.devnull"
- )
- return
-
- for variant, files in config_files.items():
- for fname in files:
- # If there's specific variant set in `load_only`, load only
- # that variant, not the others.
- if self.load_only is not None and variant != self.load_only:
- logger.debug("Skipping file '%s' (variant: %s)", fname, variant)
- continue
-
- parser = self._load_file(variant, fname)
-
- # Keeping track of the parsers used
- self._parsers[variant].append((fname, parser))
-
- def _load_file(self, variant: Kind, fname: str) -> RawConfigParser:
- logger.verbose("For variant '%s', will try loading '%s'", variant, fname)
- parser = self._construct_parser(fname)
-
- for section in parser.sections():
- items = parser.items(section)
- self._config[variant].update(self._normalized_keys(section, items))
-
- return parser
-
- def _construct_parser(self, fname: str) -> RawConfigParser:
- parser = configparser.RawConfigParser()
- # If there is no such file, don't bother reading it but create the
- # parser anyway, to hold the data.
- # Doing this is useful when modifying and saving files, where we don't
- # need to construct a parser.
- if os.path.exists(fname):
- locale_encoding = locale.getpreferredencoding(False)
- try:
- parser.read(fname, encoding=locale_encoding)
- except UnicodeDecodeError:
- # See https://github.com/pypa/pip/issues/4963
- raise ConfigurationFileCouldNotBeLoaded(
- reason=f"contains invalid {locale_encoding} characters",
- fname=fname,
- )
- except configparser.Error as error:
- # See https://github.com/pypa/pip/issues/4893
- raise ConfigurationFileCouldNotBeLoaded(error=error)
- return parser
-
- def _load_environment_vars(self) -> None:
- """Loads configuration from environment variables"""
- self._config[kinds.ENV_VAR].update(
- self._normalized_keys(":env:", self.get_environ_vars())
- )
-
- def _normalized_keys(
- self, section: str, items: Iterable[Tuple[str, Any]]
- ) -> Dict[str, Any]:
- """Normalizes items to construct a dictionary with normalized keys.
-
- This routine is where the names become keys and are made the same
- regardless of source - configuration files or environment.
- """
- normalized = {}
- for name, val in items:
- key = section + "." + _normalize_name(name)
- normalized[key] = val
- return normalized
-
- def get_environ_vars(self) -> Iterable[Tuple[str, str]]:
- """Returns a generator with all environmental vars with prefix PIP_"""
- for key, val in os.environ.items():
- if key.startswith("PIP_"):
- name = key[4:].lower()
- if name not in ENV_NAMES_IGNORED:
- yield name, val
-
- # XXX: This is patched in the tests.
- def iter_config_files(self) -> Iterable[Tuple[Kind, List[str]]]:
- """Yields variant and configuration files associated with it.
-
- This should be treated like items of a dictionary.
- """
- # SMELL: Move the conditions out of this function
-
- # environment variables have the lowest priority
- config_file = os.environ.get("PIP_CONFIG_FILE", None)
- if config_file is not None:
- yield kinds.ENV, [config_file]
- else:
- yield kinds.ENV, []
-
- config_files = get_configuration_files()
-
- # at the base we have any global configuration
- yield kinds.GLOBAL, config_files[kinds.GLOBAL]
-
- # per-user configuration next
- should_load_user_config = not self.isolated and not (
- config_file and os.path.exists(config_file)
- )
- if should_load_user_config:
- # The legacy config file is overridden by the new config file
- yield kinds.USER, config_files[kinds.USER]
-
- # finally virtualenv configuration first trumping others
- yield kinds.SITE, config_files[kinds.SITE]
-
- def get_values_in_config(self, variant: Kind) -> Dict[str, Any]:
- """Get values present in a config file"""
- return self._config[variant]
-
- def _get_parser_to_modify(self) -> Tuple[str, RawConfigParser]:
- # Determine which parser to modify
- assert self.load_only
- parsers = self._parsers[self.load_only]
- if not parsers:
- # This should not happen if everything works correctly.
- raise ConfigurationError(
- "Fatal Internal error [id=2]. Please report as a bug."
- )
-
- # Use the highest priority parser.
- return parsers[-1]
-
- # XXX: This is patched in the tests.
- def _mark_as_modified(self, fname: str, parser: RawConfigParser) -> None:
- file_parser_tuple = (fname, parser)
- if file_parser_tuple not in self._modified_parsers:
- self._modified_parsers.append(file_parser_tuple)
-
- def __repr__(self) -> str:
- return f"{self.__class__.__name__}({self._dictionary!r})"
diff --git a/env/lib/python3.9/site-packages/pip/_internal/distributions/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/distributions/__init__.py
deleted file mode 100644
index 9a89a83..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/distributions/__init__.py
+++ /dev/null
@@ -1,21 +0,0 @@
-from pip._internal.distributions.base import AbstractDistribution
-from pip._internal.distributions.sdist import SourceDistribution
-from pip._internal.distributions.wheel import WheelDistribution
-from pip._internal.req.req_install import InstallRequirement
-
-
-def make_distribution_for_install_requirement(
- install_req: InstallRequirement,
-) -> AbstractDistribution:
- """Returns a Distribution for the given InstallRequirement"""
- # Editable requirements will always be source distributions. They use the
- # legacy logic until we create a modern standard for them.
- if install_req.editable:
- return SourceDistribution(install_req)
-
- # If it's a wheel, it's a WheelDistribution
- if install_req.is_wheel:
- return WheelDistribution(install_req)
-
- # Otherwise, a SourceDistribution
- return SourceDistribution(install_req)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/distributions/base.py b/env/lib/python3.9/site-packages/pip/_internal/distributions/base.py
deleted file mode 100644
index 75ce2dc..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/distributions/base.py
+++ /dev/null
@@ -1,39 +0,0 @@
-import abc
-
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata.base import BaseDistribution
-from pip._internal.req import InstallRequirement
-
-
-class AbstractDistribution(metaclass=abc.ABCMeta):
- """A base class for handling installable artifacts.
-
- The requirements for anything installable are as follows:
-
- - we must be able to determine the requirement name
- (or we can't correctly handle the non-upgrade case).
-
- - for packages with setup requirements, we must also be able
- to determine their requirements without installing additional
- packages (for the same reason as run-time dependencies)
-
- - we must be able to create a Distribution object exposing the
- above metadata.
- """
-
- def __init__(self, req: InstallRequirement) -> None:
- super().__init__()
- self.req = req
-
- @abc.abstractmethod
- def get_metadata_distribution(self) -> BaseDistribution:
- raise NotImplementedError()
-
- @abc.abstractmethod
- def prepare_distribution_metadata(
- self,
- finder: PackageFinder,
- build_isolation: bool,
- check_build_deps: bool,
- ) -> None:
- raise NotImplementedError()
diff --git a/env/lib/python3.9/site-packages/pip/_internal/distributions/installed.py b/env/lib/python3.9/site-packages/pip/_internal/distributions/installed.py
deleted file mode 100644
index edb38aa..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/distributions/installed.py
+++ /dev/null
@@ -1,23 +0,0 @@
-from pip._internal.distributions.base import AbstractDistribution
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata import BaseDistribution
-
-
-class InstalledDistribution(AbstractDistribution):
- """Represents an installed package.
-
- This does not need any preparation as the required information has already
- been computed.
- """
-
- def get_metadata_distribution(self) -> BaseDistribution:
- assert self.req.satisfied_by is not None, "not actually installed"
- return self.req.satisfied_by
-
- def prepare_distribution_metadata(
- self,
- finder: PackageFinder,
- build_isolation: bool,
- check_build_deps: bool,
- ) -> None:
- pass
diff --git a/env/lib/python3.9/site-packages/pip/_internal/distributions/sdist.py b/env/lib/python3.9/site-packages/pip/_internal/distributions/sdist.py
deleted file mode 100644
index 4c25647..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/distributions/sdist.py
+++ /dev/null
@@ -1,150 +0,0 @@
-import logging
-from typing import Iterable, Set, Tuple
-
-from pip._internal.build_env import BuildEnvironment
-from pip._internal.distributions.base import AbstractDistribution
-from pip._internal.exceptions import InstallationError
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata import BaseDistribution
-from pip._internal.utils.subprocess import runner_with_spinner_message
-
-logger = logging.getLogger(__name__)
-
-
-class SourceDistribution(AbstractDistribution):
- """Represents a source distribution.
-
- The preparation step for these needs metadata for the packages to be
- generated, either using PEP 517 or using the legacy `setup.py egg_info`.
- """
-
- def get_metadata_distribution(self) -> BaseDistribution:
- return self.req.get_dist()
-
- def prepare_distribution_metadata(
- self,
- finder: PackageFinder,
- build_isolation: bool,
- check_build_deps: bool,
- ) -> None:
- # Load pyproject.toml, to determine whether PEP 517 is to be used
- self.req.load_pyproject_toml()
-
- # Set up the build isolation, if this requirement should be isolated
- should_isolate = self.req.use_pep517 and build_isolation
- if should_isolate:
- # Setup an isolated environment and install the build backend static
- # requirements in it.
- self._prepare_build_backend(finder)
- # Check that if the requirement is editable, it either supports PEP 660 or
- # has a setup.py or a setup.cfg. This cannot be done earlier because we need
- # to setup the build backend to verify it supports build_editable, nor can
- # it be done later, because we want to avoid installing build requirements
- # needlessly. Doing it here also works around setuptools generating
- # UNKNOWN.egg-info when running get_requires_for_build_wheel on a directory
- # without setup.py nor setup.cfg.
- self.req.isolated_editable_sanity_check()
- # Install the dynamic build requirements.
- self._install_build_reqs(finder)
- # Check if the current environment provides build dependencies
- should_check_deps = self.req.use_pep517 and check_build_deps
- if should_check_deps:
- pyproject_requires = self.req.pyproject_requires
- assert pyproject_requires is not None
- conflicting, missing = self.req.build_env.check_requirements(
- pyproject_requires
- )
- if conflicting:
- self._raise_conflicts("the backend dependencies", conflicting)
- if missing:
- self._raise_missing_reqs(missing)
- self.req.prepare_metadata()
-
- def _prepare_build_backend(self, finder: PackageFinder) -> None:
- # Isolate in a BuildEnvironment and install the build-time
- # requirements.
- pyproject_requires = self.req.pyproject_requires
- assert pyproject_requires is not None
-
- self.req.build_env = BuildEnvironment()
- self.req.build_env.install_requirements(
- finder, pyproject_requires, "overlay", kind="build dependencies"
- )
- conflicting, missing = self.req.build_env.check_requirements(
- self.req.requirements_to_check
- )
- if conflicting:
- self._raise_conflicts("PEP 517/518 supported requirements", conflicting)
- if missing:
- logger.warning(
- "Missing build requirements in pyproject.toml for %s.",
- self.req,
- )
- logger.warning(
- "The project does not specify a build backend, and "
- "pip cannot fall back to setuptools without %s.",
- " and ".join(map(repr, sorted(missing))),
- )
-
- def _get_build_requires_wheel(self) -> Iterable[str]:
- with self.req.build_env:
- runner = runner_with_spinner_message("Getting requirements to build wheel")
- backend = self.req.pep517_backend
- assert backend is not None
- with backend.subprocess_runner(runner):
- return backend.get_requires_for_build_wheel()
-
- def _get_build_requires_editable(self) -> Iterable[str]:
- with self.req.build_env:
- runner = runner_with_spinner_message(
- "Getting requirements to build editable"
- )
- backend = self.req.pep517_backend
- assert backend is not None
- with backend.subprocess_runner(runner):
- return backend.get_requires_for_build_editable()
-
- def _install_build_reqs(self, finder: PackageFinder) -> None:
- # Install any extra build dependencies that the backend requests.
- # This must be done in a second pass, as the pyproject.toml
- # dependencies must be installed before we can call the backend.
- if (
- self.req.editable
- and self.req.permit_editable_wheels
- and self.req.supports_pyproject_editable()
- ):
- build_reqs = self._get_build_requires_editable()
- else:
- build_reqs = self._get_build_requires_wheel()
- conflicting, missing = self.req.build_env.check_requirements(build_reqs)
- if conflicting:
- self._raise_conflicts("the backend dependencies", conflicting)
- self.req.build_env.install_requirements(
- finder, missing, "normal", kind="backend dependencies"
- )
-
- def _raise_conflicts(
- self, conflicting_with: str, conflicting_reqs: Set[Tuple[str, str]]
- ) -> None:
- format_string = (
- "Some build dependencies for {requirement} "
- "conflict with {conflicting_with}: {description}."
- )
- error_message = format_string.format(
- requirement=self.req,
- conflicting_with=conflicting_with,
- description=", ".join(
- f"{installed} is incompatible with {wanted}"
- for installed, wanted in sorted(conflicting_reqs)
- ),
- )
- raise InstallationError(error_message)
-
- def _raise_missing_reqs(self, missing: Set[str]) -> None:
- format_string = (
- "Some build dependencies for {requirement} are missing: {missing}."
- )
- error_message = format_string.format(
- requirement=self.req, missing=", ".join(map(repr, sorted(missing)))
- )
- raise InstallationError(error_message)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/distributions/wheel.py b/env/lib/python3.9/site-packages/pip/_internal/distributions/wheel.py
deleted file mode 100644
index 03aac77..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/distributions/wheel.py
+++ /dev/null
@@ -1,34 +0,0 @@
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.distributions.base import AbstractDistribution
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata import (
- BaseDistribution,
- FilesystemWheel,
- get_wheel_distribution,
-)
-
-
-class WheelDistribution(AbstractDistribution):
- """Represents a wheel distribution.
-
- This does not need any preparation as wheels can be directly unpacked.
- """
-
- def get_metadata_distribution(self) -> BaseDistribution:
- """Loads the metadata from the wheel file into memory and returns a
- Distribution that uses it, not relying on the wheel file or
- requirement.
- """
- assert self.req.local_file_path, "Set as part of preparation during download"
- assert self.req.name, "Wheels are never unnamed"
- wheel = FilesystemWheel(self.req.local_file_path)
- return get_wheel_distribution(wheel, canonicalize_name(self.req.name))
-
- def prepare_distribution_metadata(
- self,
- finder: PackageFinder,
- build_isolation: bool,
- check_build_deps: bool,
- ) -> None:
- pass
diff --git a/env/lib/python3.9/site-packages/pip/_internal/exceptions.py b/env/lib/python3.9/site-packages/pip/_internal/exceptions.py
deleted file mode 100644
index 97b9612..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/exceptions.py
+++ /dev/null
@@ -1,658 +0,0 @@
-"""Exceptions used throughout package.
-
-This module MUST NOT try to import from anything within `pip._internal` to
-operate. This is expected to be importable from any/all files within the
-subpackage and, thus, should not depend on them.
-"""
-
-import configparser
-import re
-from itertools import chain, groupby, repeat
-from typing import TYPE_CHECKING, Dict, List, Optional, Union
-
-from pip._vendor.requests.models import Request, Response
-from pip._vendor.rich.console import Console, ConsoleOptions, RenderResult
-from pip._vendor.rich.markup import escape
-from pip._vendor.rich.text import Text
-
-if TYPE_CHECKING:
- from hashlib import _Hash
- from typing import Literal
-
- from pip._internal.metadata import BaseDistribution
- from pip._internal.req.req_install import InstallRequirement
-
-
-#
-# Scaffolding
-#
-def _is_kebab_case(s: str) -> bool:
- return re.match(r"^[a-z]+(-[a-z]+)*$", s) is not None
-
-
-def _prefix_with_indent(
- s: Union[Text, str],
- console: Console,
- *,
- prefix: str,
- indent: str,
-) -> Text:
- if isinstance(s, Text):
- text = s
- else:
- text = console.render_str(s)
-
- return console.render_str(prefix, overflow="ignore") + console.render_str(
- f"\n{indent}", overflow="ignore"
- ).join(text.split(allow_blank=True))
-
-
-class PipError(Exception):
- """The base pip error."""
-
-
-class DiagnosticPipError(PipError):
- """An error, that presents diagnostic information to the user.
-
- This contains a bunch of logic, to enable pretty presentation of our error
- messages. Each error gets a unique reference. Each error can also include
- additional context, a hint and/or a note -- which are presented with the
- main error message in a consistent style.
-
- This is adapted from the error output styling in `sphinx-theme-builder`.
- """
-
- reference: str
-
- def __init__(
- self,
- *,
- kind: 'Literal["error", "warning"]' = "error",
- reference: Optional[str] = None,
- message: Union[str, Text],
- context: Optional[Union[str, Text]],
- hint_stmt: Optional[Union[str, Text]],
- note_stmt: Optional[Union[str, Text]] = None,
- link: Optional[str] = None,
- ) -> None:
- # Ensure a proper reference is provided.
- if reference is None:
- assert hasattr(self, "reference"), "error reference not provided!"
- reference = self.reference
- assert _is_kebab_case(reference), "error reference must be kebab-case!"
-
- self.kind = kind
- self.reference = reference
-
- self.message = message
- self.context = context
-
- self.note_stmt = note_stmt
- self.hint_stmt = hint_stmt
-
- self.link = link
-
- super().__init__(f"<{self.__class__.__name__}: {self.reference}>")
-
- def __repr__(self) -> str:
- return (
- f"<{self.__class__.__name__}("
- f"reference={self.reference!r}, "
- f"message={self.message!r}, "
- f"context={self.context!r}, "
- f"note_stmt={self.note_stmt!r}, "
- f"hint_stmt={self.hint_stmt!r}"
- ")>"
- )
-
- def __rich_console__(
- self,
- console: Console,
- options: ConsoleOptions,
- ) -> RenderResult:
- colour = "red" if self.kind == "error" else "yellow"
-
- yield f"[{colour} bold]{self.kind}[/]: [bold]{self.reference}[/]"
- yield ""
-
- if not options.ascii_only:
- # Present the main message, with relevant context indented.
- if self.context is not None:
- yield _prefix_with_indent(
- self.message,
- console,
- prefix=f"[{colour}]×[/] ",
- indent=f"[{colour}]│[/] ",
- )
- yield _prefix_with_indent(
- self.context,
- console,
- prefix=f"[{colour}]╰─>[/] ",
- indent=f"[{colour}] [/] ",
- )
- else:
- yield _prefix_with_indent(
- self.message,
- console,
- prefix="[red]×[/] ",
- indent=" ",
- )
- else:
- yield self.message
- if self.context is not None:
- yield ""
- yield self.context
-
- if self.note_stmt is not None or self.hint_stmt is not None:
- yield ""
-
- if self.note_stmt is not None:
- yield _prefix_with_indent(
- self.note_stmt,
- console,
- prefix="[magenta bold]note[/]: ",
- indent=" ",
- )
- if self.hint_stmt is not None:
- yield _prefix_with_indent(
- self.hint_stmt,
- console,
- prefix="[cyan bold]hint[/]: ",
- indent=" ",
- )
-
- if self.link is not None:
- yield ""
- yield f"Link: {self.link}"
-
-
-#
-# Actual Errors
-#
-class ConfigurationError(PipError):
- """General exception in configuration"""
-
-
-class InstallationError(PipError):
- """General exception during installation"""
-
-
-class UninstallationError(PipError):
- """General exception during uninstallation"""
-
-
-class MissingPyProjectBuildRequires(DiagnosticPipError):
- """Raised when pyproject.toml has `build-system`, but no `build-system.requires`."""
-
- reference = "missing-pyproject-build-system-requires"
-
- def __init__(self, *, package: str) -> None:
- super().__init__(
- message=f"Can not process {escape(package)}",
- context=Text(
- "This package has an invalid pyproject.toml file.\n"
- "The [build-system] table is missing the mandatory `requires` key."
- ),
- note_stmt="This is an issue with the package mentioned above, not pip.",
- hint_stmt=Text("See PEP 518 for the detailed specification."),
- )
-
-
-class InvalidPyProjectBuildRequires(DiagnosticPipError):
- """Raised when pyproject.toml an invalid `build-system.requires`."""
-
- reference = "invalid-pyproject-build-system-requires"
-
- def __init__(self, *, package: str, reason: str) -> None:
- super().__init__(
- message=f"Can not process {escape(package)}",
- context=Text(
- "This package has an invalid `build-system.requires` key in "
- f"pyproject.toml.\n{reason}"
- ),
- note_stmt="This is an issue with the package mentioned above, not pip.",
- hint_stmt=Text("See PEP 518 for the detailed specification."),
- )
-
-
-class NoneMetadataError(PipError):
- """Raised when accessing a Distribution's "METADATA" or "PKG-INFO".
-
- This signifies an inconsistency, when the Distribution claims to have
- the metadata file (if not, raise ``FileNotFoundError`` instead), but is
- not actually able to produce its content. This may be due to permission
- errors.
- """
-
- def __init__(
- self,
- dist: "BaseDistribution",
- metadata_name: str,
- ) -> None:
- """
- :param dist: A Distribution object.
- :param metadata_name: The name of the metadata being accessed
- (can be "METADATA" or "PKG-INFO").
- """
- self.dist = dist
- self.metadata_name = metadata_name
-
- def __str__(self) -> str:
- # Use `dist` in the error message because its stringification
- # includes more information, like the version and location.
- return "None {} metadata found for distribution: {}".format(
- self.metadata_name,
- self.dist,
- )
-
-
-class UserInstallationInvalid(InstallationError):
- """A --user install is requested on an environment without user site."""
-
- def __str__(self) -> str:
- return "User base directory is not specified"
-
-
-class InvalidSchemeCombination(InstallationError):
- def __str__(self) -> str:
- before = ", ".join(str(a) for a in self.args[:-1])
- return f"Cannot set {before} and {self.args[-1]} together"
-
-
-class DistributionNotFound(InstallationError):
- """Raised when a distribution cannot be found to satisfy a requirement"""
-
-
-class RequirementsFileParseError(InstallationError):
- """Raised when a general error occurs parsing a requirements file line."""
-
-
-class BestVersionAlreadyInstalled(PipError):
- """Raised when the most up-to-date version of a package is already
- installed."""
-
-
-class BadCommand(PipError):
- """Raised when virtualenv or a command is not found"""
-
-
-class CommandError(PipError):
- """Raised when there is an error in command-line arguments"""
-
-
-class PreviousBuildDirError(PipError):
- """Raised when there's a previous conflicting build directory"""
-
-
-class NetworkConnectionError(PipError):
- """HTTP connection error"""
-
- def __init__(
- self, error_msg: str, response: Response = None, request: Request = None
- ) -> None:
- """
- Initialize NetworkConnectionError with `request` and `response`
- objects.
- """
- self.response = response
- self.request = request
- self.error_msg = error_msg
- if (
- self.response is not None
- and not self.request
- and hasattr(response, "request")
- ):
- self.request = self.response.request
- super().__init__(error_msg, response, request)
-
- def __str__(self) -> str:
- return str(self.error_msg)
-
-
-class InvalidWheelFilename(InstallationError):
- """Invalid wheel filename."""
-
-
-class UnsupportedWheel(InstallationError):
- """Unsupported wheel."""
-
-
-class InvalidWheel(InstallationError):
- """Invalid (e.g. corrupt) wheel."""
-
- def __init__(self, location: str, name: str):
- self.location = location
- self.name = name
-
- def __str__(self) -> str:
- return f"Wheel '{self.name}' located at {self.location} is invalid."
-
-
-class MetadataInconsistent(InstallationError):
- """Built metadata contains inconsistent information.
-
- This is raised when the metadata contains values (e.g. name and version)
- that do not match the information previously obtained from sdist filename
- or user-supplied ``#egg=`` value.
- """
-
- def __init__(
- self, ireq: "InstallRequirement", field: str, f_val: str, m_val: str
- ) -> None:
- self.ireq = ireq
- self.field = field
- self.f_val = f_val
- self.m_val = m_val
-
- def __str__(self) -> str:
- template = (
- "Requested {} has inconsistent {}: "
- "filename has {!r}, but metadata has {!r}"
- )
- return template.format(self.ireq, self.field, self.f_val, self.m_val)
-
-
-class LegacyInstallFailure(DiagnosticPipError):
- """Error occurred while executing `setup.py install`"""
-
- reference = "legacy-install-failure"
-
- def __init__(self, package_details: str) -> None:
- super().__init__(
- message="Encountered error while trying to install package.",
- context=package_details,
- hint_stmt="See above for output from the failure.",
- note_stmt="This is an issue with the package mentioned above, not pip.",
- )
-
-
-class InstallationSubprocessError(DiagnosticPipError, InstallationError):
- """A subprocess call failed."""
-
- reference = "subprocess-exited-with-error"
-
- def __init__(
- self,
- *,
- command_description: str,
- exit_code: int,
- output_lines: Optional[List[str]],
- ) -> None:
- if output_lines is None:
- output_prompt = Text("See above for output.")
- else:
- output_prompt = (
- Text.from_markup(f"[red][{len(output_lines)} lines of output][/]\n")
- + Text("".join(output_lines))
- + Text.from_markup(R"[red]\[end of output][/]")
- )
-
- super().__init__(
- message=(
- f"[green]{escape(command_description)}[/] did not run successfully.\n"
- f"exit code: {exit_code}"
- ),
- context=output_prompt,
- hint_stmt=None,
- note_stmt=(
- "This error originates from a subprocess, and is likely not a "
- "problem with pip."
- ),
- )
-
- self.command_description = command_description
- self.exit_code = exit_code
-
- def __str__(self) -> str:
- return f"{self.command_description} exited with {self.exit_code}"
-
-
-class MetadataGenerationFailed(InstallationSubprocessError, InstallationError):
- reference = "metadata-generation-failed"
-
- def __init__(
- self,
- *,
- package_details: str,
- ) -> None:
- super(InstallationSubprocessError, self).__init__(
- message="Encountered error while generating package metadata.",
- context=escape(package_details),
- hint_stmt="See above for details.",
- note_stmt="This is an issue with the package mentioned above, not pip.",
- )
-
- def __str__(self) -> str:
- return "metadata generation failed"
-
-
-class HashErrors(InstallationError):
- """Multiple HashError instances rolled into one for reporting"""
-
- def __init__(self) -> None:
- self.errors: List["HashError"] = []
-
- def append(self, error: "HashError") -> None:
- self.errors.append(error)
-
- def __str__(self) -> str:
- lines = []
- self.errors.sort(key=lambda e: e.order)
- for cls, errors_of_cls in groupby(self.errors, lambda e: e.__class__):
- lines.append(cls.head)
- lines.extend(e.body() for e in errors_of_cls)
- if lines:
- return "\n".join(lines)
- return ""
-
- def __bool__(self) -> bool:
- return bool(self.errors)
-
-
-class HashError(InstallationError):
- """
- A failure to verify a package against known-good hashes
-
- :cvar order: An int sorting hash exception classes by difficulty of
- recovery (lower being harder), so the user doesn't bother fretting
- about unpinned packages when he has deeper issues, like VCS
- dependencies, to deal with. Also keeps error reports in a
- deterministic order.
- :cvar head: A section heading for display above potentially many
- exceptions of this kind
- :ivar req: The InstallRequirement that triggered this error. This is
- pasted on after the exception is instantiated, because it's not
- typically available earlier.
-
- """
-
- req: Optional["InstallRequirement"] = None
- head = ""
- order: int = -1
-
- def body(self) -> str:
- """Return a summary of me for display under the heading.
-
- This default implementation simply prints a description of the
- triggering requirement.
-
- :param req: The InstallRequirement that provoked this error, with
- its link already populated by the resolver's _populate_link().
-
- """
- return f" {self._requirement_name()}"
-
- def __str__(self) -> str:
- return f"{self.head}\n{self.body()}"
-
- def _requirement_name(self) -> str:
- """Return a description of the requirement that triggered me.
-
- This default implementation returns long description of the req, with
- line numbers
-
- """
- return str(self.req) if self.req else "unknown package"
-
-
-class VcsHashUnsupported(HashError):
- """A hash was provided for a version-control-system-based requirement, but
- we don't have a method for hashing those."""
-
- order = 0
- head = (
- "Can't verify hashes for these requirements because we don't "
- "have a way to hash version control repositories:"
- )
-
-
-class DirectoryUrlHashUnsupported(HashError):
- """A hash was provided for a version-control-system-based requirement, but
- we don't have a method for hashing those."""
-
- order = 1
- head = (
- "Can't verify hashes for these file:// requirements because they "
- "point to directories:"
- )
-
-
-class HashMissing(HashError):
- """A hash was needed for a requirement but is absent."""
-
- order = 2
- head = (
- "Hashes are required in --require-hashes mode, but they are "
- "missing from some requirements. Here is a list of those "
- "requirements along with the hashes their downloaded archives "
- "actually had. Add lines like these to your requirements files to "
- "prevent tampering. (If you did not enable --require-hashes "
- "manually, note that it turns on automatically when any package "
- "has a hash.)"
- )
-
- def __init__(self, gotten_hash: str) -> None:
- """
- :param gotten_hash: The hash of the (possibly malicious) archive we
- just downloaded
- """
- self.gotten_hash = gotten_hash
-
- def body(self) -> str:
- # Dodge circular import.
- from pip._internal.utils.hashes import FAVORITE_HASH
-
- package = None
- if self.req:
- # In the case of URL-based requirements, display the original URL
- # seen in the requirements file rather than the package name,
- # so the output can be directly copied into the requirements file.
- package = (
- self.req.original_link
- if self.req.original_link
- # In case someone feeds something downright stupid
- # to InstallRequirement's constructor.
- else getattr(self.req, "req", None)
- )
- return " {} --hash={}:{}".format(
- package or "unknown package", FAVORITE_HASH, self.gotten_hash
- )
-
-
-class HashUnpinned(HashError):
- """A requirement had a hash specified but was not pinned to a specific
- version."""
-
- order = 3
- head = (
- "In --require-hashes mode, all requirements must have their "
- "versions pinned with ==. These do not:"
- )
-
-
-class HashMismatch(HashError):
- """
- Distribution file hash values don't match.
-
- :ivar package_name: The name of the package that triggered the hash
- mismatch. Feel free to write to this after the exception is raise to
- improve its error message.
-
- """
-
- order = 4
- head = (
- "THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS "
- "FILE. If you have updated the package versions, please update "
- "the hashes. Otherwise, examine the package contents carefully; "
- "someone may have tampered with them."
- )
-
- def __init__(self, allowed: Dict[str, List[str]], gots: Dict[str, "_Hash"]) -> None:
- """
- :param allowed: A dict of algorithm names pointing to lists of allowed
- hex digests
- :param gots: A dict of algorithm names pointing to hashes we
- actually got from the files under suspicion
- """
- self.allowed = allowed
- self.gots = gots
-
- def body(self) -> str:
- return " {}:\n{}".format(self._requirement_name(), self._hash_comparison())
-
- def _hash_comparison(self) -> str:
- """
- Return a comparison of actual and expected hash values.
-
- Example::
-
- Expected sha256 abcdeabcdeabcdeabcdeabcdeabcdeabcdeabcdeabcde
- or 123451234512345123451234512345123451234512345
- Got bcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdef
-
- """
-
- def hash_then_or(hash_name: str) -> "chain[str]":
- # For now, all the decent hashes have 6-char names, so we can get
- # away with hard-coding space literals.
- return chain([hash_name], repeat(" or"))
-
- lines: List[str] = []
- for hash_name, expecteds in self.allowed.items():
- prefix = hash_then_or(hash_name)
- lines.extend(
- (" Expected {} {}".format(next(prefix), e)) for e in expecteds
- )
- lines.append(
- " Got {}\n".format(self.gots[hash_name].hexdigest())
- )
- return "\n".join(lines)
-
-
-class UnsupportedPythonVersion(InstallationError):
- """Unsupported python version according to Requires-Python package
- metadata."""
-
-
-class ConfigurationFileCouldNotBeLoaded(ConfigurationError):
- """When there are errors while loading a configuration file"""
-
- def __init__(
- self,
- reason: str = "could not be loaded",
- fname: Optional[str] = None,
- error: Optional[configparser.Error] = None,
- ) -> None:
- super().__init__(error)
- self.reason = reason
- self.fname = fname
- self.error = error
-
- def __str__(self) -> str:
- if self.fname is not None:
- message_part = f" in {self.fname}."
- else:
- assert self.error is not None
- message_part = f".\n{self.error}\n"
- return f"Configuration file {self.reason}{message_part}"
diff --git a/env/lib/python3.9/site-packages/pip/_internal/index/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/index/__init__.py
deleted file mode 100644
index 7a17b7b..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/index/__init__.py
+++ /dev/null
@@ -1,2 +0,0 @@
-"""Index interaction code
-"""
diff --git a/env/lib/python3.9/site-packages/pip/_internal/index/collector.py b/env/lib/python3.9/site-packages/pip/_internal/index/collector.py
deleted file mode 100644
index e6e9469..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/index/collector.py
+++ /dev/null
@@ -1,610 +0,0 @@
-"""
-The main purpose of this module is to expose LinkCollector.collect_sources().
-"""
-
-import cgi
-import collections
-import functools
-import itertools
-import logging
-import os
-import re
-import urllib.parse
-import urllib.request
-import xml.etree.ElementTree
-from html.parser import HTMLParser
-from optparse import Values
-from typing import (
- TYPE_CHECKING,
- Callable,
- Dict,
- Iterable,
- List,
- MutableMapping,
- NamedTuple,
- Optional,
- Sequence,
- Tuple,
- Union,
-)
-
-from pip._vendor import html5lib, requests
-from pip._vendor.requests import Response
-from pip._vendor.requests.exceptions import RetryError, SSLError
-
-from pip._internal.exceptions import NetworkConnectionError
-from pip._internal.models.link import Link
-from pip._internal.models.search_scope import SearchScope
-from pip._internal.network.session import PipSession
-from pip._internal.network.utils import raise_for_status
-from pip._internal.utils.filetypes import is_archive_file
-from pip._internal.utils.misc import pairwise, redact_auth_from_url
-from pip._internal.vcs import vcs
-
-from .sources import CandidatesFromPage, LinkSource, build_source
-
-if TYPE_CHECKING:
- from typing import Protocol
-else:
- Protocol = object
-
-logger = logging.getLogger(__name__)
-
-HTMLElement = xml.etree.ElementTree.Element
-ResponseHeaders = MutableMapping[str, str]
-
-
-def _match_vcs_scheme(url: str) -> Optional[str]:
- """Look for VCS schemes in the URL.
-
- Returns the matched VCS scheme, or None if there's no match.
- """
- for scheme in vcs.schemes:
- if url.lower().startswith(scheme) and url[len(scheme)] in "+:":
- return scheme
- return None
-
-
-class _NotHTML(Exception):
- def __init__(self, content_type: str, request_desc: str) -> None:
- super().__init__(content_type, request_desc)
- self.content_type = content_type
- self.request_desc = request_desc
-
-
-def _ensure_html_header(response: Response) -> None:
- """Check the Content-Type header to ensure the response contains HTML.
-
- Raises `_NotHTML` if the content type is not text/html.
- """
- content_type = response.headers.get("Content-Type", "")
- if not content_type.lower().startswith("text/html"):
- raise _NotHTML(content_type, response.request.method)
-
-
-class _NotHTTP(Exception):
- pass
-
-
-def _ensure_html_response(url: str, session: PipSession) -> None:
- """Send a HEAD request to the URL, and ensure the response contains HTML.
-
- Raises `_NotHTTP` if the URL is not available for a HEAD request, or
- `_NotHTML` if the content type is not text/html.
- """
- scheme, netloc, path, query, fragment = urllib.parse.urlsplit(url)
- if scheme not in {"http", "https"}:
- raise _NotHTTP()
-
- resp = session.head(url, allow_redirects=True)
- raise_for_status(resp)
-
- _ensure_html_header(resp)
-
-
-def _get_html_response(url: str, session: PipSession) -> Response:
- """Access an HTML page with GET, and return the response.
-
- This consists of three parts:
-
- 1. If the URL looks suspiciously like an archive, send a HEAD first to
- check the Content-Type is HTML, to avoid downloading a large file.
- Raise `_NotHTTP` if the content type cannot be determined, or
- `_NotHTML` if it is not HTML.
- 2. Actually perform the request. Raise HTTP exceptions on network failures.
- 3. Check the Content-Type header to make sure we got HTML, and raise
- `_NotHTML` otherwise.
- """
- if is_archive_file(Link(url).filename):
- _ensure_html_response(url, session=session)
-
- logger.debug("Getting page %s", redact_auth_from_url(url))
-
- resp = session.get(
- url,
- headers={
- "Accept": "text/html",
- # We don't want to blindly returned cached data for
- # /simple/, because authors generally expecting that
- # twine upload && pip install will function, but if
- # they've done a pip install in the last ~10 minutes
- # it won't. Thus by setting this to zero we will not
- # blindly use any cached data, however the benefit of
- # using max-age=0 instead of no-cache, is that we will
- # still support conditional requests, so we will still
- # minimize traffic sent in cases where the page hasn't
- # changed at all, we will just always incur the round
- # trip for the conditional GET now instead of only
- # once per 10 minutes.
- # For more information, please see pypa/pip#5670.
- "Cache-Control": "max-age=0",
- },
- )
- raise_for_status(resp)
-
- # The check for archives above only works if the url ends with
- # something that looks like an archive. However that is not a
- # requirement of an url. Unless we issue a HEAD request on every
- # url we cannot know ahead of time for sure if something is HTML
- # or not. However we can check after we've downloaded it.
- _ensure_html_header(resp)
-
- return resp
-
-
-def _get_encoding_from_headers(headers: ResponseHeaders) -> Optional[str]:
- """Determine if we have any encoding information in our headers."""
- if headers and "Content-Type" in headers:
- content_type, params = cgi.parse_header(headers["Content-Type"])
- if "charset" in params:
- return params["charset"]
- return None
-
-
-def _determine_base_url(document: HTMLElement, page_url: str) -> str:
- """Determine the HTML document's base URL.
-
- This looks for a ```` tag in the HTML document. If present, its href
- attribute denotes the base URL of anchor tags in the document. If there is
- no such tag (or if it does not have a valid href attribute), the HTML
- file's URL is used as the base URL.
-
- :param document: An HTML document representation. The current
- implementation expects the result of ``html5lib.parse()``.
- :param page_url: The URL of the HTML document.
-
- TODO: Remove when `html5lib` is dropped.
- """
- for base in document.findall(".//base"):
- href = base.get("href")
- if href is not None:
- return href
- return page_url
-
-
-def _clean_url_path_part(part: str) -> str:
- """
- Clean a "part" of a URL path (i.e. after splitting on "@" characters).
- """
- # We unquote prior to quoting to make sure nothing is double quoted.
- return urllib.parse.quote(urllib.parse.unquote(part))
-
-
-def _clean_file_url_path(part: str) -> str:
- """
- Clean the first part of a URL path that corresponds to a local
- filesystem path (i.e. the first part after splitting on "@" characters).
- """
- # We unquote prior to quoting to make sure nothing is double quoted.
- # Also, on Windows the path part might contain a drive letter which
- # should not be quoted. On Linux where drive letters do not
- # exist, the colon should be quoted. We rely on urllib.request
- # to do the right thing here.
- return urllib.request.pathname2url(urllib.request.url2pathname(part))
-
-
-# percent-encoded: /
-_reserved_chars_re = re.compile("(@|%2F)", re.IGNORECASE)
-
-
-def _clean_url_path(path: str, is_local_path: bool) -> str:
- """
- Clean the path portion of a URL.
- """
- if is_local_path:
- clean_func = _clean_file_url_path
- else:
- clean_func = _clean_url_path_part
-
- # Split on the reserved characters prior to cleaning so that
- # revision strings in VCS URLs are properly preserved.
- parts = _reserved_chars_re.split(path)
-
- cleaned_parts = []
- for to_clean, reserved in pairwise(itertools.chain(parts, [""])):
- cleaned_parts.append(clean_func(to_clean))
- # Normalize %xx escapes (e.g. %2f -> %2F)
- cleaned_parts.append(reserved.upper())
-
- return "".join(cleaned_parts)
-
-
-def _clean_link(url: str) -> str:
- """
- Make sure a link is fully quoted.
- For example, if ' ' occurs in the URL, it will be replaced with "%20",
- and without double-quoting other characters.
- """
- # Split the URL into parts according to the general structure
- # `scheme://netloc/path;parameters?query#fragment`.
- result = urllib.parse.urlparse(url)
- # If the netloc is empty, then the URL refers to a local filesystem path.
- is_local_path = not result.netloc
- path = _clean_url_path(result.path, is_local_path=is_local_path)
- return urllib.parse.urlunparse(result._replace(path=path))
-
-
-def _create_link_from_element(
- element_attribs: Dict[str, Optional[str]],
- page_url: str,
- base_url: str,
-) -> Optional[Link]:
- """
- Convert an anchor element's attributes in a simple repository page to a Link.
- """
- href = element_attribs.get("href")
- if not href:
- return None
-
- url = _clean_link(urllib.parse.urljoin(base_url, href))
- pyrequire = element_attribs.get("data-requires-python")
- yanked_reason = element_attribs.get("data-yanked")
-
- link = Link(
- url,
- comes_from=page_url,
- requires_python=pyrequire,
- yanked_reason=yanked_reason,
- )
-
- return link
-
-
-class CacheablePageContent:
- def __init__(self, page: "HTMLPage") -> None:
- assert page.cache_link_parsing
- self.page = page
-
- def __eq__(self, other: object) -> bool:
- return isinstance(other, type(self)) and self.page.url == other.page.url
-
- def __hash__(self) -> int:
- return hash(self.page.url)
-
-
-class ParseLinks(Protocol):
- def __call__(
- self, page: "HTMLPage", use_deprecated_html5lib: bool
- ) -> Iterable[Link]:
- ...
-
-
-def with_cached_html_pages(fn: ParseLinks) -> ParseLinks:
- """
- Given a function that parses an Iterable[Link] from an HTMLPage, cache the
- function's result (keyed by CacheablePageContent), unless the HTMLPage
- `page` has `page.cache_link_parsing == False`.
- """
-
- @functools.lru_cache(maxsize=None)
- def wrapper(
- cacheable_page: CacheablePageContent, use_deprecated_html5lib: bool
- ) -> List[Link]:
- return list(fn(cacheable_page.page, use_deprecated_html5lib))
-
- @functools.wraps(fn)
- def wrapper_wrapper(page: "HTMLPage", use_deprecated_html5lib: bool) -> List[Link]:
- if page.cache_link_parsing:
- return wrapper(CacheablePageContent(page), use_deprecated_html5lib)
- return list(fn(page, use_deprecated_html5lib))
-
- return wrapper_wrapper
-
-
-def _parse_links_html5lib(page: "HTMLPage") -> Iterable[Link]:
- """
- Parse an HTML document, and yield its anchor elements as Link objects.
-
- TODO: Remove when `html5lib` is dropped.
- """
- document = html5lib.parse(
- page.content,
- transport_encoding=page.encoding,
- namespaceHTMLElements=False,
- )
-
- url = page.url
- base_url = _determine_base_url(document, url)
- for anchor in document.findall(".//a"):
- link = _create_link_from_element(
- anchor.attrib,
- page_url=url,
- base_url=base_url,
- )
- if link is None:
- continue
- yield link
-
-
-@with_cached_html_pages
-def parse_links(page: "HTMLPage", use_deprecated_html5lib: bool) -> Iterable[Link]:
- """
- Parse an HTML document, and yield its anchor elements as Link objects.
- """
-
- if use_deprecated_html5lib:
- yield from _parse_links_html5lib(page)
- return
-
- parser = HTMLLinkParser(page.url)
- encoding = page.encoding or "utf-8"
- parser.feed(page.content.decode(encoding))
-
- url = page.url
- base_url = parser.base_url or url
- for anchor in parser.anchors:
- link = _create_link_from_element(
- anchor,
- page_url=url,
- base_url=base_url,
- )
- if link is None:
- continue
- yield link
-
-
-class HTMLPage:
- """Represents one page, along with its URL"""
-
- def __init__(
- self,
- content: bytes,
- encoding: Optional[str],
- url: str,
- cache_link_parsing: bool = True,
- ) -> None:
- """
- :param encoding: the encoding to decode the given content.
- :param url: the URL from which the HTML was downloaded.
- :param cache_link_parsing: whether links parsed from this page's url
- should be cached. PyPI index urls should
- have this set to False, for example.
- """
- self.content = content
- self.encoding = encoding
- self.url = url
- self.cache_link_parsing = cache_link_parsing
-
- def __str__(self) -> str:
- return redact_auth_from_url(self.url)
-
-
-class HTMLLinkParser(HTMLParser):
- """
- HTMLParser that keeps the first base HREF and a list of all anchor
- elements' attributes.
- """
-
- def __init__(self, url: str) -> None:
- super().__init__(convert_charrefs=True)
-
- self.url: str = url
- self.base_url: Optional[str] = None
- self.anchors: List[Dict[str, Optional[str]]] = []
-
- def handle_starttag(self, tag: str, attrs: List[Tuple[str, Optional[str]]]) -> None:
- if tag == "base" and self.base_url is None:
- href = self.get_href(attrs)
- if href is not None:
- self.base_url = href
- elif tag == "a":
- self.anchors.append(dict(attrs))
-
- def get_href(self, attrs: List[Tuple[str, Optional[str]]]) -> Optional[str]:
- for name, value in attrs:
- if name == "href":
- return value
- return None
-
-
-def _handle_get_page_fail(
- link: Link,
- reason: Union[str, Exception],
- meth: Optional[Callable[..., None]] = None,
-) -> None:
- if meth is None:
- meth = logger.debug
- meth("Could not fetch URL %s: %s - skipping", link, reason)
-
-
-def _make_html_page(response: Response, cache_link_parsing: bool = True) -> HTMLPage:
- encoding = _get_encoding_from_headers(response.headers)
- return HTMLPage(
- response.content,
- encoding=encoding,
- url=response.url,
- cache_link_parsing=cache_link_parsing,
- )
-
-
-def _get_html_page(
- link: Link, session: Optional[PipSession] = None
-) -> Optional["HTMLPage"]:
- if session is None:
- raise TypeError(
- "_get_html_page() missing 1 required keyword argument: 'session'"
- )
-
- url = link.url.split("#", 1)[0]
-
- # Check for VCS schemes that do not support lookup as web pages.
- vcs_scheme = _match_vcs_scheme(url)
- if vcs_scheme:
- logger.warning(
- "Cannot look at %s URL %s because it does not support lookup as web pages.",
- vcs_scheme,
- link,
- )
- return None
-
- # Tack index.html onto file:// URLs that point to directories
- scheme, _, path, _, _, _ = urllib.parse.urlparse(url)
- if scheme == "file" and os.path.isdir(urllib.request.url2pathname(path)):
- # add trailing slash if not present so urljoin doesn't trim
- # final segment
- if not url.endswith("/"):
- url += "/"
- url = urllib.parse.urljoin(url, "index.html")
- logger.debug(" file: URL is directory, getting %s", url)
-
- try:
- resp = _get_html_response(url, session=session)
- except _NotHTTP:
- logger.warning(
- "Skipping page %s because it looks like an archive, and cannot "
- "be checked by a HTTP HEAD request.",
- link,
- )
- except _NotHTML as exc:
- logger.warning(
- "Skipping page %s because the %s request got Content-Type: %s."
- "The only supported Content-Type is text/html",
- link,
- exc.request_desc,
- exc.content_type,
- )
- except NetworkConnectionError as exc:
- _handle_get_page_fail(link, exc)
- except RetryError as exc:
- _handle_get_page_fail(link, exc)
- except SSLError as exc:
- reason = "There was a problem confirming the ssl certificate: "
- reason += str(exc)
- _handle_get_page_fail(link, reason, meth=logger.info)
- except requests.ConnectionError as exc:
- _handle_get_page_fail(link, f"connection error: {exc}")
- except requests.Timeout:
- _handle_get_page_fail(link, "timed out")
- else:
- return _make_html_page(resp, cache_link_parsing=link.cache_link_parsing)
- return None
-
-
-class CollectedSources(NamedTuple):
- find_links: Sequence[Optional[LinkSource]]
- index_urls: Sequence[Optional[LinkSource]]
-
-
-class LinkCollector:
-
- """
- Responsible for collecting Link objects from all configured locations,
- making network requests as needed.
-
- The class's main method is its collect_sources() method.
- """
-
- def __init__(
- self,
- session: PipSession,
- search_scope: SearchScope,
- ) -> None:
- self.search_scope = search_scope
- self.session = session
-
- @classmethod
- def create(
- cls,
- session: PipSession,
- options: Values,
- suppress_no_index: bool = False,
- ) -> "LinkCollector":
- """
- :param session: The Session to use to make requests.
- :param suppress_no_index: Whether to ignore the --no-index option
- when constructing the SearchScope object.
- """
- index_urls = [options.index_url] + options.extra_index_urls
- if options.no_index and not suppress_no_index:
- logger.debug(
- "Ignoring indexes: %s",
- ",".join(redact_auth_from_url(url) for url in index_urls),
- )
- index_urls = []
-
- # Make sure find_links is a list before passing to create().
- find_links = options.find_links or []
-
- search_scope = SearchScope.create(
- find_links=find_links,
- index_urls=index_urls,
- )
- link_collector = LinkCollector(
- session=session,
- search_scope=search_scope,
- )
- return link_collector
-
- @property
- def find_links(self) -> List[str]:
- return self.search_scope.find_links
-
- def fetch_page(self, location: Link) -> Optional[HTMLPage]:
- """
- Fetch an HTML page containing package links.
- """
- return _get_html_page(location, session=self.session)
-
- def collect_sources(
- self,
- project_name: str,
- candidates_from_page: CandidatesFromPage,
- ) -> CollectedSources:
- # The OrderedDict calls deduplicate sources by URL.
- index_url_sources = collections.OrderedDict(
- build_source(
- loc,
- candidates_from_page=candidates_from_page,
- page_validator=self.session.is_secure_origin,
- expand_dir=False,
- cache_link_parsing=False,
- )
- for loc in self.search_scope.get_index_urls_locations(project_name)
- ).values()
- find_links_sources = collections.OrderedDict(
- build_source(
- loc,
- candidates_from_page=candidates_from_page,
- page_validator=self.session.is_secure_origin,
- expand_dir=True,
- cache_link_parsing=True,
- )
- for loc in self.find_links
- ).values()
-
- if logger.isEnabledFor(logging.DEBUG):
- lines = [
- f"* {s.link}"
- for s in itertools.chain(find_links_sources, index_url_sources)
- if s is not None and s.link is not None
- ]
- lines = [
- f"{len(lines)} location(s) to search "
- f"for versions of {project_name}:"
- ] + lines
- logger.debug("\n".join(lines))
-
- return CollectedSources(
- find_links=list(find_links_sources),
- index_urls=list(index_url_sources),
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/index/package_finder.py b/env/lib/python3.9/site-packages/pip/_internal/index/package_finder.py
deleted file mode 100644
index f70f74b..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/index/package_finder.py
+++ /dev/null
@@ -1,1030 +0,0 @@
-"""Routines related to PyPI, indexes"""
-
-# The following comment should be removed at some point in the future.
-# mypy: strict-optional=False
-
-import enum
-import functools
-import itertools
-import logging
-import re
-from typing import FrozenSet, Iterable, List, Optional, Set, Tuple, Union
-
-from pip._vendor.packaging import specifiers
-from pip._vendor.packaging.tags import Tag
-from pip._vendor.packaging.utils import canonicalize_name
-from pip._vendor.packaging.version import _BaseVersion
-from pip._vendor.packaging.version import parse as parse_version
-
-from pip._internal.exceptions import (
- BestVersionAlreadyInstalled,
- DistributionNotFound,
- InvalidWheelFilename,
- UnsupportedWheel,
-)
-from pip._internal.index.collector import LinkCollector, parse_links
-from pip._internal.models.candidate import InstallationCandidate
-from pip._internal.models.format_control import FormatControl
-from pip._internal.models.link import Link
-from pip._internal.models.search_scope import SearchScope
-from pip._internal.models.selection_prefs import SelectionPreferences
-from pip._internal.models.target_python import TargetPython
-from pip._internal.models.wheel import Wheel
-from pip._internal.req import InstallRequirement
-from pip._internal.utils._log import getLogger
-from pip._internal.utils.filetypes import WHEEL_EXTENSION
-from pip._internal.utils.hashes import Hashes
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.misc import build_netloc
-from pip._internal.utils.packaging import check_requires_python
-from pip._internal.utils.unpacking import SUPPORTED_EXTENSIONS
-
-__all__ = ["FormatControl", "BestCandidateResult", "PackageFinder"]
-
-
-logger = getLogger(__name__)
-
-BuildTag = Union[Tuple[()], Tuple[int, str]]
-CandidateSortingKey = Tuple[int, int, int, _BaseVersion, Optional[int], BuildTag]
-
-
-def _check_link_requires_python(
- link: Link,
- version_info: Tuple[int, int, int],
- ignore_requires_python: bool = False,
-) -> bool:
- """
- Return whether the given Python version is compatible with a link's
- "Requires-Python" value.
-
- :param version_info: A 3-tuple of ints representing the Python
- major-minor-micro version to check.
- :param ignore_requires_python: Whether to ignore the "Requires-Python"
- value if the given Python version isn't compatible.
- """
- try:
- is_compatible = check_requires_python(
- link.requires_python,
- version_info=version_info,
- )
- except specifiers.InvalidSpecifier:
- logger.debug(
- "Ignoring invalid Requires-Python (%r) for link: %s",
- link.requires_python,
- link,
- )
- else:
- if not is_compatible:
- version = ".".join(map(str, version_info))
- if not ignore_requires_python:
- logger.verbose(
- "Link requires a different Python (%s not in: %r): %s",
- version,
- link.requires_python,
- link,
- )
- return False
-
- logger.debug(
- "Ignoring failed Requires-Python check (%s not in: %r) for link: %s",
- version,
- link.requires_python,
- link,
- )
-
- return True
-
-
-class LinkType(enum.Enum):
- candidate = enum.auto()
- different_project = enum.auto()
- yanked = enum.auto()
- format_unsupported = enum.auto()
- format_invalid = enum.auto()
- platform_mismatch = enum.auto()
- requires_python_mismatch = enum.auto()
-
-
-class LinkEvaluator:
-
- """
- Responsible for evaluating links for a particular project.
- """
-
- _py_version_re = re.compile(r"-py([123]\.?[0-9]?)$")
-
- # Don't include an allow_yanked default value to make sure each call
- # site considers whether yanked releases are allowed. This also causes
- # that decision to be made explicit in the calling code, which helps
- # people when reading the code.
- def __init__(
- self,
- project_name: str,
- canonical_name: str,
- formats: FrozenSet[str],
- target_python: TargetPython,
- allow_yanked: bool,
- ignore_requires_python: Optional[bool] = None,
- ) -> None:
- """
- :param project_name: The user supplied package name.
- :param canonical_name: The canonical package name.
- :param formats: The formats allowed for this package. Should be a set
- with 'binary' or 'source' or both in it.
- :param target_python: The target Python interpreter to use when
- evaluating link compatibility. This is used, for example, to
- check wheel compatibility, as well as when checking the Python
- version, e.g. the Python version embedded in a link filename
- (or egg fragment) and against an HTML link's optional PEP 503
- "data-requires-python" attribute.
- :param allow_yanked: Whether files marked as yanked (in the sense
- of PEP 592) are permitted to be candidates for install.
- :param ignore_requires_python: Whether to ignore incompatible
- PEP 503 "data-requires-python" values in HTML links. Defaults
- to False.
- """
- if ignore_requires_python is None:
- ignore_requires_python = False
-
- self._allow_yanked = allow_yanked
- self._canonical_name = canonical_name
- self._ignore_requires_python = ignore_requires_python
- self._formats = formats
- self._target_python = target_python
-
- self.project_name = project_name
-
- def evaluate_link(self, link: Link) -> Tuple[LinkType, str]:
- """
- Determine whether a link is a candidate for installation.
-
- :return: A tuple (result, detail), where *result* is an enum
- representing whether the evaluation found a candidate, or the reason
- why one is not found. If a candidate is found, *detail* will be the
- candidate's version string; if one is not found, it contains the
- reason the link fails to qualify.
- """
- version = None
- if link.is_yanked and not self._allow_yanked:
- reason = link.yanked_reason or ""
- return (LinkType.yanked, f"yanked for reason: {reason}")
-
- if link.egg_fragment:
- egg_info = link.egg_fragment
- ext = link.ext
- else:
- egg_info, ext = link.splitext()
- if not ext:
- return (LinkType.format_unsupported, "not a file")
- if ext not in SUPPORTED_EXTENSIONS:
- return (
- LinkType.format_unsupported,
- f"unsupported archive format: {ext}",
- )
- if "binary" not in self._formats and ext == WHEEL_EXTENSION:
- reason = f"No binaries permitted for {self.project_name}"
- return (LinkType.format_unsupported, reason)
- if "macosx10" in link.path and ext == ".zip":
- return (LinkType.format_unsupported, "macosx10 one")
- if ext == WHEEL_EXTENSION:
- try:
- wheel = Wheel(link.filename)
- except InvalidWheelFilename:
- return (
- LinkType.format_invalid,
- "invalid wheel filename",
- )
- if canonicalize_name(wheel.name) != self._canonical_name:
- reason = f"wrong project name (not {self.project_name})"
- return (LinkType.different_project, reason)
-
- supported_tags = self._target_python.get_tags()
- if not wheel.supported(supported_tags):
- # Include the wheel's tags in the reason string to
- # simplify troubleshooting compatibility issues.
- file_tags = ", ".join(wheel.get_formatted_file_tags())
- reason = (
- f"none of the wheel's tags ({file_tags}) are compatible "
- f"(run pip debug --verbose to show compatible tags)"
- )
- return (LinkType.platform_mismatch, reason)
-
- version = wheel.version
-
- # This should be up by the self.ok_binary check, but see issue 2700.
- if "source" not in self._formats and ext != WHEEL_EXTENSION:
- reason = f"No sources permitted for {self.project_name}"
- return (LinkType.format_unsupported, reason)
-
- if not version:
- version = _extract_version_from_fragment(
- egg_info,
- self._canonical_name,
- )
- if not version:
- reason = f"Missing project version for {self.project_name}"
- return (LinkType.format_invalid, reason)
-
- match = self._py_version_re.search(version)
- if match:
- version = version[: match.start()]
- py_version = match.group(1)
- if py_version != self._target_python.py_version:
- return (
- LinkType.platform_mismatch,
- "Python version is incorrect",
- )
-
- supports_python = _check_link_requires_python(
- link,
- version_info=self._target_python.py_version_info,
- ignore_requires_python=self._ignore_requires_python,
- )
- if not supports_python:
- reason = f"{version} Requires-Python {link.requires_python}"
- return (LinkType.requires_python_mismatch, reason)
-
- logger.debug("Found link %s, version: %s", link, version)
-
- return (LinkType.candidate, version)
-
-
-def filter_unallowed_hashes(
- candidates: List[InstallationCandidate],
- hashes: Hashes,
- project_name: str,
-) -> List[InstallationCandidate]:
- """
- Filter out candidates whose hashes aren't allowed, and return a new
- list of candidates.
-
- If at least one candidate has an allowed hash, then all candidates with
- either an allowed hash or no hash specified are returned. Otherwise,
- the given candidates are returned.
-
- Including the candidates with no hash specified when there is a match
- allows a warning to be logged if there is a more preferred candidate
- with no hash specified. Returning all candidates in the case of no
- matches lets pip report the hash of the candidate that would otherwise
- have been installed (e.g. permitting the user to more easily update
- their requirements file with the desired hash).
- """
- if not hashes:
- logger.debug(
- "Given no hashes to check %s links for project %r: "
- "discarding no candidates",
- len(candidates),
- project_name,
- )
- # Make sure we're not returning back the given value.
- return list(candidates)
-
- matches_or_no_digest = []
- # Collect the non-matches for logging purposes.
- non_matches = []
- match_count = 0
- for candidate in candidates:
- link = candidate.link
- if not link.has_hash:
- pass
- elif link.is_hash_allowed(hashes=hashes):
- match_count += 1
- else:
- non_matches.append(candidate)
- continue
-
- matches_or_no_digest.append(candidate)
-
- if match_count:
- filtered = matches_or_no_digest
- else:
- # Make sure we're not returning back the given value.
- filtered = list(candidates)
-
- if len(filtered) == len(candidates):
- discard_message = "discarding no candidates"
- else:
- discard_message = "discarding {} non-matches:\n {}".format(
- len(non_matches),
- "\n ".join(str(candidate.link) for candidate in non_matches),
- )
-
- logger.debug(
- "Checked %s links for project %r against %s hashes "
- "(%s matches, %s no digest): %s",
- len(candidates),
- project_name,
- hashes.digest_count,
- match_count,
- len(matches_or_no_digest) - match_count,
- discard_message,
- )
-
- return filtered
-
-
-class CandidatePreferences:
-
- """
- Encapsulates some of the preferences for filtering and sorting
- InstallationCandidate objects.
- """
-
- def __init__(
- self,
- prefer_binary: bool = False,
- allow_all_prereleases: bool = False,
- ) -> None:
- """
- :param allow_all_prereleases: Whether to allow all pre-releases.
- """
- self.allow_all_prereleases = allow_all_prereleases
- self.prefer_binary = prefer_binary
-
-
-class BestCandidateResult:
- """A collection of candidates, returned by `PackageFinder.find_best_candidate`.
-
- This class is only intended to be instantiated by CandidateEvaluator's
- `compute_best_candidate()` method.
- """
-
- def __init__(
- self,
- candidates: List[InstallationCandidate],
- applicable_candidates: List[InstallationCandidate],
- best_candidate: Optional[InstallationCandidate],
- ) -> None:
- """
- :param candidates: A sequence of all available candidates found.
- :param applicable_candidates: The applicable candidates.
- :param best_candidate: The most preferred candidate found, or None
- if no applicable candidates were found.
- """
- assert set(applicable_candidates) <= set(candidates)
-
- if best_candidate is None:
- assert not applicable_candidates
- else:
- assert best_candidate in applicable_candidates
-
- self._applicable_candidates = applicable_candidates
- self._candidates = candidates
-
- self.best_candidate = best_candidate
-
- def iter_all(self) -> Iterable[InstallationCandidate]:
- """Iterate through all candidates."""
- return iter(self._candidates)
-
- def iter_applicable(self) -> Iterable[InstallationCandidate]:
- """Iterate through the applicable candidates."""
- return iter(self._applicable_candidates)
-
-
-class CandidateEvaluator:
-
- """
- Responsible for filtering and sorting candidates for installation based
- on what tags are valid.
- """
-
- @classmethod
- def create(
- cls,
- project_name: str,
- target_python: Optional[TargetPython] = None,
- prefer_binary: bool = False,
- allow_all_prereleases: bool = False,
- specifier: Optional[specifiers.BaseSpecifier] = None,
- hashes: Optional[Hashes] = None,
- ) -> "CandidateEvaluator":
- """Create a CandidateEvaluator object.
-
- :param target_python: The target Python interpreter to use when
- checking compatibility. If None (the default), a TargetPython
- object will be constructed from the running Python.
- :param specifier: An optional object implementing `filter`
- (e.g. `packaging.specifiers.SpecifierSet`) to filter applicable
- versions.
- :param hashes: An optional collection of allowed hashes.
- """
- if target_python is None:
- target_python = TargetPython()
- if specifier is None:
- specifier = specifiers.SpecifierSet()
-
- supported_tags = target_python.get_tags()
-
- return cls(
- project_name=project_name,
- supported_tags=supported_tags,
- specifier=specifier,
- prefer_binary=prefer_binary,
- allow_all_prereleases=allow_all_prereleases,
- hashes=hashes,
- )
-
- def __init__(
- self,
- project_name: str,
- supported_tags: List[Tag],
- specifier: specifiers.BaseSpecifier,
- prefer_binary: bool = False,
- allow_all_prereleases: bool = False,
- hashes: Optional[Hashes] = None,
- ) -> None:
- """
- :param supported_tags: The PEP 425 tags supported by the target
- Python in order of preference (most preferred first).
- """
- self._allow_all_prereleases = allow_all_prereleases
- self._hashes = hashes
- self._prefer_binary = prefer_binary
- self._project_name = project_name
- self._specifier = specifier
- self._supported_tags = supported_tags
- # Since the index of the tag in the _supported_tags list is used
- # as a priority, precompute a map from tag to index/priority to be
- # used in wheel.find_most_preferred_tag.
- self._wheel_tag_preferences = {
- tag: idx for idx, tag in enumerate(supported_tags)
- }
-
- def get_applicable_candidates(
- self,
- candidates: List[InstallationCandidate],
- ) -> List[InstallationCandidate]:
- """
- Return the applicable candidates from a list of candidates.
- """
- # Using None infers from the specifier instead.
- allow_prereleases = self._allow_all_prereleases or None
- specifier = self._specifier
- versions = {
- str(v)
- for v in specifier.filter(
- # We turn the version object into a str here because otherwise
- # when we're debundled but setuptools isn't, Python will see
- # packaging.version.Version and
- # pkg_resources._vendor.packaging.version.Version as different
- # types. This way we'll use a str as a common data interchange
- # format. If we stop using the pkg_resources provided specifier
- # and start using our own, we can drop the cast to str().
- (str(c.version) for c in candidates),
- prereleases=allow_prereleases,
- )
- }
-
- # Again, converting version to str to deal with debundling.
- applicable_candidates = [c for c in candidates if str(c.version) in versions]
-
- filtered_applicable_candidates = filter_unallowed_hashes(
- candidates=applicable_candidates,
- hashes=self._hashes,
- project_name=self._project_name,
- )
-
- return sorted(filtered_applicable_candidates, key=self._sort_key)
-
- def _sort_key(self, candidate: InstallationCandidate) -> CandidateSortingKey:
- """
- Function to pass as the `key` argument to a call to sorted() to sort
- InstallationCandidates by preference.
-
- Returns a tuple such that tuples sorting as greater using Python's
- default comparison operator are more preferred.
-
- The preference is as follows:
-
- First and foremost, candidates with allowed (matching) hashes are
- always preferred over candidates without matching hashes. This is
- because e.g. if the only candidate with an allowed hash is yanked,
- we still want to use that candidate.
-
- Second, excepting hash considerations, candidates that have been
- yanked (in the sense of PEP 592) are always less preferred than
- candidates that haven't been yanked. Then:
-
- If not finding wheels, they are sorted by version only.
- If finding wheels, then the sort order is by version, then:
- 1. existing installs
- 2. wheels ordered via Wheel.support_index_min(self._supported_tags)
- 3. source archives
- If prefer_binary was set, then all wheels are sorted above sources.
-
- Note: it was considered to embed this logic into the Link
- comparison operators, but then different sdist links
- with the same version, would have to be considered equal
- """
- valid_tags = self._supported_tags
- support_num = len(valid_tags)
- build_tag: BuildTag = ()
- binary_preference = 0
- link = candidate.link
- if link.is_wheel:
- # can raise InvalidWheelFilename
- wheel = Wheel(link.filename)
- try:
- pri = -(
- wheel.find_most_preferred_tag(
- valid_tags, self._wheel_tag_preferences
- )
- )
- except ValueError:
- raise UnsupportedWheel(
- "{} is not a supported wheel for this platform. It "
- "can't be sorted.".format(wheel.filename)
- )
- if self._prefer_binary:
- binary_preference = 1
- if wheel.build_tag is not None:
- match = re.match(r"^(\d+)(.*)$", wheel.build_tag)
- build_tag_groups = match.groups()
- build_tag = (int(build_tag_groups[0]), build_tag_groups[1])
- else: # sdist
- pri = -(support_num)
- has_allowed_hash = int(link.is_hash_allowed(self._hashes))
- yank_value = -1 * int(link.is_yanked) # -1 for yanked.
- return (
- has_allowed_hash,
- yank_value,
- binary_preference,
- candidate.version,
- pri,
- build_tag,
- )
-
- def sort_best_candidate(
- self,
- candidates: List[InstallationCandidate],
- ) -> Optional[InstallationCandidate]:
- """
- Return the best candidate per the instance's sort order, or None if
- no candidate is acceptable.
- """
- if not candidates:
- return None
- best_candidate = max(candidates, key=self._sort_key)
- return best_candidate
-
- def compute_best_candidate(
- self,
- candidates: List[InstallationCandidate],
- ) -> BestCandidateResult:
- """
- Compute and return a `BestCandidateResult` instance.
- """
- applicable_candidates = self.get_applicable_candidates(candidates)
-
- best_candidate = self.sort_best_candidate(applicable_candidates)
-
- return BestCandidateResult(
- candidates,
- applicable_candidates=applicable_candidates,
- best_candidate=best_candidate,
- )
-
-
-class PackageFinder:
- """This finds packages.
-
- This is meant to match easy_install's technique for looking for
- packages, by reading pages and looking for appropriate links.
- """
-
- def __init__(
- self,
- link_collector: LinkCollector,
- target_python: TargetPython,
- allow_yanked: bool,
- use_deprecated_html5lib: bool,
- format_control: Optional[FormatControl] = None,
- candidate_prefs: Optional[CandidatePreferences] = None,
- ignore_requires_python: Optional[bool] = None,
- ) -> None:
- """
- This constructor is primarily meant to be used by the create() class
- method and from tests.
-
- :param format_control: A FormatControl object, used to control
- the selection of source packages / binary packages when consulting
- the index and links.
- :param candidate_prefs: Options to use when creating a
- CandidateEvaluator object.
- """
- if candidate_prefs is None:
- candidate_prefs = CandidatePreferences()
-
- format_control = format_control or FormatControl(set(), set())
-
- self._allow_yanked = allow_yanked
- self._candidate_prefs = candidate_prefs
- self._ignore_requires_python = ignore_requires_python
- self._link_collector = link_collector
- self._target_python = target_python
- self._use_deprecated_html5lib = use_deprecated_html5lib
-
- self.format_control = format_control
-
- # These are boring links that have already been logged somehow.
- self._logged_links: Set[Tuple[Link, LinkType, str]] = set()
-
- # Don't include an allow_yanked default value to make sure each call
- # site considers whether yanked releases are allowed. This also causes
- # that decision to be made explicit in the calling code, which helps
- # people when reading the code.
- @classmethod
- def create(
- cls,
- link_collector: LinkCollector,
- selection_prefs: SelectionPreferences,
- target_python: Optional[TargetPython] = None,
- *,
- use_deprecated_html5lib: bool,
- ) -> "PackageFinder":
- """Create a PackageFinder.
-
- :param selection_prefs: The candidate selection preferences, as a
- SelectionPreferences object.
- :param target_python: The target Python interpreter to use when
- checking compatibility. If None (the default), a TargetPython
- object will be constructed from the running Python.
- """
- if target_python is None:
- target_python = TargetPython()
-
- candidate_prefs = CandidatePreferences(
- prefer_binary=selection_prefs.prefer_binary,
- allow_all_prereleases=selection_prefs.allow_all_prereleases,
- )
-
- return cls(
- candidate_prefs=candidate_prefs,
- link_collector=link_collector,
- target_python=target_python,
- allow_yanked=selection_prefs.allow_yanked,
- format_control=selection_prefs.format_control,
- ignore_requires_python=selection_prefs.ignore_requires_python,
- use_deprecated_html5lib=use_deprecated_html5lib,
- )
-
- @property
- def target_python(self) -> TargetPython:
- return self._target_python
-
- @property
- def search_scope(self) -> SearchScope:
- return self._link_collector.search_scope
-
- @search_scope.setter
- def search_scope(self, search_scope: SearchScope) -> None:
- self._link_collector.search_scope = search_scope
-
- @property
- def find_links(self) -> List[str]:
- return self._link_collector.find_links
-
- @property
- def index_urls(self) -> List[str]:
- return self.search_scope.index_urls
-
- @property
- def trusted_hosts(self) -> Iterable[str]:
- for host_port in self._link_collector.session.pip_trusted_origins:
- yield build_netloc(*host_port)
-
- @property
- def allow_all_prereleases(self) -> bool:
- return self._candidate_prefs.allow_all_prereleases
-
- def set_allow_all_prereleases(self) -> None:
- self._candidate_prefs.allow_all_prereleases = True
-
- @property
- def prefer_binary(self) -> bool:
- return self._candidate_prefs.prefer_binary
-
- def set_prefer_binary(self) -> None:
- self._candidate_prefs.prefer_binary = True
-
- def requires_python_skipped_reasons(self) -> List[str]:
- reasons = {
- detail
- for _, result, detail in self._logged_links
- if result == LinkType.requires_python_mismatch
- }
- return sorted(reasons)
-
- def make_link_evaluator(self, project_name: str) -> LinkEvaluator:
- canonical_name = canonicalize_name(project_name)
- formats = self.format_control.get_allowed_formats(canonical_name)
-
- return LinkEvaluator(
- project_name=project_name,
- canonical_name=canonical_name,
- formats=formats,
- target_python=self._target_python,
- allow_yanked=self._allow_yanked,
- ignore_requires_python=self._ignore_requires_python,
- )
-
- def _sort_links(self, links: Iterable[Link]) -> List[Link]:
- """
- Returns elements of links in order, non-egg links first, egg links
- second, while eliminating duplicates
- """
- eggs, no_eggs = [], []
- seen: Set[Link] = set()
- for link in links:
- if link not in seen:
- seen.add(link)
- if link.egg_fragment:
- eggs.append(link)
- else:
- no_eggs.append(link)
- return no_eggs + eggs
-
- def _log_skipped_link(self, link: Link, result: LinkType, detail: str) -> None:
- entry = (link, result, detail)
- if entry not in self._logged_links:
- # Put the link at the end so the reason is more visible and because
- # the link string is usually very long.
- logger.debug("Skipping link: %s: %s", detail, link)
- self._logged_links.add(entry)
-
- def get_install_candidate(
- self, link_evaluator: LinkEvaluator, link: Link
- ) -> Optional[InstallationCandidate]:
- """
- If the link is a candidate for install, convert it to an
- InstallationCandidate and return it. Otherwise, return None.
- """
- result, detail = link_evaluator.evaluate_link(link)
- if result != LinkType.candidate:
- self._log_skipped_link(link, result, detail)
- return None
-
- return InstallationCandidate(
- name=link_evaluator.project_name,
- link=link,
- version=detail,
- )
-
- def evaluate_links(
- self, link_evaluator: LinkEvaluator, links: Iterable[Link]
- ) -> List[InstallationCandidate]:
- """
- Convert links that are candidates to InstallationCandidate objects.
- """
- candidates = []
- for link in self._sort_links(links):
- candidate = self.get_install_candidate(link_evaluator, link)
- if candidate is not None:
- candidates.append(candidate)
-
- return candidates
-
- def process_project_url(
- self, project_url: Link, link_evaluator: LinkEvaluator
- ) -> List[InstallationCandidate]:
- logger.debug(
- "Fetching project page and analyzing links: %s",
- project_url,
- )
- html_page = self._link_collector.fetch_page(project_url)
- if html_page is None:
- return []
-
- page_links = list(parse_links(html_page, self._use_deprecated_html5lib))
-
- with indent_log():
- package_links = self.evaluate_links(
- link_evaluator,
- links=page_links,
- )
-
- return package_links
-
- @functools.lru_cache(maxsize=None)
- def find_all_candidates(self, project_name: str) -> List[InstallationCandidate]:
- """Find all available InstallationCandidate for project_name
-
- This checks index_urls and find_links.
- All versions found are returned as an InstallationCandidate list.
-
- See LinkEvaluator.evaluate_link() for details on which files
- are accepted.
- """
- link_evaluator = self.make_link_evaluator(project_name)
-
- collected_sources = self._link_collector.collect_sources(
- project_name=project_name,
- candidates_from_page=functools.partial(
- self.process_project_url,
- link_evaluator=link_evaluator,
- ),
- )
-
- page_candidates_it = itertools.chain.from_iterable(
- source.page_candidates()
- for sources in collected_sources
- for source in sources
- if source is not None
- )
- page_candidates = list(page_candidates_it)
-
- file_links_it = itertools.chain.from_iterable(
- source.file_links()
- for sources in collected_sources
- for source in sources
- if source is not None
- )
- file_candidates = self.evaluate_links(
- link_evaluator,
- sorted(file_links_it, reverse=True),
- )
-
- if logger.isEnabledFor(logging.DEBUG) and file_candidates:
- paths = []
- for candidate in file_candidates:
- assert candidate.link.url # we need to have a URL
- try:
- paths.append(candidate.link.file_path)
- except Exception:
- paths.append(candidate.link.url) # it's not a local file
-
- logger.debug("Local files found: %s", ", ".join(paths))
-
- # This is an intentional priority ordering
- return file_candidates + page_candidates
-
- def make_candidate_evaluator(
- self,
- project_name: str,
- specifier: Optional[specifiers.BaseSpecifier] = None,
- hashes: Optional[Hashes] = None,
- ) -> CandidateEvaluator:
- """Create a CandidateEvaluator object to use."""
- candidate_prefs = self._candidate_prefs
- return CandidateEvaluator.create(
- project_name=project_name,
- target_python=self._target_python,
- prefer_binary=candidate_prefs.prefer_binary,
- allow_all_prereleases=candidate_prefs.allow_all_prereleases,
- specifier=specifier,
- hashes=hashes,
- )
-
- @functools.lru_cache(maxsize=None)
- def find_best_candidate(
- self,
- project_name: str,
- specifier: Optional[specifiers.BaseSpecifier] = None,
- hashes: Optional[Hashes] = None,
- ) -> BestCandidateResult:
- """Find matches for the given project and specifier.
-
- :param specifier: An optional object implementing `filter`
- (e.g. `packaging.specifiers.SpecifierSet`) to filter applicable
- versions.
-
- :return: A `BestCandidateResult` instance.
- """
- candidates = self.find_all_candidates(project_name)
- candidate_evaluator = self.make_candidate_evaluator(
- project_name=project_name,
- specifier=specifier,
- hashes=hashes,
- )
- return candidate_evaluator.compute_best_candidate(candidates)
-
- def find_requirement(
- self, req: InstallRequirement, upgrade: bool
- ) -> Optional[InstallationCandidate]:
- """Try to find a Link matching req
-
- Expects req, an InstallRequirement and upgrade, a boolean
- Returns a InstallationCandidate if found,
- Raises DistributionNotFound or BestVersionAlreadyInstalled otherwise
- """
- hashes = req.hashes(trust_internet=False)
- best_candidate_result = self.find_best_candidate(
- req.name,
- specifier=req.specifier,
- hashes=hashes,
- )
- best_candidate = best_candidate_result.best_candidate
-
- installed_version: Optional[_BaseVersion] = None
- if req.satisfied_by is not None:
- installed_version = req.satisfied_by.version
-
- def _format_versions(cand_iter: Iterable[InstallationCandidate]) -> str:
- # This repeated parse_version and str() conversion is needed to
- # handle different vendoring sources from pip and pkg_resources.
- # If we stop using the pkg_resources provided specifier and start
- # using our own, we can drop the cast to str().
- return (
- ", ".join(
- sorted(
- {str(c.version) for c in cand_iter},
- key=parse_version,
- )
- )
- or "none"
- )
-
- if installed_version is None and best_candidate is None:
- logger.critical(
- "Could not find a version that satisfies the requirement %s "
- "(from versions: %s)",
- req,
- _format_versions(best_candidate_result.iter_all()),
- )
-
- raise DistributionNotFound(
- "No matching distribution found for {}".format(req)
- )
-
- best_installed = False
- if installed_version and (
- best_candidate is None or best_candidate.version <= installed_version
- ):
- best_installed = True
-
- if not upgrade and installed_version is not None:
- if best_installed:
- logger.debug(
- "Existing installed version (%s) is most up-to-date and "
- "satisfies requirement",
- installed_version,
- )
- else:
- logger.debug(
- "Existing installed version (%s) satisfies requirement "
- "(most up-to-date version is %s)",
- installed_version,
- best_candidate.version,
- )
- return None
-
- if best_installed:
- # We have an existing version, and its the best version
- logger.debug(
- "Installed version (%s) is most up-to-date (past versions: %s)",
- installed_version,
- _format_versions(best_candidate_result.iter_applicable()),
- )
- raise BestVersionAlreadyInstalled
-
- logger.debug(
- "Using version %s (newest of versions: %s)",
- best_candidate.version,
- _format_versions(best_candidate_result.iter_applicable()),
- )
- return best_candidate
-
-
-def _find_name_version_sep(fragment: str, canonical_name: str) -> int:
- """Find the separator's index based on the package's canonical name.
-
- :param fragment: A + filename "fragment" (stem) or
- egg fragment.
- :param canonical_name: The package's canonical name.
-
- This function is needed since the canonicalized name does not necessarily
- have the same length as the egg info's name part. An example::
-
- >>> fragment = 'foo__bar-1.0'
- >>> canonical_name = 'foo-bar'
- >>> _find_name_version_sep(fragment, canonical_name)
- 8
- """
- # Project name and version must be separated by one single dash. Find all
- # occurrences of dashes; if the string in front of it matches the canonical
- # name, this is the one separating the name and version parts.
- for i, c in enumerate(fragment):
- if c != "-":
- continue
- if canonicalize_name(fragment[:i]) == canonical_name:
- return i
- raise ValueError(f"{fragment} does not match {canonical_name}")
-
-
-def _extract_version_from_fragment(fragment: str, canonical_name: str) -> Optional[str]:
- """Parse the version string from a + filename
- "fragment" (stem) or egg fragment.
-
- :param fragment: The string to parse. E.g. foo-2.1
- :param canonical_name: The canonicalized name of the package this
- belongs to.
- """
- try:
- version_start = _find_name_version_sep(fragment, canonical_name) + 1
- except ValueError:
- return None
- version = fragment[version_start:]
- if not version:
- return None
- return version
diff --git a/env/lib/python3.9/site-packages/pip/_internal/index/sources.py b/env/lib/python3.9/site-packages/pip/_internal/index/sources.py
deleted file mode 100644
index eec3f12..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/index/sources.py
+++ /dev/null
@@ -1,224 +0,0 @@
-import logging
-import mimetypes
-import os
-import pathlib
-from typing import Callable, Iterable, Optional, Tuple
-
-from pip._internal.models.candidate import InstallationCandidate
-from pip._internal.models.link import Link
-from pip._internal.utils.urls import path_to_url, url_to_path
-from pip._internal.vcs import is_url
-
-logger = logging.getLogger(__name__)
-
-FoundCandidates = Iterable[InstallationCandidate]
-FoundLinks = Iterable[Link]
-CandidatesFromPage = Callable[[Link], Iterable[InstallationCandidate]]
-PageValidator = Callable[[Link], bool]
-
-
-class LinkSource:
- @property
- def link(self) -> Optional[Link]:
- """Returns the underlying link, if there's one."""
- raise NotImplementedError()
-
- def page_candidates(self) -> FoundCandidates:
- """Candidates found by parsing an archive listing HTML file."""
- raise NotImplementedError()
-
- def file_links(self) -> FoundLinks:
- """Links found by specifying archives directly."""
- raise NotImplementedError()
-
-
-def _is_html_file(file_url: str) -> bool:
- return mimetypes.guess_type(file_url, strict=False)[0] == "text/html"
-
-
-class _FlatDirectorySource(LinkSource):
- """Link source specified by ``--find-links=``.
-
- This looks the content of the directory, and returns:
-
- * ``page_candidates``: Links listed on each HTML file in the directory.
- * ``file_candidates``: Archives in the directory.
- """
-
- def __init__(
- self,
- candidates_from_page: CandidatesFromPage,
- path: str,
- ) -> None:
- self._candidates_from_page = candidates_from_page
- self._path = pathlib.Path(os.path.realpath(path))
-
- @property
- def link(self) -> Optional[Link]:
- return None
-
- def page_candidates(self) -> FoundCandidates:
- for path in self._path.iterdir():
- url = path_to_url(str(path))
- if not _is_html_file(url):
- continue
- yield from self._candidates_from_page(Link(url))
-
- def file_links(self) -> FoundLinks:
- for path in self._path.iterdir():
- url = path_to_url(str(path))
- if _is_html_file(url):
- continue
- yield Link(url)
-
-
-class _LocalFileSource(LinkSource):
- """``--find-links=`` or ``--[extra-]index-url=``.
-
- If a URL is supplied, it must be a ``file:`` URL. If a path is supplied to
- the option, it is converted to a URL first. This returns:
-
- * ``page_candidates``: Links listed on an HTML file.
- * ``file_candidates``: The non-HTML file.
- """
-
- def __init__(
- self,
- candidates_from_page: CandidatesFromPage,
- link: Link,
- ) -> None:
- self._candidates_from_page = candidates_from_page
- self._link = link
-
- @property
- def link(self) -> Optional[Link]:
- return self._link
-
- def page_candidates(self) -> FoundCandidates:
- if not _is_html_file(self._link.url):
- return
- yield from self._candidates_from_page(self._link)
-
- def file_links(self) -> FoundLinks:
- if _is_html_file(self._link.url):
- return
- yield self._link
-
-
-class _RemoteFileSource(LinkSource):
- """``--find-links=`` or ``--[extra-]index-url=``.
-
- This returns:
-
- * ``page_candidates``: Links listed on an HTML file.
- * ``file_candidates``: The non-HTML file.
- """
-
- def __init__(
- self,
- candidates_from_page: CandidatesFromPage,
- page_validator: PageValidator,
- link: Link,
- ) -> None:
- self._candidates_from_page = candidates_from_page
- self._page_validator = page_validator
- self._link = link
-
- @property
- def link(self) -> Optional[Link]:
- return self._link
-
- def page_candidates(self) -> FoundCandidates:
- if not self._page_validator(self._link):
- return
- yield from self._candidates_from_page(self._link)
-
- def file_links(self) -> FoundLinks:
- yield self._link
-
-
-class _IndexDirectorySource(LinkSource):
- """``--[extra-]index-url=``.
-
- This is treated like a remote URL; ``candidates_from_page`` contains logic
- for this by appending ``index.html`` to the link.
- """
-
- def __init__(
- self,
- candidates_from_page: CandidatesFromPage,
- link: Link,
- ) -> None:
- self._candidates_from_page = candidates_from_page
- self._link = link
-
- @property
- def link(self) -> Optional[Link]:
- return self._link
-
- def page_candidates(self) -> FoundCandidates:
- yield from self._candidates_from_page(self._link)
-
- def file_links(self) -> FoundLinks:
- return ()
-
-
-def build_source(
- location: str,
- *,
- candidates_from_page: CandidatesFromPage,
- page_validator: PageValidator,
- expand_dir: bool,
- cache_link_parsing: bool,
-) -> Tuple[Optional[str], Optional[LinkSource]]:
-
- path: Optional[str] = None
- url: Optional[str] = None
- if os.path.exists(location): # Is a local path.
- url = path_to_url(location)
- path = location
- elif location.startswith("file:"): # A file: URL.
- url = location
- path = url_to_path(location)
- elif is_url(location):
- url = location
-
- if url is None:
- msg = (
- "Location '%s' is ignored: "
- "it is either a non-existing path or lacks a specific scheme."
- )
- logger.warning(msg, location)
- return (None, None)
-
- if path is None:
- source: LinkSource = _RemoteFileSource(
- candidates_from_page=candidates_from_page,
- page_validator=page_validator,
- link=Link(url, cache_link_parsing=cache_link_parsing),
- )
- return (url, source)
-
- if os.path.isdir(path):
- if expand_dir:
- source = _FlatDirectorySource(
- candidates_from_page=candidates_from_page,
- path=path,
- )
- else:
- source = _IndexDirectorySource(
- candidates_from_page=candidates_from_page,
- link=Link(url, cache_link_parsing=cache_link_parsing),
- )
- return (url, source)
- elif os.path.isfile(path):
- source = _LocalFileSource(
- candidates_from_page=candidates_from_page,
- link=Link(url, cache_link_parsing=cache_link_parsing),
- )
- return (url, source)
- logger.warning(
- "Location '%s' is ignored: it is neither a file nor a directory.",
- location,
- )
- return (url, None)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/locations/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/locations/__init__.py
deleted file mode 100644
index 99312d7..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/locations/__init__.py
+++ /dev/null
@@ -1,520 +0,0 @@
-import functools
-import logging
-import os
-import pathlib
-import sys
-import sysconfig
-from typing import Any, Dict, Generator, List, Optional, Tuple
-
-from pip._internal.models.scheme import SCHEME_KEYS, Scheme
-from pip._internal.utils.compat import WINDOWS
-from pip._internal.utils.deprecation import deprecated
-from pip._internal.utils.virtualenv import running_under_virtualenv
-
-from . import _distutils, _sysconfig
-from .base import (
- USER_CACHE_DIR,
- get_major_minor_version,
- get_src_prefix,
- is_osx_framework,
- site_packages,
- user_site,
-)
-
-__all__ = [
- "USER_CACHE_DIR",
- "get_bin_prefix",
- "get_bin_user",
- "get_major_minor_version",
- "get_platlib",
- "get_prefixed_libs",
- "get_purelib",
- "get_scheme",
- "get_src_prefix",
- "site_packages",
- "user_site",
-]
-
-
-logger = logging.getLogger(__name__)
-
-
-_PLATLIBDIR: str = getattr(sys, "platlibdir", "lib")
-
-_USE_SYSCONFIG_DEFAULT = sys.version_info >= (3, 10)
-
-
-def _should_use_sysconfig() -> bool:
- """This function determines the value of _USE_SYSCONFIG.
-
- By default, pip uses sysconfig on Python 3.10+.
- But Python distributors can override this decision by setting:
- sysconfig._PIP_USE_SYSCONFIG = True / False
- Rationale in https://github.com/pypa/pip/issues/10647
-
- This is a function for testability, but should be constant during any one
- run.
- """
- return bool(getattr(sysconfig, "_PIP_USE_SYSCONFIG", _USE_SYSCONFIG_DEFAULT))
-
-
-_USE_SYSCONFIG = _should_use_sysconfig()
-
-# Be noisy about incompatibilities if this platforms "should" be using
-# sysconfig, but is explicitly opting out and using distutils instead.
-if _USE_SYSCONFIG_DEFAULT and not _USE_SYSCONFIG:
- _MISMATCH_LEVEL = logging.WARNING
-else:
- _MISMATCH_LEVEL = logging.DEBUG
-
-
-def _looks_like_bpo_44860() -> bool:
- """The resolution to bpo-44860 will change this incorrect platlib.
-
- See .
- """
- from distutils.command.install import INSTALL_SCHEMES
-
- try:
- unix_user_platlib = INSTALL_SCHEMES["unix_user"]["platlib"]
- except KeyError:
- return False
- return unix_user_platlib == "$usersite"
-
-
-def _looks_like_red_hat_patched_platlib_purelib(scheme: Dict[str, str]) -> bool:
- platlib = scheme["platlib"]
- if "/$platlibdir/" in platlib:
- platlib = platlib.replace("/$platlibdir/", f"/{_PLATLIBDIR}/")
- if "/lib64/" not in platlib:
- return False
- unpatched = platlib.replace("/lib64/", "/lib/")
- return unpatched.replace("$platbase/", "$base/") == scheme["purelib"]
-
-
-@functools.lru_cache(maxsize=None)
-def _looks_like_red_hat_lib() -> bool:
- """Red Hat patches platlib in unix_prefix and unix_home, but not purelib.
-
- This is the only way I can see to tell a Red Hat-patched Python.
- """
- from distutils.command.install import INSTALL_SCHEMES
-
- return all(
- k in INSTALL_SCHEMES
- and _looks_like_red_hat_patched_platlib_purelib(INSTALL_SCHEMES[k])
- for k in ("unix_prefix", "unix_home")
- )
-
-
-@functools.lru_cache(maxsize=None)
-def _looks_like_debian_scheme() -> bool:
- """Debian adds two additional schemes."""
- from distutils.command.install import INSTALL_SCHEMES
-
- return "deb_system" in INSTALL_SCHEMES and "unix_local" in INSTALL_SCHEMES
-
-
-@functools.lru_cache(maxsize=None)
-def _looks_like_red_hat_scheme() -> bool:
- """Red Hat patches ``sys.prefix`` and ``sys.exec_prefix``.
-
- Red Hat's ``00251-change-user-install-location.patch`` changes the install
- command's ``prefix`` and ``exec_prefix`` to append ``"/local"``. This is
- (fortunately?) done quite unconditionally, so we create a default command
- object without any configuration to detect this.
- """
- from distutils.command.install import install
- from distutils.dist import Distribution
-
- cmd: Any = install(Distribution())
- cmd.finalize_options()
- return (
- cmd.exec_prefix == f"{os.path.normpath(sys.exec_prefix)}/local"
- and cmd.prefix == f"{os.path.normpath(sys.prefix)}/local"
- )
-
-
-@functools.lru_cache(maxsize=None)
-def _looks_like_slackware_scheme() -> bool:
- """Slackware patches sysconfig but fails to patch distutils and site.
-
- Slackware changes sysconfig's user scheme to use ``"lib64"`` for the lib
- path, but does not do the same to the site module.
- """
- if user_site is None: # User-site not available.
- return False
- try:
- paths = sysconfig.get_paths(scheme="posix_user", expand=False)
- except KeyError: # User-site not available.
- return False
- return "/lib64/" in paths["purelib"] and "/lib64/" not in user_site
-
-
-@functools.lru_cache(maxsize=None)
-def _looks_like_msys2_mingw_scheme() -> bool:
- """MSYS2 patches distutils and sysconfig to use a UNIX-like scheme.
-
- However, MSYS2 incorrectly patches sysconfig ``nt`` scheme. The fix is
- likely going to be included in their 3.10 release, so we ignore the warning.
- See msys2/MINGW-packages#9319.
-
- MSYS2 MINGW's patch uses lowercase ``"lib"`` instead of the usual uppercase,
- and is missing the final ``"site-packages"``.
- """
- paths = sysconfig.get_paths("nt", expand=False)
- return all(
- "Lib" not in p and "lib" in p and not p.endswith("site-packages")
- for p in (paths[key] for key in ("platlib", "purelib"))
- )
-
-
-def _fix_abiflags(parts: Tuple[str]) -> Generator[str, None, None]:
- ldversion = sysconfig.get_config_var("LDVERSION")
- abiflags = getattr(sys, "abiflags", None)
-
- # LDVERSION does not end with sys.abiflags. Just return the path unchanged.
- if not ldversion or not abiflags or not ldversion.endswith(abiflags):
- yield from parts
- return
-
- # Strip sys.abiflags from LDVERSION-based path components.
- for part in parts:
- if part.endswith(ldversion):
- part = part[: (0 - len(abiflags))]
- yield part
-
-
-@functools.lru_cache(maxsize=None)
-def _warn_mismatched(old: pathlib.Path, new: pathlib.Path, *, key: str) -> None:
- issue_url = "https://github.com/pypa/pip/issues/10151"
- message = (
- "Value for %s does not match. Please report this to <%s>"
- "\ndistutils: %s"
- "\nsysconfig: %s"
- )
- logger.log(_MISMATCH_LEVEL, message, key, issue_url, old, new)
-
-
-def _warn_if_mismatch(old: pathlib.Path, new: pathlib.Path, *, key: str) -> bool:
- if old == new:
- return False
- _warn_mismatched(old, new, key=key)
- return True
-
-
-@functools.lru_cache(maxsize=None)
-def _log_context(
- *,
- user: bool = False,
- home: Optional[str] = None,
- root: Optional[str] = None,
- prefix: Optional[str] = None,
-) -> None:
- parts = [
- "Additional context:",
- "user = %r",
- "home = %r",
- "root = %r",
- "prefix = %r",
- ]
-
- logger.log(_MISMATCH_LEVEL, "\n".join(parts), user, home, root, prefix)
-
-
-def get_scheme(
- dist_name: str,
- user: bool = False,
- home: Optional[str] = None,
- root: Optional[str] = None,
- isolated: bool = False,
- prefix: Optional[str] = None,
-) -> Scheme:
- new = _sysconfig.get_scheme(
- dist_name,
- user=user,
- home=home,
- root=root,
- isolated=isolated,
- prefix=prefix,
- )
- if _USE_SYSCONFIG:
- return new
-
- old = _distutils.get_scheme(
- dist_name,
- user=user,
- home=home,
- root=root,
- isolated=isolated,
- prefix=prefix,
- )
-
- warning_contexts = []
- for k in SCHEME_KEYS:
- old_v = pathlib.Path(getattr(old, k))
- new_v = pathlib.Path(getattr(new, k))
-
- if old_v == new_v:
- continue
-
- # distutils incorrectly put PyPy packages under ``site-packages/python``
- # in the ``posix_home`` scheme, but PyPy devs said they expect the
- # directory name to be ``pypy`` instead. So we treat this as a bug fix
- # and not warn about it. See bpo-43307 and python/cpython#24628.
- skip_pypy_special_case = (
- sys.implementation.name == "pypy"
- and home is not None
- and k in ("platlib", "purelib")
- and old_v.parent == new_v.parent
- and old_v.name.startswith("python")
- and new_v.name.startswith("pypy")
- )
- if skip_pypy_special_case:
- continue
-
- # sysconfig's ``osx_framework_user`` does not include ``pythonX.Y`` in
- # the ``include`` value, but distutils's ``headers`` does. We'll let
- # CPython decide whether this is a bug or feature. See bpo-43948.
- skip_osx_framework_user_special_case = (
- user
- and is_osx_framework()
- and k == "headers"
- and old_v.parent.parent == new_v.parent
- and old_v.parent.name.startswith("python")
- )
- if skip_osx_framework_user_special_case:
- continue
-
- # On Red Hat and derived Linux distributions, distutils is patched to
- # use "lib64" instead of "lib" for platlib.
- if k == "platlib" and _looks_like_red_hat_lib():
- continue
-
- # On Python 3.9+, sysconfig's posix_user scheme sets platlib against
- # sys.platlibdir, but distutils's unix_user incorrectly coninutes
- # using the same $usersite for both platlib and purelib. This creates a
- # mismatch when sys.platlibdir is not "lib".
- skip_bpo_44860 = (
- user
- and k == "platlib"
- and not WINDOWS
- and sys.version_info >= (3, 9)
- and _PLATLIBDIR != "lib"
- and _looks_like_bpo_44860()
- )
- if skip_bpo_44860:
- continue
-
- # Slackware incorrectly patches posix_user to use lib64 instead of lib,
- # but not usersite to match the location.
- skip_slackware_user_scheme = (
- user
- and k in ("platlib", "purelib")
- and not WINDOWS
- and _looks_like_slackware_scheme()
- )
- if skip_slackware_user_scheme:
- continue
-
- # Both Debian and Red Hat patch Python to place the system site under
- # /usr/local instead of /usr. Debian also places lib in dist-packages
- # instead of site-packages, but the /usr/local check should cover it.
- skip_linux_system_special_case = (
- not (user or home or prefix or running_under_virtualenv())
- and old_v.parts[1:3] == ("usr", "local")
- and len(new_v.parts) > 1
- and new_v.parts[1] == "usr"
- and (len(new_v.parts) < 3 or new_v.parts[2] != "local")
- and (_looks_like_red_hat_scheme() or _looks_like_debian_scheme())
- )
- if skip_linux_system_special_case:
- continue
-
- # On Python 3.7 and earlier, sysconfig does not include sys.abiflags in
- # the "pythonX.Y" part of the path, but distutils does.
- skip_sysconfig_abiflag_bug = (
- sys.version_info < (3, 8)
- and not WINDOWS
- and k in ("headers", "platlib", "purelib")
- and tuple(_fix_abiflags(old_v.parts)) == new_v.parts
- )
- if skip_sysconfig_abiflag_bug:
- continue
-
- # MSYS2 MINGW's sysconfig patch does not include the "site-packages"
- # part of the path. This is incorrect and will be fixed in MSYS.
- skip_msys2_mingw_bug = (
- WINDOWS and k in ("platlib", "purelib") and _looks_like_msys2_mingw_scheme()
- )
- if skip_msys2_mingw_bug:
- continue
-
- # CPython's POSIX install script invokes pip (via ensurepip) against the
- # interpreter located in the source tree, not the install site. This
- # triggers special logic in sysconfig that's not present in distutils.
- # https://github.com/python/cpython/blob/8c21941ddaf/Lib/sysconfig.py#L178-L194
- skip_cpython_build = (
- sysconfig.is_python_build(check_home=True)
- and not WINDOWS
- and k in ("headers", "include", "platinclude")
- )
- if skip_cpython_build:
- continue
-
- warning_contexts.append((old_v, new_v, f"scheme.{k}"))
-
- if not warning_contexts:
- return old
-
- # Check if this path mismatch is caused by distutils config files. Those
- # files will no longer work once we switch to sysconfig, so this raises a
- # deprecation message for them.
- default_old = _distutils.distutils_scheme(
- dist_name,
- user,
- home,
- root,
- isolated,
- prefix,
- ignore_config_files=True,
- )
- if any(default_old[k] != getattr(old, k) for k in SCHEME_KEYS):
- deprecated(
- reason=(
- "Configuring installation scheme with distutils config files "
- "is deprecated and will no longer work in the near future. If you "
- "are using a Homebrew or Linuxbrew Python, please see discussion "
- "at https://github.com/Homebrew/homebrew-core/issues/76621"
- ),
- replacement=None,
- gone_in=None,
- )
- return old
-
- # Post warnings about this mismatch so user can report them back.
- for old_v, new_v, key in warning_contexts:
- _warn_mismatched(old_v, new_v, key=key)
- _log_context(user=user, home=home, root=root, prefix=prefix)
-
- return old
-
-
-def get_bin_prefix() -> str:
- new = _sysconfig.get_bin_prefix()
- if _USE_SYSCONFIG:
- return new
-
- old = _distutils.get_bin_prefix()
- if _warn_if_mismatch(pathlib.Path(old), pathlib.Path(new), key="bin_prefix"):
- _log_context()
- return old
-
-
-def get_bin_user() -> str:
- return _sysconfig.get_scheme("", user=True).scripts
-
-
-def _looks_like_deb_system_dist_packages(value: str) -> bool:
- """Check if the value is Debian's APT-controlled dist-packages.
-
- Debian's ``distutils.sysconfig.get_python_lib()`` implementation returns the
- default package path controlled by APT, but does not patch ``sysconfig`` to
- do the same. This is similar to the bug worked around in ``get_scheme()``,
- but here the default is ``deb_system`` instead of ``unix_local``. Ultimately
- we can't do anything about this Debian bug, and this detection allows us to
- skip the warning when needed.
- """
- if not _looks_like_debian_scheme():
- return False
- if value == "/usr/lib/python3/dist-packages":
- return True
- return False
-
-
-def get_purelib() -> str:
- """Return the default pure-Python lib location."""
- new = _sysconfig.get_purelib()
- if _USE_SYSCONFIG:
- return new
-
- old = _distutils.get_purelib()
- if _looks_like_deb_system_dist_packages(old):
- return old
- if _warn_if_mismatch(pathlib.Path(old), pathlib.Path(new), key="purelib"):
- _log_context()
- return old
-
-
-def get_platlib() -> str:
- """Return the default platform-shared lib location."""
- new = _sysconfig.get_platlib()
- if _USE_SYSCONFIG:
- return new
-
- old = _distutils.get_platlib()
- if _looks_like_deb_system_dist_packages(old):
- return old
- if _warn_if_mismatch(pathlib.Path(old), pathlib.Path(new), key="platlib"):
- _log_context()
- return old
-
-
-def _deduplicated(v1: str, v2: str) -> List[str]:
- """Deduplicate values from a list."""
- if v1 == v2:
- return [v1]
- return [v1, v2]
-
-
-def _looks_like_apple_library(path: str) -> bool:
- """Apple patches sysconfig to *always* look under */Library/Python*."""
- if sys.platform[:6] != "darwin":
- return False
- return path == f"/Library/Python/{get_major_minor_version()}/site-packages"
-
-
-def get_prefixed_libs(prefix: str) -> List[str]:
- """Return the lib locations under ``prefix``."""
- new_pure, new_plat = _sysconfig.get_prefixed_libs(prefix)
- if _USE_SYSCONFIG:
- return _deduplicated(new_pure, new_plat)
-
- old_pure, old_plat = _distutils.get_prefixed_libs(prefix)
- old_lib_paths = _deduplicated(old_pure, old_plat)
-
- # Apple's Python (shipped with Xcode and Command Line Tools) hard-code
- # platlib and purelib to '/Library/Python/X.Y/site-packages'. This will
- # cause serious build isolation bugs when Apple starts shipping 3.10 because
- # pip will install build backends to the wrong location. This tells users
- # who is at fault so Apple may notice it and fix the issue in time.
- if all(_looks_like_apple_library(p) for p in old_lib_paths):
- deprecated(
- reason=(
- "Python distributed by Apple's Command Line Tools incorrectly "
- "patches sysconfig to always point to '/Library/Python'. This "
- "will cause build isolation to operate incorrectly on Python "
- "3.10 or later. Please help report this to Apple so they can "
- "fix this. https://developer.apple.com/bug-reporting/"
- ),
- replacement=None,
- gone_in=None,
- )
- return old_lib_paths
-
- warned = [
- _warn_if_mismatch(
- pathlib.Path(old_pure),
- pathlib.Path(new_pure),
- key="prefixed-purelib",
- ),
- _warn_if_mismatch(
- pathlib.Path(old_plat),
- pathlib.Path(new_plat),
- key="prefixed-platlib",
- ),
- ]
- if any(warned):
- _log_context(prefix=prefix)
-
- return old_lib_paths
diff --git a/env/lib/python3.9/site-packages/pip/_internal/locations/_distutils.py b/env/lib/python3.9/site-packages/pip/_internal/locations/_distutils.py
deleted file mode 100644
index aac8218..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/locations/_distutils.py
+++ /dev/null
@@ -1,169 +0,0 @@
-"""Locations where we look for configs, install stuff, etc"""
-
-# The following comment should be removed at some point in the future.
-# mypy: strict-optional=False
-
-import logging
-import os
-import sys
-from distutils.cmd import Command as DistutilsCommand
-from distutils.command.install import SCHEME_KEYS
-from distutils.command.install import install as distutils_install_command
-from distutils.sysconfig import get_python_lib
-from typing import Dict, List, Optional, Tuple, Union, cast
-
-from pip._internal.models.scheme import Scheme
-from pip._internal.utils.compat import WINDOWS
-from pip._internal.utils.virtualenv import running_under_virtualenv
-
-from .base import get_major_minor_version
-
-logger = logging.getLogger(__name__)
-
-
-def distutils_scheme(
- dist_name: str,
- user: bool = False,
- home: str = None,
- root: str = None,
- isolated: bool = False,
- prefix: str = None,
- *,
- ignore_config_files: bool = False,
-) -> Dict[str, str]:
- """
- Return a distutils install scheme
- """
- from distutils.dist import Distribution
-
- dist_args: Dict[str, Union[str, List[str]]] = {"name": dist_name}
- if isolated:
- dist_args["script_args"] = ["--no-user-cfg"]
-
- d = Distribution(dist_args)
- if not ignore_config_files:
- try:
- d.parse_config_files()
- except UnicodeDecodeError:
- # Typeshed does not include find_config_files() for some reason.
- paths = d.find_config_files() # type: ignore
- logger.warning(
- "Ignore distutils configs in %s due to encoding errors.",
- ", ".join(os.path.basename(p) for p in paths),
- )
- obj: Optional[DistutilsCommand] = None
- obj = d.get_command_obj("install", create=True)
- assert obj is not None
- i = cast(distutils_install_command, obj)
- # NOTE: setting user or home has the side-effect of creating the home dir
- # or user base for installations during finalize_options()
- # ideally, we'd prefer a scheme class that has no side-effects.
- assert not (user and prefix), f"user={user} prefix={prefix}"
- assert not (home and prefix), f"home={home} prefix={prefix}"
- i.user = user or i.user
- if user or home:
- i.prefix = ""
- i.prefix = prefix or i.prefix
- i.home = home or i.home
- i.root = root or i.root
- i.finalize_options()
-
- scheme = {}
- for key in SCHEME_KEYS:
- scheme[key] = getattr(i, "install_" + key)
-
- # install_lib specified in setup.cfg should install *everything*
- # into there (i.e. it takes precedence over both purelib and
- # platlib). Note, i.install_lib is *always* set after
- # finalize_options(); we only want to override here if the user
- # has explicitly requested it hence going back to the config
- if "install_lib" in d.get_option_dict("install"):
- scheme.update(dict(purelib=i.install_lib, platlib=i.install_lib))
-
- if running_under_virtualenv():
- if home:
- prefix = home
- elif user:
- prefix = i.install_userbase
- else:
- prefix = i.prefix
- scheme["headers"] = os.path.join(
- prefix,
- "include",
- "site",
- f"python{get_major_minor_version()}",
- dist_name,
- )
-
- if root is not None:
- path_no_drive = os.path.splitdrive(os.path.abspath(scheme["headers"]))[1]
- scheme["headers"] = os.path.join(root, path_no_drive[1:])
-
- return scheme
-
-
-def get_scheme(
- dist_name: str,
- user: bool = False,
- home: Optional[str] = None,
- root: Optional[str] = None,
- isolated: bool = False,
- prefix: Optional[str] = None,
-) -> Scheme:
- """
- Get the "scheme" corresponding to the input parameters. The distutils
- documentation provides the context for the available schemes:
- https://docs.python.org/3/install/index.html#alternate-installation
-
- :param dist_name: the name of the package to retrieve the scheme for, used
- in the headers scheme path
- :param user: indicates to use the "user" scheme
- :param home: indicates to use the "home" scheme and provides the base
- directory for the same
- :param root: root under which other directories are re-based
- :param isolated: equivalent to --no-user-cfg, i.e. do not consider
- ~/.pydistutils.cfg (posix) or ~/pydistutils.cfg (non-posix) for
- scheme paths
- :param prefix: indicates to use the "prefix" scheme and provides the
- base directory for the same
- """
- scheme = distutils_scheme(dist_name, user, home, root, isolated, prefix)
- return Scheme(
- platlib=scheme["platlib"],
- purelib=scheme["purelib"],
- headers=scheme["headers"],
- scripts=scheme["scripts"],
- data=scheme["data"],
- )
-
-
-def get_bin_prefix() -> str:
- # XXX: In old virtualenv versions, sys.prefix can contain '..' components,
- # so we need to call normpath to eliminate them.
- prefix = os.path.normpath(sys.prefix)
- if WINDOWS:
- bin_py = os.path.join(prefix, "Scripts")
- # buildout uses 'bin' on Windows too?
- if not os.path.exists(bin_py):
- bin_py = os.path.join(prefix, "bin")
- return bin_py
- # Forcing to use /usr/local/bin for standard macOS framework installs
- # Also log to ~/Library/Logs/ for use with the Console.app log viewer
- if sys.platform[:6] == "darwin" and prefix[:16] == "/System/Library/":
- return "/usr/local/bin"
- return os.path.join(prefix, "bin")
-
-
-def get_purelib() -> str:
- return get_python_lib(plat_specific=False)
-
-
-def get_platlib() -> str:
- return get_python_lib(plat_specific=True)
-
-
-def get_prefixed_libs(prefix: str) -> Tuple[str, str]:
- return (
- get_python_lib(plat_specific=False, prefix=prefix),
- get_python_lib(plat_specific=True, prefix=prefix),
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/locations/_sysconfig.py b/env/lib/python3.9/site-packages/pip/_internal/locations/_sysconfig.py
deleted file mode 100644
index 5e141aa..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/locations/_sysconfig.py
+++ /dev/null
@@ -1,219 +0,0 @@
-import distutils.util # FIXME: For change_root.
-import logging
-import os
-import sys
-import sysconfig
-import typing
-
-from pip._internal.exceptions import InvalidSchemeCombination, UserInstallationInvalid
-from pip._internal.models.scheme import SCHEME_KEYS, Scheme
-from pip._internal.utils.virtualenv import running_under_virtualenv
-
-from .base import get_major_minor_version, is_osx_framework
-
-logger = logging.getLogger(__name__)
-
-
-# Notes on _infer_* functions.
-# Unfortunately ``get_default_scheme()`` didn't exist before 3.10, so there's no
-# way to ask things like "what is the '_prefix' scheme on this platform". These
-# functions try to answer that with some heuristics while accounting for ad-hoc
-# platforms not covered by CPython's default sysconfig implementation. If the
-# ad-hoc implementation does not fully implement sysconfig, we'll fall back to
-# a POSIX scheme.
-
-_AVAILABLE_SCHEMES = set(sysconfig.get_scheme_names())
-
-_PREFERRED_SCHEME_API = getattr(sysconfig, "get_preferred_scheme", None)
-
-
-def _should_use_osx_framework_prefix() -> bool:
- """Check for Apple's ``osx_framework_library`` scheme.
-
- Python distributed by Apple's Command Line Tools has this special scheme
- that's used when:
-
- * This is a framework build.
- * We are installing into the system prefix.
-
- This does not account for ``pip install --prefix`` (also means we're not
- installing to the system prefix), which should use ``posix_prefix``, but
- logic here means ``_infer_prefix()`` outputs ``osx_framework_library``. But
- since ``prefix`` is not available for ``sysconfig.get_default_scheme()``,
- which is the stdlib replacement for ``_infer_prefix()``, presumably Apple
- wouldn't be able to magically switch between ``osx_framework_library`` and
- ``posix_prefix``. ``_infer_prefix()`` returning ``osx_framework_library``
- means its behavior is consistent whether we use the stdlib implementation
- or our own, and we deal with this special case in ``get_scheme()`` instead.
- """
- return (
- "osx_framework_library" in _AVAILABLE_SCHEMES
- and not running_under_virtualenv()
- and is_osx_framework()
- )
-
-
-def _infer_prefix() -> str:
- """Try to find a prefix scheme for the current platform.
-
- This tries:
-
- * A special ``osx_framework_library`` for Python distributed by Apple's
- Command Line Tools, when not running in a virtual environment.
- * Implementation + OS, used by PyPy on Windows (``pypy_nt``).
- * Implementation without OS, used by PyPy on POSIX (``pypy``).
- * OS + "prefix", used by CPython on POSIX (``posix_prefix``).
- * Just the OS name, used by CPython on Windows (``nt``).
-
- If none of the above works, fall back to ``posix_prefix``.
- """
- if _PREFERRED_SCHEME_API:
- return _PREFERRED_SCHEME_API("prefix")
- if _should_use_osx_framework_prefix():
- return "osx_framework_library"
- implementation_suffixed = f"{sys.implementation.name}_{os.name}"
- if implementation_suffixed in _AVAILABLE_SCHEMES:
- return implementation_suffixed
- if sys.implementation.name in _AVAILABLE_SCHEMES:
- return sys.implementation.name
- suffixed = f"{os.name}_prefix"
- if suffixed in _AVAILABLE_SCHEMES:
- return suffixed
- if os.name in _AVAILABLE_SCHEMES: # On Windows, prefx is just called "nt".
- return os.name
- return "posix_prefix"
-
-
-def _infer_user() -> str:
- """Try to find a user scheme for the current platform."""
- if _PREFERRED_SCHEME_API:
- return _PREFERRED_SCHEME_API("user")
- if is_osx_framework() and not running_under_virtualenv():
- suffixed = "osx_framework_user"
- else:
- suffixed = f"{os.name}_user"
- if suffixed in _AVAILABLE_SCHEMES:
- return suffixed
- if "posix_user" not in _AVAILABLE_SCHEMES: # User scheme unavailable.
- raise UserInstallationInvalid()
- return "posix_user"
-
-
-def _infer_home() -> str:
- """Try to find a home for the current platform."""
- if _PREFERRED_SCHEME_API:
- return _PREFERRED_SCHEME_API("home")
- suffixed = f"{os.name}_home"
- if suffixed in _AVAILABLE_SCHEMES:
- return suffixed
- return "posix_home"
-
-
-# Update these keys if the user sets a custom home.
-_HOME_KEYS = [
- "installed_base",
- "base",
- "installed_platbase",
- "platbase",
- "prefix",
- "exec_prefix",
-]
-if sysconfig.get_config_var("userbase") is not None:
- _HOME_KEYS.append("userbase")
-
-
-def get_scheme(
- dist_name: str,
- user: bool = False,
- home: typing.Optional[str] = None,
- root: typing.Optional[str] = None,
- isolated: bool = False,
- prefix: typing.Optional[str] = None,
-) -> Scheme:
- """
- Get the "scheme" corresponding to the input parameters.
-
- :param dist_name: the name of the package to retrieve the scheme for, used
- in the headers scheme path
- :param user: indicates to use the "user" scheme
- :param home: indicates to use the "home" scheme
- :param root: root under which other directories are re-based
- :param isolated: ignored, but kept for distutils compatibility (where
- this controls whether the user-site pydistutils.cfg is honored)
- :param prefix: indicates to use the "prefix" scheme and provides the
- base directory for the same
- """
- if user and prefix:
- raise InvalidSchemeCombination("--user", "--prefix")
- if home and prefix:
- raise InvalidSchemeCombination("--home", "--prefix")
-
- if home is not None:
- scheme_name = _infer_home()
- elif user:
- scheme_name = _infer_user()
- else:
- scheme_name = _infer_prefix()
-
- # Special case: When installing into a custom prefix, use posix_prefix
- # instead of osx_framework_library. See _should_use_osx_framework_prefix()
- # docstring for details.
- if prefix is not None and scheme_name == "osx_framework_library":
- scheme_name = "posix_prefix"
-
- if home is not None:
- variables = {k: home for k in _HOME_KEYS}
- elif prefix is not None:
- variables = {k: prefix for k in _HOME_KEYS}
- else:
- variables = {}
-
- paths = sysconfig.get_paths(scheme=scheme_name, vars=variables)
-
- # Logic here is very arbitrary, we're doing it for compatibility, don't ask.
- # 1. Pip historically uses a special header path in virtual environments.
- # 2. If the distribution name is not known, distutils uses 'UNKNOWN'. We
- # only do the same when not running in a virtual environment because
- # pip's historical header path logic (see point 1) did not do this.
- if running_under_virtualenv():
- if user:
- base = variables.get("userbase", sys.prefix)
- else:
- base = variables.get("base", sys.prefix)
- python_xy = f"python{get_major_minor_version()}"
- paths["include"] = os.path.join(base, "include", "site", python_xy)
- elif not dist_name:
- dist_name = "UNKNOWN"
-
- scheme = Scheme(
- platlib=paths["platlib"],
- purelib=paths["purelib"],
- headers=os.path.join(paths["include"], dist_name),
- scripts=paths["scripts"],
- data=paths["data"],
- )
- if root is not None:
- for key in SCHEME_KEYS:
- value = distutils.util.change_root(root, getattr(scheme, key))
- setattr(scheme, key, value)
- return scheme
-
-
-def get_bin_prefix() -> str:
- # Forcing to use /usr/local/bin for standard macOS framework installs.
- if sys.platform[:6] == "darwin" and sys.prefix[:16] == "/System/Library/":
- return "/usr/local/bin"
- return sysconfig.get_paths()["scripts"]
-
-
-def get_purelib() -> str:
- return sysconfig.get_paths()["purelib"]
-
-
-def get_platlib() -> str:
- return sysconfig.get_paths()["platlib"]
-
-
-def get_prefixed_libs(prefix: str) -> typing.Tuple[str, str]:
- paths = sysconfig.get_paths(vars={"base": prefix, "platbase": prefix})
- return (paths["purelib"], paths["platlib"])
diff --git a/env/lib/python3.9/site-packages/pip/_internal/locations/base.py b/env/lib/python3.9/site-packages/pip/_internal/locations/base.py
deleted file mode 100644
index 86dad4a..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/locations/base.py
+++ /dev/null
@@ -1,52 +0,0 @@
-import functools
-import os
-import site
-import sys
-import sysconfig
-import typing
-
-from pip._internal.utils import appdirs
-from pip._internal.utils.virtualenv import running_under_virtualenv
-
-# Application Directories
-USER_CACHE_DIR = appdirs.user_cache_dir("pip")
-
-# FIXME doesn't account for venv linked to global site-packages
-site_packages: typing.Optional[str] = sysconfig.get_path("purelib")
-
-
-def get_major_minor_version() -> str:
- """
- Return the major-minor version of the current Python as a string, e.g.
- "3.7" or "3.10".
- """
- return "{}.{}".format(*sys.version_info)
-
-
-def get_src_prefix() -> str:
- if running_under_virtualenv():
- src_prefix = os.path.join(sys.prefix, "src")
- else:
- # FIXME: keep src in cwd for now (it is not a temporary folder)
- try:
- src_prefix = os.path.join(os.getcwd(), "src")
- except OSError:
- # In case the current working directory has been renamed or deleted
- sys.exit("The folder you are executing pip from can no longer be found.")
-
- # under macOS + virtualenv sys.prefix is not properly resolved
- # it is something like /path/to/python/bin/..
- return os.path.abspath(src_prefix)
-
-
-try:
- # Use getusersitepackages if this is present, as it ensures that the
- # value is initialised properly.
- user_site: typing.Optional[str] = site.getusersitepackages()
-except AttributeError:
- user_site = site.USER_SITE
-
-
-@functools.lru_cache(maxsize=None)
-def is_osx_framework() -> bool:
- return bool(sysconfig.get_config_var("PYTHONFRAMEWORK"))
diff --git a/env/lib/python3.9/site-packages/pip/_internal/main.py b/env/lib/python3.9/site-packages/pip/_internal/main.py
deleted file mode 100644
index 33c6d24..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/main.py
+++ /dev/null
@@ -1,12 +0,0 @@
-from typing import List, Optional
-
-
-def main(args: Optional[List[str]] = None) -> int:
- """This is preserved for old console scripts that may still be referencing
- it.
-
- For additional details, see https://github.com/pypa/pip/issues/7498.
- """
- from pip._internal.utils.entrypoints import _wrapper
-
- return _wrapper(args)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/__init__.py
deleted file mode 100644
index 01c35f9..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/__init__.py
+++ /dev/null
@@ -1,105 +0,0 @@
-import contextlib
-import functools
-import os
-import sys
-from typing import TYPE_CHECKING, List, Optional, Type, cast
-
-from pip._internal.utils.misc import strtobool
-
-from .base import BaseDistribution, BaseEnvironment, FilesystemWheel, MemoryWheel, Wheel
-
-if TYPE_CHECKING:
- from typing import Protocol
-else:
- Protocol = object
-
-__all__ = [
- "BaseDistribution",
- "BaseEnvironment",
- "FilesystemWheel",
- "MemoryWheel",
- "Wheel",
- "get_default_environment",
- "get_environment",
- "get_wheel_distribution",
- "select_backend",
-]
-
-
-def _should_use_importlib_metadata() -> bool:
- """Whether to use the ``importlib.metadata`` or ``pkg_resources`` backend.
-
- By default, pip uses ``importlib.metadata`` on Python 3.11+, and
- ``pkg_resourcess`` otherwise. This can be overriden by a couple of ways:
-
- * If environment variable ``_PIP_USE_IMPORTLIB_METADATA`` is set, it
- dictates whether ``importlib.metadata`` is used, regardless of Python
- version.
- * On Python 3.11+, Python distributors can patch ``importlib.metadata``
- to add a global constant ``_PIP_USE_IMPORTLIB_METADATA = False``. This
- makes pip use ``pkg_resources`` (unless the user set the aforementioned
- environment variable to *True*).
- """
- with contextlib.suppress(KeyError, ValueError):
- return bool(strtobool(os.environ["_PIP_USE_IMPORTLIB_METADATA"]))
- if sys.version_info < (3, 11):
- return False
- import importlib.metadata
-
- return bool(getattr(importlib.metadata, "_PIP_USE_IMPORTLIB_METADATA", True))
-
-
-class Backend(Protocol):
- Distribution: Type[BaseDistribution]
- Environment: Type[BaseEnvironment]
-
-
-@functools.lru_cache(maxsize=None)
-def select_backend() -> Backend:
- if _should_use_importlib_metadata():
- from . import importlib
-
- return cast(Backend, importlib)
- from . import pkg_resources
-
- return cast(Backend, pkg_resources)
-
-
-def get_default_environment() -> BaseEnvironment:
- """Get the default representation for the current environment.
-
- This returns an Environment instance from the chosen backend. The default
- Environment instance should be built from ``sys.path`` and may use caching
- to share instance state accorss calls.
- """
- return select_backend().Environment.default()
-
-
-def get_environment(paths: Optional[List[str]]) -> BaseEnvironment:
- """Get a representation of the environment specified by ``paths``.
-
- This returns an Environment instance from the chosen backend based on the
- given import paths. The backend must build a fresh instance representing
- the state of installed distributions when this function is called.
- """
- return select_backend().Environment.from_paths(paths)
-
-
-def get_directory_distribution(directory: str) -> BaseDistribution:
- """Get the distribution metadata representation in the specified directory.
-
- This returns a Distribution instance from the chosen backend based on
- the given on-disk ``.dist-info`` directory.
- """
- return select_backend().Distribution.from_directory(directory)
-
-
-def get_wheel_distribution(wheel: Wheel, canonical_name: str) -> BaseDistribution:
- """Get the representation of the specified wheel's distribution metadata.
-
- This returns a Distribution instance from the chosen backend based on
- the given wheel's ``.dist-info`` directory.
-
- :param canonical_name: Normalized project name of the given wheel.
- """
- return select_backend().Distribution.from_wheel(wheel, canonical_name)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/base.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/base.py
deleted file mode 100644
index f1a1ee6..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/base.py
+++ /dev/null
@@ -1,561 +0,0 @@
-import csv
-import email.message
-import json
-import logging
-import pathlib
-import re
-import zipfile
-from typing import (
- IO,
- TYPE_CHECKING,
- Collection,
- Container,
- Iterable,
- Iterator,
- List,
- Optional,
- Tuple,
- Union,
-)
-
-from pip._vendor.packaging.requirements import Requirement
-from pip._vendor.packaging.specifiers import InvalidSpecifier, SpecifierSet
-from pip._vendor.packaging.utils import NormalizedName
-from pip._vendor.packaging.version import LegacyVersion, Version
-
-from pip._internal.exceptions import NoneMetadataError
-from pip._internal.locations import site_packages, user_site
-from pip._internal.models.direct_url import (
- DIRECT_URL_METADATA_NAME,
- DirectUrl,
- DirectUrlValidationError,
-)
-from pip._internal.utils.compat import stdlib_pkgs # TODO: Move definition here.
-from pip._internal.utils.egg_link import egg_link_path_from_sys_path
-from pip._internal.utils.misc import is_local, normalize_path
-from pip._internal.utils.urls import url_to_path
-
-if TYPE_CHECKING:
- from typing import Protocol
-else:
- Protocol = object
-
-DistributionVersion = Union[LegacyVersion, Version]
-
-InfoPath = Union[str, pathlib.PurePath]
-
-logger = logging.getLogger(__name__)
-
-
-class BaseEntryPoint(Protocol):
- @property
- def name(self) -> str:
- raise NotImplementedError()
-
- @property
- def value(self) -> str:
- raise NotImplementedError()
-
- @property
- def group(self) -> str:
- raise NotImplementedError()
-
-
-def _convert_installed_files_path(
- entry: Tuple[str, ...],
- info: Tuple[str, ...],
-) -> str:
- """Convert a legacy installed-files.txt path into modern RECORD path.
-
- The legacy format stores paths relative to the info directory, while the
- modern format stores paths relative to the package root, e.g. the
- site-packages directory.
-
- :param entry: Path parts of the installed-files.txt entry.
- :param info: Path parts of the egg-info directory relative to package root.
- :returns: The converted entry.
-
- For best compatibility with symlinks, this does not use ``abspath()`` or
- ``Path.resolve()``, but tries to work with path parts:
-
- 1. While ``entry`` starts with ``..``, remove the equal amounts of parts
- from ``info``; if ``info`` is empty, start appending ``..`` instead.
- 2. Join the two directly.
- """
- while entry and entry[0] == "..":
- if not info or info[-1] == "..":
- info += ("..",)
- else:
- info = info[:-1]
- entry = entry[1:]
- return str(pathlib.Path(*info, *entry))
-
-
-class BaseDistribution(Protocol):
- @classmethod
- def from_directory(cls, directory: str) -> "BaseDistribution":
- """Load the distribution from a metadata directory.
-
- :param directory: Path to a metadata directory, e.g. ``.dist-info``.
- """
- raise NotImplementedError()
-
- @classmethod
- def from_wheel(cls, wheel: "Wheel", name: str) -> "BaseDistribution":
- """Load the distribution from a given wheel.
-
- :param wheel: A concrete wheel definition.
- :param name: File name of the wheel.
-
- :raises InvalidWheel: Whenever loading of the wheel causes a
- :py:exc:`zipfile.BadZipFile` exception to be thrown.
- :raises UnsupportedWheel: If the wheel is a valid zip, but malformed
- internally.
- """
- raise NotImplementedError()
-
- def __repr__(self) -> str:
- return f"{self.raw_name} {self.version} ({self.location})"
-
- def __str__(self) -> str:
- return f"{self.raw_name} {self.version}"
-
- @property
- def location(self) -> Optional[str]:
- """Where the distribution is loaded from.
-
- A string value is not necessarily a filesystem path, since distributions
- can be loaded from other sources, e.g. arbitrary zip archives. ``None``
- means the distribution is created in-memory.
-
- Do not canonicalize this value with e.g. ``pathlib.Path.resolve()``. If
- this is a symbolic link, we want to preserve the relative path between
- it and files in the distribution.
- """
- raise NotImplementedError()
-
- @property
- def editable_project_location(self) -> Optional[str]:
- """The project location for editable distributions.
-
- This is the directory where pyproject.toml or setup.py is located.
- None if the distribution is not installed in editable mode.
- """
- # TODO: this property is relatively costly to compute, memoize it ?
- direct_url = self.direct_url
- if direct_url:
- if direct_url.is_local_editable():
- return url_to_path(direct_url.url)
- else:
- # Search for an .egg-link file by walking sys.path, as it was
- # done before by dist_is_editable().
- egg_link_path = egg_link_path_from_sys_path(self.raw_name)
- if egg_link_path:
- # TODO: get project location from second line of egg_link file
- # (https://github.com/pypa/pip/issues/10243)
- return self.location
- return None
-
- @property
- def installed_location(self) -> Optional[str]:
- """The distribution's "installed" location.
-
- This should generally be a ``site-packages`` directory. This is
- usually ``dist.location``, except for legacy develop-installed packages,
- where ``dist.location`` is the source code location, and this is where
- the ``.egg-link`` file is.
-
- The returned location is normalized (in particular, with symlinks removed).
- """
- raise NotImplementedError()
-
- @property
- def info_location(self) -> Optional[str]:
- """Location of the .[egg|dist]-info directory or file.
-
- Similarly to ``location``, a string value is not necessarily a
- filesystem path. ``None`` means the distribution is created in-memory.
-
- For a modern .dist-info installation on disk, this should be something
- like ``{location}/{raw_name}-{version}.dist-info``.
-
- Do not canonicalize this value with e.g. ``pathlib.Path.resolve()``. If
- this is a symbolic link, we want to preserve the relative path between
- it and other files in the distribution.
- """
- raise NotImplementedError()
-
- @property
- def installed_by_distutils(self) -> bool:
- """Whether this distribution is installed with legacy distutils format.
-
- A distribution installed with "raw" distutils not patched by setuptools
- uses one single file at ``info_location`` to store metadata. We need to
- treat this specially on uninstallation.
- """
- info_location = self.info_location
- if not info_location:
- return False
- return pathlib.Path(info_location).is_file()
-
- @property
- def installed_as_egg(self) -> bool:
- """Whether this distribution is installed as an egg.
-
- This usually indicates the distribution was installed by (older versions
- of) easy_install.
- """
- location = self.location
- if not location:
- return False
- return location.endswith(".egg")
-
- @property
- def installed_with_setuptools_egg_info(self) -> bool:
- """Whether this distribution is installed with the ``.egg-info`` format.
-
- This usually indicates the distribution was installed with setuptools
- with an old pip version or with ``single-version-externally-managed``.
-
- Note that this ensure the metadata store is a directory. distutils can
- also installs an ``.egg-info``, but as a file, not a directory. This
- property is *False* for that case. Also see ``installed_by_distutils``.
- """
- info_location = self.info_location
- if not info_location:
- return False
- if not info_location.endswith(".egg-info"):
- return False
- return pathlib.Path(info_location).is_dir()
-
- @property
- def installed_with_dist_info(self) -> bool:
- """Whether this distribution is installed with the "modern format".
-
- This indicates a "modern" installation, e.g. storing metadata in the
- ``.dist-info`` directory. This applies to installations made by
- setuptools (but through pip, not directly), or anything using the
- standardized build backend interface (PEP 517).
- """
- info_location = self.info_location
- if not info_location:
- return False
- if not info_location.endswith(".dist-info"):
- return False
- return pathlib.Path(info_location).is_dir()
-
- @property
- def canonical_name(self) -> NormalizedName:
- raise NotImplementedError()
-
- @property
- def version(self) -> DistributionVersion:
- raise NotImplementedError()
-
- @property
- def setuptools_filename(self) -> str:
- """Convert a project name to its setuptools-compatible filename.
-
- This is a copy of ``pkg_resources.to_filename()`` for compatibility.
- """
- return self.raw_name.replace("-", "_")
-
- @property
- def direct_url(self) -> Optional[DirectUrl]:
- """Obtain a DirectUrl from this distribution.
-
- Returns None if the distribution has no `direct_url.json` metadata,
- or if `direct_url.json` is invalid.
- """
- try:
- content = self.read_text(DIRECT_URL_METADATA_NAME)
- except FileNotFoundError:
- return None
- try:
- return DirectUrl.from_json(content)
- except (
- UnicodeDecodeError,
- json.JSONDecodeError,
- DirectUrlValidationError,
- ) as e:
- logger.warning(
- "Error parsing %s for %s: %s",
- DIRECT_URL_METADATA_NAME,
- self.canonical_name,
- e,
- )
- return None
-
- @property
- def installer(self) -> str:
- try:
- installer_text = self.read_text("INSTALLER")
- except (OSError, ValueError, NoneMetadataError):
- return "" # Fail silently if the installer file cannot be read.
- for line in installer_text.splitlines():
- cleaned_line = line.strip()
- if cleaned_line:
- return cleaned_line
- return ""
-
- @property
- def editable(self) -> bool:
- return bool(self.editable_project_location)
-
- @property
- def local(self) -> bool:
- """If distribution is installed in the current virtual environment.
-
- Always True if we're not in a virtualenv.
- """
- if self.installed_location is None:
- return False
- return is_local(self.installed_location)
-
- @property
- def in_usersite(self) -> bool:
- if self.installed_location is None or user_site is None:
- return False
- return self.installed_location.startswith(normalize_path(user_site))
-
- @property
- def in_site_packages(self) -> bool:
- if self.installed_location is None or site_packages is None:
- return False
- return self.installed_location.startswith(normalize_path(site_packages))
-
- def is_file(self, path: InfoPath) -> bool:
- """Check whether an entry in the info directory is a file."""
- raise NotImplementedError()
-
- def iter_distutils_script_names(self) -> Iterator[str]:
- """Find distutils 'scripts' entries metadata.
-
- If 'scripts' is supplied in ``setup.py``, distutils records those in the
- installed distribution's ``scripts`` directory, a file for each script.
- """
- raise NotImplementedError()
-
- def read_text(self, path: InfoPath) -> str:
- """Read a file in the info directory.
-
- :raise FileNotFoundError: If ``path`` does not exist in the directory.
- :raise NoneMetadataError: If ``path`` exists in the info directory, but
- cannot be read.
- """
- raise NotImplementedError()
-
- def iter_entry_points(self) -> Iterable[BaseEntryPoint]:
- raise NotImplementedError()
-
- @property
- def metadata(self) -> email.message.Message:
- """Metadata of distribution parsed from e.g. METADATA or PKG-INFO.
-
- This should return an empty message if the metadata file is unavailable.
-
- :raises NoneMetadataError: If the metadata file is available, but does
- not contain valid metadata.
- """
- raise NotImplementedError()
-
- @property
- def metadata_version(self) -> Optional[str]:
- """Value of "Metadata-Version:" in distribution metadata, if available."""
- return self.metadata.get("Metadata-Version")
-
- @property
- def raw_name(self) -> str:
- """Value of "Name:" in distribution metadata."""
- # The metadata should NEVER be missing the Name: key, but if it somehow
- # does, fall back to the known canonical name.
- return self.metadata.get("Name", self.canonical_name)
-
- @property
- def requires_python(self) -> SpecifierSet:
- """Value of "Requires-Python:" in distribution metadata.
-
- If the key does not exist or contains an invalid value, an empty
- SpecifierSet should be returned.
- """
- value = self.metadata.get("Requires-Python")
- if value is None:
- return SpecifierSet()
- try:
- # Convert to str to satisfy the type checker; this can be a Header object.
- spec = SpecifierSet(str(value))
- except InvalidSpecifier as e:
- message = "Package %r has an invalid Requires-Python: %s"
- logger.warning(message, self.raw_name, e)
- return SpecifierSet()
- return spec
-
- def iter_dependencies(self, extras: Collection[str] = ()) -> Iterable[Requirement]:
- """Dependencies of this distribution.
-
- For modern .dist-info distributions, this is the collection of
- "Requires-Dist:" entries in distribution metadata.
- """
- raise NotImplementedError()
-
- def iter_provided_extras(self) -> Iterable[str]:
- """Extras provided by this distribution.
-
- For modern .dist-info distributions, this is the collection of
- "Provides-Extra:" entries in distribution metadata.
- """
- raise NotImplementedError()
-
- def _iter_declared_entries_from_record(self) -> Optional[Iterator[str]]:
- try:
- text = self.read_text("RECORD")
- except FileNotFoundError:
- return None
- # This extra Path-str cast normalizes entries.
- return (str(pathlib.Path(row[0])) for row in csv.reader(text.splitlines()))
-
- def _iter_declared_entries_from_legacy(self) -> Optional[Iterator[str]]:
- try:
- text = self.read_text("installed-files.txt")
- except FileNotFoundError:
- return None
- paths = (p for p in text.splitlines(keepends=False) if p)
- root = self.location
- info = self.info_location
- if root is None or info is None:
- return paths
- try:
- info_rel = pathlib.Path(info).relative_to(root)
- except ValueError: # info is not relative to root.
- return paths
- if not info_rel.parts: # info *is* root.
- return paths
- return (
- _convert_installed_files_path(pathlib.Path(p).parts, info_rel.parts)
- for p in paths
- )
-
- def iter_declared_entries(self) -> Optional[Iterator[str]]:
- """Iterate through file entires declared in this distribution.
-
- For modern .dist-info distributions, this is the files listed in the
- ``RECORD`` metadata file. For legacy setuptools distributions, this
- comes from ``installed-files.txt``, with entries normalized to be
- compatible with the format used by ``RECORD``.
-
- :return: An iterator for listed entries, or None if the distribution
- contains neither ``RECORD`` nor ``installed-files.txt``.
- """
- return (
- self._iter_declared_entries_from_record()
- or self._iter_declared_entries_from_legacy()
- )
-
-
-class BaseEnvironment:
- """An environment containing distributions to introspect."""
-
- @classmethod
- def default(cls) -> "BaseEnvironment":
- raise NotImplementedError()
-
- @classmethod
- def from_paths(cls, paths: Optional[List[str]]) -> "BaseEnvironment":
- raise NotImplementedError()
-
- def get_distribution(self, name: str) -> Optional["BaseDistribution"]:
- """Given a requirement name, return the installed distributions.
-
- The name may not be normalized. The implementation must canonicalize
- it for lookup.
- """
- raise NotImplementedError()
-
- def _iter_distributions(self) -> Iterator["BaseDistribution"]:
- """Iterate through installed distributions.
-
- This function should be implemented by subclass, but never called
- directly. Use the public ``iter_distribution()`` instead, which
- implements additional logic to make sure the distributions are valid.
- """
- raise NotImplementedError()
-
- def iter_all_distributions(self) -> Iterator[BaseDistribution]:
- """Iterate through all installed distributions without any filtering."""
- for dist in self._iter_distributions():
- # Make sure the distribution actually comes from a valid Python
- # packaging distribution. Pip's AdjacentTempDirectory leaves folders
- # e.g. ``~atplotlib.dist-info`` if cleanup was interrupted. The
- # valid project name pattern is taken from PEP 508.
- project_name_valid = re.match(
- r"^([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$",
- dist.canonical_name,
- flags=re.IGNORECASE,
- )
- if not project_name_valid:
- logger.warning(
- "Ignoring invalid distribution %s (%s)",
- dist.canonical_name,
- dist.location,
- )
- continue
- yield dist
-
- def iter_installed_distributions(
- self,
- local_only: bool = True,
- skip: Container[str] = stdlib_pkgs,
- include_editables: bool = True,
- editables_only: bool = False,
- user_only: bool = False,
- ) -> Iterator[BaseDistribution]:
- """Return a list of installed distributions.
-
- This is based on ``iter_all_distributions()`` with additional filtering
- options. Note that ``iter_installed_distributions()`` without arguments
- is *not* equal to ``iter_all_distributions()``, since some of the
- configurations exclude packages by default.
-
- :param local_only: If True (default), only return installations
- local to the current virtualenv, if in a virtualenv.
- :param skip: An iterable of canonicalized project names to ignore;
- defaults to ``stdlib_pkgs``.
- :param include_editables: If False, don't report editables.
- :param editables_only: If True, only report editables.
- :param user_only: If True, only report installations in the user
- site directory.
- """
- it = self.iter_all_distributions()
- if local_only:
- it = (d for d in it if d.local)
- if not include_editables:
- it = (d for d in it if not d.editable)
- if editables_only:
- it = (d for d in it if d.editable)
- if user_only:
- it = (d for d in it if d.in_usersite)
- return (d for d in it if d.canonical_name not in skip)
-
-
-class Wheel(Protocol):
- location: str
-
- def as_zipfile(self) -> zipfile.ZipFile:
- raise NotImplementedError()
-
-
-class FilesystemWheel(Wheel):
- def __init__(self, location: str) -> None:
- self.location = location
-
- def as_zipfile(self) -> zipfile.ZipFile:
- return zipfile.ZipFile(self.location, allowZip64=True)
-
-
-class MemoryWheel(Wheel):
- def __init__(self, location: str, stream: IO[bytes]) -> None:
- self.location = location
- self.stream = stream
-
- def as_zipfile(self) -> zipfile.ZipFile:
- return zipfile.ZipFile(self.stream, allowZip64=True)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/__init__.py
deleted file mode 100644
index 5e7af9f..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-from ._dists import Distribution
-from ._envs import Environment
-
-__all__ = ["Distribution", "Environment"]
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_compat.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_compat.py
deleted file mode 100644
index 2bc6bfd..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_compat.py
+++ /dev/null
@@ -1,41 +0,0 @@
-import importlib.metadata
-from typing import Any, Optional, Protocol, cast
-
-
-class BasePath(Protocol):
- """A protocol that various path objects conform.
-
- This exists because importlib.metadata uses both ``pathlib.Path`` and
- ``zipfile.Path``, and we need a common base for type hints (Union does not
- work well since ``zipfile.Path`` is too new for our linter setup).
-
- This does not mean to be exhaustive, but only contains things that present
- in both classes *that we need*.
- """
-
- name: str
-
- @property
- def parent(self) -> "BasePath":
- raise NotImplementedError()
-
-
-def get_info_location(d: importlib.metadata.Distribution) -> Optional[BasePath]:
- """Find the path to the distribution's metadata directory.
-
- HACK: This relies on importlib.metadata's private ``_path`` attribute. Not
- all distributions exist on disk, so importlib.metadata is correct to not
- expose the attribute as public. But pip's code base is old and not as clean,
- so we do this to avoid having to rewrite too many things. Hopefully we can
- eliminate this some day.
- """
- return getattr(d, "_path", None)
-
-
-def get_dist_name(dist: importlib.metadata.Distribution) -> str:
- """Get the distribution's project name.
-
- The ``name`` attribute is only available in Python 3.10 or later. We are
- targeting exactly that, but Mypy does not know this.
- """
- return cast(Any, dist).name
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_dists.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_dists.py
deleted file mode 100644
index cf66de5..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_dists.py
+++ /dev/null
@@ -1,274 +0,0 @@
-import email.message
-import importlib.metadata
-import os
-import pathlib
-import zipfile
-from typing import (
- Collection,
- Dict,
- Iterable,
- Iterator,
- Mapping,
- NamedTuple,
- Optional,
- Sequence,
-)
-
-from pip._vendor.packaging.requirements import Requirement
-from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
-from pip._vendor.packaging.version import parse as parse_version
-
-from pip._internal.exceptions import InvalidWheel, UnsupportedWheel
-from pip._internal.metadata.base import (
- BaseDistribution,
- BaseEntryPoint,
- DistributionVersion,
- InfoPath,
- Wheel,
-)
-from pip._internal.utils.misc import normalize_path
-from pip._internal.utils.packaging import safe_extra
-from pip._internal.utils.wheel import parse_wheel, read_wheel_metadata_file
-
-from ._compat import BasePath, get_dist_name
-
-
-class WheelDistribution(importlib.metadata.Distribution):
- """An ``importlib.metadata.Distribution`` read from a wheel.
-
- Although ``importlib.metadata.PathDistribution`` accepts ``zipfile.Path``,
- its implementation is too "lazy" for pip's needs (we can't keep the ZipFile
- handle open for the entire lifetime of the distribution object).
-
- This implementation eagerly reads the entire metadata directory into the
- memory instead, and operates from that.
- """
-
- def __init__(
- self,
- files: Mapping[pathlib.PurePosixPath, bytes],
- info_location: pathlib.PurePosixPath,
- ) -> None:
- self._files = files
- self.info_location = info_location
-
- @classmethod
- def from_zipfile(
- cls,
- zf: zipfile.ZipFile,
- name: str,
- location: str,
- ) -> "WheelDistribution":
- info_dir, _ = parse_wheel(zf, name)
- paths = (
- (name, pathlib.PurePosixPath(name.split("/", 1)[-1]))
- for name in zf.namelist()
- if name.startswith(f"{info_dir}/")
- )
- files = {
- relpath: read_wheel_metadata_file(zf, fullpath)
- for fullpath, relpath in paths
- }
- info_location = pathlib.PurePosixPath(location, info_dir)
- return cls(files, info_location)
-
- def iterdir(self, path: InfoPath) -> Iterator[pathlib.PurePosixPath]:
- # Only allow iterating through the metadata directory.
- if pathlib.PurePosixPath(str(path)) in self._files:
- return iter(self._files)
- raise FileNotFoundError(path)
-
- def read_text(self, filename: str) -> Optional[str]:
- try:
- data = self._files[pathlib.PurePosixPath(filename)]
- except KeyError:
- return None
- try:
- text = data.decode("utf-8")
- except UnicodeDecodeError as e:
- wheel = self.info_location.parent
- error = f"Error decoding metadata for {wheel}: {e} in {filename} file"
- raise UnsupportedWheel(error)
- return text
-
-
-class RequiresEntry(NamedTuple):
- requirement: str
- extra: str
- marker: str
-
-
-class Distribution(BaseDistribution):
- def __init__(
- self,
- dist: importlib.metadata.Distribution,
- info_location: Optional[BasePath],
- installed_location: Optional[BasePath],
- ) -> None:
- self._dist = dist
- self._info_location = info_location
- self._installed_location = installed_location
-
- @classmethod
- def from_directory(cls, directory: str) -> BaseDistribution:
- info_location = pathlib.Path(directory)
- dist = importlib.metadata.Distribution.at(info_location)
- return cls(dist, info_location, info_location.parent)
-
- @classmethod
- def from_wheel(cls, wheel: Wheel, name: str) -> BaseDistribution:
- try:
- with wheel.as_zipfile() as zf:
- dist = WheelDistribution.from_zipfile(zf, name, wheel.location)
- except zipfile.BadZipFile as e:
- raise InvalidWheel(wheel.location, name) from e
- except UnsupportedWheel as e:
- raise UnsupportedWheel(f"{name} has an invalid wheel, {e}")
- return cls(dist, dist.info_location, pathlib.PurePosixPath(wheel.location))
-
- @property
- def location(self) -> Optional[str]:
- if self._info_location is None:
- return None
- return str(self._info_location.parent)
-
- @property
- def info_location(self) -> Optional[str]:
- if self._info_location is None:
- return None
- return str(self._info_location)
-
- @property
- def installed_location(self) -> Optional[str]:
- if self._installed_location is None:
- return None
- return normalize_path(str(self._installed_location))
-
- def _get_dist_name_from_location(self) -> Optional[str]:
- """Try to get the name from the metadata directory name.
-
- This is much faster than reading metadata.
- """
- if self._info_location is None:
- return None
- stem, suffix = os.path.splitext(self._info_location.name)
- if suffix not in (".dist-info", ".egg-info"):
- return None
- return stem.split("-", 1)[0]
-
- @property
- def canonical_name(self) -> NormalizedName:
- name = self._get_dist_name_from_location() or get_dist_name(self._dist)
- return canonicalize_name(name)
-
- @property
- def version(self) -> DistributionVersion:
- return parse_version(self._dist.version)
-
- def is_file(self, path: InfoPath) -> bool:
- return self._dist.read_text(str(path)) is not None
-
- def iter_distutils_script_names(self) -> Iterator[str]:
- # A distutils installation is always "flat" (not in e.g. egg form), so
- # if this distribution's info location is NOT a pathlib.Path (but e.g.
- # zipfile.Path), it can never contain any distutils scripts.
- if not isinstance(self._info_location, pathlib.Path):
- return
- for child in self._info_location.joinpath("scripts").iterdir():
- yield child.name
-
- def read_text(self, path: InfoPath) -> str:
- content = self._dist.read_text(str(path))
- if content is None:
- raise FileNotFoundError(path)
- return content
-
- def iter_entry_points(self) -> Iterable[BaseEntryPoint]:
- # importlib.metadata's EntryPoint structure sasitfies BaseEntryPoint.
- return self._dist.entry_points
-
- @property
- def metadata(self) -> email.message.Message:
- return self._dist.metadata
-
- def _iter_requires_txt_entries(self) -> Iterator[RequiresEntry]:
- """Parse a ``requires.txt`` in an egg-info directory.
-
- This is an INI-ish format where an egg-info stores dependencies. A
- section name describes extra other environment markers, while each entry
- is an arbitrary string (not a key-value pair) representing a dependency
- as a requirement string (no markers).
-
- There is a construct in ``importlib.metadata`` called ``Sectioned`` that
- does mostly the same, but the format is currently considered private.
- """
- content = self._dist.read_text("requires.txt")
- if content is None:
- return
- extra = marker = "" # Section-less entries don't have markers.
- for line in content.splitlines():
- line = line.strip()
- if not line or line.startswith("#"): # Comment; ignored.
- continue
- if line.startswith("[") and line.endswith("]"): # A section header.
- extra, _, marker = line.strip("[]").partition(":")
- continue
- yield RequiresEntry(requirement=line, extra=extra, marker=marker)
-
- def _iter_egg_info_extras(self) -> Iterable[str]:
- """Get extras from the egg-info directory."""
- known_extras = {""}
- for entry in self._iter_requires_txt_entries():
- if entry.extra in known_extras:
- continue
- known_extras.add(entry.extra)
- yield entry.extra
-
- def iter_provided_extras(self) -> Iterable[str]:
- iterator = (
- self._dist.metadata.get_all("Provides-Extra")
- or self._iter_egg_info_extras()
- )
- return (safe_extra(extra) for extra in iterator)
-
- def _iter_egg_info_dependencies(self) -> Iterable[str]:
- """Get distribution dependencies from the egg-info directory.
-
- To ease parsing, this converts a legacy dependency entry into a PEP 508
- requirement string. Like ``_iter_requires_txt_entries()``, there is code
- in ``importlib.metadata`` that does mostly the same, but not do exactly
- what we need.
-
- Namely, ``importlib.metadata`` does not normalize the extra name before
- putting it into the requirement string, which causes marker comparison
- to fail because the dist-info format do normalize. This is consistent in
- all currently available PEP 517 backends, although not standardized.
- """
- for entry in self._iter_requires_txt_entries():
- if entry.extra and entry.marker:
- marker = f'({entry.marker}) and extra == "{safe_extra(entry.extra)}"'
- elif entry.extra:
- marker = f'extra == "{safe_extra(entry.extra)}"'
- elif entry.marker:
- marker = entry.marker
- else:
- marker = ""
- if marker:
- yield f"{entry.requirement} ; {marker}"
- else:
- yield entry.requirement
-
- def iter_dependencies(self, extras: Collection[str] = ()) -> Iterable[Requirement]:
- req_string_iterator = (
- self._dist.metadata.get_all("Requires-Dist")
- or self._iter_egg_info_dependencies()
- )
- contexts: Sequence[Dict[str, str]] = [{"extra": safe_extra(e)} for e in extras]
- for req_string in req_string_iterator:
- req = Requirement(req_string)
- if not req.marker:
- yield req
- elif not extras and req.marker.evaluate({"extra": ""}):
- yield req
- elif any(req.marker.evaluate(context) for context in contexts):
- yield req
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_envs.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_envs.py
deleted file mode 100644
index 25dbdea..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/importlib/_envs.py
+++ /dev/null
@@ -1,163 +0,0 @@
-import functools
-import importlib.metadata
-import os
-import pathlib
-import sys
-import zipfile
-import zipimport
-from typing import Iterator, List, Optional, Sequence, Set, Tuple
-
-from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
-
-from pip._internal.metadata.base import BaseDistribution, BaseEnvironment
-from pip._internal.utils.deprecation import deprecated
-
-from ._compat import BasePath, get_dist_name, get_info_location
-from ._dists import Distribution
-
-
-class _DistributionFinder:
- """Finder to locate distributions.
-
- The main purpose of this class is to memoize found distributions' names, so
- only one distribution is returned for each package name. At lot of pip code
- assumes this (because it is setuptools's behavior), and not doing the same
- can potentially cause a distribution in lower precedence path to override a
- higher precedence one if the caller is not careful.
-
- Eventually we probably want to make it possible to see lower precedence
- installations as well. It's useful feature, after all.
- """
-
- FoundResult = Tuple[importlib.metadata.Distribution, Optional[BasePath]]
-
- def __init__(self) -> None:
- self._found_names: Set[NormalizedName] = set()
-
- def _find_impl(self, location: str) -> Iterator[FoundResult]:
- """Find distributions in a location."""
- # To know exactly where we find a distribution, we have to feed in the
- # paths one by one, instead of dumping the list to importlib.metadata.
- for dist in importlib.metadata.distributions(path=[location]):
- normalized_name = canonicalize_name(get_dist_name(dist))
- if normalized_name in self._found_names:
- continue
- self._found_names.add(normalized_name)
- info_location = get_info_location(dist)
- yield dist, info_location
-
- def find(self, location: str) -> Iterator[BaseDistribution]:
- """Find distributions in a location.
-
- The path can be either a directory, or a ZIP archive.
- """
- for dist, info_location in self._find_impl(location):
- if info_location is None:
- installed_location: Optional[BasePath] = None
- else:
- installed_location = info_location.parent
- yield Distribution(dist, info_location, installed_location)
-
- def find_linked(self, location: str) -> Iterator[BaseDistribution]:
- """Read location in egg-link files and return distributions in there.
-
- The path should be a directory; otherwise this returns nothing. This
- follows how setuptools does this for compatibility. The first non-empty
- line in the egg-link is read as a path (resolved against the egg-link's
- containing directory if relative). Distributions found at that linked
- location are returned.
- """
- path = pathlib.Path(location)
- if not path.is_dir():
- return
- for child in path.iterdir():
- if child.suffix != ".egg-link":
- continue
- with child.open() as f:
- lines = (line.strip() for line in f)
- target_rel = next((line for line in lines if line), "")
- if not target_rel:
- continue
- target_location = str(path.joinpath(target_rel))
- for dist, info_location in self._find_impl(target_location):
- yield Distribution(dist, info_location, path)
-
- def _find_eggs_in_dir(self, location: str) -> Iterator[BaseDistribution]:
- from pip._vendor.pkg_resources import find_distributions
-
- from pip._internal.metadata import pkg_resources as legacy
-
- with os.scandir(location) as it:
- for entry in it:
- if not entry.name.endswith(".egg"):
- continue
- for dist in find_distributions(entry.path):
- yield legacy.Distribution(dist)
-
- def _find_eggs_in_zip(self, location: str) -> Iterator[BaseDistribution]:
- from pip._vendor.pkg_resources import find_eggs_in_zip
-
- from pip._internal.metadata import pkg_resources as legacy
-
- try:
- importer = zipimport.zipimporter(location)
- except zipimport.ZipImportError:
- return
- for dist in find_eggs_in_zip(importer, location):
- yield legacy.Distribution(dist)
-
- def find_eggs(self, location: str) -> Iterator[BaseDistribution]:
- """Find eggs in a location.
-
- This actually uses the old *pkg_resources* backend. We likely want to
- deprecate this so we can eventually remove the *pkg_resources*
- dependency entirely. Before that, this should first emit a deprecation
- warning for some versions when using the fallback since importing
- *pkg_resources* is slow for those who don't need it.
- """
- if os.path.isdir(location):
- yield from self._find_eggs_in_dir(location)
- if zipfile.is_zipfile(location):
- yield from self._find_eggs_in_zip(location)
-
-
-@functools.lru_cache(maxsize=None) # Warn a distribution exactly once.
-def _emit_egg_deprecation(location: Optional[str]) -> None:
- deprecated(
- reason=f"Loading egg at {location} is deprecated.",
- replacement="to use pip for package installation.",
- gone_in=None,
- )
-
-
-class Environment(BaseEnvironment):
- def __init__(self, paths: Sequence[str]) -> None:
- self._paths = paths
-
- @classmethod
- def default(cls) -> BaseEnvironment:
- return cls(sys.path)
-
- @classmethod
- def from_paths(cls, paths: Optional[List[str]]) -> BaseEnvironment:
- if paths is None:
- return cls(sys.path)
- return cls(paths)
-
- def _iter_distributions(self) -> Iterator[BaseDistribution]:
- finder = _DistributionFinder()
- for location in self._paths:
- yield from finder.find(location)
- for dist in finder.find_eggs(location):
- # _emit_egg_deprecation(dist.location) # TODO: Enable this.
- yield dist
- # This must go last because that's how pkg_resources tie-breaks.
- yield from finder.find_linked(location)
-
- def get_distribution(self, name: str) -> Optional[BaseDistribution]:
- matches = (
- distribution
- for distribution in self.iter_all_distributions()
- if distribution.canonical_name == canonicalize_name(name)
- )
- return next(matches, None)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/metadata/pkg_resources.py b/env/lib/python3.9/site-packages/pip/_internal/metadata/pkg_resources.py
deleted file mode 100644
index ffde8c7..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/metadata/pkg_resources.py
+++ /dev/null
@@ -1,254 +0,0 @@
-import email.message
-import email.parser
-import logging
-import os
-import zipfile
-from typing import Collection, Iterable, Iterator, List, Mapping, NamedTuple, Optional
-
-from pip._vendor import pkg_resources
-from pip._vendor.packaging.requirements import Requirement
-from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
-from pip._vendor.packaging.version import parse as parse_version
-
-from pip._internal.exceptions import InvalidWheel, NoneMetadataError, UnsupportedWheel
-from pip._internal.utils.egg_link import egg_link_path_from_location
-from pip._internal.utils.misc import display_path, normalize_path
-from pip._internal.utils.wheel import parse_wheel, read_wheel_metadata_file
-
-from .base import (
- BaseDistribution,
- BaseEntryPoint,
- BaseEnvironment,
- DistributionVersion,
- InfoPath,
- Wheel,
-)
-
-logger = logging.getLogger(__name__)
-
-
-class EntryPoint(NamedTuple):
- name: str
- value: str
- group: str
-
-
-class WheelMetadata:
- """IMetadataProvider that reads metadata files from a dictionary.
-
- This also maps metadata decoding exceptions to our internal exception type.
- """
-
- def __init__(self, metadata: Mapping[str, bytes], wheel_name: str) -> None:
- self._metadata = metadata
- self._wheel_name = wheel_name
-
- def has_metadata(self, name: str) -> bool:
- return name in self._metadata
-
- def get_metadata(self, name: str) -> str:
- try:
- return self._metadata[name].decode()
- except UnicodeDecodeError as e:
- # Augment the default error with the origin of the file.
- raise UnsupportedWheel(
- f"Error decoding metadata for {self._wheel_name}: {e} in {name} file"
- )
-
- def get_metadata_lines(self, name: str) -> Iterable[str]:
- return pkg_resources.yield_lines(self.get_metadata(name))
-
- def metadata_isdir(self, name: str) -> bool:
- return False
-
- def metadata_listdir(self, name: str) -> List[str]:
- return []
-
- def run_script(self, script_name: str, namespace: str) -> None:
- pass
-
-
-class Distribution(BaseDistribution):
- def __init__(self, dist: pkg_resources.Distribution) -> None:
- self._dist = dist
-
- @classmethod
- def from_directory(cls, directory: str) -> BaseDistribution:
- dist_dir = directory.rstrip(os.sep)
-
- # Build a PathMetadata object, from path to metadata. :wink:
- base_dir, dist_dir_name = os.path.split(dist_dir)
- metadata = pkg_resources.PathMetadata(base_dir, dist_dir)
-
- # Determine the correct Distribution object type.
- if dist_dir.endswith(".egg-info"):
- dist_cls = pkg_resources.Distribution
- dist_name = os.path.splitext(dist_dir_name)[0]
- else:
- assert dist_dir.endswith(".dist-info")
- dist_cls = pkg_resources.DistInfoDistribution
- dist_name = os.path.splitext(dist_dir_name)[0].split("-")[0]
-
- dist = dist_cls(base_dir, project_name=dist_name, metadata=metadata)
- return cls(dist)
-
- @classmethod
- def from_wheel(cls, wheel: Wheel, name: str) -> BaseDistribution:
- try:
- with wheel.as_zipfile() as zf:
- info_dir, _ = parse_wheel(zf, name)
- metadata_text = {
- path.split("/", 1)[-1]: read_wheel_metadata_file(zf, path)
- for path in zf.namelist()
- if path.startswith(f"{info_dir}/")
- }
- except zipfile.BadZipFile as e:
- raise InvalidWheel(wheel.location, name) from e
- except UnsupportedWheel as e:
- raise UnsupportedWheel(f"{name} has an invalid wheel, {e}")
- dist = pkg_resources.DistInfoDistribution(
- location=wheel.location,
- metadata=WheelMetadata(metadata_text, wheel.location),
- project_name=name,
- )
- return cls(dist)
-
- @property
- def location(self) -> Optional[str]:
- return self._dist.location
-
- @property
- def installed_location(self) -> Optional[str]:
- egg_link = egg_link_path_from_location(self.raw_name)
- if egg_link:
- location = egg_link
- elif self.location:
- location = self.location
- else:
- return None
- return normalize_path(location)
-
- @property
- def info_location(self) -> Optional[str]:
- return self._dist.egg_info
-
- @property
- def installed_by_distutils(self) -> bool:
- # A distutils-installed distribution is provided by FileMetadata. This
- # provider has a "path" attribute not present anywhere else. Not the
- # best introspection logic, but pip has been doing this for a long time.
- try:
- return bool(self._dist._provider.path)
- except AttributeError:
- return False
-
- @property
- def canonical_name(self) -> NormalizedName:
- return canonicalize_name(self._dist.project_name)
-
- @property
- def version(self) -> DistributionVersion:
- return parse_version(self._dist.version)
-
- def is_file(self, path: InfoPath) -> bool:
- return self._dist.has_metadata(str(path))
-
- def iter_distutils_script_names(self) -> Iterator[str]:
- yield from self._dist.metadata_listdir("scripts")
-
- def read_text(self, path: InfoPath) -> str:
- name = str(path)
- if not self._dist.has_metadata(name):
- raise FileNotFoundError(name)
- content = self._dist.get_metadata(name)
- if content is None:
- raise NoneMetadataError(self, name)
- return content
-
- def iter_entry_points(self) -> Iterable[BaseEntryPoint]:
- for group, entries in self._dist.get_entry_map().items():
- for name, entry_point in entries.items():
- name, _, value = str(entry_point).partition("=")
- yield EntryPoint(name=name.strip(), value=value.strip(), group=group)
-
- @property
- def metadata(self) -> email.message.Message:
- """
- :raises NoneMetadataError: if the distribution reports `has_metadata()`
- True but `get_metadata()` returns None.
- """
- if isinstance(self._dist, pkg_resources.DistInfoDistribution):
- metadata_name = "METADATA"
- else:
- metadata_name = "PKG-INFO"
- try:
- metadata = self.read_text(metadata_name)
- except FileNotFoundError:
- if self.location:
- displaying_path = display_path(self.location)
- else:
- displaying_path = repr(self.location)
- logger.warning("No metadata found in %s", displaying_path)
- metadata = ""
- feed_parser = email.parser.FeedParser()
- feed_parser.feed(metadata)
- return feed_parser.close()
-
- def iter_dependencies(self, extras: Collection[str] = ()) -> Iterable[Requirement]:
- if extras: # pkg_resources raises on invalid extras, so we sanitize.
- extras = frozenset(extras).intersection(self._dist.extras)
- return self._dist.requires(extras)
-
- def iter_provided_extras(self) -> Iterable[str]:
- return self._dist.extras
-
-
-class Environment(BaseEnvironment):
- def __init__(self, ws: pkg_resources.WorkingSet) -> None:
- self._ws = ws
-
- @classmethod
- def default(cls) -> BaseEnvironment:
- return cls(pkg_resources.working_set)
-
- @classmethod
- def from_paths(cls, paths: Optional[List[str]]) -> BaseEnvironment:
- return cls(pkg_resources.WorkingSet(paths))
-
- def _iter_distributions(self) -> Iterator[BaseDistribution]:
- for dist in self._ws:
- yield Distribution(dist)
-
- def _search_distribution(self, name: str) -> Optional[BaseDistribution]:
- """Find a distribution matching the ``name`` in the environment.
-
- This searches from *all* distributions available in the environment, to
- match the behavior of ``pkg_resources.get_distribution()``.
- """
- canonical_name = canonicalize_name(name)
- for dist in self.iter_all_distributions():
- if dist.canonical_name == canonical_name:
- return dist
- return None
-
- def get_distribution(self, name: str) -> Optional[BaseDistribution]:
- # Search the distribution by looking through the working set.
- dist = self._search_distribution(name)
- if dist:
- return dist
-
- # If distribution could not be found, call working_set.require to
- # update the working set, and try to find the distribution again.
- # This might happen for e.g. when you install a package twice, once
- # using setup.py develop and again using setup.py install. Now when
- # running pip uninstall twice, the package gets removed from the
- # working set in the first uninstall, so we have to populate the
- # working set again so that pip knows about it and the packages gets
- # picked up and is successfully uninstalled the second time too.
- try:
- # We didn't pass in any version specifiers, so this can never
- # raise pkg_resources.VersionConflict.
- self._ws.require(name)
- except pkg_resources.DistributionNotFound:
- return None
- return self._search_distribution(name)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/models/__init__.py
deleted file mode 100644
index 7855226..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/__init__.py
+++ /dev/null
@@ -1,2 +0,0 @@
-"""A package that contains models that represent entities.
-"""
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/candidate.py b/env/lib/python3.9/site-packages/pip/_internal/models/candidate.py
deleted file mode 100644
index a4963ae..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/candidate.py
+++ /dev/null
@@ -1,34 +0,0 @@
-from pip._vendor.packaging.version import parse as parse_version
-
-from pip._internal.models.link import Link
-from pip._internal.utils.models import KeyBasedCompareMixin
-
-
-class InstallationCandidate(KeyBasedCompareMixin):
- """Represents a potential "candidate" for installation."""
-
- __slots__ = ["name", "version", "link"]
-
- def __init__(self, name: str, version: str, link: Link) -> None:
- self.name = name
- self.version = parse_version(version)
- self.link = link
-
- super().__init__(
- key=(self.name, self.version, self.link),
- defining_class=InstallationCandidate,
- )
-
- def __repr__(self) -> str:
- return "".format(
- self.name,
- self.version,
- self.link,
- )
-
- def __str__(self) -> str:
- return "{!r} candidate (version {} at {})".format(
- self.name,
- self.version,
- self.link,
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/direct_url.py b/env/lib/python3.9/site-packages/pip/_internal/models/direct_url.py
deleted file mode 100644
index e75feda..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/direct_url.py
+++ /dev/null
@@ -1,212 +0,0 @@
-""" PEP 610 """
-import json
-import re
-import urllib.parse
-from typing import Any, Dict, Iterable, Optional, Type, TypeVar, Union
-
-__all__ = [
- "DirectUrl",
- "DirectUrlValidationError",
- "DirInfo",
- "ArchiveInfo",
- "VcsInfo",
-]
-
-T = TypeVar("T")
-
-DIRECT_URL_METADATA_NAME = "direct_url.json"
-ENV_VAR_RE = re.compile(r"^\$\{[A-Za-z0-9-_]+\}(:\$\{[A-Za-z0-9-_]+\})?$")
-
-
-class DirectUrlValidationError(Exception):
- pass
-
-
-def _get(
- d: Dict[str, Any], expected_type: Type[T], key: str, default: Optional[T] = None
-) -> Optional[T]:
- """Get value from dictionary and verify expected type."""
- if key not in d:
- return default
- value = d[key]
- if not isinstance(value, expected_type):
- raise DirectUrlValidationError(
- "{!r} has unexpected type for {} (expected {})".format(
- value, key, expected_type
- )
- )
- return value
-
-
-def _get_required(
- d: Dict[str, Any], expected_type: Type[T], key: str, default: Optional[T] = None
-) -> T:
- value = _get(d, expected_type, key, default)
- if value is None:
- raise DirectUrlValidationError(f"{key} must have a value")
- return value
-
-
-def _exactly_one_of(infos: Iterable[Optional["InfoType"]]) -> "InfoType":
- infos = [info for info in infos if info is not None]
- if not infos:
- raise DirectUrlValidationError(
- "missing one of archive_info, dir_info, vcs_info"
- )
- if len(infos) > 1:
- raise DirectUrlValidationError(
- "more than one of archive_info, dir_info, vcs_info"
- )
- assert infos[0] is not None
- return infos[0]
-
-
-def _filter_none(**kwargs: Any) -> Dict[str, Any]:
- """Make dict excluding None values."""
- return {k: v for k, v in kwargs.items() if v is not None}
-
-
-class VcsInfo:
- name = "vcs_info"
-
- def __init__(
- self,
- vcs: str,
- commit_id: str,
- requested_revision: Optional[str] = None,
- ) -> None:
- self.vcs = vcs
- self.requested_revision = requested_revision
- self.commit_id = commit_id
-
- @classmethod
- def _from_dict(cls, d: Optional[Dict[str, Any]]) -> Optional["VcsInfo"]:
- if d is None:
- return None
- return cls(
- vcs=_get_required(d, str, "vcs"),
- commit_id=_get_required(d, str, "commit_id"),
- requested_revision=_get(d, str, "requested_revision"),
- )
-
- def _to_dict(self) -> Dict[str, Any]:
- return _filter_none(
- vcs=self.vcs,
- requested_revision=self.requested_revision,
- commit_id=self.commit_id,
- )
-
-
-class ArchiveInfo:
- name = "archive_info"
-
- def __init__(
- self,
- hash: Optional[str] = None,
- ) -> None:
- self.hash = hash
-
- @classmethod
- def _from_dict(cls, d: Optional[Dict[str, Any]]) -> Optional["ArchiveInfo"]:
- if d is None:
- return None
- return cls(hash=_get(d, str, "hash"))
-
- def _to_dict(self) -> Dict[str, Any]:
- return _filter_none(hash=self.hash)
-
-
-class DirInfo:
- name = "dir_info"
-
- def __init__(
- self,
- editable: bool = False,
- ) -> None:
- self.editable = editable
-
- @classmethod
- def _from_dict(cls, d: Optional[Dict[str, Any]]) -> Optional["DirInfo"]:
- if d is None:
- return None
- return cls(editable=_get_required(d, bool, "editable", default=False))
-
- def _to_dict(self) -> Dict[str, Any]:
- return _filter_none(editable=self.editable or None)
-
-
-InfoType = Union[ArchiveInfo, DirInfo, VcsInfo]
-
-
-class DirectUrl:
- def __init__(
- self,
- url: str,
- info: InfoType,
- subdirectory: Optional[str] = None,
- ) -> None:
- self.url = url
- self.info = info
- self.subdirectory = subdirectory
-
- def _remove_auth_from_netloc(self, netloc: str) -> str:
- if "@" not in netloc:
- return netloc
- user_pass, netloc_no_user_pass = netloc.split("@", 1)
- if (
- isinstance(self.info, VcsInfo)
- and self.info.vcs == "git"
- and user_pass == "git"
- ):
- return netloc
- if ENV_VAR_RE.match(user_pass):
- return netloc
- return netloc_no_user_pass
-
- @property
- def redacted_url(self) -> str:
- """url with user:password part removed unless it is formed with
- environment variables as specified in PEP 610, or it is ``git``
- in the case of a git URL.
- """
- purl = urllib.parse.urlsplit(self.url)
- netloc = self._remove_auth_from_netloc(purl.netloc)
- surl = urllib.parse.urlunsplit(
- (purl.scheme, netloc, purl.path, purl.query, purl.fragment)
- )
- return surl
-
- def validate(self) -> None:
- self.from_dict(self.to_dict())
-
- @classmethod
- def from_dict(cls, d: Dict[str, Any]) -> "DirectUrl":
- return DirectUrl(
- url=_get_required(d, str, "url"),
- subdirectory=_get(d, str, "subdirectory"),
- info=_exactly_one_of(
- [
- ArchiveInfo._from_dict(_get(d, dict, "archive_info")),
- DirInfo._from_dict(_get(d, dict, "dir_info")),
- VcsInfo._from_dict(_get(d, dict, "vcs_info")),
- ]
- ),
- )
-
- def to_dict(self) -> Dict[str, Any]:
- res = _filter_none(
- url=self.redacted_url,
- subdirectory=self.subdirectory,
- )
- res[self.info.name] = self.info._to_dict()
- return res
-
- @classmethod
- def from_json(cls, s: str) -> "DirectUrl":
- return cls.from_dict(json.loads(s))
-
- def to_json(self) -> str:
- return json.dumps(self.to_dict(), sort_keys=True)
-
- def is_local_editable(self) -> bool:
- return isinstance(self.info, DirInfo) and self.info.editable
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/format_control.py b/env/lib/python3.9/site-packages/pip/_internal/models/format_control.py
deleted file mode 100644
index db3995e..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/format_control.py
+++ /dev/null
@@ -1,80 +0,0 @@
-from typing import FrozenSet, Optional, Set
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.exceptions import CommandError
-
-
-class FormatControl:
- """Helper for managing formats from which a package can be installed."""
-
- __slots__ = ["no_binary", "only_binary"]
-
- def __init__(
- self,
- no_binary: Optional[Set[str]] = None,
- only_binary: Optional[Set[str]] = None,
- ) -> None:
- if no_binary is None:
- no_binary = set()
- if only_binary is None:
- only_binary = set()
-
- self.no_binary = no_binary
- self.only_binary = only_binary
-
- def __eq__(self, other: object) -> bool:
- if not isinstance(other, self.__class__):
- return NotImplemented
-
- if self.__slots__ != other.__slots__:
- return False
-
- return all(getattr(self, k) == getattr(other, k) for k in self.__slots__)
-
- def __repr__(self) -> str:
- return "{}({}, {})".format(
- self.__class__.__name__, self.no_binary, self.only_binary
- )
-
- @staticmethod
- def handle_mutual_excludes(value: str, target: Set[str], other: Set[str]) -> None:
- if value.startswith("-"):
- raise CommandError(
- "--no-binary / --only-binary option requires 1 argument."
- )
- new = value.split(",")
- while ":all:" in new:
- other.clear()
- target.clear()
- target.add(":all:")
- del new[: new.index(":all:") + 1]
- # Without a none, we want to discard everything as :all: covers it
- if ":none:" not in new:
- return
- for name in new:
- if name == ":none:":
- target.clear()
- continue
- name = canonicalize_name(name)
- other.discard(name)
- target.add(name)
-
- def get_allowed_formats(self, canonical_name: str) -> FrozenSet[str]:
- result = {"binary", "source"}
- if canonical_name in self.only_binary:
- result.discard("source")
- elif canonical_name in self.no_binary:
- result.discard("binary")
- elif ":all:" in self.only_binary:
- result.discard("source")
- elif ":all:" in self.no_binary:
- result.discard("binary")
- return frozenset(result)
-
- def disallow_binaries(self) -> None:
- self.handle_mutual_excludes(
- ":all:",
- self.no_binary,
- self.only_binary,
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/index.py b/env/lib/python3.9/site-packages/pip/_internal/models/index.py
deleted file mode 100644
index b94c325..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/index.py
+++ /dev/null
@@ -1,28 +0,0 @@
-import urllib.parse
-
-
-class PackageIndex:
- """Represents a Package Index and provides easier access to endpoints"""
-
- __slots__ = ["url", "netloc", "simple_url", "pypi_url", "file_storage_domain"]
-
- def __init__(self, url: str, file_storage_domain: str) -> None:
- super().__init__()
- self.url = url
- self.netloc = urllib.parse.urlsplit(url).netloc
- self.simple_url = self._url_for_path("simple")
- self.pypi_url = self._url_for_path("pypi")
-
- # This is part of a temporary hack used to block installs of PyPI
- # packages which depend on external urls only necessary until PyPI can
- # block such packages themselves
- self.file_storage_domain = file_storage_domain
-
- def _url_for_path(self, path: str) -> str:
- return urllib.parse.urljoin(self.url, path)
-
-
-PyPI = PackageIndex("https://pypi.org/", file_storage_domain="files.pythonhosted.org")
-TestPyPI = PackageIndex(
- "https://test.pypi.org/", file_storage_domain="test-files.pythonhosted.org"
-)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/link.py b/env/lib/python3.9/site-packages/pip/_internal/models/link.py
deleted file mode 100644
index 6069b27..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/link.py
+++ /dev/null
@@ -1,288 +0,0 @@
-import functools
-import logging
-import os
-import posixpath
-import re
-import urllib.parse
-from typing import TYPE_CHECKING, Dict, List, NamedTuple, Optional, Tuple, Union
-
-from pip._internal.utils.filetypes import WHEEL_EXTENSION
-from pip._internal.utils.hashes import Hashes
-from pip._internal.utils.misc import (
- redact_auth_from_url,
- split_auth_from_netloc,
- splitext,
-)
-from pip._internal.utils.models import KeyBasedCompareMixin
-from pip._internal.utils.urls import path_to_url, url_to_path
-
-if TYPE_CHECKING:
- from pip._internal.index.collector import HTMLPage
-
-logger = logging.getLogger(__name__)
-
-
-_SUPPORTED_HASHES = ("sha1", "sha224", "sha384", "sha256", "sha512", "md5")
-
-
-class Link(KeyBasedCompareMixin):
- """Represents a parsed link from a Package Index's simple URL"""
-
- __slots__ = [
- "_parsed_url",
- "_url",
- "comes_from",
- "requires_python",
- "yanked_reason",
- "cache_link_parsing",
- ]
-
- def __init__(
- self,
- url: str,
- comes_from: Optional[Union[str, "HTMLPage"]] = None,
- requires_python: Optional[str] = None,
- yanked_reason: Optional[str] = None,
- cache_link_parsing: bool = True,
- ) -> None:
- """
- :param url: url of the resource pointed to (href of the link)
- :param comes_from: instance of HTMLPage where the link was found,
- or string.
- :param requires_python: String containing the `Requires-Python`
- metadata field, specified in PEP 345. This may be specified by
- a data-requires-python attribute in the HTML link tag, as
- described in PEP 503.
- :param yanked_reason: the reason the file has been yanked, if the
- file has been yanked, or None if the file hasn't been yanked.
- This is the value of the "data-yanked" attribute, if present, in
- a simple repository HTML link. If the file has been yanked but
- no reason was provided, this should be the empty string. See
- PEP 592 for more information and the specification.
- :param cache_link_parsing: A flag that is used elsewhere to determine
- whether resources retrieved from this link
- should be cached. PyPI index urls should
- generally have this set to False, for
- example.
- """
-
- # url can be a UNC windows share
- if url.startswith("\\\\"):
- url = path_to_url(url)
-
- self._parsed_url = urllib.parse.urlsplit(url)
- # Store the url as a private attribute to prevent accidentally
- # trying to set a new value.
- self._url = url
-
- self.comes_from = comes_from
- self.requires_python = requires_python if requires_python else None
- self.yanked_reason = yanked_reason
-
- super().__init__(key=url, defining_class=Link)
-
- self.cache_link_parsing = cache_link_parsing
-
- def __str__(self) -> str:
- if self.requires_python:
- rp = f" (requires-python:{self.requires_python})"
- else:
- rp = ""
- if self.comes_from:
- return "{} (from {}){}".format(
- redact_auth_from_url(self._url), self.comes_from, rp
- )
- else:
- return redact_auth_from_url(str(self._url))
-
- def __repr__(self) -> str:
- return f""
-
- @property
- def url(self) -> str:
- return self._url
-
- @property
- def filename(self) -> str:
- path = self.path.rstrip("/")
- name = posixpath.basename(path)
- if not name:
- # Make sure we don't leak auth information if the netloc
- # includes a username and password.
- netloc, user_pass = split_auth_from_netloc(self.netloc)
- return netloc
-
- name = urllib.parse.unquote(name)
- assert name, f"URL {self._url!r} produced no filename"
- return name
-
- @property
- def file_path(self) -> str:
- return url_to_path(self.url)
-
- @property
- def scheme(self) -> str:
- return self._parsed_url.scheme
-
- @property
- def netloc(self) -> str:
- """
- This can contain auth information.
- """
- return self._parsed_url.netloc
-
- @property
- def path(self) -> str:
- return urllib.parse.unquote(self._parsed_url.path)
-
- def splitext(self) -> Tuple[str, str]:
- return splitext(posixpath.basename(self.path.rstrip("/")))
-
- @property
- def ext(self) -> str:
- return self.splitext()[1]
-
- @property
- def url_without_fragment(self) -> str:
- scheme, netloc, path, query, fragment = self._parsed_url
- return urllib.parse.urlunsplit((scheme, netloc, path, query, ""))
-
- _egg_fragment_re = re.compile(r"[#&]egg=([^&]*)")
-
- @property
- def egg_fragment(self) -> Optional[str]:
- match = self._egg_fragment_re.search(self._url)
- if not match:
- return None
- return match.group(1)
-
- _subdirectory_fragment_re = re.compile(r"[#&]subdirectory=([^&]*)")
-
- @property
- def subdirectory_fragment(self) -> Optional[str]:
- match = self._subdirectory_fragment_re.search(self._url)
- if not match:
- return None
- return match.group(1)
-
- _hash_re = re.compile(
- r"({choices})=([a-f0-9]+)".format(choices="|".join(_SUPPORTED_HASHES))
- )
-
- @property
- def hash(self) -> Optional[str]:
- match = self._hash_re.search(self._url)
- if match:
- return match.group(2)
- return None
-
- @property
- def hash_name(self) -> Optional[str]:
- match = self._hash_re.search(self._url)
- if match:
- return match.group(1)
- return None
-
- @property
- def show_url(self) -> str:
- return posixpath.basename(self._url.split("#", 1)[0].split("?", 1)[0])
-
- @property
- def is_file(self) -> bool:
- return self.scheme == "file"
-
- def is_existing_dir(self) -> bool:
- return self.is_file and os.path.isdir(self.file_path)
-
- @property
- def is_wheel(self) -> bool:
- return self.ext == WHEEL_EXTENSION
-
- @property
- def is_vcs(self) -> bool:
- from pip._internal.vcs import vcs
-
- return self.scheme in vcs.all_schemes
-
- @property
- def is_yanked(self) -> bool:
- return self.yanked_reason is not None
-
- @property
- def has_hash(self) -> bool:
- return self.hash_name is not None
-
- def is_hash_allowed(self, hashes: Optional[Hashes]) -> bool:
- """
- Return True if the link has a hash and it is allowed.
- """
- if hashes is None or not self.has_hash:
- return False
- # Assert non-None so mypy knows self.hash_name and self.hash are str.
- assert self.hash_name is not None
- assert self.hash is not None
-
- return hashes.is_hash_allowed(self.hash_name, hex_digest=self.hash)
-
-
-class _CleanResult(NamedTuple):
- """Convert link for equivalency check.
-
- This is used in the resolver to check whether two URL-specified requirements
- likely point to the same distribution and can be considered equivalent. This
- equivalency logic avoids comparing URLs literally, which can be too strict
- (e.g. "a=1&b=2" vs "b=2&a=1") and produce conflicts unexpecting to users.
-
- Currently this does three things:
-
- 1. Drop the basic auth part. This is technically wrong since a server can
- serve different content based on auth, but if it does that, it is even
- impossible to guarantee two URLs without auth are equivalent, since
- the user can input different auth information when prompted. So the
- practical solution is to assume the auth doesn't affect the response.
- 2. Parse the query to avoid the ordering issue. Note that ordering under the
- same key in the query are NOT cleaned; i.e. "a=1&a=2" and "a=2&a=1" are
- still considered different.
- 3. Explicitly drop most of the fragment part, except ``subdirectory=`` and
- hash values, since it should have no impact the downloaded content. Note
- that this drops the "egg=" part historically used to denote the requested
- project (and extras), which is wrong in the strictest sense, but too many
- people are supplying it inconsistently to cause superfluous resolution
- conflicts, so we choose to also ignore them.
- """
-
- parsed: urllib.parse.SplitResult
- query: Dict[str, List[str]]
- subdirectory: str
- hashes: Dict[str, str]
-
-
-def _clean_link(link: Link) -> _CleanResult:
- parsed = link._parsed_url
- netloc = parsed.netloc.rsplit("@", 1)[-1]
- # According to RFC 8089, an empty host in file: means localhost.
- if parsed.scheme == "file" and not netloc:
- netloc = "localhost"
- fragment = urllib.parse.parse_qs(parsed.fragment)
- if "egg" in fragment:
- logger.debug("Ignoring egg= fragment in %s", link)
- try:
- # If there are multiple subdirectory values, use the first one.
- # This matches the behavior of Link.subdirectory_fragment.
- subdirectory = fragment["subdirectory"][0]
- except (IndexError, KeyError):
- subdirectory = ""
- # If there are multiple hash values under the same algorithm, use the
- # first one. This matches the behavior of Link.hash_value.
- hashes = {k: fragment[k][0] for k in _SUPPORTED_HASHES if k in fragment}
- return _CleanResult(
- parsed=parsed._replace(netloc=netloc, query="", fragment=""),
- query=urllib.parse.parse_qs(parsed.query),
- subdirectory=subdirectory,
- hashes=hashes,
- )
-
-
-@functools.lru_cache(maxsize=None)
-def links_equivalent(link1: Link, link2: Link) -> bool:
- return _clean_link(link1) == _clean_link(link2)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/scheme.py b/env/lib/python3.9/site-packages/pip/_internal/models/scheme.py
deleted file mode 100644
index f51190a..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/scheme.py
+++ /dev/null
@@ -1,31 +0,0 @@
-"""
-For types associated with installation schemes.
-
-For a general overview of available schemes and their context, see
-https://docs.python.org/3/install/index.html#alternate-installation.
-"""
-
-
-SCHEME_KEYS = ["platlib", "purelib", "headers", "scripts", "data"]
-
-
-class Scheme:
- """A Scheme holds paths which are used as the base directories for
- artifacts associated with a Python package.
- """
-
- __slots__ = SCHEME_KEYS
-
- def __init__(
- self,
- platlib: str,
- purelib: str,
- headers: str,
- scripts: str,
- data: str,
- ) -> None:
- self.platlib = platlib
- self.purelib = purelib
- self.headers = headers
- self.scripts = scripts
- self.data = data
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/search_scope.py b/env/lib/python3.9/site-packages/pip/_internal/models/search_scope.py
deleted file mode 100644
index e4e54c2..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/search_scope.py
+++ /dev/null
@@ -1,129 +0,0 @@
-import itertools
-import logging
-import os
-import posixpath
-import urllib.parse
-from typing import List
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.models.index import PyPI
-from pip._internal.utils.compat import has_tls
-from pip._internal.utils.misc import normalize_path, redact_auth_from_url
-
-logger = logging.getLogger(__name__)
-
-
-class SearchScope:
-
- """
- Encapsulates the locations that pip is configured to search.
- """
-
- __slots__ = ["find_links", "index_urls"]
-
- @classmethod
- def create(
- cls,
- find_links: List[str],
- index_urls: List[str],
- ) -> "SearchScope":
- """
- Create a SearchScope object after normalizing the `find_links`.
- """
- # Build find_links. If an argument starts with ~, it may be
- # a local file relative to a home directory. So try normalizing
- # it and if it exists, use the normalized version.
- # This is deliberately conservative - it might be fine just to
- # blindly normalize anything starting with a ~...
- built_find_links: List[str] = []
- for link in find_links:
- if link.startswith("~"):
- new_link = normalize_path(link)
- if os.path.exists(new_link):
- link = new_link
- built_find_links.append(link)
-
- # If we don't have TLS enabled, then WARN if anyplace we're looking
- # relies on TLS.
- if not has_tls():
- for link in itertools.chain(index_urls, built_find_links):
- parsed = urllib.parse.urlparse(link)
- if parsed.scheme == "https":
- logger.warning(
- "pip is configured with locations that require "
- "TLS/SSL, however the ssl module in Python is not "
- "available."
- )
- break
-
- return cls(
- find_links=built_find_links,
- index_urls=index_urls,
- )
-
- def __init__(
- self,
- find_links: List[str],
- index_urls: List[str],
- ) -> None:
- self.find_links = find_links
- self.index_urls = index_urls
-
- def get_formatted_locations(self) -> str:
- lines = []
- redacted_index_urls = []
- if self.index_urls and self.index_urls != [PyPI.simple_url]:
- for url in self.index_urls:
-
- redacted_index_url = redact_auth_from_url(url)
-
- # Parse the URL
- purl = urllib.parse.urlsplit(redacted_index_url)
-
- # URL is generally invalid if scheme and netloc is missing
- # there are issues with Python and URL parsing, so this test
- # is a bit crude. See bpo-20271, bpo-23505. Python doesn't
- # always parse invalid URLs correctly - it should raise
- # exceptions for malformed URLs
- if not purl.scheme and not purl.netloc:
- logger.warning(
- 'The index url "%s" seems invalid, please provide a scheme.',
- redacted_index_url,
- )
-
- redacted_index_urls.append(redacted_index_url)
-
- lines.append(
- "Looking in indexes: {}".format(", ".join(redacted_index_urls))
- )
-
- if self.find_links:
- lines.append(
- "Looking in links: {}".format(
- ", ".join(redact_auth_from_url(url) for url in self.find_links)
- )
- )
- return "\n".join(lines)
-
- def get_index_urls_locations(self, project_name: str) -> List[str]:
- """Returns the locations found via self.index_urls
-
- Checks the url_name on the main (first in the list) index and
- use this url_name to produce all locations
- """
-
- def mkurl_pypi_url(url: str) -> str:
- loc = posixpath.join(
- url, urllib.parse.quote(canonicalize_name(project_name))
- )
- # For maximum compatibility with easy_install, ensure the path
- # ends in a trailing slash. Although this isn't in the spec
- # (and PyPI can handle it without the slash) some other index
- # implementations might break if they relied on easy_install's
- # behavior.
- if not loc.endswith("/"):
- loc = loc + "/"
- return loc
-
- return [mkurl_pypi_url(url) for url in self.index_urls]
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/selection_prefs.py b/env/lib/python3.9/site-packages/pip/_internal/models/selection_prefs.py
deleted file mode 100644
index 977bc4c..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/selection_prefs.py
+++ /dev/null
@@ -1,51 +0,0 @@
-from typing import Optional
-
-from pip._internal.models.format_control import FormatControl
-
-
-class SelectionPreferences:
- """
- Encapsulates the candidate selection preferences for downloading
- and installing files.
- """
-
- __slots__ = [
- "allow_yanked",
- "allow_all_prereleases",
- "format_control",
- "prefer_binary",
- "ignore_requires_python",
- ]
-
- # Don't include an allow_yanked default value to make sure each call
- # site considers whether yanked releases are allowed. This also causes
- # that decision to be made explicit in the calling code, which helps
- # people when reading the code.
- def __init__(
- self,
- allow_yanked: bool,
- allow_all_prereleases: bool = False,
- format_control: Optional[FormatControl] = None,
- prefer_binary: bool = False,
- ignore_requires_python: Optional[bool] = None,
- ) -> None:
- """Create a SelectionPreferences object.
-
- :param allow_yanked: Whether files marked as yanked (in the sense
- of PEP 592) are permitted to be candidates for install.
- :param format_control: A FormatControl object or None. Used to control
- the selection of source packages / binary packages when consulting
- the index and links.
- :param prefer_binary: Whether to prefer an old, but valid, binary
- dist over a new source dist.
- :param ignore_requires_python: Whether to ignore incompatible
- "Requires-Python" values in links. Defaults to False.
- """
- if ignore_requires_python is None:
- ignore_requires_python = False
-
- self.allow_yanked = allow_yanked
- self.allow_all_prereleases = allow_all_prereleases
- self.format_control = format_control
- self.prefer_binary = prefer_binary
- self.ignore_requires_python = ignore_requires_python
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/target_python.py b/env/lib/python3.9/site-packages/pip/_internal/models/target_python.py
deleted file mode 100644
index 744bd7e..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/target_python.py
+++ /dev/null
@@ -1,110 +0,0 @@
-import sys
-from typing import List, Optional, Tuple
-
-from pip._vendor.packaging.tags import Tag
-
-from pip._internal.utils.compatibility_tags import get_supported, version_info_to_nodot
-from pip._internal.utils.misc import normalize_version_info
-
-
-class TargetPython:
-
- """
- Encapsulates the properties of a Python interpreter one is targeting
- for a package install, download, etc.
- """
-
- __slots__ = [
- "_given_py_version_info",
- "abis",
- "implementation",
- "platforms",
- "py_version",
- "py_version_info",
- "_valid_tags",
- ]
-
- def __init__(
- self,
- platforms: Optional[List[str]] = None,
- py_version_info: Optional[Tuple[int, ...]] = None,
- abis: Optional[List[str]] = None,
- implementation: Optional[str] = None,
- ) -> None:
- """
- :param platforms: A list of strings or None. If None, searches for
- packages that are supported by the current system. Otherwise, will
- find packages that can be built on the platforms passed in. These
- packages will only be downloaded for distribution: they will
- not be built locally.
- :param py_version_info: An optional tuple of ints representing the
- Python version information to use (e.g. `sys.version_info[:3]`).
- This can have length 1, 2, or 3 when provided.
- :param abis: A list of strings or None. This is passed to
- compatibility_tags.py's get_supported() function as is.
- :param implementation: A string or None. This is passed to
- compatibility_tags.py's get_supported() function as is.
- """
- # Store the given py_version_info for when we call get_supported().
- self._given_py_version_info = py_version_info
-
- if py_version_info is None:
- py_version_info = sys.version_info[:3]
- else:
- py_version_info = normalize_version_info(py_version_info)
-
- py_version = ".".join(map(str, py_version_info[:2]))
-
- self.abis = abis
- self.implementation = implementation
- self.platforms = platforms
- self.py_version = py_version
- self.py_version_info = py_version_info
-
- # This is used to cache the return value of get_tags().
- self._valid_tags: Optional[List[Tag]] = None
-
- def format_given(self) -> str:
- """
- Format the given, non-None attributes for display.
- """
- display_version = None
- if self._given_py_version_info is not None:
- display_version = ".".join(
- str(part) for part in self._given_py_version_info
- )
-
- key_values = [
- ("platforms", self.platforms),
- ("version_info", display_version),
- ("abis", self.abis),
- ("implementation", self.implementation),
- ]
- return " ".join(
- f"{key}={value!r}" for key, value in key_values if value is not None
- )
-
- def get_tags(self) -> List[Tag]:
- """
- Return the supported PEP 425 tags to check wheel candidates against.
-
- The tags are returned in order of preference (most preferred first).
- """
- if self._valid_tags is None:
- # Pass versions=None if no py_version_info was given since
- # versions=None uses special default logic.
- py_version_info = self._given_py_version_info
- if py_version_info is None:
- version = None
- else:
- version = version_info_to_nodot(py_version_info)
-
- tags = get_supported(
- version=version,
- platforms=self.platforms,
- abis=self.abis,
- impl=self.implementation,
- )
- self._valid_tags = tags
-
- return self._valid_tags
diff --git a/env/lib/python3.9/site-packages/pip/_internal/models/wheel.py b/env/lib/python3.9/site-packages/pip/_internal/models/wheel.py
deleted file mode 100644
index e091612..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/models/wheel.py
+++ /dev/null
@@ -1,89 +0,0 @@
-"""Represents a wheel file and provides access to the various parts of the
-name that have meaning.
-"""
-import re
-from typing import Dict, Iterable, List
-
-from pip._vendor.packaging.tags import Tag
-
-from pip._internal.exceptions import InvalidWheelFilename
-
-
-class Wheel:
- """A wheel file"""
-
- wheel_file_re = re.compile(
- r"""^(?P(?P.+?)-(?P.*?))
- ((-(?P\d[^-]*?))?-(?P.+?)-(?P.+?)-(?P.+?)
- \.whl|\.dist-info)$""",
- re.VERBOSE,
- )
-
- def __init__(self, filename: str) -> None:
- """
- :raises InvalidWheelFilename: when the filename is invalid for a wheel
- """
- wheel_info = self.wheel_file_re.match(filename)
- if not wheel_info:
- raise InvalidWheelFilename(f"{filename} is not a valid wheel filename.")
- self.filename = filename
- self.name = wheel_info.group("name").replace("_", "-")
- # we'll assume "_" means "-" due to wheel naming scheme
- # (https://github.com/pypa/pip/issues/1150)
- self.version = wheel_info.group("ver").replace("_", "-")
- self.build_tag = wheel_info.group("build")
- self.pyversions = wheel_info.group("pyver").split(".")
- self.abis = wheel_info.group("abi").split(".")
- self.plats = wheel_info.group("plat").split(".")
-
- # All the tag combinations from this file
- self.file_tags = {
- Tag(x, y, z) for x in self.pyversions for y in self.abis for z in self.plats
- }
-
- def get_formatted_file_tags(self) -> List[str]:
- """Return the wheel's tags as a sorted list of strings."""
- return sorted(str(tag) for tag in self.file_tags)
-
- def support_index_min(self, tags: List[Tag]) -> int:
- """Return the lowest index that one of the wheel's file_tag combinations
- achieves in the given list of supported tags.
-
- For example, if there are 8 supported tags and one of the file tags
- is first in the list, then return 0.
-
- :param tags: the PEP 425 tags to check the wheel against, in order
- with most preferred first.
-
- :raises ValueError: If none of the wheel's file tags match one of
- the supported tags.
- """
- return min(tags.index(tag) for tag in self.file_tags if tag in tags)
-
- def find_most_preferred_tag(
- self, tags: List[Tag], tag_to_priority: Dict[Tag, int]
- ) -> int:
- """Return the priority of the most preferred tag that one of the wheel's file
- tag combinations achieves in the given list of supported tags using the given
- tag_to_priority mapping, where lower priorities are more-preferred.
-
- This is used in place of support_index_min in some cases in order to avoid
- an expensive linear scan of a large list of tags.
-
- :param tags: the PEP 425 tags to check the wheel against.
- :param tag_to_priority: a mapping from tag to priority of that tag, where
- lower is more preferred.
-
- :raises ValueError: If none of the wheel's file tags match one of
- the supported tags.
- """
- return min(
- tag_to_priority[tag] for tag in self.file_tags if tag in tag_to_priority
- )
-
- def supported(self, tags: Iterable[Tag]) -> bool:
- """Return whether the wheel is compatible with one of the given tags.
-
- :param tags: the PEP 425 tags to check the wheel against.
- """
- return not self.file_tags.isdisjoint(tags)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/network/__init__.py
deleted file mode 100644
index b51bde9..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/__init__.py
+++ /dev/null
@@ -1,2 +0,0 @@
-"""Contains purely network-related utilities.
-"""
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/auth.py b/env/lib/python3.9/site-packages/pip/_internal/network/auth.py
deleted file mode 100644
index e40ebfb..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/auth.py
+++ /dev/null
@@ -1,324 +0,0 @@
-"""Network Authentication Helpers
-
-Contains interface (MultiDomainBasicAuth) and associated glue code for
-providing credentials in the context of network requests.
-"""
-
-import urllib.parse
-from typing import Any, Dict, List, Optional, Tuple
-
-from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth
-from pip._vendor.requests.models import Request, Response
-from pip._vendor.requests.utils import get_netrc_auth
-
-from pip._internal.utils.logging import getLogger
-from pip._internal.utils.misc import (
- ask,
- ask_input,
- ask_password,
- remove_auth_from_url,
- split_auth_netloc_from_url,
-)
-from pip._internal.vcs.versioncontrol import AuthInfo
-
-logger = getLogger(__name__)
-
-Credentials = Tuple[str, str, str]
-
-try:
- import keyring
-except ImportError:
- keyring = None # type: ignore[assignment]
-except Exception as exc:
- logger.warning(
- "Keyring is skipped due to an exception: %s",
- str(exc),
- )
- keyring = None # type: ignore[assignment]
-
-
-def get_keyring_auth(url: Optional[str], username: Optional[str]) -> Optional[AuthInfo]:
- """Return the tuple auth for a given url from keyring."""
- global keyring
- if not url or not keyring:
- return None
-
- try:
- try:
- get_credential = keyring.get_credential
- except AttributeError:
- pass
- else:
- logger.debug("Getting credentials from keyring for %s", url)
- cred = get_credential(url, username)
- if cred is not None:
- return cred.username, cred.password
- return None
-
- if username:
- logger.debug("Getting password from keyring for %s", url)
- password = keyring.get_password(url, username)
- if password:
- return username, password
-
- except Exception as exc:
- logger.warning(
- "Keyring is skipped due to an exception: %s",
- str(exc),
- )
- keyring = None # type: ignore[assignment]
- return None
-
-
-class MultiDomainBasicAuth(AuthBase):
- def __init__(
- self, prompting: bool = True, index_urls: Optional[List[str]] = None
- ) -> None:
- self.prompting = prompting
- self.index_urls = index_urls
- self.passwords: Dict[str, AuthInfo] = {}
- # When the user is prompted to enter credentials and keyring is
- # available, we will offer to save them. If the user accepts,
- # this value is set to the credentials they entered. After the
- # request authenticates, the caller should call
- # ``save_credentials`` to save these.
- self._credentials_to_save: Optional[Credentials] = None
-
- def _get_index_url(self, url: str) -> Optional[str]:
- """Return the original index URL matching the requested URL.
-
- Cached or dynamically generated credentials may work against
- the original index URL rather than just the netloc.
-
- The provided url should have had its username and password
- removed already. If the original index url had credentials then
- they will be included in the return value.
-
- Returns None if no matching index was found, or if --no-index
- was specified by the user.
- """
- if not url or not self.index_urls:
- return None
-
- for u in self.index_urls:
- prefix = remove_auth_from_url(u).rstrip("/") + "/"
- if url.startswith(prefix):
- return u
- return None
-
- def _get_new_credentials(
- self,
- original_url: str,
- *,
- allow_netrc: bool = False,
- allow_keyring: bool = False,
- ) -> AuthInfo:
- """Find and return credentials for the specified URL."""
- # Split the credentials and netloc from the url.
- url, netloc, url_user_password = split_auth_netloc_from_url(
- original_url,
- )
-
- # Start with the credentials embedded in the url
- username, password = url_user_password
- if username is not None and password is not None:
- logger.debug("Found credentials in url for %s", netloc)
- return url_user_password
-
- # Find a matching index url for this request
- index_url = self._get_index_url(url)
- if index_url:
- # Split the credentials from the url.
- index_info = split_auth_netloc_from_url(index_url)
- if index_info:
- index_url, _, index_url_user_password = index_info
- logger.debug("Found index url %s", index_url)
-
- # If an index URL was found, try its embedded credentials
- if index_url and index_url_user_password[0] is not None:
- username, password = index_url_user_password
- if username is not None and password is not None:
- logger.debug("Found credentials in index url for %s", netloc)
- return index_url_user_password
-
- # Get creds from netrc if we still don't have them
- if allow_netrc:
- netrc_auth = get_netrc_auth(original_url)
- if netrc_auth:
- logger.debug("Found credentials in netrc for %s", netloc)
- return netrc_auth
-
- # If we don't have a password and keyring is available, use it.
- if allow_keyring:
- # The index url is more specific than the netloc, so try it first
- # fmt: off
- kr_auth = (
- get_keyring_auth(index_url, username) or
- get_keyring_auth(netloc, username)
- )
- # fmt: on
- if kr_auth:
- logger.debug("Found credentials in keyring for %s", netloc)
- return kr_auth
-
- return username, password
-
- def _get_url_and_credentials(
- self, original_url: str
- ) -> Tuple[str, Optional[str], Optional[str]]:
- """Return the credentials to use for the provided URL.
-
- If allowed, netrc and keyring may be used to obtain the
- correct credentials.
-
- Returns (url_without_credentials, username, password). Note
- that even if the original URL contains credentials, this
- function may return a different username and password.
- """
- url, netloc, _ = split_auth_netloc_from_url(original_url)
-
- # Try to get credentials from original url
- username, password = self._get_new_credentials(original_url)
-
- # If credentials not found, use any stored credentials for this netloc.
- # Do this if either the username or the password is missing.
- # This accounts for the situation in which the user has specified
- # the username in the index url, but the password comes from keyring.
- if (username is None or password is None) and netloc in self.passwords:
- un, pw = self.passwords[netloc]
- # It is possible that the cached credentials are for a different username,
- # in which case the cache should be ignored.
- if username is None or username == un:
- username, password = un, pw
-
- if username is not None or password is not None:
- # Convert the username and password if they're None, so that
- # this netloc will show up as "cached" in the conditional above.
- # Further, HTTPBasicAuth doesn't accept None, so it makes sense to
- # cache the value that is going to be used.
- username = username or ""
- password = password or ""
-
- # Store any acquired credentials.
- self.passwords[netloc] = (username, password)
-
- assert (
- # Credentials were found
- (username is not None and password is not None)
- # Credentials were not found
- or (username is None and password is None)
- ), f"Could not load credentials from url: {original_url}"
-
- return url, username, password
-
- def __call__(self, req: Request) -> Request:
- # Get credentials for this request
- url, username, password = self._get_url_and_credentials(req.url)
-
- # Set the url of the request to the url without any credentials
- req.url = url
-
- if username is not None and password is not None:
- # Send the basic auth with this request
- req = HTTPBasicAuth(username, password)(req)
-
- # Attach a hook to handle 401 responses
- req.register_hook("response", self.handle_401)
-
- return req
-
- # Factored out to allow for easy patching in tests
- def _prompt_for_password(
- self, netloc: str
- ) -> Tuple[Optional[str], Optional[str], bool]:
- username = ask_input(f"User for {netloc}: ")
- if not username:
- return None, None, False
- auth = get_keyring_auth(netloc, username)
- if auth and auth[0] is not None and auth[1] is not None:
- return auth[0], auth[1], False
- password = ask_password("Password: ")
- return username, password, True
-
- # Factored out to allow for easy patching in tests
- def _should_save_password_to_keyring(self) -> bool:
- if not keyring:
- return False
- return ask("Save credentials to keyring [y/N]: ", ["y", "n"]) == "y"
-
- def handle_401(self, resp: Response, **kwargs: Any) -> Response:
- # We only care about 401 responses, anything else we want to just
- # pass through the actual response
- if resp.status_code != 401:
- return resp
-
- # We are not able to prompt the user so simply return the response
- if not self.prompting:
- return resp
-
- parsed = urllib.parse.urlparse(resp.url)
-
- # Query the keyring for credentials:
- username, password = self._get_new_credentials(
- resp.url,
- allow_netrc=True,
- allow_keyring=True,
- )
-
- # Prompt the user for a new username and password
- save = False
- if not username and not password:
- username, password, save = self._prompt_for_password(parsed.netloc)
-
- # Store the new username and password to use for future requests
- self._credentials_to_save = None
- if username is not None and password is not None:
- self.passwords[parsed.netloc] = (username, password)
-
- # Prompt to save the password to keyring
- if save and self._should_save_password_to_keyring():
- self._credentials_to_save = (parsed.netloc, username, password)
-
- # Consume content and release the original connection to allow our new
- # request to reuse the same one.
- resp.content
- resp.raw.release_conn()
-
- # Add our new username and password to the request
- req = HTTPBasicAuth(username or "", password or "")(resp.request)
- req.register_hook("response", self.warn_on_401)
-
- # On successful request, save the credentials that were used to
- # keyring. (Note that if the user responded "no" above, this member
- # is not set and nothing will be saved.)
- if self._credentials_to_save:
- req.register_hook("response", self.save_credentials)
-
- # Send our new request
- new_resp = resp.connection.send(req, **kwargs)
- new_resp.history.append(resp)
-
- return new_resp
-
- def warn_on_401(self, resp: Response, **kwargs: Any) -> None:
- """Response callback to warn about incorrect credentials."""
- if resp.status_code == 401:
- logger.warning(
- "401 Error, Credentials not correct for %s",
- resp.request.url,
- )
-
- def save_credentials(self, resp: Response, **kwargs: Any) -> None:
- """Response callback to save credentials on success."""
- assert keyring is not None, "should never reach here without keyring"
- if not keyring:
- return
-
- creds = self._credentials_to_save
- self._credentials_to_save = None
- if creds and resp.status_code < 400:
- try:
- logger.info("Saving credentials to keyring")
- keyring.set_password(*creds)
- except Exception:
- logger.exception("Failed to save credentials")
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/cache.py b/env/lib/python3.9/site-packages/pip/_internal/network/cache.py
deleted file mode 100644
index a81a239..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/cache.py
+++ /dev/null
@@ -1,69 +0,0 @@
-"""HTTP cache implementation.
-"""
-
-import os
-from contextlib import contextmanager
-from typing import Generator, Optional
-
-from pip._vendor.cachecontrol.cache import BaseCache
-from pip._vendor.cachecontrol.caches import FileCache
-from pip._vendor.requests.models import Response
-
-from pip._internal.utils.filesystem import adjacent_tmp_file, replace
-from pip._internal.utils.misc import ensure_dir
-
-
-def is_from_cache(response: Response) -> bool:
- return getattr(response, "from_cache", False)
-
-
-@contextmanager
-def suppressed_cache_errors() -> Generator[None, None, None]:
- """If we can't access the cache then we can just skip caching and process
- requests as if caching wasn't enabled.
- """
- try:
- yield
- except OSError:
- pass
-
-
-class SafeFileCache(BaseCache):
- """
- A file based cache which is safe to use even when the target directory may
- not be accessible or writable.
- """
-
- def __init__(self, directory: str) -> None:
- assert directory is not None, "Cache directory must not be None."
- super().__init__()
- self.directory = directory
-
- def _get_cache_path(self, name: str) -> str:
- # From cachecontrol.caches.file_cache.FileCache._fn, brought into our
- # class for backwards-compatibility and to avoid using a non-public
- # method.
- hashed = FileCache.encode(name)
- parts = list(hashed[:5]) + [hashed]
- return os.path.join(self.directory, *parts)
-
- def get(self, key: str) -> Optional[bytes]:
- path = self._get_cache_path(key)
- with suppressed_cache_errors():
- with open(path, "rb") as f:
- return f.read()
-
- def set(self, key: str, value: bytes, expires: Optional[int] = None) -> None:
- path = self._get_cache_path(key)
- with suppressed_cache_errors():
- ensure_dir(os.path.dirname(path))
-
- with adjacent_tmp_file(path) as f:
- f.write(value)
-
- replace(f.name, path)
-
- def delete(self, key: str) -> None:
- path = self._get_cache_path(key)
- with suppressed_cache_errors():
- os.remove(path)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/download.py b/env/lib/python3.9/site-packages/pip/_internal/network/download.py
deleted file mode 100644
index 35bc970..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/download.py
+++ /dev/null
@@ -1,185 +0,0 @@
-"""Download files with progress indicators.
-"""
-import cgi
-import logging
-import mimetypes
-import os
-from typing import Iterable, Optional, Tuple
-
-from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
-
-from pip._internal.cli.progress_bars import get_download_progress_renderer
-from pip._internal.exceptions import NetworkConnectionError
-from pip._internal.models.index import PyPI
-from pip._internal.models.link import Link
-from pip._internal.network.cache import is_from_cache
-from pip._internal.network.session import PipSession
-from pip._internal.network.utils import HEADERS, raise_for_status, response_chunks
-from pip._internal.utils.misc import format_size, redact_auth_from_url, splitext
-
-logger = logging.getLogger(__name__)
-
-
-def _get_http_response_size(resp: Response) -> Optional[int]:
- try:
- return int(resp.headers["content-length"])
- except (ValueError, KeyError, TypeError):
- return None
-
-
-def _prepare_download(
- resp: Response,
- link: Link,
- progress_bar: str,
-) -> Iterable[bytes]:
- total_length = _get_http_response_size(resp)
-
- if link.netloc == PyPI.file_storage_domain:
- url = link.show_url
- else:
- url = link.url_without_fragment
-
- logged_url = redact_auth_from_url(url)
-
- if total_length:
- logged_url = "{} ({})".format(logged_url, format_size(total_length))
-
- if is_from_cache(resp):
- logger.info("Using cached %s", logged_url)
- else:
- logger.info("Downloading %s", logged_url)
-
- if logger.getEffectiveLevel() > logging.INFO:
- show_progress = False
- elif is_from_cache(resp):
- show_progress = False
- elif not total_length:
- show_progress = True
- elif total_length > (40 * 1000):
- show_progress = True
- else:
- show_progress = False
-
- chunks = response_chunks(resp, CONTENT_CHUNK_SIZE)
-
- if not show_progress:
- return chunks
-
- renderer = get_download_progress_renderer(bar_type=progress_bar, size=total_length)
- return renderer(chunks)
-
-
-def sanitize_content_filename(filename: str) -> str:
- """
- Sanitize the "filename" value from a Content-Disposition header.
- """
- return os.path.basename(filename)
-
-
-def parse_content_disposition(content_disposition: str, default_filename: str) -> str:
- """
- Parse the "filename" value from a Content-Disposition header, and
- return the default filename if the result is empty.
- """
- _type, params = cgi.parse_header(content_disposition)
- filename = params.get("filename")
- if filename:
- # We need to sanitize the filename to prevent directory traversal
- # in case the filename contains ".." path parts.
- filename = sanitize_content_filename(filename)
- return filename or default_filename
-
-
-def _get_http_response_filename(resp: Response, link: Link) -> str:
- """Get an ideal filename from the given HTTP response, falling back to
- the link filename if not provided.
- """
- filename = link.filename # fallback
- # Have a look at the Content-Disposition header for a better guess
- content_disposition = resp.headers.get("content-disposition")
- if content_disposition:
- filename = parse_content_disposition(content_disposition, filename)
- ext: Optional[str] = splitext(filename)[1]
- if not ext:
- ext = mimetypes.guess_extension(resp.headers.get("content-type", ""))
- if ext:
- filename += ext
- if not ext and link.url != resp.url:
- ext = os.path.splitext(resp.url)[1]
- if ext:
- filename += ext
- return filename
-
-
-def _http_get_download(session: PipSession, link: Link) -> Response:
- target_url = link.url.split("#", 1)[0]
- resp = session.get(target_url, headers=HEADERS, stream=True)
- raise_for_status(resp)
- return resp
-
-
-class Downloader:
- def __init__(
- self,
- session: PipSession,
- progress_bar: str,
- ) -> None:
- self._session = session
- self._progress_bar = progress_bar
-
- def __call__(self, link: Link, location: str) -> Tuple[str, str]:
- """Download the file given by link into location."""
- try:
- resp = _http_get_download(self._session, link)
- except NetworkConnectionError as e:
- assert e.response is not None
- logger.critical(
- "HTTP error %s while getting %s", e.response.status_code, link
- )
- raise
-
- filename = _get_http_response_filename(resp, link)
- filepath = os.path.join(location, filename)
-
- chunks = _prepare_download(resp, link, self._progress_bar)
- with open(filepath, "wb") as content_file:
- for chunk in chunks:
- content_file.write(chunk)
- content_type = resp.headers.get("Content-Type", "")
- return filepath, content_type
-
-
-class BatchDownloader:
- def __init__(
- self,
- session: PipSession,
- progress_bar: str,
- ) -> None:
- self._session = session
- self._progress_bar = progress_bar
-
- def __call__(
- self, links: Iterable[Link], location: str
- ) -> Iterable[Tuple[Link, Tuple[str, str]]]:
- """Download the files given by links into location."""
- for link in links:
- try:
- resp = _http_get_download(self._session, link)
- except NetworkConnectionError as e:
- assert e.response is not None
- logger.critical(
- "HTTP error %s while getting %s",
- e.response.status_code,
- link,
- )
- raise
-
- filename = _get_http_response_filename(resp, link)
- filepath = os.path.join(location, filename)
-
- chunks = _prepare_download(resp, link, self._progress_bar)
- with open(filepath, "wb") as content_file:
- for chunk in chunks:
- content_file.write(chunk)
- content_type = resp.headers.get("Content-Type", "")
- yield link, (filepath, content_type)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/lazy_wheel.py b/env/lib/python3.9/site-packages/pip/_internal/network/lazy_wheel.py
deleted file mode 100644
index b0de535..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/lazy_wheel.py
+++ /dev/null
@@ -1,210 +0,0 @@
-"""Lazy ZIP over HTTP"""
-
-__all__ = ["HTTPRangeRequestUnsupported", "dist_from_wheel_url"]
-
-from bisect import bisect_left, bisect_right
-from contextlib import contextmanager
-from tempfile import NamedTemporaryFile
-from typing import Any, Dict, Generator, List, Optional, Tuple
-from zipfile import BadZipfile, ZipFile
-
-from pip._vendor.packaging.utils import canonicalize_name
-from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
-
-from pip._internal.metadata import BaseDistribution, MemoryWheel, get_wheel_distribution
-from pip._internal.network.session import PipSession
-from pip._internal.network.utils import HEADERS, raise_for_status, response_chunks
-
-
-class HTTPRangeRequestUnsupported(Exception):
- pass
-
-
-def dist_from_wheel_url(name: str, url: str, session: PipSession) -> BaseDistribution:
- """Return a distribution object from the given wheel URL.
-
- This uses HTTP range requests to only fetch the potion of the wheel
- containing metadata, just enough for the object to be constructed.
- If such requests are not supported, HTTPRangeRequestUnsupported
- is raised.
- """
- with LazyZipOverHTTP(url, session) as zf:
- # For read-only ZIP files, ZipFile only needs methods read,
- # seek, seekable and tell, not the whole IO protocol.
- wheel = MemoryWheel(zf.name, zf) # type: ignore
- # After context manager exit, wheel.name
- # is an invalid file by intention.
- return get_wheel_distribution(wheel, canonicalize_name(name))
-
-
-class LazyZipOverHTTP:
- """File-like object mapped to a ZIP file over HTTP.
-
- This uses HTTP range requests to lazily fetch the file's content,
- which is supposed to be fed to ZipFile. If such requests are not
- supported by the server, raise HTTPRangeRequestUnsupported
- during initialization.
- """
-
- def __init__(
- self, url: str, session: PipSession, chunk_size: int = CONTENT_CHUNK_SIZE
- ) -> None:
- head = session.head(url, headers=HEADERS)
- raise_for_status(head)
- assert head.status_code == 200
- self._session, self._url, self._chunk_size = session, url, chunk_size
- self._length = int(head.headers["Content-Length"])
- self._file = NamedTemporaryFile()
- self.truncate(self._length)
- self._left: List[int] = []
- self._right: List[int] = []
- if "bytes" not in head.headers.get("Accept-Ranges", "none"):
- raise HTTPRangeRequestUnsupported("range request is not supported")
- self._check_zip()
-
- @property
- def mode(self) -> str:
- """Opening mode, which is always rb."""
- return "rb"
-
- @property
- def name(self) -> str:
- """Path to the underlying file."""
- return self._file.name
-
- def seekable(self) -> bool:
- """Return whether random access is supported, which is True."""
- return True
-
- def close(self) -> None:
- """Close the file."""
- self._file.close()
-
- @property
- def closed(self) -> bool:
- """Whether the file is closed."""
- return self._file.closed
-
- def read(self, size: int = -1) -> bytes:
- """Read up to size bytes from the object and return them.
-
- As a convenience, if size is unspecified or -1,
- all bytes until EOF are returned. Fewer than
- size bytes may be returned if EOF is reached.
- """
- download_size = max(size, self._chunk_size)
- start, length = self.tell(), self._length
- stop = length if size < 0 else min(start + download_size, length)
- start = max(0, stop - download_size)
- self._download(start, stop - 1)
- return self._file.read(size)
-
- def readable(self) -> bool:
- """Return whether the file is readable, which is True."""
- return True
-
- def seek(self, offset: int, whence: int = 0) -> int:
- """Change stream position and return the new absolute position.
-
- Seek to offset relative position indicated by whence:
- * 0: Start of stream (the default). pos should be >= 0;
- * 1: Current position - pos may be negative;
- * 2: End of stream - pos usually negative.
- """
- return self._file.seek(offset, whence)
-
- def tell(self) -> int:
- """Return the current position."""
- return self._file.tell()
-
- def truncate(self, size: Optional[int] = None) -> int:
- """Resize the stream to the given size in bytes.
-
- If size is unspecified resize to the current position.
- The current stream position isn't changed.
-
- Return the new file size.
- """
- return self._file.truncate(size)
-
- def writable(self) -> bool:
- """Return False."""
- return False
-
- def __enter__(self) -> "LazyZipOverHTTP":
- self._file.__enter__()
- return self
-
- def __exit__(self, *exc: Any) -> Optional[bool]:
- return self._file.__exit__(*exc)
-
- @contextmanager
- def _stay(self) -> Generator[None, None, None]:
- """Return a context manager keeping the position.
-
- At the end of the block, seek back to original position.
- """
- pos = self.tell()
- try:
- yield
- finally:
- self.seek(pos)
-
- def _check_zip(self) -> None:
- """Check and download until the file is a valid ZIP."""
- end = self._length - 1
- for start in reversed(range(0, end, self._chunk_size)):
- self._download(start, end)
- with self._stay():
- try:
- # For read-only ZIP files, ZipFile only needs
- # methods read, seek, seekable and tell.
- ZipFile(self) # type: ignore
- except BadZipfile:
- pass
- else:
- break
-
- def _stream_response(
- self, start: int, end: int, base_headers: Dict[str, str] = HEADERS
- ) -> Response:
- """Return HTTP response to a range request from start to end."""
- headers = base_headers.copy()
- headers["Range"] = f"bytes={start}-{end}"
- # TODO: Get range requests to be correctly cached
- headers["Cache-Control"] = "no-cache"
- return self._session.get(self._url, headers=headers, stream=True)
-
- def _merge(
- self, start: int, end: int, left: int, right: int
- ) -> Generator[Tuple[int, int], None, None]:
- """Return a generator of intervals to be fetched.
-
- Args:
- start (int): Start of needed interval
- end (int): End of needed interval
- left (int): Index of first overlapping downloaded data
- right (int): Index after last overlapping downloaded data
- """
- lslice, rslice = self._left[left:right], self._right[left:right]
- i = start = min([start] + lslice[:1])
- end = max([end] + rslice[-1:])
- for j, k in zip(lslice, rslice):
- if j > i:
- yield i, j - 1
- i = k + 1
- if i <= end:
- yield i, end
- self._left[left:right], self._right[left:right] = [start], [end]
-
- def _download(self, start: int, end: int) -> None:
- """Download bytes from start to end inclusively."""
- with self._stay():
- left = bisect_left(self._right, start)
- right = bisect_right(self._left, end)
- for start, end in self._merge(start, end, left, right):
- response = self._stream_response(start, end)
- response.raise_for_status()
- self.seek(start)
- for chunk in response_chunks(response, self._chunk_size):
- self._file.write(chunk)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/session.py b/env/lib/python3.9/site-packages/pip/_internal/network/session.py
deleted file mode 100644
index e2c8582..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/session.py
+++ /dev/null
@@ -1,456 +0,0 @@
-"""PipSession and supporting code, containing all pip-specific
-network request configuration and behavior.
-"""
-
-import email.utils
-import io
-import ipaddress
-import json
-import logging
-import mimetypes
-import os
-import platform
-import shutil
-import subprocess
-import sys
-import urllib.parse
-import warnings
-from typing import Any, Dict, Generator, List, Mapping, Optional, Sequence, Tuple, Union
-
-from pip._vendor import requests, urllib3
-from pip._vendor.cachecontrol import CacheControlAdapter
-from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter
-from pip._vendor.requests.models import PreparedRequest, Response
-from pip._vendor.requests.structures import CaseInsensitiveDict
-from pip._vendor.urllib3.connectionpool import ConnectionPool
-from pip._vendor.urllib3.exceptions import InsecureRequestWarning
-
-from pip import __version__
-from pip._internal.metadata import get_default_environment
-from pip._internal.models.link import Link
-from pip._internal.network.auth import MultiDomainBasicAuth
-from pip._internal.network.cache import SafeFileCache
-
-# Import ssl from compat so the initial import occurs in only one place.
-from pip._internal.utils.compat import has_tls
-from pip._internal.utils.glibc import libc_ver
-from pip._internal.utils.misc import build_url_from_netloc, parse_netloc
-from pip._internal.utils.urls import url_to_path
-
-logger = logging.getLogger(__name__)
-
-SecureOrigin = Tuple[str, str, Optional[Union[int, str]]]
-
-
-# Ignore warning raised when using --trusted-host.
-warnings.filterwarnings("ignore", category=InsecureRequestWarning)
-
-
-SECURE_ORIGINS: List[SecureOrigin] = [
- # protocol, hostname, port
- # Taken from Chrome's list of secure origins (See: http://bit.ly/1qrySKC)
- ("https", "*", "*"),
- ("*", "localhost", "*"),
- ("*", "127.0.0.0/8", "*"),
- ("*", "::1/128", "*"),
- ("file", "*", None),
- # ssh is always secure.
- ("ssh", "*", "*"),
-]
-
-
-# These are environment variables present when running under various
-# CI systems. For each variable, some CI systems that use the variable
-# are indicated. The collection was chosen so that for each of a number
-# of popular systems, at least one of the environment variables is used.
-# This list is used to provide some indication of and lower bound for
-# CI traffic to PyPI. Thus, it is okay if the list is not comprehensive.
-# For more background, see: https://github.com/pypa/pip/issues/5499
-CI_ENVIRONMENT_VARIABLES = (
- # Azure Pipelines
- "BUILD_BUILDID",
- # Jenkins
- "BUILD_ID",
- # AppVeyor, CircleCI, Codeship, Gitlab CI, Shippable, Travis CI
- "CI",
- # Explicit environment variable.
- "PIP_IS_CI",
-)
-
-
-def looks_like_ci() -> bool:
- """
- Return whether it looks like pip is running under CI.
- """
- # We don't use the method of checking for a tty (e.g. using isatty())
- # because some CI systems mimic a tty (e.g. Travis CI). Thus that
- # method doesn't provide definitive information in either direction.
- return any(name in os.environ for name in CI_ENVIRONMENT_VARIABLES)
-
-
-def user_agent() -> str:
- """
- Return a string representing the user agent.
- """
- data: Dict[str, Any] = {
- "installer": {"name": "pip", "version": __version__},
- "python": platform.python_version(),
- "implementation": {
- "name": platform.python_implementation(),
- },
- }
-
- if data["implementation"]["name"] == "CPython":
- data["implementation"]["version"] = platform.python_version()
- elif data["implementation"]["name"] == "PyPy":
- pypy_version_info = sys.pypy_version_info # type: ignore
- if pypy_version_info.releaselevel == "final":
- pypy_version_info = pypy_version_info[:3]
- data["implementation"]["version"] = ".".join(
- [str(x) for x in pypy_version_info]
- )
- elif data["implementation"]["name"] == "Jython":
- # Complete Guess
- data["implementation"]["version"] = platform.python_version()
- elif data["implementation"]["name"] == "IronPython":
- # Complete Guess
- data["implementation"]["version"] = platform.python_version()
-
- if sys.platform.startswith("linux"):
- from pip._vendor import distro
-
- linux_distribution = distro.name(), distro.version(), distro.codename()
- distro_infos: Dict[str, Any] = dict(
- filter(
- lambda x: x[1],
- zip(["name", "version", "id"], linux_distribution),
- )
- )
- libc = dict(
- filter(
- lambda x: x[1],
- zip(["lib", "version"], libc_ver()),
- )
- )
- if libc:
- distro_infos["libc"] = libc
- if distro_infos:
- data["distro"] = distro_infos
-
- if sys.platform.startswith("darwin") and platform.mac_ver()[0]:
- data["distro"] = {"name": "macOS", "version": platform.mac_ver()[0]}
-
- if platform.system():
- data.setdefault("system", {})["name"] = platform.system()
-
- if platform.release():
- data.setdefault("system", {})["release"] = platform.release()
-
- if platform.machine():
- data["cpu"] = platform.machine()
-
- if has_tls():
- import _ssl as ssl
-
- data["openssl_version"] = ssl.OPENSSL_VERSION
-
- setuptools_dist = get_default_environment().get_distribution("setuptools")
- if setuptools_dist is not None:
- data["setuptools_version"] = str(setuptools_dist.version)
-
- if shutil.which("rustc") is not None:
- # If for any reason `rustc --version` fails, silently ignore it
- try:
- rustc_output = subprocess.check_output(
- ["rustc", "--version"], stderr=subprocess.STDOUT, timeout=0.5
- )
- except Exception:
- pass
- else:
- if rustc_output.startswith(b"rustc "):
- # The format of `rustc --version` is:
- # `b'rustc 1.52.1 (9bc8c42bb 2021-05-09)\n'`
- # We extract just the middle (1.52.1) part
- data["rustc_version"] = rustc_output.split(b" ")[1].decode()
-
- # Use None rather than False so as not to give the impression that
- # pip knows it is not being run under CI. Rather, it is a null or
- # inconclusive result. Also, we include some value rather than no
- # value to make it easier to know that the check has been run.
- data["ci"] = True if looks_like_ci() else None
-
- user_data = os.environ.get("PIP_USER_AGENT_USER_DATA")
- if user_data is not None:
- data["user_data"] = user_data
-
- return "{data[installer][name]}/{data[installer][version]} {json}".format(
- data=data,
- json=json.dumps(data, separators=(",", ":"), sort_keys=True),
- )
-
-
-class LocalFSAdapter(BaseAdapter):
- def send(
- self,
- request: PreparedRequest,
- stream: bool = False,
- timeout: Optional[Union[float, Tuple[float, float]]] = None,
- verify: Union[bool, str] = True,
- cert: Optional[Union[str, Tuple[str, str]]] = None,
- proxies: Optional[Mapping[str, str]] = None,
- ) -> Response:
- pathname = url_to_path(request.url)
-
- resp = Response()
- resp.status_code = 200
- resp.url = request.url
-
- try:
- stats = os.stat(pathname)
- except OSError as exc:
- # format the exception raised as a io.BytesIO object,
- # to return a better error message:
- resp.status_code = 404
- resp.reason = type(exc).__name__
- resp.raw = io.BytesIO(f"{resp.reason}: {exc}".encode("utf8"))
- else:
- modified = email.utils.formatdate(stats.st_mtime, usegmt=True)
- content_type = mimetypes.guess_type(pathname)[0] or "text/plain"
- resp.headers = CaseInsensitiveDict(
- {
- "Content-Type": content_type,
- "Content-Length": stats.st_size,
- "Last-Modified": modified,
- }
- )
-
- resp.raw = open(pathname, "rb")
- resp.close = resp.raw.close
-
- return resp
-
- def close(self) -> None:
- pass
-
-
-class InsecureHTTPAdapter(HTTPAdapter):
- def cert_verify(
- self,
- conn: ConnectionPool,
- url: str,
- verify: Union[bool, str],
- cert: Optional[Union[str, Tuple[str, str]]],
- ) -> None:
- super().cert_verify(conn=conn, url=url, verify=False, cert=cert)
-
-
-class InsecureCacheControlAdapter(CacheControlAdapter):
- def cert_verify(
- self,
- conn: ConnectionPool,
- url: str,
- verify: Union[bool, str],
- cert: Optional[Union[str, Tuple[str, str]]],
- ) -> None:
- super().cert_verify(conn=conn, url=url, verify=False, cert=cert)
-
-
-class PipSession(requests.Session):
-
- timeout: Optional[int] = None
-
- def __init__(
- self,
- *args: Any,
- retries: int = 0,
- cache: Optional[str] = None,
- trusted_hosts: Sequence[str] = (),
- index_urls: Optional[List[str]] = None,
- **kwargs: Any,
- ) -> None:
- """
- :param trusted_hosts: Domains not to emit warnings for when not using
- HTTPS.
- """
- super().__init__(*args, **kwargs)
-
- # Namespace the attribute with "pip_" just in case to prevent
- # possible conflicts with the base class.
- self.pip_trusted_origins: List[Tuple[str, Optional[int]]] = []
-
- # Attach our User Agent to the request
- self.headers["User-Agent"] = user_agent()
-
- # Attach our Authentication handler to the session
- self.auth = MultiDomainBasicAuth(index_urls=index_urls)
-
- # Create our urllib3.Retry instance which will allow us to customize
- # how we handle retries.
- retries = urllib3.Retry(
- # Set the total number of retries that a particular request can
- # have.
- total=retries,
- # A 503 error from PyPI typically means that the Fastly -> Origin
- # connection got interrupted in some way. A 503 error in general
- # is typically considered a transient error so we'll go ahead and
- # retry it.
- # A 500 may indicate transient error in Amazon S3
- # A 520 or 527 - may indicate transient error in CloudFlare
- status_forcelist=[500, 503, 520, 527],
- # Add a small amount of back off between failed requests in
- # order to prevent hammering the service.
- backoff_factor=0.25,
- ) # type: ignore
-
- # Our Insecure HTTPAdapter disables HTTPS validation. It does not
- # support caching so we'll use it for all http:// URLs.
- # If caching is disabled, we will also use it for
- # https:// hosts that we've marked as ignoring
- # TLS errors for (trusted-hosts).
- insecure_adapter = InsecureHTTPAdapter(max_retries=retries)
-
- # We want to _only_ cache responses on securely fetched origins or when
- # the host is specified as trusted. We do this because
- # we can't validate the response of an insecurely/untrusted fetched
- # origin, and we don't want someone to be able to poison the cache and
- # require manual eviction from the cache to fix it.
- if cache:
- secure_adapter = CacheControlAdapter(
- cache=SafeFileCache(cache),
- max_retries=retries,
- )
- self._trusted_host_adapter = InsecureCacheControlAdapter(
- cache=SafeFileCache(cache),
- max_retries=retries,
- )
- else:
- secure_adapter = HTTPAdapter(max_retries=retries)
- self._trusted_host_adapter = insecure_adapter
-
- self.mount("https://", secure_adapter)
- self.mount("http://", insecure_adapter)
-
- # Enable file:// urls
- self.mount("file://", LocalFSAdapter())
-
- for host in trusted_hosts:
- self.add_trusted_host(host, suppress_logging=True)
-
- def update_index_urls(self, new_index_urls: List[str]) -> None:
- """
- :param new_index_urls: New index urls to update the authentication
- handler with.
- """
- self.auth.index_urls = new_index_urls
-
- def add_trusted_host(
- self, host: str, source: Optional[str] = None, suppress_logging: bool = False
- ) -> None:
- """
- :param host: It is okay to provide a host that has previously been
- added.
- :param source: An optional source string, for logging where the host
- string came from.
- """
- if not suppress_logging:
- msg = f"adding trusted host: {host!r}"
- if source is not None:
- msg += f" (from {source})"
- logger.info(msg)
-
- host_port = parse_netloc(host)
- if host_port not in self.pip_trusted_origins:
- self.pip_trusted_origins.append(host_port)
-
- self.mount(
- build_url_from_netloc(host, scheme="http") + "/", self._trusted_host_adapter
- )
- self.mount(build_url_from_netloc(host) + "/", self._trusted_host_adapter)
- if not host_port[1]:
- self.mount(
- build_url_from_netloc(host, scheme="http") + ":",
- self._trusted_host_adapter,
- )
- # Mount wildcard ports for the same host.
- self.mount(build_url_from_netloc(host) + ":", self._trusted_host_adapter)
-
- def iter_secure_origins(self) -> Generator[SecureOrigin, None, None]:
- yield from SECURE_ORIGINS
- for host, port in self.pip_trusted_origins:
- yield ("*", host, "*" if port is None else port)
-
- def is_secure_origin(self, location: Link) -> bool:
- # Determine if this url used a secure transport mechanism
- parsed = urllib.parse.urlparse(str(location))
- origin_protocol, origin_host, origin_port = (
- parsed.scheme,
- parsed.hostname,
- parsed.port,
- )
-
- # The protocol to use to see if the protocol matches.
- # Don't count the repository type as part of the protocol: in
- # cases such as "git+ssh", only use "ssh". (I.e., Only verify against
- # the last scheme.)
- origin_protocol = origin_protocol.rsplit("+", 1)[-1]
-
- # Determine if our origin is a secure origin by looking through our
- # hardcoded list of secure origins, as well as any additional ones
- # configured on this PackageFinder instance.
- for secure_origin in self.iter_secure_origins():
- secure_protocol, secure_host, secure_port = secure_origin
- if origin_protocol != secure_protocol and secure_protocol != "*":
- continue
-
- try:
- addr = ipaddress.ip_address(origin_host)
- network = ipaddress.ip_network(secure_host)
- except ValueError:
- # We don't have both a valid address or a valid network, so
- # we'll check this origin against hostnames.
- if (
- origin_host
- and origin_host.lower() != secure_host.lower()
- and secure_host != "*"
- ):
- continue
- else:
- # We have a valid address and network, so see if the address
- # is contained within the network.
- if addr not in network:
- continue
-
- # Check to see if the port matches.
- if (
- origin_port != secure_port
- and secure_port != "*"
- and secure_port is not None
- ):
- continue
-
- # If we've gotten here, then this origin matches the current
- # secure origin and we should return True
- return True
-
- # If we've gotten to this point, then the origin isn't secure and we
- # will not accept it as a valid location to search. We will however
- # log a warning that we are ignoring it.
- logger.warning(
- "The repository located at %s is not a trusted or secure host and "
- "is being ignored. If this repository is available via HTTPS we "
- "recommend you use HTTPS instead, otherwise you may silence "
- "this warning and allow it anyway with '--trusted-host %s'.",
- origin_host,
- origin_host,
- )
-
- return False
-
- def request(self, method: str, url: str, *args: Any, **kwargs: Any) -> Response:
- # Allow setting a default timeout on a session
- kwargs.setdefault("timeout", self.timeout)
- # Allow setting a default proxies on a session
- kwargs.setdefault("proxies", self.proxies)
-
- # Dispatch the actual request
- return super().request(method, url, *args, **kwargs)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/utils.py b/env/lib/python3.9/site-packages/pip/_internal/network/utils.py
deleted file mode 100644
index 134848a..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/utils.py
+++ /dev/null
@@ -1,96 +0,0 @@
-from typing import Dict, Generator
-
-from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
-
-from pip._internal.exceptions import NetworkConnectionError
-
-# The following comments and HTTP headers were originally added by
-# Donald Stufft in git commit 22c562429a61bb77172039e480873fb239dd8c03.
-#
-# We use Accept-Encoding: identity here because requests defaults to
-# accepting compressed responses. This breaks in a variety of ways
-# depending on how the server is configured.
-# - Some servers will notice that the file isn't a compressible file
-# and will leave the file alone and with an empty Content-Encoding
-# - Some servers will notice that the file is already compressed and
-# will leave the file alone, adding a Content-Encoding: gzip header
-# - Some servers won't notice anything at all and will take a file
-# that's already been compressed and compress it again, and set
-# the Content-Encoding: gzip header
-# By setting this to request only the identity encoding we're hoping
-# to eliminate the third case. Hopefully there does not exist a server
-# which when given a file will notice it is already compressed and that
-# you're not asking for a compressed file and will then decompress it
-# before sending because if that's the case I don't think it'll ever be
-# possible to make this work.
-HEADERS: Dict[str, str] = {"Accept-Encoding": "identity"}
-
-
-def raise_for_status(resp: Response) -> None:
- http_error_msg = ""
- if isinstance(resp.reason, bytes):
- # We attempt to decode utf-8 first because some servers
- # choose to localize their reason strings. If the string
- # isn't utf-8, we fall back to iso-8859-1 for all other
- # encodings.
- try:
- reason = resp.reason.decode("utf-8")
- except UnicodeDecodeError:
- reason = resp.reason.decode("iso-8859-1")
- else:
- reason = resp.reason
-
- if 400 <= resp.status_code < 500:
- http_error_msg = (
- f"{resp.status_code} Client Error: {reason} for url: {resp.url}"
- )
-
- elif 500 <= resp.status_code < 600:
- http_error_msg = (
- f"{resp.status_code} Server Error: {reason} for url: {resp.url}"
- )
-
- if http_error_msg:
- raise NetworkConnectionError(http_error_msg, response=resp)
-
-
-def response_chunks(
- response: Response, chunk_size: int = CONTENT_CHUNK_SIZE
-) -> Generator[bytes, None, None]:
- """Given a requests Response, provide the data chunks."""
- try:
- # Special case for urllib3.
- for chunk in response.raw.stream(
- chunk_size,
- # We use decode_content=False here because we don't
- # want urllib3 to mess with the raw bytes we get
- # from the server. If we decompress inside of
- # urllib3 then we cannot verify the checksum
- # because the checksum will be of the compressed
- # file. This breakage will only occur if the
- # server adds a Content-Encoding header, which
- # depends on how the server was configured:
- # - Some servers will notice that the file isn't a
- # compressible file and will leave the file alone
- # and with an empty Content-Encoding
- # - Some servers will notice that the file is
- # already compressed and will leave the file
- # alone and will add a Content-Encoding: gzip
- # header
- # - Some servers won't notice anything at all and
- # will take a file that's already been compressed
- # and compress it again and set the
- # Content-Encoding: gzip header
- #
- # By setting this not to decode automatically we
- # hope to eliminate problems with the second case.
- decode_content=False,
- ):
- yield chunk
- except AttributeError:
- # Standard file-like object.
- while True:
- chunk = response.raw.read(chunk_size)
- if not chunk:
- break
- yield chunk
diff --git a/env/lib/python3.9/site-packages/pip/_internal/network/xmlrpc.py b/env/lib/python3.9/site-packages/pip/_internal/network/xmlrpc.py
deleted file mode 100644
index 4a7d55d..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/network/xmlrpc.py
+++ /dev/null
@@ -1,60 +0,0 @@
-"""xmlrpclib.Transport implementation
-"""
-
-import logging
-import urllib.parse
-import xmlrpc.client
-from typing import TYPE_CHECKING, Tuple
-
-from pip._internal.exceptions import NetworkConnectionError
-from pip._internal.network.session import PipSession
-from pip._internal.network.utils import raise_for_status
-
-if TYPE_CHECKING:
- from xmlrpc.client import _HostType, _Marshallable
-
-logger = logging.getLogger(__name__)
-
-
-class PipXmlrpcTransport(xmlrpc.client.Transport):
- """Provide a `xmlrpclib.Transport` implementation via a `PipSession`
- object.
- """
-
- def __init__(
- self, index_url: str, session: PipSession, use_datetime: bool = False
- ) -> None:
- super().__init__(use_datetime)
- index_parts = urllib.parse.urlparse(index_url)
- self._scheme = index_parts.scheme
- self._session = session
-
- def request(
- self,
- host: "_HostType",
- handler: str,
- request_body: bytes,
- verbose: bool = False,
- ) -> Tuple["_Marshallable", ...]:
- assert isinstance(host, str)
- parts = (self._scheme, host, handler, None, None, None)
- url = urllib.parse.urlunparse(parts)
- try:
- headers = {"Content-Type": "text/xml"}
- response = self._session.post(
- url,
- data=request_body,
- headers=headers,
- stream=True,
- )
- raise_for_status(response)
- self.verbose = verbose
- return self.parse_response(response.raw)
- except NetworkConnectionError as exc:
- assert exc.response
- logger.critical(
- "HTTP error %s while getting %s",
- exc.response.status_code,
- url,
- )
- raise
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/operations/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/build_tracker.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/build_tracker.py
deleted file mode 100644
index 6621549..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/build_tracker.py
+++ /dev/null
@@ -1,124 +0,0 @@
-import contextlib
-import hashlib
-import logging
-import os
-from types import TracebackType
-from typing import Dict, Generator, Optional, Set, Type, Union
-
-from pip._internal.models.link import Link
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.temp_dir import TempDirectory
-
-logger = logging.getLogger(__name__)
-
-
-@contextlib.contextmanager
-def update_env_context_manager(**changes: str) -> Generator[None, None, None]:
- target = os.environ
-
- # Save values from the target and change them.
- non_existent_marker = object()
- saved_values: Dict[str, Union[object, str]] = {}
- for name, new_value in changes.items():
- try:
- saved_values[name] = target[name]
- except KeyError:
- saved_values[name] = non_existent_marker
- target[name] = new_value
-
- try:
- yield
- finally:
- # Restore original values in the target.
- for name, original_value in saved_values.items():
- if original_value is non_existent_marker:
- del target[name]
- else:
- assert isinstance(original_value, str) # for mypy
- target[name] = original_value
-
-
-@contextlib.contextmanager
-def get_build_tracker() -> Generator["BuildTracker", None, None]:
- root = os.environ.get("PIP_BUILD_TRACKER")
- with contextlib.ExitStack() as ctx:
- if root is None:
- root = ctx.enter_context(TempDirectory(kind="build-tracker")).path
- ctx.enter_context(update_env_context_manager(PIP_BUILD_TRACKER=root))
- logger.debug("Initialized build tracking at %s", root)
-
- with BuildTracker(root) as tracker:
- yield tracker
-
-
-class BuildTracker:
- def __init__(self, root: str) -> None:
- self._root = root
- self._entries: Set[InstallRequirement] = set()
- logger.debug("Created build tracker: %s", self._root)
-
- def __enter__(self) -> "BuildTracker":
- logger.debug("Entered build tracker: %s", self._root)
- return self
-
- def __exit__(
- self,
- exc_type: Optional[Type[BaseException]],
- exc_val: Optional[BaseException],
- exc_tb: Optional[TracebackType],
- ) -> None:
- self.cleanup()
-
- def _entry_path(self, link: Link) -> str:
- hashed = hashlib.sha224(link.url_without_fragment.encode()).hexdigest()
- return os.path.join(self._root, hashed)
-
- def add(self, req: InstallRequirement) -> None:
- """Add an InstallRequirement to build tracking."""
-
- assert req.link
- # Get the file to write information about this requirement.
- entry_path = self._entry_path(req.link)
-
- # Try reading from the file. If it exists and can be read from, a build
- # is already in progress, so a LookupError is raised.
- try:
- with open(entry_path) as fp:
- contents = fp.read()
- except FileNotFoundError:
- pass
- else:
- message = "{} is already being built: {}".format(req.link, contents)
- raise LookupError(message)
-
- # If we're here, req should really not be building already.
- assert req not in self._entries
-
- # Start tracking this requirement.
- with open(entry_path, "w", encoding="utf-8") as fp:
- fp.write(str(req))
- self._entries.add(req)
-
- logger.debug("Added %s to build tracker %r", req, self._root)
-
- def remove(self, req: InstallRequirement) -> None:
- """Remove an InstallRequirement from build tracking."""
-
- assert req.link
- # Delete the created file and the corresponding entries.
- os.unlink(self._entry_path(req.link))
- self._entries.remove(req)
-
- logger.debug("Removed %s from build tracker %r", req, self._root)
-
- def cleanup(self) -> None:
- for req in set(self._entries):
- self.remove(req)
-
- logger.debug("Removed build tracker: %r", self._root)
-
- @contextlib.contextmanager
- def track(self, req: InstallRequirement) -> Generator[None, None, None]:
- self.add(req)
- yield
- self.remove(req)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata.py
deleted file mode 100644
index e2b7b44..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata.py
+++ /dev/null
@@ -1,39 +0,0 @@
-"""Metadata generation logic for source distributions.
-"""
-
-import os
-
-from pip._vendor.pep517.wrappers import Pep517HookCaller
-
-from pip._internal.build_env import BuildEnvironment
-from pip._internal.exceptions import (
- InstallationSubprocessError,
- MetadataGenerationFailed,
-)
-from pip._internal.utils.subprocess import runner_with_spinner_message
-from pip._internal.utils.temp_dir import TempDirectory
-
-
-def generate_metadata(
- build_env: BuildEnvironment, backend: Pep517HookCaller, details: str
-) -> str:
- """Generate metadata using mechanisms described in PEP 517.
-
- Returns the generated metadata directory.
- """
- metadata_tmpdir = TempDirectory(kind="modern-metadata", globally_managed=True)
-
- metadata_dir = metadata_tmpdir.path
-
- with build_env:
- # Note that Pep517HookCaller implements a fallback for
- # prepare_metadata_for_build_wheel, so we don't have to
- # consider the possibility that this hook doesn't exist.
- runner = runner_with_spinner_message("Preparing metadata (pyproject.toml)")
- with backend.subprocess_runner(runner):
- try:
- distinfo_dir = backend.prepare_metadata_for_build_wheel(metadata_dir)
- except InstallationSubprocessError as error:
- raise MetadataGenerationFailed(package_details=details) from error
-
- return os.path.join(metadata_dir, distinfo_dir)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata_editable.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata_editable.py
deleted file mode 100644
index 4c3f48b..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata_editable.py
+++ /dev/null
@@ -1,41 +0,0 @@
-"""Metadata generation logic for source distributions.
-"""
-
-import os
-
-from pip._vendor.pep517.wrappers import Pep517HookCaller
-
-from pip._internal.build_env import BuildEnvironment
-from pip._internal.exceptions import (
- InstallationSubprocessError,
- MetadataGenerationFailed,
-)
-from pip._internal.utils.subprocess import runner_with_spinner_message
-from pip._internal.utils.temp_dir import TempDirectory
-
-
-def generate_editable_metadata(
- build_env: BuildEnvironment, backend: Pep517HookCaller, details: str
-) -> str:
- """Generate metadata using mechanisms described in PEP 660.
-
- Returns the generated metadata directory.
- """
- metadata_tmpdir = TempDirectory(kind="modern-metadata", globally_managed=True)
-
- metadata_dir = metadata_tmpdir.path
-
- with build_env:
- # Note that Pep517HookCaller implements a fallback for
- # prepare_metadata_for_build_wheel/editable, so we don't have to
- # consider the possibility that this hook doesn't exist.
- runner = runner_with_spinner_message(
- "Preparing editable metadata (pyproject.toml)"
- )
- with backend.subprocess_runner(runner):
- try:
- distinfo_dir = backend.prepare_metadata_for_build_editable(metadata_dir)
- except InstallationSubprocessError as error:
- raise MetadataGenerationFailed(package_details=details) from error
-
- return os.path.join(metadata_dir, distinfo_dir)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata_legacy.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata_legacy.py
deleted file mode 100644
index e60988d..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/metadata_legacy.py
+++ /dev/null
@@ -1,74 +0,0 @@
-"""Metadata generation logic for legacy source distributions.
-"""
-
-import logging
-import os
-
-from pip._internal.build_env import BuildEnvironment
-from pip._internal.cli.spinners import open_spinner
-from pip._internal.exceptions import (
- InstallationError,
- InstallationSubprocessError,
- MetadataGenerationFailed,
-)
-from pip._internal.utils.setuptools_build import make_setuptools_egg_info_args
-from pip._internal.utils.subprocess import call_subprocess
-from pip._internal.utils.temp_dir import TempDirectory
-
-logger = logging.getLogger(__name__)
-
-
-def _find_egg_info(directory: str) -> str:
- """Find an .egg-info subdirectory in `directory`."""
- filenames = [f for f in os.listdir(directory) if f.endswith(".egg-info")]
-
- if not filenames:
- raise InstallationError(f"No .egg-info directory found in {directory}")
-
- if len(filenames) > 1:
- raise InstallationError(
- "More than one .egg-info directory found in {}".format(directory)
- )
-
- return os.path.join(directory, filenames[0])
-
-
-def generate_metadata(
- build_env: BuildEnvironment,
- setup_py_path: str,
- source_dir: str,
- isolated: bool,
- details: str,
-) -> str:
- """Generate metadata using setup.py-based defacto mechanisms.
-
- Returns the generated metadata directory.
- """
- logger.debug(
- "Running setup.py (path:%s) egg_info for package %s",
- setup_py_path,
- details,
- )
-
- egg_info_dir = TempDirectory(kind="pip-egg-info", globally_managed=True).path
-
- args = make_setuptools_egg_info_args(
- setup_py_path,
- egg_info_dir=egg_info_dir,
- no_user_config=isolated,
- )
-
- with build_env:
- with open_spinner("Preparing metadata (setup.py)") as spinner:
- try:
- call_subprocess(
- args,
- cwd=source_dir,
- command_desc="python setup.py egg_info",
- spinner=spinner,
- )
- except InstallationSubprocessError as error:
- raise MetadataGenerationFailed(package_details=details) from error
-
- # Return the .egg-info directory.
- return _find_egg_info(egg_info_dir)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel.py
deleted file mode 100644
index b0d2fc9..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel.py
+++ /dev/null
@@ -1,37 +0,0 @@
-import logging
-import os
-from typing import Optional
-
-from pip._vendor.pep517.wrappers import Pep517HookCaller
-
-from pip._internal.utils.subprocess import runner_with_spinner_message
-
-logger = logging.getLogger(__name__)
-
-
-def build_wheel_pep517(
- name: str,
- backend: Pep517HookCaller,
- metadata_directory: str,
- tempd: str,
-) -> Optional[str]:
- """Build one InstallRequirement using the PEP 517 build process.
-
- Returns path to wheel if successfully built. Otherwise, returns None.
- """
- assert metadata_directory is not None
- try:
- logger.debug("Destination directory: %s", tempd)
-
- runner = runner_with_spinner_message(
- f"Building wheel for {name} (pyproject.toml)"
- )
- with backend.subprocess_runner(runner):
- wheel_name = backend.build_wheel(
- tempd,
- metadata_directory=metadata_directory,
- )
- except Exception:
- logger.error("Failed building wheel for %s", name)
- return None
- return os.path.join(tempd, wheel_name)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel_editable.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel_editable.py
deleted file mode 100644
index cf7b01a..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel_editable.py
+++ /dev/null
@@ -1,46 +0,0 @@
-import logging
-import os
-from typing import Optional
-
-from pip._vendor.pep517.wrappers import HookMissing, Pep517HookCaller
-
-from pip._internal.utils.subprocess import runner_with_spinner_message
-
-logger = logging.getLogger(__name__)
-
-
-def build_wheel_editable(
- name: str,
- backend: Pep517HookCaller,
- metadata_directory: str,
- tempd: str,
-) -> Optional[str]:
- """Build one InstallRequirement using the PEP 660 build process.
-
- Returns path to wheel if successfully built. Otherwise, returns None.
- """
- assert metadata_directory is not None
- try:
- logger.debug("Destination directory: %s", tempd)
-
- runner = runner_with_spinner_message(
- f"Building editable for {name} (pyproject.toml)"
- )
- with backend.subprocess_runner(runner):
- try:
- wheel_name = backend.build_editable(
- tempd,
- metadata_directory=metadata_directory,
- )
- except HookMissing as e:
- logger.error(
- "Cannot build editable %s because the build "
- "backend does not have the %s hook",
- name,
- e,
- )
- return None
- except Exception:
- logger.error("Failed building editable for %s", name)
- return None
- return os.path.join(tempd, wheel_name)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel_legacy.py b/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel_legacy.py
deleted file mode 100644
index c5f0492..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/build/wheel_legacy.py
+++ /dev/null
@@ -1,102 +0,0 @@
-import logging
-import os.path
-from typing import List, Optional
-
-from pip._internal.cli.spinners import open_spinner
-from pip._internal.utils.setuptools_build import make_setuptools_bdist_wheel_args
-from pip._internal.utils.subprocess import call_subprocess, format_command_args
-
-logger = logging.getLogger(__name__)
-
-
-def format_command_result(
- command_args: List[str],
- command_output: str,
-) -> str:
- """Format command information for logging."""
- command_desc = format_command_args(command_args)
- text = f"Command arguments: {command_desc}\n"
-
- if not command_output:
- text += "Command output: None"
- elif logger.getEffectiveLevel() > logging.DEBUG:
- text += "Command output: [use --verbose to show]"
- else:
- if not command_output.endswith("\n"):
- command_output += "\n"
- text += f"Command output:\n{command_output}"
-
- return text
-
-
-def get_legacy_build_wheel_path(
- names: List[str],
- temp_dir: str,
- name: str,
- command_args: List[str],
- command_output: str,
-) -> Optional[str]:
- """Return the path to the wheel in the temporary build directory."""
- # Sort for determinism.
- names = sorted(names)
- if not names:
- msg = ("Legacy build of wheel for {!r} created no files.\n").format(name)
- msg += format_command_result(command_args, command_output)
- logger.warning(msg)
- return None
-
- if len(names) > 1:
- msg = (
- "Legacy build of wheel for {!r} created more than one file.\n"
- "Filenames (choosing first): {}\n"
- ).format(name, names)
- msg += format_command_result(command_args, command_output)
- logger.warning(msg)
-
- return os.path.join(temp_dir, names[0])
-
-
-def build_wheel_legacy(
- name: str,
- setup_py_path: str,
- source_dir: str,
- global_options: List[str],
- build_options: List[str],
- tempd: str,
-) -> Optional[str]:
- """Build one unpacked package using the "legacy" build process.
-
- Returns path to wheel if successfully built. Otherwise, returns None.
- """
- wheel_args = make_setuptools_bdist_wheel_args(
- setup_py_path,
- global_options=global_options,
- build_options=build_options,
- destination_dir=tempd,
- )
-
- spin_message = f"Building wheel for {name} (setup.py)"
- with open_spinner(spin_message) as spinner:
- logger.debug("Destination directory: %s", tempd)
-
- try:
- output = call_subprocess(
- wheel_args,
- command_desc="python setup.py bdist_wheel",
- cwd=source_dir,
- spinner=spinner,
- )
- except Exception:
- spinner.finish("error")
- logger.error("Failed building wheel for %s", name)
- return None
-
- names = os.listdir(tempd)
- wheel_path = get_legacy_build_wheel_path(
- names=names,
- temp_dir=tempd,
- name=name,
- command_args=wheel_args,
- command_output=output,
- )
- return wheel_path
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/check.py b/env/lib/python3.9/site-packages/pip/_internal/operations/check.py
deleted file mode 100644
index fb3ac8b..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/check.py
+++ /dev/null
@@ -1,149 +0,0 @@
-"""Validation of dependencies of packages
-"""
-
-import logging
-from typing import Callable, Dict, List, NamedTuple, Optional, Set, Tuple
-
-from pip._vendor.packaging.requirements import Requirement
-from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
-
-from pip._internal.distributions import make_distribution_for_install_requirement
-from pip._internal.metadata import get_default_environment
-from pip._internal.metadata.base import DistributionVersion
-from pip._internal.req.req_install import InstallRequirement
-
-logger = logging.getLogger(__name__)
-
-
-class PackageDetails(NamedTuple):
- version: DistributionVersion
- dependencies: List[Requirement]
-
-
-# Shorthands
-PackageSet = Dict[NormalizedName, PackageDetails]
-Missing = Tuple[NormalizedName, Requirement]
-Conflicting = Tuple[NormalizedName, DistributionVersion, Requirement]
-
-MissingDict = Dict[NormalizedName, List[Missing]]
-ConflictingDict = Dict[NormalizedName, List[Conflicting]]
-CheckResult = Tuple[MissingDict, ConflictingDict]
-ConflictDetails = Tuple[PackageSet, CheckResult]
-
-
-def create_package_set_from_installed() -> Tuple[PackageSet, bool]:
- """Converts a list of distributions into a PackageSet."""
- package_set = {}
- problems = False
- env = get_default_environment()
- for dist in env.iter_installed_distributions(local_only=False, skip=()):
- name = dist.canonical_name
- try:
- dependencies = list(dist.iter_dependencies())
- package_set[name] = PackageDetails(dist.version, dependencies)
- except (OSError, ValueError) as e:
- # Don't crash on unreadable or broken metadata.
- logger.warning("Error parsing requirements for %s: %s", name, e)
- problems = True
- return package_set, problems
-
-
-def check_package_set(
- package_set: PackageSet, should_ignore: Optional[Callable[[str], bool]] = None
-) -> CheckResult:
- """Check if a package set is consistent
-
- If should_ignore is passed, it should be a callable that takes a
- package name and returns a boolean.
- """
-
- missing = {}
- conflicting = {}
-
- for package_name, package_detail in package_set.items():
- # Info about dependencies of package_name
- missing_deps: Set[Missing] = set()
- conflicting_deps: Set[Conflicting] = set()
-
- if should_ignore and should_ignore(package_name):
- continue
-
- for req in package_detail.dependencies:
- name = canonicalize_name(req.name)
-
- # Check if it's missing
- if name not in package_set:
- missed = True
- if req.marker is not None:
- missed = req.marker.evaluate()
- if missed:
- missing_deps.add((name, req))
- continue
-
- # Check if there's a conflict
- version = package_set[name].version
- if not req.specifier.contains(version, prereleases=True):
- conflicting_deps.add((name, version, req))
-
- if missing_deps:
- missing[package_name] = sorted(missing_deps, key=str)
- if conflicting_deps:
- conflicting[package_name] = sorted(conflicting_deps, key=str)
-
- return missing, conflicting
-
-
-def check_install_conflicts(to_install: List[InstallRequirement]) -> ConflictDetails:
- """For checking if the dependency graph would be consistent after \
- installing given requirements
- """
- # Start from the current state
- package_set, _ = create_package_set_from_installed()
- # Install packages
- would_be_installed = _simulate_installation_of(to_install, package_set)
-
- # Only warn about directly-dependent packages; create a whitelist of them
- whitelist = _create_whitelist(would_be_installed, package_set)
-
- return (
- package_set,
- check_package_set(
- package_set, should_ignore=lambda name: name not in whitelist
- ),
- )
-
-
-def _simulate_installation_of(
- to_install: List[InstallRequirement], package_set: PackageSet
-) -> Set[NormalizedName]:
- """Computes the version of packages after installing to_install."""
- # Keep track of packages that were installed
- installed = set()
-
- # Modify it as installing requirement_set would (assuming no errors)
- for inst_req in to_install:
- abstract_dist = make_distribution_for_install_requirement(inst_req)
- dist = abstract_dist.get_metadata_distribution()
- name = dist.canonical_name
- package_set[name] = PackageDetails(dist.version, list(dist.iter_dependencies()))
-
- installed.add(name)
-
- return installed
-
-
-def _create_whitelist(
- would_be_installed: Set[NormalizedName], package_set: PackageSet
-) -> Set[NormalizedName]:
- packages_affected = set(would_be_installed)
-
- for package_name in package_set:
- if package_name in packages_affected:
- continue
-
- for req in package_set[package_name].dependencies:
- if canonicalize_name(req.name) in packages_affected:
- packages_affected.add(package_name)
- break
-
- return packages_affected
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/freeze.py b/env/lib/python3.9/site-packages/pip/_internal/operations/freeze.py
deleted file mode 100644
index 930d4c6..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/freeze.py
+++ /dev/null
@@ -1,254 +0,0 @@
-import collections
-import logging
-import os
-from typing import Container, Dict, Generator, Iterable, List, NamedTuple, Optional, Set
-
-from pip._vendor.packaging.utils import canonicalize_name
-from pip._vendor.packaging.version import Version
-
-from pip._internal.exceptions import BadCommand, InstallationError
-from pip._internal.metadata import BaseDistribution, get_environment
-from pip._internal.req.constructors import (
- install_req_from_editable,
- install_req_from_line,
-)
-from pip._internal.req.req_file import COMMENT_RE
-from pip._internal.utils.direct_url_helpers import direct_url_as_pep440_direct_reference
-
-logger = logging.getLogger(__name__)
-
-
-class _EditableInfo(NamedTuple):
- requirement: str
- comments: List[str]
-
-
-def freeze(
- requirement: Optional[List[str]] = None,
- local_only: bool = False,
- user_only: bool = False,
- paths: Optional[List[str]] = None,
- isolated: bool = False,
- exclude_editable: bool = False,
- skip: Container[str] = (),
-) -> Generator[str, None, None]:
- installations: Dict[str, FrozenRequirement] = {}
-
- dists = get_environment(paths).iter_installed_distributions(
- local_only=local_only,
- skip=(),
- user_only=user_only,
- )
- for dist in dists:
- req = FrozenRequirement.from_dist(dist)
- if exclude_editable and req.editable:
- continue
- installations[req.canonical_name] = req
-
- if requirement:
- # the options that don't get turned into an InstallRequirement
- # should only be emitted once, even if the same option is in multiple
- # requirements files, so we need to keep track of what has been emitted
- # so that we don't emit it again if it's seen again
- emitted_options: Set[str] = set()
- # keep track of which files a requirement is in so that we can
- # give an accurate warning if a requirement appears multiple times.
- req_files: Dict[str, List[str]] = collections.defaultdict(list)
- for req_file_path in requirement:
- with open(req_file_path) as req_file:
- for line in req_file:
- if (
- not line.strip()
- or line.strip().startswith("#")
- or line.startswith(
- (
- "-r",
- "--requirement",
- "-f",
- "--find-links",
- "-i",
- "--index-url",
- "--pre",
- "--trusted-host",
- "--process-dependency-links",
- "--extra-index-url",
- "--use-feature",
- )
- )
- ):
- line = line.rstrip()
- if line not in emitted_options:
- emitted_options.add(line)
- yield line
- continue
-
- if line.startswith("-e") or line.startswith("--editable"):
- if line.startswith("-e"):
- line = line[2:].strip()
- else:
- line = line[len("--editable") :].strip().lstrip("=")
- line_req = install_req_from_editable(
- line,
- isolated=isolated,
- )
- else:
- line_req = install_req_from_line(
- COMMENT_RE.sub("", line).strip(),
- isolated=isolated,
- )
-
- if not line_req.name:
- logger.info(
- "Skipping line in requirement file [%s] because "
- "it's not clear what it would install: %s",
- req_file_path,
- line.strip(),
- )
- logger.info(
- " (add #egg=PackageName to the URL to avoid"
- " this warning)"
- )
- else:
- line_req_canonical_name = canonicalize_name(line_req.name)
- if line_req_canonical_name not in installations:
- # either it's not installed, or it is installed
- # but has been processed already
- if not req_files[line_req.name]:
- logger.warning(
- "Requirement file [%s] contains %s, but "
- "package %r is not installed",
- req_file_path,
- COMMENT_RE.sub("", line).strip(),
- line_req.name,
- )
- else:
- req_files[line_req.name].append(req_file_path)
- else:
- yield str(installations[line_req_canonical_name]).rstrip()
- del installations[line_req_canonical_name]
- req_files[line_req.name].append(req_file_path)
-
- # Warn about requirements that were included multiple times (in a
- # single requirements file or in different requirements files).
- for name, files in req_files.items():
- if len(files) > 1:
- logger.warning(
- "Requirement %s included multiple times [%s]",
- name,
- ", ".join(sorted(set(files))),
- )
-
- yield ("## The following requirements were added by pip freeze:")
- for installation in sorted(installations.values(), key=lambda x: x.name.lower()):
- if installation.canonical_name not in skip:
- yield str(installation).rstrip()
-
-
-def _format_as_name_version(dist: BaseDistribution) -> str:
- if isinstance(dist.version, Version):
- return f"{dist.raw_name}=={dist.version}"
- return f"{dist.raw_name}==={dist.version}"
-
-
-def _get_editable_info(dist: BaseDistribution) -> _EditableInfo:
- """
- Compute and return values (req, comments) for use in
- FrozenRequirement.from_dist().
- """
- editable_project_location = dist.editable_project_location
- assert editable_project_location
- location = os.path.normcase(os.path.abspath(editable_project_location))
-
- from pip._internal.vcs import RemoteNotFoundError, RemoteNotValidError, vcs
-
- vcs_backend = vcs.get_backend_for_dir(location)
-
- if vcs_backend is None:
- display = _format_as_name_version(dist)
- logger.debug(
- 'No VCS found for editable requirement "%s" in: %r',
- display,
- location,
- )
- return _EditableInfo(
- requirement=location,
- comments=[f"# Editable install with no version control ({display})"],
- )
-
- vcs_name = type(vcs_backend).__name__
-
- try:
- req = vcs_backend.get_src_requirement(location, dist.raw_name)
- except RemoteNotFoundError:
- display = _format_as_name_version(dist)
- return _EditableInfo(
- requirement=location,
- comments=[f"# Editable {vcs_name} install with no remote ({display})"],
- )
- except RemoteNotValidError as ex:
- display = _format_as_name_version(dist)
- return _EditableInfo(
- requirement=location,
- comments=[
- f"# Editable {vcs_name} install ({display}) with either a deleted "
- f"local remote or invalid URI:",
- f"# '{ex.url}'",
- ],
- )
- except BadCommand:
- logger.warning(
- "cannot determine version of editable source in %s "
- "(%s command not found in path)",
- location,
- vcs_backend.name,
- )
- return _EditableInfo(requirement=location, comments=[])
- except InstallationError as exc:
- logger.warning("Error when trying to get requirement for VCS system %s", exc)
- else:
- return _EditableInfo(requirement=req, comments=[])
-
- logger.warning("Could not determine repository location of %s", location)
-
- return _EditableInfo(
- requirement=location,
- comments=["## !! Could not determine repository location"],
- )
-
-
-class FrozenRequirement:
- def __init__(
- self,
- name: str,
- req: str,
- editable: bool,
- comments: Iterable[str] = (),
- ) -> None:
- self.name = name
- self.canonical_name = canonicalize_name(name)
- self.req = req
- self.editable = editable
- self.comments = comments
-
- @classmethod
- def from_dist(cls, dist: BaseDistribution) -> "FrozenRequirement":
- editable = dist.editable
- if editable:
- req, comments = _get_editable_info(dist)
- else:
- comments = []
- direct_url = dist.direct_url
- if direct_url:
- # if PEP 610 metadata is present, use it
- req = direct_url_as_pep440_direct_reference(direct_url, dist.raw_name)
- else:
- # name==version requirement
- req = _format_as_name_version(dist)
-
- return cls(dist.raw_name, req, editable, comments=comments)
-
- def __str__(self) -> str:
- req = self.req
- if self.editable:
- req = f"-e {req}"
- return "\n".join(list(self.comments) + [str(req)]) + "\n"
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/install/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/operations/install/__init__.py
deleted file mode 100644
index 24d6a5d..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/install/__init__.py
+++ /dev/null
@@ -1,2 +0,0 @@
-"""For modules related to installing packages.
-"""
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/install/editable_legacy.py b/env/lib/python3.9/site-packages/pip/_internal/operations/install/editable_legacy.py
deleted file mode 100644
index bb548cd..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/install/editable_legacy.py
+++ /dev/null
@@ -1,47 +0,0 @@
-"""Legacy editable installation process, i.e. `setup.py develop`.
-"""
-import logging
-from typing import List, Optional, Sequence
-
-from pip._internal.build_env import BuildEnvironment
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.setuptools_build import make_setuptools_develop_args
-from pip._internal.utils.subprocess import call_subprocess
-
-logger = logging.getLogger(__name__)
-
-
-def install_editable(
- install_options: List[str],
- global_options: Sequence[str],
- prefix: Optional[str],
- home: Optional[str],
- use_user_site: bool,
- name: str,
- setup_py_path: str,
- isolated: bool,
- build_env: BuildEnvironment,
- unpacked_source_directory: str,
-) -> None:
- """Install a package in editable mode. Most arguments are pass-through
- to setuptools.
- """
- logger.info("Running setup.py develop for %s", name)
-
- args = make_setuptools_develop_args(
- setup_py_path,
- global_options=global_options,
- install_options=install_options,
- no_user_config=isolated,
- prefix=prefix,
- home=home,
- use_user_site=use_user_site,
- )
-
- with indent_log():
- with build_env:
- call_subprocess(
- args,
- command_desc="python setup.py develop",
- cwd=unpacked_source_directory,
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/install/legacy.py b/env/lib/python3.9/site-packages/pip/_internal/operations/install/legacy.py
deleted file mode 100644
index 5b7ef90..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/install/legacy.py
+++ /dev/null
@@ -1,120 +0,0 @@
-"""Legacy installation process, i.e. `setup.py install`.
-"""
-
-import logging
-import os
-from distutils.util import change_root
-from typing import List, Optional, Sequence
-
-from pip._internal.build_env import BuildEnvironment
-from pip._internal.exceptions import InstallationError, LegacyInstallFailure
-from pip._internal.models.scheme import Scheme
-from pip._internal.utils.misc import ensure_dir
-from pip._internal.utils.setuptools_build import make_setuptools_install_args
-from pip._internal.utils.subprocess import runner_with_spinner_message
-from pip._internal.utils.temp_dir import TempDirectory
-
-logger = logging.getLogger(__name__)
-
-
-def write_installed_files_from_setuptools_record(
- record_lines: List[str],
- root: Optional[str],
- req_description: str,
-) -> None:
- def prepend_root(path: str) -> str:
- if root is None or not os.path.isabs(path):
- return path
- else:
- return change_root(root, path)
-
- for line in record_lines:
- directory = os.path.dirname(line)
- if directory.endswith(".egg-info"):
- egg_info_dir = prepend_root(directory)
- break
- else:
- message = (
- "{} did not indicate that it installed an "
- ".egg-info directory. Only setup.py projects "
- "generating .egg-info directories are supported."
- ).format(req_description)
- raise InstallationError(message)
-
- new_lines = []
- for line in record_lines:
- filename = line.strip()
- if os.path.isdir(filename):
- filename += os.path.sep
- new_lines.append(os.path.relpath(prepend_root(filename), egg_info_dir))
- new_lines.sort()
- ensure_dir(egg_info_dir)
- inst_files_path = os.path.join(egg_info_dir, "installed-files.txt")
- with open(inst_files_path, "w") as f:
- f.write("\n".join(new_lines) + "\n")
-
-
-def install(
- install_options: List[str],
- global_options: Sequence[str],
- root: Optional[str],
- home: Optional[str],
- prefix: Optional[str],
- use_user_site: bool,
- pycompile: bool,
- scheme: Scheme,
- setup_py_path: str,
- isolated: bool,
- req_name: str,
- build_env: BuildEnvironment,
- unpacked_source_directory: str,
- req_description: str,
-) -> bool:
-
- header_dir = scheme.headers
-
- with TempDirectory(kind="record") as temp_dir:
- try:
- record_filename = os.path.join(temp_dir.path, "install-record.txt")
- install_args = make_setuptools_install_args(
- setup_py_path,
- global_options=global_options,
- install_options=install_options,
- record_filename=record_filename,
- root=root,
- prefix=prefix,
- header_dir=header_dir,
- home=home,
- use_user_site=use_user_site,
- no_user_config=isolated,
- pycompile=pycompile,
- )
-
- runner = runner_with_spinner_message(
- f"Running setup.py install for {req_name}"
- )
- with build_env:
- runner(
- cmd=install_args,
- cwd=unpacked_source_directory,
- )
-
- if not os.path.exists(record_filename):
- logger.debug("Record file %s not found", record_filename)
- # Signal to the caller that we didn't install the new package
- return False
-
- except Exception as e:
- # Signal to the caller that we didn't install the new package
- raise LegacyInstallFailure(package_details=req_name) from e
-
- # At this point, we have successfully installed the requirement.
-
- # We intentionally do not use any encoding to read the file because
- # setuptools writes the file using distutils.file_util.write_file,
- # which does not specify an encoding.
- with open(record_filename) as f:
- record_lines = f.read().splitlines()
-
- write_installed_files_from_setuptools_record(record_lines, root, req_description)
- return True
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/install/wheel.py b/env/lib/python3.9/site-packages/pip/_internal/operations/install/wheel.py
deleted file mode 100644
index 7e17656..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/install/wheel.py
+++ /dev/null
@@ -1,739 +0,0 @@
-"""Support for installing and building the "wheel" binary package format.
-"""
-
-import collections
-import compileall
-import contextlib
-import csv
-import importlib
-import logging
-import os.path
-import re
-import shutil
-import sys
-import warnings
-from base64 import urlsafe_b64encode
-from email.message import Message
-from itertools import chain, filterfalse, starmap
-from typing import (
- IO,
- TYPE_CHECKING,
- Any,
- BinaryIO,
- Callable,
- Dict,
- Generator,
- Iterable,
- Iterator,
- List,
- NewType,
- Optional,
- Sequence,
- Set,
- Tuple,
- Union,
- cast,
-)
-from zipfile import ZipFile, ZipInfo
-
-from pip._vendor.distlib.scripts import ScriptMaker
-from pip._vendor.distlib.util import get_export_entry
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.exceptions import InstallationError
-from pip._internal.locations import get_major_minor_version
-from pip._internal.metadata import (
- BaseDistribution,
- FilesystemWheel,
- get_wheel_distribution,
-)
-from pip._internal.models.direct_url import DIRECT_URL_METADATA_NAME, DirectUrl
-from pip._internal.models.scheme import SCHEME_KEYS, Scheme
-from pip._internal.utils.filesystem import adjacent_tmp_file, replace
-from pip._internal.utils.misc import captured_stdout, ensure_dir, hash_file, partition
-from pip._internal.utils.unpacking import (
- current_umask,
- is_within_directory,
- set_extracted_file_to_default_mode_plus_executable,
- zip_item_is_executable,
-)
-from pip._internal.utils.wheel import parse_wheel
-
-if TYPE_CHECKING:
- from typing import Protocol
-
- class File(Protocol):
- src_record_path: "RecordPath"
- dest_path: str
- changed: bool
-
- def save(self) -> None:
- pass
-
-
-logger = logging.getLogger(__name__)
-
-RecordPath = NewType("RecordPath", str)
-InstalledCSVRow = Tuple[RecordPath, str, Union[int, str]]
-
-
-def rehash(path: str, blocksize: int = 1 << 20) -> Tuple[str, str]:
- """Return (encoded_digest, length) for path using hashlib.sha256()"""
- h, length = hash_file(path, blocksize)
- digest = "sha256=" + urlsafe_b64encode(h.digest()).decode("latin1").rstrip("=")
- return (digest, str(length))
-
-
-def csv_io_kwargs(mode: str) -> Dict[str, Any]:
- """Return keyword arguments to properly open a CSV file
- in the given mode.
- """
- return {"mode": mode, "newline": "", "encoding": "utf-8"}
-
-
-def fix_script(path: str) -> bool:
- """Replace #!python with #!/path/to/python
- Return True if file was changed.
- """
- # XXX RECORD hashes will need to be updated
- assert os.path.isfile(path)
-
- with open(path, "rb") as script:
- firstline = script.readline()
- if not firstline.startswith(b"#!python"):
- return False
- exename = sys.executable.encode(sys.getfilesystemencoding())
- firstline = b"#!" + exename + os.linesep.encode("ascii")
- rest = script.read()
- with open(path, "wb") as script:
- script.write(firstline)
- script.write(rest)
- return True
-
-
-def wheel_root_is_purelib(metadata: Message) -> bool:
- return metadata.get("Root-Is-Purelib", "").lower() == "true"
-
-
-def get_entrypoints(dist: BaseDistribution) -> Tuple[Dict[str, str], Dict[str, str]]:
- console_scripts = {}
- gui_scripts = {}
- for entry_point in dist.iter_entry_points():
- if entry_point.group == "console_scripts":
- console_scripts[entry_point.name] = entry_point.value
- elif entry_point.group == "gui_scripts":
- gui_scripts[entry_point.name] = entry_point.value
- return console_scripts, gui_scripts
-
-
-def message_about_scripts_not_on_PATH(scripts: Sequence[str]) -> Optional[str]:
- """Determine if any scripts are not on PATH and format a warning.
- Returns a warning message if one or more scripts are not on PATH,
- otherwise None.
- """
- if not scripts:
- return None
-
- # Group scripts by the path they were installed in
- grouped_by_dir: Dict[str, Set[str]] = collections.defaultdict(set)
- for destfile in scripts:
- parent_dir = os.path.dirname(destfile)
- script_name = os.path.basename(destfile)
- grouped_by_dir[parent_dir].add(script_name)
-
- # We don't want to warn for directories that are on PATH.
- not_warn_dirs = [
- os.path.normcase(i).rstrip(os.sep)
- for i in os.environ.get("PATH", "").split(os.pathsep)
- ]
- # If an executable sits with sys.executable, we don't warn for it.
- # This covers the case of venv invocations without activating the venv.
- not_warn_dirs.append(os.path.normcase(os.path.dirname(sys.executable)))
- warn_for: Dict[str, Set[str]] = {
- parent_dir: scripts
- for parent_dir, scripts in grouped_by_dir.items()
- if os.path.normcase(parent_dir) not in not_warn_dirs
- }
- if not warn_for:
- return None
-
- # Format a message
- msg_lines = []
- for parent_dir, dir_scripts in warn_for.items():
- sorted_scripts: List[str] = sorted(dir_scripts)
- if len(sorted_scripts) == 1:
- start_text = "script {} is".format(sorted_scripts[0])
- else:
- start_text = "scripts {} are".format(
- ", ".join(sorted_scripts[:-1]) + " and " + sorted_scripts[-1]
- )
-
- msg_lines.append(
- "The {} installed in '{}' which is not on PATH.".format(
- start_text, parent_dir
- )
- )
-
- last_line_fmt = (
- "Consider adding {} to PATH or, if you prefer "
- "to suppress this warning, use --no-warn-script-location."
- )
- if len(msg_lines) == 1:
- msg_lines.append(last_line_fmt.format("this directory"))
- else:
- msg_lines.append(last_line_fmt.format("these directories"))
-
- # Add a note if any directory starts with ~
- warn_for_tilde = any(
- i[0] == "~" for i in os.environ.get("PATH", "").split(os.pathsep) if i
- )
- if warn_for_tilde:
- tilde_warning_msg = (
- "NOTE: The current PATH contains path(s) starting with `~`, "
- "which may not be expanded by all applications."
- )
- msg_lines.append(tilde_warning_msg)
-
- # Returns the formatted multiline message
- return "\n".join(msg_lines)
-
-
-def _normalized_outrows(
- outrows: Iterable[InstalledCSVRow],
-) -> List[Tuple[str, str, str]]:
- """Normalize the given rows of a RECORD file.
-
- Items in each row are converted into str. Rows are then sorted to make
- the value more predictable for tests.
-
- Each row is a 3-tuple (path, hash, size) and corresponds to a record of
- a RECORD file (see PEP 376 and PEP 427 for details). For the rows
- passed to this function, the size can be an integer as an int or string,
- or the empty string.
- """
- # Normally, there should only be one row per path, in which case the
- # second and third elements don't come into play when sorting.
- # However, in cases in the wild where a path might happen to occur twice,
- # we don't want the sort operation to trigger an error (but still want
- # determinism). Since the third element can be an int or string, we
- # coerce each element to a string to avoid a TypeError in this case.
- # For additional background, see--
- # https://github.com/pypa/pip/issues/5868
- return sorted(
- (record_path, hash_, str(size)) for record_path, hash_, size in outrows
- )
-
-
-def _record_to_fs_path(record_path: RecordPath) -> str:
- return record_path
-
-
-def _fs_to_record_path(path: str, relative_to: Optional[str] = None) -> RecordPath:
- if relative_to is not None:
- # On Windows, do not handle relative paths if they belong to different
- # logical disks
- if (
- os.path.splitdrive(path)[0].lower()
- == os.path.splitdrive(relative_to)[0].lower()
- ):
- path = os.path.relpath(path, relative_to)
- path = path.replace(os.path.sep, "/")
- return cast("RecordPath", path)
-
-
-def get_csv_rows_for_installed(
- old_csv_rows: List[List[str]],
- installed: Dict[RecordPath, RecordPath],
- changed: Set[RecordPath],
- generated: List[str],
- lib_dir: str,
-) -> List[InstalledCSVRow]:
- """
- :param installed: A map from archive RECORD path to installation RECORD
- path.
- """
- installed_rows: List[InstalledCSVRow] = []
- for row in old_csv_rows:
- if len(row) > 3:
- logger.warning("RECORD line has more than three elements: %s", row)
- old_record_path = cast("RecordPath", row[0])
- new_record_path = installed.pop(old_record_path, old_record_path)
- if new_record_path in changed:
- digest, length = rehash(_record_to_fs_path(new_record_path))
- else:
- digest = row[1] if len(row) > 1 else ""
- length = row[2] if len(row) > 2 else ""
- installed_rows.append((new_record_path, digest, length))
- for f in generated:
- path = _fs_to_record_path(f, lib_dir)
- digest, length = rehash(f)
- installed_rows.append((path, digest, length))
- for installed_record_path in installed.values():
- installed_rows.append((installed_record_path, "", ""))
- return installed_rows
-
-
-def get_console_script_specs(console: Dict[str, str]) -> List[str]:
- """
- Given the mapping from entrypoint name to callable, return the relevant
- console script specs.
- """
- # Don't mutate caller's version
- console = console.copy()
-
- scripts_to_generate = []
-
- # Special case pip and setuptools to generate versioned wrappers
- #
- # The issue is that some projects (specifically, pip and setuptools) use
- # code in setup.py to create "versioned" entry points - pip2.7 on Python
- # 2.7, pip3.3 on Python 3.3, etc. But these entry points are baked into
- # the wheel metadata at build time, and so if the wheel is installed with
- # a *different* version of Python the entry points will be wrong. The
- # correct fix for this is to enhance the metadata to be able to describe
- # such versioned entry points, but that won't happen till Metadata 2.0 is
- # available.
- # In the meantime, projects using versioned entry points will either have
- # incorrect versioned entry points, or they will not be able to distribute
- # "universal" wheels (i.e., they will need a wheel per Python version).
- #
- # Because setuptools and pip are bundled with _ensurepip and virtualenv,
- # we need to use universal wheels. So, as a stopgap until Metadata 2.0, we
- # override the versioned entry points in the wheel and generate the
- # correct ones. This code is purely a short-term measure until Metadata 2.0
- # is available.
- #
- # To add the level of hack in this section of code, in order to support
- # ensurepip this code will look for an ``ENSUREPIP_OPTIONS`` environment
- # variable which will control which version scripts get installed.
- #
- # ENSUREPIP_OPTIONS=altinstall
- # - Only pipX.Y and easy_install-X.Y will be generated and installed
- # ENSUREPIP_OPTIONS=install
- # - pipX.Y, pipX, easy_install-X.Y will be generated and installed. Note
- # that this option is technically if ENSUREPIP_OPTIONS is set and is
- # not altinstall
- # DEFAULT
- # - The default behavior is to install pip, pipX, pipX.Y, easy_install
- # and easy_install-X.Y.
- pip_script = console.pop("pip", None)
- if pip_script:
- if "ENSUREPIP_OPTIONS" not in os.environ:
- scripts_to_generate.append("pip = " + pip_script)
-
- if os.environ.get("ENSUREPIP_OPTIONS", "") != "altinstall":
- scripts_to_generate.append(
- "pip{} = {}".format(sys.version_info[0], pip_script)
- )
-
- scripts_to_generate.append(f"pip{get_major_minor_version()} = {pip_script}")
- # Delete any other versioned pip entry points
- pip_ep = [k for k in console if re.match(r"pip(\d(\.\d)?)?$", k)]
- for k in pip_ep:
- del console[k]
- easy_install_script = console.pop("easy_install", None)
- if easy_install_script:
- if "ENSUREPIP_OPTIONS" not in os.environ:
- scripts_to_generate.append("easy_install = " + easy_install_script)
-
- scripts_to_generate.append(
- "easy_install-{} = {}".format(
- get_major_minor_version(), easy_install_script
- )
- )
- # Delete any other versioned easy_install entry points
- easy_install_ep = [
- k for k in console if re.match(r"easy_install(-\d\.\d)?$", k)
- ]
- for k in easy_install_ep:
- del console[k]
-
- # Generate the console entry points specified in the wheel
- scripts_to_generate.extend(starmap("{} = {}".format, console.items()))
-
- return scripts_to_generate
-
-
-class ZipBackedFile:
- def __init__(
- self, src_record_path: RecordPath, dest_path: str, zip_file: ZipFile
- ) -> None:
- self.src_record_path = src_record_path
- self.dest_path = dest_path
- self._zip_file = zip_file
- self.changed = False
-
- def _getinfo(self) -> ZipInfo:
- return self._zip_file.getinfo(self.src_record_path)
-
- def save(self) -> None:
- # directory creation is lazy and after file filtering
- # to ensure we don't install empty dirs; empty dirs can't be
- # uninstalled.
- parent_dir = os.path.dirname(self.dest_path)
- ensure_dir(parent_dir)
-
- # When we open the output file below, any existing file is truncated
- # before we start writing the new contents. This is fine in most
- # cases, but can cause a segfault if pip has loaded a shared
- # object (e.g. from pyopenssl through its vendored urllib3)
- # Since the shared object is mmap'd an attempt to call a
- # symbol in it will then cause a segfault. Unlinking the file
- # allows writing of new contents while allowing the process to
- # continue to use the old copy.
- if os.path.exists(self.dest_path):
- os.unlink(self.dest_path)
-
- zipinfo = self._getinfo()
-
- with self._zip_file.open(zipinfo) as f:
- with open(self.dest_path, "wb") as dest:
- shutil.copyfileobj(f, dest)
-
- if zip_item_is_executable(zipinfo):
- set_extracted_file_to_default_mode_plus_executable(self.dest_path)
-
-
-class ScriptFile:
- def __init__(self, file: "File") -> None:
- self._file = file
- self.src_record_path = self._file.src_record_path
- self.dest_path = self._file.dest_path
- self.changed = False
-
- def save(self) -> None:
- self._file.save()
- self.changed = fix_script(self.dest_path)
-
-
-class MissingCallableSuffix(InstallationError):
- def __init__(self, entry_point: str) -> None:
- super().__init__(
- "Invalid script entry point: {} - A callable "
- "suffix is required. Cf https://packaging.python.org/"
- "specifications/entry-points/#use-for-scripts for more "
- "information.".format(entry_point)
- )
-
-
-def _raise_for_invalid_entrypoint(specification: str) -> None:
- entry = get_export_entry(specification)
- if entry is not None and entry.suffix is None:
- raise MissingCallableSuffix(str(entry))
-
-
-class PipScriptMaker(ScriptMaker):
- def make(self, specification: str, options: Dict[str, Any] = None) -> List[str]:
- _raise_for_invalid_entrypoint(specification)
- return super().make(specification, options)
-
-
-def _install_wheel(
- name: str,
- wheel_zip: ZipFile,
- wheel_path: str,
- scheme: Scheme,
- pycompile: bool = True,
- warn_script_location: bool = True,
- direct_url: Optional[DirectUrl] = None,
- requested: bool = False,
-) -> None:
- """Install a wheel.
-
- :param name: Name of the project to install
- :param wheel_zip: open ZipFile for wheel being installed
- :param scheme: Distutils scheme dictating the install directories
- :param req_description: String used in place of the requirement, for
- logging
- :param pycompile: Whether to byte-compile installed Python files
- :param warn_script_location: Whether to check that scripts are installed
- into a directory on PATH
- :raises UnsupportedWheel:
- * when the directory holds an unpacked wheel with incompatible
- Wheel-Version
- * when the .dist-info dir does not match the wheel
- """
- info_dir, metadata = parse_wheel(wheel_zip, name)
-
- if wheel_root_is_purelib(metadata):
- lib_dir = scheme.purelib
- else:
- lib_dir = scheme.platlib
-
- # Record details of the files moved
- # installed = files copied from the wheel to the destination
- # changed = files changed while installing (scripts #! line typically)
- # generated = files newly generated during the install (script wrappers)
- installed: Dict[RecordPath, RecordPath] = {}
- changed: Set[RecordPath] = set()
- generated: List[str] = []
-
- def record_installed(
- srcfile: RecordPath, destfile: str, modified: bool = False
- ) -> None:
- """Map archive RECORD paths to installation RECORD paths."""
- newpath = _fs_to_record_path(destfile, lib_dir)
- installed[srcfile] = newpath
- if modified:
- changed.add(_fs_to_record_path(destfile))
-
- def is_dir_path(path: RecordPath) -> bool:
- return path.endswith("/")
-
- def assert_no_path_traversal(dest_dir_path: str, target_path: str) -> None:
- if not is_within_directory(dest_dir_path, target_path):
- message = (
- "The wheel {!r} has a file {!r} trying to install"
- " outside the target directory {!r}"
- )
- raise InstallationError(
- message.format(wheel_path, target_path, dest_dir_path)
- )
-
- def root_scheme_file_maker(
- zip_file: ZipFile, dest: str
- ) -> Callable[[RecordPath], "File"]:
- def make_root_scheme_file(record_path: RecordPath) -> "File":
- normed_path = os.path.normpath(record_path)
- dest_path = os.path.join(dest, normed_path)
- assert_no_path_traversal(dest, dest_path)
- return ZipBackedFile(record_path, dest_path, zip_file)
-
- return make_root_scheme_file
-
- def data_scheme_file_maker(
- zip_file: ZipFile, scheme: Scheme
- ) -> Callable[[RecordPath], "File"]:
- scheme_paths = {key: getattr(scheme, key) for key in SCHEME_KEYS}
-
- def make_data_scheme_file(record_path: RecordPath) -> "File":
- normed_path = os.path.normpath(record_path)
- try:
- _, scheme_key, dest_subpath = normed_path.split(os.path.sep, 2)
- except ValueError:
- message = (
- "Unexpected file in {}: {!r}. .data directory contents"
- " should be named like: '/'."
- ).format(wheel_path, record_path)
- raise InstallationError(message)
-
- try:
- scheme_path = scheme_paths[scheme_key]
- except KeyError:
- valid_scheme_keys = ", ".join(sorted(scheme_paths))
- message = (
- "Unknown scheme key used in {}: {} (for file {!r}). .data"
- " directory contents should be in subdirectories named"
- " with a valid scheme key ({})"
- ).format(wheel_path, scheme_key, record_path, valid_scheme_keys)
- raise InstallationError(message)
-
- dest_path = os.path.join(scheme_path, dest_subpath)
- assert_no_path_traversal(scheme_path, dest_path)
- return ZipBackedFile(record_path, dest_path, zip_file)
-
- return make_data_scheme_file
-
- def is_data_scheme_path(path: RecordPath) -> bool:
- return path.split("/", 1)[0].endswith(".data")
-
- paths = cast(List[RecordPath], wheel_zip.namelist())
- file_paths = filterfalse(is_dir_path, paths)
- root_scheme_paths, data_scheme_paths = partition(is_data_scheme_path, file_paths)
-
- make_root_scheme_file = root_scheme_file_maker(wheel_zip, lib_dir)
- files: Iterator[File] = map(make_root_scheme_file, root_scheme_paths)
-
- def is_script_scheme_path(path: RecordPath) -> bool:
- parts = path.split("/", 2)
- return len(parts) > 2 and parts[0].endswith(".data") and parts[1] == "scripts"
-
- other_scheme_paths, script_scheme_paths = partition(
- is_script_scheme_path, data_scheme_paths
- )
-
- make_data_scheme_file = data_scheme_file_maker(wheel_zip, scheme)
- other_scheme_files = map(make_data_scheme_file, other_scheme_paths)
- files = chain(files, other_scheme_files)
-
- # Get the defined entry points
- distribution = get_wheel_distribution(
- FilesystemWheel(wheel_path),
- canonicalize_name(name),
- )
- console, gui = get_entrypoints(distribution)
-
- def is_entrypoint_wrapper(file: "File") -> bool:
- # EP, EP.exe and EP-script.py are scripts generated for
- # entry point EP by setuptools
- path = file.dest_path
- name = os.path.basename(path)
- if name.lower().endswith(".exe"):
- matchname = name[:-4]
- elif name.lower().endswith("-script.py"):
- matchname = name[:-10]
- elif name.lower().endswith(".pya"):
- matchname = name[:-4]
- else:
- matchname = name
- # Ignore setuptools-generated scripts
- return matchname in console or matchname in gui
-
- script_scheme_files: Iterator[File] = map(
- make_data_scheme_file, script_scheme_paths
- )
- script_scheme_files = filterfalse(is_entrypoint_wrapper, script_scheme_files)
- script_scheme_files = map(ScriptFile, script_scheme_files)
- files = chain(files, script_scheme_files)
-
- for file in files:
- file.save()
- record_installed(file.src_record_path, file.dest_path, file.changed)
-
- def pyc_source_file_paths() -> Generator[str, None, None]:
- # We de-duplicate installation paths, since there can be overlap (e.g.
- # file in .data maps to same location as file in wheel root).
- # Sorting installation paths makes it easier to reproduce and debug
- # issues related to permissions on existing files.
- for installed_path in sorted(set(installed.values())):
- full_installed_path = os.path.join(lib_dir, installed_path)
- if not os.path.isfile(full_installed_path):
- continue
- if not full_installed_path.endswith(".py"):
- continue
- yield full_installed_path
-
- def pyc_output_path(path: str) -> str:
- """Return the path the pyc file would have been written to."""
- return importlib.util.cache_from_source(path)
-
- # Compile all of the pyc files for the installed files
- if pycompile:
- with captured_stdout() as stdout:
- with warnings.catch_warnings():
- warnings.filterwarnings("ignore")
- for path in pyc_source_file_paths():
- success = compileall.compile_file(path, force=True, quiet=True)
- if success:
- pyc_path = pyc_output_path(path)
- assert os.path.exists(pyc_path)
- pyc_record_path = cast(
- "RecordPath", pyc_path.replace(os.path.sep, "/")
- )
- record_installed(pyc_record_path, pyc_path)
- logger.debug(stdout.getvalue())
-
- maker = PipScriptMaker(None, scheme.scripts)
-
- # Ensure old scripts are overwritten.
- # See https://github.com/pypa/pip/issues/1800
- maker.clobber = True
-
- # Ensure we don't generate any variants for scripts because this is almost
- # never what somebody wants.
- # See https://bitbucket.org/pypa/distlib/issue/35/
- maker.variants = {""}
-
- # This is required because otherwise distlib creates scripts that are not
- # executable.
- # See https://bitbucket.org/pypa/distlib/issue/32/
- maker.set_mode = True
-
- # Generate the console and GUI entry points specified in the wheel
- scripts_to_generate = get_console_script_specs(console)
-
- gui_scripts_to_generate = list(starmap("{} = {}".format, gui.items()))
-
- generated_console_scripts = maker.make_multiple(scripts_to_generate)
- generated.extend(generated_console_scripts)
-
- generated.extend(maker.make_multiple(gui_scripts_to_generate, {"gui": True}))
-
- if warn_script_location:
- msg = message_about_scripts_not_on_PATH(generated_console_scripts)
- if msg is not None:
- logger.warning(msg)
-
- generated_file_mode = 0o666 & ~current_umask()
-
- @contextlib.contextmanager
- def _generate_file(path: str, **kwargs: Any) -> Generator[BinaryIO, None, None]:
- with adjacent_tmp_file(path, **kwargs) as f:
- yield f
- os.chmod(f.name, generated_file_mode)
- replace(f.name, path)
-
- dest_info_dir = os.path.join(lib_dir, info_dir)
-
- # Record pip as the installer
- installer_path = os.path.join(dest_info_dir, "INSTALLER")
- with _generate_file(installer_path) as installer_file:
- installer_file.write(b"pip\n")
- generated.append(installer_path)
-
- # Record the PEP 610 direct URL reference
- if direct_url is not None:
- direct_url_path = os.path.join(dest_info_dir, DIRECT_URL_METADATA_NAME)
- with _generate_file(direct_url_path) as direct_url_file:
- direct_url_file.write(direct_url.to_json().encode("utf-8"))
- generated.append(direct_url_path)
-
- # Record the REQUESTED file
- if requested:
- requested_path = os.path.join(dest_info_dir, "REQUESTED")
- with open(requested_path, "wb"):
- pass
- generated.append(requested_path)
-
- record_text = distribution.read_text("RECORD")
- record_rows = list(csv.reader(record_text.splitlines()))
-
- rows = get_csv_rows_for_installed(
- record_rows,
- installed=installed,
- changed=changed,
- generated=generated,
- lib_dir=lib_dir,
- )
-
- # Record details of all files installed
- record_path = os.path.join(dest_info_dir, "RECORD")
-
- with _generate_file(record_path, **csv_io_kwargs("w")) as record_file:
- # Explicitly cast to typing.IO[str] as a workaround for the mypy error:
- # "writer" has incompatible type "BinaryIO"; expected "_Writer"
- writer = csv.writer(cast("IO[str]", record_file))
- writer.writerows(_normalized_outrows(rows))
-
-
-@contextlib.contextmanager
-def req_error_context(req_description: str) -> Generator[None, None, None]:
- try:
- yield
- except InstallationError as e:
- message = "For req: {}. {}".format(req_description, e.args[0])
- raise InstallationError(message) from e
-
-
-def install_wheel(
- name: str,
- wheel_path: str,
- scheme: Scheme,
- req_description: str,
- pycompile: bool = True,
- warn_script_location: bool = True,
- direct_url: Optional[DirectUrl] = None,
- requested: bool = False,
-) -> None:
- with ZipFile(wheel_path, allowZip64=True) as z:
- with req_error_context(req_description):
- _install_wheel(
- name=name,
- wheel_zip=z,
- wheel_path=wheel_path,
- scheme=scheme,
- pycompile=pycompile,
- warn_script_location=warn_script_location,
- direct_url=direct_url,
- requested=requested,
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/operations/prepare.py b/env/lib/python3.9/site-packages/pip/_internal/operations/prepare.py
deleted file mode 100644
index df1016e..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/operations/prepare.py
+++ /dev/null
@@ -1,585 +0,0 @@
-"""Prepares a distribution for installation
-"""
-
-# The following comment should be removed at some point in the future.
-# mypy: strict-optional=False
-
-import logging
-import mimetypes
-import os
-import shutil
-from typing import Dict, Iterable, List, Optional
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.distributions import make_distribution_for_install_requirement
-from pip._internal.distributions.installed import InstalledDistribution
-from pip._internal.exceptions import (
- DirectoryUrlHashUnsupported,
- HashMismatch,
- HashUnpinned,
- InstallationError,
- NetworkConnectionError,
- PreviousBuildDirError,
- VcsHashUnsupported,
-)
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata import BaseDistribution
-from pip._internal.models.link import Link
-from pip._internal.models.wheel import Wheel
-from pip._internal.network.download import BatchDownloader, Downloader
-from pip._internal.network.lazy_wheel import (
- HTTPRangeRequestUnsupported,
- dist_from_wheel_url,
-)
-from pip._internal.network.session import PipSession
-from pip._internal.operations.build.build_tracker import BuildTracker
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.hashes import Hashes, MissingHashes
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.misc import display_path, hide_url, is_installable_dir
-from pip._internal.utils.temp_dir import TempDirectory
-from pip._internal.utils.unpacking import unpack_file
-from pip._internal.vcs import vcs
-
-logger = logging.getLogger(__name__)
-
-
-def _get_prepared_distribution(
- req: InstallRequirement,
- build_tracker: BuildTracker,
- finder: PackageFinder,
- build_isolation: bool,
- check_build_deps: bool,
-) -> BaseDistribution:
- """Prepare a distribution for installation."""
- abstract_dist = make_distribution_for_install_requirement(req)
- with build_tracker.track(req):
- abstract_dist.prepare_distribution_metadata(
- finder, build_isolation, check_build_deps
- )
- return abstract_dist.get_metadata_distribution()
-
-
-def unpack_vcs_link(link: Link, location: str, verbosity: int) -> None:
- vcs_backend = vcs.get_backend_for_scheme(link.scheme)
- assert vcs_backend is not None
- vcs_backend.unpack(location, url=hide_url(link.url), verbosity=verbosity)
-
-
-class File:
- def __init__(self, path: str, content_type: Optional[str]) -> None:
- self.path = path
- if content_type is None:
- self.content_type = mimetypes.guess_type(path)[0]
- else:
- self.content_type = content_type
-
-
-def get_http_url(
- link: Link,
- download: Downloader,
- download_dir: Optional[str] = None,
- hashes: Optional[Hashes] = None,
-) -> File:
- temp_dir = TempDirectory(kind="unpack", globally_managed=True)
- # If a download dir is specified, is the file already downloaded there?
- already_downloaded_path = None
- if download_dir:
- already_downloaded_path = _check_download_dir(link, download_dir, hashes)
-
- if already_downloaded_path:
- from_path = already_downloaded_path
- content_type = None
- else:
- # let's download to a tmp dir
- from_path, content_type = download(link, temp_dir.path)
- if hashes:
- hashes.check_against_path(from_path)
-
- return File(from_path, content_type)
-
-
-def get_file_url(
- link: Link, download_dir: Optional[str] = None, hashes: Optional[Hashes] = None
-) -> File:
- """Get file and optionally check its hash."""
- # If a download dir is specified, is the file already there and valid?
- already_downloaded_path = None
- if download_dir:
- already_downloaded_path = _check_download_dir(link, download_dir, hashes)
-
- if already_downloaded_path:
- from_path = already_downloaded_path
- else:
- from_path = link.file_path
-
- # If --require-hashes is off, `hashes` is either empty, the
- # link's embedded hash, or MissingHashes; it is required to
- # match. If --require-hashes is on, we are satisfied by any
- # hash in `hashes` matching: a URL-based or an option-based
- # one; no internet-sourced hash will be in `hashes`.
- if hashes:
- hashes.check_against_path(from_path)
- return File(from_path, None)
-
-
-def unpack_url(
- link: Link,
- location: str,
- download: Downloader,
- verbosity: int,
- download_dir: Optional[str] = None,
- hashes: Optional[Hashes] = None,
-) -> Optional[File]:
- """Unpack link into location, downloading if required.
-
- :param hashes: A Hashes object, one of whose embedded hashes must match,
- or HashMismatch will be raised. If the Hashes is empty, no matches are
- required, and unhashable types of requirements (like VCS ones, which
- would ordinarily raise HashUnsupported) are allowed.
- """
- # non-editable vcs urls
- if link.is_vcs:
- unpack_vcs_link(link, location, verbosity=verbosity)
- return None
-
- assert not link.is_existing_dir()
-
- # file urls
- if link.is_file:
- file = get_file_url(link, download_dir, hashes=hashes)
-
- # http urls
- else:
- file = get_http_url(
- link,
- download,
- download_dir,
- hashes=hashes,
- )
-
- # unpack the archive to the build dir location. even when only downloading
- # archives, they have to be unpacked to parse dependencies, except wheels
- if not link.is_wheel:
- unpack_file(file.path, location, file.content_type)
-
- return file
-
-
-def _check_download_dir(
- link: Link, download_dir: str, hashes: Optional[Hashes]
-) -> Optional[str]:
- """Check download_dir for previously downloaded file with correct hash
- If a correct file is found return its path else None
- """
- download_path = os.path.join(download_dir, link.filename)
-
- if not os.path.exists(download_path):
- return None
-
- # If already downloaded, does its hash match?
- logger.info("File was already downloaded %s", download_path)
- if hashes:
- try:
- hashes.check_against_path(download_path)
- except HashMismatch:
- logger.warning(
- "Previously-downloaded file %s has bad hash. Re-downloading.",
- download_path,
- )
- os.unlink(download_path)
- return None
- return download_path
-
-
-class RequirementPreparer:
- """Prepares a Requirement"""
-
- def __init__(
- self,
- build_dir: str,
- download_dir: Optional[str],
- src_dir: str,
- build_isolation: bool,
- check_build_deps: bool,
- build_tracker: BuildTracker,
- session: PipSession,
- progress_bar: str,
- finder: PackageFinder,
- require_hashes: bool,
- use_user_site: bool,
- lazy_wheel: bool,
- verbosity: int,
- ) -> None:
- super().__init__()
-
- self.src_dir = src_dir
- self.build_dir = build_dir
- self.build_tracker = build_tracker
- self._session = session
- self._download = Downloader(session, progress_bar)
- self._batch_download = BatchDownloader(session, progress_bar)
- self.finder = finder
-
- # Where still-packed archives should be written to. If None, they are
- # not saved, and are deleted immediately after unpacking.
- self.download_dir = download_dir
-
- # Is build isolation allowed?
- self.build_isolation = build_isolation
-
- # Should check build dependencies?
- self.check_build_deps = check_build_deps
-
- # Should hash-checking be required?
- self.require_hashes = require_hashes
-
- # Should install in user site-packages?
- self.use_user_site = use_user_site
-
- # Should wheels be downloaded lazily?
- self.use_lazy_wheel = lazy_wheel
-
- # How verbose should underlying tooling be?
- self.verbosity = verbosity
-
- # Memoized downloaded files, as mapping of url: path.
- self._downloaded: Dict[str, str] = {}
-
- # Previous "header" printed for a link-based InstallRequirement
- self._previous_requirement_header = ("", "")
-
- def _log_preparing_link(self, req: InstallRequirement) -> None:
- """Provide context for the requirement being prepared."""
- if req.link.is_file and not req.original_link_is_in_wheel_cache:
- message = "Processing %s"
- information = str(display_path(req.link.file_path))
- else:
- message = "Collecting %s"
- information = str(req.req or req)
-
- if (message, information) != self._previous_requirement_header:
- self._previous_requirement_header = (message, information)
- logger.info(message, information)
-
- if req.original_link_is_in_wheel_cache:
- with indent_log():
- logger.info("Using cached %s", req.link.filename)
-
- def _ensure_link_req_src_dir(
- self, req: InstallRequirement, parallel_builds: bool
- ) -> None:
- """Ensure source_dir of a linked InstallRequirement."""
- # Since source_dir is only set for editable requirements.
- if req.link.is_wheel:
- # We don't need to unpack wheels, so no need for a source
- # directory.
- return
- assert req.source_dir is None
- if req.link.is_existing_dir():
- # build local directories in-tree
- req.source_dir = req.link.file_path
- return
-
- # We always delete unpacked sdists after pip runs.
- req.ensure_has_source_dir(
- self.build_dir,
- autodelete=True,
- parallel_builds=parallel_builds,
- )
-
- # If a checkout exists, it's unwise to keep going. version
- # inconsistencies are logged later, but do not fail the
- # installation.
- # FIXME: this won't upgrade when there's an existing
- # package unpacked in `req.source_dir`
- # TODO: this check is now probably dead code
- if is_installable_dir(req.source_dir):
- raise PreviousBuildDirError(
- "pip can't proceed with requirements '{}' due to a"
- "pre-existing build directory ({}). This is likely "
- "due to a previous installation that failed . pip is "
- "being responsible and not assuming it can delete this. "
- "Please delete it and try again.".format(req, req.source_dir)
- )
-
- def _get_linked_req_hashes(self, req: InstallRequirement) -> Hashes:
- # By the time this is called, the requirement's link should have
- # been checked so we can tell what kind of requirements req is
- # and raise some more informative errors than otherwise.
- # (For example, we can raise VcsHashUnsupported for a VCS URL
- # rather than HashMissing.)
- if not self.require_hashes:
- return req.hashes(trust_internet=True)
-
- # We could check these first 2 conditions inside unpack_url
- # and save repetition of conditions, but then we would
- # report less-useful error messages for unhashable
- # requirements, complaining that there's no hash provided.
- if req.link.is_vcs:
- raise VcsHashUnsupported()
- if req.link.is_existing_dir():
- raise DirectoryUrlHashUnsupported()
-
- # Unpinned packages are asking for trouble when a new version
- # is uploaded. This isn't a security check, but it saves users
- # a surprising hash mismatch in the future.
- # file:/// URLs aren't pinnable, so don't complain about them
- # not being pinned.
- if req.original_link is None and not req.is_pinned:
- raise HashUnpinned()
-
- # If known-good hashes are missing for this requirement,
- # shim it with a facade object that will provoke hash
- # computation and then raise a HashMissing exception
- # showing the user what the hash should be.
- return req.hashes(trust_internet=False) or MissingHashes()
-
- def _fetch_metadata_using_lazy_wheel(
- self,
- link: Link,
- ) -> Optional[BaseDistribution]:
- """Fetch metadata using lazy wheel, if possible."""
- if not self.use_lazy_wheel:
- return None
- if self.require_hashes:
- logger.debug("Lazy wheel is not used as hash checking is required")
- return None
- if link.is_file or not link.is_wheel:
- logger.debug(
- "Lazy wheel is not used as %r does not points to a remote wheel",
- link,
- )
- return None
-
- wheel = Wheel(link.filename)
- name = canonicalize_name(wheel.name)
- logger.info(
- "Obtaining dependency information from %s %s",
- name,
- wheel.version,
- )
- url = link.url.split("#", 1)[0]
- try:
- return dist_from_wheel_url(name, url, self._session)
- except HTTPRangeRequestUnsupported:
- logger.debug("%s does not support range requests", url)
- return None
-
- def _complete_partial_requirements(
- self,
- partially_downloaded_reqs: Iterable[InstallRequirement],
- parallel_builds: bool = False,
- ) -> None:
- """Download any requirements which were only fetched by metadata."""
- # Download to a temporary directory. These will be copied over as
- # needed for downstream 'download', 'wheel', and 'install' commands.
- temp_dir = TempDirectory(kind="unpack", globally_managed=True).path
-
- # Map each link to the requirement that owns it. This allows us to set
- # `req.local_file_path` on the appropriate requirement after passing
- # all the links at once into BatchDownloader.
- links_to_fully_download: Dict[Link, InstallRequirement] = {}
- for req in partially_downloaded_reqs:
- assert req.link
- links_to_fully_download[req.link] = req
-
- batch_download = self._batch_download(
- links_to_fully_download.keys(),
- temp_dir,
- )
- for link, (filepath, _) in batch_download:
- logger.debug("Downloading link %s to %s", link, filepath)
- req = links_to_fully_download[link]
- req.local_file_path = filepath
-
- # This step is necessary to ensure all lazy wheels are processed
- # successfully by the 'download', 'wheel', and 'install' commands.
- for req in partially_downloaded_reqs:
- self._prepare_linked_requirement(req, parallel_builds)
-
- def prepare_linked_requirement(
- self, req: InstallRequirement, parallel_builds: bool = False
- ) -> BaseDistribution:
- """Prepare a requirement to be obtained from req.link."""
- assert req.link
- link = req.link
- self._log_preparing_link(req)
- with indent_log():
- # Check if the relevant file is already available
- # in the download directory
- file_path = None
- if self.download_dir is not None and link.is_wheel:
- hashes = self._get_linked_req_hashes(req)
- file_path = _check_download_dir(req.link, self.download_dir, hashes)
-
- if file_path is not None:
- # The file is already available, so mark it as downloaded
- self._downloaded[req.link.url] = file_path
- else:
- # The file is not available, attempt to fetch only metadata
- wheel_dist = self._fetch_metadata_using_lazy_wheel(link)
- if wheel_dist is not None:
- req.needs_more_preparation = True
- return wheel_dist
-
- # None of the optimizations worked, fully prepare the requirement
- return self._prepare_linked_requirement(req, parallel_builds)
-
- def prepare_linked_requirements_more(
- self, reqs: Iterable[InstallRequirement], parallel_builds: bool = False
- ) -> None:
- """Prepare linked requirements more, if needed."""
- reqs = [req for req in reqs if req.needs_more_preparation]
- for req in reqs:
- # Determine if any of these requirements were already downloaded.
- if self.download_dir is not None and req.link.is_wheel:
- hashes = self._get_linked_req_hashes(req)
- file_path = _check_download_dir(req.link, self.download_dir, hashes)
- if file_path is not None:
- self._downloaded[req.link.url] = file_path
- req.needs_more_preparation = False
-
- # Prepare requirements we found were already downloaded for some
- # reason. The other downloads will be completed separately.
- partially_downloaded_reqs: List[InstallRequirement] = []
- for req in reqs:
- if req.needs_more_preparation:
- partially_downloaded_reqs.append(req)
- else:
- self._prepare_linked_requirement(req, parallel_builds)
-
- # TODO: separate this part out from RequirementPreparer when the v1
- # resolver can be removed!
- self._complete_partial_requirements(
- partially_downloaded_reqs,
- parallel_builds=parallel_builds,
- )
-
- def _prepare_linked_requirement(
- self, req: InstallRequirement, parallel_builds: bool
- ) -> BaseDistribution:
- assert req.link
- link = req.link
-
- self._ensure_link_req_src_dir(req, parallel_builds)
- hashes = self._get_linked_req_hashes(req)
-
- if link.is_existing_dir():
- local_file = None
- elif link.url not in self._downloaded:
- try:
- local_file = unpack_url(
- link,
- req.source_dir,
- self._download,
- self.verbosity,
- self.download_dir,
- hashes,
- )
- except NetworkConnectionError as exc:
- raise InstallationError(
- "Could not install requirement {} because of HTTP "
- "error {} for URL {}".format(req, exc, link)
- )
- else:
- file_path = self._downloaded[link.url]
- if hashes:
- hashes.check_against_path(file_path)
- local_file = File(file_path, content_type=None)
-
- # For use in later processing,
- # preserve the file path on the requirement.
- if local_file:
- req.local_file_path = local_file.path
-
- dist = _get_prepared_distribution(
- req,
- self.build_tracker,
- self.finder,
- self.build_isolation,
- self.check_build_deps,
- )
- return dist
-
- def save_linked_requirement(self, req: InstallRequirement) -> None:
- assert self.download_dir is not None
- assert req.link is not None
- link = req.link
- if link.is_vcs or (link.is_existing_dir() and req.editable):
- # Make a .zip of the source_dir we already created.
- req.archive(self.download_dir)
- return
-
- if link.is_existing_dir():
- logger.debug(
- "Not copying link to destination directory "
- "since it is a directory: %s",
- link,
- )
- return
- if req.local_file_path is None:
- # No distribution was downloaded for this requirement.
- return
-
- download_location = os.path.join(self.download_dir, link.filename)
- if not os.path.exists(download_location):
- shutil.copy(req.local_file_path, download_location)
- download_path = display_path(download_location)
- logger.info("Saved %s", download_path)
-
- def prepare_editable_requirement(
- self,
- req: InstallRequirement,
- ) -> BaseDistribution:
- """Prepare an editable requirement."""
- assert req.editable, "cannot prepare a non-editable req as editable"
-
- logger.info("Obtaining %s", req)
-
- with indent_log():
- if self.require_hashes:
- raise InstallationError(
- "The editable requirement {} cannot be installed when "
- "requiring hashes, because there is no single file to "
- "hash.".format(req)
- )
- req.ensure_has_source_dir(self.src_dir)
- req.update_editable()
-
- dist = _get_prepared_distribution(
- req,
- self.build_tracker,
- self.finder,
- self.build_isolation,
- self.check_build_deps,
- )
-
- req.check_if_exists(self.use_user_site)
-
- return dist
-
- def prepare_installed_requirement(
- self,
- req: InstallRequirement,
- skip_reason: str,
- ) -> BaseDistribution:
- """Prepare an already-installed requirement."""
- assert req.satisfied_by, "req should have been satisfied but isn't"
- assert skip_reason is not None, (
- "did not get skip reason skipped but req.satisfied_by "
- "is set to {}".format(req.satisfied_by)
- )
- logger.info(
- "Requirement %s: %s (%s)", skip_reason, req, req.satisfied_by.version
- )
- with indent_log():
- if self.require_hashes:
- logger.debug(
- "Since it is already installed, we are trusting this "
- "package without checking its hash. To ensure a "
- "completely repeatable environment, install into an "
- "empty virtualenv."
- )
- return InstalledDistribution(req).get_metadata_distribution()
diff --git a/env/lib/python3.9/site-packages/pip/_internal/pyproject.py b/env/lib/python3.9/site-packages/pip/_internal/pyproject.py
deleted file mode 100644
index 1e9119f..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/pyproject.py
+++ /dev/null
@@ -1,175 +0,0 @@
-import importlib.util
-import os
-from collections import namedtuple
-from typing import Any, List, Optional
-
-from pip._vendor import tomli
-from pip._vendor.packaging.requirements import InvalidRequirement, Requirement
-
-from pip._internal.exceptions import (
- InstallationError,
- InvalidPyProjectBuildRequires,
- MissingPyProjectBuildRequires,
-)
-
-
-def _is_list_of_str(obj: Any) -> bool:
- return isinstance(obj, list) and all(isinstance(item, str) for item in obj)
-
-
-def make_pyproject_path(unpacked_source_directory: str) -> str:
- return os.path.join(unpacked_source_directory, "pyproject.toml")
-
-
-BuildSystemDetails = namedtuple(
- "BuildSystemDetails", ["requires", "backend", "check", "backend_path"]
-)
-
-
-def load_pyproject_toml(
- use_pep517: Optional[bool], pyproject_toml: str, setup_py: str, req_name: str
-) -> Optional[BuildSystemDetails]:
- """Load the pyproject.toml file.
-
- Parameters:
- use_pep517 - Has the user requested PEP 517 processing? None
- means the user hasn't explicitly specified.
- pyproject_toml - Location of the project's pyproject.toml file
- setup_py - Location of the project's setup.py file
- req_name - The name of the requirement we're processing (for
- error reporting)
-
- Returns:
- None if we should use the legacy code path, otherwise a tuple
- (
- requirements from pyproject.toml,
- name of PEP 517 backend,
- requirements we should check are installed after setting
- up the build environment
- directory paths to import the backend from (backend-path),
- relative to the project root.
- )
- """
- has_pyproject = os.path.isfile(pyproject_toml)
- has_setup = os.path.isfile(setup_py)
-
- if not has_pyproject and not has_setup:
- raise InstallationError(
- f"{req_name} does not appear to be a Python project: "
- f"neither 'setup.py' nor 'pyproject.toml' found."
- )
-
- if has_pyproject:
- with open(pyproject_toml, encoding="utf-8") as f:
- pp_toml = tomli.loads(f.read())
- build_system = pp_toml.get("build-system")
- else:
- build_system = None
-
- # The following cases must use PEP 517
- # We check for use_pep517 being non-None and falsey because that means
- # the user explicitly requested --no-use-pep517. The value 0 as
- # opposed to False can occur when the value is provided via an
- # environment variable or config file option (due to the quirk of
- # strtobool() returning an integer in pip's configuration code).
- if has_pyproject and not has_setup:
- if use_pep517 is not None and not use_pep517:
- raise InstallationError(
- "Disabling PEP 517 processing is invalid: "
- "project does not have a setup.py"
- )
- use_pep517 = True
- elif build_system and "build-backend" in build_system:
- if use_pep517 is not None and not use_pep517:
- raise InstallationError(
- "Disabling PEP 517 processing is invalid: "
- "project specifies a build backend of {} "
- "in pyproject.toml".format(build_system["build-backend"])
- )
- use_pep517 = True
-
- # If we haven't worked out whether to use PEP 517 yet,
- # and the user hasn't explicitly stated a preference,
- # we do so if the project has a pyproject.toml file
- # or if we cannot import setuptools.
-
- # We fallback to PEP 517 when without setuptools,
- # so setuptools can be installed as a default build backend.
- # For more info see:
- # https://discuss.python.org/t/pip-without-setuptools-could-the-experience-be-improved/11810/9
- elif use_pep517 is None:
- use_pep517 = has_pyproject or not importlib.util.find_spec("setuptools")
-
- # At this point, we know whether we're going to use PEP 517.
- assert use_pep517 is not None
-
- # If we're using the legacy code path, there is nothing further
- # for us to do here.
- if not use_pep517:
- return None
-
- if build_system is None:
- # Either the user has a pyproject.toml with no build-system
- # section, or the user has no pyproject.toml, but has opted in
- # explicitly via --use-pep517.
- # In the absence of any explicit backend specification, we
- # assume the setuptools backend that most closely emulates the
- # traditional direct setup.py execution, and require wheel and
- # a version of setuptools that supports that backend.
-
- build_system = {
- "requires": ["setuptools>=40.8.0", "wheel"],
- "build-backend": "setuptools.build_meta:__legacy__",
- }
-
- # If we're using PEP 517, we have build system information (either
- # from pyproject.toml, or defaulted by the code above).
- # Note that at this point, we do not know if the user has actually
- # specified a backend, though.
- assert build_system is not None
-
- # Ensure that the build-system section in pyproject.toml conforms
- # to PEP 518.
-
- # Specifying the build-system table but not the requires key is invalid
- if "requires" not in build_system:
- raise MissingPyProjectBuildRequires(package=req_name)
-
- # Error out if requires is not a list of strings
- requires = build_system["requires"]
- if not _is_list_of_str(requires):
- raise InvalidPyProjectBuildRequires(
- package=req_name,
- reason="It is not a list of strings.",
- )
-
- # Each requirement must be valid as per PEP 508
- for requirement in requires:
- try:
- Requirement(requirement)
- except InvalidRequirement as error:
- raise InvalidPyProjectBuildRequires(
- package=req_name,
- reason=f"It contains an invalid requirement: {requirement!r}",
- ) from error
-
- backend = build_system.get("build-backend")
- backend_path = build_system.get("backend-path", [])
- check: List[str] = []
- if backend is None:
- # If the user didn't specify a backend, we assume they want to use
- # the setuptools backend. But we can't be sure they have included
- # a version of setuptools which supplies the backend, or wheel
- # (which is needed by the backend) in their requirements. So we
- # make a note to check that those requirements are present once
- # we have set up the environment.
- # This is quite a lot of work to check for a very specific case. But
- # the problem is, that case is potentially quite common - projects that
- # adopted PEP 518 early for the ability to specify requirements to
- # execute setup.py, but never considered needing to mention the build
- # tools themselves. The original PEP 518 code had a similar check (but
- # implemented in a different way).
- backend = "setuptools.build_meta:__legacy__"
- check = ["setuptools>=40.8.0", "wheel"]
-
- return BuildSystemDetails(requires, backend, check, backend_path)
diff --git a/env/lib/python3.9/site-packages/pip/_internal/req/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/req/__init__.py
deleted file mode 100644
index 8d56359..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/req/__init__.py
+++ /dev/null
@@ -1,94 +0,0 @@
-import collections
-import logging
-from typing import Generator, List, Optional, Sequence, Tuple
-
-from pip._internal.utils.logging import indent_log
-
-from .req_file import parse_requirements
-from .req_install import InstallRequirement
-from .req_set import RequirementSet
-
-__all__ = [
- "RequirementSet",
- "InstallRequirement",
- "parse_requirements",
- "install_given_reqs",
-]
-
-logger = logging.getLogger(__name__)
-
-
-class InstallationResult:
- def __init__(self, name: str) -> None:
- self.name = name
-
- def __repr__(self) -> str:
- return f"InstallationResult(name={self.name!r})"
-
-
-def _validate_requirements(
- requirements: List[InstallRequirement],
-) -> Generator[Tuple[str, InstallRequirement], None, None]:
- for req in requirements:
- assert req.name, f"invalid to-be-installed requirement: {req}"
- yield req.name, req
-
-
-def install_given_reqs(
- requirements: List[InstallRequirement],
- install_options: List[str],
- global_options: Sequence[str],
- root: Optional[str],
- home: Optional[str],
- prefix: Optional[str],
- warn_script_location: bool,
- use_user_site: bool,
- pycompile: bool,
-) -> List[InstallationResult]:
- """
- Install everything in the given list.
-
- (to be called after having downloaded and unpacked the packages)
- """
- to_install = collections.OrderedDict(_validate_requirements(requirements))
-
- if to_install:
- logger.info(
- "Installing collected packages: %s",
- ", ".join(to_install.keys()),
- )
-
- installed = []
-
- with indent_log():
- for req_name, requirement in to_install.items():
- if requirement.should_reinstall:
- logger.info("Attempting uninstall: %s", req_name)
- with indent_log():
- uninstalled_pathset = requirement.uninstall(auto_confirm=True)
- else:
- uninstalled_pathset = None
-
- try:
- requirement.install(
- install_options,
- global_options,
- root=root,
- home=home,
- prefix=prefix,
- warn_script_location=warn_script_location,
- use_user_site=use_user_site,
- pycompile=pycompile,
- )
- except Exception:
- # if install did not succeed, rollback previous uninstall
- if uninstalled_pathset and not requirement.install_succeeded:
- uninstalled_pathset.rollback()
- raise
- else:
- if uninstalled_pathset and requirement.install_succeeded:
- uninstalled_pathset.commit()
-
- installed.append(InstallationResult(req_name))
-
- return installed
diff --git a/env/lib/python3.9/site-packages/pip/_internal/req/constructors.py b/env/lib/python3.9/site-packages/pip/_internal/req/constructors.py
deleted file mode 100644
index dea7c3b..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/req/constructors.py
+++ /dev/null
@@ -1,501 +0,0 @@
-"""Backing implementation for InstallRequirement's various constructors
-
-The idea here is that these formed a major chunk of InstallRequirement's size
-so, moving them and support code dedicated to them outside of that class
-helps creates for better understandability for the rest of the code.
-
-These are meant to be used elsewhere within pip to create instances of
-InstallRequirement.
-"""
-
-import logging
-import os
-import re
-from typing import Any, Dict, Optional, Set, Tuple, Union
-
-from pip._vendor.packaging.markers import Marker
-from pip._vendor.packaging.requirements import InvalidRequirement, Requirement
-from pip._vendor.packaging.specifiers import Specifier
-
-from pip._internal.exceptions import InstallationError
-from pip._internal.models.index import PyPI, TestPyPI
-from pip._internal.models.link import Link
-from pip._internal.models.wheel import Wheel
-from pip._internal.req.req_file import ParsedRequirement
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.filetypes import is_archive_file
-from pip._internal.utils.misc import is_installable_dir
-from pip._internal.utils.packaging import get_requirement
-from pip._internal.utils.urls import path_to_url
-from pip._internal.vcs import is_url, vcs
-
-__all__ = [
- "install_req_from_editable",
- "install_req_from_line",
- "parse_editable",
-]
-
-logger = logging.getLogger(__name__)
-operators = Specifier._operators.keys()
-
-
-def _strip_extras(path: str) -> Tuple[str, Optional[str]]:
- m = re.match(r"^(.+)(\[[^\]]+\])$", path)
- extras = None
- if m:
- path_no_extras = m.group(1)
- extras = m.group(2)
- else:
- path_no_extras = path
-
- return path_no_extras, extras
-
-
-def convert_extras(extras: Optional[str]) -> Set[str]:
- if not extras:
- return set()
- return get_requirement("placeholder" + extras.lower()).extras
-
-
-def parse_editable(editable_req: str) -> Tuple[Optional[str], str, Set[str]]:
- """Parses an editable requirement into:
- - a requirement name
- - an URL
- - extras
- - editable options
- Accepted requirements:
- svn+http://blahblah@rev#egg=Foobar[baz]&subdirectory=version_subdir
- .[some_extra]
- """
-
- url = editable_req
-
- # If a file path is specified with extras, strip off the extras.
- url_no_extras, extras = _strip_extras(url)
-
- if os.path.isdir(url_no_extras):
- # Treating it as code that has already been checked out
- url_no_extras = path_to_url(url_no_extras)
-
- if url_no_extras.lower().startswith("file:"):
- package_name = Link(url_no_extras).egg_fragment
- if extras:
- return (
- package_name,
- url_no_extras,
- get_requirement("placeholder" + extras.lower()).extras,
- )
- else:
- return package_name, url_no_extras, set()
-
- for version_control in vcs:
- if url.lower().startswith(f"{version_control}:"):
- url = f"{version_control}+{url}"
- break
-
- link = Link(url)
-
- if not link.is_vcs:
- backends = ", ".join(vcs.all_schemes)
- raise InstallationError(
- f"{editable_req} is not a valid editable requirement. "
- f"It should either be a path to a local project or a VCS URL "
- f"(beginning with {backends})."
- )
-
- package_name = link.egg_fragment
- if not package_name:
- raise InstallationError(
- "Could not detect requirement name for '{}', please specify one "
- "with #egg=your_package_name".format(editable_req)
- )
- return package_name, url, set()
-
-
-def check_first_requirement_in_file(filename: str) -> None:
- """Check if file is parsable as a requirements file.
-
- This is heavily based on ``pkg_resources.parse_requirements``, but
- simplified to just check the first meaningful line.
-
- :raises InvalidRequirement: If the first meaningful line cannot be parsed
- as an requirement.
- """
- with open(filename, encoding="utf-8", errors="ignore") as f:
- # Create a steppable iterator, so we can handle \-continuations.
- lines = (
- line
- for line in (line.strip() for line in f)
- if line and not line.startswith("#") # Skip blank lines/comments.
- )
-
- for line in lines:
- # Drop comments -- a hash without a space may be in a URL.
- if " #" in line:
- line = line[: line.find(" #")]
- # If there is a line continuation, drop it, and append the next line.
- if line.endswith("\\"):
- line = line[:-2].strip() + next(lines, "")
- Requirement(line)
- return
-
-
-def deduce_helpful_msg(req: str) -> str:
- """Returns helpful msg in case requirements file does not exist,
- or cannot be parsed.
-
- :params req: Requirements file path
- """
- if not os.path.exists(req):
- return f" File '{req}' does not exist."
- msg = " The path does exist. "
- # Try to parse and check if it is a requirements file.
- try:
- check_first_requirement_in_file(req)
- except InvalidRequirement:
- logger.debug("Cannot parse '%s' as requirements file", req)
- else:
- msg += (
- f"The argument you provided "
- f"({req}) appears to be a"
- f" requirements file. If that is the"
- f" case, use the '-r' flag to install"
- f" the packages specified within it."
- )
- return msg
-
-
-class RequirementParts:
- def __init__(
- self,
- requirement: Optional[Requirement],
- link: Optional[Link],
- markers: Optional[Marker],
- extras: Set[str],
- ):
- self.requirement = requirement
- self.link = link
- self.markers = markers
- self.extras = extras
-
-
-def parse_req_from_editable(editable_req: str) -> RequirementParts:
- name, url, extras_override = parse_editable(editable_req)
-
- if name is not None:
- try:
- req: Optional[Requirement] = Requirement(name)
- except InvalidRequirement:
- raise InstallationError(f"Invalid requirement: '{name}'")
- else:
- req = None
-
- link = Link(url)
-
- return RequirementParts(req, link, None, extras_override)
-
-
-# ---- The actual constructors follow ----
-
-
-def install_req_from_editable(
- editable_req: str,
- comes_from: Optional[Union[InstallRequirement, str]] = None,
- use_pep517: Optional[bool] = None,
- isolated: bool = False,
- options: Optional[Dict[str, Any]] = None,
- constraint: bool = False,
- user_supplied: bool = False,
- permit_editable_wheels: bool = False,
- config_settings: Optional[Dict[str, str]] = None,
-) -> InstallRequirement:
-
- parts = parse_req_from_editable(editable_req)
-
- return InstallRequirement(
- parts.requirement,
- comes_from=comes_from,
- user_supplied=user_supplied,
- editable=True,
- permit_editable_wheels=permit_editable_wheels,
- link=parts.link,
- constraint=constraint,
- use_pep517=use_pep517,
- isolated=isolated,
- install_options=options.get("install_options", []) if options else [],
- global_options=options.get("global_options", []) if options else [],
- hash_options=options.get("hashes", {}) if options else {},
- config_settings=config_settings,
- extras=parts.extras,
- )
-
-
-def _looks_like_path(name: str) -> bool:
- """Checks whether the string "looks like" a path on the filesystem.
-
- This does not check whether the target actually exists, only judge from the
- appearance.
-
- Returns true if any of the following conditions is true:
- * a path separator is found (either os.path.sep or os.path.altsep);
- * a dot is found (which represents the current directory).
- """
- if os.path.sep in name:
- return True
- if os.path.altsep is not None and os.path.altsep in name:
- return True
- if name.startswith("."):
- return True
- return False
-
-
-def _get_url_from_path(path: str, name: str) -> Optional[str]:
- """
- First, it checks whether a provided path is an installable directory. If it
- is, returns the path.
-
- If false, check if the path is an archive file (such as a .whl).
- The function checks if the path is a file. If false, if the path has
- an @, it will treat it as a PEP 440 URL requirement and return the path.
- """
- if _looks_like_path(name) and os.path.isdir(path):
- if is_installable_dir(path):
- return path_to_url(path)
- # TODO: The is_installable_dir test here might not be necessary
- # now that it is done in load_pyproject_toml too.
- raise InstallationError(
- f"Directory {name!r} is not installable. Neither 'setup.py' "
- "nor 'pyproject.toml' found."
- )
- if not is_archive_file(path):
- return None
- if os.path.isfile(path):
- return path_to_url(path)
- urlreq_parts = name.split("@", 1)
- if len(urlreq_parts) >= 2 and not _looks_like_path(urlreq_parts[0]):
- # If the path contains '@' and the part before it does not look
- # like a path, try to treat it as a PEP 440 URL req instead.
- return None
- logger.warning(
- "Requirement %r looks like a filename, but the file does not exist",
- name,
- )
- return path_to_url(path)
-
-
-def parse_req_from_line(name: str, line_source: Optional[str]) -> RequirementParts:
- if is_url(name):
- marker_sep = "; "
- else:
- marker_sep = ";"
- if marker_sep in name:
- name, markers_as_string = name.split(marker_sep, 1)
- markers_as_string = markers_as_string.strip()
- if not markers_as_string:
- markers = None
- else:
- markers = Marker(markers_as_string)
- else:
- markers = None
- name = name.strip()
- req_as_string = None
- path = os.path.normpath(os.path.abspath(name))
- link = None
- extras_as_string = None
-
- if is_url(name):
- link = Link(name)
- else:
- p, extras_as_string = _strip_extras(path)
- url = _get_url_from_path(p, name)
- if url is not None:
- link = Link(url)
-
- # it's a local file, dir, or url
- if link:
- # Handle relative file URLs
- if link.scheme == "file" and re.search(r"\.\./", link.url):
- link = Link(path_to_url(os.path.normpath(os.path.abspath(link.path))))
- # wheel file
- if link.is_wheel:
- wheel = Wheel(link.filename) # can raise InvalidWheelFilename
- req_as_string = f"{wheel.name}=={wheel.version}"
- else:
- # set the req to the egg fragment. when it's not there, this
- # will become an 'unnamed' requirement
- req_as_string = link.egg_fragment
-
- # a requirement specifier
- else:
- req_as_string = name
-
- extras = convert_extras(extras_as_string)
-
- def with_source(text: str) -> str:
- if not line_source:
- return text
- return f"{text} (from {line_source})"
-
- def _parse_req_string(req_as_string: str) -> Requirement:
- try:
- req = get_requirement(req_as_string)
- except InvalidRequirement:
- if os.path.sep in req_as_string:
- add_msg = "It looks like a path."
- add_msg += deduce_helpful_msg(req_as_string)
- elif "=" in req_as_string and not any(
- op in req_as_string for op in operators
- ):
- add_msg = "= is not a valid operator. Did you mean == ?"
- else:
- add_msg = ""
- msg = with_source(f"Invalid requirement: {req_as_string!r}")
- if add_msg:
- msg += f"\nHint: {add_msg}"
- raise InstallationError(msg)
- else:
- # Deprecate extras after specifiers: "name>=1.0[extras]"
- # This currently works by accident because _strip_extras() parses
- # any extras in the end of the string and those are saved in
- # RequirementParts
- for spec in req.specifier:
- spec_str = str(spec)
- if spec_str.endswith("]"):
- msg = f"Extras after version '{spec_str}'."
- raise InstallationError(msg)
- return req
-
- if req_as_string is not None:
- req: Optional[Requirement] = _parse_req_string(req_as_string)
- else:
- req = None
-
- return RequirementParts(req, link, markers, extras)
-
-
-def install_req_from_line(
- name: str,
- comes_from: Optional[Union[str, InstallRequirement]] = None,
- use_pep517: Optional[bool] = None,
- isolated: bool = False,
- options: Optional[Dict[str, Any]] = None,
- constraint: bool = False,
- line_source: Optional[str] = None,
- user_supplied: bool = False,
- config_settings: Optional[Dict[str, str]] = None,
-) -> InstallRequirement:
- """Creates an InstallRequirement from a name, which might be a
- requirement, directory containing 'setup.py', filename, or URL.
-
- :param line_source: An optional string describing where the line is from,
- for logging purposes in case of an error.
- """
- parts = parse_req_from_line(name, line_source)
-
- return InstallRequirement(
- parts.requirement,
- comes_from,
- link=parts.link,
- markers=parts.markers,
- use_pep517=use_pep517,
- isolated=isolated,
- install_options=options.get("install_options", []) if options else [],
- global_options=options.get("global_options", []) if options else [],
- hash_options=options.get("hashes", {}) if options else {},
- config_settings=config_settings,
- constraint=constraint,
- extras=parts.extras,
- user_supplied=user_supplied,
- )
-
-
-def install_req_from_req_string(
- req_string: str,
- comes_from: Optional[InstallRequirement] = None,
- isolated: bool = False,
- use_pep517: Optional[bool] = None,
- user_supplied: bool = False,
- config_settings: Optional[Dict[str, str]] = None,
-) -> InstallRequirement:
- try:
- req = get_requirement(req_string)
- except InvalidRequirement:
- raise InstallationError(f"Invalid requirement: '{req_string}'")
-
- domains_not_allowed = [
- PyPI.file_storage_domain,
- TestPyPI.file_storage_domain,
- ]
- if (
- req.url
- and comes_from
- and comes_from.link
- and comes_from.link.netloc in domains_not_allowed
- ):
- # Explicitly disallow pypi packages that depend on external urls
- raise InstallationError(
- "Packages installed from PyPI cannot depend on packages "
- "which are not also hosted on PyPI.\n"
- "{} depends on {} ".format(comes_from.name, req)
- )
-
- return InstallRequirement(
- req,
- comes_from,
- isolated=isolated,
- use_pep517=use_pep517,
- user_supplied=user_supplied,
- config_settings=config_settings,
- )
-
-
-def install_req_from_parsed_requirement(
- parsed_req: ParsedRequirement,
- isolated: bool = False,
- use_pep517: Optional[bool] = None,
- user_supplied: bool = False,
- config_settings: Optional[Dict[str, str]] = None,
-) -> InstallRequirement:
- if parsed_req.is_editable:
- req = install_req_from_editable(
- parsed_req.requirement,
- comes_from=parsed_req.comes_from,
- use_pep517=use_pep517,
- constraint=parsed_req.constraint,
- isolated=isolated,
- user_supplied=user_supplied,
- config_settings=config_settings,
- )
-
- else:
- req = install_req_from_line(
- parsed_req.requirement,
- comes_from=parsed_req.comes_from,
- use_pep517=use_pep517,
- isolated=isolated,
- options=parsed_req.options,
- constraint=parsed_req.constraint,
- line_source=parsed_req.line_source,
- user_supplied=user_supplied,
- config_settings=config_settings,
- )
- return req
-
-
-def install_req_from_link_and_ireq(
- link: Link, ireq: InstallRequirement
-) -> InstallRequirement:
- return InstallRequirement(
- req=ireq.req,
- comes_from=ireq.comes_from,
- editable=ireq.editable,
- link=link,
- markers=ireq.markers,
- use_pep517=ireq.use_pep517,
- isolated=ireq.isolated,
- install_options=ireq.install_options,
- global_options=ireq.global_options,
- hash_options=ireq.hash_options,
- config_settings=ireq.config_settings,
- user_supplied=ireq.user_supplied,
- )
diff --git a/env/lib/python3.9/site-packages/pip/_internal/req/req_file.py b/env/lib/python3.9/site-packages/pip/_internal/req/req_file.py
deleted file mode 100644
index 4550c72..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/req/req_file.py
+++ /dev/null
@@ -1,540 +0,0 @@
-"""
-Requirements file parsing
-"""
-
-import optparse
-import os
-import re
-import shlex
-import urllib.parse
-from optparse import Values
-from typing import (
- TYPE_CHECKING,
- Any,
- Callable,
- Dict,
- Generator,
- Iterable,
- List,
- Optional,
- Tuple,
-)
-
-from pip._internal.cli import cmdoptions
-from pip._internal.exceptions import InstallationError, RequirementsFileParseError
-from pip._internal.models.search_scope import SearchScope
-from pip._internal.network.session import PipSession
-from pip._internal.network.utils import raise_for_status
-from pip._internal.utils.encoding import auto_decode
-from pip._internal.utils.urls import get_url_scheme
-
-if TYPE_CHECKING:
- # NoReturn introduced in 3.6.2; imported only for type checking to maintain
- # pip compatibility with older patch versions of Python 3.6
- from typing import NoReturn
-
- from pip._internal.index.package_finder import PackageFinder
-
-__all__ = ["parse_requirements"]
-
-ReqFileLines = Iterable[Tuple[int, str]]
-
-LineParser = Callable[[str], Tuple[str, Values]]
-
-SCHEME_RE = re.compile(r"^(http|https|file):", re.I)
-COMMENT_RE = re.compile(r"(^|\s+)#.*$")
-
-# Matches environment variable-style values in '${MY_VARIABLE_1}' with the
-# variable name consisting of only uppercase letters, digits or the '_'
-# (underscore). This follows the POSIX standard defined in IEEE Std 1003.1,
-# 2013 Edition.
-ENV_VAR_RE = re.compile(r"(?P\$\{(?P[A-Z0-9_]+)\})")
-
-SUPPORTED_OPTIONS: List[Callable[..., optparse.Option]] = [
- cmdoptions.index_url,
- cmdoptions.extra_index_url,
- cmdoptions.no_index,
- cmdoptions.constraints,
- cmdoptions.requirements,
- cmdoptions.editable,
- cmdoptions.find_links,
- cmdoptions.no_binary,
- cmdoptions.only_binary,
- cmdoptions.prefer_binary,
- cmdoptions.require_hashes,
- cmdoptions.pre,
- cmdoptions.trusted_host,
- cmdoptions.use_new_feature,
-]
-
-# options to be passed to requirements
-SUPPORTED_OPTIONS_REQ: List[Callable[..., optparse.Option]] = [
- cmdoptions.install_options,
- cmdoptions.global_options,
- cmdoptions.hash,
-]
-
-# the 'dest' string values
-SUPPORTED_OPTIONS_REQ_DEST = [str(o().dest) for o in SUPPORTED_OPTIONS_REQ]
-
-
-class ParsedRequirement:
- def __init__(
- self,
- requirement: str,
- is_editable: bool,
- comes_from: str,
- constraint: bool,
- options: Optional[Dict[str, Any]] = None,
- line_source: Optional[str] = None,
- ) -> None:
- self.requirement = requirement
- self.is_editable = is_editable
- self.comes_from = comes_from
- self.options = options
- self.constraint = constraint
- self.line_source = line_source
-
-
-class ParsedLine:
- def __init__(
- self,
- filename: str,
- lineno: int,
- args: str,
- opts: Values,
- constraint: bool,
- ) -> None:
- self.filename = filename
- self.lineno = lineno
- self.opts = opts
- self.constraint = constraint
-
- if args:
- self.is_requirement = True
- self.is_editable = False
- self.requirement = args
- elif opts.editables:
- self.is_requirement = True
- self.is_editable = True
- # We don't support multiple -e on one line
- self.requirement = opts.editables[0]
- else:
- self.is_requirement = False
-
-
-def parse_requirements(
- filename: str,
- session: PipSession,
- finder: Optional["PackageFinder"] = None,
- options: Optional[optparse.Values] = None,
- constraint: bool = False,
-) -> Generator[ParsedRequirement, None, None]:
- """Parse a requirements file and yield ParsedRequirement instances.
-
- :param filename: Path or url of requirements file.
- :param session: PipSession instance.
- :param finder: Instance of pip.index.PackageFinder.
- :param options: cli options.
- :param constraint: If true, parsing a constraint file rather than
- requirements file.
- """
- line_parser = get_line_parser(finder)
- parser = RequirementsFileParser(session, line_parser)
-
- for parsed_line in parser.parse(filename, constraint):
- parsed_req = handle_line(
- parsed_line, options=options, finder=finder, session=session
- )
- if parsed_req is not None:
- yield parsed_req
-
-
-def preprocess(content: str) -> ReqFileLines:
- """Split, filter, and join lines, and return a line iterator
-
- :param content: the content of the requirements file
- """
- lines_enum: ReqFileLines = enumerate(content.splitlines(), start=1)
- lines_enum = join_lines(lines_enum)
- lines_enum = ignore_comments(lines_enum)
- lines_enum = expand_env_variables(lines_enum)
- return lines_enum
-
-
-def handle_requirement_line(
- line: ParsedLine,
- options: Optional[optparse.Values] = None,
-) -> ParsedRequirement:
-
- # preserve for the nested code path
- line_comes_from = "{} {} (line {})".format(
- "-c" if line.constraint else "-r",
- line.filename,
- line.lineno,
- )
-
- assert line.is_requirement
-
- if line.is_editable:
- # For editable requirements, we don't support per-requirement
- # options, so just return the parsed requirement.
- return ParsedRequirement(
- requirement=line.requirement,
- is_editable=line.is_editable,
- comes_from=line_comes_from,
- constraint=line.constraint,
- )
- else:
- if options:
- # Disable wheels if the user has specified build options
- cmdoptions.check_install_build_global(options, line.opts)
-
- # get the options that apply to requirements
- req_options = {}
- for dest in SUPPORTED_OPTIONS_REQ_DEST:
- if dest in line.opts.__dict__ and line.opts.__dict__[dest]:
- req_options[dest] = line.opts.__dict__[dest]
-
- line_source = f"line {line.lineno} of {line.filename}"
- return ParsedRequirement(
- requirement=line.requirement,
- is_editable=line.is_editable,
- comes_from=line_comes_from,
- constraint=line.constraint,
- options=req_options,
- line_source=line_source,
- )
-
-
-def handle_option_line(
- opts: Values,
- filename: str,
- lineno: int,
- finder: Optional["PackageFinder"] = None,
- options: Optional[optparse.Values] = None,
- session: Optional[PipSession] = None,
-) -> None:
-
- if options:
- # percolate options upward
- if opts.require_hashes:
- options.require_hashes = opts.require_hashes
- if opts.features_enabled:
- options.features_enabled.extend(
- f for f in opts.features_enabled if f not in options.features_enabled
- )
-
- # set finder options
- if finder:
- find_links = finder.find_links
- index_urls = finder.index_urls
- if opts.index_url:
- index_urls = [opts.index_url]
- if opts.no_index is True:
- index_urls = []
- if opts.extra_index_urls:
- index_urls.extend(opts.extra_index_urls)
- if opts.find_links:
- # FIXME: it would be nice to keep track of the source
- # of the find_links: support a find-links local path
- # relative to a requirements file.
- value = opts.find_links[0]
- req_dir = os.path.dirname(os.path.abspath(filename))
- relative_to_reqs_file = os.path.join(req_dir, value)
- if os.path.exists(relative_to_reqs_file):
- value = relative_to_reqs_file
- find_links.append(value)
-
- if session:
- # We need to update the auth urls in session
- session.update_index_urls(index_urls)
-
- search_scope = SearchScope(
- find_links=find_links,
- index_urls=index_urls,
- )
- finder.search_scope = search_scope
-
- if opts.pre:
- finder.set_allow_all_prereleases()
-
- if opts.prefer_binary:
- finder.set_prefer_binary()
-
- if session:
- for host in opts.trusted_hosts or []:
- source = f"line {lineno} of {filename}"
- session.add_trusted_host(host, source=source)
-
-
-def handle_line(
- line: ParsedLine,
- options: Optional[optparse.Values] = None,
- finder: Optional["PackageFinder"] = None,
- session: Optional[PipSession] = None,
-) -> Optional[ParsedRequirement]:
- """Handle a single parsed requirements line; This can result in
- creating/yielding requirements, or updating the finder.
-
- :param line: The parsed line to be processed.
- :param options: CLI options.
- :param finder: The finder - updated by non-requirement lines.
- :param session: The session - updated by non-requirement lines.
-
- Returns a ParsedRequirement object if the line is a requirement line,
- otherwise returns None.
-
- For lines that contain requirements, the only options that have an effect
- are from SUPPORTED_OPTIONS_REQ, and they are scoped to the
- requirement. Other options from SUPPORTED_OPTIONS may be present, but are
- ignored.
-
- For lines that do not contain requirements, the only options that have an
- effect are from SUPPORTED_OPTIONS. Options from SUPPORTED_OPTIONS_REQ may
- be present, but are ignored. These lines may contain multiple options
- (although our docs imply only one is supported), and all our parsed and
- affect the finder.
- """
-
- if line.is_requirement:
- parsed_req = handle_requirement_line(line, options)
- return parsed_req
- else:
- handle_option_line(
- line.opts,
- line.filename,
- line.lineno,
- finder,
- options,
- session,
- )
- return None
-
-
-class RequirementsFileParser:
- def __init__(
- self,
- session: PipSession,
- line_parser: LineParser,
- ) -> None:
- self._session = session
- self._line_parser = line_parser
-
- def parse(
- self, filename: str, constraint: bool
- ) -> Generator[ParsedLine, None, None]:
- """Parse a given file, yielding parsed lines."""
- yield from self._parse_and_recurse(filename, constraint)
-
- def _parse_and_recurse(
- self, filename: str, constraint: bool
- ) -> Generator[ParsedLine, None, None]:
- for line in self._parse_file(filename, constraint):
- if not line.is_requirement and (
- line.opts.requirements or line.opts.constraints
- ):
- # parse a nested requirements file
- if line.opts.requirements:
- req_path = line.opts.requirements[0]
- nested_constraint = False
- else:
- req_path = line.opts.constraints[0]
- nested_constraint = True
-
- # original file is over http
- if SCHEME_RE.search(filename):
- # do a url join so relative paths work
- req_path = urllib.parse.urljoin(filename, req_path)
- # original file and nested file are paths
- elif not SCHEME_RE.search(req_path):
- # do a join so relative paths work
- req_path = os.path.join(
- os.path.dirname(filename),
- req_path,
- )
-
- yield from self._parse_and_recurse(req_path, nested_constraint)
- else:
- yield line
-
- def _parse_file(
- self, filename: str, constraint: bool
- ) -> Generator[ParsedLine, None, None]:
- _, content = get_file_content(filename, self._session)
-
- lines_enum = preprocess(content)
-
- for line_number, line in lines_enum:
- try:
- args_str, opts = self._line_parser(line)
- except OptionParsingError as e:
- # add offending line
- msg = f"Invalid requirement: {line}\n{e.msg}"
- raise RequirementsFileParseError(msg)
-
- yield ParsedLine(
- filename,
- line_number,
- args_str,
- opts,
- constraint,
- )
-
-
-def get_line_parser(finder: Optional["PackageFinder"]) -> LineParser:
- def parse_line(line: str) -> Tuple[str, Values]:
- # Build new parser for each line since it accumulates appendable
- # options.
- parser = build_parser()
- defaults = parser.get_default_values()
- defaults.index_url = None
- if finder:
- defaults.format_control = finder.format_control
-
- args_str, options_str = break_args_options(line)
-
- opts, _ = parser.parse_args(shlex.split(options_str), defaults)
-
- return args_str, opts
-
- return parse_line
-
-
-def break_args_options(line: str) -> Tuple[str, str]:
- """Break up the line into an args and options string. We only want to shlex
- (and then optparse) the options, not the args. args can contain markers
- which are corrupted by shlex.
- """
- tokens = line.split(" ")
- args = []
- options = tokens[:]
- for token in tokens:
- if token.startswith("-") or token.startswith("--"):
- break
- else:
- args.append(token)
- options.pop(0)
- return " ".join(args), " ".join(options)
-
-
-class OptionParsingError(Exception):
- def __init__(self, msg: str) -> None:
- self.msg = msg
-
-
-def build_parser() -> optparse.OptionParser:
- """
- Return a parser for parsing requirement lines
- """
- parser = optparse.OptionParser(add_help_option=False)
-
- option_factories = SUPPORTED_OPTIONS + SUPPORTED_OPTIONS_REQ
- for option_factory in option_factories:
- option = option_factory()
- parser.add_option(option)
-
- # By default optparse sys.exits on parsing errors. We want to wrap
- # that in our own exception.
- def parser_exit(self: Any, msg: str) -> "NoReturn":
- raise OptionParsingError(msg)
-
- # NOTE: mypy disallows assigning to a method
- # https://github.com/python/mypy/issues/2427
- parser.exit = parser_exit # type: ignore
-
- return parser
-
-
-def join_lines(lines_enum: ReqFileLines) -> ReqFileLines:
- """Joins a line ending in '\' with the previous line (except when following
- comments). The joined line takes on the index of the first line.
- """
- primary_line_number = None
- new_line: List[str] = []
- for line_number, line in lines_enum:
- if not line.endswith("\\") or COMMENT_RE.match(line):
- if COMMENT_RE.match(line):
- # this ensures comments are always matched later
- line = " " + line
- if new_line:
- new_line.append(line)
- assert primary_line_number is not None
- yield primary_line_number, "".join(new_line)
- new_line = []
- else:
- yield line_number, line
- else:
- if not new_line:
- primary_line_number = line_number
- new_line.append(line.strip("\\"))
-
- # last line contains \
- if new_line:
- assert primary_line_number is not None
- yield primary_line_number, "".join(new_line)
-
- # TODO: handle space after '\'.
-
-
-def ignore_comments(lines_enum: ReqFileLines) -> ReqFileLines:
- """
- Strips comments and filter empty lines.
- """
- for line_number, line in lines_enum:
- line = COMMENT_RE.sub("", line)
- line = line.strip()
- if line:
- yield line_number, line
-
-
-def expand_env_variables(lines_enum: ReqFileLines) -> ReqFileLines:
- """Replace all environment variables that can be retrieved via `os.getenv`.
-
- The only allowed format for environment variables defined in the
- requirement file is `${MY_VARIABLE_1}` to ensure two things:
-
- 1. Strings that contain a `$` aren't accidentally (partially) expanded.
- 2. Ensure consistency across platforms for requirement files.
-
- These points are the result of a discussion on the `github pull
- request #3514 `_.
-
- Valid characters in variable names follow the `POSIX standard
- `_ and are limited
- to uppercase letter, digits and the `_` (underscore).
- """
- for line_number, line in lines_enum:
- for env_var, var_name in ENV_VAR_RE.findall(line):
- value = os.getenv(var_name)
- if not value:
- continue
-
- line = line.replace(env_var, value)
-
- yield line_number, line
-
-
-def get_file_content(url: str, session: PipSession) -> Tuple[str, str]:
- """Gets the content of a file; it may be a filename, file: URL, or
- http: URL. Returns (location, content). Content is unicode.
- Respects # -*- coding: declarations on the retrieved files.
-
- :param url: File path or url.
- :param session: PipSession instance.
- """
- scheme = get_url_scheme(url)
-
- # Pip has special support for file:// URLs (LocalFSAdapter).
- if scheme in ["http", "https", "file"]:
- resp = session.get(url)
- raise_for_status(resp)
- return resp.url, resp.text
-
- # Assume this is a bare path.
- try:
- with open(url, "rb") as f:
- content = auto_decode(f.read())
- except OSError as exc:
- raise InstallationError(f"Could not open requirements file: {exc}")
- return url, content
diff --git a/env/lib/python3.9/site-packages/pip/_internal/req/req_install.py b/env/lib/python3.9/site-packages/pip/_internal/req/req_install.py
deleted file mode 100644
index b40d9e2..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/req/req_install.py
+++ /dev/null
@@ -1,862 +0,0 @@
-# The following comment should be removed at some point in the future.
-# mypy: strict-optional=False
-
-import functools
-import logging
-import os
-import shutil
-import sys
-import uuid
-import zipfile
-from typing import Any, Collection, Dict, Iterable, List, Optional, Sequence, Union
-
-from pip._vendor.packaging.markers import Marker
-from pip._vendor.packaging.requirements import Requirement
-from pip._vendor.packaging.specifiers import SpecifierSet
-from pip._vendor.packaging.utils import canonicalize_name
-from pip._vendor.packaging.version import Version
-from pip._vendor.packaging.version import parse as parse_version
-from pip._vendor.pep517.wrappers import Pep517HookCaller
-
-from pip._internal.build_env import BuildEnvironment, NoOpBuildEnvironment
-from pip._internal.exceptions import InstallationError, LegacyInstallFailure
-from pip._internal.locations import get_scheme
-from pip._internal.metadata import (
- BaseDistribution,
- get_default_environment,
- get_directory_distribution,
-)
-from pip._internal.models.link import Link
-from pip._internal.operations.build.metadata import generate_metadata
-from pip._internal.operations.build.metadata_editable import generate_editable_metadata
-from pip._internal.operations.build.metadata_legacy import (
- generate_metadata as generate_metadata_legacy,
-)
-from pip._internal.operations.install.editable_legacy import (
- install_editable as install_editable_legacy,
-)
-from pip._internal.operations.install.legacy import install as install_legacy
-from pip._internal.operations.install.wheel import install_wheel
-from pip._internal.pyproject import load_pyproject_toml, make_pyproject_path
-from pip._internal.req.req_uninstall import UninstallPathSet
-from pip._internal.utils.deprecation import deprecated
-from pip._internal.utils.direct_url_helpers import (
- direct_url_for_editable,
- direct_url_from_link,
-)
-from pip._internal.utils.hashes import Hashes
-from pip._internal.utils.misc import (
- ConfiguredPep517HookCaller,
- ask_path_exists,
- backup_dir,
- display_path,
- hide_url,
- redact_auth_from_url,
-)
-from pip._internal.utils.packaging import safe_extra
-from pip._internal.utils.subprocess import runner_with_spinner_message
-from pip._internal.utils.temp_dir import TempDirectory, tempdir_kinds
-from pip._internal.utils.virtualenv import running_under_virtualenv
-from pip._internal.vcs import vcs
-
-logger = logging.getLogger(__name__)
-
-
-class InstallRequirement:
- """
- Represents something that may be installed later on, may have information
- about where to fetch the relevant requirement and also contains logic for
- installing the said requirement.
- """
-
- def __init__(
- self,
- req: Optional[Requirement],
- comes_from: Optional[Union[str, "InstallRequirement"]],
- editable: bool = False,
- link: Optional[Link] = None,
- markers: Optional[Marker] = None,
- use_pep517: Optional[bool] = None,
- isolated: bool = False,
- install_options: Optional[List[str]] = None,
- global_options: Optional[List[str]] = None,
- hash_options: Optional[Dict[str, List[str]]] = None,
- config_settings: Optional[Dict[str, str]] = None,
- constraint: bool = False,
- extras: Collection[str] = (),
- user_supplied: bool = False,
- permit_editable_wheels: bool = False,
- ) -> None:
- assert req is None or isinstance(req, Requirement), req
- self.req = req
- self.comes_from = comes_from
- self.constraint = constraint
- self.editable = editable
- self.permit_editable_wheels = permit_editable_wheels
- self.legacy_install_reason: Optional[int] = None
-
- # source_dir is the local directory where the linked requirement is
- # located, or unpacked. In case unpacking is needed, creating and
- # populating source_dir is done by the RequirementPreparer. Note this
- # is not necessarily the directory where pyproject.toml or setup.py is
- # located - that one is obtained via unpacked_source_directory.
- self.source_dir: Optional[str] = None
- if self.editable:
- assert link
- if link.is_file:
- self.source_dir = os.path.normpath(os.path.abspath(link.file_path))
-
- if link is None and req and req.url:
- # PEP 508 URL requirement
- link = Link(req.url)
- self.link = self.original_link = link
- self.original_link_is_in_wheel_cache = False
-
- # Path to any downloaded or already-existing package.
- self.local_file_path: Optional[str] = None
- if self.link and self.link.is_file:
- self.local_file_path = self.link.file_path
-
- if extras:
- self.extras = extras
- elif req:
- self.extras = {safe_extra(extra) for extra in req.extras}
- else:
- self.extras = set()
- if markers is None and req:
- markers = req.marker
- self.markers = markers
-
- # This holds the Distribution object if this requirement is already installed.
- self.satisfied_by: Optional[BaseDistribution] = None
- # Whether the installation process should try to uninstall an existing
- # distribution before installing this requirement.
- self.should_reinstall = False
- # Temporary build location
- self._temp_build_dir: Optional[TempDirectory] = None
- # Set to True after successful installation
- self.install_succeeded: Optional[bool] = None
- # Supplied options
- self.install_options = install_options if install_options else []
- self.global_options = global_options if global_options else []
- self.hash_options = hash_options if hash_options else {}
- self.config_settings = config_settings
- # Set to True after successful preparation of this requirement
- self.prepared = False
- # User supplied requirement are explicitly requested for installation
- # by the user via CLI arguments or requirements files, as opposed to,
- # e.g. dependencies, extras or constraints.
- self.user_supplied = user_supplied
-
- self.isolated = isolated
- self.build_env: BuildEnvironment = NoOpBuildEnvironment()
-
- # For PEP 517, the directory where we request the project metadata
- # gets stored. We need this to pass to build_wheel, so the backend
- # can ensure that the wheel matches the metadata (see the PEP for
- # details).
- self.metadata_directory: Optional[str] = None
-
- # The static build requirements (from pyproject.toml)
- self.pyproject_requires: Optional[List[str]] = None
-
- # Build requirements that we will check are available
- self.requirements_to_check: List[str] = []
-
- # The PEP 517 backend we should use to build the project
- self.pep517_backend: Optional[Pep517HookCaller] = None
-
- # Are we using PEP 517 for this requirement?
- # After pyproject.toml has been loaded, the only valid values are True
- # and False. Before loading, None is valid (meaning "use the default").
- # Setting an explicit value before loading pyproject.toml is supported,
- # but after loading this flag should be treated as read only.
- self.use_pep517 = use_pep517
-
- # This requirement needs more preparation before it can be built
- self.needs_more_preparation = False
-
- def __str__(self) -> str:
- if self.req:
- s = str(self.req)
- if self.link:
- s += " from {}".format(redact_auth_from_url(self.link.url))
- elif self.link:
- s = redact_auth_from_url(self.link.url)
- else:
- s = ""
- if self.satisfied_by is not None:
- s += " in {}".format(display_path(self.satisfied_by.location))
- if self.comes_from:
- if isinstance(self.comes_from, str):
- comes_from: Optional[str] = self.comes_from
- else:
- comes_from = self.comes_from.from_path()
- if comes_from:
- s += f" (from {comes_from})"
- return s
-
- def __repr__(self) -> str:
- return "<{} object: {} editable={!r}>".format(
- self.__class__.__name__, str(self), self.editable
- )
-
- def format_debug(self) -> str:
- """An un-tested helper for getting state, for debugging."""
- attributes = vars(self)
- names = sorted(attributes)
-
- state = ("{}={!r}".format(attr, attributes[attr]) for attr in sorted(names))
- return "<{name} object: {{{state}}}>".format(
- name=self.__class__.__name__,
- state=", ".join(state),
- )
-
- # Things that are valid for all kinds of requirements?
- @property
- def name(self) -> Optional[str]:
- if self.req is None:
- return None
- return self.req.name
-
- @functools.lru_cache() # use cached_property in python 3.8+
- def supports_pyproject_editable(self) -> bool:
- if not self.use_pep517:
- return False
- assert self.pep517_backend
- with self.build_env:
- runner = runner_with_spinner_message(
- "Checking if build backend supports build_editable"
- )
- with self.pep517_backend.subprocess_runner(runner):
- return "build_editable" in self.pep517_backend._supported_features()
-
- @property
- def specifier(self) -> SpecifierSet:
- return self.req.specifier
-
- @property
- def is_pinned(self) -> bool:
- """Return whether I am pinned to an exact version.
-
- For example, some-package==1.2 is pinned; some-package>1.2 is not.
- """
- specifiers = self.specifier
- return len(specifiers) == 1 and next(iter(specifiers)).operator in {"==", "==="}
-
- def match_markers(self, extras_requested: Optional[Iterable[str]] = None) -> bool:
- if not extras_requested:
- # Provide an extra to safely evaluate the markers
- # without matching any extra
- extras_requested = ("",)
- if self.markers is not None:
- return any(
- self.markers.evaluate({"extra": extra}) for extra in extras_requested
- )
- else:
- return True
-
- @property
- def has_hash_options(self) -> bool:
- """Return whether any known-good hashes are specified as options.
-
- These activate --require-hashes mode; hashes specified as part of a
- URL do not.
-
- """
- return bool(self.hash_options)
-
- def hashes(self, trust_internet: bool = True) -> Hashes:
- """Return a hash-comparer that considers my option- and URL-based
- hashes to be known-good.
-
- Hashes in URLs--ones embedded in the requirements file, not ones
- downloaded from an index server--are almost peers with ones from
- flags. They satisfy --require-hashes (whether it was implicitly or
- explicitly activated) but do not activate it. md5 and sha224 are not
- allowed in flags, which should nudge people toward good algos. We
- always OR all hashes together, even ones from URLs.
-
- :param trust_internet: Whether to trust URL-based (#md5=...) hashes
- downloaded from the internet, as by populate_link()
-
- """
- good_hashes = self.hash_options.copy()
- link = self.link if trust_internet else self.original_link
- if link and link.hash:
- good_hashes.setdefault(link.hash_name, []).append(link.hash)
- return Hashes(good_hashes)
-
- def from_path(self) -> Optional[str]:
- """Format a nice indicator to show where this "comes from" """
- if self.req is None:
- return None
- s = str(self.req)
- if self.comes_from:
- if isinstance(self.comes_from, str):
- comes_from = self.comes_from
- else:
- comes_from = self.comes_from.from_path()
- if comes_from:
- s += "->" + comes_from
- return s
-
- def ensure_build_location(
- self, build_dir: str, autodelete: bool, parallel_builds: bool
- ) -> str:
- assert build_dir is not None
- if self._temp_build_dir is not None:
- assert self._temp_build_dir.path
- return self._temp_build_dir.path
- if self.req is None:
- # Some systems have /tmp as a symlink which confuses custom
- # builds (such as numpy). Thus, we ensure that the real path
- # is returned.
- self._temp_build_dir = TempDirectory(
- kind=tempdir_kinds.REQ_BUILD, globally_managed=True
- )
-
- return self._temp_build_dir.path
-
- # This is the only remaining place where we manually determine the path
- # for the temporary directory. It is only needed for editables where
- # it is the value of the --src option.
-
- # When parallel builds are enabled, add a UUID to the build directory
- # name so multiple builds do not interfere with each other.
- dir_name: str = canonicalize_name(self.name)
- if parallel_builds:
- dir_name = f"{dir_name}_{uuid.uuid4().hex}"
-
- # FIXME: Is there a better place to create the build_dir? (hg and bzr
- # need this)
- if not os.path.exists(build_dir):
- logger.debug("Creating directory %s", build_dir)
- os.makedirs(build_dir)
- actual_build_dir = os.path.join(build_dir, dir_name)
- # `None` indicates that we respect the globally-configured deletion
- # settings, which is what we actually want when auto-deleting.
- delete_arg = None if autodelete else False
- return TempDirectory(
- path=actual_build_dir,
- delete=delete_arg,
- kind=tempdir_kinds.REQ_BUILD,
- globally_managed=True,
- ).path
-
- def _set_requirement(self) -> None:
- """Set requirement after generating metadata."""
- assert self.req is None
- assert self.metadata is not None
- assert self.source_dir is not None
-
- # Construct a Requirement object from the generated metadata
- if isinstance(parse_version(self.metadata["Version"]), Version):
- op = "=="
- else:
- op = "==="
-
- self.req = Requirement(
- "".join(
- [
- self.metadata["Name"],
- op,
- self.metadata["Version"],
- ]
- )
- )
-
- def warn_on_mismatching_name(self) -> None:
- metadata_name = canonicalize_name(self.metadata["Name"])
- if canonicalize_name(self.req.name) == metadata_name:
- # Everything is fine.
- return
-
- # If we're here, there's a mismatch. Log a warning about it.
- logger.warning(
- "Generating metadata for package %s "
- "produced metadata for project name %s. Fix your "
- "#egg=%s fragments.",
- self.name,
- metadata_name,
- self.name,
- )
- self.req = Requirement(metadata_name)
-
- def check_if_exists(self, use_user_site: bool) -> None:
- """Find an installed distribution that satisfies or conflicts
- with this requirement, and set self.satisfied_by or
- self.should_reinstall appropriately.
- """
- if self.req is None:
- return
- existing_dist = get_default_environment().get_distribution(self.req.name)
- if not existing_dist:
- return
-
- version_compatible = self.req.specifier.contains(
- existing_dist.version,
- prereleases=True,
- )
- if not version_compatible:
- self.satisfied_by = None
- if use_user_site:
- if existing_dist.in_usersite:
- self.should_reinstall = True
- elif running_under_virtualenv() and existing_dist.in_site_packages:
- raise InstallationError(
- f"Will not install to the user site because it will "
- f"lack sys.path precedence to {existing_dist.raw_name} "
- f"in {existing_dist.location}"
- )
- else:
- self.should_reinstall = True
- else:
- if self.editable:
- self.should_reinstall = True
- # when installing editables, nothing pre-existing should ever
- # satisfy
- self.satisfied_by = None
- else:
- self.satisfied_by = existing_dist
-
- # Things valid for wheels
- @property
- def is_wheel(self) -> bool:
- if not self.link:
- return False
- return self.link.is_wheel
-
- # Things valid for sdists
- @property
- def unpacked_source_directory(self) -> str:
- return os.path.join(
- self.source_dir, self.link and self.link.subdirectory_fragment or ""
- )
-
- @property
- def setup_py_path(self) -> str:
- assert self.source_dir, f"No source dir for {self}"
- setup_py = os.path.join(self.unpacked_source_directory, "setup.py")
-
- return setup_py
-
- @property
- def setup_cfg_path(self) -> str:
- assert self.source_dir, f"No source dir for {self}"
- setup_cfg = os.path.join(self.unpacked_source_directory, "setup.cfg")
-
- return setup_cfg
-
- @property
- def pyproject_toml_path(self) -> str:
- assert self.source_dir, f"No source dir for {self}"
- return make_pyproject_path(self.unpacked_source_directory)
-
- def load_pyproject_toml(self) -> None:
- """Load the pyproject.toml file.
-
- After calling this routine, all of the attributes related to PEP 517
- processing for this requirement have been set. In particular, the
- use_pep517 attribute can be used to determine whether we should
- follow the PEP 517 or legacy (setup.py) code path.
- """
- pyproject_toml_data = load_pyproject_toml(
- self.use_pep517, self.pyproject_toml_path, self.setup_py_path, str(self)
- )
-
- if pyproject_toml_data is None:
- self.use_pep517 = False
- return
-
- self.use_pep517 = True
- requires, backend, check, backend_path = pyproject_toml_data
- self.requirements_to_check = check
- self.pyproject_requires = requires
- self.pep517_backend = ConfiguredPep517HookCaller(
- self,
- self.unpacked_source_directory,
- backend,
- backend_path=backend_path,
- )
-
- def isolated_editable_sanity_check(self) -> None:
- """Check that an editable requirement if valid for use with PEP 517/518.
-
- This verifies that an editable that has a pyproject.toml either supports PEP 660
- or as a setup.py or a setup.cfg
- """
- if (
- self.editable
- and self.use_pep517
- and not self.supports_pyproject_editable()
- and not os.path.isfile(self.setup_py_path)
- and not os.path.isfile(self.setup_cfg_path)
- ):
- raise InstallationError(
- f"Project {self} has a 'pyproject.toml' and its build "
- f"backend is missing the 'build_editable' hook. Since it does not "
- f"have a 'setup.py' nor a 'setup.cfg', "
- f"it cannot be installed in editable mode. "
- f"Consider using a build backend that supports PEP 660."
- )
-
- def prepare_metadata(self) -> None:
- """Ensure that project metadata is available.
-
- Under PEP 517 and PEP 660, call the backend hook to prepare the metadata.
- Under legacy processing, call setup.py egg-info.
- """
- assert self.source_dir
- details = self.name or f"from {self.link}"
-
- if self.use_pep517:
- assert self.pep517_backend is not None
- if (
- self.editable
- and self.permit_editable_wheels
- and self.supports_pyproject_editable()
- ):
- self.metadata_directory = generate_editable_metadata(
- build_env=self.build_env,
- backend=self.pep517_backend,
- details=details,
- )
- else:
- self.metadata_directory = generate_metadata(
- build_env=self.build_env,
- backend=self.pep517_backend,
- details=details,
- )
- else:
- self.metadata_directory = generate_metadata_legacy(
- build_env=self.build_env,
- setup_py_path=self.setup_py_path,
- source_dir=self.unpacked_source_directory,
- isolated=self.isolated,
- details=details,
- )
-
- # Act on the newly generated metadata, based on the name and version.
- if not self.name:
- self._set_requirement()
- else:
- self.warn_on_mismatching_name()
-
- self.assert_source_matches_version()
-
- @property
- def metadata(self) -> Any:
- if not hasattr(self, "_metadata"):
- self._metadata = self.get_dist().metadata
-
- return self._metadata
-
- def get_dist(self) -> BaseDistribution:
- return get_directory_distribution(self.metadata_directory)
-
- def assert_source_matches_version(self) -> None:
- assert self.source_dir
- version = self.metadata["version"]
- if self.req.specifier and version not in self.req.specifier:
- logger.warning(
- "Requested %s, but installing version %s",
- self,
- version,
- )
- else:
- logger.debug(
- "Source in %s has version %s, which satisfies requirement %s",
- display_path(self.source_dir),
- version,
- self,
- )
-
- # For both source distributions and editables
- def ensure_has_source_dir(
- self,
- parent_dir: str,
- autodelete: bool = False,
- parallel_builds: bool = False,
- ) -> None:
- """Ensure that a source_dir is set.
-
- This will create a temporary build dir if the name of the requirement
- isn't known yet.
-
- :param parent_dir: The ideal pip parent_dir for the source_dir.
- Generally src_dir for editables and build_dir for sdists.
- :return: self.source_dir
- """
- if self.source_dir is None:
- self.source_dir = self.ensure_build_location(
- parent_dir,
- autodelete=autodelete,
- parallel_builds=parallel_builds,
- )
-
- # For editable installations
- def update_editable(self) -> None:
- if not self.link:
- logger.debug(
- "Cannot update repository at %s; repository location is unknown",
- self.source_dir,
- )
- return
- assert self.editable
- assert self.source_dir
- if self.link.scheme == "file":
- # Static paths don't get updated
- return
- vcs_backend = vcs.get_backend_for_scheme(self.link.scheme)
- # Editable requirements are validated in Requirement constructors.
- # So here, if it's neither a path nor a valid VCS URL, it's a bug.
- assert vcs_backend, f"Unsupported VCS URL {self.link.url}"
- hidden_url = hide_url(self.link.url)
- vcs_backend.obtain(self.source_dir, url=hidden_url, verbosity=0)
-
- # Top-level Actions
- def uninstall(
- self, auto_confirm: bool = False, verbose: bool = False
- ) -> Optional[UninstallPathSet]:
- """
- Uninstall the distribution currently satisfying this requirement.
-
- Prompts before removing or modifying files unless
- ``auto_confirm`` is True.
-
- Refuses to delete or modify files outside of ``sys.prefix`` -
- thus uninstallation within a virtual environment can only
- modify that virtual environment, even if the virtualenv is
- linked to global site-packages.
-
- """
- assert self.req
- dist = get_default_environment().get_distribution(self.req.name)
- if not dist:
- logger.warning("Skipping %s as it is not installed.", self.name)
- return None
- logger.info("Found existing installation: %s", dist)
-
- uninstalled_pathset = UninstallPathSet.from_dist(dist)
- uninstalled_pathset.remove(auto_confirm, verbose)
- return uninstalled_pathset
-
- def _get_archive_name(self, path: str, parentdir: str, rootdir: str) -> str:
- def _clean_zip_name(name: str, prefix: str) -> str:
- assert name.startswith(
- prefix + os.path.sep
- ), f"name {name!r} doesn't start with prefix {prefix!r}"
- name = name[len(prefix) + 1 :]
- name = name.replace(os.path.sep, "/")
- return name
-
- path = os.path.join(parentdir, path)
- name = _clean_zip_name(path, rootdir)
- return self.name + "/" + name
-
- def archive(self, build_dir: Optional[str]) -> None:
- """Saves archive to provided build_dir.
-
- Used for saving downloaded VCS requirements as part of `pip download`.
- """
- assert self.source_dir
- if build_dir is None:
- return
-
- create_archive = True
- archive_name = "{}-{}.zip".format(self.name, self.metadata["version"])
- archive_path = os.path.join(build_dir, archive_name)
-
- if os.path.exists(archive_path):
- response = ask_path_exists(
- "The file {} exists. (i)gnore, (w)ipe, "
- "(b)ackup, (a)bort ".format(display_path(archive_path)),
- ("i", "w", "b", "a"),
- )
- if response == "i":
- create_archive = False
- elif response == "w":
- logger.warning("Deleting %s", display_path(archive_path))
- os.remove(archive_path)
- elif response == "b":
- dest_file = backup_dir(archive_path)
- logger.warning(
- "Backing up %s to %s",
- display_path(archive_path),
- display_path(dest_file),
- )
- shutil.move(archive_path, dest_file)
- elif response == "a":
- sys.exit(-1)
-
- if not create_archive:
- return
-
- zip_output = zipfile.ZipFile(
- archive_path,
- "w",
- zipfile.ZIP_DEFLATED,
- allowZip64=True,
- )
- with zip_output:
- dir = os.path.normcase(os.path.abspath(self.unpacked_source_directory))
- for dirpath, dirnames, filenames in os.walk(dir):
- for dirname in dirnames:
- dir_arcname = self._get_archive_name(
- dirname,
- parentdir=dirpath,
- rootdir=dir,
- )
- zipdir = zipfile.ZipInfo(dir_arcname + "/")
- zipdir.external_attr = 0x1ED << 16 # 0o755
- zip_output.writestr(zipdir, "")
- for filename in filenames:
- file_arcname = self._get_archive_name(
- filename,
- parentdir=dirpath,
- rootdir=dir,
- )
- filename = os.path.join(dirpath, filename)
- zip_output.write(filename, file_arcname)
-
- logger.info("Saved %s", display_path(archive_path))
-
- def install(
- self,
- install_options: List[str],
- global_options: Optional[Sequence[str]] = None,
- root: Optional[str] = None,
- home: Optional[str] = None,
- prefix: Optional[str] = None,
- warn_script_location: bool = True,
- use_user_site: bool = False,
- pycompile: bool = True,
- ) -> None:
- scheme = get_scheme(
- self.name,
- user=use_user_site,
- home=home,
- root=root,
- isolated=self.isolated,
- prefix=prefix,
- )
-
- global_options = global_options if global_options is not None else []
- if self.editable and not self.is_wheel:
- install_editable_legacy(
- install_options,
- global_options,
- prefix=prefix,
- home=home,
- use_user_site=use_user_site,
- name=self.name,
- setup_py_path=self.setup_py_path,
- isolated=self.isolated,
- build_env=self.build_env,
- unpacked_source_directory=self.unpacked_source_directory,
- )
- self.install_succeeded = True
- return
-
- if self.is_wheel:
- assert self.local_file_path
- direct_url = None
- if self.editable:
- direct_url = direct_url_for_editable(self.unpacked_source_directory)
- elif self.original_link:
- direct_url = direct_url_from_link(
- self.original_link,
- self.source_dir,
- self.original_link_is_in_wheel_cache,
- )
- install_wheel(
- self.name,
- self.local_file_path,
- scheme=scheme,
- req_description=str(self.req),
- pycompile=pycompile,
- warn_script_location=warn_script_location,
- direct_url=direct_url,
- requested=self.user_supplied,
- )
- self.install_succeeded = True
- return
-
- # TODO: Why don't we do this for editable installs?
-
- # Extend the list of global and install options passed on to
- # the setup.py call with the ones from the requirements file.
- # Options specified in requirements file override those
- # specified on the command line, since the last option given
- # to setup.py is the one that is used.
- global_options = list(global_options) + self.global_options
- install_options = list(install_options) + self.install_options
-
- try:
- success = install_legacy(
- install_options=install_options,
- global_options=global_options,
- root=root,
- home=home,
- prefix=prefix,
- use_user_site=use_user_site,
- pycompile=pycompile,
- scheme=scheme,
- setup_py_path=self.setup_py_path,
- isolated=self.isolated,
- req_name=self.name,
- build_env=self.build_env,
- unpacked_source_directory=self.unpacked_source_directory,
- req_description=str(self.req),
- )
- except LegacyInstallFailure as exc:
- self.install_succeeded = False
- raise exc
- except Exception:
- self.install_succeeded = True
- raise
-
- self.install_succeeded = success
-
- if success and self.legacy_install_reason == 8368:
- deprecated(
- reason=(
- "{} was installed using the legacy 'setup.py install' "
- "method, because a wheel could not be built for it.".format(
- self.name
- )
- ),
- replacement="to fix the wheel build issue reported above",
- gone_in=None,
- issue=8368,
- )
-
-
-def check_invalid_constraint_type(req: InstallRequirement) -> str:
-
- # Check for unsupported forms
- problem = ""
- if not req.name:
- problem = "Unnamed requirements are not allowed as constraints"
- elif req.editable:
- problem = "Editable requirements are not allowed as constraints"
- elif req.extras:
- problem = "Constraints cannot have extras"
-
- if problem:
- deprecated(
- reason=(
- "Constraints are only allowed to take the form of a package "
- "name and a version specifier. Other forms were originally "
- "permitted as an accident of the implementation, but were "
- "undocumented. The new implementation of the resolver no "
- "longer supports these forms."
- ),
- replacement="replacing the constraint with a requirement",
- # No plan yet for when the new resolver becomes default
- gone_in=None,
- issue=8210,
- )
-
- return problem
diff --git a/env/lib/python3.9/site-packages/pip/_internal/req/req_set.py b/env/lib/python3.9/site-packages/pip/_internal/req/req_set.py
deleted file mode 100644
index 0f550bf..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/req/req_set.py
+++ /dev/null
@@ -1,69 +0,0 @@
-import logging
-from collections import OrderedDict
-from typing import Dict, List
-
-from pip._vendor.packaging.utils import canonicalize_name
-
-from pip._internal.req.req_install import InstallRequirement
-
-logger = logging.getLogger(__name__)
-
-
-class RequirementSet:
- def __init__(self, check_supported_wheels: bool = True) -> None:
- """Create a RequirementSet."""
-
- self.requirements: Dict[str, InstallRequirement] = OrderedDict()
- self.check_supported_wheels = check_supported_wheels
-
- self.unnamed_requirements: List[InstallRequirement] = []
-
- def __str__(self) -> str:
- requirements = sorted(
- (req for req in self.requirements.values() if not req.comes_from),
- key=lambda req: canonicalize_name(req.name or ""),
- )
- return " ".join(str(req.req) for req in requirements)
-
- def __repr__(self) -> str:
- requirements = sorted(
- self.requirements.values(),
- key=lambda req: canonicalize_name(req.name or ""),
- )
-
- format_string = "<{classname} object; {count} requirement(s): {reqs}>"
- return format_string.format(
- classname=self.__class__.__name__,
- count=len(requirements),
- reqs=", ".join(str(req.req) for req in requirements),
- )
-
- def add_unnamed_requirement(self, install_req: InstallRequirement) -> None:
- assert not install_req.name
- self.unnamed_requirements.append(install_req)
-
- def add_named_requirement(self, install_req: InstallRequirement) -> None:
- assert install_req.name
-
- project_name = canonicalize_name(install_req.name)
- self.requirements[project_name] = install_req
-
- def has_requirement(self, name: str) -> bool:
- project_name = canonicalize_name(name)
-
- return (
- project_name in self.requirements
- and not self.requirements[project_name].constraint
- )
-
- def get_requirement(self, name: str) -> InstallRequirement:
- project_name = canonicalize_name(name)
-
- if project_name in self.requirements:
- return self.requirements[project_name]
-
- raise KeyError(f"No project with the name {name!r}")
-
- @property
- def all_requirements(self) -> List[InstallRequirement]:
- return self.unnamed_requirements + list(self.requirements.values())
diff --git a/env/lib/python3.9/site-packages/pip/_internal/req/req_uninstall.py b/env/lib/python3.9/site-packages/pip/_internal/req/req_uninstall.py
deleted file mode 100644
index 15b6738..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/req/req_uninstall.py
+++ /dev/null
@@ -1,640 +0,0 @@
-import functools
-import os
-import sys
-import sysconfig
-from importlib.util import cache_from_source
-from typing import Any, Callable, Dict, Generator, Iterable, List, Optional, Set, Tuple
-
-from pip._internal.exceptions import UninstallationError
-from pip._internal.locations import get_bin_prefix, get_bin_user
-from pip._internal.metadata import BaseDistribution
-from pip._internal.utils.compat import WINDOWS
-from pip._internal.utils.egg_link import egg_link_path_from_location
-from pip._internal.utils.logging import getLogger, indent_log
-from pip._internal.utils.misc import ask, is_local, normalize_path, renames, rmtree
-from pip._internal.utils.temp_dir import AdjacentTempDirectory, TempDirectory
-
-logger = getLogger(__name__)
-
-
-def _script_names(
- bin_dir: str, script_name: str, is_gui: bool
-) -> Generator[str, None, None]:
- """Create the fully qualified name of the files created by
- {console,gui}_scripts for the given ``dist``.
- Returns the list of file names
- """
- exe_name = os.path.join(bin_dir, script_name)
- yield exe_name
- if not WINDOWS:
- return
- yield f"{exe_name}.exe"
- yield f"{exe_name}.exe.manifest"
- if is_gui:
- yield f"{exe_name}-script.pyw"
- else:
- yield f"{exe_name}-script.py"
-
-
-def _unique(
- fn: Callable[..., Generator[Any, None, None]]
-) -> Callable[..., Generator[Any, None, None]]:
- @functools.wraps(fn)
- def unique(*args: Any, **kw: Any) -> Generator[Any, None, None]:
- seen: Set[Any] = set()
- for item in fn(*args, **kw):
- if item not in seen:
- seen.add(item)
- yield item
-
- return unique
-
-
-@_unique
-def uninstallation_paths(dist: BaseDistribution) -> Generator[str, None, None]:
- """
- Yield all the uninstallation paths for dist based on RECORD-without-.py[co]
-
- Yield paths to all the files in RECORD. For each .py file in RECORD, add
- the .pyc and .pyo in the same directory.
-
- UninstallPathSet.add() takes care of the __pycache__ .py[co].
-
- If RECORD is not found, raises UninstallationError,
- with possible information from the INSTALLER file.
-
- https://packaging.python.org/specifications/recording-installed-packages/
- """
- location = dist.location
- assert location is not None, "not installed"
-
- entries = dist.iter_declared_entries()
- if entries is None:
- msg = "Cannot uninstall {dist}, RECORD file not found.".format(dist=dist)
- installer = dist.installer
- if not installer or installer == "pip":
- dep = "{}=={}".format(dist.raw_name, dist.version)
- msg += (
- " You might be able to recover from this via: "
- "'pip install --force-reinstall --no-deps {}'.".format(dep)
- )
- else:
- msg += " Hint: The package was installed by {}.".format(installer)
- raise UninstallationError(msg)
-
- for entry in entries:
- path = os.path.join(location, entry)
- yield path
- if path.endswith(".py"):
- dn, fn = os.path.split(path)
- base = fn[:-3]
- path = os.path.join(dn, base + ".pyc")
- yield path
- path = os.path.join(dn, base + ".pyo")
- yield path
-
-
-def compact(paths: Iterable[str]) -> Set[str]:
- """Compact a path set to contain the minimal number of paths
- necessary to contain all paths in the set. If /a/path/ and
- /a/path/to/a/file.txt are both in the set, leave only the
- shorter path."""
-
- sep = os.path.sep
- short_paths: Set[str] = set()
- for path in sorted(paths, key=len):
- should_skip = any(
- path.startswith(shortpath.rstrip("*"))
- and path[len(shortpath.rstrip("*").rstrip(sep))] == sep
- for shortpath in short_paths
- )
- if not should_skip:
- short_paths.add(path)
- return short_paths
-
-
-def compress_for_rename(paths: Iterable[str]) -> Set[str]:
- """Returns a set containing the paths that need to be renamed.
-
- This set may include directories when the original sequence of paths
- included every file on disk.
- """
- case_map = {os.path.normcase(p): p for p in paths}
- remaining = set(case_map)
- unchecked = sorted({os.path.split(p)[0] for p in case_map.values()}, key=len)
- wildcards: Set[str] = set()
-
- def norm_join(*a: str) -> str:
- return os.path.normcase(os.path.join(*a))
-
- for root in unchecked:
- if any(os.path.normcase(root).startswith(w) for w in wildcards):
- # This directory has already been handled.
- continue
-
- all_files: Set[str] = set()
- all_subdirs: Set[str] = set()
- for dirname, subdirs, files in os.walk(root):
- all_subdirs.update(norm_join(root, dirname, d) for d in subdirs)
- all_files.update(norm_join(root, dirname, f) for f in files)
- # If all the files we found are in our remaining set of files to
- # remove, then remove them from the latter set and add a wildcard
- # for the directory.
- if not (all_files - remaining):
- remaining.difference_update(all_files)
- wildcards.add(root + os.sep)
-
- return set(map(case_map.__getitem__, remaining)) | wildcards
-
-
-def compress_for_output_listing(paths: Iterable[str]) -> Tuple[Set[str], Set[str]]:
- """Returns a tuple of 2 sets of which paths to display to user
-
- The first set contains paths that would be deleted. Files of a package
- are not added and the top-level directory of the package has a '*' added
- at the end - to signify that all it's contents are removed.
-
- The second set contains files that would have been skipped in the above
- folders.
- """
-
- will_remove = set(paths)
- will_skip = set()
-
- # Determine folders and files
- folders = set()
- files = set()
- for path in will_remove:
- if path.endswith(".pyc"):
- continue
- if path.endswith("__init__.py") or ".dist-info" in path:
- folders.add(os.path.dirname(path))
- files.add(path)
-
- # probably this one https://github.com/python/mypy/issues/390
- _normcased_files = set(map(os.path.normcase, files)) # type: ignore
-
- folders = compact(folders)
-
- # This walks the tree using os.walk to not miss extra folders
- # that might get added.
- for folder in folders:
- for dirpath, _, dirfiles in os.walk(folder):
- for fname in dirfiles:
- if fname.endswith(".pyc"):
- continue
-
- file_ = os.path.join(dirpath, fname)
- if (
- os.path.isfile(file_)
- and os.path.normcase(file_) not in _normcased_files
- ):
- # We are skipping this file. Add it to the set.
- will_skip.add(file_)
-
- will_remove = files | {os.path.join(folder, "*") for folder in folders}
-
- return will_remove, will_skip
-
-
-class StashedUninstallPathSet:
- """A set of file rename operations to stash files while
- tentatively uninstalling them."""
-
- def __init__(self) -> None:
- # Mapping from source file root to [Adjacent]TempDirectory
- # for files under that directory.
- self._save_dirs: Dict[str, TempDirectory] = {}
- # (old path, new path) tuples for each move that may need
- # to be undone.
- self._moves: List[Tuple[str, str]] = []
-
- def _get_directory_stash(self, path: str) -> str:
- """Stashes a directory.
-
- Directories are stashed adjacent to their original location if
- possible, or else moved/copied into the user's temp dir."""
-
- try:
- save_dir: TempDirectory = AdjacentTempDirectory(path)
- except OSError:
- save_dir = TempDirectory(kind="uninstall")
- self._save_dirs[os.path.normcase(path)] = save_dir
-
- return save_dir.path
-
- def _get_file_stash(self, path: str) -> str:
- """Stashes a file.
-
- If no root has been provided, one will be created for the directory
- in the user's temp directory."""
- path = os.path.normcase(path)
- head, old_head = os.path.dirname(path), None
- save_dir = None
-
- while head != old_head:
- try:
- save_dir = self._save_dirs[head]
- break
- except KeyError:
- pass
- head, old_head = os.path.dirname(head), head
- else:
- # Did not find any suitable root
- head = os.path.dirname(path)
- save_dir = TempDirectory(kind="uninstall")
- self._save_dirs[head] = save_dir
-
- relpath = os.path.relpath(path, head)
- if relpath and relpath != os.path.curdir:
- return os.path.join(save_dir.path, relpath)
- return save_dir.path
-
- def stash(self, path: str) -> str:
- """Stashes the directory or file and returns its new location.
- Handle symlinks as files to avoid modifying the symlink targets.
- """
- path_is_dir = os.path.isdir(path) and not os.path.islink(path)
- if path_is_dir:
- new_path = self._get_directory_stash(path)
- else:
- new_path = self._get_file_stash(path)
-
- self._moves.append((path, new_path))
- if path_is_dir and os.path.isdir(new_path):
- # If we're moving a directory, we need to
- # remove the destination first or else it will be
- # moved to inside the existing directory.
- # We just created new_path ourselves, so it will
- # be removable.
- os.rmdir(new_path)
- renames(path, new_path)
- return new_path
-
- def commit(self) -> None:
- """Commits the uninstall by removing stashed files."""
- for _, save_dir in self._save_dirs.items():
- save_dir.cleanup()
- self._moves = []
- self._save_dirs = {}
-
- def rollback(self) -> None:
- """Undoes the uninstall by moving stashed files back."""
- for p in self._moves:
- logger.info("Moving to %s\n from %s", *p)
-
- for new_path, path in self._moves:
- try:
- logger.debug("Replacing %s from %s", new_path, path)
- if os.path.isfile(new_path) or os.path.islink(new_path):
- os.unlink(new_path)
- elif os.path.isdir(new_path):
- rmtree(new_path)
- renames(path, new_path)
- except OSError as ex:
- logger.error("Failed to restore %s", new_path)
- logger.debug("Exception: %s", ex)
-
- self.commit()
-
- @property
- def can_rollback(self) -> bool:
- return bool(self._moves)
-
-
-class UninstallPathSet:
- """A set of file paths to be removed in the uninstallation of a
- requirement."""
-
- def __init__(self, dist: BaseDistribution) -> None:
- self._paths: Set[str] = set()
- self._refuse: Set[str] = set()
- self._pth: Dict[str, UninstallPthEntries] = {}
- self._dist = dist
- self._moved_paths = StashedUninstallPathSet()
-
- def _permitted(self, path: str) -> bool:
- """
- Return True if the given path is one we are permitted to
- remove/modify, False otherwise.
-
- """
- return is_local(path)
-
- def add(self, path: str) -> None:
- head, tail = os.path.split(path)
-
- # we normalize the head to resolve parent directory symlinks, but not
- # the tail, since we only want to uninstall symlinks, not their targets
- path = os.path.join(normalize_path(head), os.path.normcase(tail))
-
- if not os.path.exists(path):
- return
- if self._permitted(path):
- self._paths.add(path)
- else:
- self._refuse.add(path)
-
- # __pycache__ files can show up after 'installed-files.txt' is created,
- # due to imports
- if os.path.splitext(path)[1] == ".py":
- self.add(cache_from_source(path))
-
- def add_pth(self, pth_file: str, entry: str) -> None:
- pth_file = normalize_path(pth_file)
- if self._permitted(pth_file):
- if pth_file not in self._pth:
- self._pth[pth_file] = UninstallPthEntries(pth_file)
- self._pth[pth_file].add(entry)
- else:
- self._refuse.add(pth_file)
-
- def remove(self, auto_confirm: bool = False, verbose: bool = False) -> None:
- """Remove paths in ``self._paths`` with confirmation (unless
- ``auto_confirm`` is True)."""
-
- if not self._paths:
- logger.info(
- "Can't uninstall '%s'. No files were found to uninstall.",
- self._dist.raw_name,
- )
- return
-
- dist_name_version = f"{self._dist.raw_name}-{self._dist.version}"
- logger.info("Uninstalling %s:", dist_name_version)
-
- with indent_log():
- if auto_confirm or self._allowed_to_proceed(verbose):
- moved = self._moved_paths
-
- for_rename = compress_for_rename(self._paths)
-
- for path in sorted(compact(for_rename)):
- moved.stash(path)
- logger.verbose("Removing file or directory %s", path)
-
- for pth in self._pth.values():
- pth.remove()
-
- logger.info("Successfully uninstalled %s", dist_name_version)
-
- def _allowed_to_proceed(self, verbose: bool) -> bool:
- """Display which files would be deleted and prompt for confirmation"""
-
- def _display(msg: str, paths: Iterable[str]) -> None:
- if not paths:
- return
-
- logger.info(msg)
- with indent_log():
- for path in sorted(compact(paths)):
- logger.info(path)
-
- if not verbose:
- will_remove, will_skip = compress_for_output_listing(self._paths)
- else:
- # In verbose mode, display all the files that are going to be
- # deleted.
- will_remove = set(self._paths)
- will_skip = set()
-
- _display("Would remove:", will_remove)
- _display("Would not remove (might be manually added):", will_skip)
- _display("Would not remove (outside of prefix):", self._refuse)
- if verbose:
- _display("Will actually move:", compress_for_rename(self._paths))
-
- return ask("Proceed (Y/n)? ", ("y", "n", "")) != "n"
-
- def rollback(self) -> None:
- """Rollback the changes previously made by remove()."""
- if not self._moved_paths.can_rollback:
- logger.error(
- "Can't roll back %s; was not uninstalled",
- self._dist.raw_name,
- )
- return
- logger.info("Rolling back uninstall of %s", self._dist.raw_name)
- self._moved_paths.rollback()
- for pth in self._pth.values():
- pth.rollback()
-
- def commit(self) -> None:
- """Remove temporary save dir: rollback will no longer be possible."""
- self._moved_paths.commit()
-
- @classmethod
- def from_dist(cls, dist: BaseDistribution) -> "UninstallPathSet":
- dist_location = dist.location
- info_location = dist.info_location
- if dist_location is None:
- logger.info(
- "Not uninstalling %s since it is not installed",
- dist.canonical_name,
- )
- return cls(dist)
-
- normalized_dist_location = normalize_path(dist_location)
- if not dist.local:
- logger.info(
- "Not uninstalling %s at %s, outside environment %s",
- dist.canonical_name,
- normalized_dist_location,
- sys.prefix,
- )
- return cls(dist)
-
- if normalized_dist_location in {
- p
- for p in {sysconfig.get_path("stdlib"), sysconfig.get_path("platstdlib")}
- if p
- }:
- logger.info(
- "Not uninstalling %s at %s, as it is in the standard library.",
- dist.canonical_name,
- normalized_dist_location,
- )
- return cls(dist)
-
- paths_to_remove = cls(dist)
- develop_egg_link = egg_link_path_from_location(dist.raw_name)
-
- # Distribution is installed with metadata in a "flat" .egg-info
- # directory. This means it is not a modern .dist-info installation, an
- # egg, or legacy editable.
- setuptools_flat_installation = (
- dist.installed_with_setuptools_egg_info
- and info_location is not None
- and os.path.exists(info_location)
- # If dist is editable and the location points to a ``.egg-info``,
- # we are in fact in the legacy editable case.
- and not info_location.endswith(f"{dist.setuptools_filename}.egg-info")
- )
-
- # Uninstall cases order do matter as in the case of 2 installs of the
- # same package, pip needs to uninstall the currently detected version
- if setuptools_flat_installation:
- if info_location is not None:
- paths_to_remove.add(info_location)
- installed_files = dist.iter_declared_entries()
- if installed_files is not None:
- for installed_file in installed_files:
- paths_to_remove.add(os.path.join(dist_location, installed_file))
- # FIXME: need a test for this elif block
- # occurs with --single-version-externally-managed/--record outside
- # of pip
- elif dist.is_file("top_level.txt"):
- try:
- namespace_packages = dist.read_text("namespace_packages.txt")
- except FileNotFoundError:
- namespaces = []
- else:
- namespaces = namespace_packages.splitlines(keepends=False)
- for top_level_pkg in [
- p
- for p in dist.read_text("top_level.txt").splitlines()
- if p and p not in namespaces
- ]:
- path = os.path.join(dist_location, top_level_pkg)
- paths_to_remove.add(path)
- paths_to_remove.add(f"{path}.py")
- paths_to_remove.add(f"{path}.pyc")
- paths_to_remove.add(f"{path}.pyo")
-
- elif dist.installed_by_distutils:
- raise UninstallationError(
- "Cannot uninstall {!r}. It is a distutils installed project "
- "and thus we cannot accurately determine which files belong "
- "to it which would lead to only a partial uninstall.".format(
- dist.raw_name,
- )
- )
-
- elif dist.installed_as_egg:
- # package installed by easy_install
- # We cannot match on dist.egg_name because it can slightly vary
- # i.e. setuptools-0.6c11-py2.6.egg vs setuptools-0.6rc11-py2.6.egg
- paths_to_remove.add(dist_location)
- easy_install_egg = os.path.split(dist_location)[1]
- easy_install_pth = os.path.join(
- os.path.dirname(dist_location),
- "easy-install.pth",
- )
- paths_to_remove.add_pth(easy_install_pth, "./" + easy_install_egg)
-
- elif dist.installed_with_dist_info:
- for path in uninstallation_paths(dist):
- paths_to_remove.add(path)
-
- elif develop_egg_link:
- # PEP 660 modern editable is handled in the ``.dist-info`` case
- # above, so this only covers the setuptools-style editable.
- with open(develop_egg_link) as fh:
- link_pointer = os.path.normcase(fh.readline().strip())
- normalized_link_pointer = normalize_path(link_pointer)
- assert os.path.samefile(
- normalized_link_pointer, normalized_dist_location
- ), (
- f"Egg-link {link_pointer} does not match installed location of "
- f"{dist.raw_name} (at {dist_location})"
- )
- paths_to_remove.add(develop_egg_link)
- easy_install_pth = os.path.join(
- os.path.dirname(develop_egg_link), "easy-install.pth"
- )
- paths_to_remove.add_pth(easy_install_pth, dist_location)
-
- else:
- logger.debug(
- "Not sure how to uninstall: %s - Check: %s",
- dist,
- dist_location,
- )
-
- if dist.in_usersite:
- bin_dir = get_bin_user()
- else:
- bin_dir = get_bin_prefix()
-
- # find distutils scripts= scripts
- try:
- for script in dist.iter_distutils_script_names():
- paths_to_remove.add(os.path.join(bin_dir, script))
- if WINDOWS:
- paths_to_remove.add(os.path.join(bin_dir, f"{script}.bat"))
- except (FileNotFoundError, NotADirectoryError):
- pass
-
- # find console_scripts and gui_scripts
- def iter_scripts_to_remove(
- dist: BaseDistribution,
- bin_dir: str,
- ) -> Generator[str, None, None]:
- for entry_point in dist.iter_entry_points():
- if entry_point.group == "console_scripts":
- yield from _script_names(bin_dir, entry_point.name, False)
- elif entry_point.group == "gui_scripts":
- yield from _script_names(bin_dir, entry_point.name, True)
-
- for s in iter_scripts_to_remove(dist, bin_dir):
- paths_to_remove.add(s)
-
- return paths_to_remove
-
-
-class UninstallPthEntries:
- def __init__(self, pth_file: str) -> None:
- self.file = pth_file
- self.entries: Set[str] = set()
- self._saved_lines: Optional[List[bytes]] = None
-
- def add(self, entry: str) -> None:
- entry = os.path.normcase(entry)
- # On Windows, os.path.normcase converts the entry to use
- # backslashes. This is correct for entries that describe absolute
- # paths outside of site-packages, but all the others use forward
- # slashes.
- # os.path.splitdrive is used instead of os.path.isabs because isabs
- # treats non-absolute paths with drive letter markings like c:foo\bar
- # as absolute paths. It also does not recognize UNC paths if they don't
- # have more than "\\sever\share". Valid examples: "\\server\share\" or
- # "\\server\share\folder".
- if WINDOWS and not os.path.splitdrive(entry)[0]:
- entry = entry.replace("\\", "/")
- self.entries.add(entry)
-
- def remove(self) -> None:
- logger.verbose("Removing pth entries from %s:", self.file)
-
- # If the file doesn't exist, log a warning and return
- if not os.path.isfile(self.file):
- logger.warning("Cannot remove entries from nonexistent file %s", self.file)
- return
- with open(self.file, "rb") as fh:
- # windows uses '\r\n' with py3k, but uses '\n' with py2.x
- lines = fh.readlines()
- self._saved_lines = lines
- if any(b"\r\n" in line for line in lines):
- endline = "\r\n"
- else:
- endline = "\n"
- # handle missing trailing newline
- if lines and not lines[-1].endswith(endline.encode("utf-8")):
- lines[-1] = lines[-1] + endline.encode("utf-8")
- for entry in self.entries:
- try:
- logger.verbose("Removing entry: %s", entry)
- lines.remove((entry + endline).encode("utf-8"))
- except ValueError:
- pass
- with open(self.file, "wb") as fh:
- fh.writelines(lines)
-
- def rollback(self) -> bool:
- if self._saved_lines is None:
- logger.error("Cannot roll back changes to %s, none were made", self.file)
- return False
- logger.debug("Rolling %s back to previous state", self.file)
- with open(self.file, "wb") as fh:
- fh.writelines(self._saved_lines)
- return True
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/base.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/base.py
deleted file mode 100644
index 42dade1..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/resolution/base.py
+++ /dev/null
@@ -1,20 +0,0 @@
-from typing import Callable, List, Optional
-
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.req.req_set import RequirementSet
-
-InstallRequirementProvider = Callable[
- [str, Optional[InstallRequirement]], InstallRequirement
-]
-
-
-class BaseResolver:
- def resolve(
- self, root_reqs: List[InstallRequirement], check_supported_wheels: bool
- ) -> RequirementSet:
- raise NotImplementedError()
-
- def get_installation_order(
- self, req_set: RequirementSet
- ) -> List[InstallRequirement]:
- raise NotImplementedError()
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/legacy/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/legacy/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/legacy/resolver.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/legacy/resolver.py
deleted file mode 100644
index 1225ae7..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/resolution/legacy/resolver.py
+++ /dev/null
@@ -1,591 +0,0 @@
-"""Dependency Resolution
-
-The dependency resolution in pip is performed as follows:
-
-for top-level requirements:
- a. only one spec allowed per project, regardless of conflicts or not.
- otherwise a "double requirement" exception is raised
- b. they override sub-dependency requirements.
-for sub-dependencies
- a. "first found, wins" (where the order is breadth first)
-"""
-
-# The following comment should be removed at some point in the future.
-# mypy: strict-optional=False
-
-import logging
-import sys
-from collections import defaultdict
-from itertools import chain
-from typing import DefaultDict, Iterable, List, Optional, Set, Tuple
-
-from pip._vendor.packaging import specifiers
-from pip._vendor.packaging.requirements import Requirement
-
-from pip._internal.cache import WheelCache
-from pip._internal.exceptions import (
- BestVersionAlreadyInstalled,
- DistributionNotFound,
- HashError,
- HashErrors,
- InstallationError,
- NoneMetadataError,
- UnsupportedPythonVersion,
-)
-from pip._internal.index.package_finder import PackageFinder
-from pip._internal.metadata import BaseDistribution
-from pip._internal.models.link import Link
-from pip._internal.models.wheel import Wheel
-from pip._internal.operations.prepare import RequirementPreparer
-from pip._internal.req.req_install import (
- InstallRequirement,
- check_invalid_constraint_type,
-)
-from pip._internal.req.req_set import RequirementSet
-from pip._internal.resolution.base import BaseResolver, InstallRequirementProvider
-from pip._internal.utils import compatibility_tags
-from pip._internal.utils.compatibility_tags import get_supported
-from pip._internal.utils.logging import indent_log
-from pip._internal.utils.misc import normalize_version_info
-from pip._internal.utils.packaging import check_requires_python
-
-logger = logging.getLogger(__name__)
-
-DiscoveredDependencies = DefaultDict[str, List[InstallRequirement]]
-
-
-def _check_dist_requires_python(
- dist: BaseDistribution,
- version_info: Tuple[int, int, int],
- ignore_requires_python: bool = False,
-) -> None:
- """
- Check whether the given Python version is compatible with a distribution's
- "Requires-Python" value.
-
- :param version_info: A 3-tuple of ints representing the Python
- major-minor-micro version to check.
- :param ignore_requires_python: Whether to ignore the "Requires-Python"
- value if the given Python version isn't compatible.
-
- :raises UnsupportedPythonVersion: When the given Python version isn't
- compatible.
- """
- # This idiosyncratically converts the SpecifierSet to str and let
- # check_requires_python then parse it again into SpecifierSet. But this
- # is the legacy resolver so I'm just not going to bother refactoring.
- try:
- requires_python = str(dist.requires_python)
- except FileNotFoundError as e:
- raise NoneMetadataError(dist, str(e))
- try:
- is_compatible = check_requires_python(
- requires_python,
- version_info=version_info,
- )
- except specifiers.InvalidSpecifier as exc:
- logger.warning(
- "Package %r has an invalid Requires-Python: %s", dist.raw_name, exc
- )
- return
-
- if is_compatible:
- return
-
- version = ".".join(map(str, version_info))
- if ignore_requires_python:
- logger.debug(
- "Ignoring failed Requires-Python check for package %r: %s not in %r",
- dist.raw_name,
- version,
- requires_python,
- )
- return
-
- raise UnsupportedPythonVersion(
- "Package {!r} requires a different Python: {} not in {!r}".format(
- dist.raw_name, version, requires_python
- )
- )
-
-
-class Resolver(BaseResolver):
- """Resolves which packages need to be installed/uninstalled to perform \
- the requested operation without breaking the requirements of any package.
- """
-
- _allowed_strategies = {"eager", "only-if-needed", "to-satisfy-only"}
-
- def __init__(
- self,
- preparer: RequirementPreparer,
- finder: PackageFinder,
- wheel_cache: Optional[WheelCache],
- make_install_req: InstallRequirementProvider,
- use_user_site: bool,
- ignore_dependencies: bool,
- ignore_installed: bool,
- ignore_requires_python: bool,
- force_reinstall: bool,
- upgrade_strategy: str,
- py_version_info: Optional[Tuple[int, ...]] = None,
- ) -> None:
- super().__init__()
- assert upgrade_strategy in self._allowed_strategies
-
- if py_version_info is None:
- py_version_info = sys.version_info[:3]
- else:
- py_version_info = normalize_version_info(py_version_info)
-
- self._py_version_info = py_version_info
-
- self.preparer = preparer
- self.finder = finder
- self.wheel_cache = wheel_cache
-
- self.upgrade_strategy = upgrade_strategy
- self.force_reinstall = force_reinstall
- self.ignore_dependencies = ignore_dependencies
- self.ignore_installed = ignore_installed
- self.ignore_requires_python = ignore_requires_python
- self.use_user_site = use_user_site
- self._make_install_req = make_install_req
-
- self._discovered_dependencies: DiscoveredDependencies = defaultdict(list)
-
- def resolve(
- self, root_reqs: List[InstallRequirement], check_supported_wheels: bool
- ) -> RequirementSet:
- """Resolve what operations need to be done
-
- As a side-effect of this method, the packages (and their dependencies)
- are downloaded, unpacked and prepared for installation. This
- preparation is done by ``pip.operations.prepare``.
-
- Once PyPI has static dependency metadata available, it would be
- possible to move the preparation to become a step separated from
- dependency resolution.
- """
- requirement_set = RequirementSet(check_supported_wheels=check_supported_wheels)
- for req in root_reqs:
- if req.constraint:
- check_invalid_constraint_type(req)
- self._add_requirement_to_set(requirement_set, req)
-
- # Actually prepare the files, and collect any exceptions. Most hash
- # exceptions cannot be checked ahead of time, because
- # _populate_link() needs to be called before we can make decisions
- # based on link type.
- discovered_reqs: List[InstallRequirement] = []
- hash_errors = HashErrors()
- for req in chain(requirement_set.all_requirements, discovered_reqs):
- try:
- discovered_reqs.extend(self._resolve_one(requirement_set, req))
- except HashError as exc:
- exc.req = req
- hash_errors.append(exc)
-
- if hash_errors:
- raise hash_errors
-
- return requirement_set
-
- def _add_requirement_to_set(
- self,
- requirement_set: RequirementSet,
- install_req: InstallRequirement,
- parent_req_name: Optional[str] = None,
- extras_requested: Optional[Iterable[str]] = None,
- ) -> Tuple[List[InstallRequirement], Optional[InstallRequirement]]:
- """Add install_req as a requirement to install.
-
- :param parent_req_name: The name of the requirement that needed this
- added. The name is used because when multiple unnamed requirements
- resolve to the same name, we could otherwise end up with dependency
- links that point outside the Requirements set. parent_req must
- already be added. Note that None implies that this is a user
- supplied requirement, vs an inferred one.
- :param extras_requested: an iterable of extras used to evaluate the
- environment markers.
- :return: Additional requirements to scan. That is either [] if
- the requirement is not applicable, or [install_req] if the
- requirement is applicable and has just been added.
- """
- # If the markers do not match, ignore this requirement.
- if not install_req.match_markers(extras_requested):
- logger.info(
- "Ignoring %s: markers '%s' don't match your environment",
- install_req.name,
- install_req.markers,
- )
- return [], None
-
- # If the wheel is not supported, raise an error.
- # Should check this after filtering out based on environment markers to
- # allow specifying different wheels based on the environment/OS, in a
- # single requirements file.
- if install_req.link and install_req.link.is_wheel:
- wheel = Wheel(install_req.link.filename)
- tags = compatibility_tags.get_supported()
- if requirement_set.check_supported_wheels and not wheel.supported(tags):
- raise InstallationError(
- "{} is not a supported wheel on this platform.".format(
- wheel.filename
- )
- )
-
- # This next bit is really a sanity check.
- assert (
- not install_req.user_supplied or parent_req_name is None
- ), "a user supplied req shouldn't have a parent"
-
- # Unnamed requirements are scanned again and the requirement won't be
- # added as a dependency until after scanning.
- if not install_req.name:
- requirement_set.add_unnamed_requirement(install_req)
- return [install_req], None
-
- try:
- existing_req: Optional[
- InstallRequirement
- ] = requirement_set.get_requirement(install_req.name)
- except KeyError:
- existing_req = None
-
- has_conflicting_requirement = (
- parent_req_name is None
- and existing_req
- and not existing_req.constraint
- and existing_req.extras == install_req.extras
- and existing_req.req
- and install_req.req
- and existing_req.req.specifier != install_req.req.specifier
- )
- if has_conflicting_requirement:
- raise InstallationError(
- "Double requirement given: {} (already in {}, name={!r})".format(
- install_req, existing_req, install_req.name
- )
- )
-
- # When no existing requirement exists, add the requirement as a
- # dependency and it will be scanned again after.
- if not existing_req:
- requirement_set.add_named_requirement(install_req)
- # We'd want to rescan this requirement later
- return [install_req], install_req
-
- # Assume there's no need to scan, and that we've already
- # encountered this for scanning.
- if install_req.constraint or not existing_req.constraint:
- return [], existing_req
-
- does_not_satisfy_constraint = install_req.link and not (
- existing_req.link and install_req.link.path == existing_req.link.path
- )
- if does_not_satisfy_constraint:
- raise InstallationError(
- "Could not satisfy constraints for '{}': "
- "installation from path or url cannot be "
- "constrained to a version".format(install_req.name)
- )
- # If we're now installing a constraint, mark the existing
- # object for real installation.
- existing_req.constraint = False
- # If we're now installing a user supplied requirement,
- # mark the existing object as such.
- if install_req.user_supplied:
- existing_req.user_supplied = True
- existing_req.extras = tuple(
- sorted(set(existing_req.extras) | set(install_req.extras))
- )
- logger.debug(
- "Setting %s extras to: %s",
- existing_req,
- existing_req.extras,
- )
- # Return the existing requirement for addition to the parent and
- # scanning again.
- return [existing_req], existing_req
-
- def _is_upgrade_allowed(self, req: InstallRequirement) -> bool:
- if self.upgrade_strategy == "to-satisfy-only":
- return False
- elif self.upgrade_strategy == "eager":
- return True
- else:
- assert self.upgrade_strategy == "only-if-needed"
- return req.user_supplied or req.constraint
-
- def _set_req_to_reinstall(self, req: InstallRequirement) -> None:
- """
- Set a requirement to be installed.
- """
- # Don't uninstall the conflict if doing a user install and the
- # conflict is not a user install.
- if not self.use_user_site or req.satisfied_by.in_usersite:
- req.should_reinstall = True
- req.satisfied_by = None
-
- def _check_skip_installed(
- self, req_to_install: InstallRequirement
- ) -> Optional[str]:
- """Check if req_to_install should be skipped.
-
- This will check if the req is installed, and whether we should upgrade
- or reinstall it, taking into account all the relevant user options.
-
- After calling this req_to_install will only have satisfied_by set to
- None if the req_to_install is to be upgraded/reinstalled etc. Any
- other value will be a dist recording the current thing installed that
- satisfies the requirement.
-
- Note that for vcs urls and the like we can't assess skipping in this
- routine - we simply identify that we need to pull the thing down,
- then later on it is pulled down and introspected to assess upgrade/
- reinstalls etc.
-
- :return: A text reason for why it was skipped, or None.
- """
- if self.ignore_installed:
- return None
-
- req_to_install.check_if_exists(self.use_user_site)
- if not req_to_install.satisfied_by:
- return None
-
- if self.force_reinstall:
- self._set_req_to_reinstall(req_to_install)
- return None
-
- if not self._is_upgrade_allowed(req_to_install):
- if self.upgrade_strategy == "only-if-needed":
- return "already satisfied, skipping upgrade"
- return "already satisfied"
-
- # Check for the possibility of an upgrade. For link-based
- # requirements we have to pull the tree down and inspect to assess
- # the version #, so it's handled way down.
- if not req_to_install.link:
- try:
- self.finder.find_requirement(req_to_install, upgrade=True)
- except BestVersionAlreadyInstalled:
- # Then the best version is installed.
- return "already up-to-date"
- except DistributionNotFound:
- # No distribution found, so we squash the error. It will
- # be raised later when we re-try later to do the install.
- # Why don't we just raise here?
- pass
-
- self._set_req_to_reinstall(req_to_install)
- return None
-
- def _find_requirement_link(self, req: InstallRequirement) -> Optional[Link]:
- upgrade = self._is_upgrade_allowed(req)
- best_candidate = self.finder.find_requirement(req, upgrade)
- if not best_candidate:
- return None
-
- # Log a warning per PEP 592 if necessary before returning.
- link = best_candidate.link
- if link.is_yanked:
- reason = link.yanked_reason or ""
- msg = (
- # Mark this as a unicode string to prevent
- # "UnicodeEncodeError: 'ascii' codec can't encode character"
- # in Python 2 when the reason contains non-ascii characters.
- "The candidate selected for download or install is a "
- "yanked version: {candidate}\n"
- "Reason for being yanked: {reason}"
- ).format(candidate=best_candidate, reason=reason)
- logger.warning(msg)
-
- return link
-
- def _populate_link(self, req: InstallRequirement) -> None:
- """Ensure that if a link can be found for this, that it is found.
-
- Note that req.link may still be None - if the requirement is already
- installed and not needed to be upgraded based on the return value of
- _is_upgrade_allowed().
-
- If preparer.require_hashes is True, don't use the wheel cache, because
- cached wheels, always built locally, have different hashes than the
- files downloaded from the index server and thus throw false hash
- mismatches. Furthermore, cached wheels at present have undeterministic
- contents due to file modification times.
- """
- if req.link is None:
- req.link = self._find_requirement_link(req)
-
- if self.wheel_cache is None or self.preparer.require_hashes:
- return
- cache_entry = self.wheel_cache.get_cache_entry(
- link=req.link,
- package_name=req.name,
- supported_tags=get_supported(),
- )
- if cache_entry is not None:
- logger.debug("Using cached wheel link: %s", cache_entry.link)
- if req.link is req.original_link and cache_entry.persistent:
- req.original_link_is_in_wheel_cache = True
- req.link = cache_entry.link
-
- def _get_dist_for(self, req: InstallRequirement) -> BaseDistribution:
- """Takes a InstallRequirement and returns a single AbstractDist \
- representing a prepared variant of the same.
- """
- if req.editable:
- return self.preparer.prepare_editable_requirement(req)
-
- # satisfied_by is only evaluated by calling _check_skip_installed,
- # so it must be None here.
- assert req.satisfied_by is None
- skip_reason = self._check_skip_installed(req)
-
- if req.satisfied_by:
- return self.preparer.prepare_installed_requirement(req, skip_reason)
-
- # We eagerly populate the link, since that's our "legacy" behavior.
- self._populate_link(req)
- dist = self.preparer.prepare_linked_requirement(req)
-
- # NOTE
- # The following portion is for determining if a certain package is
- # going to be re-installed/upgraded or not and reporting to the user.
- # This should probably get cleaned up in a future refactor.
-
- # req.req is only avail after unpack for URL
- # pkgs repeat check_if_exists to uninstall-on-upgrade
- # (#14)
- if not self.ignore_installed:
- req.check_if_exists(self.use_user_site)
-
- if req.satisfied_by:
- should_modify = (
- self.upgrade_strategy != "to-satisfy-only"
- or self.force_reinstall
- or self.ignore_installed
- or req.link.scheme == "file"
- )
- if should_modify:
- self._set_req_to_reinstall(req)
- else:
- logger.info(
- "Requirement already satisfied (use --upgrade to upgrade): %s",
- req,
- )
- return dist
-
- def _resolve_one(
- self,
- requirement_set: RequirementSet,
- req_to_install: InstallRequirement,
- ) -> List[InstallRequirement]:
- """Prepare a single requirements file.
-
- :return: A list of additional InstallRequirements to also install.
- """
- # Tell user what we are doing for this requirement:
- # obtain (editable), skipping, processing (local url), collecting
- # (remote url or package name)
- if req_to_install.constraint or req_to_install.prepared:
- return []
-
- req_to_install.prepared = True
-
- # Parse and return dependencies
- dist = self._get_dist_for(req_to_install)
- # This will raise UnsupportedPythonVersion if the given Python
- # version isn't compatible with the distribution's Requires-Python.
- _check_dist_requires_python(
- dist,
- version_info=self._py_version_info,
- ignore_requires_python=self.ignore_requires_python,
- )
-
- more_reqs: List[InstallRequirement] = []
-
- def add_req(subreq: Requirement, extras_requested: Iterable[str]) -> None:
- # This idiosyncratically converts the Requirement to str and let
- # make_install_req then parse it again into Requirement. But this is
- # the legacy resolver so I'm just not going to bother refactoring.
- sub_install_req = self._make_install_req(str(subreq), req_to_install)
- parent_req_name = req_to_install.name
- to_scan_again, add_to_parent = self._add_requirement_to_set(
- requirement_set,
- sub_install_req,
- parent_req_name=parent_req_name,
- extras_requested=extras_requested,
- )
- if parent_req_name and add_to_parent:
- self._discovered_dependencies[parent_req_name].append(add_to_parent)
- more_reqs.extend(to_scan_again)
-
- with indent_log():
- # We add req_to_install before its dependencies, so that we
- # can refer to it when adding dependencies.
- if not requirement_set.has_requirement(req_to_install.name):
- # 'unnamed' requirements will get added here
- # 'unnamed' requirements can only come from being directly
- # provided by the user.
- assert req_to_install.user_supplied
- self._add_requirement_to_set(
- requirement_set, req_to_install, parent_req_name=None
- )
-
- if not self.ignore_dependencies:
- if req_to_install.extras:
- logger.debug(
- "Installing extra requirements: %r",
- ",".join(req_to_install.extras),
- )
- missing_requested = sorted(
- set(req_to_install.extras) - set(dist.iter_provided_extras())
- )
- for missing in missing_requested:
- logger.warning(
- "%s %s does not provide the extra '%s'",
- dist.raw_name,
- dist.version,
- missing,
- )
-
- available_requested = sorted(
- set(dist.iter_provided_extras()) & set(req_to_install.extras)
- )
- for subreq in dist.iter_dependencies(available_requested):
- add_req(subreq, extras_requested=available_requested)
-
- return more_reqs
-
- def get_installation_order(
- self, req_set: RequirementSet
- ) -> List[InstallRequirement]:
- """Create the installation order.
-
- The installation order is topological - requirements are installed
- before the requiring thing. We break cycles at an arbitrary point,
- and make no other guarantees.
- """
- # The current implementation, which we may change at any point
- # installs the user specified things in the order given, except when
- # dependencies must come earlier to achieve topological order.
- order = []
- ordered_reqs: Set[InstallRequirement] = set()
-
- def schedule(req: InstallRequirement) -> None:
- if req.satisfied_by or req in ordered_reqs:
- return
- if req.constraint:
- return
- ordered_reqs.add(req)
- for dep in self._discovered_dependencies[req.name]:
- schedule(dep)
- order.append(req)
-
- for install_req in req_set.requirements.values():
- schedule(install_req)
- return order
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/__init__.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/base.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/base.py
deleted file mode 100644
index b206692..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/base.py
+++ /dev/null
@@ -1,141 +0,0 @@
-from typing import FrozenSet, Iterable, Optional, Tuple, Union
-
-from pip._vendor.packaging.specifiers import SpecifierSet
-from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
-from pip._vendor.packaging.version import LegacyVersion, Version
-
-from pip._internal.models.link import Link, links_equivalent
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.hashes import Hashes
-
-CandidateLookup = Tuple[Optional["Candidate"], Optional[InstallRequirement]]
-CandidateVersion = Union[LegacyVersion, Version]
-
-
-def format_name(project: str, extras: FrozenSet[str]) -> str:
- if not extras:
- return project
- canonical_extras = sorted(canonicalize_name(e) for e in extras)
- return "{}[{}]".format(project, ",".join(canonical_extras))
-
-
-class Constraint:
- def __init__(
- self, specifier: SpecifierSet, hashes: Hashes, links: FrozenSet[Link]
- ) -> None:
- self.specifier = specifier
- self.hashes = hashes
- self.links = links
-
- @classmethod
- def empty(cls) -> "Constraint":
- return Constraint(SpecifierSet(), Hashes(), frozenset())
-
- @classmethod
- def from_ireq(cls, ireq: InstallRequirement) -> "Constraint":
- links = frozenset([ireq.link]) if ireq.link else frozenset()
- return Constraint(ireq.specifier, ireq.hashes(trust_internet=False), links)
-
- def __bool__(self) -> bool:
- return bool(self.specifier) or bool(self.hashes) or bool(self.links)
-
- def __and__(self, other: InstallRequirement) -> "Constraint":
- if not isinstance(other, InstallRequirement):
- return NotImplemented
- specifier = self.specifier & other.specifier
- hashes = self.hashes & other.hashes(trust_internet=False)
- links = self.links
- if other.link:
- links = links.union([other.link])
- return Constraint(specifier, hashes, links)
-
- def is_satisfied_by(self, candidate: "Candidate") -> bool:
- # Reject if there are any mismatched URL constraints on this package.
- if self.links and not all(_match_link(link, candidate) for link in self.links):
- return False
- # We can safely always allow prereleases here since PackageFinder
- # already implements the prerelease logic, and would have filtered out
- # prerelease candidates if the user does not expect them.
- return self.specifier.contains(candidate.version, prereleases=True)
-
-
-class Requirement:
- @property
- def project_name(self) -> NormalizedName:
- """The "project name" of a requirement.
-
- This is different from ``name`` if this requirement contains extras,
- in which case ``name`` would contain the ``[...]`` part, while this
- refers to the name of the project.
- """
- raise NotImplementedError("Subclass should override")
-
- @property
- def name(self) -> str:
- """The name identifying this requirement in the resolver.
-
- This is different from ``project_name`` if this requirement contains
- extras, where ``project_name`` would not contain the ``[...]`` part.
- """
- raise NotImplementedError("Subclass should override")
-
- def is_satisfied_by(self, candidate: "Candidate") -> bool:
- return False
-
- def get_candidate_lookup(self) -> CandidateLookup:
- raise NotImplementedError("Subclass should override")
-
- def format_for_error(self) -> str:
- raise NotImplementedError("Subclass should override")
-
-
-def _match_link(link: Link, candidate: "Candidate") -> bool:
- if candidate.source_link:
- return links_equivalent(link, candidate.source_link)
- return False
-
-
-class Candidate:
- @property
- def project_name(self) -> NormalizedName:
- """The "project name" of the candidate.
-
- This is different from ``name`` if this candidate contains extras,
- in which case ``name`` would contain the ``[...]`` part, while this
- refers to the name of the project.
- """
- raise NotImplementedError("Override in subclass")
-
- @property
- def name(self) -> str:
- """The name identifying this candidate in the resolver.
-
- This is different from ``project_name`` if this candidate contains
- extras, where ``project_name`` would not contain the ``[...]`` part.
- """
- raise NotImplementedError("Override in subclass")
-
- @property
- def version(self) -> CandidateVersion:
- raise NotImplementedError("Override in subclass")
-
- @property
- def is_installed(self) -> bool:
- raise NotImplementedError("Override in subclass")
-
- @property
- def is_editable(self) -> bool:
- raise NotImplementedError("Override in subclass")
-
- @property
- def source_link(self) -> Optional[Link]:
- raise NotImplementedError("Override in subclass")
-
- def iter_dependencies(self, with_requires: bool) -> Iterable[Optional[Requirement]]:
- raise NotImplementedError("Override in subclass")
-
- def get_install_requirement(self) -> Optional[InstallRequirement]:
- raise NotImplementedError("Override in subclass")
-
- def format_for_error(self) -> str:
- raise NotImplementedError("Subclass should override")
diff --git a/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/candidates.py b/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/candidates.py
deleted file mode 100644
index d1470ec..0000000
--- a/env/lib/python3.9/site-packages/pip/_internal/resolution/resolvelib/candidates.py
+++ /dev/null
@@ -1,550 +0,0 @@
-import logging
-import sys
-from typing import TYPE_CHECKING, Any, FrozenSet, Iterable, Optional, Tuple, Union, cast
-
-from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
-from pip._vendor.packaging.version import Version
-
-from pip._internal.exceptions import (
- HashError,
- InstallationSubprocessError,
- MetadataInconsistent,
-)
-from pip._internal.metadata import BaseDistribution
-from pip._internal.models.link import Link, links_equivalent
-from pip._internal.models.wheel import Wheel
-from pip._internal.req.constructors import (
- install_req_from_editable,
- install_req_from_line,
-)
-from pip._internal.req.req_install import InstallRequirement
-from pip._internal.utils.misc import normalize_version_info
-
-from .base import Candidate, CandidateVersion, Requirement, format_name
-
-if TYPE_CHECKING:
- from .factory import Factory
-
-logger = logging.getLogger(__name__)
-
-BaseCandidate = Union[
- "AlreadyInstalledCandidate",
- "EditableCandidate",
- "LinkCandidate",
-]
-
-# Avoid conflicting with the PyPI package "Python".
-REQUIRES_PYTHON_IDENTIFIER = cast(NormalizedName, "