diff --git a/.gitignore b/.gitignore index 9334f368..1b52a651 100644 --- a/.gitignore +++ b/.gitignore @@ -2,6 +2,7 @@ __pycache__/ *.py[cod] *$py.class +.idea # C extensions *.so diff --git "a/docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" "b/docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" new file mode 100644 index 00000000..18338f1a --- /dev/null +++ "b/docs/FAQ/miniconda\345\256\211\350\243\205\347\256\241\347\220\206python.md" @@ -0,0 +1,70 @@ +# 说明 +本文介绍如何安装miniconda +并且基于miniconda安装python环境 + +# 官方介绍 +https://docs.anaconda.com/free/miniconda/ +## windows安装 +```bash +curl https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe -o miniconda.exe +start /wait "" miniconda.exe /S +del miniconda.exe +``` +## mac安装 +```bash +mkdir -p ~/miniconda3 +curl https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-arm64.sh -o ~/miniconda3/miniconda.sh +bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 +rm -rf ~/miniconda3/miniconda.sh + +``` +安装完毕之后, 以下命令针对 bash 和 zsh shell 进行初始化 +```bash +~/miniconda3/bin/conda init bash +~/miniconda3/bin/conda init zsh +``` +## linux安装 +```bash +mkdir -p ~/miniconda3 +wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh +bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 +rm -rf ~/miniconda3/miniconda.sh +``` +安装完毕之后, 以下命令针对 bash 和 zsh shell 进行初始化 +```bash +~/miniconda3/bin/conda init bash +~/miniconda3/bin/conda init zsh +``` + + +# 使用MiniConda管理python环境 +- 安装python3.10.13 +假设我需要有一个环境叫myenv(你也可以叫其他名字), 并且指定python版本为3.10.13 +```bash +conda create --name myenv python=3.10.13 +``` + +- 激活环境, 并安装依赖文件requirement.txt +```bash +conda activate myenv +pip install -r requirements.txt +``` +ps: 如果遇到下载超时或者失败, 更换源 +```bash +pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple +``` + +- 解除激活环境 +```bash +conda deactivate +``` + +- 删除环境 +```bash +conda remove --name myenv --all +``` + + + + + diff --git "a/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" "b/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" new file mode 100644 index 00000000..566231de --- /dev/null +++ "b/docs/FAQ/\345\246\202\344\275\225\350\256\277\351\227\256openAI\346\216\245\345\217\243.md" @@ -0,0 +1,32 @@ +# 如何访问openAI接口 +本文只提供学习用.商用请使用正规国内代理, 代理商提供的服务 + +## 解法1. 使用国内代理api +使用国内代理api即可. 将域名替换即可. api.openai.com -> api.openai-proxy.com +风险提示: 由于是第三方代理, 因此可以作为测试学习用. !!!不要作为生产用以防key被盗用!!!!!! + +```python +client = OpenAI( + base_url='https://api.openai-proxy.com/v1' +) +``` + +## 解法2. 自购代理, 使用socks5代理 +第二种: 已购买代理的socks5代理 +设置为全局代理可以直接使用, 如果不是设置为全局代理, 查看你代理的端口, 诸如18080 + +先安装库 +```bash +pip install PySocks +``` + +在代码开头设置socks5代理 +```python +import socket +import socks +socks.set_default_proxy(socks.SOCKS5, "127.0.0.1", 18080) +socket.socket = socks.socksocket +``` + + + diff --git a/openai-translator/README-CN.md b/openai-translator/README-CN.md index 00d890c1..c88b4e0c 100644 --- a/openai-translator/README-CN.md +++ b/openai-translator/README-CN.md @@ -44,24 +44,40 @@ OpenAI 翻译器目前还处于早期开发阶段,我正在积极地添加更 ### 环境准备 -1.克隆仓库 `git clone git@github.com:DjangoPeng/openai-translator.git`。 +1. 克隆仓库代码 +```bash +git clone git@github.com:DjangoPeng/openai-translator.git +``` -2.OpenAI-翻译器 需要 Python 3.6 或更高版本。使用 `pip install -r requirements.txt` 安装依赖项。 +2. 准备python环境. +- 版本要求 **python > 3.10.13** + - 无独立环境请移步 [miniconda安装管理python](../docs/FAQ/miniconda%E5%AE%89%E8%A3%85python.md) +- 安装依赖(确保你已经使用miniConda创建了python环境myenv, 并且激活了myenv环境) +```bash +pip install -r requirements.txt +``` -3.设置您的 OpenAI API 密钥(`$OPENAI_API_KEY`)或 ChatGLM 模型 URL(`$GLM_MODEL_URL`)。您可以将其添加到环境变量中,或者在 config.yaml 文件中指定。 +3. 启动翻译程序 -### 使用示例 +下列启动方式多选一即可 +- 命令行方式启动(推荐), 使用OpenAI模型 +```bash +# 把您的 OPENAI_API_KEY 替换为你具体的API_KEY +export OPENAI_API_KEY="sk-xxx" +python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +``` -您可以通过指定配置文件或提供命令行参数来使用 OpenAI-翻译器。 +ps: windows 系统请使用 `set` 命令替换 `export` 命令 -#### 使用配置文件 +即可看到结果 +![sample_out](images/sample_image_1.png) +- yaml配置文件方式启动, 使用OpenAI模型 根据您的设置调整 `config.yaml` 文件: - ```yaml OpenAIModel: model: "gpt-3.5-turbo" - api_key: "your_openai_api_key" + api_key: "sk-xxx" GLMModel: model_url: "your_chatglm_model_url" @@ -71,33 +87,19 @@ common: book: "test/test.pdf" file_format: "markdown" ``` - -然后命令行直接运行: - +执行命令 ```bash -python ai_translator/main.py +python ai_translator/main.py --config config.yaml --model_type OpenAIModel ``` -![sample_out](images/sample_image_1.png) - -#### 使用命令行参数 - -您也可以直接在命令行上指定设置。这是使用 OpenAI 模型的例子: - +- 命令行方式启动, 使用GLM模型 ```bash -# 将您的 api_key 设置为环境变量 -export OPENAI_API_KEY="sk-xxx" -python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo -``` - -这是使用 GLM 模型的例子: - -```bash -# 将您的 GLM 模型 URL 设置为环境变量 export GLM_MODEL_URL="http://xxx:xx" python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf ``` +ps: windows 系统请使用 `set` 命令替换 `export` 命令 + ## 许可证 该项目采用 GPL-3.0 许可证。有关详细信息,请查看 [LICENSE](LICENSE) 文件。 diff --git a/openai-translator/README.md b/openai-translator/README.md index 11a42c20..dbacc9d0 100644 --- a/openai-translator/README.md +++ b/openai-translator/README.md @@ -43,26 +43,39 @@ The OpenAI Translator is still in its early stages of development, and I'm activ ## Getting Started -### Environment Setup +### Quick Start -1.Clone the repository `git clone git@github.com:DjangoPeng/openai-translator.git`. - -2.The `OpenAI-Translator` requires Python 3.6 or later. Install the dependencies with `pip install -r requirements.txt`. - -3.Set up your OpenAI API key(`$OPENAI_API_KEY`) or ChatGLM Model URL(`$GLM_MODEL_URL`). You can either add it to your environment variables or specify it in the config.yaml file. - -### Usage +1. Clone the repository: +```bash +git clone git@github.com:DjangoPeng/openai-translator.git +``` -You can use OpenAI-Translator either by specifying a configuration file or by providing command-line arguments. +2. Prepare the Python environment: +- Required Python version: python > 3.10.13 + - If you don’t have a separate environment, please refer to [Install and Manage Python with Miniconda](../docs/FAQ/miniconda%E5%AE%89%E8%A3%85python.md) +- Install dependencies (make sure you have created a Python environment named myenv with Miniconda and activated it) +```bash +pip install -r requirements.txt +``` -#### Using a configuration file: +3. Start the translation program +Choose one of the following methods to start: -Adapt `config.yaml` file with your settings: +- Command-line startup (recommended), using the OpenAI model: +ps: For Windows system, please use the set command instead of the export command +```bash +# Replace 'sk-xxx' with your actual OPENAI_API_KEY +export OPENAI_API_KEY="sk-xxx" +python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +``` +You will see the result: +![sample_out](images/sample_image_1.png) +- YAML configuration file startup, using the OpenAI model Adjust the config.yaml file according to your settings: ```yaml OpenAIModel: model: "gpt-3.5-turbo" - api_key: "your_openai_api_key" + api_key: "sk-xxx" GLMModel: model_url: "your_chatglm_model_url" @@ -73,31 +86,19 @@ common: file_format: "markdown" ``` -Then run the tool: - +Execute the command: ```bash -python ai_translator/main.py +python ai_translator/main.py --config config.yaml --model_type OpenAIModel ``` -![sample_out](images/sample_image_1.png) - -#### Using command-line arguments: - -You can also specify the settings directly on the command line. Here's an example of how to use the OpenAI model: - +- Command-line startup, using the GLM model: +ps: For Windows system, please use the set command instead of the export command ```bash -# Set your api_key as an env variable -export OPENAI_API_KEY="sk-xxx" -python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo +export GLM_MODEL_URL="http://xxx:xx" +python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf ``` -And an example of how to use the GLM model: -```bash -# Set your GLM Model URL as an env variable -export GLM_MODEL_URL="http://xxx:xx" -python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf -``` ## License diff --git a/openai-translator/ai_translator/main.py b/openai-translator/ai_translator/main.py index 6b8e0c9b..7b02699b 100644 --- a/openai-translator/ai_translator/main.py +++ b/openai-translator/ai_translator/main.py @@ -18,6 +18,8 @@ api_key = args.openai_api_key if args.openai_api_key else config['OpenAIModel']['api_key'] model = OpenAIModel(model=model_name, api_key=api_key) + if args.model_type == 'OpenAIModel' and model_name and not api_key: + raise Exception("--openai_model and --openai_api_key is required when using OpenAIModel") pdf_file_path = args.book if args.book else config['common']['book'] file_format = args.file_format if args.file_format else config['common']['file_format'] diff --git a/openai-translator/ai_translator/utils/argument_parser.py b/openai-translator/ai_translator/utils/argument_parser.py index 95681dc1..ed1932cf 100644 --- a/openai-translator/ai_translator/utils/argument_parser.py +++ b/openai-translator/ai_translator/utils/argument_parser.py @@ -14,6 +14,4 @@ def __init__(self): def parse_arguments(self): args = self.parser.parse_args() - if args.model_type == 'OpenAIModel' and not args.openai_model and not args.openai_api_key: - self.parser.error("--openai_model and --openai_api_key is required when using OpenAIModel") return args diff --git a/openai-translator/config.yaml b/openai-translator/config.yaml index 2b8bc837..2114618d 100644 --- a/openai-translator/config.yaml +++ b/openai-translator/config.yaml @@ -1,9 +1,9 @@ OpenAIModel: model: "gpt-3.5-turbo" - api_key: "your_openai_api_key" + api_key: "sk-xxx" GLMModel: - model_url: "your_chatglm_model_url" + model_url: "your_chatglm_model_url like http://xxx:xx" timeout: 300 common: diff --git a/openai-translator/requirements.txt b/openai-translator/requirements.txt index 3ad8bd4c..0016508a 100644 --- a/openai-translator/requirements.txt +++ b/openai-translator/requirements.txt @@ -1,9 +1,36 @@ -pdfplumber -simplejson -requests -PyYAML -pillow -reportlab -pandas -loguru -openai \ No newline at end of file +annotated-types==0.6.0 +anyio==4.3.0 +certifi==2024.2.2 +cffi==1.16.0 +chardet==5.2.0 +charset-normalizer==3.3.2 +cryptography==42.0.5 +distro==1.9.0 +exceptiongroup==1.2.0 +h11==0.14.0 +httpcore==1.0.4 +httpx==0.27.0 +idna==3.6 +loguru==0.7.2 +numpy==1.26.4 +openai==1.14.2 +pandas==2.2.1 +pdfminer.six==20231228 +pdfplumber==0.11.0 +pillow==10.2.0 +pycparser==2.21 +pydantic==2.6.4 +pydantic_core==2.16.3 +pypdfium2==4.28.0 +python-dateutil==2.9.0.post0 +pytz==2024.1 +PyYAML==6.0.1 +reportlab==4.1.0 +requests==2.31.0 +simplejson==3.19.2 +six==1.16.0 +sniffio==1.3.1 +tqdm==4.66.2 +typing_extensions==4.10.0 +tzdata==2024.1 +urllib3==2.2.1 \ No newline at end of file