Skip to content

Conversation

@mmnsrti
Copy link

@mmnsrti mmnsrti commented Jan 7, 2025

i made the llm model to either use local ollama llm models or chatgpt or ....

@marcellodesales
Copy link

if you make this PR backwards compatible with the previous version, it would be useful

@JosueMonteiroUchoaAlves

you should create another file. you are using the GPT one and erasing the previous code

Copy link
Owner

@FujiwaraChoki FujiwaraChoki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not a good PR. Please improve it. Add support, don't remove the existing g4f implementation.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

????

Comment on lines -3 to -19
import g4f
import json
import openai
import google.generativeai as genai

from g4f.client import Client
import subprocess
from termcolor import colored
from dotenv import load_dotenv
from typing import Tuple, List
import sys

# Load environment variables
load_dotenv("../.env")

# Set environment variables
OPENAI_API_KEY = os.getenv('OPENAI_API_KEY')
openai.api_key = OPENAI_API_KEY
GOOGLE_API_KEY = os.getenv('GOOGLE_API_KEY')
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are removing support for g4f. Why?

return search_terms


def generate_metadata(video_subject: str, script: str, ai_model: str) -> Tuple[str, str, List[str]]:
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But we need metadata...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants