Open
Conversation
# Conflicts: # .gitignore # frontend/src/SearchPage.tsx
hudsonhadley
requested changes
Apr 21, 2026
| 2. Open the project in IntelliJ and execute the `application/run` gradle task from | ||
| the right hand menu. | ||
| 3. Navigate to `http://localhost:7070`. | ||
| 3. From a terminal inside the python-backend folder, run `python -m venv venv` to create the virtual environment. |
Collaborator
There was a problem hiding this comment.
Is there a way to incorporate these steps into the build.gradle instead of having to manually do each?
Collaborator
There was a problem hiding this comment.
Is there a reason why we shouldn't put this under backend/main/python?
| from pydantic import BaseModel | ||
| from fastapi.middleware.cors import CORSMiddleware | ||
| import os | ||
| from openai import OpenAI |
Collaborator
There was a problem hiding this comment.
When I ran, it couldn't find the OpenAI module. I ran the steps listed in the readme.
amusingimpala75
requested changes
Apr 21, 2026
Owner
There was a problem hiding this comment.
This file should not be in the repository, please add the __pycache__ to the .gitignore
hudsonhadley
requested changes
Apr 22, 2026
Collaborator
hudsonhadley
left a comment
There was a problem hiding this comment.
I get the following error:
Traceback (most recent call last):
File "C:\Users\HADLEYHR23\AppData\Local\Programs\Python\Python312\Lib\multiprocessing\process.py", line 314, in _bootstrap
self.run()
File "C:\Users\HADLEYHR23\AppData\Local\Programs\Python\Python312\Lib\multiprocessing\process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\uvicorn\_subprocess.py", line 80, in subprocess_started
target(sockets=sockets)
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\uvicorn\server.py", line 75, in run
return asyncio_run(self.serve(sockets=sockets), loop_factory=self.config.get_loop_factory())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\AppData\Local\Programs\Python\Python312\Lib\asyncio\runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\AppData\Local\Programs\Python\Python312\Lib\asyncio\runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\AppData\Local\Programs\Python\Python312\Lib\asyncio\base_events.py", line 684, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\uvicorn\server.py", line 79, in serve
await self._serve(sockets)
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\uvicorn\server.py", line 86, in _serve
config.load()
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\uvicorn\config.py", line 441, in load
self.loaded_app = import_from_string(self.app)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\uvicorn\importer.py", line 19, in import_from_string
module = importlib.import_module(module_str)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\AppData\Local\Programs\Python\Python312\Lib\importlib\__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 994, in exec_module
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\main.py", line 14, in <module>
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\openai\_client.py", line 122, in __init__
super().__init__(
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\openai\_base_client.py", line 825, in __init__
self._client = http_client or SyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\HADLEYHR23\comp-350\COMP350FRFR\python-backend\venv\Lib\site-packages\openai\_base_client.py", line 723, in __init__
super().__init__(**kwargs)
TypeError: Client.__init__() got an unexpected keyword argument 'proxies'
AI said that the openai and httpx versions clashed. It recommended using an earlier version of openai. Does it work on your end?
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Added an AI Chatbot feature using FastAPI and OpenAI, using a python backend that connects with OpenAI's API to process user queries. The README has been updated to show the additional steps required to run the new backend.