Skip to content

CloudLLM-ai/openai-rust

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

89 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

openai-rust2

Test Status Crates.io docs.rs

This is an unofficial library to interact with the Openai-API. The goal of this crate is to support the entire api while matching the official documentation as closely as possible.

Current features:

Example usage

// Here we will use the chat completion endpoint connecting to openAI's default base URL
use openai_rust2 as openai_rust; // since this is a fork of openai_rust
let client = openai_rust::Client::new(&std::env::var("OPENAI_API_KEY").unwrap());
let args = openai_rust::chat::ChatArguments::new("gpt-3.5-turbo", vec![
    openai_rust::chat::Message {
        role: "user".to_owned(),
        content: "Hello GPT!".to_owned(),
    }
]);
let res = client.create_chat(args).await.unwrap();
println!("{}", res);

Here another example connecting to a local LLM server (Ollama's base URL)

use openai_rust2 as openai_rust; // since this is a fork of openai_rust
let client = openai_rust::Client::new_with_base_url(
    "", // no need for an API key when connecting to a default ollama instance locally
    "http://localhost:11434"
);

You can run this code as an example with OPENAI_API_KEY=(your key) cargo run --example chat.

Checkout the examples directory for more usage examples. You can find documentation on docs.rs.

Projects using openai-rust

About

A library to interface with the OpenAI API

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • Rust 100.0%