use openai_responses::{Client, Request, types::{Input, Model}};
let response = Client::from_env()?.create(Request {
model: Model::GPT4o,
input: Input::Text("Are semicolons optional in JavaScript?".to_string()),
instructions: Some("You are a coding assistant that talks like a pirate".to_string()),
..Default::default()
}).await?;
println!("{}", response.output_text());
A Rust SDK for the OpenAI Responses API
OpenAI released a new API today to unify all of their models under a single endpoint. They released Python and JS SDKs, but I wanted one for Rust, so I spent some time knocking one out.
I'm particularly happy with the way I managed to "Rust-ify" the types to make them feel more idiomatic. Will be building some cool stuff with it soon!