Skip to content

Commit 73cc980

Browse files
authored
Merge pull request alexrudall#197 from alexrudall/chat
Add ChatGPT endpoint
2 parents 28d0397 + 2fc47a0 commit 73cc980

File tree

8 files changed

+179
-4
lines changed

8 files changed

+179
-4
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,12 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
66
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
77

8+
## [3.4.0] - 2023-03-01
9+
10+
### Added
11+
12+
- Add Client#chat endpoint - ChatGPT over the wire!
13+
814
## [3.3.0] - 2023-02-15
915

1016
### Changed

Gemfile.lock

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
PATH
22
remote: .
33
specs:
4-
ruby-openai (3.3.0)
4+
ruby-openai (3.4.0)
55
httparty (>= 0.18.1)
66

77
GEM

README.md

Lines changed: 17 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77

88
Use the [OpenAI API](https://openai.com/blog/openai-api/) with Ruby! 🤖❤️
99

10-
Generate text with GPT-3, create images with DALL·E, or write code with Codex...
10+
Generate text with ChatGPT, create images with DALL·E, or write code with Codex...
1111

1212
## Installation
1313

@@ -89,9 +89,24 @@ There are different models that can be used to generate text. For a full list an
8989
- code-davinci-002
9090
- code-cushman-001
9191

92+
### ChatGPT
93+
94+
ChatGPT is a model that can be used to generate text in a conversational style. You can use it to generate a response to a sequence of [messages](https://platform.openai.com/docs/guides/chat/introduction):
95+
96+
```ruby
97+
response = client.chat(
98+
parameters: {
99+
model: "gpt-3.5-turbo",
100+
messages: [{ role: "user", content: "Hello!"}],
101+
})
102+
puts response.dig("choices", 0, "message", "content")
103+
=> "Hello! How may I assist you today?"
104+
105+
```
106+
92107
### Completions
93108

94-
Hit the OpenAI API for a completion:
109+
Hit the OpenAI API for a completion using other GPT-3 models:
95110

96111
```ruby
97112
response = client.completions(

lib/openai/client.rb

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,10 @@ def initialize(access_token: nil, organization_id: nil)
77
OpenAI.configuration.organization_id = organization_id if organization_id
88
end
99

10+
def chat(parameters: {})
11+
OpenAI::Client.json_post(path: "/chat/completions", parameters: parameters)
12+
end
13+
1014
def completions(parameters: {})
1115
OpenAI::Client.json_post(path: "/completions", parameters: parameters)
1216
end

lib/openai/version.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
module OpenAI
2-
VERSION = "3.3.0".freeze
2+
VERSION = "3.4.0".freeze
33
end

spec/fixtures/cassettes/gpt-3_5-turbo-0301_chat.yml

Lines changed: 56 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spec/fixtures/cassettes/gpt-3_5-turbo_chat.yml

Lines changed: 56 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spec/openai/client/chat_spec.rb

Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
RSpec.describe OpenAI::Client do
2+
describe "#chat" do
3+
context "with messages", :vcr do
4+
let(:messages) { [{ role: "user", content: "Hello!" }] }
5+
6+
let(:response) do
7+
OpenAI::Client.new.chat(
8+
parameters: {
9+
model: model,
10+
messages: messages
11+
}
12+
)
13+
end
14+
let(:content) { JSON.parse(response.body).dig("choices", 0, "message", "content") }
15+
let(:cassette) { "#{model} chat".downcase }
16+
17+
context "with model: gpt-3.5-turbo" do
18+
let(:model) { "gpt-3.5-turbo" }
19+
20+
it "succeeds" do
21+
VCR.use_cassette(cassette) do
22+
expect(content.split.empty?).to eq(false)
23+
end
24+
end
25+
end
26+
27+
context "with model: gpt-3.5-turbo-0301" do
28+
let(:model) { "gpt-3.5-turbo-0301" }
29+
30+
it "succeeds" do
31+
VCR.use_cassette(cassette) do
32+
expect(content.split.empty?).to eq(false)
33+
end
34+
end
35+
end
36+
end
37+
end
38+
end

0 commit comments

Comments
 (0)