Skip to content

Commit 5683eb9

Browse files
authored
Merge pull request alexrudall#234 from alexrudall/faraday
Add streaming with Faraday
2 parents 9f60b79 + ef1e2bc commit 5683eb9

File tree

60 files changed

+64536
-1422
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

60 files changed

+64536
-1422
lines changed

.rubocop.yml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,9 @@ Layout/LineLength:
1212
Exclude:
1313
- "**/*.gemspec"
1414

15+
Metrics/AbcSize:
16+
Max: 20
17+
1518
Metrics/BlockLength:
1619
Exclude:
1720
- "spec/**/*"

CHANGELOG.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,18 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
66
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
77

8+
## [4.0.0] - 2023-04-25
9+
10+
### Added
11+
12+
- Add the ability to stream Chat responses from the API! Thanks to everyone who requested this and made suggestions.
13+
- Added instructions for streaming to the README.
14+
15+
### Changed
16+
17+
- Switch HTTP library from HTTParty to Faraday to allow streaming and future feature and performance improvements.
18+
- [BREAKING] Endpoints now return JSON rather than HTTParty objects. You will need to update your code to handle this change, changing `JSON.parse(response.body)["key"]` and `response.parsed_response["key"]` to just `response["key"]`.
19+
820
## [3.7.0] - 2023-03-25
921

1022
### Added

Gemfile.lock

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
11
PATH
22
remote: .
33
specs:
4-
ruby-openai (3.7.0)
5-
httparty (>= 0.18.1)
4+
ruby-openai (4.0.0)
5+
faraday (>= 1)
6+
faraday-multipart (>= 1)
67

78
GEM
89
remote: https://rubygems.org/
@@ -15,13 +16,15 @@ GEM
1516
rexml
1617
diff-lcs (1.5.0)
1718
dotenv (2.8.1)
19+
faraday (2.7.4)
20+
faraday-net_http (>= 2.0, < 3.1)
21+
ruby2_keywords (>= 0.0.4)
22+
faraday-multipart (1.0.4)
23+
multipart-post (~> 2)
24+
faraday-net_http (3.0.2)
1825
hashdiff (1.0.1)
19-
httparty (0.21.0)
20-
mini_mime (>= 1.0.0)
21-
multi_xml (>= 0.5.2)
2226
json (2.6.3)
23-
mini_mime (1.1.2)
24-
multi_xml (0.6.0)
27+
multipart-post (2.3.0)
2528
parallel (1.22.1)
2629
parser (3.2.2.0)
2730
ast (~> 2.4.1)
@@ -56,6 +59,7 @@ GEM
5659
rubocop-ast (1.28.0)
5760
parser (>= 3.2.1.0)
5861
ruby-progressbar (1.13.0)
62+
ruby2_keywords (0.0.5)
5963
unicode-display_width (2.4.2)
6064
vcr (6.1.0)
6165
webmock (3.18.1)

README.md

Lines changed: 32 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66

77
Use the [OpenAI API](https://openai.com/blog/openai-api/) with Ruby! 🤖❤️
88

9-
Generate text with ChatGPT, transcribe and translate audio with Whisper, or create images with DALL·E...
9+
Stream text with GPT-4, transcribe and translate audio with Whisper, or create images with DALL·E...
1010

1111
[Ruby AI Builders Discord](https://discord.gg/k4Uc224xVD)
1212

@@ -34,10 +34,6 @@ and require with:
3434
require "openai"
3535
```
3636

37-
## Upgrading
38-
39-
The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top level `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`.
40-
4137
## Usage
4238

4339
- Get your API key from [https://beta.openai.com/account/api-keys](https://beta.openai.com/account/api-keys)
@@ -57,8 +53,8 @@ For a more robust setup, you can configure the gem with your API keys, for examp
5753

5854
```ruby
5955
OpenAI.configure do |config|
60-
config.access_token = ENV.fetch('OPENAI_ACCESS_TOKEN')
61-
config.organization_id = ENV.fetch('OPENAI_ORGANIZATION_ID') # Optional.
56+
config.access_token = ENV.fetch("OPENAI_ACCESS_TOKEN")
57+
config.organization_id = ENV.fetch("OPENAI_ORGANIZATION_ID") # Optional.
6258
end
6359
```
6460

@@ -70,7 +66,7 @@ client = OpenAI::Client.new
7066

7167
#### Custom timeout or base URI
7268

73-
The default timeout for any OpenAI request is 120 seconds. You can change that passing the `request_timeout` when initializing the client. You can also change the base URI used for all requests, eg. to use observability tools like [Helicone](https://docs.helicone.ai/quickstart/integrate-in-one-line-of-code):
69+
The default timeout for any request using this library is 120 seconds. You can change that by passing a number of seconds to the `request_timeout` when initializing the client. You can also change the base URI used for all requests, eg. to use observability tools like [Helicone](https://docs.helicone.ai/quickstart/integrate-in-one-line-of-code):
7470

7571
```ruby
7672
client = OpenAI::Client.new(
@@ -130,6 +126,23 @@ puts response.dig("choices", 0, "message", "content")
130126
# => "Hello! How may I assist you today?"
131127
```
132128

129+
### Streaming ChatGPT
130+
131+
You can stream from the API in realtime, which can be much faster and used to create a more engaging user experience. Pass a [Proc](https://ruby-doc.org/core-2.6/Proc.html) to the `stream` parameter to receive the stream of text chunks as they are generated. Each time one or more chunks is received, the Proc will be called once with each chunk, parsed as a Hash. If OpenAI returns an error, `ruby-openai` will pass that to your proc as a Hash.
132+
133+
```ruby
134+
client.chat(
135+
parameters: {
136+
model: "gpt-3.5-turbo", # Required.
137+
messages: [{ role: "user", content: "Describe a character called Anna!"}], # Required.
138+
temperature: 0.7,
139+
stream: proc do |chunk, _bytesize|
140+
print chunk.dig("choices", 0, "delta", "content")
141+
end
142+
})
143+
# => "Anna is a young woman in her mid-twenties, with wavy chestnut hair that falls to her shoulders..."
144+
```
145+
133146
### Completions
134147

135148
Hit the OpenAI API for a completion using other GPT-3 models:
@@ -188,9 +201,9 @@ and pass the path to `client.files.upload` to upload it to OpenAI, and then inte
188201
```ruby
189202
client.files.upload(parameters: { file: "path/to/sentiment.jsonl", purpose: "fine-tune" })
190203
client.files.list
191-
client.files.retrieve(id: 123)
192-
client.files.content(id: 123)
193-
client.files.delete(id: 123)
204+
client.files.retrieve(id: "file-123")
205+
client.files.content(id: "file-123")
206+
client.files.delete(id: "file-123")
194207
```
195208

196209
### Fine-tunes
@@ -208,9 +221,9 @@ You can then use this file ID to create a fine-tune model:
208221
response = client.finetunes.create(
209222
parameters: {
210223
training_file: file_id,
211-
model: "text-ada-001"
224+
model: "ada"
212225
})
213-
fine_tune_id = JSON.parse(response.body)["id"]
226+
fine_tune_id = response["id"]
214227
```
215228

216229
That will give you the fine-tune ID. If you made a mistake you can cancel the fine-tune model before it is processed:
@@ -224,7 +237,7 @@ You may need to wait a short time for processing to complete. Once processed, yo
224237
```ruby
225238
client.finetunes.list
226239
response = client.finetunes.retrieve(id: fine_tune_id)
227-
fine_tuned_model = JSON.parse(response.body)["fine_tuned_model"]
240+
fine_tuned_model = response["fine_tuned_model"]
228241
```
229242

230243
This fine-tuned model name can then be used in completions:
@@ -236,7 +249,7 @@ response = client.completions(
236249
prompt: "I love Mondays!"
237250
}
238251
)
239-
JSON.parse(response.body)["choices"].map { |c| c["text"] }
252+
response.dig("choices", 0, "text")
240253
```
241254

242255
You can delete the fine-tuned model when you are done with it:
@@ -305,9 +318,9 @@ The translations API takes as input the audio file in any of the supported langu
305318
response = client.translate(
306319
parameters: {
307320
model: "whisper-1",
308-
file: File.open('path_to_file', 'rb'),
321+
file: File.open("path_to_file", "rb"),
309322
})
310-
puts response.parsed_response['text']
323+
puts response["text"]
311324
# => "Translation of the text"
312325
```
313326

@@ -319,9 +332,9 @@ The transcriptions API takes as input the audio file you want to transcribe and
319332
response = client.transcribe(
320333
parameters: {
321334
model: "whisper-1",
322-
file: File.open('path_to_file', 'rb'),
335+
file: File.open("path_to_file", "rb"),
323336
})
324-
puts response.parsed_response['text']
337+
puts response["text"]
325338
# => "Transcription of the text"
326339
```
327340

lib/openai.rb

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
1-
require "httparty"
1+
require "faraday"
2+
require "faraday/multipart"
23

4+
require_relative "openai/http"
35
require_relative "openai/client"
46
require_relative "openai/files"
57
require_relative "openai/finetunes"

lib/openai/client.rb

Lines changed: 2 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
module OpenAI
22
class Client
3+
extend OpenAI::HTTP
4+
35
def initialize(access_token: nil, organization_id: nil, uri_base: nil, request_timeout: nil)
46
OpenAI.configuration.access_token = access_token if access_token
57
OpenAI.configuration.organization_id = organization_id if organization_id
@@ -50,55 +52,5 @@ def transcribe(parameters: {})
5052
def translate(parameters: {})
5153
OpenAI::Client.multipart_post(path: "/audio/translations", parameters: parameters)
5254
end
53-
54-
def self.get(path:)
55-
HTTParty.get(
56-
uri(path: path),
57-
headers: headers,
58-
timeout: request_timeout
59-
)
60-
end
61-
62-
def self.json_post(path:, parameters:)
63-
HTTParty.post(
64-
uri(path: path),
65-
headers: headers,
66-
body: parameters&.to_json,
67-
timeout: request_timeout
68-
)
69-
end
70-
71-
def self.multipart_post(path:, parameters: nil)
72-
HTTParty.post(
73-
uri(path: path),
74-
headers: headers.merge({ "Content-Type" => "multipart/form-data" }),
75-
body: parameters,
76-
timeout: request_timeout
77-
)
78-
end
79-
80-
def self.delete(path:)
81-
HTTParty.delete(
82-
uri(path: path),
83-
headers: headers,
84-
timeout: request_timeout
85-
)
86-
end
87-
88-
private_class_method def self.uri(path:)
89-
OpenAI.configuration.uri_base + OpenAI.configuration.api_version + path
90-
end
91-
92-
private_class_method def self.headers
93-
{
94-
"Content-Type" => "application/json",
95-
"Authorization" => "Bearer #{OpenAI.configuration.access_token}",
96-
"OpenAI-Organization" => OpenAI.configuration.organization_id
97-
}
98-
end
99-
100-
private_class_method def self.request_timeout
101-
OpenAI.configuration.request_timeout
102-
end
10355
end
10456
end

lib/openai/http.rb

Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
module OpenAI
2+
module HTTP
3+
def get(path:)
4+
to_json(conn.get(uri(path: path)) do |req|
5+
req.headers = headers
6+
end&.body)
7+
end
8+
9+
def json_post(path:, parameters:)
10+
to_json(conn.post(uri(path: path)) do |req|
11+
if parameters[:stream].is_a?(Proc)
12+
req.options.on_data = to_json_stream(user_proc: parameters[:stream])
13+
parameters[:stream] = true # Necessary to tell OpenAI to stream.
14+
end
15+
16+
req.headers = headers
17+
req.body = parameters.to_json
18+
end&.body)
19+
end
20+
21+
def multipart_post(path:, parameters: nil)
22+
to_json(conn(multipart: true).post(uri(path: path)) do |req|
23+
req.headers = headers.merge({ "Content-Type" => "multipart/form-data" })
24+
req.body = multipart_parameters(parameters)
25+
end&.body)
26+
end
27+
28+
def delete(path:)
29+
to_json(conn.delete(uri(path: path)) do |req|
30+
req.headers = headers
31+
end&.body)
32+
end
33+
34+
private
35+
36+
def to_json(string)
37+
return unless string
38+
39+
JSON.parse(string)
40+
rescue JSON::ParserError
41+
# Convert a multiline string of JSON objects to a JSON array.
42+
JSON.parse(string.gsub("}\n{", "},{").prepend("[").concat("]"))
43+
end
44+
45+
# Given a proc, returns an outer proc that can be used to iterate over a JSON stream of chunks.
46+
# For each chunk, the inner user_proc is called giving it the JSON object. The JSON object could
47+
# be a data object or an error object as described in the OpenAI API documentation.
48+
#
49+
# If the JSON object for a given data or error message is invalid, it is ignored.
50+
#
51+
# @param user_proc [Proc] The inner proc to call for each JSON object in the chunk.
52+
# @return [Proc] An outer proc that iterates over a raw stream, converting it to JSON.
53+
def to_json_stream(user_proc:)
54+
proc do |chunk, _|
55+
chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|
56+
user_proc.call(JSON.parse(data))
57+
rescue JSON::ParserError
58+
# Ignore invalid JSON.
59+
end
60+
end
61+
end
62+
63+
def conn(multipart: false)
64+
Faraday.new do |f|
65+
f.options[:timeout] = OpenAI.configuration.request_timeout
66+
f.request(:multipart) if multipart
67+
end
68+
end
69+
70+
def uri(path:)
71+
OpenAI.configuration.uri_base + OpenAI.configuration.api_version + path
72+
end
73+
74+
def headers
75+
{
76+
"Content-Type" => "application/json",
77+
"Authorization" => "Bearer #{OpenAI.configuration.access_token}",
78+
"OpenAI-Organization" => OpenAI.configuration.organization_id
79+
}
80+
end
81+
82+
def multipart_parameters(parameters)
83+
parameters&.transform_values do |value|
84+
next value unless value.is_a?(File)
85+
86+
# Doesn't seem like OpenAI need mime_type yet, so not worth
87+
# the library to figure this out. Hence the empty string
88+
# as the second argument.
89+
Faraday::UploadIO.new(value, "", value.path)
90+
end
91+
end
92+
end
93+
end

lib/openai/version.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
module OpenAI
2-
VERSION = "3.7.0".freeze
2+
VERSION = "4.0.0".freeze
33
end

ruby-openai.gemspec

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,6 @@ Gem::Specification.new do |spec|
2525
spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) }
2626
spec.require_paths = ["lib"]
2727

28-
spec.add_dependency "httparty", ">= 0.18.1"
29-
30-
spec.post_install_message = "Note if upgrading: The `::Ruby::OpenAI` module has been removed and all classes have been moved under the top level `::OpenAI` module. To upgrade, change `require 'ruby/openai'` to `require 'openai'` and change all references to `Ruby::OpenAI` to `OpenAI`."
28+
spec.add_dependency "faraday", ">= 1"
29+
spec.add_dependency "faraday-multipart", ">= 1"
3130
end

0 commit comments

Comments
 (0)