Class: OmniAI::OpenAI::Client

Inherits:
Client
  • Object
show all
Defined in:
lib/omniai/openai/client.rb

Overview

An OpenAI client implementation. Usage:

w/ ‘api_key“:

client = OmniAI::OpenAI::Client.new(api_key: '...')

w/ ENV:

ENV['OPENAI_API_KEY'] = '...'
client = OmniAI::OpenAI::Client.new

w/ config:

OmniAI::OpenAI.configure do |config|
  config.api_key = '...'
end

client = OmniAI::OpenAI::Client.new

Constant Summary collapse

VERSION =
'v1'

Instance Method Summary collapse

Constructor Details

#initialize(api_key: OmniAI::OpenAI.config.api_key, host: OmniAI::OpenAI.config.host, organization: OmniAI::OpenAI.config.organization, project: OmniAI::OpenAI.config.project, logger: OmniAI::OpenAI.config.logger, timeout: OmniAI::OpenAI.config.timeout) ⇒ Client

Returns a new instance of Client.

Parameters:

  • api_key (String, nil) (defaults to: OmniAI::OpenAI.config.api_key)

    optional - defaults to ‘OmniAI::OpenAI.config.api_key`

  • host (String) (defaults to: OmniAI::OpenAI.config.host)

    optional - defaults to ‘OmniAI::OpenAI.config.host`

  • project (String, nil) (defaults to: OmniAI::OpenAI.config.project)

    optional - defaults to ‘OmniAI::OpenAI.config.project`

  • organization (String, nil) (defaults to: OmniAI::OpenAI.config.organization)

    optional - defaults to ‘OmniAI::OpenAI.config.organization`

  • logger (Logger, nil) (defaults to: OmniAI::OpenAI.config.logger)

    optional - defaults to ‘OmniAI::OpenAI.config.logger`

  • timeout (Integer, nil) (defaults to: OmniAI::OpenAI.config.timeout)

    optional - defaults to ‘OmniAI::OpenAI.config.timeout`



31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
# File 'lib/omniai/openai/client.rb', line 31

def initialize(
  api_key: OmniAI::OpenAI.config.api_key,
  host: OmniAI::OpenAI.config.host,
  organization: OmniAI::OpenAI.config.organization,
  project: OmniAI::OpenAI.config.project,
  logger: OmniAI::OpenAI.config.logger,
  timeout: OmniAI::OpenAI.config.timeout
)
  if api_key.nil? && host.eql?(Config::DEFAULT_HOST)
    raise(
      ArgumentError,
      %(ENV['OPENAI_API_KEY'] must be defined or `api_key` must be passed when using #{Config::DEFAULT_HOST})
    )
  end

  super(api_key:, host:, logger:, timeout:)

  @organization = organization
  @project = project
end

Instance Method Details

#assistantsOmniAI::OpenAI::Assistants



129
130
131
# File 'lib/omniai/openai/client.rb', line 129

def assistants
  Assistants.new(client: self)
end

#chat(messages = nil, model: Chat::DEFAULT_MODEL, temperature: nil, format: nil, stream: nil, tools: nil) {|prompt| ... } ⇒ OmniAI::Chat::Completion

Parameters:

  • messages (String) (defaults to: nil)

    optional

  • model (String) (defaults to: Chat::DEFAULT_MODEL)

    optional

  • format (Symbol) (defaults to: nil)

    optional :text or :json

  • temperature (Float, nil) (defaults to: nil)

    optional

  • stream (Proc, nil) (defaults to: nil)

    optional

  • tools (Array<OmniAI::Tool>, nil) (defaults to: nil)

    optional

Yields:

  • (prompt)

Yield Parameters:

  • prompt (OmniAI::Chat::Prompt)

Returns:

  • (OmniAI::Chat::Completion)

Raises:

  • (OmniAI::Error)


76
77
78
# File 'lib/omniai/openai/client.rb', line 76

def chat(messages = nil, model: Chat::DEFAULT_MODEL, temperature: nil, format: nil, stream: nil, tools: nil, &)
  Chat.process!(messages, model:, temperature:, format:, stream:, tools:, client: self, &)
end

#connectionHTTP::Client

Returns:

  • (HTTP::Client)


53
54
55
56
57
58
59
60
61
# File 'lib/omniai/openai/client.rb', line 53

def connection
  @connection ||= begin
    http = super
    http = http.auth("Bearer #{@api_key}") if @api_key
    http = http.headers('OpenAI-Organization': @organization) if @organization
    http = http.headers('OpenAI-Project': @project) if @project
    http
  end
end

#embed(input, model: Embed::DEFAULT_MODEL) ⇒ Object

Parameters:

  • input (String, Array<String>, Array<Integer>)

    required

  • model (String) (defaults to: Embed::DEFAULT_MODEL)

    optional

Raises:

  • (OmniAI::Error)


84
85
86
# File 'lib/omniai/openai/client.rb', line 84

def embed(input, model: Embed::DEFAULT_MODEL)
  Embed.process!(input, model:, client: self)
end

#filesOmniAI::OpenAI::Files



124
125
126
# File 'lib/omniai/openai/client.rb', line 124

def files
  Files.new(client: self)
end

#speak(input, model: Speak::Model::TTS_1_HD, voice: Speak::Voice::ALLOY, speed: nil, format: nil) {|output| ... } ⇒ Tempfile``

Parameters:

  • input (String)

    required

  • model (String) (defaults to: Speak::Model::TTS_1_HD)

    optional

  • voice (String) (defaults to: Speak::Voice::ALLOY)

    optional

  • speed (Float) (defaults to: nil)

    optional

  • format (String) (defaults to: nil)

    optional (default “aac”):

    • “aac”

    • “mp3”

    • “flac”

    • “opus”

    • “pcm”

    • “wav”

Yields:

  • (output)

    optional

Returns:

  • (Tempfile``)

Raises:

  • (OmniAI::Error)


119
120
121
# File 'lib/omniai/openai/client.rb', line 119

def speak(input, model: Speak::Model::TTS_1_HD, voice: Speak::Voice::ALLOY, speed: nil, format: nil, &)
  Speak.process!(input, model:, voice:, speed:, format:, client: self, &)
end

#threadsOmniAI::OpenAI::Threads



134
135
136
# File 'lib/omniai/openai/client.rb', line 134

def threads
  Threads.new(client: self)
end

#transcribe(path, model: Transcribe::Model::WHISPER, language: nil, prompt: nil, temperature: nil, format: nil) ⇒ OmniAI::Transcribe

Parameters:

  • path (String)
  • model (String) (defaults to: Transcribe::Model::WHISPER)
  • language (String, nil) (defaults to: nil)

    optional

  • prompt (String, nil) (defaults to: nil)

    optional

  • temperature (Float, nil) (defaults to: nil)

    optional

  • format (Symbol) (defaults to: nil)

    :text, :srt, :vtt, or :json (default)

Returns:

  • (OmniAI::Transcribe)

Raises:

  • (OmniAI::Error)


98
99
100
# File 'lib/omniai/openai/client.rb', line 98

def transcribe(path, model: Transcribe::Model::WHISPER, language: nil, prompt: nil, temperature: nil, format: nil)
  Transcribe.process!(path, model:, language:, prompt:, temperature:, format:, client: self)
end