OmniAI::DeepSeek
An DeepSeek implementation of the OmniAI interface for deepseek.
Installation
gem install omniai-deepseek
Usage
Client
A client is setup as follows if ENV['DEEPSEEK_API_KEY']
exists:
client = OmniAI::DeepSeek::Client.new
A client may also be passed the following options:
api_key
(required - default isENV['DEEPSEEK_API_KEY']
)host
(optional)
Configuration
Global configuration is supported for the following options:
OmniAI::DeepSeek.configure do |config|
config.api_key = 'sk-...' # default: ENV['DEEPSEEK_API_KEY']
config.host = '...' # default: 'https://api.deepseek.com'
end
Chat
A chat completion is generated by passing in a simple text prompt:
completion = client.chat('Tell me a joke!')
completion.content # 'Why did the chicken cross the road? To get to the other side.'
A chat completion may also be generated by using a prompt builder:
completion = client.chat do |prompt|
prompt.system('Your are an expert in geography.')
prompt.user('What is the capital of Canada?')
end
completion.content # 'The capital of Canada is Ottawa.'
Model
model
takes an optional string (default is gpt-4o
):
completion = client.chat('How fast is a cheetah?', model: OmniAI::DeepSeek::Chat::Model::REASONER)
completion.content # 'A cheetah can reach speeds over 100 km/h.'
Temperature
temperature
takes an optional float between 0.0
and 2.0
(defaults is 0.7
):
completion = client.chat('Pick a number between 1 and 5', temperature: 2.0)
completion.content # '3'
DeepSeek API Reference temperature
Stream
stream
takes an optional a proc to stream responses in real-time chunks instead of waiting for a complete response:
stream = proc do |chunk|
print(chunk.content) # 'Better', 'three', 'hours', ...
end
client.chat('Be poetic.', stream:)
Format
format
takes an optional symbol (:json
) and that setes the response_format
to json_object
:
completion = client.chat(format: :json) do |prompt|
prompt.system(OmniAI::Chat::JSON_PROMPT)
prompt.user('What is the name of the drummer for the Beatles?')
end
JSON.parse(completion.content) # { "name": "Ringo" }
DeepSeek API Reference response_format
When using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message.