steamship.agents.llms package#
Submodules#
steamship.agents.llms.openai module#
- class steamship.agents.llms.openai.ChatOpenAI(client, model_name: str = 'gpt-4-0613', *args, generator: PluginInstance)[source]#
-
ChatLLM that uses Steamship’s OpenAI plugin to generate chat completions.
- class steamship.agents.llms.openai.OpenAI(client, model_name: str = 'gpt-3.5-turbo', temperature: float = 0.4, *args, generator: PluginInstance)[source]#
Bases:
LLM
LLM that uses Steamship’s OpenAI plugin to generate completions.
NOTE: By default, this LLM uses the gpt-3.5-turbo model. Valid model choices are gpt-3.5-turbo and gpt-4.
- complete(prompt: str, stop: str | None = None, **kwargs) List[Block] [source]#
Completes the prompt, respecting the supplied stop sequence.
Supported kwargs include: - max_tokens (controls the size of LLM responses)
- generator: PluginInstance#
steamship.agents.llms.steamship_llm module#
- class steamship.agents.llms.steamship_llm.SteamshipLLM(plugin_instance: PluginInstance)[source]#
Bases:
object
A class wrapping LLM plugins.
- generate(messages: List[Block], capabilities: List[Capability] = None, assert_capabilities: bool = True, **kwargs) List[Block] [source]#
Call the LLM plugin’s generate method. Generate requests for plugin capabilities based on input parameters.
- Parameters:
messages – Messages to be passed to the LLM to construct the prompt.
capabilities –
Capabilities that will be asked of the LLM. See the docstring for steamship.plugins.capabilities.
If default_capabilities was provided at construction, those capabilities will be requested unless overridden by this parameter.
assert_capabilities – If True (default), raise a SteamshipError if the LLM plugin did not respond with a block that asserts what level capabilities were fulfilled at.
kwargs – Options that can be passed to the LLM model as other parameters.
- Returns:
a List of Blocks that are returned from the plugin.
Module contents#
- class steamship.agents.llms.OpenAI(client, model_name: str = 'gpt-3.5-turbo', temperature: float = 0.4, *args, generator: PluginInstance)[source]#
Bases:
LLM
LLM that uses Steamship’s OpenAI plugin to generate completions.
NOTE: By default, this LLM uses the gpt-3.5-turbo model. Valid model choices are gpt-3.5-turbo and gpt-4.
- complete(prompt: str, stop: str | None = None, **kwargs) List[Block] [source]#
Completes the prompt, respecting the supplied stop sequence.
Supported kwargs include: - max_tokens (controls the size of LLM responses)
- generator: PluginInstance#