public class OpenAI extends AbstractLLM<LLMRespOpenAI>
llmConfig, streamCallBack| Modifier and Type | Method and Description |
|---|---|
protected LLMRespOpenAI |
getLLMResponse(List<ConversationMessage> chatMessages,
LLMParamConfig paramConfig)
调用chat接口
|
protected LLMRespOpenAI |
getLLMResponseFromCache(String cache)
从缓存获取结果
|
ask, ask, ask, ask, askForObject, askForObject, getExistPrompt, getLLMConfig, needAddNewSystemMessage, setCache, setStreamCallbackpublic OpenAI(LLMConfig llmConfig)
llmConfig - llm参数配置protected LLMRespOpenAI getLLMResponse(List<ConversationMessage> chatMessages, LLMParamConfig paramConfig)
AbstractLLMgetLLMResponse in class AbstractLLM<LLMRespOpenAI>chatMessages - LLM模型输入paramConfig - 模型参数配置protected LLMRespOpenAI getLLMResponseFromCache(String cache)
AbstractLLMgetLLMResponseFromCache in class AbstractLLM<LLMRespOpenAI>cache - 缓存Copyright © 2024. All rights reserved.