pangukitsappdev.llms package

Subpackages

Submodules

pangukitsappdev.llms.gallery module

class pangukitsappdev.llms.gallery.GalleryChatLLM(*, name: Optional[str] = None, cache: Union[BaseCache, bool, None] = None, verbose: bool = None, callbacks: Callbacks = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, custom_get_token_ids: Optional[Callable[[str], List[int]]] = None, callback_manager: Optional[BaseCallbackManager] = None, temperature: Optional[float] = None, max_tokens: Optional[int] = None, top_p: Optional[float] = None, gallery_url: str, token_getter: IAMTokenProvider, streaming: Optional[bool] = None, proxies: dict = {})

基类:BaseChatModel

gallery_url: str
max_tokens: Optional[int]
proxies: dict
streaming: Optional[bool]
temperature: Optional[float]
token_getter: IAMTokenProvider
top_p: Optional[float]
class pangukitsappdev.llms.gallery.GalleryLLM(*, name: Optional[str] = None, cache: Union[BaseCache, bool, None] = None, verbose: bool = None, callbacks: Callbacks = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, custom_get_token_ids: Optional[Callable[[str], List[int]]] = None, callback_manager: Optional[BaseCallbackManager] = None, temperature: Optional[float] = None, max_tokens: Optional[int] = None, top_p: Optional[float] = None, gallery_url: str, token_getter: IAMTokenProvider, streaming: Optional[bool] = None, proxies: dict = {})

基类:LLM

gallery_url: str
max_tokens: Optional[int]
proxies: dict
streaming: Optional[bool]
temperature: Optional[float]
token_getter: IAMTokenProvider
top_p: Optional[float]
class pangukitsappdev.llms.gallery.GalleryLLMApi(llm_config: LLMConfig, chat_llm: Optional[BaseChatModel] = None, cache: Optional[CacheApi] = None)

基类:AbstractLLMApi

do_create_chat_llm(llm_config: LLMConfig) BaseChatModel
do_create_llm(llm_config: LLMConfig) BaseLLM
parse_llm_response(llm_result: LLMResult) LLMRespGallery

pangukitsappdev.llms.openai module

class pangukitsappdev.llms.openai.OpenAILLMApi(llm_config: LLMConfig, chat_llm: Optional[BaseChatModel] = None, cache: Optional[CacheApi] = None)

基类:AbstractLLMApi

do_create_chat_llm(llm_config: LLMConfig) BaseChatModel
do_create_llm(llm_config: LLMConfig) BaseLLM
parse_llm_response(llm_result: LLMResult) LLMRespOpenAI

pangukitsappdev.llms.pangu module

class pangukitsappdev.llms.pangu.PanguChatLLM(*, name: Optional[str] = None, cache: Union[BaseCache, bool, None] = None, verbose: bool = None, callbacks: Callbacks = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, custom_get_token_ids: Optional[Callable[[str], List[int]]] = None, callback_manager: Optional[BaseCallbackManager] = None, temperature: Optional[float] = None, max_tokens: Optional[int] = None, top_p: Optional[float] = None, presence_penalty: Optional[float] = None, pangu_url: str, token_getter: IAMTokenProvider, streaming: Optional[bool] = None, with_prompt: Optional[bool] = None, proxies: dict = {})

基类:BaseChatModel

get_num_tokens(text: str) int

Get the number of tokens present in the text.

get_num_tokens_from_messages(messages: List[BaseMessage]) int

Get the number of tokens in the message.

get_token_ids(text: str) List[int]

Get the token present in the text.

max_tokens: Optional[int]
pangu_url: str
presence_penalty: Optional[float]
proxies: dict
streaming: Optional[bool]
temperature: Optional[float]
token_getter: IAMTokenProvider
top_p: Optional[float]
with_prompt: Optional[bool]
class pangukitsappdev.llms.pangu.PanguLLM(*, name: Optional[str] = None, cache: Union[BaseCache, bool, None] = None, verbose: bool = None, callbacks: Callbacks = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, custom_get_token_ids: Optional[Callable[[str], List[int]]] = None, callback_manager: Optional[BaseCallbackManager] = None, temperature: Optional[float] = None, max_tokens: Optional[int] = None, top_p: Optional[float] = None, presence_penalty: Optional[float] = None, pangu_url: str, token_getter: IAMTokenProvider, streaming: Optional[bool] = None, proxies: dict = {})

基类:LLM

max_tokens: Optional[int]
pangu_url: str
presence_penalty: Optional[float]
proxies: dict
streaming: Optional[bool]
temperature: Optional[float]
token_getter: IAMTokenProvider
top_p: Optional[float]
class pangukitsappdev.llms.pangu.PanguLLMApi(llm_config: LLMConfig, chat_llm: Optional[BaseChatModel] = None, cache: Optional[CacheApi] = None)

基类:AbstractLLMApi

do_create_chat_llm(llm_config: LLMConfig) BaseChatModel
do_create_llm(llm_config: LLMConfig) BaseLLM
parse_llm_response(llm_result: LLMResult) LLMRespPangu

Module contents