quackamollie.model.langchain_simple.model_langchain_simple module

class quackamollie.model.langchain_simple.model_langchain_simple.SimpleLangchainQuackamollieModel(model_config: str | None = None)[source]

Bases: MetaLangchainQuackamollieModel

Simple Langchain index model with chat history, managed by LangchainQuackamollieModelManager

DEFAULT_OLLAMA_BASE_MODEL: str = 'llama3'
_abc_impl = <_abc._abc_data object>
classmethod astream_answer(content: str, chat_history: List[Tuple[str, str]], model_config: str | None = None, **kwargs) AsyncIterable[Tuple[str, bool]][source]

Asynchronous iterator to stream the answer from a Langchain model

Parameters:
  • content (str) – The new message content

  • chat_history (List) – A list of past messages formatted accordingly by model manager

  • model_config (Optional[str]) – Additional configuration given as a string through CLI or Telegram App Settings and retrieved from the database

  • kwargs (Dict) – Additional streaming arguments

Returns:

An asynchronous iterator giving a tuple containing the new chunk and a boolean indicating if the model is done or not

Return type:

AsyncIterable[Tuple[str, bool]]

model_families: List[ModelFamilyIcon] = [ModelFamilyIcon.LANGCHAIN]