#%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuIndexImport.template%#

LLM Selection

Here you can pick models from a LLM model service to select them as production model. In the "Production Models" - Matrix you can then assign each selected model a function inside YaCy

Install your local LLM service! You need either a local ollama or LM Studio instance running on your local host or inside the intranet.

Service Selection
service
  This makes a preset to the Hoststub value
hoststub
  you can probably leave this to the default value
api_key
  (not required for Ollama or LMStudio)
max_tokens
  You must set the Context Length in the LLM service to fit to your selected max_tokens; in Ollama you find a Context Length slider in the settings
 
Production Models #{productionmodels}# #{/productionmodels}#
servicemodelhoststubapi_keymax_tokensanswerschattranslationqa-generationclassificationshortenervisionActions
#[service]##[model]##[hoststub]##[api_key]##[max_tokens]#
#%env/templates/footer.template%#