Problem/Motivation

It would be amazing with Ollama support in AI Interpolator:

  • It is Open Source, and can be self-hosted
  • It is easy to implement
  • It is widely useful, and works well as a classical LLM text prompt.

Steps to reproduce

Proposed resolution

Remaining tasks

User interface changes

API changes

Data model changes

Comments

ressa created an issue. See original summary.

marcus_johansson’s picture

Status: Active » Closed (won't fix)

Already is being worked on here: https://www.drupal.org/project/ollama

mindaugasd’s picture

Title: Add Ollama support in AI Interpolator » Add AI Interpolator support
Project: AI Interpolator » Ollama AI
Version: 1.0.x-dev »
Status: Closed (won't fix) » Active

Since issue belongs within ollama issue queue, I changed the project.

@Orkut Murat Yılmaz integration can reuse some code from Hugging face module https://www.drupal.org/project/huggingface
Module issue: #3420592: Add support in AI Interpolator for Hugging Face

ressa’s picture

Great idea, thanks @mindaugasd.

ressa’s picture

I see that @Marcus Johansson joined the project recently, and that @Orkut Murat Yılmaz made some commits, which is great! Is a release close, or perhaps it can be tested already now?

The llama3 model works well for me in Ollama (run locally in Debian 12) and it would be awesome if Drupal supported at least one self-hosted LLM.

mindaugasd’s picture

@ressa

would be awesome if Drupal supported at least one self-hosted LLM

I think this other module supports it already https://www.drupal.org/project/lmstudio . Would you up to test it?

lmstudio is integrated with LLM provider, and module maintainer @seogow is also working on AI interpolator support at the moment.

As per this issue #3426454: Collaboration with existing projects, ollama could also be integrated with LLM provider the same way, and by extension, with AI interpolator.

ressa’s picture

Thanks for the tip about https://www.drupal.org/project/lmstudio @mindaugasd, it looks great!

I guess I should run Ollama with ollama serve and then use Ollama's IP to connect it to the LM Studio module?

And thanks for the link to the issue about collaboration between existing LLM projects, that would be very nice. There are a lot of moving parts to keep track of :)

ressa’s picture

I tried the LM Studio module, but it's not clear to me how to actually use it -- I can't add a LM Studio block, there are no new LM Studio fields ...

mindaugasd’s picture

In case you seek AI interpolator support with lmstudio, @seogow is developing it, but there is non yet. With this patch #3426452: Proposal for Implementing LLM Abstraction in OpenAI Drupal Modules possibly you can get it working with features of OpenAI module.