Large language model (llm)

Hello,

I was wondering if or when you are going to integrate LLMs? I already Llama2 on my existing Elasticsearch instance but it would be awesome to be able to do with my siren instance as well.

Thanks.

Hello,

Thanks for reaching out! However, to help us understand your specific needs better, could you provide a bit more detail about your use case? For instance:

  1. How are you currently using Llama2 with Elasticsearch? What kind of data and queries are you working with?
  2. What specific functionality or benefits are you hoping to gain by integrating LLMs into your Siren instance?

Kind regards.

Hi,

  1. Yes I am. I work with different models using Elasticsearch Machine Learning nodes.
    1.1. My data sets vary drastically based on the project I’m currency working. Anything from scraped websites and social media data to large research papers to patents. As for queries, right now, I use a simple React app to interact with Elastic. I usually ask it to summarize information for me or ask it more complicated questions instead of reading millions of pages.
  2. I like the Siren UI much better than the Kibana UI plus with the Federate plugin I can navigate between my different dashboards with that I believe you call associative search. I have installed my own NLP plugin so now using the graph is even better. As for specific benefits, I use different ML models very heavily. The fact that I can’t take advantage of them on my local Siren instance basically makes the whole moot.

Thanks.

Hi Jheran,

Thanks for providing the detail and use case in order to understand more about it, please reach out to sales@siren.io so that we can discuss in detail.

Regards
Manu