Handle 100gb a day


I’ve recently though about using siren for one of my projects. I’m looking to analyze massive amounts of netflow data. In my homelab I ingest into my elastic cluster a little over 100gb per day. I tried using siren to help me analyze my data but its so slow. I have 96gb of ram on my machine with 16gb assigned to siren but its still really slow and almost always return timeout error. Any idea what I am doing wrong?
Can siren even handle these type of volumes?

Thank you.