cross-posted from: https://lemmit.online/post/5364920
just for good measures
This is an automated archive made by the Lemmit Bot.
The original was posted on /r/memes by /u/brylex1 on 2025-03-10 03:01:06+00:00.
cross-posted from: https://lemmit.online/post/5364920
just for good measures
This is an automated archive made by the Lemmit Bot.
The original was posted on /r/memes by /u/brylex1 on 2025-03-10 03:01:06+00:00.
i have an amazing solution for this, instead of screenshotting, i save it in txt (type it out) so that later when i have a self hosted LLM assistant, i can send all the shit ive compiled till now, and ask for movies/ songs/ or any article i saved and i can just do a semantic search through it. planning to make an open source tool for this but not too good at ML
Oh, no need to wait for LLMs. Apache Solr should be really good at it. We used it at a company I was working at to build the most kickass search into our platform, that would actually find the stuff you were looking for…and that was back in 2018 :D
ayy, that’s nice. LLMs are truely overkill just for semantic search though, didnt know there are other ways to achieve this. but we need intelligence too right. (somewhat)
Don’t get me wrong though… throwing an LLM at it would be a lot easier and faster. Just a mind boggling use of resources for a task that could probably be done more efficiently :D
Setting this up with Apache Solr and a suitable search frontend runs a high risk of becoming an abandoned side project itself^^
Yeah LLM seems like the go to solution. And the best one. And talking about resources, we can use barely smart models which can generate coherent sentences, be it 0.5b-3b models offloaded to CPU inference only.
Yes, your own intelligence that you integrate into the structure of your database and queries ;)
For everyone else, here: https://rlama.dev/