Absurd: llm

20231203   How to Block LLM bots   ⇢  

With all of the various LLMs and LLM-using-services springing up all over the web without any regard at all to copyright, morality, or consent, I wanted to find a way of blocking these things from using any of my content. Normally, we would use `robots.tx

20231205   Running an LLM locally   ⇢  

So, if you want to run an LLM locally for whatever reason, it's not too hard to do. Essentially, you first need to grab Llama.cpp, compile it, then grab a model (and possibly convert it), and then run it. Before starting, make sure you have a C/C++ compil

⇠ back

© MMXXV, Abort Retry Fail LLC
Licentiam Absurdum