ollama5 min read
Ollama — Run LLMs Locally on Your Mac
Run large language models locally using Ollama without internet or API keys.
Read →
2 articles
Run large language models locally using Ollama without internet or API keys.
For each query (xi, mi), find max xi XOR with element ≤ mi. Offline: sort queries and elements by value, add elements up to mi, query trie.