Software Alternatives, Accelerators & Startups

Ask HN: Which LLMs can run locally on most consumer computers

Jan.ai RecurseChat
  1. 1
    Run LLMs like Mistral or Llama2 locally and offline on your computer, or connect to remote AI APIs like OpenAIโ€™s GPT-4 or Groq.
    Pricing:
    • Open Source

    #LLM #Chat GPT #AI 11 social mentions

  2. Use Local AI as Daily Driver
    I have been using local LLM as a daily driver. Built https://recurse.chat for it.

    #AI #Productivity #AI Tools 15 social mentions

Discuss: Ask HN: Which LLMs can run locally on most consumer computers

Log in or Post with