Open Source JavaScript Large Language Models (LLM) for Mac

JavaScript Large Language Models (LLM) for Mac

View 119 business solutions

Browse free open source JavaScript Large Language Models (LLM) for Mac and projects below. Use the toggles on the left to filter open source JavaScript Large Language Models (LLM) for Mac by OS, license, language, programming language, and project status.

  • Our Free Plans just got better! | Auth0 Icon
    Our Free Plans just got better! | Auth0

    With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

    You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
    Try free now
  • Inventors: Validate Your Idea, Protect It and Gain Market Advantages Icon
    Inventors: Validate Your Idea, Protect It and Gain Market Advantages

    SenseIP is ideal for individual inventors, startups, and businesses

    senseIP is an AI innovation platform for inventors, automating any aspect of IP from the moment you have an idea. You can have it researched for uniqueness and protected; quickly and effortlessly, without expensive attorneys. Built for business success while securing your competitive edge.
    Learn More
  • 1
    SillyTavern

    SillyTavern

    LLM Frontend for Power Users

    Mobile-friendly, Multi-API (KoboldAI/CPP, Horde, NovelAI, Ooba, OpenAI, OpenRouter, Claude, Scale), VN-like Waifu Mode, Horde SD, System TTS, WorldInfo (lorebooks), customizable UI, auto-translate, and more prompt options than you'd ever want or need. Optional Extras server for more SD/TTS options + ChromaDB/Summarize. SillyTavern is a user interface you can install on your computer (and Android phones) that allows you to interact with text generation AIs and chat/roleplay with characters you or the community create. SillyTavern is a fork of TavernAI 1.2.8 which is under more active development and has added many major features. At this point, they can be thought of as completely independent programs.
    Downloads: 159 This Week
    Last Update:
    See Project
  • 2
    Anything LLM

    Anything LLM

    The all-in-one Desktop & Docker AI application with full RAG and AI

    A full-stack application that enables you to turn any document, resource, or piece of content into a context that any LLM can use as references during chatting. This application allows you to pick and choose which LLM or Vector Database you want to use as well as supporting multi-user management and permissions. AnythingLLM is a full-stack application where you can use commercial off-the-shelf LLMs or popular open-source LLMs and vectorDB solutions to build a private ChatGPT with no compromises that you can run locally as well as host remotely and be able to chat intelligently with any documents you provide it. AnythingLLM divides your documents into objects called workspaces. A Workspace functions a lot like a thread, but with the addition of containerization of your documents. Workspaces can share documents, but they do not talk to each other so you can keep your context for each workspace clean.
    Downloads: 34 This Week
    Last Update:
    See Project
  • 3
    LLM Datasets

    LLM Datasets

    Curated list of datasets and tools for post-training

    LLM Datasets curates and standardizes datasets commonly used to train and fine-tune large language models, reducing the overhead of hunting down sources and normalizing formats. The repository aims to make datasets easy to inspect and transform, with scripts for downloading, deduping, cleaning, and converting to formats like JSONL that slot into training pipelines. It highlights instruction-tuning and conversation-style corpora while also pointing to code, math, or domain-specific sets for targeted capabilities. Quality is a recurring theme: examples and utilities help filter low-value samples, enforce length limits, and split train/validation consistently so results are comparable. Licensing and provenance are surfaced to encourage compliant usage and to guide dataset selection in commercial settings. For practitioners, the repo is a practical “starting pantry” that accelerates experimentation and helps keep data wrangling from dominating the project timeline.
    Downloads: 6 This Week
    Last Update:
    See Project
  • 4
    CSGHub

    CSGHub

    CSGHub is a brand-new open-source platform for managing LLMs

    CSGHub is an open-source framework designed for collaborative scientific research and content generation. It enables researchers to utilize AI-driven tools for literature review, hypothesis generation, and automated writing assistance, streamlining the scientific discovery process.
    Downloads: 1 This Week
    Last Update:
    See Project
  • Axe Credit Portal - ACP- is axefinance’s future-proof AI-driven solution to digitalize the loan process from KYC to servicing, available as a locally hosted or cloud-based software. Icon
    Axe Credit Portal - ACP- is axefinance’s future-proof AI-driven solution to digitalize the loan process from KYC to servicing, available as a locally hosted or cloud-based software.

    Banks, lending institutions

    Founded in 2004, axefinance is a global market-leading software provider focused on credit risk automation for lenders looking to provide an efficient, competitive, and seamless omnichannel financing journey for all client segments (FI, Retail, Commercial, and Corporate.)
    Learn More
  • 5
    ChatLLM Web

    ChatLLM Web

    Chat with LLM like Vicuna totally in your browser with WebGPU

    Chat with LLM like Vicuna totally in your browser with WebGPU, safely, privately, and with no server. Powered By web-llm. To use this app, you need a browser that supports WebGPU, such as Chrome 113 or Chrome Canary. Chrome versions ≤ 112 are not supported. You will need a GPU with about 6.4GB of memory. If your GPU has less memory, the app will still run, but the response time will be slower. The first time you use the app, you will need to download the model. For the Vicuna-7b model that we are currently using, the download size is about 4GB. After the initial download, the model will be loaded from the browser cache for faster usage.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 6
    Superagent

    Superagent

    Build, deploy, and manage LLM-powered agents

    Superagent is a powerful tool that simplifies the configuration and deployment of LLM (Large Language Model) Agents to production. It provides a range of features and functionalities to make it easier for developers to build, manage and deploy AI agents to production including features such as built-in memory and document retrieval via vector dbs, powerful tools, webhooks, cron jobs etc.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 7
    LLM Course

    LLM Course

    Course to get into Large Language Models (LLMs)

    LLM Course is a hands-on, notebook-driven path for learning how large language models work in practice, from data curation to training, fine-tuning, evaluating, and deploying. It emphasizes reproducible experiments: each step is demonstrated with runnable code, clear dependencies, and references to commonly used open-source models and libraries. Learners get exposure to multiple adaptation strategies—LoRA/QLoRA, instruction fine-tuning, and alignment techniques—so they can choose approaches that fit their hardware and budgets. The materials also cover inference optimization and quantization to make serving LLMs feasible on commodity GPUs or even CPUs, which is crucial for side projects and startups. Evaluation is treated as a first-class topic, with examples of automatic and human-in-the-loop methods to catch regressions and verify quality beyond simple loss values. By the end, students have a mental model and a practical toolkit for iterating on datasets, training configs, etc.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Learn Prompting

    Learn Prompting

    This website is a free, open-source guide on prompt engineering

    This website is a free, open-source guide on prompt engineering. Contributions are welcome! Harsh criticism is welcome too. We launched the first ever prompt hacking competition designed to enhance AI safety and education by challenging participants to outsmart large language models from May 5th to June 3rd! The competition featured 10 increasingly difficult levels of prompt hacking defenses and the chance to win over $35,000 in prizes. Coding is a great skill to learn alongside prompt engineering. We recommend learning Python, as it is a popular language for AI and machine learning. Be among the first to access the certification program as soon as it launches.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next