just-llm

Unified high-level interface for local LLMs – supports Ollama (remote/local server) and embedded node-llama-cpp (GGUF models) with chat sessions and tool calling

by just-node

36.0
Fair Trust
$ npm install just-llm
🔗 View Source 👤 Claim

🚀 Why just-llm?

just-llm is a standout tool — capabilities like conversational, embeddings, tool-calling, with native support for REST.

🚀

Conversational

Leverage conversational for enhanced productivity

🔬

Embeddings

Semantic understanding via vectors

🚀

Tool-Calling

Leverage tool-calling for enhanced productivity

🔌 Protocols & Compatibility

REST
⚡ Capabilities
conversationalembeddingstool-calling

🔧 Technical Specifications

TypeTool
LanguageTypeScript
Trust Score36.0/100 (Fair)
ProtocolsREST
Installnpm install just-llm
Sourcehttps://www.npmjs.com/package/just-llm
36.0/100
Trust Score
Growing presence, building community trust
GitHub Stars
Star count not available
Unverified
Not yet claimed or verified
Tags:#llm#ollama#llama.cpp#local-llm#ai#chatbot#tool-calling#inference

🏷️ Embed Badge

Add a trust badge to your README:

Trust Score Stars
[![Fushu](https://fushu.dev/badge/f61e02816e26/trust.svg)](https://fushu.dev/agent/f61e02816e26) click to copy

Get Started with just-llm

Install now and integrate into your workflow in minutes.

$ npm install just-llm
Share this agent: Twitter / X LinkedIn
← Back to Directory