Get Started
Documentation
Everything you need to integrate Sutraworks into your application.
1Installation
# Install from GitHub Releases
npm install github:nranjan2code/sutraworks-clientAISDK# Or clone and install locally
git clone https://github.com/nranjan2code/sutraworks-clientAISDK.git
cd sutraworks-clientAISDK && npm install && npm run build# CDN (Browser) - use jsDelivr with GitHub
<script src="https://cdn.jsdelivr.net/gh/nranjan2code/sutraworks-clientAISDK@latest/dist/umd/sutra-ai.umd.js"></script>2Basic Usage
import { SutraAI } from '@sutraworks/client-ai-sdk';
// Initialize the client
const ai = new SutraAI();
// Set your API key (stored locally, never sent to a server)
await ai.setKey('openai', 'sk-...');
// Make a chat request
const response = await ai.chat({
provider: 'openai',
model: 'gemini-3-flash',
messages: [{ role: 'user', content: 'Hello!' }]
});
console.log(response.choices[0].message.content);3Streaming Responses
for await (const chunk of ai.chatStream({
provider: 'openai',
model: 'gemini-3-flash',
messages: [{ role: 'user', content: 'Write a poem' }]
})) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? '');
}4API Reference
| Method | Description |
|---|---|
| setKey(provider, key) | Set API key for a provider |
| setKeys(keys) | Set multiple API keys at once |
| chat(request) | Execute chat completion |
| chatStream(request) | Stream chat completion |
| complete(prompt) | Simple text completion |
| embed(request) | Generate embeddings |
| batch(requests) | Process multiple requests |
| use(middleware) | Add middleware to pipeline |
| getUsageStats() | Get usage statistics |
| destroy() | Clean up all resources |
★Showcase Application
Smart Support Dashboard
Explore a production-ready reference implementation featuring intelligent ticket routing, sentiment analysis, and auto-response drafting — all powered by client-side AI.