NEW: Multi-Model AI Integration

FABRICON

AI agents with composable intelligence - the first framework combining GPT-4, Claude, and custom models for complex creation tasks. Production-ready, TypeScript-first architecture.

$ npm install @fabricon/core
$ fabricon --architecture
architecture.ts
// Fabricon's Multi-Model Architecture
// Intelligent routing based on task complexity

interface FabriconCore {
  // Model orchestration
  router: ModelRouter;
  
  // Context management
  contexts: ContextManager;
  
  // Memory persistence
  memory: PersistentMemory;
  
  // Action handlers
  actions: ActionRegistry;
}

// Automatic model selection
class ModelRouter {
  selectModel(task: Task): AIModel {
    if (task.type === 'code_generation') return GPT4;
    if (task.type === 'analysis') return Claude;
    if (task.type === 'research') return GPT5;
    return this.customModel || GPT4;
  }
}

// Context isolation & composition
const fabricon = createFabricon({
  contexts: [
    codeContext.use([memoryContext, toolsContext]),
    analysisContext.use([documentContext])
  ]
});
$ fabricon --capabilities
[MULTI-MODEL]

Intelligent Model Selection

Automatically routes tasks to the optimal AI model - GPT-4 for code, Claude for analysis, custom models for specialized tasks

[STATEFUL]

Persistent Memory

Each agent remembers context across sessions - true stateful intelligence for long-running projects

[COMPOSABLE]

Context Composition

Combine isolated workspaces for modular agent behaviors - build complex systems from simple components

[TYPESCRIPT]

Type-Safe Throughout

Full IntelliSense for contexts, actions, memory, and schemas - catch errors before runtime

[STREAMING]

Real-Time Streaming

Stream responses token-by-token for immediate feedback - perfect for interactive applications

[PARALLEL]

Parallel Execution

Run multiple AI models simultaneously - combine outputs for enhanced results

$ fabricon --quickstart
quickstart.ts
// Initialize Fabricon with GPT-4
import { createFabricon } from '@fabricon/core';
import { openai } from '@ai-sdk/openai';

const fabricon = createFabricon({
  model: openai('gpt-4o'),
  contexts: [chatContext],
});

// Simple agent
const chatContext = context({
  type: "chat",
  create: () => ({ messages: [] }),
  setActions: (state) => ({
    sendMessage: async (content: string) => {
      state.messages.push({ role: 'user', content });
      const response = await fabricon.generate(content);
      state.messages.push({ role: 'assistant', content: response });
      return response;
    }
  })
});

// Use the agent
await chatContext.actions.sendMessage("Create a React component");
💡

Context-Based Architecture

Fabricon uses isolated contexts that can be composed together. Each context has its own state, actions, and can leverage different AI models automatically.

$ fabricon --models
→
GPT-4 Turbo
Advanced reasoning and code generation. Primary model for complex development tasks.
→
Claude 3.5 Sonnet
Superior analysis and documentation. Excels at understanding existing codebases.
→
GPT-5 Preview
Next-generation capabilities. Early access for cutting-edge AI features.
→
Gemini Pro
Multimodal processing. Handles images, video, and complex data formats.
→
DeepSeek Coder
Specialized code model. Optimized for programming languages and algorithms.
→
Custom Models
Bring your own. Connect any AI SDK compatible model or fine-tuned variant.
$ fabricon --examples
→
Code Generation
Multi-model agent that combines GPT-4 for architecture, Claude for review
→
Smart Contract Builder
Solidity generation with automatic security audits and gas optimization
→
API Development
Complete REST API generation with authentication and documentation
→
Database Schema Designer
Intelligent database modeling with relationship optimization
$ fabricon --api
createFabricon(config)

Initialize a new Fabricon instance with AI model configuration, contexts, and optional memory persistence.

config: FabriconConfig
context(definition)

Create an isolated context with state management, actions, and optional composition with other contexts.

definition: ContextDefinition
fabricon.generate(prompt, options)

Generate content using the configured AI model with automatic model selection based on task complexity.

prompt: string options: GenerateOptions
context.use(dependencies)

Compose multiple contexts together. Child contexts inherit state and actions from parent contexts.

dependencies: Context[]
$ fabricon --providers
OpenAI Anthropic Google Groq Mistral DeepSeek Cohere Replicate

+ all AI SDK providers