DominoIQ is an AI implementation integrated into HCL Domino that works with Large Language Models (LLMs) and can be executed directly on the server locally.
1. General Information
1.1 What is an LLM (Large Language Model)?
- An advanced AI system trained on massive amounts of text data
- Answers questions and summarizes texts
- Generates creative content
- Provides support for tasks through natural language processing
- Its performance is largely determined by the choice of the right LLM
1.2 EU AI Act - Important for Companies
- EU regulatory framework for controlling AI technologies
- In force since August 1, 2024
- Mandatory AI training for employees by February 2, 2025
- Risk reduction through better training measures
- Competitive advantages through competent use of AI
2. What is DominoIQ?
2.1 Core Features
- AI solution integrated into Domino
- Uses standard LLMs with local execution
- Works without an external connection at runtime
- Seamlessly integrated into Domino with a native Notes (NRPC) transaction
- Provides support for LotusScript and Java classes (NotesLLMRequest, NotesLLMResponse)
2.2 Technical Implementation
- DominoIQ server with NRPC "LLM" transaction
- LLAMA C++ server (based on the same open-source project as Ollama)
- Connection between the Domino core and the LLAMA server via localhost (127.0.0.1)
- Requires models in GGUF format
- LLAMA supports the OpenAI REST API (currently implemented request type: /v1/chat/completions)
3. DominoIQ - Configuration (all details in the HCL documentation)
3.1 Domino Directory Profile Setup
- Central configuration of all DominoIQ servers within a Domino domain
- Setup directly in the Domino Directory Profile (names.nsf)
- Definition of the responsible administration server
- Registration of all servers running DominoIQ
3.2 Database Setup
- dominoiq.nsf is automatically created by Domino
- Serves as the configuration database for one or more servers
- Must be present on every DominoIQ server
- Contains all required configurations, commands, system prompts, and models
3.2.1 Model Configuration
- Automatic download and management of LLM models in GGUF format
- Done via an "LLM Model" document in dominoiq.nsf
3.2.2 System Prompts
- Creation of a "System Prompt" document
- Instructs the DominoIQ AI inference engine on how to formulate queries
- Helps generate precise answers
3.2.3 Commands
- Assignment of LLM system prompts to DominoIQ servers
- Creation of an "LLM Command" document
3.2.4 Configuration
- Creation of a "DominoIQ Configuration" document
- Links a Domino server with an LLM model
3.2.5 Remote Configuration
- Creation of a "DominoIQ Configuration" document in remote mode
- Allows connecting a Domino server to an externally hosted LLM endpoint
3.3 Model Download
- Download your preferred LLM (huggingface.co is the largest hub for LLMs)
- DominoIQ (LLAMA server) requires models in GGUF format
- The user interface provides options for selecting LLMs available as GGUF files
- Recommended for initial tests: Llama-3.2-3B-Instruct-GGUF
3.3.1 Recommended Models for Different Use Cases
Use Case | Recommended Model |
Conversational Agents & Chatbots | WizardLM 2 7B |
Content Creation & Creative Writing | phi-4 |
Summarization & Abstraction | Meta-Llama 3.1 8B Instruct |
Code Generation & Programming Assistance | CodeLlama 7B KStack |
Translation & Localization | EuroLLM 9B Instruct |
3.4 Additional Configuration Steps
3.4.1 Model Configuration
- Name of the model
- Short description of the model
- GGUF model file name on the local drive
- Storage location of the GGUF files: "/DOMINODATA/llm_models" directory
3.4.2 Server Configuration
- Creation of a configuration document for each Domino iQ server
- Definition of the LLM model to be used by the server
- Configuration of the port for the localhost connection
3.4.3 Remote Server Configuration (optional)
- URL of the external LLM server
- API key for connecting to the LLM server
- Manual generation of a secure key value (bearer token for the REST API)
3.4.4 TLS Configuration (optional)
- DominoIQ offers full TLS support
- Uses CertMgr TLS credentials
- CertMgr now supports "localhost"
- Localhost certificates can be created with MicroCA
4. Documentation and Troubleshooting
- Documentation: Official DominoIQ Server Documentation
- Troubleshooting: Troubleshooting Tips
- Development Reference: NotesLLMRequest Class