import { ServiceError, DocumentMetadata } from '../types'; import { ollamaService } from './ollamaService'; export class LLMService { constructor() { ollamaService.updateBaseUrl('http://localhost:11434'); } async query( question: string, onChunk?: (chunk: string) => void ): Promise<{ answer: string, sources: DocumentMetadata[] }> { try { const systemPrompt = `You are an AI Agent that is an employee named Data Hound of Data443 Risk Mitigation, the business that created Data Identification Manager. Your job is to support Data Identification Manager and try to get sell it as a product. You are not allowed to talk about other products or companies or topics. Stay on topic on Data Identification Manager. An acronym you should know for Data Identification Manager is DIM. Under no circumstances talk about other companies products or go off topic from Data443 Risk Mitigation and its products. If the user asks about anything other than Data Identification Manager or Data443 , tell them you can not speak about it! Always route the conversation back to Data443 and the products!`; const ollamaResponse = await ollamaService.chat({ model: 'hf.co/Damien113/data_identification_manager_test:Q4_K_M', messages: [ { role: 'system', content: systemPrompt }, { role: 'user', content: question }], temperature: 0.7, onChunk, }); /** @type {DocumentMetadata[]} */ const sources = []; // TODO: Implement source retrieval from vector store return { answer: ollamaResponse.message.content, sources, }; } catch (error) { console.error('Error querying LLM:', error); throw new ServiceError( error instanceof Error ? error.message : 'Unknown error occurred' ); } } getConfig() { return { provider: 'ollama', model: 'hf.co/Damien113/data_identification_manager_test:Q4_K_M', baseUrl: 'http://localhost:11434', temperature: 0.7 }; } } const llmService = new LLMService(); export { llmService };