fix
This commit is contained in:
parent
086e311aa4
commit
240d8ece6a
261
DEVELOPMENT.md
261
DEVELOPMENT.md
@ -1,261 +0,0 @@
|
|||||||
# Data Hound Development Guide
|
|
||||||
|
|
||||||
This document provides detailed technical information for developers working on the Data Hound project. It covers architecture, development workflows, and best practices.
|
|
||||||
|
|
||||||
## Architecture Overview
|
|
||||||
|
|
||||||
### Main Process (Electron)
|
|
||||||
|
|
||||||
The main process (`electron/`) handles:
|
|
||||||
|
|
||||||
- File system operations (`services/fileSystem.ts`)
|
|
||||||
- LLM service integration (`services/llmService.ts`, `services/ollamaService.ts`)
|
|
||||||
- Vector store management (`services/vectorStore.ts`)
|
|
||||||
- IPC communication (`ipc/handlers.ts`)
|
|
||||||
- Application state persistence (`store.ts`)
|
|
||||||
|
|
||||||
#### Key Services
|
|
||||||
|
|
||||||
1. **File System Service** (`fileSystem.ts`)
|
|
||||||
- Handles file indexing and monitoring
|
|
||||||
- Manages file metadata extraction
|
|
||||||
- Implements file type detection and parsing
|
|
||||||
|
|
||||||
2. **LLM Service** (`llmService.ts`)
|
|
||||||
- Manages LLM provider connections
|
|
||||||
- Handles prompt engineering
|
|
||||||
- Implements response streaming
|
|
||||||
|
|
||||||
3. **Vector Store** (`vectorStore.ts`)
|
|
||||||
- Manages ChromaDB integration
|
|
||||||
- Handles document embeddings
|
|
||||||
- Implements semantic search functionality
|
|
||||||
|
|
||||||
### Renderer Process (React)
|
|
||||||
|
|
||||||
The renderer process (`src/`) is organized into:
|
|
||||||
|
|
||||||
- Components (`components/`)
|
|
||||||
- Contexts (`contexts/`)
|
|
||||||
- Custom hooks (`hooks/`)
|
|
||||||
- Type definitions (`electron.d.ts`)
|
|
||||||
|
|
||||||
#### Key Components
|
|
||||||
|
|
||||||
1. **ChatPanel**
|
|
||||||
- Handles user queries and LLM responses
|
|
||||||
- Manages conversation history
|
|
||||||
- Implements message rendering
|
|
||||||
|
|
||||||
2. **FileExplorer**
|
|
||||||
- Directory selection and navigation
|
|
||||||
- File list visualization
|
|
||||||
- File metadata display
|
|
||||||
|
|
||||||
3. **ScanningPanel**
|
|
||||||
- Progress visualization for file scanning
|
|
||||||
- Status updates
|
|
||||||
- Error handling
|
|
||||||
|
|
||||||
## Adding New Features
|
|
||||||
|
|
||||||
### Adding a New LLM Provider
|
|
||||||
|
|
||||||
1. Create a new service in `electron/services/`
|
|
||||||
2. Implement the provider interface:
|
|
||||||
```typescript
|
|
||||||
interface LLMProvider {
|
|
||||||
initialize(): Promise<void>;
|
|
||||||
query(prompt: string): Promise<string>;
|
|
||||||
streamResponse(prompt: string): AsyncGenerator<string>;
|
|
||||||
}
|
|
||||||
```
|
|
||||||
3. Add provider configuration to `store.ts`
|
|
||||||
4. Update the settings UI in `SettingsPanel`
|
|
||||||
5. Add the provider to the LLM service factory
|
|
||||||
|
|
||||||
### Adding File Type Support
|
|
||||||
|
|
||||||
1. Update `fileSystem.ts` with new file type detection
|
|
||||||
2. Implement parsing logic in a new service
|
|
||||||
3. Add metadata extraction
|
|
||||||
4. Update the vector store schema if needed
|
|
||||||
5. Add UI support in FileExplorer
|
|
||||||
|
|
||||||
### Adding a New Panel
|
|
||||||
|
|
||||||
1. Create component in `src/components/`
|
|
||||||
2. Add routing in `App.tsx`
|
|
||||||
3. Implement required hooks
|
|
||||||
4. Add IPC handlers if needed
|
|
||||||
5. Update navigation
|
|
||||||
|
|
||||||
## Development Workflows
|
|
||||||
|
|
||||||
### Local Development
|
|
||||||
|
|
||||||
1. Start Electron development:
|
|
||||||
```bash
|
|
||||||
npm run dev
|
|
||||||
```
|
|
||||||
This runs:
|
|
||||||
- Vite dev server for React
|
|
||||||
- Electron with hot reload
|
|
||||||
- TypeScript compilation in watch mode
|
|
||||||
|
|
||||||
2. Debug main process:
|
|
||||||
- Use VSCode launch configuration
|
|
||||||
- Console logs appear in terminal
|
|
||||||
- Break points work in VSCode
|
|
||||||
|
|
||||||
3. Debug renderer process:
|
|
||||||
- Use Chrome DevTools (Cmd/Ctrl+Shift+I)
|
|
||||||
- React DevTools available
|
|
||||||
- Network tab shows IPC calls
|
|
||||||
|
|
||||||
### Testing
|
|
||||||
|
|
||||||
1. Unit Tests:
|
|
||||||
- Located in `__tests__` directories
|
|
||||||
- Run with `npm test`
|
|
||||||
- Focus on service logic
|
|
||||||
|
|
||||||
2. Integration Tests:
|
|
||||||
- Test IPC communication
|
|
||||||
- Verify file system operations
|
|
||||||
- Check LLM integration
|
|
||||||
|
|
||||||
3. E2E Tests:
|
|
||||||
- Use Playwright
|
|
||||||
- Test full user workflows
|
|
||||||
- Verify cross-platform behavior
|
|
||||||
|
|
||||||
## Best Practices
|
|
||||||
|
|
||||||
### TypeScript
|
|
||||||
|
|
||||||
1. Use strict type checking
|
|
||||||
2. Define interfaces for all IPC messages
|
|
||||||
3. Avoid `any` - use proper types
|
|
||||||
4. Use discriminated unions for state
|
|
||||||
|
|
||||||
### React Components
|
|
||||||
|
|
||||||
1. Use functional components
|
|
||||||
2. Implement proper error boundaries
|
|
||||||
3. Memoize expensive computations
|
|
||||||
4. Use proper prop types
|
|
||||||
|
|
||||||
### Electron
|
|
||||||
|
|
||||||
1. Validate IPC messages
|
|
||||||
2. Handle window state properly
|
|
||||||
3. Implement proper error handling
|
|
||||||
4. Use proper security practices
|
|
||||||
|
|
||||||
### State Management
|
|
||||||
|
|
||||||
1. Use contexts for shared state
|
|
||||||
2. Implement proper loading states
|
|
||||||
3. Handle errors gracefully
|
|
||||||
4. Use proper TypeScript types
|
|
||||||
|
|
||||||
## Common Tasks
|
|
||||||
|
|
||||||
### Adding an IPC Handler
|
|
||||||
|
|
||||||
1. Define types in `preload-types.ts`:
|
|
||||||
```typescript
|
|
||||||
interface IPCHandlers {
|
|
||||||
newHandler: (arg: ArgType) => Promise<ReturnType>;
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Implement handler in `ipc/handlers.ts`:
|
|
||||||
```typescript
|
|
||||||
ipcMain.handle('newHandler', async (event, arg: ArgType) => {
|
|
||||||
// Implementation
|
|
||||||
});
|
|
||||||
```
|
|
||||||
|
|
||||||
3. Add to preload script:
|
|
||||||
```typescript
|
|
||||||
newHandler: (arg: ArgType) => ipcRenderer.invoke('newHandler', arg)
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Use in renderer:
|
|
||||||
```typescript
|
|
||||||
const result = await window.electron.newHandler(arg);
|
|
||||||
```
|
|
||||||
|
|
||||||
### Updating the Database Schema
|
|
||||||
|
|
||||||
1. Create migration in `electron/services/vectorStore.ts`
|
|
||||||
2. Update type definitions
|
|
||||||
3. Implement data migration
|
|
||||||
4. Update queries
|
|
||||||
5. Test migration
|
|
||||||
|
|
||||||
### Adding Settings
|
|
||||||
|
|
||||||
1. Add to store schema in `store.ts`
|
|
||||||
2. Update settings component
|
|
||||||
3. Implement validation
|
|
||||||
4. Add migration if needed
|
|
||||||
5. Update relevant services
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### Common Issues
|
|
||||||
|
|
||||||
1. **IPC Communication Failures**
|
|
||||||
- Check handler registration
|
|
||||||
- Verify type definitions
|
|
||||||
- Check error handling
|
|
||||||
|
|
||||||
2. **File System Issues**
|
|
||||||
- Verify permissions
|
|
||||||
- Check path handling
|
|
||||||
- Validate file operations
|
|
||||||
|
|
||||||
3. **LLM Integration**
|
|
||||||
- Verify API keys
|
|
||||||
- Check network connectivity
|
|
||||||
- Validate response handling
|
|
||||||
|
|
||||||
### Performance Optimization
|
|
||||||
|
|
||||||
1. **Main Process**
|
|
||||||
- Profile file system operations
|
|
||||||
- Optimize database queries
|
|
||||||
- Implement proper caching
|
|
||||||
|
|
||||||
2. **Renderer Process**
|
|
||||||
- Use React.memo for expensive components
|
|
||||||
- Implement virtual scrolling
|
|
||||||
- Optimize re-renders
|
|
||||||
|
|
||||||
## Release Process
|
|
||||||
|
|
||||||
1. Update version in `package.json`
|
|
||||||
2. Run full test suite
|
|
||||||
3. Build production version
|
|
||||||
4. Test packaged application
|
|
||||||
5. Create release notes
|
|
||||||
6. Tag release in git
|
|
||||||
7. Build installers
|
|
||||||
8. Publish release
|
|
||||||
|
|
||||||
## Contributing
|
|
||||||
|
|
||||||
1. Fork the repository
|
|
||||||
2. Create feature branch
|
|
||||||
3. Follow code style
|
|
||||||
4. Add tests
|
|
||||||
5. Submit pull request
|
|
||||||
|
|
||||||
Remember to:
|
|
||||||
- Follow TypeScript best practices
|
|
||||||
- Add proper documentation
|
|
||||||
- Include tests
|
|
||||||
- Update this guide as needed
|
|
3
Modelfile
Normal file
3
Modelfile
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
FROM deepseek-r1:8b
|
||||||
|
SYSTEM You are a helpful AI assistant.
|
||||||
|
PARAMETER num_gpu 0
|
36
README.md
36
README.md
@ -1,36 +0,0 @@
|
|||||||
This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app).
|
|
||||||
|
|
||||||
## Getting Started
|
|
||||||
|
|
||||||
First, run the development server:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm run dev
|
|
||||||
# or
|
|
||||||
yarn dev
|
|
||||||
# or
|
|
||||||
pnpm dev
|
|
||||||
# or
|
|
||||||
bun dev
|
|
||||||
```
|
|
||||||
|
|
||||||
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
|
|
||||||
|
|
||||||
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
|
|
||||||
|
|
||||||
This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel.
|
|
||||||
|
|
||||||
## Learn More
|
|
||||||
|
|
||||||
To learn more about Next.js, take a look at the following resources:
|
|
||||||
|
|
||||||
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
|
|
||||||
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
|
|
||||||
|
|
||||||
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js) - your feedback and contributions are welcome!
|
|
||||||
|
|
||||||
## Deploy on Vercel
|
|
||||||
|
|
||||||
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.
|
|
||||||
|
|
||||||
Check out our [Next.js deployment documentation](https://nextjs.org/docs/app/building-your-application/deploying) for more details.
|
|
BIN
ai-profile.webp
BIN
ai-profile.webp
Binary file not shown.
Before Width: | Height: | Size: 398 KiB |
@ -67,17 +67,6 @@ export function setupIpcHandlers() {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
ipcMain.handle('update-llm-config', async (_: unknown, config: LLMConfig) => {
|
|
||||||
try {
|
|
||||||
await (await import('../services/llmService')).llmService.updateConfig(config);
|
|
||||||
return { success: true };
|
|
||||||
} catch (error) {
|
|
||||||
const err = error as Error;
|
|
||||||
console.error('Error updating LLM config:', err);
|
|
||||||
return { success: false, error: err.message };
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
ipcMain.handle('get-llm-config', async () => {
|
ipcMain.handle('get-llm-config', async () => {
|
||||||
try {
|
try {
|
||||||
const config = (await import('../services/llmService')).llmService.getConfig();
|
const config = (await import('../services/llmService')).llmService.getConfig();
|
||||||
@ -89,14 +78,52 @@ export function setupIpcHandlers() {
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
ipcMain.handle('get-ollama-models', async () => {
|
// Model Operations
|
||||||
|
ipcMain.handle('check-model', async (_: unknown, modelName: string) => {
|
||||||
try {
|
try {
|
||||||
const models = await (await import('../services/llmService')).llmService.getOllamaModels();
|
const status = await (await import('../services/ollamaService')).ollamaService.checkModel(modelName);
|
||||||
return { success: true, data: models };
|
return status;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
const err = error as Error;
|
const err = error as Error;
|
||||||
console.error('Error getting Ollama models:', err);
|
console.error('Error checking model:', err);
|
||||||
return { success: false, error: err.message };
|
throw err;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
ipcMain.handle('pull-model', async (event, modelName: string) => {
|
||||||
|
try {
|
||||||
|
console.log('Starting model pull in IPC handler for:', modelName);
|
||||||
|
const webContents = event.sender;
|
||||||
|
|
||||||
|
// Create a wrapper for the progress callback that includes error handling
|
||||||
|
const progressCallback = (status: string) => {
|
||||||
|
try {
|
||||||
|
console.log('Sending progress update:', status);
|
||||||
|
webContents.send(`pull-model-progress-${modelName}`, status);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error sending progress update:', err);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Attempt to pull the model
|
||||||
|
await (await import('../services/ollamaService')).ollamaService.pullModel(
|
||||||
|
modelName,
|
||||||
|
progressCallback
|
||||||
|
);
|
||||||
|
|
||||||
|
// If we get here, the pull was successful and verified
|
||||||
|
console.log('Model pull completed successfully');
|
||||||
|
return { success: true };
|
||||||
|
} catch (error) {
|
||||||
|
const err = error as Error;
|
||||||
|
console.error('Error pulling model:', err);
|
||||||
|
|
||||||
|
// Return a structured error response
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: err.message || 'Failed to install model',
|
||||||
|
details: err instanceof Error ? err.stack : undefined
|
||||||
|
};
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -1,8 +1,9 @@
|
|||||||
import { app, BrowserWindow, ipcMain } from 'electron';
|
const { app, BrowserWindow, ipcMain } = require('electron');
|
||||||
import path from 'path';
|
const path = require('path');
|
||||||
import os from 'os';
|
const os = require('os');
|
||||||
import { store as electronStore } from './store';
|
const spawn = require('child_process').spawn;
|
||||||
import { setupIpcHandlers } from './ipc/handlers';
|
const { store: electronStore } = require('./store');
|
||||||
|
const { setupIpcHandlers } = require('./ipc/handlers');
|
||||||
|
|
||||||
// Initialize IPC handlers immediately
|
// Initialize IPC handlers immediately
|
||||||
setupIpcHandlers();
|
setupIpcHandlers();
|
||||||
@ -20,7 +21,15 @@ function createWindow() {
|
|||||||
sandbox: false,
|
sandbox: false,
|
||||||
webSecurity: true,
|
webSecurity: true,
|
||||||
},
|
},
|
||||||
show: false,
|
});
|
||||||
|
|
||||||
|
// Enable logging
|
||||||
|
mainWindow.webContents.on('did-fail-load', (event, errorCode, errorDescription) => {
|
||||||
|
console.error('Failed to load:', errorCode, errorDescription);
|
||||||
|
});
|
||||||
|
|
||||||
|
mainWindow.webContents.on('console-message', (event, level, message) => {
|
||||||
|
console.log('Renderer Console:', message);
|
||||||
});
|
});
|
||||||
|
|
||||||
// In development, use the Vite dev server
|
// In development, use the Vite dev server
|
||||||
@ -42,7 +51,44 @@ function createWindow() {
|
|||||||
pollDevServer();
|
pollDevServer();
|
||||||
} else {
|
} else {
|
||||||
// In production, load the built files
|
// In production, load the built files
|
||||||
mainWindow.loadFile(path.join(__dirname, '../dist/index.html'));
|
const prodPath = path.join(__dirname, '../dist/index.html');
|
||||||
|
console.log('Production mode detected');
|
||||||
|
console.log('__dirname:', __dirname);
|
||||||
|
console.log('Attempting to load:', prodPath);
|
||||||
|
|
||||||
|
// Check if the file exists
|
||||||
|
try {
|
||||||
|
if (require('fs').existsSync(prodPath)) {
|
||||||
|
console.log('Found production build at:', prodPath);
|
||||||
|
mainWindow.loadFile(prodPath).catch(err => {
|
||||||
|
console.error('Failed to load production file:', err);
|
||||||
|
// Try alternative path
|
||||||
|
const altPath = path.join(process.cwd(), 'dist/index.html');
|
||||||
|
console.log('Trying alternative path:', altPath);
|
||||||
|
if (require('fs').existsSync(altPath)) {
|
||||||
|
mainWindow.loadFile(altPath).catch(err => {
|
||||||
|
console.error('Failed to load alternative path:', err);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
console.error('Alternative path does not exist');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
console.error('Production build not found at:', prodPath);
|
||||||
|
// Try alternative path
|
||||||
|
const altPath = path.join(process.cwd(), 'dist/index.html');
|
||||||
|
console.log('Trying alternative path:', altPath);
|
||||||
|
if (require('fs').existsSync(altPath)) {
|
||||||
|
mainWindow.loadFile(altPath).catch(err => {
|
||||||
|
console.error('Failed to load alternative path:', err);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
console.error('Alternative path does not exist');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error checking file existence:', err);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
mainWindow.once('ready-to-show', () => {
|
mainWindow.once('ready-to-show', () => {
|
||||||
@ -115,4 +161,46 @@ ipcMain.handle('window-close', (event) => {
|
|||||||
window?.close();
|
window?.close();
|
||||||
});
|
});
|
||||||
|
|
||||||
export default app;
|
ipcMain.handle('check-ollama', async () => {
|
||||||
|
const checkInstalled = () => {
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
const check = spawn('ollama', ['--version']);
|
||||||
|
check.on('close', (code) => {
|
||||||
|
resolve(code === 0);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const checkRunning = async () => {
|
||||||
|
try {
|
||||||
|
const response = await fetch('http://localhost:11434/api/version');
|
||||||
|
return response.ok;
|
||||||
|
} catch (error) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const startOllama = () => {
|
||||||
|
return new Promise<void>((resolve) => {
|
||||||
|
const start = spawn('ollama', ['serve']);
|
||||||
|
// Wait a bit for the server to start
|
||||||
|
setTimeout(resolve, 2000);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const installed = await checkInstalled();
|
||||||
|
let running = await checkRunning();
|
||||||
|
|
||||||
|
if (installed && !running) {
|
||||||
|
await startOllama();
|
||||||
|
running = await checkRunning();
|
||||||
|
}
|
||||||
|
|
||||||
|
return { installed, running };
|
||||||
|
});
|
||||||
|
|
||||||
|
ipcMain.handle('open-external', (_, url) => {
|
||||||
|
return require('electron').shell.openExternal(url);
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = app;
|
||||||
|
@ -1,4 +1,6 @@
|
|||||||
import { contextBridge, ipcRenderer } from 'electron';
|
const { contextBridge, ipcRenderer } = require('electron');
|
||||||
|
|
||||||
|
// Import types
|
||||||
import type { LLMConfig, DocumentMetadata } from './types';
|
import type { LLMConfig, DocumentMetadata } from './types';
|
||||||
|
|
||||||
interface Directory {
|
interface Directory {
|
||||||
@ -28,12 +30,8 @@ contextBridge.exposeInMainWorld('electron', {
|
|||||||
answer: string;
|
answer: string;
|
||||||
sources: DocumentMetadata[];
|
sources: DocumentMetadata[];
|
||||||
}> => ipcRenderer.invoke('query-llm', question),
|
}> => ipcRenderer.invoke('query-llm', question),
|
||||||
updateLLMConfig: async (config: LLMConfig): Promise<void> =>
|
|
||||||
ipcRenderer.invoke('update-llm-config', config),
|
|
||||||
getLLMConfig: async (): Promise<LLMConfig> =>
|
getLLMConfig: async (): Promise<LLMConfig> =>
|
||||||
ipcRenderer.invoke('get-llm-config'),
|
ipcRenderer.invoke('get-llm-config'),
|
||||||
getOllamaModels: async (): Promise<IpcResponse<string[]>> =>
|
|
||||||
ipcRenderer.invoke('get-ollama-models'),
|
|
||||||
|
|
||||||
// Vector Store Operations
|
// Vector Store Operations
|
||||||
getDocuments: async (): Promise<IpcResponse<DocumentMetadata[]>> =>
|
getDocuments: async (): Promise<IpcResponse<DocumentMetadata[]>> =>
|
||||||
@ -67,10 +65,29 @@ contextBridge.exposeInMainWorld('electron', {
|
|||||||
ipcRenderer.removeListener(channel, callback);
|
ipcRenderer.removeListener(channel, callback);
|
||||||
},
|
},
|
||||||
|
|
||||||
|
checkOllama: () => ipcRenderer.invoke('check-ollama'),
|
||||||
|
openExternal: (url: string) => ipcRenderer.invoke('open-external', url),
|
||||||
|
|
||||||
|
// Model Operations
|
||||||
|
checkModel: (modelName: string) => ipcRenderer.invoke('check-model', modelName),
|
||||||
|
pullModel: (modelName: string, onProgress: (status: string) => void) => {
|
||||||
|
const channel = `pull-model-progress-${modelName}`;
|
||||||
|
ipcRenderer.on(channel, (_event, status) => onProgress(status));
|
||||||
|
return ipcRenderer.invoke('pull-model', modelName).finally(() => {
|
||||||
|
ipcRenderer.removeListener(channel, onProgress);
|
||||||
|
});
|
||||||
|
},
|
||||||
|
|
||||||
// Window Controls
|
// Window Controls
|
||||||
minimizeWindow: () => ipcRenderer.invoke('window-minimize'),
|
minimizeWindow: () => ipcRenderer.invoke('window-minimize'),
|
||||||
maximizeWindow: () => ipcRenderer.invoke('window-maximize'),
|
maximizeWindow: () => ipcRenderer.invoke('window-maximize'),
|
||||||
closeWindow: () => ipcRenderer.invoke('window-close'),
|
closeWindow: () => ipcRenderer.invoke('window-close'),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Export types for TypeScript
|
||||||
export type { Directory, IpcResponse };
|
export type { Directory, IpcResponse };
|
||||||
|
|
||||||
|
// For CommonJS compatibility
|
||||||
|
if (typeof module !== 'undefined' && module.exports) {
|
||||||
|
module.exports = {};
|
||||||
|
}
|
||||||
|
@ -1,170 +1,28 @@
|
|||||||
import { ServiceError, LLMConfig, DocumentMetadata } from '../types';
|
import { ServiceError, DocumentMetadata } from '../types';
|
||||||
const { store } = require('../store');
|
|
||||||
import OpenAI from 'openai';
|
|
||||||
import { OpenRouter } from 'openrouter-client';
|
|
||||||
import { ollamaService } from './ollamaService';
|
import { ollamaService } from './ollamaService';
|
||||||
|
|
||||||
type Message = { role: 'system' | 'user' | 'assistant'; content: string };
|
|
||||||
|
|
||||||
interface OpenAIClient {
|
|
||||||
chat: {
|
|
||||||
completions: {
|
|
||||||
create: Function;
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
interface OpenRouterStreamResponse {
|
|
||||||
success: boolean;
|
|
||||||
data?: {
|
|
||||||
choices: Array<{ delta?: { content?: string }; message?: { content: string } }>;
|
|
||||||
};
|
|
||||||
errorCode?: number;
|
|
||||||
errorMessage?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
type OpenRouterConfig = {
|
|
||||||
temperature?: number;
|
|
||||||
model?: string;
|
|
||||||
stream?: boolean;
|
|
||||||
};
|
|
||||||
|
|
||||||
interface OpenRouterClient {
|
|
||||||
chat: (messages: Message[], config?: OpenRouterConfig) => Promise<OpenRouterStreamResponse>;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
export class LLMService {
|
export class LLMService {
|
||||||
#config: LLMConfig;
|
|
||||||
#openaiClient: OpenAIClient | null;
|
|
||||||
#openrouterClient: OpenRouterClient | null;
|
|
||||||
|
|
||||||
constructor() {
|
constructor() {
|
||||||
const storedConfig = store.get('llm_config');
|
ollamaService.updateBaseUrl('http://localhost:11434');
|
||||||
|
|
||||||
this.#config = storedConfig || {
|
|
||||||
provider: 'ollama',
|
|
||||||
model: 'jimscard/blackhat-hacker:v2',
|
|
||||||
baseUrl: 'http://localhost:11434',
|
|
||||||
temperature: 0.7,
|
|
||||||
apiKey: null
|
|
||||||
};
|
|
||||||
|
|
||||||
// Ensure config is saved with defaults
|
|
||||||
store.set('llm_config', this.#config);
|
|
||||||
this.#openaiClient = null;
|
|
||||||
this.#openrouterClient = null;
|
|
||||||
this.#initializeClient();
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @private
|
|
||||||
*/
|
|
||||||
#initializeClient() {
|
|
||||||
switch (this.#config?.provider) {
|
|
||||||
case 'openai':
|
|
||||||
if (!this.#config.apiKey) {
|
|
||||||
throw new ServiceError('OpenAI API key is required');
|
|
||||||
}
|
|
||||||
this.#openaiClient = new OpenAI({
|
|
||||||
apiKey: this.#config.apiKey,
|
|
||||||
baseURL: this.#config.baseUrl,
|
|
||||||
});
|
|
||||||
break;
|
|
||||||
case 'openrouter':
|
|
||||||
if (!this.#config.apiKey) {
|
|
||||||
throw new ServiceError('OpenRouter API key is required');
|
|
||||||
}
|
|
||||||
this.#openrouterClient = new OpenRouter(this.#config.apiKey);
|
|
||||||
break;
|
|
||||||
case 'ollama':
|
|
||||||
if (this.#config.baseUrl) {
|
|
||||||
ollamaService.updateBaseUrl(this.#config.baseUrl);
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
async query(
|
async query(
|
||||||
question: string,
|
question: string,
|
||||||
onChunk?: (chunk: string) => void
|
onChunk?: (chunk: string) => void
|
||||||
): Promise<{ answer: string, sources: DocumentMetadata[] }> {
|
): Promise<{ answer: string, sources: DocumentMetadata[] }> {
|
||||||
if (!this.#config?.provider) {
|
|
||||||
throw new ServiceError('LLM provider not configured');
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
try {
|
||||||
let response;
|
|
||||||
|
|
||||||
switch (this.#config.provider) {
|
|
||||||
case 'openai':
|
|
||||||
if (!this.#openaiClient) {
|
|
||||||
throw new ServiceError('OpenAI client not initialized');
|
|
||||||
}
|
|
||||||
const openaiResponse = await this.#openaiClient.chat.completions.create({
|
|
||||||
model: this.#config.model || 'gpt-3.5-turbo',
|
|
||||||
messages: [{ role: 'user', content: question }],
|
|
||||||
temperature: this.#config.temperature || 0.7,
|
|
||||||
stream: true,
|
|
||||||
});
|
|
||||||
|
|
||||||
let openaiText = '';
|
|
||||||
for await (const chunk of openaiResponse) {
|
|
||||||
const content = chunk.choices[0]?.delta?.content || '';
|
|
||||||
if (content) {
|
|
||||||
openaiText += content;
|
|
||||||
onChunk?.(content);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
response = openaiText;
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'openrouter':
|
|
||||||
if (!this.#openrouterClient) {
|
|
||||||
throw new ServiceError('OpenRouter client not initialized');
|
|
||||||
}
|
|
||||||
const openrouterResponse = await this.#openrouterClient.chat(
|
|
||||||
[{ role: 'user', content: question }],
|
|
||||||
{
|
|
||||||
model: this.#config.model || 'openai/gpt-3.5-turbo',
|
|
||||||
temperature: this.#config.temperature || 0.7,
|
|
||||||
stream: true,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
if (!openrouterResponse.success) {
|
|
||||||
throw new ServiceError(openrouterResponse.errorMessage || 'OpenRouter request failed');
|
|
||||||
}
|
|
||||||
|
|
||||||
let routerText = '';
|
|
||||||
for await (const chunk of openrouterResponse.data?.choices || []) {
|
|
||||||
const content = chunk.delta?.content || chunk.message?.content || '';
|
|
||||||
if (content) {
|
|
||||||
routerText += content;
|
|
||||||
onChunk?.(content);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
response = routerText;
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'ollama':
|
|
||||||
const ollamaResponse = await ollamaService.chat({
|
const ollamaResponse = await ollamaService.chat({
|
||||||
model: this.#config.model || 'phi4:latest',
|
model: 'damien113/datahound:latest',
|
||||||
messages: [{ role: 'user', content: question }],
|
messages: [{ role: 'user', content: question }],
|
||||||
temperature: this.#config.temperature,
|
temperature: 0.7,
|
||||||
onChunk,
|
onChunk,
|
||||||
});
|
});
|
||||||
response = ollamaResponse.message.content;
|
|
||||||
break;
|
|
||||||
|
|
||||||
default:
|
|
||||||
throw new ServiceError(`Unsupported provider: ${this.#config.provider}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
/** @type {DocumentMetadata[]} */
|
/** @type {DocumentMetadata[]} */
|
||||||
const sources = []; // TODO: Implement source retrieval from vector store
|
const sources = []; // TODO: Implement source retrieval from vector store
|
||||||
|
|
||||||
return {
|
return {
|
||||||
answer: response,
|
answer: ollamaResponse.message.content,
|
||||||
sources,
|
sources,
|
||||||
};
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@ -175,60 +33,13 @@ export class LLMService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* @returns {LLMConfig}
|
|
||||||
*/
|
|
||||||
getConfig() {
|
getConfig() {
|
||||||
return this.#config;
|
return {
|
||||||
}
|
provider: 'ollama',
|
||||||
|
model: 'damien113/datahound:latest',
|
||||||
/**
|
baseUrl: 'http://localhost:11434',
|
||||||
* @param {LLMConfig} newConfig - The new LLM configuration
|
temperature: 0.7
|
||||||
*/
|
|
||||||
async updateConfig(newConfig) {
|
|
||||||
// Validate required fields from schema
|
|
||||||
if (!newConfig.provider) {
|
|
||||||
throw new ServiceError('Provider is required');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Clean config to only include allowed properties from schema
|
|
||||||
const cleanConfig = {
|
|
||||||
provider: newConfig.provider,
|
|
||||||
apiKey: newConfig.apiKey ?? null,
|
|
||||||
model: newConfig.model ?? (newConfig.provider === 'ollama' ? 'phi4' : null),
|
|
||||||
baseUrl: newConfig.provider === 'ollama' ? (newConfig.baseUrl ?? 'http://localhost:11434') : (newConfig.baseUrl ?? null),
|
|
||||||
temperature: typeof newConfig.temperature === 'number' ? newConfig.temperature : 0.7
|
|
||||||
};
|
};
|
||||||
|
|
||||||
// Validate provider-specific requirements
|
|
||||||
if (cleanConfig.provider !== 'ollama' && !cleanConfig.apiKey) {
|
|
||||||
throw new ServiceError(`${cleanConfig.provider} requires an API key`);
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
store.set('llm_config', cleanConfig);
|
|
||||||
this.#config = cleanConfig;
|
|
||||||
this.#initializeClient();
|
|
||||||
} catch (error) {
|
|
||||||
throw new ServiceError(
|
|
||||||
error instanceof Error ? error.message : 'Failed to update config'
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get available models from Ollama server
|
|
||||||
* @returns {Promise<string[]>} List of model names
|
|
||||||
*/
|
|
||||||
async getOllamaModels() {
|
|
||||||
try {
|
|
||||||
return await ollamaService.getModels();
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error fetching Ollama models:', error);
|
|
||||||
throw new ServiceError(
|
|
||||||
error instanceof Error ? error.message : 'Failed to fetch Ollama models'
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -8,6 +8,11 @@ interface OllamaModel {
|
|||||||
digest: string;
|
digest: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface OllamaModelStatus {
|
||||||
|
installed: boolean;
|
||||||
|
installing: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
interface OllamaListResponse {
|
interface OllamaListResponse {
|
||||||
models: Array<{
|
models: Array<{
|
||||||
name: string;
|
name: string;
|
||||||
@ -33,6 +38,7 @@ interface OllamaChatParams {
|
|||||||
|
|
||||||
class OllamaService {
|
class OllamaService {
|
||||||
private baseUrl: string = 'http://127.0.0.1:11434';
|
private baseUrl: string = 'http://127.0.0.1:11434';
|
||||||
|
private _lastProgress: number | null = null;
|
||||||
|
|
||||||
private async makeRequest<T>(
|
private async makeRequest<T>(
|
||||||
path: string,
|
path: string,
|
||||||
@ -70,8 +76,8 @@ class OllamaService {
|
|||||||
try {
|
try {
|
||||||
const chunkStr = chunk.toString();
|
const chunkStr = chunk.toString();
|
||||||
|
|
||||||
if (path === '/api/chat') {
|
if (path === '/api/chat' || path === '/api/pull') {
|
||||||
// Handle streaming chat response
|
// Handle streaming responses
|
||||||
streamBuffer += chunkStr;
|
streamBuffer += chunkStr;
|
||||||
const lines = streamBuffer.split('\n');
|
const lines = streamBuffer.split('\n');
|
||||||
|
|
||||||
@ -82,18 +88,27 @@ class OllamaService {
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
const parsed = JSON.parse(line);
|
const parsed = JSON.parse(line);
|
||||||
if (parsed.message?.content && onChunk) {
|
if (path === '/api/chat' && parsed.message?.content && onChunk) {
|
||||||
onChunk(parsed.message.content);
|
onChunk(parsed.message.content);
|
||||||
|
} else if (path === '/api/pull' && onChunk) {
|
||||||
|
if (parsed.status === 'success') {
|
||||||
|
onChunk('downloading: 100% complete');
|
||||||
|
} else if (parsed.total && parsed.completed !== undefined) {
|
||||||
|
const percentage = ((parsed.completed / parsed.total) * 100).toFixed(1);
|
||||||
|
onChunk(`downloading: ${percentage}% complete`);
|
||||||
|
} else if (parsed.status) {
|
||||||
|
onChunk(parsed.status);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.warn('Failed to parse chat chunk:', { line, error: e });
|
console.warn('Failed to parse chunk:', { line, error: e });
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Keep the last potentially incomplete line
|
// Keep the last potentially incomplete line
|
||||||
streamBuffer = lines[lines.length - 1];
|
streamBuffer = lines[lines.length - 1];
|
||||||
} else {
|
} else {
|
||||||
// For non-streaming endpoints
|
// For non-streaming endpoints, accumulate the entire response
|
||||||
responseData += chunkStr;
|
responseData += chunkStr;
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
@ -103,26 +118,31 @@ class OllamaService {
|
|||||||
|
|
||||||
response.on('end', () => {
|
response.on('end', () => {
|
||||||
try {
|
try {
|
||||||
if (path === '/api/chat') {
|
if (path === '/api/chat' || path === '/api/pull') {
|
||||||
// Handle any remaining data in the buffer
|
// Handle any remaining data in the streaming buffer
|
||||||
if (streamBuffer.trim()) {
|
if (streamBuffer.trim()) {
|
||||||
try {
|
try {
|
||||||
const parsed = JSON.parse(streamBuffer);
|
const parsed = JSON.parse(streamBuffer);
|
||||||
if (parsed.message?.content && onChunk) {
|
if (path === '/api/chat' && parsed.message?.content && onChunk) {
|
||||||
onChunk(parsed.message.content);
|
onChunk(parsed.message.content);
|
||||||
|
} else if (path === '/api/pull' && onChunk) {
|
||||||
|
if (parsed.status === 'success') {
|
||||||
|
onChunk('downloading: 100% complete');
|
||||||
|
} else if (parsed.total && parsed.completed !== undefined) {
|
||||||
|
const percentage = ((parsed.completed / parsed.total) * 100).toFixed(1);
|
||||||
|
onChunk(`downloading: ${percentage}% complete`);
|
||||||
|
} else if (parsed.status) {
|
||||||
|
onChunk(parsed.status);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.warn('Failed to parse final chat chunk:', { buffer: streamBuffer, error: e });
|
console.warn('Failed to parse final chunk:', { buffer: streamBuffer, error: e });
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
// Resolve streaming endpoints with success response
|
||||||
resolve({
|
resolve({ success: true } as T);
|
||||||
message: {
|
|
||||||
content: ''
|
|
||||||
}
|
|
||||||
} as T);
|
|
||||||
} else {
|
} else {
|
||||||
// For non-streaming endpoints
|
// For non-streaming endpoints, parse the accumulated response
|
||||||
const trimmedResponse = responseData.trim();
|
const trimmedResponse = responseData.trim();
|
||||||
if (!trimmedResponse) {
|
if (!trimmedResponse) {
|
||||||
throw new Error('Empty response received');
|
throw new Error('Empty response received');
|
||||||
@ -217,6 +237,126 @@ class OllamaService {
|
|||||||
updateBaseUrl(baseUrl: string) {
|
updateBaseUrl(baseUrl: string) {
|
||||||
this.baseUrl = baseUrl;
|
this.baseUrl = baseUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async checkModel(modelName: string): Promise<OllamaModelStatus> {
|
||||||
|
try {
|
||||||
|
const models = await this.getModels();
|
||||||
|
return {
|
||||||
|
installed: models.includes(modelName),
|
||||||
|
installing: false
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error checking model:', error);
|
||||||
|
throw new ServiceError(
|
||||||
|
error instanceof Error ? error.message : 'Failed to check model status'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async pullModel(modelName: string, onProgress?: (status: string) => void): Promise<void> {
|
||||||
|
try {
|
||||||
|
console.log('Starting model pull for:', modelName);
|
||||||
|
|
||||||
|
// Make a direct request using net.request to handle streaming properly
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const request = net.request({
|
||||||
|
url: `${this.baseUrl}/api/pull`,
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
request.on('response', (response) => {
|
||||||
|
if (response.statusCode !== 200) {
|
||||||
|
reject(new Error(`HTTP error! status: ${response.statusCode}`));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
let buffer = '';
|
||||||
|
response.on('data', (chunk) => {
|
||||||
|
buffer += chunk.toString();
|
||||||
|
const lines = buffer.split('\n');
|
||||||
|
buffer = lines.pop() || ''; // Keep the last incomplete line
|
||||||
|
|
||||||
|
for (const line of lines) {
|
||||||
|
if (!line.trim()) continue;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const data = JSON.parse(line);
|
||||||
|
console.log('Pull progress data:', data);
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
if (onProgress) onProgress('downloading: 100% complete');
|
||||||
|
} else if (data.total && typeof data.completed === 'number') {
|
||||||
|
// Round to nearest whole number to reduce update frequency
|
||||||
|
const percentage = Math.round((data.completed / data.total) * 100);
|
||||||
|
// Cache the last reported progress to avoid duplicate updates
|
||||||
|
if (onProgress && (!this._lastProgress || percentage !== this._lastProgress)) {
|
||||||
|
this._lastProgress = percentage;
|
||||||
|
onProgress(`downloading: ${percentage}% complete`);
|
||||||
|
}
|
||||||
|
} else if (data.status) {
|
||||||
|
if (onProgress) onProgress(data.status);
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.warn('Failed to parse progress data:', e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
response.on('end', () => {
|
||||||
|
// Process any remaining data in buffer
|
||||||
|
if (buffer.trim()) {
|
||||||
|
try {
|
||||||
|
const data = JSON.parse(buffer);
|
||||||
|
if (data.status === 'success') {
|
||||||
|
if (onProgress) onProgress('downloading: 100% complete');
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.warn('Failed to parse final data:', e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
|
||||||
|
response.on('error', (error) => {
|
||||||
|
console.error('Response error:', error);
|
||||||
|
reject(error);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
request.on('error', (error) => {
|
||||||
|
console.error('Request error:', error);
|
||||||
|
reject(error);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Send the request with the model name
|
||||||
|
const body = JSON.stringify({ name: modelName });
|
||||||
|
console.log('Sending pull request with body:', body);
|
||||||
|
request.write(body);
|
||||||
|
request.end();
|
||||||
|
});
|
||||||
|
|
||||||
|
// After successful pull, verify the model exists
|
||||||
|
console.log('Pull completed, verifying model installation...');
|
||||||
|
|
||||||
|
// Give Ollama some time to process
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 2000));
|
||||||
|
|
||||||
|
const models = await this.getModels();
|
||||||
|
console.log('Available models:', models);
|
||||||
|
|
||||||
|
if (!models.includes(modelName)) {
|
||||||
|
throw new ServiceError('Model pull completed but model is not available in Ollama');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error pulling model:', error);
|
||||||
|
throw new ServiceError(
|
||||||
|
error instanceof Error ? error.message : 'Failed to pull model'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export const ollamaService = new OllamaService();
|
export const ollamaService = new OllamaService();
|
||||||
|
@ -1,8 +1,8 @@
|
|||||||
{
|
{
|
||||||
"compilerOptions": {
|
"compilerOptions": {
|
||||||
"composite": true,
|
"composite": true,
|
||||||
"target": "ESNext",
|
"target": "ES2020",
|
||||||
"module": "ESNext",
|
"module": "CommonJS",
|
||||||
"moduleResolution": "node",
|
"moduleResolution": "node",
|
||||||
"allowJs": true,
|
"allowJs": true,
|
||||||
"skipLibCheck": true,
|
"skipLibCheck": true,
|
||||||
|
@ -4,6 +4,7 @@
|
|||||||
"composite": true,
|
"composite": true,
|
||||||
"noEmit": false,
|
"noEmit": false,
|
||||||
"target": "ES2020",
|
"target": "ES2020",
|
||||||
|
"module": "CommonJS",
|
||||||
"allowJs": true,
|
"allowJs": true,
|
||||||
"skipLibCheck": true,
|
"skipLibCheck": true,
|
||||||
"esModuleInterop": true,
|
"esModuleInterop": true,
|
||||||
|
@ -1,9 +1,8 @@
|
|||||||
export interface LLMConfig {
|
export interface LLMConfig {
|
||||||
provider: 'openai' | 'openrouter' | 'ollama';
|
provider: 'ollama';
|
||||||
apiKey?: string;
|
model: string;
|
||||||
model?: string;
|
baseUrl: string;
|
||||||
baseUrl?: string;
|
temperature: number;
|
||||||
temperature?: number;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface DocumentMetadata {
|
export interface DocumentMetadata {
|
||||||
|
@ -9,6 +9,7 @@ import HomePanel from './components/HomePanel';
|
|||||||
import ScanningPanel from './components/ScanningPanel';
|
import ScanningPanel from './components/ScanningPanel';
|
||||||
import ReportingPanel from './components/ReportingPanel';
|
import ReportingPanel from './components/ReportingPanel';
|
||||||
import { ElectronProvider } from './contexts/ElectronContext';
|
import { ElectronProvider } from './contexts/ElectronContext';
|
||||||
|
import OllamaCheck from './components/OllamaCheck';
|
||||||
|
|
||||||
const theme = createTheme({
|
const theme = createTheme({
|
||||||
palette: {
|
palette: {
|
||||||
@ -277,9 +278,15 @@ function AppContent() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
function App() {
|
function App() {
|
||||||
|
const [ollamaInstalled, setOllamaInstalled] = useState(false);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<ElectronProvider>
|
<ElectronProvider>
|
||||||
|
{!ollamaInstalled ? (
|
||||||
|
<OllamaCheck onInstalled={() => setOllamaInstalled(true)} />
|
||||||
|
) : (
|
||||||
<AppContent />
|
<AppContent />
|
||||||
|
)}
|
||||||
</ElectronProvider>
|
</ElectronProvider>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
304
electron-file-search/src/components/OllamaCheck.tsx
Normal file
304
electron-file-search/src/components/OllamaCheck.tsx
Normal file
@ -0,0 +1,304 @@
|
|||||||
|
import React, { useState, useEffect } from 'react';
|
||||||
|
import { Box, Button, Typography, CircularProgress, ThemeProvider, CssBaseline } from '@mui/material';
|
||||||
|
import { createTheme } from '@mui/material/styles';
|
||||||
|
|
||||||
|
const theme = createTheme({
|
||||||
|
palette: {
|
||||||
|
mode: 'dark',
|
||||||
|
primary: {
|
||||||
|
main: '#2196f3',
|
||||||
|
light: '#64b5f6',
|
||||||
|
dark: '#1976d2',
|
||||||
|
contrastText: '#ffffff',
|
||||||
|
},
|
||||||
|
background: {
|
||||||
|
default: '#121212',
|
||||||
|
paper: '#1e1e1e',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
interface OllamaCheckProps {
|
||||||
|
onInstalled: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface OllamaStatus {
|
||||||
|
installed: boolean;
|
||||||
|
running: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ModelStatus {
|
||||||
|
installed: boolean;
|
||||||
|
installing: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function OllamaCheck({ onInstalled }: OllamaCheckProps) {
|
||||||
|
const [ollamaStatus, setOllamaStatus] = useState<OllamaStatus | null>(null);
|
||||||
|
const [modelStatus, setModelStatus] = useState<ModelStatus | null>(null);
|
||||||
|
const [isChecking, setIsChecking] = useState(true);
|
||||||
|
const [downloadProgress, setDownloadProgress] = useState<number>(0);
|
||||||
|
const MODEL_NAME = 'damien113/datahound:latest';
|
||||||
|
|
||||||
|
const checkOllama = async () => {
|
||||||
|
try {
|
||||||
|
const result = await window.electron.checkOllama();
|
||||||
|
setOllamaStatus(result);
|
||||||
|
if (result.installed && result.running) {
|
||||||
|
checkModel();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
setOllamaStatus({ installed: false, running: false });
|
||||||
|
} finally {
|
||||||
|
setIsChecking(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const checkModel = async () => {
|
||||||
|
try {
|
||||||
|
const status = await window.electron.checkModel(MODEL_NAME);
|
||||||
|
setModelStatus(status);
|
||||||
|
if (status.installed) {
|
||||||
|
onInstalled();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error checking model:', error);
|
||||||
|
setModelStatus({ installed: false, installing: false });
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const [installError, setInstallError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
const installModel = async () => {
|
||||||
|
try {
|
||||||
|
// Set initial installation state once at the start
|
||||||
|
setModelStatus({ installed: false, installing: true });
|
||||||
|
setInstallError(null);
|
||||||
|
setDownloadProgress(0);
|
||||||
|
let downloadComplete = false;
|
||||||
|
|
||||||
|
await window.electron.pullModel(MODEL_NAME, (status) => {
|
||||||
|
// Only update progress during download, avoid updating modelStatus
|
||||||
|
const match = status.match(/downloading: ([\d.]+)%/);
|
||||||
|
if (match) {
|
||||||
|
const progress = parseFloat(match[1]);
|
||||||
|
setDownloadProgress(progress);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (status.includes('100% complete')) {
|
||||||
|
downloadComplete = true;
|
||||||
|
setDownloadProgress(100);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// After pull completes, check the final status
|
||||||
|
const status = await window.electron.checkModel(MODEL_NAME);
|
||||||
|
if (status.installed) {
|
||||||
|
// Show verification screen for a moment before transitioning
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 2000)); // 2 second delay
|
||||||
|
setModelStatus({ installed: true, installing: false });
|
||||||
|
onInstalled();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(
|
||||||
|
downloadComplete
|
||||||
|
? 'Model download completed but installation verification failed. Please check Ollama logs.'
|
||||||
|
: 'Model installation failed. Please try again.'
|
||||||
|
);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error installing model:', error);
|
||||||
|
setModelStatus({ installed: false, installing: false });
|
||||||
|
setInstallError(error instanceof Error ? error.message : 'Unknown error occurred');
|
||||||
|
setDownloadProgress(0);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const openOllamaWebsite = () => {
|
||||||
|
window.electron.openExternal('https://ollama.com');
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
// Initial check
|
||||||
|
checkOllama();
|
||||||
|
|
||||||
|
// Only poll if Ollama is not installed or not running
|
||||||
|
const interval = setInterval(() => {
|
||||||
|
if (!ollamaStatus?.installed || !ollamaStatus?.running) {
|
||||||
|
checkOllama();
|
||||||
|
}
|
||||||
|
}, 5000);
|
||||||
|
|
||||||
|
// Cleanup interval on unmount
|
||||||
|
return () => clearInterval(interval);
|
||||||
|
}, [ollamaStatus?.installed, ollamaStatus?.running]);
|
||||||
|
|
||||||
|
// Debounce progress updates
|
||||||
|
const [debouncedProgress, setDebouncedProgress] = useState(0);
|
||||||
|
useEffect(() => {
|
||||||
|
const timeoutId = setTimeout(() => {
|
||||||
|
setDebouncedProgress(downloadProgress);
|
||||||
|
}, 100); // Debounce time of 100ms
|
||||||
|
return () => clearTimeout(timeoutId);
|
||||||
|
}, [downloadProgress]);
|
||||||
|
|
||||||
|
if (isChecking) {
|
||||||
|
return (
|
||||||
|
<Box sx={{
|
||||||
|
display: 'flex',
|
||||||
|
flexDirection: 'column',
|
||||||
|
alignItems: 'center',
|
||||||
|
justifyContent: 'center',
|
||||||
|
height: '100vh',
|
||||||
|
gap: 2,
|
||||||
|
bgcolor: 'background.default',
|
||||||
|
color: 'text.primary',
|
||||||
|
}}>
|
||||||
|
<CircularProgress sx={{ color: 'primary.main' }} />
|
||||||
|
<Typography variant="h6" sx={{ color: 'text.primary' }}>
|
||||||
|
Checking Ollama status...
|
||||||
|
</Typography>
|
||||||
|
</Box>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!ollamaStatus?.installed) {
|
||||||
|
return (
|
||||||
|
<Box sx={{
|
||||||
|
display: 'flex',
|
||||||
|
flexDirection: 'column',
|
||||||
|
alignItems: 'center',
|
||||||
|
justifyContent: 'center',
|
||||||
|
height: '100vh',
|
||||||
|
gap: 2,
|
||||||
|
bgcolor: 'background.default',
|
||||||
|
color: 'text.primary',
|
||||||
|
}}>
|
||||||
|
<Typography variant="h5" sx={{ color: 'text.primary', mb: 1 }}>
|
||||||
|
Ollama is not installed
|
||||||
|
</Typography>
|
||||||
|
<Typography sx={{ color: 'primary.main', mb: 2, textAlign: 'center' }}>
|
||||||
|
Ollama is required to use the AI features of this application.
|
||||||
|
</Typography>
|
||||||
|
<Button
|
||||||
|
variant="contained"
|
||||||
|
onClick={openOllamaWebsite}
|
||||||
|
color="primary"
|
||||||
|
size="large"
|
||||||
|
sx={{
|
||||||
|
px: 4,
|
||||||
|
py: 1,
|
||||||
|
borderRadius: 2,
|
||||||
|
textTransform: 'none',
|
||||||
|
fontSize: '1.1rem',
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Install Ollama
|
||||||
|
</Button>
|
||||||
|
</Box>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!ollamaStatus.running) {
|
||||||
|
return (
|
||||||
|
<Box sx={{
|
||||||
|
display: 'flex',
|
||||||
|
flexDirection: 'column',
|
||||||
|
alignItems: 'center',
|
||||||
|
justifyContent: 'center',
|
||||||
|
height: '100vh',
|
||||||
|
gap: 2,
|
||||||
|
bgcolor: 'background.default',
|
||||||
|
color: 'text.primary',
|
||||||
|
}}>
|
||||||
|
<Typography variant="h5" sx={{ color: 'text.primary', mb: 1 }}>
|
||||||
|
Starting Ollama
|
||||||
|
</Typography>
|
||||||
|
<Typography sx={{ color: 'primary.main', mb: 2, textAlign: 'center' }}>
|
||||||
|
Attempting to start Ollama automatically...
|
||||||
|
<br />
|
||||||
|
This may take a few moments.
|
||||||
|
</Typography>
|
||||||
|
<CircularProgress size={24} sx={{ color: 'primary.main' }} />
|
||||||
|
</Box>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (modelStatus === null || (!modelStatus.installed && !installError)) {
|
||||||
|
return (
|
||||||
|
<Box sx={{
|
||||||
|
display: 'flex',
|
||||||
|
flexDirection: 'column',
|
||||||
|
alignItems: 'center',
|
||||||
|
justifyContent: 'center',
|
||||||
|
height: '100vh',
|
||||||
|
gap: 2,
|
||||||
|
bgcolor: 'background.default',
|
||||||
|
color: 'text.primary',
|
||||||
|
}}>
|
||||||
|
<Typography variant="h5" sx={{ color: 'text.primary', mb: 1 }}>
|
||||||
|
{modelStatus?.installing ? 'Installing Data Hound AI Model' : 'AI Model Required'}
|
||||||
|
</Typography>
|
||||||
|
{installError && (
|
||||||
|
<Typography sx={{ color: 'error.main', mb: 2, textAlign: 'center' }}>
|
||||||
|
{installError}
|
||||||
|
</Typography>
|
||||||
|
)}
|
||||||
|
<Typography sx={{ color: 'primary.main', mb: 2, textAlign: 'center' }}>
|
||||||
|
{modelStatus?.installing ? (
|
||||||
|
<>
|
||||||
|
Downloading {MODEL_NAME}...
|
||||||
|
<br />
|
||||||
|
{debouncedProgress.toFixed(1)}% complete
|
||||||
|
{downloadProgress === 100 && (
|
||||||
|
<Typography component="div" sx={{ color: 'text.secondary', mt: 1 }}>
|
||||||
|
Verifying installation...
|
||||||
|
</Typography>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
The required model {MODEL_NAME} is not installed.
|
||||||
|
<br />
|
||||||
|
Click below to install it.
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</Typography>
|
||||||
|
{modelStatus?.installing ? (
|
||||||
|
<CircularProgress
|
||||||
|
variant="determinate"
|
||||||
|
value={debouncedProgress}
|
||||||
|
size={50}
|
||||||
|
sx={{ color: 'primary.main' }}
|
||||||
|
/>
|
||||||
|
) : (
|
||||||
|
<Button
|
||||||
|
variant="contained"
|
||||||
|
onClick={installModel}
|
||||||
|
color="primary"
|
||||||
|
size="large"
|
||||||
|
sx={{
|
||||||
|
px: 4,
|
||||||
|
py: 1,
|
||||||
|
borderRadius: 2,
|
||||||
|
textTransform: 'none',
|
||||||
|
fontSize: '1.1rem',
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Download Data Hound AI Model
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
|
</Box>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function OllamaCheckWithTheme(props: OllamaCheckProps) {
|
||||||
|
return (
|
||||||
|
<ThemeProvider theme={theme}>
|
||||||
|
<CssBaseline />
|
||||||
|
<OllamaCheck {...props} />
|
||||||
|
</ThemeProvider>
|
||||||
|
);
|
||||||
|
}
|
@ -1,88 +1,14 @@
|
|||||||
import React, { useState, useEffect } from 'react';
|
import React from 'react';
|
||||||
import {
|
import {
|
||||||
Button,
|
|
||||||
TextField,
|
TextField,
|
||||||
FormControl,
|
|
||||||
InputLabel,
|
|
||||||
Select,
|
|
||||||
MenuItem,
|
|
||||||
CircularProgress,
|
|
||||||
Alert,
|
|
||||||
Box,
|
Box,
|
||||||
Typography,
|
Typography,
|
||||||
Paper,
|
Paper,
|
||||||
Divider,
|
|
||||||
Switch,
|
Switch,
|
||||||
FormControlLabel,
|
FormControlLabel,
|
||||||
} from '@mui/material';
|
} from '@mui/material';
|
||||||
import { useLLMConfig } from '../../hooks/useLLMConfig';
|
|
||||||
import { useOllamaModels } from '../../hooks/useOllamaModels';
|
|
||||||
import type { LLMConfig } from '../../../electron/types';
|
|
||||||
|
|
||||||
const defaultConfig: LLMConfig = {
|
|
||||||
provider: 'ollama',
|
|
||||||
model: 'jimscard/blackhat-hacker:v2',
|
|
||||||
baseUrl: 'http://localhost:11434',
|
|
||||||
temperature: 0.7,
|
|
||||||
apiKey: undefined
|
|
||||||
};
|
|
||||||
|
|
||||||
export default function SettingsPanel() {
|
export default function SettingsPanel() {
|
||||||
const { config, isLoading, error, updateConfig, reloadConfig } = useLLMConfig();
|
|
||||||
const ollamaModels = useOllamaModels();
|
|
||||||
const [formData, setFormData] = useState<LLMConfig>(defaultConfig);
|
|
||||||
const [isSaving, setIsSaving] = useState(false);
|
|
||||||
|
|
||||||
// Reload config when component becomes visible
|
|
||||||
useEffect(() => {
|
|
||||||
const handleVisibilityChange = () => {
|
|
||||||
if (document.visibilityState === 'visible') {
|
|
||||||
reloadConfig();
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
document.addEventListener('visibilitychange', handleVisibilityChange);
|
|
||||||
return () => {
|
|
||||||
document.removeEventListener('visibilitychange', handleVisibilityChange);
|
|
||||||
};
|
|
||||||
}, [reloadConfig]);
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (!config) {
|
|
||||||
setFormData(defaultConfig);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Initialize form data with loaded config, only using defaults for missing values
|
|
||||||
setFormData({
|
|
||||||
provider: config.provider,
|
|
||||||
model: config.model || defaultConfig.model,
|
|
||||||
baseUrl: config.baseUrl || defaultConfig.baseUrl,
|
|
||||||
temperature: config.temperature ?? defaultConfig.temperature,
|
|
||||||
apiKey: config.apiKey
|
|
||||||
});
|
|
||||||
}, [config]);
|
|
||||||
|
|
||||||
const handleSubmit = async (e: React.FormEvent) => {
|
|
||||||
e.preventDefault();
|
|
||||||
try {
|
|
||||||
setIsSaving(true);
|
|
||||||
await updateConfig(formData);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Failed to save settings:', error);
|
|
||||||
} finally {
|
|
||||||
setIsSaving(false);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
if (isLoading) {
|
|
||||||
return (
|
|
||||||
<Box sx={{ display: 'flex', justifyContent: 'center', p: 3 }}>
|
|
||||||
<CircularProgress />
|
|
||||||
</Box>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Box sx={{
|
<Box sx={{
|
||||||
p: 3,
|
p: 3,
|
||||||
@ -124,7 +50,7 @@ export default function SettingsPanel() {
|
|||||||
/>
|
/>
|
||||||
</Paper>
|
</Paper>
|
||||||
|
|
||||||
{/* Search Settings */}
|
{/* Connections Settings */}
|
||||||
<Paper sx={{ p: 3, mb: 3 }} >
|
<Paper sx={{ p: 3, mb: 3 }} >
|
||||||
<Typography variant="h6" gutterBottom>Connections</Typography>
|
<Typography variant="h6" gutterBottom>Connections</Typography>
|
||||||
<FormControlLabel
|
<FormControlLabel
|
||||||
@ -143,119 +69,6 @@ export default function SettingsPanel() {
|
|||||||
sx={{ mb: 2, display: 'block' }}
|
sx={{ mb: 2, display: 'block' }}
|
||||||
/>
|
/>
|
||||||
</Paper>
|
</Paper>
|
||||||
|
|
||||||
{/* LLM Settings */}
|
|
||||||
<Paper sx={{ p: 3, mb: 3 }}>
|
|
||||||
<Typography variant="h6" gutterBottom>LLM Settings</Typography>
|
|
||||||
<form onSubmit={handleSubmit}>
|
|
||||||
{error && (
|
|
||||||
<Alert severity="error" sx={{ mb: 2 }}>
|
|
||||||
{error}
|
|
||||||
</Alert>
|
|
||||||
)}
|
|
||||||
<FormControl fullWidth margin="normal">
|
|
||||||
<InputLabel>Provider</InputLabel>
|
|
||||||
<Select
|
|
||||||
value={formData.provider || defaultConfig.provider}
|
|
||||||
label="Provider"
|
|
||||||
onChange={(e) => setFormData(prev => ({
|
|
||||||
...prev,
|
|
||||||
provider: e.target.value as LLMConfig['provider']
|
|
||||||
}))}
|
|
||||||
>
|
|
||||||
<MenuItem value="openai">OpenAI</MenuItem>
|
|
||||||
<MenuItem value="openrouter">OpenRouter</MenuItem>
|
|
||||||
<MenuItem value="ollama">Ollama</MenuItem>
|
|
||||||
</Select>
|
|
||||||
</FormControl>
|
|
||||||
|
|
||||||
{formData.provider !== 'ollama' && (
|
|
||||||
<TextField
|
|
||||||
fullWidth
|
|
||||||
margin="normal"
|
|
||||||
label="API Key"
|
|
||||||
type="password"
|
|
||||||
value={formData.apiKey ?? ''}
|
|
||||||
onChange={(e) => setFormData(prev => ({
|
|
||||||
...prev,
|
|
||||||
apiKey: e.target.value
|
|
||||||
}))}
|
|
||||||
/>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{formData.provider === 'ollama' ? (
|
|
||||||
<FormControl fullWidth margin="normal">
|
|
||||||
<InputLabel>Model</InputLabel>
|
|
||||||
<Select
|
|
||||||
value={formData.model || defaultConfig.model}
|
|
||||||
label="Model"
|
|
||||||
onChange={(e) => setFormData(prev => ({
|
|
||||||
...prev,
|
|
||||||
model: e.target.value
|
|
||||||
}))}
|
|
||||||
displayEmpty
|
|
||||||
>
|
|
||||||
<MenuItem value="jimscard/blackhat-hacker:v2">jimscard/blackhat-hacker:v2</MenuItem>
|
|
||||||
{ollamaModels.models.map((model) => (
|
|
||||||
model !== 'jimscard/blackhat-hacker:v2' && <MenuItem key={model} value={model}>{model}</MenuItem>
|
|
||||||
))}
|
|
||||||
</Select>
|
|
||||||
{ollamaModels.error && (
|
|
||||||
<Alert severity="error" sx={{ mt: 1 }}>
|
|
||||||
{ollamaModels.error}
|
|
||||||
</Alert>
|
|
||||||
)}
|
|
||||||
</FormControl>
|
|
||||||
) : (
|
|
||||||
<TextField
|
|
||||||
fullWidth
|
|
||||||
margin="normal"
|
|
||||||
label="Model"
|
|
||||||
value={formData.model ?? defaultConfig.model}
|
|
||||||
onChange={(e) => setFormData(prev => ({
|
|
||||||
...prev,
|
|
||||||
model: e.target.value
|
|
||||||
}))}
|
|
||||||
/>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{formData.provider === 'ollama' && (
|
|
||||||
<TextField
|
|
||||||
fullWidth
|
|
||||||
margin="normal"
|
|
||||||
label="Base URL"
|
|
||||||
value={formData.baseUrl ?? defaultConfig.baseUrl}
|
|
||||||
onChange={(e) => setFormData(prev => ({
|
|
||||||
...prev,
|
|
||||||
baseUrl: e.target.value
|
|
||||||
}))}
|
|
||||||
/>
|
|
||||||
)}
|
|
||||||
|
|
||||||
<TextField
|
|
||||||
fullWidth
|
|
||||||
margin="normal"
|
|
||||||
label="Temperature"
|
|
||||||
type="number"
|
|
||||||
inputProps={{ min: 0, max: 1, step: 0.1 }}
|
|
||||||
value={formData.temperature ?? defaultConfig.temperature}
|
|
||||||
onChange={(e) => setFormData(prev => ({
|
|
||||||
...prev,
|
|
||||||
temperature: parseFloat(e.target.value)
|
|
||||||
}))}
|
|
||||||
/>
|
|
||||||
<Box sx={{ mt: 3 }}>
|
|
||||||
<Button
|
|
||||||
type="submit"
|
|
||||||
variant="contained"
|
|
||||||
disabled={isSaving}
|
|
||||||
fullWidth
|
|
||||||
>
|
|
||||||
{isSaving ? <CircularProgress size={24} /> : 'Save LLM Settings'}
|
|
||||||
</Button>
|
|
||||||
</Box>
|
|
||||||
</form>
|
|
||||||
</Paper>
|
|
||||||
</Box>
|
</Box>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
10
electron-file-search/src/electron.d.ts
vendored
10
electron-file-search/src/electron.d.ts
vendored
@ -18,9 +18,7 @@ declare global {
|
|||||||
answer: string;
|
answer: string;
|
||||||
sources: DocumentMetadata[];
|
sources: DocumentMetadata[];
|
||||||
}>;
|
}>;
|
||||||
updateLLMConfig: (config: LLMConfig) => Promise<void>;
|
|
||||||
getLLMConfig: () => Promise<LLMConfig>;
|
getLLMConfig: () => Promise<LLMConfig>;
|
||||||
getOllamaModels: () => Promise<{ success: boolean; data?: string[]; error?: string }>;
|
|
||||||
|
|
||||||
// Vector Store Operations
|
// Vector Store Operations
|
||||||
getDocuments: () => Promise<{ success: boolean; data?: DocumentMetadata[]; error?: string }>;
|
getDocuments: () => Promise<{ success: boolean; data?: DocumentMetadata[]; error?: string }>;
|
||||||
@ -46,6 +44,14 @@ declare global {
|
|||||||
minimizeWindow: () => Promise<void>;
|
minimizeWindow: () => Promise<void>;
|
||||||
maximizeWindow: () => Promise<void>;
|
maximizeWindow: () => Promise<void>;
|
||||||
closeWindow: () => Promise<void>;
|
closeWindow: () => Promise<void>;
|
||||||
|
|
||||||
|
// Ollama Operations
|
||||||
|
checkOllama: () => Promise<{ installed: boolean; running: boolean }>;
|
||||||
|
openExternal: (url: string) => Promise<void>;
|
||||||
|
|
||||||
|
// Model Operations
|
||||||
|
checkModel: (modelName: string) => Promise<{ installed: boolean; installing: boolean }>;
|
||||||
|
pullModel: (modelName: string, onProgress: (status: string) => void) => Promise<void>;
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1,4 +1,4 @@
|
|||||||
import { useState, useEffect, useCallback } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import { useElectron } from './useElectron';
|
import { useElectron } from './useElectron';
|
||||||
import type { LLMConfig } from '../../electron/types';
|
import type { LLMConfig } from '../../electron/types';
|
||||||
|
|
||||||
@ -16,17 +16,8 @@ export function useLLMConfig() {
|
|||||||
try {
|
try {
|
||||||
setIsLoading(true);
|
setIsLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
const config = await electron.getLLMConfig().catch(err => {
|
const config = await electron.getLLMConfig();
|
||||||
console.error('Error loading LLM config:', err);
|
setConfig(config);
|
||||||
return null;
|
|
||||||
});
|
|
||||||
setConfig(config || {
|
|
||||||
provider: 'ollama',
|
|
||||||
model: 'phi4',
|
|
||||||
baseUrl: 'http://localhost:11434',
|
|
||||||
temperature: 0.7,
|
|
||||||
apiKey: undefined
|
|
||||||
});
|
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
setError(err instanceof Error ? err.message : 'Failed to load LLM config');
|
setError(err instanceof Error ? err.message : 'Failed to load LLM config');
|
||||||
console.error('Error loading LLM config:', err);
|
console.error('Error loading LLM config:', err);
|
||||||
@ -35,26 +26,9 @@ export function useLLMConfig() {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
const updateConfig = useCallback(async (newConfig: LLMConfig) => {
|
|
||||||
try {
|
|
||||||
setIsLoading(true);
|
|
||||||
setError(null);
|
|
||||||
await electron.updateLLMConfig(newConfig);
|
|
||||||
setConfig(newConfig);
|
|
||||||
} catch (err) {
|
|
||||||
setError(err instanceof Error ? err.message : 'Failed to update LLM config');
|
|
||||||
console.error('Error updating LLM config:', err);
|
|
||||||
throw err;
|
|
||||||
} finally {
|
|
||||||
setIsLoading(false);
|
|
||||||
}
|
|
||||||
}, [electron]);
|
|
||||||
|
|
||||||
return {
|
return {
|
||||||
config,
|
config,
|
||||||
isLoading,
|
isLoading,
|
||||||
error,
|
error,
|
||||||
updateConfig,
|
|
||||||
reloadConfig: loadConfig,
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
@ -5,14 +5,17 @@ import path from 'path';
|
|||||||
// https://vitejs.dev/config/
|
// https://vitejs.dev/config/
|
||||||
export default defineConfig({
|
export default defineConfig({
|
||||||
plugins: [react()],
|
plugins: [react()],
|
||||||
base: './',
|
base: process.env.ELECTRON_VITE_DEV_SERVER_URL ? '/' : './',
|
||||||
build: {
|
build: {
|
||||||
outDir: 'dist',
|
outDir: 'dist',
|
||||||
emptyOutDir: true,
|
emptyOutDir: true,
|
||||||
target: 'esnext',
|
target: 'esnext',
|
||||||
rollupOptions: {
|
rollupOptions: {
|
||||||
external: ['http', 'https', 'path', 'fs', 'electron']
|
external: ['http', 'https', 'path', 'fs', 'electron']
|
||||||
}
|
},
|
||||||
|
assetsDir: '.',
|
||||||
|
minify: true,
|
||||||
|
sourcemap: false
|
||||||
},
|
},
|
||||||
resolve: {
|
resolve: {
|
||||||
alias: {
|
alias: {
|
||||||
|
@ -1,16 +0,0 @@
|
|||||||
import { dirname } from "path";
|
|
||||||
import { fileURLToPath } from "url";
|
|
||||||
import { FlatCompat } from "@eslint/eslintrc";
|
|
||||||
|
|
||||||
const __filename = fileURLToPath(import.meta.url);
|
|
||||||
const __dirname = dirname(__filename);
|
|
||||||
|
|
||||||
const compat = new FlatCompat({
|
|
||||||
baseDirectory: __dirname,
|
|
||||||
});
|
|
||||||
|
|
||||||
const eslintConfig = [
|
|
||||||
...compat.extends("next/core-web-vitals", "next/typescript"),
|
|
||||||
];
|
|
||||||
|
|
||||||
export default eslintConfig;
|
|
@ -1,7 +0,0 @@
|
|||||||
import type { NextConfig } from "next";
|
|
||||||
|
|
||||||
const nextConfig: NextConfig = {
|
|
||||||
/* config options here */
|
|
||||||
};
|
|
||||||
|
|
||||||
export default nextConfig;
|
|
5912
package-lock.json
generated
5912
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
27
package.json
27
package.json
@ -1,27 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "test",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"private": true,
|
|
||||||
"scripts": {
|
|
||||||
"dev": "next dev --turbopack",
|
|
||||||
"build": "next build",
|
|
||||||
"start": "next start",
|
|
||||||
"lint": "next lint"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"react": "^19.0.0",
|
|
||||||
"react-dom": "^19.0.0",
|
|
||||||
"next": "15.1.6"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"typescript": "^5",
|
|
||||||
"@types/node": "^20",
|
|
||||||
"@types/react": "^19",
|
|
||||||
"@types/react-dom": "^19",
|
|
||||||
"postcss": "^8",
|
|
||||||
"tailwindcss": "^3.4.1",
|
|
||||||
"eslint": "^9",
|
|
||||||
"eslint-config-next": "15.1.6",
|
|
||||||
"@eslint/eslintrc": "^3"
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,8 +0,0 @@
|
|||||||
/** @type {import('postcss-load-config').Config} */
|
|
||||||
const config = {
|
|
||||||
plugins: {
|
|
||||||
tailwindcss: {},
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
export default config;
|
|
@ -1,18 +0,0 @@
|
|||||||
import type { Config } from "tailwindcss";
|
|
||||||
|
|
||||||
export default {
|
|
||||||
content: [
|
|
||||||
"./src/pages/**/*.{js,ts,jsx,tsx,mdx}",
|
|
||||||
"./src/components/**/*.{js,ts,jsx,tsx,mdx}",
|
|
||||||
"./src/app/**/*.{js,ts,jsx,tsx,mdx}",
|
|
||||||
],
|
|
||||||
theme: {
|
|
||||||
extend: {
|
|
||||||
colors: {
|
|
||||||
background: "var(--background)",
|
|
||||||
foreground: "var(--foreground)",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
plugins: [],
|
|
||||||
} satisfies Config;
|
|
@ -1,27 +0,0 @@
|
|||||||
{
|
|
||||||
"compilerOptions": {
|
|
||||||
"target": "ES2017",
|
|
||||||
"lib": ["dom", "dom.iterable", "esnext"],
|
|
||||||
"allowJs": true,
|
|
||||||
"skipLibCheck": true,
|
|
||||||
"strict": true,
|
|
||||||
"noEmit": true,
|
|
||||||
"esModuleInterop": true,
|
|
||||||
"module": "esnext",
|
|
||||||
"moduleResolution": "bundler",
|
|
||||||
"resolveJsonModule": true,
|
|
||||||
"isolatedModules": true,
|
|
||||||
"jsx": "preserve",
|
|
||||||
"incremental": true,
|
|
||||||
"plugins": [
|
|
||||||
{
|
|
||||||
"name": "next"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"paths": {
|
|
||||||
"@/*": ["./src/*"]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
|
|
||||||
"exclude": ["node_modules"]
|
|
||||||
}
|
|
Loading…
x
Reference in New Issue
Block a user