Vai al contenuto

Troubleshooting

This section collects the most common problems and their solutions. If you can't find an answer here, you can ask for help in the community channels.

Installation problems

"npm" is not recognized as a command

Cause: Node.js is not installed or the terminal wasn't restarted after installation.

Solution: 1. Verify that Node.js is installed: search for "Node.js" in the Start menu 2. Close and reopen Command Prompt 3. If the problem persists, reinstall Node.js

"git" is not recognized as a command

Cause: Git is not installed or is not in the system PATH.

Solution: 1. Reinstall Git from git-scm.com 2. During installation, make sure the "Git from the command line" option is selected 3. Restart the computer

Error during npm install

Cause: network or permission problems.

Solution: 1. Try running Command Prompt as Administrator 2. If you're behind a corporate proxy, configure npm:

npm config set proxy http://proxy.company.com:8080
3. Try clearing the cache and retrying:
npm cache clean --force
npm install

Startup problems

The application won't start

Possible cause 1: Ollama is not running.

Check: look for the Ollama icon in the notification area (near the clock). If it's not there: 1. Search for "Ollama" in the Start menu and launch it 2. Wait a few seconds for it to fully start 3. Try launching Ollama Easy GUI again

Possible cause 2: port 3003 is already in use.

Solution: close other instances of the application.

Blank page in browser

Cause: the server is started but there's a frontend error.

Solution: 1. Open browser developer tools (F12) 2. Check the "Console" tab for error messages 3. Try clearing the browser cache and reloading

Model problems

No models available in the list

Cause: Ollama has no models installed or is not reachable.

Solution: 1. Verify that Ollama is running 2. Download at least one model:

ollama pull llama3.2
3. Reload the web interface

The model responds very slowly

Cause: insufficient hardware resources or model too large.

Solutions: - Try a smaller model (e.g. switch from 8b to 3b) - Close other applications to free up RAM - If you have a GPU, verify that Ollama is using it - Increase OLLAMA_NUM_THREADS in the .bat file

The model doesn't understand my language

Cause: some models are trained primarily on English texts.

Solutions: - Use multilingual models like qwen2.5 or mistral - Add in the system prompt: "Always respond in [your language]" - Consider that small models (<3b) may have limited capabilities in languages other than English

Attachment problems

The PDF is not read correctly

Cause: PDF with complex formatting or scanned.

Solutions: - If the PDF is a scan, the text is not extractable: use OCR first - For PDFs with many tables, consider extracting the text manually - Try attaching a simpler version of the document

Images are not analyzed

Cause: the model is not multimodal.

Solution: use a model that supports images: - llava - bakllava - llama3.2-vision

MCP problems

MCP tools don't work

Checklist: 1. Is MCP activated? (switch in the sidebar) 2. Is the specific tool activated in the MCP panel? 3. Does the model support function calling? (use llama3.1, qwen2.5, mistral) 4. Does the mcp-config.json file exist and is it configured correctly?

"Tool not found" error

Cause: the MCP server didn't start correctly.

Solution: 1. Verify that Node.js is installed (MCP servers require it) 2. Check logs for specific errors 3. Try restarting the application

Checking the logs

The application keeps track of errors in log files. To view them:

  1. Click the Log button in the interface footer
  2. Select the category to view:
  3. App: general application errors
  4. Chat: problems during conversations
  5. MCP: external tool errors
  6. Models: problems with download or loading models

Log viewer

You can search for specific text and filter by date.

Getting support

If the problem persists:

When asking for help, include: - Description of the problem - Operating system and version - Error messages (from logs or console) - Steps to reproduce the problem