Vai al contenuto

Installation

With the prerequisites ready, you can now download and launch Ollama Easy GUI. The procedure takes just a few minutes.

Downloading the application

Choose a folder where you want to install the application: it can be an existing folder or you can create a new one. Open the folder with File Explorer, in the address bar at the top type cmd and press Enter, Command Prompt opens already positioned in the installation folder.

To open Command Prompt

Now download the application with Git:

git clone https://github.com/paolodalprato/ollama-easy-gui.git

Git will create a folder called ollama-easy-gui with all the necessary files.

Installing dependencies

Enter the newly created folder:

cd ollama-easy-gui

Install the libraries required by the application:

npm install

This command downloads some additional components. The operation takes about a minute and you only need to run it the first time.

Launching the application

You have two ways to launch Ollama Easy GUI: the quick method and the method with configuration.

Quick method (npm start)

The simplest way: open Command Prompt in the application folder and type:

npm start

The application starts and shows the address where you can reach it, typically http://localhost:3003. Open this address in your browser to use the interface.

Method with configuration (.bat file)

On Windows you can use a batch file that offers additional features: update checking and optimal configuration for your graphics card.

First create the configuration file: in the application folder you'll find the file start-ollama-easy-gui.bat.example. Copy it and rename it to start-ollama-easy-gui.bat (you can also give it another name, the important thing is that the extension is .bat).

Then edit the file with a text editor (Notepad works fine) to adapt it to your hardware:

:: Number of layers to run on the GPU (higher = faster, but requires more VRAM)
set "OLLAMA_GPU_LAYERS=18"

:: Enable Flash Attention for faster responses
set "OLLAMA_FLASH_ATTENTION=1"

:: CPU threads to use for inference
set "OLLAMA_NUM_THREADS=24"

What values to use?

  • OLLAMA_GPU_LAYERS: with 6 GB of VRAM you can try 20-25, with 8 GB try 30-35
  • OLLAMA_NUM_THREADS: set a number equal to your CPU cores (e.g. 8 for a quad-core with hyperthreading)

From now on, launch the application by double-clicking the start-ollama-easy-gui.bat file. At each startup the file automatically checks for updates and installs them.

Accessing the interface

Whichever method you used, the interface will be available in the browser at:

http://localhost:3003

Terminal with successful startup

If there are no error messages in the terminal window (as in the screenshot), the application has started correctly. Open the address in your browser to access the graphical interface.

Updating the application

If you use the .bat file, updates are automatically downloaded and installed at each startup.

If you launch with npm start instead, you can update manually:

  1. Open Command Prompt in the application folder
  2. Download updates:
    git pull
    
  3. Update dependencies:
    npm install
    

Your settings are safe

Updates don't overwrite your conversations, saved prompts or custom configurations. All your data is in the app/data folder which is not touched by updates.