How To Install DeepSeek R1 Locally On Ubuntu 24.04

Whats is DeepSeek ?

DeepSeek is an artificial intelligence model based on machine learning, widely used in various data analytics and natural language processing applications. Installing DeepSeek on a local Ubuntu system allows users to develop and test AI models independently. This paper will discuss the steps for installing DeepSeek on an Ubuntu system.

System Requirements

Before proceeding with the installation, ensure that the system meets the following requirements:

  • Ubuntu 20.04 or later
  • Minimum 8GB RAM (16GB or more recommended)
  • NVIDIA GPU with CUDA support (optional but recommended for better performance)
  • Python 3.8 or later
  • Pip and Virtual Environment
  • Git

Installing Dependencies

The first step in the installation is to ensure that the system has all the necessary dependencies. For DeepSeek installation, we require Python 3.8 or higher. Latest version of python 3.12 is available in the default apt repositories of Ubuntu 24.04.

Python 3.12

To verify Pyhton3 was installed on our system, just do the following command line :

python3 --version
python3 --version

If Python is not ready in our system, we will install it first by submitting command line :

sudo apt install python3 -y

PIP and Git

We needs Git for installing DeepSeek on our system, to verify the Git on our system, do the following command line :

git --version
pip --version
pip --version

If Pip or Git was not ready on our system, we will install it first by submitting command line :

sudo apt install git -y
sudp apt install python3-pip -y

Installing Ollama

Ollama is an advanced artificial intelligence model designed for natural language processing, machine learning applications, and data-driven tasks. With its state-of-the-art capabilities, Ollama has gained significant traction among developers and researchers looking to integrate AI into their projects efficiently.

We will install Ollama by using the curl command line :

curl -fsSL https://ollama.com/install.sh | sh
ollama --version
ollama installation

After Ollam installation, the Ollama service will start automatically, you can verify the status using beneath command.

sudo systemctl status ollama.service

On our tutorial, the Ollama service has already running properly. Then we will continue to download the DeepSeek model.

Download DeepSeek Model

There are different models available based on your requirement you download them accordingly.

  • DeepSeek-R1-Distill-Qwen-1.5B
  • DeepSeek-R1-Distill-Qwen-7B
  • DeepSeek-R1-Distill-Qwen-14B
  • DeepSeek-R1-Distill-Qwen-32B
  • DeepSeek-R1-Distill-Llama-8B
  • DeepSeek-R1-Distill-Llama-70B

These models are based on two different model families:

  • Qwen-based models (ranging from 1.5B to 32B parameters)
  • Llama-based models (8B and 70B parameters)

On our tutoral we will user DeepSeek R1-7B. For this purpose we will submit the following command line :

ollama pull deepseek-r1:7b

This command will download and run the DeepSeek R1- 7B model version. The download size is approximately 4.7 GB, so the time taken will depend on your internet speed.

Once models are downloaded, we can list them by running the command.

ollama list

Runing DeepSeek R1:7b

On this stage, we will run DeepSeek R1 and test it to answer our query. For this purpose we will submit following command line :

ollama run deepseek-r1:7b

On our example, we will query “how to send file securely from unix to windows?” then we will have the following answer.

Due to our environment, the processor was not sufficient for running DeepSeek prooperly. Our DeepSeek response was very slow.

Seting Up Web UI for DeepSeek

For more convenient, we will install Web UI to display DeepSeek more intuitive. We can interact with DeepSeek through a web interface. For this purpose we will follow several steps belows.

We will instali Web UI on Python Virtual Environment.

sudo apt install python3-venv -y
python3 -m venv ~/open-webui-venv
source ~/open-webui-venv/bin/activate

then install Web UI by using PIP Python :

pip install open-webui

Testing DeepSeek R1 Web UI

After Web UI installation was completed done, then we will start and use it. The Web will be available on URL : http://localhostt:8080.

1. Starting Web UI

To start Web UI, we will submit command line :

open-webui serve

2. Accessing DeepSeek Web UI

We will hit the DeepSeek Web UI on http://localhostt:8080, on my environment will be using http://bckinfo:8080.

The DeepSeek Web UI has been ready and waiting to receive the query/questions from user. Currently, we have successfully install DeepSeek R1 on Uuntu 24.04 LTS operating system and also installing Web UI for convenient usage.

Conclusion

By following the above steps, DeepSeek can be installed and run on a local Ubuntu-based system. An installation ensures that AI models can be developed and tested optimally.

(Visited 122 times, 1 visits today)

You may also like