Skip to content

Quickstart

Development

To get started with development of this project, clone the repository from GitHub:

git
sh
git clone https://github.com/twangodev/uw-coursemap.git

Next, create a .env file in the root directory of the project. This project contains an .env.example file that works out of the box. You can copy it to create your own .env file, or modify it to suit your needs.

sh
sh
cp .env.example .env
.env.example
dotenv
# This is an example .env file used for uw-coursemap-web, uw-coursemap-search, and elasticsearch.
# Please change the values to your own settings, especially the ELASTIC_PASSWORD for security reasons.

# uw-coursemap-web
PUBLIC_API_URL='https://cdn.jsdelivr.net/gh/twangodev/uw-coursemap-data@main' # Use https://raw.githubusercontent.com/twangodev/uw-coursemap-data/refs/heads/main to bypass the jsDeliver
PUBLIC_SEARCH_API_URL='https://search.uwcourses.com'

# uw-coursemap-search and elasticsearch
ELASTIC_HOST='https://elasticsearch:9200' # This points to the elasticsearch service
ELASTIC_USERNAME='elastic'
ELASTIC_PASSWORD='CHANGEME' # Change this to a secure password
ELASTIC_VERIFY_CERTS='false' # Since we are using self-signed certificates, set this to false unless you have valid certs
DISCOVERY.TYPE='single-node'
DATA_DIR='/data'

Next, determine whether you want to run the frontend, search, generation, or all. If you're not sure what you want to run, you should read architecture to get a better understanding of the project.

Frontend is the easiest to get started with, so we recommend starting there.

Frontend

To begin development on the frontend, ensure you have Node.js installed. Then, navigate into the project directory and install the dependencies:

npm
sh
npm install

Next, start the development server:

npm
sh
npm run dev

You can now access the application with your browser at the specified URL.

The search is a little more tricky to set up, as it uses Pipenv and Elasticsearch.

First, you will need to initialize the git submodules if you have not done so already.

git
sh
git submodule update --init --recursive

Setup Elasticsearch

Next, you will need to install Elasticsearch on your machine. We recommend using Docker to run Elasticsearch, as it is the easiest way to get started. If you don't have Docker installed, you can follow the instructions on the Docker website.

docker
sh
docker compose up -d

At this point, you will need to reconfigure your .env file to point to the Elasticsearch instance. As specified in the docker-compose.yml, the Docker container will run on localhost:9200, so you can use the following configuration for your .env file:

dotenv
ELASTIC_HOST=https://localhost:9200
docker-compose.yml
yaml
services:

  elasticsearch:
    image: elasticsearch:8.18.0
    container_name: elasticsearch
    env_file: ".env"
    ports:
      - "9200:9200"
      - "9300:9300"

  search:
    container_name: uw-coursemap-search
    image: ghcr.io/twangodev/uw-coursemap-search:v1.0.7
    restart: unless-stopped
    env_file: ".env"
    ports:
      - "3001:8000"
    volumes:
      - ./generation/data:/data
    depends_on:
      - elasticsearch

  web:
    container_name: uw-coursemap-web
    image: ghcr.io/twangodev/uw-coursemap-web:v1.0.7
    env_file: ".env"
    restart: unless-stopped
    ports:
      - "3000:3000"

Setup Flask

Finally, ensure you have Python installed. Install Pipenv with the following command:

pip
sh
pip install pipenv --user

Next, navigate into the search directory and install the dependencies:

pipenv
sh
pipenv install

And now you can run the search server:

pipenv
sh
pipenv run app.py

TIP

The search server requires the same environment variables as specified in the .env file in the root directory. See Pipenv Shell for how to bring them into the shell.

PyCharm users, you can create a run configuration to run the search server with the environment variables from the .env file. :::

Generation

The generation process requires the same Python setup as the search. Install Pipenv and the dependencies as specified above. Then, navigate into the generation directory and run the following command:

pipenv
sh
pipenv run python main.py --help

For full details on how to run the generation process, see the Generation documentation.

Deployment

We recommend deploying this application using Docker for ease of use. We publish both the frontend and search images to Docker Hub and the GitHub Container Registry, which docker-compose.yml will pull from by default.

To deploy the application, you will need to create a .env file in the root directory of the project, just like in development. You can use the .env.example file as a template and copy it to create your own .env file.

sh
sh
cp .env.example .env

CAUTION

Ensure that you change ELASTIC_PASSWORD in the .env file to a secure password. This is the password for the elastic user in Elasticsearch, and it is used to authenticate the search server to Elasticsearch.

To run the application, run the following command:

docker
sh
docker compose up -d

This will start the application in detached mode. You can view the logs with the following command:

docker
sh
docker compose logs -f

To stop the application, run the following command:

docker
sh
docker compose down

To expose your application to the internet, you can use a production grade reverse proxy like NGINX, Caddy, or Traefik.

Scaling

This application is designed to be horizontally scalable, meaning you can run multiple instances of the frontend and search servers to handle more traffic.

Currently, there isn't enough demand to warrant horizontally scaling the application. However, if you would like to scale the application, you can use Docker Swarm or Kubernetes to orchestrate the scaling process, or using an autoscaler group with your cloud provider of choice.

UW Course Map is not affiliated with the University of Wisconsin-Madison.