Skip to main content

The Server AI Hub Manual

Server AI Hub is a managed, pre-configured software stack that turns an NVIDIA GPU server into a polished workstation experience — without ever asking you to open a terminal.

This manual is the canonical reference for installing, configuring, and operating the Hub. It is intentionally short on jargon and heavy on screenshots; every page is something you would have asked someone over Slack.

Start here

  • Getting Started — the 15-minute first install. From bare server to a working dashboard.
  • Design Philosophy — why the product is built the way it is. Read this if you want to know what we will and will not ship.
  • User Guide — task-by-task walkthroughs of every feature.

Who this is for

  • A researcher who bought a DGX Spark and would rather not become a Linux sysadmin.
  • A small team running models in-house and tired of one person being "the docker person".
  • An enterprise piloting GPU servers and looking for a managed stack instead of a roll-your-own.

If you are comfortable in Linux and want to run everything yourself, the Hub will still help — but the design philosophy was written for people who shouldn't have to.

How to navigate

The left sidebar mirrors the structure of the product. Admin topics describe operating the box; Settings topics describe configuring it; Reference is the dry technical material; Design Philosophy is the company story.

If you want to search, the box at the top right is full-text.