Release 0.5.1 (#9533)

This commit is contained in:
Lianmin Zheng
2025-08-23 07:09:26 -07:00
committed by GitHub
parent 86d10d220f
commit 97a38ee85b
6 changed files with 7 additions and 9 deletions

View File

@@ -33,7 +33,7 @@ Add [performance optimization options](#performance-optimization-options) as nee
```bash ```bash
# Installation # Installation
pip install "sglang[all]>=0.5.0rc2" pip install "sglang[all]>=0.5.1"
# Launch # Launch
python3 -m sglang.launch_server --model deepseek-ai/DeepSeek-V3 --tp 8 --trust-remote-code python3 -m sglang.launch_server --model deepseek-ai/DeepSeek-V3 --tp 8 --trust-remote-code

View File

@@ -12,20 +12,19 @@ It is recommended to use uv for faster installation:
```bash ```bash
pip install --upgrade pip pip install --upgrade pip
pip install uv pip install uv
uv pip install "sglang[all]>=0.5.0rc2" uv pip install "sglang[all]>=0.5.1"
``` ```
**Quick fixes to common problems** **Quick fixes to common problems**
- If you encounter `OSError: CUDA_HOME environment variable is not set`. Please set it to your CUDA install root with either of the following solutions: - If you encounter `OSError: CUDA_HOME environment variable is not set`. Please set it to your CUDA install root with either of the following solutions:
1. Use `export CUDA_HOME=/usr/local/cuda-<your-cuda-version>` to set the `CUDA_HOME` environment variable. 1. Use `export CUDA_HOME=/usr/local/cuda-<your-cuda-version>` to set the `CUDA_HOME` environment variable.
2. Install FlashInfer first following [FlashInfer installation doc](https://docs.flashinfer.ai/installation.html), then install SGLang as described above. 2. Install FlashInfer first following [FlashInfer installation doc](https://docs.flashinfer.ai/installation.html), then install SGLang as described above.
- SGLang currently uses torch 2.8 and flashinfer for torch 2.8. If you want to install flashinfer separately, please refer to [FlashInfer installation doc](https://docs.flashinfer.ai/installation.html). Please note that the FlashInfer pypi package is called `flashinfer-python` instead of `flashinfer`.
## Method 2: From source ## Method 2: From source
```bash ```bash
# Use the last release branch # Use the last release branch
git clone -b v0.5.0rc2 https://github.com/sgl-project/sglang.git git clone -b v0.5.1 https://github.com/sgl-project/sglang.git
cd sglang cd sglang
# Install the python packages # Install the python packages
@@ -35,7 +34,6 @@ pip install -e "python[all]"
**Quick fixes to common problems** **Quick fixes to common problems**
- If you want to develop SGLang, it is recommended to use docker. Please refer to [setup docker container](../developer_guide/development_guide_using_docker.md#setup-docker-container). The docker image is `lmsysorg/sglang:dev`. - If you want to develop SGLang, it is recommended to use docker. Please refer to [setup docker container](../developer_guide/development_guide_using_docker.md#setup-docker-container). The docker image is `lmsysorg/sglang:dev`.
- SGLang currently uses torch 2.8 and flashinfer for torch 2.8. If you want to install flashinfer separately, please refer to [FlashInfer installation doc](https://docs.flashinfer.ai/installation.html). Please note that the FlashInfer pypi package is called `flashinfer-python` instead of `flashinfer`.
## Method 3: Using docker ## Method 3: Using docker

View File

@@ -44,7 +44,7 @@ You can install SGLang using one of the methods below.
```bash ```bash
# Use the last release branch # Use the last release branch
git clone -b v0.5.0rc2 https://github.com/sgl-project/sglang.git git clone -b v0.5.1 https://github.com/sgl-project/sglang.git
cd sglang cd sglang
# Compile sgl-kernel # Compile sgl-kernel

View File

@@ -99,7 +99,7 @@ We are also providing a DeepEP-compatible Library as a drop-in replacement of de
```shell ```shell
# Use the last release branch # Use the last release branch
git clone -b v0.5.0rc2 https://github.com/sgl-project/sglang.git git clone -b v0.5.1 https://github.com/sgl-project/sglang.git
cd sglang cd sglang
pip install --upgrade pip pip install --upgrade pip

View File

@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project] [project]
name = "sglang" name = "sglang"
version = "0.5.0rc2" version = "0.5.1"
description = "SGLang is yet another fast serving framework for large language models and vision language models." description = "SGLang is yet another fast serving framework for large language models and vision language models."
readme = "README.md" readme = "README.md"
requires-python = ">=3.10" requires-python = ">=3.10"

View File

@@ -1 +1 @@
__version__ = "0.5.0rc2" __version__ = "0.5.1"