初始化项目,由ModelHub XC社区提供模型

Model: mlx-community/GLM-4-9B-0414-bf16
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-12 13:58:55 +08:00
commit 5f2860ad3b
12 changed files with 843 additions and 0 deletions

39
README.md Normal file
View File

@@ -0,0 +1,39 @@
---
license: mit
language:
- zh
- en
pipeline_tag: text-generation
library_name: mlx
tags:
- mlx
base_model: THUDM/GLM-4-9B-0414
---
# mlx-community/GLM-4-9B-0414-bf16
This model [mlx-community/GLM-4-9B-0414-bf16](https://huggingface.co/mlx-community/GLM-4-9B-0414-bf16) was
converted to MLX format from [THUDM/GLM-4-9B-0414](https://huggingface.co/THUDM/GLM-4-9B-0414)
using mlx-lm version **0.22.4**.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/GLM-4-9B-0414-bf16")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
```