910 B
910 B
library_name, license, pipeline_tag, tags, extra_gated_heading, extra_gated_prompt, extra_gated_button_content
| library_name | license | pipeline_tag | tags | extra_gated_heading | extra_gated_prompt | extra_gated_button_content | |
|---|---|---|---|---|---|---|---|
| transformers | gemma | text-generation |
|
Access Gemma on Hugging Face | To access Gemma on Hugging Face, you’re required to review and agree to Google’s usage license. To do this, please ensure you’re logged in to Hugging Face and click below. Requests are processed immediately. | Acknowledge license |
mlx-community/gemma-2-2b-fp16
The Model mlx-community/gemma-2-2b-fp16 was converted to MLX format from google/gemma-2-2b using mlx-lm version 0.16.1.
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/gemma-2-2b-fp16")
response = generate(model, tokenizer, prompt="hello", verbose=True)