38 lines
798 B
Markdown
38 lines
798 B
Markdown
|
|
---
|
||
|
|
base_model: Qwen/Qwen3-8B
|
||
|
|
datasets:
|
||
|
|
- manindra18/trascript-dataset
|
||
|
|
language:
|
||
|
|
- en
|
||
|
|
library_name: transformers
|
||
|
|
license: apache-2.0
|
||
|
|
pipeline_tag: text-generation
|
||
|
|
tags:
|
||
|
|
- tinker
|
||
|
|
- tinker-cookbook
|
||
|
|
- sft
|
||
|
|
- chat
|
||
|
|
---
|
||
|
|
|
||
|
|
# manindra18/Qwen3-8B
|
||
|
|
|
||
|
|
This model was fine-tuned from [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B) using [Tinker](https://thinkingmachines.ai/tinker) and [tinker-cookbook](https://github.com/thinking-machines-lab/tinker-cookbook).
|
||
|
|
|
||
|
|
## Model details
|
||
|
|
|
||
|
|
- **Base model:** [Qwen/Qwen3-8B](https://huggingface.co/Qwen/Qwen3-8B)
|
||
|
|
- **Format:** Merged model
|
||
|
|
|
||
|
|
## Usage
|
||
|
|
|
||
|
|
```python
|
||
|
|
from transformers import AutoModelForCausalLM
|
||
|
|
|
||
|
|
model = AutoModelForCausalLM.from_pretrained("manindra18/Qwen3-8B")
|
||
|
|
```
|
||
|
|
|
||
|
|
## Framework versions
|
||
|
|
|
||
|
|
- tinker-cookbook: 0.1.0
|
||
|
|
- transformers: 5.1.0
|
||
|
|
- torch: 2.10.0
|