9108aae4ae4bb59d90bb06f605e126779bc196f5
Model: yashm/qwen25-15b-biomed-finetuned Source: Original Platform
library_name, tags, license, base_model
| library_name | tags | license | base_model | |||||||
|---|---|---|---|---|---|---|---|---|---|---|
| transformers |
|
apache-2.0 |
|
Qwen2.5-1.5B Biomedical Fine-Tuned Model
This model is a biomedical and bioinformatics fine-tuned version of Qwen/Qwen2.5-1.5B, fine-tuned by Dr. YMG.
Model Details
Model Description
This model is a domain-adapted and instruction fine-tuned large language model specialized for biomedical and bioinformatics tasks.
- Developed by: Dr. YMG
- Model type: Causal Language Model (LLM)
- Language(s): English
- License: Apache 2.0
- Finetuned from model: Qwen/Qwen2.5-1.5B
Model Sources
- Repository: https://huggingface.co/yashm/qwen25-15b-biomed-finetuned
- Base Model: https://huggingface.co/Qwen/Qwen2.5-1.5B
Uses
Direct Use
- Biomedical concept explanation
- Bioinformatics discussions
- Research assistance
- Literature summarization
- Gene expression & biomarker discussion
Out-of-Scope Use
- Clinical diagnosis
- Medical treatment decisions
- Drug prescription
- Patient-specific advice
Example Usage
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
import torch
MODEL_ID = "yashm/qwen25-15b-biomed-finetuned"
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
MODEL_ID,
device_map="auto",
dtype=torch.bfloat16,
trust_remote_code=True,
)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
prompt = "Explain gene expression in simple terms."
out = pipe(prompt, max_new_tokens=200)
print(out[0]["generated_text"])
Training Details
- Base model: Qwen/Qwen2.5-1.5B
- Method: LoRA (PEFT)
- Precision: BF16
- Quantization: 4-bit QLoRA
Limitations
- May hallucinate
- Not medically validated
- Limited to training data
Disclaimer
For research and educational use only. Not for clinical decision-making.
Author
Fine-tuned by Dr. YMG
Description
Languages
Jinja
100%