初始化项目,由ModelHub XC社区提供模型
Model: ik-ram28/MedMistralInstruct-CPT-SFT-7B Source: Original Platform
This commit is contained in:
64
README.md
Normal file
64
README.md
Normal file
@@ -0,0 +1,64 @@
|
||||
---
|
||||
library_name: transformers
|
||||
tags:
|
||||
- medical
|
||||
license: apache-2.0
|
||||
language:
|
||||
- fr
|
||||
- en
|
||||
base_model:
|
||||
- ik-ram28/MedMistralInstruct-CPT-7B
|
||||
- mistralai/Mistral-7B-Instruct-v0.1
|
||||
---
|
||||
|
||||
|
||||
## MedMistralInstruct-CPT-SFT-7B
|
||||
|
||||
### Model Description
|
||||
|
||||
MedMistralInstruct-CPT-SFT-7B is a French medical language model based on Mistral-7B-Instruct-v0.1, adapted through Continual Pre-Training followed by Supervised Fine-Tuning.
|
||||
|
||||
### Model Details
|
||||
|
||||
- **Model Type**: Causal Language Model
|
||||
- **Base Model**: Mistral-7B-Instruct-v0.1
|
||||
- **Language**: French
|
||||
- **Domain**: Medical/Healthcare
|
||||
- **Parameters**: 7 billion
|
||||
- **License**: Apache 2.0
|
||||
|
||||
### Training Details
|
||||
|
||||
**Continual Pre-Training (CPT)**
|
||||
- **Dataset**: NACHOS corpus (7.4 GB French medical texts)
|
||||
- **Training Duration**: 2.8 epochs
|
||||
- **Hardware**: 32 NVIDIA A100 80GB GPUs
|
||||
- **Training Time**: ~40 hours
|
||||
|
||||
**Supervised Fine-Tuning (SFT)**
|
||||
- **Dataset**: 30K French medical question-answer pairs
|
||||
- **Method**: DoRA (Weight-Decomposed Low-Rank Adaptation)
|
||||
- **Training Duration**: 10 epochs
|
||||
- **Hardware**: 1 NVIDIA H100 80GB GPU
|
||||
- **Training Time**: ~42 hours
|
||||
|
||||
|
||||
### Computational Requirements
|
||||
- **Carbon Emissions**: 33.96 kgCO2e (CPT+SFT)
|
||||
- **Training Time**: 82 hours total (CPT+SFT)
|
||||
|
||||
|
||||
|
||||
### Ethical Considerations
|
||||
- **Medical Accuracy**: For research and educational purposes only
|
||||
- **Professional Oversight**: Requires verification by qualified medical professionals
|
||||
- **Bias Awareness**: May contain biases from training data
|
||||
- **Privacy**: Do not input private health information
|
||||
|
||||
### Citation
|
||||
```bibtex
|
||||
|
||||
```
|
||||
|
||||
### Contact
|
||||
For questions about these models, please contact: ikram.belmadani@lis-lab.fr
|
||||
Reference in New Issue
Block a user