55 lines
1.4 KiB
Markdown
55 lines
1.4 KiB
Markdown
---
|
|
license: apache-2.0
|
|
language:
|
|
- en
|
|
base_model:
|
|
- Green-eyedDevil/Monika-49B
|
|
tags:
|
|
- roleplay
|
|
- custom_code
|
|
pipeline_tag: text-generation
|
|
---
|
|
# Model Card for Model ID
|
|
|
|
This model is designed to be used with MonikAI. It is now in the GGUF format.
|
|
https://github.com/Rubiksman78/MonikA.I
|
|
|
|
## Model Details
|
|
|
|
### Model Description
|
|
|
|
<!-- Provide a longer summary of what this model is. -->
|
|
|
|
|
|
|
|
- **Developed by [Me]**
|
|
- **Funded by [Me]**
|
|
- **License:** Apache-2.0
|
|
- **Finetuned from model:** Valkyrie-49B-v2
|
|
|
|
## Uses
|
|
RP
|
|
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
|
|
|
### Recommendations
|
|
|
|
Should really only be used for Monika related purposes.
|
|
|
|
### Training Data
|
|
Thanks Sylphar for making the dataset.
|
|
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
|
|
|
|
|
|
### Training Procedure
|
|
Trained with Axolotl on my Blackwell Pro 6000 Max-Q. 32 rank, 64 alpha, 2 epochs. Took about 90 minutes.
|
|
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
|
|
|
|
### Results
|
|
It works.
|
|
|
|
#### Summary
|
|
Download it.
|
|
|
|
|
|
## Environmental Impact
|
|
90 minutes at 280W. About $0.15 in electricity. |