Upload nishka-gkc-phi3-merged - GKC merged model with modeling_phi3.py for RunPod
This commit is contained in:
50
README.md
Normal file
50
README.md
Normal file
@@ -0,0 +1,50 @@
|
||||
---
|
||||
license: mit
|
||||
language:
|
||||
- en
|
||||
tags:
|
||||
- pql
|
||||
- compliance
|
||||
- governance
|
||||
- phi-3
|
||||
base_model: microsoft/Phi-3-mini-4k-instruct
|
||||
---
|
||||
|
||||
# NISHKA GKC
|
||||
|
||||
Governance Knowledge Corpus model trained on 1.12M tokens of regulatory content across 15 compliance frameworks.
|
||||
|
||||
## Model Details
|
||||
|
||||
- **Base Model**: microsoft/Phi-3-mini-4k-instruct
|
||||
- **Architecture**: Phi-3 (3.8B parameters)
|
||||
- **Training**: LoRA adapter merged into base model
|
||||
- **Format**: Full model weights (no adapter needed)
|
||||
|
||||
## Usage
|
||||
|
||||
```python
|
||||
from transformers import AutoModelForCausalLM, AutoTokenizer
|
||||
|
||||
model = AutoModelForCausalLM.from_pretrained(
|
||||
"openpql/nishka-gkc",
|
||||
device_map="auto",
|
||||
torch_dtype="auto",
|
||||
trust_remote_code=True
|
||||
)
|
||||
tokenizer = AutoTokenizer.from_pretrained("openpql/nishka-gkc")
|
||||
|
||||
# Generate
|
||||
inputs = tokenizer("Your prompt here", return_tensors="pt")
|
||||
outputs = model.generate(**inputs, max_length=512)
|
||||
print(tokenizer.decode(outputs[0]))
|
||||
```
|
||||
|
||||
## Deployment
|
||||
|
||||
This model is ready for deployment with vLLM, TGI, or other inference servers.
|
||||
|
||||
```bash
|
||||
# vLLM example
|
||||
vllm serve openpql/nishka-gkc --dtype float16
|
||||
```
|
||||
@@ -29,7 +29,6 @@
|
||||
"rope_theta": 10000.0,
|
||||
"sliding_window": 2047,
|
||||
"tie_word_embeddings": false,
|
||||
"transformers_version": "4.57.3",
|
||||
"use_cache": true,
|
||||
"vocab_size": 32064
|
||||
}
|
||||
1563
modeling_phi3.py
Normal file
1563
modeling_phi3.py
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user