Compare commits

...

10 Commits

Author SHA1 Message Date
Richard Erkhov
299638fb31 Update model metadata to set pipeline tag to the new text-ranking (#1)
- Update model metadata to set pipeline tag to the new `text-ranking` (086f17f9419f4f1165940437bad1ed822d52d0ce)


Co-authored-by: Tom Aarsen <tomaarsen@users.noreply.huggingface.co>
2025-04-02 15:17:43 +00:00
Richard Erkhov
2d0a592209 uploaded readme 2024-10-16 00:42:30 +00:00
Richard Erkhov
6ea157ed78 uploaded model 2024-10-16 00:42:28 +00:00
Richard Erkhov
75928ce392 uploaded model 2024-10-16 00:40:48 +00:00
Richard Erkhov
71c38d56d7 uploaded model 2024-10-16 00:39:29 +00:00
Richard Erkhov
9f27d16153 uploaded model 2024-10-16 00:38:10 +00:00
Richard Erkhov
2058fd67ed uploaded model 2024-10-16 00:37:20 +00:00
Richard Erkhov
e450eeda24 uploaded model 2024-10-16 00:36:05 +00:00
Richard Erkhov
bfcda0414c uploaded model 2024-10-16 00:35:00 +00:00
Richard Erkhov
1eeec91e22 uploaded model 2024-10-16 00:33:55 +00:00
10 changed files with 170 additions and 0 deletions

8
.gitattributes vendored
View File

@@ -47,3 +47,11 @@ TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_NL.gguf filter=lfs diff=lf
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_1.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
TinyLlama-ContextQuestionPair-Classifier-Reranker.Q8_0.gguf filter=lfs diff=lfs merge=lfs -text

138
README.md Normal file
View File

@@ -0,0 +1,138 @@
---
pipeline_tag: text-ranking
---
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
TinyLlama-ContextQuestionPair-Classifier-Reranker - GGUF
- Model creator: https://huggingface.co/cnmoro/
- Original model: https://huggingface.co/cnmoro/TinyLlama-ContextQuestionPair-Classifier-Reranker/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q2_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q2_K.gguf) | Q2_K | 0.4GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_XS.gguf) | IQ3_XS | 0.44GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_S.gguf) | IQ3_S | 0.47GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_S.gguf) | Q3_K_S | 0.47GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_M.gguf) | IQ3_M | 0.48GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K.gguf) | Q3_K | 0.51GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_M.gguf) | Q3_K_M | 0.51GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_L.gguf) | Q3_K_L | 0.55GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_XS.gguf) | IQ4_XS | 0.57GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_0.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_0.gguf) | Q4_0 | 0.59GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_NL.gguf) | IQ4_NL | 0.6GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_S.gguf) | Q4_K_S | 0.6GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K.gguf) | Q4_K | 0.62GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_M.gguf) | Q4_K_M | 0.62GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_1.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_1.gguf) | Q4_1 | 0.65GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_0.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_0.gguf) | Q5_0 | 0.71GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_S.gguf) | Q5_K_S | 0.71GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K.gguf) | Q5_K | 0.73GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_M.gguf) | Q5_K_M | 0.73GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_1.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_1.gguf) | Q5_1 | 0.77GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q6_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q6_K.gguf) | Q6_K | 0.84GB |
| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q8_0.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q8_0.gguf) | Q8_0 | 1.09GB |
Original model description:
---
license: cc-by-nc-2.0
language:
- en
- pt
tags:
- classification
- llama
- tinyllama
- rag
- rerank
---
```python
template = """<s><|system|>
You are a chatbot who always responds in JSON format indicating if the context contains relevant information to answer the question</s>
<|user|>
Context:
{Text}
Question:
{Prompt}</s>
<|assistant|>
"""
# Output should be:
{"relevant": true}
# or
{"relevant": false}
```
Example:
```text
<s><|system|>
You are a chatbot who always responds in JSON format indicating if the context contains relevant information to answer the question</s>
<|user|>
Context:
old. NFT were observed in almost all patients over 60 years of age, but the incidence was low.
Many ubiquitin-positive small-sized granules were observed in the second and third layer of the parahippocampal gyrus of aged patients,
and the incidence rose with increasing age. On the other hand, few of these granules were in patients with Alzheimer\'s type dementia.
Granulovacuolar degeneration was examined. Many centrally-located granules were positive for ubiquitin. Based on electron microscopic
observation of these granules at several stages, the granules were thought to be a type of autophagosome. During the first stage of
granulovacuolar degeneration, electron-dense materials appeared in the cytoplasm, following which they were surrounded by smooth cytoplasm,
following which they were surrounded by smooth endoplasmic reticulum. Analytical electron microscopy disclosed that the granules contained
some aluminium. Several senile changes in the central nervous system in cadavers were examined. The pattern of extension of Alzheimer\'s
neurofibrillary tangles (NFT) and senile plaques (SP) in the olfactory bulbs of 100 specimens was examined during routine autopsy by
immunohistochemical staining. NFT were first observed in the anterior olfactory nucleus after the age of 60, and incidence rose with
increasing age. Senile plaques were found in the nucleus when there were many SP in the cerebral cortex. Of 25 non-demented amyotrophic
lateral sclerosis patients, SP were found in the cerebral cortices of 10, and 9 of 10 were over 60 years old. NFT were observed in almost
all patients over
Question:
What is granulovacuolar degeneration and what was its observation on electron microscopy?</s>
<|assistant|>
{"relevant": true}</s>
```
vLLM recommended request parameters:
```python
prompt = "<s><|system|>\nYou are a chatbot who always responds in JSON format indicating if the context contains relevant information to answer the question</s>\n<|user|>\nContext:\nConhecida como missão de imagem de raios-x e espectroscopia (da sigla em inglês XRISM), a estratégia é utilizar o telescópio para ampliar os estudos da humanidade a níveis celestiais com uma fração dos pixels da tela de um Gameboy original, lançado em 1989. Isso é possível por meio de uma ferramenta chamada “Resolve”. Apesar de utilizar a medição em pixels, a tecnologia é bastante diferente de uma câmera. Com um conjunto de microcalorímetros de seis pixels quadrados que mede 0,5 cm², ela detecta a temperatura de cada raio-x que o atinge. Como funciona o Resolve do telescópio XRISM? Cientista do projeto XRISM da NASA, Brian Williams explicou em um comunicado o funcionamento do telescópio. “Chamamos o Resolve de espectrômetro de microcalorímetros porque cada um de seus 36 pixels está medindo pequenas quantidades de calor entregues por cada raio-x recebido, nos permitindo ver as impressões digitais químicas dos elementos que compõem as fontes com detalhes sem precedentes”.\n\nQuestion:\nQual é a sigla em alemão mencionada?</s>\n<|assistant|>\n{\"relevant\":"
headers = {
"Accept": "text/event-stream",
"Authorization": "Bearer EMPTY"
}
body = {
"model": model,
"prompt": [prompt],
"best_of": 5,
"max_tokens": 1,
"temperature": 0,
"top_p": 1,
"use_beam_search": True,
"top_k": -1,
"min_p": 0,
"repetition_penalty": 1,
"length_penalty": 1,
"min_tokens": 1,
"logprobs": 1
}
result = requests.post(base_uri, headers=headers, json=body)
result = result.json()
boolean_response = bool(eval(json_result['choices'][0]['text'].strip().title()))
print(boolean_response)
```

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7b55178dbdafadd17365b26ee05bb991c08251f35c8614e6aed35bd73c9d784a
size 701378112

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:80b8ef4068565bdc12e7382d7225cda0d6be258c36525387d087fd5a08ae33d4
size 766029376

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3a526b74cc43cc5a7747d0177d66909c21459c6b571ec4689dee3013251788a3
size 830680640

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:35c32294b30d4f784ab2efb28b1466b9e097d9ccb149dd781c8c35726499e348
size 782044736

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:35c32294b30d4f784ab2efb28b1466b9e097d9ccb149dd781c8c35726499e348
size 782044736

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:fcf70998b9d9bb8c967ddd664a8ebcad3d9e09fff1780de749d841d3ea8ee2ad
size 766029376

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9cdd8f0dab429f8ce4f28b656134ec37f215939cc4cca43b32859de0ad2f15df
size 903413312

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:417fcad2199f40ec690c05d0d8dda02bb3338f608967f0db50781933f728ed4e
size 1169808960