diff --git a/README.md b/README.md
new file mode 100644
index 0000000..ac267fa
--- /dev/null
+++ b/README.md
@@ -0,0 +1,135 @@
+Quantization made by Richard Erkhov.
+
+[Github](https://github.com/RichardErkhov)
+
+[Discord](https://discord.gg/pvy7H8DZMG)
+
+[Request more models](https://github.com/RichardErkhov/quant_request)
+
+
+TinyLlama-ContextQuestionPair-Classifier-Reranker - GGUF
+- Model creator: https://huggingface.co/cnmoro/
+- Original model: https://huggingface.co/cnmoro/TinyLlama-ContextQuestionPair-Classifier-Reranker/
+
+
+| Name | Quant method | Size |
+| ---- | ---- | ---- |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q2_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q2_K.gguf) | Q2_K | 0.4GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_XS.gguf) | IQ3_XS | 0.44GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_S.gguf) | IQ3_S | 0.47GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_S.gguf) | Q3_K_S | 0.47GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ3_M.gguf) | IQ3_M | 0.48GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K.gguf) | Q3_K | 0.51GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_M.gguf) | Q3_K_M | 0.51GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q3_K_L.gguf) | Q3_K_L | 0.55GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_XS.gguf) | IQ4_XS | 0.57GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_0.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_0.gguf) | Q4_0 | 0.59GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.IQ4_NL.gguf) | IQ4_NL | 0.6GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_S.gguf) | Q4_K_S | 0.6GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K.gguf) | Q4_K | 0.62GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_K_M.gguf) | Q4_K_M | 0.62GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_1.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q4_1.gguf) | Q4_1 | 0.65GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_0.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_0.gguf) | Q5_0 | 0.71GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_S.gguf) | Q5_K_S | 0.71GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K.gguf) | Q5_K | 0.73GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_K_M.gguf) | Q5_K_M | 0.73GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_1.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q5_1.gguf) | Q5_1 | 0.77GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q6_K.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q6_K.gguf) | Q6_K | 0.84GB |
+| [TinyLlama-ContextQuestionPair-Classifier-Reranker.Q8_0.gguf](https://huggingface.co/RichardErkhov/cnmoro_-_TinyLlama-ContextQuestionPair-Classifier-Reranker-gguf/blob/main/TinyLlama-ContextQuestionPair-Classifier-Reranker.Q8_0.gguf) | Q8_0 | 1.09GB |
+
+
+
+
+Original model description:
+---
+license: cc-by-nc-2.0
+language:
+- en
+- pt
+tags:
+- classification
+- llama
+- tinyllama
+- rag
+- rerank
+---
+```python
+template = """<|system|>
+You are a chatbot who always responds in JSON format indicating if the context contains relevant information to answer the question
+<|user|>
+Context:
+{Text}
+
+Question:
+{Prompt}
+<|assistant|>
+"""
+
+# Output should be:
+
+{"relevant": true}
+
+# or
+
+{"relevant": false}
+```
+
+Example:
+```text
+<|system|>
+You are a chatbot who always responds in JSON format indicating if the context contains relevant information to answer the question
+<|user|>
+Context:
+old. NFT were observed in almost all patients over 60 years of age, but the incidence was low.
+Many ubiquitin-positive small-sized granules were observed in the second and third layer of the parahippocampal gyrus of aged patients,
+and the incidence rose with increasing age. On the other hand, few of these granules were in patients with Alzheimer\'s type dementia.
+Granulovacuolar degeneration was examined. Many centrally-located granules were positive for ubiquitin. Based on electron microscopic
+observation of these granules at several stages, the granules were thought to be a type of autophagosome. During the first stage of
+granulovacuolar degeneration, electron-dense materials appeared in the cytoplasm, following which they were surrounded by smooth cytoplasm,
+following which they were surrounded by smooth endoplasmic reticulum. Analytical electron microscopy disclosed that the granules contained
+some aluminium. Several senile changes in the central nervous system in cadavers were examined. The pattern of extension of Alzheimer\'s
+neurofibrillary tangles (NFT) and senile plaques (SP) in the olfactory bulbs of 100 specimens was examined during routine autopsy by
+immunohistochemical staining. NFT were first observed in the anterior olfactory nucleus after the age of 60, and incidence rose with
+increasing age. Senile plaques were found in the nucleus when there were many SP in the cerebral cortex. Of 25 non-demented amyotrophic
+lateral sclerosis patients, SP were found in the cerebral cortices of 10, and 9 of 10 were over 60 years old. NFT were observed in almost
+all patients over
+
+Question:
+What is granulovacuolar degeneration and what was its observation on electron microscopy?
+<|assistant|>
+{"relevant": true}
+```
+
+vLLM recommended request parameters:
+
+```python
+prompt = "<|system|>\nYou are a chatbot who always responds in JSON format indicating if the context contains relevant information to answer the question\n<|user|>\nContext:\nConhecida como missão de imagem de raios-x e espectroscopia (da sigla em inglês XRISM), a estratégia é utilizar o telescópio para ampliar os estudos da humanidade a níveis celestiais com uma fração dos pixels da tela de um Gameboy original, lançado em 1989. Isso é possível por meio de uma ferramenta chamada “Resolve”. Apesar de utilizar a medição em pixels, a tecnologia é bastante diferente de uma câmera. Com um conjunto de microcalorímetros de seis pixels quadrados que mede 0,5 cm², ela detecta a temperatura de cada raio-x que o atinge. Como funciona o Resolve do telescópio XRISM? Cientista do projeto XRISM da NASA, Brian Williams explicou em um comunicado o funcionamento do telescópio. “Chamamos o Resolve de espectrômetro de microcalorímetros porque cada um de seus 36 pixels está medindo pequenas quantidades de calor entregues por cada raio-x recebido, nos permitindo ver as impressões digitais químicas dos elementos que compõem as fontes com detalhes sem precedentes”.\n\nQuestion:\nQual é a sigla em alemão mencionada?\n<|assistant|>\n{\"relevant\":"
+
+headers = {
+ "Accept": "text/event-stream",
+ "Authorization": "Bearer EMPTY"
+}
+
+body = {
+ "model": model,
+ "prompt": [prompt],
+ "best_of": 5,
+ "max_tokens": 1,
+ "temperature": 0,
+ "top_p": 1,
+ "use_beam_search": True,
+ "top_k": -1,
+ "min_p": 0,
+ "repetition_penalty": 1,
+ "length_penalty": 1,
+ "min_tokens": 1,
+ "logprobs": 1
+}
+
+result = requests.post(base_uri, headers=headers, json=body)
+result = result.json()
+
+boolean_response = bool(eval(json_result['choices'][0]['text'].strip().title()))
+print(boolean_response)
+```
+