Update examples in Quickstart

This commit is contained in:
ai-modelscope
2025-03-19 22:27:49 +08:00
parent ce352a1910
commit b0b57963f5
2 changed files with 23 additions and 5 deletions

View File

@@ -60,8 +60,27 @@ model = AutoModelForCausalLM.from_pretrained(
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Choose your prompt:
# Math example (AIME 2024)
prompt = r"""Let $x,y$ and $z$ be positive real numbers that satisfy the following system of equations:
\[\log_2\left({x \over yz}\right) = {1 \over 2}\]\[\log_2\left({y \over xz}\right) = {1 \over 3}\]\[\log_2\left({z \over xy}\right) = {1 \over 4}\]
Then the value of $\left|\log_2(x^4y^3z^2)\right|$ is $\tfrac{m}{n}$ where $m$ and $n$ are relatively prime positive integers. Find $m+n$.
Please reason step by step, and put your final answer within \boxed{}."""
# Korean MCQA example (CSAT Math 2025)
prompt = r"""Question : $a_1 = 2$인 수열 $\{a_n\}$과 $b_1 = 2$인 등차수열 $\{b_n\}$이 모든 자연수 $n$에 대하여\[\sum_{k=1}^{n} \frac{a_k}{b_{k+1}} = \frac{1}{2} n^2\]을 만족시킬 때, $\sum_{k=1}^{5} a_k$의 값을 구하여라.
Options :
A) 120
B) 125
C) 130
D) 135
E) 140
Please reason step by step, and you should write the correct option alphabet (A, B, C, D or E) within \\boxed{}."""
messages = [
{"role": "user", "content": "How many golf balls can fit in a school bus?"}
{"role": "user", "content": prompt}
]
input_ids = tokenizer.apply_chat_template(
messages,
@@ -136,7 +155,7 @@ The following table shows the evaluation results of reasoning tasks such as math
<tr>
<td>QwQ-32B</td>
<td>95.5</td>
<td><strong>79.5</strong> / 86.7</td>
<td>79.5 / 86.7</td>
<td><strong>67.1</strong> / 76.7</td>
<td>94.4</td>
<td>63.3</td>
@@ -154,7 +173,7 @@ The following table shows the evaluation results of reasoning tasks such as math
<tr>
<td>DeepSeek-R1 (671B)</td>
<td><strong>97.3</strong></td>
<td>79.8 / 86.7</td>
<td><strong>79.8</strong> / 86.7</td>
<td>66.8 / <strong>80.0</strong></td>
<td>89.9</td>
<td><strong>71.5</strong></td>
@@ -236,7 +255,7 @@ Please refer to our [EXAONE Deep GitHub](https://github.com/LG-AI-EXAONE/EXAONE-
## Quantization
We provide the pre-quantized EXAONE 3.5 models with **AWQ** and several quantization types in **GGUF** format. Please refer to our [EXAONE Deep collection](https://huggingface.co/collections/LGAI-EXAONE/exaone-deep-67d119918816ec6efa79a4aa) to find corresponding quantized models.
We provide the pre-quantized EXAONE Deep models with **AWQ** and several quantization types in **GGUF** format. Please refer to our [EXAONE Deep collection](https://huggingface.co/collections/LGAI-EXAONE/exaone-deep-67d119918816ec6efa79a4aa) to find corresponding quantized models.
## Usage Guideline

View File

@@ -160,7 +160,6 @@ class ExaoneConfig(PretrainedConfig):
self.hidden_size = hidden_size
self.num_layers = num_layers
self.num_attention_heads = num_attention_heads
self.num_layers = num_layers
if num_key_value_heads is None:
num_key_value_heads = num_attention_heads
self.num_key_value_heads = num_key_value_heads