models:- model:Gille/StrangeMerges_51-7B-dare_ties# No parameters necessary for base model- model:WizardLM/WizardMath-7B-V1.1parameters:density:0.66weight:0.2- model:AurelPx/Percival_01-7b-slerpparameters:density:0.55weight:0.2- model:Weyaxi/Einstein-v4-7Bparameters:density:0.55weight:0.2- model:Kukedlc/NeuralMaths-Experiment-7bparameters:density:0.44weight:0.2- model:Gille/StrangeMerges_35-7B-slerpparameters:density:0.66weight:0.2merge_method:dare_tiesbase_model:Gille/StrangeMerges_51-7B-dare_tiesparameters:int8_mask:truedtype:bfloat16
💻 Usage
!pipinstall-qUtransformersacceleratefromtransformersimportAutoTokenizerimporttransformersimporttorchmodel="Gille/StrangeMerges_52-7B-dare_ties"messages=[{"role":"user","content":"What is a large language model?"}]tokenizer=AutoTokenizer.from_pretrained(model)prompt=tokenizer.apply_chat_template(messages,tokenize=False,add_generation_prompt=True)pipeline=transformers.pipeline("text-generation",model=model,torch_dtype=torch.float16,device_map="auto",)outputs=pipeline(prompt,max_new_tokens=256,do_sample=True,temperature=0.7,top_k=50,top_p=0.95)print(outputs[0]["generated_text"])