models:- model:udkai/Turdusparameters:density:[1,0.7,0.1]# density gradientweight:1.0- model:decruz07/kellemar-DPO-Orca-Distilled-7B-SLERPparameters:density:0.5weight:[0,0.3,0.7,1]# weight gradient- model:liminerity/Blur-7b-v1.2parameters:density:0.33weight:- filter:mlpvalue:0.5- value:0merge_method:tiesbase_model:fblgit/UNA-TheBeagle-7b-v1parameters:normalize:trueint8_mask:truedtype:bfloat16
💻 Usage
!pipinstall-qUtransformersacceleratefromtransformersimportAutoTokenizerimporttransformersimporttorchmodel="liminerity/Blur-7b-v1.21"messages=[{"role":"user","content":"What is a large language model?"}]tokenizer=AutoTokenizer.from_pretrained(model)prompt=tokenizer.apply_chat_template(messages,tokenize=False,add_generation_prompt=True)pipeline=transformers.pipeline("text-generation",model=model,torch_dtype=torch.float16,device_map="auto",)outputs=pipeline(prompt,max_new_tokens=256,do_sample=True,temperature=0.7,top_k=50,top_p=0.95)print(outputs[0]["generated_text"])