slices:- sources:- model:eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2layer_range:[0,32]- model:yam-peleg/Experiment26-7Blayer_range:[0,32]base_model:yam-peleg/Experiment26-7Bparameters:t:- filter:self_attnvalue:[0,0.5,0.3,0.7,1]- filter:mlpvalue:[1,0.5,0.7,0.3,0]- value:0.5# fallback for rest of tensorsmerge_method:slerpdtype:bfloat16
💻 Usage
!pipinstall-qUtransformersacceleratefromtransformersimportAutoTokenizerimporttransformersimporttorchmodel="mayacinka/yam-jom-7B-slerp"messages=[{"role":"user","content":"What is a large language model?"}]tokenizer=AutoTokenizer.from_pretrained(model)prompt=tokenizer.apply_chat_template(messages,tokenize=False,add_generation_prompt=True)pipeline=transformers.pipeline("text-generation",model=model,torch_dtype=torch.float16,device_map="auto",)outputs=pipeline(prompt,max_new_tokens=256,do_sample=True,temperature=0.7,top_k=50,top_p=0.95)print(outputs[0]["generated_text"])