models:- model:yam-peleg/Experiment27-7B# No parameters necessary for base model- model:cognitivecomputations/samantha-mistral-7bparameters:weight:0.3density:0.8- model:CorticalStack/shadow-clown-7B-dareparameters:weight:0.1density:0.8- model:yam-peleg/Experiment26-7Bparameters:weight:0.6density:0.8merge_method:dare_tiesbase_model:yam-peleg/Experiment27-7Bparameters:int8_mask:truedtype:bfloat16random_seed:0
💻 Usage
!pipinstall-qUtransformersacceleratefromtransformersimportAutoTokenizerimporttransformersimporttorchmodel="mayacinka/yam-sam-7B"messages=[{"role":"user","content":"What is a large language model?"}]tokenizer=AutoTokenizer.from_pretrained(model)prompt=tokenizer.apply_chat_template(messages,tokenize=False,add_generation_prompt=True)pipeline=transformers.pipeline("text-generation",model=model,torch_dtype=torch.float16,device_map="auto",)outputs=pipeline(prompt,max_new_tokens=256,do_sample=True,temperature=0.7,top_k=50,top_p=0.95)print(outputs[0]["generated_text"])