models:- model:mistralai/Mistral-7B-v0.1# no parameters necessary for base model- model:shadowml/BeagleSempra-7Bparameters:density:0.65weight:0.4- model:shadowml/BeagSake-7Bparameters:density:0.6weight:0.35- model:shadowml/WestBeagle-7Bparameters:density:0.6weight:0.35merge_method:dare_tiesbase_model:mistralai/Mistral-7B-v0.1parameters:int8_mask:truedtype:float16
💻 Usage
!pipinstall-qUtransformersacceleratefromtransformersimportAutoTokenizerimporttransformersimporttorchmodel="mlabonne/OmniBeagle-7B"messages=[{"role":"user","content":"What is a large language model?"}]tokenizer=AutoTokenizer.from_pretrained(model)prompt=tokenizer.apply_chat_template(messages,tokenize=False,add_generation_prompt=True)pipeline=transformers.pipeline("text-generation",model=model,torch_dtype=torch.float16,device_map="auto",)outputs=pipeline(prompt,max_new_tokens=256,do_sample=True,temperature=0.7,top_k=50,top_p=0.95)print(outputs[0]["generated_text"])