models:- model:Kukedlc/Neural-Cosmic-7B-slerpparameters:density:[1,0.7,0.1]# density gradientweight:1.0- model:Kukedlc/NeuralLogic-7B-Vparameters:density:0.5weight:[0,0.3,0.7,1]# weight gradient- model:Kukedlc/SuperComboparameters:density:0.33weight:- filter:mlpvalue:0.5- value:0merge_method:tiesbase_model:mistralai/Mistral-7B-v0.1parameters:normalize:trueint8_mask:truedtype:float16
💻 Usage
!pipinstall-qUtransformersacceleratefromtransformersimportAutoTokenizerimporttransformersimporttorchmodel="Kukedlc/Neural-Cosmic-Boy-7B-slerp"messages=[{"role":"user","content":"What is a large language model?"}]tokenizer=AutoTokenizer.from_pretrained(model)prompt=tokenizer.apply_chat_template(messages,tokenize=False,add_generation_prompt=True)pipeline=transformers.pipeline("text-generation",model=model,torch_dtype=torch.float16,device_map="auto",)outputs=pipeline(prompt,max_new_tokens=256,do_sample=True,temperature=0.7,top_k=50,top_p=0.95)print(outputs[0]["generated_text"])