- update (7b7080082a23044b54eb7b88bfda817b9924818b) Co-authored-by: Raushan Turganbay <RaushanTurganbay@users.noreply.huggingface.co>
1.7 KiB
1.7 KiB
license, license_name, license_link, extra_gated_prompt, extra_gated_fields, extra_gated_description, extra_gated_button_content, pipeline_tag
| license | license_name | license_link | extra_gated_prompt | extra_gated_fields | extra_gated_description | extra_gated_button_content | pipeline_tag | ||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| other | chameleon-research-license | https://ai.meta.com/resources/models-and-libraries/chameleon-license/ | ### META CHAMELEON RESEARCH LICENSE AGREEMENT |
|
Meta Chameleon Research License and Acceptable Use Policy | I Accept Meta Chameleon Research License and AUP | image-text-to-text |
Meta Chameleon 7B
Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.
The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.
[more details and usage examples coming soon]
Citation
To cite the paper, model, or software, please use the below:
@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
author = {Chameleon Team},
doi = {10.48550/arXiv.2405.09818},
journal = {arXiv preprint arXiv:2405.09818},
title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
url = {https://github.com/facebookresearch/chameleon},
year = {2024}
}
License
Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.