ModelHub XC ddf6c4ec19 初始化项目,由ModelHub XC社区提供模型
Model: facebook/chameleon-7b
Source: Original Platform
2026-04-09 13:28:19 +08:00

license, license_name, license_link, extra_gated_prompt, extra_gated_fields, extra_gated_description, extra_gated_button_content, pipeline_tag
license license_name license_link extra_gated_prompt extra_gated_fields extra_gated_description extra_gated_button_content pipeline_tag
other chameleon-research-license https://ai.meta.com/resources/models-and-libraries/chameleon-license/ ### META CHAMELEON RESEARCH LICENSE AGREEMENT
First Name Last Name Date of birth Country Affiliation I accept the terms and conditions geo
text text date_picker country text checkbox ip_location
Meta Chameleon Research License and Acceptable Use Policy I Accept Meta Chameleon Research License and AUP image-text-to-text

Meta Chameleon 7B

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. See the Chameleon paper for more information.

The Chameleon collection on HuggingFace contains 7 billion parameter and 30 billion parameter model checkpoints.

[more details and usage examples coming soon]

Citation

To cite the paper, model, or software, please use the below:

@article{Chameleon_Team_Chameleon_Mixed-Modal_Early-Fusion_2024,
  author = {Chameleon Team},
  doi = {10.48550/arXiv.2405.09818},
  journal = {arXiv preprint arXiv:2405.09818},
  title = {Chameleon: Mixed-Modal Early-Fusion Foundation Models},
  url = {https://github.com/facebookresearch/chameleon},
  year = {2024}
}

License

Use of this repository and related resources are governed by the Chameleon Research License and this repository's LICENSE file.

Description
Model synced from source: facebook/chameleon-7b
Readme 600 KiB
Languages
Markdown 100%