init
This commit is contained in:
85
transformers/docs/source/en/model_doc/timesfm.md
Normal file
85
transformers/docs/source/en/model_doc/timesfm.md
Normal file
@@ -0,0 +1,85 @@
|
||||
<!--Copyright 2025 The HuggingFace Team. All rights reserved.
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
|
||||
the License. You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
|
||||
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
||||
specific language governing permissions and limitations under the License.
|
||||
|
||||
⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
|
||||
rendered properly in your Markdown viewer.
|
||||
|
||||
-->
|
||||
*This model was released on 2023-10-14 and added to Hugging Face Transformers on 2025-04-16.*
|
||||
|
||||
# TimesFM
|
||||
|
||||
<div class="flex flex-wrap space-x-1">
|
||||
<img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-DE3412?style=flat&logo=pytorch&logoColor=white">
|
||||
</div>
|
||||
|
||||
## Overview
|
||||
|
||||
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model proposed in [A decoder-only foundation model for time-series forecasting](https://huggingface.co/papers/2310.10688) by Abhimanyu Das, Weihao Kong, Rajat Sen, and Yichen Zhou. It is a decoder only model that uses non-overlapping patches of time-series data as input and outputs some output patch length prediction in an autoregressive fashion.
|
||||
|
||||
The abstract from the paper is the following:
|
||||
|
||||
*Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.*
|
||||
|
||||
This model was contributed by [kashif](https://huggingface.co/kashif).
|
||||
The original code can be found [here](https://github.com/google-research/timesfm).
|
||||
|
||||
To use the model:
|
||||
|
||||
```python
|
||||
import numpy as np
|
||||
import torch
|
||||
from transformers import TimesFmModelForPrediction
|
||||
|
||||
|
||||
model = TimesFmModelForPrediction.from_pretrained(
|
||||
"google/timesfm-2.0-500m-pytorch",
|
||||
dtype=torch.bfloat16,
|
||||
attn_implementation="sdpa",
|
||||
device_map="auto"
|
||||
)
|
||||
|
||||
|
||||
# Create dummy inputs
|
||||
forecast_input = [
|
||||
np.sin(np.linspace(0, 20, 100)),
|
||||
np.sin(np.linspace(0, 20, 200)),
|
||||
np.sin(np.linspace(0, 20, 400)),
|
||||
]
|
||||
frequency_input = [0, 1, 2]
|
||||
|
||||
# Convert inputs to sequence of tensors
|
||||
forecast_input_tensor = [
|
||||
torch.tensor(ts, dtype=torch.bfloat16).to(model.device)
|
||||
for ts in forecast_input
|
||||
]
|
||||
frequency_input_tensor = torch.tensor(frequency_input, dtype=torch.long).to(model.device)
|
||||
|
||||
# Get predictions from the pre-trained model
|
||||
with torch.no_grad():
|
||||
outputs = model(past_values=forecast_input_tensor, freq=frequency_input_tensor, return_dict=True)
|
||||
point_forecast_conv = outputs.mean_predictions.float().cpu().numpy()
|
||||
quantile_forecast_conv = outputs.full_predictions.float().cpu().numpy()
|
||||
```
|
||||
|
||||
## TimesFmConfig
|
||||
|
||||
[[autodoc]] TimesFmConfig
|
||||
|
||||
## TimesFmModel
|
||||
|
||||
[[autodoc]] TimesFmModel
|
||||
- forward
|
||||
|
||||
## TimesFmModelForPrediction
|
||||
|
||||
[[autodoc]] TimesFmModelForPrediction
|
||||
- forward
|
||||
Reference in New Issue
Block a user