timesfm
Claude skill for time-series forecasting — leverages Google's TimesFM to predict future data points
A Claude skill for time-series forecasting, leveraging Google Research's TimesFM foundation model to predict future data points. It provides a powerful, pre-trained model for various forecasting tasks, supporting long context lengths and continuous quantile forecasts. The model can be fine-tuned with LoRA and integrates with Google Cloud products like BigQuery ML and Google Sheets.
- Forecasts time series data using Google's TimesFM 2.5 model
- Supports up to 16k context length for long-range predictions
- Provides continuous quantile forecasts for uncertainty estimation
- Fine-tuning example with HuggingFace Transformers and PEFT (LoRA)
- Integrates with Claude Code via a dedicated SKILL.md definition
README
View on GitHub ↗TimesFM
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
- Paper: A decoder-only foundation model for time-series forecasting, ICML 2024.
- All checkpoints: TimesFM Hugging Face Collection.
- Google Research blog.
- TimesFM in Google 1P Products:
- BigQuery ML: Enterprise level SQL queries for scalability and reliability.
- Google Sheets: For your daily spreadsheet.
- Vertex Model Garden: Dockerized endpoint for agentic calling.
This open version is not an officially supported Google product.
Latest Model Version: TimesFM 2.5
Archived Model Versions:
- 1.0 and 2.0: relevant code archived in the sub directory
v1. You canpip install timesfm==1.3.0to install an older version of this package to load them.
Update - Apr. 9, 2026
Added fine-tuning example using HuggingFace Transformers + PEFT (LoRA) — see
timesfm-forecasting/examples/finetuning/.
Also added unit tests (tests/) and incorporated several community fixes.
Shoutout to @kashif and @darkpowerxo.
Update - Mar. 19, 2026
Huge shoutout to @borealBytes for adding the support for AGENTS! TimesFM SKILL.md is out.
Update - Oct. 29, 2025
Added back the covariate support through XReg for TimesFM 2.5.
Update - Sept. 15, 2025
TimesFM 2.5 is out!
Comparing to TimesFM 2.0, this new 2.5 model:
- uses 200M parameters, down from 500M.
- supports up to 16k context length, up from 2048.
- supports continuous quantile forecast up to 1k horizon via an optional 30M quantile head.
- gets rid of the
frequencyindicator. - has a couple of new forecasting flags.
Since the Sept. 2025 launch, the following improvements have been completed:
- ✅ Flax version of the model for faster inference.
- ✅ Covariate support via XReg (see Oct. 2025 update).
- ✅ Documentation, examples, and agent skill (see
timesfm-forecasting/). - ✅ Fine-tuning example with LoRA via HuggingFace Transformers + PEFT (see
timesfm-forecasting/examples/finetuning/). - ✅ Unit tests for core layers, configs, and utilities (see
tests/).
Install
Clone the repository:
git clone https://github.com/google-research/timesfm.git cd timesfmCreate a virtual environment and install dependencies using
uv:# Create a virtual environment uv venv # Activate the environment source .venv/bin/activate # Install the package in editable mode with torch uv pip install -e .[torch] # Or with flax uv pip install -e .[flax] # Or XReg is needed uv pip install -e .[xreg][Optional] Install your preferred
torch/jaxbackend based on your OS and accelerators (CPU, GPU, TPU or Apple Silicon).:
- Install PyTorch.
- Install Jax for Flax.
Code Example
import torch
import numpy as np
import timesfm
torch.set_float32_matmul_precision("high")
model = timesfm.TimesFM_2p5_200M_torch.from_pretrained("google/timesfm-2.5-200m-pytorch")
model.compile(
timesfm.ForecastConfig(
max_context=1024,
max_horizon=256,
normalize_inputs=True,
use_continuous_quantile_head=True,
force_flip_invariance=True,
infer_is_positive=True,
fix_quantile_crossing=True,
)
)
point_forecast, quantile_forecast = model.forecast(
horizon=12,
inputs=[
np.linspace(0, 1, 100),
np.sin(np.linspace(0, 20, 67)),
], # Two dummy inputs
)
point_forecast.shape # (2, 12)
quantile_forecast.shape # (2, 12, 10): mean, then 10th to 90th quantiles.
Similar claude code skills
ui-ux-pro-max-skill
Design System Generator for Claude Code — creates tailored UI/UX design systems from project requirements
antigravity-awesome-skills
Claude skill library — 1,400+ agentic playbooks for coding assistants and AI IDEs
scientific-agent-skills
Scientific agent skills for research — transforms your AI agent into a research assistant across biology, chemistry, and medicine
claude-skills
Claude skill bundle for full-stack development — 66 specialized experts for languages, frameworks, and DevOps