lostella commited on
Commit
d93a81f
1 Parent(s): 7ed1189

add example, fix table

Browse files
Files changed (1) hide show
  1. README.md +44 -4
README.md CHANGED
@@ -26,13 +26,53 @@ Chronos-T5 uses 4096 different tokens, compared to 32128 of the original T5 mode
26
  Model | Parameters | Based on
27
  ----------------|-------------------|----------------------
28
  [chronos-t5-mini](https://huggingface.co/amazon/chronos-t5-mini) | 20M | [t5-efficient-mini](https://huggingface.co/google/t5-efficient-mini)
29
- [chronos-t5-small](https://huggingface.co/amazon/chronos-t5-small) | 46M | [flan-t5-small](https://huggingface.co/google/flan-t5-small)
30
- [chronos-t5-base](https://huggingface.co/amazon/chronos-t5-base) | 200M | [flan-t5-base](https://huggingface.co/google/flan-t5-base)
31
- [chronos-t5-large](https://huggingface.co/amazon/chronos-t5-large) | 710M | [flan-t5-large](https://huggingface.co/google/flan-t5-large)
32
 
33
  ## Usage
34
 
35
- To do inference with Chronos models, refer to the code and examples in the [companion GitHub repo](https://www.example.com/).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
  ## References
38
 
 
26
  Model | Parameters | Based on
27
  ----------------|-------------------|----------------------
28
  [chronos-t5-mini](https://huggingface.co/amazon/chronos-t5-mini) | 20M | [t5-efficient-mini](https://huggingface.co/google/t5-efficient-mini)
29
+ [chronos-t5-small](https://huggingface.co/amazon/chronos-t5-small) | 46M | [t5-efficient-small](https://huggingface.co/google/t5-efficient-small)
30
+ [chronos-t5-base](https://huggingface.co/amazon/chronos-t5-base) | 200M | [t5-efficient-base](https://huggingface.co/google/t5-efficient-base)
31
+ [chronos-t5-large](https://huggingface.co/amazon/chronos-t5-large) | 710M | [t5-efficient-large](https://huggingface.co/google/t5-efficient-large)
32
 
33
  ## Usage
34
 
35
+ To do inference with Chronos models, you will need to install the code from the [companion GitHub repo](https://www.example.com/).
36
+
37
+ ```bash
38
+ pip install git+https://github.com/amazon-science/chronos-forecasting.git
39
+ ```
40
+
41
+ A minimal example:
42
+
43
+ ```python
44
+ import numpy as np
45
+ import pandas as pd
46
+ import matplotlib.pyplot as plt
47
+ import torch
48
+ from chronos import ChronosPipeline
49
+
50
+ pipeline = ChronosPipeline.from_pretrained("amazon/chronos-t5-base")
51
+
52
+ df = pd.read_csv(
53
+ "https://raw.githubusercontent.com/AileenNielsen/"
54
+ "TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv",
55
+ index_col=0,
56
+ parse_dates=True,
57
+ )
58
+
59
+ context = torch.Tensor(df["#Passengers"].values)
60
+ forecast = pipeline.predict(context, prediction_length=12)
61
+
62
+ forecast_steps = range(len(df), len(df) + 12)
63
+ forecast_np = forecast.numpy()[0].T
64
+ low = np.quantile(forecast_np, 0.1, axis=1)
65
+ median = np.quantile(forecast_np, 0.5, axis=1)
66
+ high = np.quantile(forecast_np, 0.9, axis=1)
67
+
68
+ plt.plot(range(len(df)), df["#Passengers"], color="royalblue", label="historical data")
69
+ plt.plot(forecast_steps, forecast_np, color="grey", alpha=0.1)
70
+ plt.fill_between(forecast_steps, low, high, color="tomato", alpha=0.4, label="80% interval")
71
+ plt.plot(forecast_steps, median, color="tomato", label="median")
72
+ plt.legend()
73
+ plt.grid()
74
+ plt.show()
75
+ ```
76
 
77
  ## References
78