|
--- |
|
library_name: transformers |
|
tags: |
|
- time series |
|
- multimodal |
|
- TimeSeries-Text-to-Text |
|
license: apache-2.0 |
|
--- |
|
|
|
# Mists-7B-v0.1-not-trained |
|
|
|
Mists(**Mis**tral **T**ime **S**eries) model is a multimodal model that combines language and time series model. |
|
This model is based on the following models: |
|
- [mistralai/Mistral-7B-Instruct-v0.3](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3) |
|
- [AutonLab/MOMENT-1-large](https://huggingface.co/AutonLab/MOMENT-1-large) |
|
|
|
This is an experimental model. |
|
It has some limitations and is not suitable for use at this time. |
|
|
|
## How to load model |
|
|
|
```Python |
|
!pip install git+https://github.com/Hajime-Y/moment.git |
|
!pip install -U transformers |
|
!git clone https://github.com/Hajime-Y/Mists.git |
|
``` |
|
|
|
```Python |
|
import torch |
|
|
|
from Mists.configuration_mists import MistsConfig |
|
from Mists.modeling_mists import MistsForConditionalGeneration |
|
from Mists.processing_mists import MistsProcessor |
|
|
|
model_id = "HachiML/Mists-7B-v0.1-not-trained" |
|
model = MistsForConditionalGeneration.from_pretrained( |
|
model_id, |
|
torch_dtype=torch.bfloat16, |
|
low_cpu_mem_usage=True, |
|
).to("cuda") |
|
processor = MistsProcessor.from_pretrained(model_id) |
|
``` |
|
|
|
```Python |
|
import pandas as pd |
|
|
|
hist_ndaq_512 = pd.DataFrame("nasdaq_price_history.csv") |
|
time_series_data = torch.tensor(hist_ndaq_512[["Open", "High", "Low", "Close", "Volume"]].values, dtype=torch.float) |
|
time_series_data = time_series_data.t().unsqueeze(0) |
|
|
|
prompt = "USER: <time_series>\nWhat are the features of this data?\nASSISTANT:" |
|
inputs = processor(prompt, time_series_data, return_tensors='pt').to(torch.float32) |
|
|
|
output = model.generate(**inputs, max_new_tokens=200, do_sample=False) |
|
print(processor.decode(output[0], skip_special_tokens=True)) |
|
``` |
|
|