We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I follow the script in https://towardsdatascience.com/time-llm-reprogram-an-llm-for-time-series-forecasting-e2558087b8ac to Predicting with Time-LLM using GPT2. My code is exactly the same as the script. But got the terrible result.
My result:
script: https://towardsdatascience.com/time-llm-reprogram-an-llm-for-time-series-forecasting-e2558087b8ac
code:
import time import numpy as np import pandas as pd import pytorch_lightning as pl import matplotlib.pyplot as plt from neuralforecast import NeuralForecast from neuralforecast.models import TimeLLM from neuralforecast.losses.pytorch import MAE from neuralforecast.tsdataset import TimeSeriesDataset from neuralforecast.utils import AirPassengers, AirPassengersPanel, AirPassengersStatic, augment_calendar_df from transformers import GPT2Config, GPT2Model, GPT2Tokenizer AirPassengersPanel, calendar_cols = augment_calendar_df(df=AirPassengersPanel, freq='M') Y_train_df = AirPassengersPanel[AirPassengersPanel.ds<AirPassengersPanel['ds'].values[-12]] Y_test_df = AirPassengersPanel[AirPassengersPanel.ds>=AirPassengersPanel['ds'].values[-12]].reset_index(drop=True) gpt2_config = GPT2Config.from_pretrained('openai-community/gpt2') gpt2 = GPT2Model.from_pretrained('openai-community/gpt2',config=gpt2_config) gpt2_tokenizer = GPT2Tokenizer.from_pretrained('openai-community/gpt2') prompt_prefix = "The dataset contains data on monthly air passengers. There is a yearly seasonality" timellm = TimeLLM(h=12, input_size=36, llm=gpt2, llm_config=gpt2_config, llm_tokenizer=gpt2_tokenizer, prompt_prefix=prompt_prefix, max_steps=100, batch_size=24, windows_batch_size=24) nf = NeuralForecast( models=[timellm], freq='M' ) nf.fit(df=Y_train_df, val_size=12) forecasts = nf.predict(futr_df=Y_test_df)
The text was updated successfully, but these errors were encountered:
Can somebody tell me why my code has such awful outcome? qaq
Sorry, something went wrong.
No branches or pull requests
I follow the script in https://towardsdatascience.com/time-llm-reprogram-an-llm-for-time-series-forecasting-e2558087b8ac to Predicting with Time-LLM using GPT2.
My code is exactly the same as the script. But got the terrible result.
My result:
script:
https://towardsdatascience.com/time-llm-reprogram-an-llm-for-time-series-forecasting-e2558087b8ac
code:
The text was updated successfully, but these errors were encountered: