In regression models, we may also have additional covariates that are also measured sequentially over time. We may want to regress the yt times series and see what relationships they have with other covariates that are also measured over time. The simplest possible case is the dynamic simple regression model. In this case, I can write down. I have a single covariate, that covariate is X_t that is observed here, and then I have the usual. In this case, I have an intercept and a slope, and this is representing my simple linear regression. It's just the regression where both the intercept and the slope are time-varying. I can define the variation. I need to specify what's the evolution of the two components, and we are going to use this random walk. We could use other structures, but again, in the normal linear case, we are going to be using these evolution equations. Then I collect here my W's as a single vector. The Omega t is going to have the two components in here. These are normally distributed zero and variance covariance matrix W_t, that is a two-by-two matrix. This is the case of the simple regression model. In the case of this model, we have F now is time-varying. This is going to change depending on the value of X_t. I can write Ft transpose as one and X_t. My Theta vector. Again, if I think about what it is, is just Beta t, 0 Beta t, 1. I have those two components. The G matrix is going to be the identity, and you can see that essentially the first component is related to the first component in t minus one, and the second component at time t is related to the second component at time t minus 1. So the identity matrix will be the G. Therefore, if I think about my forecast function in the simple linear regression case, this is going to be my F transpose, which is 1 xt times the G, the G is the identity, times the expected value of Theta t, given Dt. For the expected value of Theta t given Dt, This is a two-dimensional vector, so I'm going to have components in there. I can write this down as K_t0 plus K_t1 Xt. We can see that the forecast function is again has that form that depends on that covariate at the time. This should be t plus h because we are evaluating this at t plus h. You need to have the covariate evaluated at t plus h here. In the case of general dynamic regression model, we're going to have a set of covariates. We can have, let's say k of those covariates or p of those covariates, X_t1. This is my observation equation. Instead of having a single covariate, now I'm going to have p of them. I'm going to have coefficients that go with each of those and I may have the Beta t0 coefficient. My G matrix now, if I think about my parameter vector is just p plus 1 dimensional, p plus 1. Yeah, so that I have the 0 and then the p values, so is a p plus 1 vector. Then my G is the identity. My F_t is going to be a vector, is also p plus 1 dimension. The first entry is one, the second is X_t1 X_tp. My forecast function is going to be similar to this, but now we are going to have more than one covariate, so we end up with a forecast function that has this form, p. This is the case for the dynamic regression. One particular example of dynamic regression model is the case of a time-varying autoregressive process. This brings us back to those autoregressive processes that we were discussing earlier in the course. When you you're regressing each of the X's correspond to pass values, you have a regression model that we call a time-varying AR P. In this case, your observation equation is going to have the AR coefficients, but the AR coefficients are going to be varying over time. If we assume that we put all the coefficients together and have a random walk evolution equation for those. If I said, I call Phi_t the vector that contains all the components with all the coefficients from one to p, then I can now define this evolution equation. Then my Omega_t here is a p-dimensional vector, and I have Omega t, normal zero, WT, and my epsilon t normal 0 vt. This defines a time-varying AR. It's the same structure that we had before. The only difference is my covariates are just past values of the time series. Therefore my forecast function for the time-varying AR is going to have this form where everything is going to depend on past values of the time series. We will study this model in particular and make connections with the AR that we studied earlier in the class.