Hello, my name is Raquel Prado and I am a professor of statistics in the Baskin School of Engineering at the University of California, Santa Cruz. Welcome to my course on Bayesian time series. Many statistical models work with the assumption that the observations are realizations of independent random variables. In contrast, time-series analysis is concerned with modeling the dependency among elements of a sequence of temporarily related random variables. Using models that can adequately describe such temporal dependency, as well as developing statistical tools for inference and forecasting within such classes of models is key in many applied settings where the data are recorded sequentially over time. For instance, consider the time series of daily average temperature values taken in a given location in the Northern Hemisphere. You may notice that the average temperature value for a given day and a given year in the month of say, February is generally close to the average daily temperature values of the previous few days for the same location. In addition, the average temperature values for this particular location day and year in the month of February are usually closer to the average daily temperature values during the month of February of the previous year than to the average daily temperature values during the month of August of the previous year. This indicates that there is a temporal dependency structure underlying the process with possibly several components. One component that describes short temporal dependency. This means temperature values today are related to temperature values on previous days and another component that describes a longer temporal dependency with a seasonal behavior, winter and summer seasonal effects. In this class, you will learn how to build models that can describe these and other types of temporal dependencies. You will also learn how to perform Bayesian inference, and forecasting for such models. I will begin by introducing some basic concepts and ideas for modeling temporarily dependent data. I will then focus on describing and implementing some specific classes of models including the class of autoregressive models for stationary time series and the class of normal dynamic linear models for non-stationary time series. To succeed in this course, you should be familiar with the basics of calculus-based probability and the principles of maximum likelihood estimation and Bayesian inference, including simulation-based methods for posterior inference. If you're not familiar with these topics I suggest you can see they're taking two other courses offer in this platform. The course Bayesian statistics from concepts to data by Herbert Lee and the course Bayesian statistics techniques and models by Matthew Heiner. You will also need some knowledge of the R language. If you have not worked with R before, I'd recommend that you take a tutorial or class in R before starting this course. There are many different alternatives. I recommend the R programming course in Coursera by Roger Peng. The expression time-series data, or simply time series, usually refers to a set of observations collected sequentially over time. If these observations are collected at equally spaced time points, we say that the time series are equally spaced. In this course, we will only work with equally spaced time series. We also know that in practical settings, a time series can be a scalar quantity measured over time, or a k-dimensional vector containing case color quantities at each time. In this course, we will only discuss approaches for modeling, inference, and forecasting of scalar time series which are also called univariate time series. Some of the goals of time series analysis in applied settings include analysis and inference to obtain a characterization of the time or frequency domain features of the observed time series. For instance, brain signals such as electroencephalograms are time series that contain information about the electrical activity in the brain of a given subject. These signals are typically recorded during a period of time under certain clinical or experimental conditions. We may be interested, for example, in analyzing electroencephalograms recording from a subject during sleep. Understanding the time and frequency features of these electroencephalograms can provide useful information about the sleep patterns of an individual. Another key goal is forecasting or predicting future values of a time series process based on the information provided by the time series data collected up to a certain time. For example, we can use the information given by the daily average temperature values measuring a specific location during say, the last 30 days and use this information to predict the average daily temperature values for the next five days. We may also be interested in monitoring, this is, in detecting changes of a process over time using the information provided sequentially by the data. For example, in change point detection we are interested in identifying features in the time series that allows us to determine if there have been structural changes in the underlying temporal process. This is important in some practical settings such as in real-time or online monitoring of an industrial process. Finally, another goal of time series analysis is clustering. We may want to group a collection of time series in terms of their domain or frequency domain characteristics. This could allow us to discover if there is an underlying group structure among this collection of time series. In this course, we will focus on the first two goals that involve time series analysis and forecasting. We will study some widely used time-domain models for analysis and forecasting of stationary time series. In particular, we will study the class of autoregressive models and their properties. We will also explore frequency domain representations of these models. We will study a flexible class of time-domain models for analysis and forecasting of non-stationary time series, the class of normal dynamic linear models. We will learn how to build relatively sophisticated normal dynamic linear models that incorporate several components including among others, polynomial trend components, seasonal components, and regression components. We will learn an implement tools for Bayesian inference and forecasting within the classes of models just mentioned. Finally, we will apply the modeling inference and forecasting tools that we learn in this course in many practical settings.