How do you calculate the value that is 2 Standard Deviations from a Trendline. For instance, you have 1000 daily data points. You calculate the value of a Linear Regression trendline over the last 100 days from you daily data points. Do you calculate the 2 Standard Deviations of your underlying data points or 2 Standard Deviations for the Linear Regression trendline itself when plotting prediction bands around your trendline. Also, what about Standard Error bands? Are they calculated using the underlying data or the trendline values? Finally, if you mulitply a Standard Error over 100 days by its corresponding 95% value from the Student's T-Distribution Table, will that give you an accurate value for prediction bands around your Linear Regression trendline?