top of page

CST383 - Week 6

  • Writer: YZ
    YZ
  • Aug 29, 2021
  • 1 min read

This week we delved into learning about linear regression. First, we learned about the formulas and variables for finding a line to fit our data. Although we usually want to use KNN, linear regression is less sensitive to noisy predictors within our data. The values of the coefficients give insight into the data. A coefficient of 0 means the feature is not contributing to the prediction, while a perfect model would have a coefficient of 1. When making predictions with test data, the general steps to follow are to fit a linear model to the data, make predictions, plot predicted vs. actual values, and compute the root mean squared error. We also learned about feature engineering and the interaction of features. In order to predict closer values, cross-validation is sometimes used and you must be very selective about which features and predictors to include.

After the lectures, we completed 4 labs to helps us practice and become familiar with using linear regression and seeing its effect on predictions. We experimented with kNN regression, last week's material, in this week's homework using data about student housing.

We were also required to submit a progress report for our project and our team submitted the details of our project about the effects of student alcohol comsuption. We did some preprocessing and data exploration to start our project. Lastly, we took a quiz on probabilities and data visualization. It was a busy week and we are nearing the end of the class!



Recent Posts

See All

Comments


Post: Blog2_Post
  • Facebook
  • Twitter

©2020 by yz-learningjournal-csumb. Proudly created with Wix.com

bottom of page