Week 4 was the first week back after winter break, so it was a bit hard to get back into the swing of things, even though I had been diligently (mostly) working on my project.
On Monday, we were introduced to Logistic Regression, or ‘hipster regression’ as our instructor put it. Not exactly sure why it’s called that…
Anyway, logistic regression is a method of machine learning that’s basically used everywhere. From fraud detection to medical diagnoses to customer churn. I was already familiar with the method before the lesson, and it’s one that I plan to implement for my current project, but I was learning it formally for the first time. Discovering the math behind logistic regression was incredibly helpful in understanding its pros and cons in prediction.

On Wednesday we discussed another method called Naive Bayes Classification. Completely new material to me and I’m honestly still daunted by the math and all the definitions for “prior probability”, “posterior probability”, and why it’s even called Naive in the first place (because it assumes independence of features, which may not be true.)
The instructor seemed to sense that many of us were stumped and reminded us that it’s okay!

Well, I’m definitely outside of my comfort zone now. Homework this week was challenging, but I’m excited about what I’ve learned and how I could apply it. I just hope I’m doing it correctly… time to go to office hours.