Learn Probability in Computer Science with Stanford University for FREE
Probability is one of the foundational elements of computer science. Some bootcamps will skim over the topic, however, it is integral to your computer science knowledge.
Image by Author
For those diving into the world of computer science or needing a touch-up on their probability knowledge, you’re in for a treat. Stanford University has recently updated its YouTube playlist on its CS109 course with new content!
The playlist comprises 29 lectures to provide you with gold-standard knowledge of the basics of probability theory, essential concepts in probability theory, mathematical tools for analyzing probabilities, and then ending data analysis and Machine Learning.
So let’s get straight into it…
Lecture 1: Counting
Link: Counting
Learn about the history of probability and how it has helped us achieve modern AI, with real-life examples of developing AI systems. Understand the core counting phases, counting with ‘steps’ and counting with ‘or’. This includes areas such as artificial neural networks and how researchers would use probability to build machines.Â
Lecture 2: Combinatorics
Link: Combinatorics
The second lecture goes into the next level of seriousness counting - this is called Combinatorics. Combinatorics is the mathematics of counting and arranging. Dive into counting tasks on n objects, through sorting objects (permutations), choosing k objects (combinations), and putting objects in r buckets.Â
Lecture 3: What is Probability?
Link: What is Probability?
This is where the course really starts to dive into Probability. Learn about the core rules of probability with a wide range of examples and a touch on the Python programming language and its use with probability.Â
Lecture 4: Probability and Bayes
Link: Probability and Bayes
In this lecture, you will dive into learning how to use conditional probabilities, chain rule, the law of total probability and Bayes theorem.Â
Lecture 5: Independence
Link: Independence
In this lecture, you will learn about probability in respect of it being mutually exclusive and independent, using AND/OR. The lecture will go through a variety of examples for you to get a good grasp.
Lecture 6: Random Variables and Expectations
Link: Random Variables and Expectations
Based on the previous lectures and your knowledge of conditional probabilities and independence, this lecture will dive into random variables, use and produce the probability mass function of a random variable, and be able to calculate expectations.Â
Lecture 7: Variance Bernoulli Binomial
Link: Variance Bernoulli Binomial
You will now use your knowledge to solve harder and harder problems. Your goal for this lecture will be to recognise and use Binomial Random Variables, Bernoulli Random Variables, and be able to calculate the variance for random variables.Â
Lecture 8: Poisson
Link: Poisson
Poisson is great when you have a rate and you care about the number of occurrences. You will learn about how it can be used in different aspects along with Python code examples.
Lecture 9: Continuous Random Variables
Link: Continuous Random Variables
The goals of this lecture will include being comfortable using new discrete random variables, integrating a density function to get a probability, and using a cumulative function to get a probability.Â
Lecture 10: Normal Distribution
Link: Normal Distribution
You may have heard this about normal distribution before, in this lecture, you will go through a brief history of normal distribution, what it is, why it is important and practical examples.
Lecture 11: Joint Distributions
Link: Joint Distributions
In the previous lectures, you will have worked with 2 random variables at most, the next step of learning will be to go into any given number of random variables.
Lecture 12: Inference
Link: Inference
The learning goal of this lecture is how to use multinomials, appreciate the utility of log probabilities, and be able to use the Bayes theorem with random variables.Â
Lecture 13: Inference II
Link: Inference II
The learning goal continues from the last lecture of combining Bayes theorem with random variables.Â
Lecture 14: Modeling
Link: Modelling
In this lecture, you will take everything you have learned so far and put it into perspective about real-life problems - probabilistic modelling. This is taking a whole bunch of random variables being random together.
Lecture 15: General Inference
Link: General Inference
You will dive into general inference, and in particular, learn about an algorithm called rejection sampling.Â
Lecture 16: Beta
Link: Beta
This lecture will go into the random variables of probabilities which are used to solve real-world problems. Beta is a distribution for probabilities, where its range values between 0 and 1.Â
Lecture 17: Adding Random VariablesÂ
Link: Adding Random Variables I
At this point of the course, you will be learning about deep theory and adding random variables is an introduction to how to attain results of the theory of probability.Â
Lecture 18: Central Limit Theorem
Link: Central Limit Theorem
In this lecture, you will dive into the central limit theorem which is an important element in probability. You will go through practical examples so that you can grasp the concept.
Lecture 19: Bootstrapping and P-ValuesÂ
Link: Bootstrapping and P-Values I
You will now move into uncertainty theory, sampling and bootstrapping which is inspired by the central limit theorem. You will go through practical examples.Â
Lecture 20: Algorithmic Analysis
Link: Algorithmic Analysis
In this lecture, you will dive a bit more into computer science with an in-depth understanding of the analysis of algorithms, which is the process of finding the computational complexity of algorithms.
Lecture 21: M.L.E.
Link: M.L.E.
This lecture will dive into parameter estimation, which will provide you with more knowledge on machine learning. This is where you take your knowledge of probability and apply it to machine learning and artificial intelligence.Â
Lecture 22: M.A.P.
Link: M.A.P.
We’re still at the stage of taking core principles of probability and how it applied to machine learning. In this lecture, you will focus on parameters in machine learning regarding probability and random variables.Â
Lecture 23: Naive Bayes
Link: Naive Bayes
Naive Bayes is the first machine learning algorithm you will learn about in depth. You will have learnt about the theory of parameter estimation, and now will move on to how core algorithms such as Naive Bayes lead to ideas such as neural networks.Â
Lecture 24: Logistic Regression
Link: Logistic Regression
In this lecture, you will dive into a second algorithm called Logistic regression which is used for classification tasks, which you will also learn more about.Â
Lecture 25: Deep Learning
Link: Deep Learning
As you’ve started to dive into machine learning, this lecture will go into further detail about deep learning based on what you have already learned.Â
Lecture 26: Fairness
Link: Fairness
We live in a world where machine learning is being implemented in our day-to-day lives. In this lecture, you will look into the fairness around machine learning, with a focus on ethics.Â
Lecture 27: Advanced Probability
Link: Advanced Probability
You have learnt a lot about the basics of probability and have applied it in different scenarios and how it relates to machine learning algorithms. The next step is to get a bit more advanced about probability.Â
Lecture 28: Future of Probability
Link: Future of Probability
The learning goal for this lecture is to learn about the use of probability and the variety of problems that probability can be applied to solve these problems.Â
Lecture 29: Final Review
Link: Final Review
And last but not least, the last lecture. You will go through all the other 28 lectures and touch on any uncertainties.Â
Wrapping it up
Being able to find good material for your learning journey can be difficult. This probability for computer science course material is amazing and can help you grasp concepts of probability that you were unsure of or needed a touch up.
Nisha Arya is a data scientist, freelance technical writer, and an editor and community manager for KDnuggets. She is particularly interested in providing data science career advice or tutorials and theory-based knowledge around data science. Nisha covers a wide range of topics and wishes to explore the different ways artificial intelligence can benefit the longevity of human life. A keen learner, Nisha seeks to broaden her tech knowledge and writing skills, while helping guide others.