# MA252: Introduction to Probability Theory

Unit 2: Introduction to Probability Distributions   This unit will introduce you to probability distributions and several of their most important properties. You will learn how to identify discrete and continuous probability distributions as well as calculate their expected values and variances. You will also learn how to calculate the cumulative probability distribution for a given probability distribution and vice versa.

Completing this unit should take approximately 26 hours.

☐    Subunit 2.1: 7.5 hours

☐    Subunit 2.2: 11.5 hours

☐    Subunit 2.3: 2 hours

☐    Subunit 2.4: 5 hours

Unit2 Learning Outcomes
Upon successful completion of this unit, you will be able to: - define random variables and probability distributions;
- calculate the expected values of discrete and continuous distributions;
- calculate the probabilities of sums of random variables; and
- calculate cumulative distributions and marginal distributions.

2.1 Random Variables and Distributions   - Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 8: Random Variables and Distributions” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 8: Random Variables and Distributions” (PDF)

Instructions: Read this lecture to gain an understanding of random variables, which map probabilities to real numbers. This reading will also introduce you to the concepts of discrete and continuous distributions and offers several examples of each.

Reading this lecture and taking notes should take approximately 2 hours.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

• Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 8: Random Variables” Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences“Lecture 8: Random Variables” (YouTube)

Instructions: Watch this video. It will introduce you to discrete random variables, probability mass functions, cumulative distributions, and expected values. This video will reinforce the earlier readings you did in this subunit and show you more examples.

Watching this video and taking notes should take approximately 1 hour and 30 minutes.

• Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Continuous Density Functions” Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Continuous Density Functions”(PDF)

Instructions: Read pages 55 - 68 of Section 2.2 “Continuous Density Functions” in Chapter 2. This reading will introduce you to continuous random variables and their density functions.

Reading this chapter and taking notes should take approximately 2 hours.

• Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Exercises” Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 2, Section 2.2: Exercises” (PDF)

Instructions: Go to pages 71 - 73 and complete exercises 1, 2, 3, 4, and 5. You can then check your answers to odd-numbered questions here.

Completing this assessment should take you approximately 2 hours.

2.2 Expected Values, Variance, and Standard Deviation   - Lecture: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences: “Lecture 9: Expected Values” Link: YouTube: UCLA: Professor Herbert Enderton’s Math 3C: Math and Probability for Life Sciences“Lecture 9: Expected Values” (YouTube)

Instructions: Watch this video to learn how to calculate expected values, variance, and standard deviation of a probability distribution. In this video, you will see how the expected value is a measure of central tendency of a random variable, while the variance and standard deviation are measures of variation. The random variables are discrete, and several nice examples are illustrated.

Watching this video and taking notes should take approximately 1 hour and 30 minutes.

• Reading: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6, Section 6.1: Expected Value of Discrete Random Variables,” “Chapter 6, Section 6.2: Variance of Discrete Random Variables,” and “Chapter 6, Section 6.3: Continuous Random Variables” Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6, Section 6.1: Expected Value of Discrete Random Variables” (PDF), “Chapter 6, Section 6.2: Variance of Discrete Random Variables” (PDF), and “Chapter 6, Section 6.3: Continuous Random Variables” (PDF)

Instructions: Read the following pages in Sections 6.1, 6.2, and 6.3 of “Chapter 6: Expected Value and Variance”:

Section 6.1: pages 225 - 240;
Section 6.2: pages 257 - 263; and
Section 6.3: pages 268 - 275.

In this reading, you will learn how to compute the expected value and variance of discrete and continuous random variables. You will also learn how to compute the expected value and variance of sums of random variables. Several examples will be given, including special random variables that will be studied in the coming units.

Reading these textbook sections and taking notes should take approximately 4 hours.

• Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 16 : Expectation, Chebyshev's Inequality” and “Lecture 17 : Properties of Expectation, Variance, Standard Deviation” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics“Lecture 16: Expectation, Chebyshev’s  Inequality” (PDF) and “Lecture 17: Properties of Expectation, Variance, Standard Deviation” (PDF)

Instructions: Read these lectures to get a good review of expected values, variance, and standard deviation. You will see more examples that illustrate the concepts.

Reading these lectures and taking notes should take approximately 3 hours.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

• Assessment: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6: Expected Value and Variance”: “Section 6.1: Exercises,” “Section 6.2: Exercises,” and “Section 6.3: Exercises” Link: Dartmouth College: Professors Charles M. Grinstead and J. Laurie Snell’s Introduction to Probability: “Chapter 6: Expected Value and Variance”: “Section 6.1: Exercises” (PDF), “Section 6.2: Exercises” (PDF), and “Section 6.3: Exercises” (PDF)

Instructions: Go to the Section 6.1 exercises on pages 247 - 250 and complete exercises 1, 3, 5, 8, 13, and 23. Then go to the Section 6.2 exercises on pages 263 - 264 and complete exercises 1, 2, 7, 8, and 14. Finally, go to the exercises on page 378 and complete exercises 1 and 3. You can then check your answers to odd-numbered questions here

Completing this assessment should take approximately 3 hours.

2.3 Cumulative Distribution   - Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 9: Cumulative Distribution Function” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 9: Cumulative Distribution Function” (PDF)

Instructions: Read this lecture to learn how to calculate cumulative distributions for discrete and continuous probability distributions. Given a probability mass function of a discrete random variable X or a density function of a continuous random variable X, you should be able to find the cumulative distribution function F(x) = P(X is less than or equal to x). You will also learn how to use the cumulative distribution function to compute probabilities. Finally, you will be introduced to joint distributions, which will be studied further in the next subunit.

Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

• Assessment: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1” (PDF)

Instructions: Solve problem 4 in this practice test. Read the problem carefully and then try to solve it yourself before looking up the solution, which you can find here (PDF).

Completing this assessment should take approximately 30 minutes.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

2.4 Joint Probability Distributions   - Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 10: Marginal Distributions” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 10: Marginal Distributions” (PDF)

Instructions: Read this lecture for an introduction to marginal distributions. You will first learn about joint distributions, f(x,y), of two discrete or continuous random variables X and Y. Several examples will be given, and marginal distributions will be defined.

Reading this lecture and taking notes should take approximately 2 hours.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

• Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 11: Conditional Distributions, Multivariate Distributions” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 11: Conditional Distributions, Multivariate Distributions” (PDF)

Instructions: Read this lecture for an introduction to conditional distributions.

Reading this lecture and taking notes should take approximately 1 hour and 30 minutes.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

• Reading: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 19: Covariance and Correlation, Cauchy-Schwartz Inequality.” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Lecture 19: Covariance and Correlation, Cauchy-Schwartz Inequality” (PDF)

Instructions: Read this lecture for an introduction to covariance and correlation.

Reading this lecture and taking notes should take approximately 1 hour.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.

• Assessment: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics: “Practice Test 1” Link: Massachusetts Institute of Technology OpenCourseWare: Professor Dmitry Panchenko’s Math 18.05: Introduction to Probability and Statistics“Practice Test 1” (PDF)

Instructions: Solve problem 3 in this practice test. Read the problem carefully and then try to solve it yourself before looking up the solution, which you can find here (PDF).

Completing this assessment should take approximately 30 minutes.

Terms of Use: The above material is released under a Creative Commons Attribution-Non Commercial-ShareAlike License 3.0. It is attributed to Dmitry Panchenko and Massachusetts Institute of Technology OpenCourseWare.