Cover

Table of Contents

Table of Contents

“Exercises of Advanced Statistics”

INTRODUCTION

THEORETICAL OUTLINE

EXERCISES

“Exercises of Advanced Statistics”

“Exercises of Advanced Statistics”

SIMONE MALACRIDA

In this book, exercises are carried out regarding the following mathematical topics:

random variables, mean, variance, covariance

marginal and joint probability distributions

continuous and discrete probability distributions

remarkable theorems and inequalities of statistics

Initial theoretical hints are also presented to make the conduct of the exercises understandable.

Simone Malacrida (1977)

Engineer and writer, has worked on research, finance, energy policy and industrial plants.

ANALYTICAL INDEX

––––––––

INTRODUCTION

––––––––

I – THEORETICAL OUTLINE

Random variables, distributions and properties

Notable inequalities

Convergence

Discrete distributions

Continuous distributions

––––––––

II – EXERCISES

Exercise 1

Exercise 2

Exercise 3

Exercise 4 _

Exercise 5

Exercise 6

Exercise 7

Exercise 8

Exercise 9

Exercise 10

Exercise 11

Exercise 12

Exercise 13

Exercise 14

Exercise 15

Exercise 16

Exercise 17

Exercise 18

Exercise 19

Exercise 20

Exercise 21

Exercise 22

Exercise 23

Exercise 24

Exercise 25

Exercise 26

Exercise 27

Exercise 28

Exercise 29

Exercise 30

Exercise 31

Exercise 32

Exercise 33

Exercise 34

Exercise 35

Exercise 36

INTRODUCTION

INTRODUCTION

In this exercise book some examples of calculations related to advanced statistics are carried out.

Furthermore, the main theorems and inequalities used in statistics are presented.

The notion of random variable imposes a review of the elementary statistics system, going to define the probability distributions of this variable according to the various cases.

Discrete and continuous distributions make it possible to mathematically abstract a large number of otherwise unsolved problems even using the finest knowledge of mathematical analysis.

In order to understand in more detail what is presented in the resolution of the exercises, the theoretical reference context is recalled in the first chapter.

What is exposed in this workbook is generally addressed in university courses in statistics.

I

THEORETICAL OUTLINE

THEORETICAL OUTLINE

Random variables, distributions and properties

––––––––

A random or random variable is a measurable function on a sample space in which a probability measure is defined.

This variable can have values in R, and therefore have one dimension, or have more dimensions and in this case we speak of multivariate random variables.

Each random variable X can be associated with a distribution or probability law which assigns to each subset of possible values of X the probability that the random variable takes on value in that subset and is defined as follows:

Where the last relation is the probability measure defined on the sample space.

If the random variable is discrete then the discrete probability function is defined as follows:

While if it is continuous, the probability density function is given by:

Where A is a subset of the sample space and the integral is intended according to Lebesgue.

For multivariate random variables the following extension for the probability density function holds:

This is called the joint probability density function.

On the other hand, the probability density of a single component, called the marginal density , is defined as follows:

In the case of multivariate discrete variables, the following definitions apply for joint and marginal probability functions:

Instead, it is called a distribution function, a non-decreasing function, continuous to the right and with the following properties:

Such that one has that:

The relations between the distribution function and the probability function are given by the following formulas, respectively in the continuous and in the discrete case:

The following probability function (continuous case and discrete case) is called conditional distribution:

If two random variables are independent then the denominators of these relations are unitary.

––––––––

The expected value of random variables is defined as follows in the discrete and continuous cases:

The expected value of a constant is the constant itself, furthermore the expected value is linear and the expected value of the sum of the independent random variables is equal to the sum of the expected values of the single random variables (this result, however, does not require the condition of independence as necessary ).

Furthermore, the expected value is monotonic, i.e. if one random variable is greater than another, then its expected value will also be greater than that of the other.

The conditional expected value of a random variable is the expected value with respect to a conditional probability distribution and can be expressed as follows, respectively in the discrete and continuous cases:

We define variance as the following quantity:

The variance is never negative and is zero only when the variable assumes a value with probability equal to the certain event.

The variance has the following property:

Furthermore, for two independent random variables:

The variance of discrete and continuous random variables is given by:

The measure of the independence of two random variables is given by the covariance:

Which can be expressed like this:

Two independent random variables always have zero covariance (if the covariance is zero, however, the variables can also be dependent).

The covariance has the following properties:

Impressum

Verlag: BookRix GmbH & Co. KG

Tag der Veröffentlichung: 23.04.2023
ISBN: 978-3-7554-4002-4

Alle Rechte vorbehalten

Nächste Seite
Seite 1 /