100% found this document useful (9 votes)
38 views31 pages

Instant ebooks textbook An Introduction to Bayesian Inference, Methods and Computation Nick Heard download all chapters

The document is a promotional material for the ebook 'An Introduction to Bayesian Inference, Methods and Computation' by Nick Heard, available for download on ebookmeta.com. It outlines the book's aim to provide a fast and accessible introduction to Bayesian statistical inference for postgraduate students, covering fundamental principles to advanced modeling techniques using the probabilistic programming language Stan. The content includes various chapters on decision-making, prior and likelihood representation, graphical modeling, and computational techniques in Bayesian inference.

Uploaded by

shinkwerryyq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (9 votes)
38 views31 pages

Instant ebooks textbook An Introduction to Bayesian Inference, Methods and Computation Nick Heard download all chapters

The document is a promotional material for the ebook 'An Introduction to Bayesian Inference, Methods and Computation' by Nick Heard, available for download on ebookmeta.com. It outlines the book's aim to provide a fast and accessible introduction to Bayesian statistical inference for postgraduate students, covering fundamental principles to advanced modeling techniques using the probabilistic programming language Stan. The content includes various chapters on decision-making, prior and likelihood representation, graphical modeling, and computational techniques in Bayesian inference.

Uploaded by

shinkwerryyq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Get the full ebook with Bonus Features for a Better Reading Experience on ebookmeta.

com

An Introduction to Bayesian Inference, Methods and


Computation Nick Heard

https://ebookmeta.com/product/an-introduction-to-bayesian-
inference-methods-and-computation-nick-heard/

OR CLICK HERE

DOWLOAD NOW

Download more ebook instantly today at https://ebookmeta.com


Nick Heard

An Introduction
to Bayesian
Inference,
Methods and
Computation
An Introduction to Bayesian Inference, Methods
and Computation
Nick Heard

An Introduction to Bayesian
Inference, Methods
and Computation
Nick Heard
Imperial College London
London, UK

ISBN 978-3-030-82807-3 ISBN 978-3-030-82808-0 (eBook)


https://doi.org/10.1007/978-3-030-82808-0

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature
Switzerland AG 2021
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar
or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

The aim of writing this text was to provide a fast, accessible introduction to Bayesian
statistical inference. The content is directed at postgraduate students with a back-
ground in a numerate discipline, including some experience in basic probability
theory and statistical estimation. The text accompanies a module of the same name,
Bayesian Methods and Computation, which forms part of the online Master of
Machine Learning and Data Science degree programme at Imperial College London.
Starting from an introduction to the fundamentals of subjective probability, the
course quickly advances to modelling principles, computational approaches and then
advanced modelling techniques. Whilst this rapid development necessitates a light
treatment of some advanced theoretical concepts, the benefit is to fast track the reader
to an exciting wealth of modelling possibilities whilst still providing a key grounding
in the fundamental principles.
To make possible this rapid transition from basic principles to advanced modelling,
the text makes extensive use of the probabilistic programming language Stan, which
is the product of a worldwide initiative to make Bayesian inference on user-defined
statistical models more accessible. Stan is written in C++, meaning it is computa-
tionally fast and can be run in parallel, but the interface is modular and simple. The
future of applied Bayesian inference arguably relies on the broadening development
of such software platforms.
Chapter 1 introduces the core ideas of Bayesian reasoning: Decision-making under
uncertainty, specifying subjective probabilities and utility functions and identifying
optimal decisions as those which maximise expected utility. Prediction and estima-
tion, the two core tasks in statistical inference, are shown to be special cases of this
broader decision-making framework. The application-driven reader may choose to
skip this chapter, although philosophically it sets the foundation for everything that
follows.
Chapter 2 presents representation theorems which justify the prior × likelihood
formulation synonymous with Bayesian probability models. Simply believing that
unknown variables are exchangeable, meaning probability beliefs are invariant to
relabelling of the variables, is sufficient to guarantee that construction must hold.
The prior distribution distinguishes Bayesian inference from frequentist statistical
methods, and several approaches to specifying prior distributions are discussed. The
v
vi Preface

prior and likelihood construction leads naturally to consideration of the posterior


distribution, including useful results on asymptotic consistency and normality which
suggest large sample robustness to the choice of prior distribution.
Chapter 3 shows how graphical models can be used to specify dependencies in
probability distributions. Graphical representations are most useful when the depen-
dency structure is a primary target of inferential interest. Different types of graphical
model are introduced, including belief networks and Markov networks, highlighting
that the same graph structure can have different interpretations for different models.
Chapter 4 discusses parametric statistical models. Attention is focused on conju-
gate models, which present the most mathematically convenient parametric approx-
imations of true, but possibly hard to specify, underlying beliefs. Although these
models might appear relatively simplistic, later chapters will show how these basic
models can form the basis of very flexible modelling frameworks.
Chapter 5 introduces the computational techniques which revolutionised the appli-
cation of Bayesian statistical modelling, enabling routine performance of infer-
ential tasks which had previously appeared infeasible. Relatively simple Markov
chain Monte Carlo methods were at the heart of this development, and these are
explained in some detail. A higher level description of Hamiltonian Monte Carlo
methods is also provided, since these methods are becoming increasingly popular
for performing simulation-based computational inference more efficiently. For high-
dimensional inference problems, some useful analytic approximations are presented
which sacrifice the theoretical accuracy of Monte Carlo methods for computational
speed.
Chapter 6 discusses probabilistic programming languages specifically designed
for easing some of the complexities of implementing Bayesian inferential methods.
Particular attention is given to Stan, which has experienced rapid growth in deploy-
ment. Stan automates parallel Hamiltonian Monte Carlo sampling for statistical infer-
ence on any suitably specified Bayesian model on a continuous parameter space. In
the subsequent chapters which introduce more advanced statistical models, Stan is
used for demonstration wherever possible.
Chapter 7 is concerned with model checking. There are no expectations for subjec-
tive probability models to be correct, but it can still be useful to consider how well
observed data appear to fit with an assumed model before making any further predic-
tions using the same model assumptions; it may make sense to reconsider alter-
natives. Posterior predictive checking provides one framework for model checking
in the Bayesian framework, and its application is easily demonstrated in Stan. For
comparing rival models, Bayes factors are shown to be a well-calibrated statistic for
quantifying evidence in favour for one model or the other, providing a vital Bayesian
analogue to Neyman-Pearson likelihood ratios.
Chapter 8 presents the Bayesian linear model as the cornerstone of regression
modelling. Extensions from the standard linear model to other basis functions such
as polynomial and spline regression highlight the flexibility of this fundamental
model structure. Further extensions to generalised linear models, such as logistic
and Poisson regression, are demonstrated through implementation in Stan.
Preface vii

Chapter 9 characterises nonparametric models as more flexible parametric models


with a potentially infinite number of parameters, distributing probability mass across
larger function spaces. Dirichlet process and Polya tree models are presented as
respective nonparametric models for discrete and continuous random probability
measures. Partition models such as Bayesian histograms are also included in this
class of models.
Chapter 10 covers nonparametric regression. Particular attention is given to Gaus-
sian processes, which can be regarded as generalisations of the Bayes linear model.
Spline models and partition models are also re-examined in this context.
Chapter 11 combines clustering and latent factor models. Both classes of model
assume a latent underlying structure, which is either discrete or continuous, respec-
tively. Finite and infinite mixture models are considered for clustering data into
homogeneous groupings. Topic modelling of text and other unstructured data is
considered as both a finite and infinite mixture problem. Finally, continuous latent
factor models are presented as an extension of linear regression modelling, through
the inclusion of unobserved covariates. Again, example Stan code is used to illustrate
this class of models.
Throughout the text, there are exercises which should form an important compo-
nent of following this course. Exercises which require access to a computer are
indicated with a symbol; these become increasingly prevalent as the chapters
progress, reflecting the transition within the text from laying fundamental principles
to applied practice.

London, UK Nick Heard


June 2021
Contents

1 Uncertainty and Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1


1.1 Subjective Uncertainty and Possibilities . . . . . . . . . . . . . . . . . . . . . 1
1.1.1 Subjectivism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 Subjective Uncertainty . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.3 Possible Outcomes and Events . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Decisions: Actions, Outcomes, Consequences . . . . . . . . . . . . . . . . 3
1.2.1 Elements of a Decision Problem . . . . . . . . . . . . . . . . . . . . 3
1.2.2 Preferences on Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Subjective Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.1 Standard Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.2 Equivalent Standard Events . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3.3 Definition of Subjective Probability . . . . . . . . . . . . . . . . . 6
1.3.4 Contrast with Frequentist Probability . . . . . . . . . . . . . . . . 7
1.3.5 Conditional Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.3.6 Updating Beliefs: Bayes Theorem . . . . . . . . . . . . . . . . . . . 8
1.4 Utility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.4.1 Principle of Maximising Expected Utility . . . . . . . . . . . . 9
1.4.2 Utilities for Bounded Decision Problems . . . . . . . . . . . . . 10
1.4.3 Utilities for Unbounded Decision Problems . . . . . . . . . . . 10
1.4.4 Randomised Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.4.5 Conditional Probability as a Consequence
of Coherence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5 Estimation and Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.5.1 Continuous Random Variables and Decision
Spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.5.2 Estimation and Loss Functions . . . . . . . . . . . . . . . . . . . . . . 12
1.5.3 Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2 Prior and Likelihood Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.1 Exchangeability and Infinite Exchangeability . . . . . . . . . . . . . . . . . 15

ix
x Contents

2.2 De Finetti’s Representation Theorem . . . . . . . . . . . . . . . . . . . . . . . . 16


2.3 Prior, Likelihood and Posterior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.3.1 Prior Elicitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.3.2 Non-informative Priors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.3.3 Hyperpriors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.3.4 Mixture Priors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.3.5 Bayesian Paradigm for Prior to Posterior Reporting . . . . 20
2.3.6 Asymptotic Consistency . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.3.7 Asymptotic Normality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3 Graphical Modelling and Hierarchical Models . . . . . . . . . . . . . . . . . . . 23
3.1 Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.1.1 Specifying a Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.1.2 Neighbourhoods of Graph Nodes . . . . . . . . . . . . . . . . . . . 24
3.1.3 Paths, Cycles and Directed Acyclic Graphs . . . . . . . . . . . 25
3.1.4 Cliques and Separation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.2 Graphical Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.2.1 Belief Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.2.2 Markov Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.2.3 Factor Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.3 Hierarchical Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4 Parametric Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.1 Parametric Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.2 Conjugate Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4.3 Exponential Families . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.4 Non-conjugate Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.5 Posterior Summaries for Parametric Models . . . . . . . . . . . . . . . . . . 37
4.5.1 Marginal Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.5.2 Credible Regions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
5 Computational Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
5.1 Intractable Integrals in Bayesian Inference . . . . . . . . . . . . . . . . . . . 39
5.2 Monte Carlo Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.2.1 Standard Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.2.2 Estimation Under a Loss Function . . . . . . . . . . . . . . . . . . . 41
5.2.3 Importance Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
5.2.4 Normalising Constant Estimation . . . . . . . . . . . . . . . . . . . 43
5.3 Markov Chain Monte Carlo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
5.3.1 Technical Requirements of Markov Chains
in MCMC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
5.3.2 Gibbs Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
5.3.3 Metropolis-Hastings Algorithm . . . . . . . . . . . . . . . . . . . . . 48
5.4 Hamiltonian Markov Chain Monte Carlo . . . . . . . . . . . . . . . . . . . . 50
5.5 Analytic Approximations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
5.5.1 Normal Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Contents xi

5.5.2 Laplace Approximations . . . . . . . . . . . . . . . . . . . . . . . . . . . 53


5.5.3 Variational Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
5.6 Further Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
6 Bayesian Software Packages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
6.1 Illustrative Statistical Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
6.2 Stan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
6.2.1 PyStan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6.3 Other Software Libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.3.1 PyMC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.3.2 Edward . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
7 Criticism and Model Choice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
7.1 Model Uncertainty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
7.2 Model Averaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
7.3 Model Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
7.3.1 Selecting From a Set of Models . . . . . . . . . . . . . . . . . . . . . 69
7.3.2 Pairwise Comparisons: Bayes Factors . . . . . . . . . . . . . . . . 70
7.3.3 Bayesian Information Criterion . . . . . . . . . . . . . . . . . . . . . 72
7.4 Posterior Predictive Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
7.4.1 Posterior Predictive p-Values . . . . . . . . . . . . . . . . . . . . . . . 73
7.4.2 Monte Carlo Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
7.4.3 PPC with Stan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
8 Linear Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
8.1 Parametric Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
8.2 Bayes Linear Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
8.2.1 Conjugate Prior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
8.2.2 Reference Prior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
8.3 Generalisation of the Linear Model . . . . . . . . . . . . . . . . . . . . . . . . . 86
8.3.1 General Basis Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
8.4 Generalised Linear Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
8.4.1 Poisson Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
8.4.2 Logistic regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
9 Nonparametric Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
9.1 Random Probability Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
9.2 Dirichlet Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
9.2.1 Discrete Base Measure . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
9.3 Polya Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
9.3.1 Continuous Random Measures . . . . . . . . . . . . . . . . . . . . . . 101
9.4 Partition Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
9.4.1 Partition Models: Bayesian Histograms . . . . . . . . . . . . . . 102
9.4.2 Bayesian Histograms with Equal Bin Widths . . . . . . . . . 104
xii Contents

10 Nonparametric Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107


10.1 Nonparametric Regression Modelling . . . . . . . . . . . . . . . . . . . . . . . 107
10.2 Gaussian Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
10.2.1 Normal Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
10.2.2 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
10.3 Spline Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
10.3.1 Spline Regression with Equally Spaced Knots . . . . . . . . 114
10.4 Partition Regression Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
10.4.1 Changepoint Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
10.4.2 Classification and Regression Trees . . . . . . . . . . . . . . . . . 119
11 Clustering and Latent Factor Models . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
11.1 Mixture Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
11.1.1 Finite Mixture Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
11.1.2 Dirichlet Process Mixture Models . . . . . . . . . . . . . . . . . . . 126
11.2 Mixed-Membership Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
11.2.1 Latent Dirichlet Allocation . . . . . . . . . . . . . . . . . . . . . . . . . 129
11.2.2 Hierarchical Dirichlet Processes . . . . . . . . . . . . . . . . . . . . 131
11.3 Latent Factor Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
11.3.1 Stan Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134

Appendix A: Conjugate Parametric Models . . . . . . . . . . . . . . . . . . . . . . . . . . 137


Appendix B: Solutions to Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
Chapter 1
Uncertainty and Decisions

1.1 Subjective Uncertainty and Possibilities

1.1.1 Subjectivism

In the seminal work of de Finetti (see the English translation of de Finetti 2017),
the central idea for the Bayesian paradigm is to address decision-making in the
face of uncertainty from a subjective viewpoint. Given the same set of uncertain
circumstances, two decision-makers could differ in the following ways:
• How desirable different potential outcomes might seem to them.
• How likely they consider the various outcomes to be.
• How they feel their actions might affect the eventual outcome.
The Bayesian decision-making paradigm is most easily viewed through the lens of
an individual making choices (“decisions”) in the face of (personal) uncertainty. For
this reason, certain illustrative elements of this section will be purposefully written
in the first person.
This decision-theoretic view of the Bayesian paradigm represents a mathematical
ideal of how a coherent non-self-contradictory individual should aspire to behave.
This is a non-trivial requirement, made easier with various mathematical formalisms
which will be introduced in the modelling sections of this text. Whilst these for-
malisms might not exactly match my beliefs for specific decision problems, the aim
is to present sufficiently many classes of models that one of them might adequately
reflect my opinions up to some acceptable level of approximation.
Coherence is also the most that will be expected from a decision-maker; there
will be no requirement for me to choose in any sense the right decisions from any
perspective other than my own at that time. Everything within the paradigm is sub-
jective, even apparently absolute concepts such as truth. Statements of certainty such
as “The true value of the parameter is x” should be considered shorthand for “It is my
understanding that the true value of the parameter is x”. This might seem pedantic,

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021 1


N. Heard, An Introduction to Bayesian Inference, Methods and Computation,
https://doi.org/10.1007/978-3-030-82808-0_1
2 1 Uncertainty and Decisions

but crucially allows contradictions between individuals, and between perspectives


and reality: the decision-making machinery will still function.

1.1.2 Subjective Uncertainty

There are numerous sources of individual uncertainty which can complicate decision-
making. These could include:
• Events which have not yet happened, but might happen some time in the future
• Events which have happened which I have not yet learnt about
• Facts which may yet be undiscovered, such as the truth of some mathematical
conjecture
• Facts which may have been discovered elsewhere, but remain unknown to me
• Facts which I have partially or completely forgotten
In the Bayesian paradigm, these and other sources of uncertainty are treated equally.
If there are matters on which I am unsure, then these uncertainties must be acknowl-
edged and incorporated into a rational decision process. Whether or not I perhaps
should know them is immaterial.

1.1.3 Possible Outcomes and Events

Suppose I, the decision-maker, am interested in a currently unknown outcome ω,


and believe that it will eventually assume a single realised value from an exhaustive
set of possibilities Ω. When considering uncertain outcomes, the assumed set of
possibilities will also be chosen subjectively, as illustrated in the following example.

Example 1.1 If rolling a die, I might understandably assume that the outcome
will be in Ω = { , , , , , }. Alternatively, I could take a more conserva-
tive viewpoint and extend the space of outcomes to include some unintended or
potentially unforeseen outcomes; for example, Ω = {Dice roll does not take place,
No valid outcome, , , , , , }.
Neither viewpoint in Example 1.1 could irrefutably be said to be right or wrong.
But if I am making a decision which I consider to be affected by the future outcome of
the intended dice roll, I would possibly adopt different positions according to which
set of possible outcomes I chose to focus on. The only requirement for Ω is that it
should contain every outcome I currently conceive to be possible and meaningful to
the decision problem under consideration.

Definition 1.1 (Event) An event is a subset of the possible outcomes. An event


E ⊆ Ω is said to occur if and only if the realised outcome ω ∈ E.
1.2 Decisions: Actions, Outcomes, Consequences 3

1.2 Decisions: Actions, Outcomes, Consequences

1.2.1 Elements of a Decision Problem

Definition 1.2 (Decision problem) Following Bernardo and Smith (1994), a decision
problem will be composed of three elements:
1. An action a, to be chosen from a set A of possible actions.
2. An uncertain outcome ω, thought to lie within a set Ω of envisaged possible
outcomes.
3. An identifiable consequence, assumed to lie within a set C of possible conse-
quences, resulting from the combination of both the action taken and the ensuing
outcome which occurs.
Axioms 1 C will be totally ordered, meaning there exists an ordering relation ≤C
on C such that for any pair of consequences c1 , c2 ∈ C , necessarily c1 ≤C c2 or
c2 ≤C c1 .
If both c1 ≤C c2 and c2 ≤C c1 , then we write c1 =C c2 . This provides definitions
of (subjective) preference and indifference between consequences.
Remark 1.1 Crucially, the ordering ≤C is assumed to be subjective; my perceived
ordering of the different consequences must be allowed to differ from that of other
decision-makers.
Definition 1.3 (Preferences on consequences) Suppose c1 , c2 ∈ C . If c1 ≤C c2 and
c1 =C c2 , then c2 is said to be a preferable consequence to c1 , written c1 <C c2 . If
c1 =C c2 , then I am indifferent between the two consequences.
Definition 1.4 (Action) An action defines a function which maps outcomes to con-
sequences. For simplicity of presentation, until Section 1.5.1 the actions in A will
be assumed to be discrete, meaning that each can be represented by a generic form
a = {(E 1 , c1 ), (E 2 , c2 ), . . .}, where c1 , c2 , . . . ∈ C , and E 1 , E 2 , . . . are referred to as
fundamental events which form a partition of Ω, meaning Ω = ∪i E i , E i ∩ E j = ∅
for i = j. Then, for example, if I take action a, then I anticipate that any outcome
ω ∈ E 1 would lead to consequence c1 , and so on.
Remark 1.2 When actions are identified, in this way, by the perceived consequences
they will lead to under different outcomes, they are subjective.

1.2.2 Preferences on Actions

Rational decision-making requires well-founded preferences between possible


actions. Let a, a ∈ A be two possible actions, which for illustration could be written
as
4 1 Uncertainty and Decisions

a = {(E 1 , c1 ), (E 2 , c2 ), . . .},
a = {(E 1 , c1 ), (E 2 , c2 ), . . .}.

The overall desirability of each action will depend entirely on the uncertainty sur-
rounding the fundamental events E 1 , E 2 , . . . and E 1 , E 2 , . . . and the desirability of
the corresponding consequences c1 , c2 , . . . and c1 , c2 , . . .. This can be exploited in
two ways, which will be developed in later sections:
1. If I innately prefer action a to a , then this preference can be used to quantify my
beliefs about the uncertainty surrounding the fundamental events characterising
each action. This will form the basis for eliciting subjective probabilities (see
Sect. 1.3).
2. Reversing the same argument, once I have elicited my probabilities for certain
events then these can be used to obtain preferences between corresponding actions
through the principle of maximising expected utility (see Sect. 1.4.1).

Definition 1.5 (Preferences on actions) For actions a, a ∈ A , a subjective decision-


maker regarding a not to be a preferable action to a is written a ≤ a . For actions
a, a ∈ A , if both a ≤ a and a ≤ a, then a and a are said to be equivalent actions,
written a ∼ a .

Axioms 2 Preferences on actions must be compatible with preferences on conse-


quences. Let E, F be events such that ∅ ⊆ E ⊆ F ⊆ Ω, and let c1 , c2 ∈ C such that
c1 ≤C c2 . Then the following preference on actions must hold:

{(F, c1 ), (F, c2 )} ≤ {(E, c1 ), (E, c2 )}.

Remark 1.3 The two actions {(F, c1 ), (F, c2 )} and {(E, c1 ), (E, c2 )} only differ in
the consequences anticipated from any ω ∈ E ∩ F; that is, the event E ∩ F would
lead to a consequence of c1 under the first action and c2 under the second.

Remark 1.4 By Axiom 2, for ∅ ⊆ E ⊆ Ω and c1 , c2 ∈ C , if c1 ≤C c2 then

{(Ω, c1 )} ≤ {(E, c1 ), (E, c2 )} ≤ {(Ω, c2 )}.

That is, if consequence c2 is preferable to consequence c1 , then I should prefer a


strategy which guarantees a consequence c2 against carrying any risk of exposure to
consequence c1 through the occurrence of event E. Similarly, rather than guarantee-
ing the lesser consequence c1 , I should prefer a strategy whereby the occurrence of
event E will improve the consequence to c2 .
Another Random Scribd Document
with Unrelated Content
CHAPTER XXII—THE TRAIL AGAIN
The crowd stepped aside and let them go. No one said anything.
Possibly the men were so shocked over what had happened that
night that they didn’t know what to say.
Hashknife and Sleepy went to the hotel and got their war-bags,
mounted their horses, and rode southward out of Red Arrow. Some
one called to them from the sheriff’s office, but they did not heed.
Their work was over, and nothing remained to be done.
“It shore feels good to have a horse between yore legs again,
Sleepy,” said Hashknife. “That Half-Box R bay was all right, but
nothin’ like Ghost. It’s funny what a simple horse trade will lead to.
Kid Glover’s bay picks up a sharp rock, and from there she rolls
bigger and bigger, like a snowball rollin’ down a hill. But it was all
right, pardner. We’ll get to Arizona before snow flies in this country.
Things like this kinda break the monotony, don’tcha know it?”
“There was five thousand dollars reward,” reminded Sleepy.
“Yeah, there was. And it’ll be a good thing for Slim and Lila to start
housekeepin’ on.”
“Yeah, that’s true, Hashknife. It was plenty fun, but not a bit
remunerative.”
“It ain’t what yuh get, Sleepy; it’s what yuh learn.”
“What in hell didja learn?”
“I learned that when an old jigger like Rance McCoy gives his
word, it makes blood a sight thinner than water.”
“Shore; but what good will that ever do you?”
“It builds up my faith in humanity.”
“Anyway, we got yore horse, Hashknife; and that’s what we went
after.”
“Which is all anybody could ask, pardner.”
And they rode on toward Arizona—satisfied—while back in Red
Arrow the people wondered where they had gone. Butch Reimer
returned the money, and the judge gave him few enough years for
his crime. Kid Glover paid the penalty of his murders, while Langley,
Fohl, and One-Eye Connell served short terms.
Rance McCoy got his money back which DuMond stole that night,
and later he sold the Eagle for enough to pay him back the money
he had lost on a crooked deal. Slim Caldwell resigned his office
when he married Lila McCoy, and went into business with the Circle
Spade, where Chuckwalla still putters around the kitchen, testing out
new recipes from a cook book, which had been sent him from
Arizona.
There was no mark on it to show who sent it, except on the cover,
where a crudely drawn hashknife gave them a clue to the donor. Lila
tore off the cover and had it framed; and it hangs over the fireplace
of the Circle Spade ranch-house.
Chuck Ring is sheriff now, wishing for something to happen again.
And somewhere under the sun, heading toward the next hill, ride
Hashknife and Sleepy, looking into the future with a smile—following
the dim trails.
THE END
*** END OF THE PROJECT GUTENBERG EBOOK THICKER
THAN WATER ***

Updated editions will replace the previous one—the old editions


will be renamed.

Creating the works from print editions not protected by U.S.


copyright law means that no one owns a United States copyright
in these works, so the Foundation (and you!) can copy and
distribute it in the United States without permission and without
paying copyright royalties. Special rules, set forth in the General
Terms of Use part of this license, apply to copying and
distributing Project Gutenberg™ electronic works to protect the
PROJECT GUTENBERG™ concept and trademark. Project
Gutenberg is a registered trademark, and may not be used if
you charge for an eBook, except by following the terms of the
trademark license, including paying royalties for use of the
Project Gutenberg trademark. If you do not charge anything for
copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such
as creation of derivative works, reports, performances and
research. Project Gutenberg eBooks may be modified and
printed and given away—you may do practically ANYTHING in
the United States with eBooks not protected by U.S. copyright
law. Redistribution is subject to the trademark license, especially
commercial redistribution.

START: FULL LICENSE


THE FULL PROJECT GUTENBERG LICENSE
PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK

To protect the Project Gutenberg™ mission of promoting the


free distribution of electronic works, by using or distributing this
work (or any other work associated in any way with the phrase
“Project Gutenberg”), you agree to comply with all the terms of
the Full Project Gutenberg™ License available with this file or
online at www.gutenberg.org/license.

Section 1. General Terms of Use and


Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand,
agree to and accept all the terms of this license and intellectual
property (trademark/copyright) agreement. If you do not agree to
abide by all the terms of this agreement, you must cease using
and return or destroy all copies of Project Gutenberg™
electronic works in your possession. If you paid a fee for
obtaining a copy of or access to a Project Gutenberg™
electronic work and you do not agree to be bound by the terms
of this agreement, you may obtain a refund from the person or
entity to whom you paid the fee as set forth in paragraph 1.E.8.

1.B. “Project Gutenberg” is a registered trademark. It may only


be used on or associated in any way with an electronic work by
people who agree to be bound by the terms of this agreement.
There are a few things that you can do with most Project
Gutenberg™ electronic works even without complying with the
full terms of this agreement. See paragraph 1.C below. There
are a lot of things you can do with Project Gutenberg™
electronic works if you follow the terms of this agreement and
help preserve free future access to Project Gutenberg™
electronic works. See paragraph 1.E below.
1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright
law in the United States and you are located in the United
States, we do not claim a right to prevent you from copying,
distributing, performing, displaying or creating derivative works
based on the work as long as all references to Project
Gutenberg are removed. Of course, we hope that you will
support the Project Gutenberg™ mission of promoting free
access to electronic works by freely sharing Project
Gutenberg™ works in compliance with the terms of this
agreement for keeping the Project Gutenberg™ name
associated with the work. You can easily comply with the terms
of this agreement by keeping this work in the same format with
its attached full Project Gutenberg™ License when you share it
without charge with others.

1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside
the United States, check the laws of your country in addition to
the terms of this agreement before downloading, copying,
displaying, performing, distributing or creating derivative works
based on this work or any other Project Gutenberg™ work. The
Foundation makes no representations concerning the copyright
status of any work in any country other than the United States.

1.E. Unless you have removed all references to Project


Gutenberg:

1.E.1. The following sentence, with active links to, or other


immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project
Gutenberg™ work (any work on which the phrase “Project
Gutenberg” appears, or with which the phrase “Project
Gutenberg” is associated) is accessed, displayed, performed,
viewed, copied or distributed:

This eBook is for the use of anyone anywhere in the United


States and most other parts of the world at no cost and with
almost no restrictions whatsoever. You may copy it, give it
away or re-use it under the terms of the Project Gutenberg
License included with this eBook or online at
www.gutenberg.org. If you are not located in the United
States, you will have to check the laws of the country where
you are located before using this eBook.

1.E.2. If an individual Project Gutenberg™ electronic work is


derived from texts not protected by U.S. copyright law (does not
contain a notice indicating that it is posted with permission of the
copyright holder), the work can be copied and distributed to
anyone in the United States without paying any fees or charges.
If you are redistributing or providing access to a work with the
phrase “Project Gutenberg” associated with or appearing on the
work, you must comply either with the requirements of
paragraphs 1.E.1 through 1.E.7 or obtain permission for the use
of the work and the Project Gutenberg™ trademark as set forth
in paragraphs 1.E.8 or 1.E.9.

1.E.3. If an individual Project Gutenberg™ electronic work is


posted with the permission of the copyright holder, your use and
distribution must comply with both paragraphs 1.E.1 through
1.E.7 and any additional terms imposed by the copyright holder.
Additional terms will be linked to the Project Gutenberg™
License for all works posted with the permission of the copyright
holder found at the beginning of this work.

1.E.4. Do not unlink or detach or remove the full Project


Gutenberg™ License terms from this work, or any files
containing a part of this work or any other work associated with
Project Gutenberg™.
1.E.5. Do not copy, display, perform, distribute or redistribute
this electronic work, or any part of this electronic work, without
prominently displaying the sentence set forth in paragraph 1.E.1
with active links or immediate access to the full terms of the
Project Gutenberg™ License.

1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if
you provide access to or distribute copies of a Project
Gutenberg™ work in a format other than “Plain Vanilla ASCII” or
other format used in the official version posted on the official
Project Gutenberg™ website (www.gutenberg.org), you must, at
no additional cost, fee or expense to the user, provide a copy, a
means of exporting a copy, or a means of obtaining a copy upon
request, of the work in its original “Plain Vanilla ASCII” or other
form. Any alternate format must include the full Project
Gutenberg™ License as specified in paragraph 1.E.1.

1.E.7. Do not charge a fee for access to, viewing, displaying,


performing, copying or distributing any Project Gutenberg™
works unless you comply with paragraph 1.E.8 or 1.E.9.

1.E.8. You may charge a reasonable fee for copies of or


providing access to or distributing Project Gutenberg™
electronic works provided that:

• You pay a royalty fee of 20% of the gross profits you derive from
the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information
about donations to the Project Gutenberg Literary Archive
Foundation.”

• You provide a full refund of any money paid by a user who


notifies you in writing (or by e-mail) within 30 days of receipt that
s/he does not agree to the terms of the full Project Gutenberg™
License. You must require such a user to return or destroy all
copies of the works possessed in a physical medium and
discontinue all use of and all access to other copies of Project
Gutenberg™ works.

• You provide, in accordance with paragraph 1.F.3, a full refund of


any money paid for a work or a replacement copy, if a defect in
the electronic work is discovered and reported to you within 90
days of receipt of the work.

• You comply with all other terms of this agreement for free
distribution of Project Gutenberg™ works.

1.E.9. If you wish to charge a fee or distribute a Project


Gutenberg™ electronic work or group of works on different
terms than are set forth in this agreement, you must obtain
permission in writing from the Project Gutenberg Literary
Archive Foundation, the manager of the Project Gutenberg™
trademark. Contact the Foundation as set forth in Section 3
below.

1.F.

1.F.1. Project Gutenberg volunteers and employees expend


considerable effort to identify, do copyright research on,
transcribe and proofread works not protected by U.S. copyright
law in creating the Project Gutenberg™ collection. Despite
these efforts, Project Gutenberg™ electronic works, and the
medium on which they may be stored, may contain “Defects,”
such as, but not limited to, incomplete, inaccurate or corrupt
data, transcription errors, a copyright or other intellectual
property infringement, a defective or damaged disk or other
medium, a computer virus, or computer codes that damage or
cannot be read by your equipment.

1.F.2. LIMITED WARRANTY, DISCLAIMER OF DAMAGES -


Except for the “Right of Replacement or Refund” described in
paragraph 1.F.3, the Project Gutenberg Literary Archive
Foundation, the owner of the Project Gutenberg™ trademark,
and any other party distributing a Project Gutenberg™ electronic
work under this agreement, disclaim all liability to you for
damages, costs and expenses, including legal fees. YOU
AGREE THAT YOU HAVE NO REMEDIES FOR NEGLIGENCE,
STRICT LIABILITY, BREACH OF WARRANTY OR BREACH
OF CONTRACT EXCEPT THOSE PROVIDED IN PARAGRAPH
1.F.3. YOU AGREE THAT THE FOUNDATION, THE
TRADEMARK OWNER, AND ANY DISTRIBUTOR UNDER
THIS AGREEMENT WILL NOT BE LIABLE TO YOU FOR
ACTUAL, DIRECT, INDIRECT, CONSEQUENTIAL, PUNITIVE
OR INCIDENTAL DAMAGES EVEN IF YOU GIVE NOTICE OF
THE POSSIBILITY OF SUCH DAMAGE.

1.F.3. LIMITED RIGHT OF REPLACEMENT OR REFUND - If


you discover a defect in this electronic work within 90 days of
receiving it, you can receive a refund of the money (if any) you
paid for it by sending a written explanation to the person you
received the work from. If you received the work on a physical
medium, you must return the medium with your written
explanation. The person or entity that provided you with the
defective work may elect to provide a replacement copy in lieu
of a refund. If you received the work electronically, the person or
entity providing it to you may choose to give you a second
opportunity to receive the work electronically in lieu of a refund.
If the second copy is also defective, you may demand a refund
in writing without further opportunities to fix the problem.

1.F.4. Except for the limited right of replacement or refund set


forth in paragraph 1.F.3, this work is provided to you ‘AS-IS’,
WITH NO OTHER WARRANTIES OF ANY KIND, EXPRESS
OR IMPLIED, INCLUDING BUT NOT LIMITED TO
WARRANTIES OF MERCHANTABILITY OR FITNESS FOR
ANY PURPOSE.

1.F.5. Some states do not allow disclaimers of certain implied


warranties or the exclusion or limitation of certain types of
damages. If any disclaimer or limitation set forth in this
agreement violates the law of the state applicable to this
agreement, the agreement shall be interpreted to make the
maximum disclaimer or limitation permitted by the applicable
state law. The invalidity or unenforceability of any provision of
this agreement shall not void the remaining provisions.

1.F.6. INDEMNITY - You agree to indemnify and hold the


Foundation, the trademark owner, any agent or employee of the
Foundation, anyone providing copies of Project Gutenberg™
electronic works in accordance with this agreement, and any
volunteers associated with the production, promotion and
distribution of Project Gutenberg™ electronic works, harmless
from all liability, costs and expenses, including legal fees, that
arise directly or indirectly from any of the following which you do
or cause to occur: (a) distribution of this or any Project
Gutenberg™ work, (b) alteration, modification, or additions or
deletions to any Project Gutenberg™ work, and (c) any Defect
you cause.

Section 2. Information about the Mission of


Project Gutenberg™
Project Gutenberg™ is synonymous with the free distribution of
electronic works in formats readable by the widest variety of
computers including obsolete, old, middle-aged and new
computers. It exists because of the efforts of hundreds of
volunteers and donations from people in all walks of life.

Volunteers and financial support to provide volunteers with the


assistance they need are critical to reaching Project
Gutenberg™’s goals and ensuring that the Project Gutenberg™
collection will remain freely available for generations to come. In
2001, the Project Gutenberg Literary Archive Foundation was
created to provide a secure and permanent future for Project
Gutenberg™ and future generations. To learn more about the
Project Gutenberg Literary Archive Foundation and how your
efforts and donations can help, see Sections 3 and 4 and the
Foundation information page at www.gutenberg.org.

Section 3. Information about the Project


Gutenberg Literary Archive Foundation
The Project Gutenberg Literary Archive Foundation is a non-
profit 501(c)(3) educational corporation organized under the
laws of the state of Mississippi and granted tax exempt status by
the Internal Revenue Service. The Foundation’s EIN or federal
tax identification number is 64-6221541. Contributions to the
Project Gutenberg Literary Archive Foundation are tax
deductible to the full extent permitted by U.S. federal laws and
your state’s laws.

The Foundation’s business office is located at 809 North 1500


West, Salt Lake City, UT 84116, (801) 596-1887. Email contact
links and up to date contact information can be found at the
Foundation’s website and official page at
www.gutenberg.org/contact

Section 4. Information about Donations to


the Project Gutenberg Literary Archive
Foundation
Project Gutenberg™ depends upon and cannot survive without
widespread public support and donations to carry out its mission
of increasing the number of public domain and licensed works
that can be freely distributed in machine-readable form
accessible by the widest array of equipment including outdated
equipment. Many small donations ($1 to $5,000) are particularly
important to maintaining tax exempt status with the IRS.

The Foundation is committed to complying with the laws


regulating charities and charitable donations in all 50 states of
the United States. Compliance requirements are not uniform
and it takes a considerable effort, much paperwork and many
fees to meet and keep up with these requirements. We do not
solicit donations in locations where we have not received written
confirmation of compliance. To SEND DONATIONS or
determine the status of compliance for any particular state visit
www.gutenberg.org/donate.

While we cannot and do not solicit contributions from states


where we have not met the solicitation requirements, we know
of no prohibition against accepting unsolicited donations from
donors in such states who approach us with offers to donate.

International donations are gratefully accepted, but we cannot


make any statements concerning tax treatment of donations
received from outside the United States. U.S. laws alone swamp
our small staff.

Please check the Project Gutenberg web pages for current


donation methods and addresses. Donations are accepted in a
number of other ways including checks, online payments and
credit card donations. To donate, please visit:
www.gutenberg.org/donate.

Section 5. General Information About Project


Gutenberg™ electronic works
Professor Michael S. Hart was the originator of the Project
Gutenberg™ concept of a library of electronic works that could
be freely shared with anyone. For forty years, he produced and
distributed Project Gutenberg™ eBooks with only a loose
network of volunteer support.

Project Gutenberg™ eBooks are often created from several


printed editions, all of which are confirmed as not protected by
copyright in the U.S. unless a copyright notice is included. Thus,
we do not necessarily keep eBooks in compliance with any
particular paper edition.

Most people start at our website which has the main PG search
facility: www.gutenberg.org.

This website includes information about Project Gutenberg™,


including how to make donations to the Project Gutenberg
Literary Archive Foundation, how to help produce our new
eBooks, and how to subscribe to our email newsletter to hear
about new eBooks.

You might also like