100% found this document useful (9 votes)
63 views

Buy ebook (Ebook) Hands-on Scikit-Learn for machine learning applications: data science fundamentals with Python by David Paper ISBN 9780933333338, 9781484253724, 9781484253731, 9789109027774, 0933333331, 1484253728, 1484253736, 9109027777 cheap price

The document provides information about the ebook 'Hands-on Scikit-Learn for Machine Learning Applications' by David Paper, which focuses on practical applications of machine learning using the Scikit-Learn library in Python. It is designed for readers with intermediate programming skills and covers various machine learning algorithms and techniques across eight chapters. Additionally, it includes details about the author and technical reviewer, as well as links to download the ebook and supplementary materials.

Uploaded by

vhinastenio
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (9 votes)
63 views

Buy ebook (Ebook) Hands-on Scikit-Learn for machine learning applications: data science fundamentals with Python by David Paper ISBN 9780933333338, 9781484253724, 9781484253731, 9789109027774, 0933333331, 1484253728, 1484253736, 9109027777 cheap price

The document provides information about the ebook 'Hands-on Scikit-Learn for Machine Learning Applications' by David Paper, which focuses on practical applications of machine learning using the Scikit-Learn library in Python. It is designed for readers with intermediate programming skills and covers various machine learning algorithms and techniques across eight chapters. Additionally, it includes details about the author and technical reviewer, as well as links to download the ebook and supplementary materials.

Uploaded by

vhinastenio
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 65

Download Full Version ebook - Visit ebooknice.

com

(Ebook) Hands-on Scikit-Learn for machine learning


applications: data science fundamentals with
Python by David Paper ISBN 9780933333338,
9781484253724, 9781484253731, 9789109027774,
0933333331, 1484253728, 1484253736, 9109027777
https://ebooknice.com/product/hands-on-scikit-learn-for-
machine-learning-applications-data-science-fundamentals-
with-python-11863088

Click the button below to download

DOWLOAD EBOOK

Discover More Ebook - Explore Now at ebooknice.com


Instant digital products (PDF, ePub, MOBI) ready for you
Download now and discover formats that fit your needs...

Start reading on any device today!

(Ebook) Biota Grow 2C gather 2C cook by Loucas, Jason;


Viles, James ISBN 9781459699816, 9781743365571,
9781925268492, 1459699815, 1743365578, 1925268497
https://ebooknice.com/product/biota-grow-2c-gather-2c-cook-6661374

ebooknice.com

(Ebook) Matematik 5000+ Kurs 2c Lärobok by Lena


Alfredsson, Hans Heikne, Sanna Bodemyr ISBN 9789127456600,
9127456609
https://ebooknice.com/product/matematik-5000-kurs-2c-larobok-23848312

ebooknice.com

(Ebook) SAT II Success MATH 1C and 2C 2002 (Peterson's SAT


II Success) by Peterson's ISBN 9780768906677, 0768906679

https://ebooknice.com/product/sat-ii-success-
math-1c-and-2c-2002-peterson-s-sat-ii-success-1722018

ebooknice.com

(Ebook) Master SAT II Math 1c and 2c 4th ed (Arco Master


the SAT Subject Test: Math Levels 1 & 2) by Arco ISBN
9780768923049, 0768923042
https://ebooknice.com/product/master-sat-ii-math-1c-and-2c-4th-ed-
arco-master-the-sat-subject-test-math-levels-1-2-2326094

ebooknice.com
(Ebook) Cambridge IGCSE and O Level History Workbook 2C -
Depth Study: the United States, 1919-41 2nd Edition by
Benjamin Harrison ISBN 9781398375147, 9781398375048,
1398375144, 1398375047
https://ebooknice.com/product/cambridge-igcse-and-o-level-history-
workbook-2c-depth-study-the-united-states-1919-41-2nd-edition-53538044

ebooknice.com

(Ebook) Hands-on Machine Learning with Python: Implement


Neural Network Solutions with Scikit-learn and PyTorch by
Ashwin Pajankar; Aditya Joshi ISBN 9781484279205,
1484279204
https://ebooknice.com/product/hands-on-machine-learning-with-python-
implement-neural-network-solutions-with-scikit-learn-and-
pytorch-42813972
ebooknice.com

(Ebook) Hands-On Machine Learning with Scikit-Learn and


TensorFlow by Aurélien Géron ISBN 9781491962299,
1491962291
https://ebooknice.com/product/hands-on-machine-learning-with-scikit-
learn-and-tensorflow-5760794

ebooknice.com

(Ebook) Hands-On Machine Learning with Scikit-Learn and


TensorFlow by Aurélien Géron ISBN 9789352135219,
9352135210
https://ebooknice.com/product/hands-on-machine-learning-with-scikit-
learn-and-tensorflow-10429888

ebooknice.com

(Ebook) Hands-On Machine Learning with Scikit-Learn,


Keras, and Tensorflow by Aurélien Géron ISBN
9781098125974, 1098125975
https://ebooknice.com/product/hands-on-machine-learning-with-scikit-
learn-keras-and-tensorflow-46502320

ebooknice.com
David Paper

Hands-on Scikit-Learn for Machine


Learning Applications
Data Science Fundamentals with Python
David Paper
Logan, UT, USA

Any source code or other supplementary material referenced by the


author in this book is available to readers on GitHub via the book’s
product page, located at www.​apress.​com/​9781484253724 . For more
detailed information, please visit http://​www.​apress.​com/​source-code
.

ISBN 978-1-4842-5372-4 e-ISBN 978-1-4842-5373-1


https://doi.org/10.1007/978-1-4842-5373-1

© David Paper 2020

This work is subject to copyright. All rights are reserved by the


Publisher, whether the whole or part of the material is concerned,
specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other
physical way, and transmission or information storage and retrieval,
electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed.

Trademarked names, logos, and images may appear in this book. Rather
than use a trademark symbol with every occurrence of a trademarked
name, logo, or image we use the names, logos, and images only in an
editorial fashion and to the benefit of the trademark owner, with no
intention of infringement of the trademark. The use in this publication
of trade names, trademarks, service marks, and similar terms, even if
they are not identified as such, is not to be taken as an expression of
opinion as to whether or not they are subject to proprietary rights.

While the advice and information in this book are believed to be true
and accurate at the date of publication, neither the authors nor the
editors nor the publisher can accept any legal responsibility for any
errors or omissions that may be made. The publisher makes no
warranty, express or implied, with respect to the material contained
herein.

Distributed to the book trade worldwide by Springer Science+Business


Media New York, 233 Spring Street, 6th Floor, New York, NY 10013.
Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail orders-
[email protected], or visit www.springeronline.com. Apress Media,
LLC is a California LLC and the sole member (owner) is Springer
Science + Business Media Finance Inc (SSBM Finance Inc). SSBM
Finance Inc is a Delaware corporation.
For my mother, brothers, and friends.
Introduction
We apply the popular Scikit-Learn library to demonstrate machine
learning exercises with Python code to help readers solve machine
learning problems. The book is designed for those with intermediate
programming skills and some experience with machine learning
algorithms. We focus on application of the algorithms rather than
theory. So, readers should read about the theory online or from other
sources if appropriate. The reader should also be willing to spend a lot
of time working through the code examples because they are pretty
deep. But, the effort will pay off because the examples are intended to
help the reader tackle complex problems.
The book is organized into eight chapters. Chapter 1 introduces the
topic of machine learning, Anaconda, and Scikit-Learn. Chapters 2 and 3
introduce algorithmic classification. Chapter 2 classifies simple data
sets and Chapter 3 classifies complex ones. Chapter 4 introduces
predictive modeling with regression. Chapters 5 and 6 introduce
classification tuning. Chapter 5 tunes simple data sets and Chapter 6
tunes complex ones. Chapter 7 introduces predictive modeling
regression tuning. Chapter 8 puts all knowledge together to review and
present findings in a holistic manner.
Download this book’s example data by clicking the Download
source code button found on the book’s catalog page at
https://www.apress.com/us/book/9781484253724​.
Table of Contents
Chapter 1:​Introduction to Scikit-Learn
Machine Learning
Anaconda
Scikit-Learn
Data Sets
Characterize Data
Simple Classification Data
Complex Classification Data
Regression Data
Feature Scaling
Dimensionality Reduction
Chapter 2:​Classification from Simple Training Sets
Simple Data Sets
Classifying Wine Data
Classifying Digits
Classifying Bank Data
Classifying make_​moons
Chapter 3:​Classification from Complex Training Sets
Complex Data Sets
Classifying fetch_​20newsgroups
Classifying MNIST
Classifying fetch_​lfw_​people
Chapter 4:​Predictive Modeling Through Regression
Regression Data Sets
Regressing tips
Regressing boston
Regressing wine data
Chapter 5:​Scikit-Learn Classifier Tuning from Simple Training
Sets
Tuning Data Sets
Tuning Iris Data
Tuning Digits Data
Tuning Bank Data
Tuning Wine Data
Chapter 6:​Scikit-Learn Classifier Tuning from Complex Training
Sets
Tuning Data Sets
Tuning fetch_​1fw_​people
Tuning MNIST
Tuning fetch_​20newsgroups
Chapter 7:​Scikit-Learn Regression Tuning
Tuning Data Sets
Tuning tips
Tuning boston
Tuning wine
Chapter 8:​Putting It All Together
The Journey
Value and Cost
MNIST Value and Cost
Explaining MNIST to Money People
Explaining Output to Money People
Explaining the Confusion Matrix to Money People
Explaining Visualizations to Money People
Value and Cost
fetch_​lfw_​people Value and Cost
Explaining fetch_​lfw_​people to Money People
Explaining Output to Money People
Explaining Visualizations to Money People
Value and Cost
fetch_​20newsgroups Value and Cost
Explaining fetch_​20newsgroups to Money People
Explaining Output to Money People
Explaining the Confusion Matrix to Money People
Value and Cost
Index
About the Author and About the Technical
Reviewer

About the Author


David Paper
is a professor at Utah State University in
the Management Information Systems
department. He is the author of two
books –Web Programming for Business:
PHP Object-Oriented Programming with
Oracle andData Science Fundamentals for
Python and MongoDB . He has over
70 publications in refereed journals such
asOrganizational Research Methods
,Communications of the ACM, Information
& Management ,Information Resource
Management Journal ,Communications of
the AIS ,Journal of Information
Technology Case and Application
Research , andLong Range Planning . He
has also served on several editorial
boards in various capacities, including
associate editor. Besides growing up in family businesses, Dr. Paper has
worked for Texas Instruments, DLS, Inc., and the Phoenix Small
Business Administration. He has performed IS consulting work for IBM,
AT&T, Octel, Utah Department of Transportation, and the Space
Dynamics Laboratory. Dr. Paper’s teaching and research interests
include data science, machine learning, process reengineering, object-
oriented programming, and change management.

About the Technical Reviewer


Jojo Moolayil
is an artificial intelligence, deep learning,
machine learning, and decision science
professional and published author of
three books:Smarter Decisions – The
Intersection of Internet of Things and
Decision Science ,Learn Keras for Deep
Neural Networks , andApplied Supervised
Learning with R . He has worked with
industry leaders on several high-impact
and critical data science and machine
learning projects across multiple
verticals. He is currently associated with
Amazon Web Services as a research
scientist – AI.
Jojo was born and raised in Pune,
India, and graduated from the University
of Pune with a major in Information
Technology Engineering. He started his
career with Mu Sigma Inc., the world’s largest pure-play analytics
provider, and worked with the leaders of many Fortune 50 clients. He
later worked with Flutura – an IoT analytics start-up – and GE, the
pioneer and leader in Industrial AI.
He currently resides in Vancouver, BC. Apart from authoring books
on deep learning, decision science, and IoT, Jojo has also been a
technical reviewer for various books on the same subject with Apress
and Packt publications. He is an active Data Science tutor and maintains
a blog at http://blog.jojomoolayil.com .
Jojo’s personal web site: www.jojomoolayil.com
Business e-mail:[email protected]
© David Paper 2020
D. Paper, Hands-on Scikit-Learn for Machine Learning Applications
https://doi.org/10.1007/978-1-4842-5373-1_1

1. Introduction to Scikit-Learn
David Paper1

(1) Logan, UT, USA

Scikit-Learn is a Python library that provides simple and efficient tools for implementing
supervised and unsupervised machine learning algorithms. The library is accessible to everyone
because it is open source and commercially usable. It is built on NumPY, SciPy, and matplolib
libraries, which means it is reliable, robust, and core to the Python language.
Scikit-Learn is focused on data modeling rather than data loading, cleansing, munging or
manipulating. It is also very easy to use and relatively clean of programming bugs.

Machine Learning
Machine learning is getting computers to program themselves. We use algorithms to make this
happen. An algorithm is a set of rules used to calculate or problem solve with a computer.
Machine learning advocates create, study, and apply algorithms to improve performance on
data-driven tasks. They use tools and technology to answer questions about data by training a
machine how to learn.
The goal is to build robust algorithms that can manipulate input data to predict an output
while continually updating outputs as new data becomes available. Any information or data sent
to a computer is considered input. Data produced by a computer is considered output.
In the machine learning community, input data is referred to as the feature set and output data
is referred to as the target. The feature set is also referred to as the feature space. Sample data is
typically referred to as training data. Once the algorithm is trained with sample data, it can make
predictions on new data. New data is typically referred to as test data.
Machine learning is divided into two main areas: supervised and unsupervised learning. Since
machine learning typically focuses on prediction based on known properties learned from
training data, our focus is on supervised learning.
Supervised learning is when the data set contains both inputs (or the feature set) and desired
outputs (or targets). That is, we know the properties of the data. The goal is to make predictions.
This ability to supervise algorithm training is a big part of why machine learning has become so
popular.
To classify or regress new data, we must train on data with known outcomes. We classify data
by organizing it into relevant categories. We regress data by finding the relationship between
feature set data and target data.
With unsupervised learning, the data set contains only inputs but no desired outputs (or
targets). The goal is to explore the data and find some structure or way to organize it. Although
not the focus of the book, we will explore a few unsupervised learning scenarios.

Anaconda
You can use any Python installation, but I recommend installing Python with Anaconda for several
reasons. First, it has over 15 million users. Second, Anaconda allows easy installation of the
desired version of Python. Third, it preinstalls many useful libraries for machine learning
including Scikit-Learn. Follow this link to see the Anaconda package lists for your operating
system and Python version: https://docs.anaconda.com/anaconda/packages/pkg-
docs/. Fourth, it includes several very popular editors including IDLE, Spyder, and Jupyter
Notebooks. Fifth, Anaconda is reliable and well-maintained and removes compatibility
bottlenecks.
You can easily download and install Anaconda with this link:
https://www.anaconda.com/download/. You can update with this link:
https://docs.anaconda.com/anaconda/install/update-version/. Just open
Anaconda and follow instructions. I recommend updating to the current version.

Scikit-Learn
Python’s Scikit-Learn is one of the most popular machine learning libraries. It is built on Python
libraries NumPy, SciPy, and Matplotlib. The library is well-documented, open source,
commercially usable, and a great vehicle to get started with machine learning. It is also very
reliable and well-maintained, and its vast collection of algorithms can be easily incorporated into
your projects. Scikit-Learn is focused on modeling data rather than loading, manipulating,
visualizing, and summarizing data. For such activities, other libraries such as NumPy, pandas,
Matplotlib, and seaborn are covered as encountered. The Scikit-Learn library is imported into a
Python script as sklearn.

Data Sets
A great way to understand machine learning application is by working through Python data-
driven code examples. We use either Scikit-Learn, UCI Machine Learning, or seaborn data sets for
all examples. The Scikit-Learn data sets package embeds some small data sets for getting started
and helpers to fetch larger data sets commonly used in the machine learning library to benchmark
algorithms on data from the world at large. The UCI Machine Learning Repository maintains 468
data sets to serve the machine learning community. Seaborn provides an API on top of Matplotlib
that offers simplicity when working with plot styles, color defaults, and high-level functions for
common statistical plot types that facilitate visualization. It also integrates nicely with Pandas
DataFrame functionality.
We chose the data sets for our examples because the machine learning community uses them
for learning, exploring, benchmarking, and validating, so we can compare our results to others
while learning how to apply machine learning algorithms.
Our data sets are categorized as either classification or regression data. Classification data
complexity ranges from simple to relatively complex. Simple classification data sets include
load_iris, load_wine, bank.csv, and load_digits. Complex classification data sets include
fetch_20newsgroups, MNIST, and fetch_1fw_people. Regression data sets include tips, redwine.csv,
whitewine.csv, and load_boston.

Characterize Data
Before working with algorithms, it is best to understand the data characterization. Each data set
was carefully chosen to help you gain experience with the most common aspects of machine
learning. We begin by describing the characteristics of each data set to better understand its
composition and purpose. Data sets are organized by classification and regression data.
Classification data is further organized by complexity. That is, we begin with simple
classification data sets that are not complex so that the reader can focus on the machine learning
content rather than on the data. We then move onto more complex data sets.

Simple Classification Data


Classification is a machine learning technique for predicting the class upon which a dependent
variable belongs. A class is a discrete response. In machine learning, a dependent variable is
typically referred to as the target. A class is predicted based upon the independent variables of a
data set. Independent variables are typically referred to as the feature set or feature space. Feature
space is the collection of features used to characterize the data.
Simple data sets are those with a limited number of features. Such a data set is referred to as
one with a low-dimensional feature space.

Iris Data
The first data set we characterize is load_iris, which consists of Iris flower data. Iris is a
multivariate data set consisting of 50 samples from each of three species of iris (Iris setosa, Iris
virginica, and Iris versicolor). Each sample contains four features, namely, length and width of
sepals and petals in centimeters. Iris is a typical test case for machine learning classification. It is
also one of the best known data sets in the data science literature, which means you can test your
results against many other verifiable examples.
The first code example shown in Listing 1-1 loads Iris data, displays its keys, shape of the
feature set and target, feature and target names, a slice from the DESCR key, and feature
importance (from most to least).

from sklearn import datasets


from sklearn.ensemble import RandomForestClassifier

if __name__ == "__main__":
br = '\n'
iris = datasets.load_iris()
keys = iris.keys()
print (keys, br)
X = iris.data
y = iris.target
print ('features shape:', X.shape)
print ('target shape:', y.shape, br)
features = iris.feature_names
targets = iris.target_names
print ('feature set:')
print (features, br)
print ('targets:')
print (targets, br)
print (iris.DESCR[525:900], br)
rnd_clf = RandomForestClassifier(random_state=0,
n_estimators=100)
rnd_clf.fit(X, y)
rnd_name = rnd_clf.__class__.__name__
feature_importances = rnd_clf.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
print ('most important features' + ' (' + rnd_name + '):')
[print (row) for i, row in enumerate(importance)]
Listing 1-1 Characterize the Iris data set
Go ahead and execute the code from Listing 1-1. Remember that you can find the example
from the book’s example download. You don’t need to type the example by hand. It’s easier to
access the example download and copy/paste.
Your output from executing Listing 1-1 should resemble the following:

dict_keys(['data', 'target', 'target_names', 'DESCR',


'feature_names', 'filename'])

features shape: (150, 4)


target shape: (150,)

feature set:
['sepal length (cm)', 'sepal width (cm)', 'petal length (cm)', 'petal
width (cm)']

targets:
['setosa' 'versicolor' 'virginica']

============== ==== ==== ======= ===== ====================


Min Max Mean SD Class Correlation
============== ==== ==== ======= ===== ====================
sepal length: 4.3 7.9 5.84 0.83 0.7826
sepal width: 2.0 4.4 3.05 0.43 -0.4194
petal length: 1.0 6.9 3.76 1.76 0.9490 (high!)
petal width:

most important features (RandomForestClassifier):


(0.4604447396171521, 'petal length (cm)')
(0.4241162651271012, 'petal width (cm)')
(0.09090795402103086, 'sepal length (cm)')
(0.024531041234715754, 'sepal width (cm)')

The code begins by importing datasets and RandomForestClassifier packages.


RandomForestClassifier is an ensemble learning method that constructs a multitude of decision
trees at training time and outputs the class that is the mode of the classes.
In this example, we are only using it to return feature importance. The main block begins by
loading data and displaying its characteristics. Loading feature set data into variable X and target
data into variable y is convention in the machine learning community.
The code concludes by training RandomForestClassifier on the pandas data, so it can return
feature importance. When actually modeling data, we convert pandas data to NumPy for optimum
performance. Keep in mind that the keys are available because the data set is embedded in Scikit-
Learn.
Notice that we only took a small slice from DESCR, which holds a lot of information about the
data set. I always recommend displaying at least the shape of the original data set before
embarking on any machine learning experiment.

Tip RandomForestClassifier is a powerful machine learning algorithm that not only models
training data, but returns feature importance.

Wine Data
The next data set we characterize is load_wine. The load_wine data set consists of 178 data
elements. Each element has thirteen features that describe three target classes. It is considered a
classic in the machine learning community and offers an easy multi-classification data set.
The next code example shown in Listing 1-2 loads wine data and displays its keys, shape of the
feature set and target, feature and target names, a slice from the DESCR key, and feature
importance (from most to least).

from sklearn.datasets import load_wine


from sklearn.ensemble import RandomForestClassifier

if __name__ == "__main__":
br = '\n'
data = load_wine()
keys = data.keys()
print (keys, br)
X, y = data.data, data.target
print ('features:', X.shape)
print ('targets', y.shape, br)
print (X[0], br)
features = data.feature_names
targets = data.target_names
print ('feature set:')
print (features, br)
print ('targets:')
print (targets, br)
rnd_clf = RandomForestClassifier(random_state=0,
n_estimators=100)
rnd_clf.fit(X, y)
rnd_name = rnd_clf.__class__.__name__
feature_importances = rnd_clf.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
n = 6
print (n, 'most important features' + ' (' + rnd_name + '):')
[print (row) for i, row in enumerate(importance) if i < n]
Listing 1-2 Characterize load_wine

After executing code from Listing 1-2, your output should resemble the following:

dict_keys(['data', 'target', 'target_names', 'DESCR',


'feature_names'])

features: (178, 13)


targets (178,)

[1.423e+01 1.710e+00 2.430e+00 1.560e+01 1.270e+02 2.800e+00


3.060e+00
2.800e-01 2.290e+00 5.640e+00 1.040e+00 3.920e+00 1.065e+03]

feature set:
['alcohol', 'malic_acid', 'ash', 'alcalinity_of_ash', 'magnesium',
'total_phenols', 'flavanoids', 'nonflavanoid_phenols',
'proanthocyanins', 'color_intensity', 'hue',
'od280/od315_of_diluted_wines', 'proline']

targets:
['class_0' 'class_1' 'class_2']

6 most important features (RandomForestClassifier):


(0.19399882779940295, 'proline')
(0.16095401215681593, 'flavanoids')
(0.1452667364559143, 'color_intensity')
(0.11070045042456281, 'alcohol')
(0.1097465262717493, 'od280/od315_of_diluted_wines')
(0.08968972021098301, 'hue')

Tip To create (instantiate) a machine learning algorithm (model), just assign it to a variable
(e.g., model = algorithm()). To train based on the model, just fit it to the data (e.g., model.fit(X,
y)).

The code begins by importing load_wine and RandomForestClassifier. The main block displays
keys, loads data into X and y, displays the first vector from feature set X, displays shapes, and
displays feature set and target information. The code concludes by training X with
RandomForestClassifier, so we can display the six most important features. Notice that we display
the first vector from feature set X to verify that all features are numeric.

Bank Data
The next code example shown in Listing 1-3 works with bank data. The bank.csv data set is
composed of direct marketing campaigns from a Portuguese banking institution. The target is
described by whether a client will subscribe (yes/no) to a term deposit (target label y). It consists
of 41188 data elements with 20 features for each element. A 10% random sample of 4119 data
elements is also available from this site for more computationally expensive algorithms such as
svm and KNeighborsClassifier.

import pandas as pd

if __name__ == "__main__":
br = '\n'
f = 'data/bank.csv'
bank = pd.read_csv(f)
features = list(bank)
print (features, br)
X = bank.drop(['y'], axis=1).values
y = bank['y'].values
print (X.shape, y.shape, br)
print (bank[['job', 'education', 'age', 'housing',
'marital', 'duration']].head())
Listing 1-3 Characterize bank data
After executing code from Listing 1-3, your output should resemble the following:

['age', 'job', 'marital', 'education', 'default', 'housing', 'loan',


'contact', 'month', 'day_of_week', 'duration', 'campaign', 'pdays',
'previous', 'poutcome', 'emp.var.rate', 'cons.price.idx',
'cons.conf.idx', 'euribor3m', 'nr.employed', 'y']

(41188, 20) (41188,)

job education age housing marital duration


0 housemaid basic.4y 56 no married 261
1 services high.school 57 no married 149
2 services high.school 37 yes married 226
3 admin. basic.6y 40 no married 151
4 services high.school 56 no married 307

The code example begins by importing the pandas package. The main block loads bank data
from a CSV file into a Pandas DataFrame and displays the column names (or features). To retrieve
column names from pandas, all we need to do is make the DataFrame a list and assign the result
to a variable. Next, feature set X and target y are created. Finally, X and y shapes are displayed as
well as a few choice features.

Digits Data
The final code example in this subsection is load_digits. The load_digits data set consists of 1797 8
× 8 handwritten images. Each image is represented by 64 pixels (based on an 8 × 8 matrix), which
make up the feature set. Ten targets are predicted represented by digits zero to nine.
Listing 1-4 contains the code that characterizes load_digits.

import numpy as np
from sklearn.datasets import load_digits
import matplotlib.pyplot as plt

if __name__ == "__main__":
br = '\n'
digits = load_digits()
print (digits.keys(), br)
print ('2D shape of digits data:', digits.images.shape, br)
X = digits.data
y = digits.target
print ('X shape (8x8 flattened to 64 pixels):', end=' ')
print (X.shape)
print ('y shape:', end=' ')
print (y.shape, br)
i = 500
print ('vector (flattened matrix) of "feature" image:')
print (X[i], br)
print ('matrix (transformed vector) of a "feature" image:')
X_i = np.array(X[i]).reshape(8, 8)
print (X_i, br)
print ('target:', y[i], br)
print ('original "digits" image matrix:')
print (digits.images[i])
plt.figure(1, figsize=(3, 3))
plt.title('reshaped flattened vector')
plt.imshow(X_i, cmap="gray", interpolation="gaussian")
plt.figure(2, figsize=(3, 3))
plt.title('original images dataset')
plt.imshow(digits.images[i], cmap="gray",
interpolation='gaussian')
plt.show()
Listing 1-4 Characterize load_digits
After executing code from Listing 1-4, your output should resemble the following:

dict_keys(['data', 'target', 'target_names', 'images', 'DESCR'])

2D shape of digits data: (1797, 8, 8)

X shape (8x8 flattened to 64 pixels): (1797, 64)


y shape: (1797,)

vector (flattened matrix) of "feature" image:


[ 0. 0. 3. 10. 14. 3. 0. 0. 0. 8. 16. 11. 10.
13. 0. 0. 0. 7.
14. 0. 1. 15. 2. 0. 0. 2. 16. 9. 16. 16. 1. 0. 0. 0. 12.
16.
15. 15. 2. 0. 0. 0. 12. 10. 0. 8. 8. 0. 0. 0. 9.
12. 4. 7.
12. 0. 0. 0. 2. 11. 16. 16. 9. 0.]

matrix (transformed vector) of a "feature" image:


[[ 0. 0. 3. 10. 14. 3. 0. 0.]
[ 0. 8. 16. 11. 10. 13. 0. 0.]
[ 0. 7. 14. 0. 1. 15. 2. 0.]
[ 0. 2. 16. 9. 16. 16. 1. 0.]
[ 0. 0. 12. 16. 15. 15. 2. 0.]
[ 0. 0. 12. 10. 0. 8. 8. 0.]
[ 0. 0. 9. 12. 4. 7. 12. 0.]
[ 0. 0. 2. 11. 16. 16. 9. 0.]]

target: 8

original "digits" image matrix:


[[ 0. 0. 3. 10. 14. 3. 0. 0.]
[ 0. 8. 16. 11. 10. 13. 0. 0.]
[ 0. 7. 14. 0. 1. 15. 2. 0.]
[ 0. 2. 16. 9. 16. 16. 1. 0.]
[ 0. 0. 12. 16. 15. 15. 2. 0.]
[ 0. 0. 12. 10. 0. 8. 8. 0.]
[ 0. 0. 9. 12. 4. 7. 12. 0.]
[ 0. 0. 2. 11. 16. 16. 9. 0.]]
Listing 1-4 also displays Figures 1-1 and 1-2. Figure 1-1 is a reshaped flattened vector of the
500th image in the data set. Each data element in feature set X is represented as a flattened vector
of 64 pixels because Scikit-Learn cannot recognize an 8 × 8 image matrix, so we must reshape the
500th vector to an 8 × 8 image matrix to visualize. Figure 1-2 is the 500th image taken directly
from the images data set that is available when we load the data into variable digits.

Figure 1-1 Reshaped flattened vector of the 500th data element

Figure 1-2 Original image matrix of the 500th data element

The code begins by importing numpy, load_digits, and matplotlib packages. The main block
places load_digits into the digits variable and displays its keys: data, target, target_names, images,
and DESCR. It continues by displaying the two-dimensional (2D) shape of images contained in
images. Data in images are represented by 1797 8 × 8 matrices. Next, feature data (represented as
vectors) are placed in X and target data in y.
A feature vector is one that contains information about an object’s important characteristics.
Data in data are represented by 1797 64-pixel feature vectors. A simple feature representation of
an image is the raw intensity value of each pixel. So, an 8 × 8 image is represented by 64 pixels.
Machine learning algorithms process feature data as vectors, so each element in data must be a
one-dimensional (1D) vector representation of its 2D image matrix.

Tip Feature data must be composed of vectors to work with machine learning algorithm.

The code continues by displaying the feature vector of the 500th image. Next, the 500th feature
vector is transformed from its flattened 1D vector shape into a 2D image matrix and displayed
with the NumPy reshape function. The code continues by displaying the target value y of the
500th image. Next, the 500th image matrix is displayed by referencing images.
The reason we transformed the image from its 1D flattened vector state to the 2D image
matrix is that most data sets don’t include an images object like load_data. So, to visualize and
process data with machine learning algorithms, we must be able to manually flatten images and
transform flattened images back to their original 2D matrix shape.
The code concludes by visualizing the 500th image in two ways. First, we use the flattened
vector X_i. Second, we reference images. While machine learning algorithms require feature
vectors, function imshow requires 2D image matrices to visualize.

Complex Classification Data


Now let’s work with more complex data sets. Complex data sets are those with a very high
number of features. Such a data set is referred to as one with a high-dimensional feature space.

Newsgroup Data
The first data set we characterize is fetch_20newsgroups, which consists of approximately 18000
posts on 20 topics. Data is split into train-test subsets. The split is based on messages posted
before and after a specific date.
Listing 1-5 contains the code that characterizes fetch_20newsgroups.

from sklearn.datasets import fetch_20newsgroups

if __name__ == "__main__":
br = '\n'
train = fetch_20newsgroups(subset='train')
test = fetch_20newsgroups(subset='test')
print ('data:')
print (train.target.shape, 'shape of train data')
print (test.target.shape, 'shape of test data', br)
targets = test.target_names
print (targets, br)
categories = ['rec.autos', 'rec.motorcycles', 'sci.space',
'sci.med']
train = fetch_20newsgroups(subset='train',
categories=categories)
test = fetch_20newsgroups(subset='test',
categories=categories)
print ('data subset:')
print (train.target.shape, 'shape of train data')
print (test.target.shape, 'shape of test data', br)
targets = train.target_names
print (targets)
Listing 1-5 Characterize fetch_20newsgroups
After executing code from Listing 1-5, your output should resemble the following:

data:
(11314,) shape of train data
(7532,) shape of test data

['alt.atheism', 'comp.graphics', 'comp.os.ms-windows.misc',


'comp.sys.ibm.pc.hardware', 'comp.sys.mac.hardware',
'comp.windows.x', 'misc.forsale', 'rec.autos', 'rec.motorcycles',
'rec.sport.baseball', 'rec.sport.hockey', 'sci.crypt',
'sci.electronics', 'sci.med', 'sci.space', 'soc.religion.christian',
'talk.politics.guns', 'talk.politics.mideast', 'talk.politics.misc',
'talk.religion.misc']

data subset:
(2379,) shape of train data
(1584,) shape of test data

['rec.autos', 'rec.motorcycles', 'sci.med', 'sci.space']

The code begins by importing fetch_20newsgroups. The main block begins by loading train
and test data and displaying their shapes. Training data consists of 11314 postings, while test
data consists of 7532 postings. The code continues by displaying target names and categories.
Next, train and test data are created from a subset of categories. The code concludes by displaying
shapes and target names of the subset.

MNIST Data
The next data set we characterize is MNIST. MNIST (Modified National Institute of Standards and
Technology) is a large database of handwritten digits commonly used for training and testing in
the machine learning community and other industrial image processing applications. MNIST
contains 70000 examples of handwritten digit images labeled from 0 to 9 of size 28 × 28. Each
target (or label) is stored as a digit value. The feature set is a matrix of 70000 28 × 28 images
automatically flattened to 784 pixels each. So, each of the 70000 data elements is a vector of
length 784. The target set is a vector of 70000 digit values.
Listing 1-6 contains the code that characterizes MNIST.

import numpy as np
from random import randint
import matplotlib.pyplot as plt

def find_image(data, labels, d):


for i, row in enumerate(labels):
if d == row:
target = row
X_pixels = np.array(data[i])
return (target, X_pixels)

if __name__ == "__main__":
br = '\n'
X = np.load('data/X_mnist.npy')
y = np.load('data/y_mnist.npy')
target = np.load('data/mnist_targets.npy')
print ('labels (targets):')
print (target, br)
print ('feature set shape:')
print (X.shape, br)
print ('target set shape:')
print (y.shape, br)
indx = randint(0, y.shape[0]-1)
target = y[indx]
X_pixels = np.array(X[indx])
print ('the feature image consists of', len(X_pixels),
'pixels')
X_image = X_pixels.reshape(28, 28)
plt.figure(1, figsize=(3, 3))
title = 'image @ indx ' + str(indx) + ' is digit ' \
+ str(int(target))
plt.title(title)
plt.imshow(X_image, cmap="gray")
digit = 7
target, X_pixels = find_image(X, y, digit)
X_image = X_pixels.reshape(28, 28)
plt.figure(2, figsize=(3, 3))
title = 'find first ' + str(int(target)) + ' in dataset'
plt.title(title)
plt.imshow(X_image, cmap="gray")
plt.show()
Listing 1-6 Characterize MNIST
After executing code from Listing 1-6, your output should resemble the following:

labels (targets):
[0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]

feature set shape:


(70000, 784)

target set shape:


(70000,)

the feature image consists of 784 pixels

Listing 1-6 also displays Figures 1-3 and 1-4. Figure 1-3 is the reshaped image of digit 1 at
index 6969. Figure 1-4 is the first image of digit 7 in the data set.
Figure 1-3 Reshaped flattened vector of image at index 6969

Figure 1-4 Image of first digit 7 in the data set


The code begins by importing randint and other requisite packages. Function find_image
locates the first occurrence of an image. The main block loads data from NumPy files into feature
set X, target y, and target. Variable target holds target labels. It continues by displaying the shape
of X and y. Feature set X consists of 70000 784-pixel vectors, so each image consists of 28 × 28
pixels.
Target y consists of 70000 labels. Next, a random integer between 0 and 69999 is generated,
so we can display a random image from the data set. The random integer in our case is 6969. The
image at index 6969 is digit 1. The size of the image is displayed to verify that it is 784 pixels. We
then reshape vector 6969 to a 28 × 28 matrix, so we can visualize with function imshow. The code
concludes by finding the first digit 7 and displaying it.
Faces Data
The final data set characterized in this subsection is fetch_1fw_people. The fetch_1fw_people data
set is used for classifying labeled faces. It contains 1288 face images and seven targets. Each
image is represented by a 50 × 37 matrix of pixels, so the feature set has 1850 features (based on
a 50 × 37 matrix). In all, the data consists of 1288 labeled faces composed of 1850 pixels each
predicting seven targets.
Listing 1-7 contains the code that characterizes fetch_1fw_people.

import numpy as np
import matplotlib.pyplot as plt

if __name__ == "__main__":
br = '\n'
X = np.load('data/X_faces.npy')
y = np.load('data/y_faces.npy')
targets = np.load('data/faces_targets.npy')
print ('shape of feature and target data:')
print (X.shape)
print (y.shape, br)
print ('target faces:')
print (targets)
X_i = np.array(X[0]).reshape(50, 37)
image_name = targets[y[0]]
fig, ax = plt.subplots()
image = ax.imshow(X_i, cmap="bone")
plt.title(image_name)
plt.show()
Listing 1-7 Characterize fetch_1fw_people

After executing code from Listing 1-7, your output should resemble the following:

shape of feature and target data:


(1288, 1850)
(1288,)

target faces:
['Ariel Sharon' 'Colin Powell' 'Donald Rumsfeld' 'George W Bush'
'Gerhard Schroeder' 'Hugo Chavez' 'Tony Blair']

Listing 1-7 also displays Figure 1-5. Figure 1-5 is the reshaped image of the first data element
in the data set.
Figure 1-5 Reshaped image of the first data element in the data set
The code begins by importing requisite packages. The main block loads data into X, y, and
targets from NumPy files. The code continues by printing shapes of X and y. X contains 1288
1850-pixel vectors and y contains 1288 target values. Target labels are then displayed. The code
concludes by reshaping the first feature vector to a 50 × 37 image and displaying it with function
imshow.

Regression Data
We now change gears away from classification and move into regression. Regression is a machine
learning technique for predicting a numerical value based on the independent variables (or
feature set) of a data set. That is, we are measuring the impact of the feature set on a numerical
output. The first data set we characterize for regression is tips.

Tips Data
The tips data set is integrated with the seaborn library. It consists of food server tips in
restaurants and related factors including tip, price of meal, and time of day. Specifically, features
include total_bill (price of meal), tip (gratuity), sex (male or female), smoker (yes or no), day
(Thursday, Friday, Saturday, or Sunday), time (day or night), and size of the party. Features are
coded as follows: total_bill (US dollars), tip (US dollars), sex (0=male, 1=female), smoker (0=no,
1=yes), day (3=Thur, 4=Fri, 5= Sat, 6=Sun). Tips data is represented by 244 elements with six
features predicting one target. The target being tips received from customers.
Listing 1-8 characterizes tips data.

import numpy as np, pandas as pd, seaborn as sns

if __name__ == "__main__":
br = '\n'
sns.set(color_codes=True)
tips = sns.load_dataset('tips')
print (tips.head(), br)
X = tips.drop(['tip'], axis=1).values
y = tips['tip'].values
print (X.shape, y.shape)
Listing 1-8 Characterize the tips data set
After executing code from Listing 1-8, your output should resemble the following:

total_bill tip sex smoker day time size


0 16.99 1.01 Female No Sun Dinner 2
1 10.34 1.66 Male No Sun Dinner 3
2 21.01 3.50 Male No Sun Dinner 3
3 23.68 3.31 Male No Sun Dinner 2
4 24.59 3.61 Female No Sun Dinner 4

(244, 6) (244,)

The code begins by loading tips as a Pandas DataFrame, displaying the first five records,
converting data to NumPy, and displaying the feature set and target shapes. Seaborn data is
automatically loaded as a Pandas DataFrame. We couldn’t get feature importance because
RandomForestClassifier expects numeric data. It takes a great deal of data wrangling to get the
data set into this form. We will transform categorical data to numeric in later chapters.

Red and White Wine


The next two data sets we characterize are redwine.csv and whitewine.csv. Data sets redwine.csv
and whitewine.csv relate to red and white wine quality, respectively. Both wines are composed of
variants of the Portuguese Vinho Verde wine.
The feature set consists of eleven attributes. The input attributes are based on objective tests
like pH (acidity or basicity of a substance) and alcohol (percent by volume). Output quality is
based on sensory data reported as the median of at least three wine expert evaluations. Each
expert graded wine quality on a scale from 0 (very bad) to 10 (very excellent). The red wine data
set has 1599 instances while the white wine data set has 4898.
Listing 1-9 characterizes redwine.csv.

import pandas as pd
from sklearn.ensemble import RandomForestRegressor

if __name__ == "__main__":
br = '\n'
f = 'data/redwine.csv'
red_wine = pd.read_csv(f)
X = red_wine.drop(['quality'], axis=1)
y = red_wine['quality']
print (X.shape)
print (y.shape, br)
features = list(X)
rfr = RandomForestRegressor(random_state=0,
n_estimators=100)
rfr_name = rfr.__class__.__name__
rfr.fit(X, y)
feature_importances = rfr.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
n = 3
print (n, 'most important features' + ' (' + rfr_name + '):')
[print (row) for i, row in enumerate(importance) if i < n]
for row in importance:
print (row)
print ()
print (red_wine[['alcohol', 'sulphates', 'volatile acidity',
'total sulfur dioxide', 'quality']]. head())
Listing 1-9 Characterize redwine
After executing code from Listing 1-9, your output should resemble the following:

(1599, 11)
(1599,)

3 most important features (RandomForestRegressor):


(0.27432500255956216, 'alcohol')
(0.13700073893077233, 'sulphates')
(0.13053941311188708, 'volatile acidity')
(0.27432500255956216, 'alcohol')
(0.13700073893077233, 'sulphates')
(0.13053941311188708, 'volatile acidity')
(0.08068199773601588, 'total sulfur dioxide')
(0.06294612644261727, 'chlorides')
(0.057730976351602854, 'pH')
(0.055499749756166, 'residual sugar')
(0.05198192402458334, 'density')
(0.05114079873500658, 'fixed acidity')
(0.049730883807319035, 'free sulfur dioxide')
(0.04842238854446754, 'citric acid')

alcohol sulphates volatile acidity total sulfur dioxide quality


0 9.4 0.56 0.70 34.0 5.0
1 9.8 0.68 0.88 67.0 5.0
2 9.8 0.65 0.76 54.0 5.0
3 9.8 0.58 0.28 60.0 6.0
4 9.4 0.56 0.70 34.0 5.0

The code example begins by loading pandas and RandomForestRegressor packages. The main
block loads redwine.csv into a Pandas DataFrame. It then displays feature and target shapes. The
code concludes by training pandas data with RandomForestRegressor, displaying the three most
important features, and displaying the first five records from the data set.
RandomForestRegressor is also an ensemble algorithm, but it is used when the target is numeric
or continuous.
Tip Always hard-code random_state (e.g., random_state=0) for algorithms that use this
parameter to stabilize results.

The white wine example follows the exact same logic, but output differs in terms of data set size
and feature importance.
Listing 1-10 characterizes whitewine.csv.

import numpy as np, pandas as pd


from sklearn.ensemble import RandomForestRegressor

if __name__ == "__main__":
br = '\n'
f = 'data/whitewine.csv'
white_wine = pd.read_csv(f)
X = white_wine.drop(['quality'], axis=1)
y = white_wine['quality']
print (X.shape)
print (y.shape, br)
features = list(X)
rfr = RandomForestRegressor(random_state=0,
n_estimators=100)
rfr_name = rfr.__class__.__name__
rfr.fit(X, y)
feature_importances = rfr.feature_importances_
importance = sorted(zip(feature_importances, features),
reverse=True)
n = 3
print (n, 'most important features' + ' (' + rfr_name + '):')
[print (row) for i, row in enumerate(importance) if i < n]
print ()
print (white_wine[['alcohol', 'sulphates',
'volatile acidity',
'total sulfur dioxide',
'quality']]. head())
Listing 1-10 Characterize whitewine

After executing code from Listing 1-10, your output should resemble the following:

(4898, 11)
(4898,)

3 most important features (RandomForestRegressor):


(0.24186185906056268, 'alcohol')
(0.1251626059551235, 'volatile acidity')
(0.11524332271725685, 'free sulfur dioxide')

alcohol sulphates volatile acidity total sulfur dioxide quality


0 8.8 0.45 0.27 170.0 6.0
1 9.5 0.49 0.30 132.0 6.0
2 10.1 0.44 0.28 97.0 6.0
Exploring the Variety of Random
Documents with Different Content
The Project Gutenberg eBook of Translations from
Lucretius
This ebook is for the use of anyone anywhere in the United
States and most other parts of the world at no cost and with
almost no restrictions whatsoever. You may copy it, give it away
or re-use it under the terms of the Project Gutenberg License
included with this ebook or online at www.gutenberg.org. If you
are not located in the United States, you will have to check the
laws of the country where you are located before using this
eBook.

Title: Translations from Lucretius

Author: Titus Lucretius Carus

Translator: R. C. Trevelyan

Release date: December 12, 2020 [eBook #64024]

Language: English

Credits: Sonya Schermann, Chuck Greif and the Online


Distributed
Proofreading Team at https://www.pgdp.net (This file
was
produced from images generously made available by
The Internet
Archive)

*** START OF THE PROJECT GUTENBERG EBOOK TRANSLATIONS


FROM LUCRETIUS ***
TRANSLATIONS FROM
LUCRETIUS
By the same Author.

The Foolishness of Solomon 3s. 6d.


Lucretius on Death 2s. 6d.
The Pterodamozels 2s.
The New Parsifal 3s. 6d.
The Bride of Dionysus 3s. 6d.
Sisyphus 5s.
Polyphemus 7s. 6d.
The Birth of Parsival 3s. 6d.
Cecilia Gonzaga 2s. 6d.
Mallow and Asphodel 2s. 6d.
The Ajax of Sophocles 2s.

TRANSLATIONS FROM
LUCRETIUS
BY
R. C. TREVELYAN

LONDON: GEORGE ALLEN & UNWIN LTD.


RUSKIN HOUSE 40 MUSEUM STREET, W.C. 1

First published in 1920.

All rights reserved.

TO
G. LOWES DICKINSON

TRANSLATIONS FROM
LUCRETIUS
BOOK I, lines 1-328
BOOK II, lines 991-1174
BOOK III, lines 1-160
BOOK III, lines 830-1094
BOOK IV, lines 962-1287
BOOK V
BOOK VI, lines 1-95
BOOK I, lines 1-328
Thou mother of the Aenead race, delight
Of men and deities, bountiful Venus, thou
Who under the sky’s gliding constellations
Fillest ship-carrying ocean with thy presence
And the corn-bearing lands, since through thy power
Each kind of living creature is conceived
Then riseth and beholdeth the sun’s light:
Before thee and thine advent the winds and clouds
Of heaven take flight, O goddess: daedal earth
Puts forth sweet-scented flowers beneath thy feet:
Beholding thee the smooth deep laughs, the sky
Grows calm and shines with wide-outspreading light.
For soon as the day’s vernal countenance
Has been revealed, and fresh from wintry bonds
Blows the birth-giving breeze of the West wind,
First do the birds of air give sign of thee,
Goddess, and thine approach, as through their hearts
Thine influence smites. Next the wild herds of beasts
Bound over the rich pastures and swim through
The rapid streams, as captured by thy charm
Each one with eager longing follows thee
Whithersoever thou wouldst lure them on.
And thus through seas, mountains and rushing rivers,
Through the birds’ leafy homes and the green plains,
Striking bland love into the hearts of all,
Thou art the cause that following his lust
Each should renew his race after his kind.
Therefore since thou alone art nature’s mistress,
And since without thine aid naught can rise forth
Into the glorious regions of the light,
Nor aught grow to be gladsome and delectable,
Thee would I win to help me while I write
These verses, wherein I labour to describe
The nature of things in honour of my friend
This scion of the Memmian house, whom thou
Hast willed to be found peerless all his days
In every grace. Therefore the more, great deity,
Grant to my words eternal loveliness:
Cause meanwhile that the savage works of warfare
Over all seas and lands sink hushed to rest.
For thou alone hast power to bless mankind
With tranquil peace; since of war’s savage works
Mavors mighty in battle hath control,
Who oft flings himself back upon thy lap,
Quite vanquished by love’s never-healing wound;
And so with upturned face and shapely neck
Thrown backward, feeds with love his hungry looks,
Gazing on thee, goddess, while thus he lies
Supine, and on thy lips his spirit hangs.
O’er him thus couched upon thy holy body
Do thou bend down to enfold him, and from thy lips
Pour tender speech, petitioning calm peace,
O glorious divinity, for thy Romans.
For nor can we in our country’s hour of trouble
Toil with a mind untroubled at our task,
Nor yet may the famed child of Memmius
Be spared from public service in such times.

For the rest,[A] leisured ears and a keen mind


Withdrawn from cares, lend to true reasoning,
Lest my gifts, which with loving diligence
I set out for you, ere they be understood
You should reject disdainfully. For now
About the most high theory of the heavens
And of the deities, I will undertake
To tell you in my discourse, and will reveal
The first beginnings of existing things,
Out of which nature gives birth and increase
And nourishment to all things; into which
Nature likewise, when they have been destroyed,
Resolves them back in turn. These we are wont,
I i f h ll
In setting forth our argument, to call
Matter, or else begetting particles,
Or to name them the seeds of things: again
As primal atoms we shall speak of them,
Because from them first everything is formed.

When prostrate upon earth lay human life


Visibly trampled down and foully crushed
Beneath religion’s cruelty, who meanwhile
Forth from the regions of the heavens above
Showed forth her face, lowering down on men
With horrible aspect, first did a man of Greece[B]
Dare to lift up his mortal eyes against her;
The first was he to stand up and defy her.
Him neither stories of the gods, nor lightnings,
Nor heaven with muttering menaces could quell,
But all the more did they arouse his soul’s
Keen valour, till he longed to be the first
To break through the fast-bolted doors of nature.
Therefore his fervent energy of mind
Prevailed, and he passed onward, voyaging far
Beyond the flaming ramparts of the world,
Ranging in mind and spirit far and wide
Throughout the unmeasured universe; and thence
A conqueror he returns to us, bringing back
Knowledge both of what can and what cannot
Rise into being, teaching us in fine
Upon what principle each thing has its powers
Limited, and its deep-set boundary stone.
Therefore now has religion been cast down
Beneath men’s feet, and trampled on in turn:
Ourselves heaven-high his victory exalts.

Herein this fear assails me, lest perchance


You should suppose I would initiate you
Into a school of reasoning unholy,
And set your feet upon a path of sin:
And set your feet upon a path of sin:
Whereas in truth often has this religion
Given birth to sinful and unholy deeds.
So once at Aulis did those chosen chiefs
Of Hellas, those most eminent among heros,
Foully defile the Trivian Virgin’s altar
With Iphianassa’s lifeblood. For so soon
As the fillet wreathed around her maiden locks
Streamed down in equal lengths from either cheek,
And soon as she was aware of her father standing
Sorrowful by the altar, and at his side
The priestly ministers hiding the knife,
And the folk shedding tears at sight of her,
Speechless in terror, dropping on her knees
To the earth she sank down. Nor in that hour
Of anguish might it avail her that she first
Had given the name of father to the king;
For by the hands of men lifted on high
Shuddering to the altar she was borne,
Not that, when the due ceremonial rites
Had been accomplished, she might be escorted
By the clear-sounding hymenaeal song,
But that a stainless maiden foully stained,
In the very season of marriage she might fall
A sorrowful victim by a father’s stroke,
That so there might be granted to the fleet
A happy and hallowed sailing. Such the crimes
Whereto religion has had power to prompt.

Yet there may come a time when you yourself,


Surrendering to the terror-breathing tales
Of seers and bards, will seek to abandon us.
Ay verily, how many dreams even now
May they be forging for you, which might well
Overturn your philosophy of life,
And trouble all your happiness with fear!
And with good cause: for if men could perceive
And with good cause: for if men could perceive
That there was a fixed limit to their sorrows,
By some means they would find strength to withstand
The hallowed lies and threatenings of these seers.
But as it is, men have no means, no power
To make a stand, since everlasting seem
The penalties that they must fear in death.
For none knows what is the nature of the soul,
Whether ’tis born, or on the contrary
Enters into our bodies at their birth:
Whether, when torn from us by death, it perishes
Together with us, or thereafter goes
To visit Orcus’ glooms and the vast chasms;
Or penetrates by ordinance divine
Into brutes in man’s stead, as sang our own
Ennius, who first from pleasant Helicon
Brought down a garland of unfading leaf,
Destined among Italian tribes of men
To win bright glory. And yet in spite of this
Ennius sets forth in immortal verse
That none the less there does exist a realm
Of Acheron, though neither do our souls
Nor bodies penetrate thither, but a kind
Of phantom images, pale in wondrous wise:
And thence it was, so he relates, that once
The ghost of ever-living Homer rose
Before him, shedding salt tears, and began
To unfold in discourse the nature of things.
Therefore not only must we grasp the truth
Concerning things on high, what principle
Controls the courses of the sun and moon,
And by what force all that takes place on earth
Is governed, but above all by keen thought
We must investigate whereof consists
The soul and the mind’s nature, and what it is
That comes before us when we wake, if then
We are preyed on by disease or when we lie
We are preyed on by disease, or when we lie
Buried in sleep, and terrifies our minds,
So that we seem face to face to behold
And hear those speaking to us who are dead,
Whose bones the earth now holds in its embrace.

Nor am I unaware how hard my task


In Latin verses to set clearly forth
The obscure truths discovered by the Greeks,
Chiefly because so much will need new terms
To deal with it, owing to our poverty
Of language, and the novelty of the themes.
Nevertheless your worth and the delight
Of your sweet friendship, which I hope to win,
Prompt me to bear the burden of any toil,
And lead me on to watch the calm nights through,
Seeking by means of what words and what measures
I may attain my end, and shed so clear
A light upon your spirit, that thereby
Your gaze may search the depths of hidden things.

This terror, then, and darkness of the mind


Must needs be scattered not by the sun’s beams
And day’s bright arrows, but by contemplation
Of nature’s aspect and her inward law.
And this first principle of her design
Shall be our starting point: nothing is ever
By divine will begotten out of nothing.
In truth the reason fear so dominates
All mortals, is that they behold on earth
And in the sky many things happening,
Yet of these operations by no means
Can they perceive the causes, and so fancy
That they must come to pass by power divine.
Therefore when we have understood that nothing
Can be born out of nothing, we shall then
Win juster knowledge of the truth we seek,
Both from what elements each thing can be formed,
And in what way all things can come to pass
Without the intervention of the gods.

For if things came from nothing, any kind


Might be born out of anything; naught then
Would require seed. Thus men might rise from ocean
The scaly race out of the land, while birds
Might suddenly be hatched forth from the sky:
Cattle and other herds and every kind
Of wild beast, bred by no fixed law of birth,
Would roam o’er tilth and wilderness alike.
No fruit would remain constant to its tree,
But would change; every tree would bear all kinds.
For if there were not for each thing its own
Begetting particles, how could they have
A fixed unvarying mother? But in fact
Since all are formed from fixed seeds, each is born
And issues into the borders of the light
From that alone wherein resides its substance
And its first bodies. And for this cause all things
Cannot be generated out of all,
Since in each dwells its own particular power.
Again why do we see in spring the rose,
Corn in the summer’s heat, vines bursting forth
When autumn summons them, if not because
When in their own time the fixed seeds of things
Have flowed together, there is then revealed
Whatever has been born, while the due seasons
Are present, and the quickened earth brings forth
Safely into the borders of the light
Its tender nurslings? But if they were formed
From nothing, they would suddenly spring up
At unfixed periods and hostile times,
Since there would then be no fixed particles
To be kept from a begetting union
By the unpropitious season of the year.
Nor yet after the meeting of the seed
Would lapse of time be needed for their increase,
If they could grow from nothing. Suddenly
Small babes would become youths; trees would arise
Shooting up in a moment from the ground.
But nothing of the kind, ’tis plain, takes place,
Seeing that all things grow little by little,
As befits, from determined seed, and growing
Preserve their kind: so that you may perceive
That all things become greater and are nourished
Out of their own material. Furthermore
Without fixed annual seasons for the rain
Earth could not put her gladdening produce forth,
Nor yet, if kept apart from nourishment,
Could living creatures propagate their kind
Or sustain life: so that with greater reason
You may think many things have many atoms
In common, as we see that different words
Have common letters, than that anything
Can come to being without first elements.
Again, why could not nature have produced
Men of such mighty bulk, that they could wade
Through the deep places of the sea, or rend
Huge mountains with their hands, or in one life
Overpass many living generations,
If not because there has been set apart
A changeless substance for begetting things,
And what can thence arise is predetermined?
Therefore we must confess this truth, that nothing
Can come from nothing, since seed is required
For each thing, out of which it may be born
And lift itself into the air’s soft breezes.
Lastly, since it is evident that tilled lands
Excel the untilled, and yield to labouring hands
A richer harvest, we may thence infer
That in the earth there must be primal atoms,
And these, labouring its soil, we stimulate
To rise, when with the coulter we turn up
The fertile clods. But if none such existed,
We should see all things without toil of ours
Spring forth far richer of their own accord.

Furthermore nature dissolves each form back


Into its own first particles, nor ever
Annihilates things. For if aught could be mortal
In all its parts, then it might from our eyes
Be snatched away to perish suddenly.
For there would be no need of any force
To cause disruption of its parts, and loosen
Their fastenings. But in fact each is composed
Of everlasting seeds; so till some force
Arrives that with a blow can shatter things
To pieces, or can penetrate within
Their empty spaces, and so break them up,
Nature will not permit the dissolution
Of anything to be seen. Again, if time
Utterly destroys, consuming all the substance
Of whatsoever it removes from sight
Through lapse of ages, out of what does Venus
Bring back into the light of life the race
Of living creatures each after its kind?
Or, once brought back, whence does the daedal earth
Feed and increase them, giving nourishment
To each after its kind? Whence do its own
Fountains and far-drawn rivers from without
Keep full the sea? Whence does the ether feed
The stars? For infinite time and lapse of days
Surely must long since have devoured all things
Formed of a body that must die. But if
Throughout that period of time long past
Those atoms have existed out of which
Thi i f thi h b d
This universe of things has been composed
And recomposed, ’tis plain they are possessed
Of an immortal nature: none of them
Therefore can turn to nothing. Then again
The same force and the same cause would destroy
All things without distinction, were it not
That an eternal substance held them fast,
A substance interwoven part with part
By bonds more or less close. For without doubt
A mere touch would be cause enough for death,
Seeing that any least amount of force
Must needs dissolve the texture of such things,
No one of which had an eternal body.
But in fact since the mutual fastenings
Between first elements are dissimilar,
And their substance eternal, things endure
With body uninjured, till some force arrives
Strong enough to dissolve the texture of each.
Therefore no single thing ever returns
To nothing, but at their disruption all
Pass back into the elements of matter.
Lastly the rain showers perish, when the sky father
Has flung them into the lap of mother earth.
But then bright crops spring up luxuriantly;
Boughs on the trees are green; the trees themselves
Grow, and with fruits are laden: from this source
Moreover both our own race and the race
Of beasts are nourished; for this cause we see
Glad towns teeming with children, leafy woods
With young birds’ voices singing on all sides;
For this cause cattle about the fertile meadows
Wearied with fatness lay their bodies down,
And from their swollen udders oozing falls
The white milk stream; for this cause a new brood
Bounds on weak limbs over the soft grass, frisking
And gamboling, their young hearts with pure milk thrilled.
N th f f th thi th t t ih
None therefore of those things that seem to perish
Utterly perishes, since nature forms
One thing out of another, and permits
Nothing to be begotten, unless first
She has been recruited by another’s death.

Now listen: since I have proved to you that things


Cannot be formed from nothing, lest you yet
Should tend in any way to doubt my words,
Because the primal particles of things
Can never be distinguished by the eyes,
I will proceed to give you instances
Of bodies which yourself you must admit
Are real things, yet cannot be perceived.
First the wind’s wakened force scourges the sea,
Whelming huge ships and scattering the clouds;
And sometimes with impetuous hurricane
Scouring the plains, it strews them with great trees,
And ravages with forest-rending blasts
The mountain-tops: with such rude savagery
Does the wind howl and bluster and wreak its rage
With menacing uproar. Therefore past all doubt
Winds must be formed of unseen particles
That sweep the seas, the lands, the clouds of heaven,
Ravaging and dishevelling them all
With fitful hurricane gusts. Onward they stream
Multiplying destruction, just as when
The soft nature of water suddenly
Swoops forward in one overwhelming flood
Swelled with abundant rains by a mighty spate
Of water rushing down from the high hills,
Hurtling together broken forest boughs
And entire trees: nor can the sturdy bridges
Sustain the oncoming water’s sudden force:
In such wise turbulent with much rain the river
Flings its whole mighty strength against the piles.
With a loud crashing roar it then deals havoc
With a loud crashing roar it then deals havoc,
And rolls the huge stones on beneath its waves,
Sweeping before it all that stems its flood.
In this way then wind-blasts must likewise move;
And when like a strong stream they have hurled themselves
Towards any quarter, they thrust things along
And with repeated onslaughts overwhelm them,
Often in writhing eddy seizing them
To bear them away in swiftly circling swirl.
Therefore beyond all doubt winds are composed
Of unseen atoms, since in their works and ways
We find that they resemble mighty rivers
Which are of visible substance. Then again
We can perceive the various scents of things,
Yet never see them coming to our nostrils:
Heat too we see not, nor can we observe
Cold with our eyes, nor ever behold words:
Yet must all these be of a bodily nature,
Since they are able to act upon our senses.
For naught can touch or be touched except body.
Clothes also, hung up on a shore where waves
Are breaking, become moist, and then grow dry
If spread out in the sun. Yet in what way
The water’s moisture has soaked into them,
Has not been seen, nor again in what way
The heat has driven it out. The moisture therefore
Is dispersed into tiny particles,
Which our eyes have no power to see at all.
Furthermore after many revolutions
Of the sun’s year, a finger-ring is thinned
On the under side by being worn: the fall
Of dripping eave-drops hollows out a stone:
The bent ploughshare of iron insensibly
Grows smaller in the fields; and we behold
The paving stones of roads worn down at length
By the footsteps of the people. Then again
The brazen statues at the city gates
The brazen statues at the city gates
Show right hands wearing thinner by the touch
Of those who greet them ever as they pass by.
Thus we perceive that all such things grow less
Because they have been worn down: and yet what atoms
Are leaving them each moment, that the jealous
Nature of vision has quite shut us out
From seeing. Finally whatever time
And nature gradually add to things,
Obliging them to grow in due proportion,
No effort of our eyesight can behold.
So too whenever things grow old by age
Or through corruption, and wherever rocks
That overhang the sea are gnawed away
By the corroding brine, you cannot discern
What they are losing at any single moment.
Thus nature operates by unseen atoms.
BOOK II, lines 991-1174
Moreover we are sprung, all we that live,
From heavenly seed: there is, for all, that same
One father[C]; from whom when the bounteous Earth,
Our mother, has drunk in the liquid drops
Of moisture, then by him impregnated
She bears bright crops and glad trees and the race
Of men, bears every species of wild beast,
Furnishing food with which all feed their bodies,
And lead a pleasant life, and propagate
Their offspring. Wherefore justly she has won
The name of mother. Also that which once
Came from the earth, sinks back into the earth,
And what was sent down from the coasts of aether,
Returning thither, is received once more
Into the mansions of the sky. So death
Does not demolish things in such a way
As to destroy the particles of matter,
But only dissipates their union,
Then recombines one element with another,
And so brings it to pass that all things change
Their shapes, alter their colours, and receive
Sensations, then in a moment yield them up.
Thus you may learn how greatly it signifies
Both with what others and in what positions
The same primordial atoms are held bound;
Also what motions they are mutually
Imparting and receiving: and thus too
You need no more suppose that what we see
Hovering upon the surfaces of things,
Or now being born, then suddenly perishing,[D]
Can be inherent qualities in atoms
That are eternal. Nay, in my verses even
It is of moment with what other letters
And in what order each one has been placed.
If not all, yet by far the greater part
Are similar letters: but as their position
Varies, so do the words sound different.
Thus too with actual things, whenever change
Takes place in the collisions motions order
Shape and position of their material atoms,
Then also must the things themselves be changed.

Now to true reasoning turn your mind, I pray;


For a new theme is struggling urgently
To reach your ears, a new aspect of things
Would now reveal itself. But there is naught
So easy, that at first it will not seem
Difficult of belief, and likewise naught
So mighty, naught so wondrous, but that all
Little by little abate their wonder at it.
Consider first the colour of the heavens,
So bright and pure, and all that they contain,
The stars wandering everywhere, the moon
And the surpassing radiance of the sun;
If all these sights were now for the first time
To be revealed to mortals suddenly
And without warning, what could have been described
That would have seemed more marvellous than such things,
Or that humanity could less have dared
Beforehand to believe might come to pass?
Nothing, I think: so wonderful had been
This spectacle. Yet think how no one now,
Wearied to satiety at the sight,
Deigns to look up at the sky’s shining quarters.
Cease therefore to cast reason from your mind
Terrified by mere novelty, but rather
Weigh facts with eager judgment; and if then
They appear true, surrender; if they seem
A falsehood, gird yourself to prove them so.
For since the sum of space outside, beyond
This world’s walls, must be infinite, the mind seeks
To reason as to what may else exist
Yonder in regions whither the intellect
Is constantly desiring to prospect,
And whither the projection of our thought
Reaches in free flight of its own accord.

Now first of all we find that everywhere


In all directions, horizontally,
Below and above throughout the universe
There is no limit, as I have demonstrated.
Indeed the facts themselves proclaim the truth,
And the deep void reveals its nature clearly.
Since then on all sides vacant space extends
Illimitably, and seeds in countless number
And sum immeasurable flit to and fro
Eternally driven on in manifold modes
Of motion, we must deem it in no wise
Probable that this single globe of earth
And this one heaven alone have been created,
While outside all those particles of matter
Are doing nothing: the more so that this world
Was formed by nature, as the seeds of things,
Casually colliding of their own
Spontaneous motion, flocked in manifold ways
Together, vainly, without aim or result,
Until at last such particles combined
As, suddenly thrown together, might become
From time to time the rudiments of great things,
Earth, sea, sky, and the race of living creatures.
Therefore beyond all question we are bound
To admit that elsewhere other aggregates
Of matter must exist, resembling this
Which in its greedy embrace our aether holds.
Moreover, when much matter is at hand,
And space is there, nor any obstacle
Nor cause of hindrance, then you may be sure
Thi tb f i d di l i th
Things must be forming and dissolving there.
Now if there be so vast a store of seeds
That the whole lifetime of all conscious beings
Would fail to count them, and if likewise nature
Abides the same, and so can throw together
The seeds of things each into its own place,
In the same manner as they were thrown together
Into our world, then you must needs admit
That in other regions there are other earths,
And diverse stocks of men and kinds of beasts.

Besides in the whole universe there exists


No one thing that is born unique, and grows
Unique and sole; but it must needs belong
To one class, and there must be many others
Of the same kind. Consider first of all
Live creatures: you will find that thus are born
The mountain-ranging breeds of savage beasts,
Thus the human race, thus also the dumb shoals
Of scaly fish and every flying fowl.
Therefore by a like reasoning you must grant
That sky and earth and sun, moon, sea and all
That else exists, are not unique, but rather
Of number innumerable; since life’s deep-fixed
Boundary stone as surely awaits these,
And they are of a body that has birth
As much as any species here on earth
Abounding in examples of its kind.

If you learn well and keep these truths in mind,


Nature, forthwith enfranchised and released
From her proud lords, is seen then to be acting
In all things of herself spontaneously
Without the interference of the gods.
For by the holy breasts of those divinities,
Who in calm peace are passing tranquil days
Of life untroubled who I ask has power
Of life untroubled, who, I ask, has power
To rule the sum of space immeasurable?
Or who to hold in his controlling hand
The strong reins of the deep? Who can at once
Make all those various firmaments revolve
And with the fires of aether warm each one
Of all those fruitful earths, or at all times
Be present in all places, so to cause
Darkness by clouds, and shake the calms of heaven
With thunder, to hurl lightnings, and ofttimes
Shatter down his own temples, or withdraw
To desert regions, there to spend his fury
And exercise his bolt, which often indeed
Passes the guilty by, and strikes with death
The unoffending who deserve it least.

Now since the birth-time of the world, since sea


And earth’s first natal day and the sun’s origin,
Many atoms have been added from without,
Many seeds from all round, which, shooting them
Hither and thither, the great universe
Has brought together: and by means of these
Sea and land have been able to increase;
Thus too the mansion of the sky has gained
New spaciousness, and lifted its high roof
Far above earth, and the air has risen with it.
For to each thing its own appropriate atoms
Are all distributed by blows from all
Regions of space, so that they separate
Into their proper elements. Moisture joins
With moisture: earth from earthy substance grows;
Fires generate fire, and ether ether,
Till Nature, the creatress, consummating
Her labour, has brought all things to their last
Limit of growth; as happens, when at length
That which is entering the veins of life
Is now no more than what is flowing away
g y
And ebbing thence. In all things at this point
The age of growth must halt: at this point nature
Curbs increase by her powers. For all such things
As you may see waxing with joyous growth,
And climbing step by step to matured age
Receive into themselves more particles
Than they discharge, so long as food is passing
Easily into all their veins, and while
They are not so widely spread as to throw off
Too many atoms and to cause more waste
Than what their life requires for nourishment.
For we must surely grant that many atoms
Are flowing away from things and leaving them:
But still more must be added, till at length
They have attained the highest pitch of growth.
Then age little by little breaks their powers
And their mature strength, as it wastes away
On the worse side of life. And out of doubt
The bulkier and the wider a thing is,
Once its growth ceases, the more particles
Does it now shed around it and discharge
On all sides: nor is food distributed
Easily into all its veins, nor yet
In quantity sufficient that therefrom
A supply may continually rise up
To compensate the copious emanations
Which it exhales. For there is need of food
To preserve all things by renewing them:
Food must uphold, food sustain everything:
Yet all is to no purpose, since the veins
Fail to convey what should suffice, nor yet
Does nature furnish all that is required.
There is good reason therefore why all forms
Should perish, when they are rarefied by flux
Of atoms, and succumb to external blows,
Since food must fail advanced age in the end,
g ,
And atoms cease not ever from outside
To buffet each thing till they wear it out
And overpower it by beleaguering blows.
In this way then it is that the walls too
Of the great world from all sides shall be stormed
And so collapsing crumble away to ruins.
And even now already this world’s age
Is broken, and the worn-out earth can scarce
Create the tiniest animals, she who once
Created every kind, and brought to birth
The huge shapes of wild beasts. For, as I think,
Neither did any golden rope let down
The tribes of mortal creatures from the heights
Of heaven on to the fields, nor did the sea
Nor its waves beating on the rocks create them,
But the same earth gave birth to them, which now
Feeds them from her own breast. At first moreover
Herself spontaneously did she create
Flourishing crops and rich vines for mankind,
Herself gave them sweet fruits and joyous pastures;
Which now, though aided by our toil, scarce grow
To any size. Thus we wear out our oxen
And the strength of our peasants: we use up
Our iron tools; yet hardly do we win
A sustenance from the fields, so niggardly
They grudge their produce and increase our toil.
And now shaking his head the aged ploughman
Sighs ever and anon, when he beholds
The labours of his hands all spent in vain;
And when with times past he compares the present,
He praises often the fortune of his sire,
Harping upon that ancient race of men
Who rich in piety supported life
Upon their narrow plots contentedly,
Seeing the land allotted to each man
Was far less in those days than now. So too
y
The planter of the worn-out shrivelled vine
Disconsolately inveighs against the march
Of time, wearying heaven with complaints,
And understands not how all things are wasting
Little by little, and passing to the grave
Tired out by lengthening age and lapse of days.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebooknice.com

You might also like