Skip to content

Commit ee7a669

Browse files
AnipikTomFinley
authored andcommitted
Enables FastTreeBinaryClassificationCategoricalSplitTest and BinaryClassifierTesterThresholdingTest (dotnet#255)
* Tests Enabled & Dataset Moved to correct place in test\BaselineOutput * Correcting path for adult data set for autoInference class, and removing @ from path
1 parent 20c36ca commit ee7a669

File tree

87 files changed

+273909
-29
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

87 files changed

+273909
-29
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
maml.exe TrainTest test=%Data% tr=FastTreeBinaryClassification{nl=5 mil=5 lr=0.25 iter=20 mb=255} dout=%Output% loader=Text{sep=, header+ col=Label:14 col=Cat:TX:1,3,5-9,13} data=%Data% out=%Output% seed=1 xf=Cat{col=Cat} xf=Concat{col=Features:Cat}
2+
Not adding a normalizer.
3+
Making per-feature arrays
4+
Changing data from row-wise to column-wise
5+
Processed 32561 instances
6+
Binning and forming Feature objects
7+
Reserved memory for tree learner: 4980 bytes
8+
Starting to train ...
9+
Not training a calibrator because it is not needed.
10+
TEST POSITIVE RATIO: 0.2362 (3846.0/(3846.0+12435.0))
11+
Confusion table
12+
||======================
13+
PREDICTED || positive | negative | Recall
14+
TRUTH ||======================
15+
positive || 1,982 | 1,864 | 0.5153
16+
negative || 895 | 11,540 | 0.9280
17+
||======================
18+
Precision || 0.6889 | 0.8609 |
19+
OVERALL 0/1 ACCURACY: 0.830539
20+
LOG LOSS/instance: 0.537244
21+
Test-set entropy (prior Log-Loss/instance): 0.788708
22+
LOG-LOSS REDUCTION (RIG): 31.883066
23+
AUC: 0.871960
24+
25+
OVERALL RESULTS
26+
---------------------------------------
27+
AUC: 0.871960 (0.0000)
28+
Accuracy: 0.830539 (0.0000)
29+
Positive precision: 0.688912 (0.0000)
30+
Positive recall: 0.515341 (0.0000)
31+
Negative precision: 0.860937 (0.0000)
32+
Negative recall: 0.928026 (0.0000)
33+
Log-loss: 0.537244 (0.0000)
34+
Log-loss reduction: 31.883066 (0.0000)
35+
F1 Score: 0.589618 (0.0000)
36+
AUPRC: 0.670582 (0.0000)
37+
38+
---------------------------------------
39+
Physical memory usage(MB): %Number%
40+
Virtual memory usage(MB): %Number%
41+
%DateTime% Time elapsed(s): %Number%
42+
43+
--- Progress log ---
44+
[1] 'Building term dictionary' started.
45+
[1] (%Time%) 32561 examples Total Terms: 100
46+
[1] 'Building term dictionary' finished in %Time%.
47+
[2] 'FastTree data preparation' started.
48+
[2] 'FastTree data preparation' finished in %Time%.
49+
[3] 'FastTree in-memory bins initialization' started.
50+
[3] 'FastTree in-memory bins initialization' finished in %Time%.
51+
[4] 'FastTree feature conversion' started.
52+
[4] 'FastTree feature conversion' finished in %Time%.
53+
[5] 'FastTree training' started.
54+
[5] 'FastTree training' finished in %Time%.
55+
[6] 'Saving model' started.
56+
[6] 'Saving model' finished in %Time%.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
FastTreeBinaryClassification
2+
AUC Accuracy Positive precision Positive recall Negative precision Negative recall Log-loss Log-loss reduction F1 Score AUPRC /lr /nl /mil /iter Learner Name Train Dataset Test Dataset Results File Run Time Physical Memory Virtual Memory Command Line Settings
3+
0.87196 0.830539 0.688912 0.515341 0.860937 0.928026 0.537244 31.88307 0.589618 0.670582 0.25 5 5 20 FastTreeBinaryClassification %Data% %Data% %Output% 99 0 0 maml.exe TrainTest test=%Data% tr=FastTreeBinaryClassification{nl=5 mil=5 lr=0.25 iter=20 mb=255} dout=%Output% loader=Text{sep=, header+ col=Label:14 col=Cat:TX:1,3,5-9,13} data=%Data% out=%Output% seed=1 xf=Cat{col=Cat} xf=Concat{col=Features:Cat} /lr:0.25;/nl:5;/mil:5;/iter:20
4+
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
2+
Per-feature gain summary for the boosted tree ensemble:
3+
marital-status.Married-civ-spouse 1
4+
occupation.Exec-managerial 0.399117896915284
5+
occupation.Prof-specialty 0.391227805213273
6+
education.Bachelors 0.334466450228397
7+
education.Masters 0.287494503306969
8+
education.Prof-school 0.204600035294214
9+
education.Doctorate 0.187985462587772
10+
occupation.Sales 0.150856624152445
11+
occupation.Other-service 0.147997280910012
12+
relationship.Own-child 0.137529286784067
13+
marital-status.Never-married 0.131366568817433
14+
education.7th-8th 0.122497278808477
15+
workclass.Self-emp-inc 0.119940186278871
16+
education.HS-grad 0.113890497534289
17+
occupation.Tech-support 0.104801559978143
18+
workclass.Self-emp-not-inc 0.0911869726068162
19+
native-country.Mexico 0.081461599208144
20+
workclass.Federal-gov 0.0785782301236086
21+
sex.Male 0.0783006369313957
22+
relationship.Wife 0.0749144815167656
23+
education.11th 0.0746051436850124
24+
occupation.Handlers-cleaners 0.0601209002480329
25+
native-country.United-States 0.0593295291546631
26+
occupation.Farming-fishing 0.0583149634881263
27+
education.10th 0.0544727793656118
28+
education.9th 0.052272828491541
29+
workclass.Local-gov 0.0519635659099365
30+
occupation.? 0.0479530108171804
31+
occupation.Protective-serv 0.0451534478142547

0 commit comments

Comments
 (0)