Skip to content

Commit acfe388

Browse files
committed
[SYSTEMDS-118] New generic gridSearch builtin function
This patch adds a new generic grid search function for hyper-parameter optimization of arbitrary ML algorithms and parameter combinations. This function takes train and eval functions by name as well as lists of parameter names and vectors of their values, and returns the parameter combination and model that gave the best results. So far hyper-parameter optimization is working, but the core training/scoring part needs additional features on list data types (e.g., list-list append, and eval fcalls with lists of unnamed and named parameters). Also, before it can be applied in practice it needs an integration with cross validation.
1 parent 4bbba40 commit acfe388

File tree

5 files changed

+208
-1
lines changed

5 files changed

+208
-1
lines changed

docs/Tasks.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -91,7 +91,7 @@ SYSTEMDS-110 New Builtin Functions
9191
* 115 Builtin function for model debugging (slice finder) OK
9292
* 116 Builtin function for kmeans OK
9393
* 117 Builtin function for lm cross validation OK
94-
* 118 Builtin function for hyperparameter grid search with CVlm
94+
* 118 Builtin function for hyperparameter grid search
9595
* 119 Builtin functions for l2svm and msvm OK
9696

9797
SYSTEMDS-120 Performance Features

scripts/builtin/gridSearch.dml

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
#-------------------------------------------------------------
2+
#
3+
# Licensed to the Apache Software Foundation (ASF) under one
4+
# or more contributor license agreements. See the NOTICE file
5+
# distributed with this work for additional information
6+
# regarding copyright ownership. The ASF licenses this file
7+
# to you under the Apache License, Version 2.0 (the
8+
# "License"); you may not use this file except in compliance
9+
# with the License. You may obtain a copy of the License at
10+
#
11+
# http://www.apache.org/licenses/LICENSE-2.0
12+
#
13+
# Unless required by applicable law or agreed to in writing,
14+
# software distributed under the License is distributed on an
15+
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
16+
# KIND, either express or implied. See the License for the
17+
# specific language governing permissions and limitations
18+
# under the License.
19+
#
20+
#-------------------------------------------------------------
21+
22+
m_gridSearch = function(Matrix[Double] X, Matrix[Double] y, String train, String predict,
23+
List[String] params, List[Unknown] paramValues, Boolean verbose = TRUE)
24+
return (Matrix[Double] B, Frame[Unknown] opt)
25+
{
26+
# Step 0) preparation of parameters, lengths, and values in convenient form
27+
numParams = length(params);
28+
paramLens = matrix(0, numParams, 1);
29+
for( j in 1:numParams ) {
30+
vect = as.matrix(paramValues[j,1]);
31+
paramLens[j,1] = nrow(vect);
32+
}
33+
paramVals = matrix(0, numParams, max(paramLens));
34+
for( j in 1:numParams ) {
35+
vect = as.matrix(paramValues[j,1]);
36+
paramVals[j,1:nrow(vect)] = t(vect);
37+
}
38+
cumLens = rev(cumprod(rev(paramLens))/rev(paramLens));
39+
numConfigs = prod(paramLens);
40+
41+
# Step 1) materialize hyper-parameter combinations
42+
# (simplify debugging and compared to compute negligible)
43+
HP = matrix(0, numConfigs, numParams);
44+
parfor( i in 1:nrow(HP) ) {
45+
for( j in 1:numParams )
46+
HP[i,j] = paramVals[j,as.scalar(((i-1)/cumLens[j,1])%%paramLens[j,1]+1)];
47+
}
48+
49+
if( verbose )
50+
print("GridSeach: Hyper-parameter combinations: \n"+toString(HP));
51+
52+
# Step 2) training/scoring of parameter combinations
53+
# TODO integrate cross validation
54+
Rbeta = matrix(0, nrow(HP), ncol(X));
55+
Rloss = matrix(0, nrow(HP), 1);
56+
arguments = list(X=X, y=y);
57+
58+
parfor( i in 1:nrow(HP) ) {
59+
# a) prepare training arguments
60+
largs = arguments;
61+
for( j in 1:numParams ) {
62+
key = as.scalar(params[j]);
63+
value = as.scalar(HP[i,j]);
64+
largs = append(largs, list(key=value));
65+
}
66+
67+
# b) core training/scoring
68+
lbeta = eval(train, largs);
69+
lloss = eval(predict, list(X, y, lbeta));
70+
71+
# c) write models and loss back to output
72+
Rbeta[i,] = lbeta;
73+
Rloss[i,] = lloss;
74+
}
75+
76+
# Step 3) select best parameter combination
77+
ix = as.scalar(rowIndexMin(t(Rloss)));
78+
B = Rbeta[ix,]; # optimal model
79+
opt = as.frame(HP[ix,]); # optimal hyper-parameters
80+
}

src/main/java/org/apache/sysds/common/Builtins.java

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,7 @@ public enum Builtins {
9090
EVAL("eval", false),
9191
FLOOR("floor", false),
9292
GNMF("gnmf", true),
93+
GRID_SEARCH("gridSearch", true),
9394
IFELSE("ifelse", false),
9495
IMG_MIRROR("img_mirror", true),
9596
IMG_BRIGHTNESS("img_brightness", true),
Lines changed: 82 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,82 @@
1+
/*
2+
* Licensed to the Apache Software Foundation (ASF) under one
3+
* or more contributor license agreements. See the NOTICE file
4+
* distributed with this work for additional information
5+
* regarding copyright ownership. The ASF licenses this file
6+
* to you under the Apache License, Version 2.0 (the
7+
* "License"); you may not use this file except in compliance
8+
* with the License. You may obtain a copy of the License at
9+
*
10+
* http://www.apache.org/licenses/LICENSE-2.0
11+
*
12+
* Unless required by applicable law or agreed to in writing,
13+
* software distributed under the License is distributed on an
14+
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15+
* KIND, either express or implied. See the License for the
16+
* specific language governing permissions and limitations
17+
* under the License.
18+
*/
19+
20+
package org.apache.sysds.test.functions.builtin;
21+
22+
import org.junit.Assert;
23+
import org.junit.Test;
24+
25+
import org.apache.sysds.common.Types.ExecMode;
26+
import org.apache.sysds.lops.LopProperties.ExecType;
27+
import org.apache.sysds.test.AutomatedTestBase;
28+
import org.apache.sysds.test.TestConfiguration;
29+
import org.apache.sysds.test.TestUtils;
30+
31+
32+
public class BuiltinGridSearchTest extends AutomatedTestBase
33+
{
34+
private final static String TEST_NAME = "GridSearchLM";
35+
private final static String TEST_DIR = "functions/builtin/";
36+
private final static String TEST_CLASS_DIR = TEST_DIR + BuiltinGridSearchTest.class.getSimpleName() + "/";
37+
38+
private final static int rows = 300;
39+
private final static int cols = 20;
40+
41+
@Override
42+
public void setUp() {
43+
addTestConfiguration(TEST_NAME,new TestConfiguration(TEST_CLASS_DIR, TEST_NAME,new String[]{"R"}));
44+
}
45+
46+
@Test
47+
public void testGridSearchCP() {
48+
//TODO additional list features needed
49+
//runGridSearch(ExecType.CP);
50+
}
51+
52+
@Test
53+
public void testGridSearchSpark() {
54+
//TODO additional list features needed
55+
//runGridSearch(ExecType.SPARK);
56+
}
57+
58+
@SuppressWarnings("unused")
59+
private void runGridSearch(ExecType et)
60+
{
61+
ExecMode modeOld = setExecMode(et);
62+
try {
63+
loadTestConfiguration(getTestConfiguration(TEST_NAME));
64+
String HOME = SCRIPT_DIR + TEST_DIR;
65+
66+
fullDMLScriptName = HOME + TEST_NAME + ".dml";
67+
programArgs = new String[] {"-args", input("X"), input("y"), output("R")};
68+
double[][] X = getRandomMatrix(rows, cols, 0, 1, 0.8, -1);
69+
double[][] y = getRandomMatrix(rows, 1, 0, 1, 0.8, -1);
70+
writeInputMatrixWithMTD("X", X, true);
71+
writeInputMatrixWithMTD("y", y, true);
72+
73+
runTest(true, EXCEPTION_NOT_EXPECTED, null, -1);
74+
75+
//expected loss smaller than default invocation
76+
Assert.assertTrue(TestUtils.readDMLBoolean(output("R")));
77+
}
78+
finally {
79+
resetExecMode(modeOld);
80+
}
81+
}
82+
}
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
#-------------------------------------------------------------
2+
#
3+
# Licensed to the Apache Software Foundation (ASF) under one
4+
# or more contributor license agreements. See the NOTICE file
5+
# distributed with this work for additional information
6+
# regarding copyright ownership. The ASF licenses this file
7+
# to you under the Apache License, Version 2.0 (the
8+
# "License"); you may not use this file except in compliance
9+
# with the License. You may obtain a copy of the License at
10+
#
11+
# http://www.apache.org/licenses/LICENSE-2.0
12+
#
13+
# Unless required by applicable law or agreed to in writing,
14+
# software distributed under the License is distributed on an
15+
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
16+
# KIND, either express or implied. See the License for the
17+
# specific language governing permissions and limitations
18+
# under the License.
19+
#
20+
#-------------------------------------------------------------
21+
22+
l2norm = function(Matrix[Double] X, Matrix[Double] y, Matrix[Double] B) return (Double loss) {
23+
loss = sum((y - X%*%B)^2);
24+
}
25+
26+
X = read($1);
27+
y = read($2);
28+
29+
N = 200;
30+
Xtrain = X[1:N,];
31+
ytrain = y[1:N,];
32+
Xtest = X[(N+1):nrow(X),];
33+
ytest = y[(N+1):nrow(X),];
34+
35+
params = list("reg", "tol", "maxi");
36+
paramRanges = list(10^seq(0,-4), 10^seq(-5,-9), 10^seq(1,3));
37+
[B1, opt] = gridSearch(Xtrain, ytrain, "lm", "lmPredict", params, paramRanges, TRUE);
38+
B2 = lm(X=Xtrain, y=ytrain, verbose=FALSE);
39+
40+
l1 = l2norm(Xtest, ytest, B1);
41+
l2 = l2norm(Xtest, ytest, B2);
42+
R = l1 <= l2;
43+
44+
write(R, $3)

0 commit comments

Comments
 (0)