You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Major bug reported by vicaya and fixed by crystal_gaoyu (security)<br>
59
+
<b>Fix UGI for IBM JDK running on Windows</b><br>
60
+
<blockquote>The login module and user principal classes are different for 32 and 64-bit Windows in IBM J9 JDK 6 SR10. Hadoop 1.0.3 does not run on either because it uses the 32 bit login module and the 64-bit user principal class.</blockquote></li>
Major improvement reported by vicaya and fixed by crystal_gaoyu (security)<br>
64
+
<b>Introduce HADOOP_PROXY_USER for secure impersonation in child hadoop client processes</b><br>
65
+
<blockquote>To solve the problem for an authenticated user to type hadoop shell commands in a web console, we can introduce an HADOOP_PROXY_USER environment variable to allow proper impersonation in the child hadoop client processes.</blockquote></li>
Minor test reported by [email protected] and fixed by vicaya (test)<br>
74
+
<b>Òant testÓ will build failed for trying to delete a file</b><br>
75
+
<blockquote>Run "ant test" on branch-1 of hadoop-common.<br>When the test process reach "test-core-excluding-commit-and-smoke"<br><br>It will invoke the "macro-test-runner" to clear and rebuild the test environment.<br>Then the ant task command <delete dir="@{test.dir}/logs" /><br>failed for trying to delete an non-existent file.<br><br>following is the test result logs:<br>test-core-excluding-commit-and-smoke:<br> [delete] Deleting: /home/jdu/bdc/hadoop-topology-branch1-new/hadoop-common/build/test/testsfailed<br> [delete] Dele...</blockquote></li>
<blockquote>Some very small percentage of tasks fail with a "Text file busy" error.<br><br>The following was the original diagnosis:<br>{quote}<br>Our use of PrintWriter in TaskController.writeCommand is unsafe, since that class swallows all IO exceptions. We're not currently checking for errors, which I'm seeing result in occasional task failures with the message "Text file busy" - assumedly because the close() call is failing silently for some reason.<br>{quote}<br>.. but turned out to be another issue as well (see below)</blockquote></li>
Major bug reported by vicaya and fixed by crystal_gaoyu (task)<br>
104
+
<b>SortedRanges.Range#compareTo is not spec compliant</b><br>
105
+
<blockquote>SortedRanges.Range#compareTo does not satisfy the requirement of Comparable#compareTo, where "the implementor must ensure {noformat}sgn(x.compareTo(y)) == -sgn(y.compareTo(x)){noformat} for all x and y."<br><br>This is manifested as TestStreamingBadRecords failures in alternative JDKs.</blockquote></li>
Major improvement reported by vicaya and fixed by crystal_gaoyu (task-controller)<br>
114
+
<b>Introduce HADOOP_SECURITY_CONF_DIR for task-controller</b><br>
115
+
<blockquote>The linux task controller currently hard codes the directory in which to look for its config file at compile time (via the HADOOP_CONF_DIR macro). Adding a new environment variable to look for task-controller's conf dir (with strict permission checks) would make installation much more flexible.</blockquote></li>
Blocker bug reported by revans2 and fixed by vinodkv (mrv1)<br>
149
+
<b>NLineInputFormat drops data in 1.1 and beyond</b><br>
150
+
<blockquote>When trying to root cause why MAPREDUCE-4782 did not cause us issues on 1.0.2, I found out that HADOOP-7823 introduced essentially the exact same error into org.apache.hadoop.mapred.lib.NLineInputFormat.<br><br>In 1.X org.apache.hadoop.mapred.lib.NLineInputFormat and org.apache.hadoop.mapreduce.lib.input.NLineInputFormat are separate implementations. The latter had an off by one error in it until MAPREDUCE-4782 fixed it. The former had no error in it until HADOOP-7823 introduced it in 1.1 and MAPR...</blockquote></li>
0 commit comments