Skip to content

Commit b29f07f

Browse files
committed
[log] tensorlayer --> [TL]
1 parent 56e682f commit b29f07f

File tree

1 file changed

+12
-12
lines changed

1 file changed

+12
-12
lines changed

docs/user/tutorial.rst

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -70,15 +70,15 @@ TensorFlow's methods like ``sess.run()``, see ``tutorial_mnist.py`` for more det
7070
network = tl.layers.DenseLayer(network, n_units=800,
7171
act = tf.nn.relu, name='relu2')
7272
network = tl.layers.DropoutLayer(network, keep=0.5, name='drop3')
73-
# the softmax is implemented internally in tl.cost.cross_entropy(y, y_) to
73+
# the softmax is implemented internally in tl.cost.cross_entropy(y, y_, 'cost') to
7474
# speed up computation, so we use identity here.
7575
# see tf.nn.sparse_softmax_cross_entropy_with_logits()
7676
network = tl.layers.DenseLayer(network, n_units=10,
7777
act = tf.identity,
7878
name='output_layer')
7979
# define cost function and metric.
8080
y = network.outputs
81-
cost = tl.cost.cross_entropy(y, y_)
81+
cost = tl.cost.cross_entropy(y, y_, 'cost')
8282
correct_prediction = tf.equal(tf.argmax(y, 1), y_)
8383
acc = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
8484
y_op = tf.argmax(tf.nn.softmax(y), 1)
@@ -149,13 +149,13 @@ If everything is set up correctly, you will get an output like the following:
149149
y_test.shape (10000,)
150150
X float32 y int64
151151
152-
tensorlayer:Instantiate InputLayer input_layer (?, 784)
153-
tensorlayer:Instantiate DropoutLayer drop1: keep: 0.800000
154-
tensorlayer:Instantiate DenseLayer relu1: 800, relu
155-
tensorlayer:Instantiate DropoutLayer drop2: keep: 0.500000
156-
tensorlayer:Instantiate DenseLayer relu2: 800, relu
157-
tensorlayer:Instantiate DropoutLayer drop3: keep: 0.500000
158-
tensorlayer:Instantiate DenseLayer output_layer: 10, identity
152+
[TL] InputLayer input_layer (?, 784)
153+
[TL] DropoutLayer drop1: keep: 0.800000
154+
[TL] DenseLayer relu1: 800, relu
155+
[TL] DropoutLayer drop2: keep: 0.500000
156+
[TL] DenseLayer relu2: 800, relu
157+
[TL] DropoutLayer drop3: keep: 0.500000
158+
[TL] DenseLayer output_layer: 10, identity
159159
160160
param 0: (784, 800) (mean: -0.000053, median: -0.000043 std: 0.035558)
161161
param 1: (800,) (mean: 0.000000, median: 0.000000 std: 0.000000)
@@ -591,9 +591,9 @@ If everything is set up correctly, you will get an output like the following:
591591
.. code-block:: text
592592
593593
[2016-07-12 09:31:59,760] Making new env: Pong-v0
594-
tensorlayer:Instantiate InputLayer input_layer (?, 6400)
595-
tensorlayer:Instantiate DenseLayer relu1: 200, relu
596-
tensorlayer:Instantiate DenseLayer output_layer: 3, identity
594+
[TL] InputLayer input_layer (?, 6400)
595+
[TL] DenseLayer relu1: 200, relu
596+
[TL] DenseLayer output_layer: 3, identity
597597
param 0: (6400, 200) (mean: -0.000009, median: -0.000018 std: 0.017393)
598598
param 1: (200,) (mean: 0.000000, median: 0.000000 std: 0.000000)
599599
param 2: (200, 3) (mean: 0.002239, median: 0.003122 std: 0.096611)

0 commit comments

Comments
 (0)