1. Trang chủ
  2. » Công Nghệ Thông Tin

deep learning with tensorflow

50 48 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 50
Dung lượng 13,13 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Some scalar valuesSome scalar values initialized randomly Some scalar values initialized randomly w1 w2 w3 Nonlinear Activation Function Nonlinear Activation Function Some scalar value S

Trang 1

An Introduction To Artificial Neural Networks

By Brian Pugh CMU Crash Course 1/28 2017

Trang 4

DEEP LEARNING

CAT

DOG

NO 1%

NO 1%

YES 99% YES 99%

Trang 5

DEEP LEARNING

CAT

1%

NO 1%

YES 99% YES 99%

Trang 6

DEEP LEARNING

CAT

DOG

NO 45% NO 45%

YES 55% YES 55%

Trang 7

DEEP LEARNING

CAT

45% NO 45%

YES 55% YES 55%

Trang 8

DEEP LEARNING

CAT

25% NO 25%

YES 75% YES 75%

Trang 10

Source: http://webspace.ship.edu/cgboer/neuron.gif

Trang 13

Some scalar values

Some scalar values (initialized randomly)

Some scalar values (initialized randomly)

w1

w2

w3

Nonlinear Activation Function

Nonlinear Activation Function

Some scalar value

Some scalar value Some scalar Some scalar value value

May use some output decision function

May use some output decision function

𝑟𝑒𝑙𝑢( 𝑥' 𝑥( 𝑥) 𝑤𝑤'(

𝑤) )

𝑥' 𝑥( 𝑥) 𝑤𝑤'(

𝑤)

Trang 14

Combining Neurons Into Layers

Trang 17

DEEP LEARNING

CAT

36% NO 36%

YES 64% YES 64%

% Trained

NO 48% NO 48%

YES 52% YES 52%

Trang 18

DEEP LEARNING

CAT

47% NO 47%

YES 53% YES 53%

% Trained

YES 71% YES 71%

NO 29% NO 29%

Trang 19

DEEP LEARNING

CAT

94% YES 94%

NO 6%

NO 6%

% Trained

NO 26% NO 26%

YES 74% YES 74%

Trang 20

DEEP LEARNING

CAT

89% YES 89%

NO 11% NO 11%

% Trained

NO 8%

NO 8%

YES 92% YES 92%

Trang 21

DO THIS TENS/HUNDREDS

OF THOUSANDS OF TIMES

Trang 25

import tensorflow as tf

• Tensorflow functions can now be called using the tf prefix Example: tf.session()

Trang 26

from tensorflow.examples.tutorials.mnist import input_data

mnist = input_data.read_data_sets( "MNIST_data/" , one_hot= True )

• This downloads and loads in the MNIST digits dataset.

• Note: 784 = 28 x 28

• Images ϵ [0,1] mnist

validation images labels

test images labels

train images labels

55000 x 784 55000 x 10 5000 x 784 5000 x 10 10000 x 784 10000 x 10

Trang 28

Softmax and One-Hot Encoding

• We want the network to output a percent certainty it believes some image belongs to some label

Trang 32

Some scalar values (initialized randomly)

Some scalar value Some scalar Some scalar value < 1 value < 1

SUM

Some Output

Percentage E.g Six

10x of these

Trang 34

Some scalar values (initialized randomly)

Some scalar value Some scalar Some scalar value < 1 value < 1

SUM

Some Output

Percentage E.g Six

10x of these

Trang 36

Some scalar values (initialized randomly)

Some scalar value Some scalar Some scalar value < 1 value < 1

SUM

Some Output

Percentage E.g Six

10x of these

1 Literally the Literally the value one.value one.

b Bias weightBias weight(scalar) (scalar)

Trang 38

Some scalar values (initialized randomly)

Some scalar value Some scalar Some scalar value < 1 value < 1

SUM

Some Output

Percentage E.g Six

10x of these

1 Literally the Literally the value one.value one.

b Bias weightBias weight(scalar) (scalar)

Trang 40

Some scalar values (initialized randomly)

Some scalar value Some scalar Some scalar value < 1 value < 1

SUM

Some Output

Percentage E.g Six

10x of these

1 Literally the Literally the value one.value one.

b Bias weightBias weight(scalar) (scalar)

Trang 42

loss = tf.reduce_mean(-tf.reduce_sum(yTruth * tf.log(y),

reduction_indices=1))

• tf.log(y) turns values close to 1 to be close to 0, and values close to 0 to be close to –infinity

Sums along the class dimension (mostly 0’s), fixes the sign

Trang 43

Vector

0.288 0.105 0.916

-Sum across labels

Loss

0.4363

Average

Trang 44

lr = 0.5 # learning rate

trainStep = tf.train.GradientDescentOptimizer(lr).minimize(loss)

• Learning rate is how much to proportionally change weights per training example

• Minimize the loss function

Trang 45

Begin the TensorFlow Session

• Up to this point, we have just been laying down a blueprint for TensorFlow to follow, but it hasn’t “built” anything yet

Trang 46

batchSize = 100

for i in range(1000):

# get some images and their labels

xBatches, yBatches = mnist.train.next_batch(batchSize)

Trang 47

correctPred = tf.equal(tf.argmax(y,1), tf.argmax(yTruth,1))

accuracy = tf.reduce_mean(tf.cast(correctPred, tf.float32))

resultAcc = sess.run(accuracy, feed_dict=

{x: mnist.test.images, yTruth: mnist.test.labels}) print("Trained Acc: %f" % resultAcc)

Trang 49

Questions?

Ngày đăng: 13/04/2019, 01:25

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

  • Đang cập nhật ...

TÀI LIỆU LIÊN QUAN