Dissecting tf.function to discover AutoGraph strengths and - - PowerPoint PPT Presentation

dissecting tf function to discover autograph strengths
SMART_READER_LITE
LIVE PREVIEW

Dissecting tf.function to discover AutoGraph strengths and - - PowerPoint PPT Presentation

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties Dissecting tf.function to discover AutoGraph strengths and subtleties How to correctly write graph-convertible functions in TensorFlow 2.0.


slide-1
SLIDE 1

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 1/36

Dissecting tf.function to discover AutoGraph strengths and subtleties

How to correctly write graph-convertible functions in TensorFlow 2.0.

slide-2
SLIDE 2

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 2/36

About me

Computer engineer | Head of ML & CV @ ZURU Tech Italy | Machine Learning GDE Blog: https://pgaleone.eu/ Github: https://github.com/galeone/ Twitter: @paolo_galeone

slide-3
SLIDE 3

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 3/36

TensorFlow 2.0 & DataFlow Graphs

Execution Speed Language Agnostic Representation Easy to replicate and distribute Automatic Differentiation

slide-4
SLIDE 4

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 4/36

tf.function and AutoGraph

# tf.function signature: it is a decorator. def function(func=None, input_signature=None, autograph=True, experimental_autograph_options=None)

tf.function uses AutoGraph AutoGraph converts Python control flow statements into appropriate TensorFlow graph ops.

slide-5
SLIDE 5

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 5/36

tf.function and AutoGraph

slide-6
SLIDE 6

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 6/36

The problem

Given the constant matrices And the scalar variable Compute

slide-7
SLIDE 7

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 7/36

TensorFlow 1.x solution

g = tf.Graph() with g.as_default(): a = tf.constant([[10,10],[11.,1.]]) x = tf.constant([[1.,0.],[0.,1.]]) b = tf.Variable(12.) y = tf.matmul(a, x) + b init_op = tf.global_variables_initializer() with tf.Session() as sess: sess.run(init_op) print(sess.run(y))

slide-8
SLIDE 8

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 8/36

TensorFlow 2.0 solution: eager execution

def f(): a = tf.constant([[10,10],[11.,1.]]) x = tf.constant([[1.,0.],[0.,1.]]) b = tf.Variable(12.) y = tf.matmul(a, x) + b return y print([f().numpy() for _ in range(10)])

Every tf.* op, produces a tf.Tensor object

slide-9
SLIDE 9

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 9/36

From eager function to tf.function

@tf.function def f(): a = tf.constant([[10,10],[11.,1.]]) x = tf.constant([[1.,0.],[0.,1.]]) b = tf.Variable(12.) y = tf.matmul(a, x) + b print("PRINT: ", y) tf.print("TF-PRINT: ", y) return y f()

slide-10
SLIDE 10

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 10/36

From eager function to tf.function

PRINT: Tensor("add:0", shape=(2, 2), dtype=float32)

slide-11
SLIDE 11

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 11/36

From eager function to tf.function

ValueError: tf.function-decorated function tried to create variables on non-first call.

slide-12
SLIDE 12

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 12/36

Lesson #1: functions that create a state

A tf.Variable object in eager mode is just a Python object that gets destroyed as soon as it goes out of scope. A tf.Variable object in a tf.function-decorated function is the definition of a node in a persistent graph (eager execution disabled).

slide-13
SLIDE 13

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 13/36

The solution

class F(): def __init__(self): self._b = None @tf.function def __call__(self): a = tf.constant([[10, 10], [11., 1.]]) x = tf.constant([[1., 0.], [0., 1.]]) if self._b is None: self._b = tf.Variable(12.) y = tf.matmul(a, x) + self._b print("PRINT: ", y) tf.print("TF-PRINT: ", y) return y f = F() f()

slide-14
SLIDE 14

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 14/36

Lesson #2: eager function is not graph- convertible as is

There is no 1:1 match between eager execution and the graph built by @tf.function. Define the function thinking about the graph that is being built.

slide-15
SLIDE 15

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 15/36

Change the input type

Python is a dynamically-typed language TensorFlow is a strictly statically typed framework

slide-16
SLIDE 16

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 16/36

The function

@tf.function def f(x): print("Python execution: ", x) tf.print("Graph execution: ", x) return x

The function parameters type is used to create a graph, that is a statically typed object, and to assign it an ID

slide-17
SLIDE 17

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 17/36

tf.Tensor as input

print("##### float32 test #####") a = tf.constant(1, dtype=tf.float32) print("first call") f(a) a = tf.constant(1.1, dtype=tf.float32) print("second call") f(a) print("##### uint8 test #####") b = tf.constant(2, dtype=tf.uint8) print("first call") f(b) b = tf.constant(3, dtype=tf.uint8) print("second call") f(b)

slide-18
SLIDE 18

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 18/36

tf.Tensor as input

##### float32 test ##### first call Python execution: Tensor("x:0", shape=(), dtype=float32) Graph execution: 1 second call Graph execution: 1.1 ##### uint8 test ##### first call Python execution: Tensor("x:0", shape=(), dtype=uint8) Graph execution: 2 second call Graph execution: 3

Everything goes as we expect

slide-19
SLIDE 19

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 19/36

Inspecting the function

tf.autograph.to_code(f.python_function) def tf__f(x): try: with ag__.function_scope('f'): do_return = False retval_ = None with ag__.utils.control_dependency_on_returns(ag__.converted_call(print, None, ag__.ConversionOptions(re tf_1, x_1 = ag__.utils.alias_tensors(tf, x) with ag__.utils.control_dependency_on_returns(ag__.converted_call('print', tf_1, ag__.ConversionOption x_2 = ag__.utils.alias_tensors(x_1) do_return = True retval_ = x_1 return retval_ except: ag__.rewrite_graph_construction_error(ag_source_map__)

slide-20
SLIDE 20

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 20/36

Inspecting the function

with ag__.utils.control_dependency_on_returns( ag__.converted_call( print, None, ag__.ConversionOptions( recursive=True, force_conversion=False,

  • ptional_features=ag__.Feature.ALL,

internal_convert_user_code=True), ('Python execution: ', x), {}) ):

converted_call owner parameter is None: this line becomes a tf.no_op()

slide-21
SLIDE 21

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 21/36

Python native type as input

def printinfo(x): print("Type: ", type(x), " value: ", x) print("##### int test #####") print("first call") a = 1 printinfo(a) f(a) print("second call") b = 2 printinfo(b) f(b) print("##### float test #####") print("first call") a = 1.0 printinfo(a) f(a) print("second call") b = 2.0 printinfo(b) f(b)

slide-22
SLIDE 22

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 22/36

Python native type as input

Call with Python int

##### int test ##### first call Type: <class 'int'> value: 1 Python execution: 1 Graph execution: 1 second call Type: <class 'int'> value: 2 Python execution: 2 Graph execution: 2

The graph is being recreated at every function invocation

slide-23
SLIDE 23

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 23/36

Python native type as input

Call with Python float

##### float test ##### first call Type: <class 'float'> value: 1.0 Graph execution: 1 second call Type: <class 'float'> value: 2.0 Graph execution: 2

The return type is WRONG. The graphs built for the integers 1 and 2 is reused for the float 1.0 and 2.0

slide-24
SLIDE 24

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 24/36

slide-25
SLIDE 25

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 25/36

Lesson #3: no autoboxing

That's why we used the same graph built for f(1) for f(1.0), because 1 == 1.0. tf.function does not automatically convert a Python integer to a tf.Tensor with dtype tf.int64, and so on. The graph ID, when the input is not a tf.Tensor object, is built using the variable value.

slide-26
SLIDE 26

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 26/36

No autoboxing: performance measurement

@tf.function def g(x): return x start = time.time() for i in tf.range(1000): g(i) print("tf.Tensor time elapsed: ", (time.time()-start)) start = time.time() for i in range(1000): g(i) print("Native type time elapsed: ", (time.time()-start)) tf.Tensor time elapsed: 0.41594886779785156 Native type time elapsed: 5.189513444900513

slide-27
SLIDE 27

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 27/36

Lesson #4: tf.Tensor everywhere

Use tf.Tensor everywhere.

slide-28
SLIDE 28

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 28/36

Python Operators

@tf.function def if_elif(a, b): if a > b: tf.print("a > b", a, b) elif a == b: tf.print("a == b", a, b) elif a < b: tf.print("a < b", a, b) else: tf.print("wat") x = tf.constant(1) if_elif(x,x)

slide-29
SLIDE 29

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 29/36

Python operators

wat

slide-30
SLIDE 30

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 30/36

slide-31
SLIDE 31

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 31/36

Problems

Python __eq__ is not converted to tf.equal (even in eager mode!) but checks for the Python variable identity. AutoGraph converts the if, elif, else statements but AutoGraph does not converts the Python built-in operators (__eq__, __lt__, __gt__)

slide-32
SLIDE 32

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 32/36

Python operators

@tf.function def if_elif(a, b): if tf.math.greater(a, b): tf.print("a > b", a, b) elif tf.math.equal(a, b): tf.print("a == b", a, b) elif tf.math.less(a, b): tf.print("a < b", a, b) else: tf.print("wat")

slide-33
SLIDE 33

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 33/36

Lesson 5: operators

Use the TensorFlow operators explicitly everywhere.

slide-34
SLIDE 34

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 34/36

Recap

  • 1. tf.Variable need a special treatment.
  • 2. Think about the graph while designing the function: eager to graph is not straightforward.
  • 3. There is no autoboxing of Python native types, therefore
  • 4. Use tf.Tensor everywhere.
  • 5. Use the TensorFlow operators explicitly everywhere.
slide-35
SLIDE 35

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 35/36

Stay updated: https://pgaleone.eu/subscribe

Hands-On Neural Networks with TensorFlow 2.0

Understanding the TensorFlow architecture, from static graph to eager execution, designing Deep Neural Networks.

slide-36
SLIDE 36

12/7/2019 Dissecting tf.function to discover AutoGraph strengths and subtleties file:///home/paolo/projects/tf-function-talk/index.html#slide=1 36/36

The End

Thank you 😅 Blog: https://pgaleone.eu/ Twitter: https://twitter.com/paolo_galeone GitHub: https://github.com/galeone (the slides of this talk are here!)