Matthew Leung
1 min readNov 22, 2020

I just compared the performance of the TensorFlow_macos package with the standard TensorFlow 2.0 python package. I used a very simple python tensorflow testing program, running on my Mac 2020 (x86), with the 2 package respectively. I found that the tensorflow_macos is much slower. I'm not sure whether the Mac-optimized package is only good for the new M1 ARM Macor it is the problem of my test program.

Here is the result:

standard TensorFlow: 10.87

stensorflow_macos: 79.41s

code

```

from datetime import datetime

import numpy as np

import tensorflow as tf

# Import mlcompute module to use the optional set_mlc_device API for device selection with ML Compute.

from tensorflow.python.compiler.mlcompute import mlcompute

# Select CPU device.

mlcompute.set_mlc_device(device_name="any") # Available options are 'cpu', 'gpu', and ‘any'.

print("start" , datetime.now())

X_raw = np.array([2013, 2014, 2015, 2016, 2017, 2018], dtype=np.float32)

y_raw = np.array([12000, 14000, 15000, 16500, 17500, 19000], dtype=np.float32)

X = (X_raw - X_raw.min()) / (X_raw.max() - X_raw.min())

y = (y_raw - y_raw.min()) / (y_raw.max() - y_raw.min())

X = tf.constant(X)

y = tf.constant(y)

a = tf.Variable(initial_value=0.)

b = tf.Variable(initial_value=0.)

variables = [a, b]

num_epoch = 10000

optimizer = tf.keras.optimizers.SGD(learning_rate=1e-3)

for e in range(num_epoch):

# Use tf.GradientTape() to record information about the gradient of the loss function.

with tf.GradientTape() as tape:

y_pred = a * X + b

loss = 0.5 * tf.reduce_sum(tf.square(y_pred - y))

# TensorFlow computes the gradients of the loss function with respect to independent variables (model parameters) automatically.

grads = tape.gradient(loss, variables)

# TensorFlow updates parameters according to the gradient automatically.

optimizer.apply_gradients(grads_and_vars=zip(grads, variables))

print(a, b)

print("end" , datetime.now())

```

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Responses (1)

Write a response

I published some preliminary tests on the Apple TensorfFlow accellerator here https://github.com/apple/tensorflow_macos/issues/10

--