Tensorflow - ofithcheallaigh/masters_project GitHub Wiki
This section of the documentation will provide an overview of the TensorFlow
system, which was developed by the Google Brain team, initially for internal use within Google.
Tensorflow
has a flexible architecture which has been the most popular open-source Python machine learning library. It was released in 2015 under the Apache licence. Over the years since its release, it has evolved into a full ecosystem of tools which are used for model development and deployment. Over that time too, a number of APIs have been developed with the specific aim of handling tasks such as data ingestion, transformation, feature engineering and model development.
One such API is the Keras
API. We will get to this later.
As the name suggests, Tensorflow
is built on tensors. TensorFlow
is imported into Python as follows: import tensorflow as tf
. Please note, the as tf
is a convention, and is not required.
A tensor is a multidimensional array with a uniform type. The list of data types available can be found at [1]
. To keep things relatively simple, a tensor can be thought of as something similar to a numpy.array
. There are a number of basic types of tensors:
Rank | Type |
---|---|
0 | Scalar |
1 | Vector |
2 | Matrix |
3 | Cube |
4 | n |
- The
scalar
contains a single value, and as such, has no axes. A rank 0 tensor could be constructed as:rank0Tensor = tf.constant(1)
. By default, this will be anint32
data type. - The vector tensor is a list of values, and as such, has 1-axes. The vector tensor, or rank-1 tensor, can be constraucted as:
rank1Tensor = tf.constant([5.6, 3.0, 9.0])
. - The matrix, or rank-2 tensor has 2-axes and can be constructed as:
rank2Tensor = tf.constant([[6.0,2.0],[1.0,5.0],[9.0,1.0]])
. - A rank-3 tensor, or a cube tensor, can be constructed as:
rank3Tensor = tf.construct([
[[0, 1, 2, 3, 4],
[5, 6, 7, 8, 9]],
[[10, 11, 12, 13, 14],
[15, 16, 17, 18, 19]],
[[20, 21, 22, 23, 24],
[25, 26, 27, 28, 29]],])
The rank 0, rank 1 and rank 2 tensors can be visualised as:
The cube tensor can be visually represented as [2]
:
Oh course, higher rank tensors are not easily displayed, however, we can give some details on how they are constructed, using the rank 4 tensor as an example. One of the more important bits of information to grasp is the axes. If we create a rank 4 tensor as follows:
rank4Tensor = tf.zeros([7,3,2,5])
The axes will be:
Tensors will typically contain floats or integers, however, it is not uncommon to see tensors containing complex numbers or strings.
Keras
, again, is an open-source deep learning framework. It is also referred to as a high-level API. When the term "high level" is used, it implies that at the lower level, there is a body of code, functions, classes and so on, which actually executes the computations required to generate a model. This lower level of code will include TensorFlow
, Theano
and the Microsoft Cognitive Toolkit (CNTK
) [3]
. So, essentially, Keras
allows an easier method to access the lower-level functionality.
The Keras
functionality can be imported into Python as follows:
import tensorflow
from tensorflow import keras
The Keras
API allows for a number of Model class, namely:
- The
Model
class - The
Sequential
class
The Sequential
model is a basic model which consists of a sequence of layers, one after the other. Once the Keras
API has been imported into Python, the Sequential
class can be accessed as follows:
from keras import Sequential
When using the Sequential
model within Keras
, there are a number of metrics we can use. A metric is a parameter, or a function, that can be used to evaluate the performance of a model. Metrics can have an impact on the performance of the models, and as such, care needs to be taken in their selection.
[1] TensorFlow Org, "Module: tf.dtypes," [Online]. Available: https://www.tensorflow.org/api_docs/python/tf/dtypes.
[2] TensorFlow Org, "Introduction to Tensors" [Online]. Available: https://www.tensorflow.org/guide/tensor
[3] K. Tung, "Chapter 1. Introduction to TensorFlow 2," in TensorFlow 2 Pocket Reference, O'Reilly Media, Inc., 2021.
[4] J. Brownlee, "A Gentle Introduction to the Rectified Linear Unit (ReLU)," 20 8 2020. [Online]. Available: https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/.