Developing a basic Tensorflow framework

We ll be building the skeleton of Tensorflow framework using just basic python

Important components:

1) Graph : Holds information regarding order of computations

2) Operations: Nodes in the graph

Gradient computations are described in later sections

In [73]:
import numpy as np

Graph

In [182]:
class Graph():
    
    def __init__(self):
        
        self.operations = []
        self.placeholders = []
        self.variables = []
        
    def set_as_default(self):
        global _default_graph
        _default_graph = self

Operation

In [183]:
class Operation():
    
    def __init__(self,input_nodes=[]):
            
        self.input_nodes = input_nodes
        self.output_nodes = []
        
        for node in input_nodes:
            #add the current node to the output node of the input nodes passed
            node.output_nodes.append(self)
            
        _default_graph.operations.append(self)
            
    #This is called when executing the graph with Session class
    def compute(self):
        #should be implemented by child class
        pass
In [184]:
class add(Operation):
    def __init__(self,x,y):
        self.name = "add"
        super().__init__([x,y])
        
    def compute(self,x_var,y_var):
        self.inputs = [x_var,y_var]
        return x_var + y_var
        
In [185]:
class multiply(Operation):
    def __init__(self,x,y):
        self.name = "multiply"
        super().__init__([x,y])
        
    def compute(self,x_var,y_var):
        self.inputs = [x_var,y_var]
        return x_var * y_var
In [186]:
class matmul(Operation):
    def __init__(self,x,y):
        self.name = "matmul"
        super().__init__([x,y])
        
    def compute(self,x_var,y_var):
        self.inputs = [x_var,y_var]
        #assuming they are numpy array
        return x_var.dot(y_var)

Variable

NOTE: In Tensorflow, variables are those nodes in a graph for which gradient have to be computed. But we ll be going into gradient computations in later section

In [187]:
class Variable():
    def __init__(self,name,initial_value=None):
        self.name = name
        self.value = initial_value
        self.output_nodes = []
        _default_graph.variables.append(self)

Placeholder

These are data inputs

In [188]:
class Placeholder():
    def __init__(self,name):
        self.name = name
        self.output_nodes = []
        _default_graph.variables.append(self)

Example:

Consider the problem,

z = Ax+B

A,B : Variables ; x: Placeholder

Building the Graph

In [189]:
g = Graph()
g.set_as_default()
In [190]:
A = Variable("A",10)
In [191]:
B = Variable("B",1)
In [192]:
x = Placeholder("x")
In [193]:
y = multiply(A,x)
z = add(y,B)

In [194]:
[node.name for node in A.output_nodes] 
Out[194]:
['multiply']
In [195]:
[node.name for node in x.output_nodes]
Out[195]:
['multiply']
In [196]:
[node.name for node in y.input_nodes]
Out[196]:
['A', 'x']
In [197]:
[node.name for node in y.output_nodes]
Out[197]:
['add']
In [198]:
[node.name for node in B.output_nodes]
Out[198]:
['add']
In [199]:
[node.name for node in z.input_nodes]
Out[199]:
['multiply', 'B']
In [200]:
[node.name for node in z.output_nodes]
Out[200]:
[]

Build the order of execution

In [203]:
def traverse_postorder(operation):
    
    nodes_postorder = []
    
    def recurse(node):
        if isinstance(node,Operation):
            for input_node in node.input_nodes:
                recurse(input_node)
        nodes_postorder.append({node.name : node})
            
    recurse(operation)
    
    return nodes_postorder
In [204]:
tree = traverse_postorder(z)
tree
Out[204]:
[{'A': <__main__.Variable at 0x7effea6a56a0>},
 {'x': <__main__.Placeholder at 0x7effea6a5828>},
 {'multiply': <__main__.multiply at 0x7effea6a5cc0>},
 {'B': <__main__.Variable at 0x7effea6a59e8>},
 {'add': <__main__.add at 0x7effea6a5cf8>}]
In [205]:
list(tree[0].values())[0]
Out[205]:
<__main__.Variable at 0x7effea6a56a0>

Execute the graph with Session class

In [206]:
class Session():
    
    def run(self,operation,feed_dict={}):
        #Get the order of execution
        nodes_postorder = traverse_postorder(operation)
        
        for node_d in nodes_postorder:
            
            node = list(node_d.values())[0]
            print(node.name)
            
            if isinstance(node,Placeholder):
                #Placeholder node
                node.value = feed_dict[node]
            elif isinstance(node, Operation):
                #Operation node
                node.inputs = [input_node.value for input_node in node.input_nodes]
                #compute
                node.value = node.compute(*node.inputs) 
                #* implies number of params is determined by size of list
                print(node.name, " Inputs :", node.inputs, "Out: ", node.value)
                
                if type(node.value) == list:
                    #for matmul to work
                    node.value = np.array(node.value)
                
        return operation.value
        
In [207]:
sess = Session()
In [208]:
result = sess.run(z,feed_dict={x:3})
A
x
multiply
multiply  Inputs : [10, 3] Out:  30
B
add
add  Inputs : [30, 1] Out:  31

Another example

In [209]:
g = Graph()
g.set_as_default()

A = Variable("A",np.array([[10,20],[30,40]]))
b = Variable("b",np.array([1,2]))
x = Placeholder("x")

z = add(matmul(A,x),b)
In [210]:
sess = Session()
sess.run(z,feed_dict={x:10})
A
x
matmul
matmul  Inputs : [array([[10, 20],
       [30, 40]]), 10] Out:  [[100 200]
 [300 400]]
b
add
add  Inputs : [array([[100, 200],
       [300, 400]]), array([1, 2])] Out:  [[101 202]
 [301 402]]
Out[210]:
array([[101, 202],
       [301, 402]])

Activation function

In [211]:
import matplotlib.pyplot as plt
%matplotlib inline
In [212]:
def sigmoid(z):
    return 1/(1+np.exp(-z))
In [213]:
sample_z = np.linspace(-10,10,100)
sample_a = sigmoid(sample_z)
plt.plot(sample_z,sample_a)
Out[213]:
[<matplotlib.lines.Line2D at 0x7effea645c88>]

Activation is a type of Operation

In [214]:
class Sigmoid(Operation):
    
    def __init__(self,z):
        self.name = "sigmoid"
        super().__init__([z])
        
    def compute(self,z_val):
        return sigmoid(z_val)

Create random dataset with sklearn

In [215]:
from sklearn.datasets import make_blobs
In [216]:
data = make_blobs(n_samples=50,n_features=2,centers=2,random_state=75) #centers=[number of classes]
In [217]:
data
Out[217]:
(array([[  7.3402781 ,   9.36149154],
        [  9.13332743,   8.74906102],
        [  1.99243535,  -8.85885722],
        [  7.38443759,   7.72520389],
        [  7.97613887,   8.80878209],
        [  7.76974352,   9.50899462],
        [  8.3186688 ,  10.1026025 ],
        [  8.79588546,   7.28046702],
        [  9.81270381,   9.46968531],
        [  1.57961049,  -8.17089971],
        [  0.06441546,  -9.04982817],
        [  7.2075117 ,   7.04533624],
        [  9.10704928,   9.0272212 ],
        [  1.82921897,  -9.86956281],
        [  7.85036314,   7.986659  ],
        [  3.04605603,  -7.50486114],
        [  1.85582689,  -6.74473432],
        [  2.88603902,  -8.85261704],
        [ -1.20046211,  -9.55928542],
        [  2.00890845,  -9.78471782],
        [  7.68945113,   9.01706723],
        [  6.42356167,   8.33356412],
        [  8.15467319,   7.87489634],
        [  1.92000795,  -7.50953708],
        [  1.90073973,  -7.24386675],
        [  7.7605855 ,   7.05124418],
        [  6.90561582,   9.23493842],
        [  0.65582768,  -9.5920878 ],
        [  1.41804346,  -8.10517372],
        [  9.65371965,   9.35409538],
        [  1.23053506,  -7.98873571],
        [  1.96322881,  -9.50169117],
        [  6.11644251,   9.26709393],
        [  7.70630321,  10.78862346],
        [  0.79580385,  -9.00301023],
        [  3.13114921,  -8.6849493 ],
        [  1.3970852 ,  -7.25918415],
        [  7.27808709,   7.15201886],
        [  1.06965742,  -8.1648251 ],
        [  6.37298915,   9.77705761],
        [  7.24898455,   8.85834104],
        [  2.09335725,  -7.66278316],
        [  1.05865542,  -8.43841416],
        [  6.43807502,   7.85483418],
        [  6.94948313,   8.75248232],
        [ -0.07326715, -11.69999644],
        [  0.61463602,  -9.51908883],
        [  1.31977821,  -7.2710667 ],
        [  2.72532584,  -7.51956557],
        [  8.20949206,  11.90419283]]),
 array([1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 1,
        1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1,
        1, 0, 0, 0, 0, 1]))
In [218]:
features = data[0]
labels = data[1]
In [219]:
#plt.plot(features[:,0],features[:,1],'r*') #also works the same except we dont have option to color acc to labels
plt.scatter(features[:,0],features[:,1],c=labels) 
Out[219]:
<matplotlib.collections.PathCollection at 0x7effea6029e8>

Handcoded classifier

In [220]:
x = np.linspace(0,10,10)
y = -x+5
plt.scatter(features[:,0],features[:,1],c=labels) 
plt.plot(x,y)
Out[220]:
[<matplotlib.lines.Line2D at 0x7effea619b70>]

Equation of the classifier

y = -x + 5

Substitue the features for x and y and check on which side the result lies

f1 = -f0 + 5

f1 + f0 - 5 = 0

(1,1)*F - 5 = 0

(1,1) is the Weight and 5 is a bias

Eg, consider point (8,10)

In [221]:
res = np.array([1,1]).dot(np.array([[8],[10]])) - 5
print(res)
print(sigmoid(res))
[13]
[0.99999774]

13 > 0, which imples, it belongs to the yellow class

We build a graph for this classifier

In [222]:
g = Graph()
g.set_as_default()
F = Placeholder("F")
W = Variable("W",np.array([1,1]))
b = Variable("b",-5)
z = add(matmul(W,F),b)
a = Sigmoid(z)
In [223]:
sess = Session()

This point belongs to yellow class

In [225]:
sess.run(a,feed_dict={F:np.array([8,10])})
W
F
matmul
matmul  Inputs : [array([1, 1]), array([ 8, 10])] Out:  18
b
add
add  Inputs : [18, -5] Out:  13
sigmoid
sigmoid  Inputs : [13] Out:  0.999997739675702
Out[225]:
0.999997739675702
In [226]:
traverse_postorder(a)
Out[226]:
[{'W': <__main__.Variable at 0x7effea6b4c18>},
 {'F': <__main__.Placeholder at 0x7effea5d2940>},
 {'matmul': <__main__.matmul at 0x7effea5d2b38>},
 {'b': <__main__.Variable at 0x7effea5d27b8>},
 {'add': <__main__.add at 0x7effea5d2710>},
 {'sigmoid': <__main__.Sigmoid at 0x7effea5d2668>}]

This point belongs to violet class

In [227]:
sess.run(a,feed_dict={F:np.array([0,-5])})
W
F
matmul
matmul  Inputs : [array([1, 1]), array([ 0, -5])] Out:  -5
b
add
add  Inputs : [-5, -5] Out:  -10
sigmoid
sigmoid  Inputs : [-10] Out:  4.5397868702434395e-05
Out[227]:
4.5397868702434395e-05