Skip to content

micro-implementation of tensorflow-like utilities

License

Notifications You must be signed in to change notification settings

smilu97/py-miniflow

Repository files navigation

Miniflow

What is miniflow?

Miniflow is a implementing project of tensorflow-like functions, utilities just for studying. It may be a very-easy, most basic version of Deep-learning framework

How to try

git clone https://github.com/smilu97/miniflow
cd miniflow
virtualenv venv -p python3
pip install -r requirements.txt
python test.py sin
# python test.py xor  # If you want to see xor test

Learning XOR Test

xor_test

Simple logistic regression

  • 2 Input (x)
  • 2 HiddenLayer (S0 = sigmoid(x * V0 + b0))
  • 1 Output (S1 = sigmoid(S0 * V1 + b1))
sess = fl.Session(lr=0.1)

train_x = np.array([[0, 0],
    [0, 1],
    [1, 0],
    [1, 1]])
train_y = np.array([[0],[1],[1],[0]])

x = fl.Placeholder(sess, train_x, 'x')

y = fl.Placeholder(sess, train_y, 'y')

V0 = fl.Variable(sess, fl.xavier(2,2))
b0 = fl.Variable(sess, fl.xavier(2))
S0 = fl.sigmoid(fl.matmul(x, V0) + b0)

V1 = fl.Variable(sess, fl.xavier(2,1))
b1 = fl.Variable(sess, fl.xavier(1))
S1 = fl.sigmoid(fl.matmul(S0, V1) + b1)

E = fl.sum(fl.square(S1 - y), axis=0)

Learning Sin Test

sin_test sin_complete

Logistic regression

  • 1 Input (x)
  • 1000 HiddenLayer (S0 = tanh(x * W0 + b0))
  • 1000 HiddenLayer (S1 = tanh(x * W1 + b1))
  • 1 Output (S2 = S1 * W2 + b2)
input_size = 1 # Constant
h1 = 1000
h2 = 1000
output_size = 1 # Constant
batch_size = 200

sess.fan_in = input_size # For xavier initializer
sess.fan_out = output_size

x = fl.Placeholder(sess, np.zeros((batch_size, 1)), 'x')
y = fl.Placeholder(sess, np.zeros((batch_size, 1)), 'y')

S0, W0, b0 = fl.fully_conntected(x, h1, activation=fl.tanh, initializer=fl.xavier_initializer())
S1, W1, b1 = fl.fully_conntected(S0, h2, activation=fl.tanh, initializer=fl.xavier_initializer())
S2, W2, b2 = fl.fully_conntected(S1, output_size, activation=None, initializer=fl.xavier_initializer())

y_ = S2
E = fl.avg(fl.avg(fl.square(y - y_), 0), 0)

TODO

  • Basic Graph Node
  • XOR learning test
  • Concat, Select - not tested
  • Transpose
  • Shape validations
  • Convolution2D
  • Optimize back-propagation algorithm
  • MaxPool, AvgPool etc...

Releases

No releases published

Packages

No packages published

Languages