Posts

Showing posts with the label alphabet recognition

96.24% accuracy with higher epoch numbers for Convolutional Neural Network

Image
For Convolutional Neural Network , there are a lots of factors affect the model accuracy. Factors included the structure of Convolutional Neural Network , hyper parameters values, over-fitting and etc. Only correct or suitable network structure can produce high model accuracy. If wrong network structure is used, whatever you tune the hyper parameters, the accuracy will still worse. So, after we found the right network structure, only we can start to tune the hyper parameters. However, over-fitting may happened on your model accuracy. There are several methods to prevent that. One of the over-fitting prevention is by using the dropout function. Dropout give the model a way to find alternative way to train the model by removed some characteristic according to the ratio. Another way to prevent over-fitting is by increasing the number of samples. Besides, we can also simplify the network structure. Some complicated network structure sometimes can cause over-fitting too.  Fi...

Accuracy 96% with Simple Deep NN using Keras Theano backend for nonMNist alphabet recognition

Image
Here's first try to build a simple simple deep neurons network by using Keras with Theano backend. By using the previous data pickles sample   nonMNist dataset we have saved, here's the completed coding. The summarized deep filter layers can be described as combination of common dense layer with 512 units, relu activation, dropout ratio of 0.2, softmax activation, and RMSprop optimizer. from __future__ import print_function import keras from keras.models import Sequential from keras.layers import Dense, Dropout from keras.optimizers import RMSprop from keras.utils import np_utils import os from six.moves import cPickle as pickle batch_size = 128 num_classes = 10 epochs = 20 def load_data(): test_filename = "notMNIST.pickle" if os.path.exists(test_filename): with open(test_filename, 'rb') as f: letter_dataset = pickle.load(f) return (letter_dataset) lt_dt = load_data() train_dataset = lt_dt['train_...

Using mini batch SGD Neural Network in Alphabet recognition!

Here we are using mini-batch stochastic gradient descent neural network method in Alphabet recognition with previous data set mentioned in  nonMNIST data set . First, we are going to build the neural network part. Here we have adjustable number of hidden neurons, layers, epochs, mini batch sizes that allow us to tune the accuracy later. #network part import random import numpy as np class Network(object): def __init__(self, sizes): #sizes: number of neurons (eg: (2,3,1)) #biases and weights are initialized randomly self.num_layers = len(sizes) self.sizes = sizes self.biases = [np.random.randn(y, 1) for y in sizes[1:]] self.weights = [np.random.randn(y, x) for x, y in zip(sizes[:-1], sizes[1:])] def feedforward(self, a): #return output of network f a is input for b, w in zip(self.biases, self.weights): a = sigmoid(np.dot(w, a) + b) return a ...

Alphabet recognition by using LogisticRegression model from sklearn.linear_model. (Udacity Assignment 1)

Image
Continued from previous assignment, we saved pickle data from some sample alphabet images. Now we are going to extract the data and train the data, so that we can recognize the alphabet. from __future__ import print_function import matplotlib.pyplot as plt import numpy as np import os import sys import tarfile import random import hashlib from IPython.display import display, Image from sklearn.linear_model import LogisticRegression from six.moves.urllib.request import urlretrieve from six.moves import cPickle as pickle test_filename = "notMNIST.pickle" def read_data(picklename,data_name): if os.path.exists(picklename): with open(picklename, 'rb') as f: letter_set = pickle.load(f) return letter_set[data_name] train_dataset = read_data(test_filename,'train_dataset') train_labels = read_data(test_filename,'train_labels') valid_dataset = read_data(test_filename,'valid_dataset') valid_labels = read_data(test_filename,'valid_labels...