Image Classification with Deep Learning and Comparison between Different Convolutional Neural Network Structures using Tensorflow and Keras
| Author(s) | : | Karan Chauhan, Shrwan Ram |
| Institution | : | Computer Science Department, M.B.M. Engineering College Jodhpur |
| Published In | : | Vol. 5, Issue 2 — February 2018 |
| Page No. | : | 533-538 |
| Domain | : | Engineering |
| Type | : | Research Paper |
| ISSN (Online) | : | 2348-4470 |
| ISSN (Print) | : | 2348-6406 |
Deep learning technologies are becoming the major approaches for natural signal and informationprocessing, like image classification, speech recognition. Deep learning is a technology inspired by the functioning ofhuman brain. In deep learning, networks of artificial neurons analyze large dataset to automatically discover underlyingpatterns, without human intervention, deep learning identify patterns in unstructured data such as, Images, sound, videoand text. Convolutional neural networks (CNN) become very popular for image classification in deep learning; CNN’sperform better than human subjects on many of the image classification datasets.In this paper, a deep learning convolutional network based on keras and tensorflow is deployed using python for binaryimage classification. In this study, a large number of different images, which contains two types of animals, namely catand dog are used for image classification. Four different structures of CNN are compared on CPU system, with fourdifferent combinations of classifiers and activation functions.It is shown that, for Binary image classification combination of sigmoid classifier and Relu activation function giveshigher classification accuracy than any other combination of classifier and activation function.
Karan Chauhan, Shrwan Ram, “Image Classification with Deep Learning and Comparison between Different Convolutional Neural Network Structures using Tensorflow and Keras”, International Journal of Advance Engineering and Research Development (IJAERD), Vol. 5, Issue 2, pp. 533-538, February 2018.








