JumaRubea commited on
Commit
f504975
1 Parent(s): ec24a39

Updated ReadMe

Browse files
Files changed (1) hide show
  1. README.md +68 -3
README.md CHANGED
@@ -1,3 +1,68 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - ylecun/mnist
5
+ language:
6
+ - en
7
+ - sw
8
+ base_model:
9
+ - ChufanSuki/LeNet5
10
+ ---
11
+ # lenet-5_architecture_model
12
+ # Handwritten Digit Recognition
13
+
14
+ This project implements the LeNet-5 neural network architecture to recognize handwritten digits using the MNIST dataset.
15
+
16
+
17
+ ## Table of Contents
18
+
19
+ - [Introduction](#introduction)
20
+ - [Prerequisites](#prerequisites)
21
+ - [Installation](#installation)
22
+ - [Model Architecture](#model-architecture)
23
+ - [Results](#results)
24
+ - [Contributing](#contributing)
25
+
26
+ ## Introduction
27
+
28
+ LeNet-5 is a classic convolutional neural network (CNN) architecture designed by Yann LeCun,[learn more](https://en.wikipedia.org/wiki/LeNet), primarily for handwritten digit classification. This project uses [TensorFlow](https://www.tensorflow.org/guide/keras/functional_api) and Keras to build and train the LeNet-5 model on the MNIST dataset. The dataset is already included in this project.
29
+
30
+
31
+ ## Prerequisites
32
+
33
+ Make sure you have the following installed:
34
+
35
+ - [Python_3.6+] (https://www.python.org/downloads/)
36
+ - [TensorFlow_2.x] (https://www.tensorflow.org/install)
37
+ - [NumPy] (https://numpy.org/install/)
38
+ - [Matplotlib](https://matplotlib.org/stable/install/index.html)
39
+ - [Pandas](https://pandas.pydata.org/docs/getting_started/install.html) (optional)
40
+
41
+ ## Installation
42
+
43
+ Clone this repository:
44
+
45
+ ```bash
46
+ git clone https://github.com/jumarubea/lenet-5_architecture_model.git
47
+ cd lenet-5-digit-recognition
48
+ ```
49
+ ## Model Architecture
50
+ LeNet-5 consists of the following layers:
51
+
52
+ - Convolutional Layer: 6 filters of size 5x5, activation function: tanh
53
+ - Average Pooling Layer: pool size 2x2
54
+ - Convolutional Layer: 16 filters of size 5x5, activation function: tanh
55
+ - Average Pooling Layer: pool size 2x2
56
+ - Convolutional Layer: 120 filters of size 5x5, activation function: tanh
57
+ - Flatten Layer
58
+ - Dense Layer: 84 units, activation function: tanh
59
+ - Output Layer: 10 units, activation function: softmax
60
+
61
+ Note: for the purpose of accuracy measure, i implement `relu` activation instead of `tanh`
62
+ except for the 84 dense layer.
63
+
64
+ ## Results
65
+ The trained LeNet-5 model achieves a test accuracy of approximately 98% on the MNIST dataset.
66
+
67
+ ## Contributing
68
+ If you want to contribute to this project, please fork the repository and submit a pull request.