Classifier construction
- Pages: 4
- Word count: 980
- Category: College Example Mathematics
A limited time offer! Get a custom sample essay written according to your requirements urgent 3h delivery guaranteed
Order NowThe PNN is certain to converge to a theorem classifier, and thus, it’s an excellent potential for creating classification choices accurately and providing likelihood and responsibility measures for every classification. Additionally, the coaching procedure of the PNN solely desires one epoch to regulate the weights and biases of the specification. Therefore, the foremost necessary advantage of victimization the PNN is its high speed of learning. Typically, the PNN consists of Associate in Nursing input layer, a pattern layer, a summation layer, and a choice layer.
The operate of the neurons in every layer of the PNN is outlined as follows.
Fig4: Topology of a PNN classifier
1) Layer 1: the primary layer is that the input layer, and this layer performs no computation. The neurons of this layer convey the input options x to the neurons of the second layer directly
where p is the number of the extracted features.
2) Layer 2: The second layer is that the pattern layer, and also the variety of neurons during this layer is adequate NL. Once a pattern vector x from the input layer arrives, the output of the neurons of the pattern layer is calculated as follows:
where xki is that the somatic cell vector, σ could be a smoothing parameter, d is that the dimension of the pattern vector x, and ϕki is the output of the pattern layer.
3) Layer 3: The third layer is that the summation layer. The contributions for every category of inputs ar summed during this layer to provide the output because the vector of chances. every nerve cell within the summation layer represents the active standing of 1 category. The output of the kth nerve cell is
Where Ni is the total number of samples in the kth neuron
4) Layer 4: The fourth layer is the decision layer
Where m denotes the amount of categories within the coaching samples and c(x) is that the calculable category of the pattern x.
In this paper, the output of the PNN is pictured because the label of the specified outcome outlined by users. as an example, in our written digit recognition, the labels “1,” “2,” “3,” “4,” “5,” “6,” “7,” “8,” “9,” and “10” area unit wont to represent handwriting digits one, 2, . . ., 9, and 0, severally.
IV. RESULT AND DISCUSSION
Table 1: Recognition rate for different combinations of feature Selection and Extraction Method for online recognition.
Table 2: Recognition rate for different combinations of feature Selection and Extraction Method for offline recognition.
Table 3: online Recognition results for 10 samples with PNN
Table 4: Offline Recognition results for 10 samples with PNN
Table 5: Offline Confusion matrix
Fig 5: Generate training data in .txt format. Converted .txt into .xls format
Fig 6: Training data generated
Fig 7: Output of feature generation
Fig 8: Final output display on screen using PNN classifier
Fig 9: Graph for all coordinates of digit ‘5’
Fig 10: After applying moving average filter for digit ‘5’
Fig 11: After applying high pass filter for digit ‘5’
Fig 12: Mean of ‘0’ and ‘6’
Fig 13: Average recognition rate Vs feature dimension of the PNN classifier by using KBCS
Fig 14: Average recognition rate Vs feature dimension of the PNN classifier by using KBCS
VII. CONCLUSION
This paper has conferred a scientific trajectory recognition algorithm framework which will construct effective classifiers for acceleration-based handwriting and gesture recognition. The proposed trajectory recognition algorithmic program consists of acceleration acquisition, signal preprocessing, feature generation, feature selection and have extraction. With the reduced options a PNN is quickly trained as an efficient classifier. In the experiments we have a tendency to used 3-D hand gestures to validate the effectiveness of the projected device and algorithmic program. The overall handwritten digit recognition rate was 90% for offline and 71% for online using PNN and less for ANN. This result encourages North American country to additional investigate the likelihood of using our digital pen as an efficient toll for HCI applications.
REFERENCES
[1] Z. Dong, U. C. Wejinya, and W. J. Li, “An optical-tracking calibration method for MEMS-based digital writing instrument,” IEEE Sens. J.,vol. 10, no. 10, pp. 1543–1551, Oct. 2010.
[2] S.-H. P. Won, W. W. Melek, and F. Golnaraghi, “A Kalman/particle filter-based position and orientation estimation method using a position sensor/inertial measurement unit hybrid system,” IEEE Trans. Ind. Electron.,
vol. 57, no. 5, pp. 1787–1798, May 2010.
[3] J. S.Wang, Y. L. Hsu, and J. N. Liu, “An inertial-measurement-unit-based pen with a trajectory reconstruction algorithm and its applications,” IEEE Trans. Ind. Electron., vol. 57, no. 10, pp. 3508–3521, Oct. 2010.
[4] Ahmad akl, student member, IEEE, chen feng, student member, IEEE, and shahrokh valaee, senior member, IEEE “A novel accelerometer-based gesture recognition system”, IEEE transactions on signal processing, vol. 59, no. 12, december 2011.
[5] Yu-Liang Hsu, Cheng-Ling Chu, Yi-Ju Tsai and jeen-Shing Wang, Member ,IEEE, “An Inertial Pen With Dynamic Time Warping Recognizer for Handwriting and Gesture Recognition”, IEEE sensors journal, Vol 15 No 1, January2015.
[6] Shengli Zhou, Fei Fei, Guanglie Zhang, John D. Mai, Yunhui Liu, Jay Y. J. Liou, and Wen J. Li, Fellow, IEEE, “2D Human gesture tracking and recognition by the fusion of MEMS inertial and vision sensors”, IEEE SENSORS JOURNAL, VOL. 14, NO. 4, APRIL 2014.
[7] S. J. Preece, J. Y. Goulermas, L. P. J. Kenney, and D. Howard, “A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data,” IEEE Trans. Biomed. Eng., vol. 56, no. 3, pp. 871–879, Mar. 2009.
[8] S.-H. P. Won, F. Golnaraghi, and W. W. Melek, “A fastening tool tracking system using an IMUand a position sensor with Kalman filters and a fuzzy expert system,” IEEE Trans. Ind. Electron., vol. 56, no. 5, pp. 1782–1792, May 2009.
[9] Neelasagar K and K Suresh , “Real Time 3D-Handwritten Character and Gesture Recognition for Smartphone” , International Journal of Computer Applications, Volume 123-No 13, August 2015
[10] “Analysis of Inertial sensor Data using Trajectory Recognition Algorithm”, Nagadeepa . Ch, Dr. N. Balaji and Dr. V. padmaja, International Journal on Cybernetics & Informatics (IJCI) Vol. 5, No, August 2016