Institute of Technology Management
National Chung Hsing University

John Sum Professor
BEng, MPhil, PhD, IEEE Senior Member
IJCA Associate Editor, APNNA


Rm 821, College of Social Sci. & Mgt Building
National Chung Hsing University
250 Kuo Kuang Road, Taichung 402, Taiwan.
pfsum@nchu.edu.tw
John Sum Photo

| Home | Education | Research | Publication | Teaching | Service | Travel Plan | About Project Supervision | About Collaboration | Research Skills Training (Facebook Login Required) | Extra Curricular Activities |

Representative Publication

John Sum, Chi-sing Leung, Kevin Ho, On-line node fault injection training algorithm for MLP networks: Objective function and convergence analysis, IEEE Transactions on Neural Networks and Learning Systems, Vol.23(2), 211-222, 2012.

I. My Contributions

Background Survey and Problem Formulation

  1. I am involved in the discussion and problem formulation during the early stage of the research (back dated to 2007). At that time, I conducted a comprehensive survey on the works regarding the convergence of the learning algorithms that are based on injecting noise/fault during training. Thus, some of these noise/fault injection algorithms with incomplete theoretical analyses are identified for further investigation.
  2. Simulations were then conducted in order to investigate their convergence behaviors, fault tolerance abilities and generalization abilities. In accordance with the simulation results, the incapability of injecting multiplicative/additive weight noise in improving fault tolerance and generalization has been found. Improvement in generalization and fault tolerance has been observed from the on-line node fault injection type training.
  3. Thus, I conjectured that the convergence of the on-line node fault injection training algorithm for MLPs is with probability one.
  4. An unpublished manuscript (Version 1), summarizing these preliminary findings and a sketch on the theoretical proof, is complied and sent to the collaborators Chi-sing Leung and Kevin Ho for discussions.

Objective Function Derivation and Convergence Analyses

  1. Survey on the techniques for the convergence analysis is conducted. Theorems like Ljungˇ¦s Theorem, Kushner-Clark Lemma, Gladyshev Theorem, and Martingale Convergence Theorem are revisited.
  2. The objective function for the online node fault injection training algorithm for MLP with single linear output node is derived.
  3. The boundedness condition on the weight vector is then proved.
  4. By applying a convergence theorem from H. White, I prove that the convergence of the online node fault injection training algorithm for MLP with single linear output node is with probability one.
  5. Conduct computer simulations to validate the theoretical results. This part has not been published in the paper.
  6. An unpublished paper (Version 2) including the background survey and the analytical proof is complied and circulated to collaborators Chi-sing Leung and Kevin Ho for discussions.

Extended Works

  1. The objective function for the online node fault injection training algorithm for MLP with multiple linear output nodes is derived.
  2. By applying a convergence theorem from H. White, I prove that the convergence of the online node fault injection training algorithm for MLP with multiple linear output node is with probability one.
  3. The objective function for the online node fault injection training algorithm for MLP with single sigmoid output node is derived.
  4. By applying a convergence theorem from H. White, I prove that the convergence of the online node fault injection training algorithm for MLP with single sigmoid output node is with probability one.
  5. An unpublished paper (Version 3) including the background survey and complete convergence analyses is complied and circulated to collaborators Chi-sing Leung and Kevin Ho for discussions.

Drafting the final manuscript

  1. Manuscript (Version 4) for submission to IEEE Transactions on Neural Networks.

Revision and Rewriting

  1. After the paper has been submitted, it has gone through two major revisions and two minor revisions (Reject -> Reject & Resubmit -> Reject & Resubmit -> Minor Revision -> Minor Revision -> Accept). I need to go through all the reviews comments and revised the paper.
  2. After the paper has been revised, I need to prepare the Response Letters elucidating all our responses to the comments.
  3. Send the Revised Manuscripts and the Response Letters to the collaborators Chi-sing Leung and Kevin Ho for comments.

II. Collaborators' Contributions

While I have completed almost all the works, the collaborators Chi-sing Leung and Kevin Ho have provided many valuable advises and comments, particularly during the initial discussion, the problem formulation, revising the mathematical proofs and the future works along the direction. Their contributions are inevitable.

Background Survey and Problem Formulation

  1. CSL and KH repeated the simulations on the node fault injection training algorithms and reconfirmed that this algorithm is able to improve fault tolerance and generalization abilities. The convergence behavior of the algorithm is reconfirmed.
  2. CSL and KH Reviewed Manuscript Version 1 and gave comments on the wordings describing the contributions and the incompleteness of the existing research works in the literature.

Objective Function Derivation and Convergence Analyses

  1. CSL and KH reviewed all the proofs and suggested techniques that can simplify the proofs. (In some of the proofs, the steps appeared in the final version are actually shortened because of their comments.)
  2. CSL and KH reviewed the Manuscript Version 2 and discussed whether the contribution of Version 2 is significant. Then, we concluded that the work should be extended to consider the MLP with multiple linear output nodes and the MLP with single sigmoid node.

Extended Works

  1. While I have done all the derivations, CSL and KH reviewed the steps of derivations and repeated the simulations to re-confirm the findings.
  2. CSL and KH reviewed the Manuscript Version 3. After discussion, we decided to write a paper to IEEE TNN.

Drafting the final manuscript

  1. CSL and KH reviewed the Manuscript Version 4 before submission to IEEE TNN.

Revision and Rewriting

  1. CSL and KH reviewed all the revised manuscripts and the response letters.