23rd December, 2019
Syed Ali Hasnain is currently completing his PhD in computer engineering at Texas A&M University, USA. He holds a Bachelor’s in electrical engineering from the National University of Science and Technology in Pakistan. His areas of interest include Photonic Computing, Computer Architecture, Machine Learning and Reservoir Computing. He has gained experience, working in Intel iCDG and CSG group. He is also a member of IEEE as a student.
Mr. Hasnain came to Habib to introduce his research and extensive work in machine learning, specifically the rise of Reservoir Computing for Recurring Neural Networks (RNN). As it is that in machine learning, unlike regular programming, the input and output are both provided by programmers and the result is the function that maps the given input and output. In a neural network, its black box is made up of layers, each of which is passed through with the given input to reach an output. The input is essentially assigned by adjusting weights in the network. Since, in this case the output is known and what is required is the correct function, the weights are repeatedly adjusted till the correct output is processed through the network. This process is known as the training phase of the layers, where each layer has to be primed with a programming function that is able to give the corresponding output for the given input. In a RNN, the training occurs through a system where the output resulted in each layer goes back to the input till it can deliver the right answer. This process of training each layer is very time consuming and further, it faces challenges in hardware implementation.
As an alternative and solution to the complications in traditional RNN, Mr. Hasnain worked on a new design for Reservoir Computing, which is a subset of RNN? Instead of training each weight in turn by going through repetitive trial and error, in Reservoir Computing only the last layer is trained while rest of the weights are randomly initialised. Although, optimisation may be slightly affected, there is not much difference in the results, but Reservoir Computing is found to be much faster in its functioning. There have been however, difficulties in the hardware application, which is what the speaker aimed to resolve. Mr. Hasnain designed a network model that included hardware accelerators for Reservoir Computing. It is a feat of design architecture in computer networks, the first model to employ multiple layers in its hardware. The greatest feature is that it helps make machine learning a speedier and more efficient process, minimising hardware challenges faced in a RNN. Innovative solutions like Mr. Hasnain’s are imperative in the world of advancing science.