Fork me on GitHub

Greetings!

Welcome to my personal website in which I present my research, projects, and experience.

If you like the site and want to explore or use it yourself, view the source here on GitHub.


View My Portfolio

If you would like to contact me, feel free to do so.


View My Projects

View some of the projects I have worked on.


Recent Works

Deep Positron: A Deep Neural Network Using Posit Number System

In this work, we address the knowledge gap on how low-precision operations can be realized for deep neural network (DNN) inference. In doing so, we propose a DNN architecture, Deep Positron, with posit numerical format operating successfully at under 8 bits for inference. We propose a precision-adaptable FPGA soft core for exact multiply-and-accumulate for uniform comparison across three numerical formats: fixed, floating-point and posit. Preliminary results demonstrate that 8-bit posit has better accuracy that 8-bit fixed or floating-point for three different low-dimensional datasets. Moreover, the accuracy is comparable to 32-bit floating-point on a Xilinx Virtex-7 FGPA device. The trade-offs between DNN performance and hardware resources, i.e. latency, power, and resource utilization, show that posit outperforms in accuracy and latency at 8-bit and below. The paper was accepted to the 2019 IEEE Conference and Exhibition on Design, Automation and Test in Europe (DATE).


Mod-DeepESN: Modular Deep Echo State Network

In this work, we explore the impact of topology and connectivity of deep echo state networks (ESNs). Xavier initialization and intrinsic plasticity are utilized in the model initialization and training phases, respectively, and a genetic algorithm is used to tune model hyperparameters. We propose the Mod-DeepESN architecture and show that it either performs on par or outperforms baseline ESN models on multi-scale time series tasks. The paper was accepted to the 2018 Cognitive Computational Neuroscience Conference.