When machines learn, what do they actually learn? How can we know for sure? Can there be a better way to learn?
For my dissertation research I have been trying to answer questions like these. If you are interested, join me in the IUI workshop for Explainable Smart Systems in L.A. this March. I’m presenting learn2Sign:a feedback driven technique to learn Sign Language.
Within Computer Science, my interests are in Artificial Intelligence, Computer Vision, Gesture Recognition, Sign Language Recognition and Natural Language Processing. I started to learn some American Sign Language myself, now that is fun!
Outside of Computer Science, I like to be informed on topics in Economics, Neuroscience, Social Psychology among others. (goodreads) I also like to think of myself as an avid hiker and an amateur photographer. (flikr)
Currently the projects I am actively involved in are:
- DyFAV: Dynamic Feature Selection and Voting for Real-time Recognition of Fingerspelled Alphabet using Wearables (pdf)
- GRUSI: Recognizing human actions and gestures using representative and human understandable Images.
- MirrorGen: Using human movement models in Unity for accurate gesture recognition using sensors
Some of them are still in the pipeline for being published so more details to come! But let me know if you are curious
Heres’ an article from back when I started with Sign Language Recognition! in MIT Tech Review
If you have a project idea, or would like to collaborate on my existing work, send me an email.
ppaudyal at asu dot edu
- Paudyal, Prajwal, Ayan Banerjee, and Sandeep KS Gupta, SCEPTRE: a Pervasive, Non-Invasive, and Programmable Gesture Recognition Technology. Proceedings of the 21st International Conference on Intelligent User Interfaces. ACM, 2016, March 2016 (pdf)
- Dyfav: Dynamic feature selection and voting for real-time recognition of fingerspelled alphabet using wearables. Prajwal Paudyal, Junghyo Lee, Ayan Banerjee, Sandeep KS Gupta. Proceedings of the 22nd International Conference on Intelligent User Interfaces
- FIT-EVE&ADAM: Estimation of Velocity & Energy for Automated Diet Activity Monitoring. Junghyo Lee, Prajwal Paudyal, Ayan Banerjee, Sandeep KS Gupta. Machine Learning and Applications (ICMLA), 2017 16th IEEE International Conference on
- Mt-diet: Automated diet assessment using myo and thermal. Junghyo Lee, Ayan Banerjee, Prajwal Paudyal, Sandeep KS Gupta. Late-Breaking Research Abstract at the conference on Wireless Health
- IDEA: Instant Detection of Eating Action using Wrist-Worn Sensors in Absence of User-Specific Model. Junghyo Lee, Prajwal Paudyal, Ayan Banerjee, Sandeep KS Gupta. Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization
- Mt-diet: Automated smartphone based diet assessment with infrared images. Junghyo Lee, Ayan Banerjee, Sandeep KS Gupta. 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)
This may not be up-to date so here is a link to my Google Scholar’s page.
- GPSA Research Grant Summer 2019
- CIDSE Graduate Fellowship (2016, 2017, 2018, 2019)
- Best Student Paper 2017 IUI
- Jumpstart Research Grant 2017
- Outstanding Research Award, 2016 GPSA
- Caukin’s Communication Award, 2016
- Travel awards from ACM, SIGCHI, SIGAI, CIDSE (ASU)