
Aspiration Statement
“I aspire to advance automation and intelligent systems through academic research, exploring their parallels with the human brain. My expertise in robotics, AI, and sensor fusion aligns with this vision.”
Core Skills
- C++
- MATLAB
- Python
- ROS
Academic Awards / Achievements
- Dean's List, Spring 2022 - Fall 2024
- President's List, 2023
- High Achievement Scholarship, Spring 2022, Fall 2022
Experience
Leadership / Meta-curricular
- Club President, Natural Science Club
- Member, Academic Affairs Cabinet
Internship / Volunteer Work
- Part time, StingRay (March 2025 – May 2025)
- Research Intern (STRP), Habib University (June 2023)
- Intern, Systems Ltd (June 2022 – July 2022)
Publications / Creative Projects
- Enhanced Camouflaged Object Detection for Agricultural Pest Management: Insights from Unified Benchmark Dataset Analysis. Utilized cutting-edge YOLOv8 for camouflaged agricultural pest detection and segmentation, achieving strong performance on largescale datasets and contributing to open-source agricultural technology. Implemented advanced CNN and transformer architectures, including YOLO, RCNN, MirrorNet, ANet, DETR, and ResNet, to enhance detection accuracy. Compiled existing COD datasets to enhance training (IEEE Xplore) Integrating Ensemble Learning into Remote Health Monitoring for Accurate Prediction of Oral and Maxillofacial Diseases. Developed a deep learning-based approach for accurate diagnosis of oral diseases, including mouth ulcers, hypodontia, and dental caries, using over 6,000 annotated RGB images. Achieved up to 97% accuracy by combining VGG16, MobileNet, and InceptionV3 models in an ensemble approach, outperforming traditional X-ray-based methods. (IEEE Xplore):
Final Year Project
Project Title
Automation of Toyota Tow Truck: Vision
Description
I am working on the perception module of the project. Implemented obstacle detection using YOLOv5 Nano in varying lighting conditions, integrated with ROS2 for automatic braking and speed control. Successfully tested in Gazebo and real-world hardware, overcoming deployment challenges on Jetson Nano, and reducing latency for optimized real-time performance. Enhanced localization with camera data and focused on path delineation detection for navigation using road markings, improving braking control for safe, autonomous operation. Currently working on integration with the rest of the system using Nav Stack. The perception module enhances obstacle detection, braking, and navigation, ensuring safer autonomous operation. Optimized for real-time performance on Jetson Nano, it improves localization and path following, enabling seamless integration with the Nav Stack.