Welcome to My Research Space!
​
My research journey is rooted in curiosity, innovation, and a constant drive to explore the potential of emerging technologies. Throughout my academic and professional career, I have focused on building intelligent systems, advancing AI-driven solutions, and contributing to impactful research in the fields of Machine Learning, Deep Learning, and Data Science.
During my time at Iowa State University, I worked on various research projects and publications that reflect my passion for solving complex problems and pushing the boundaries of technology. This space highlights some of my key research contributions, collaborative projects, and the skills I’ve developed along the way.
Research
Graduate Research Thesis – Iowa State University
Program: M.S. in Computer Engineering (Computing and Networks)
Committee: Dr. Ravikumar Gelli (Advisor), Dr. Manimaran Govindarasu, Dr. Julie Dickerson
Thesis Title: Scenario-Based Auto Data Generation Framework Using Hardware-in-the-Loop Testbed for Deep Learning-Based Application
​
Summary:​
This thesis introduces the architecture and deployment of a scenario-driven autonomous data generation framework purpose-built for deep learning (DL) applications within Distributed Energy Resource (DER)-integrated distribution networks. With the accelerated integration of DERs, modern power systems have become increasingly complex, facing challenges such as stochastic power injection, dynamic load conditions, and evolving cyber-physical vulnerabilities. ​To mitigate the inherent limitations of conventional dataset acquisition methods, the proposed framework facilitates the automated synthesis of high-resolution, feature-rich datasets encompassing a diverse set of operating conditions. These include transient line faults, steady-state anomalies, and cyberattack-induced perturbations, all emulated in real-time via a Hardware-in-the-Loop (HIL) Cyber-Physical Systems (CPS) testbed powered by OPAL-RT.​​
The system exposes a Python-based configuration interface, enabling dynamic control over parameters such as voltage magnitude, current phase, fault injection types, attack vectors, and time domain characteristics. An integrated API orchestrates the full data pipeline—from event triggering and waveform acquisition to structured dataset generation—ensuring reproducibility, traceability, and scalability for ML model development. ​The framework generated a corpus of labeled datasets tailored for anomaly detection tasks. These were validated using a Long Short-Term Memory (LSTM) neural network optimized for temporal pattern recognition. The model attained 99.01% classification accuracy, affirming the quality and utility of the synthetic datasets for critical grid intelligence applications.​​
This research not only addresses the data scarcity problem in smart grid ML workflows but also lays the groundwork for next-generation grid analytics, including real-time diagnostic systems, federated learning architectures, and edge AI deployment in operational substations.
Tech Stack
-
Languages & Libraries: Python, NumPy, Pandas, Scikit-learn, TensorFlow/Keras
-
Simulation & Systems: OPAL-RT, RT-Lab, MATLAB
-
Data Engineering: Custom Feature Engineering Scripts, Real-Time Signal Logging, Data Normalization Pipelines
-
Modeling: LSTM (Time Series), Hyperparameter Tuning, Performance Evaluation (Precision, Recall, F1)
-
Automation: Python Scripting, API-driven Test Configuration, Dataset Orchestration Framework
​
What the Framework Offers
-
Plug-and-play simulation control: Automate grid events like faults or attacks using parameterized Python scripts.
-
Custom dataset generation: Create large-scale, annotated datasets tailored for ML/DL workflows.
-
Realistic test scenarios: Use Hardware-in-the-Loop for high-fidelity signal simulation.
-
Rapid prototyping: Evaluate ML models quickly using freshly generated, labeled datasets.
-
High-performance validation: Achieved 99.01% anomaly detection accuracy using LSTM on generated data.
-
Reproducible & extensible: Easily integrate into other AI pipelines or smart grid applications.
Publications
HIL Testbed-based Auto Feature Extraction and Data Generation Framework for ML/DL-based Anomaly Detection and Classification
Authors: Aditya Akilesh Mantha, Arif Hussain, Ravikumar Gelli
​Summary:​
This paper presents AFEDG—an Auto Feature Extraction and Data Generation framework—designed to overcome data scarcity and imbalance in ML/DL-based anomaly detection within smart grids. The framework features a Python-based API that automates event simulation, feature extraction, and dataset generation by integrating virtual sensors and modulating real-time parameters within a power system model.
Deployed on a Hardware-in-the-Loop (HIL) Cyber-Physical Testbed using OPAL-RT, RT-LAB, and MATLAB, the system enables scalable, unbiased, and high-fidelity dataset generation for fault and cyberattack classification tasks. The framework was validated using a modified SSN-distribution bus model and supports seamless integration across multi-bus systems.
​
Conference: IEEE Power & Energy Society Innovative Smart Grid Technologies (ISGT), 2024
Focus Areas: HIL simulation, Smart Grid AI, Auto data generation, Anomaly classification, Python API integration
Tech Stack
-
Languages & APIs: Python, MATLAB
-
Simulation Tools: OPAL-RT, RT-LAB, Simulink
-
Automation & Control: Custom Python API for signal control, feature extraction, and event orchestration
-
Power System Modeling: SSN-based distribution grid (multi-bus architecture)
-
Data Engineering: Real-time data acquisition, virtual sensor integration, feature vector generation
-
Machine Learning Tools: Scikit-learn, Pandas (for post-processing and preparation for ML/DL models)
-
Cyber-Physical System (CPS) Integration: HIL Testbed with dynamic scenario injection for smart grid applications
​
Bayesian Optimization for Deep Reinforcement Learning for Robust Volt-Var Control
Authors: Kundan Kumar, Aditya Akilesh Mantha, Ravikumar Gelli
​Summary:​
This paper introduces a Bayesian Optimization-enhanced Deep Reinforcement Learning (BO-DRL) framework for robust and adaptive Volt-Var Control (VVC) in active distribution networks with high Renewable Energy Source (RES) penetration. The volatility and intermittency introduced by RES complicate voltage regulation and reactive power coordination, demanding intelligent, data-driven control strategies.
To address these challenges, we integrate Bayesian Optimization (BO) into an actor-critic DRL architecture, facilitating automated hyperparameter tuning and accelerating policy convergence. The BO module employs a Gaussian Process surrogate model with acquisition functions to iteratively refine DRL parameters, improving the sample efficiency and control policy robustness.
The proposed BO-DRL system was validated on IEEE-13 and IEEE-123 bus test feeders using a power flow co-simulation environment. Experimental results showed that the optimized DRL agent achieved up to 81.81% performance gain over baseline models in voltage profile adherence and reactive power minimization. The model consistently maintained voltage magnitudes within ANSI limits across diverse operating conditions, demonstrating superior generalization and control stability.
​
Conference: 2024 IEEE Power & Energy Society General Meeting (PESGM)
Focus Areas: Deep Reinforcement Learning, Bayesian Optimization, Volt-Var Control, Smart Grid Automation, Hyperparameter Search, Distributed Energy Systems
Tech Stack
-
ML Frameworks: PyTorch, TensorFlow (Custom Actor-Critic DRL Implementation)
-
Optimization Algorithms: Bayesian Optimization with Gaussian Processes & UCB/Expected Improvement
-
Grid Simulation Tools: OpenDSS, GridLAB-D (for power flow modeling and control simulation)
-
Distribution System Models: IEEE-13 and IEEE-123 Node Test Feeders
-
Control Objectives: Voltage Deviation Minimization, Reactive Power Dispatch, Fast Policy Convergence
-
Evaluation Metrics: Voltage Profile Deviation (ΔV), Cumulative Reward, Convergence Iterations, Generalization Performance
​