top of page

Projects

Automation and coding have always been my passions, leading me to gravitate naturally towards tasks in these fields. Over time, I've continuously learned new and efficient ways to handle complex scenarios. As my work gained recognition, what started as a passion evolved into a career path and became my primary job role. Below, I am showcasing a selection of my developments. These include projects completed during my master's program and others undertaken during my professional career.

Master's Projects

Conditional Forecasting of Bitcoin Prices Using Exogenous Variables

Research Project

Bitcoin's high volatility makes price prediction challenging.. In my study, I aimed to forecast Bitcoin prices for a month by incorporating exogenous variables, specifically interest rates and recession probabilities. My primary objective was to examine whether these variables positively impact Bitcoin price predictions. I utilized two popular time series forecasting models: Long Short-Term Memory (LSTM) and Facebook Prophet. My approach involved assessing the influence of these exogenous variables on the models' performance and comparing the outcomes through plots and cross-validation. I trained the models using historical Bitcoin price data along with the additional variables and evaluated their performance on a test dataset. The results indicated that LSTM surpassed Facebook Prophet in predicting Bitcoin prices, likely due to its ability to learn complex patterns. Importantly, I found that including interest rates and recession probabilities significantly enhanced the models' predictive capabilities, suggesting their impact on Bitcoin prices and underscoring their importance in improving prediction accuracy in my models.

Handwritten Character Recognition using Deep Learning Concepts

Individual Class Project

This project focuses on developing a model capable of recognizing handwritten English alphabets and digits, utilizing the EMNIST dataset. Unlike widely used image classification applications, this study emphasizes creating a specialized model from scratch, targeting a specific problem set. Pre-trained architectures, though versatile, can be computationally intensive for certain tasks, making them impractical for CPU-only training. Building a custom model offers advantages in terms of efficiency and cost, potentially eliminating the need for GPUs by reducing parameter count. However, designing such a model is challenging without any set formula for approximating necessary hyper-parameters based on the problem set alone. A structured approach to model development is crucial to address issues like poor accuracy, overfitting, and high learning curve variance. Ultimately, I aim to compare my custom model's performance, efficiency, and parameter count with popular architectures like ResNet50V2 and MobileNetV2. My model achieves comparable accuracy but with a 50% reduction in parameters and increased processing speed.

Electro-cardiogram Diagnosis Classifier for Healthcare Applications

Group Class Project

The primary objective of our project was to develop a web portal that serves as a vital tool in the realm of telehealth. It allows doctors to upload Electrocardiogram (ECG) scans, which are essential for measuring the heart's electrical activity, for analysis. The core functionality of the portal is an AI model that I personally developed. This model classifies the ECG scans as normal or abnormal, thereby facilitating quicker diagnosis and timely recommendations for follow-up treatments.

In this group project, my specific role was to architect and fine-tune the machine learning model that is the backbone of our system. I meticulously handled the data analysis phase, ensuring the model was trained effectively to recognize the nuances in ECG scans. After rigorous training and optimization, I saved the model with its trained weights. This crucial step ensures that the model operates efficiently during runtime, mirroring real-world production environments.

Professional Projects

Auto messaging API Coding

Design and Code

In my project, the overarching goal was to minimize the Minimum Time to Restore (MTTR), a crucial factor in enhancing customer satisfaction. We identified that certain critical faults leading to outages required a swifter notification process for the responsible engineers, rather than relying on traditional human-mediated communication methods.

To address this, I developed an SMPP (Short Message Peer-to-Peer) protocol-based solution that automatically sends SMS alerts to the designated engineer upon detecting system abnormalities. Understanding the need for versatility in communication, I further integrated a Telegram bot. This addition ensured that notifications could also be dispatched through the internet, providing an alternative channel for reaching engineers.

Furthermore, I created an API that facilitates the sending of these notifications. To streamline the management of this entire system, I designed and implemented a front-end CRUD (Create, Read, Update, Delete) application. This application serves as a centralized platform for managing both the notifications system and the underlying back-end database. My contribution thus lies in developing a comprehensive, automated notification system that accelerates the response time to critical faults, thereby significantly reducing MTTR and enhancing overall customer satisfaction.

User experience automation

Design and Code

The project I undertook was driven by the need to address customer dissatisfaction stemming from internet latency, an issue inadequately represented by traditional ping-based latency checks. To provide a more accurate reflection of real user experience, I developed a system that simulates human-like browsing and activity patterns. This system periodically performs a range of activities on the internet, closely mimicking typical user behavior, and measures the latency for each specific use case.

To ensure comprehensive coverage, these tests are conducted across various virtual machines located in different areas where internet quality assessment is crucial. All the data collected from these tests are then sent back to a central server for analysis and aggregation.

A significant aspect of my contribution was the development of a dashboard that presents a graphical comparison of internet performance across these machines. This visualization not only made it easier to understand and interpret the data but also brought the actual user experience directly to our attention. Consequently, this led to the identification of specific issues and enabled targeted rectifications, thereby significantly improving customer satisfaction regarding internet latency. My system effectively bridged the gap between technical metrics and real user experience, providing invaluable insights for enhancing internet service quality.

JSON decoder of a bulky and streaming file

Code

To address the challenge our peripheral team faced with parsing a massive JSON file (~5 GB+), continuously updated with logs from thousands of base stations, I devised an efficient PHP-based solution. By employing the pointer method, my code efficiently tracks only the position where it last stopped decoding, rather than loading the entire large file into memory. This strategy significantly minimizes server memory usage, preventing lag and memory bottlenecks. The code also selectively parses JSON data as required, seamlessly updating their database. This development notably reduced the processing time from over an hour to just a few seconds, effectively handling the substantial file size and streamlining the parsing process

Json Parser.PNG

Convergent tool and XML decoder

Design and code

To resolve a critical network service malfunction that was causing outages and revenue loss, I initiated a project to create an innovative solution. Recognizing that all other network services communicated with the affected device in XML format, I set up a server mirroring the communication matrix of the malfunctioning device. My approach was to activate this alternate route whenever the primary service was down. This server would then receive data in XML format from the other network devices.

My code played a pivotal role in this system. It decoded the incoming XML data, applied necessary mathematical calculations as per our business logic, and generated new XMLs for the remaining network elements in the network. These XMLs were then dispatched accordingly. This strategy not only maintained operational continuity but also effectively eliminated the revenue loss previously experienced due to this specific outage. The project was a testament to the power of creative problem-solving in critical network infrastructure management.

Convergent Tool.JPG

Business configuration Sync

Desing and code

Faced with the challenge of configuring short codes for businesses across more than 75 network elements nationwide, a task involving around 525 commands and requiring a week's effort from eight personnel, I spearheaded an automation project. This initiative featured a user-friendly dashboard displaying the end-to-end workflow and historical logs.

The process began with users uploading an Excel sheet containing the necessary configurations. I utilized Python to decode the Excel data, and then implemented PHP socket programming for the backend workflows. This automation significantly streamlined the previously labor-intensive process, enhancing efficiency and accuracy in network configuration management.

Configuration syncing.JPG
bottom of page