How a high school student built an AI to save energy
Large AI models consume massive amounts of energy. A single ChatGPT query can use 10x more energy than a Google search. Many prompts contain unnecessary words that increase computational load without adding value. As AI usage grows exponentially, so does its environmental impact.
Green Prompts Optimizer uses a custom-trained T5 transformer model to intelligently reduce prompt length while preserving meaning. By removing filler words and optimizing sentence structure, it can reduce token count by 30-50% on average, directly translating to energy savings.
Data Collection — Created a dataset of 127 prompt pairs across different domains: coding, writing, and analysis.
Model Training — Fine-tuned Google's T5-small model using PyTorch and Hugging Face Transformers.
Energy Measurement — Integrated Zeus library to measure actual energy consumption during optimization.
Web Deployment — Built with GitHub Pages as the frontend, calling the Hugging Face Inference API directly.
Hi, I'm Srinesh Toranala, a high school student interested in AI and environmental sustainability. This project was developed as part of my ISM (Independent Study Mentorship) Original Work for 2025-2026. I wanted to combine my interests in machine learning and climate action to create something that makes a real difference.
Through this project, I learned about transformer architectures, energy-efficient computing, and full-stack development. Most importantly, I discovered how technology can be a force for positive environmental change.
Optimize prompts directly in your browser. Works on ChatGPT, Claude, and other AI platforms.
chrome://extensions/