How a high school student built an AI to save energy
Large AI models consume massive amounts of energy. A single ChatGPT query can use 10x more energy than a Google search. Many prompts contain unnecessary words that increase computational load without adding value. As AI usage grows exponentially, so does its environmental impact.
Green Prompts Optimizer uses a custom-trained T5 transformer model to intelligently reduce prompt length while preserving meaning. By removing filler words and optimizing sentence structure, it can reduce token count by 30-50% on average, directly translating to energy savings.
Created a custom dataset of 127 prompt pairs across different domains (coding, writing, analysis)
Fine-tuned Google's T5-small model using PyTorch and Hugging Face Transformers
Integrated Zeus library to measure actual energy consumption during optimization
Built Gradio interface and deployed to Hugging Face Spaces with GitHub Pages frontend
Hi! I'm Srinesh Toranala, a high school student passionate about AI and environmental sustainability. This project was developed as part of my ISM (Independent Study Mentorship) Original Work for 2025-2026. I wanted to combine my interests in machine learning and climate action to create something that makes a real difference.
Through this project, I learned about transformer architectures, energy-efficient computing, and full-stack development. Most importantly, I discovered how technology can be a force for positive environmental change.