📖 About the Project

How a high school student built an AI to save energy

🎯 The Problem

Large AI models consume massive amounts of energy. A single ChatGPT query can use 10x more energy than a Google search. Many prompts contain unnecessary words that increase computational load without adding value. As AI usage grows exponentially, so does its environmental impact.

💡 The Solution

Green Prompts Optimizer uses a custom-trained T5 transformer model to intelligently reduce prompt length while preserving meaning. By removing filler words and optimizing sentence structure, it can reduce token count by 30-50% on average, directly translating to energy savings.

🔬 How It Works

1. Data Collection

Created a custom dataset of 127 prompt pairs across different domains (coding, writing, analysis)

2. Model Training

Fine-tuned Google's T5-small model using PyTorch and Hugging Face Transformers

3. Energy Measurement

Integrated Zeus library to measure actual energy consumption during optimization

4. Web Deployment

Built Gradio interface and deployed to Hugging Face Spaces with GitHub Pages frontend

🛠️ Technology Stack

🤖 T5 Transformer
🔥 PyTorch
🤗 Hugging Face
⚡ Gradio
📊 Zeus Energy
🌐 GitHub Pages

👨‍💻 About the Developer

Hi! I'm Srinesh Toranala, a high school student passionate about AI and environmental sustainability. This project was developed as part of my ISM (Independent Study Mentorship) Original Work for 2025-2026. I wanted to combine my interests in machine learning and climate action to create something that makes a real difference.

Through this project, I learned about transformer architectures, energy-efficient computing, and full-stack development. Most importantly, I discovered how technology can be a force for positive environmental change.

🌍 Join the Movement

Every optimized prompt helps reduce AI's carbon footprint

Try the Optimizer →