Skip to content

ejhusom/green-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 

Repository files navigation

Green AI 🌱

A curated overview of resources for reducing the environmental footprint of AI development and usage.

Contributions and pull requests are welcome!

Tools

Tools for measuring and quantifying footprint

Tools for calculation/estimation of footprint

The following tools are designed to calculate the footprint based on information about the choice of algorithms, configuration and hardware.

Tools for AI/ML development with integrated carbon footprint reporting

  • d2m [Website] [Source code] – a machine learning pipeline for ML model development with automatic monitoring and tracking of the carbon footprint

Papers

Particularly important papers are highlighted.

  • Energy and Policy Considerations for Deep Learning in NLP (Strubell et al. 2019) [Paper]
  • Quantifying the Carbon Emissions of Machine Learning (Lacoste et al. 2019) [Paper]
  • Green AI (Schwartz et al. 2020) [Paper] [Notes]
  • Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models (Anthony et al. 2020) [Paper]
  • Carbon Emissions and Large Neural Network Training (Patterson, et al. 2021) [Paper]
  • Chasing Carbon: The Elusive Environmental Footprint of Computing (Gupta et al. 2020) [Paper]
  • Green Algorithms: Quantifying the Carbon Footprint of Computation (Lannelongue et al. 2021) [Paper]
  • A Pratical Guide to Quantifying Carbon Emissions for Machine Learning researchers and practitioners (Ligozat et al. 2021) [Paper]
  • A framework for energy and carbon footprint analysis of distributed and federated edge learning (Savazzi et al. 2021) [Paper] [Notes]
  • Aligning artificial intelligence with climate change mitigation (Kaack et al. 2021) [Paper]
  • New universal sustainability metrics to assess edge intelligence (Lenherr et al. 2021) [Paper]
  • Unraveling the Hidden Environmental Impacts of AI Solutions for Environment Life Cycle Assessment of AI Solutions (Ligozat et al. 2022) [Paper]
  • Measuring the Carbon Intensity of AI in Cloud Instances (Dodge et al. 2022) [Paper]
  • Estimating the Carbon Footprint of BLOOM a 176B Parameter Language Model (Luccioni et al. 2022) [Paper]
  • Bridging Fairness and Environmental Sustainability in Natural Language Processing (Hessenthaler et al. 2022) [Paper]
  • Eco2AI: carbon emissions tracking of machine learning models as the first step towards sustainable AI (Budennyy et al. 2022) [Paper]
  • Environmental assessment of projects involving AI methods (Lefèvre et al. 2022) [Paper]
  • Sustainable AI: Environmental Implications, Challenges and Opportunities (Wu et al. 2022) [Paper]
  • A first look into the carbon footprint of federated learning (Qiu et al. 2022) [Paper] [Notes]
  • The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink (Patterson et al. 2022) [Paper]
  • Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning (Henderson et al. 2022) [Paper]
  • Towards Sustainable Artificial Intelligence: An Overview of Environmental Protection Uses and Issues (Pachot et al. 2022) [Paper]
  • Measuring the Environmental Impacts of Artificial Intelligence Compute and Applications (OECD 2022) [Paper]
  • Method and evaluations of the effective gain of artificial intelligence models for reducing CO2 emissions (Delanoë et al. 2023) [Paper]
  • Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models (Li et al. 2023) [Paper]
  • Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training (You et al. 2023) [Paper]
  • Chasing Low-Carbon Electricity for Practical and Sustainable DNN Training (Yang et al. 2023) [Paper]
  • LLMCarbon: Modeling the End-To-End Carbon Footprint of Large Language Models (Faiz et al. 2023) [Paper]
  • Power Hungry Processing: Watts Driving the Cost of AI Deployment? (Luccioni et al. 2023) [Paper]
  • A Synthesis of Green Architectural Tactics for ML-Enabled Systems (Järvenpää et al. 2023) [Paper]
  • Toward Sustainable HPC: Carbon Footprint Estimation and Environmental Implications of HPC Systems (Li et al. 2023) [Paper]
  • Exploring the Carbon Footprint of Hugging Face's ML Models: A Repository Mining Study (Castaño et al. 2023) [Paper]
  • Estimating the environmental impact of Generative-AI services using an LCA-based methodology (Berthelot et al. 2023) [Paper]
  • From Words to Watts: Benchmarking the Energy Costs of Large Language Model Inference (Samsi et al. 2023) [Paper]
  • Power Hungry Processing: Watts Driving the Cost of AI Deployment? (Luccioni et al. 2023) [Paper]
  • Perseus: Reducing Energy Bloat in Large Model Training (Chung et al. 2024) [Paper]
  • Timeshifting strategies for carbon-efficient long-running large language model training (Jagannadharao et al. 2024) [Paper]
  • Engineering Carbon Emissions Aware Machine Learning Pipelines (Husom et al. 2024) [Paper]
  • Measuring and Improving the Energy Efficiency of Large Language Models Inference (Argerich et al. 2024) [Paper] [GitHub]
  • Green AI: Exploring Carbon Footprints, Mitigation Strategies, and Trade Offs in Large Language Model Training (Liu et al. 2024) [Paper]
  • A simplified machine learning product carbon footprint evaluation tool (Lang et al.) [Paper]
  • Beyond Efficiency: Scaling AI Sustainably (Wu et al. 2024) [Paper]
  • Towards Efficient Generative Large Language Model Serving: A Survey from Algorithms to Systems (Miao et al. 2024) [Paper]
  • Towards Greener LLMs: Bringing Energy-Efficiency to the Forefront of LLM Inference (Stojkovic et al. 2024) [Paper]
  • The Price of Prompting: Profiling Energy Use in Large Language Models Inference (Husom et al. 2024) [Paper]]
  • Hybrid Heterogeneous Clusters Can Lower the Energy Consumption of LLM Inference Workloads (Wilkins et al. 2024) [Paper]]
  • Offline Energy-Optimal LLM Serving: Workload-Based Energy Models for LLM Inference on Heterogeneous Systems (Wilkins et al. 2024) [Paper]]
  • AI, Climate, and Regulation: From Data Centers to the AI Act (Erbert et al. 2024 [Paper]
  • LLMCO2: Advancing Accurate Carbon Footprint Prediction for LLM Inferences (Fu et al. 2024) [Paper]
  • Addition is all you need for energy-efficient language models (Luo et al. 2024) [Paper]
  • Artificial Intelligence in Climate Change Mitigation: A Review of Predictive Modeling and Data-Driven Solutions for Reducing Greenhouse Gas Emissions (Adegbite et al. 2024) [Paper]

Survey papers

  • Evaluating the carbon footprint of NLP methods: a survey and analysis of existing tools (Bannour et al. 2021) [Paper]
  • A Survey on Green Deep Learning (Xu et al. 2021) [Paper] [Notes]
  • A Systematic Review of Green AI (Verdecchia et al. 2023) [Paper]
  • Counting Carbon: A Survey of Factors Influencing the Emissions of Machine Learning (Luccioni et al. 2023) [Paper]

Leaderboards

Organizations, projects and foundations

  • Green Software Foundation – non-profit foundation promoting software development with sustainability as a core priority [Website]
  • ENFIELD: European Lighthouse to Manifest Trustworthy and Green AI – project for creating a European Centre of Excellence with Green AI as one of the pillars [Website]

Other resources