Whilst the transformative capabilities of AI have captured the global consciousness, the sustainability of its usage is one such topic that has been slow to attract attention.
Due to the intense computational power needed to run and train AI models, the energy consumption of these computers is immense, with one study by Cornell University showing that training some models, such as GPT-3, burned up to 500 metric tons of CO2, comparable to a coal power plant running for 10 hours.