Gaël Duez invites two distinguished professors on stage live from the Green IO Conference Singapore 2025 to discuss the environmental footprint of AI and data centers in this cosmopolitan city-state. Professor PS Lee, from the Energy Studies Institute and Department of Mechanical Engineering National University of Singapore, whose current research focuses on data center liquid cooling, is joined by Professor Heng Wang, Associate Dean and Professor of Law, Singapore Management University, who works at the intersection of sustainability and the governance of digitalization.
AI & data centre development in Singapore
Singapore is a complex city-state which relies heavily on imports from around the world due to its small geographical size, dense population and limited in-situ resources (listen to Episode 36 for more info). Given such restrictions, Singapore is therefore acutely aware of resource consumption and waste processing limitations. Its strategic position in SE Asia, coupled with high speed connectivity and world class telecoms has led to a very mature data center market, and growth in the sector is closely scrutinized. Moratoriums have already been imposed (2019 - 2022), and although they have since been lifted, development remains tightly regulated as the AI boom continues.
Greening data centers
Although not new, liquid cooling is now very much associated with the demand for AI, and the need to reduce carbon footprints. The switch from conventional air cooling to liquid cooling can result in significant energy savings due to precision targeting and better heat transfer. But in order to take advantage of the various technologies to manage the carbon footprint of the data center industry, PS Lee stresses how we need to equip practitioners and the industry as a whole with both the necessary know-how and the appropriate tools to design and operate the liquid cooling infrastructure.
Existing vs. future data centers
As new data centers come on line, recent technology can be designed in from the start, though the question remains how to retrofit existing data centers in a cost-effective fashion.
Air cooling can still be useful in low power density operations,
One size does not fit all, and so low cost, low power, and low resource solutions need to be employed wherever possible. The idea is more one of ‘fit for purpose’, or as Prof. PS Lee says, ‘a proper engineering’ approach where actual power, rather than potential power consumption should dictate the solution – e.g. inferencing can rely on air based cooling, whereas liquid cooling could target high power usage in AI training.
Room for improvement
According to PS Lee, the current lack of standards for AI infrastructure is problematic. Tech developments should be closely monitored to ensure that appropriate governance, regulations and standards keep up with the pace. PS Weng thinks that altering the discourse around AI resource use could also have reaching effects, for example framing AI use in terms of drinking water supplies could potentially lead to very different outcomes in terms of the technology used or whether AI is actually deployed in the first place.
AI for good
Knowledge and foresight are crucial, as is the need for an inclusive network of stakeholders to ensure that solutions are found and applied across the industry. AI is clearly here to stay, so we need to channel its use on positive projects such as efficient allocation of energy and resources, orchestration of computing, power and cooling, or even use it on the entire data center life cycle, from design, operation to end of life. The very essence of digital sobriety.
Want to know more? Sign up for the newsletter here.
Prof PS Lee (00:01)
It's always a case of fit for purpose, or what we call a proper engineering. I think for high-power AI workloads, then think going with liquid cooling, I think it's almost becoming the standard solution. But then not to forget, you also have storage, you have networking equipment. So these are actually typically the lower power,
Gaël Duez (00:24)
Hello everyone, welcome to Green IO I'm Gaël Duez and in this podcast, we empower responsible technologists to build a greener digital world, one byte at a time. Twice a month on a Tuesday, our guests from across the globe share insights, tools and alternative approaches, enabling people within the tech sector and beyond to boost digital sustainability. And because accessible and transparent information is in the DNA of Green IO, all the references mentioned in this episode, as well as the full transcript, are in the show notes. You can find these notes on your favorite podcast platform and, of course, on the website greenio.tech. One last thing. This episode is a bit special because it was recorded live from Green IO Singapore two weeks ago. The sound quality isn't as good as you have been used to, but the quality of the guests is as good as ever. Enjoy the episode.
Gael Duez (01:33)
We’re live and we're doing this fire chat session at Green IO Singapore, second edition, and we're trying something new, which is recording live with all the technical hiccups that happened for the last 20 minutes. So my dear listeners, you didn't experience them, but the participants, did. And I'm delighted to be joined today by Professor P.S. Lee and Professor Heng Wang to discuss the environmental footprint of AI, but from a very hands-on perspective based on the city-state of Singapore, because both of them are experts, Professor PS Lee from an infrastructure perspective, Professor Heng from a governance perspective. So how does this rise in energy consumption, resource consumption caused by the AI boom is concretely impacting the infrastructure of Singapore and the way Singapore governs its digitalization. Without further notice, I would love to leave the ground to Professor Heng Wang. Can you briefly introduce yourself?
Pr PS Lee (02:34)
First, thanks for having me. I'm Poh Seng or PS Lee from both Energy Studies Institute as well as Mechanical Engineering from the National University of Singapore. So my personal research centers around data center liquid cooling. So I've been working on this for past three to five years. Liquid cooling is not new but associated with the demand for AI now becomes almost as a part of requirement in order to sort of unleash the AI performance. But I more importantly is that the switch from conventional air cooling to liquid cooling, it can actually result in very significant energy savings as well as carbon footprint reduction. So think that to me is actually the more important question, how do we actually take advantage of various technologies to manage the carbon footprint of the industry so that we can allow the growth in the most sustainable fashion.
Gael Duez (03:30)
Okay, thanks a lot. Professor Wang.
Prof Heng Wang (03:38)
Thank you for having me here. Singapore Management University, Newport House School of Law. Before that I was a professor at the University of South Wales in Sydney. I work on the governance of digitalisation and sustainability. So the issue is about how do we align digitalisation with sustainability and one of the core issues about how do we navigate through the uncertainties because we are not necessarily have all the knowledge about that. How do we use the regulation governance, different tools to align the two together. And also I contribute to World Economic Forum, governance alliance from the perspective of responsible users.
Gaël Duez (04:10)
Quite a lot to deal with. So without further notice, let's jump right into the main question. What is according to both of you the main issues, the main pitfalls with this current AI boom in your fields of expertise?
Prof PS Lee (04:34)
I think for the infrastructure side, the industry has been operating air-based systems for the longest time, the past three, four decades. So while liquid cooling is not new, think in general the industry does not actually have a lot of practitioners that are familiar with the design as well as the operations of liquid cooling infrastructure.
So I think there will be a period whereby there's quite a bit of learning as well as training that's actually necessary so that we actually equip the industry practitioner with the necessary know-how as well as tools to solve design and operate the liquid cooling infrastructure. Then obviously one of the natural questions is actually what about the CAPEX indication? We also gained the switching from liquid cooling to air cooling the demand while it's picking up is certainly not at the same volume as airbase is now. So that's why it's still the perceived delta in terms of the CAPEX Then I think even the more challenging is actually existing infrastructure. How do we actually allow wind retrofit of your existing data centers in a cost-effective fashion. So I think these are some of the issues.
the other is actually standards. think currently there is a lack of standards while there have been various guidelines, for example from ASHRAE, from OCP. But think there isn't a very well established standard when it comes to the new AI infrastructure. So I think it really needs to be a power-tracking of both the technical, the technology development as well as the governance including the standards.
Pr. Heng Wang (06:20)
I think I agree what you have said, standard is one issue. From governance perspective, or from knowledge perspective, I think we have a number of major issue. First, about knowledge. So if you use AI, AI is fast developing. have an issue about we have imperfect knowledge about AI, it's kind of beta version. And secondly, we do not have complete knowledge.
because you you mentioned about cooling, mentioned about energy transition, know, e-waste, and different context, finance and other sectors, so you have silos. You knowledge are not really distributed as much as we want. Certainly also, it's also been an issue about unpredictability, so the social response to AI is an issue. If you think about AI will compete with your drinking water, the feedback to that will be different. So that's the knowledge side.
imperfect knowledge, incomplete knowledge and unpredictability. From governance perspective, we have mismatches. Mismatches about one is if you look at horizontally, your short-term and long-term consideration. Short-term you want to roll out fast. You also have a bottom-up decision. You have the use cases long-term, you have a susceptibility. And also you look at the governance levels. have domestic ones, sub-national ones, to regional ones, to international ones, which has been pretty tricky nowadays.
And also if you look at the issue of the silo solution earlier, you also have to break that. So that's come back to the issue, how do we build the knowledge? And how do we engage with stakeholders at different levels and to forge the standards and also to evolve over time?
Gael Duez (07:58)
two questions to dive a bit on what you say. First of all, Professor Lee, it seems to me that you made the assumption that liquid cooling was the de facto solution to solve maybe partially the energy crisis caused by AI, but is it the only one? mean, if you take a step back and you position yourself as like an almighty father of Singapore, of Singapore's infrastructure, would you say, okay,
We need to go all in into liquid cooling, retrofit everything like this or is there different ways, different approaches that we should follow?
Pr PS Lee (08:38)
It's always a case of fit for purpose, or what we call a proper engineering. I think for high-power AI workloads, then think going with liquid cooling, I think it's almost becoming the standard solution. But then not to forget, you also have storage, you have networking equipment. So these are actually typically the lower power, so you don't really mix to go to link cooling.
But even in AI, if you look at for example training versus inferencing, especially the later, inferencing you probably don't really require super high power chips. So in that regard, you can probably go with some of the air-based solutions, but maybe going from the conventional what we call the perimeter cooling involving crack-claw to a roll-based solution like in-roll coolers or rear door exchanger, which allows you to cool higher airflow rate.
such an AI influencing hardware. So I think it's always a case of fit for purpose.
Gael Duez (09:41)
And in the case of, just for the listeners who are not super familiar with the wording, AB is air based cooling So, for a city like Singapore, did you manage to estimate the share of the current infrastructure that should be sort of retrofit toward liquid cooling and the share that should go as it is today?
Pr PS Lee (10:04)
I think Singapore is actually quite unique because we have a moratorium about three years ago because back then, data center industry is consuming 7 % of electricity, obviously it's 7 % so it's actually huge. The government actually has its net zero commitment and obviously it's also concerned about the energy demand so that's why a moratorium was actually put in place which was actually finally lifted.
Thereafter, actually have a data center called for applications. But if you look at the licenses that were actually awarded, they are just averaging about 20 million. So I think for a mature data center market like Singapore, the reality or the fact is you have a bigger stock of existing data centers. Or you can call it legacy or ground-fuel data center. So I think the solutions that we should be looking at, whether or not you do liquid cooling or more efficient air-based solutions.
will be quite different compared to example Malaysia in particular, Johor Bahru because there's a lot of new data centers that are coming online. So for them, I suppose right from day one, they can actually start by designing or incorporating the coolant into the infrastructure. But for a merchant market like Singapore, I think likely we'll have to do it in phases.
without undue interruptions to ongoing operations, as well as undue delta in terms of the CAPEX But I feel that for mature data center markets like Singapore, the green retrofit data center, in fact, will be a lot more needle moving in terms of bringing down the carbon footprint compared to emerging markets like Malaysia.
Gael Duez (11:53)
That's interesting the connection with the governance issue when you mentioned the moratorium that happened three years ago. mean, what is today in your view, Professor Wang, the balance you mentioned like short term, long term, how do you believe that Singapore would position itself in this fast rhythm regarding AI adoption and even like investment in new infrastructure, retrofitting infrastructure and like
Pr. Heng Wang (12:21)
That is a great question. It involves the issue about understanding the costs and benefits. So you have to actually, it's a trade-off, which is a very difficult one. So I think it's very important to understand the risks, to build the knowledge It's only in this way you can make the informed decision. So that would be very important for the government's arrangements to understand what's happening, what's likely to happen. And you need foresight.
AI now is a landscape, move much faster. Like if you think of a highway in the past, you have 100 kilometers or 200 kilometers. Now it's 1,000, 20,000 depends on the speed of the development. So I think the first important step is actually we have to build a network of the stakeholders to understand risks and also the short-term and long-term considerations of that.
specifically you probably need to take more concrete measures like the Carrot and stick to launch towards that. So for example, for understanding risks, whether you want to encourage disclosure, disclosure of CO2 emissions, water consumption, so e-waste and other things, or you want to...
efficient labeling as we have done before you you utilize the adaptive existing tool for that or index or verification there could be also be industry use case for verification it's not only about the cost of money you can make money in a sense of it you can be as an ethical leader in that regard and those kind of things or you want to think of a monitoring forecast and also the early warning for example
I looking at thinking also being, it's also about the technology side, which I looked at earlier, is that whether it's also useful to train the AI models with those kind of training principles. So in this way, actually even if you use a prompt answer to the prompts, bearing in mind those kind of issues. And also you should like, whether you're thinking of what something you can do now, like the tendering offering requirements. So if have a project, and you're
for energy efficiency, so you have raised to top instead of raised to bottom. So there are other things you can do. You think about whether regulate AOMs and others. So I think a coherent approach and engage with different stakeholders and use different tools will be very important to make that happen.
Gael Duez (14:50)
And if I can wrap up what you said, it's all about gathering the different stakeholders and using a mix of carrot and sticks. Could you maybe provide one or two examples of each in Singapore? Did the Singapore authority, maybe government, maybe governmental bodies, started to use the stick? I mean, they used a pretty big stick when they forbidden any new data center facilities three years ago.
but it has been lifted now. But what sort of sticks do you from the Singapore government, and what sort of carrots?
Pr. Heng Wang (15:27)
I think that's actually as mentioned, you mentioned example the government may take measures about the developments of new data center, that's one example as you mentioned. But also that Singapore has a green data center roadmap. That's also been a way where they're trying to promote water and energy efficiency because actually if you have energy consumption over a certain threshold, then you are expected to have those kind of monitoring or the water usage of that.
think that's maybe probably more things to be done because this is a very new area. that's the reason why, mentioned earlier, that you can't think of adapting existing tools for that. Instead, everything starts from scratch. So that's the issue about when you talk about carrot and stick, energy and that going as mentioned earlier will be one way where you can give a reward because consumers will know what it means. Or for large energy models, you never requirement things, taming what are the energy consumption of that. So consumers may choose which energy actually more environmentally friendly. So what I try to make is that actually we have to think creatively and adapt to that. But not necessarily everything will be able to do that. One example is environment impact assessment. This is a tool which is difficult to apply because it's, you know...
So I just want to say that you have to adapt and risk one, but also think about developing new ones.
Gael Duez (16:51)
I think the Singapore green data center roadmap was mentioned earlier today by the IMDA representative Dr. Lawrence Wee But it has been something that has been existing prior to the AI boom. Is it correct?
Pr PS Lee (17:09)
There was an earlier version, I think they called it the Green Data Center Research Roadmap. So I think even back then, I think they mentioned things like liquid cooling, but I think the latest version which was actually launched last year is really about pushing for accelerated adoption of various solutions. I think like what Prof. mentioned, it really has to be done holistically. It's not just...
one singular solution like liquid cooling. You also need to look at the details right, are you still end up rejecting the heat, are you still end up consuming a lot of water operating the wet cooling towels. So I think the Green Data Center roadmap the latest edition I think it actually sort of put up a holistic kind of framework right, so that operators as well as their so-called partners, vendors, equipment vendors.
consultant can really adopt a holistic mindset in terms of looking at different aspects all the way from the procurement of your energy sources to the adoption of some of the latest energy-efficient solutions including liquid cooling to possibly recovering waste and actually putting it to productive end-use. So I see that as a progression.
But I think for me the next step in my view is really to come up with a holistic green or sustainable data center standard. I think we have different pieces, including I think one of the new standards that me together with my colleagues among the audience that's developing essentially on the liquid cooling standard. But I think it's more than just liquid cooling. So I think really I look forward to Singapore showing the leadership to establish a holistic standard for green or sustainable data.
Gael Duez (19:05)
And just from our understanding, the roadmap as it is today, it's a set of best practices and things like that, but there is nothing compulsory at the moment. There is nothing sort of related to what the European Union did with the energy efficiency directive. But now, if you're above a certain power capacity for your data center, you need to waste heat as you mentioned, share of energy renewable. This sort of reporting, is it already in place in Singapore or not yet?
Pr PS Lee (19:38)
what it would extend that the EU is doing but I think what we can see is actually IMDA and EDB is actually tying for example the PCCF8 to for example the green data center roadmap as well as the green mark for data center 3G green and 4 version so I think it still serves the purpose of actually really compelling the industry to sort of be very intentional as well as be very the
holistic in terms of assessing the different aspects that would actually lead to the more sustainable operation of the data centers.
Gael Duez (20:14)
And maybe to close this fire chat, can we reverse the question? What do you see both of you as the greatest opportunity leveraging AI for sustained purpose in Singapore?
Pr. Heng Wang (20:30)
Yeah, I just been seeing a lot of new ways that AI provides the possible way to do with that. So something I'm thinking of, you know, can use AI to simulate. So if you different governance or regulator arrangements, you could have agents. And then you can think about using AI to simulate what an outcome of that would be. So that probably will inform your decisions, you know, and also bring the different positions of actors into that.
And of course, AI will be used in the allocation of the energy resources and even allocation of the human resources for free energy generation, so on and so forth. So many ways that we need to explore.
Gael Duez (21:10)
And just a side question because I didn't mention it in the introduction but you're an expert in blockchain techniques, not necessarily cryptocurrency but blockchain use. Do you see any related use between the AI boom which is a machine learning boom, now let's be honest, it's just a marketing trick to name it AI, and blockchain today or for you are there still two separate topics?
Pr. Heng Wang (21:36)
Yeah, I'm not sure about blockchain technology expert, but I think that actually blockchain will be one way that nowadays we're thinking of synergize or work with the AI. So people even in the area like finance, example, or FinTech areas also looking to the AI's more important use of that, including how do you leverage technologies to do the compliance, regulatory compliance, for example, if you have an environmental.
regular requirements, whether we can use blockchain and other tools to make that happen, and AI to assist that process. So that's also been a possible way that we try to explore. But we have to see what are the rules by that. So for example, people may say we want regulation by design. So we require you to have the designer to build into your green stability in your AI or similar arrangements to make sure you comply with the requirements.
Pr PS Lee (22:27)
Pr Wang already mentioned that can obviously leverage on AI for the simulating performance, can use it for optimizing the operations of the data centers. But I think the other aspect which I think really the AI can be sort of leveraged on is actually to orchestrate your computing, your power and cooling. Currently it's done in silos
But if you are able to time the workload to match with the availability of green electrons, then obviously you can actually better reduce the operational carbon. So see AI has a lot of potential in various aspects. In fact, can even leverage on AI for the entire life cycle of the data center, all the way from design, operation to end of life.
Gael Duez (23:15)
Thanks a lot, of you. That was the first time we record live a podcast. I hope that, I mean, the sound obviously would not be studio quality sound, but I hope it will be understandable. Thanks a lot to the audience for attending. It was a great exercise. And yes, let's meet next year.
Pr PS Lee (23:32)
Thank you.
Gaël Duez (23:36)
Thank you for listening to this live episode from Green IO Singapore. Don't forget to share it on social media or directly with other data center practitioners. Our next episode will not be about the AI energy score. We had to postpone its recording, but with Letitia Bornes to talk about the avoided carbon emissions thanks to digital technologies. It will be based on a study she co- authored about Vinted, the massive second-hand platform. Stay tuned.
By the way, we decided to open our Slack workspace to our listeners willing to get involved in the making and the promotion of the GreenIo Podcast and its newsletter. The link in the show notes and you're more than welcome to join. One last thing, visit greenio.tech to check our next conferences. New York is in two weeks. The lineup is a blast. And as usual, you can get a free ticket using the voucher GREENIOVIP Just make sure to have one before the 10 remaining
tickets are all gone. I'm looking forward to meeting you there to help you fellow responsible technologists build a greener digital world.