What does real estate have to do with AI?To shed some light on the potential of artificial intelligence (AI), and discuss the role of the supporting infrastructure enabling this boom, we were delighted to leverage the expertise of Eric Rothman, Portfolio Manager, Real Estate Securities with CenterSquare. CenterSquare is a dedicated real estate investment manager, with around $14 billion under management, and Eric has been with the company for 17 years.
Before we explore the Nvidia story and the relationship between AI, data centres, and ‘new economy real estate’, let’s define what that latter phrase means.
New economy real estate is supporting technological advancements, like AI
What is ‘new economy real estate’? Eric noted that there is so much beyond the traditional ‘4 foodgroups’ of real estate:
1) retail
2) office
3) residential
4) industrial
When CenterSquare defines the ‘new economy real estate’ space, Eric noted that the larger components include data centres, cell phone towers, and warehouses dedicated to new economy logistics—things like ecommerce fulfillment. This is far from traditional, industrial real estate.
Some of the smaller segments include life sciences, cold storage, and office space that is uniquely tailored to technology tenants, typically located in specific cities with focused pools of technology talent. Such cities might be Seattle, San Francisco or New York. These types of ‘real estate’, most notably data centres, are vital to support growing technologies like AI.
The Nvidia story—$1 trillion to be spent?
There has been a huge amount of excitement and discussion around Nvidia as the stock has enjoyed overnight success on the coattails of the AI boom. ‘$1 trillion’ is a big number (and a nice headline), but it’s very difficult to forecast where generative AI will take us. Some people say it is like inventing the wheel or the personal computer. This is a big claim, and only time will tell.
If people are thinking about ‘data centre REITs’ as an investment, they have to understand that data centres just fulfil the provision of power, cooling, and connectivity. The data centre REITs do not actually own the computers. The tenants invest in the computers. One thing that is absolutely true, however, is that as an owner, you love to see the tenants putting money into the space that they are renting. Why? This makes it less likely they are going to leave. Therefore, a greater investment in AI technology and computing power may be a positive signal for the supporting real estate (like data centres).
Eric’s conclusion, whether thinking about the impact of generative AI on data centre REITs or cell phone tower REITs, was that the move in share prices hasn’t reflected where we could be going yet. Connectivity and data centres will be vital components for artificial intelligence, but it’s not yet clear how or when investors are going to reflect that in the real estate prices. Eric noted that investors frequently forget about the buildings until later in a cycle or a trend.
Greater computing power = greater energy consumption?
Another aspect that we discussed was energy usage. Eric estimated that newer AI-focused semiconductors draw more power, not just a little bit more power but a step change in power consumption.
A chart from the ‘Decadal Plan for Semiconductors’, a research report by Semiconductor Research Corporation allows us to compare compute energy to the world's energy production. A critical point to keep in mind is that ‘something has to give’; simply continuing to add computational capacity without thinking of efficiency or energy resources will eventually hit a wall. However, if history is any guide, we should expect that, as demand and investment in computational resources increases, there will be the potential for gains in efficiency, improved model design, and even different energy resources that may not yet exist today.
Since many investors may be less familiar with cell phone towers, Eric made sure to mention just how strong of a business model he believes this to be. Now, it’s true that these REITs have not performed well in the past 18-months, but we are right in the middle of the current 5G rollout. Tenants have long leases, there is lots of demand, and there are even consumer price index (CPI) escalators that increase the rent to be collected.
Conclusion: a different way to think about real estate
It was great to be able to spend some time speaking with Eric and to learn about what’s happening both in the broader real estate market as well as in the more specific, new economy, ‘tech-focused’ market. The full discussion is accessible on behind the markets podcast
This material is prepared by WisdomTree and its affiliates and is not intended to be relied upon as a forecast, research or investment advice, and is not a recommendation, offer or solicitation to buy or sell any securities or to adopt any investment strategy. The opinions expressed are as of the date of production and may change as subsequent conditions vary. The information and opinions contained in this material are derived from proprietary and non-proprietary sources. As such, no warranty of accuracy or reliability is given and no responsibility arising in any other way for errors and omissions (including responsibility to any person by reason of negligence) is accepted by WisdomTree, nor any affiliate, nor any of their officers, employees or agents. Reliance upon information in this material is at the sole discretion of the reader. Past performance is not a reliable indicator of future performance.
Datacentre
The environmental impact of AI: a case studyIn our previous blog, Will AI workloads consume all the world’s energy?, we looked at the relationship between increasing processing power and an increase in energy demand, and what this means for artificial intelligence (AI) from an environmental standpoint. In this latest blog, we aim to further illuminate this discussion with a case study of the world’s biggest large language model (LLM), BLOOM.
Case study on environmental impact: BLOOM
An accurate estimate of the environmental impact of an LLM being run is far from a simple exercise. One must understand, first, that there is a general ‘model life cycle.’ Broadly, the model life cycle could be thought of as three phases1:
Inference: This is the phase when a given model is said to be ‘up-and-running.’ If one is thinking of Google’s machine translation system, for example, inference is happening when the system is providing translations for users. The energy usage for any single request is small, but if the overall system is processing 100 billion words per day, the overall energy usage could still be quite large.
Training: This is the phase when the parameters of a model have been set and the system is exposed to data from which it is able to learn such that outputs in the inference phase are judged to be ‘accurate’. There are cases where the greenhouse gas emissions impact for training large, cutting-edge models can be comparable to the lifetime emissions of a car.
Model development: This is the phase when developers and researchers are seeking to build the model and will tend to experiment with all sorts of different options. It is easier to measure the impact of training a finished model that becomes public, as opposed to seeking to measure the impact of the research and development process, which might have included many different paths prior to getting to the finished model that the public actually sees.
Therefore, the BLOOM case study focuses on the impact from training the model.
BLOOM is trained on 1.6 terabytes of data in 46 natural languages and 13 programming languages.
Note, at the time of the study, Nvidia did not disclose the carbon intensity of this specific chip, so the researchers needed to compile data from a close approximate equivalent setup. It’s an important detail to keep in mind, in that an accurate depiction of the carbon impact of training a single model requires a lot of information and, if certain data along the way is not disclosed, there must be more and more estimates and approximations (which will impact the final data).
If AI workloads are always increasing, does that mean carbon emissions are also always increasing2?
Considering all data centres, data transmission networks, and connected devices, it is estimated that there were about 700 million tonnes of carbon dioxide equivalent in 2020, roughly 1.4% of global emissions. About two-thirds of the emissions came from operational energy use. Even if 1.4% is not yet a significant number relative to the world’s total, growth in this area can be fast.
Currently, it is not possible to know exactly how much of this 700 million tonne total comes directly from AI and machine learning. One possible assumption to make, to come to a figure, is that AI and machine learning workloads were occurring almost entirely in hyperscale data centres. These specific data centres contributed roughly 0.1% to 0.2% of greenhouse gas emissions.
Some of the world’s largest firms directly disclose certain statistics to show that they are environmentally conscious. Meta Platforms represents a case in point. If we consider its specific activities:
Overall data centre energy use was increasing 40% per year from 2016.
Overall training activity in machine learning was growing roughly 150% per year.
Overall inference activity was growing 105% per year.
But Meta Platforms’ overall greenhouse gas emissions footprint was down 90% from 2016 due to its renewable energy purchases.
The bottom line is, if companies just increased their compute usage to develop, train and run models—increasing these activities all the time—then it would make sense to surmise that their greenhouse gas emissions would always be rising. However, the world’s biggest companies want to be seen as ‘environmentally conscious’, and they frequently buy renewable energy and even carbon credits. This makes the total picture less clear; whilst there is more AI and it may be more energy intensive in certain respects, if more and more of the energy is coming from renewable sources, then the environmental impact may not increase at anywhere near the same rate.
Conclusion—a fruitful area for ongoing analysis
One of the interesting areas for future analysis will be to gauge the impact of internet search with generative AI versus the current, more standard search process. There are estimates that the carbon footprint of generative AI search could be four or five times higher, but looking solely at this one datapoint could be misleading. For instance, if generative AI search actually saves time or reduces the overall number of searches, in the long run, more efficient generative AI search may help the picture more than it hurts3.
Just as we are currently learning how and where generative AI will help businesses, we are constantly learning more about the environmental impacts.
Sources
1 Source: Kaack et al. “Aligning artificial intelligence with climate change mitigation.” Nature Climate Change. Volume 12, June 2022.
2 Source: Kaack et al., June 2022.
3 Source: Saenko, Kate. “Is generative AI bad for the environment? A computer scientist explains the carbon footprint of ChatGPT and its cousins.” The Conversation. 23 May 2023.