Monday 06 Jan 2025
By
main news image

This article first appeared in Digital Edge, The Edge Malaysia Weekly on June 28, 2021 - July 4, 2021

Most of us do not realise it, but we send and receive copious amounts of data every day, whether through emails, texts, gaming or mobile apps. Some 15 years ago, text messages were a few kilobytes in size but, today, text messages that include images and videos can go up to a few megabytes. 

If you are reading this on www.theedgemarkets.com or The Edge Markets e-paper, cloud data centres are to thank for the quick loading. Because of the support and computing power of these data centres, websites are able to transfer data quickly, so you can receive information on your device within seconds.

But there are instances when a single webpage can take an excruciatingly long time to load, or not load at all, as seen during the rush to sign up for the AstraZeneca Covid-19 vaccination on May 26. The website was not able to scale and expand quickly enough to manage the high traffic.

The vaccination sign-up scramble is just the tip of the iceberg. At the rate the country is moving towards a digitalised world by incorporating sophisticated deep tech applications, regular data centres may not be fully capable of supporting it.

In February this year, the government demonstrated its commitment to the country’s digitalisation journey and digital economy by announcing the MyDIGITAL initiative. While this is good news for the country, it also fuels demand for greater cloud capacity to support digitisation efforts.

Vertiv Malaysia country manager Teoh Wooi Keat is excited about MyDIGITAL’s unfolding over the next 10 years and hopes to see an increased adoption of technology for sustainability and economic growth. However, he notes that there needs to be a strategic approach to IT infrastructure that forms the backbone of many technology deployments and applications.

Regular data centres will definitely need an upgrade as deep tech such as artificial intelligence (AI) and the crunching of big data would require greater cloud computing in order to reduce latency, that is, data transfer delays, and provide good service to everyday users.

SilTerra vice-president of strategic management Tan Eng Tong says digital growth today — which encompasses websites, movies, and music and on-demand apps such as Grab — is nothing compared to what is yet to come with the evolution of AI, Internet of Things (IoT) and 5G.

This is where a hyperscale data centre comes into play. Hyperscale means the ability to rapidly achieve massive scale in data and cloud computing, using large and expandable data centres.

AIMS Group CEO Chiew Kok Hin explains the difference between a data centre and a hyperscale data centre with a simple analogy: “A normal car can accelerate and give you high power if needed when put under pressure, but it will take some time. A supercar, however, can accelerate and reach that speed faster because it has the capacity to do so.”

The difference in size, capacity and computing power of a regular and hyperscale data centre is significant. The latter allows companies to scale up quickly and focus on what is most important — its business, customers and delivering the right services.

Technologies such as AI, virtual reality (VR) and autonomous vehicles require a lot of big data crunching to form analytics, Chiew explains, which requires powerful servers and resources to carry out the task. The need for hyperscale data centres in this case is to make computing processes a lot more efficient.

“This means putting more power in a single footprint. For example, if you compare the first generation of computers to laptops today, we can do a lot more things in a smaller space as compared to 20 years ago. That is the goal with data centres too,” he says. “But this doesn’t mean that a regular data centre cannot do whatever a hyperscale data centre does. They can always upgrade themselves to have the same capacity and computing power as a hyperscale data centre.”

The human and tech challenges

Players in this space face a lot of challenges when it comes to building a data centre — from tech challenges in terms of developing technologies to build strong, cost-effective centres to human challenges such as finding the right talent and creating a shift in mindset.

In January, Synergy Research Group revealed that the total number of large data centres operated by hyperscale providers increased to 597 at the end of 2020, having more than doubled since the end of 2015. The report adds that Amazon, Microsoft and Google collectively account for over half of all major data centres around the world. 

Their major hardware challenge is heat generation and the massive power consumption required for cooling. In 2018, data centres all around the world collectively consumed 1% (205TWh) of all electricity worldwide.

SilTerra’s Tan explains that energy is needed not just to power data centres in general, but also to power air conditioning and cooling systems to dissipate the high heat generated. Silicon photonic chips is a technology that can tackle excessive heat generation, he says, adding that SilTerra has been developing these chips to be incorporated into a data centre’s hardware.

Silicon photonics chips are made of silicon but conduct photons instead of electrons. Structures are adapted to guide photons, also known as waveguides, similar to how fibre-optic cables transmit light, but at the chip level. SilTerra has developed the capability of combining waveguides on the same wafer as semiconductors. 

Tan believes silicon photonics will be utilised by hyperscale data centre operators first as they have an existing problem to solve. This technology can potentially reduce the power consumption of the servers in these centres by 10%.  

“The electricity used and the heat generated can actually power a small village, so if we save even 10% of the heat emission, it’s a big deal,” says Tan.

AIMS’ Chiew says another way to manage generated heat is ambient cooling. Cold countries such as Finland and Greenland that experience cooler ambient temperatures offer a cost benefit to data centre operators. 

However, in tropical countries like Malaysia, other technologies such as liquid cooling need to be considered. Chiew explains that this method of cooling involves the running of liquid or gels in the data centres to dissipate heat generated.

Microsoft has taken liquid cooling to the next level with Project Natick. A data centre was deployed 117ft deep in the seafloor of Northern Isles in June 2018 and retrieved in July last year to be analysed. This research project aims to inform Microsoft’s data centre sustainability strategy around energy, waste and water as well as how to make land data centres more reliable.

By understanding the benefits and difficulties in deploying subsea data centres worldwide, Microsoft is looking to explore opportunities to better serve customers in areas that are near large bodies of water (where nearly 50% of the world’s population resides).

Microsoft Malaysia managing director K Raman shares that the team is currently still reviewing the data, but it demonstrated that providers can manufacture full-scale undersea data centre modules, powered by renewable energy, economically.

“The team found that Microsoft’s undersea data centres are eight times more reliable than those on land because of their sealed and purged environment, with all oxygen and humidity displaced by nitrogen. Nitrogen is less corrosive than oxygen and the absence of people to bump and jostle components are the primary reasons for the difference,” he says. “If the analysis proves this correct, the team may be able to translate the findings into land data centres.”

A big part of Microsoft’s goal of going carbon negative means completely changing the way data centres operate. Raman says Microsoft’s engineers are focused on developing next-generation technologies, which include liquid immersion cooling.

The fluid inside the liquid cooling tank is harmless to electronic equipment and engineered to boil at 122°F, 90°F lower than the boiling point of water. The boiling effect, which is generated by working servers, carries heat away from labouring computer processors. The low-temperature boil enables the servers to operate continuously at full power without risk of failure due to overheating.

Inside the tank, vapour rising from the boiling fluid contacts a cooled condenser in the tank lid, which causes the vapour to change to liquid and rain back onto the immersed servers, creating a closed loop cooling system.

“This is a new server cooling method predicted to lower server energy consumption in the future by 5% to 15% at a minimum while greatly minimising overall water use in data centres,” says Raman.

The company is also utilising grid-interactive uninterruptible power supply (UPS) batteries to balance supply and lower energy demand on the grid by directing microbursts of electricity to data centres or the grid as needed.

“These batteries store energy at close to 90% efficiency and smooth out intermittency from renewables. As we continue to explore this technology further, we could potentially extend the duration of the batteries from a few minutes to several hours, potentially using these long-duration batteries as a replacement for traditional back-up generators,” says Raman.

Computing power is another challenge. Ryan Yang, director of technical marketing at Lightelligence, says the company uses silicon manufacturing technology to construct an optical device to carry out fast data transferring and signal processing. Silicon electronics is currently doing this, he says, but the company is changing the domain from electronics to optics.

“We do this because light is very fast. The signals and data are transferred at the speed of light, which is faster than electronics. It also generates less heat, which means it can consume much less power while doing data transferring and computing. These are the fundamental reasons people are interested in silicon photonics and optical computing,” he says.

Yang says a data centre has three pieces — the computing unit, storage unit and the networking between the two units. Lightelligence is focused on the networking aspect, which includes how one constructs data and how they transfer it. 

“It is mature technology that is being explored with data centres and, going forward, the focus will be on how to leverage optical technology to build more efficient networking inside the data centre.”

To tap into a variety of cutting-edge technologies — from 5G to AI, VR and more — Vertiv’s Teoh says business leaders need to pursue major improvements across data infrastructure, computing capabilities and bandwidth. He says the lack of readiness to upgrade is one of the many problems data centres face today.

“Most IT professionals say it is difficult to manage today’s bandwidth needs with the existing infrastructure. Both business leaders and engineers say bandwidth is at the very heart of future business success. However, according to our research, only 11% of C-suite executives and just 1% of data centre engineers believe their data centres are updated to accommodate current and future needs,” he says.

Future data centres will inevitably require adequate processing power in the cloud and for edge computing to manage new bandwidth challenges effectively, says Teoh. Decentralisation and edge computing are not as widespread as they should be, as both involve moving and processing data and resources away from the organisation’s local data centre or corporate hub. 

Edge computing is computing that is done at or near the source of the data, instead of relying on the cloud at one of a dozen data centres.

“Overall, the top challenges that hinder wider utilisation of edge computing is the cybersecurity of devices and equipment, followed by managing costs,” says Teoh.

Talent shortage and lack of resources needed for data centre operations are also problems faced by companies within the industry. Teoh says, according to Vertiv’s recent report, organisations could lose up to 16% of their infrastructure workforce in the next five years. Companies face a staffing challenge as they transform data centres to adapt to disruptive forces. 

“Developing solutions across security, bandwidth, processing power and more (while helping internal talent build fluency in evolving environments and unfamiliar technologies) will be essential as data centres change. However, the speed of change leaves most IT organisations lacking in expertise or talent to handle complexities ranging from processing power to energy management,” he says.

The future of hyperscale data centres

As part of MyDIGITAL, the government hopes to build data centres in Malaysia as well as attract industry pioneers to set up facilities here. In April, Microsoft partnered with the government to establish its first data centre region in Malaysia. Prime Minister Tan Sri Muhyiddin Yassin said the initiative will see investments amounting to US$1 billion (RM4.15 billion) over the next five years.

SilTerra’s Tan believes Malaysia is the best location for hyperscale data centres as not only will it benefit Southeast Asia, but also the world, which is good news, especially since data centres are expected to grow and adapt to evolving consumer demands for more data. This is based on the location of undersea submarine cables to carry telecommunication signals across stretches of ocean, most of which pass through the Straits of Malacca.

Vertiv’s Teoh says as we look towards a smarter future with 5G, smart cities and IoT, data centres will likewise continue to evolve to cope with the computing demands. In Malaysia, the internet has been dubbed the “third utility” and the government is allocating considerable investments to boost connectivity. 

Sustainability and environmental impact is a major challenge for the data centre sector, says Teoh. To address sustainability issues, the use of renewable energy, particularly solar technology, will advance considerably in the next 10 years.

“High-efficiency air-conditioning UPS systems can be the primary back-up energy source in the future. UPS systems have made significant innovations in energy efficiency in recent years. With the introduction of eco-mode and intelligent paralleling, UPS can now achieve efficiencies approaching 99%, making them less likely to be displaced by competing technologies,” he says.

AIMS’ Chiew says the company did consider renewable energy sources, but building a data centre is a challenging task as it requires approvals from different governmental and regulatory bodies. This includes registering with the relevant digital authority bodies as well as obtaining building approvals for the land, power and water, among other things.

“We have not seen a very concerted effort to bring all these agencies together like in Singapore. There, when you want to build a data centre, operators just need to submit a proposal and it will be ready,” he explains.

“In Malaysia, there needs to be constant follow-ups with all the agencies one by one. It can get very frustrating,” he says, adding that he hopes with MyDIGITAL, the government will look into and address this particular stumbling block.

Save by subscribing to us for your print and/or digital copy.

P/S: The Edge is also available on Apple's App Store and Android's Google Play.

      Print
      Text Size
      Share