Saturday 07 Dec 2024
By
main news image

(APRIL 8): Wherever you stand on the development of AI one thing is certain — in its various forms, AI will have the single biggest impact across industries not witnessed since the emergence of the internet. It will define modernisation, becoming a de facto tool for everything from research and development through to production and post-sales services. It will, as the International Monetary Fund recently suggested, “transform the global economy”.

We are already seeing this with GenAI. In fact, enterprise spending on GenAI alone will, according to IDC, double this year and grow to a whopping US$151 billion by 2027. Large language models (LLMs) have clearly captured the imagination of organisations and is now accelerating interest in GenAI capabilities internally and within third party applications. It is driving strategic thinking.

Organisations see the technology as key to innovation but also efficiency — as McKinsey said last year, GenAI is the next productivity frontier. It’s boom time — or at least it should be. But what if an organisation cannot connect all of its data or cannot scale its infrastructure? What impact does that have on its ability to take advantage of GenAI?

Organisations need connected data and flexibility to meet the fluctuating needs of modernisation. They also need it to be affordable. One of the biggest challenges for GenAI development and applications is the cost as many businesses are cautious about increasing spending on new AI services.

Cost-effective cloud for long-term AI growth

The cloud, of course, becomes crucial here. While demand for cloud services is only increasing — in 2024, Canalys expects global cloud infrastructure services spending to increase by 20% compared with 18% in 2023 — how many businesses are actually being held back by budget constraints or the complexity of managing and updating disparate systems?  

Not all cloud infrastructure is equal. There needs to be a levelling of the playing field to give as many organisations as possible access to the technologies that will undoubtedly shape all our futures.

As a McKinsey report 'In search of cloud value: Can generative AI transform cloud ROI?' reveals: “Getting value from public cloud, it turns out, is complicated. Companies have spent the past several decades building enterprise technology organisations, processes, and architectures designed to work for on-premises environments. Much of that needs to change.”

We agree totally. If organisations are going to reap the benefits of GenAI, there's need for a more open, sustainable cloud that lowers barriers to entry, both in terms of cost but also in terms of flexibility and access.

In fact, we are already doing this, bringing our own LLMs to customers through our open cloud infrastructure. These have already enabled world leading consumer health companies, such as AI nutritionist Haleon to build trust, improving the accuracy of its nutritional database and relevancy of recommendations to customers.

It has also helped rinna, a Japanese startup specialising in the development of pre-trained foundation models adept at processing Japanese, to innovate with new products and services.

The point is that these organisations are able to accelerate innovation through affordable access to GenAI capabilities in the cloud. This is their ROI.

This is also reflected in Alibaba Cloud’s latest pricing strategy, which is designed to benefit long-term subscribers with significant discounts while also providing businesses with a stable foundation for developing their long-term strategies in planning and creating their AI applications.

Democratization for GenAI boom

Over the next few years, we will see an ongoing shift towards AI computing, an infrastructure designed for AI — embedding GenAI capabilities — to drive innovation and action, with clear cost structures and scalability.

That’s why Alibaba Cloud has built ModelScope, a leading open-source AI model community for models and related tools & services, which is very popular among developers. The community hosted over 3,000 AI models, including Meta’s recently released Llama2 and our own open-source models, Qwen LLMs with parameters ranging from 1.8 billion, 7 billion, 14 billion to 72 billion, as well as multimodal LLMs with audio and visual understanding.

While there will be a co-existence of closed-source and open-source LLMs, the ability of open-source solutions to democratise AI should accelerate its adoption. Open-sourced LLMs can drive growth of AI model communities, which prioritise collaboration for improved AI interpretability. It means organisations of all sizes and resources can start to dream of enhancing products and services with the help of GenAI.

For example, our SeaLLMs mark a significant stride towards inclusivity by providing enhanced support for local languages in Southeast Asia, embracing the cultural diversity of the region. The growth of open-sourced LLMs will also drive the further growth of AI model communities, which prioritize collaboration for improved AI interpretability.

The democratisation of AI and delivery of open, GenAI-ready cloud services also means businesses can put more resource into organisational data to make sure it is synthesized and usable by the LLM. After all, GenAI is great at summarising and synthesizing data, but less impressive when it comes to gaining insights from unstructured data. Organisations need to be able to address data issues without worrying about the underlying infrastructure. It should not be a trade-off. For organisations to truly innovate, cloud infrastructure has to become a de facto standard, a baseline for organisations to operate LLMs, to experiment, innovate and grow.

As we move towards constructing an AI computing infrastructure, this will become even more apparent. The demands on IT resources are only going to increase, so we need to enable infrastructures capable of supporting energy-intensive model training, while ensuring operational efficiency, cost-effectiveness, and minimal environmental impact. This is an industry challenge, to not democratize GenAI but also to encourage collaboration. For that to happen, it needs to be open.

Selina Yuan is President of the International Business of Alibaba Cloud Intelligence.

      Print
      Text Size
      Share