Categories: BlogCanonicalUbuntu

Let’s talk about open source, AI and cloud infrastructure at GITEX 2024

October 14 – 18, 2024. Dubai. Hall 26, Booth C40

The largest tech event of the world – GITEX 2024 – is taking place in Dubai next week. This event is a great opportunity for Canonical to connect with industry leaders from various industries, share expert opinions and make your cloud journey easier and more cost effective. 

In 2023 Canonical presented GITEX attendees with the latest news on predictive analytics, generative AI, and large language models (LLMs). This year we aim to delve deeper into these topics to help you innovate at speed with open source AI.

Sponsored

Canonical, the publisher of Ubuntu, provides open source security, support and services. Our portfolio covers critical systems, from the kernel to containers, from databases to AI. With customers that include top tech brands, emerging startups, governments and home users, Canonical delivers trusted open source for everyone. 

Let’s talk about open source, ai and cloud infrastructure at gitex 2024 3

Join us in Hall 26, Booth C40 to explore how to scale your ML projects with us. Our team will be excited to shine more light on Enterprise AI solutions supporting you in developing artificial intelligence projects in any environment. 

Don’t miss this opportunity to dive into the world of open-source innovation with Canonical. We can’t wait to meet you at GITEX 2024!

Dive into GenAI with a Retrieval Augmented Generation (RAG) demo

Retrieval-Augmented Generation (RAG) remains one of the key discussion points when it comes to enterprises’ generative AI efforts. Our team will be showing a demo on how to create your own LLM with Retrieval-Augmented Generation (RAG) in Kubeflow using Opensearch with Juju. It can be deployed on any public or private cloud. 

In more detail, this demo shows how to prepare your infrastructure for an end-to-end  solution from data collection and cleaning to training and inference usage of an open-source large language model integrated using the RAG technique on an open-source vector database. It shows how to scrape information out of your publicly available company website to be embedded into the vector database and to be consumed by the LLM model.

Join us in the hall 26 if you:

  • Are curious and passionate about AI and MLOps
  • Seek to deliver AI at scale securely
  • Are interested in IT infrastructure solutions and enterprise AI solutions

Here is all the information about GITEX 2024:

  • Location: Dubai World Trade Centre. Sheikh Zayed Rd – Trade Centre 2
  • Dates: October, 14 – 18, 2024
  • Hours: Monday 11:00 AM – 5:00 PM, Tuesday – Friday 10:00 AM – 5:00 PM

Let’s talk about open source, ai and cloud infrastructure at gitex 2024 4
Ubuntu Server Admin

Recent Posts

Cut data center energy costs with bare metal automation

Data centers are popping up everywhere. With the rapid growth of AI, cloud services, streaming…

20 hours ago

Build the future of *craft: announcing Starcraft Bounties!

Our commitment to building a thriving open source community is stronger than ever. We believe…

20 hours ago

NodeJS 18 LTS EOL extended from April 2025 to May 2032 on Ubuntu

The clock was ticking: Node.js 18’s upstream End of Life (EOL) The OpenJS Foundation is…

20 hours ago

Native integration now available for Pure Storage and Canonical LXD

June 25th, 2025 – Canonical, the company behind Ubuntu, and Pure Storage, the IT pioneer…

2 days ago

Revolutionizing Web Page Creation: How Structured Content is Slashing Design and Development Time

Co-authored with Julie Muzina A year ago, during our Madrid Engineering Sprint, we challenged ourselves…

3 days ago

Ubuntu Weekly Newsletter Issue 897

Welcome to the Ubuntu Weekly Newsletter, Issue 897 for the week of June 15 –…

4 days ago