Andreea Munteanu
on 14 September 2023
The Canonical AI Roadshow has started. Meet us around the globe.
Date: 11-12 October 2023
Location: Taets Art & Event Park, Amsterdam, Netherlands
Booth: A24
The Canonical AI Roadshow is taking off. Generative AI, large language models (LLMs) and predictive analytics are shaping the future of technology. Experience the latest advances in these areas at the World AI Summit, one the largest global AI events.
At the conference, AI leaders, developers, creators and students will discover how to use open source technologies to elevate their AI story. From getting started to enabling large teams to build reproducible AI projects, we will cover a variety of topics during the event.
Canonical AI Roadshow at World AI Summit
World AI Summit gathers the global AI ecosystem, talking about the latest innovations from the industry, the biggest challenges that enterprises face or groundbreaking stories about AI.
Our team of experts will travel to Amsterdam as part of the Canonical AI Roadshow to give talks about large language models (LLMs), deliver a joint workshop with NVIDIA and answer your questions about AI, machine learning operations (MLOps) and open source. From genAI to predictive analytics, we will showcase a wide variety of demos for different industries.
Bonus: Everyone who visits our booth can interact with a conversational assistant, similar to ChatGPT, built fully with open source components, to learn more about Ubuntu and Canonical.
Build your LLM factory with NVIDIA and Canonical
Building an AI factory requires both powerful hardware and an integrated suite of tools. An AI factory should empower enterprises to run AI at scale, enabling professionals to collaboratively develop and deploy machine learning models. NVIDIA and Canonical put together an end-to-end solution to simplify these activities, addressing the entire ML lifecycle.
Michael Balint, Senior Manager at NVIDIA, Maciej Mazur, AI/ML Principal engineer and myself put together an exciting workshop to tackle building an AI factory using open source tooling. We will zoom into a use case that showcases large language models on DGX, with Charmed Kubeflow and NGC containers.
LLMs from 0 to hero
Large language models (LLMs) are gaining popularity. Maciej Mazur and myself will deliver a session on automating LLM fine-tuning and developing chat-based and multimodal model assistants. He will highlight the potential of LLMs within different industries, touching upon common pitfalls and how to overcome them.
This is a technical deep dive into large language models, which will refer to open source tooling such as Kubeflow, MLFlow or Spark. At the end of this talk, you will know how to use LLMs and open source, and have a better understanding of the major challenges that they have.
What is the Canonical AI roadshow?
Canonical AI Roadshow is a series of events that will highlight generative AI use cases powered by open source software. The roadshow will take us around the globe between mid-September and mid-November to talk about the latest innovations from the industry, demo some of the most exciting use cases from various industries, including financial service, telco and oil and gas, answer questions about AI, MLOps, Big data and more… We will stop in:
- Europe
- Amsterdam, Netherlands
- Bilbao, Spain
- Riga, Latvia
- Paris, France
- North America
- Austin, Texas
- Chicago, Illinois
- Las Vegas, Nevada
- Middle East and Africa
- Central and South America
- São Paulo, Brazil