A three-day workshop exploring the intersection of artificial intelligence and multimodal scientific research
AIMS2026 (AI for Multimodal Science) is a three-day interdisciplinary workshop that brings together Schmidt Science Fellows from around the world and UC San Diego postdoctoral researchers to explore how multimodal artificial intelligence can accelerate discovery across scientific domains.
Multimodal AI sits at the forefront of modern research, enabling the integration of heterogeneous data types—such as images, time series, text, graphs, and simulations—to address complex scientific questions. These methods cut across biology, medicine, climate science, physics, materials science, and computer vision, where insights increasingly emerge from the fusion of multiple data modalities rather than from any single source alone.
While scientific datasets are often domain-specific, the core principles of multimodal fusion are broadly transferable. AIMS2026 emphasizes both data-level multimodality and architecture-level multimodality. By focusing on generalizable fusion patterns and model designs, the workshop equips participants with approaches that can be readily adapted across disciplines—simply by swapping in their own domain's data, tasks, and evaluation metrics.
Through a combination of lectures, hands-on sessions, and collaborative discussions, AIMS2026 aims to foster a shared technical language for multimodal AI in science, build cross-domain connections among early-career researchers, and empower participants to apply state-of-the-art multimodal methods to their own research challenges.
Light breakfast, badge and gift distribution
Day 1: Three long-format talks (45 min + 5 min Q&A each), with short breaks.
Days 2 & 3: Two long-format talks, followed by four lightning talks by participants (15 min each).
Catered lunch and networking
Example topics:
Coffee, tea, juice, and light snacks
Hackathon (max ~25 participants) or Alternative Activities (max ~25 participants)
Room layout: Alternative activities will be in Room 15A, whereas Hackathon will be in Room 15B.
See details below
Maximum ~25 participants
Purpose: Fun, hands-on, educational; light prizes for all participating teams.
Teams: 4–5 teams (4–5 participants each; max ~25). Teams may self-organize, or organizers can assist.
Resources: Starter datasets with train/test splits, baseline code to lower barriers, and volunteer mentors for troubleshooting.
Objectives: Extend or improve an existing model by:
Maximum ~25 participants
Bring-Your-Own Paper (participant-driven): Each participant selects a multimodal AI–relevant paper, drafts brief notes/questions (e.g., on fusion strategies, evaluation, failure modes), and shares at round tables. A moderator (organizer) ensures balanced discussion.
Led by a professional writing coach. Structured exercises, peer feedback, and short coaching rounds help participants draft or revise sections of proposals and research statements, with optional prompts focused on multimodal AI research (e.g., framing interdisciplinary datasets, highlighting integration of methods, or articulating broader impacts).
Participants draw prepared question cards with prompts on multimodal systems (e.g., reliability, architectures, applications). Each participant gives a brief response, followed by moderator input with accurate context and examples. This keeps everyone engaged, sparks peer learning, and ensures valuable takeaways.
Step inside the Goeddel Family Technology Sandbox and see state-of-the-art imaging and multimodal AI workflows in action. This tour is designed to spark cross‑university and cross‑department collaboration by showcasing shared infrastructure, open research challenges, and concrete pathways for joint multimodal systems projects. Refreshments will be served.
Informal sunset walk and ice cream at La Jolla beach.
The final day of AIMS2026 will conclude with a hackathon awards ceremony, featuring dynamic short demos, results presentations, and live audience voting, followed by the presentation of awards and special gifts. The workshop will then close with an evening dinner reception, creating a welcoming and celebratory atmosphere for participants to network, exchange ideas, and strengthen cross-disciplinary connections formed throughout the workshop.
Distinguished researchers and practitioners in AI and multimodal science
To be determined
To be determined
To be determined
To be determined
To be determined
To be determined
To be determined
University of California, San Diego
Seventh College – Tower West, 15th Floor, Room 15A
10176 Scholars Drive
La Jolla, CA 92093
USA
The workshop will be held at UC San Diego, one of the world's leading public research universities. The campus provides state-of-the-art facilities and a vibrant academic environment for interdisciplinary collaboration.
Explore the workshop venue with our interactive 360° virtual tour
Executive Lead Organizer
Scientific Program, Workshop Vision & Overall Coordination
Schmidt AI in Science Postdoctoral Fellow
Co-Vice Chair of Exposure to Industry Program, PDA
University of California, San Diego
Senior Scientific Advisor, Faculty Engagement & Partnerships Lead
Assistant Professor
Dr. David V. Goeddel Chancellor’s Endowed Chair in Biological Sciences
University of California, San Diego
Communications & Registration Lead
Schmidt AI in Science Postdoctoral Fellow
University of California, San Diego
Tutorials & Materials Lead
Schmidt AI in Science Postdoctoral Fellow
University of California, San Diego
Hackathon & Data Lead
Schmidt AI in Science Postdoctoral Fellow
University of California, San Diego
Event Logistics & Administrative Coordination
University of California, San Diego Staff
This workshop is sponsored by Schmidt Sciences in partnership with University of California San Diego.
This workshop is supported by the UC San Diego Postdoctoral Association (PDA) Executive Board and the Goeddel Family Technology Sandbox.