Protocol Learning: Towards True Open-Source AI

#Artificial #intelligence #startup #LLMs #foundationmodels #machinelearning
Share

Abstract: In the current AI landscape, foundational models remain effectively closed-source, with innovation tightly controlled by a few well-funded corporations and dependent on their continued release of base model weight sets. Protocol Learning represents a bold new direction, enabling truly open-source AI through decentralized, low-bandwidth, multi-party model-parallel training. In this talk, Alexander Long, founder of Pluralis Research, will discuss the core research behind Protocol Learning, how it allows—for the first time—truly community-created and owned models, and the significant economic and geopolitical implications this has on the foundational model layer. He will also share insights from his journey founding Pluralis, raising over AUD 11M from prominent US investors in the first funding round, assembling a tier-1 research group domestically in Australia, and offering his perspective on the current state of venture capital in Australia, and how to navigate this as a founder, as well as Australia's strategic position within the global AI ecosystem.



  Date and Time

  Location

  Hosts

  Registration



  • Date: 17 Apr 2025
  • Time: 06:00 AM UTC to 07:00 AM UTC
  • Add_To_Calendar_icon Add Event to Calendar
If you are not a robot, please complete the ReCAPTCHA to display virtual attendance info.
  • Unviersity of Queensland
  • St Lucia Campus
  • Brisbane, Queensland
  • Australia 4702
  • Building: 63
  • Room Number: 348
  • Click here for Map

  • Contact Event Host
  • Starts 08 April 2025 02:00 PM UTC
  • Ends 17 April 2025 02:00 AM UTC
  • No Admission Charge


  Speakers

Alexander Long

Biography:

Alexander Long spent three years gaining familiarity with large-scale training dynamics at Amazon as an Applied Scientist. He spearheaded research on the structure of representation spaces in multimodal foundation models: work that was both published in tier 1 venues and applied directly at Amazon scale. His PhD focused on Nonparametric External Memory in Deep Learning and was the school nominee for Malcolm Chaikin prize (best thesis). He holds an M.Sc. from the Technical University of Munich and BE from UQ all awarded with highest honors.