Women in AI Series 2025 - Empowering LLMs for Scalable and Efficient Machine Programming: Guixiang Ma

#WIE #AI #LLM #lehigh #lehighvalleysection
Share

Large language models (LLMs) have shown remarkable potential in various domains, particularly in code optimization and generation for machine programming. However, their performance in low-resource scenarios, such as parallel code generation, remains limited. In this talk, I will discuss our recent advancements in enhancing LLMs' capabilities for these challenges through agent-based approaches, fine-tuning techniques, and strategic data curation. Additionally, I will introduce our work on representation learning for code data, focusing on tasks like compiler optimization. I will also demonstrate the use of machine learning methods for efficient workload partitioning in system optimization, which can also facilitate scalable distributed training and inference of large-scale machine learning models. Finally, I will share insights into future research directions in these areas.



  Date and Time

  Location

  Hosts

  Registration



  • Date: 23 Sep 2025
  • Time: 11:00 PM UTC to 12:00 AM UTC
  • Add_To_Calendar_icon Add Event to Calendar
If you are not a robot, please complete the ReCAPTCHA to display virtual attendance info.
  • Contact Event Hosts
  • Starts 06 March 2025 05:00 AM UTC
  • Ends 23 September 2025 04:00 AM UTC
  • No Admission Charge


  Speakers

Guixiang of Intel

Biography:

Guixiang Ma is a Senior AI Research Scientist at Intel. She has 6 years of industrial research experience in artificial intelligence and machine learning. She has led multiple initiatives on machine learning and systems research at Intel Labs. Her research is focused on the foundations of large-scale machine learning algorithms, with emphasis on the modeling and analysis of relational graph data, text data, dynamic / spatio-temporal data, multimodal data, and system performance optimization. She has published over 40 peer-reviewed research papers in leading AI conferences and journals such as NeurIPS, MLSys, KDD, AAAI, ICML, Nature Communications, and others. She also holds 2 patents.