
About
ML Researcher making LLMs practical for real tasksI have the fortunate of playing with AI models under the guidence of Professor Ameet Talwalkar and Valerie Chen
Current: Research Scientist @ Apple
Email: rshar@cs.cmu.edu
Education
-
MS Machine Learning, 2025
Carnegie Mellon University
Overall GPA: 4.0/4.0
Research: Human-AI Interactions for Programming Tasks
Relevent Coursework: Graduate Machine learning, ML in practice, convex optimization -
BSc Honours Computer Science, 2024
University of British Columbia
Overall GPA: 4.0/4.0
Honours Thesis: Improving the weighted error of sparse decision trees
Relevent Coursework: Advanced machine learning, applied machine learning, NLP,
intelligent systems, advanced databases, machine learning and data mining, stochastic processes
Papers
-
When Benchmarks Talk: Re-Evaluating Code LLMs with Interactive Feedback (arXiv)
Jane Pan*, Ryan Shar*, Jacob Pfau, Ameet Talwalkar, He He, Valerie Chen
ACL, 2025
Experience
-
Research Scientist
Apple
2025-Present
-
Graduate Student Researcher
Carnegie Mellon University
2024-Present
-
Teaching Assistant
University of British Columbia
2019-2024
-
Undergraduate Research Assistant
University of British Columbia
2023-2024
-
Firmware Developer
Motorola Solutions
2021-2022
Projects
-
User Intent Benchmark
Carnegie Mellon University
-
Analyzing effects of user ambiguity on LLM performance via simulated human interaction (methods to be published)
-
Implemented novel feedback technique with LLMs using proprietary models (GPT4o, Sonnet) and open source hugging face models (Llama, Gemma, Qwen), achieving 25% higher task correctness
- Designed and conducted experiments to analyze efficacy of simulated human feedback, showing statistically significant differences in outcome
-
Analyzing effects of user ambiguity on LLM performance via simulated human interaction (methods to be published)
-
University of British Columbia
-
Improved weighted loss of decision trees (GOSDT model) with a novel sampling method using SciPy and numpy
-
Experimentally showed 15% reduction of weighted loss and reduced loss variance compared to models without our sampling method
-
Designed synthetic, weighted datasets with imbalanced and sparse distributions, to represent real data seen in rare diseases and underrepresented populations
-
Improved weighted loss of decision trees (GOSDT model) with a novel sampling method using SciPy and numpy
-
Class Based VAE (github)
University of British Columbia
-
Created a novel statistical model for CPSC 440 (Advanced Machine Learning)
-
Achieved improved reconstruction error of scarse labels with lower variance
-
Created a novel statistical model for CPSC 440 (Advanced Machine Learning)