Curriculum VitaeRev 2025.1

IVAN LEON

whoamiCognitive Scientist / Engineer
LocationSan Francisco, CA
Experience
DatacurvePrincipal Engineer

I joined Datacurve as a principal engineer to build their infrastructure and sandboxing environments for user-submitted code/projects and swarms of agent systems to evaluate and judge submissions. I was responsible for designing and implementing secure sandboxes for arbitrary code execution that provided real-time compilation and runtime feedback to users around the world, as well as enabling custom agents to run on the sandbox to evaluate and correct user submissions across several axes to provide frontier coding data for foundation model labs to improve the quality of LLMs for programming tasks.

MonoqlFounder | CEO

I built Monoql to empower mental health providers with AI tools to save time, reduce burnout, and improve the quality of patient care. The healthcare industry has historically lacked automation tools that facilitate patient workflows. Monoql aims to change that by integrating directly into electronic health records (EHRs) and augmenting visibility into a patient's mental health history.

Products
OraclePatient History Summarization
CopilotReporting Automation Tools
OmniGraphPatient Relationship Graph
Guidance AnalyticsPrincipal Engineer

I joined Guidance Analytics to design and build their healthcare analytics platform. I had the opportunity to hire and work with a team of talented engineers to build the Abacus platform and DataLux data visualization tool. Our goal was to provide healthcare organizations with a platform to analyze and visualize their data to make better decisions, as well as guide them on their journey to improving value-based care.

Abacus is a Rust project built with Cargo Workspaces, enabling us to build the core analytics platform as well as various auxiliary tools such as a web server and a remote CLI tool to control it. It handles all large-scale data processing for Guidance Analytics and has modules for time-series analysis, cluster analysis, and natural language processing.

MetaData Scientist

I joined Meta as a Data Scientist to work on their Messenger Notifications platform. I was responsible for building and maintaining machine learning models to predict user engagement with notifications. I also worked on the data infrastructure to support these models and the data pipelines to collect and process the data.

I had the opportunity to work with a team of talented data scientists, engineers, and designers to redefine the way we interact with notifications, and to integrate data driven insights to make them less intrusive and more engaging.

WeWorkApplied Science Researcher

I joined WeWork pre-IPO to build tools for generative space planning to automate their design process upon property acquisition. We worked closely with architects and designers to build AI tools that integrated directly with their workflow and augmented their ability to design spaces that not only deterministically met requirements for building code and business projections, but also to create human-centric spaces that feel good to be in.

I'm proud to say that if you've worked at a WeWork building, you've likely been in a space my tools have helped build.