Alyssa Mia Taliotis
Mathematician & AI Researcher
Mathematician & AI Researcher
Alyssa Mia Taliotis is a mathematician and AI researcher advancing the frontiers of machine learning, agentic systems, and embodied intelligence. She is pursuing a Master’s in Data Science at Harvard University and deepening her AI research at MIT, where she focuses on building intelligent systems that reason, adapt, and act across industrial, scientific, and infrastructure domains. She also serves as a Teaching Fellow in Statistics at Harvard.
Alyssa’s work spans deep learning, statistical inference, reinforcement learning, computer vision, and computational modeling. Her projects range from autonomous robotic architectures and CAD-driven manufacturing intelligence to multimodal neural networks and privacy-preserving AI. She has developed scalable memory layers for agentic systems, engineered full-stack AI platforms, and applied advanced modelling across logistics, finance, and industrial automation.
Previously, Alyssa graduated as Valedictorian from the University of Manchester with a BSc (Hons) in Mathematics, receiving the Outstanding Academic Achievement Award from Dame Nancy Rothwell and the Mathematics Excellence Award. She is also an Exness Fintech Scholar, recognized for innovation at the intersection of mathematics, finance, and AI.
Originally from Paphos, Cyprus, Alyssa brings a disciplined, high-performance mindset shaped by classical ballet and international leadership. Her mission is to build transformative, globally scalable AI systems driving the next generation of industrial and economic progress.
Developed a method for adaptive boundary detection in neural fields, enhancing edge preservation and discontinuity modeling in 2D representations. The approach eliminates the need for explicit meshing, enhancing the efficiency and accuracy of neural implicit representations in computer vision tasks.
Designed a multimodal neural network architecture that integrates diverse medical imaging modalities (e.g., MRI, CT, X-ray) to improve diagnostic accuracy and interpretability. The framework leverages cross-modal attention mechanisms to enhance feature extraction and representation learning across heterogeneous data sources.
Evaluated privacy-utility-fairness trade-offs in ICU mortality models using output perturbation, DP-XGBoost, DP-SGD to determine the clinical feasibility of DP.
Developed an interpretable multimodal model to support clinicians in diagnosing paediatric appendicitis, leveraging concept-based reasoning to enhance transparency, reduce diagnostic uncertainty, and build trust in AI-assisted clinical decision-making.
Researched agentic AI-driven medical intelligence to optimize treatment strategies for patients with complex comorbidities, enabling cross-specialty coordination and personalized care. Conducted under mentorship of Mitsubishi Electric Innovation Centre.
Applied Deep Q-Networks (DQN) and Proximal Policy Optimization (PPO) to train AI agents for strategic gameplay in Gomoku. Designed custom reward shaping and training environments to improve long-term decision-making, enabling the agent to learn competitive, human-level strategies through self-play.
[Python, PyTorch, Gym, Deep Q-Network (DQN), Proximal Policy Optimization (PPO), NumPy]
Designed a multimodal deep learning framework to automate CAD-based prosthetic socket modifications by learning complex geometric adjustments from raw anatomical data. The system leverages 3D point clouds, clinician-generated annotations, and multimodal inputs to produce scalable, personalized prosthetic designs. Conducted in collaboration with Rise Bionics to accelerate high-quality, affordable prosthetic manufacturing.
Core developer on the MIT team behind NANDA (Network of Agents and Decentralized AI), the world’s first Internet of AI Agents. NANDA defines a standard protocol (MCP) and decentralized memory infrastructure to enable autonomous, modular, and interoperable agent ecosystems. The initiative pioneers AI-native internet architectures supporting real-time collaboration, memory composition, and identity persistence.
[JavaScript, TypeScript, Python, FastMCP, SSE, Starlette, Uvicorn, Claude, LangChain, JSON Routing]
Built an AI-powered logistics planning platform for real-time, multimodal freight routing. Tavi lets users input origin, destination, product, and priority (e.g., fastest, cheapest, most sustainable) and returns optimised shipment routes using a graph-powered backend. The system integrates live transport data (e.g., port status, rail availability) and supports disruption-aware rerouting while preserving original route plans for comparison.
[React, Tailwind CSS, React Leaflet, FastAPI, CrewAI, GeoJSON, OpenStreetMap, Multimodal Graph Routing]
Built a lightweight, modular memory operating system that supports agent interoperability across local and distributed settings. Enables persistent, queryable memory containers with identity tracking and customizable data retrieval for both LLMs and agentic workflows.
[Python, FastAPI, SQLite, JSON]
Developed multiple plug-and-play MCP-compliant servers to demonstrate specialized agent workflows:
[Python, FastAPI, Starlette, SSE, Claude, JSON Routing, REST APIs]
Developed a full-stack, open-source web application that enables patients to input their medical history and receive real-time feedback on potential medication interactions. Designed to improve medication safety and accessibility through an intuitive interface and intelligent backend processing.
[Next.js, React, TypeScript, Python, FastAPI, SQLite, Ngrok]
Engineered the backend architecture for an agentic AI system that analyzes patient histories and clinical reports to identify medical conflicts across comorbid conditions. Enabled nuanced detection of drug-condition conflicts and specialty-level insights using fine-tuned language models and structured EMR data.
[Python, CrewAI, LangChain, OpenAIEmbeddings, FAISS, RAG, SQLite, Pandas, JSON Caching]
Developed an agentic AI system for autonomous ultrasound triage and diagnostic guidance in space environments. The system leverages real-time reasoning, multimodal perception, and context-aware agents to assist astronauts with non-invasive diagnostics when direct medical supervision is unavailable. Designed for zero-gravity usability and resilient communication protocols.
[Python, OpenCV, Ultrasound Imaging, LangChain, Multi-Agent Orchestration, MCP, JSON Routing]
Partially developed at Harvard x Anthropic Hackathon 2025
Built an ensemble learning framework to predict heart disease risk across the European population, incorporating geographical and demographic data. Leveraged model interpretability techniques to identify region-specific risk factors and key clinical predictors, enabling more targeted public health strategies.
[Python, Scikit-learn, XGBoost, Random Forests, Pandas, Matplotlib, Geopandas]
Applied causal inference techniques to assess the efficacy of melatonin in a double-blind, placebo-controlled randomized clinical trial (RCT). Estimated average treatment effects while adjusting for potential confounders, enabling a robust evaluation of melatonin's impact on sleep outcomes in patients with primary insomnia.
Conducted time series analysis on Australian dry white wine sales using the Box-Jenkins methodology. Performed log transformation, seasonal differencing, and model identification to fit ARIMA/SARIMA models. Evaluated model assumptions via ACF/PACF and residual diagnostics, and validated forecasts against 1985 holdout data to assess predictive performance.
[R, ARIMA, SARIMA, ACF/PACF, Residual Diagnostics, Time Series Decomposition]
Analyzed the Old Faithful geyser dataset using Gaussian Mixture Models (GMMs) to uncover latent eruption patterns based on eruption duration and waiting times. Estimated mixture model parameters using the Expectation-Maximization (EM) algorithm and determined the optimal number of clusters using Bayesian Information Criterion (BIC). Compared GMM results with k-means clustering to evaluate modeling differences and cluster interpretation.
[R, mclust, MASS, Gaussian Mixture Models, EM Algorithm, BIC, K-Means Clustering, Data Visualization]