About Me
I am a results-driven Analyst and Engineer specialising in building automated data solutions. With a First Class Honours degree in Mathematics from Imperial College London, I combine rigorous quantitative skills with full-stack development expertise in Python, Flask, and modern AI models like Gemini and Claude. I excel at transforming complex manual workflows into efficient, scalable systems that deliver measurable value and significant cost savings for clients in finance, law, and corporate sectors.
My approach combines mathematical rigour with practical engineering solutions. I specialise in transforming time-intensive manual processes into intelligent automated systems, typically reducing processing time by 80-95% while maintaining or improving accuracy. Whether working with investment firms on ESG compliance or legal teams on regulatory analysis, I focus on delivering solutions that scale efficiently and provide immediate, measurable ROI.
How I Work
Analyse & Understand
Deep dive into existing workflows to identify bottlenecks and automation opportunities
Build & Iterate
Develop modular solutions with continuous testing and stakeholder feedback
Deploy & Scale
Implement robust systems with monitoring and documentation for long-term value
Featured Projects
Modern Slavery Benchmark Automation
LiveCCLA Investment Management
Developed a Gemini-powered scoring system that achieved 95% accuracy against manual assessments, reducing a half-year benchmarking process to just weeks per project.
Claims Assessment Web Tool
In DevelopmentCorporates
Created a web application that analyses corporate documents for unsubstantiated ESG claims, generating risk assessment reports with actionable mitigation steps.
Maritime Regulation Compliance Benchmark
CompletedLaw Firms
Built an automated Python system to analyse c.500 regulations, extracting and structuring over 5,000 key compliance insights for clients within two weeks.
Net Zero Regulatory Analysis System
CompletedUN PRI
Implemented an LLM-based classification system to analyse over 1,000 climate policy instruments, reducing processing costs by 80% through advanced caching algorithms.
Core Skills
Programming & Development
AI / ML
Data Engineering & Infrastructure
Technical Showcase
# Example: Efficient document processing pipeline
def process_documents(docs: List[Document]) -> AnalysisResult:
"""
Demonstrates multi-threaded processing with intelligent caching.
Reduces API calls by 80% through smart deduplication.
"""
with ThreadPoolExecutor(max_workers=8) as executor:
# Process documents in parallel with caching
futures = [executor.submit(analyse_with_cache, doc) for doc in docs]
results = [f.result() for f in concurrent.futures.as_completed(futures)]
return aggregate_results(results)
This pattern achieves 8x speedup for document processing while maintaining thread safety and reducing API costs.
Career & Education
Dec 2023 - Current
Analyst
Canbury Insights, London
Nov 2023
Professional Certifications
- Financial Engineering and Risk Management - Columbia University
- Financial Markets (with Honours) - Yale University
Oct 2019 - Oct 2023
MSci Mathematics, First Class Honours
Imperial College London