Analytical and detail-oriented professional with a strong foundation in research reporting, data analysis, and performance measurement. Experienced in interpreting complex datasets to support strategic planning and enhance institutional research outcomes. Proficient in Power BI, SQL, and Google Cloud Platform (GCP), with expertise in data visualization and performance metrics. Skilled at collaborating with cross-functional teams and academic leadership to ensure compliance, benchmarking, and continuous improvement. Deep understanding of higher education trends and research evaluation frameworks. Highly organized, self-motivated, and dedicated to delivering actionable insights that drive innovation and strategic excellence.
At Nido Early School, I support children’s learning and development by creating meaningful, child-centred experiences inspired by the Reggio Emilia philosophy. I foster a nurturing and inclusive environment where children feel confident to express themselves, explore their surroundings, and build strong, respectful relationships. Through thoughtful observation and documentation, I tailor learning opportunities to each child’s interests, strengths, and developmental needs, encouraging their natural curiosity and sense of agency. I actively engage in setting up aesthetically pleasing and purposeful learning spaces, both indoors and outdoors, that invite discovery and collaboration. I work closely with families, recognising them as partners in their child’s learning journey, and communicate regularly to ensure consistent support. I also collaborate with my team to reflect on our practices, contribute to planning, and uphold the Early Years Learning Framework, National Quality Standards, and centre values.
Built a machine learning classification system for a digital archaeology archive (Trendall
archive) using Python.
Parsed and cleaned 40,000+ records from unstructured text files, achieving 80% accuracy
in classification.
Migrated data processing to GCP Compute Engine for improved scalability and utilized
Cloud Storage for artifact storage.
Implemented ML models using GCP AI Platform and deployed classification APIs using
Cloud Functions .
Set up automated data pipelines with GCP Dataflow to continuously process new
archaeological records.
Employed Agile methodology and collaborated with stakeholders using Jira and GitHub.
Delivered a research portal interface for historians to explore dataset classifications, hosted
on GCP App Engine .
Python
Strong SQL skills
Power BI
MS Excel
Tableau
JIRA
Statistical analysis
Analytical thinking
Time management