Curriculum Vitae

George Flint Resume

George Flint

georgeflint@berkeley.edu | georgeflint.com

Education

University of California, Berkeley

Bachelor of Arts in Cognitive Science, GPA: 3.9 / 4.0

Berkeley, CA, Class of 2026

Relevant Coursework: Advanced Syntax, Compositionality Seminar, Computational Models of Cognition, Linear Algebra and Differential Equations, Biological Psychology, Cognitive Neuroscience, Computational Neuroscience, Probability and Statistics, Advanced Programming in R

† = graduate coursework

Publications

George Flint, Anna Ivanova*. Primitive linguistic compositionality in a Hebbian Neural Network. Cognitive Science (Proceedings). [Link]

Aalok Sathe, George Flint, Evelina Fedorenko*, Noga Zaslavsky*. Language use is only sparsely compositional: The case of English adjective-noun phrases in humans and large language models. Cognitive Science (Proceedings). [Link]

George Flint, Anna Ivanova. Testing a Distributional Semantics Account of Grammatical Gender Effects on Semantic Gender Perception. Cognitive Science (Proceedings). [Link]

Dominic Domingo, Aryan Bandi, Arya Kunisetty, Ahan Banerjee, George Flint*, Kevin Zhu*. Testing Evolutionary and Reinforcement Learning Approaches to Traffic Flow Optimization in SUMO. AAAI Workshop on AI for Urban Planning. [Link]

* = senior author

Preprints

George Flint, Kaustubh Kislay. Quantifying Phonosemantic Iconicity Distributionally in 6 Languages. In review (AACL). [Link]

Kaustubh Kislay, Shlok Singh, Soham Joshi, Rohan Dutta, Jay Shim, George Flint*, Kevin Zhu*. Evaluating K-Fold Cross Validation for Transformer Based Symbolic Regression Models. arXiv. [Link]

Research Experience

Lead Machine Learning Researcher — Language, Intelligence, and Thought Lab, Georgia Institute of Technology (PI: Anna Ivanova)

Remote, May 2024 – Present

Sole theorist and developer for a computational model of an associational (Hebbian) implementation of primitive linguistic mappings and compositionality.

Achieved high performance in label-to-image and image-to-label reconstruction; with compositional images/labels, achieved proof-of-concept performance for OOD composition reconstruction.

Currently developing a primitive recurrent language model with a purely predictive-coding architecture.

Project Manager, Lead Machine Learning Researcher — Launchpad (UC Berkeley)

Berkeley, CA, August 2024 – January 2025

Lead theorist and developer for a quantum machine learning project with 8 engineers; developed a theoretical model and preliminary implementation for an entanglement-based attention operation with subquadratic time.

Organized work distribution, community events, and education on transformer architecture and quantum concepts for a team with no prior background.

Lead Computational Linguistics Researcher — EvLab, Massachusetts Institute of Technology (PI: Evelina Fedorenko)

Cambridge, MA, May 2023 – August 2024

Lead theorist and developer on a study investigating linguistic relativity in humans and in distributional semantic space.

Designed behavioral experiments (n=500), implemented distributional experiments, and found positive relativity effects in Spanish and German (distributional) and in Spanish (behavioral).

Teaching Experience

Head of Education — Launchpad (UC Berkeley)

Berkeley, CA, May 2024 – December 2024

Designed and taught ML lectures/workshops on NLP, GANs, CNNs/ResNets, Autoencoders, RNNs, transformers, LLMs/CLIP, MLPs, NeRFs, deep RL, evolutionary algorithms, diffusion, MCTS, alignment, ethics, and interpretability.

Designed and administered a technical interview on Haar cascades for object detection.

Mentor, Machine Learning Researcher — Algoverse

Remote, June 2024 – Present

Mentor to 20 student teams (6 current) on ML research projects targeting conference/workshop publications.

Taught high school and college students with limited prior experience core concepts and research practices.

Course Instructor — UC Berkeley

Berkeley, CA, January 2023 – May 2023

Created, taught, and managed a new Linguistics 198 course on linguistic relativity for 50+ students.

Technical Skills

Programming Languages: Python, Taichi Lang, R, JavaScript, HTML/CSS, PHP, Ruby, SQL, Bash

Frameworks & Libraries: PyTorch, NeuroTorch, TensorFlow, qiskit, PennyLane, TensorFlow Quantum, Optuna, pandas, NumPy, Matplotlib, seaborn, scikit-learn, FastAPI, FastText, Gensim, jQuery, praat-parselmouth, Rails