I'm a Computer Science Ph.D. student at the University of Memphis.
I work in the Human-SE Lab with Scott Fleming, researching software engineering and human-computer interaction. In particular, I investigate how tools can provide better cognitive support to software engineers while debugging.
Previously, I worked as a research intern at Microsoft Research in 2016 and at National Instruments in 2015 and 2014. I also worked as a software engineer intern at First Tennessee Bank in 2012. I graduated with a M.S. from University of Memphis in 2013 and a B.S. from Austin Peay State University in 2011. During undergrad, I made Flash games that you can still find all over the web.
Information foraging theory is based on biological theories that have been applied to how humans seek information. We aim to apply these models to software development tools. This work is done in collaboration with Oregon State University and IBM Research.
Very little is known about how programmers change and refactor visual code. After performing initial interviews and an exploratory user study, we designed Yestercode to help programmers perform manual code changes and found many benefits in our evaluation.
Programmers spend a lot of time navigating large sets of code. We developed a new code editor that allows you to easily juxtapose code fragments on a never-ending ribbon. Two user studies and a simulated study have shown that Patchworks improves navigation.
CHI preview video (30 seconds), supplementary video (1:55)
These are projects that I did for fun or school. Most of them can be found on my github.
Content management systems and static blog generators usually provide more features than I need. This generator creates a website given a folder of markdown files and a bootstrap theme while also allowing you to insert dynamic scripts and a comment system. (source)
W3C's HTML Validator reports any warnings or errors with a page's HTML. After trying several C# APIs, I made my own. (source)
For a class project we were assigned to write a program to simply crawl a few hundred sites. So I designed mine to crawl hundreds of thousands of pages with threading, detect spider traps and dynamic URLs, parse HTML correctly, analyze HTML errors, tokenize and stem the text, keep relevant documents and images, and compute TF-IDF and pagerank. (source available on request)
A friend of mine needed dynamic copy and paste. Each time you paste, the content is different. You set variables on your clipboard and define how they change with each paste. (source)
I developed a system using neural networks that determines the current state of 9 independent facial features from a video feed. This information is stored as a series of bits which are then transferred over a network to another application that reconstructs the facial expression in real-time in the form of a cartoon avatar.
video demo (52 seconds)
As my undergraduate senior project, I implemented multiple handwritten parsers for a dialect of BASIC. The interpreter supports simultaneous text and graphical output. Additionally, I compared the implementation details and runtime differences for various parsing algorithms. (source)
From 2009 to 2011, I developed 8 Flash games. Cumulatively, these games have been played over 20 million times and spread to over 1000 websites. Most of these games were created in less than 72 hours during "game jams."