Tweet Sentiment


A small AI project that implements, and visualises, several versions of attention backed by conventional RNNs.
Cover Image
Prediction Screen
Attention visualisation

Cover Image

1 / 4

Originally started to help get my head around attention mechanisms, this small project implements several versions of attention backed by conventional RNNs. I created a complete demo that enables you to explore the attention mechanism by dropping selected tokens from the input. This makes it simple to observe the impact of including or excluding different words on both the attention calculation and the final prediction.

What I Did

This research project, built in Python and TensorFlow, includes three different model types and a preprocessing pipeline, all optimized with a tfrecord-based pipeline. I also designed and constructed a functional demo that facilitates a deeper understanding of the attention mechanisms by enabling you to investigate its workings.


Working on a project that spans multiple disciplines is always a fulfilling experience. In the creation of the demo, I was able to integrate my knowledge from two different fields, resulting in a visualization that is both user-friendly and intuitive. I am pleased with the outcome of this project, and I hope that if you decide to build and run it, you will feel the same way.