Use large datasets to train machine learning models that can be exported to Tensorflow for a developer to integrate.
Platform: Google Cloud AutoML
I trained a machine learning model to filter tweets posted by real humans about their personal experiences to a subject hash tag. The model is useful for surfacing research insights, especially for a topic that consists of a lot of noise, chatter and click bait.
Trained and progressively tweeked a machine learning model to distinguish between healthy and infected individuals from a large dataset of chest x-rays. (This exercise was a part of the Udacity AI Product Manager Nanodegree course.)
I used tabular machine learning to predict the most relevant research questions in a survey. This could help to reduce the total number of questions asked in future surveys, with the least impact on the outcome.