- Who: In this story, there are three main characters: 1) the people/the community who needs help, 2) the data scientist (that is you), and 3) AI. –The main character is the user whoever is viewing the application for gaining knowledge on present situation
- How much does the data scientist understand Assignment 1 (domain) and Assignment 2 (data)? – By data preprocessing using hive and spark we had visualized the data, understand the context and nature of data.
- What models and analysis did the data scientist and AI apply to fulfill the need of the people or the community? – Used tweepy API to stream the tweets using spark and python ML libraries to perform sentimental analysis and AWS cloud to incorporate data to UI.
- Can the data scientist estimate and select data for their goals from Assignment 1? Can they map data sets from Assignment 2 onto appropriate ML models? – Yes, the data scientist estimate and select data for their goals and map the data sets to the model
- Can the data scientist connect Story 1 with ML models/stories about what a ML model can do? To perform good ML research, what in-depth knowledge and experience with ML algorithms and ML stories does a data scientist need? – Yes, the data scientist estimate and select data for their goals and map the data sets to the model
- When has to do with the iterations (Calibration 2). How much time did it take for experimentation? How efficient is the modeling/algorithm? – During US ELECTION 2020 the data was streamed form the twitter mostly real time data, the data collected is longitudinal data as it is only collected at this point of particular time across the world. The generalization can be done whenever required for at different point of events by changing the hashtags while collecting the data.
- Can the data scientist determine the acceptance level of the model (validation with accuracy and runtime performance) considering the targeted users? – Yes, the data scientist determine the acceptance level of the model.
- Where has to do with the learning environment. Where did this experiential learning process take place? For example, it was part of an online Deep Learning course. – Spark Environment and sentiment analysis in python which is the part of bigdata course
- Why explains the modeling. Explainable ML models. – The data was collected to bring awareness among the people about the US elections by considering various attributes.