- Who will be impacted? Society could be potentially impacted by being able to determine a person’s age, gender and ethnicity by an image. The implications to how government utilize this technology to how business leverage it can be far reaching. Who were sampled? The source of the images is part of a Kaggle challenge, and they did not indicate as to how they came to collect the set of images. Who were over sampled or under sampled? The data was not normally distributed over the set of categories. Men were more represented then were women. People between 20 and 30 and newborns were more represented then were other age groups. The rest of the age groups were underrepresented. Caucasians were more represented then the rest, with Asians being under represented. Who were the data scientists? Debsankar Mukhopadhyay, Vasim Saikh, Pravanjan Trivedi, William Yerkes, students at UMKC studying Big Data Analytics and Applications.
- What are the social and cultural impacts? The possibility of discrimination and racism exist with being able to determine a person gender and ethnicity from an image. The ability to have software that can either promote or reduce racial profiling can have either a positive or negative impact on society. What are the concerns about data privacy, security, and fairness? There are concerns about how the data and technology will be used. China currently is able to use facial recognition to issue Jaywalking tickets.
- When will the social and cultural impacts take place? The cultural impacts from AI being able to categorize images to determine Race, Gender, and Age, will most likely occur within the next 20 years. When should people be concerned about data privacy, security, and fairness? People should be concerned about their data privacy, security, and fairness now. People who live in countries with democratically elected governments need to petition their representatives with these concerns.
- Where will the social and cultural impacts take place? These potential impacts can occur anywhere in society where CCTV are in use, which is being more and more prevalent. Where will data privacy, security, and fairness issues, like data breach, and evaluative bias, likely to happen? As the technology become more available and cheaper to implement, it will become available to smaller entities which may not have robust data security measures. As the use of the technology becomes more prevalent, the possibility for data breaches increases.
- Why are the social and cultural impacts important or consequential to the people and/or the community? Giving AI the ability to determine age, gender, and ethnicity, can have major consequence once it is adopted by governments. Governments use Facial Recognition to pursue possible suspects for crimes, could it be used to monitor business to see if they are selling restricted products to underage clients. Why should we be concerned about data privacy, security, and fairness issues? Once the images are captured and classified, who will have access to them. Companies currently sell client information they possess, what is to stop them from selling images of their clients categorized by age, gender, and ethnicity.
- How can we address these societal issues in ML using a community-in-the-loop approach? Our ML models to predict the biodiversity in the population at the clothing store under consideration will help the clothing manufacturer and service providers associated with that store to understand the relative difference between the diverse population and tune their businesses in a way so that those diversities can be turned into the customer’s satisfaction and business’ profitability. Our model and its outcome should enable them to promote inclusion and diversity in their products.