Spaces:
Running
on
Zero
Running
on
Zero
Update app.py
Browse files
app.py
CHANGED
@@ -174,14 +174,14 @@ def ageplot(agelist):
|
|
174 |
|
175 |
def is_nsfw(image):
|
176 |
"""
|
177 |
-
A function that checks if the input image is not
|
178 |
an image classification pipeline and returning the label with the highest score.
|
179 |
|
180 |
Args:
|
181 |
image: The input image to be classified.
|
182 |
|
183 |
Returns:
|
184 |
-
str: The label of the
|
185 |
"""
|
186 |
classifier = pipeline("image-classification", model="Falconsai/nsfw_image_detection")
|
187 |
result = classifier(image)
|
@@ -190,7 +190,7 @@ def is_nsfw(image):
|
|
190 |
|
191 |
def nsfwplot(nsfwlist):
|
192 |
"""
|
193 |
-
Generates a plot of
|
194 |
|
195 |
Args:
|
196 |
nsfwlist (list): A list of NSFW labels ("normal" or "nsfw").
|
@@ -201,9 +201,9 @@ def nsfwplot(nsfwlist):
|
|
201 |
Raises:
|
202 |
None
|
203 |
|
204 |
-
This function takes a list of
|
205 |
Each label is sorted based on a predefined order and assigned a color. The plot is then created using matplotlib,
|
206 |
-
with each cell representing an
|
207 |
The function returns the generated figure object.
|
208 |
"""
|
209 |
order = ["normal", "nsfw"]
|
@@ -243,16 +243,16 @@ def generate_images_plots(prompt, model_name):
|
|
243 |
nsfws.append(nsfw)
|
244 |
return images, skintoneplot(skintones), genderplot(genders), ageplot(ages), nsfwplot(nsfws)
|
245 |
|
246 |
-
with gr.Blocks(title="
|
247 |
-
gr.Markdown("#
|
248 |
gr.Markdown('''
|
249 |
-
In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender
|
250 |
|
251 |
1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
|
252 |
2. **Gender Detection**: The [BLIP caption generator](https://huggingface.co/Salesforce/blip-image-captioning-large) is used to elicit gender markers by identifying words like "man," "boy," "woman," and "girl" in the captions.
|
253 |
3. **Skin Tone Classification**: The [skin-tone-classifier library](https://github.com/ChenglongMa/SkinToneClassifier) is used to extract the skin tones of the generated subjects.
|
254 |
4. **Age Detection**: The [Faces Age Detection model](https://huggingface.co/dima806/faces_age_detection) is used to identify the age of the generated subjects.
|
255 |
-
5. **
|
256 |
|
257 |
#### Visualization
|
258 |
|
@@ -261,7 +261,7 @@ We create visual grids to represent the data:
|
|
261 |
- **Skin Tone Grids**: Skin tones are plotted as exact hex codes rather than using the Fitzpatrick scale, which can be [problematic and limiting for darker skin tones](https://arxiv.org/pdf/2309.05148).
|
262 |
- **Gender Grids**: Light green denotes men, dark green denotes women, and grey denotes cases where the BLIP caption did not specify a binary gender.
|
263 |
- **Age Grids**: Light blue denotes people between 18 and 30, blue denotes people between 30 and 50, and dark blue denotes people older than 50.
|
264 |
-
- **
|
265 |
|
266 |
This demo provides an insightful look into how current text-to-image models handle sensitive attributes, shedding light on areas for improvement and further study.
|
267 |
[Here is an article](https://medium.com/@evijit/analysis-of-ai-generated-images-of-indian-people-for-colorism-and-sexism-b80ff946759f) showing how this space can be used to perform such analyses, using colorism and sexism in India as an example.
|
@@ -294,7 +294,7 @@ This demo provides an insightful look into how current text-to-image models hand
|
|
294 |
genplot = gr.Plot(label="Gender")
|
295 |
with gr.Row(equal_height=True):
|
296 |
agesplot = gr.Plot(label="Age")
|
297 |
-
nsfwsplot = gr.Plot(label="
|
298 |
btn.click(generate_images_plots, inputs=[prompt, model_dropdown], outputs=[gallery, skinplot, genplot, agesplot, nsfwsplot])
|
299 |
|
300 |
demo.launch(debug=True)
|
|
|
174 |
|
175 |
def is_nsfw(image):
|
176 |
"""
|
177 |
+
A function that checks if the input image is not for all audiences (NFAA) by classifying it using
|
178 |
an image classification pipeline and returning the label with the highest score.
|
179 |
|
180 |
Args:
|
181 |
image: The input image to be classified.
|
182 |
|
183 |
Returns:
|
184 |
+
str: The label of the NFAA category with the highest score.
|
185 |
"""
|
186 |
classifier = pipeline("image-classification", model="Falconsai/nsfw_image_detection")
|
187 |
result = classifier(image)
|
|
|
190 |
|
191 |
def nsfwplot(nsfwlist):
|
192 |
"""
|
193 |
+
Generates a plot of NFAA categories based on a list of NFAA labels.
|
194 |
|
195 |
Args:
|
196 |
nsfwlist (list): A list of NSFW labels ("normal" or "nsfw").
|
|
|
201 |
Raises:
|
202 |
None
|
203 |
|
204 |
+
This function takes a list of NFAA labels and generates a plot with a grid of 2 rows and 5 columns.
|
205 |
Each label is sorted based on a predefined order and assigned a color. The plot is then created using matplotlib,
|
206 |
+
with each cell representing an NFAA label. The color of each cell is determined by the corresponding label's color.
|
207 |
The function returns the generated figure object.
|
208 |
"""
|
209 |
order = ["normal", "nsfw"]
|
|
|
243 |
nsfws.append(nsfw)
|
244 |
return images, skintoneplot(skintones), genderplot(genders), ageplot(ages), nsfwplot(nsfws)
|
245 |
|
246 |
+
with gr.Blocks(title="Demographic bias in Text-to-Image Generation Models") as demo:
|
247 |
+
gr.Markdown("# Demographic bias in Text to Image Models")
|
248 |
gr.Markdown('''
|
249 |
+
In this demo, we explore the potential biases in text-to-image models by generating multiple images based on user prompts and analyzing the gender, skin tone, age, and potential sexual nature of the generated subjects. Here's how the analysis works:
|
250 |
|
251 |
1. **Image Generation**: For each prompt, 10 images are generated using the selected model.
|
252 |
2. **Gender Detection**: The [BLIP caption generator](https://huggingface.co/Salesforce/blip-image-captioning-large) is used to elicit gender markers by identifying words like "man," "boy," "woman," and "girl" in the captions.
|
253 |
3. **Skin Tone Classification**: The [skin-tone-classifier library](https://github.com/ChenglongMa/SkinToneClassifier) is used to extract the skin tones of the generated subjects.
|
254 |
4. **Age Detection**: The [Faces Age Detection model](https://huggingface.co/dima806/faces_age_detection) is used to identify the age of the generated subjects.
|
255 |
+
5. **NFAA Detection**: The [Falconsai/nsfw_image_detection](https://huggingface.co/Falconsai/nsfw_image_detection) model is used to identify whether the generated images are NFAA (not for all audiences).
|
256 |
|
257 |
#### Visualization
|
258 |
|
|
|
261 |
- **Skin Tone Grids**: Skin tones are plotted as exact hex codes rather than using the Fitzpatrick scale, which can be [problematic and limiting for darker skin tones](https://arxiv.org/pdf/2309.05148).
|
262 |
- **Gender Grids**: Light green denotes men, dark green denotes women, and grey denotes cases where the BLIP caption did not specify a binary gender.
|
263 |
- **Age Grids**: Light blue denotes people between 18 and 30, blue denotes people between 30 and 50, and dark blue denotes people older than 50.
|
264 |
+
- **NFAA Grids**: Light red denotes FAA images, and dark red denotes NFAA images.
|
265 |
|
266 |
This demo provides an insightful look into how current text-to-image models handle sensitive attributes, shedding light on areas for improvement and further study.
|
267 |
[Here is an article](https://medium.com/@evijit/analysis-of-ai-generated-images-of-indian-people-for-colorism-and-sexism-b80ff946759f) showing how this space can be used to perform such analyses, using colorism and sexism in India as an example.
|
|
|
294 |
genplot = gr.Plot(label="Gender")
|
295 |
with gr.Row(equal_height=True):
|
296 |
agesplot = gr.Plot(label="Age")
|
297 |
+
nsfwsplot = gr.Plot(label="NFAA")
|
298 |
btn.click(generate_images_plots, inputs=[prompt, model_dropdown], outputs=[gallery, skinplot, genplot, agesplot, nsfwsplot])
|
299 |
|
300 |
demo.launch(debug=True)
|