Hello, first things first, i hope this post isnt offensive, if it is, the mods should wipe it.
its a genuine question for black women, if its out of the scope of the forum (it is a forum about dicks afterall) apologies and again, mods can wipe the thread.
i come from a country with a very very small black community, so there isnt much information around.
The question is : what is going on with wigs and black women? it seems to me that the vast majority of women (at least in movies, series etc) have a "removing the wig" scene. are wigs so popular in real life black women? i was under the impression that the afro was celebrated. Obviously anyone can do whatever they like with their hair, but is seems like quite a drastic direction that has taken hold only in the black community. Why? is afro hair impossible to straigten properly? or is it just an unusual cultural phenomenon? was it always a thing? is it a new trend of the past couple of decades? what makes a young black girl decide she doesnt want to show her natural hair anymore and start purchasing wigs?
its a genuine question for black women, if its out of the scope of the forum (it is a forum about dicks afterall) apologies and again, mods can wipe the thread.
i come from a country with a very very small black community, so there isnt much information around.
The question is : what is going on with wigs and black women? it seems to me that the vast majority of women (at least in movies, series etc) have a "removing the wig" scene. are wigs so popular in real life black women? i was under the impression that the afro was celebrated. Obviously anyone can do whatever they like with their hair, but is seems like quite a drastic direction that has taken hold only in the black community. Why? is afro hair impossible to straigten properly? or is it just an unusual cultural phenomenon? was it always a thing? is it a new trend of the past couple of decades? what makes a young black girl decide she doesnt want to show her natural hair anymore and start purchasing wigs?