do people still do this? im putting this in this forum because i see this as descrimination. from what ive seen in some Black women's magazines adverts for creams to make the skin whiter. why is this still around? is there a pressure on black women to have lighter skin? does this pressure come from just white people or from both sides?