History
rivaldifirmansyah
39

What right did women in the United States win by 1930? the right to work the right to vote the right to join the military the right to run for office

+3
(2) Answers
TiffaniMehner26

By the 1930s women won the right to vote in the United States, although it is true that this right wasn't really particularly taken in account by women. The thing was that women weren't instantaneously prepared to vote when it was possible to do so. 

DestinedForDestiny

The right to work was the right that women had in the United States in the 1930s. Many married women was forced into working when their husbands lost their jobs or where their husbands’ wages are too low to support the needs of the family during the Great Depression.

Add answer