In the 28 years since winning the very first Women’s World Cup, the U.S. women’s soccer team has dominated the game on the global stage, taking home four Women's World Cups in all, including the 2019 title captured this month in a 2-0 victory over The Netherlands.
The U.S. men haven’t come close to the women’s success. Not only have the men never won a World Cup, they even failed to qualify for the most recent men’s World Cup in 2018.
To deduce why U.S. women’s soccer dominates on the world stage while the men’s game continues to falter, you might just have to go back to the beginning, to the time when future world-class players — female and male — first start showing athletic promise.
“Soccer was never really been part of the national lexicon. It's always been kind of this underground, kind of foreign game,” says Eileen Narcotta-Welp, an assistant professor of sport management at the University of Wisconsin-La Crosse. “Not only has it been a foreign game, but it's been seen as a less masculine state. So if a child has to choose, or their parents have to choose, which sport a child is going to go into, ultimately it's going to be basketball, baseball, [or] football.”
Köhnə versiyamızdan xəbərləri izlə