Soccer in the United States

association football (soccer) practiced in the USA

Soccer in the United States is one of the most-widely played sports in the country. It is governed by the United States Soccer Federation, commonly known as U.S. Soccer.

You were kind of an outlier if you even liked football and you were a girl in England. So to come over here and have that opportunity? I've always said America is the land of opportunity. It certainly was for me. ~ Jill Ellis


Quotes

edit
edit
 
Wikipedia
Wikipedia has an article about: