the West Coast
noun /ðə ˌwest ˈkəʊst/
/ðə ˌwest ˈkəʊst/
[singular]- the states on the west coast of the US, especially California
美国西海岸(尤指加利福尼亚州) CultureTo many people the West Coast suggests a place where the sun shines most of the time, where the people have a relaxed way of life and often invent or follow new fashions, particularly those involving physical fitness or psychology.