East Coast Stereotypes About The West Coast That Are Plain Wrong

For many east coast folks, the west coast is a mythical land full of sunshine, healthy people, and nothing but sandy beaches.

And yes, you can find those things in California, Arizona, and other western states. But there are just some things about the west coast those back east don’t understand. These stereotypes should be put to rest!

Surfing Isn’t A Mandatory Sport

Surf paddle-out memorial held for Huntington Beach's
Mark Rightmire/MediaNews Group/Orange County Register via Getty Images
Mark Rightmire/MediaNews Group/Orange County Register via Getty Images

Contrary to popular belief, not everyone on the west coast owns a surfboard and knows how to ride the waves. While California does have some of the best surf spots in the continental US, the sport isn’t a mandatory activity for people who live out west.

Some people enjoy lying on the sand with a good book!