Hey non-Westcoasters? How do you view the west coast?

I mean, we all have these visions of the west coast that are both shallow and iconic. For those of us who are from other countries, the West Coast is as much as an idea as it is a place. I know for most Northeasterners, when they describe California, they say "You can see why everyone is so happy," partly because they don't slog through the long winters. And I mean, no small part of the world's view of California is based on the fact that it is the center of the film and television industry, which sells a certain vision of the West Coast. Or I know so many people who love Seattle, attracted by tech industries or coffee or food or the 90s music scene from that area. And finally, I mean, isn't Portland the land where young people go to retire? Most people from the West Coast likely know that these are superficial stereotypes that capture the smallest and simplest aspect of these areas, just as I know that New England isn't just about chowder and angry drivers and the Red Sox. It is also about pine trees and coffee brandy.

So I wonder for you from the rest of the country, those of us who are writing about a part of the country that is not ours, what are your visions of the West Coast? And would you ever want to live there? (I've gone back and forth about trying to go out there for years. My sister's mission in life is to convince us all to move to California—she loves it, especially Southern California). And when you visited, what surprised you the most? For me, it was the idea that all of it was really as beautiful as it seemed on films.