Dear Friends,
Nostalgically speaking, Hawaii was once the epitome of romance and luxury, sort of an exotic dream for many, a place of pineapples and volcanoes and missionaries.
These days, you can go anywhere you like by hopping on a jet plane.
Therefore, why do you choose to live in Thailand rather than to live in Hawaii?
Is it the present-day culture of America that causes you to steer clear of these magnificent volcanic islands?
Or, is it the culture of Thailand and the SE Asian food that keeps you here in one of the longest countries, north to south, on Earth?
Thailand, these days, is not the same as it once was, 50 years ago, and therefore, much of the allure of Thailand has become diluted due to ever-intruding influences from the West.
If Thailand, as is true of other countries, is becoming evermore westernized, then why not just live in Hawaii, an equally exotic place, at one time, which has now become totally taken over by western culture?
As you can see...
The guys above are very western.
BUT, in Hawaii, the food is terrible.
By contrast, the food in Thailand is AMAZING!
And so, my friends....
In my opinion, one should come to Thailand, instead of Hawaii, if one wishes to witness a still-vibrant culture with amazing food from all around the world, a gastronomic delight, and an amalgam of all possible flavors and spices.
What is your opinion, comparing Hawaii to Thailand?
Is it not like comparing Pablum to Stir-fried Pumpkin?
Even the lowliest dish in Thailand is worth two conch-shell-blowing guys in Hawaii.
If you love Thai food as much as I do,
Then, leaving Thailand is unthinkable.
There are many other reasons why Thailand is far better than Hawaii, except maybe for just a short island hop.
What are they?
Other than the weather?
Regards,
GammaGlobbulin (sp)