According to wiki… Thailand has never been colonialized by the Europeans or any other country. Therefore, the country did not have any beginning or ending to the colonial era. However, the then-named Siam kingdom received Western influence, which contributed to some changes of the country's modern identity.