Florida

Is Florida A Word?

Florida (proper noun) Is Florida an English word? Augustine. Ponce de Leon claimed the land for Spain, calling it La Florida, the Spanish name for flowery, covered with flowers, or abounding in flowers. What does Florida mean? Florida. / (ˈflɒrɪdə) / noun. a state of the southeastern US, between the Atlantic and the Gulf of …

Is Florida A Word? Read More »