colonialism

Primary tabs

The word colony comes from the Latin word colonus, which means farmer - indicating the transfer of people to land. Colonialism is the act of power and domination of one nation, by acquiring or maintaining full or partial political control over another sovereign nation. The country or nation which comes under the control of another foreign nation, is known as a colony of that dominating country. While the two are related; colonialism should not be confused with imperialism - which is involves the outward use of military and economic power, and always aims for more expansion and collective domination. 

For example, the outward expansion of the British Empire was imperialism. The commonwealth states were then colonies of the British (colonialism was the occupation of the land after the imperial expansion). When the colonies gained independence and became politically and economically sovereign, they collectively became known as the United States. Colonialism is still in practice within the United States, however, as there are many indigenous nations present in the same territory. 

To learn more about colonialism see this Florida A & M University Law Review Article

See also: terra nullius, doctrine of discovery

[Last updated in April of 2022 by the Wex Definitions Team]