Answer :

some time after the war of 1812
CuriousJenn
After the American Revolution, America won the war. So, the English agreed to recognize the United States as independent.

Other Questions