1 definition by A Brit

A war that Americans are taught in school that they won. Basically, it was a pathetic attempt to Annex Canada from the British, and ultimately got their capital burned to the ground.
American: We won the war of 1812!
Smart Person: Can you smell something burning? Oh, yeah, its the white house.
by A Brit May 17, 2004
Get the War of 1812 mug.