|
Edited on Thu Nov-29-07 11:30 PM by Selatius
The term "liberal" was co-opted by FDR and the New Deal Democrats in the 1930s in an attempt to appeal to centrist voters of the era, many of whom were already suffering the disasterous effects of the Great Depression. Since then, the original definition of "liberal" in American parlance has been lost. Prior to FDR, a liberal was somebody who generally favored laissez-faire economic policies, a hands-off approach to managing the economy. Before the 1930s, the general catch-all term for the left in the US was "progressive."
In the UK, the Liberal Democrats are, indeed, more left wing than the Labour Party on economic issues; however, compared to the Greens, they are a center-right political party.
If you're in Europe and talking with left wingers, the general rule is to avoid calling them "liberal." For many, that can be seen as an insult because the running joke in places like France or Belgium is that a "liberal" is a false-friend of the working man.
|