The liberals have been terrorizing people for long enough. Proud citizens of USA are standing up against them as they try to destroy our country.
Political correctness is ruining American culture and our identity.
Some states have taken their preference for English further: including Arizona, Arkansas, Colorado, and many others. They have mandated that only English be used to teach students.
English is our first language and it should be taught in schools as a first language to everyone, especially foreigners who come here.
What are your thoughts on this?