Just read this in the National Review Got me thinking…
One thing that I am beginning to believe in is the increase in States rights. The spectre of slavery was vanquished, racism has been LEGALLY removed from the law. So why are we trying to impose views on each other state by state? Maybe it’s time to go back towards the states, maybe dismantling the federal government is the right idea?
Yes, it’s scary to think what could happen in the more conservative states and yes, its scary to thing how far left things could get in the more progressive states, but isn’t that the purpose of the states? They are different from each other? Shouldn’t they be allowed to be different?
US history has always had the battle between states rights and the federal government. Slavery was the massive battle… then the great depression then World War II and then the Civil Rights acts. World War II & Civil Rights cemented the federal government as the central power in our lives and there has been a steady increase in the power of the Federal government to now.
The power of the Fed is what is making everyone angry. The imposition of federal items in other people’s lives. Views out of Massachusetts on Alabama. Views on religion from Arkansas imposed on New York. I think we have to let a state’s culture be a state’s culture… maybe that is our way out? I fear if we don’t do that…. we’ll end up in some version of the divided States of America. There is no foregone conclusion that the United States has to be one country….