I know that a lot of people have said this. But I legitimately don't feel safe in America, and I'm thinking about leaving. Something about Trump's "America" is really unnerving. It's more than just a terrible president. It feels like something at the very core of the country is changing. Even under Bush, things felt safe- they felt somewhat stable. But now I feel really scared..