I ask this having been to events with national/ethnic dress, food, and other cultures. What can a white American say their culture is? It feels that for better or worse it’s been all melted together.
Trying to trace back to European roots feels disingenuous because I’ve been disconnected from those roots for a few generations.
This also makes me wonder was their any political motive in making white American culture be everything and nothing?
Having a problem with a state and being of a certain group is not the same thing. Did Germans or Japanese stop being themselves post WW2? Your opinion is nonsense.