Who in school told you Britain and France are socialist? That was never the case. They were literally always allies, why would any school teach this? I know fox news viewers believe it for some reason but that certianly never impacted my curriculum
Elementary school through high school, and they taught socialism to basically be when the government runs services or really does anything someone else could be making money off it
I think healthcare was a big sticking point for the label, but they basically described any social safety net as socialist.
It not have been in your curriculum, but teachers watch mainstream media too…
Who in school told you Britain and France are socialist? That was never the case. They were literally always allies, why would any school teach this? I know fox news viewers believe it for some reason but that certianly never impacted my curriculum
Elementary school through high school, and they taught socialism to basically be when the government runs services or really does anything someone else could be making money off it
I think healthcare was a big sticking point for the label, but they basically described any social safety net as socialist.
It not have been in your curriculum, but teachers watch mainstream media too…
Why are they allowed to teach that though,? No wonder they think “the libs” want to indoctrinate people
Because they’re just teaching their understanding of the world, it’s not like they’re lying. Propaganda works
And it’s not like it comes up often. And even if it does, very few people are able to explain why that’s wrong on the spot