We call "Western" the entire continent of America and "Western Europe" as the west part of Europe.
Alright, calm your tits. It was just a question.
so "the west" and "the east" are technically classically defined by the empires or civilizations that held the most dominance; within the past few hundred years, the british empire had the most sway over the widest regions of the world. to them, they considered Asia and practically anything east of the Ural Mountains "the east"
like Russia is part of the west and the east (majority east) and how the arabian regions are considered "middle east" and while overall it is essentially based on geographical location, there's not really a point of reference where the middle of the world is on the globe so it was chosen arbitrarily i guess.
so yeah. The Western society is considered most of Europe (north and west of turkey??) up til and including the americas. The East is east of that.
(someone feel free to correct me if i'm misinformed)