What Happens When We Die
What happens when you die?
While most of us accept that death is a scientific fact, many of us still believe in a life after death. In fact, according to a 2014 survey by The Telegraph, just under 60% of UK citizens believe that "a part of us" lives on.In the United States, which is still a very Christian country, Pew Research asked people what happened after they died in 2015. The results showed that 72% of Americans believed that they went to heaven, which is described as a place "where people who have lived good lives are rewarded and rewarded forever." 54% of U.S adults said that they believed in a place "called hell," which is described as "a place where people who have lived bad lives and died without regret are punished forever."