Here we go again with "the media". Personally I agree with the public having the right to know but should the media be allowed to put their own twist on it?
To me its like telling you a lie and then saying "well you dont have to believe it" when its obvious that they are "doctoring" the story.
We remember the dark photos of OJ that surfaced when he was accused of murder. We remember the airplanes hitting the twin towers and going to war with Iraq. We've seen the Vanity Fair photos of Tiger Woods with no shirt, wearing a tobogin and looking rough. There are countless events that the media has "altered" to make a story more interesting or profitable but is that right?
I think the media should report worldly affairs but it should be as unbiased as possible. People rely heavily on information and the worse kind of information is biased information. I think we should be able to make our own conclusions first instead of being "lead" to feel one way or another.
What do you all think?
dac
To me its like telling you a lie and then saying "well you dont have to believe it" when its obvious that they are "doctoring" the story.
We remember the dark photos of OJ that surfaced when he was accused of murder. We remember the airplanes hitting the twin towers and going to war with Iraq. We've seen the Vanity Fair photos of Tiger Woods with no shirt, wearing a tobogin and looking rough. There are countless events that the media has "altered" to make a story more interesting or profitable but is that right?
I think the media should report worldly affairs but it should be as unbiased as possible. People rely heavily on information and the worse kind of information is biased information. I think we should be able to make our own conclusions first instead of being "lead" to feel one way or another.
What do you all think?
dac