Is organic food really better for a person than non-organic food? I'm rather health-conscious about what I eat, and I'm wondering if organic food (fresh fruit and vegies, etc.) is really any better than regular food. Take bananas, for instance. The organic ones cost a lot more money than the regular ones, and the organic ones are often smaller and have more dark spots and bruises on them. So, is there any solid evidence that organic produce is better and healthier for a person to eat than non-organic? Thanks for your input!