A study that followed 70,000 adults has finally given us evidence that eating an organic diet does in fact benefit your health. Adults who eat organic foods are 25% less likely to get diagnosed with cancer especially lymphomas or bread cancer. Better start eating your organic greens!