Lately been working a lot of upgrades to 11g. I know 12c is just around the corner 10gR2 was a pretty good release and there is still a large install base.
One the major items that are of concern when upgrading the database besides ensuring basic functionality is performance regressions mainly due to execution plan changes from using a new optimizer. With the introduction of the cost based optimizer years ago statistics about the data are extremely important for the optimizer to find the right plan for a query.
Prior to 10g collecting stats was a manual process or cron job and starting with 10g a nightly stats job was scheduled when you created a database. The default stats job was ok but many people simply collected stats on 10g the same as 9i or tried the 10g default stats job, had a bad experience and wrote it off completely. Now they are upgrading again to 11g and remember the bad experience they are writing of the 11g stats job and collecting stats they same way they did in 10g or even 9i.
When learning ride a bike did you ever fall at first? Have you tried a new food didn’t like it and maybe a few years later tried it again and found your tastes have changed? Have you ever tried software that maybe had issues and didn’t use it but found a later release to be much better? The default stats job in 11g is greatly improved. I am not going to say its perfect and there are not cases where manually collecting the stats will be required. What I will say is when upgrading to 11g don’t write off the default stats job without first testing it. Especially the auto sample size is greatly improved. A old post but a good one from Greg Rahn is a good read on why the new dbms stats is worth a try.