In this video at site pro news, Vanessa Fox from the Google Sitemaps team clariefies to some extent, duplicate content issues. Google doesn't apply a "penalty" to the site, its more like a "filter", they just choose one version to index. I think the other versions of the articles or pages end up in hell aka the Supplemental Index. I don't know how they choose which version to index. I still wouldn't leave it up to google to choose which version of the page to index. The best way to tackle this problem is with redirects. If you have pages that are similar where only a few words are different among the pages, this is not the solution for you, you need to make the pages more unique.
She didn't clarify how duplicate content is treated when the content is similar on different sites.