Consumer Reports now recommends the new MacBook Pro

Consumer Reports has now finished retesting the battery life on Apple’s new MacBook Pro laptops, and our results show that a software update released by Apple on January 9 fixed problems we’d encountered in earlier testing,” Consumer Reports writes. “With the updated software, the three MacBook Pros in our labs all performed well, with one model running 18.75 hours on a charge.”

“Now that we’ve factored in the new battery-life measurements, the laptops’ overall scores have risen, and all three machines now fall well within the recommended range in Consumer Reports ratings,” Consumer Reports writes. “The three MacBook Pros in our labs include two 13-inch models, one with Apple’s new Touch Bar and one without the Touch Bar; and a 15-inch model. (All 15-inch MacBook Pros come with the Touch Bar.) The new average battery-life results are, in order, 15.75 hours, 18.75 hours, and 17.25 hours.”

“Apple’s updated software is available through Apple’s Beta Software Program now, and will be rolled out in a full Software Update to all users in several weeks,” Consumer Reports writes. “According to Apple, the new software fixes a bug in Safari that caused the poor battery-life results in Consumer Reports testing.”

Read more in the full article here.

MacDailyNews Take: The problem was Consumer Reports‘ testing methodology to which the disingenuous Consumer Reports will not own up. Consumer Reports‘ continued duplicitous attempt to blame an obscure bug is a copout.

Again, Consumer Reports is, was, and has always been a joke when it comes to testing anything remotely associated with tech (to say nothing of devices running the world’s most advanced operating systems). The rag is nothing more than an anachronism for grandma to use to reassure herself that she bought the right vacuum cleaner (even though she didn’t unless she bought a Miele – which she almost certainly did not since she’s an unfortunate Consumer Reports subscriber).

SEE ALSO:
Consumer Reports’ deck-stacking, or incompetence, exposed – January 11, 2017
Consumer Reports’ weird MacBook Pro battery test results due to use of obscure Safari developer setting – January 10, 2017
Consumer Reports stands by its weird MacBook Pro battery test results – December 29, 2016
Consumer Reports says do not buy Apple’s new MacBook Pro, citing erratic battery life – December 23, 2016
Consumer Reports evaluates iTunes Store movie streaming, confusion ensues – August 13, 2012
Is Consumer Reports having its revenge against Apple? – July 10, 2012
How Apple and facts killed Consumer Reports – March 29, 2012
Consumer Reports was no iPhone killer and they’re no iPad killer, either – March 28, 2012
Tests prove Apple’s new iPad heat levels comparable to Android tablets – March 26, 2012
Expert: iPad heat claims overblown, not a real issue – March 22, 2012
What’s the deal with Consumer Reports and Apple? – March 21, 2012
Consumer Reports’ bombshell: New iPad runs hotter than predecessor but ‘not especially uncomfortable’ – March 20, 2012
FUD Alert: Consumer Reports to ‘investigate’ reports of iPad and ‘excess heat’ – March 20, 2012
Consumer Reports hops off free PR gravy train, officially recommends Apple iPhone 4S – November 8, 2011
Why does anyone believe Consumer Reports? – April 6, 2011
Consumer Reports on iPad 2: We didn’t notice any significant speed improvement – March 15, 2011
Consumer Reports was wrong on Verizon iPhone 4; so-called ‘death grip’ fixed by Apple – March 2, 2011
Consumer Reports: Verizon iPhone 4 has antenna ‘problem’; not recommended – February 25, 2011
Consumer Reports continues laughable vendetta against iPhone 4 – January 14, 2011
Android sweeps Consumer Reports’ rankings as iPhone 4 is omitted – November 17, 2010
All of Consumer Reports’ ‘recommended’ smartphones suffer attenuation when held – July 19, 2010
Consumer Reports: Apple’s free Bumper case does not earn iPhone 4 our recommendation – July 16, 2010
Consumer Reports: Apple’s Bumper case fixes iPhone 4 signal-loss issue – July 15, 2010
Consumer Reports continues harping on iPhone 4 attenuation issue – July 14, 2010
Electromagnetic engineer: Consumer Reports’ iPhone 4 study flawed – July 13, 2010
The Consumer Reports – Apple iPhone 4 fiasco – July 13, 2010
Consumer Reports: Oh yeah, almost forgot, Apple iPhone 4 is also the best smartphone on the market – July 12, 2010
Consumer Reports: We cannot recommend Apple iPhone 4 – July 12, 2010
Consumer Reports does their readership a disservice, says viruses target Apple Macs – December 13, 2005
Consumer Reports: Apple’s new iPod screens scratch-prone like iPod nanos – October 28, 2005
Consumer Reports dubiously finds 20-percent of Mac users ‘detected’ virus in last two years -UPDATED – August 10, 2005

21 Comments

    1. All Apple has to do is say, “we screwed up our Safari software and undersized the battery so Jony could shave off another 0.1 inch.”

      CR used the identical settings that every other brand laptop used. The test was not flawed. It may not be representative of YOUR usage case, but no repeatable scientific test is.

      1. No Mike, the test is flawed on all platforms because they used different browsers in ways in which they are not intended to be used by end users. If they would have simply used chrome on all machines, or Firefox, which do not have deep system integration like safari and edge, then it would be a fair example of usage. But by modifying the performance of the browsers artificially, i.e. Disabling caching, the system is not meant to compensate for that in normal use. Therefore the test is not even remotely close to anyone’s use case let alone mine…

    2. What’s with you King? Do you ever have something positive to report or are you always in the complaint mode? Nothing Apple does ever seems to be good enough for you. Are you a troll for the other side? I wonder.

  1. I have no problem with CR using a non-real world testing scenario, as long as they are open about why they have done so and test in factory recommended settings, as well.

    To repeat some of what I said a few days ago:

    “CR should have reported that their testing procedures don’t involve typical user conditions. Furthermore, I believe they should test computers as designed by their manufacturers to operate. That’s how the VAST majority of their readers would use them.

    If they want to test under both normal and artificial but very consistent conditions, that’s OK, but they need to explain to readers what they were doing.”

    It appears they were unaware of the fact that their non-real world settings could cause such divergent results. I’m disappointed but not surprised that they didn’t say that their engineers didn’t realize that settings they used were not recommended by Apple and were responsible for the poor performance. I read the article and they are rather sanctimonious in their justification. They do make the point that they test ALL laptops using the same settings, But nowhere to they explain what that really means and that those settings would rarely be used by a consumer.

  2. Agreed that it’s important for a consumer review to test using typical consumer usage patterns.

    However, their methodology (purportedly identical for all platforms) revealed a serious power management flaw in the Macbooks, which has since been addressed.

    This is the kind of edge case bug that will drive a small number of people insane, and Apple would deny to the ends of the earth if it weren’t a publication citing it. Spend 10 minutes in the Apple customer forums to hear those stories. Or just call Apple and ask them about their double-secret unpublished recall of the Trashcan Mac Pro GPUs.

  3. Read the article. They didn’t admit an error. They justified their non-real world but standard testing procedure. They never actually explained what the procedure was or why it resulted in the inconsistent results.

    1. If you look into it, you will see that CR has very well explained their test procedure and explained why they use this setting for a standardized test.

      Has Apple explained their test method?

  4. Make a “Pro” laptop worthy of the title. Don’t just put some ports, a touch bar and some performance into a RAM limited plain MacBook style housing. MAKE A PRO VERSION! 15 or 17 inches. Give the Pro’s something to spend their money on. They ~want~ to spend it.

    Put some power in it and then give it a big battery to power it for hours.

    PC World nailed it with this.

    Apple, bring back the MacBook Pro 17 and make your laptops great again
    Make it bigger and more powerful, and give it a big, fat battery and screen.
    http://www.pcworld.com/article/3144054/macs/apple-bring-back-the-macbook-pro-17-and-make-your-laptops-great-again.html

  5. I hope this is the last of MDN covering the Consumer Reports tests for a long time. More than anything it exposed how fanboys have taken over the internet. When the dust settles, the Consumer Reports test was shown to be as good as any artificial objective test and Apple has shown 3 things:

    1 its software had flaws
    2 battery life of the MBP with touchbar is not great compared to the competition. Poor for something labeled “Pro”, whatever that means at Apple these days
    3 Apple has a horrible PR team.

    I await a proper 17″ MacBook Pro with true all-day battery under rigorous use. Enough with the thinness games. Isn’t that what the Air was supposed to be?

  6. If you’re going to change settings away from normal, you’re not doing a real world test. In which case, why use web browsing as a test at all? Why not run scripts that cycle screen images, do calculations, move files around etc. Couldn’t that create a more consistent test? Not real world, but the tests weren’t anyway and this would allow better control and consistency. Not relying on browsers for example. The test is meant to test the hardware only, not the software. So drop the software except for the custom written test software. Errors in that software are irrelevant as they are consistent.

    1. Then you can’t have test comparisons between manufacturers because every mfr will tweak its default settings for its own purposes.

      Can anyone recommend another source for independent battery life tests?

      MDN probably doesn’t like The Verge, but Verge editor Jake Kastrenakes reported 6 hours of battery life in his usage. Apple sent him a new MacBook Pro, which of course didn’t solve anything.

      Apple’s first response, to remove battery life information to hide the obvious performance inconsistencies that many people with different computer settings were seeing, is a slap in the face. Totally wrong answer to the problem.

      1. The whole point of my test suggestion was to make settings irrelevant. If you’re playing a movie, the display won’t sleep. If you’re also copying TB’s of data and calculating Pi or 42 or whatever, it’s going to stay awake. Do things that settings cannot affect. That would give worst case battery life but it would be comparable between machines, although unfairly ignoring any advances one may have in power saving techniques. Maybe a lightweight test could cover that. Just do Pi, for example, and let the machine sleep the GPU and display or whatever. Then you have best and worse case for all the machines that are directly comparable.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.