>The new guts are getting you better performance but also better battery life with what Apple says is 10 hours of web browsing or 11 hours of iTunes movie playback.
Movie playback used to be considered the de facto test of the most rigorous power use a computer would go through. Spinning DVDs and hard drives have been replaced with SSD, hardware accelerated decoding of video has replaced maxing out your CPU.
On the other hand, web browsing used to be considered a light use of power. Pull some network content into memory, parse some basic html, etc. Now with javascript EVERYWHERE and the rising complexity of web pages, web browsing has become one of the most taxing things you can do as far as power use is concerned. In fact, on my MacBook Pro now that OS X tells you which processes are using the most power, web browsers like Safari and Chrome are the only thing I ever see show up in "Apps using significant energy"
It's almost purely a function of CPU usage especially if you have an SSD.
I don't think the mac power usage thing takes into account GPU usage but for that you'd probably want to download something third party to check the watt usage itself.
That's because Chrome's permission model isn't very granular. If you're doing something that needs to be able to interact with any website (like that tab suspender thing), you get that scary warning, regardless of whether or not it's actually reading or changing data on all websites, because the permission indicates that it can, and there's not a less granular option available to declare.
The other option for extensions is to specify certain sites that they work with (this is the "read and change all data on 'x.com'" permission prompt). That only works for site-specific extensions (like Reddit Enhancement Suite, or Camel Camel Camel).
Edge is very power efficient on Windows 10, assuming your drivers allow it to do full hardware acceleration. Their rendering stack and JS engine are very competitive.
I find Chrome with uBlock origin and Ghostery installed is pretty low power. It seems to run at about 1% of CPU if I don't watch video. Without the extensions it uses way more.
None. For one, Safari's rendering engine has been open source from the beginning.
You can compile webkit from scratch, note the similar battery efficiency AND check it's code.
What's it with Apple that brings out the conspiracy theorists in people? (other pet peeves: "they purposefully remove ports to sell more adapters", "they purposefully cripple mobile web apps", etc).
Adding to your list: "they released the newest OS for my old iPhone for free in order to slow it down and make me buy a new iPhone".
It's not limited to Apple, but the pattern is the same: it goes immediately from conjecture to absolute certainty of malicious intent (but they didn't count on the speaker being clever enough to see through the scam). And if disproved, it's just the exception that proves the rule, which the speaker then has to loudly reassert.
We need a new rule: Never ascribe to malice what's adequately explained by others having different priorities than you.
Safari's rendering engine has incorporated closed-source binary blobs with ambiguous licensing that call into undocumented OS X APIs from the very beginning. See http://arstechnica.com/apple/2008/02/finding-a-worm-in-the-a... Used to be that critical functionality like font rendering went through the binary blobs too.
Apple really does do nasty things to drive short term profits while hurting user experience. Why does Apple charge $130 for a $5 LTE antenna on iPads or $100 for a $3 64GB SSD chip on iPhones? Why do they cripple entry level iPhones with only 16GB? Why do they have a long history (recently ameliorated) of selling Macs with far too little RAM to be useful and charging ten times the market price for upgrades?
Just because they aren't guilty here doesn't mean Apple hasn't behaved badly at some point. And when a company prizes short term profits from loyal customers over long term profits and gains of market share, that isn't a conspiracy -- it's a strategy.
>Apple really does do nasty things to drive short term profits while hurting user experience. Why does Apple charge $130 for a $5 LTE antenna on iPads or $100 for a $3 64GB SSD chip on iPhones? Why do they cripple entry level iPhones with only 16GB?
Because market segmentation. It's a concept in sales as old as Adam Smith.
When you think about it, either 64GB iPhone is overpriced or the 32GB is underpriced, or both.
Possibly, the people buying the 64GB iPhone for slightly more than they should are actually subsidising the people getting the 32GB for less than otherwise.
> either 64GB iPhone is overpriced or the 32GB is underpriced, or both
The problem is that there isn't a 32GB iPhone 6s, though it would only cost Apple a few pennies per unit to offer one. Instead, it has a 16GB version (too little storage) and then jumps to a 64GB version (which is comparatively overpriced).
Either way, purely in terms of manufacturing costs, it's a choice between "overpriced" and "even more overpriced" -- or, if it makes you feel better, "premium priced" and "even more premium priced". I don't think "subsidising" comes into it.
Well, obviously Apple has to keep driving up short-term profits because having only $200 billion stowed away in cash (mostly overseas, to avoid taxes) is not good enough. If only Apple had $300 billion, or $500 billion, it could really change the world... though I doubt whether paying Foxconn or Pegatron workers a decent basic wage would be part of it ;-)
> None. For one, Safari's rendering engine has been open source from the beginning.
Well, it's not that simple. Safari's Webkit has started as a fork of KHTML, which was already open-source.
Webkit's development has been conducted for circa one year behind closed doors. Then they released it as open source. KHTML dev have been faced with the option of merging a giant patch, throw away a year's worth of work and rebase on webkit or continue on their own.
So:
1) Webkit has not been open source from the start, but from one year later
2) Webkit is open source, it's so because it's a fork of KHTML. A wild fork though, an aggressive, non-cooperative fork. Webkit effectively hijacked and killed KHTML.
3) energy efficiency is not something that only concerns the rendering engine. There is also the javascript engine, the plugins and so on. You're making it a little bit too simple there.
>Well, it's not that simple. Safari's Webkit has started as a fork of KHTML, which was already open-source.
I know, I was using Konqueror before it was cool. That's beside the point though.
>1) Webkit has not been open source from the start, but from one year later
Still irrelevant to our discussion.
>2) Webkit is open source, it's so because it's a fork of KHTML. A wild fork though, an aggressive, non-cooperative fork. Webkit effectively hijacked and killed KHTML.
And also a fork so much advanced (even at the first year) from KHTML that might as well have been a totally different project. And I should know, I'm one of the (I presume) few in here using Konqueror and KHTML in KDE 2.0 back in the day (that would be circa 2000-2003), for my, then modest browsing needs. And I know the whole backstory, as I was reading the "dot" then (KDE's news portal).
The thing is, WebKit, the fork, saw far more success than KHTML as an open source project, and became itself one of the largest open source successes. In fact it's so much a not just Apple thing, that code from there also powers Chrome, the most popular browser today (and Opera) and numerous other projects.
And, like with KHTML, Google forked Webkit to create Blink. When you want freedom to shape a project as you please, that's what you do. If you can keep 'em up, forks are nothing to be ashamed of. Some of the most successful projects have been forks (and sometimes, they even merged back after many years, e.g. XEmacs).
But still, this is again all beside the point. You whole comment until here merely repeats: "This Apple is not that benevolent -- they only made WebKit open source because they based it on an open source project". Nobody argued otherwise, and it's now what's under discussion.
>3) energy efficiency is not something that only concerns the rendering engine. There is also the javascript engine, the plugins and so on. You're making it a little bit too simple there.
The javascript engine is also open source.
And you can check battery efficiency without the plugins -- which aren't any secret either.
You've highlighted one of the many advantages to browsing with javascript off by default. I think at this point it should really only be used for must-have cases.
Yes, but that's why I use an extension like NoScript. If it's too badly broken I enable domains 1 by 1 till I get enough content, sometimes websites just look odd or behave odd but still give you what you came for. It also helps to have FlashBlock as well, which will stop plenty of flash adverts from loading in the background.
Maybe, but the typical person doesn't browse the entire web, they browse a tiny part of it. Could be the GP visits sites (such as this one) that don't all depend on JS and there's a positive feedback loop for them.
I've permanently whitelisted about 150 domains, and temporarily whitelist domains as I need them. It's not very problematic and pays dividends in general web browsing speed.
Which also raises the question why do these websites not offer versions of their website that at least function enough without all the JS sugarcoating. I miss the early web with simpler designs.
Because designing, building, maintaining and supporting multiple versions of a site costs more. Imagine a tech support call when the first thing you need to determine is whether a user is using your full-featured site or your reduced-functionality site, and then explaining to the end user that they're on entirely the wrong site. Id guess that cost/benefit analysis just doesn't justify the effort in most cases.
The main issue is that people expect the modern web to behave more like an application, which generally requires client side code. JS is not the only thing to blame, as some CSS can cause the graphics card to switch from 2D to 3D, causing a vast increase in power consumption.
It would be much better of there was a good way to have the server issue partial updates of the DOM in response to user action without needing JS glue to make it happen.
Web developers can do this now if they don't need to easily react to user input. They can use HTTP multipart messages and slowly stream in HTML as needed. This is also fairly buggy in current browsers as websites stopped using it ever since XMLHttpRequest came out.
I assume the videos available on iTunes are encoded in an H.264 profile that can be decoded entirely on the GPU, so the CPU would indeed be mostly idling during playback.
Yep, you'll also get H.264 served on Safari, while Chrome will get VP9 which isn't hardware decoded and will cause the fans to whirr.
There's a h264ify chrome extension that forces YouTube to serve H.264 to Chrome on OS X so it can be hardware decoded. It significantly reduces battery drain.
You get accelerated H264 in Chrome which you paid for (through Apple), but YouTube prefer to serve vp9 if you can decode it, to save their own licensing fees. MS Edge only reports being able to decode vp9 if you have hardware acceleration for it, for battery life.
Not so much licensing fees as bandwidth - VP9 as a new generation codec (H.265 being a rough equivalent) uses less of Googles bandwidth which probably saves Google a lot of money for a "minor" cost of users power draw :)
Most Apple devices don't have VP9 capable hardware decoders. Even those that could do it (new Intels) do not have such capability enabled or exposed in OS X.
I tried Safari for a while, but there were many annoying quirks, some which may be fixed now (every n new tabs open slowly, safari sync slowing things down, some important-to-me sites not rendering properly, ...)
With Chrome, The Great Suspender[1] seems to work well at reducing background tab CPU.
Interestingly, the thing that drains my battery most is running a VM in virtualbox, even when it's doing nothing.
I'm not really surprised by this, but it would be interesting to see how much Virtualbox hooks into the power-saving functionality on the host machine, and if this can be optimized. I'd have thought that a VM that's pretty much idle would be able to somehow utilize the host computer's power saving methods. Maybe it can, but I haven't enabled it in the VM settings... Any tips?
If you want to prolong battery life with VM, use Parallels or Fusion, not Virtual Box [1]. I suppose VB just does not implement power saving hooks or, perhaps, implementing them brings too much complexity into the software.
KVM on Linux has similar support for efficient idle: if the guest OS inside the VM handles idle sensibly, KVM itself will idle, allowing the host OS to idle.
I happen to use Go and a Makefile to build my servers, and my deployment process works on osx and linux, so I end up not having to have to replicate an environment for development.
I looked in to Docker on osx, but I am waiting for their latest beta Docker for Mac to come out. I think something like this with a lower learning curve would be more ideal for virtualization. It lowers the feedback loop on experimenting.
I run Postgresql on one of my VPS machines, and I dread having to upgrade it all the time.
You understand why a casual reader, when encountering the statement, "I use 3% of my battery in 90 minutes on my laptop" will raise an eyebrow, right? The extrapolation to total battery life is a little mind boggling.
Movie decoding is substantially offloaded to dedicated hardware circuits in the GPU that can do the job with minimum power. HTML rendering runs on the CPU.
> On the other hand, web browsing used to be considered a light use of power
It would make a better test if they reported battery life when web browsing both with and without ad blocking. It'll certainly make a noticeable difference.
I feel like there should be a Law out there somewhere that, given multiple choices, marketing will invariably pick the scenario that results in the largest number.
I heard the sentiment from Conte at Georgia Tech with regards to Amdahl and speedup, but it seems to hold generally whenever any type of numeric performance measure is presented.
I have a rMB from 2015, I easily get 10-13 hours out of it unless using flash of Java. It's a great portable laptop but BADLY needs 16GB of RAM, it would / could just be so much better if you weren't limited to 8GB especially with browsers / websites the way they are these days you chew that up in no time.
Now with javascript EVERYWHERE and the rising complexity of web pages, web browsing has become one of the most taxing things you can do as far as power use is concerned.
>The new guts are getting you better performance but also better battery life with what Apple says is 10 hours of web browsing or 11 hours of iTunes movie playback.
Movie playback used to be considered the de facto test of the most rigorous power use a computer would go through. Spinning DVDs and hard drives have been replaced with SSD, hardware accelerated decoding of video has replaced maxing out your CPU.
On the other hand, web browsing used to be considered a light use of power. Pull some network content into memory, parse some basic html, etc. Now with javascript EVERYWHERE and the rising complexity of web pages, web browsing has become one of the most taxing things you can do as far as power use is concerned. In fact, on my MacBook Pro now that OS X tells you which processes are using the most power, web browsers like Safari and Chrome are the only thing I ever see show up in "Apps using significant energy"