Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Feeling safer online with Firefox (astithas.com)
378 points by nachtigall on Jan 13, 2017 | hide | past | favorite | 97 comments


Awesome changes.

One suggestion: In the Control Center™, I would recommend using the past-tense for the current state. E.g.,

    Receive Notifications           Allowed X
    Access Your Location            Allowed X
    Maintain Offline Storage        Allowed X
As it exists in the screenshots, the present tense is used, and the X button seems to be associated with the word "Allow." Further clarification could be achieved by making the X button actually say "Disallow" and giving it a border separate from the word "Allowed." E.g.,

    Receive Notifications      Allowed   [Disallow]


> Further clarification could be achieved by making the X button actually say "Disallow" and giving it a border separate from the word "Allowed."

I agree. An X usually means "close" or "hide this thing," whereas here they're using it to change a setting. It really looks like there should be a toggle switch there.


This is exactly right that X means "close" or "hide this thing" and this is what it does in this case as well: it removes the non-default setting and hides the list item.


Thanks for the suggestion, this has now been fixed.


I am sure there are people who would love a browser that can control their car or pacemaker and report their bank balance on the welcome page.

But I personally would feel far more secure if there was a firefox-lite where no sensitive stuff (access camera, share screen) were included to start with. And I don't mean turned off by default, I want it removed at compile time.


As mido22 said, the alternative right now is a closed plugin (Flash) or installing software on your computer completely outside of the sandbox. Putting these features in the browser in a controllable way is a step forward. If you don't need them, Firefox is open source and I'm not just saying that to be glib. You can easily grab the source https://developer.mozilla.org/en-US/docs/Mozilla/Developer_g... and compile it with --disable-webrtc https://developer.mozilla.org/en-US/docs/Mozilla/Developer_g...


Replying to myself: ok so I just tried this and it's not that easy. The bootstrap script installed a bunch of dependencies but didn't install rust for some reason. Also Ubuntu 14.04 doesn't have the right version of gcc available so you have to get that basically too. The hg clone operation took forever and timed out a few times. Then my computer ran out of RAM during the build and I had to close some programs and start it up again. Pretty much a giant pain, and I don't even want to try Windows where the first step is "install Visual Studio"!


FWIW, on a fresh ubuntu 16.04 install, this is what needs to be run:

    sudo apt update && sudo apt install python build-essential -y
    wget -O bootstrap.py https://hg.mozilla.org/mozilla-central/raw-file/default/python/mozboot/bin/bootstrap.py && python bootstrap.py
    cd mozilla-unified/
    ./mach build


That takes care of rust too?


re: hg clone timeouts, you can download a leaner hg bundle over HTTP and then "unbundle" it:

https://developer.mozilla.org/en-US/docs/Mozilla/Developer_g...

mozilla-central is just such a big repository that it pushes the usual cloning process to its limits.

The Rust dependency was added to Firefox very recently, so I guess with the holidays it hasn't made it into the bootstrap script and docs yet.


*basically = manually


But I am not familiar enough with the firefox code to do this. I might unknowingly create a whole new set of security issues by changing a few lines in a project of this complexity.


FF you first start by looking at their long guide for version control/build instructions http://mozilla-version-control-tools.readthedocs.io/en/lates...

Then you make a custom mozconfig file to disable or enable features, or to add your own extensions (ie: windows): https://developer.mozilla.org/en-US/docs/Mozilla/Developer_g...

  ac_add_options --disable-activex  
  ac_add_options --disable-activex-scripting
  ac_add_options --disable-installer
  ac_add_options --disable-crashreporter
You can further abstract this with a mozconfig wrapper https://github.com/ahal/mozconfigwrapper

But I agree, I can't find any documentation what exactly every feature is and why you would not want to disable/enable them. The Chromium build process is much more straight forward, or you could just use 3rd party sandbox and regular release FF.


Your first link is misleading because it's targeted toward onboarding new Firefox contributors and includes a Mercurial tutorial and development workflow guide in addition to build instructions. Simple build instructions are here:

https://developer.mozilla.org/en-US/docs/Mozilla/Developer_g...

On Linux I clone the source and run "./mach build".


Forgot to add there's a really good page around by the Chrome security team detailing all kinds of functions somebody might wish to remove such as dns lookups that expose your IP (the tails/torbrowser design document also has this list)


do you have a link?


The suggestion is that you compile it with a flag to disable the feature. No need to change any lines of code.


I think their point is that they don't know enough about the innerworkings (read: code) to be sure that turning off one feature doesn't enable far more security vulnerabilities than existed when the feature was on...

They aren't suggesting using compile flags requires code changes...


So get Firefox back to where it was, just a browser that supported extensions. Everything beyond core browsing should be an extension or plugin.

I have no use for WebRTC, so I would not install the addon/plugin. You may want it, so you would. When there was a problem with Mozilla's implementation, I wouldn't have to care, and neither would anyone else who didn't use or want it. Only those that chose to have the functionality would need to be concerned, and even then it could be updated without a full browser update.


If you want that you can use the Pale Moon browser. It's a very fast, non-bloated, fork of Firefox that has significantly diverged from FF. See https://www.palemoon.org/technical.shtml for details and in particular why they don't support WebRTC.


See https://www.palemoon.org/technical.shtml for details and in particular why they don't support WebRTC.

The real reason is that they are based off of an old ESR and their code wouldn't be able to inter-operate with anything else. Aside from the issue that they're not getting security patches for it, either...

The recommendation to use an external PDF reader is also not quite something I can stand behind. (Unless that external reader is Chrome...)


Sounds like your a bit behind that times. Pale Moon 27 is a rebase from FF 38 which has WebRTC. I'm not sure why we should trust your claims, already shown misinformed, when the developer's page states something else. The PM community is basically in consensus in not wanting WebRTC.

As for external PDF readers I'd argue that avoiding the push for integrating or redeveloping more and more external applications into a browser is very sane.


rebase from FF 38 which has WebRTC

This is a bit like saying "Firefox 4 has HTML5". The WebRTC stack in Firefox evolved significantly from Firefox 38 to 50, as did the WebRTC standards themselves. That's why I pointed to interoperability issues.


ESR38 stopped getting upstream security updates from Mozilla in April of 2016. But I'm sure the Palemoon folks have stayed on top of things since.


Recommending Pale Moon to folks who care about security is foolish. There is a much bigger story and a bit of internet drama around it, but essentially, when I asked the Palemoon project lead about their security program on twitter, he blocked me. The claims that they make that they "inherit" Firefox's security properties is not valid as they introduce significant code changes.


The catch using something like this - while great from many perspectives is that you risk vulnerabilities just due to it not being as popular and as commonly attacked.

So if your threat model includes targeted attacks, where an attacker might invest some (even a small) level of effort to find a 0-day vulnerability, I don't think I'd use it.


With Pale Moon the largest risk is that as far as I know the ESR branch they forked away from no longer gets security patches from Mozilla.

So you probably don't need to do effort to find a 0-day, just browse old Mozilla CVE disclosures.


I'm sure they at least attempt to patch these, but it's often all too easy to screw up a patch and leave some part of the vulnerability still exposed. Look at what happened when Google tried to patch the stagefright vulnerability.


Again with the FUD? I've witnessed the last couple big 0-day discovered by the Tor people were patched in Pale Moon before Mozilla pushed out theirs for Firefox.

edit: I'd reply to your response below normally but apparently I don't get to reply to any comments on HN anymore. The reply button has disappeared. When I log out of my 5 year old/458 karma account it's back. I guess my opinion isn't wanted here.

You have a good point there. I bet a least a couple of those are present. But you've also completely missed my point. When looking through the FF known vuln list the vast majority are for things like WebRTC, WebGl, and other attack surfaces that Pale Moon intentionally avoids.


The whole point of my post, which you seem to have completely and utterly missed, was that you don't need 0-days for exploiting Pale Moon. Every single Mozilla CVE published from Firefox 38 to Firefox 50 is potential issue for it. The amount of 0-days in there is exceedingly low, but amount of CVE is very high, because Mozilla publishes CVEs for security bugs they find themselves. AFAIK Google also does this, but Microsoft doesn't.

This isn't FUD. You can literally go read the list:

https://www.mozilla.org/en-US/security/known-vulnerabilities...

I count over 174 fixed vulnerabilities and stopped at version 48. Yes, some of these might not apply to Pale Moon because they're new vulnerabilities or it has the relevant feature disabled. You think anyone did the work to go through them all? Let alone backport the ones that are relevant?

Mozilla used to do this work for Pale Moon by virtue of still backporting the most important ones (i.e. not all) to ESR38. Not any more. Good luck!

the vast majority are for things like WebRTC, WebGl, and other attack surfaces that Pale Moon intentionally avoids

Pale Moon supports WebGL nowadays. It's needed for a few things like Google Maps to not suck. Of course, the implementation is outdated, which is perhaps what made you think it's not there...


Pale Moon does not completely ignore these though as you seem to be suggesting, they indeed do at least try to patch these.

And I'm betting most of the time they succeed. There may be a few weird ones with edge cases that they've screwed up though and some subset of the vulnerability is still possible.


Scroll through Mozilla's security announcements, pick ones at random, find the patch that fixed it and see if it ever got applied to Pale Moon. In many cases they haven't.

I have pointed out many of these and argued with Pale Moon devs about it. "Moon Child" believes they don't need to apply patches if they can't replicate the PoC from Mozilla's bugzilla.

These are things that are obviously vulnerable and need to be fixed (such as missing bound checks in the XML parser).

If someone ever cared enough to target Pale Moon users they would have an absolute field day with all the known Firefox vulnerabilities they could use.


The reply button has disappeared.

HN delays the visibility of the reply link on threads that seem to be getting too deep too quickly.


At that point you should basically only be using the hardened tor browser based off (currently) Firefox 38. https://blog.torproject.org/blog/tor-browser-65a6-hardened-r...


If your threat model includes targeted attacks I'd have serious doubts even about using Curl.


This ship has sailed, because pushing the Web Platform (which every year exposes more and more of this platform functionality in JS) is a vested interest of Google and Mozilla, for different but related-enough reasons to where they rapidly achieve parity.

You're going to have to dig up a browser whose authors don't subscribe to this vision, or browsers that may intend to do this down the road but haven't caught up yet. For the time being, out of graphical browsers, Dillo and Netsurf are in this category.


I feel far more secure doing a video call with Firefox and WebRTC than I do with the alternatives.


I disagree, I trust WebRTC enough for camera and screen access, the one real issue might be, it exposes your local ip


It seems uBlock Origin has a setting (right on the main settings page) that can prevent the local IP to be leaked: https://github.com/gorhill/uBlock/wiki/Prevent-WebRTC-from-l...

You can test it here: https://browserleaks.com/webrtc

I tried that test page with the mentioned setting 1) enabled and 2) disabled. I did not see my local IP on 1), I did get see it reported with 2) - so it seems the block works. NOTE: Whether uBlock Origin was enabled for the specific page did not matter, the WebRTC block setting seems to be global and independent of whether the extension blocks anything else on the site.


> And I don't mean turned off by default, I want it removed at compile time.

Many features can be turned off globally in about:config in a way that will not result in an opt-in prompt and will also reduce the javascript API surface.

Is that not good enough?


I'd imagine that the closest we have right now would be something like the Tor browser bundle. IIRC some things are outright removed, but many others are just disabled by default.


Let's patch dillo and make it an HTTP static document reader, and embed lua for everything else.


Don't downvote or I'll do it out of spite :)


This is a good point. Is there just a global variable? allowCameraAccess = false;


Yup, there are instructions on this test page: https://browserleaks.com/webrtc#webrtc-disable


The status "Use the Camera - Allow - X" can be confusing. Is the site currently allowed to use the camera, or not? The word Allow could either mean "currently allowed" or "click to allow." The X could mean either "currently blocked" or "click to block."



The X has a tooltip that says "Clear this permission and ask again" so it's quite clear to me.


Right, but it would be much more clear, without needing to read the tooltip, if it just said "Allowed - X", which IMHO is pretty clear that it is currently allowed and can be revoked with the 'X'.


Agree - this was the first thought I had and came here to comment on it.

When using the UI it's a bit more clear: you can't click "Allow". But just looking at the screenshot, I had the impression this was a request dialog somehow. I thought Allow was clickable to give permission, and the X was to dismiss being told the site wanted to know about the permission request.

The little X doesn't seem like a common UI element to indicate "block", but it's probably ugly to say "Allowed - [Block]".

Another idea might be to write "Currently allowed" or "This site has access to" under the word "Permissions".


The incorrect system time detection is a small feature, but actually pretty neat. I've run into that before, when testing time-sensitive features in my software and forgetting to change it back, then wondering why on Earth nothing secure works anymore.


Any reason why you would not try time-sensitive stuff in a VM?


Easier to change the system clock than set up a VM :P Especially for macOS.


Anyone know how it detects the system time is wrong?


I run FireFox on Linux for personal on-line banking. It's the only browser that I am able to run with Tomoyo Linux in enforcing mode (level 3). I'm sure, if given enough time, I could build a Tomoyo policy for Chrome, but it's far more verbose than FireFox and the last few times I tried, I gave up.


If you don't mind answering: How usable is Tomoyo for you overall? How much extra time does it consume? Are there things that just don't run? Are there many bugs in applications that do run? (Also, which distro are you using?) Thanks!


Debian 8. I find Tomoyo to be the most usable of the LSMs. You can be up and running in a few days with rather complex policies. I could not get Chrome to work with it. It just reads and writes all over the place in ways that are hard to manage. No bugs yet.


Am I only the one who finds "new" (well, it was there for about an year, I believe) "Site Identity and Permissions Panel" panel to be literally useless for the "site identity" part?

It has no information on CA, whenever it's first time you saw this exact certificate or not, whenever a "weak" or "strong" ciphers are used (and if PFS is enabled), etc - things one'd really want to see if they care about their connection encryption and authentication. It's all still available, but hidden after long sequence of button clicks. Heck, it would be useful to have client certificate and HTTP auth status there as well - it would actually make those nice things closer to being usable.

I really fail to understand why it can't be displayed in a sanely concise manner - and why things that were there before were removed. Surely there's a plenty of screen space and it's not like it would scare Joe Sixpack off to Chrome, or confuse anyone. Or analytics show it otherwise?


There have been a number of user studies done by Mozilla and other browser vendors that clearly show end users only have a basic grasp of the security properties of the web. So yes, more details in main UI elements leads to confusion. It's just 2 clicks for those of us who know what a certificate is (Ctrl/Cmd-I is even faster).


> It's just 2 clicks

Three, actually. One on the i+lock icon pair, one on the right-pointing arrow, then one on "more info". If I'd need more details (like issue and expiry dates, which is pretty common thing to be interested in), then it's 1 more button "view certificate". And if I'd happen to be interested in certificate public key properties (algorithm and key size) it's a really long story, 6 clicks away from the address bar.

It certainly makes sense to not show something right away, on the first click. But the current UI hides quite essential information (to those who can understand it) way too deep. I'm really not persuaded it would hurt usability and confuse users if such information would be 2 clicks away, rather than 4-6.

(And, really, it mustn't hurt to show at least "have I visited this page prior to today? yep, 234 times" on the very first click. And probably won't confuse anyone much to also see something like "TLS1.2, modern ciphers" or "TLS1.0, legacy ciphers".)

> Ctrl/Cmd-I

Toggles bookmarks sidebar for me. I'm unaware of any shortcut to open page info.


i'm also not a fan of all the ui reduction and "simplification" just for the sake of a refresh. needless to say the only reason i'm still able to use firefox is Classic Theme Restorer. Once that is unable to run, i'll be off to Opera or Ungoogled Chromium or maybe Servo (hopefully we get a few years for "power" users before that gets gutted too)


As long as more and more features get added the more the attack surface increases on firefox and all other browsers.

Feeling safe, and being safe are two different things.

Same goes for self signed (or expired) certificates and 'not secure' connections, they are not per definition 'not secure'.


not sure if you can call these features, these are basically options to see what permissions you have given and what are being used by whom, and as for the features question, any day I would prefer camera access through WebRTC than installing and using flash plug-in.


Could not agree more (see my other comment). Firefox should slim down to reduce the attack surface.


Reduced attack surface or standards compliance; pick one.


Nice post, it's hard to realize progress made in secondary UI elements such as security panels.


Making it easy to see permissions for the current site is great, but why is there no way to see all the sites that have special permissions? Firefox used to have about:permissions but that was removed last year.


about:permissions was an incomplete experimental UI, but a fully functional replacement is high on our priority list. We are waiting on a new design from the UX team at the moment.


It's a step in right direction, but would certainly feel safer if in addition to cookies/storage/geolocation permissions, Firefox allowed to whitelist JavaScript on certain domains out of box, with no need to resort to NoScript. Using NoScript results in two different whitelist mechanisms with completely different UI which breaks the browsing experience.

Ironically, as far as "privacy-oriented browsers" go, Chrome has domain whitelisting of Cookies/JS/Plugins easily accessible from address bar and it works as expected.


> In the new design, permission prompts stay up even when you interact with the page.

I think this is going to be a nice improvement. It was way to easy to "loose" the permissions dialog in the older flow.


I love firefox.. My big problem as a late though seems to be that sites stop loading intermittently, need to be refreshed or I need to wait (and I don't have this problem on Chrome).

Also, I got kind of annoyed when one of their leaders came begging for donations by email, but are getting paid FAR beyond normal wage.


Extensions still run in Private Tabs, unlike with Chrome, so they are free to phone home about your private browsing as much as they'd like. This is the real privacy hole that still needs to be fixed.


The only thing that would make me feel safer would be the sandbox.


The Windows version of Firefox is already sandboxed. (The other platforms are too, but not in release)


Mbox exists for this or FireJail, Sandboxie (Windows)or OSX sandbox-exec https://pdos.csail.mit.edu/archive/mbox/

https://wiki.mozilla.org/Security/Sandbox


Sandboxie is nonfree and after a trial period only allows sandboxing one app at once. Firejail just had a local privilege escalation exploit, but I'm still using it (although more cautiously than before). Mbox appears unmaintained, and sandbox-exec isn't even for a platform I use.

I'm quite glad that Firefox implements sandboxing of its own.


Look at the sandbox used for SubgraphOS (Debian/Jessie only) https://github.com/subgraph/oz


The goal of browser sandboxing is to protect different sites from each other. It does me no good if some malicious ad uses some plugin bug or JavaScript heap spray or something, and my sandbox successfully prevents the exploit from escaping the browser, but the same browser also has my bank open in another tab.

Given that 95% of what I do on my personal computer is in the browser, sandboxing the rest of my computer from the browser is sort of a https://xkcd.com/1200/ situation.


I'm excited to see FF undertake security this way. It's the right thing to do.


Firefox is not even looked at pwn2own competition because it's too easy to hack and not using good OS or sandbox protection https://it.slashdot.org/story/16/02/12/034206/pwn2own-2016-w...


After an absolutely massive engineering effort to make the browser multi-process, Firefox is starting to roll out OS-level sandboxing with Firefox 50.

https://blog.mozilla.org/futurereleases/2016/12/21/update-on...

The statement quoted in Slashdot is really unfair.


Not at the time it was made. This blog post is from December, the slashdot post from February. The feature wasn't even being rolled out until August. We'll see what happens at the next Pwn2Own...


The underlying engineering effort (e10s) was well underway by that point in February.


E10s has been underway for years though.


So does Firefox have security sandboxing on my Mac nowadays? No...


OS-level sandboxing for the content process is enabled for macOS as of Firefox 52, which is the current release on the Developer Edition (Aurora) channel and will be rolling out to stable come March. It's currently in stable on Windows and Nightly on Linux. Sandboxing for media plugins is already stable on macOS, Windows, and Linux.


Good to know that this is finally there. I would rather use Safari than a browser without OS sandboxing (which is a no go for me personally).


How do you know with such certainty that it is because it is too easy to hack and not because Chrome is the current "big fish" among browsers and Google gives monetary rewards to white hats, both of which could reasonably fuel disproportionate interest in breaking Chrome?


For what it is worth, Firefox also has a security bounty program, one that is older than Chrome itself. :) Of course, the payouts are smaller than those from Pwn2Own, but on the other hand, reporters don't have to demonstrate a full exploit that can actually execute arbitrary code.


Reasons aside, the fact remains that when a group of white-hat hackers says they "won't bother" with a given target, it doesn't speak well to that target's security. There are no-doubt tons of exploits in Firefox still waiting to be found, as there are in Chrome, Opera, Edge, etc. Given that the Tor Browser is built on top of Firefox, it's a huge loss for everyone that Firefox is not being included in the attacks; the vulnerabilities that the white-hats find would be reported and promptly plugged, rather than left open for nefarious three-letter agencies to exploit.


> Reasons aside

No. Reasons not aside. Reasons are very important.


If we don't know what the reason was, and they won't elaborate on it, what importance does it have? Pwn2Own 2016 already happened. Firefox wasn't included, for whatever reason. The damage is already done at this point, as its exclusion created the impression that it was "not worth attacking".

If the guys had said, "We didn't bother with Firefox because they weren't willing to pay us as much as Google or Microsoft", okay. But they didn't. What they said was:

'We wanted to focus on the browsers that have made serious security improvements in the last year,' Brian Gorenc, manager of Vulnerability Research at HPE said."

And now Firefox looks weak by comparison.


> If the guys had said

Why would they have said that, even if it were true? What's the upside for them?

I agree the net result is the same whether they were honest or not, of course, which is again why there was no upside for them to say that if it were true.


Or maybe it's because Mozilla refused to pay money to the pwn2own organizers (unlike the other browsers)? Hard to say which one is more likely.


But the Safari browser is? And Tor Browser uses Firefox?


It wasn't looked at, because no one paid a bounty for it. And they stated as reason for that, that Firefox simply didn't have any major architectural changes since the last pwn2own, i.e. they probably wouldn't find anything new. If they had expected to find many vulnerabilities, then it would make more sense to target it, as the whole point of these events is to find vulnerabilities, so that they can be fixed.


>It wasn't looked at, because no one paid a bounty for it.

I don't think you understand how PWN2OWN works. And the poster you replied to was right, at the time of PWN2OWN 2016 Firefox was the only browser without a working sandbox. It would have been very easy to compromise it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: