This is the Connected Devices Meetup where we will have 3 speakers presenting their slides or demos and answering questions.
the following changes have been pushed to bugzilla.mozilla.org:
-  add support for the hellosplat tracker to ‘see also’
-  intermittent internal error: “file error – nav_link: not found” (also manifests as fields_lhs: not found)
-  backport upstream bug 1263923 to bmo/4.2 – X-Bugzilla-Who header is not set for flag mails
-  I have found a bug in the section 2.6.1 in the user guide(2.6) of BMO documentation. The bug identified is a grammatical error committed in one of the sentences.
-  Don’t see a way to redirect a needinfo request (in Experimental UI)
-  clickjacking is possible on “view all” and “details” attachment pages
discuss these changes on mozilla.tools.bmo.
Firefox: Mozilla und Canonical verlängern Ubuntu-Partnerschaft
Mozilla und Canonical haben heute die Verlängerung ihrer Partnerschaft angekündigt. Damit wird Firefox auch in Zukunft der Standard-Browser von Ubuntu sein. Außerdem wird Mozilla ab diesem Jahr Firefox-Versionen im neuen Snap-Format ausliefern.
en meer »
Hello, SUMO Nation!
Let’s get the big things out of the way – we met last week in Berlin to talk about where we are and what’s ahead of us. While you will see changes and updates appearing here and there around SUMO, the most crucial result is the start of a discussion about the future of our platform – a discussion about the technical future of SUMO. We need your feedback about it as soon as possible. Read more details here – and tell us what you think.
This is just the beginning of a discussion about one aspect of sumo, so please – don’t panic and remember: we are not going away, no content will be lost (but we may be archiving some obsolete stuff), no user will be left behind, and (on a less serious note) no chickens will have to cross the road – we swear!
Now, let’s get to the updates…Welcome, new contributors!
- Swarnava, for being his usual awesome self ;-)
- The forum supporters who helped users out for the last week.
- The writers of all languages who worked on the KB for the last week.
We salute you!Don’t forget that if you are new to SUMO and someone helped you get started in a nice way you can nominate them for the Buddy of the Month! Most recent SUMO Community meeting SUMO Community meeting…
- …is happening on WEDNESDAY the 27th of April – join us!
- Reminder: if you want to add a discussion topic to the upcoming meeting agenda:
- Start a thread in the Community Forums, so that everyone in the community can see what will be discussed and voice their opinion here before Wednesday (this will make it easier to have an efficient meeting).
- Please do so as soon as you can before the meeting, so that people have time to read, think, and reply (and also add it to the agenda).
- If you can, please attend the meeting in person (or via IRC), so we can follow up on your discussion topic during the meeting with your feedback.
- Once again – the most important news of the week (month?) – WE NEED YOUR FEEDBACK.
- The second most important update… at least today ;-) We are looking for more contributors to our blog. Do you write on the web about open source, communities, SUMO, Mozilla… and more? Do let us know!
- Want to know what’s going on with the Admins? Check this thread in the forum.
- Ongoing reminder: if you think you can benefit from getting a second-hand device to help you with contributing to SUMO, you know where to find us.
- Madalina is plotting and planning for Sprinklr’s future and looking at other tools that could support our valiant Army of Awesome!
- Ongoing reminder: We have a training out there for all those interested in Social Support – talk to Madalina or Costenslayer on #AoA (IRC) for more information.
- We have an ongoing QuickTime discussion, since Apple is not longer providing updates to the QuickTime Player, and has been blocked for Windows Machines. Huge thanks to everyone who jumped in to help out with this hot potato!
- Please review the discussion about the upcoming 46 release – if you notice any major issues please let us know on that thread.
- Remember, if you’re new to the support forum, come over and say hi!
- Hackathons everywhere! Well, at least in Stockholm, Sweden (this Friday) and Prague, Czech Republic (next Friday). Contact information in the meeting notes!
- A guest post all about a certain group of our legendary l10ns coming your way – it will be a great read, I guarantee!
- An update post about SUMO l10n coming over the weekend, because there ain’t no rest for the wicked.
- for Desktop
…and that’s it for today! We hope you enjoyed the update and will stick around for more news this (and next) week. We are looking forward to seeing you all around SUMO – KEEP ROCKING THE HELPFUL WEB!
The chaotic early days of a new computing era is an extended period of product innovation and experimentation. But both the form and function of new products are still strongly influenced by the norms and transitional technologies of the waning era. New technologies are applied to new problems but often those new technologies are not yet mature enough to support early expectations. The optimal form-factors, conceptual metaphors, and usage idioms of the new era have yet to be fully explored and solidified. Looking back from the latter stages of a computing era, early era products appear crude and naive.
This is a great time to be a product innovator or an enthusiastic early adopter. But don’t get too comfortable with the present. These are still the early days of the Ambient Computing Era and the big changes are likely still to come.
Once a month, web developers from across the Mozilla Project get together to talk about our side projects and drink, an occurrence we like to call “Beer and Tell”.
First up was emceeaich, who shared Memory Landscapes, a visual memoir of the life and career of artist and photographer Laurie Toby Edison. The project is presented as a non-linear collection of photographs, in contrast to the traditionally linear format of memoirs. The feature that emceeaich demoed was “Going to Brooklyn”, which gives any link a 1/5 chance of showing shadow pictures briefly before moving on to the linked photo.lorchard: DIY Keyboard Prototype
Next was lorchard, who talked about the process of making a DIY keyboard using web-based tools. He used keyboard-layout-editor.com to generate a layout serialized in JSON, and then used Plate & Case Builder to generate a CAD file for use with a laser cutter.
A flickr album is available with photos of the process.lorchard: Jupyter Notebooks in Space
lorchard also shared eve-market-fun, a Node.js-based service that pulls data from the EVE Online API and pre-digests useful information about it. He then uses a Jupyter notebook to pull data from the API and analyze it to guide his market activities in the game. Neat!Pomax: React Circle-Tree Visualizer
Pomax was up next with a new React component: react-circletree! It depicts a tree structure using segmented concentric circles. The component is very configurable and can by styled with CSS as it is generated via SVG. While built as a side-project, the component can be seen in use on the Web Literacy Framework website.Pomax: HTML5 Mahjong
Also presented by Pomax was an HTML5 multiplayer Mahjong game. It allows four players to play the classic Chinese game online by using socket.io and a Node.js server to connect the players. The frontend is built using React and Webpack.groovecoder and John Dungan: Codesy
Last up was groovecoder and John Dungan, who shared codesy, an open-source startup addressing the problem of compensation for fixing bugs in open-source software. They provide a browser extension that allows users to bid on bugs as well as name their price for fixing a bug. Users may then provide proof that they fixed a bug, and once it is approved by the bidders, they receive a payout.
If you’re interested in attending the next Beer and Tell, sign up for the firstname.lastname@example.org mailing list. An email is sent out a week beforehand with connection details. You could even add yourself to the wiki and show off your side-project!
See you next month!
Les meetup Open Transport ce sont des échanges réguliers autour des initiatives ouvertes et collaboratives dans le secteur de la mobilité (projets à base d'open...
Mozilla's Web Literacy Map Teaches the Essential Web Skills Everyone Should ...
Reading, writing, and math are no longer the only essential subjects everyone should learn. Today's essential skills include navigating the web, writing code, and engaging with others online. This web literacy map from Mozilla presents activities that ...
Just over a year ago, in bug 1145270, we removed the root certificate of e-Guven (Elektronik Bilgi Guvenligi A.S.), a Turkish CA, because their audits were out of date. This is part of a larger program we have to make sure all the roots in our program have current audits and are in other ways properly included.
Now, we find that e-Guven has contrived to issue an X509 v1 certificate to one of their customers.
The latest version of the certificate standard X509 is v3, which has been in use since at least the last millennium. So this is ancient magic and requires spelunking in old, crufty RFCs that don’t use current terminology but as far as I can understand it, whether a certificate is a CA certificate or an end-entity certificate in X509v1 is down to client convention – there’s no way of saying so in the certificate. In other words, they’ve accidentally issued a CA certificate to one of their customers, much like TurkTrust did. This certificate could itself issue certificates, and they would be trusted in some subset of clients.
But not Firefox, fortunately, thanks to the hard work of Kathleen Wilson, the CA Certificates module owner. Neither current Firefox nor the current or previous ESR trust this root any more. If they had, we would have had to go into full misissuance mode. (This is less stressful than it used to be due to the existence of OneCRL, our system for pushing revocations out, but it’s still good to avoid.)
Now, we aren’t going to prevent all misissuance problems by removing old CAs, but there’s still a nice warm feeling when you avoid a problem due to forward-looking preventative action. So well done Kathleen.
This is our weekly gathering of Mozilla'a Web QA team filled with discussion on our current and future projects, ideas, demos, and fun facts.
Former Mozilla VP Darren Herman is starting his own holding company to take ...
Darren Herman, Mozilla's former VP of president services, is launching a new holding company that aims to take on advertising stalwarts like WPP and Omnicom. The as-yet-unnamed company is described on Herman's LinkedIn profile as "the refugee camp ...
Mozilla Firefox Web Browser to Be Available as a Snap Package for Ubuntu 16.04
Mozilla today, April 21, 2016, announced the availability of future releases of their popular Firefox web browser in the snap package format for Ubuntu 16.04 LTS. Earlier today, Canonical unleashed the final release of the highly anticipated Ubuntu 16 ...
en meer »
At Mozilla, we strive to offer users a great experience based on transparency, choice and trust, and to make Firefox available across many platforms, devices and operating systems. Today, Mozilla and Canonical are renewing their partnership to make Firefox the default browser for Ubuntu users. We are proud to have been a partner of choice for Ubuntu for over a decade. Canonical and Mozilla share a similar heritage as open-source and community-supported organizations.
Ubuntu version 16.04 will include the introduction of the snap infrastructure. With the snap format, we will be able to continually optimize Firefox on Ubuntu. Like our rapid engineering release cycle, snap format will allow us to provide Linux users the most up-to-date features, in particular security patches, even after major Operating System ship dates.
Previously, a static version of Firefox would ship with each new Operating System version for the lifecycle of that OS. With the snap format, new features can be released to users of older OS versions too. Later this year, we will offer Firefox in snap format making it easier to push the browser directly to users rather than relying on an intermediary to accept updates before they reach users.
This is my third Firefox release as release manager, and the fifth that I’ve followed closely from the beginning to the end of the release cycle. (31 and 36 as QA lead; 39, 43, and 46 as release manager.) This time I felt more than usually okay with things, even while there was a lot of change in our infrastructure and while we started triaging and following even more bugs than usual. No matter how on top of things I get, there is still chaos and things still come up at the last minute. Stuff breaks, and we never stop finding new issues!
I’m not going into all the details because that would take forever and would mostly be me complaining or blaming myself for things. Save it for the post-mortem meeting. This post is to record my feeling of accomplishment from today.
During the approximately 6 week beta cycle of Firefox development we release around 2 beta versions per week. I read through many bugs nominated as possibly important regressions, and many that need review and assessment to decide if the benefit of backporting warrants the risk of breaking something else.
During this 7 week beta cycle I have made some sort of decision about at least 480 bugs. That usually means that I’ve read many more bugs, since figuring out what’s going on in one may mean reading through its dependencies, duplicates, and see-alsos, or whatever someone randomly mentions in comment 45 of 96.
And today I got to a point I’ve never been at near the end of a beta cycle: Zarro Boogs found!
This is what Bugzilla says when you do a query and it returns 0. I think everyone likes saying (and seeing) “Zarro Boogs”. Its silliness expresses the happy feeling you get when you have burned down a giant list of bugs.
This particular query is for bugs that anyone at all has nominated for the release management team to pay attention to.
Here is the list of requests for uplift (or backporting, same thing) to the mozilla-beta repo:
Yes!! Also zarro boogs.
Since we build our release candidate a week (or a few days) from the mozilla-release repo, I check up on requests to uplift there too:
PEAK ZARRO BOOGS.
For the bugs that are unresolved and that I’m still tracking into the 46 release next week, it’s down to 4: Two fairly high volume crashes that may not be actionable yet, one minor issue in a system addon that will be resolved in a planned out-of-band upgrade, and one web compatibility issue that should be resolved soon by an external site. Really not bad!
Our overall regression tracking has a release health dashboard on displays in many Mozilla offices. Blockers, 0. Known new regressions that we are still working on and haven’t explicitly decided to wontfix: 1. (But this will be fixed by the system addon update once 46 ships.) Carryover regressions: 41; about 15 of them are actually fixed but not marked up correctly yet. The rest are known regressions we shipped with already that still aren’t fixed. Some of those are missed uplift opportunities. We will do better in the next release!
In context, I approved 196 bugs for uplift during beta, and 329 bugs for aurora. And, we fix several thousands of issues in every release during the approx. 12 week development cycle. Which ones of those should we pay the most attention to, and which of those can be backported? Release managers act as a sort of Maxwell’s Demon to let in only particular patches …
Will this grim activity level for the past 7 weeks and my current smug feeling of being on top of regression burndown translate to noticeably better “quality”… for Firefox users? That is hard to tell, but I feel hopeful that it will over time. I like the feeling of being caught up, even temporarily.
Here I am with drink in hand on a sunny afternoon, toasting all the hard working developers, QA testers, beta users, release engineers, PMs, managers and product folks who did most of the actual work to fix this stuff and get it firmly into place in this excellent, free, open source browser. Cheers!Related posts:Kiva lending and people with disabilitiesBugzilla hijinks, Tuesday March 5
Some time around 4 weeks ago, a few of us got together to investigate what it would take to implement the Electron API on top of Gecko. Electron consists of two parts: a Node environment with a few additional Node modules, and a lightweight embedding API for opening windows that point to a local or remote web page in order to display UI. Project Positron tries to create an Electron compatible runtime built on Mozilla technology stack, that is, Gecko and SpiderMonkey.
While a few of my colleagues are busy working on Positron itself, I have been working on SpiderNode, which is intended to be used in Positron to implement the Node part of the Electron API. SpiderNode has been changing rapidly since 3 weeks ago when I made the initial commit.
SpiderNode is still in its early days, and is not yet complete. As such, we still can’t link the Node binary successfully since we’re missing quite a few V8 APIs, but we’re making rapid progress towards finishing the V8 APIs used in Node. If you’re curious to look at the parts of the V8 API that have been implemented so far, check out the existing tests for spidershim.
I have tried to fix the issues that new contributors to SpiderNode may face. As things stand right now, you should be able to clone the repository and build it on Linux and OS X (note that as I said earlier we still can’t link the node binary, so the build won’t finish successfully, see README.md for more details). We have continuous integration set up so that we don’t regress the current state of the builds and tests. I have also written some documentation that should help you get started!
Please see the current list of issues if you’re interested to contribute to SpiderNode. Note that SpiderNode is under active development, so if you’re considering to contribute, it may be a good idea to get in touch with me to avoid working on something that is already being worked on!
mconley livehacks on real Firefox bugs while thinking aloud.
I post these updates every 3 weeks to inform add-on developers about the status of the review queues, add-on compatibility, and other happenings in the add-ons world.The Review Queues
In the past 3 weeks, 902 add-ons were reviewed:
- 846 (94%) were reviewed in fewer than 5 days.
- 27 (3%) were reviewed between 5 and 10 days.
- 29 (3%) were reviewed after more than 10 days.
There are 73 listed add-ons awaiting review.
You can read about the recent improvements in the review queues here.
If you’re an add-on developer and are looking for contribution opportunities, please consider joining us. Add-on reviewers get invited to Mozilla events and earn cool gear with their work. Visit our wiki page for more information.Compatibility Communications
Most of you should have received an email from us about the future compatibility of your add-ons. You can use the compatibility tool to enter your add-on ID and get some info on what we think is the best path forward for your add-on.
To ensure long-term compatibility, we suggest you start looking into WebExtensions, or use the Add-ons SDK and try to stick to the high-level APIs. There are many XUL add-ons that require APIs that aren’t available in either of these options, which is why we’re also asking you to fill out this survey, so we know which APIs we should look into adding to WebExtensions.
The compatibility blog post for 47 is up. The bulk validation will be run soon. Make sure that the compatibility metadata for your add-on is up to date, so you don’t miss these checks.
As always, we recommend that you test your add-ons on Beta and Firefox Developer Edition to make sure that they continue to work correctly. End users can install the Add-on Compatibility Reporter to identify and report any add-ons that aren’t working anymore.Extension Signing
The wiki page on Extension Signing has information about the timeline, as well as responses to some frequently asked questions. The current plan is to remove the signing override preference in Firefox 47 (updated from 46).
This is the sumo weekly call We meet as a community every Wednesday 17:00 - 17:30 UTC The etherpad is here: https://public.etherpad-mozilla.org/p/sumo-2016-04-20
Mozilla ontwerpt nieuwe browser
Mozilla werkt aan een compleet nieuwe versie van Firefox, eentje met een geheel nieuw kern. Toch wordt voorlopig nog geen afscheid genomen van het oude Firefox. Tofino is de naam van het project, waaraan de komende maanden een klein team van ...
When I started writing my very own password generation extension I didn’t know much about the security aspects. In theory, any hash function should do in order to derive the password because hash functions cannot be reversed, right? Then I started reading and discovered that one is supposed to use PBKDF2. And not just that, you had to use a large number of iterations. But why?Primary threat scenario: Giving away your master password
That’s the major threat with password generators: some website manages to deduce your master password from the password you used there. And once they have the master password they know all your other passwords as well. But how can this happen if hash functions cannot be reversed? Problem is, one can still guess your master password. They will try “password” as master password first — nope, this produces a different password for their site. Then they will try “password1” and get a match. Ok, now they know that your master password is most likely “password1” (it could still be something else but that’s quite unlikely).
Of course, a number of conditions have to be met for this scenario. First, a website where you have an account should be malicious — or simply leak its users database which isn’t too unlikely. Second, they need to know the algorithm you used to generate your password. However, in my case everybody knows now that I’m using Easy Passwords, no need to guess. And even for you it’s generally better if you don’t assume that they won’t figure out. And third, your master password has to be guessable within “finite” time. Problem is, if people start guessing passwords with GPUs most passwords fall way too quickly.
So, how does one address this issue? First, the master password clearly needs to be a strong one. But choosing the right hashing algorithm is also important. PBKDF2 makes guessing hard because it is computationally expensive — depending on the number of iterations generating a single password might take a second. A legitimate user won’t notice this delay, somebody who wants to test millions of guesses however will run out of time pretty quickly.
There are more algorithms, e.g. bcrypt and scrypt are even better. However, none of them found its way into Firefox so far. Since Easy Passwords is using the native (fast) PBKDF2 implementation in Firefox it can use a very high number of iterations without creating noticeable delays for the users. That makes guessing master passwords impractical on current hardware as long as the master password isn’t completely trivial.
Finally, it’s a good measure to use a random salt when hashing passwords — different salts would result in different generated passwords. A truly random salt would usually be unknown to potential attackers and make guessing master passwords impossible. However, that salt would also make recreating passwords on a different device complicated, one would need to back up the salt from the original device and transfer it to the new one. So for Easy Passwords I chose a compromise: the salt isn’t really random, instead the user-defined password name is used as salt. While an attacker will normally be able to guess the password’s name, it still makes his job significantly more complicated.What about other password generators?
In order to check my assumptions I looked into what the other password generators were doing. I found more than twenty password generator extensions for Firefox, and most of them apparently didn’t think much about hashing functions. You have to keep in mind that none of them gained significant traction, most likely due to usability issues. The results outlined in the table below should be correct but I didn’t spend much time figuring out how these extensions work. For a few of them I noticed issues beyond their choice of a hashing algorithm, for others I might have missed these issues.Extension User count Hashing algorithm Security Password Hasher 2491 SHA1 Very weak PwdHash 2325 HMAC+MD5 Very weak1 Hash Password Generator 291 Custom (same as Magic Password Generator) Very weak Password Maker X 276 SHA256/SHA1/MD4/MD5/RIPEMD160, optionally with HMAC Very weak masterpassword for Firefox 155 scrypt, cost parameter 32768, user-defined salt Medium2 uPassword 115 SHA1 Very weak vPass Password Generator 88 TEA, 10 iterations Weak Passwordgen For Firefox 1 77 SHA256 Very weak Phashword 57 SHA-1 Very weak Passera 52 SHA-512 Very weak My Password 51 MD5 Very weak HashPass Firefox 48 MD5/SHA1/SHA256/SHA512 Very weak UniPass 33 SHA-256, 4,096 iterations Weak RndPhrase 29 CubeHash Very weak PasswordProtect 28 SHA1, 10,000 iterations Weak PswGen Toolbar v2.0 24 SHA512 Very weak UniquePasswordBuilder Addon 13 scrypt, cost factor 1024 by default Strong3 hash0 9 PBKDF2+HMAC+SHA256, 100,000 iterations, random salt Very strong4 MS Password Generator 9 SHA1 Very weak Vault 9 PBKDF2+HMAC+SHA1, 8 iterations, fixed salt Weak BPasswd2 8 bcrypt, 64 iterations by default, user-defined salt Weak5 Persistent "Magic" Password Generator 8 MurmurHash Very weak BPasswd 7 bcrypt, 64 iterations Weak SecPassGen 2 PBKDF2+HMAC+SHA1, 10,000 iterations by default Weak6 Magic Password Generator ? Custom Very weak
1 The very weak hash function isn’t even the worst issue with PwdHash. It also requires you to enter the master password into a field on the web page. The half-hearted attempts to prevent the website from stealing that password are easily circumvented.
2 Security rating for masterpassword downgraded because (assuming that I understand the approach correctly) scrypt isn’t being applied correctly. The initial scrypt hash calculation only depends on the username and master password. The resulting key is combined with the site name via SHA-256 hashing then. This means that a website only needs to break the SHA-256 hashing and deduce the intermediate key — as long as the username doesn’t change this key can be used to generate passwords for other websites. This makes breaking scrypt unnecessary, security rating is still “medium” however because the intermediate key shouldn’t be as guessable as the master password itself.
3 Security rating for UniquePasswordBuilder downgraded because of low default cost factor which it mistakenly labels as “rounds.” Users can select cost factor 16384 manually which is very recommendable.
4 hash0 actually went as far as paying for a security audit. Most of the conclusions just reinforced what I already came up with by myself, others were new (e.g. the pointer to window.crypto.getRandomValues() which I didn’t know before).
5 BPasswd2 allows changing the number of iterations, anything up to 2100 goes (the Sun will die sooner than this calculation completes). However, the default is merely 26 iterations which is a weak protection, and the extension neither indicates that changing the default is required nor does it give useful hints towards choosing a better value.
6 Security rating for SecPassGen downgraded because the master password is stored in Firefox preferences as clear text.Additional threats: Shoulder surfing & Co.
Websites aren’t the only threat however, one classic being somebody looking over your shoulder and noting your password. Easy Passwords addresses this by never showing your passwords: it’s either filling in automatically or copying to clipboard so that you can paste it into the password field yourself. In both scenarios the password never become visible.
And what if you leave your computer unattended? Easy Password remembers your master password once it has been entered, this is an important usability feature. The security concerns are addressed by “forgetting” the master password again after a given time, 10 minutes by default. And, of course, the master password is never saved to disk.Usability vs. security: Validating master password
There is one more usability feature in Easy Password with the potential to compromise security. When you mistype your master password Easy Passwords will notify you about it. That’s important because otherwise wrong passwords will get generated and you won’t know why. But how does one validate the master password without storing it?
My initial idea was storing a SHA hash of the master password. Then I realized that it opens the primary threat scenario again: somebody who can get their hands on this SHA hash (e.g. by walking past your computer when it is unattended) can use it to guess your master password. Only store a few characters of the SHA hash? Better but it will still allow an attacker who has both this SHA hash and a generated password to throw away a large number of guesses without having to spend time on calculating the expensive PBKDF2 hash. Wait, why treat this hash differently from other passwords at all?
And that’s the solution I went with. When the master password is set initially it is used to generate a new password with a random salt, using the usual PBKDF2 algorithm. Then this salt and the first two characters of the password are stored. The two characters are sufficient to recognize typos in most cases. They are not sufficient to guess the master password however. And they won’t even provide a shortcut when guessing based on a known generated password — checking the master password hash is just as expensive as checking the generated password itself.Encrypting legacy passwords
One requirement for Easy Passwords was dealing with “legacy passwords,” meaning existing passwords that cannot be changed for some reason. Instead of generating, these passwords would have to be stored securely. Luckily, there is a very straightforward solution: the PBKDF2 algorithm can be used to generate an encryption key. The password is then encrypted with AES-256.
My understanding is that AES-encrypted data currently cannot be decrypted without knowing the encryption key. And the encryption key is derived using the same algorithm as Easy Passwords uses for generating passwords, so the security of stored passwords is identical to that of generated ones. The only drawback of such legacy passwords currently seems to be a more complicated backup approach, also moving the password from one device to another is no longer trivial.Phishing & Co.
Password generators will generally protect you nicely against phishing: a phishing website can look exactly like the original, a password generator will still produce a different password for it. But what about malicious scripts injected into a legitimate site? These will still be able to steal your password. On the bright side, they will only compromise your password for a single website.
Question is, how do malicious scripts get to run there in the first place? One option are XSS vulnerabilities, not much can be done about those. But there are also plenty of websites showing password fields on pages that are transmitted unencrypted (plain HTTP, not HTTPS). These can then be manipulated by an attacker who is in the same network as you. The idea is that Easy Passwords could warn in such cases in future. It should be possible to disable this warning for websites that absolutely don’t support HTTPS, but for others it will hopefully be helpful. Oh, and did I recommend using Enforce Encryption extension already?
Finally, there is the worst-case scenario: your computer could be infected with a password sniffer. This is really bad because it could intercept your master password. Then again, it could also intercept all the individual passwords as you log into the respective websites, it will merely take a bit longer. I think that there is only one effective solution here: just don’t get infected.Other threats?
There are probably more threats to consider that I didn’t think of. It might also be that I made a mistake in my conclusions somewhere. So feel free to post your own thoughts in the comments.