Mozilla Nederland LogoDe Nederlandse

Michael Kaply: Firefox ESR Only Changes

Mozilla planet - di, 24/03/2015 - 19:41

There are a few changes that are coming for Firefox that will be major headaches for enterprise, educational, government and other institutional deployments. These include the removal of the distribution/bundles directory as well as the requirement for all add-ons to be signed by Mozilla.

Given that these two changes are not needed for enterprise, there has been some discussion of not putting these changes into the Firefox ESR.

So I'm curious: besides these two changes, what other things do you think should be different between regular Firefox and the Firefox ESR? I'm not talking about creating new features for the ESR, I'm only talking about enabling and/or disabling features.

Put your suggestions in the comments. I'll put mine there as well.

Categorieën: Mozilla-nl planet

Daniel Pocock: The easiest way to run your own OpenID provider?

Mozilla planet - di, 24/03/2015 - 17:57

A few years ago, I was looking for a quick and easy way to run OpenID on a small web server.

A range of solutions were available but some appeared to be slightly more demanding than what I would like. For example, one solution required a servlet container such as Tomcat and another one required some manual configuration of Python with Apache.

I came across the SimpleID project. As the name implies, it is simple. It is written in PHP and works with the Apache/PHP environment on just about any Linux web server. It allows you to write your own plugin for a user/password database or just use flat files to get up and running quickly with no database at all.

This seemed like the level of simplicity I was hoping for so I created the Debian package of SimpleID. SimpleID is also available in Ubuntu.

Help needed

Thanks to a contribution from Jean-Michel Nirgal Vourgère, I've just whipped up a 0.8.1-14 package that should fix Apache 2.4 support in jessie. I also cleaned up a documentation bug and the control file URLs.

Nonetheless, it may be helpful to get feedback from other members of the community about the future of this package:

  • Is it considered secure enough?
  • Have other people found it relatively simple to install or was I just lucky when I tried it?
  • Are there other packages that now offer such a simple way to get OpenID for a vanilla Apache/PHP environment?
  • Would anybody else be interested in helping to maintain this package?
  • Would anybody like to see this packaged in other distributions such as Fedora?
  • Is anybody using it for any online community?
Works with HOTP one-time-passwords and LDAP servers

One reason I chose SimpleID is because of dynalogin, the two-factor authentication framework. I wanted a quick and easy way to use OTP with OpenID so I created the SimpleID plugin for dynalogin, also available as a package.

I also created the LDAP backend for SimpleID, that is available as a package too.

Works with Drupal

I tested SimpleID for login to a Drupal account when the OpenID support is enabled in Drupal, it worked seamlessly. I've also tested it with a few public web sites that support OpenID.

Categorieën: Mozilla-nl planet

Adam Lofting: 2015 Mozilla Foundation Metrics Strategy(ish) & Roadmap(ish)

Mozilla planet - di, 24/03/2015 - 17:13

I wrote a version of this strategy in January but hadn’t published it as I was trying to remove those ‘ish‘s from the title. But the ‘ish’ is actually a big part of my day-to-day work, so this version embraces the ‘ish’.

MoFo Metrics Measures of Success:

These are ironically, more qualitative than quantitative.

  1. Every contributor (paid or volunteer) knows at any given time what number they (or we) are trying to move, where that number is right now, and how they hope to influence it.
  2. We consider metrics (i.e. measures of success) before, during and after after each project.
  3. We articulate the stories behind the metrics we aim for, so their relevance isn’t lost in the numbers.
  4. A/B style testing practice has a significant impact on the performance of our ‘mass audience’ products and campaigns.
1. Every contributor (paid or volunteer) knows at any given time what number they (or we) are trying to move, where that number is right now, and how they hope to influence it.
  • “Every” is ambitious, but it sets the right tone.
  • This includes:
    • Public dashboards, like those at
    • Updates and storytelling throughout the year
    • Building feedback loops between the process, the work and the results (the impact)
2. We consider metrics (i.e. measures of success) before, during and after after each piece of work.
  • This requires close integration into our organizational planning process
  • This work is underway, but it will take time (and many repetitions) before it becomes habit
3. We articulate the stories behind the metrics we aim for, so their relevance isn’t lost in the numbers.
  • The numbers should be for navigation, rather than fuel
4. A/B style testing practice has a significant impact on the performance of our ‘mass audience’ products and campaigns.
  • This is the growth hacking part of the plan
  • We’ve had some successes (e.g. Webmaker and Fundraising)
  • This needs to become a continuous process

Those are my goals.

In many cases, the ultimate measure of success is when this work is done by the team rather than by me for the team.

We’re working on Process AND Culture

Process and culture feed off of and influence each other. Processes must suit the culture being cultivated. A data driven culture can blinker creativity – it doesn’t have to, but it can. And a culture that doesn’t care for data, won’t care for processes related to data. This strategy aims to balance the needs of both.

A roadmap?

I tried to write one, but basically this strategy will respond to the roadmaps of each of the MoFo teams.

So, what does Metrics work look like in 2015?
  • Building the tools and dashboards to provide the organisational visibility we need for our KPIs
  • ‘Instrumenting’ our products so that we can accurately measure how they are being used
  • Running Optimization experiments against high profile campaigns
  • Running training and support for Google Analytics, Optimizely, and other tools
  • Running project level reporting and analysis to support iterative development
  • Consulting to the Community Development Team to plan experimental initiatives

Plus: supporting teams to implement our data practices, and of course, the unknown unknowns.


Categorieën: Mozilla-nl planet

Nigel Babu: Dino Cufflinks

Mozilla planet - di, 24/03/2015 - 17:06

Recently, in a moment of weakness, I made an order on Etsy for custom cufflinks. I had no idea how it would turn out, so it was a huge leap of faith. I got it the other day and it looks gorgeous!

They do look quite good! Click through for larger image

For those of you wondering, I ordered it from LogiCuff. So, when can we get cufflinks on Mozilla Gear? :)

Categorieën: Mozilla-nl planet

Ben Kelly: Service Workers in Firefox Nightly

Mozilla planet - di, 24/03/2015 - 16:15

I’m pleased to announce that we now recommend normal Nightly builds for testing our implementation of Service Workers. We will not be posting any more custom builds here.

Now that bug 1110814 has landed in mozilla-central, Nightly has roughly the same functionality as the last sw-build. Just enable these preferences in about:config:

  • Set dom.caches.enabled to true.
  • Set dom.serviceWorkers.enabled to true.

Please note that on Firefox OS you must enable an additional preference as well. See bug 1125961 for details.

In addition, we’ve decided to move forward with enabling the Service Worker and Cache API preferences by default in non-releases builds. We expect the Cache preference to be enabled in the tree today. The Service Worker preference should be enabled within the next week once bug 931249 is complete.

When Nightly merges to Aurora (Developer Edition), these preferences will also be enabled by default there. They will not, however, ride the trains to Beta or Release yet. We feel we need more time stabilizing the implementation before that can occur.

So, unfortunately, I cannot tell you exactly which Firefox Release will ship with Service Workers yet. It will definitely not be Firefox 39. Its possible Service Workers will ship in Firefox 40, but its more likely to finally be enabled in Firefox 41.

Developer Edition 39, however, will have Cache enabled and will likely also have Service Workers enabled.

Finally, while the code is stabilizing you may see Service Worker registrations and Cache data be deleted when you update the browser. If we find that the data format on disk needs to change we will simply be reseting the relevant storage area in your profile. Once the decision to ship is made any future changes will then properly migrate data without any loss. Again, this only effects Service Worker registrations and data stored in Cache.

As always we appreciate your help testing, reporting bugs, and implementing code.

Categorieën: Mozilla-nl planet

Google and Mozilla block bogus certificates from China -

Nieuws verzameld via Google - di, 24/03/2015 - 14:58

Google and Mozilla block bogus certificates from China
Google and Mozilla have blocked bogus certificates being distributed by a Chinese registrar that could be used to target Mac, Windows and Linux users. Google security engineer Adam Langley said in a threat advisory that the problem affects numerous ...
Google Warns Of Unauthorised Security Certificates In Latest BreachTechWeekEurope UK
Google catches bad digital certificates from Egyptian companyInfoWorld

alle 22 nieuwsartikelen »
Categorieën: Mozilla-nl planet

Gervase Markham: How to Responsibly Publish a Misissued SSL Certificate

Mozilla planet - di, 24/03/2015 - 10:13

I woke up this morning wanting to write a blog post, then I found that someone else had already written it. Thank you, Andrew.

If you succeed in getting a certificate misissued to you, then that has the opportunity to be a great learning experience for the site, the CA, the CAB Forum, or all three. Testing security is, to my mind, generally a good thing. But publishing the private key turns it from a great learning experience into a browser emergency update situation (at least at the moment, in Firefox, although we are working to make this easier with OneCRL).

Friends don’t publish private keys for certs for friends’ domain names. Don’t be that guy. :-)

Categorieën: Mozilla-nl planet

QMO: Firefox 38 Aurora Testday Results

Mozilla planet - di, 24/03/2015 - 09:37

Hello everyone!

Last Friday, March 20th, we held the Firefox 38 Aurora Testday. We’d like to take this opportunity to thank everyone for getting involved in the proposed testing activities and in general, for your hard work in helping us make Firefox even better.

Many thanks to doublex, Aleksej, Hossain Al Ikram and kenkon for their efforts and contributions and to all our moderators. Your help is greatly appreciated!

We look forward to seeing you at the next Testday. Keep an eye on QMO for upcoming events and schedule announcements!

Categorieën: Mozilla-nl planet

Byron Jones: happy bmo push day!

Mozilla planet - di, 24/03/2015 - 07:27

the following changes have been pushed to

  • [1145502] cf_crash_signature missing in fresh BMO install
  • [1145689] Adding “Content Services” into Key Initiatives
  • [1146219] the ‘abuse’ comment tag should function like the ‘abusive’ tag and trigger automatic account disabling
  • [1141165] Enforce mandatory field on form.reps.mentorship
  • [1146434] renaming of “Distribution/Bundling” component broke one of the project kickoff form’s sub-bugs
  • [1096798] prototype modal show_bug view

discuss these changes on

Filed under: bmo, mozilla
Categorieën: Mozilla-nl planet

Chris Double: Contributing to Servo

Mozilla planet - di, 24/03/2015 - 04:00

Servo is a web browser engine written in the Rust programming language. It is being developed by Mozilla. Servo is open source and the project is developed on github.

I was looking for a small project to do some Rust programming and Servo being written in Rust seemed likely to have tasks that were small enough to do in my spare time yet be useful contributions to the project. This post outlines how I built Servo, found issues to work on, and got them merged.

Preparing Servo

The Servo README has details on the pre-requisites needed. Installing the pre-requisites and cloning the repository on Ubuntu was:

$ sudo apt-get install curl freeglut3-dev \ libfreetype6-dev libgl1-mesa-dri libglib2.0-dev xorg-dev \ msttcorefonts gperf g++ cmake python-virtualenv \ libssl-dev libbz2-dev libosmesa6-dev ... $ git clone Building Rust

The Rust programming language has been fairly volatile in terms of language and library changes. Servo deals with this by requiring a specific git commit of the Rust compiler to build. The Servo source is periodically updated for new Rust versions. The commit id for Rust that is required to build is stored in the rust-snapshot-hash file in the Servo repository.

$ cat servo/rust-snapshot-hash d3c49d2140fc65e8bb7d7cf25bfe74dda6ce5ecf/rustc-1.0.0-dev $ git clone $ cd rust $ git checkout -b servo d3c49d2140fc65e8bb7d7cf25bfe74dda6ce5ecf $ ./configure --prefix=/home/myuser/rust $ make $ make install

Note that I configure Rust to be installed in a directory off my home directory. I do this out of preference to enable managing different Rust versions. The build will take a long time and once built you need to add the prefix directories to the PATH:

$ export PATH=$PATH:/home/myuser/rust/bin $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/myuser/rust/lib Building Servo

There is a configuration file used by the Servo build system to store information on what Rust compiler to use, whether to use a system wide Cargo (Rust package manager) install and various paths. This file, .servobuild, should exist in the root of the Servo source that was cloned. There is a sample file that can be used as a template. The values I used were:

[tools] system-rust = true system-cargo = false [build] android = false debug-mozjs = false

Servo uses the mach command line interface that is used to build Firefox. Once the .servobuild is created then Servo can be built with:

$ ./mach build

Servo can be run with:

$ ./mach run

To run the test suite:

$ ./mach test Finding something to work on

The github issue list has three useful labels for finding work. They are:

For my first task I searched for E-easy issues that were not currently assigned (using the C-assigned label). I commented in the issue asking if I could work on it and it was then assigned to me by a Servo maintainer.

Submitting the Fix

Fixing the issue involved:

  • Fork the Servo repository on github.
  • Clone my fork localling and make the changes required to the source in a branch I created for the issue I was working on.
  • Commit the changes locally and push them to my fork on github.
  • Raise a pull request for my branch.

Raising the pull request runs a couple of automated actions on the Servo repository. The first is an automated response thanking you for the changes followed by a link to the external critic review system.


The Servo project uses the Critic review tool. This will contain data from your pull request and any reviews made by Servo reviewers.

To address reviews I made the required changes and committed them to my local branch as seperate commits using the fixup flag to git commit. This associates the new commit with the original commit that contained the change. It allows easier squashing later.

$ git commit --fixup=<commit id of original commit>

The changes are then pushed to the github fork and the previously made pull request is automatically updated. The Critic review tool also automatically picks up the change and will associate the fix with the relevant lines in the review.

With some back and forth the changes get approved and a request might be made to squash the commits. If fixup was used to record the review changes then they will be squashed into the correct commits when you rebase:

$ git fetch origin $ git rebase --autosquash origin/master

Force pushing this to the fork will result in the pull request being updated. When the reviewer marks this as r+ the merge to master will start automatically, along with a build and test runs. If test failures happen these get added to the pull request and the review process starts again. If tests pass and it merges then it will be closed and the task is done.

A full overview of the process is available on the github wiki under Github and Critic PR handling 101.


The process overhead of committing to Servo is quite low. There are plenty of small tasks that don’t require a deep knowledge of Rust. The first task I worked on was basically a search/replace. The second was more involved, implementing view-source protocol and text/plain handling. The latter allows the following to work in Servo:

$ ./mach run view-source: $ ./mach run

The main issues I encountered working with Rust and Servo were:

  • Compiling Servo is quite slow. Even changing private functions in a module would result in other modules rebuilding. I assume this is due to cross module inlining.
  • I’d hoped to get away from intermittent test failures like there are in Gecko but there seems to be the occasional intermittent reftest failure.

The things I liked:

  • Very helpful Servo maintainers on IRC and in github/review comments.
  • Typechecking in Rust helped find errors early.
  • I found it easier comparing Servo code to HTML specifications and following them together than I do in Gecko.

I hope to contribute more as time permits.

Categorieën: Mozilla-nl planet

Mozilla Patches Firefox for Pwn2Own Security Flaws - eWeek

Nieuws verzameld via Google - ma, 23/03/2015 - 22:37

Mozilla Patches Firefox for Pwn2Own Security Flaws
As has been the case in prior years, Mozilla is the first vendor to patch its browser for vulnerabilities first disclosed at Hewlett-Packard's Pwn2Own browser-hacking contest. Every year, major browsers that the vendors have fully patched beforehand ...
Largest Payout Ever At Pwn2Own 2015iProgrammer

alle 2 nieuwsartikelen »
Categorieën: Mozilla-nl planet

Air Mozilla: Mozilla Weekly Project Meeting

Mozilla planet - ma, 23/03/2015 - 19:00

Mozilla Weekly Project Meeting The Monday Project Meeting

Categorieën: Mozilla-nl planet

Dave Townsend: Making communicating with chrome from in-content pages easy

Mozilla planet - ma, 23/03/2015 - 16:34

As Firefox increasingly switches to support running in multiple processes we’ve been finding common problems. Where we can we are designing nice APIs to make solving them easy. One problem is that we often want to run in-content pages like about:newtab and about:home in the child process without privileges making it safer and less likely to bring down Firefox in the event of a crash. These pages still need to get information from and pass information to the main process though, so we have had to come up with ways to handle that. Often we use custom code in a frame script acting as a middle-man, using things like DOM events to listen for requests from the in-content page and then messaging to the main process.

We recently added a new API to make this problem easier to solve. Instead of needing code in a frame script the RemotePageManager module allows special pages direct access to a message manager to communicate with the main process. This can be useful for any page running in the content area, regardless of whether it needs to be run at low privileges or in the content process since it takes care of listening for documents and hooking up the message listeners for you.

There is a low-level API available but the higher-level API is probably more useful in most cases. If your code wants to interact with a page like about:myaddon just do this from the main process:

Components.utils.import("resource://gre/modules/RemotePageManager.jsm"); let manager = new RemotePages("about:myaddon");

The manager object is now something resembling a regular process message manager. It has sendAsyncMessage and addMessageListener methods but unlike the regular e10s message managers it only communicates with about:myaddon pages. Unlike the regular message managers there is no option to send synchronous messages or pass cross-process wrapped objects.

When about:myaddon is loaded it has sendAsyncMessage and addMessageListener functions defined in its global scope for regular JavaScript to call. Anything that can be structured-cloned can be passed between the processes

The module documentation has more in-depth examples showing message passing between the page and the main process.

The RemotePageManager module is available in nightlies now and you can see it in action with the simple change I landed to switch about:plugins to run in the content process. For the moment the APIs only support exact URL matching but it would be possible to add support for regular expressions in the future if that turns out to be useful.

Categorieën: Mozilla-nl planet

Mozilla Developer Network Fellowship Program - iProgrammer

Nieuws verzameld via Google - ma, 23/03/2015 - 16:31

Mozilla Developer Network Fellowship Program
Fellows won't be paid but Mozilla will cover travel and accommodation costs to attend an Orientation session in early June prior to the rprgram's commencement on June 29. The number of fellows hasn't been specified but five specific projects are ...

Categorieën: Mozilla-nl planet

Mozilla Science Lab: Mozilla Science Lab Week in Review, March 16-22

Mozilla planet - ma, 23/03/2015 - 16:00

The Week in Review is our weekly roundup of what’s new in open science from the past week. If you have news or announcements you’d like passed on to the community, be sure to share on Twitter with @mozillascience and @billdoesphysics, or join our mailing list and get in touch there.

  • Andrew Nesbitt has launched, a project to help tackle the discoverability challenge in open source and open science software. By leveraging the PageRank algorithm, Nesbitt hopes to represent what is actually being used (rather than what is simply admired) to better represent the true workhorses of open source.
  • The Center for Open Science began composing a wiki on the arguments and motivations for open science part of a “growing open source, open science” meeting that Titus Brown and Kaitlin Thaney co-organized last week. The new wiki explores ways we can work together better across open science initiatives – watch their space for developments, and get in touch there to contribute!
  • The Wikimedia Foundation has adopted an open access policy to support the free reuse of research produced with their support.
  • Stephanie Hampton et al have submitted a preprint of ‘The Tao of Open Science for Ecology‘, a paper outlining a roadmap to understanding and participating in open science. This paper got its start as a collaborative discussion at the NCEAS Codefest in 2014.
  • GitHub added PDF rendering to their services last week.
  • PLOS Biology published recommendations for the role of publishers in the dissemination of open data.
  • Jojo Scoble wrote a great blog post for Digital Science describing her experiences sharing her data openly, and why other researchers should consider it. On the common worry of whether a dataset is ‘good’ enough to publish, Scoble quoted her former supervisor:

    You could spend years trying to collect the perfect data set when you should be publishing what you have, which is enough.”

  •  The National Science Foundation in the US announced a plan to accommodate comprehensive public access to research results; in it, the “NSF will require that articles in peer-reviewed scholarly journals and papers in juried conference proceedings or transactions be deposited in a public access compliant repository and be available for download, reading and analysis within one year of publication.
  • The Fair Access to Research and Technology (FASTR) act was reintroduced recently to the US Congress. Successor to the Federal Research Public Access Act, FASTR introduces, among other provisions, an emphasis on reuse and correct attribution.
  • PLOS Collections has showcased a collection of negative results, underscoring the importance of publishing such studies to the broader scientific community.
  • In a similar vein, submissions are open for the ERROR conference, highlighting negative results in Munich, Germany on 3-4 September.
  • UNESCO recently put its open access curriculum online; the content is targeted at librarians and researchers, and emphasizes topics from introducing open access, to intellectual property rights to how to share your work in an open access model.
  • Also in partnership with UNESCO, Foster Open Science is hosting a two-day Open Science Workshop for European graduate school administrators in order to ‘construct a roadmap for making Open Science certifiable and standard training for future graduates‘.
  • Tom Baden et al recently published an article on 3D printing your own lab equipment, in order to mitigate the costs and hurdles to setting up a research program.
  • The Scholarly Kitchen recently interviewed two of the founders of Advancing Research and Communication Scholarship (ARCS), a new conference coming April 26-28 in Philadelphia ‘designed to provide a broad and collaborative forum for addressing and affecting scholarly and scientific communication.‘ (- Alice Meadows, Scholarly Kitchen)
  • Chris Parr wrote an article for Times Higher Education on Carol Goble’s work and comments on the hurdles created by ostentation in scholarly communication and questions raised by the failure to distribute both data and code.
  • The Su Lab is holding a hackathon on biomedical big data, May 7-9.
  • Finally, don’t miss our map of hacky hours and study groups – and if you know of anyone running a meetup about coding for researchers, let us know so we can add you to the map!


Categorieën: Mozilla-nl planet

Daniel Stenberg: Fixing the Func KB-460 ‘-key

Mozilla planet - ma, 23/03/2015 - 13:54

Func KB-460 keyboardI use a Func KB-460 keyboard with Nordic layout – that basically means it is a qwerty design with the Nordic keys for “åäö” on the right side as shown on the picture above. (yeah yeah Swedish has those letters fairly prominent in the language, don’t mock me now)

The most annoying part with this keyboard has been that the key repeat on the apostrophe key has been sort of broken. If you pressed it and then another key, it would immediately generate another (or more than one) apostrophe. I’ve sort of learned to work around it with some muscle memory and treating the key with care but it hasn’t been ideal.

This problem is apparently only happening on Linux someone told me (I’ve never used it on anything else) and what do you know? Here’s how to fix it on a recent Debian machine that happens to run and use systemd so your mileage will vary if you have something else:

1. Edit the file “/lib/udev/hwdb.d/60-keyboard.hwdb”. It contains keyboard mappings of scan codes to key codes for various keyboards. We will add a special line for a single scan code and for this particular keyboard model only. The line includes the USB vendor and product IDs in uppercase and you can verify that it is correct with lsusb -v and check your own keyboard.

So, add something like this at the end of the file:

# func KB-460

2. Now update the database:

$ udevadm hwdb –update

3. … and finally reload the tweaks:

$ udevadm trigger

4. Now you should have a better working key and life has improved!

With a slightly older Debian without systemd, the instructions I got that I have not tested myself but I include here for the world:

1. Find the relevant input for the device by “cat /proc/bus/input/devices”

2. Make a very simple keymap. Make a file with only a single line like this:

$ cat /lib/udev/keymaps/func
0×70031 reserved

3 Map the key with ‘keymap’:

$ sudo /lib/udev/keymap -i /dev/input/eventX /lib/udev/keymaps/func

where X is the event number you figured out in step 1.

The related kernel issue.

Categorieën: Mozilla-nl planet

Pierros Papadeas: Multiple emails on

Mozilla planet - ma, 23/03/2015 - 12:51
tl;dr version: You can now associate multiple emails to your profile Background
Since the start, users of were able to associate only one email per profile. This was used both as email displayed on your profile (depending on your privacy settings) but most importantly as the email used to login using Persona. Rationale
Most of us own and use multiple emails everyday. Personal, business, alias and any combo in between. Even within various Mozilla properties people have associated different profiles with different emails (eg. SuMo account vs. Bugzilla email). Although we need to recognize and respect the will of some people to use different emails as different (separate) online personas, we also need to find ways to make identity management and consolidation easier for people that choose to use multiple emails, under the same name. Being able to associate multiple emails under one profile, presents us with really interesting advantages. For once, you can login on websites that check for your account using any email associated with your Persona account. Also other mozillians would be able to look you up using any of your emails. Finally, from a metrics standpoint we will be able to effectively deduplicate accounts and metrics/statistics across different systems of Mozilla. Implementation
  • Main email is being used for communication with mozillians in
  • Alternate emails are mostly being used for identity deduplication
  • API v2 exposes alternate emails
What should I do?
  • Login to
  • Click “Edit your profile”
  • Click “Edit E-mail addresses”
There we provide all the functionality to manage your profile’s emails.
  • Add/delete alternate email address
  • Change your primary email address
  • Manage email visibility
Multiple Accounts?
We don‘t expect many people to have multiple profiles in We cannot know for sure, only anecdotally. People with multiple accounts should contact us ( #commtools on IRC or open a bug here) for help merging, or they can choose to use one of them and delete the others.
What is next? dev team is working tirelessly on new features and enhancements that would make even easier to use and more robust as a source of truth about all things mozillians. You can check our our roadmap here, follow our development and contribute on github and join our discussions here.
Categorieën: Mozilla-nl planet

Cameron Kaiser: Pwn2Own this Power Mac (plus: IonPower's time is running out)

Mozilla planet - ma, 23/03/2015 - 06:52
All of the mighties fell at this year's Pwn2Own, including Firefox, where clever minds find gaping holes for fun and a small amount of profit (but then it also makes a great resume builder, which may be a more reliable paycheque). The holes last year were quickly patched, and after I determined we were also vulnerable we followed suit. As usual, for this year's naughtiness Mozilla already has patched versions available, including ESR 31.5.3.

However, the two holes used for this year's marvelous and somewhat alarming crack are not exploitable in TenFourFox directly: the SVG navigation fault cannot be effectively used to escalate privileges in TenFourFox's default configuration, and we don't even build the code that uses JavaScript bounds checking. The navigation fault may have other weaponizeable vectors and we do want to roll that fix, but the good news is 31.6 will come out this weekend, so no urgent chemspill is necessary unless I discover some odd way of busting through it between now and then.

I lost about a week of hacking time to one of my occasional bouts of bronchitis, which is pushing IonPower's timing very close to the wire. We need two cycles for 38 to allow localizers to catch up and people to test, and of course somewhere in that timeframe we also have to finish the move from Eric Schmidt is a Poopypants Land Google Code. Over the weekend I got IonPower to pass the test suite in Baseline mode, which is very encouraging, but some of the same problems that doomed PPCBC's Ion work are starting to rear up again.

The biggest problem that recurred is an old one: Ion's allocator is not endian-safe. I get bad indicies off it for stack slots and other in-memory boxed values and all but the simplest scripts either assert deep within Ion's bowels (not our new PowerPC backend) or generate code that is verifiably wrong. Unfortunately, Mozilla doesn't really document Ion's guts anywhere, so I don't know where to start with fixing it, and all the extant Ion backends, even MIPS, are little-endian. Maybe some Mozilla JIT folks are reading and can comment? (See also the post in the JavaScript engine internals group.)

One old problem with bad bailout stack frames, however, is partially solved with IonPower. I say partially because even though the stack frame is sane now, it still crashes, but I have a few ideas about that. However, the acid test is getting Ion code to jump to Baseline, run a bit in Baseline, and then jump back to Ion to finish execution. PPCBC could never manage this without crashing. If IonPower can do no better, there is no point in continuing the porting effort.

Even if this effort runs aground again, that doesn't make IonPower useless. PPCBC may pass the test suite, but some reproducible bugs in Tenderapp indicate that it goes awry in certain extremely-tough-to-debug edge cases, and IonPower (in Baseline mode) does pass the test suite as well now. If I can get IonPower to be as fast or faster than PPCBC even if it can't execute Ion code either, we might ship it anyway as "PPCBC II" in Baseline-only mode to see if it fixes those problems -- I have higher confidence that it will, because it generates much more sane and "correct" output and doesn't rely on the hacks and fragile glue code that PPCBC does in 24 and 31. I have to make this decision sometime mid-April, though, because we're fast approaching EOL for 31.

Also, as of Firefox 38 Mozilla no longer supports gcc 4.6, the compiler which we build with. However, I'm not interested in forcing a compiler change so close to the next ESR, and it appears that we should still be able to get it working on 4.6 with some minor adjustments. That won't be the case for Fx39, if we're even going to bother with that, but fortunately there is a gcc 4.8 in MacPorts and we might even use Sevan's gcc from pkgsrc. Again, the decision to continue will be based on feasibility and how close Electrolysis is to becoming mandatory before 45ESR, which is the next jump after that. For now, TenFourFox 38 is the highest priority.

Categorieën: Mozilla-nl planet

This Week In Rust: This Week in Rust 75

Mozilla planet - ma, 23/03/2015 - 05:00

Hello and welcome to another issue of This Week in Rust! Rust is a systems language pursuing the trifecta: safety, concurrency, and speed. This is a weekly summary of its progress and community. Want something mentioned? Send me an email! Want to get involved? We love contributions.

This Week in Rust is openly developed on GitHub. If you find any errors or omissions in this week's issue, please submit a PR.

What's cooking on master?

79 pull requests were merged in the last week, and 9 RFC PRs.

Now you can follow breaking changes as they happen!

Breaking Changes Other Changes New Contributors
  • Johannes Oertel
  • kjpgit
  • Nicholas
  • Paul ADENOT
  • Sae-bom Kim
  • Tero Hänninen
Approved RFCs New RFCs Notable Links Project Updates Upcoming Events

If you are running a Rust event please add it to the calendar to get it mentioned here. Email Erick Tryzelaar or Brian Anderson for access.

Quote of the Week <mbrubeck> the 5 stages of loss and rust <mbrubeck> 1. type check. 2. borrow check. 3. anger. 4. acceptance. 5. rust upgrade

Thanks to jdm for the tip. Submit your quotes for next week!.

Categorieën: Mozilla-nl planet

Brandon Eich ouster from Mozilla sparks push to protect workers from political ... - Washington Times

Nieuws verzameld via Google - ma, 23/03/2015 - 03:05

Brandon Eich ouster from Mozilla sparks push to protect workers from political ...
Washington Times
Because Mr. Eich resigned “voluntarily,” the law did not apply in his case despite reports of intense pressure from Mozilla's board. Mr. Eich, a Silicon Valley star who created the JavaScript programming language, stepped down as CEO of Mozilla a year ...

Google Nieuws
Categorieën: Mozilla-nl planet