Mozilla Nederland LogoDe Nederlandse
Mozilla gemeenschap

Abonneren op feed Mozilla planet
Planet Mozilla -
Bijgewerkt: 14 uur 54 min geleden

Luke Wagner: asm.js on

do, 18/09/2014 - 22:03

I was excited to see that asm.js has been added to as “Under Consideration”. Since asm.js isn’t a JS language extension like, say, Generators, what this means is that Microsoft is currently considering adding optimizations to Chakra for the asm.js subset of JS. (As explained in my previous post, explicitly recognizing asm.js allows an engine to do a lot of exciting things.)

Going forward, we are hopeful that, after consideration, Microsoft will switch to “Under Development” and we are quite happy to collaborate with them and any other JS engine vendors on the future evolution of asm.js.

On a more general note, it’s exciting to see that there has been across-the-board improvements on asm.js workloads in the last 6 months. You can see this on or by loading up the Dead Trigger 2 demo on Firefox, Chrome or (beta) Safari. Furthermore, with the recent release of iOS8, WebGL is now shipping in all modern browsers. The future of gaming and high-performance applications on the web is looking good!

Categorieën: Mozilla-nl planet

Kartikaya Gupta: Maker Party shout-out

do, 18/09/2014 - 17:48

I've blogged before about the power of web scale; about how important it is to ensure that everybody can use the web and to keep it as level of a playing field as possible. That's why I love hearing about announcements like this one: 127K Makers, 2513 Events, 86 Countries, and One Party That Just Won't Quit. Getting more people all around the world to learn about how the web works and keeping that playing field level is one of the reasons I love working at Mozilla. Even though I'm not directly involved in Maker Party, it's great to see projects like this having such a huge impact!

Categorieën: Mozilla-nl planet

Henrik Skupin: Memchaser 0.6 has been released

do, 18/09/2014 - 08:10

The Firefox Automation team would like to announce the release of memchaser 0.6. After nearly a year of no real feature updates, but also some weeks of not being able to run Memchaser in Firefox Aurora (34.0a2) at all (due to a regression), we decided to release the current state of development as a public release. We are aware that we still do not fully support the default Australis theme since Firefox 29.0, but that’s an issue, which takes some more time to finish up.

Changes in 0.6
  • Upgrade to Add-on SDK 1.17 (#201)
  • Fix test_start_stop_logging for ‘File not found’ error (#199)
  • contentURL expects a string in Panel() and Widget() constructors (#198)
  • Bring the minimizeMemory() implementation up to date (#193)
  • Use ‘residentFast’ instead of ‘resident’ for memory reporter (#192)
  • Use postMessage for widget and panel communication instead port object (#185)
  • Support incremental cycle collection statistics (#187)
  • Added a usage section to the README (#178)
  • Change require statements and update the SDK to 1.14 (#174)
  • Simplify .travis.yml (#172)
  • Use . (dot) to include files while invoking /bin/sh (#169)
  • Use failonerror attribute in exec tasks (#168)

For all the details about the 0.6 release please check our issue tracker on Github.

Categorieën: Mozilla-nl planet

Matt Brubeck: Let's build a browser engine! Part 6: Block layout

do, 18/09/2014 - 06:30

Welcome back to my series on building a toy HTML rendering engine:

This article will continue the layout module that we started previously. This time, we’ll add the ability to lay out block boxes. These are boxes that are stack vertically, such as headings and paragraphs.

To keep things simple, this code implements only normal flow: no floats, no absolute positioning, and no fixed positioning.

Traversing the Layout Tree

The entry point to this code is the layout function, which takes a takes a LayoutBox and calculates its dimensions. We’ll break this function into three cases, and implement only one of them for now:

impl LayoutBox { /// Lay out a box and its descendants. fn layout(&mut self, containing_block: Dimensions) { match self.box_type { BlockNode(_) => self.layout_block(containing_block), InlineNode(_) => {} // TODO AnonymousBlock => {} // TODO } } // ... }

A block’s layout depends on the dimensions of its containing block. For block boxes in normal flow, this is just the box’s parent. For the root element, it’s the size of the browser window (or “viewport”).

You may remember from the previous article that a block’s width depends on its parent, while its height depends on its children. This means that our code needs to traverse the tree top-down while calculating widths, so it can lay out the children after their parent’s width is known, and traverse bottom-up to calculate heights, so that a parent’s height is calculated after its children’s.

fn layout_block(&mut self, containing_block: Dimensions) { // Child width can depend on parent width, so we need to calculate // this box's width before laying out its children. self.calculate_block_width(containing_block); // Determine where the box is located within its container. self.calculate_block_position(containing_block); // Recursively lay out the children of this box. self.layout_block_children(); // Parent height can depend on child height, so `calculate_height` // must be called *after* the children are laid out. self.calculate_block_height(); }

This function performs a single traversal of the layout tree, doing width calculations on the way down and height calculations on the way back up. A real layout engine might perform several tree traversals, some top-down and some bottom-up.

Calculating the Width

The width calculation is the first step in the block layout function, and also the most complicated. I’ll walk through it step by step.

To start, we need to know the values of the CSS width property and all the left and right edge size properties:

fn calculate_block_width(&mut self, containing_block: Dimensions) { let style = self.get_style_node(); // `width` has initial value `auto`. let auto = Keyword("auto".to_string()); let mut width = style.value("width").unwrap_or(auto.clone()); // margin, border, and padding have initial value 0. let zero = Length(0.0, Px); let mut margin_left = style.lookup("margin-left", "margin", &zero); let mut margin_right = style.lookup("margin-right", "margin", &zero); let border_left = style.lookup("border-left-width", "border-width", &zero); let border_right = style.lookup("border-right-width", "border-width", &zero); let padding_left = style.lookup("padding-left", "padding", &zero); let padding_right = style.lookup("padding-right", "padding", &zero); // ... }

This uses a helper function called lookup, which just tries a series of values in sequence. If the first property isn’t set, it tries the second one. If that’s not set either, it returns the given default value. This provides an incomplete (but simple) implementation of shorthand properties and initial values.

Note: This is similar to the following code in, say, JavaScript or Ruby:

margin_left = style["margin-left"] || style["margin"] || zero;

Since a child can’t change its parent’s width, it needs to make sure its own width fits the parent’s. The CSS spec expresses this as a set of constraints and an algorithm for solving them. The following code implements that algorithm.

First we add up the margin, padding, border, and content widths. The to_px helper method converts lengths to their numerical values. If a property is set to 'auto', it returns 0 so it doesn’t affect the sum.

let total = [&margin_left, &margin_right, &border_left, &border_right, &padding_left, &padding_right, &width].iter().map(|v| v.to_px()).sum();

This is the minimum horizontal space needed for the box. If this isn’t equal to the container width, we’ll need to adjust something to make it equal.

If the width or margins are set to 'auto', they can expand or contract to fit the available space. Following the spec, we first check if the box is too big. If so, we set any expandable margins to zero.

// If width is not auto and the total is wider than the container, treat auto margins as 0. if width != auto && total > containing_block.width { if margin_left == auto { margin_left = Length(0.0, Px); } if margin_right == auto { margin_right = Length(0.0, Px); } }

If the box is too large for its container, it overflows the container. If it’s too small, it will underflow, leaving extra space. We’ll calculate the underflow—the amount of extra space left in the container. (If this number is negative, it is actually an overflow.)

let underflow = containing_block.width - total;

We now follow the spec’s algorithm for eliminating any overflow or underflow by adjusting the expandable dimensions. If there are no 'auto' dimensions, we adjust the right margin. (Yes, this means the margin may be negative in the case of an overflow!)

match (width == auto, margin_left == auto, margin_right == auto) { // If the values are overconstrained, calculate margin_right. (false, false, false) => { margin_right = Length(margin_right.to_px() + underflow, Px); } // If exactly one size is auto, its used value follows from the equality. (false, false, true) => { margin_right = Length(underflow, Px); } (false, true, false) => { margin_left = Length(underflow, Px); } // If width is set to auto, any other auto values become 0. (true, _, _) => { if margin_left == auto { margin_left = Length(0.0, Px); } if margin_right == auto { margin_right = Length(0.0, Px); } if underflow >= 0.0 { // Expand width to fill the underflow. width = Length(underflow, Px); } else { // Width can't be negative. Adjust the right margin instead. width = Length(0.0, Px); margin_right = Length(margin_right.to_px() + underflow, Px); } } // If margin-left and margin-right are both auto, their used values are equal. (false, true, true) => { margin_left = Length(underflow / 2.0, Px); margin_right = Length(underflow / 2.0, Px); } }

At this point, the constraints are met and any 'auto' values have been converted to lengths. The results are the the used values for the horizontal box dimensions, which we will store in the layout tree. You can see the final code in


The next step is simpler. This function looks up the remanining margin/padding/border styles, and uses these along with the containing block dimensions to determine this block’s position on the page.

fn calculate_block_position(&mut self, containing_block: Dimensions) { let style = self.get_style_node(); let d = &mut self.dimensions; // margin, border, and padding have initial value 0. let zero = Length(0.0, Px); // If margin-top or margin-bottom is `auto`, the used value is zero. = style.lookup("margin-top", "margin", &zero).to_px(); d.margin.bottom = style.lookup("margin-bottom", "margin", &zero).to_px(); = style.lookup("border-top-width", "border-width", &zero).to_px(); d.border.bottom = style.lookup("border-bottom-width", "border-width", &zero).to_px(); = style.lookup("padding-top", "padding", &zero).to_px(); d.padding.bottom = style.lookup("padding-bottom", "padding", &zero).to_px(); // Position the box below all the previous boxes in the container. d.x = containing_block.x + d.margin.left + d.border.left + d.padding.left; d.y = containing_block.y + containing_block.height + + +; }

Take a close look at that last statement, which sets the y position. This is what gives block layout its distinctive vertical stacking behavior. For this to work, we’ll need to make sure the parent’s height is updated after laying out each child.


Here’s the code that recursively lays out the box’s contents. As it loops through the child boxes, it keeps track of the total content height. This is used by the positioning code (above) to find the vertical position of the next child.

fn layout_block_children(&mut self) { let d = &mut self.dimensions; for child in self.children.mut_iter() { child.layout(*d); // Track the height so each child is laid out below the previous content. d.height = d.height + child.dimensions.margin_box_height(); } }

The total vertical space taken up by each child is the height of its margin box, which we calculate just by adding all up the vertical dimensions.

impl Dimensions { /// Total height of a box including its margins, border, and padding. fn margin_box_height(&self) -> f32 { self.height + + self.padding.bottom + + self.border.bottom + + self.margin.bottom } }

For simplicity, this does not implement margin collapsing. A real layout engine would allow the bottom margin of one box to overlap the top margin of the next box, rather than placing each margin box completely below the previous one.

The ‘height’ Property

By default, the box’s height is equal to the height of its contents. But if the 'height' property is set to an explicit length, we’ll use that instead:

fn calculate_block_height(&mut self) { // If the height is set to an explicit length, use that exact length. match self.get_style_node().value("height") { Some(Length(h, Px)) => { self.dimensions.height = h; } _ => {} } }

And that concludes the block layout algorithm. You can now call layout() on a styled HTML document, and it will spit out a bunch of rectangles with widths, heights, margins, etc. Cool, right?


Some extra ideas for the ambitious implementer:

  1. Collapsing vertical margins.

  2. Relative positioning.

  3. Parallelize the layout process, and measure the effect on performance.

If you try the parallelization project, you may want to separate the width calculation and the height calculation into two distinct passes. The top-down traversal for width is easy to parallelize just by spawning a separate task for each child. The height calculation is a little trickier, since you need to go back and adjust the y position of each child after its siblings are laid out.

To Be Continued…

Thank you to everyone who’s followed along this far!

These articles are taking longer and longer to write, as I journey further into unfamiliar areas of layout and rendering. There will be a longer hiatus before the next part as I experiment with font and graphics code, but I’ll resume the series as soon as I can.

Categorieën: Mozilla-nl planet

James Long: Transducers.js: A JavaScript Library for Transformation of Data

do, 18/09/2014 - 02:00

If you didn't grab a few cups of coffee for my last post, you're going to want to for this one. While writing my last post about js-csp, a port of Clojure's core.async, they announced transducers which solves a key problem when working with transformation of data. The technique works particularly well with channels (exactly what js-csp uses), so I dug into it.

What I discovered is mind-blowing. So I also ported it to JavaScript, and today I'm announcing transducers.js, a library to build transformations of data and apply it to any data type you could imagine.

Woha, what did I just say? Let's take a step back for a second. If you haven't heard of transducers before, you can read about their history in Clojure's announcement. Additionally, there's an awesome post that explores these ideas in JavaScript and walks you through them from start to finish. I give a similar (but brief) walkthrough at the end of this post.

The word transduce is just a combination of transform and reduce. The reduce function is the base transformation; any other transformation can be expressed in terms of it (map, filter, etc), .

var arr = [1, 2, 3, 4]; arr.reduce(function(result, x) { result.push(x + 1); return result; }, []); // -> [ 2, 3, 4, 5 ]

The function passed to reduce is a reducing function. It takes a result and an input and returns a new result. Transducers abstract this out so that you can compose transformations completely independent of the data structure. Here's the same call but with transduce:

function append(result, x) { result.push(x); return result; } transduce(map(x => x + 1), append, [], arr);

We created append to make it easier to work with arrays, and are using ES6 arrow functions (you really should too, they are easy to cross-compile). The main difference is that the push call on the array is now moved out of the transformation. In JavaScript we always couple transformation with specific data structures, and we've got to stop doing that. We can reuse transformations across all data structures, even streams.

There are three main concerns here that reduce needs to work. First is to iterate over the source data structure. Second is to transform each value. Third is to build up a new result.

These are completely separate concerns, and yet most transformations in JavaScript are tightly coupled with specific data structures. Transducers decouples this and you can apply all the available transformations on any data structure.


We have a small amount of transformations that will solve most of your needs like map, filter, dedupe, and more. Here's an example of composing transformations:

sequence( compose( cat, map(x => x + 1), dedupe(), drop(3) ), [[1, 2], [3, 4], [4, 5]] ) // -> [ 5, 6 ]

The compose function combines transformations, and sequence just creates a new collection of the same type and runs the transformations. Note that nothing within the transformations assume anything about the data structure from where it comes or where it's going.

Most of the transformations that transducers.js provides can also simply take a collection, and it will immediately run the transformation over the collection and return a new collection of the same type. This lets you do simple transformations the familiar way:

map(x => x + 1, [1, 2, 3, 4]); filter(x => x % 2 === 0, [1, 2, 3, 4])

These functions are highly optimized for the builtin types like arrays, so the above map literally just runs a while loop and applies your function over each value.

Iterating and Building

These transformations aren't useful unless you can actually apply them. We figured out the transform concern, but what about iterate and build?

First lets take a look at the available functions for applying transducers:

  • sequence(xform, coll) - get a collection of the same type and fill it with the results of applying xform over each item in coll
  • transduce(xform, f, init, coll) - reduce a collection starting with the initial value init, applying xform to each value and running the reducing function f
  • into(to, xform, from) - apply xform to each value in the collection from and append it to the collection to

Each of these has different levels of assumptions. transduce is the lowest-level in that it iterates over coll but lets you build up the result. into assumes the result is a collection and automatically appends to it. Finally, sequence assumes you want a collection of the same type so it creates it and fills it with the results of the transformation.

Ideally our library wouldn't care about the details of iteration or building either, otherwise it kind of kills the point of generic transformations. Luckily ES6 has an iteration protocol, so we can use that for iteration.

But what about building? Unfortunately there is no protocol for that, so we need to create our own. transducers.js looks for @@append and @@empty methods on a collection for adding to it and creating new collections. (Of course, it works out of the box for native arrays and objects).

Let's drive this point home with an example. Say you wanted to use the immutable-js library. It already supports iteration, so you can automatically do this:

into([], compose( map(x => x * 2), filter(x => x > 5) ), Immutable.Vector(1, 2, 3, 4)); // -> [ 6, 8 ]

We really want to use immutable vectors all the way through, so let's augment the vector type to support "building":

Immutable.Vector.prototype['@@append'] = function(x) { return this.push(x); }; Immutable.Vector.prototype['@@empty'] = function(x) { return Immutable.Vector(); };

Now we can just use sequence, and we get an immutable vector back:

sequence(compose( map(x => x * 2), filter(x => x > 5) ), Immutable.Vector(1, 2, 3, 4)); // -> Immutable.Vector(6, 8)

This is experimental, so I would wait a little while before using this in production, but so far this gives a surprising amount of power for a 500-line JavaScript library.

Implications Works with Everything (including Streams and Channels)!

Let's play around with all the kinds of data structures we can use now. A type must at least be iterable to use with into or transduce, but if it is also buildable then it can also be used with sequence or the target collection of into.

var xform = compose(map(x => x * 2), filter(x => x > 5)); // arrays (iterable & buildable) sequence(xform, [1, 2, 3, 4]); // -> [ 6, 8 ] // objects (iterable & buildable) into([], compose(map(kv => kv[1]), xform), { x: 1, y: 2, z: 3, w: 4 }) // -> [ 6, 8 ] sequence(map(kv => [kv[0], kv[1] + 1]), { x: 1, y: 2, z: 3, w: 4 }) // -> { x: 2, y: 3, z: 4, w: 5 } // generators (iterable) function *data() { yield 1; yield 2; yield 3; yield 4; } into([], xform, data()) // -> [ 6, 8 ] // Sets and Maps (iterable) into([], xform, new Set([1, 2, 3, 3])) // -> [ 6 ] into({}, map(kv => [kv[0], kv[1] * 2], new Map([['x', 1], ['y', 2]]))) // -> { x: 2, y: 4 } // or make it buildable Map.prototype['@@append'] = Map.prototype.add; Map.prototype['@@empty'] = function() { return new Map(); }; Set.prototype['@@append'] = Set.prototype.add; Set.prototype['@@empty'] = function() { return new Set(); }; sequence(xform, new Set([1, 2, 3, 2])) sequence(xform, new Map([['x', 1], ['y', 2]])); // node lists (iterable) into([], map(x => x.className), document.querySelectorAll('div')); // custom types (iterable & buildable) into([], xform, Immutable.Vector(1, 2, 3, 4)); into(MyCustomType(), xform, Immutable.Vector(1, 2, 3, 4)); // if implemented append and empty: sequence(xform, Immutable.Vector(1, 2, 3, 4)); // channels var ch = chan(1, xform); go(function*() { yield put(ch, 1); yield put(ch, 2); yield put(ch, 3); yield put(ch, 4); }); go(function*() { while(!ch.closed) { console.log(yield take(ch)); } }); // output: 6 8

Now that we've decoupled the data that comes in, how it's transformed, and what comes out, we have an insane amount of power. And with a pretty simple API as well.

Did you notice that last example with channels? That's right, a js-csp channel which I introduced in my last post now can take a transducer to apply over each item that passes through the channel. This easily lets us do Rx-style (reactive) code by simple reusing all the same transformations.

A channel is basically just a stream. You can reuse all of your familiar transformations on streams. That's huge!

This is possible because transducers work differently in that instead of applying each transformation to a collection one at a time (and creating multiple intermediate collections), they take each value separately and fire them through the whole transformation pipeline. That's leads us to the next point, in which there are...

No Intermediate Allocations!

Not only do we have a super generic way of transforming data, we get good performance on large arrays. This is because transducers create no intermediate collections. If you want to apply several transformations, usually each one is performed in order, creating a new collection each time.

Transducers, however, take one item off the collection at a time and fire it through the whole transformation pipeline. So it doesn't need any intermediate collections; each value runs through the pipeline separately.

Think of it as favoring a computational burden over a memory burden. Since each value runs through the pipeline, there are several function calls per item but no allocations, instead of 1 function call per item but an allocation per transformation. For small arrays there is a small difference, but for large arrays the computation burden easily wins out over the memory burden.

To be frank, early benchmarks show that this doesn't win anything in V8 until you reach a size of around 100,000 items (after that this really wins out). So it only matters for very large arrays. It's too early to post benchmarks.

How a Transducer is Born

If you are interested in walking through how transducers generalize reduce into what you see above, read the following. Feel free to skip this part though, or read this post which also does a great job of that.

The reduce function is the base transformation; any other transformation can be expressed in terms of it (map, filter, etc), so let's start with that. Here's an example call to reduce, which is available on native JS arrays:

var arr = [1, 2, 3, 4]; arr.reduce(function(result, x) { return result + x; }, 0); // -> 10

This sums up all numbers in arr. Pretty simple, right? Hm, let's try and implement map in terms of reduce:

function map(f, coll) { return coll.reduce(function(result, x) { result.push(f(x)); return result; }, []); } map(function(x) { return x + 1; }, arr); // -> [2, 3, 4, 5]

That works. But our map only works with native JS arrays. It assumes a lot of knowledge about how to reduce, how to append an item, and what kind of collection to create. Shouldn't our map only be concerned with mapping? We've got to stop coupling transformations with data; every single collection is forced to completely re-implement map, filter, take, and all the collection operations, with varying incompatible properties!

But how is that possible? Well, let's start with something simple: the mapping function that we meant to create. It's only concernced with mapping. The key is that reduce will always be at the bottom of our transformation, but there's nothing stopping us from abstracting the function we pass to reduce:

function mapper(f) { return function(result, x) { result.push(f(x)); return result; } }

That looks better. We would use this by doing arr.reduce(mapper(function(x) { return x + 1; }), []). Note that now mapper has no idea how the reduction is actually done, or how the initial value is created. Unfortunately, it still has result.push embedded so it still only works with arrays. Let's abstract that out:

function mapper(f) { return function(combine) { return function(result, x) { return combine(result, f(x)); } } }

That looks crazy, but now we have a mapper function that is literally only concerned about mapping. It calls f with x before passing it to combine. The above function may look daunting, but it's simple to use:

function append(arr, x) { arr.push(x); return arr; } arr.reduce(mapper(function(x) { return x + 1; })(append), []); // -> [ 2, 3, 4, 5 ]

We create append to make it easy to functionally append to arrays. So that's about it, now we can just make this a little easi-- hold on, doesn't combine look a little like a reducer function?

If the result of applying append to the result of mapper creates a reducer function, can't we apply that itself to mapper?

arr.reduce( mapper(function(x) { return x * 2; })( mapper(function(x) { return x + 1; })(append) ), [] ); // -> [ 3, 5, 7, 9 ]

Wow! So now we can compose these super generic transformation functions. For example, let's create a filterer. You wouldn't normally apply two maps right next to each other, but you would certainly map and filter!

function filterer(f) { return function(combine) { return function(result, x) { return f(x) ? combine(result, x) : result; } } } arr.reduce( filterer(function(x) { return x > 2; })( mapper(function(x) { return x * 2; })(append) ), [] ); // -> [ 6, 8 ]

Nobody wants to write code like that though. Let's make one more function compose which makes it easy to compose these, that's right, transducers. You just wrote transducers without even knowing it.

// All this does is it transforms // `compose(x(), y(), z())(val)` into x(y(z(val)))` function compose() { var funcs =; return function(r) { var value = r; for(var i=funcs.length-1; i>=0; i--) { value = funcs[i](value); } return value; } } arr.reduce( compose( filterer(function(x) { return x > 2; }), mapper(function(x) { return x * 2; }) )(append), [] ); // -> [ 6, 8 ]

Now we can write really clean sequential-looking transformations! Hm, there's still that awkward syntax to pass in append. How about we make our own reduce function?

function transduce(xform, f, init, coll) { return coll.reduce(xform(f), init); } transduce( compose( filterer(function(x) { return x > 2; }), mapper(function(x) { return x * 2; }) ), append, [], arr ); // -> [ 6, 8 ]

Voila, you have transduce. Given a transformation, a function for appending data, an initial value, and a collection, run the whole process and return the final result from whatever append is. Each of those arguments are distinct pieces of information that shouldn't care at all about the others. You could easily apply the same transformation to any data structure you can imagine, as you will see below.

This transduce is not completely correct in that it should not care how the collection reduces itself.

Final Notes

You might think that this is sort of lazy evaluation, but that's not true. If you want lazy sequences, you will still have to explicitly build a lazy sequence type that handles those semantics. This just makes transformations first-class values, but you still always have to eagerly apply them. Lazy sequences are something I think should be added to transducers.js in the future. (edit: well, this paragraph isn't exactly true, but we'll have to explain laziness more in the future)

Some of the examples my also feel similar to ES6 comprehensions, and while true comprehensions don't give you the ability to control what type is built up. You can only get a generator or an array back. They also aren't composable; you will still need to solve the problem of building up transformations that can be reused.

When you correctly separate concerns in a program, it breeds super simple APIs that allow you build up all sorts of complex programs. This is a simple 500-line JavaScript library that, in my opinion, radically changes how I interact with data, and all with just a few methods.

transducers.js is still early work and it will be improved a lot. Let me know if you find any bugs (or if it blows your mind).

Categorieën: Mozilla-nl planet

Paul Rouget: (video) DevTools Timeline and Firefox OS

do, 18/09/2014 - 02:00

Just thought I'd share that (if you're a Firefox OS hacker, you want to read this).

We recently landed a very basic timeline in the Firefox DevTools (enable it in the devtools options. Firefox and B2G nightly both required). It is basic. But already useful, especially to debug Firefox OS apps. For example today, we were looking at the System App and realized that the main thread was never idle. Looking at the timeline, we realized that a restyle was triggered many many times per seconds, even if nothing was happening on the screen. By tweaking the DOM with the inspector, we figured it was coming from a CSS animation that was display:none (see bug 962594).

We are working hard to build new Firefox performance tools. Expect to see better tools coming in the coming months. More info about this new timeline tool here.

Categorieën: Mozilla-nl planet

David Boswell: Learning more about the Mozilla community

do, 18/09/2014 - 01:09

We’ve learned a lot this year as we’ve been working on enabling communities that have impact. We discovered there is a high contributor churn rate, scaling the size of the community doesn’t meet the needs of all teams and there is a need to make community building more reliable and predictable.

There are still many more questions than answers right now though and there is much more to learn. A contributor audit was done in 2011 that had useful findings and recommendations, but it has now been 3 years since that research was completed. It is time to do more.


Research has provided other community-driven organizations with lessons that help them be more successful. For example, Lego has an active community and research has helped them develop a set of principles that promote successful interactions that provide value for both community members and the Lego organization.

We’ll be kicking off a new research project soon and we’d love to get your help. This will involve creating a survey to send out to Mozillians and will also dive into the contributor data we’ve started collecting. This won’t answer all the questions we have, but this will give us some insight and can provide a starting point for other research projects.


Some specific asks for helping with the survey include thinking about what questions we want to ask and thinking about the audience of people we want to reach out to. For the data analysis part, please comment here or contact me if you’re interested in helping.

Categorieën: Mozilla-nl planet

Jean-Marc Valin: Opus beats all other codecs. Again.

do, 18/09/2014 - 00:14

Three years ago Opus got rated higher than HE-AAC and Vorbis in a 64 kb/s listening test. Now, the results of the recent 96 kb/s listening test are in and Opus got the best ratings, ahead of AAC-LC and Vorbis. Also interesting, Opus at 96 kb/s sounded better than MP3 at 128 kb/s.

Full plot

Categorieën: Mozilla-nl planet

Henrik Skupin: mozdownload 1.12 has been released

do, 18/09/2014 - 00:11

The Firefox Automation team would like to announce the release of mozdownload 1.12. Without any other release of our universal download tool in the last 7 months, a couple of nice new features and bug fixes will make this release even more useful. You can upgrade your installation easily via pip, or by downloading it from PyPI.

Changes in 1.12
  • Display selected build when downloading (#149)
  • Add support for downloading B2G desktop builds (#104)
  • Download candidate builds from candidates/ and not nightly/ (#218)
  • Add Travis CI build status and PyPI version badges to README (#220)
  • Add Python 2.6 to test matrix (#210)
  • Fix broken download of mac64 tinderbox builds (#144)
  • Allow download even if content-length header is missing (#194)
  • Convert run_tests script to Python (#168)
  • Ensure that –date option is a valid date (#196)
Categorieën: Mozilla-nl planet

Henrik Skupin: Mozmill 2.0.7 and 2.0.8 have been released

do, 18/09/2014 - 00:04

The Firefox Automation team would like to announce the release of Mozmill 2.0.7 and Mozmill 2.0.8. Both versions had to be released in such a short time frame to ensure continuing support for Firefox. Some latest changes done for Firefox Nightly broke Mozmill, or at least made it misbehaving. If you run tests with Mozmill ensure to upgrade to the latest version. You can do this via PyPI, or simply download the already pre-configured environment.

Changes in 2.0.7
  • Bug 1066295 – testMultipleLoads.js is failing due to new HTTP Cache v2
  • Bug 1000098 – Fix testPageLoad.js test for invalid cert page
  • Bug 1065436 – Disable e10s until full support landed
  • Bug 1062773 – Disconnect errors invalidate the report
  • Bug 999393 – Expose assert and expect by default in sub modules
  • Bug 970820 – Mozmill aborts with socket.timeout when trying to send the report
Changes in 2.0.8
  • Bug 1067939 – JSBridge and Mozmill broken due to ‘let’ changes in bug 1001090

Please keep in mind that Mozmill 2.0 does not support electrolysis (e10s) builds of Firefox yet. We are working hard to get full support for e10s added, and hope it will be done until the next version bump mid of October.

Thanks everyone who was helping with those releases!

Categorieën: Mozilla-nl planet

Sean Martell: Mozilla ID Project: Palette Explorations

wo, 17/09/2014 - 22:58

Churning along with out explorations into the Mozilla brand, we’ve been looking at nailing down an expanded base color palette to bring more choice and a bolder feel. We’ve injected the colors used in the Firefox color palette – derived from the logo itself – and added a few more previously missed hues, such as greens and pinks.


Along with the straight expansion of a single set, we’ve picked a vibrant set to compliment the base. With the two sets we then chose two stepped-down variants for each. This makes 6 variants for each of the named colors within the set.

All this is still in the works and not fully locked down, but we thought we’d share the thinking around this and offer a glimpse at where we’re going. We’re also updating the names of the base colors and adding a few fun labels in the process. Again, the labels aren’t finalized yet either, but thought we’d share. More to come on color and usage scenarios later!

Categorieën: Mozilla-nl planet

Byron Jones: we’re changing the default search settings for advanced search

wo, 17/09/2014 - 17:14

at the end of this week we will change the default search settings on’s advanced search page:


the resolution DUPLICATE will no longer be selected by default – only open bugs will be searched if the resolution field is left unmodified.

this change will not impact any existing saved searches or queries.


DUPLICATE has been a part of our default query for as long as i can remember, and was included to accommodate using that form to search for existing bugs.

since the addition of “possible duplicates” to the bug creation workflow, the importance of searching duplicates has lessened, and returning duplicates by default to advanced users is more of a hindrance than a help.  the data reflects this – the logs indicate that over august less than 4% of advanced queries included DUPLICATE as a resolution.


this change is being tracked in bug 1068648.

Filed under: bmo
Categorieën: Mozilla-nl planet

Kim Moir: Mozilla Releng: The ice cream

wo, 17/09/2014 - 15:43
A week or so ago, I was commenting in IRC that I was really impressed that our interns had such amazing communication and presentation skills.  One of the interns, John Zeller said something like "The cream rises to the top", to which I replied "Releng: the ice cream of CS".  From there, the conversation went on to discuss what would be the best ice cream flavour to make that would capture the spirit of Mozilla releng.  The consensus at the end was was that Irish Coffee (coffee with whisky) with cookie dough chunks was the favourite.  Because a lot of people like on the team like coffee, whisky makes it better and who doesn't like cookie dough?

I made this recipe over the weekend with some modifications.  I used the coffee recipe from the Perfect Scoop.  After it was done churning in the ice cream maker,  instead of whisky, which I didn't have on hand, I added Kahlua for more coffee flavour.  I don't really like cookie dough in ice cream but cooked chocolate chip cookies cut up with a liberal sprinkling of Kahlua are tasty.

Diced cookies sprinkled with Kahlua
Ice cream ready to put in freezer
Finished productI have to say, it's quite delicious :-) If I open source ever stops being fun, I'm going to start a dairy empire.  Not really. Now back to bugzilla...
Categorieën: Mozilla-nl planet

Mozilla Release Management Team: Firefox 33 beta3 to beta4

wo, 17/09/2014 - 13:58

  • 43 changesets
  • 114 files changed
  • 6806 insertions
  • 3156 deletions

ExtensionOccurrences cpp49 h28 js12 cc9 build4 xul2 java2 xml1 sh1 dat1

ModuleOccurrences security50 gfx12 media11 js9 ipc7 mobile3 dom3 content3 browser3 image2 caps2 accessible2 netwerk1 modules1

List of changesets:

Robert O'CallahanBug 1063052. NS_RUNTIMEABORT if a builtin stylesheet fails to load. r=heycam,a=lmandel - c43d3d833973 EKRBug 1063730 - Require HTTPS for Screen/window sharing. r=mt,sstamm a=lmandel - a706b85f6d4d Jim MathiesBug 1066242 - Use a 'ui' chromium message loop/pump for the Windows compositor thread so that it can process native windowing events. r=Bas a=sylvestre - e3fe616ef9a2 Alexander SurkovBug 1020039 - Fix intermittent failures in relations/test_embeds.xul. a=test-only - 5007a59d2d92 Nick AlexanderBug 1041770 - Update missed reference. r=mrbkap, a=lmandel - 40044a225ae7 Chia-hung TaiBug 1057174 - [WebRTC] |DesktopDeviceInfoImpl::initializ| in use wrong argument while calling snprintf. r=rjesup, a=sledru - 645d232705b3 Tim AbraldesBug 1027906 - Set delayed token level for GMP plugin processes to USER_RESTRICTED. Whitelist certain files and registry keys that are required for EME plugins to successfully load. r=bobowen. r=jesup, r=bent, a=lmandel - 0af2575571f3 Jim MathiesBug 1066242 - Use a 'ui' chromium message loop/pump for the Windows compositor thread so that it can process native windowing events. r=Bas a=sylvestre - a128f3f1ce1f Jim MathiesBug 1060738 - Add support for webrtc ThreadWindowsUI for use by webrtc desktop capture thread. r=jesup a=sylvestre - 2c6a2069023a Jim MathiesBug 1060738 - Implement MessagePumpForNonMainUIThreads for Windows, a xpcom compatible subclass of chromium's MessagePumpForUI. r=tabraldes a=sylvestre - b6a5a3973477 Jim MathiesBug 1060738 - Switch to using chromium's Thread/tasks in MediaManager. On Windows, use MessagePumpForNonMainUIThreads for the background media thread. r=jesup a=sylvestre - 1355bb2a2765 Jim MathiesBug 1060738 - Add IsGUIThread asserts in various webrtc capture related methods. r=jesup a=sylvestre - 84daded3719c Gervase MarkhamBug 1065977 - Uplift recent PSL changes to the release branches. a=lmandel - 5b7a15b4fee2 Gian-Carlo PascuttoBug 1063547 - Return no available devices where not supported, disable on Android. r=jesup, a=lmandel - abbbaa040046 Michael WuBug 1063733 - Optimize DataSourceSurface allocation. r=bas, r=seth, a=sledru - 5982da7a1215 Dão GottwaldBug 1061947 - Avoid flushing layout and making it dirty repeatedly in ToolbarIconColor.inferFromText. r=gijs, a=lmandel - 2938d6cea847 Lucas RochaBug 1041448 - Fix crash when double-tapping on empty top site spot. r=bnicholson, a=sledru - e0c49c71cc55 Margaret LeibovicBug 1063518 - Hide MLS "Learn More" link when MLS is disabled. r=liuche, a=lmandel - 275330447f6d Mark FinkleBug 887755 - Lightweight theme preview is broken. r=margaret, a=lmandel - c21b3ccb9c19 Nicolas SilvaBug 1041744 - Don't crash if tile allocation fails. r=Bas, a=sledru - 9148cd599e9f Nicolas SilvaBug 1061696 - Don't crash release builds when failing to allocate a surface in AutoRestoreClippedOut::save. r=Bas, a=sledru - ff9cef7b2f9d Richard NewmanBug 1065531 - Crash in java.lang.NoSuchMethodError: android.os.Bundle.getString at org.mozilla.gecko.preferences.GeckoPreferences.setupPreferences. r=nalexander, a=lmandel - 430b3512f177 David MajorBug 1058131 - Avoid getting a crashy hook from Avast 10 Beta. r=bzbarsky, a=sledru - cd04e5bf0fec Bobby HolleyBug 1061136 - Assume both http:// and https:// for schemeless URIs in CAPS prefs. r=bz, a=sledru - e608db37bafb Bobby HolleyBug 1053725 - When one domain is whitelisted for file:// URI access, whitelist all subdomains. r=bz, a=sledru - a91c79c7e64e Bobby HolleyBug 1008481 - Switch to the root dir instead of the profile dir. a=test-only - f58da8f6f47e Michael ComellaBug 1062338 - Remove useless ic_menu_back drawable xml. r=lucasr, a=sledru - 1c636d0e8ec1 Brian SmithBug 1039064: Use strongly-typed enum instead of NSPR-style error handling, r=keeler a=lmandel - f3115a9f645c David KeelerBug 1040446 - mozilla::pkix: add error code for CA cert used as end-entity cert r=briansmith a=lmandel - 15c382469fd1 David KeelerBug 1034124 - allow overrides when a CA cert is used as an end-entity cert r=briansmith a=lmandel - 198d06258284 Ryan VanderMeulenBug 1037618 - Skip ice_unittest on OSX. a=test-only - 0225b61c4f71 Jan de MooijBug 1057598 - Suppress the object metadata callback in RStringSplit::recover. r=nbp, a=sledru - 62f5d35f2210 Ryan VanderMeulenBug 1057598 - s/warmup/usecount on older release branches. rs=nbp, a=bustage - 3e6571e74e01 Dan GohmanBug 1054972 - IonMonkey: Truncation for phis. r=nbp, a=sledru - 94dc71a06159 Dan GohmanBug 1054972 - IonMonkey: GVN: More misc cleanups. r=nbp, a=sledru - c0d46e44a6cb Dan GohmanBug 1054972 - IonMonkey: GVN: Avoid setting UseRemoved flags unnecessarily. r=nbp, a=sledru - 316374007734 Dan GohmanBug 1062612 - IonMonkey: Fix cast insertion for truncation of phi operands. r=nbp, a=lmandel - c5ee54bc44f8 Ryan VanderMeulenBacked out 3 changesets (Bug 1039064, Bug 1040446, Bug 1034124) for ASAN xpcshell hangs. - b8c9b76b6585 Chris CooperBug 1066403 - replace empty blocklist - a=blocklist-update - 06300676d4cd Gabriel LuongBug 1061003 - Add New Rule won't work in non-english locales. r=harth, a=lmandel - bacdfedd7241 Brian SmithBug 1039064 - Use strongly-typed enum instead of NSPR-style error handling. r=keeler, a=lmandel - 1f599d357743 David KeelerBug 1040446 - mozilla::pkix: add error code for CA cert used as end-entity cert. r=briansmith, a=lmandel - 93cd4a068e9d David KeelerBug 1034124 - Allow overrides when a CA cert is used as an end-entity cert. r=briansmith, a=lmandel - a6856f90ce36

Categorieën: Mozilla-nl planet

Daniel Stenberg: Snaxx delivers

wo, 17/09/2014 - 09:15

A pint of guinnessLate in the year 1999 I quit my job. I handed over a signed paper where I wrote that I quit and then I started my new job first thing in the year 2000. I had a bunch of friends at the work I left and together with my closest friends (who coincidentally also switched jobs at roughly the same time) we decided we needed a way to keep in touch with friends that isn’t associated with our current employer.

The fix, the “employer independent” social thing to help us keep in touch with friends and colleagues in the industry, started on the last of February 2000. The 29th of February, since it was a leap year and that fact alone is a subject that itself must’ve been discussed at that meetup.

Snaxx was born.

Snaxx is getting a bunch of friends to a pub somewhere in Stockholm. Preferably a pub with lots of great beers and a sensible sound situation. That means as little music as possible and certainly no TVs or anything. We keep doing them at a pace of two or three per year or so.

Bishops Arms logo

Yesterday we had the 31st Snaxx and just under 30 guests showed up (that might actually have been the new all time high). We had many great beers, food and we argued over bug reporting, discussed source code formats, electric car charging, C64 nostalgia, mentioned Linux kernel debugging methods, how to transition from Erlang to javascript development and a whole load of other similarly very important topics. The Bishops Arms just happens to be a brand of pubs here that have a really sensible view on how to run pubs to be suitable for our events so yesterday we once again visited one of their places.

Thanks for a great time yesterday, friends! I’ll be setting up a date for number 32 soon. I figure it’ll be in the January 2015 time frame…If you want to get notified with an email, sign up yourself on the snaxx mailing list.

A few pictures from yesterday can be found on the Snaxx-31 G+ event page.

Categorieën: Mozilla-nl planet

Raniere Silva: Mathml September Meeting

wo, 17/09/2014 - 05:00
Mathml September Meeting ../../../_images/mathml2.jpg

This is a report about the Mozilla MathML September Meeting. The topics of the meeting can be found in this PAD (local copy of the PAD). This meeting happens all at and because of that we don’t have a log.

The next meeting will be in October 10th at 8pm UTC (note that October 10th is Friday, we change the day in our try make the meeting suitable to as many people possible). Please add topics in the PAD.

Leia mais...

Categorieën: Mozilla-nl planet

Ludovic Hirlimann: Gnupg / PGP key signing party in mozilla's San francisco space

wo, 17/09/2014 - 02:35

I’m organizing a pgp Keysigning party in the Mozilla san francisco office on September the 26th 2014 from 6PM to 8PM.

For security and assurances reasons I need to count how many people will attend. I’ve setup a eventbrite for that at (please take one ticket if you think about attending - If you change you mind cancel so more people can come).

I will use the eventbrite tool to send reminders and I will try to make a list with keys and fingerprint before the event to make things more manageable (but I don’t promise).

For those using lanyrd you will be able to use tweet the event to get more people in).

Categorieën: Mozilla-nl planet

Clint Talbert: Getting Ready for Fall

wo, 17/09/2014 - 01:43

As leaves begin to turn in some places, the Bay area gets a nice blast of heat in September. It’s like a week of summer that forgot to happen back when we were socked in with fog in July. But, that nice last bit of warmth before our leaves start to turn (yes we have some leaves that turn) is also the harbinger of the fourth quarter at Mozilla. This year, I’m excited to start putting into place some of the changes we identified at the work week. So, to that end, three quick notes.

First, we are starting off with a Quality and Automation Community Call. This is going to be in the style of the WebMaker Community Calls, and the WebMaker folks have been graciously teaching us how they work. Our first one will be next Tuesday at 8AM Pacific/15:00 UTC. Unlike most of our other meetings where we talk in glib Mozilla shorthand, this will be a call that will focus on fully explaining a few specific projects that people can help with Right Now, regardless of their skill level. And, this will also be a forum for folks in the community to tell us about what they are doing. Giving our community a way to easily let us know what they are doing, and giving them a forum to talk to one another is not something we do enough of at Mozilla. So, I’m extremely excited to see where this goes. Additionally, we are going to do this in conjunction with the Automation and Tools team and I’m extremely psyched about starting this process off. Many thanks to Lyre Calliope who helped nudge us in this direction.

Second, we are going to define our efforts in quality around some core principles, which I can talk more about later, but I’ll briefly introduce them here. They are probably best envisioned as a set of overlapping circles in a Venn diagram:

Dependability, Delight, Security & Privacy, Performance underpinned by the web platform

There are likely only two things in there that you weren’t expecting. One is Delight–in this ultra competitive world, it’s not enough for our products to be right and correct. We have to go the extra mile and delight our users with our products. Nothing short of that is good enough.

The other one is the web platform. At Mozilla, our mission to extend, enhance and empower the web platform while ensuring it remains open underpins everything we do. As an odd anachronism, our QA systems and metrics have not historically taken into account how a given feature does or does not help to move the web platform forward. Likewise, we’ve not been very involved in looking at how we are implementing and testing our support for various web standards in any kind of systemic way. While I don’t believe that there are problems in these areas, I do believe that this is a core piece of quality at Mozilla and it is an area we should work to get more involved in and be more cognizant of.

Third, time is unfortunately a zero sum game.  The amount of work doesn’t shrink, especially when attempting to venture into new areas. So in order to make room for our new directions, we are going to experiment with stopping and/or pausing some of our endeavors. That can be scary when you’re changing how things have been done for years, but it’s what we need to do to move Mozilla forward. The world we live in has changed, and there is no going back. There is constant iteration toward our vision of being a more technically astute, more data-driven, more community empowering team that propels quality forward.

Categorieën: Mozilla-nl planet

Armen Zambrano: Which builders get added to buildbot?

di, 16/09/2014 - 17:26
To add/remove jobs on, we have to modify buildbot-configs.

Making changes can be learnt by looking at previous patches, however, there's a bit of an art to it to get it right.

I just landed a script that sets up buildbot for you inside of a virtualenv and you can pass a buildbot-config patch and determine which builders get added/removed.

You can run this by checking out braindump and running something like this:
buildbot-related/ -j path_to_patch.diff

NOTE: This script does not check that the job has all the right parameters once live (e.g. you forgot to specify the mozharness config for it).

Happy hacking!

Creative Commons License
This work by Zambrano Gasparnian, Armen is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
Categorieën: Mozilla-nl planet

Tantek Çelik: IndieWebCampUK 2014 Hack Day Demos: HTTPS, #webactions, new & improved #indieweb sites

di, 16/09/2014 - 14:12

One weekend ago, 18 IndieWebCampUK participants (including 2 remote) showed 25 demos in just under 75 minutes of what they designed and built that weekend in 19 different interoperable projects. Every single demo exemplified an indieweb community member scratching their own personal site itch(es), helping each other do so, and together advancing the state of the indieweb. We can all say:

I'm building Indie Web Camp.

During the demos I took realtime notes in IRC, with some help from Barnaby Walters. Archived on the IndieWebCamp wiki, here's a summary of what each of us got working.

Glenn Jones

Glenn Jones built improvements to Transmat. (IRC notes)

He built a map view that shows the venues nearest to his current location (via GeoLocation API).

He also found an open source HTML5 JS open source pedometer and repurposed it into Transmat so that when running on his Android as a web app, it can detect when he's walking, and only do GPS lookups when he's walking, so it saves battery.

Now he has an HTML5 JS app that can auto-checkin for him while he's walking.

Barnaby and Pelle

Barnaby WaltersPelle Wessman Barnaby Walters and Pelle Wessman built cross-site reply webactions that work purely via their websites - no browser extension needed! This is the first time this has been done. (IRC notes)

Barnaby has setup registerProtocolHandler on Taproot to register a handler for the "web+indie:" (since updated to "web+action:") protocol when he loads a particular page on his website so that his website is registered to handle webactions via the <indie-action> tag.

Barnaby demonstrates loading the page that calls registerProtocolHandler. The browser asks to confirm that he wants to handle "web+indie" URLs.

Then Barnaby goes to Pelle's website home page where he has a list of posts that he's written, now with "Reply", "Like", and "Tip" webactions next to each post, each webaction represented and wrapped by <indie-action> tags in the markup.

Pelle's site also has a web component ([ open sourced on github]) to handle his <indie-action> tags, which creates an iframe that uses that same protocol handler using a Promise, which connects the iframe to calling the handler that Taproot registered.

Thus without anything installed in the browser, Barnaby can go to Pelle's site, click the "Reply" button next to a post which automatically goes to Barnaby's site's Taproot UI to post a reply!

Barnaby Walters

Barnaby Walters also built a map-view post aggregator that shows icons for people at the locations embedded in their recent posts. (IRC notes)

The map-view aggregator is at a self-standing demo URL for now, but Barnaby plans to include this view as another column type in Shrewdness, so you can have a map view of recent posts from people you're following.

Grant Richmond

Grant Richmond got a fancy new domain ( and setup Glenn Jones's Transmat on it - which makes it the second installation of Transmat! (IRC notes)

Grant also built a contact page: that has links for various methods of communication:

All of the links are text links for now, no icons yet.

Grant has implemented a people focused communication UI on his site!

Jeremy Keith

Jeremy Keith added https on, and implemented <indie-action> tag webactions. (IRC notes)

adactio https

Jeremy took his site from no https support to https Level 4. All URLs redirect to https. However subdomains (e.g. are still http.

adactio webactions

Jeremy's also implemented the new <indie-action> tag for webactions around his existing Tweet action links, both on his post permalinks, and on his posts in-stream (e.g. on his home page or when paginated).

Shane Hudson

Shane Hudson went from no SSL and no comments yesterday to https level 5! He also imported the contents of all his old comments from his WordPress blog to his Craft install (the CMS he's dogfooding, contributing plugins to, selfdogfooding). (IRC notes)

He was able to get SSL setup on his site with an A rating, and forward secrecy, and is thus https level 5.

Shane also wrote a script to do the import of comments from WordPress to Craft. It's "a bit crude, dealing with XML to CSV a few times".

Nat Welch

Nat Welch (AKA icco on IRC) got his blog running (his own software) in Go (language) hosted on AppEngine with SSL, achieving https level 4! (IRC notes)

AppEngine does SSL for free if you're ok with SNI.

So now Nat has SSL Labs rating A- on! And also automatic redirect works from http to https. Thus he has also achieved https Level 4!

Right now he's using AppEngine default auth, using his Google account. Eventually he wants to use indieauth to auth into his site.

Tim Retout

Tim Retout got running on his site and added support to it for POSSEing to Twitter. (IRC notes)

His goal is to add all the indieweb feature support too like webmentions, microformats etc. He has to run off to catch a train.

He is also too humble, as he helped numerous people in person at the camp get on SSL, https level 4 or 5 at that. A round of applause for Tim!

Tom Morris

Tom Morris added https to his site, made it responsive, and setup mf2py as a service. (IRC notes)


Tom showed his current site with different window sizes. His CSS is now "less sucky" and he has made his site more responsive on mobile / small display etc.

mf2py as a service

Tom also got the Python microformats2 parser (mf2py) running as a service that you can submit your URLs to and get back pretty-printed JSON.

tommorris https

Tom got his main site up to https Level 4 with an A- rating, but has not yet done so with * (e.g.

During the next demo, Tom got his SSL Labs rating from A- to A with some help from Aral. And during the demo after that took his rating up to A+ thanks to this blog post.

Kevin Beynon

Kevin Beynon got IndieAuth login to his own site working! (IRC notes)

Kevin started by showing us his site home page using a tablet. We projected it by holding up to the Talky HD camera.

He pointed out that there is no admin link on the home page then went to his "secret" URL at /admin/ which has an IndieAuth login screen. He entered his own URL, and chose to RelMeAuth authenticate using Twitter which redirected to it and back and came back with the message "Log-in Successful".

Kevin went to his home page again, and showed that it now has visible links to "admin" and "log out". Next he plans to bring his post creating and editing interface into his home page front end, so that he can do inline editing and post notes from his home page.

Joschi Kuphal

Joschi Kuphal got his site's https support to SSL rating A+, fixed his webmention implementation, and implemented webactions on permalinks. (IRC notes)

jkphl https A+

Joschi noted that his site was running with SSL before but had some flaws. He worked on it and improved his site's rating from F to A+.

jkphl webmentions fixed

He also fixed some flaws with his webmention implementation thanks to feedback from Ryan Barrett online.

jkphl permalinks webactions

Third, Joschi implemented webactions on permalinks, in particular he added <indie-action> markup around his default Twitter, G+, Facebook "share" links. He then demonstrated his site working with Barnaby Walters's Web Action Hero Toolkit browser extension.

Chris Asteriou

Chris Asteriou is fairly new to the IndieWeb and started with going through IndieMark, adding h-entry and h-card markup, and a notes section to his site.(IRC notes)

digitalbliss microformats

Chris showed, noted that he added h-entry on his page with entries. He clicked the "Play" link at top to show this. And then he marked up the info at bottom of his home page with h-card.

digitalbliss notes

Chris added a notes section and used the verification tools on to check it and verify that he reached IndieMark Level 2.


Tantek Çelik switched his permalink webactions from <action> tags to <indie-action> tags and researched the UX of webactions on posts in a stream (e.g. a home page).

tantek indie-action

Based on the webactions discussion session in the first day with Tantek, Jeremy, and Pelle, they concluded that the <indie-action> tag was more appropriate than the <action> tag.

Tantek initially publicly proposed the <action> tag for consideration in a session on Web Actions at Open Source Bridge 2012, and then later implemented them at last year's IndieWebcampUK 2013 which were then demonstrated working with Barnaby Walters's browser extension.

Changing from <action> to <indie-action> at a minimum better fits with the web component model. Jeremy Keith pointed out that an <indie-action> tag in particular would be a good example of a web component, worthy as a case-study for web components.

Tantek updated his permalink webactions to use <indie-action> tags and Barnaby updated his browser extension to support them as well.

in-stream webactions

Tantek analyzed the UI of various silos, in particular Instagram and Twitter.

Instagram has a very minimal simple webaction UI, with just "Like", "Comment", and "..." (more) buttons, the first two with both icon and text labels, which makes sense since their primary content is large (relative to the UI) images/video (visual media). Instagram's webactions are identical on photos viewed on their own screen, and when in a stream of media. Deliberately designed consistency.

Twitter on the other hand is horribly inconsistent between different views of tweets, and even different streams, sometimes their webactions are:

  • on the right with text labels
  • on the left with text labels
  • on the left without text labels

Their trend seems to be icon only, likely because the text label distracts from the tweet text content around it, especially in a stream of tweets that are primarily (nearly all) just text.

Tantek walked through comparisons of Twitter's different webactions button icon/text usage/placements with Aral, who came to the same conclusions from the data.

It may be ok to use both icon and text labels on note/post permalink pages, as there is more distinction between the (single) content area, and the footer of webactions.

However, the conclusions is that in-stream webactions should use just icons (clear ones at that) when among posts that are primarily, mostly, or perhaps even often just text.

Next Tantek is working on implementing icon-only webactions on his home page posts stream. He made some progress but realized it will require him to rework some storage code first.

Aral Balkan

Aral Balkan upgraded his site's https support to SSL rating A+ and https Level 5, and his how-to blog post about it! (IRC notes)

Aral already supported https on his site beforehand. On IndieWebCampUK hack day he added support for forward secrecy, which raised its SSL rating from A- to A+ and thus he achieved https Level 5!

Apparently it took him only 2 lines of code to implement that change on nginx, and noted that it's a bit harder on Apache.

After his demo, Aral also updated his blog post about SSL setup with nginx with what he learned and how to get to SSL rating A+.

Rosa Fox

Rosa Fox created a UI on her site for CRUD posting of projects. (IRC notes)

Rosa wanted to make her own CMS with support for posting images and tags. She demonstrated her local dev install of her new CMS with the following new features she built at Hack Day:

  • a UI for creating a new project
  • CRUD posting interface for projects
  • using Postgres to store data
Aaron Parecki

Aaron Parecki participated remotely, added support for posting bookmarks to his site, and added bookmarks posting via micropub to his Quill app! (IRC notes)

Aaron has been publishing bookmarks to another place for a long time in a WordPress install at and he wanted to integrate them into his main site

Once Aaron got the bookmark post type implemented in his publishing software p3k and deployed to his site, he did a mass import from the WordPress XML export.

That was the last thing aaronpk was using WordPress for, so he's no longer using WordPress to publish any of his own content.

Now all of Aaron's bookmarks are at all marked up with microformats. Each bookmark is an h-entry, and embedded inside is an h-cite of the bookmark itself.

This also means you can comment, bookmark, and like his bookmarks themselves!

During later demos, Aaron also updated his Quill app with a bookmark posting interface, as well as a bookmarklet so you can quickly open the Quill UI to make a bookmark.

Kevin Marks

Kevin Marks built a feed coverter that takes legacy RSS/Atom feeds and produces modern readable and usable h-entry page, including such niceties as inline playable audio elements in converted podcasts. (IRC notes)

Kevin noticed that people are building h-feed readers, so he built a tool that takes legacy RSS Atom feeds and unmunges them and produces nice clean h-entry feeds.

The converter is at is a URL he bought ages ago, and set it up on Google AppEngine.

E.g. if you put in into it, it generates a nice readable HTML page with h-entry, which you can then subscribe to in an indie reader like Barnaby's Shrewdness.

Kevin demonstrated using unmung to convert a podcast feed into an h-feed with embedded playable HTML5 <audio> elements, providing an actual useful interface, much better than the original feed.

Kevin made the point that no one wants to parse RSS or Atom any more. Now by parsing the microformats JSON representation, you can get any existing RSS or Atom etc.

You can now subscribe to iTunes podcasts etc. in your indieweb reader!

Robin Taylor

Robin Taylor added support for https (including forward secrecy, getting an SSL "A" rating) to his site and automatic redirects from http to https, achieving https Level 5! (IRC notes)

UK Homebrew Website Clubs

As we were wrapping up, Tom Morris asked openly if anyone would be interested in coming to a Homebrew Website Club in London. Jeremy Keith similarly asked the group for interest in a Homebrew Website Club Brighton.

Both had quite a bit of interest, so we can expect to start seeing more Homebrew Website Club meetups in more locations!

See also Join Us At The Next IndieWebCamp In Cambridge

IndieWebCamp Cambridge is next month on the East Coast.

Join us. Share ideas. Come work on your personal web site. Help grow and evolve the independent web. Be the change you want to see in the world wide web.

"The people I met at @indiewebcamp are the A-Team of the Internet. Give them some tape and an oxy-acetalyne torch and they'll fix the web."
Categorieën: Mozilla-nl planet