Planet Mozilla’s been a little mixed up for the past few days, claiming that I was the author of posts on the Mozilla Security Blog. The good news is that this problems appears to have been fixed, thanks to Mike Hoye.
However, it’s likely that very few people saw the post I made a few days ago about the new per-class measurements in about:memory. So please take a look if you’re interested in that sort of thing. Thanks.
JackPair is a small widget which fits between your headset and your phone using the 3.5mm jack and encrypts your voice calls when you are talking to another JackPair user. Seems a really good design, without any secret sauce crypto, uses open hardware and software, and they need another $7,500 in the next day and a half to build it. Go and back them on Kickstarter :-)
Alex Fink from the OKCast interviewed me earlier this week about Mozilla’s work. More specifically, we discussed Webmaker and my focus on the Web Literacy Map. It serves as a useful introduction to the space as well as the importance of what we’re doing at the Mozilla Foundation.
Click here to listen (54:02)
Weeks 6 and 7!
- ~60K new Webmaker account holders coming from the snippet (including the wildly successful hackable snippet) and the new landing pages (see Adam’s recent post on the month-long process of refining that entire funnel)
- new user experience for partners and others is in the works
- starting to see some results from paid media
- Contributors: 5996 (up from 5552 two weeks ago)
- Webmaker accounts: 174.4 (up from 124K two weeks ago)
- Events: 2238 (up from 1799 two weeks ago)
- Hosts: 605 (up from 493 two weeks ago)
- Expected attendees: 108,500 (up from 76,200 two weeks ago)
- Cities: 403 (up from 362 two weeks ago)
Engagement Strategy #1: PARTNER OUTREACH
Over the past couple weeks, we started focusing on a new experience for partners—this will allow us to walk a potential partner through the process of creating an event on the platform using a “choose your own adventure” style wizard with three event types. (The etherpad where the experience is being sketched out is here.)
Though we started this work with partners in mind, we’ve realized it can potentially be used by other Webmaker users (see Earned Media section below.)
Engagement Strategy #2: ACTIVE MOZILLIANS
Helping the FSA community managers track FSA involvement in Maker Party is in the works: https://bugzilla.mozilla.org/show_bug.cgi?id=1061753
Engagement Strategy #3: OWNED MEDIA
Adam has a fantastic post summarizing learnings from our snippet funnel work so far. His main takeaway regarding process:Prioritize time and permission for testing, with a clear shared objective, and get just enough people together who can make the work happen.
Adam and Andrea already reported on the success of the hackable snippet, which increased our end-to-end conversion rate significantly. The hackable snippet will be replaced soon (due to the typical “snippet fatigue” that we always see), but we are now motivated to try similar experiments in the future.Here are some highlights from the Twitterverse: The new user experience for partners described in the Partners section above may be re-purposed as the end of this funnel. In the meantime, we released a temporary page that we hope will funnel people towards an immediate contribution. The page asks people to pledge to teach someone about the web. In the first three days, nearly 3,000 people made the pledge.
Suggestion: let’s be sure to begin this work earlier in the 2015 Maker Party campaign, so that we can best take advantage of the snippet traffic during the campaign.
Engagement Strategy #4: EARNED MEDIA
The hackable snippet got a mention from Cory Doctorow: http://boingboing.net/2014/08/27/firefoxs-new-start-page-is-a.html
The Buenos Aires Media Party got quite a bit of press. Here’s some of what we know about:
PAID MEDIA (we’ve never considered this a major strategy for Maker Party)
Several weeks ago, Adam made some changes to our Google AdWords campaign that seem to have had a bit of an impact on results. After weeks of seeing miniscule click-through rates, we finally saw some movement when we shifted the ad target to the United Kingdom and India. A generic ad promoting Webmaker (not Maker Party) has generated 4,679 click-throughs and and 92 new accounts so far.
This is still a minor channel for us, but it’s good to know that we can generate some interest with our AdWords grant.
A collection of semi-random notes from Wikimania London, published very late:The conference generally
- Tone: Overall tone of the conference was very positive. It is possibly just small sample size—any one person can only talk to a small number of the few thousand at the conference—but seemed more upbeat/positive than last year.
- Tone, 2: The one recurring negative theme was concern about community tone, from many angles, including Jimmy. I’m very curious to see how that plays out. I agree, of course, and will do my part, both at WMF and when I’m editing. But that sort of social/cultural change is very hard.
- Speaker diversity: Heard a few complaints about gender balance and other diversity issues in the speaker lineup, and saw a lot of the same (wonderful!) faces as last year. I’m wondering if there are procedural changes (like maybe blind submissions, or other things from this list) might bring some new blood and improve diversity.
- “Outsiders”: The conference seemed to have better representation than last year from “outside” our core community. In particular, it was great for me to see huge swathes of the open content/open access movements represented, as well as other free software projects like Mozilla. We should be a movement that works well with others, and Wikimania can/should be a key part of that, so this was a big plus for me.
- Types of talks: It would be interesting to see what the balance was of talks (and submissions) between “us learning about the world” (e.g., me talking about CC), “us learning about ourselves” (e.g., the self-research tracks), and “the world learning about us” (e.g., aimed at outsiders). Not sure there is any particular balance we should have between the three of them, but it might be revealing to see what the current balance is.
- Less speaking, more conversing: Next year I will probably propose mostly (only?) panels and workshops, and I wonder if I can convince others to do the same. I can do a talk+slides and stream it at any time; what I can only do in person is have deeper, higher-bandwidth conversations.
- Physical space and production values: The hackathon space was amazingly fun for me, though I got the sense not everyone agreed. The production values (and the rest of the space) for the conference were very good. I’m torn on whether or not the high production values are a plus for us, honestly. They raise the bar for participation (bad); make the whole event feel somewhat… un-community-ish(?); but they also make us much more accessible to people who aren’t yet ready for the full-on, super-intense Wikimedian Experience.
- LCA: Legal/Community Affairs was pretty awesome on many fronts—our talks, our work behind the scenes, our dealing with both the expected and unexpected, etc. Deeply proud to be part of this dedicated, creative team. Also very appreciative for everyone who thanked us—it means a lot when we hear from people we’ve helped.
- Maps: Great seeing so much interest in Open Street Map. Had a really enjoyable time at their 10th birthday meetup; was too bad I had to leave early. Now have a better understanding of some of the technical issues after a chat with Kolossos and Katie. Also had just plain fun geeking out about “hard choices” like map boundaries—I find how communities make decisions about problems like that fascinating.
- Software licensing: My licensing talk with Stephen went well, but probably should have been structured as part of the hackathon rather than for more general audiences. Ultimately this will only work out if engineering (WMF and volunteer) is on board, and will work best if engineering leads. (The question asked by Mako afterwards has already led to patches, which is cool.)
- Creative Commons: My CC talk with Kat went well, and got some good questions. Ultimately the rubber will meet the road when the translations are out and we start the discussion with the full community. Also great meeting User:Multichill; looking forward to working on license templates with him and May from design.
- Metadata: The multimedia metadata+licensing work is going to be really challenging, but very interesting and ultimately very empowering for everyone who wants to work with the material on commons. Look forward to working with a large/growing number of people on this project.
- Advocacy: Advocacy panel was challenging, in a good way. A variety of good, useful suggestions; but more than anything else, I took away that we should probably talk about how we talk when subjects are hard, and consensus may be difficult to reach. Examples would include when there is a short timeline for a letter, or when topics are deeply controversial for good, honest reasons.
- Lesson (1): Learned a lesson: never schedule a meeting for the day after Wikimania. Odds of being productive are basically zero, though we did get at least some things done.
- Lesson (2): I badly overbooked myself; it hurt my ability to enjoy the conference and meet everyone I wanted to meet. Next year I’ll try to be more focused in my commitments so I can benefit more from spontaneity, and get to see some slightly less day-job-related (but enjoyable or inspirational) talks/presentations.
- Research: Love that there is so much good/interesting research going on, and do deeply think that it is important to understand it so that I can apply it to my work. Did not get to see very much of it, though :/
- Arguing with love: As tweeted about by Phoebe, one of the highlights was a vigorous discussion (violent agreement :) with Mako over dinner about the four freedoms and how they relate to just/empowering software more broadly. Also started a good, vigorous discussion with SJ about communication and product quality, but we sadly never got to finish that.
- Recharging: Just like GUADEC in my previous life, I find these exhausting but also ultimately exhilarating and recharging. Can’t wait to get to Mexico City!
- London: I really enjoy London—the mix of history and modernity is amazing. Bonus: I think the beer scene has really improved since the last time I was there.
- Movies: I hardly ever watch movies anymore, even though I love them. Knocked out 10 movies in the 22 hours in flight. On the way to London:
- Grand Hotel Budapest (the same movie as every other one of his movies, which is enjoyable)
- Jodorowsky’s Dune (awesome if you’re into scifi)
- Anchorman (finally)
- Stranger than Fiction (enjoyed it, but Adaptation was better)
- Captain America, Winter Soldier (not bad?)
- On the way back:
- All About Eve (finally – completely compelling)
- Appleseed:Alpha (weird; the awful dialogue and wooden “faces” of computer animated actors clashed particularly badly with the clasically great dialogue and acting of All About Eve)
- Mary Poppins (having just seen London; may explain my love of magico-realism?)
- The Philadelphia Story (great cast, didn’t engage me otherwise)
- Her (very good)
The 2nd video of the series (streamed live on August 19th) involves exploring options for what a potential living logo could look like. Based on these initial rough sketches, I’m very excited to say we’ll be working in partnership with Pitch Interactive to come up with data-driven visualizations as the main build of our logo system. More on that soon!
In addition, today on the Firefox Channel we released a small overview video of what we plan to achieve and how we’re going about making the project open. Enjoy!
The add-on review process on AMO is fairly complicated, and can get very overwhelming if you need to look at it close enough that you must understand file and add-on statuses. AMO admins, devs, and reviewers are usually the ones who have to worry about this stuff and there aren’t good docs for it.
Since the issue popped up again today, I decided to take a few minutes to create a chart that explains the AMO review cycle from a file and add-on status perspective. If you think this chart is pretty crazy, you should keep in mind it’s a simplified view of the process. It doesn’t take into account developers deleting versions of marking their add-ons as inactive, and a few repetitive connections were left out. Still, it should give a good idea of how add-on and file statuses interact during the review process, and should help admins figure out which status means what (to add more confusion to the mix, AMO has old unused statuses, as well as others that are only used in Marketplace).
Here’s the chart without the notes:
For the real deal, check out the doc.
Is this complexity necessary? Probably. We have two review levels because it allows us to list polished add-ons as well as experimental ones, giving developers and users more flexibility and choice. This in turn makes AMO more diverse and generally a better option than self-hosting.
We’ve made the decision today to postpone the Maker Party planned for Saturday 13th September at Campus North, Newcastle-upon-Tyne.
Although there were lots of people interested in the event, the timing proved problematic for potential mentors and attendees alike. We’re going to huddle next week and think about a more suitable time – perhaps in November.
Thanks to those who got in touch about the event and offered their support.
One of the hidden features of Firefox 29 was a unicorn that bounced around the Firefox menu when it was emptied. The LA Times covered it in their list of five great features of Firefox 29.
Building on the fun, Firefox 32 (released today) will now spin the unicorn when you press the mouse down in the area that unicorn is bouncing.
The unicorn is shown when the menu’s :empty pseudo-class is true. The direction and speed of the movement is controlled via a CSS animation that moves the unicorn in the X- and Y-direction, with both moving at different speeds. On :hover, the image of the unicorn gets swapped from grayscale to colorful. Finally, :active triggers the spinning.
Tagged: australis, CSS, firefox, planet-mozilla
At around 06:49 CEST on the morning of August 27 2014, Google deployed an HTTP/2 draft-14 implementation on their front-end servers that handle logins to Google accounts (and possibly others). Those at least take care of all the various login stuff you do with Google, G+, gmail, etc.
The little problem with that was just that their implementation of HTTP2 is in disagreement with all existing client implementations of that same protocol at that draft level. Someone immediately noticed this problem and filed a bug against Firefox.
The Firefox Nightly and beta versions have HTTP2 enabled by default and so users quickly started to notice this and a range of duplicate bug reports have been filed. And keeps being filed as more users run into this problem. As far as I know, Chrome does not have this enabled by default so much fewer Chrome users get this ugly surprise.
The Google implementation has a broken cookie handling (remnants from the draft-13 it looks like by how they do it). As I write this, we’re on the 7th day with this brokenness. We advice bleeding-edge users of Firefox to switch off HTTP/2 support in the mean time until Google wakes up and acts.
You can actually switch http2 support back on once you’ve logged in and it then continues to work fine. Below you can see what a lovely (wildly misleading) error message you get if you try http2 against Google right now with Firefox:
This post is being debated on hacker news.
the following changes have been pushed to bugzilla.mozilla.org:
-  changes made at the same time as a comment are no longer grouped with the comment
-  Only retrieve database fetched values if requested
-  add bit.ly support to bmo
-  Remove reporting of the firefox release channel from the guided bug entry (hasn’t been accurate since firefox 25)
discuss these changes on mozilla.tools.bmo.
Filed under: bmo, mozilla
After all the fun reading “Meanwhile in San Francisco”, I looked to see if this duo had co-written any other books. Sure enough, they had.
“Lost Cat” tells the true story of how an urban cat owner (one of the authors) loses her cat, then has the cat casually walk back in the door weeks later healthy and well. The book details various experiments the authors did using GPS trackers, and tiny “CatCam” cameras to figure out where her cat actually went. Overlaying that data onto google maps surprised them both – they never knew their cats roamed so far and wide across the city. The detective work they did to track down and then meeting with “Cat StealerA” and “Cat Stealer B” made for a fun read… Just like “Meanwhile in San Francisco”, the illustrations are all paintings. Literally. My all-time favorite painting of any cat ever is on page7.
A fun read… and a great gift to any urban cat owners you know.
We’ve removed the rsync modules mozilla-current and mozilla-releases today, after calling for comment a few months ago and hearing no objections. Those modules were previously used to deliver Firefox and other Mozilla products to end users via a network of volunteer mirrors but we now use content delivery networks (CDN). If there’s a use case we haven’t considered then please get in touch in the comments or on the bug.
What if, when giving a patch r+ on Mozilla’s bugzilla, you were presented with the following checklist:
- I have considered:
- Memory leaks
- Null checks
- Security implications
- Mozilla Code Style
You could not actually submit an r+ unless you had checked an HTML check box next to each item. For patches where any of this is irrelevant, just check the box(es) – you considered it.
Checklists like this are commonly used in industries that value safety, quality, and consistency (e.g. medicine, construction, aviation). I don’t see them as often as I’d expect in software development, despite our commitments to these values.
The idea here is to get people to think about the most common and/or serious classes of errors that can be introduced with nearly all patches. Reviewers tend to focus on whatever issue a patch addresses and pay less attention to the other myriad issues any patch might introduce. Example: a patch adds a null check, the reviewer focuses on pointer validity, and misses a leak being introduced.
Catching mistakes in code review is much, much more efficient than dealing with them after they make it into our code base. Once they’re in, fixing them requires a report, a regression range, debugging, a patch, another patch review, and another opportunity for further regressions. If a checklist like this spurred people to do some extra thinking and eliminated even one in twenty (5%) of preventable regressions in code review, we’d become a significantly more efficient organization.
For this to work, the checklist must be kept short. In fact, there is an art to creating effective checklists, involving more than just brevity, but I won’t get into anything else here. My list here has only four items. Are there items you would add or remove?
General thoughts on this or variations as a way to reduce regressions?
More than 20 years ago I first experienced virtual reality in one of those large-scale 3D rigs which was traveling the country, setting up shop in the local multiplex cinema and charging you a small fortune to step into a 4-by-4 foot contraption, strap on a pair of 3D goggles, grab a plastic gun and hunt down some aliens in an immersive 3D environment.
It’s funny - as unimpressive as the graphics were, as much as the delay between movement and visual update was puke inducing – I still have vivid memories of the game and the incredible experience of literally stepping into a new world.
Fast forward to last year: Oculus revived the whole Virtual Reality (VR) scene with their Rift headset – cobbled together with cheap off-the-shelf components and some clever hardware and software hacking. The first time I tried the Rift I was hooked. It was the exact same crazy experience I had some 20 years ago – I found myself in a new world, just this time with much better graphics, none of the nasty visual delays (which makes most people motion sick) and all delivered in a much more palatable device. And again, I can’t get that initial experience out of my head (in my case a rather boring walking experience of a Tuscan villa).
Since that experience, I joined Singularity University where we have a Rift in our Innovation Lab. Over the course of the last 8 weeks I must have demoed the Rift to at least 30 people – and they all react in the exact same way:
People giggle, laugh, scream, start moving with the motion they see in the headset… They are lost within the experience in less than 30 seconds of putting the goggles on. The last time I’ve seen people react emotionally in a similar way to a new piece of technology was when Apple introduced the iPhone.
It’s rare to see a piece of technology create such a strong emotional reaction (delight!). And that’s precisely the reason why I believe VR will be huge. A game changer. The entry vector will be gaming – with serious applications following suit (think about use cases in the construction industry, engineering, visualization of complex information) and immersive storytelling being probably the biggest game changer. In the future you will not watch a movie or the news – you will be right in it. You will shop in these environments. You will not Skype but literally be with the other person.
And by being instead of just watching we will be able to tap much deeper into human empathy than ever before. To get a glimpse of this future, check out these panoramic pictures of the destruction in Gaza.
With prices for VR technology rapidly approaching zero (the developer version of Oculus Rift’s DK2 headset is a mere $350 and Google already introduced a cardboard (!) kit which turns your Android phone into a VR headset) and software development tools becoming much more accessible, we are rapidly approaching the point where the tools of production become so accessible that we will see an incredible variety of content being produced. And as VR is not bound to a specific hardware platform, I believe we will see a market more akin to the Internet than the closed ecosystems of traditional game consoles or mobile phone app stores.
The future of virtual reality is nigh. And it’s looking damn real.
Kingdom Code is a new initiative to gather together Christians who program, to direct their efforts towards hastening the eventual total triumph of God’s kingdom on earth. There’s a preparatory meet-up on Monday 15th September (tickets) and then a full get-together on Monday 13th October. Check out the website and sign up if you are interested.
(There’s also Code for the Kingdom in various cities in the US and India, if you live nearer those places than here.)