mozilla

Mozilla Nederland LogoDe Nederlandse
Mozilla-gemeenschap

Joel Maher: lost in data – episode 1, tackling a bunch of alerts

Mozilla planet - fr, 24/07/2015 - 01:22

Today I recorded a session of me investigating talos alerts, It is ~35 minutes, sort of long, but doable in the background whilst grabbing breakfast or lunch!

I am looking forward to more sessions and answering questions that come up.


Categorieën: Mozilla-nl planet

Air Mozilla: Lost in Data with Joel Maher - Episode 1

Mozilla planet - to, 23/07/2015 - 23:50

Lost in Data with Joel Maher - Episode 1 Taking a stab at the incoming alerts

Categorieën: Mozilla-nl planet

David Burns: Microsoft ship a WebDriver implementation

Mozilla planet - to, 23/07/2015 - 22:19

Microsoft, the people still claim to be evil (who are actually big proponents of the the The Open Web), have... (wait for it...) SHIPPED. AN. IMPLEMENTATION. OF. WEBDRIVER!

At GTAC in California in 2011, Simon Stewart and I discussed that the Selenium project was at a crossroads (I am pretty sure beer was involved). We could ,and should, move this to the browser vendors. We had seen how in April the Chrome team had shipped their implementation and the Selenium project then deleted its implementation. I am pretty sure Daniel (Wagner-Hall) was glad to see it go.

In that time to now we have seen Mozilla get Marionette into Firefox and into release branches since Firefox 24 while slowly working on it (as well as Firefox OS support). We have seen Blackberry ship a version for the browser on their devices. We have seen mobile implementation with iOS-Driver, Selendroid and Appium.

The Spec is on track to be put forward for Recommendation by the end of the year. All the dreams that we (the Selenium Development team (my BFFs)) had are slowly coming true. This ship might be slow moving but it's mostly because some companies haven't always seen the value.

so...

.@kylealden I raise a drink to the @MSEdgeDev team (esp @thejohnjansen) for helping make this happen @jimevansmusic pic.twitter.com/eBxdpNNqPC

— David Burns (@AutomatedTester) July 23, 2015

P.s. There is an open bug on the WebKit tracker for Safari support (and it is getting some internal push so I am hopeful!)

Categorieën: Mozilla-nl planet

About:Community: Ten years of evolution of MDN

Mozilla planet - to, 23/07/2015 - 20:22

Logo for MDN 10 year anniversaryThis week marks the tenth anniversary of Mozilla Developer Network (MDN) as a wiki. This post offers a deep dive into where MDN came from, how it has evolved in various ways, and where it may be going.

(This post is based in large part on a round table discussion about MDN that was held during the “Hack on MDN” weekend in Berlin in April 2015, and on Florian Scholz’s history of MDN’s JavaScript documentation.)

What is MDN today?

For many web developers, MDN is the reference manual for the Web, the place they go to look up or learn about open web technologies. MDN offers much more than that. It is a resource for learning about the Web, and a place for developers to share their skills and knowledge. MDN’s strength lies in its openness, where anybody can help make the resources incrementally better, or substantially better. MDN can also encourage the growth of web technologies into new spaces from where they’ve been in the past.

MDN is a community of programmers, writers, and localizers. A few of these are paid staff of Mozilla, but they are a subset of the larger community of people who make small or large contributions.

One of the coolest things for those who contribute to MDN is that every time we talk to developers, they tell us how much they love MDN. It’s not, “Oh MDN is kind of cool,” or “It’s great.” The response is: “I love MDN. It’s the best resource out there.” It’s tremendously gratifying to feel that you are part of something that people really love.

Who is MDN for?

MDN serves a variety of audiences:

  • Web developers, first and foremost
  • People who want to teach themselves about web development
  • Teachers of web development skills and concepts
  • Developers in the ecosystem of Mozilla products, such as Firefox add-ons, and Firefox OS apps
  • Developers working on the Mozilla codebase
Beginnings of MDN

The site that became MDN, developer.mozilla.org or “Devmo,” started as a redirect to a developer-oriented page on the main mozilla.org site. Later that content was moved over to Devmo; it contained primarily information for developers contributing to the Mozilla codebase.

Just as the Mozilla project emerged from the remains of Netscape, also MDN as we know it began with documentation originally written at Netscape. The site known as “Netscape DevEdge” documented web technologies, such as JavaScript, and other things that were implemented in Netscape products. After Netscape was acquired by AOL, the DevEdge site eventually was shut down, and that information disappeared from the Web.

Mitchell Baker (Chair of Mozilla) and others from Mozilla worked out an arrangement with AOL to release the DevEdge content, which she announced in February 2005. At the same time, Deb Richardson was hired to migrate and curate the DevEdge content into Devmo.

Mitchell and Deb made the decision to put the content into a wiki, to enable open contributions to maintain and update the content. Previously, the DevEdge content was in a CVS source control system, and published as a static site. Using a wiki for documentation was a novel concept at the time, and it took a while for some core developers within the Mozilla project to warm to this way of working. However, others quickly embraced the idea, and the many of the earliest contributors to the Devmo wiki were developers who were active elsewhere in the Mozilla project.

Deb and some volunteers spent a few months mining and migrating the still-useful content from DevEdge, working on a test server. This effort was still in progress when the content was migrated to the Devmo site, now titled “Mozilla Developer Center” or “MDC” in July 2005. We mark this as the starting point of what we now call Mozilla Developer Network.

Platform evolution

MDN has lived on three different wiki platforms in the course of its history: first MediaWiki, then MindTouch DekiWiki, and now Kuma, a Mozilla-developed platform. Looking at the technical infrastructure is interesting not just from a technical point of view, but also because technology influences social structures like community.

MediaWIki

The wiki platform used for the first iteration of MDC was MediaWiki, the open source software that underlies Wikipedia. It was the most robust and widely-used wiki at the time. The Devmo project began to discover that software designed for writing a general-purpose encyclopedia was not necessarily ideal for writing developer-oriented technical documentation. For example, it did not handle code examples well, reformatting them to be unreadable. Mozilla tried to fix such issues by creating its own fork of MediaWiki, which then ended up being quite difficult to maintain.

On the level of contributions, using MediaWiki was initially an advantage, because many technical people were already familiar with how to use it. However, the project eventually reached a plateau, where it became difficult to keep contributors coming back. This, combined with the technical quirks, led to a search for a more user-friendly platform.

Dekiwiki

After an evaluation process that looked at all the wiki products available on the market (not just open source ones), the choice was made of DekiWiki by MindTouch. One advantage of DekiWiki was that the source format for articles was HTML, rather than wiki markup. It seemed a logical choice for a site targeting web developers, to have the source format be a standard web language. This required migrating all of the content from MediaWiki markup format to HTML, which was a major migration project. The choice of DekiWiki was announced in November 2007, and the site switched to it in August 2008.

While DekiWiki was a quality product, one way that the selection process was flawed was that it did not include a major group of stakeholders: volunteers who contributed to the site. The rate of contribution nose-dived, because the platform was not embraced by the contributor community. In particular, localization communities, who translated the content into various languages other than English, were severely disrupted. They had built tools and processes around working with MediaWiki, and these tools didn’t work with DekiWiki. After a few months, many of these groups simply disbanded and decided not to contribute any more, with the result that translated documentation started to become stale, and became more and more out of date over time.

DekiWiki was also written in C#, and designed to run in a Microsoft .NET environment. This was a mismatch with Mozilla’s technical infrastructure, which is Linux-based. Trying to run DekiWiki on Mono led to a great deal of instability, with the site being down for days and weeks at times.

These issues, after a couple of years, led to looking for another solution. The best candidates on the market were still MediaWiki and DekiWiki. Now that the content was all written in HTML, migrating back to MediaWiki markup syntax was not feasible. No product seemed suited to the specific needs of an open developer documentation site, so Mozilla decided to create its own.

Kuma

The current platform for MDN, known as Kuma, is written in Python with Django. It started as a fork of Kitsune, the platform for the Mozilla support site, and was adapted to the needs of a wiki site for developers rather than end users. (Also, “kitsune” means “fox” in Japanese, and “kuma” means “bear”. Because users : foxes :: developers : bears, right?)

Like DekiWiki, Kuma uses HTML as the source format for the content. The migration effort in this case was converting the scripts and macros used on the site. DekiWiki used “DekiScript,” based on Lua, while Kuma introduced KumaScript, which is based on JavaScript, using Node.js. KumaScript is the brainchild of developer Les Orchard. As with creating content in HTML, KumaScript means that MDN is implemented using the same technologies that it documents, and that its contributors are familiar with. It was possible to migrate about 70% of the existing macros automatically, but the rest had to be manually converted.

The goal when launching the Kuma platform was to achieve feature parity with the DekiWiki implementation of MDN. The content was migrated to the new system, and changes from the production server were periodically updated on the Kuma staging server. Thus, the Kuma instance was kept in sync with the production DekiWiki server. While the months leading up to the launch of Kuma were involved a great deal of migration work, the actual launch was very smooth. A routing switch was flipped, and traffic shifted to the new site seamlessly, without even disrupting login sessions.

Community evolution

From the beginning, the community for the DevMo site grew organically, starting with contributors who were already active in other parts of the Mozilla project. Like other areas of Mozilla, communication happens through a mailing list and IRC chat channel. By mid-2007, contributions were typically 250 per month. As mentioned before, the migration to Dekiwiki led to a dramatic drop-off in localization contribution, and total contribution declined as well.

Doc contributors sitting around a table, in Paris, in October 2010

MDN doc sprint, Paris 2010


As part of an effort to engage the community more actively, I (Janet Swisher) was hired as a staff technical writer in mid-2010. I brought experience with open source developer documentation, and in particular, experience with the “book sprints” methodology used by the FLOSS Manuals project to produce manuals for free software in five days or less. The first MDN “doc sprint” took place in October 2010, in the Mozilla Paris office. Doc sprints bring together a number of MDN contributors, either physically or virtually, for focussed, collaborative work, typically over a weekend. These were held about once a quarter for about three years. More recently, they have evolved into less frequent but broader “Hack on MDN” events, whose scope includes hacking on the platform or tools, as well as on content, to make them more attractive to developers.
Idea pitches at HackOnMDN weekend, Berlin 2015

Idea pitches at HackOnMDN weekend, Berlin 2015


In addition, the MDN community holds a number of regular online meetings, both for general information, and for tracking specific projects. These community activities, as well as the migration to Kuma in 2012, have led to a significant increase in contributions, now around 1000 per month.

Branding evolution

In the beginning, the DevMo site was known as “Mozilla Developer Center.” At first, it simply sported that title, with a simple skin on MediaWiki. With the move to DekiWiki, the word “Mozilla” became the Mozilla wordmark, followed by “<developer center/>”, to convey slightly more webbiness.

Wordmark for Mozilla Developer Center

Wordmark for Mozilla Developer Center

In September 2010, the name of the site was changed from “Mozilla Developer Center” to “Mozilla Developer Network” or MDN. This change was met with some skepticism from the developer audience at the time, though by now they simply accept MDN as MDN. The visual design of the site changed at the same time, to a darker theme, and MDN acquired a logo, the “robot dino,” which it had never had before.

MDN robot dino logo

MDN robot dino logo

Along with these visual changes, features were added to the site to broaden its scope beyond just documentation. One successful feature is known as “Demo Studio”, an area where developers can upload code demos, share them, and show them off.

When MDN migrated from DekiWiki to Kuma, the visual appearance was preserved, so there was very little visual difference between the pre- and post-migration sites. After six to eight months of bug-fixing on Kuma, a project was started to change not only the visual design, but also the content structure. These changes were rolled out using feature flags, to users who chose to be beta testers. Thus, while most users continued to see the old design, while beta testers saw and tested the new visuals and structure. “Launch day” for the redesign consisted of simply flipping a switch in the database to make the new features visible to everybody.

The redesign brought not only a new logo, the dino-head-map that we see today, but also structural features like the navigation sidebar, which varies depending on which content area an article is in. In localized pages, items in the sidebar that are not yet translated link to their English versions, and show an invitation to translate them.

Content evolution

We mark the start of MDN “as we know it” from the acquisition and republication of the Netscape DevEdge content in 2005. But in the early days, the content was very slanted toward Mozilla products and technology. Not only was there documentation of XUL and internal Mozilla APIs (which are still there), but documentation of web technologies tended to be focused on Mozilla and Firefox, for example, with big banners like “works in Firefox 2.0″ or explanations of Gecko’s support of a feature in the middle of an otherwise neutral article.

As Mozilla began engaging more actively with the MDN community in 2010, community members began to express a vision of MDN as a vendor-neutral resource for web developers, whatever browsers they are targeting. Adopting this as a strategy required a lot of clean-up effort to remove Firefox-specific content from articles about web standards, and to create the compatibility tables that exist now, with information about all major browsers. Not coincidentally, as the content on MDN became more browser-agnostic, MDN started seeing contributions from other organizations.

MDN today and tomorrow

Two current projects on MDN are having a major impact on the shape of MDN in the near to medium term: the Learning area, and the compatibility data project.

MDN’s information about web technologies has long been a resource for experienced web developers. But it has poorly supported beginners to web development. The aim of the Learning area is to change that by offering tutorials and other resources to people who want to teach themselves about web development. This effort is happening in response to surveys we’ve done of our audience, who reported basic learning material as a significant gap. The Learning area project has been underway for about a year, and in that time has created a large Glossary about web technology concepts, and a number of new tutorials, corresponding to the Web Literacy Map developed by the Mozilla Foundation. The Learning area is a great opportunity to get started in contributing to MDN, since learners and teachers are as needed as technical experts.

Currently, data on MDN about browsers’ compatibility with web technology features is maintained in tables on the relevant pages. The data is pretty good, thanks to many, many crowd-sourced contributions. But this approach is not very sustainable or maintainable; for example, every table must be replicated on all localized versions of the page. The compatibility data project aims to improve the quality of the data, make data contribution easier, make access to the data easier, and allow reuse of the data, through a centralized data store. This project is action-driven rather than time-bound; contributions and involvement are welcome.

MDN in 10 years?

MDN as it exists today is quite different from its beginnings ten years ago. The Web has evolved, Mozilla has evolved, and MDN has evolved. We can expect even greater changes in the next ten years. Perhaps the vision of a direct brain interface to virtual reality “cyberspace” will finally come to pass. We know for sure there will be many more web developers, many more types of devices, and many standards that are not yet written.

Some things won’t change: Mozilla’s mission will continue to be to work towards an Internet that is a global public resource, open and accessible for all. MDN will continue to be a means towards that mission, by providing resources to enable anyone to become a creator of the Web, and to develop on the Web as a primary platform. MDN’s content, no matter how it’s delivered, will continue to be contributed by a global community of people who are passionate about learning and sharing knowledge about the Web.

Categorieën: Mozilla-nl planet

Andreas Tolfsen: Optimised Rust code in Gecko

Mozilla planet - to, 23/07/2015 - 20:10
Rust Rust wave by Glen Scott (CC BY-NC 2.0) Optimised Rust code in Gecko

Following bug 1177608, Rust code in Gecko is now compiled with optimisation by default.

Compilation of Rust components can be enabled by adding ac_add_options --enable-rust to your mozconfig file. For now there’s only limited usage of Rust in Gecko, but you can take a look at the bindings for the MP4 encoder if you’re interested. This is much thanks to the work of Ralph Giles.

Since optimised compiles are the default, you will now also get optimised output from rustc. You can disable this by setting ac_add_options --disable-optimize. This disables compilation for all cc, c++, and rustc compilers.

Next up is adding debug symbols, assertions, and adding some Rust autoconf macros.

sny.no/a
Categorieën: Mozilla-nl planet

Air Mozilla: Web QA Weekly Meeting

Mozilla planet - to, 23/07/2015 - 18:00

Web QA Weekly Meeting This is our weekly gathering of Mozilla'a Web QA team filled with discussion on our current and future projects, ideas, demos, and fun facts.

Categorieën: Mozilla-nl planet

Air Mozilla: July Brantina on Prototyping with Tom Chi

Mozilla planet - to, 23/07/2015 - 18:00

July Brantina on Prototyping with Tom Chi At this month's July 23 Brantina Tom Chi (one of the founders of GoogleX) will share some best practices as well as things to avoid...

Categorieën: Mozilla-nl planet

Le Mozilla Developer Network fête ses dix ans - Next INpact

Nieuws verzameld via Google - to, 23/07/2015 - 17:34

Le Mozilla Developer Network fête ses dix ans
Next INpact
Le Mozilla Developer Network est né en février 2005, soit trois mois à peine après l'arrivée de la toute première version officielle et finalisée de Firefox. Il s'appelait d'ailleurs à l'époque Developer Center, mais la mission est restée globalement ...

en meer »
Categorieën: Mozilla-nl planet

Air Mozilla: Reps weekly

Mozilla planet - to, 23/07/2015 - 17:00

Reps weekly Weekly Mozilla Reps call

Categorieën: Mozilla-nl planet

Mozilla Developer Network turns 10 - BetaNews

Nieuws verzameld via Google - to, 23/07/2015 - 16:20

BetaNews

Mozilla Developer Network turns 10
BetaNews
The most viewed documents relate to JavaScript, CSS and Firefox. MDN isn't just for experienced developers though, there are 90 articles for complete beginners and people learning the Web. There's more information on the Mozilla blog or in the facts ...

Categorieën: Mozilla-nl planet

Adobe Flash Player - Mozilla Jammed Adobe Flash Player - How Will the Other ... - Boosh Articles

Nieuws verzameld via Google - to, 23/07/2015 - 16:13

Boosh Articles

Adobe Flash Player - Mozilla Jammed Adobe Flash Player - How Will the Other ...
Boosh Articles
Mozilla has completely jammed the Adobe Flash Player from running on its browser until any of the latest security updates of Flash satisfies Mozilla. Flash is popular software used to run videos and images on web content. However, it has seen a ...

en meer »
Categorieën: Mozilla-nl planet

Go Faster, czyli Mozilla chce, aby Firefox był bardziej elastyczny - Komputer Świat

Nieuws verzameld via Google - to, 23/07/2015 - 15:35

Komputer Świat

Go Faster, czyli Mozilla chce, aby Firefox był bardziej elastyczny
Komputer Świat
W ten sposób Mozilla będzie chciała ustalić, które z funkcji cieszą się zainteresowaniem użytkowników i warto przyłożyć więcej pracy nad ich rozwojem. Widzimy więc, że Go Faster będzie działać na zasadzie zbliżonej do dodatków. Nowe funkcje mają ...

en meer »
Categorieën: Mozilla-nl planet

Mozilla wettert gegen Standard-App-Regelung in Windows 10 - WinFuture

Nieuws verzameld via Google - to, 23/07/2015 - 13:31

WinFuture

Mozilla wettert gegen Standard-App-Regelung in Windows 10
WinFuture
Mozilla will seinen Browser Firefox wie bereits berichtet auch für Windows 10 anbieten. Jetzt haben die Entwickler des freien Browsers massive Kritik an Microsofts Entscheidung geübt, die Festlegung von Standard-Apps für bestimmte Aufgaben aus ihrer ...

Categorieën: Mozilla-nl planet

The Mozilla Blog: MDN celebrates 10 years of documenting YOUR Web

Mozilla planet - to, 23/07/2015 - 11:09

Today, Mozilla proudly celebrates the 10th anniversary of the Mozilla Developer Network, one of the richest and also one of the few multilingual resources on the Web for documentation. It started in February 2005, when a small team dedicated to the open Web took DevEdge (Netscape’s developer materials) and set out to create an open, free, community-built online resource for all Web developers. Just a couple of months later, on 23 July, 2005 the original MDN wiki site launched and has evolved steadily ever since for the convenience and the benefit of its users.

MDN_10-Milestones_UK
Today, ten years later, not only has the amount of documentation grown – 34,500 documents and climbing – but also MDN’s global volunteer community is bigger than ever. Currently, MDN has more than 4 million users and over 1000 volunteer editors per month creating and translating documentation, sample code, tutorials and other learning resources for all open Web technologies, including CSS, HTML, JavaScript and everything that makes the open Web as rich and versatile as it is.

MDN_10_Facts_UK

For a wide range of Web developers, from learners to hobbyists to full-time professionals, MDN provides useful explanations for coding practice. It aims to inspire ideas, encourage collaboration, innovation and ultimately, foster the growth of the open Web. Moreover, as the digital industry flourishes and the demand for coding skills at young age rises, the importance of well-organized resources like MDN grows exponentially. That is why in 2014 MDN started to feed and expand all its learning pages into a “Learn the Web” area for beginning web developers, including a web terminology glossary, which MDN’s technical writers and volunteers will continue to develop over the next years.

All these efforts, which would not be possible without the active MDN volunteer base, are being greatly acknowledged by developers from all over the world who would not be doing what they do without MDN – or at least not as good.

Let’s hear it for MDN!

For more information:

Web: https://developer.mozilla.org/
MDN at 10: https://developer.mozilla.org/en-US/docs/MDN_at_ten
Twitter: https://twitter.com/MozDevNet

All graphics are also available in French, German, Italien, Spanish and Polish.

Categorieën: Mozilla-nl planet

MDN celebrates 10 years of documenting YOUR Web

Mozilla Blog - to, 23/07/2015 - 11:09

Today, Mozilla proudly celebrates the 10th anniversary of the Mozilla Developer Network, one of the richest and also one of the few multilingual resources on the Web for documentation. It started in February 2005, when a small team dedicated to the open Web took DevEdge (Netscape’s developer materials) and set out to create an open, free, community-built online resource for all Web developers. Just a couple of months later, on 23 July, 2005 the original MDN wiki site launched and has evolved steadily ever since for the convenience and the benefit of its users.

MDN_10-Milestones_UK
Today, ten years later, not only has the amount of documentation grown – 34,500 documents and climbing – but also MDN’s global volunteer community is bigger than ever. Currently, MDN has more than 4 million users and over 1000 volunteer editors per month creating and translating documentation, sample code, tutorials and other learning resources for all open Web technologies, including CSS, HTML, JavaScript and everything that makes the open Web as rich and versatile as it is.

MDN_10_Facts_UK

For a wide range of Web developers, from learners to hobbyists to full-time professionals, MDN provides useful explanations for coding practice. It aims to inspire ideas, encourage collaboration, innovation and ultimately, foster the growth of the open Web. Moreover, as the digital industry flourishes and the demand for coding skills at young age rises, the importance of well-organized resources like MDN grows exponentially. That is why in 2014 MDN started to feed and expand all its learning pages into a “Learn the Web” area for beginning web developers, including a web terminology glossary, which MDN’s technical writers and volunteers will continue to develop over the next years.

All these efforts, which would not be possible without the active MDN volunteer base, are being greatly acknowledged by developers from all over the world who would not be doing what they do without MDN – or at least not as good.

Let’s hear it for MDN!

For more information:

Web: https://developer.mozilla.org/
Twitter: https://twitter.com/MozDevNet

Categorieën: Mozilla-nl planet

Daniel Pocock: Unpaid work training Google's spam filters

Mozilla planet - to, 23/07/2015 - 10:49

This week, there has been increased discussion about the pain of spam filtering by large companies, especially Google.

It started with Google's announcement that they are offering a service for email senders to know if their messages are wrongly classified as spam. Two particular things caught my attention: the statement that less than 0.05% of genuine email goes to the spam folder by mistake and the statement that this new tool to understand misclassification is only available to "help qualified high-volume senders".

From there, discussion has proceeded with Linus Torvalds blogging about his own experience of Google misclassifying patches from Linux contributors as spam and that has been widely reported in places like Slashdot and The Register.

Personally, I've observed much the same thing from the other perspective. While Torvalds complains that he isn't receiving email, I've observed that my own emails are not always received when the recipient is a Gmail address.

It seems that Google expects their users work a little bit every day going through every message in the spam folder and explicitly clicking the "Not Spam" button:

so that Google can improve their proprietary algorithms for classifying mail. If you just read or reply to a message in the folder without clicking the button, or if you don't do this for every message, including mailing list posts and other trivial notifications that are not actually spam, more important messages from the same senders will also continue to be misclassified.

If you are not willing to volunteer your time to do this, or if you are simply one of those people who has better things to do, Google's Gmail service is going to have a corrosive effect on your relationships.

A few months ago, we visited Australia and I sent emails to many people who I wanted to catch up with, including invitations to a family event. Some people received the emails in their inboxes yet other people didn't see them because the systems at Google (and other companies, notably Hotmail) put them in a spam folder. The rate at which this appeared to happen was definitely higher than the 0.05% quoted in the Google article above. Maybe the Google spam filters noticed that I haven't sent email to some members of the extended family for a long time and this triggered the spam algorithm? Yet it was at that very moment that we were visiting Australia that email needs to work reliably with that type of contact as we don't fly out there every year.

A little bit earlier in the year, I was corresponding with a few students who were applying for Google Summer of Code. Some of them also observed the same thing, they sent me an email and didn't receive my response until they were looking in their spam folder a few days later. Last year I know a GSoC mentor who lost track of a student for over a week because of Google silently discarding chat messages, so it appears Google has not just shot themselves in the foot, they managed to shoot their foot twice.

What is remarkable is that in both cases, the email problems and the XMPP problems, Google doesn't send any error back to the sender so that they know their message didn't get through. Instead, it is silently discarded or left in a spam folder. This is the most corrosive form of communication problem as more time can pass before anybody realizes that something went wrong. After it happens a few times, people lose a lot of confidence in the technology itself and try other means of communication which may be more expensive, more synchronous and time intensive or less private.

When I discussed these issues with friends, some people replied by telling me I should send them things through Facebook or WhatsApp, but each of those services has a higher privacy cost and there are also many other people who don't use either of those services. This tends to fragment communications even more as people who use Facebook end up communicating with other people who use Facebook and excluding all the people who don't have time for Facebook. On top of that, it creates more tedious effort going to three or four different places to check for messages.

Despite all of this, the suggestion that Google's only response is to build a service to "help qualified high-volume senders" get their messages through leaves me feeling that things will get worse before they start to get better. There is no mention in the Google announcement about what they will offer to help the average person eliminate these problems, other than to stop using Gmail or spend unpaid time meticulously training the Google spam filter and hoping everybody else does the same thing.

Some more observations on the issue

Many spam filtering programs used in corporate networks, such as SpamAssassin, add headers to each email to suggest why it was classified as spam. Google's systems don't appear to give any such feedback to their users or message senders though, just a very basic set of recommendations for running a mail server.

Many chat protocols work with an explicit opt-in. Before you can exchange messages with somebody, you must add each other to your buddy lists. Once you do this, virtually all messages get through without filtering. Could this concept be adapted to email, maybe giving users a summary of messages from people they don't have in their contact list and asking them to explicitly accept or reject each contact?

If a message spends more than a week in the spam folder and Google detects that the user isn't ever looking in the spam folder, should Google send a bounce message back to the sender to indicate that Google refused to deliver it to the inbox?

I've personally heard that misclassification occurs with mailing list posts as well as private messages.

Categorieën: Mozilla-nl planet

Daniel Pocock: Recording live events like a pro (part 1: audio)

Mozilla planet - to, 23/07/2015 - 09:14

Whether it is a technical talk at a conference, a political rally or a budget-conscious wedding, many people now have most of the technology they need to record it and post-process the recording themselves.

For most events, audio is an essential part of the recording. There are exceptions: if you take many short clips from a wedding and mix them together you could leave out the audio and just dub the couple's favourite song over it all. For a video of a conference presentation, though, the the speaker's voice is essential.

These days, it is relatively easy to get extremely high quality audio using a lapel microphone attached to a smartphone. Lets have a closer look at the details.

Using a lavalier / lapel microphone

Full wireless microphone kits with microphone, transmitter and receiver are usually $US500 or more.

The lavalier / lapel microphone by itself, however, is relatively cheap, under $US100.

The lapel microphone is usually an omnidirectional microphone that will pick up the voices of everybody within a couple of meters of the person wearing it. It is useful for a speaker at an event, some types of interviews where the participants are at a table together and it may be suitable for a wedding, although you may want to remember to remove it from clothing during the photos.

There are two key features you need when using such a microphone with a smartphone:

  • TRRS connector (this is the type of socket most phones and many laptops have today)
  • Microphone impedance should be at least 1kΩ (that is one kilo Ohm) or the phone may not recognize when it is connected

Many leading microphone vendors have released lapel mics with these two features aimed specifically at smartphone users. I have personally been testing the Rode smartLav+

Choice of phone

There are almost 10,000 varieties of smartphone just running Android, as well as iPhones, Blackberries and others. It is not practical for most people to test them all and compare audio recording quality.

It is probably best to test the phone you have and ask some friends if you can make test recordings with their phones too for comparison. You may not hear any difference but if one of the phones has a poor recording quality you will hopefully notice that and exclude it from further consideration.

A particularly important issue is being able to disable AGC in the phone. Android has a standard API for disabling AGC but not all phones or Android variations respect this instruction.

I have personally had positive experiences recording audio with a Samsung Galaxy Note III.

Choice of recording app

Most Android distributions have at least one pre-installed sound recording app. Look more closely and you will find not all apps are the same. For example, some of the apps have aggressive compression settings that compromise recording quality. Others don't work when you turn off the screen of your phone and put it in your pocket. I've even tried a few that were crashing intermittently.

The app I found most successful so far has been Diktofon, which is available on both F-Droid and Google Play. Diktofon has been designed not just for recording, but it also has some specific features for transcribing audio (currently only supporting Estonian) and organizing and indexing the text. I haven't used those features myself but they don't appear to cause any inconvenience for people who simply want to use it as a stable recording app.

As the app is completely free software, you can modify the source code if necessary. I recently contributed patches enabling 48kHz recording and disabling AGC. At the moment, the version with these fixes has just been released and appears in F-Droid but not yet uploaded to Google Play. The fixes are in version 0.9.83 and you need to go into the settings to make sure AGC is disabled and set the 48kHz sample rate.

Whatever app you choose, the following settings are recommended:

  • 16 bit or greater sample size
  • 48kHz sample rate
  • Disable AGC
  • WAV file format

Whatever app you choose, test it thoroughly with your phone and microphone. Make sure it works even when you turn off the screen and put it in your pocket while wearing the lapel mic for an hour. Observe the battery usage.

Gotchas

Now lets say you are recording a wedding and the groom has that smartphone in his pocket and the mic on his collar somewhere. What is the probability that some telemarketer calls just as the couple are exchanging vows? What is the impact on the recording?

Maybe some apps will automatically put the phone in silent mode when recording. More likely, you need to remember this yourself. These are things that are well worth testing though.

Also keep in mind the need to have sufficient storage space and to check whether the app you use is writing to your SD card or internal memory. The battery is another consideration.

In a large event where smartphones are being used instead of wireless microphones, possibly for many talks in parallel, install a monitoring app like Ganglia on the phones to detect and alert if any phone has weak wifi signal, low battery or a lack of memory.

Live broadcasts and streaming

Some time ago I tested RTP multicasting from Lumicall on Android. This type of app would enable a complete wireless microphone setup with live streaming to the internet at a fraction of the cost of a traditional wireless microphone kit. This type of live broadcast could also be done with WebRTC on the Firefox app.

Conclusion

If you research the topic thoroughly and spend some time practicing and testing your equipment, you can make great audio recordings with a smartphone and an inexpensive lapel microphone.

In subsequent blogs, I'll look at tips for recording video and doing post-production with free software.

Categorieën: Mozilla-nl planet

Nicholas Nethercote: “Thank you” is a wonderful phrase

Mozilla planet - to, 23/07/2015 - 06:29

Last year I contributed a number of patches to pdf.js. Most of my patches were reviewed and merged to the codebase by Yury Delendik. Every single time he merged one of my patches, Yury wrote “Thank you for the patch”. It was never “thanks” or “thank you!” or “thank you :)”. Nor was it “awesome” or “great work” or “+1″ or “\o/” or “\m/”. Just “Thank you for the patch”.

Oddly enough, this unadorned use of one of the most simple and important phrases in the English language struck me as quaint, slightly formal, and perhaps a little old-fashioned. Not a bad thing by any means, but… notable. And then, as Yury merged more of my patches, I started getting used to it. Tthen I started enjoying it. Each time he wrote it — I’m pretty sure he wrote it every time — it made me smile. I felt a small warm glow inside. All because of a single, simple, specific phrase.

So I started using it myself. (“Thank you for the patch.”) I tried to use it more often, in situations I previously wouldn’t have. (“Thank you for the fast review”.) I mostly kept to this simple form and eschewed variations. (“Thank you for the additional information.”) I even started using it each time somebody answered one of my questions on IRC. (“glandium: thank you”)

I’m still doing this. I don’t always use this exact form, and I don’t always remember to thank people who have helped me. But I do think it has made my gratitude to those around me more obvious, more consistent, and more sincere. It feels good.

Categorieën: Mozilla-nl planet

Morgan Phillips: git push origin taskcluster

Mozilla planet - to, 23/07/2015 - 06:16
If you've been around Mozilla during the past two years, you've probably heard talk about TaskCluster - the fancy task execution engine that a few awesome folks built for B2G - which will be used to power our entire CI/Build infrastructure soonish.

Up until now, there have been a few ways for developers to schedule custom CI jobs in TaskCluster; but nothing completely general. However, today I'd like to give sneak peak at a project I've been working on to make the process of running jobs on our infra extremely simple: TaskCluster GitHub.

Why Should You Care?
1.) The service watches an entire organization at once: if your project lives in github/mozilla, drop a .taskclusterrc file in the base of your repository and the jobs will just start running after each pull request - dead simple.

2.) TaskCluster will give you more control over the platform/environment: you can choose your own docker container by default, but thanks to the generic worker we'll also be able to run your jobs on Windows XP-10 and OSX.

3.) Expect integration with other Mozilla services: For a mozilla developer, using this service over travis or circle should make sense, since it will continue to evolve and integrate with our infrastructure over time.

It's Not Ready Yet: Why Did You Post This? :(
Because today the prototype is working, and I'm very excited! I also feel that there's no harm in spreading the word about what's coming.

When this goes into production I'll do something more significant than a blog post, to let developers know they can start using the system. In the meantime here it is handling a replay of this pull request. \o/ note: The finished version will do nice things, like automatically leave a comment with a link to the job and its status..

Categorieën: Mozilla-nl planet

Pages