For testing clubs this first quarter, we followed this process:
- Invite testers. We talked to allies about the opportunity and invited them to join the testing process. Each tester was given the dedicated support of a staff member to ensure they had direct and regular contact with the project.
- Kickoff call with testers. We initiated testing with a community call, which we continued to host fortnightly as an important check-in and reflection point. We used Vidyo and etherpad for the calls.
- 1:1 Interviews. To better understand our allies needs, we conducted 40+ interviews with them. We collated and analysized the data, which greatly informed our efforts.
- Affiliate comparison. In parallel, we also reviewed 10+ other organizations who have a club model or other form of local group organizing. This review gave us best practices to learn from.
- Curriculum curation. The testing process was two-part: curriculum curating and curriculum testing. To curate, we developed a curriculum arc (Reading the Web, Writing the Web, and Participating on the Web) and then sought existing activities to fill that out. Where there were gaps, we created or remixed new activities. This work was done on Github to great effect.
- Curriculum testing. Every two weeks, our testers were invited to try out the latest curriculum section. We shared reflections and questions in Discourse and used our fortnightly check-in call to discuss our experience and feedback on the sections.
- Assessment is hard. We know how important it is for benchmarks. We want to know how effective the curriculum is. We created brief questionnaires in Google Docs and made them part of the testing process. But the responses were low. This continues to be a challenge. How can we do friction-free assessment?
- Partner cultivation. As the testing was going on, we also drafted a partner engagement plan. What organizations would be ideal partners for clubs? What are we offering them and how to we want to engage them? Next quarter we will put this plan into action with a number of wonderful organizations.
- Website development. Furthermore, we discussed with testers their needs for an online platform to showcase and connect this initiative. The first version of this new website will go live in April.
- Reflect early, reflect often. Throughout this quarter, we had conversations with testers, colleagues and other partners about this process. We constantly adjusted and improved. This is an essential practice. Going forward, I anticipate continual reflection and iteration as we develop clubs collaboratively and in the open. It was very beneficial meeting the team in person for several days of planning. I hope we can do that again, expanding to regional coordinators and testers, next quarter.
- Get out of the way. Once the framework is set up and a team is in place to support testing, it’s important to get out of the way! Smart people will innovate and remix the experience. Make sure there are ways to encourage and capture that. But allow beautiful and unexpected things to emerge, like Project Mile.
If you participated in this round of testing, or have related experiences, we’d love to hear your thoughts on the process!
During the testing of clubs, we identified the need for a community organizing model to recruit and support club leaders regionally.
We realized that this model must provide value to i) established mentors as well as help ii) aspiring educators find their audience and their pedagogical stance.Understanding our allies
In other words, these are our two personas for clubs:
- Motivation: Strong on MISSION. Cares about the web, issues like privacy.
- Needs: Help on how to teach. Seeks collaborators and a community.
- Incentives: Belonging, impact, recognition.
- Concerns: Feeling under appreciated, mixed to low teaching skills
- Motivation: Strong on STRUCTURE.Wants good content for their learners
- Needs: Curriculum & web tools, professional development, access to skilled educators
- Incentives: Engaged learners, professional develop credentials
- Concerns: Not drawn primarily by cause, narrative or brand.
To support these two personas, we established that intermediary volunteer leadership roles are needed. Inspired by Obama’s community organizing model, nicknamed “the snowflake”, we would like to pilot the following structure:
- Club Leader. Runs a local club.
- Regional Coordinators. Supports several local clubs.
- Staff Organizer. Supports several regional coordinators.
Starting in April, we’d like to work with a handful of beta-tester regional coordinators to test and grow this organizing model.The Facilitative Teacher
Furthermore, we realized that community leaders would benefit from professional development and training. In parallel to the curriculum stream we have around web literacy, we will also develop modules around facilitative leadership and teaching.
This includes hands-on activities to teach how to use open practices, connectivism, digital making and general facilitation skills to empower your learners and grow your local community.
I’m quite excited about this area of development and plan to collaborate closely with Aspiration Tech and Mozilla Reps to build this out next quarter.
For clubs, we needed well curated and field-tested curriculum informed by our pedagogy:Our pedagogy
- Why we teach: This is our mission. We are dedicated to empowering others with web literacy so that they have agency on the web as creators, citizens and future leaders.
- How we teach: This is our pedagogy. Teaching and learning is how we achieve our mission. They are political as well as self-actualizing acts. We teach and learn by making projects together and openly reflecting on the process in an inclusive and locally relevant environment. Learning is social, production-centered, and open-ended. It is done best when facilitated in small groups meeting in-person.
- Who we teach: This is our audience. We teach our peers, so that we can reflect and improve together. We teach our local community, so we can give back and make a different locally.
- What we teach: This is our subject. We teach web literacy, which encompasses the mechanics, culture and citizenship of the web. Our learners are more self-actualized as creators when they can use the web as a platform for creativity. They are better citizens when they can make more informed choices on the web. And they are economically more empowered with skills and practical knowledge of this public resources.
- Where we teach: This is our classroom. We teach locally, wherever we have our learners, be that in formal classrooms, to libraries and coffee shops and kitchen tables. We learn globally, as we connect with peers who inspire and mentor us to make local change that has a global impact.
For the last few years, web literacy pioneers have been developing open educational resources to teach the web.
Over the last two months, we curated some of the brightest examples of that work and sequenced it into a six-part introductory module.
They inspired and shared the foundational materials for the first module. Here’s what the result looks like!Web Literacy Basics
Learners get familiar with reading, writing and participating on the web in this six-part module. Discover the foundations of the web through production and collaboration. The learning objectives underpinning each activity are informed by Mozilla’s Web Literacy Map.
Complete the activities in sequence, or mix & match for your learners.1. Reading the Web
A. Kraken the Code. Understanding credibility.
- Learners use the Internet to solve the mystery of The Kraken, a legendary sea creature, while also learning about search terms, keywords, and how to assess the validity and relevance of web sources.
B. Ping Kong. Understanding web mechanics.
- For many, “the Internet” is an abstract and overwhelming concept. This activity challenges learners to think concretely about how the internet communicates with a computer.
A. Hack the News. Understanding remixing.
- Learners use X-Ray Goggles to remix a news website, learning about openly-licensed resources, different forms of media, and how to create something new on the Web through remix.
B. HTML Puzzle Boxes. Understanding composing for the web.
- Learners race to sequence the paper boxes labeled with HTML tags, becoming familiar with the most common HTML tags and how to structure a web page.
A. Web Chef. Understanding open practices.
- Learners teach their peers a skill and document the steps by making a web resource that includes properly attributed open content.
FINAL PROJECT: B. Story of Us. Understanding community participation.
- Learners tell their Story of Self, use it to reflect on what they have learned, and how they want to participate on the web and with their community going.
During a code review recently I was looking at some code that was using the new keyword to create an instance e.g:var inst = new MyConstructor(); The context object when new isn't used
The code contained a safeguard to bail if the new keyword wasn't used. Which is a good idea because if the the new keyword is omitted, the context will be window.
This outputs:"[object Window]" "[object Object]" Making a flexible constructor
To move away from the need to require new we ended up changing the constructor to allow instance creation both with and without new. This meant that we could create our objects more directly. In our case the code was passing the objects as args so getting rid of new simplified things quite a bit.
This outputs:"foo1" "foo2"
So now:var inst = MyConstructor();
andvar inst = new MyConstructor();
Are equivalent. At least if you forget the new keyword you're not going to get burned by having the context object be window.Using arguments to DRY up the signature
One issue that my colleague Kumar raised with this approach is it means you need to update the signature in two places when it changes. Not a huge deal but it would be nice to get away from that.
Immediately I thought of arguments. But you can't use apply and new together to create an instance.
So here's an alternative that deals with that:
Here we create an object with the MyConstructor prototype and then call the constructor using that object as the context. As arguments is used you avoid having 2 places to update the signature.
Here's the output:["foo1", "bar1"] ["foo2", "bar2"] ["whatever", "bar1"] ["foo2", "bar2"] true true
Here's that example on JSBin so you can see it in action.
- 39 changesets
- 79 files changed
- 1070 insertions
- 508 deletions
ExtensionOccurrences js32 cpp17 h6 xul4 ini4 xml3 jsx3 jsm3 html3 java2 xhtml1 mm1
ModuleOccurrences browser38 gfx13 dom10 toolkit8 services2 netwerk2 mobile2 xpcom1 widget1 media1 layout1
List of changesets:Randell JesupBug 1130150: mSources update r=roc a=abillings - 21f52f25675a Tooru FujisawaBug 949971 - Set longer timeout for test_input_typing_sanitization.html. r=RyanVM, a=test-only - 066ad2436c70 Boris ZbarskyBug 453969 - Fix the race in test_bug382113.html so we don't set our child-onload-fired boolean to false _after_ the child onload has already fired. r=froydnj, a=test-only - 8fbda87f6b4d ChrisBug 1106926 - Ensure that removing a hidden one click search provider also removes it from the browser.search.hiddenOneOffs pref. r=florian, a=lmandel - dc95247d23d7 ChrisBug 1121417 - Change hiddenOneOffs pref to use unichar type. r=gavin, a=lmandel - 90a33a9f129e J. Ryan StinnettBug 1128027 - Clean up protocol.js pools after connection close. r=bgrins a=lsblakk - abce8eb1a75e J. Ryan StinnettBug 1128027 - Repair sourceeditor test after protocol.js cleanup change. r=bgrins a=lsblakk - f21e8fa80469 J. Ryan StinnettBug 1128027 - Inspector destroy error was holding document alive. r=bgrins a=lsblakk - fa9b1bfa9f0e J. Ryan StinnettBug 1128027 - Record protocol.js request headers for debugging. r=bgrins a=lsblakk - fbef1b8d36e0 J. Ryan StinnettBug 1128027 - Rework Console tests that click links. r=bgrins a=lsblakk - 9fb666f03801 J. Ryan StinnettBug 1128027 - Wait for inspector link in Web Console test. r=bgrins a=lsblakk - 7e2e728297e6 Nicholas NethercoteBug 1134030 - Add WindowsAddressSpaceReporter. code=njn,dmajor. r=dmajor,njn. a=lsblakk. - 20306323469e Jean-Yves AvenardBug 1132342: Handle race should operation be aborted while reading metadata. r=karlt a=lsblakk - 803ed9fc9507 Boris ZbarskyBug 1102042 - Fix the test to not have a race between the binding and the test bits. r=terrence, a=test-only - aa133901be39 Ehsan AkhgariBug 922977 - Request a longer timeout when running test_reftests_with_caret.html. a=test-only - e41d9d701e13 Jesse RudermanBug 1133142 - Downgrade 'mTempFile not equal to mTargetFile' from assertion to warning. r=yoric, a=NPOTB - a8ffdc019a78 David AndersonBug 1135883 - Implement GetMaxTextureSize in the basic compositor. r=mattwoodrow, a=lsblakk - 096b8eb2590d Neil DeakinBug 1015617 - Wait for panel to show before adding hidden listener. r=gijs, a=lsblakk - 916424218be4 Karl TomlinsonBug 1138229 - GetOutputStreamInfo() after each SetOutputType(). r=cpearce, a=lsblakk - 14cc1f92c84c Jan-Ivar BruaroeyBug 1140363 - Fire recording-window-ended on gUM failures, like we do on deny. r=jesup, a=lsblakk - 0cd4e38e00b1 James WillcoxBug 1140830 - Don't try to use a null JSONObject in SiteIdentity.update(). r=rnewman, a=lsblakk - 7e93dd7c7feb Jeff MuizelaarBug 1136242 - Make sure we acquire the mutexes when copying the surfaces. r=jgilbert, a=lsblakk - 71e45360880e Robert LongsonBug 1134561 - Use of the namespace when checking the tag name if HTMLPictureElement and HTMLSourceElement. r=jst, a=lsblakk - 97c57043b3fc Patrick McManusBug 1136140 - wss inside https proxy null deref. r=hurley, a=lsblakk - b8c7154fab60 Benjamin SmedbergBug 1132192 - Enable org.mozilla.searches.engines by default in FHR. r=gps, a=lsblakk - 7949e470a547 Ryan VanderMeulenBacked out changeset 096b8eb2590d (Bug 1135883) for bustage. - b526678ba6d2 Mark BannerBug 1106941 - Part 1: Firefox Hello doesn't work properly when no video camera is installed - fix rooms and outgoing conversations. r=mikedeboer, a=lsblakk - 245598f10dcd Mark BannerBug 1106941 - Part 2: Firefox Hello doesn't work properly when no video camera is installed - fix incoming conversations. r=mikedeboer, a=lsblakk - 9e52698fd237 Cesar GuiraoBug 1139132: Fix Chroma offset on WebRTC remote video when width is not even r=jesup a=lmandel - 19ac18d33c28 Chris PearceBug 1141241 - Add nullcheck for mDecoder in WMFMediaDataDecoder::ProcessDrain(). r=mattwoodrow a=lmandel - 18ecbc81b0e4 Tim TaubertBug 1128928 - Fix intermittent browser_social_chatwindow_resize.js failures with ASAN builds by increasing the number of tries used by waitForCondition(). r=markh, a=test-only - f758eb029b69 Neil DeakinBug 1002232 - Move tooltip test into a window to prevent the browser tooltip from interfering with it. r=neil, a=test-only - 09ac7b7f011a Matt WoodrowBug 1131808 - Avoid trying to allocate a buffer for 0 sized YCbCr images. r=nical, a=lmandel - 05df69e4ada6 Nicolas SilvaBug 1123080 - Use cairo's image backend as canvas fallback on windows. r=Bas a=lmandel - 45cc75aa62d9 Nicolas SilvaBug 1125848 - Reduce the likelyhood of a CompositorParent being destroyed without the proper shutdown sequence. r=sotaro a=lmandel - e15e6597d699 Nicolas SilvaBug 1125848 - Consolidate PCompositor's creation-destruction logic. r=sotaro a=lmandel - 81009105d11d Steven MichaudBug 1137229 - Fix breakage in IMEInputHandler::OnDestroyWidget(). r=masayuki a=lmandel - f8c988045bb5 David AndersonBug 1135883 - Implement GetMaxTextureSize in the basic compositor. r=mattwoodrow, a=lsblakk - 2d58dd0bfaf7 Ryan VanderMeulenBacked out changesets 81009105d11d and e15e6597d699 (Bug 1125848) for Windows mochitest timeouts. - 17af3ddb4a24
This is part 2 of User Success in 2015. If you haven’t already, read part 1 first!
Mozilla planned things differently this year. All of Mozilla including the Mozilla Corporation and the Mozilla Foundation started back in late October and had the 2015 goals 90% finished in early December. As the humorous but insightful cliché goes, “the last 10% is the hardest 90%” which is why the goals weren’t really 100% done until after the Christmas break.
We started with a three year vision and then moved onto the goals.
Behold – here is the User Success three year vision:
We will push the boundaries of what it means to give global community-powered support for a billion users with excellence and personality. We will enable users to help themselves and each other in ways never before seen.
We will surface the issues that product teams need to fix first to stop attrition, because we understand that the best service is no service. As a result, user satisfaction is skyrocketing.
Internally, we will become known as the team that truly understands our growing user base. Externally, we will become seen as thought-leaders in proactive customer care.
Let’s get a little more specific and talk about our specific plans for 2015. First, some assumptions we’re working under:
- Mozilla’s global, cross-product market share will roughly double in size (500M -> 1 Billion)
- The size of the paid staff on my team will remain largely the same (give or take a couple of hundred people – one can always dream)
- Our fantastic volunteer community continues to grow and thrive, aided by our focused community management efforts
With that out of the way, these are the specific things we’re doing in 2015:
1. Help make our products better to increase user happiness
- Increase the accuracy of user insights provided to the org so that product and engineering teams can more easily act on them.
- Get instrumentation in place to define and prioritize Top Attrition Risk issues (issues that are most devastating to user happiness and retention, such as data loss), so
2. Help more users by moving our efforts up in the product/user lifecycle
- Self-heal: Don’t wait for users to come to us with problems – when possible, fix known issues automatically in Desktop Firefox!
3. Provide excellent support to all of our products and services
- Increase user satisfaction across our products and services
- Create feedback mechanisms and stand up support to serve and gain insights about users of new product and service launches.
On “moving our efforts up” in the product/user lifecycle, one analogy I’ve been kicking around in the past is the idea of our team on a football field (note that this comes from someone who isn’t very interested in football!).
Remember the amazing collection of circles-in-circles in part 1 of this blog post series? Now, consider those circles overlaid on a football field.
Maybe this helps illustrate how we think of the impact we have on both our products and our users. The higher up in the field we’re able to deflect issues, the lower the cost and the higher the user satisfaction.
One way of looking at this is to consider the point when a user hits a support website as a point of failure. If the midfield messes up, defense has to deal with it. And if the defense messes up, it’s up to the goalie to recover the situation. The closer you get to the goal, the more costly mistakes become and the less proactive you can be on the field.
To football fans out there, on a scale of 1 to 10, how painfully obvious is it that I know more about user happiness than the green field of chess?
Next up: User Success in 2015 – Part 3: How will we know we nailed it in 2015? (Will update this post with a link once that post is published.)
In my previous about our new Spark infrastructure, I went into the details on how to launch a Spark cluster on AWS to perform custom analyses on Telemetry data. Sometimes though one has the need to rerun an analysis recurrently over a certain timeframe, usually to feed data into dashboards of various kinds. We are going to roll out a new feature that allows users to upload an IPython notebook to the self-serve data analysis dashboard and run it on a scheduled basis. The notebook will be executed periodically with the chosen frequency and the result will be made available as an updated IPython notebook.
To schedule a Spark job:
- Visit the analysis provisioning dashboard at telemetry-dash.mozilla.org and sign in using Persona with an @mozilla.com email address.
- Click “Schedule a Spark Job”.
- Enter some details:
- The “Job Name” field should be a short descriptive name, like “chromehangs analysis”.
- Upload your IPython notebook containt the analysis.
- Set the number of workers of the cluster in the “Cluster Size” field.
- Set a schedule frequency using the remaining fields.
Once a new scheduled job is created it will appear in the top listing of the scheduling dashboard. When the job is run its result will be made available as an IPython notebook visible by clicking on the “View Data” entry of your job.
As I briefly mentioned at the beginning, periodic jobs are typically used to feed data to dashboards. Writing dashboards for a custom job isn’t very pleasant and I wrote in the past some simple tool to help with that. It turns out though that thanks to IPython one doesn’t need necessarily to write a dashboard from scratch but can simple re-use the notebook as the dashboard itself! I mean, why not? That might not be good enough for management facing dashboards but acceptable for ones aimed at engineers.
In fact with IPython we are not limited at all to matplotlib’s static charts. Thanks to Plotly, it’s easy enough to generate interactive plots which allow to:
- Check the x and y coordinates of every point on the plot by hovering with the cursor.
- Zoom in on the plot and resize lines, points and axes by clicking and dragging the cursor over a region.
- Pan by holding the shift key while clicking and dragging.
- Zooms back out to the original version by double clicking on the plot.
Plotly comes with its own API but if you have already a matplotlib based chart then it’s trivial to convert it to an interactive plot. As a concrete example, I updated my Spark Hello World example with a plotly chart.fig = plt.figure(figsize=(18, 7)) frame["WINNT"].plot(kind="hist", bins=50) plt.title("startup distribution for Windows") plt.ylabel("count") plt.xlabel("log(firstPaint)") py.iplot_mpl(fig, strip_style=True)
As you can see, just a single extra line of code is needed for the conversion.
As WordPress doesn’t support iframes, you are going to have to click on the image and follow the link to see the interactive plot in action.
It’s been a fantastic two months of producing and testing clubs.
Here’s a recap of what we accomplished:
- Produced and field-tested the first curriculum module in 24 cities.
- Drafted an organizing model to recruit and support club leaders regionally.
- Developed a soon-to-launch website introducing clubs as part of the larger Mozilla Learning offering.
- Finalized a partner engagement plan to spread web literacy through strategic partnerships.
- And most importantly, collaborated with an amazing community of club testers and educators who are at the forefront of teaching web literacy.
In follow-up posts, I’d like to share more about the accomplishments above, what we learned and where we struggled, as well as share ideas for where the club project can go next. If you’re interested in getting involved, drop a line in our discussion forum.
But for now, I’d like to give a huge thanks and gif high-five to the first club testers. We couldn’t have had a more knowledgeable and joyful group!
- Andrea Ellis, Kansas City Library
- Leslie Scott, aSTEAM Village
- Lina Kim, Diana Lee, and Ab Velasco, Toronto Public Library
- Justin Hoenke, Chattanooga Public Library
- Meredith Summs, MOUSE
- Gina Tesoriero, NYC Department of Education
- Maurya Couvares, ScriptED
- Nivedita Lane, Boys and Girls Clubs of Canada
- Heather Schneider, Shedd Aquarium
- Christina Cantrill, National Writing Project
- Dan Gilbert, Afterschool Alliance
- Jess Weichler, Makerbox
- Su Adams, Staplehurst School
- Steven Flower, CoderDojo Manchester
- Jonathan Prozzi, Digital Harbor
- Ellie Mitchell, Maryland Afterschool Network
- Emma Dicks, Innovate the Cape
- Mick Fuzz, Clearer Channel
- Emma Irwin, Mozilla Community Education
- Vineel Reddy, Gauthamraj Elango, Ankit Gadgil, Sayak Sarkar, Galaxy Kadiyala, Priyanka Naj, Mozilla India
- San Emmanuel James and Lawrence Kisuuki, Mozilla Uganda
- Faye Tandog, Mozilla Philippines
- Chad Sansing, BETA Academy
- Kim Wilkens, St. Anne’s-Belfield School
- Ian O’Byrne, University of New Haven
- Alvar Maciel, Buenos Aires educator
- Mikko Kontto, The English School in Helsinki
- Greg McVerry, Southern Connecticut State University
- Luis Sanchez, Mozilla Mexico
- Roz Hussin, University of Nebraska-Lincoln
- Sophia Koniarska, Think Big
- Pete O’Shea, CoderDojo
- Elijah van der Giessen, NetSquared
- Chris Mills, Mozilla Developer Network
- Kathryn Barrett, Girls Learning Code
- Elio Qoshi, Webmacademy Albania
- Andre Garzia, Rio LAN House project
- Simona Ramkisson, Amira Dhalla, Doug Belshaw, Lainie DeCoursy, Laura Hilliger, An-Me Chung and Chris Lawrence, Mozilla Foundation