We'll see | Matt Zimmerman

a potpourri of mirth and madness

Posts Tagged ‘Conferences

Who’s afraid of Ubuntu Women?

At the recent Ubuntu Developer Summit, there were three sessions held to discuss the future of the Ubuntu Women project. Unfortunately, I was unable to attend the first two, because I didn’t realize the first one was happening, and I had a scheduling conflict for the second. The first session was video recorded, and hopefully the recording will be made available soon. While attending the third and final session, I tried to catch up on the earlier discussion as I listened to what was being said.

The first thing I noticed in the Gobby notes was a link to an existing roadmap for Ubuntu Women. I hadn’t seen that document before, and was encouraged to see that it included concrete, measurable goals for increasing the participation of women in Ubuntu. In particular, it presents a goal of increased representation in Ubuntu governing bodies, which I think is an important step in promoting gender diversity in the project. People want leaders they can identify with.

The next thing I found in the document was a list of goals. I asked about the relationship between the goals in Gobby and the ones in the wiki roadmap, and someone explained that the goals in the wiki were long term, while the ones in Gobby were short term (to be completed in the 6-month Lucid cycle).

There were about 25 people attending the session, and most of the talking was done by Amber Graner, Elizabeth Krumbach, Laura Czajkowski, Jono Bacon and Kurt von Finck. It was Friday afternoon, the last day of an intense week, and the energy level was fairly low. The focus seemed to be on reviewing the group’s objectives and agreeing who would take the next steps. The objectives were as follows:

Clarify the purpose of the #ubuntu-women channel

The group seemed to feel that there was confusion about what this IRC channel was for. A couple of men in the room said that they didn’t know whether they could or should join the channel, because it had the word “women” in the name.

The core of the issue seemed to be less about purpose than governance. The group was concerned about the fact that the channel was not publicly logged like most other Ubuntu channels, and that this gave the impression of it being a “fiefdom” within the community, or a place where people would “gossip”.

As far as I’m aware, there is at present no requirement that Ubuntu channels (official or unofficial) must be publicly logged, and there are many channels which are not. If this is considered to be a requirement for a healthy IRC community, then the Ubuntu IRC council would be in a good position to put forward such a policy. I don’t think I have enough experience in regulating IRC discussions to say whether this is the right thing to do, but it seemed a bit odd to me that this came up in the context of #ubuntu-women. It isn’t clear to me what problem this is meant to solve, and whether it is consistent with precedent (again, I’m not very familiar with IRC governance).

There was some confusion over why folks might not want the channel to be logged. Kurt suggested that if the conversation adhered to the Code of Conduct, there should be no reason not to publish it. I suggested that there were many occasions where a conversation might be appropriate to keep “off the record” while still following the code of conduct, and that these were separate issues (standards of behavior versus privacy).

The group’s agreed actions on this topic included agreeing and documenting guidelines for behavior in #ubuntu-women, and arranging for the conversations in the channel to be publicly recorded.

Create a safe space IRC channel

This objective seemed to acknowledge that something would be lost if the conversations in #ubuntu-women were made a matter of public record. The group therefore proposed the creation of a separate channel, which would still be logged, but only the Community Council would have access to the logs.

The reason for this seemed to be, again, the need to ensure regulation, and the concern that without oversight, channel participants would misbehave. While a safe space does require oversight in order to be maintained, the goal of involving the CC seemed to be general governance of behavior rather than the safety of women. The group seemed to acknowledge that this idea needed more work, and in particular wasn’t satisfied with the terminology of safe space.

The agreed actions were to create the new channel, document guidelines for behavior in it, and arrange for the conversation there to be logged for the Community Council.

Appoint a leader of the Ubuntu Women team

The group seemed to feel that, in order for the team to meet its goals, it was important to implement some form of government, and that the appropriate structure (at least initially) would be to have a single leader. They proposed to define the responsibilities of such a role, solicit nominations from the community, and ask the Community Council to appoint a leader.

I asked why the team could not appoint their own leader, and they explained that the team was not well defined enough, e.g. the Launchpad team is open for anyone to join. Without explicit membership, it’s difficult to organize a fair election. They suggested that the appointed leader would go about organizing the team to the point where it could govern itself more effectively.

There seemed to be some concern that this would be controversial.

Change the perception of Ubuntu Women

After the written goals had been reviewed, Amber said that in her view, the true value of the sessions had been to change the perception of Ubuntu Women in the community, and that the perception had been very negative. All of the vocal participants agreed with this assessment, seemed to feel this was an important problem to solve, and felt that great progress had been made during the course of UDS.

I was surprised by this, because I hadn’t encountered this perception myself, and so I asked to hear more about it. Several people asserted that that there was a problem, that Ubuntu Women and/or its IRC channel were perceived in a negative light. Two men in the room offered anecdotes: one didn’t think he should join the IRC channel because it had “women” in the name (which seems like a different issue), and another said that someone in his LoCo had advised him to avoid it because it was hostile.

I didn’t really understand all of this, but I didn’t want to derail the conversation, particularly as I had missed the first two thirds of it. In talking to people following the event, the issue at hand seems to be the IRC channel, #ubuntu-women, rather than Ubuntu Women itself. The channel, at one point, had become a sort of common meeting place for women in various geek communities, and was a place where they would sometimes blow off steam, or conduct broader feminist discussions beyond the scope of Ubuntu Women. This was apparently a bit off-putting to the uninitiated, as well as to some of the channel’s regular participants.

Some time ago, #ubuntu-women reverted back to its original purpose and the other discussion moved elsewhere, but it seems that this perception remained among some members of the Ubuntu community. This also may explain why I’ve been hearing that people are confused about the difference between Geek Feminism and Ubuntu Women, because some of the same people are involved in both, and discussed both on #ubuntu-women.

Hopefully that’s the end of this apparent stigma, and Ubuntu Women can get on with the business of helping the Ubuntu community to welcome more women.

Written by Matt Zimmerman

November 23, 2009 at 06:44

Ubuntu Developer Summit: 10.04 (Lucid)

This week, I’m in Dallas, Texas working at the Ubuntu Developer Summit.  Hundreds of Ubuntu developers and other community members are gathered to discuss the future of the project, particularly the 10.04 release.  Developers are engaged in technical discussions about how to implement new features for the release next April.

Obviously, a week is not enough time to decide, design and plan half a year of work, but we try to fit as much as possible into the week, because it is such a rare opportunity for us to work together face to face.  In order to make the best use of our time, there is a very full schedule of sessions, and we do a great deal of advance preparation.

There is a persistent rumor that UDS is where we decide what to do in the next cycle, but this isn’t quite accurate.  UDS is where we primarily figure out how to do what needs to be done.  Naturally, UDS is a sea of ideas, thanks to all of the creative thinking which happens among attendees, and we do dream up and decide to do new things there.  However, most of this is determined well before we all board airplanes to travel to UDS.

Brainstorm is constantly collecting and ranking suggestions from Ubuntu users.  Ubuntu development teams hold public meetings on IRC where they discuss ideas and plans.  Canonical stakeholders submit requirements for their needs.  All of this information is aggregated, sorted, evaluated and prioritized, largely by the heroic engineering managers at Canonical, who then develop the core of the agenda for UDS.  Additional sessions are then added as they come up during the week, when there is space.

At this particular UDS, I am moderating the server track, where we’re hashing out the details of our projects for Ubuntu Server Edition 10.04.  Being a UDS track moderator makes for a very busy week, with back-to-back sessions all day for five days straight.  It’s only Wednesday, and I’m feeling a bit fried already, having been away from home for over two weeks.

In each session, there is a discussion between the developers working on the project, the other UDS attendees who are interested in it, and any random folk who listen in on the audio stream and add questions or comments via IRC.  The participants take notes using Gobby and then publish them in the Ubuntu wiki, where they are developed into specification documents tracked in Launchpad.

Those specifications are further broken down into work items, which we can use to maintain a burn down chart.  Rick Spencer, our desktop engineering manager, gave a presentation this afternoon about how that process will work.  The burn down chart will give us a tool for establishing whether we are on track to complete our work, or if we are under or over committed, and make adjustments to our plans as needed.

I have a sense of tremendous momentum going into this release cycle, which will culminate in our third LTS (long-term support) release of Ubuntu.

Written by Matt Zimmerman

November 18, 2009 at 23:40

Amplify Your Effectiveness (AYE) Conference: Day 3 (afternoon)

Seeing Process: Making Process Visible (Steve Smith)

This session allowed me to practice capturing and visualizing the processes used by a team. In order to create a process to work with, we first conducted a simulation, where we (about 15 participants) organized ourselves to produce a product (sorted decks of cards) from raw materials (many cards from different decks). Volunteers took on the roles of the customer, the customer’s QA manager, the factory owner and the factory production manager, while the rest of us (myself included) acted as factory workers. As a manager in my day job, I wanted to get a different perspective in this exercise, and practice different kinds of leadership.

While the customer and management were sequestered for further instruction, the workers waited outside and talked about what would happen. There was quite a bit of talk of disrupting the activity, making things difficult for management, which I hoped wouldn’t amount to anything once we got started. The point of the exercise was to learn something, and being too disruptive would hinder that.

More interestingly, several of the workers had participated in other, similar simulations, and so were keen to avoid the obvious pitfalls. We collectively resolved to work cooperatively and treat each other well during the simulation: to act as a team. This turned out to be a defining moment in the simulation.

The appointed manager started out by briefing us on the parameters of the simulation, insofar as he was aware of them, and setting some ground rules. He explained that there was a mess inside, preparing us for what to expect. Realizing the group might be too big for the task at hand, he explained that some people might not be needed, and they should find something to do on their own without interfering with the rest of the group. This was also a key action.

The manager also made a point of wielding explicit power, which I found a bit excessive. He said there would be layoffs, and threatened that as punishment for the workers. Particularly given the team dynamic, this threatened to unite the team against him, which I felt was an unnecessary risk.

When we were ready to start work, the manager gave a lot of explicit instruction, providing an algorithm for the first phase of sorting. It wasn’t a very good algorithm, and this was evident to some members of the group, but the workers didn’t seem to feel it was worth the time which would be wasted arguing. That is, all but one, who caused the group to block for a couple of minutes by refusing to cooperate or suggest an alternative.

What was missing at this point was an understanding of the goal. The team was prepared to work on the task, but without knowing what the result was expected to be, they hesitated and were confused. The manager offered a vote on a small change to the algorithm, and it was agreed. Gradually, we established that the result should be piles of cards sorted according to the pattern on the back, and the team quickly self-organized around this goal, largely disregarding the proposed algorithms. This beautifully illustrated one of the lessons of the exercise, by showing how quickly and naturally processes diverge from what is prescribed.

I took quite an active role in learning the parameters of the simulation and helping to facilitate the group’s self-organization. During the initial card-sorting phase, I established a clearinghouse for cards of all types, and invited my colleagues to bring me any type of card to add to my stacks. This simple act made it obvious how to contribute to the goal, and organized the physical materials for later use. Someone else was doing the same thing, and when all of the cards were stacked, we just combined our stacks.

The manager explained that we needed to sort the decks. I asked what ordering should be used, and he pointed to a flipchart where instructions were written. The workers took the stacks of cards off to various parts of the room to sort them. Once I had one complete, I took it to the QA manager, who said it was incorrect. I checked it against the instructions, and it was clear that they were not effective in communicating the correct ordering. I rewrote them on another flipchart, checked them with the QA manager, and took the old instructions down.

Before I went back to sorting cards, I talked to other people sorting, and made sure that they knew about the new instructions. I checked if they had any complete decks, helped them do the final (quick) sorting by suit, and carried their completed decks to QA. This simple action which both boosted our immediate output (by pushing work in progress through the process) and accelerated production (by spreading information about the requirements).

The next time I handed in a batch of cards to QA, I was recruited to work at the QA table, because there was a shortage of capacity there. After checking a couple of decks, I had had experience with every stage of the process in the “factory” and thus a strong working knowledge of the overall process. For the rest of the time, I sought out work in progress on the factory floor and helped to finish it off. Wherever I found stacks of cards, I just sorted them and took them to QA.

Throughout this process, I remember hearing the disconcerted voice of the owner, who seemed to be worried about everything. She was frightened when I fixed the instructions on the flipchart, confused when I was changing roles or anything else unexpected. She noticed when people were not working, and wanted the manager to do something about it. I’m not sure how that role was structured, and what pressures she was responding to. I had no contact at all with the customer.

Once the simulation was over, the customer checked several of the delivered products, rejecting one which had an error, but still ending up with enough to meet the requirements. All told, the simulation ran very smoothly, and I was keen to analyze why.

Next, each of us drew a diagram of the process from our point of view. I often struggle with creating diagrams by hand, because I tend to work out the structure as I go. This time, I stopped for a moment and sketched the major phases of the process in my notebook before I started drawing. This resulted in a much clearer and neater diagram than I normally produce.

We discussed the diagrams as a group, which revealed a lot about how things had really worked, more than anyone had been able to see individually. It was surprising how much information was disseminated in such a short amount of time through drawing and discussing the diagrams. It helped that the entire exercise was very fresh in our minds, but I got the impression that this would be a useful way to gather and share information from a group under normal circumstances as well.

I had to leave a bit early in order to catch my flight, so I missed the end of the discussion.

Written by Matt Zimmerman

November 12, 2009 at 01:28

Posted in Uncategorized

Tagged with , , ,

Amplify Your Effectiveness (AYE) Conference: Day 3 (morning)

Today is my last day at the conference, and I’ll be leaving before the welcome dinner in order to catch my flight to Dallas. It has been a pleasure meeting everyone, and an invaluable opportunity to get perspective on my work and other parts of my life.

Morning session: Retrospectives (Esther Derby)

The full title of this session was Looking Back, Moving Forward: Continuous Improvement With Effective Retrospectives. I find retrospectives to be a useful tool for process improvement, and am interested in gaining more experience with them and learning new techniques.

imag0009To start, the group (about 12 people) conducted a project together. We were provided with a specification for a 3-dimensional model of an urban development project, and some supplies with which to build it. Over the course of 30 minutes, we organized ourselves, performed the task, and successfully delivered the result to a satisfied customer.

Overall, the project went very well, so I was looking forward to the retrospective. In a retrospective, I often find myself focusing on the problems, so this would be good practice in identifying and repeating successes.

To start, we “set the stage” for the retrospective by reviewing the agenda and “working agreements” (pre-established goals and guidelines for behavior). I always try to do this in retrospectives, but the idea of agreeing the guidelines in advance was new to me. This would both help to build support in the team, and save time in the meeting.

Then, we conducted a quick poll of the group to assess the outcome of the project, using a weather metaphor: was it “partly sunny”, “cloudy”, “rain”, etc. I liked this metaphor because I found it more neutral than a rating: weather is something which just happens and is observable, while numbers, for example, seem to carry more judgement.

Next, we broke into smaller groups to collect data (our observations) on the project we had just completed. I liked the idea of doing this in smaller groups, as it promoted more points of view, and also produced some redundant data (which was useful in the analysis to follow). We wrote these observations on sticky notes.

imag0010To analyze the data, everyone came back together and plotted it on a “FRIM” chart (frequency vs. impact), where one axis indicated frequency (how often it occurred during the project?) and the other impact (how much impact did it have, and was it positive or negative?). We looked at this data from near and far, looking for patterns and considering individual notes.

The upper left quadrant (low frequency and positive impact) was dominated things which happened near the beginning of the project: establishing roles, establishing the workspace, etc. The lower left quadrant (low frequency and negative impact) showed mostly items to do with ambiguity: lack of information, confusion and so on. The upper right quadrant (high frequency and positive impact) included ongoing teamwork, the ongoing progress which helped coalesce and motivate the team. There was nothing in the lower right quadrant (high frequency and negative impact), which I took as a sign of a healthy project. The fact that some of the observations were redundant helped make the patterns clearer, by showing what had been noticed more.

With these observations in mind, we broke into smaller groups again, this time with the objective of generating ideas for improvement, or for repeating our successes. I thought that splitting up the group would bring more ideas out, but in fact we came up with many of the same suggestions. Still, this helped to reinforce the suggestions because they were independently proposed by different groups.

imag0011The team was reformed, and suggestions were posted on a flipchart page with a table for evaluation. The team rated the suggestions according to the amount of effort required, the anticipated impact, and their individual willingness to work on it. Finally, based on this data, Esther asked for a volunteer to implement the most promising suggestion, choosing just one to keep the amount of change manageable. She asked for a second volunteer to back up and support the first. I liked the idea of rating the suggestions together, and also having someone in a supporting role for the followup action.

We then polled the team again, reusing the weather metaphor, to predict the “weather” for the next project, based on what we learned. Most of the group was optimistic, though a couple of people predicted storms. When asked about their concerns, one of them pointed out that our decision to focus on improving one aspect of our process could cause us to waste too much time up front working on that. To mitigate this, we resolved to limit the amount of time we would spend on it, and he seemed satisfied. The other doomsayer was unrepentant, and said he suspected we would fall prey to the “second system” effect. I got the impression that he wanted to be proven right, and wasn’t interested in avoiding such a problem.

We concluded by reviewing the overall structure of the retrospective, and how the stages fit together: setting the stage, gathering data, analyzing it, selecting actions and wrapping up. The wrap-up included a retrospective on the retrospective itself, to promote improvement of that process. I was a bit concerned we would end up in infinite regress, but Esther stopped there, and didn’t do a retrospective of the retrospective of the retrospective. I don’t normally even do one meta-retrospective, but am considering trying that out now.

Written by Matt Zimmerman

November 12, 2009 at 01:01

Amplify Your Effectiveness (AYE) Conference: Day 2

Morning Session: How Do I Communicate With You? (Don Gray)

This session explored interpersonal communication, in particular how to identify communication styles and preferences and apply this knowledge to communicate more effectively.

In one of the exercises, we broke into pairs and exchanged stories from recent memory. While one person told their story, the other observed their language, noting the types of predicates used (visual, auditory, kinesthetic, olfactory/gustatory and “unspecified”) as indicators of the speaker’s representational system. We used this data as the basis for a discussion about cues which indicate a person’s preferences, ranging from word choice to eye movement.

Another exercise divided the group into two based on personality type (Myers-Briggs “N” and “S”). Both groups were given identical objects (Starbucks paper coffee cups) and asked to write down descriptive phrases. The “N” group’s descriptions were wide-ranging, exploring the possible uses of the cup, the memories and meanings it evoked in them. The “S” group focused more on the physical properties of the cup, and where there were some comments on the ideas associated with it, these were carefully separated on a separate list.

A third exercise had us folding a piece of paper according to instructions read aloud. The instructions were ambiguous, and would be interpreted differently depending on how the person happened to be holding the paper, how they turned it as they folded it, or how they interpreted the sentences. Unsurprisingly, everyone ended up with a different pattern.

There was then some free-flowing discussion about communication styles where people in the group shared experiences of communication challenges. I learned more from these scenarios than from the exercises, which were covering familiar ground. The challenge, for me, is applying what I know about communication theory, by raising my awareness of patterns in real life situations. Practicing this through discussing examples was more helpful to me than exploring more theory.

Finally, the group was presented with a choice of whether to explore a canned scenario (a new executive looking for feedback from others in the organization) or a real-world scenario put forward by someone in the group. The latter option prevailed, but the scenario was a price negotiation with few details, which I didn’t think would be valuable for me. I excused myself and left early, purchasing a copy of Tom DeMarco’s Slack from the book table.

Afternoon session: Beyond the Org Chart Illusions (Jerry Weinberg)

This session alone would have made this conference worthwhile. In it, we explored the structure and dynamics of organizations from a first person point of view. The idea was to discover the tacit structures which make up organizational culture. Setting aside the “objective” description provided by an organizational chart or other such tool, we instead created individual representations of the organization as we experienced it.

To do this, we used an exercise based on Virginia Satir’s technique of family mapping. We each selected an organization we were familiar with, and drew a picture with symbols: first ourselves, then other people (both with no names or words), then physical objects and structures, and labels. We then overlaid the points of pain, pleasure, problems, plans, performance (high and low) and power in the organization.

Unsurprisingly, the diagrams were all vastly different in symbolism, structure, order, level and technique. Jerry focused the group on a pair of diagrams created by two different people in the same organization. The creators explained the meanings of the symbols they used, illustrating the problems faced by the organization. This spawned a discussion about how to address those problems, which occupied the rest of our time.

This discussion really got me thinking, and drawing parallels with my own experience. In particular, i identified with some of the fears and other roadblocks which prevented the people from taking action. One memorable point from Jerry was that you will never convince someone with logic to change something they did not arrive at through logic. I make that mistake a lot.

I also gained some insight on how companies can be successful despite poor management, at least for a time. For example, circumstances may favor the company, such as having little or no competition. Advantages like that don’t last, though, and when things change, the management gap can cripple the company.

I can’t describe here much of what I learned in this session, but I found it extremely valuable.

Written by Matt Zimmerman

November 11, 2009 at 06:08

Amplify Your Effectiveness (AYE) Conference: Day 1

Morning Session: Project Patterns

I chose to attend a session entitled: Is this the Way We Really Want to do Things? Seeing Project Patterns and Changing What You Don’t Like (Johanna Rothman). My goal was to explore the causes of the troublesome patterns I see in projects at work. In particular, I see:

  • Too many projects starting up at once
  • Projects being instantiated without enough consideration for the probability of success (“is this a good idea?” rather than “can we realistically achieve this?”)
  • Key people finding out too late about projects which affect them

All of these patterns lead to increased project risk, communication bottlenecks, low motivation, and high stress.

In the session, we conducted a simulation of a team, with engineers, a project manager, a senior manager and a customer. I took on the role of the senior manager.

In the course of the simulation, we received requirements from the customer, implemented them, and delivered products. While the team was working on implementation, I talked with the customer about what was coming next: what would happen when we delivered, what the next project would be, and so on. Part of the simulation was that I had to be separated from the group while they were working.

When we delivered the first batch of products, and the customer was happy with them, it was time to decide what to do next. We gave the customer a choice of two projects we had discussed, one of which was similar to the previous one (but larger scale and more involved), while the other was different. Despite repeated attempts, we could not persuade the customer to prioritize one over the other.

So, I decided that we should change gears and start work on the “different” project. It seemed to be of greater economic value to the customer, and simpler to execute. One of the engineers disagreed with this decision, but didn’t explain why. The project manager seemed to agree, and I left the team to work. They produced a prototype, which the customer liked, and with a few small changes it was accepted as a finished product.

To my surprise, though, I found out later that the team was in fact working on both projects at once, delivering two different types of products. The decision hadn’t actually been made. These unexpected products were delivered to the customer, but didn’t meet the expanded requirements, and that work was wasted.

The debrief which followed was unfortunately too short, and I didn’t feel that we were able to fully explore what the simulation had revealed. The project manager indicated that he hadn’t understood the decision to have been made, pointing to a communication failure.

This reminded me that while we often think of a decision as an event which happens at a point in time, it is more commonly a process of change, which takes time and must be revisited in order to check progress and evaluate. A decision is really just an idea for a change, and there is more work to be done in order to implement that idea. This can be true even when there is a very explicit “decision point”: it still takes time for that message to be received, interpreted and accepted.

One of the tangents we followed during the debrief had to do with how humans think about numbers. Jerry asked each member of the group to write down a random number, and then we wrote them all on a flipchart. They were: 8, 75, 47, 72, 45, 44, 32, 6, 13 and 47. This reminded me of the analyses of election results which indicate fraud.

Afternoon Session: Saying No

After lunch, I decided to attend Jerry Weinberg’s session, Saying No That Really Means No. This was much larger than the morning session, with over 40 people sitting in a large circle.

The subject of discussion was the variety of difficulties that people face in saying “no” to things which don’t seem right for them. For example, saying “yes” to a project which is doomed to failure. This seemed like a good follow-on to the morning’s exercise.

Jerry began by asking the audience to name some of their difficulties, and tell stories of times when they had trouble saying “no”. One of these stories was role-played and analyzed as an example. Most of the time, though, was filled with storytelling and discussion.

This is a deeply complex topic, because this problem is rooted in self-image, social norms, egocentrism, misperception, and other cognitive phenomena. There was no key insight for me, just a reinforcement of the necessity of self-awareness. The only way to avoid patterns like this is to notice when they are happening, and that can be challenging, especially in a stressful situation.

Once you realize what’s happening, there are all sorts of tools which can be applied to the problem: negotiation, requests for help, problem-solving, even simple inaction.

Written by Matt Zimmerman

November 10, 2009 at 04:34

Amplify Your Effectiveness (AYE) conference: Day 0

The AYE conference is an annual conference “designed to increase your effectiveness—in leadership, coaching, managing, influencing, and working in teams, whether you work in systems development, testing, product development, quality assurance, customer service, or consulting.” It starts tomorrow, and I’ve arrived today eager to meet the other attendees and see what the conference is like.

My work involves all of those things, but the main reason I decided to attend was that I learned a lot at the Problem Solving Leadership workshop, which I attended in January and wrote about on this blog. When I heard about this event, organized in part by the same people, and realized I would be in the US that week anyway, I seized the opportunity to attend.

The 2009 program is overflowing with sessions on topics I’m interested in, some of which I have written about here previously.

The speakers are all people whose insight I’ve appreciated through their writing, speaking and teaching, so I expect I’ll have a tough time deciding where to spend my time these next few days. The sessions are said to be experiential ones rather than lectures, so I don’t think that I’ll be able to hop between them, but I’ll see tomorrow when the main program starts.

Written by Matt Zimmerman

November 9, 2009 at 01:11

Gran Canaria Desktop Summit 2009

Technical conferences are a great way for me to get myself thinking about technology in different ways. I spend my working days immersed in it, but primarily from the “inside”, thinking about what we’re doing in Ubuntu and what’s happening in free software today.

At conferences, on the other hand, I get the chance to think freely about what’s possible, and what might be coming. I find this very difficult to do at work, but a free software conference, with plenty of lively chats with colleagues and other participants, always seems to do the trick. I return home with a notebook full of ideas, including plenty of topics which aren’t directly relevant to the conference topics.

Here’s some of what I took home from GCDS 2009. The conference is still going on as I write this, but I’ve returned home after spending four days there.

The keynotes

  • r0ml is one of my favorite speakers. His sense of humor, timing, gesture, and general exuberance really bring his material to life. This particular talk got me thinking about free software communities in terms of Aristotelian epistemology. I think my relatively sparse classical education makes me a sucker for this kind of thing.
  • Walter Bender spoke about Sugar. Despite having had an XO laptop sitting on my desk for a period of months, this talk taught me most of what I know about the principles and possibilities of the Sugar platform. It really is pretty cool. My only worry is that it’s so ambitious, it might be eclipsed by less powerful, but more pragmatic systems before it reaches its potential.
  • Then, of course, there was Richard Stallman. This keynote was the least interesting of the talks I attended at GCDS, yet I feel compelled to write the most about it:
    • Much of the talk was devoted to his standard general-audience material, for example, explaining the four freedoms (to an audience of free software enthusiast developers)
    • He also took the time to explain the history of KDE and GNOME (to an audience of KDE and GNOME developers)
    • He then warned of the dangers of Mono, alienating the application developers in the crowd who happened to prefer it.
    • He did his Saint IGNUcius routine, throwing in a sexist joke for good measure.

The talks

The summit was organized into two parts: a cross-desktop segment, and a GUADEC/aKademy segment. This gave GNOME and KDE developers the chance to attend some common talks, but also have talks separately. I’m not sure how well this worked out for everyone. Superficially, it seemed like we would have benefited from more cross-pollination (GNOME folks attending KDE-oriented talks and vice versa) in addition to the cross-desktop content (which seemed to focus on common code).

  • In a cross-desktop audio talk, Lennart Poettering gave an overview of the audio stack and some of the problems that each of its layers have. I felt like this overview was a useful starting point, but there wasn’t much opportunity to discuss solutions and how we could work together, so I’m not sure what everyone took away from it.
  • The GeoClue talk got me thinking, particularly about location awareness in more sophisticated desktop applications (i.e. not just mobile phone applications):
    • We could use GeoClue in the Ubuntu installer to more intelligently guess the user’s language, time zone and keyboard layout based on their location, or at least narrow the range of options a bit. Even country-level information would be accurate enough to be useful.
    • Because some devices have more accurate location data than others, it would be interesting to share this data between them. For example, your mobile phone could tell your laptop where it is.
    • With so much location sharing going on, I started to wonder about the apparent implicit assumption in many of these systems that the user and the device are in the same location. Programs like Google’s Latitude present this assumption very strongly, with a picture of the person placed on a map wherever their location-aware device is. This will start to get confusing as people accumulate more location-aware devices.
    • Telepathy apparently supports passing location data, which opens up all sorts of interesting possibilities. It should be possible to write very straightforward applications which do disproportionately cool things using this combination of technology.
    • It should be possible to automagically select a download mirror for Ubuntu installation media or packages using GeoClue, without writing a lot of code.
  • A talk on videoconferencing centered on Farsight 2, which seems to be coming along nicely. It provides, through the familiar and flexible GStreamer API, access to a host of audio and video conferencing systems. Take your VNC screen sharing session, connect it up to a GStreamer pipeline, and have it come out the other end as an MSN webcam stream with accompanying audio. NAT traversal and all the other gory details are taken care of for you. Very cool.
  • The GNOME Shell represents a small revolution in the design of the GNOME user interface. The talk demonstrated a number of its features and general princples. I found myself wanting to understand more about why those particular features and principles were chosen, but I guess that didn’t fit into the talk slot.
  • The Zeitgeist talk was inspirational. It gave me all sorts of ideas about how I would organize my system differently if I had more information about how I was using it. By collecting data on what the user is doing and when, Zeitgeist opens up the possibility of more adaptive user interface techniques, as well as simple and useful tools like a journal of activity.
  • Rob Bradford presented Mojito, a gateway for desktop applications to access web services data from social networking sites. It takes care of managing authentication tokens, and presenting the data from different services in similar and predictable ways. I’m definitely into the idea of socializing the desktop, though I’m not yet clear on where Mojito fits in relative to Gwibber, which provides a very functional API for several of the same web services.
  • Rick Spencer, my colleague and the leader of the Ubuntu desktop team, gave a talk about the different varieties of programming, focusing on the “opportunistic” type. In this state of mind, a programmer isn’t thinking systematically, or developing for long-term goals, but just experimenting with something new or trying to solve an immediate and short-term problem. They have very different needs and expectations than someone who is approaching programming more systematically. This nicely encapsulates the subtle difference in behavior and perspective between someone who is developing the system they’re using (e.g. GNOME or Ubuntu developers) and someone who is developing on that system. The latter type of developer doesn’t care how the system itself works, and just wants to solve their problem. I don’t think they are very well served by free software yet, but Rick has created a tool called Quickly which aims to help opportunistic programmers get what they need…quickly.

The people

I was able to match a number of new faces to names, and make some new connections as well. The conference format offered ample opportunities for this, which I appreciated. Even if someone lives in my home town, I think I’m more likely to run into them at a conference on another continent, because the global community is much more cohesive.

There was a lot of Twitter and identi.ca activity during the event, which helped me to make some new connections there, as well as to involve people who couldn’t attend the event.

It was great to be able to spend some time with colleagues from Canonical in small groups and under more relaxed circumstances. Too often, I only see them all at once, and when we all have too much to do. This alone would have made it worth going.

Written by Matt Zimmerman

July 9, 2009 at 15:04

Posted in Uncategorized

Tagged with , ,

UDS Karmic remote participants: we want your feedback

We are a large community, but only a small number of people can travel to attend UDS in person.  So, over the years, we’ve experimented with different approaches to enable remote participation in UDS.  If you participated in UDS remotely (for example, using the audio feed, IRC, Gobby, etc.), please tell us about your experience by filling out this survey:


If you registered to attend in person in Barcelona, you’ll be receiving an email with a (different) link to the (same) survey.  Please use that one instead, so that we can easily sort feedback from local and remote participants.

Written by Matt Zimmerman

May 29, 2009 at 03:00

Posted in Uncategorized

Tagged with , , ,

Plumbers Conference retrospective

The Linux Plumbers Conference has ended, and on the whole it was a productive forum despite its rocky start.

One of the reasons for this was that there was a strong presence from the kernel community, carried over from Kernel Summit.  Since the purpose of Plumbers was to explore problems which span subsystems, having these folks in the room was a key factor.  I’m told that it’s unlikely that the conferences will colocate next year, and I hope that it will succeed in drawing participation from kernel developers anyway.

There was a strong sense of cooperation among the different distributions, companies and projects which were represented, though less so between the kernel developers and userspace developers.  These two groups would benefit from a better understanding of one another’s problems, and I hope that can be achieved through cross-participation in working events like Plumbers.

It’s common to picture the ecosystem as a stack or a sphere with some components at the bottom/center and others at the periphery, but these simplistic metaphors belie the complex and non-linear interdependencies which exist between projects.  The kernel, the toolchain, the “plumbing”, applications, distributions, companies, and so on, don’t form a neat diagram, and each performs an essential function in making the overall ecosystem work.

I had a chance to talk briefly with Greg KH about his concerns and the way they were expressed, and have hope that some goodwill can be fostered there.  I introduced him to Pete, who manages our kernel team, as a point of contact for a more nuanced dialog about our working relationship with the kernel community.

The discussions about the boot process were particularly interesting, as a great example of a problem which needs broad cooperation in order to solve effectively.  For example, as a result of comparing the (quite different) bootcharts between Fedora and Ubuntu, developers from both distributions identified areas where significant gains were clearly possible without deep structural changes.  Scott has isolated a long-standing issue which made our module loading sequence in Ubuntu much slower than it could be.

In between talks, I did some work on integrating apport with kerneloops.  The result is that kernel oopses can be captured as Apport problem reports with full detail, and semi-automatically filed as bugs, in addition to being counted on kerneloops.org’s statistics.  I’ve put an initial version into Ubuntu and sent the patch to Arjan for merging upstream, and we’re exploring the addition of kerneloops to our default installation to provide testing feedback to kernel developers from our users.

Written by Matt Zimmerman

September 20, 2008 at 18:19

Posted in Uncategorized

Tagged with , ,