Posts Tagged ‘Debian’
Ubuntu 10.10 (Maverick) Developer Summit
I spent last week at the Ubuntu Developer Summit in Belgium, where we kicked off the 10.10 development cycle.
Due to our time-boxed release cycle, not everything discussed here will necessarily appear in Ubuntu 10.10, but this should provide a reasonable overview of the direction we’re taking.
Presentations
While most of our time at UDS is spent in small group meetings to discuss specific topics, there are also a handful of presentations to convey key information and stimulate creative thinking.
A few of the more memorable ones for me were:
- Mark Shuttleworth talked about the desktop, in particular the introduction of the new Unity shell for Ubuntu Netbook Edition
- Fanny Chevalier presented Diffamation, a tool for visualizing and navigating the history of a document in a very flexible and intuitive way
- Rick Spencer talked about the development process for 10.10 and some key changes in it, including a greater focus on meeting deadlines for freezes (and making fewer exceptions)
- Stefano Zacchiroli, the current Debian project leader, gave an overview of how Ubuntu and Debian developers are working together today, and how this could be improved. He has posted a summary on the debian-project mailing list.
The talks were all recorded, though they may not all be online yet.
Foundations
The Foundations team provides essential infrastructure, tools, packages and processes which are central to the development of all Ubuntu products. They make it possible for the desktop and server teams to focus on their areas of expertise, building on a common base system and development procedures.
Highlights from their track:
- Early on in the week, they conducted a retrospective to discuss how things went during the 10.04 cycle and how we can improve in the future
- One of their major projects has been about revision control for all of Ubuntu’s source code, and they talked last week about what’s next
- We’re aiming to provide btrfs as an install-time option in 10.10
- In order to keep Ubuntu moving forward, the foundations team is always on the lookout for stale bits which we don’t need to keep around anymore. At UDS, they discussed culling unattended packages, retiring the IA64 and SPARC ports and other spring cleaning
- There was a lot of discussion about Upstart, including its further development, implications for servers, desktops and the kernel, and the migration of server init scripts to upstart jobs
- After maintaining two separate x86 boot loaders for years, it looks like we may be ready to replace isolinux with GRUB2 on Ubuntu CDs
Desktop
The desktop team manages both Desktop Edition and Netbook Edition, on a mission to provide a top-notch experience to users across a range of client computing devices.
Highlights from their track:
- A key theme for 10.10 is to help developers to create applications for Ubuntu, by providing a native development environment, improving Quickly, improving desktopcouch, making it easier to get started with desktopcouch, and enabling developers to deliver new applications to Ubuntu users continuously
- With more and more touch screen devices appearing, Ubuntu will grow some new features to support touch oriented applications
- The web browser is a staple application for Ubuntu, and as such we are always striving for the best experience for our users. The team is looking ahead to Chromium, using apport to improve browser bug reports, and providing a web-oriented document capability via Zoho
- Building on work done in 10.04, we will aim to make simple things simple for basic photo editing
- Security-conscious users may rest easier knowing that the X window system will run without root privileges where kernel modesetting is supported
Server/Cloud
The server team is charging ahead with making Ubuntu the premier server OS for cloud computing environments.
Highlights from their track:
- Providing more powerful tools for managing Ubuntu in EC2 and Ubuntu Enterprise Cloud infrastructure, including boot-time configuration, image and instance management, and kernel upgrades
- Improving Ubuntu Enterprise Cloud by adding new Eucalyptus features (such as LXC container support, monitoring, rapid provisioning, and load balancing. If you ever wanted to run a UEC demo from a USB stick, that’s possible too.
- Providing packaged solutions for cloud building blocks such as hadoop and pig, Drupal, ehcache, Spring, various NOSQL databases, web frameworks, and more
- Providing turn-key solutions for free software applications like Alfresco and Kolab
- Making Puppet easier to deploy, easier to configure, and easier to scale in the cloud
ARM
Kiko Reis gave a talk introducing ARM and the corresponding opportunity for Ubuntu. The ARM team ran a full track during the week on all aspects of their work, from the technical details of the kernel and toolchain, to the assembly of a complete port of Netbook Edition 10.10 for several ARM platforms.
Kernel
The kernel team provided essential input and support for the above efforts, and also held their own track where they selected 2.6.35 as their target version, agreed on a variety of changes to the Ubuntu kernel configuration, and created a plan for providing backports of newer kernels to LTS releases to support deployment on newer hardware.
Security
Like the kernel team, the security team provided valuable input into the technical plans being developed by other teams, and also organized a security track to tackle some key security topics such as clarifying the duration of maintenance for various collections of packages, and the ongoing development of AppArmor and Ubuntu’s AppArmor profiles.
QA
The QA team focuses on testing, test automation and bug management throughout the project. While quality is everyone’s responsibility, the QA team helps to coordinate these activities across different teams, establish common standards, and maintain shared infrastructure and tools.
Highlights from their track include:
- There was a strong sense of broadening and deepening our testing efforts, mobilizing testers for specific testing projects, streamlining the ISO testing process by engaging Ubuntu derivatives and fine-tuning ISO test cases, and reactivating the community-based laptop testing program
- In support of this effort, there will be projects to improve test infrastructure, including enabling tests to target specific hardware and tracking test results in Launchpad
- There is a continuous effort to improve high-volume processing of bug reports, and two focus areas for this cycle will be tracking regressions (as these are among the most painful bugs for users) and improving our response to kernel bugs (as the kernel faces some special challenges in bug management)
Design
The design team organized a track at UDS for the first time this cycle, and team manager Ivanka Majic gave a presentation to help explain its purpose and scope.
Toward the end of the week, I joined in a round table discussion about some of the challenges faced by the team in engaging with the Ubuntu community and building support for their work. This is a landmark effort in mating professional design with free software community, and there is still much to learn about how to do this well.
Community
The community track discussed the usual line-up of events, outreach and advocacy programs, organizational tools, and governance housekeeping for the 10.10 cycle, as well as goals for improving the translation of Ubuntu and related resources into many languages.
One notable project is an initiative to aggressively review patches submitted to the bug tracker, to show our appreciation for these contributions by processing them more quickly and completely.
Lucid ruminations
A few months ago, I wrote about changes in our development process for Ubuntu 10.04 LTS in order to meet our goals for this long-term release. So, how has it turned out?
Well, the development teams are still very busy preparing for the upcoming release, so there hasn’t been too much time for retrospection yet. Here are some of my initial thoughts, though.
- Merge from Debian testing – Martin Pitt has started a discussion on ubuntu-devel about how this went. For my part, I found that Lucid included fewer surprises than Karmic.
- Add fewer features – This is difficult to evaluate objectively, but my gut feeling is that we kept this largely under control. As usual, a few surprise desktop features were implemented that not everyone is happy about, myself included.
- Avoid major infrastructure changes – I think we did reasonably well here, though Plymouth is a notable exception. It resulted (unsurprisingly) in some nasty bugs which we’ve had to spend time dealing with.
- Extend beta testing – This will be difficult to assess, though if 10.04 beta was at least as good as 9.10 or 9.04 beta, then it will have arguably been a success.
- Freeze with Debian – Although early indications were good, this didn’t work out so well, as Debian’s freeze was delayed
- Visualize progress – The feature status page provided a lot of visual progress information, and the system behind it allowed us to keep track of work slippage throughout the cycle, both of which seemed like a firm step in the right direction. I’m looking forward to hearing from development teams how this information helped them (or not).
A more complete set of retrospectives on Lucid should give us some good ideas for how to improve further in Maverick and beyond.
Update: Fixed broken link.
Interviewed by Ubuntu Turkey
As part of my recent Istanbul visit, I was interviewed by Ubuntu Turkey. They’ve now published the interview in Turkish. With their permission, the original English interview has been published on Ubuntu User by Amber Graner.
Ubuntu Inside Out – Free Software and Linux Days 2010 in Istanbul
In early April, I visited Istanbul to give a keynote at the Free Software and Linux Days event. This was an interesting challenge, because this was my first visit to Turkey, and my first experience presenting with simultaneous translation.
In my new talk, Ubuntu Inside Out, I spoke about:
- What Ubuntu is about, and where it came from
- Some of the challenges we face as a growing project with a large community
- Some ways in which we’re addressing those challenges
- How to get involved in Ubuntu and help
- What’s coming next in Ubuntu
The organizers have made a video available if you’d like to watch it (WordPress.com won’t let me embed it here).
Afterward, Calyx and I wandered around Istanbul, with the help of our student guide, Oğuzhan. We don’t speak any Turkish, apart from a few vocabulary words I learned on the way to Turkey, so we were glad to have his help as we visited restaurants, cafes and shops, and wandered through various neighborhoods. We enjoyed a variety of delicious food, and the unexpected company of many friendly stray cats.
It was only a brief visit, but I was grateful for the opportunity to meet the local free software community and to see some of the city.
Ada Lovelace Day 2010
Today is Ada Lovelace Day, “an international day of blogging to celebrate the achievements of women in technology and science”. By participating in this event, and blogging about women in these fields, my hopes are:
- to credit individual women for their achievements, which are often undervalued (both by others and by the women themselves)
- to raise awareness of the presence of women in technical fields, who are often presumed to be few or nonexistent, or only included by association with a man
- to remind men in technical communities that women are unquestionably capable of high achievement, and deserving of professional respect in their fields of excellence
For Ada Lovelace Day 2010, I would like to honor two women in the free software community who have made a strong impression on me in the past year: Akkana Peck and Miriam Ruiz.
Akkana Peck
I became aware of Akkana’s work through Planet Ubuntu Women, which is something of a year-round Ada Lovelace Day as it aggregates the blogs of many of the women of the Ubuntu community. As a senior engineer at Netscape, Akkana was instrumental in the development of the Linux port of Mozilla. Having developed a number of extensions to the GIMP, she went on to write the book on using it as well.
Her website, blog and Twitter feed include top-notch content on both free software and astronomy.
Miriam Ruiz
I met Miriam through the Debian Women project. Her profile there seems to be out of date, as it says she is not an official Debian developer yet, but she was officially recognized over two years ago as a trusted member of the team, through Debian’s notoriously thorough New Maintainer process. Miriam has been particularly interested in game development, and as the founder of the Debian Games Team has been responsible for uncountable hours of pleasant distraction for Debian users and developers. Thanks to her leadership, the Debian Games team has been exemplary in terms of cross-participation between Debian and Ubuntu.
Miriam blogs and tweets on free software topics in English and Spanish.
linux.conf.au 2010: Day 2 (afternoon)
James Westby: Ubuntu Distributed Development
James gave a great overview of the Ubuntu distributed development project, which has the ambitious goal of providing a homogeneous view of all of the source code for Ubuntu packages using Launchpad and Bazaar. This includes modeling the relationships between the versions of the code in Debian and further upstream, which is complicated by the use of different revision control systems, patch systems, and so on.
At this stage, most of the source code for Ubuntu and Debian is available through the system, and developers can freely branch from it and request merges. It works the same way for all packages, so developers only need to learn one toolset and workflow. We hope that this will lower the barrier to entry for contributing to Ubuntu, as well as make it easier to share patches between Ubuntu, Debian and upstream.
Timo Hoenig: Extending the scope of mobile devices
Timo reviewed how mobile devices have evolved over the past 40 years, citing dramatic improvements in compute power, memory, bandwidth and so on, but comparatively small improvement in battery life (several orders of magnitude less). Thus, he sees power management and related technologies as important to the further advancement of the category. He specifically identified network links as a key consideration, as they consume a great deal of power, and have continued to do so with newer generations of technology. Local power management, he says, is not sufficient, and we need to take a network-aware view.
He introduced the concept of an “early bird” connector, which acts as a supervisor for a mobile device. It communicates with remote network nodes on its behalf, and takes decisions about when and whether to wake up the mobile device. He estimated a 12% power savings by offloading processing to such a device, using a simple model of power consumption. The early bird would run on another system on the network, presumably without the same power constraints (like a proxy server).
Sam Vilain: Linux Containers
Sam detailed the LXC implementation of containers for Linux. In contrast with vserver, it seems to offer a much simpler interface. Because of this, it has been comparatively straightforward to merge into the Linux kernel mainline.
LXC uses existing Linux kernel facilities to group processes within containers into control groups, which can then be used to control access and scheduling of resources (network, CPU, storage, etc.). Each resource type has a namespace similar in principle to what chroot() provides for filesystems. Since all of the hardware is visible to a single kernel, there can be a great deal of flexibility in how resources are allocated. For example, a given network device and CPU can be dedicated to a container.
Usefully for system administration and diagnostics, all of these resources can be directly accessed from the host without stopping or shutting down guests.
linux.conf.au 2010: Day 1 (afternoon)
After lunch, I had a three-way conflict, as I wanted to attend talks on SVG as an alternative to Flash (Libre Graphics Day), gnucash and domain-specific package management.
Sometimes, when I have a conflict like this, I try to attend the talk whose material is less familiar to me (in this case, probably the SVG/Flash one). However, since the talks are being recorded and made available on the Internet, this changes the dynamic a bit. I don’t have to miss out on watching anything, as I can download it later. So, it makes more sense for me to go where I can best participate, taking advantage of my presence at the conference.
Distro summit: Package management
So, I chose to attend the package management talk, as I might have something to contribute. It was about how to harmonize general distribution packaging mechanisms (dpkg, RPM, etc.) with special-purpose ones like those used by Ruby (gems), Lua (rocks), Perl (CPAN modules) and so on. The solution described employed a set of wrapper scripts to provide a standard API to these systems, so that they could be used by the distribution package manager to resolve dependencies.
Due up next was Scott James Remnant’s talk on booting faster, but due to travel difficulties, he hadn’t arrived yet. Instead, we had a free-form discussion on various other challenges in the area of package management.
I took the opportunity to put forward an idea I had been thinking about for some time, which is that we may need to move beyond a “one size fits all” or “everything is a package” approach to package management. Modern package management systems are very capable, and solve a range of difficult problems with a high degree of reliability. The cost of all of this functionality is complexity: making packages is difficult. The system can be made to work for a wide range of software, but the simple case is often not very simple.
I also made the point that there are non-technical challenges as well: it is difficult for developers and ISVs to understand the ecosystem of distributions, and even more difficult to succeed in making their software available in “the right way” to Linux users. The obstacles range from procedural to cultural, and if we only approach this as a technical problem, we risk adding more complexity and making the situation worse.
The opportunity to have this kind of participatory discussion really validated my decision about how to choose which talk to attend.
Liz Henry: Code of our Own
Back in the Haecksen/LinuxChix miniconf, Liz Henry presented an action plan
for increasing the involvement of women in open source, with many well-considered “dos” and “don’ts” based on observations of what has and has not worked for open source communities.
It was the first opportunity I’ve had to attend a free software conference session which went beyond the typical “yes, this is important” and “yes, there really is a problem here” content which is unfortunately as necessary as it is commonplace.
I won’t attempt to summarize it here, but I can definitely recommend Liz’ presentation to anyone who is looking for concrete, actionable methods to promote gender diversity in their technical communities.
Lucas Nussbaum: The Relationship between Debian and Ubuntu
Historically, in Lucas’ assessment, many Debian developers have been unhappy with Ubuntu, feeling that it had “stolen” from Debian, and was not “giving back”. He said that bad experiences with certain people associated with Canonical and Ubuntu reflected on the project as a whole.
However, he says, things have improved considerably, and today, most Debian developers see some good points in Ubuntu: it brings new users to free software and Debian technology, it provides a system which “just works” for their (less technical) friends and family, and brings new developers to the Debian community.
There are still some obstacles, though. Lucas says that many bugfix patches in Ubuntu are just workarounds, and so are not very helpful to Debian. He gave the example of a patch which disabled the test suite for a package because of a failure, rather than fixing the failure.
He felt that Canonical offered few “free gifts” to Debian, citing as the only example the website banner on ubuntu.com which was displayed for Debian’s 15th anniversary. I felt this was a bit unfair, as Canonical has done more than this over the years, including sponsoring DebConf every year since Canonical was founded.
It occurred to me that the distinctions between Canonical and Ubuntu are still not clear, even within the core free software community. For example, the “main” package repository for Ubuntu is often seen to be associated exclusively with Canonical, while “universe” is the opposite. In reality, Canonical works on whatever is most important to the company, and volunteers do the same. These interests often overlap, particularly in “main” (where the most essential and popular components are).
Lucas speculated that more mission-critical servers run Debian pre-releases (especially testing) than Ubuntu pre-releases. It would be interesting to test this, as it’s rare to get sufficient real-world testing for server software prior to an Ubuntu release.
Lucas presented a wishlist for Ubuntu:
- more technical discussions between Ubuntu and Debian (particularly on the
ubuntu-develdebian-devel mailing list - easier access to Launchpad data
- Launchpad PPAs for Debian
The prominence of Launchpad in these discussions spawned a number of tangential discussions about Launchpad, which were eventually deferred to tomorrow’s Launchpad mini-conf. One audience member asked whether Debian would ever adopt Launchpad. The answer from Lucas and Martin Krafft was that it would definitely not adopt the Canonical instance, but that a separate, interoperating instance might eventually be palatable.
I made the point that there is no single Debian/Ubuntu “relationship”, but a large number of distinct relationships between individuals and smaller groups. Instead of focusing on large-scale questions like infrastructure, I think there would be more mileage in working to build relationships between people around their common work.
Excellent adventures in free software
After maintaining an ad hoc Linux distribution for myself for several years, I replaced it with Debian and have never looked back. One of the main reasons for this has been the mind-boggling quantity of applications and tools which are available from its repositories. Given couple of keywords, or a good guess at the name of the application, APT fetches and installs the necessary packages in a matter of seconds. After years of compiling free software programs from source code, this profoundly changed the way I thought about finding and obtaining software.
Over 10 years later, the speed and convenience of this system still occasionally leaves me awestruck. As a typical example, on one occasion, I was using a pastebin to share the output of a program I was discussing with someone online by copying and pasting it.. The output was fairly long, and it was inconvenient to copy and paste, so I wanted a tool which would read the output from a pipe and upload it directly to the pastebin, without a human in the middle.
Before I fired up an editor to write such a tool, I did a quick search to check if any such thing existed already, and found Stéphane Graber’s pastebinit, which did exactly what I wanted (and more). Not only had someone else had the idea first, they had implemented, released and packaged it over a year earlier. The end result, for me, was that within 30 seconds of discovering that I needed such a tool, I had it installed and working.
Experiences like the above still make me feel like I’m living a scene from the 1989 film Bill & Ted’s Excellent Adventure, where the protagonists discover that they have already traveled back in time to anticipate their own needs. They merely think about what they need, and there it is. The fact that I am still amazed by this probably makes me sound like a dinosaur to other free software enthusiasts, but this kind of instant gratification is something which is only just beginning to emerge in proprietary systems like the iPhone. The web was designed from the start to work this way, of course, but there is much I can do with free software that I can’t do with web applications (at least for now). The web also doesn’t give me that feeling of personal connection with the creator of the software, or (generally) the opportunity to tailor it for my needs.
Ubuntu 10.04 LTS: How we get there
The development of Ubuntu 10.04 has been underway for nearly two months now, and will produce our third long-term (LTS) release in April. Rick Spencer, desktop engineering manager, summarized what’s ahead for the desktop team, and a similar update will be coming soon from Jos Boumans, our new engineering manager for the server team.
What I want to talk about, though, is not the individual projects we’re working on. I want to explain how the whole thing comes together, and what’s happening behind the scenes to make 10.04 LTS different from other Ubuntu releases.
Changing the focus
Robbie Williamson, engineering manager for the foundations team, has captured the big picture in the LTS release plan, the key elements of which are:
Merge from Debian testing
By merging from Debian testing, rather than the usual unstable, we aim to avoid regressions early in the release cycle which tend to block development work. So far, Lucid has been surprisingly usable in its first weeks, compared to previous Ubuntu releases.
Add fewer features
By starting fewer development projects, and opting for more testing projects over feature projects, we will free more time and energy for stabilization. This approach will help us to discover regressions earlier, and to fix them earlier as well. This doesn’t mean that Ubuntu 10.04 won’t have bugs (with hundreds of millions of lines of source code, there is no such thing as a bug-free system), but we believe it will help us to produce a system which is suitable for longer-term use by more risk-averse users.
Avoid major infrastructure changes
We will bring in less bleeding-edge code from upstream than usual, preferring to stay with more mature components. Where a major transition is brewing upstream, we will probably opt to defer it to the next Ubuntu cycle. While this might delay some new functionality slightly, we believe the additional stability is well worth it for an LTS release.
Extend beta testing
With less breakage early in the cycle, we plan to enter beta early. Traditionally, the beta period is when we receive the most user feedback, so we want to make the most of it. We’ll deliver a usable, beta-quality system substantially earlier than in 9.10, and our more adventurous users will be able to upgrade at that point with a reasonable expectation of stability.
Freeze with Debian
With Debian “squeeze” expected to freeze in March, Ubuntu and Debian will be stabilizing on similar timelines. This means that Debian and Ubuntu developers will be attacking the same bugs at the same time, creating more opportunities to join forces.
Staying on course
In addition, we’re rolling out some new tools and techniques to track our development work, which were pioneered by the desktop team in Ubuntu 9.10. We believe this will help us to stay on course, and make adjustments earlier when needed. Taking some pages from the Agile software development playbook, we’ll be planning in smaller increments and tracking our progress using burn-down charts. As always, we aim to make Ubuntu development as transparent as possible, so all of this information is posted publicly so that everyone can see how we’re doing.
Delivering for users
By making these changes, we aim to deliver for our users the right balance of stability and features that they expect from an Ubuntu LTS release. In particular, we want folks to feel confident deploying Ubuntu 10.04 in settings where it will be actively maintained for a period of years.
Debian is NOT switching to time-based releases
At DebConf 9 this week, the Debian release team proposed a new approach to Debian’s release cycle, which was then announced on the Debian web site. Both the Debconf presentation and the announcement were quite clear, but a number of news articles and blog posts on the subject seem to have misinterpreted them:
- Debian Adopts Two-Year Time-Based Release Cycle on OSNews
- Debian to adopt time-based releases on Linux Today
- Debian to Adopt Time-based Release Cycle on Jonathan Carter’s blog
- Debian to adopt time-based releases on a blog masquerading as a news site
Debian is not switching to time-based releases. I’m glad they aren’t, because I don’t think it would be the right choice for the project at this time. Time-based releases are the approach used by Ubuntu, where we plan to release on a specific date. Instead, they will use the same approach as in previous releases, where they set criteria for release-critical bugs, and release when all release-critical bugs are closed.
The difference is that they will schedule the freeze date in advance. This means that there is a bounded time period available for new development, where things sometimes need to be broken in order to make progress. Once the freeze point is reached, Debian developers will minimize breakage and focus on stabilization. Once the RC bug count drops to zero, they’ll release as usual. That could happen soon after the freeze, or it could take a long time, depending on how many bugs are introduced during development.
This hybrid approach seems like a good balance, giving the release team the ability to guide the project toward a more predictable release cadence, without sacrificing their uncompromising approach to quality. Having a predictable freeze date will help Debian and Ubuntu developers to work together this winter to fix many of the same bugs in Ubuntu 10.04 and Debian squeeze.