LibrePlanet 2016

Jump to: navigation, search

March 19--20, 2016

Saturday, March 19

Keynote: The last lighthouse: free software in dark times

Daniel Kahn Gilmor interviewing Edward Snowden

Snowden: What happened in 2013 could not have happened without free software. I couldn't have used Windows, because you can't trust it. You just don't know. We have to rely on lots of software that we don't trust. Many people suspected what the NSA was doing, but didn't know for sure. Now we know. Now that we know what the NSA is doing, we can integrate that into our own threat model.

dkg: What are the tradeoffs between Apple locking their devices down, and other aspects that might lead to less control.

Snowden: There's a difference between security and control. Is a device trustable if you have no idea what it's doing? From free software, we know there are alternatives. FOSS strategies can be ported, so they're easier to use. This community has to grow. We can't compete with Google or Apple on resources, but we can compete on ideology, because ours is better.

With many devices - your phone, your washing machine, you have no idea what it's doing unless you hack into it yourself. Aside from software trust, we have to think about hardware trust.

dkg: How to bring people into the free software movement is a big question. I came in for technical reasons, but stayed for the freedom. What are some ways we can expand or sustain our community. Or, how can we get a free software fridge?

Snowden: Focus on community strategy. Figure out what's important to you, then figure out how to pass that on to the people who come after you. This isn't about privacy vs security; it's about power. The US spied on NGOs and UNICEF. No one was ever charged for this, because doing so would have revealed the programs. Do these people work for you, or do they work for someone else?

dkg: Cars have proprietary software and over-the-air updates. Our relationships are often mediated by software. How do we be sure our cars don't become surveillance devices? We have to do outreach to like-minded communities. Surveillance is a power dynamic. Freedom is autonomy. Can we have a regulatory requirement that all vehicles must run free software, and can we make that not a radical idea?

Snowden: We can't control telecom providers; we're very much at their mercy. The providers and the network paths are hostile and we have to treat them that way.

Encrypting everything is a good first step. Timing and mixing are good next steps. Could we use specialized chips, rather than single-purpose ones? What about tool chains, when attackers go after compilers or try to modify binaries, or track what developers are doing?

dkg: I hope we care about this. If you're a maintainer, how do you prevent an attack on you from becoming an attack on your users? We're making a lot of progress on getting to reproducible builds. The question of addressing legacy vs stability vs security (e.g., Android) is still a quandary.

Snowden: There's a real impact to supporting legacy software. Users have difficulty accepting the fact that legacy software is a security risk.

Question: Should one of our priorities be to develop a killer app for privacy or security? If so, what could it be?

Snowden: We're starting to create apps like signal. Can we build a complete stack? Are there ways we can change our interactions? VR is a big technology that's starting to take off. Can VR be used for a social meeting space, that makes it easier for people to interact over distances? Can we get there first and provide safe meeting spaces?

Question: What about communities trying to build community-controlled infrastructure? Snowden: That's a powerful idea. I'm not familiar with community infrastructure, but I think that's an awesome idea. Centralized infrastructure is what an adversary will target.

Question: Self-hosting is one response to surveillance. Does this provide a reasonable defense?

Snowden: This kind of infrastructure is important. I used to have a bunch of scheduled tasks that ran XKEYSCORE searches every day. There are limits to mass surveillance. The NSA targets telcos and routers. The more local your network is, the safer you are.

Question: What can a middle-schooler do?

Snowden: First, they care. If you care, then you'll learn. There are contests for attention and mind share. If you're interested, then you're on the right track. Learn and develop your capabilities.

Yes, the FCC might ban your operating system

Eric Schultz

The FCC would like wireless router manufacturer to prevent users from installing software that might affect the router's radio operation. The FCC specifically mentions dd-WRT as an example of such software.

TP Link is now blocking the use of FOSS software in their routers.

The FCC regulates radio spectrum. Radio spectrum is a finite resource, and we can't expand it. Different frequencies have different use cases. The radio spectrum can be broken down into three categories. (1) frequencies that anyone can use, (2) frequencies that no-one can use, and (3) frequencies that can be used with a license. Use is regulated according to power, frequency, and modulation technique.

A spectrum can have primary and secondary uses. Secondary uses must always defer to primary ones.

The fines for inappropriate use (e.g., overmodulation) are very high. Unintentional violation is also illegal, and can result in punishment. When it comes to radios (including wireless networking equipment), the FCC regulates devices to prevent them from behaving badly. Devices are "regulated to best practices". Devices are regulated by use case; for example, amateur radio is governed by a different set of regulations than bluetooth.

The FCC now considers "device" to be radio hardware, plus any software that can affect the operation of that radio hardware.

There are two ways for software to interface with radio hardware: Soft MAC and hard MAC. Soft MAC drivers use the kernel's implementation of regulatory schemes. Hard MAC drivers implement their own regulatory compliance; these drivers send commands directly to firmware and hardware. In reality, "device" includes hardware, software, firmware, and perhaps the kernel - depending on how regulatory compliance is implemented.

The FCC is currently floating two proposals: (1) U-NII rules (for unlicensed spectrum), and (2) NPRM on E-Modulator Transmitters. (NPRM means "notice of proposed rulemaking".)

U-NII currently applies to the 5 GHz wifi spectrum. The rule states that a device "must contain features to prevent unauthorized modification". In other words, this rule would prevent someone from installing unauthorized software. The FCC specifically names dd-WRT as unauthorized software.

There are two types of firmware: (1) the router operating system, and (2) firmware for the radio hardware.

802.11ac is the current standard for high speed wireless, which works in the 5 GHz range. Doppler weather radar also uses this frequency range. To accommodate two uses of the 5 GHz spectrum, the FCC requires dynamic frequency selection, or DFS. If a router notices a doppler radar signal, it must switch frequency. The FCC's logic: lock down everything, so that interference can't happen.

Doppler radar interference hasn't been a huge problem. The FCC receives around 10 complaints per year, and most of them involve for-profit wireless carriers (e.g, AT&T). These companies deploy routers outside, and disable DFS. There's never been a case of doppler radar interference that was brought against an individual. AT&T flashed their routers with dd-WRT, which permits one to disable DFS through the UI. This is why the FCC specifically names dd-WRT.

The FCC could have notified the outdoor operators, and asked them to correct the problem. They didn't take that route.

The NPRM (notice of proposed rulemaking) is concerned with modular transmitters. These are transmitters that can be added to hardware, without FCC approval. The FCC wants these transmitters to be locked down. Originally, the FCC wanted all software-defined radios to be locked down. That didn't work out. So far, they've received over 4000 comments to this NPRM.

The FCC takes the position "we're not opposed to free software, as long as you lock down the radio". The problem is that you can't lock down the hardware without locking down the software too, meaning that you can't change anything on the device. Lockdowns would prevent small groups from conducting wireless research. This doesn't bother the FCC; the FCC doesn't consider individuals to be innovators or inventors.

If put into effect, the FCC rules would require different devices for different countries.

Inappropriate use of the radio spectrum by individuals is rare, but we should still campaign to discourage it. We should ask that all radio software be open software, so that it's auditable.

Hardware Reverse Engineering

Felipe Correa da Silva Sanches

MAME: The Multiple Arcade Machine Emulator. MAME works in conjunction with video game ROMs. MAME is a toolchain and framework for creating hardware emulators. The project originally focused on 1980's arcade games, like Pac-Man. Eventually, the project expanded to emulate other types of hardware.

Groups of people wanted to preserve these old arcade games. Some of the old games had "suicide boards". The board has a bit of battery-backed RAM, which holds a key to decrypt the boards ROM. When the battery dies, the decryption keys (and the encrypted ROM) die with it.

MAME merged with MESS, the Multiple Emulator Super System.

What functionality do you lose in Linux Libre, where all non-free firmware is removed from the Linux kernel? You wind up losing support for around 300 device drivers. If your hardware doesn't work, you can always buy different hardware. But it would be better if we could work with our own hardware.

One can reverse engineer source code from GPL's binaries. A number of binary blobs are released under GPLv2, which gives us the ability to reverse engineer them. You can disassemble the blob, then re-construct proper source code.

We got the assembly code for a Keyspan USB driver. We tried to compile our own driver from this assembly code, but it didn't quite match. There was a one-bit difference in the binary, which came from a toolchain bug.

Reverse engineering is time-consuming, and there's a scarcity of manpower to do it. Sometimes legal issues force you to use clean room techniques, where one group disassembles a binary and writes a specification for what it does; a different group implements the specification.

MAME emulates CPUs, auxiliary chips, drivers, and the connections between them. It's kind of like a virtual printed circuit board.

Some 1980s games had two CPUs: one for game play, and a second for audio/music playback.

How do you extract code from a ROM? You plug the chip into an extractor device, and then suck out the contents of the chip.

Bare metal software often contains information about how the hardware worked. If you're reverse engineering, it's useful to have several versions of the firmware, and to keep track of the status of each binary. This gives you more knowledge to leverage when creating free software alternatives.

Makerspaces can be great places to collaborate on reverse engineering projects.

Some CPUs have signature instructions; these allow you to identify the CPU that a given binary was targeted for. For example, ARM has conditional instruction execution. There's a distinct pattern where most instructions have the conditional bit unset, but a certain number have the bit set.

Sniffing messages on hardware busses is another strategy for figuring out how a system works. You can also intercept and log memory accesses.

Question: Are there resources where people can learn about the legal aspects of reverse engineering?

The EFF is a good source to learn from.

Question: Do you need a logic analyzer to observe these things?

Yes. Sometimes we have to use proprietary tools, like logic analyzers. But I'm okay with using a proprietary tool, if it enables me to make something free.

Free/Libre Alternatives to GAFAM's Internet

Marianne Corvellec and Jonathan Le Lous, April

"GAFAM" means "Google, Apple, Facebook, Amazon, Microsoft".

In the name of "fighting terrorism", France is proposing a number of policies that apply to software and the internet. We believe that these polices go in the wrong direction.

Google would like to sell information about you. See the WIRED magazine article Open Source Software went Nuclear This Year

Open source is everywhere, but here, we're going to talk about freedom and privacy.

Today, software is about control and surveillance. Software companies aren't interested in technology. They're interested in collecting and selling personal information. is a French organization that was founded in 2004. It promotes free software, free culture, and free services.

GAFAM poses a threat to healthy competition, and goes against our notion of a free and decentralized internet. We're trying to raise awareness of how dependent we've become on a few major players.

Framasoft has a three year plan to promote non-commercial services, promote selfhosting, and promote free software alternatives.

The centralized web is reminiscent of big brother. We could have citizen-driven initiatives instead. Organize locally and regionally. Let people install their own services, for their own groups and communities. This many not scale, but maybe it doesn't have to. If you're providing a service to a community, then you only need to scale enough to serve that community's needs. It's more important to build locally, and to tell others how you did it (or how others could adapt it).

There are three different "as a service" layers:

  • IaaS (infrastructure as a service) is tailored to network architects.
  • Paas (Platform as a service) is tailored to application layers.
  • SaaS (Software as a service) is tailored to end users.

What about community clouds? We'd like to reclaim the cloud for ourselves, for better privacy. You can completely build a cloud system with free software. We want cloud computing options that are ethical, and respect user's privacy. Tech foundations could help with technical expertise and support.

BYOCC = build your own community cloud.

Free Software Awards

Richard Stallman


  • Individual contributor: Werner Koch, for his work on GnuPG.
  • Library Freedom Project

Facebook doesn't have users, it has "useds".

It's not enough to limit how companies can use the data they collect. Our government can compel companies to hand over whatever they've got.

Cloudflare requires visitors from the Tor network to solve captchas. That shuts out people with vision disabilities. Cloudflare also requires visitors to run non-free javascript. I'm sure Cloudflare could provide a non-javascript captcha, but that still doesn't help people with vision impairment.

We published our criteria for freedom-respecting repositories. There are some repositories that force developers to distribute their software under specific non-free licenses.

There is no mobile device that fully works with free software. The device's wifi and camera require non-free drivers and firmware. The modem processor only runs on non-free software; that's how phones can be turned into remote listening devices. Mobile phones are Stalin's dream. They enable the state to know where you are at all times, and to listen to you whenever they want.

Many schools force students to use non-free software: iBads and Chromebooks. These devices snoop on students.

There are schools that require students to have a Gmail account. Why can't the school use whatever email address the student gives?

We need to demand an end to digital oppression. Don't make a concession before you state your demands. Even if you don't get what you want, you still have the opportunity to state what the issues are.

I'd like to talk about the FBI vs Apple case. People are admiring Apple for defending people's privacy. Apple designed their hardware to only run software that Apple signed. You have no choice but to trust Apple. Apple users should hope that they don't lose their case with the FBI.

Free software is the necessary starting point for security. By making their machine a tyrant, people who use iThings will be screwed if Apple loses this case. The only way to have security is by preventing some company from taking it away from you. also requires users to run lots of non-free software. If you don't run their non-free javascript, then you can't submit opinions on proposed regulations.

Question: You addressed dystopia for mobile computers. It's hard to get replicant to work on a device. Is it worthwhile to get GNU/Linux running on mobile devices?

The problem is in the drivers, and all the "crapps" that people want to run on their mobiles. People have been led to want to run non-free software.

Question: What about reverse engineering?

It's generally okay to use a non-free program to provide a free alternative to that program. As a user of a non-free program, you're a victim, rather than a culprit. If a program has a network effect and requires others to use it, then you're a culprit.

Question: What should we do about the FCC's desire to ban free software on mobile devices?

This is a very bad thing. I understand what the FCC wants. Taking people's freedom away is not an acceptable solution.

Question: Can code signing be done in a way that respects users?

If users can control they keys, then code signing is a security measure. If users can't control the keys, then it's a restriction. When you don't control the keys, code signing isn't a lock, it's a shackle.

The W3C is considering whether to incorporate DRM into WWW specifications. We have to fight DRM until it's dead. Don't use devices that shackle users. Sometimes defending freedom requires a sacrifice.

Seeing code isn't enough. If you don't have control over the code, then how can you be sure that the code you see is the code that wound up in the software. If the NSA snuck something into Microsoft's compiler, there are only a handful of people who could possibly find it.

Sunday, March 20th

Keynote: Free Software, Free Society

Allison Randall

When Stallman was modifying software in MIT's AI lab, there were no software copyrights. Congress added copyrights for software in 1980. Free Software was a direct reaction to software copyright.

In the early days - the 1940s or so - hardware was the asset, and software had zero value. In 1980 copyright for software was seen as a technical innovation. But it was really a barrier, and barriers aren't a way to improve progress. Free software is not about technical excellence, it's about ethics. Freedom is not the same thing as progress.

If you're not free to control the material environment you live in, then you're not free.

Today, free software is a significant part of our material environment. Free software is a necessary, but not a sufficient condition for free society. For example, you can't have a free society without free speech.

In the early days, software users were highly-skilled individuals. Today, users are everyone, and software is everywhere.

The more pervasive free software becomes, the more disadvantageous proprietary software becomes. A centered set is characterized by whether members are moving towards the center, or away from it. It's almost like a form of gravity.

We had the dot-com bust in the early 2000s. This left many companies unable to pay for proprietary software, and they turned to free software as a result.

Commoditization is a fact of life in any industry. Things that were once very unique become commonplace. What happens when all software is free software? Commoditization might make proprietary software irrelevant. This could mean the end of copyright or patents on software.

Many rebellions fail the moment they win. They were prepared to fight, but they weren't prepared for when the fight was over. We have to think about building a community to sustain our victories. You're empowered with the permission to make the world a better place. But you also need to be empowered with the capability to do that.

Question: Sometimes free software feels like a utopia. How do you reconcile free software with economic pressure?

A lot of pressure is toward free software. We know where we want to go, but we're not there yet. There are tensions, but we'll work through them.

Question: Sometimes free software is not the same thing as freedom. For example, Android was built on free software, but it doesn't give you freedom. How do you get the freedom?

We're not there yet. A lot of our understanding has come from copyright. We'll have to look at the other restrictions and figure out how to get freedom.

Question: You introduced the notion of bringing in community. Do you have a pitch to bring people in?

People understand human rights, and they can understand human rights violations. Trying to sell free software as a human right might be a way.

Beyond Reproducible Builds

Holger Levsen, Debian

There's a 31c3 talk on reproducible builds. describes the concepts.

There was a remote root exploit in sshd resulting from a single bit binary difference. It was essentially the difference between > and >=. There can be financial incentives to hack developer machines. The CIA has done research into hacking SDKs. Apple's SDKs were compromised.

The goal of reproducible builds: for a particular toolchain, the same source code should produce bit-for-bit identical binaries. If you perform the same build 5x, do you get the same binaries (and the same checksums) each time? You should, and this should be the norm. To us, "reproducible" means "bit for bit identical".

Timestamps are a common source of problems, as well as timezone and locale. There are numerous other small issues.

Build dates are not useful. If you'd like to identify when a build took place, use the timestamp from the last source control checkin in seconds since the epoch. builds everything twice and compares checksums. The test builds are done with varying locales, machine unames, file systems, clock times, and build user ids.

We've had to build tools that recursively unpack files, and compare their contents. These tools make it easier to figure out where (and why) differences are occurring. See for one such tool.

In Debian, 85% of source packages give reproducible builds. allows you to see the reproducible status of individual packages.

What should you do if you find the source of a non-reproducibility? A bug report would be good.

Some compilers embed paths in binaries. You have to use the same build paths to get identical results.

Embedded image resources can be a problem. For example, one project had a timestamp embedded in an image file. We've developed tools to strip out this kind of non-deterministic content.

To reproduce builds, you need an exact duplicate of the original build environment. In addition to providing a list of source files and checksums, you have to provide the list of files that make up the build environment, along with their checksums. We store this information in .buildinfo files. Eventually, we'd like .buildinfo files to be signed and distributed. We're still figuring out the best way to do this.

Tar files are another thing to be conscious of. A tar file contains timestamps and permissions. This information needs to be normalized. You can't rely on a user's umask - you have to set that explicitly. You also need to set file timestamps to the build epoch (the timestamp of the last checkin). tar --clamp-mtime can help with this.

Aside from Debian, Coreboot, and Open WRT are moving towards reproducible builds. NetBSD, FreeBSD, and ElectroBSD (FreeBSD with binary blobs removed) are 100% reproducible. Many other projects are working on this. Tor was the first software project to have a reproducible build (in 2013).

The rpm format is not reproducible; it includes the build timestamp and hostname. In Germany and France, the law requires software used in gambling machines to have reproducible builds.

In the future, we'd like third-party verification of reproducible builds. We need to figure out who will test who's builds.

Question: What about the mobile side of the world?

F-Droid is working on this?

Question: Is there a specific format for a .buildinfo file?

There's no unified format yet.

Take Control of your Communications with Ring

Adrien Beraud and Guillaume Roguez, Savoir-faire Linux

We've just released the first beta version of Ring. Ring is developed by Savoir-faire Linux, a company that develops free software. There are ring clients for Linux, Windows, MacOS, and Android. The iOS client is still in the works.

Ring supports video conferencing, chat, slide sharing, and video file sharing. It's completely peer-to-peer and there's no server software involved. Ring works with any SIP service. Ring is modular and scriptable. It can be used to communicate with IoT devices.

Ring's peer-to-peer architecture. Paul Baron was a pioneer of packed switched networking. He realized that if each node has three connections, then the (mesh) network can tolerate the loss of any node. The challenge lies in how to construct and maintain the mesh.

OpenDHT is a library that provides a distributed hash table. Ring uses a distributed hashtable to find network peers. Every node has a unique identifier. To communicate, Ring starts by placing an initialization message in the distributed hashtable. Nodes can ask to be notified when a specific DHT cell changes. This is how the recipient gets the initialization message. The initialization message names the parties that are interested in communicating; from there, the interested parties set up a peer to peer channel.

Because the DHT is shared among many nodes, it has to be treated as a hostile environment. The DHT keeps the application decentralized, but it does add communications latency.

ICE is a method for exchanging information about communications. An ICE message contains a party's endpoint information. The recipient sends responses back to the DHT. This allows the protocol to traverse NATs and firewalls.

The node id is actually the fingerprint of the node's public RSA key. Connections are set up with RSA and AES. Negotiations use SIP and DTLS with PFS. Ring supports certificate pinning and X509 certificate chains.

Ring has the potential to be a universal communications solution. You can use it to communicate with a robot, or to look at pictures from a camera you've set up. libring is the core set of ring libraries. libringclient is the core set of client libraries.

The actual clients are built on top of libringclient.

In the future, we'd like to add support for UPnP, IPv6, better security, and UX improvements. We'd also like to have our code audited.

The source code is available from github, as ring-project.

Question: Suppose I have keys on my phone. Can I import these keys to my laptop?

Not yet, but that's something we're working on.

Question: What about spam?

You can accept (or reject) calls based on the caller's key signature.

Question: My grandmother will never be able to figure this out. Will you have a directory to help people find each other? Can you have multiple keychains?

This is also future work. We're collaborating with a university to develop a DHT indexing algorithm. We're also working on ways to link multiple keys to a single identity.

Question: Is the private key stored on my device?

Yes. We're also working on ways to link multiple devices to a single identity.

Question: How long have you been working on this?

The ring project started two years ago, but it's based on work we've been doing for the last ten years.

Question: Is there any intention to support non-patent-encumbered codecs (e.g., something besides H.264)?

Yes. Eventually we'll support any codec that ffmpeg can work with.

Question: What about privacy?

All calls are peer-to-peer. There's no central server involved.

Question: What about Tor?

Tor works fine for textual chatting, but it's usually got too much latency for audio and video communications.

Free Software for Sousveillance

M. C. McGrath

The presenter is behind the ICWatch project,

How do you learn about people working in government intelligence? You do web searches for the names of NSA programs, and scrape linked-in profiles where those terms appear.

There's a whole branch of intelligence that deals with data collection. Scraping keywords from linkedin isn't that different from what they do. Scraping linkedin profiles also turns up new programs that you've never heard of.

The intelligence community has people how don't know they're part of the intelligence community. For example, Lockheed Martin has an open-source intelligence program; Walmart used this program to figure out which workers would be likely to organize for higher wages. To the intelligence community, data collection is a form of risk management. Why don't we do the same things? We can't send warrants to telecommunications company, but we can use the internet to collect information that people post about themselves (e.g., their resumes).

The presenter started to build free software tools to collect and dissect data, just like the intelligence community does. Let's start with a data source, linkedin for example. We run a few searches using search terms (aka "selectors"). This is an iterative process, where information gathered in round N informs what you look for in round N + 1.

Matching data across sources is a challenge, but you can do it successfully. Look for comma-separated lists; these are often lists of programs and terms. The context in which a term is used helps you figure out what it is.

What kind of data sources are at our disposal? OSINT, hacked data, leaked data, FOIA data, interviews, and measurement data. The hacking team emails were a very useful data source. We're planning more data integration work in the future.

How did people react when they found out I was collecting this information? Some changed their profiles after finding themselves on icwatch. But we can still track their changes. Some people deleted their profiles. One person sent me a DMCA takedown notice, which I didn't honor. I also got a few threats.

There are some IC mistakes we should avoid. First, we shouldn't demonize people in the IC community. Humanize them instead. It's important to understand why they do what they do. If we can understand them, maybe we can push them to do something different. Second, we should share information more than the IC community does.

By itself, knowledge isn't powerful. The power comes from how you use it. It would be better to have more collaboration and less competition. Collaboration will put us ahead of the IC - they're really bad at collaborating.

Comment: This project is a little like little sis. It would be interesting to see the two projects combined. Put could build a social graph of the intelligence community.

Question: What can other people do to contribute?

Search through the data and make use of it. Write code to add data sources. Write FOIA requests based on the data we've published.

Question: Have you discovered any international relationships?

There more open source information available about US intelligence agencies than in any other country. In the US, many intelligence functions are privatized. To get a job with one of these companies, it helps to list the programs on your resumes. The keywords help you to get hired. Other countries don't privatize intelligence like this, so there's less open source information available.

Question: Are there things the free software community can do to spread collaboration?

Try to get people to release data behind their news stories.

Question: What about locally-collected data?

We need to collect more local data, especially about different police departments. There's probably a different set of search terms for every city and town.


Zak Rogoff, FSF

The web is global, but the W3C doesn't equally represent all of the people who use the web. DRM has been brought up by the likes of Netflix, Microsoft, the BBC, and google. The W3C started discussing DRM in 2013. W3C standards aren't "law", but they're extremely influential.

The FSF has been happy with the W3C since the 1990s, but media companies have become very disgruntled with the web.

When you watch a movie on Netflix, Netflix installs a piece of software on your computer, and this software limits what you can do. In the early 2000s, Sony installed rootkits on people's computers. This was part of an anti-copying scheme. Beyond copy protection, DRM violates your right to control your computer.

The DMCA makes it illegal to circumvent DRM. It's not legal to "crack" DRM. It's not legal to reverse-engineer the software, to see if it's doing something harmful to your computer. DRM doesn't make exceptions for fair use. Companies want DRM in web standards, to make it easy to spread DRM throughout the web.

The DMCA is a US law, but trade agreements can spread laws to other countries. We don't have faith in the US government to write good copyright laws. It's interesting that this fight has spread to the W3C.

More than 35,000 people have signed our petition, asking the W3C to keep DRM out of HTML5. We gave the W3C an "Oscar" for the best supporting role in the Hollyweb.

We've had lots of discussions with the W3C, and lots of people in the W3C agree with us. But the top people in the W3C are less receptive.

The EFF proposed a compromise: you can't use DRM to attack people, and you can't sue security researchers. Even this would be a positive step.

The GPL is orthogonal to patents. The W3C was concerned about the use of patents in HTML. Companies that join the W3C agree not to claim patent rights over the web. There's not a clear way for individuals to join the W3C. Applying as an individual expert is the best you can do. Companies pay thousands of dollars/year for W3C membership. The MPPA, Google, Microsoft, and EFF are all members.

It all comes down to a free and open web, vs a web that's controlled by corporations. Victory here would show a new kind of civic engagement. If we lose, we lose one of the main bastions of a free and open internet. A loss would also open the door to adding DRM to other standards. There is a long line of lobbyists watching this. If we lose, it will affect DRM down the road.

There's been discussions of forking the W3C; we'd start using something different than we're using now.

There are other standards bodies. The IETF had a scandal with the NSA. The use case of protected content has been accepted. Encoded media extensions is a proposal, but it hasn't been accepted yet. Some kind of standardization is complementary to free software. The question is what kind of standardization.

The W3C's goal is to come to agreement on standards. Companies want legitimacy behind DRM, to make it easier to build DRM into other things. Encrypted media extensions (EME) is technically not a recommendation - it's not a mandatory part of the web. The EME talks about APIs that a browser has to implement. Decryption still requires a black box decryption library which comes from some approved source. See

Copyright law is not absolute. It's supposed to provide equilibrium between rights holders and the public. DRM doesn't respect the rights of the public (fair use, for example). DRM would effectively work like private law.

Some countries have copyright laws that differ from our. At least until the TPP is passed.