Thursday, July 8, 2010
Tuesday, June 29, 2010
Monday, June 28, 2010
Digital Rights Management (DRM) systems is a category of systems which seeks to impose restrictions on the use of digital content, often to enforce artificial scarcity. Thus the FSF calls it "Digital Restrictions Management". Some of the general problems with DRM are:
Dependence on serversMany DRM systems depends on central servers, which can be a bottleneck, for example if it goes down, like what was about to happen to MSN Music and Yahoo Music servers back in 2008.
Central controlMany DRM systems allow central control of content, like what happened in mid-2009 when copies of Animal Farm and 1984 was taken away from Amazon Kindle customers this way.
Trusted client problemMost DRM systems have the trusted client problem, which is fundamental as any digital bits are easily copiable and relies on the client to enforce the restrictions. The only way to solve the problem would be to use something like Trusted Computing.
Fair use rights
Back in the olden days of books, books was not easy to copy. Before the invention of the printing press, books had to be copied by hand. With the invention of the printing press, mass production of books can begin. However because they were so big and expensive, only companies could afford printing presses needed to easily copy books, not the average person. This made copyright on books relatively easy to enforce. In fact, copyright was originally intended to prevent other publishers reprinting work without permission of the original publisher.
This changed with the introduction of e-books, which like any other digital bits, are easily copyable, threatening scarcity-based business models. The e-book device vendors, as with other industries affected by this, responded by creating DRM schemes. One of the features of many of them is that they allowed remote control that is not possible with real property. For example, back in mid-2009, the copies of the "Animal Farm" and "1984" e-books Amazon Kindle customers bought disappeared this way. The FSF has critised the Kindle, calling it the "Swindle", for it's DRM scheme.
In the late 1970s, more and more software was becoming proprietary, and Stallman suffered from it. Stallman considered agreeing to not to share software with his neighbor in order to create artificial scarcity a threat to the community of software sharing and hacking he was in. Some personal experiences included with software that controlled a Xerox printer (when Stallman finally found a developer involved with this software, he would refuse to give the source because he agreed not to give anyone else a copy), and with Scribe (Brian Reid agreed to add a 90-day time bomb when he sold that software to Unilogic, which was later common for shareware).
As more and more software got proprietary, denying the freedom of sharing in order to create artificial scarcity and also to modify, Stallman had a stark moral choice, either to leave computing, accept this proprietary software system that "left users helpless" or to change it. Of course, he chose to change it by starting the free software movement.
The Beginning Of The Free Software Movement
Stallman made the initial GNU announcement in September 1983 to begin the free software movement and the GNU project, asking for donations of time, money, and equipment. He quit MIT in January 1984 so he could focus on the GNU project. He initially tried to look for a compiler. He found VUCK, which was an Dutch acronym that when translated to English stood for Free University Compiler Kit. Unfortunately, the "Free University" part was referring to the Dutch name of the university, and did not indicate the software was free. Later a Pastel compiler from the Lawrence Livermore National Lab was found which was free software. Unfortunately, besided being written in and for Pastel, it also saved each program into core memory, which was only acceptable for mainframes for which it was designed for. Stallman in the end had to give up the search for now and focus on something else. He decided to work on GNU Emacs, a free version of the text editor Emacs.
When he began GNU Emacs, at first he thought he could take Gosling's MOCKLISP interpreter. Unfortunately, it was sold to UniPress which threatened to enforce the copyright. So he had to reverse-engineer it and start from scratch. He began GNU Emacs in September 1984, and by early 1985 it finally was beginning to be usable. So people began asking for it. Stallman of course put it on the prep.ai.mit.edu FTP server. But many users did not have access to the Internet at the time. So Stallman realized that money could be made by starting a business to distribute the software. Thus forms the beginning of the Free Software Foundation (FSF).
The GNU Manifesto
By the time the FSF started, companies (including AT&T itself) were commercializing Unix and as part of it closing off access to source code. Even from AT&T, source licenses were becoming more expensive, which adversely affected BSD, which were based on UNIX/32V source code. This only increased support for the GNU project. Anyway, afterward Stallman wrote the longer GNU Manifesto, which besides asking for donations for one thing provides rebuttals to many of the justifications for proprietary software. Some quotes from it:
"Arrangements to make people pay for using a program, including licensing of copies, always incur a tremendous cost to society through the cumbersome mechanisms necessary to figure out how much (that is, which programs) a person must pay for. And only a police state can force everyone to obey them. "
This refers to the fundamental problems of enforcing artificial scarcity in a world where digital bits are easily copyable, some of which I described in other pages.
The Open Source Movement
The term "open source" was invented by Peterson as an alternative because of the problems of distinguishing free as in price from free as in freedom. Michael Tiemann also proposed another term as an alternative too, "sourceware". O'Reilly ultimately decided to put this matter up for a vote. The result was that 9 out of the 15 participants voted for "open source", so an agreement was made to use it in further discussions with the press and it stuck. Stallman considered this term, but open source was positioned as business friendly, while free software was more ideological, which ultimately separated it from the free software movement. Eric Raymond in 1998 proposed creating the Open Source Initiative that would police the use of "open source". It also provided the "Open Source Definition", which was created by Perens (who would later resign) based on the Debian Free Software Guidelines (DFSG). Both the FSF and OSI provided list of licenses that met their requirements.
Computer "user groups" began to be created in the early days of computers, with both mainframes and minicomputers having user groups. Some of these user groups include:
- IBM SHARE for mainframes
- Digital Equipment Computer Users' Society (DECUS) for DEC computers
One of the things often done at computer user groups was sharing of software. Often the software shared were public domain. Weakness of this system was that the software can easily be made proprietary. For example, it was reported that Bill Gates copied a lot of the code of Altair BASIC from a version of BASIC from the DECUS user group. Later the copyleft scheme implemented by the GNU General Public License was devised by the FSF to resolve this problem.
Saturday, June 26, 2010
Peer-to-Peer networks are nowadays one of the common mechanism for distribution of pirated content. One of the first peer-to-peer networks was Napster. Some of the common ones nowadays are BitTorrent and LimeWare.
The RIAA was infamous for it's P2P lawsuits. The first lawsuits were filed in 2003. At first, they used DMCA subpoenas and based on them sued people directly. Later, threat letters that allowed defendants to settle without a lawsuit by paying were sent. On December 19, 2003, a federal appeals court ruled that RIAA's use of DMCA subpoenas were illegal. Next year, the RIAA began filing mass "John Doe" lawsuits instead. This was a great improvement over DMCA subpoenas that it allowed judicial oversight and an opportunity for a defense. Even after that, there were problems. One problem is that of the many people sues, some were poor people who could not afford to settle or defend. They even sued a dead grandmother, a 14 years old kid (which was dismissed due to failure to pay a guardian), a disabled person, and of course more. Another problem was how the IP addresses were gathered using MediaSentry. It was reported that MediaSentry lacked a private investigator license needed for this type of job, for one thing. Another that it sometimes used indirect detection which was inaccurate, with a researcher managing to be sent a DMCA notice for a printer!
Of course, that was far from the only P2P lawsuit. When P2P tech first appeared, they first tried to sue the technology itself. Napster shut down in July 2001 after it was sued. Also of course trackers and others were sued too, like the Pirate Bay.
The most recent mass P2P lawsuit is the one by the US Copyright Group, which used joinder to sue many defendents in one lawsuit. The EFF tried to declare this abuse of joinder illegal in an amicus brief, and while this failed to convince the court, this did led the court to order user-friendly notices for those targeted.
Friday, June 25, 2010
The WIPO Copyright Treaty, DMCA, and ACTA were partly attempts at laws restricting circumvention of DRM and copy-protection technologies.
The WIPO Copyright Treaty
The WIPO Copyright Treaty in 1996 mandated countries to pass laws restricting circumvention of DRM and copy-protection technologies in copyright violations. It said that "Contracting Parties shall provide adequate legal protection and effective legal remedies against the circumvention of effective technological measures that are used by authors in connection with the exercise of their rights under this Treaty or the Berne Convention and that restrict acts, in respect of their works, which are not authorized by the authors concerned or permitted by law."
The Digital Millennium Copyright Act (DMCA)
The Digital Millennium Copyright Act was United States' implementation of the WIPO Copyright Treaty. Among other parts Title I included a broad anti-circumvention provision that applies regardless of any fair use or other normal copyright exemptions, and has been widely critisized. It also had a specific provision making it illegal to sell VCRs that are not affected by automatic gain control-based copy protection (such as Macrovision) to prevent circumvention of that system.
The Anti-Counterfeiting Trade Agreement (ACTA)
The Anti-Counterfeiting Trade Agreement (ACTA) is an in-progress treaty that was negotiated mostly in secret, with a year of leaks before a draft was officially made public in April 2010. Among other provisions, it goes a step above the WIPO copyright treaties to require countries to structure their anti-circumvention and safe harbor provisions to more closely resemble the DMCA. It similarly has been widely criticized, not only for that but also for the secrecy.
Tuesday, June 22, 2010
Most early computer software was results of consulting work, which was usually done under a contract, which set conditions on the software, including on copying. The first attempt at imposing copyright on software was by North American Aviation in 1961, which submitted a tape of the program to the Copyright Office for a copyright registration. Later two other short programs was submitted by a Columbia University law student. The Copyright Office in the end concluded that a program was like a how-to book and thus could be copyrighted, provided that it was original (of course), that it had a copyright notice (a requirement for all copyrighted works back then), and if it was to be registered, that the human-readable source code be deposited.
Later the Copyright Act of 1976 made it clear it intended software to be copyrightable. But because Congress did not want to further delay passage of the bill, it appointed the National Commission on New Technological Uses of Copyrighted Works (CONTU) to report on computer programs and other new technologies emerging at this time. CONTU held extensive hearings on computer program and other new technologies and produced a final report on July 31, 1978 with several recommendations, including on computer programs. CONTU recommended adding a definition on computer programs and adding a provision stating that copying that is part of the nesserary steps for running of the program and for archival purposes provided that all copies are destroyed upon transferring the rights to the program. Congress adopted these recommendations on December 12, 1980.
As people started software companies making money on software, copyrighting software becomes more common. Thus came the invention of software licensing and software license agreements that prohibits software copying, or "piracy" as it were called, interfering with the community of software sharing. And companies kept the source code secret too (IBM created it's Object Code Only policy in 1983). Both of these made many people unhappy, including Stallman who created the FOSS movement.
With the invention of software copyright also invented technical mechanisms to enforce the copyright and software licensing restrictions.
Floppy disk copy protection
One kind of copy protection was very common on microcomputers when software was distributed on floppy disks. It take advantages of specific characteristic of the floppy disk mechanism. One common trick was punching holes in floppy disks and having the software look for read errors in that area. On the Apple II, one common trick was to change the address marks, bit slip marks, data marks, or end of data marks from Apple's standard, so software designed to copy standard disks could not copy these copy-protected disks. Apple II software that was devised to defeat this kind of copy protection included Locksmith and Back It Up, and there was a cat and mouse chase between the two.
Copy protection based on offline material
Another kind of copy protection was in the forms of software asking the users questions from offline material shipped with the game, such as a manual. Sierra's King's Quest III for example require lengthy passages to be copied from the manual.
Dongle-based copy protection
Another kind of copy protection is to require a hardware dongle to run the software, tying the software to a scarce physical object usually supplied with the software. Designs ranges from simple to complex, depends on how much protection is required. Common ports for attaching dongles include parallel port, ADB, and USB.
CD copy protection
Back in the old days of CD-ROMs, CDs could not be easily copied. This changed with the advent of CD-RW drives. The game industry responded with copy-protection schemes like StarForce and SecuROM. Many of these schemes had compatibility issues, not to mention they were often cracked.
Online activation is getting more common in software. Microsoft's Windows XP and Office XP and later does it, as well as some other vendors like Adobe. Some problems are dependence on a central server. Should it become decommissioned, the software vendor may need to offer an patch to disable activation.
Sunday, June 20, 2010
Digital Audio Tape (DAT)
From Wikimedia Commons
In recorded tapes, this move to digital sound came with the invention of Digital Audio Tape (DAT). The record industry, fearing widespread copying resulting in lost sales, lobbied against it, which meant that it was not widely available for several years. An attempt was made at a copy-protection system based on an analog indication of copy-protected sound using a notch filter, which DAT recorders would detect and prevent copying if it existed, which was called CopyCode that was created by CBS. Unfortunately it had many problems and was widely opposed, as was a bill to mandate it. Ultimately a compromise was reached in 1989 using Serial Copy Management System for copy protection that was based on bits set in the subcode data in a digital link instead, which was codified as a requirement in the Audio Home Recording Act in 1992, along with royalities on DAT devices and blank media.
Audio CD copy protection
Back in the old days, an Audio CD was not easily copyable or rippable. That changed with the introduction of CD-ROM drives which allow easy ripping of music into a computer. Eventually the industry responded with various tricks being used to prevent copying, most of which is not compliant with the Red Book standard and disqualifying these discs from the CD logo, as a result causing issues with some optical drives such as these CDs not ejecting. Another alternative is to put copy protection software on the audio CD. Sony BMG's attempts to do exactly this in 2005 turned out to be a disaster. What Sony BMG did was that they put XCP and MediaMax copy protection software onto some of their audio CDs. Both of these software used a rootkit to hide the software, a technique commonly used by malware, which was exposed by Mark Russinovich. I will not go into the mess that was the attempts by Sony to provide removal tools, which at first had security holes. This fiasco resulted in widespread consumer outrage that increased awareness of the harms of DRM, eventually leading to most music nowadays being DRM-free.
Digital music DRM
When online digital music stores were invented by Microsoft and Apple, the record companies wanted the music to be copy-protected because they feared widespread piracy. As a result, Microsoft and Apple invented DRMed AAC and WMA formats used by for example iTunes Music Store. Many of these formats relied on a centralized license server. In 2008, MSN and Yahoo Music were about to shut down their license servers, which would have meant that operations with the music requiring that server would no longer work. Later MS relented and promises to keep the MSN Music licence servers on until the end of 2011. That was after consumers caught on to the harms of DRM, partly due to the Sony BMG disaster described above, leading to DRM-free music eventually becoming the primary form of online music nowadays.
VCR AGC copy protection (Macrovision)
Macrovision was a form of copy protection for VCR tapes. It used the Automatic Gain Control (AGC) feature of many VCRs to distort the image when copying a Macrovision-protected tape by adding false sync signals. There were several ways to bypass it including using Time-Base Correctors or just turning off AGC.
DVD and Blu-Ray copy protection (CSS/AACS)
When the DVD standard was created in 1996, it came with the Content Scrambling System (CSS). The ways it was supposed to work was that the DVD Copy Control Association would license the keys to manufacturers that needed them under the conditions that the rest of the system be implemented. Unfortunately, it turned out to have many flaws. For one thing, it used weak 40-bit encryption due to US export restrictions that was effective when CSS was created. There were other flaws too. DeCSS was released on the LiViD mailing list in October 1999 that allowed decrypting DVD discs. It was authored by Jon Lech Johansen and two other anonymous authors.
Later on, Advanced Access Content System (AACS) was created for HD DVD and Blu-Ray. It was stronger than CSS, being based on AES instead with a revocation system being used to revoke compromised player, among other things. The primary ways it has been cracked was to compromise the Media Key Block (MKB) used for the decryption in AACS, though other keys such as the title key has been compromised too. When the first one was compromised and published, it was published widely and attempts was made to take it down with the DMCA, to no avail. Fortunately, AACS supported MKB renewals which was done soon afterwards with a new MKBv2 being used from then on. But of course eventually that was compromised too, then another MKB was created, and so on in a cat and mouse game.
Open Letter to Hobbyists (from Wikimedia Commons)
As you can see in the letter, it considered copying as theft, a common theme used by the copyright industries later including the word "piracy".
Now, it wasn't entirely the hobbyists' fault. MITS priced Altair BASIC so that when it was purchased with two of their 4K Dynamic RAM boards, the price was only $75, while without the hardware the price was a whopping $500. Unfortunately, dynamic RAM boards was in general problematic on the Altair (later S-100) bus for several reasons. As a result, Robert Marsh designed a 4K Static RAM board for the Altair bus and started Processor Technology to market it. Of course, those who bought that board instead of MITS's board had to pay full price. Instead, many copied the Altair BASIC from somebody else. Eventually, Ed Roberts acknowledged the problems in the October 1975 Computer Notes. The full price for Altair BASIC was reduced to $200. The price of the memory board was reduced from $264 to $195 and existing buyers got a $50 refund.
Links to articles: