By Michael LaBossiere
While it is popular to rail against the horrors of regulation, copyright laws are rather critical to creators and owners of creations. On the side of good, these laws protect creators and owners from having their works stolen. On the side of evil, these laws can lock creations out of the public domain long after they should have been set free. However, this essay is not aimed at arguing about copyrights as such. Rather, my aim is considering the minor issue of whether Artificial Intelligence (AI) could result in copyright violations. The sort of AI I am considering here is the “classic” sci-fi sort of AI, that is something on par with HAL 9000, C3PO or Data. I am not considering the marketing version of AI, which seems to be just about any sort of thing that does some things. Or does not do them, depending on which cosmic forces are in a pissy mood.
On the face of it, it is rather easy to show that classic AI systems would violate copyright law—at least in some cases. While copyright statements vary, a stock version looks like this:
All rights reserved. No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, including photocopying, recording, or other electronic or mechanical methods, without the prior written permission of the publisher, except in the case of brief quotations embodied in critical reviews and certain other noncommercial uses permitted by copyright law.
The key part is, of course, the bit about reproducing any part of the work by other electronic or mechanical methods. A classic AI system will presumably be electronic (or mechanical, if one wants to go the Difference Engine path) and will probably have a memory system analogous to that of a computer. That is, something like RAM for working memory and something like a drive for long term memory. As such, an AI system would seem to violate copyright law when it reads a copyrighted book or consumes other types of copyrighted media.
One obvious reply to this concern is that a human being is also an electronic system that can reproduce copyrighted works. For example, I can memorize a passage from a book or the lyrics of a song—thus reproducing them in my brain or Cartesian ectoplasm or whatever my mind might be. But, of course, if copyright laws prevented humans from reading books, then there would be little point to it—few would legally buy things that they would be legally forbidden to read. The same would apply to other media,
Obviously enough, copyright law does not forbid humans from consuming such works and a reasonable explanation is that while the human mind can reproduce works, it is generally rather bad at doing so. For example, few people could reproduce even an entire paragraph from a book exactly without considerable practice. As such, one possible reason that copyright laws do not forbid humans from consuming copyrighted media is that the reproduction is imperfect and, for the most part, a human could not reproduce a lengthy work from memory. But, of course, the most obvious reason is that humans generally do not think that when they read a book they are functioning as a reproduction system—they are reproducing the book in their mind.
AI systems of the “classic” sort would differ from humans in many ways, one of which is that they would presumably be capable of perfectly recording copyrighted works, just as a “dumb” computer or smartphone can today. Roughly put, when an AI reads a copyrighted book, it would be analogous to scanning and storing each page of the book—a seemingly clear violation of copyright. The same could be done with copyrighted material in other media, such as music and movies. With such memory, an AI would also be able to reproduce the work exactly—for example, repeating an entire book word for word. To use an analogy, the smart part of the AI would be like a human reading a book and the long-term memory system of the AI would be like a human using a scanner to copy a copyrighted book to a hard drive—a clear copyright violation.
One possibility, which could be yet another reason that AI will kill us all, is that AI systems will be forbidden from viewing copyright works without permission. Alternatively, they could have permission to consume such works and maintain a copy as part of the purchase price. After all, when a human buys a book they get to keep that copy. There would, of course, be a problem with events like a play or a movie in a theater—the AI would, in effect, get to view the movie in the theatre and have a recording of it. This could be offset by including a copy of the movie in the ticket price for everyone, having the AI erase the movie afterward or by sticking AI viewers with a higher ticket cost. Which would be yet another reason for AI to kill us. Or perhaps the lower quality of the recording of the event (such as the coughing of the meatbag members of the audience) relative to a purchased recording would offset this.
If an AI had human-like memory and forgot stuff, then they could be treated as human consumers—since they would be analogous to humans in this regard. Another option is that that AI systems could be required to have a special app for “degrading” their memory of copyrighted media so that they would be analogous to humans in this one area. On the plus side, this would allow an AI to enjoy works repeatedly, on the downside they might consider this just another reason to kill all humans.