August 17, 2006

Techies and Kiddies

by PG

or, The Internet Isn't for THAT Kind of Porn.

Some think that AOL's release of searches has been improperly attacked as a breach of privacy. Under this argument, the fact that the New York Times could identify a specific user from the collection, and that Slate could identify seven types of searchers plus a joke, is not much reason for concern. Nonetheless, I am glad that I never searched my own name nor any portion thereof on AOL, which would have made my "anonymizing" number useless. One person asked me whether I consider it a horrible breach of privacy for me to write posts wondering how a particular search drove someone to my site. I don't think it is, particularly because I don't publish anything that identifies the searcher, nor do I have multiple searches collected from a single user (as far as I know). However, in the spirit of the NYT's remark, "There are also many thousands of sexual queries, along with searches about “child porno” and “how to kill oneself by natural gas” that raise questions about what legal authorities can and should do with such information, " I'll put up the (publicly available, again unlike AOL's) information of one person who Googled to De Novo:

Domain Name: pacbell.net ? (Network); IP Address: 69.236.180.# (SBC Internet Services); ISP: SBC Internet Services
Location: Continent : North America; Country : United States (Facts); State : California; City : Hayward; Lat/Long : 37.6503, -122.073 (Map)
Language: English (United States) en-us
Operating System: Microsoft WinXP; Browser: Internet Explorer 6.0 Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1); Javascript: version 1.3
Monitor: Resolution : 1024 x 768; Color Depth : 16 bits
Time of Visit: Aug 17 2006 7:20:37 pm; Last Page View: Aug 17 2006 7:20:37 pm; Visit Length: 0 seconds; Page Views: 1
Referring URL: http://www.google.com/search?q=topless+minors+pics&hl=en&lr=&start=30&sa=N
Search Engine: google.com
Search Words: topless minors pics
Visit Entry Page: http://www.blogdenovo.org/archives/000468.html
Visit Exit Page: http://www.blogdenovo.org/archives/000468.html
Out Click: Time Zone: UTC-8:00; Visitor's Time: Aug 17 2006 4:20:37 pm; Visit Number: 252,914

Perhaps unfortunately from the searcher's perspective, the post clicked upon has no actual pictures of topless minors. Instead, it is about distinguishing one case of a minor's disseminating topless photos of herself from another case in which two young men disseminated such photos of their underage girlfriends. On the other hand, maybe this was the type of information Visitor 252,914 wanted, and s/he perused it with interest before going back to the initial search.

It's difficult to say, and using Google searches seems more likely to catch the idle researcher (I tried the search myself and found this article) or the imbecile pedophile (no intelligent pedophile thinks he'd find such images through a Google search for them). I'd prefer that my Google searches for Child Pornography, while considering this absurd lawsuit, not be used as reasonable cause to take my laptop apart.

That technology can be used for both intellectual as well as more dubious purposes is no surprise, but the legal treatment thereof still gets debated. Boing Boing (by way of Bamber) has the story of how Adrian Lyne managed to get a topless Lolita for his 1997 remake:

Since the filmmakers were not legally able to film their underage actress topless in a sexual situation, they filmed her with a beige body stocking with X's of electrical tape where her nipples would have been. They then re-filmed the same scene with a rather busty (but entirely legal) 18-year-old actress. My friend was then given the task of seamlessly tracking and compositing the nekkid 18-year-old bosoms onto the 14-year-old body.
The story was compared to that of a 38-year-old fine art student in the UK who was charged with possessing indecent pseudo-images of children, after he used software to reduce the breasts on porn stars and dress them in school uniforms. A London Times article about him noted a similar case:
Four years ago, a lorry driver from Rutland admitted eight specimen charges at Leicester Crown Court of making an indecent pseudo-photograph of a child.
The amateur photographer, who was fined £100 for each offence, had taken snapshots of local children -- some taken at village functions -- before superimposing their faces on to the bodies of adults in explicit poses and storing the images on his computer.
Yet when we contemplate the actual harms of the crime, the truck driver's seems more problematic than the student's. The student didn't use children in order to produce his images, while the truck driver has pasted the faces of children -- not porn stars -- onto adult bodies. As long as potential child-molesters stay far away from children, they can have their fantasies and photoshop all they want; when they start hanging around children, photographing them, and storing the images of children to whom they could have access, they come closer to actual harm. BoingBoing commenter Connerss says,
In the past, the question about child pornography was always related to the child being involved, and the harm done to the child. Now that these things can be created without harm to a child, I think the question is whether the images increase or decrease the desires of those that are sexually interested in children. Do the images decrease the urge to actually molest children, and thereby give them an ou[t]let (however disgusting) for this problem? Or do they increase the urges? In a free society, in the privacy of your own home, if no one is hurt, can you draw what ever image you want? If beastiality [sic] is illegal, do people go to jail for drawing pics of people with horses?
First, this ignores one of the alleged harms of child pornography: that it is used to entice children into sex, convincing them that because other kids do this (see, they're in the picture!), it must be OK. Second, I don't think we can allow the question of whether images increase one's illegal urges to decide whether such images can be produced, as long as their production does not harm. I dislike pornography that pretends to depict women being degraded, abused and killed, and think that it may well increase some people's urges to degrade, abuse and kill. But TastyTrixie agress with the outlet possibility. "This is what I love about pornography: it's a way to stage and get off on taboos without actually violating them in real life." She takes seriously the distinction between staging taboos and violating them, and how the latter creates harm; she does not see taboos as existing for repressed others, but for herself as well.

In short, I think the student should not be prosecuted nor penalized (though encouraged to seek psychological help); the truck driver probably should have gotten more than a fine, considering that he was using images of children for prurient purposes. Had the truck driver distributed his pictures, that offense should have been treated like any other distribution charge, because yet another potential harm of child pornography is that if distributed, the child may have to see the disturbing image of herself. Even if it is only the picture of her face on another's body, it still is an improper, unauthorized use of her likeness.

U.S. law, of course, protects against prosecution for what the British call "pseudo-images." Ashcroft v. Free Speech Coalition invalidated Section 2256(8)(B) of the Child Pornography Prevention Act of 1996, which would have banned sexually explicit images that appear to depict minors but were produced by means other than using real children. Notice that under this language, the student's pictures would not be illegal, but the truck driver's would be more questionable, as they did use real children. Kennedy's opinion, joined by Stevens, Souter, Ginsburg, and Breyer (with Thomas concurring only in the judgment, due to his concerns about advancing technology), has some classic Kennedyisms: "Our society, like other cultures, has empathy and enduring fascination with the lives and destinies of the young. Art and literature express the vital interest we all have in the formative years we ourselves once knew, when wounds can be so grievous, disappointment so profound, and mistaken choices so tragic, but when moral acts and self-fulfillment are still in reach."

The Shorter Anthony Kennedy is that films that depict legal-but-looking-younger actors or technologically-made-obscene minors can have redeeming value, and that the concerns about pornography's being used to seduce real children, or to whet the pedophile's appetite, are not sufficient to ban it. The affirmative defense that the pornography can be proven to have used only adult actors is not enough because it cannot be raised for the charge of possession, nor for virtual child porn.

The 1997 film version of Lolita actually might provide a rebuttal to O'Connor's statement in Part II of her opinion, which Part Rehnquist and Scalia joined: "Respondents provide no examples of films or other materials that are wholly computer-generated and contain images that 'appea[r] to be … of minors' engaging in indecent conduct, but that have serious value or do not facilitate child abuse." O'Connor would have overturned the ban on "youthful adult" porn, but retained that on virtual child pornography, which would give us a legal regime similar to the UK's. (I'm fairly sure the Brits have their versions of Barely Legal.)

As usual, I find the British law overly restrictive. Legislation like the ban on virtual child porn, or on hate speech unconnected to a crime, jumps too far ahead of the evil that is sought to be prevented. While American "snuff the approach of tyranny in every tainted breeze," the UK government punishes the approach of harm in every action that may lead to it.

PS: I see Second Life and think
Galt's Gulch!
and
I hope they're paying taxes on all this.

August 17, 2006 09:00 PM | TrackBack
Comments
Post a comment









Remember personal info?






Sitting in Review
Armen (e-mail) #
PG (e-mail) #
Craig Konnoth (e-mail) #
About Us
Senior Status