bg



The Y10K Problem -- Prepare while you have time

Fun little article on the supposed Y10K problem:
   The most common fix for the Y2K problem has been to switch to 4-digit
years. This fix covers roughly the next 8,000 years (until the year
9999) by which time, everyone seems convinced that all current
programs will have been retired. This is exactly the faulty logic
and lazy programming practice that led to the current Y2K problem!
Programmers and designers always assume that their code will
eventually disappear, but history suggests that code and programs are
often used well past their intended circumstances.
In a similar vein, but intended to be serious: A Long, Painful History of Time.
Related Entries:
ANGEL APPLICATION 0.4.5
ANGEL APPLICATION 0.4.4
ANGEL APPLICATION 0.4.3
ANGEL APPLICATION 0.4.2 "pollination"
harvesting April 1st hoaxes for future technologies
 Permalink

etoy wins VIDA AWARD 2007 with MISSION ETERNITY SARCOPHAGUS

Spanish telecom giant Telefonica saves etoy.CORPORATION from bankruptcy and brings etoy's mission to the next level.

As president Francisco Serrano, announced today at Barcelona Contemporary Art Museum (MACBA): etoy wins the first prize of the VIDA AWARDS created by Fundación Telefónica to foster artistic creation based on new technologies and artificial life.

Fundación Telefónica Press Releases download:

files/vida10/press-release-vida10awards-winners2007.pdf
files/vida10/vida10awards-winners2007.pdf

Excerpt from the jury statement:

etoy launched the Mission Eternity Project in 2005, foregrounding on the one hand respect for the human longing to survive in some way after death, and on the other a sense of irony about dated sci-fi fantasies we contrive to satisfy that desire. The Sarcophagus is one materialization of this project. It is a mobile sepulcher that holds and displays portraits of those who wish to have their informational remains cross over into a digital afterlife. The size of a standard cargo container that can travel to any location in the world, the Sarcophagus has an immersive LED screen covering its walls, ceiling and floor. There, interactive digital portraits can be summoned via mobile phone or web browser from virtual capsules that are stored in the shared memory of thousands of networked electronic devices of Mission Eternity Angels (people who contribute a small part of their personal storage capacity to the mission, currently 765 of them; to date, 2 volunteers have been accepted for encapsulation). The data spectres that populate this tenuous memorial space are composed of details of lives lived, in visual, audio and text fragments. But when they are summoned in lo-res pixellated form in the Sarcophagus, they resemble one merged personality. The massing of details that we find in archives and records that keep the dead with us has a similar compositing effect, yet the Sarcophagus is also very unlike those. It gives us access to a novel social world generated among networked computer users who have a common goal of keeping something alive, which can invoke intense feelings such as care and wonder.



http://www.missioneternity.org
http://www.etoy.com
http://angelapp.missioneternity.org/
http://www.telefonica.es/vida





Press image download:
http://missioneternity.org/files/images/site/tank/06-sanjose-etoy-taasevigen2-01.jpg

_______


PRESS RELEASE DEUTSCH (Zuerich/Barcelona 28.11.2007)


Schweizer Medienkunstgruppe etoy gewinnt den wichtigsten Spanischen Medienkunstpreis "VIDA AWARD 2007" der Telefonica Foundation.

Wie Francisco Serrano, Praesident der Telefonica Foundation, heute an einer Pressekonferenz in Barcelona bekannt gab, wird etoy morgen Donnerstag fuer die Container-Skulptur "MISSION ETERNITY SARCOPHAGUS" ausgezeichnet. Das Kunstwerk beherbergt die sterblichen Ueberreste von Pionieren des Informationszeitalters wie Timothy Leary und erschliesst das digitale Vermächtnis dieser Menschen. Auf einem begehbaren 3-dimensionalen Screen, bestehend aus 17'000 Leucht-Dioden, erscheinen dem Ausstellungs-Besucher Ahnen der heutigen Medienkultur: Bilder aus ihrem Leben, ihre Stimmen, Videosequenzen, und mehr. Die elektronischen Geister koennen per Webterminal und Mobiltelefon aufgerufen und lokal, zum Beispiel auf dem eigenen Laptop, gespeichert werden. So entsteht mit der Hilfe einer ausgekluegelten Software ein soziales Netzwerk bestehend aus vielen Tausenden von "Angels", welche die Aufgabe haben die Daten der Verstorbenen fuer immer zu erhalten: das digitale Erbe wird von Generation zu Generation weitergereicht und tritt damit eine unendliche Reise durch Raum und Zeit an.

Der explodierende Speicherplatz, welcher fuer digitale Daten heute zur Verfuegung steht, eroeffnet neue Dimensionen der Konservierung und stellt zentralisierte Wissenspeicher in Frage. Die Erhaltung von Information ist eine soziale und kulturelle Herausforderung und nicht mehr eine Frage des Speicherplatzes. etoy, selbst staendig mit der Vergaenglichkeit des eignen Werkes konfrontiert, wirft mit MISSION ETERNITY grundlegende Fragen auf: was passiert mit den digitalen Spuren, welche wir alle hinterlassen?

Die bekannte Kunstgruppe thematisiert mit dem neusten Werk Vergaenglichkeit, Konservierung, Datenspeicherung, Tod und Afterlife. MISSION ETERNITY ist ein technisches Experiment, hinterfragt die Konsequenzen der totalen Datenerfassung und skizziert die Vision eines zeitgemaessen Totenkults. Gleichzeitig geht es, wie immer bei etoy, um eine vielschichtige Darstellung der Virtualisierung unserer Lebenswelt, gespiegelt im Tod, und um das Verwischen der Grenzen zwischen Autor und Konsument - Sender und Empfaenger - Lebenden und Toten.

etoy.SPONSORS: Bundesamt fuer Kultur Schweiz, Migros Kulturprozent, Swiss Life, Pro Helvetia, Stadt und Kanton Zug und Zuerich, Ernst Göhner Stiftung

Comments (4)  Permalink

ANGEL APPLICATION 0.3.2 "VIDA"

angel-app

We are very pleased to be able to announce the immediate availability of the maintenance release of ANGEL APPLICATION version 0.3.2 code-named "VIDA".

This update consists of cleanup and stability fixes since 0.3.0.

Changes:

* stability fixes (resource initialization & redirect handling)
* rolling clone list
* optimizations in maintenance loop
* test cases
* minor fixes

Get it from the Developer WIKI

Comments (1)  Permalink

Tangible Functional Programming

In this beautiful google tech talk, Conal Elliott makes a convincing case that API and UI should in fact be one and the same -- while currently there lurks one of the deepest schisms of IT. Among other things, he makes this rather outrageous statement (see 17:40):
The essence of programming has nothing to do with programs.
but of course not without thoroughly backing it up. In fact, he presents a prototypical tool kit which does just that. The talk gets a bit technical towards the end, but should be accessible (and I recommend it) to everyone with a general interest in programming and user interface design.
 Permalink

ANGEL APPLICATION 0.3.0

angel-app

We are very pleased to be able to announce the immediate availability of ANGEL APPLICATION version 0.3.0.


This update consists of stability fixes (see e.g. here), api cleanup work (see the current module import graph), as well as GUI work. See the CHANGELOG. Further information is available on the M∞ ANGEL-APPLICATION Developer Wiki.

One important thing to note: if you are upgrading from an older version, you will have to purge/empty your local repository once before being able to help safeguard MISSION ETERNITY data forever. This can be done with a single mouse-click in the File menu -> "Purge repository".
Comments (4)  Permalink

what does eternity look like?

Code-wise, eternity seems to look like a graph. At least this is what the ANGEL APPLICATION looks like in python:

Comments (3)  Permalink

Fixing urlparse: Make the simple easy, keep the complex solvable

In my previous post, I presented netaddress, an RFC 3986 compliant (I believe) URI parser (and all the shenanigans that come with it, such as numerical IP addresses). Now, while it's good to know that that's available, it has made the parsing simple URI's (the most common case) more complicated than it needs to be. This is because it now exposes most of the complexity inherent in URI's. But this is yet another place where parser combinators really shine. Say, I'd want to parse URI's of the simplified form $(scheme)://$(host)$(path), then this is all you need to do:

from rfc3986 import scheme, reg_name, path_abempty
from pyparsing import Literal
host = reg_name.setResultsName("host")
path = path_abempty.setResultsName("path")
URI = scheme + Literal("://") + host + path

And now you've got yourself a validating parser for your reduced grammar. Nice, no? I've added this as an extra module ("notQuiteURI") to netaddress, so you can use it like this:

>>> from netaddress import notQuiteURI 
>>> uri = notQuiteURI.URI.parseString("http://host.name.com/path/to/resource")
>>> uri.scheme
'http'
>>> uri.host
'host.name.com'
>>> uri.path
(['/', 'path', '/', 'to', '/', 'resource'], {})

Update: netaddress is now available through the python cheese shop. If you're interested, you should be able to install it by simply typing:

$ easy_install netaddress
 Permalink

Fixing urlparse: More on pyparsing and introducing netaddress

This is the last in a series of three posts (1, 2), discussing issues with pythons urlparse module. Here, I intend to provide a solution.

In the last post, I was talking about parser combinators and parsec in particular, mentioning pyparsing towards the end. The angel-app being a python application, parsec, while cool, is of no immediate use. pyparsing on the other hand provides parsec-like functionality for python. Consider this excerpt from the RFC 3986-compliant URI parser that I'm about to present in this post (please ignore as usual the blog's spurious formatting):


dec_octet = Combine(Or([
Literal("25") + ZeroToFive, # 250 - 255
        Literal("2") + ZeroToFour + Digit,     # 200 - 249
        Literal("1") + repeat(Digit, 2),       # 100 - 199
        OneToNine + Digit,                     # 10 - 99
        Digit                                  # 1-9    
        ]))
IPv4address = Group(repeat(dec_octet + Literal("."), 3) + dec_octet)

And now:

>>> from netaddress import IPv4address 
[snipped warning message]
>>> IPv4address.parseString("127.0.0.1")
([(['127', '.', '0', '.', '0', '.', '1'], {})], {})
>>> IPv4address.parseString("350.0.0.1")
Traceback (most recent call last):
File "", line 1, in ?
[snip]
egg/pyparsing.py", line 1244, in parseImpl
raise exc
pyparsing.ParseException: Expected "." (at char 2), (line:1, col:3)

Anyhow, what I mean to say is this: We have a validating URI parser now. Apart from the bugs that are still to be expected for a piece of code at this early stage, it should be RFC 3986 compliant. You can get either the python package, or a tarball of the darcs repository (unfortunately my zope account chockes on the "_darcs" directory filename, so I'm still looking for a good way to host the darcs).


This is how one would use it:

>>> from netaddress import URI
>>> uri = URI.parseString("http://localhost:6221/foo/bar")
>>> uri.port
'6221'
>>> uri.host
'localhost'
>>> uri.scheme
'http'

Or, in the case of a more complex parse:

>>> uri = URI.parseString("http://vincent@localhost:6221/foo/bar")
>>> uri.asDict().keys()
['scheme', 'hier_part']
>>> uri.hier_part.path_abempty
(['/', 'foo', '/', 'bar'], {})
>>> uri.hier_part.authority.userinfo
'vincent'
>>> uri.hier_part.authority.port
'6221'

Hope you find this useful.

Comments (5)  Permalink

Fixing urlparse: A case for Parsec and pyparsing

In a previous post, I described issues with parsing and validating URL's with the functionality provided by Python's stdlib. I will just restate that clearly, all messages exchanged by angel-app nodes must be validated in order for it to work properly. What to do? First of all, I was of course not the first person to notice the module's shortcomings. However, I was surprised at the answers that popped up: It seems like no one was interested in actually coming up with a validating parser (perhaps even just for a subset of the complete URI syntax), but instead people focussed on fixing specific cases where the parser would fail -- in essence adding new features, rather than putting the whole system on a solid basis. Suggestions go so far as to propose a new URI parsing module. However, the proposed new module is again based on the premise that the input represents a valid URI, the behavior in the case of an invalid input is again left undefined. WTF? Have these people never looked beyond string.split() and regexes?


Dudes, writing a VALIDATING PARSER is NOT THAT HARD, if you have a reasonable grammar and good libs. Why do people keep pretending that it is? Sure, you might be afraid of having to fire up lex, yacc and antlr, and for good reason. But with sufficiently dynamic languages, that's usually unnecessary, if you have a parser combinator library handy.


The key idea behind parser combinators is that you write your parser in a bottom up fashion, in just the same way that you would define your grammar. You write a parser for a small part of the grammar, then combine these partial parsers to form a complex whole. The canonical example in this context is Haskell's parsec library. Let's start out with a simple restricted URI grammar:

module RestrictedURI where

import Text.ParserCombinators.Parsec

data URI = URI {
host :: [String],
port :: Int,
path :: [String]
} deriving (Eq, Show, Read)

schemeP = string "http" "scheme"
schemeSepP = string "://" "scheme separator"

hostPartP = many lower "part of a host name"
hostNameP = sepBy hostPartP (string ".") "host name"

pathSegmentP = sepEndBy1 (many1 alphaNum) (string "/") "multiple path segments"
pathP = do {
root - string "/" "absolute path required";
segments - pathSegmentP;
return (root:segments)
} "an absolute path, optionally terminated by a /"

restrictedURIP :: Parser URI
restrictedURIP =
do {
ignored - schemeP;
ignored - schemeSepP;
h - hostNameP;
p - pathP;
return (URI h 80 p)
} "a subset of the full URI grammar"


parseURI :: String -> (Either ParseError URI)
parseURI = parse restrictedURIP ""


(Where you should forgive me for the blog inserting break tags all over the place). But just to illustrate:

vincent$ ghci 
GHCi, version 6.8.1: http://www.haskell.org/ghc/ :? for help
Loading package base ... linking ... done.
Prelude> :l restrictedURI
[1 of 1] Compiling RestrictedURI ( restrictedURI.hs, interpreted )
Ok, modules loaded: RestrictedURI.
*RestrictedURI> parseURI "http://localhost.com/foo/bar"
Loading package parsec-2.1.0.0 ... linking ... done.
Right (URI {host = ["localhost","com"], port = 80, path = ["/","foo","bar"]})

Plus, we get composability, validation and error messages essentially for free:

*RestrictedURI> parseURI "http://localhost2.com/foo/bar" 
Left (line 1, column 17): unexpected "2" expecting lowercase letter,
"." or an absolute path, optionally terminated by a /

Now consider the following excerpt from Haskell's Network.URI.

--  RFC3986, section 3.1  
uscheme :: URIParser String
uscheme =
do { s - oneThenMany alphaChar (satisfy isSchemeChar)
; char ':'
; return $ s++":"
}

(Again, please forgive for the blog eating my code, but you can also get it from the haskell web site.) And compare that to the ABNF found in the corresponding section of the RFC:

scheme      = ALPHA *( ALPHA / DIGIT / "+" / "-" / "." )

Note how the complete URI grammar specification in the RFC is barely a page long. So yeah, implementing this grammar is a significant amount of work (of course you could always choose to support just a well-defined subset), but if you have a good parser combinator library, it's just a few hours of mechanically transforming the ABNF into your parser grammar. You can even watch the Simpsons while doing it (I did). In the case of Network.URI, this boils down a line count of 1278, with about half of the lines being comments or empty lines. Not only that, but given the complete grammar specification, it's super easy to formulate a modified grammar.


As it turns out, Python has a library quite like parsec, it's called pyparsing and I'll bore you with it in my next (and last) post on this topic.
 Permalink

Think you can Trust Python's stdlib? Think again.

It's been a while that I've blogged about Ken Thompson's Reflections on Trusting Trust. And this week I was bitten hard by its moral:

The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.

The task seemed simple enough. We had been passing around links between clones in a URL-like format of the type ${host}:${port}/${path}, with a small custom parser (an ugly hack) for parsing and unparsing these things. As we adapted the code to support IPv6 it turned out that in many cases (i.e. unless the nodename field was configured), raw IPv6 addresses would be passed around, and the parser would of course choke on that. Fair enough, I thought, time to use the established standards and

import urlparse 

Now this is supposed to split the URI into parts corresponding to scheme, host, path etc. like so

>>> urlparse.urlparse("http://foo.com/bar") 
('http', 'foo.com', '/bar', '', '', '')

Of course, most nodes still had the old clone links lying around, and I was surprised to find the parse for these entries:

>>> urlparse.urlparse("foo.com:6221/bar") 
('foo.com', '', '6221/bar', '', '', '')

Hmm. OK. Let's look at the internals of that parser, and vi urlparse.py:

def urlsplit(url, scheme='', allow_fragments=1): """Parse a URL into 5 components: :/// ?#

[snip]

(e.g. netloc is a single string) and we don't expand % escapes."""
key = url, scheme, allow_fragments
cached = _parse_cache.get(key, None)
if cached:
return cached
if len(_parse_cache) >= MAX_CACHE_SIZE: # avoid runaway growth
clear_cache()
netloc = query = fragment = ''
i = url.find(':')
if i > 0:
if url[:i] == 'http': # optimize the common case
scheme = url[:i].lower()
url = url[i+1:]
if url[:2] == '//':
netloc, url = _splitnetloc(url, 2)

[snip]

else:
scheme, url = url[:i].lower(), url[i+1:]
[snip]

return tuple

(Why do blogs always _INSIST_ on fucking up source code? But we're kind of on topic, so maybe this fits). Anyhow, we have a fancy caching scheme, but the parser itself consists of a bunch of if and uri.split() statements. Talk about premature optimization. More than that, one should think that language implementors know a thing or two about parsers...

Consider: the parser is written in such a way that the result is predictable if and only if the input string represents a valid URL. But how do you find out if a string is indeed a URL? The answer is easy: you use a parser. In other words, the urlparse module is in most cases useless, because unless have sufficient control over the input (unlikely for networking apps) the parse result is essentially undefined.

However the urlparse module is not only "useless", it is in fact dangerous, since by using it for untrusted input, the behaviour of your app is by implication also essentially undefined (how do you handle an undefined result?). Now consider the following quick google code search. I don't suppose that any of the following names rings a bell with you: Zope, Plone, twisted, Turbogears, mailman, django, chandler, bittorrent. Surely all of these software packages have carefully reviewed all of their uses of urlparse, and properly identify and handle all cases where an arbitrary result may be returned... Script kiddies, REJOICE!

 Permalink
Next1-10/18
etoy.com twisting values since 1994