The OPEN SYSTEM that is the Internet's internal and fundamental "user
logic" is transforming into a mechanism of restriction and surveillance
becoming a potential network of exclusive "gated communities" within
the expanding digital universe. Instead of remaining the open forum its
users wish it to be, the threats of Internet annihilation are being
sounded, not only from governments and their (often ill-informed)
legislators, but also by many of the major "gatekeeper" and mass-market
businesses forming the Internet backbone, many of whom are on the verge
of bankruptcy or merger. How may the concepts of expressive freedom,
mobility, identity, and market (including those which are artistic,
non-commercial, fiscally "value-less" along with those which are
genuinely "for-profit") within the current "Inter_Net" be preserved?
How do artists deal with these issues, and how do activists reveal such
impending "closures" or draconian restrictions of the Net? The seminar
will examine how the emergent fields of intelligent and generative
systems, swarm technologies, free networks, open source development
along with parallel and free association models are the Open Systems
creating our future "Post-Web" environments.
Armin Medosch's (soon to be published) book on "Free Networks" forms
the starting point for an examination into these environments of open
cultural and artistic production. Free, not in the sense that they do
not cost any money, but free in their philosophy of user/public access,
wireless structures, mobility and open-sourcing content. An extension
of Medosch's notion of Piracy (seen as a positive term, in contrast to
"theft or burglary") such networks, systems and collaborative processes
begin to produce the foundations of the sustainable networks,
intelligent structures and tactical tools the Internet has been
envisioned as being. These global structural trajectories begin to
develop new semiotic structures which themselves provide the means and
environments for a renewed understanding of digital expression and data
perception which begin to work at keeping the System Open and
Introductory position by Armin Medosch
Drazen Pantic _ Open Dialogue / Interaction Systems
Seda Guerses _ Security/Artistic expression as Open System
Julian Priest _ Wireless Community Networking / Spectrum Occupation
Respondent: Rainer Kuhlen _ Participatory Democracy, Media Models
Marco Dorigo _ Autonomous Robots / Swarm Technology
Jo Walsh _ Semantic Structures / Intelligent Interfaces
Michael Bitterman _ A.I. Collaborative Spaces
Respondent: Alexei Shulgin _ Basic Systems / Technological
Open Systems - Glossary
The term system covers both the elements of function and their interactions. The functional relations of the system elements cause the specific properties and capacities of a system: the system properties. The single system elements can be grouped together to depict a system in a block diagram since block diagrams are instructive aids for the understanding of the nature of the system's elements and their combinations. They form the basis for a further system-analytic analysis of the data. They serve to stocktake facts and represent the structure of the effects. There are a whole range of structures and processes in biology that can be depicted in this way.
A system consists of a number of system elements. It is important to distinguish system elements that range at the same level from such of higher or lower levels of hierarchy. Systems of lower levels are sub-systems of a higher system. Living systems, for example, can be arranged according to the following hierarchy:
cells - multicellular organisms - societies - ecosystems.
Complex systems involve phenomena characterized by interactions of individual agencies (sub-systems or elements), that self-organize at a higher systems level, and exhibit emergent and adaptive characteristics. The complex behavior and hierarchical structures evidenced in most social organizations suggest that the relationships of sub-organizational elements are such that they perform specific autonomous functions are nearly decomposable.
Information theory explains organized complexity in terms of the reduction of entropy (disorder) that is achieved when systems absorb energy from external sources and convey it to pattern or structure. Complex systems take in data from their environments, find regularities in the data, and compress these perceived regularities into internal models that are used to describe and predict its future. Adaptation results from the selection pressures of specific environmental conditions which are often composed of many degrees of uncertainty. Conditions under which complexity emerges can best be described as intermediate between chaos and order. In that social organizations can be understood as non-linear and dynamic artificial systems, theories of complexity can be used to describe properties which can be algorithmically and semantically described.
Complex systems (artificial or natural) are composed of excessively large numbers of elements that interact simultaneously and in a parallel fashion, including certain computational systems, networks and databases. Such systems exhibit self-organizing behavior, are auto-catalytic, are nearly decomposable and are sensitive to initial conditions when they are in the chaotic regimen. A significant phenomena observed in complex systems is their non-deterministic bifurcation evidenced in dynamic trajectories, which emerge as higher-level processes and include adaptive properties resulting from interactions between simpler ones.
is a branch of non-linear science that deals with relatively complex open systems, and systems that are displaced from stable equilibrium by means of a flow of energy, matter, information or, for example, money. In this the boundaries of Chaos Theory merge with those of Systems Theory, Complexity Theory, Rene Thom's Catastrophe Theory.
Open systems have been defined as encapsulated, reactive, and spatially and temporally extensible systems: They are composed of possibly many encapsulated components, each of which is normally described as an object. Each component supports one or more public interfaces. The inner workings of each component are separately specified via a normally inaccessible implementation description (often a class). Open systems are reactive in that they do not just perform a single one-time service; most functionality is provided ``on-demand'', in reaction to a potentially endless series of requests. Open systems are spatially extensible when components need not bear fixed connectivity relations among each other. They may interact via message passing mechanisms ranging from local procedure calls to multi-threaded invocations to asynchronous remote communication. And open systems are temporally extensible to the extent to which their components and functionality may be altered across a system's lifetime, typically by adding new components and new component types (perhaps even while the system is running) and/or updating components to support additional functionality.
For example, an open system supporting financial trading might include a set of components that ``publish'' streams of stock quotes, those that gather quotes to update models of individual companies, those serving as traders issuing buys and sells of financial instruments, and so on. Such a system possesses each of the above features: It relies on interfaces (e.g., QuotePublisher) with multiple implementations. It contains reactive components handling a constant influx of messages. It is intrinsically distributed in order to deal with world-wide markets. And it may evolve in several ways across time; for example to accommodate a new company listed on an exchange, or to replace components computing taxes with interoperable versions reflecting tax law changes.
Systems that live in a continuous exchange with their environment are called open systems.
Living systems require the continuous uptake of energy and nutriments from their environment, to excrete and to react in specific ways. Cells have therefore -just like all other biological systems- to be regarded as open systems that are characterized by inputs and outputs and an in-between element of transition. They are never in a stationary equilibrium but always in a steady state. As long as we do not know what happens in the transitional element (in our case the cell) it can, according to the system theory, be regarded as a black box. The relation between input and output characterizes a flow of information through the system. A physical or chemical energy may influence the system through the input and thus cause certain changes that may again have an influence on other systems or system elements via the output. From a cybernetic point of view, neither the inner structure of the transitional element (in the case of the cell: the requirements mentioned above) nor the form of the energy is of importance. The time course of input and output signal and the connection of both signals alone is decisive.
Transitional elements or systems do in the simplest case operate linearly; the input and the output signal would thus be proportional. But their function is usually much more complex so that the output signal is noticeably changed. Such changes can be captured mathematically and can be described by formulas that are normally differential and integral equations of first or higher orders. This means that the output signal may have any state.
Cybernetics, the science of organization.
The twin domains of cybernetics and systems science study all forms of "organized complexity", that is, different components assembled in a way that is neither random nor repetitive so as to form a "system". Organization can be defined as structure with function, that is, a system of components arranged in such a way as to fulfill a certain purpose. The main insight that cybernetics has contributed to the understanding of organization is that of the control system. A control system is involved in a negative feedback cycle, so that its output (actions) influences its input (perception), in such a way as to bring the perception as close as possible to its goal. Cybernetics has shown that all forms of goal-directed action are based on such cycles.
After this insight was formulated in the 1940's, it led to plenty of applications. Yet, the emphasis was on the organization of a given system rather than on the issue of where that organization had come from. This emphasis on rigid structures led to a counterreaction in the 1970's: second-order cybernetics. The rationale behind this approach was that systems are not given, physical entities, but models constructed by an observer to clearly distinguish themselves from these more mechanistic approaches, by emphasizing autonomy, self-organization, cognition, and the role of the observer in modelling a system. In the early 1970's this movement became known as second-order cybernetics.They began with the recognition that all our knowledge of systems is mediated by our simplified representations--or models--of them, which necessarily ignore those aspects of the system which are irrelevant to the purposes for which the model is constructed. Thus the properties of the systems themselves must be distinguished from those of their models, which depend on us as their creators. An engineer working with a mechanical system, on the other hand, almost always know its internal structure and behavior to a high degree of accuracy, and therefore tends to de-emphasize the system/model distinction, acting as if the model is the system.
Moreover, such an engineer, scientist, or "first-order" cyberneticist, will study a system as if it were a passive, objectively given "thing", that can be freely observed, manipulated, and taken apart. A second-order cyberneticist working with an organism or social system, on the other hand, recognizes that system as an agent in its own right, interacting with another agent, the observer. As quantum mechanics has taught us, observer and observed cannot be separated, and the result of observations will depend on their interaction. The observer too is a cybernetic system, trying to construct a model of another cybernetic system (see constructivism). To understand this process, we need a "cybernetics of cybernetics", i.e. a "meta" or "second-order" cybernetics.
Unavailable energy or molecular disorder. Entropy is at a maximum when the molecules in a gas are at the same energy level. Entropy should not be confused with uncertainty. Uncertainty is at a minimum when all elements are in the same category.
THERMODYNAMIC ENTROPY The quantity of energy no longer available to do physical work. Every real process converts energy into both work or a condensed form of energy and waste. Some waste may be utilized in processes other than those generating it (see recycling) but the ultimate waste which can no longer support any process is energy in the form of dispersed heat see second law of thermodynamics). All physical process, despite any local and temporal concentration of energy they may achieve, contribute to the increased overall dispersion of heat. Entropy therefore irreversibly increases in the known universe.
The measure of a system's energy that is unavailable for work. Since work is obtained from order, the amount of entropy is also a measure of the disorder, or randomness, of a system.
Systems in which autonomy, emergence and distributedness replace control, preprogramming an centralization. These designs are proving flexible and robust, able to adapt quickly to changing environments and to continue functioning even when individual elements fail.
A term developed by the biologists Humberto Maturana and Francisco Varela to denote a form of system organization where the system as a whole produces and replaces its own components, and differentiates itself from its surrounding environment on a continual basis, is being applied in such diverse fields as software engineering, sociology, economics and law. The issue of how autopoiesis can or should be applied to social systems is an ongoing topic of debate...
Organization science explains autopoiesis in terms of social systems as realized primarily in linguistic, consensual domains. Language, as a consensual domain, is a patterning of behavior that possesses a shared orientation. Social systems exercise influence on individual participants through allowances for, and regularities in, their interactivity. This influence is recursively employed on the emergent social system through the participants' continued interaction.
The processes utilized by systems that differentiate themselves from other systems on a continual basis through operational closure, (Maturana and Varela 89), and that produce and replace their own components in the process of interaction with their environment (structural coupling, ibid. 75), take place through a membrane containing the organization of the unity in question, thus allowing distinction between it and its environment. (ibid. 46) Such systems also exhibit great versatility and plasticity allowing expansion of possible behaviors, (ibid. 138) reproduction with conservation of adaptation, and motility. A basic question for any analysis (or design), based on autopoietic theory involves distinguishing the membrane, or the interface, where operational closure (inside) and structural coupling with an environment (outside) are expressed, because the membrane is the plane of distinction that allows any observation of plasticity, reproduction, and motility. Until we know where the membrane is, we can not analyze an autopoietic system, leave alone design one.
Autopoiesis as a type of organisational closure, note that such closure is not the same as thermodynamic closure :the autopoietic system is open to the exchange of matter and energy with its environment, but is automously responsible for the way these resources are organised. Maturana and Varela have postulated autopoiesis to be the defining characteristic of living systems. Another fundamental feature of life, self reproduction, can be seen as a a special case of autopoiesis, where the self produced components are not used to rebuilt the system, but to assemble a copy of it...
The traditional notion of a database is an organized information store which can be queried and updated via a software application in a very controlled manner. But at an extreme, a database utilizing an autopoietic conceptual scheme would not have a data store at all. It would query the environment/knowledge domain directly. There would be no map, nor representation of the domain beyond the domain itself. The environment surrounding the application would itself be the database, and the autopoietic application would draw from that source via it structural coupling with that environment. Queries to the application would return concerns and reflections of the ontology of the environment, rather than internally consistent conclusions (facts) generated in the domain. Such applications could be useful in evaluating systems where traditional predictive modeling data mining techniques are initially impractical.
The nomad adheres to the skilled measure of territorial itinerancy, one that gains its meaning only from its heterogeneous relationship to other measuring machines, and by their antagonism in relation to the non-measurable Body without Organs. The nomadic itinerant measure partly involves the act of bringing something into the realm of the symbolized and known way of perceiving. But it also does not come to prioritize a particular manner of perception above all others, or above no perception at all. This is precisely the non-measurable agency initiated when the war machine refuses to take war as its direct object, and only presumes its possibility as a supplementary Idea. Direct, perceptible war must exist, for any act of warding off must anticipate to some extent the very object seeking to be averted: It is necessary to demonstrate that what does not yet exist is already in action in a different form than that of its existence (ATP 431). The overwhelming marker of the State is its incorporatizing of flows, the centripetal movement of heterogeneity into a partial homogeneity described as isomorphy. This centripetal movement must already be in existence in itinerant territories, but must cancel itself out at the point of inversion where the territory would cross the threshold into a new assemblage, into a State assemblage. It is the essence of this threshold that begins to take on significance, the degree at which, or beyond which, what has been anticipated and then prevented for so long ceases to be conjured away and consequently arrives as a direct and perceptible object. For this reason, it has been important to be wary and not blindly take for granted the efficacy of even the most marginal of anti-State forces like the war machine, and acts of deterritorialization specially when such resistant machines begin to phenomenalize their antiproductive essences. It is not so much a matter of unveiling the logic of a dominant center imposed upon an oppressed periphery; for this assumes that all power originates from the center and flows outward. The moment at which domination is decided is at the threshold, the matrix in which very different orders are placed in communication. A nomadology, as a warring movement with that which has come to hold dominion in the perceptible world, with the modes of production that have become available, would speak of the silence that points to everything uniquely inexplicable, the abundant interests that refuse to be contained within any isomorphic apparatus of capture. A nomadology would offer in place of the global free market and its axiomatic logic of deterritorializing singular diversities so that all may speak on a common planetary playing field, the extra-ideological democracy of singular diversity itself in the movement of an itinerant territoriality.
Conceptualizing attractors as perturbation patterns of linguistic activity suggests redirection of attention from a simple structural orientation to one in which the state transitions of an organization as system is more fully explored. This is particularly true in complex organizations such as databases or networks, which are clearly deterministic yet unpredictable. It would seem, for example, that chaotic patterns of prehensive activity emerge in specific clustering and nearest neighbor representations. Based on the features of deterministic chaos, a prehensive-based interpretation may illuminate how an organization is bound to seek new pattern as well as sustain its tendency for adaptation.
"The French expression for an entailment mesh, which is due to Peter Burch, is "Maille d'entrainment", or roughly spiders-web of entrainment". I prefer it to the English language original for, on the one hand, it rightly states that the web is woven by the spiders living in it and, on the other hand, that the unfoldment of a mesh is the entrainment of those dynamic conceptual and coherent entities which render it real." Indeed, the web and spiders metaphor is better than the skeletal leaf metaphor because it is a more dynamic and very suggestive means for understanding the action of entailment meshes in teaching and learning conversations. The co-construction of agreed entailment meshes (-like spider's web spinning) through conversations, and their use to help determine and inform vital actions in the life-world(like catching and devouring flies for nourishment) is human being.
The emergent result of conversations among (lower order) P-individuals which generate agreements. The claim that "conversations" are the means by which new higher-level "participants" further (P-individuals)are procedurally constructed (caveat: see Lofgren,1993 & below) or , is argued by Pask in Conversation Cognition and Learning (p. 164, 1975). This hypothesis is ontologically similar not only to Wittgenstein's later position, but also to that of Jurgen Habermas(1981/84) who believes that our common vested interest in discursively promoting understanding or "communicative action" is precisely what is really constitutive of the human "lifeworld". Recently, one might also cite Merlin Donald's mime-etic interaction based Origins of the Modern Mind (1991) and Kenneth Gergen's social-constructionist Realities and Relationships(1994) as very well grounded complementary theoretical constructs. How some of this may bio-physically be realised is plausibly exhibited in Gerald Edelman's Extended Theory of Neuronal Group Selection (Edelman, 1992).
The basic elements of an entailment mesh are called topics. They are "public" concepts, in the sense that their meaning is shared by a number of conversational participants. These concepts are connected through coherences. A coherence is a collection or cluster of topics which are so interrelated that the meaning of any topic of the coherence can be derived from the meaning of the other topics in the cluster. In other words, the topics in a coherence entail or define each other. A simple example of a coherence is the cluster (pen, paper, writing). This means that we can somehow start from the concepts of writing and paper and produce the concept of an instrument that allows you to write on paper: a pen. Complementarily, we can start from the notions of pen and paper and derive from them the activity you do when applying a pen to paper: writing.
An entailment mesh is then a complex of overlapping coherences, that is, a collection of topics and coherences such that every topic belongs to at least one coherence
Is extension of entailment meshes: directed graph representations governed by the 'bootstrapping axiom', determining which concepts are to be distinguished or merged. Constant restructuring and elicitation of the conceptual network.
Entailment nets are then generalized to associative nets characterized by weighted links. Learning algorithms are presented which can adapt the link strengths, based on the frequency with which links are selected by hypertext browsers.
An agent is anything that can perceive its environment through sensors and act on environment through effectors...
An ideal rational agent performs a sequence of actions (life output history) which maximizes the expected and performs measure given evidence of the percept sequence and built-in knowledge... additional vital trait: autonomy, ability to learn.
Distributed artificial intelligence
(DAI) which studies problem solving by multiple interacting knowledgebased processes  DAI tackles problem complexity by dividing the problem into smaller manageable parts, introduces fault tolerance, reduces solution uncertainty by integrating different points of view and increases performance because of concurrency in execution
Is where the links between nodes are given some sort of weight or priority, and the strengths of these vary according to the number of times they are traversed (or associated). Classical rules for strength adjustment date from Hebb. An example of this mechanism has been demonstrated by Bollen and Heylighen at the Principia Cybernetica Project.
An evolutionary algorithm allows for the creation and destruction of nodes and links according to their success. Variation in the nodes could come from versions of the node created by the participants, whilst nodes that are not linked to within a certain time could be downgraded or eliminated. Given these two operations the nodes at the site would undergo an evolutionary process.
Self organised networks
One might argue that it is not the WWW's goal to simulate brains or neural networks, but to provide a reliable and user-friendly access to stored knowledge. But it is questionable whether the present WWW--and the hypermedia paradigm in general [Nielsen, 1990]--succeeds in this [Jonassen, 1989; 1993]. The WWW's content is presently expanding at an enormous pace, but the quality of its structure does not seem to improve. This should not surprise us, as the only mechanism for network restructuring at present are the contributions of individual web-designers, each adding their own, often poorly designed, sub-networks to the WWW. The WWW, being not more than the sum of its parts, can achieve no better quality of structure than that of these sub-networks. This causes the WWW to be, in general, very poorly organized, which in its turn seriously hampers efficient and user-friendly retrieval of information [Hamond, 1993]. With an ever-expanding amount of information being added to the WWW, this problem can only be expected to worsen within the present set-up.
We believe the only solution to these practical and fundamental problems is to move beyond metaphors and implement the necessary conditions to make the WWW really function in a more "brain-like" manner [Heylighen & Bollen, 1996]. As a first step in that direction, we tried to design a mechanism for the self-organization of a hypertext network. Our goal was to develop algorithms that would allow the WWW to autonomously change its structure and organise the knowledge it contains, by "learning" the ideas and knowledge of its human users as manifested in their browsing behaviour, thus producing a more ergonomic and user-friendly network.
We used three rules which were all inspired by the Hebbian principle of learning: the link between nodes of the network that have been activated within the same interval of time is reinforced [Hull, 1952; Thorndike, 1911]. This mechanism is entirely associative in nature and claims to achieve global optimisation of network structure through the adjustment of local connections and is as such in accord with the associative nature of the WWW and the absence of centralised control. In line with the principles of evolutionary epistemology [Campbell, 1974], our learning algorithms were also based on the principles of variation and selection, which are assumed to guide the evolution of knowledge [Heylighen, 1993]. Of the three learning rules, two produce variation by introducing new candidate links to the list of 10 "actual" links, the third one produces selection by rewarding or punishing already actual connections.
heterarchical structure--where concepts mutually produce each other.
Another knowledge representation scheme from AI, the semantic network (Brachman, 1977; Shastri, 1988; Sowa, 1991), is based on similar nets of interdependent concepts, but here the dependencies are classified into distinct types with specific interpretations. For example, different types of relations might specify that a concept a "causes" another concept b, that a "is a part of" b, or that a "is a special case of" b. The motivation underlying semantic networks is that concepts get their meaning through the semantic relations they have with other concepts. This is similar to the bootstrapping philosophy underlying entailments meshes and entailment nets.
Emerge when communication and computing technologies amplify human talents for cooperation. The impacts of smart mob technology already appear to be both beneficial and destructive, used by some of its earliest adopters to support democracy and by others to coordinate terrorist attacks. The technologies that are beginning to make smart mobs possible are mobile communication devices and pervasive computing - inexpensive microprocessors embedded in everyday objects and environments. Already, governments have fallen, youth subcultures have blossomed from Asia to Scandinavia, new industries have been born and older industries have launched furious counterattacks.
Inexpensive microprocessors embedded in everyday objects and environments. Characterised by being numerous, casually accesible, often invisible computing devices, frequently mobile or imbedded in the environment and connected to an increasingly ubiquitous network structure.
The Internet has, in a very short period, become the indisputable arena of open and global artistic expression and cultural activity. However the basic structural tenets of the Internet's conceptual free movement of expression are increasingly in jeopardy. As global usership and technology expand so do external, often draconian attempts at control, monitoring, and censorship.
Research sources and bibliography
Nomadology/ Deleuze and Guattari resources
open systems ludwig von bertalanffy
Ashby homeostat/ neural networks/ perceptron/ pattern recognition/ disturbances
self programming machines
self modyfing automata/ self programming machine/ genetic programming and evolvable machines
Autopoiesis and embodied cognition
conceptual integration/ blending
glossary genetic algorithms
glossary evolutionary computing
cellular automata------evolutionary music----generative art
Distributed, parallel and cluster computing
Ant algorithm and swarm intelligence
Distributed neural activity
Social structures and chaos theory
Complexity and emergence
glossary on nanotechnology
Ontology of organization as system
Bootstrapping social intelligence
Open Dialog Architecture-------beyond the boundaries of art systems
Wireless clouds of free internet access over urban areas
Theorizing the Radical Potential of Location-Aware Mobiles
Peer to peer systems
free networks PPA----Pico Peering agreement
Free networks around the world
Distributed, decentralized, multi-agent systems
IRIS Infrastructure for Resilient Internet Systems
Open source architectural practice
Distributed systems, voluntary cooperation
Collaborative Networked Communication: MUDs as Systems Tools
Cartography of contemporary control system
Technology and privacy
Internet and copyright
Free software as collaborative text
Maturana, H. and Varela, F. (1980) Autopoiesis and cognition, the realization of the living. D. Reidel Publishing co London, England
Koolhaas, Boeri, Kwinter et al. (2000) Mutations. Actar, Barcelona, Spain
Mulder, A. (1998) We living systems, interview with Humberto Maturana in The
art of the accident. NAI Publishers/V2_Organisation, Rotterdam, Holland
Mulder, A (2003) The Deep pattern of life, interview with Simon Conway Morris
in information is alive. NAI Publishers/V2_Organisation, Rotterdam, Holland
Mulder, A. (2000) The deep now, interview with Francisco Varela in Machine
Times. NAI Publishers/V2_Organisation
Lovink, G. (2003) My last recesion. NAI Publishers/V2_Organisation
Segment I: Thursday, November 27, 11.00 - 13.30
Moderator: Stephen Kovats
Respondent: Prof. Rainer Kuhlen
The "Free" in Free Networks
My definition of "free" in Free Networks is in line with the way Richard Stallman speaks about Free Software. He emphasises the notion of freedom of expression versus the less important financial side of the argument for Free Software. Acknowledging that there are differences between writing software and building networks, the "free" in Free Networks can be better understood with the help of a layered communication model derived from the tcp/ip protocol stack. At the bottom layer is the freedom to physically build networks. The second layer is the access layer, the ability to have unrestricted access to networks. The third layer is the communication layer which defines how communications can be structured within those networks. The fourth layer is the layer of media freedom, the right to use networks for mass media communication. Free Networks are mostly concerned with layers 1 and 2, which are infrastructural necessities to guarantee the higher level freedoms of layer 3 and 4. Somehow above, around and in between all those layers sits the notion of free association supported by semantic web structures which could turn out to be the glue that holds it all together. However, the communication model of network freedom is not a technical model. What is most important is that all those efforts by the free network, free software and free media movements to safeguard different freedoms are driven by a desire for autonomy from the dominant forces in society, the state and large corporations, whose interests now more often than not seem to coincide and to conspire against the individual. Contrary to their repressive models of social organisation the individual seeks to establish freedom by creating self-institutions. Reading 'institutions' as manifestations of the collective imagination, self-institutions allow us to retain the full potential of individual agency while at the same time we become part of a collective entity.
Open Dialog: Public WiFi Network 2 Public Cable Network
Computers and the Internet have penetrated widely - both horizontally and vertically - through all the strata of the society. Not undermining the importance of TV and radio networks, we can safely reach the conclusion that the Internet has made a significant difference on public opinion, offering diversity and variety of views and arguments to the general public. But, the Internet is not the free, open, unfettered domain many once imagined. From the proliferation of proprietary, commercial standards and networks, to new layers of restrictions now being imposed by governments worldwide, we see that digital space is subject to the same restricting forces and trends as conventional media and communications. Consequently, it is our intention to help create an Internet infrastructure that will actively promote dialog and foster possible reconciliation of diversities through open, immediate and unmoderated dialog. As an example of such work we will present a project that tries to establish procedure and criteria for broadcasting to the cable or satellite TV networks directly from remote locations, using a laptop, camera and any type of available broadband Internet connection - preferably WiFi.
Breaking the dichotomies: open source as a strategy in educational software
Educational software has been playing a central role in the commodification and rationalization of universities. The main driving force of most proprietary educational software is the dichotomous thinking that splits technology and content, or technology and pedagogy. Form and content are assumed to be independent of one another. In the presentation technological determinism is put into question and strategies for intervening in technology production and opening up the existing proprietary structures using open source software and philosophies are suggested.
Open or Closed Spectrum
From the growth of community wireless networks to the rollout of the first 3G networks across Europe, the internet is being unwired. This has the potential to alter the topography of control within communications networks in profound ways. It can alter the relationships between telecommunications provider and consumer, change the competitive and co-operative landscape for telecoms and allow the creation of temporary local infrastructures within a shifting data network. These developments are occurring within and being shaped by a legislative framework of radio spectrum regulation from another century that places strong limits on electromagnetic transmission and reception. What is the future of the open system in the closed spectrum and how are technological and political developments creating a momentum for an open spectrum?
Segment II, Thursday, November 27, 14.30 -17.00
Moderator: Stephen Kovats
Respondent: Alexei Shulgin
Ilya Eric Lee
OSSF, Open Source Software Foundry, is a national project on building the Taiwanese national and global OSS developers' common facilities and infrastructure of communication and collaboration. It's also the pilot project to involve and facilitate the local OSS communities with the help of law experts and business-related operation dialogues. OSSF conducts an annual community survey and investigation, and categorises OSS communities into different active "clusters" which aim to trigger and catalyze the chemical reaction among the emerging domains in which tools like open source software play a important role. For an island that is world renowned for its OEM efficiency on computer hardware, the software industry hasn't yet received enough attention, nor has the conscious of the software culture and the importance of openness. OSSF would be the glue to connect OSS zealots and target application communities to make people practically dialogue and face the real world, practising "open and collaborative development models".
Imagine you live in a world, where whatever you do, whatever you think and whatever you dream would be relatively open source. And imagine the spaces you use, the political space, the social space and your information and work-space would not be passive but would to an extend and relatively directly adapt to you. Always under construction. Some parts more stabile (over time) than others. Actually not just adapting to but also stimulating you. - Would that not be confusing? That depends on the amount and context of the stable versus the unstable elements. And if you think about it. The described condition is neither radical nor new but is in some cases relatively indirect and delayed. - Would that not be dangerous? A high degree of networking proves much more stabile to perturbations than a single link. That is why our brain does not forget in «chunks« of information but more in terms of decreasing resolution. The Informationizer represents the relationship of data, Information, knowledge and the users as a swarm phenomenon. It explains the basicproperties of any adaptive environment and gives impulse to the discussion onhow the internet may evolve as an interactive information space.
"Common sense won't tell you. We have to tell each other." -DNA
A web of machine intelligible, interconnected information, from simple pragmatic applications like RSS feeds of sites to aggregators and reasoners, is just the start. The semantic web is a rescue system for the web that is straining against it.
As the net expands and deepens centralised search and indexing, now the course of so much power and control, becomes increasingly less useful; rendering the results compromised and unintelligible. Huge meta-organisations use all-encompassing proprietary ontologies, classification systems with rules and reasoners like CYC and SUO, backed by natural language parsing and analysis techniques; a means of abstracting and codifying 'meaning' that is written from a worldview.
On the semantic web now, we are publishing, sharing, annotating and collaborating on 'ontologies': vocabularies for describing domains in the world, with logical rules and constraints about them which allow us to make 'meaning' from the data we find published on the web and draw new conclusions from it.
Asking the wrong questions can lead to very weird answers; it is essential and at one with tenets of open source to put these tools into real peoples hands. RDF and OWL provide a substrate from which we can build and subscribe to a 'worldview' and share the ideas it demonstrates with each other.
Jo has been working on a semantic web toolset as part of the mutemap/mapping contemporary capitalism project; infrastructure to map and describe the connection network between various government, commercial, public institutions and their leading members; and to share and merge this data with related projects like http://hierarchies.org/ and http://theyrule.net/.
Building these systems to allow open contribution to, and collaboration on, a distributed knowledge base and the tools to make sense with it, we hope to create - a consensus-building machine, something that will stop us needing to pose questions about 'after the Internet'.
Much of Jo's work revolves around 'bots', tiny dumb agents that traverse the semantic web; a conversation with an instant message bot helps us explore the technology in this session.
http://mutemap.openmute.org/ - the 'infomesh'
http://www.foaf-project.org/ - Friend of a Friend: semantic web vapourware for the masses
http://space.frot.org/ - related work using the same toolset to describe and annotate physical space and wireless networks
Prof. Marco Dorigo
Swarm Intelligence: From ant colonies to computer algorithms
Ant colonies, and more generally social insect societies, are systems that in spite of the simplicity of their individuals present a highly structured social organization. As a result of this organization, ant colonies can accomplish complex tasks that in some cases far exceed the individual capacities of a single ant. The study of ant colonies behavior and of their self-organizing capacities is interesting for computer scientists because it provides organization models which can be used to solve difficult optimization and control problems. In this talk, I will focus on a particular ant colony activity, foraging for food, and I will discuss how ant colony behaviors have inspired algorithms for the solution of difficult optimization problems and for the control of robot swarms.
ant colony optimization home page: http://www.aco-metaheuristic.org/
swarm-bot project home page: http://www.swarm-bots.org