IFIP TC9 workshop on ICT uses in war and not-peace

I had the honour of being allowed to attend the IFIP TC 9 (“relationship between Computers and Society”) Workshop on “ICT uses in Warfare and the Safeguarding of Peace” yesterday, which was held at the CSIR Convention Center in Pretoria, South Africa, that covered an array of topics, such as cyber threats, network warfare, ICT for command & control and for socio-tech, and militarizing the FIFA Worldcup 2010, summarized further below and sandwiched between general comments on IT and the military.

Pre-workshop event and considerations

Beforehand, there had occurred some attempts to ‘massage’ me, in that one of the organisers already had figured out that, possibly, I would not be interested in the second part of the workshop that would focus on ICT for destruction and damage control, and I had participated in a meeting with the CSIR Department of peace, Safety & Security (DPSS) earlier in the week, of which they thought I might be interested—in contributing my knowledge to enhance, as it unfolded during that meeting, the PsyOps section as part of Information Warfare. PsyOps is an abbreviation for Psychological Operations, which is a euphemism for (1) torturing detainees to beyond breaking point, which does not leave physical scars so proving it in court is more difficult[n1], (2) mind manipulation of the masses-at-home to swallow all sorts of things that either violate the country’s constitution (magna carta, whichever), or UN declaration of Human Rights, or fall under plain propaganda to distort reality. (3) mentally preparing and patching up soldiers for battlefield operations. Not surprisingly, a psychologist signing up for PsyOps has to de-register from the list of accredited psychologists (at least, in South Africa they have to).

Unfortunately also not surprisingly, some of the computer science attendees of that meeting were not against hooking up with the DPSS for enhancing PsyOps; after all, PsyOps has project money to give away. Researchers and engineers hooking up with third parties—be it the military or industry that produces for the military—is a well-known thing (barely an issue) in computer science and engineering, as well as in other disciplines (though to a lesser extent), for there is plenty of money to burn, unlike government funding (in Italy, the Berlusconi government even has reduced the amount of project funds, and they were already at the bottom in the OECD %GDP raking).
Setting aside the US, of which hearsay says that at least 80% of computer science research is funded by the military or military-industrial complex, I could equally take an example in Europe, though not exemplary for all of the European Union member states (!!): some history [1], an analysis that shows that 31% of UK government funding goes to military type of projects [2], and, more recently, Nature spent another article on the topic of which the title says quite a lot, being “UK Universities in bed with military” [3] where it is not just the amount of collaboration but also the secrecy around it that raises several issues, and yet another, scholarly, paper that looks at the links between universities and the military with its focus on destruction [4]. Regarding the latter, it is not necessarily the case that the military focuses on destruction. This is partially due, or thanks to, the UN Peace Support Operations (PSOs), which include also peacekeeping and peacebuilding. In the case of the above-mentioned DPSS meeting, it is intended, however, to be the peace making and peace enforcing type of PSOs—“those that are actually war”, to quote organizer Leenen.

The workshop

Now, after the ‘massaging’ attempts earlier in the week, let us take a look at the IFIP workshop on war and not-peace, which, in fact, does not suit well with the TC9 description on computers and society—that what war destroys, among other things; hence, we would have a ‘TC9 on computers and anything as long as it is not society’.
Anyhow, the first stream in the morning was about socio-technical aspects, which was content-wise rather behind on available theory of PSOs. Noteworthy was Col. Xaba’s correction of a comment by Maj. Dr. Falkson’s talk about emotional factors in PSOs (including a slide on “African warrior role exposure” and differences between “war fighting” and PSO-er). Falkson showed a photo of blue-helmets in a crowded street, saying that this gives high stress to the soldier/peacekeeper because it is difficult to identify the enemy. Correctly so, Col. Xaba pointed out that in PSOs, and peacekeeping and peacebuilding operations in particular, there are no enemies from the viewpoint of a blue-helmet; at least: according to PSO doctrine there none. But, as it seemed, he was the only defence person who had a notion of PSO principles. Ben du Toit of the Defence Institute talked about cultural issues, without realizing he is well-embedded in one himself, being the military tunnel vision that states that resorting to violent ‘solutions’ is unavoidable. On the flipside, he is aware of ICT4Peace, analysed the notion of the “cultural smartie box”, and added the relatively new term ethnocomputing. Prof. von Solms, IFIP president, looked at critical information infrastructure (CII) management and tried to get the message across through scaremongering that CII failure can “cause” war. It might help in being the last straw, but the occasional temporal inconvenience of, say, denial of service attacks on government websites or ATM machines is unlikely going to cause a revolution on its own (I’m not asking you to disprove this assumption!). Simon Nare elaborated more on a “Computer Security Incident Response Team”, though upon probing the defence people as to their interest, Col. Coetzee was only “taking note of the information”. A buyers market?

Lt. Col. Theron demonstrated his joys regarding the chaos of operational battlespace visually, with battle photos, and moving and swishing figurines in the ppt presentation. The comment aside on the Information Warfare layers—being the communications grid, physical network, command and control, and doctrine—that doctrine is often copied from the US one, which therefore makes the non-US defence forces vulnerable [know thy enemy], was not lost, neither the sarcasm of the ‘alternative’ US OODA loop (Orientate, Overreact, Destroy, and Apologise). All in all, they seem to have a rather large information integration problem, which was discussed in more detail by Harris who is working on an “integrated development environment” (IDE) to hook up hardware and software across Joint Operations divisions. There is still a lot of work to do in that area. There is a basic version of this IDE as “command and control environment” already, which a gullible CSIR systems engineer, Venter, would like to test during “big events”, such as the FIFA Worldcup, to be held in South Africa in 2010—after all, a world cup is just like war and one surely would need to “have a 3-block war approach”, according to Venter. He even foresaw soldiers patrolling and interacting with fans, hooligans, and more of those terror and crime suspects. So, at the end, I could not resist throwing in a few comments, by first asking if he actually realized he was trying to militarize a civilian event, upon which he drew a blank. He’s been indoctrinated effectively, I suppose. Arrogant Maj. Gen. (Ret) Dr. M. du Toit jumped in to try to lecture me in a condescending tone that the defence is only part of the whole command and control operation—uhh, not of the event management, cross-organizational coordination, and co-operation?! It may well be the case it is one of the 7 pillars—and hopefully they all did get the message that soldiers harassing soccer fans isn’t a good idea at all and, in fact, is asking for trouble—but my point was that the gullible systems engineer was brainwashed. His boss, Harris, also took a more nuanced position, in a civilised, humane way. Perhaps the “Maj. Gen. (Ret) Dr.” is the cause of insistence on unidirectional chat against a young, female, civilian researcher; either way, it does not make him suitable for non-WarDefence interactions and events.

Last, Naude and Voster had nice reflections on trying to define Information Warfare. They discussed the lack of a widely accepted definition and the “US and UK-dominated wikipedia definition of Information Warfare” that includes “…propaganda or disinformation…”, and, according to Naude, most of the Information Warfare material comes from the USA, whereas it is unknown what other countries are doing, if anything, other than copying the US; hence, the current wiki definition might as well be recursive, in that the definition and related material itself is disinformation. The fantastic “sense making” might as well be, as it effectively demands, as a minimum (!), that most, if not all, artificial intelligence problems have to be solved to get a “sense making environment” up and running.

As for a next time of an IFIP TC9 WG 9.4 workshop, to have the “…safeguarding of peace”-part of the workshop scope properly included, as well as to have more ‘entertainment’ during the workshop by inviting people with more diverse backgrounds, the organisers may want to consider sourcing people from, among others, ICT4peace for engineering aspects and the UN-mandated UPeace for peace education and research to provide a balance that, in theory, the defence forces ought to have provided already, given that “war fighting” is only one of the five pillars (in the South African constitution, at least). In addition, it would be a rather dubious ‘honour’ and legacy of the organisers and current IFIP president (Prof. Solms from South Africa—the first African IFIP president) to be the ones who took the lead in institutionalizing an “ICT uses in warfare” Work Group in IFIP. Last, given that IFIP events are, as far as I know, civilian events, there ought not be a perceived need to instruct the defence people[n2] that one of the organizers was the “commander in chief for the day”—who, if that was not clear enough already, was ‘overlooked’ by Col. Xaba as counting as a woman because “her dress was too straight”, i.e. not feminine and sexy enough; proper education—or should I say, indoctrination—conformant to the now supposedly multifunctional defence forces is an area were there is still plenty of work to do.

UPDATE (Aug 11, 2008): the proceedings and presentations etc of the workshop are online now, in case you wnat to have a look at it yourself.

Last, but not least

Clearly, the whole issue of responsibility—is the scientist who discovers x responsible, or the engineer that uses it in a malicious way, or the government that deploys it, or the masses that do not complain—and the ‘what do you want to do with your knowledge’ is not new at all, but after the fall of the Berlin wall it got pushed to behind the scenes, whilst quietly a growing amount of money for military research is being made available. Some people do stand up, even get organised in a collective of scientists for global responsibility, but, thus far and in the vast majority of cases, the siren of short-term research funding wins it over any moral obligation to use knowledge responsibly.
Should one reform the scientists, engineers, research funding policy, or wake up the masses? This ‘million dollar question’ is out for a long time already, but that does not mean one can stick one’s head in the sand and take blood-stained research money until there is a final solution. Of course, with that kind of research funding in your pocket, you could say “In research, I don’t do politics and I don’t take sides…”, but that just means you are indifferent who gets harassed, bugged, occupied, tortured, and killed more efficiently; hence, by the same logic—despite how implausible it might seem now—it might just be used against you or your loved ones some day, too, so you can swallow the bitter pill of reaping what you sow.
Or you can take the other pill, and use your knowledge constructively to, say, use computers for benefit the environment, facilitate biosciences to understand nature, for stability of society—of an open society—or at least for the post-war reconstruction efforts to clean up the disasters that warmongers leave behind and contribute to planting the seeds for a, or nurturing a budding, stable society to achieve positive peace.

[1] book review of “Surviving the Swastika; Scientific Research in Nazi Germany” by Kristie Macrakis. NY: Oxford University Press, 1993. 280 pages.
[2] Ball, Philip. (2005). Science lobby urges UK to divert funds from military fields. Nature, 433: 184.
[3] Brumfiel, Geoff. (2008). UK Universities in bed with military. Nature, 453: 967.
[4] Chris Langley. (2008). Universities, the military and the means of destruction in the United Kingdom. The Economics of Peace and Security Journal, 3(1): 49-55.
[5] Forrest, Drew. (2008). The nature of greatness. Mail & Guardian, 24(29): 14.

———-

note1: To give an example of antique, early days PsyOps experiments: Nelson Mandela was “swinging a pick in a lime quarry, half-blinded by the glare”, for no purpose whatsoever, each day, for 13 years in a row, in an attempt to crush the morale of political prisoners [5]. Those things are calculable and can be simply demonstrated by physical damage to the eyes, unlike, say, humiliation, severe sleep deprivation (which distorts the sense of what’s real and what not and make you go crazy), and the like.

note2: Once a soldier always a soldier, 24/7, regardless if s/he participates in non-military events?

BFO’s specific and generic dependence and generalising progress in essential and mandatory parts

The Basic Formal Ontology (BFO) version 1.1 has, compared to v1.0, the additions SpecificallyDependentContinuant (SDC) and GenericallyDependentContinuant (GDC); at least on 30 June when I downloaded it. They are defined as follows (emphasis added):
SDC = “A continuant [snap:Continuant] that inheres in or is borne by other entities. Every instance of A requires some specific instance of B which must always be the same”. To note: it subsumes Quality and RealizableEntity.
GDC = “A continuant [snap:Continuant] that is dependent on one or other independent continuant [snap:IndependentContinuant] bearers. For every instance of A requires some instance of (an independent continuant [snap:IndependentContinuant] type) B but which instance of B serves can change from time to time”.
Setting aside the lack of similarity in the formulation of the definitions, difference in constraints on the participating entities, awkward English, and absence of full formal definitions (neither in the text nor in the OWL file), the interesting bit I will zoom in on now is the mandatoryness with “same” [at all times] and “some…from time to time”. This sounds just like our work on essential versus plain mandatory parts and wholes, but then counting for an arbitrary relation that relates [in]dependent continuants as opposed to limiting it to part-whole relations. For more details, see also the previous post on essential and mandatory parts or the DL’08 paper with technical details, and the extension that specifically deals with specific and generic dependence [1], constrained to part-whole relations due to the scope of the paper.
Put differently, one needs to represent the life cycle semantics of the participating entities to be able to distinguish between “some instance—be it x1, or …, or xn—of type X must participate in a relational instance of relation of type R” and “the same instance y of type Y must participate in a relational instance of relation of type S”. More practically, and pattern-wise, given some object z that is an instance of Z (a continuant) and t0, t1, … , tn point in time, then we have for GDC
r1 : < x1, z> at t0 (at the start of active z)
r2 : < x2, z> at t1, where t1>t0, and (x1 = x2 or < x1, z> or not < x1, z> )
r3 : < x3, z> at t2, where t2 >t1 , etc… until z cease to exist as instance of Z at tn
whereas for SDC
s1 : < y1, z1> at t0 (at the start of active z)
s1 : < y1, z1> at t1, where t1>t0
s1 : < y1, z1> at t2, where t2 >t1 until z cease to exist as instance of Z at tn.
s2 : < y2, z2> at t0 (at the start of active z)
s2 : < y2, z2> at t1, where t1>t0
s2 : < y2, z2> at t2, where t2 >t1 until z cease to exist as instance of Z at tn, and where it may be that z1 = z2 or y1 = y2 or both.

Taking the GDC example from the OWL file “a certain PDF file that exists in different and in several hard drives”, then we have an example where x…xi are the distinct hard drives and z the PDF file (well, the elusive ‘contents’ of the file—clearly, there are different bits involved in the different hard drives).

The provided SDC examples, however, are somewhat more complicated: “the mass of a cloud, the smell of mozzarella, the liquidity of blood, the color of a tomato, the disposition of fish to decay, the role of being a doctor, the function of the heart in the body: to pump blood, to receive de-oxygenated and oxygenated blood”. Obviously, each cloud must have a mass, but generally not the same mass, and some mass, say, 10kg, is not necessarily related always to the same cloud as clouds can grow and decrease in volume and, thus, in amount of mass. Given the example, we only can have the specific dependence if a ‘grown’ cloud (>10kg) counts as a different cloud (which is counterintuitive). Likewise, a certain liquidity of blood can change in value (due to drinking alcoholic beverages, for instance), although blood must have some value for liquidity (which may or may not be measures and which reaches 0 if it is dried up blood in a healing wound). Vice versa, a certain liquidity value does not have to be related to the same blood for the duration of its existence. If, however, we consider instead, say, that the doctor takes a blood sample and measures the liquidity and the result of that measurement is stored in a database or written on a paper-based health record, then that measured value ‘123’ is permanent for the duration of its existence related to ‘blood sample from patient p1 taken at time hh:mm at date dd-mm-yyyy’. But the latter reading is certainly a different case from just blood & liquidity. So, overall, this seems to contradict the SDC definition—or the examples don’t quite fit the definition.

In addition, we can have variations in the life cycles of the SDC and its bearer, which I don’t think are covered. Take the following figure, where there are two principal options: we fix the lifetime of the [independent]continuant and vary the SDC’s lifetime in (A), or fix the lifetime of the SDC and vary the lifetime of the [independent]continuant in (B).

In the case of the SDC definition, one would have to focus on (B): the [independent]continuant bearer might have one or more SDCs, but given an SDC, it must always be related to the Cx, which has the same or a longer lifetime than the SDC but never a shorter lifetime. In the case of our blood sample as bearer, then we have either C2 (if the sample continues after the record of the measurement is destroyed) or C4 (if the sample is destroyed together with the recording of the taken measurement).

So, it is either me who doesn’t get it, or there is room for improvement in the SDC/GDC definitions and/or examples. Anyone has some clarifying thoughts on this?

—–

[1] Artale, A., Guarino, N., and Keet, C.M. Formalising temporal constraints on part-whole relations. 11th International Conference on Principles of Knowledge Representation and Reasoning (KR’08). Gerhard Brewka, Jerome Lang (Eds.) AAAI Press. Sydney, Australia, September 16-19, 2008.