Publications by Dr Chris Fox

These are some of Dr Chris Fox's research outputs, including books, articles, conference papers and selected presentations, in an incrementally searchable table. Note that some of the papers linked to from here may be drafts, rather than the final published versions.

Example QuickSearch terms: ontology ─ semantics ─ imperatives ─ deontic ─ questions ─ type ─ software ─ process ─ slicing

QuickSearch:   Number of matching entries: 0.

YearTitleAuthorJournal/ProceedingsBibTeX typeFieldDOI/URL
2016 ‘Ought implies Can’ in the law Feis, G. & Fox, C. Inquiry   article semantics, natural language, deontic URL  
Abstract: In this paper we will investigate the “Ought implies Can” (OIC) thesis.
We will concentrate on explanations and interpretations of OIC, clarifying
its uses and relevance. The OIC thesis is relevant to debates that
ranges from semantics and pragmatics to legal philosophy, deontic
logic and the philosophy of action and mind.


We state the different theses that have been proposed concerning OIC
and seek to distinguish them and classify them, trying to set up
a framework in which the different OIC proposals can be compared.
We relate this to deontic logic and legal philosophy concerning the
role of secondary rules.


Finally, we will try to clarify what is meant by `implies' inside
the OIC thesis, dealing with the different notions of presupposition,
implicature and entailments and how different characterizations of
`implies' can give rise to the different OIC theses and uses.
BibTeX:
@article{Feis2016,
  author = {Guglielmo Feis and Chris Fox},
  title = {‘Ought implies Can’ in the law},
  journal = {Inquiry},
  year = {2016},
  note = {Accepted for publication. Version here is a much earlier draft.},
  url = {http://chris.foxearth.org/papers/CFox-Feis-OIC-2012-draft.pdf}
}
2016 Existence and Freedom Fox, C. Gedenkschrift for Sebastian Danicic   conference ontology, semantics URL  
Abstract: This presentation sketches out how free and partial quantification can reconcile universalism with richer ontologies.
BibTeX:
@conference{Fox2016,
  author = {Chris Fox},
  title = {Existence and Freedom},
  booktitle = {Gedenkschrift for Sebastian Danicic},
  address = {Goldsmiths College, London},
  year = {2016},
  url = {http://chris.foxearth.org/papers/CFox-Slides-Gedenkschrift-September-2016-corrected.pdf}
}
2015 Experiments with query expansion for entity finding Alarfaj, F., Kruschwitz, U. & Fox, C. Proceedings of CICLing   inproceedings natural language DOIURL  
BibTeX:
@inproceedings{AlarfajKruschwitzFox2015,
  author = {Fawaz Alarfaj and Udo Kruschwitz and Chris Fox},
  title = {Experiments with query expansion for entity finding},
  booktitle = {Proceedings of CICLing},
  editor = {Gelbukh, Alexander},
  publisher = {Springer},
  series = {Lecture Notes in Computer Science},
  year = {2015},
  volume = {9042},
  pages = {417--426},
  note = {ISBN 978-3-319-18116-5},
  url = {http://dx.doi.org/10.1007/978-3-319-18117-2_31},
  doi = {http://dx.doi.org/10.1007/978-3-319-18117-2_31}
}
2015 Creating Language Resources for Under-resourced Languages: methodologies and experiments on Arabic El-Haj, M., Kruschwitz, U. & Fox, C. Language Resources and Evaluation Journal   article arabic, natural language processing, summarisation DOIURL  
Abstract: Language resources are important for those working on computational
methods to analyse and study languages. These resources are needed
to help advancing the research in fields such as natural language
processing, machine learning, information retrieval and text analysis
in general. We describe the creation of useful resources for languages
that currently lack them, taking resources for Arabic summarisation
as a case study. We illustrate three different paradigms for creating
language resources, namely: (1) using crowdsourcing to produce a
small resource rapidly and relatively cheaply; (2) translating an
existing gold-standard dataset, which is relatively easy but potentially
of lower quality; and (3) using manual effort with appropriately
skilled human participants to create a resource that is more expensive
but of high quality. The last of these was used as a test collection
for TAC-2011. An evaluation of the resources is also presented.
BibTeX:
@article{El-Haj2014,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {Creating Language Resources for Under-resourced Languages: methodologies and experiments on {Arabic}},
  journal = {Language Resources and Evaluation Journal},
  year = {2015},
  volume = {49},
  number = {3},
  pages = {549--580},
  note = {First online: 09 August 2014. Printed September 2015.},
  url = {http://chris.foxearth.org/papers/CFox-LREV-2014.pdf},
  doi = {10.1007/s10579-014-9274-3}
}
2015 Imperatives Fox, C. Handbook of Contemporary Semantic Theory (2nd ed)   incollection semantics, natural language URL  
Abstract: Some issues in the analysis of imperatives and a sketch of a proof-theoretic
analysis.
BibTeX:
@incollection{fox14:_imper,
  author = {Chris Fox},
  title = {Imperatives},
  booktitle = {Handbook of Contemporary Semantic Theory (2nd ed)},
  editor = {Shalom Lappin and Chris Fox},
  publisher = {Wiley-Blackwell},
  address = {Oxford and Malden MA},
  year = {2015},
  edition = {Second},
  note = {Contains original research.},
  url = {http://chris.foxearth.org/papers/Chapter_10_Handbook_of_Contemporary_Semantic_Theory_2nd_ed.pdf}
}
2015 Handbook of Contemporary Semantic Theory (2nd ed)   book semantics, natural language URL  
BibTeX:
@book{lappin14:_handb_contem_seman,,
  title = {Handbook of Contemporary Semantic Theory (2nd ed)},
  editor = {Shalom Lappin and Chris Fox},
  publisher = {Wiley-Blackwell},
  address = {Oxford and Malden MA},
  year = {2015},
  edition = {Second},
  note = {755 pages},
  url = {http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0470670738.html}
}
2015 The C@merata Task at MediaEval 2015: Natural Language Queries on Classical Music Scores Sutcliffe, R., Crawford, T., Fox, C., Root, D. L. & Hovy, E. Proceedings of Mediæval   inproceedings natural language, music URL  
BibTeX:
@inproceedings{RichardSutcliffe2015,
  author = {Richard Sutcliffe and Tim Crawford and Chris Fox and Deane L. Root and Eduard Hovy},
  title = {The C@merata Task at MediaEval 2015: Natural Language Queries on Classical Music Scores},
  booktitle = {Proceedings of Mediæval},
  address = {Barcelona},
  year = {2015},
  url = {http://ceur-ws.org/Vol-1436/Paper12.pdf}
}
2015 Relating Natural Language Text to Musical Passages Sutcliffe, R., Crawford, T., Fox, C., Root, D. L. & Hovy, E. Relating Natural Language Text to Musical Passages   inproceedings natural language, music URL  
Abstract: There is a vast body of musicological literature
containing detailed analyses of musical works. These
texts make frequent references to musical passages in
scores by means of natural language phrases. Our long-
term aim is to investigate whether these phrases can be
linked automatically to the musical passages to which
they refer. As a first step, we have organised for two
years running a shared evaluation in which participants
must develop software to identify passages in a
MusicXML score based on a short noun phrase in
English. In this paper, we present the rationale for this
work, discuss the kind of references to musical passages
which can occur in actual scholarly texts, describe the
first two years of the evaluation and finally appraise the
results to establish what progress we have made.
BibTeX:
@inproceedings{RichardSutcliffe2015a,
  author = {Richard Sutcliffe and Tim Crawford and Chris Fox and Deane L. Root and Eduard Hovy},
  title = {Relating Natural Language Text to Musical Passages},
  booktitle = {Relating Natural Language Text to Musical Passages},
  address = {Malaga, Spain},
  year = {2015},
  pages = {524–530},
  url = {http://ismir2015.uma.es/articles/263_Paper.pdf}
}
2014 Profile-based Summarisation for Web Site Navigation Alhindi, A., Kruschwitz, U., Fox, C. & Bakour, D. A. ACM Transactions on Information Systems   article natural language URL  
BibTeX:
@article{Alhindi2014,
  author = {Azhar Alhindi and Udo Kruschwitz and Chris Fox and Dyaa Al Bakour},
  title = {Profile-based Summarisation for Web Site Navigation},
  journal = {ACM Transactions on Information Systems},
  year = {2014},
  volume = {33},
  number = {1},
  note = {Special Issue on Contextual Search and Recommendation. Editors: Paul N. Bennett, Kevyn Collins-Thompson, Diane Kelly, Ryen W. White, Yi Zhang},
  url = {http://dl.acm.org/citation.cfm?doid=2737806.2699661}
}
2014 Type-Theoretic Logic with an Operational Account of Intensionality Fox, C. & Lappin, S. Synthese   article semantics, natural language DOIURL  
Abstract: A reformulation of Curry-Typed Property Theory within Typed Predicate
Logic, and some discussion of an operational interpretation of intensional
distinctions.
BibTeX:
@article{Fox2014,
  author = {Chris Fox and Shalom Lappin},
  title = {Type-Theoretic Logic with an Operational Account of Intensionality},
  journal = {Synthese},
  year = {2014},
  url = {http://chris.foxearth.org/papers/CFox-Lappin-Synthese2014.pdf},
  doi = {DOI: 10.1007/s11229-013-0390-1}
}
2014 The meaning of formal semantics Fox, C. Semantics and Beyond. Philosophical and Linguistic Investigations   incollection ontology, semantics, natural language URL  
Abstract: What is it that semanticists think they are doing when using formalisation?
What kind of endeavour is the formal semantics of natural language:
scientific; linguistic; philosophical; logical; mathematical?


If formal semantics is a scientific endeavour, then there ought to
be empirical criteria for determining whether such a theory is correct,
or an improvement on an alternative account. The question then arises
as to the nature of the evidence that is being accounted for.


It could be argued that the empirical questions are little different
in kind to other aspects of linguistic analysis, involving questions
of performance and competence (Chomsky 1965; Saussure 1916). But
there are aspects of formal accounts of meaning that appear to sit
outside this scientific realm. One key issue concerns the precise
nature of the formalisation that is adopted; what criteria are to
be used to decide between accounts that are founded on different
formal systems, with different ontological assumptions? Indeed, is
it necessary to judge semantic frameworks on such grounds? In other
words, are two theoretical accounts to be treated as equivalent for
all relevant purposes if they account for exactly the same linguistic
data?


Broadly speaking, there are two related perspectives on the analysis
of propositional statements, one “truth conditional” --- reducing
sentence meaning to the conditions under which the sentence is judged
to be true (e.g. Montague 1973) --- the other “proof theoretic” ---
reducing sentence meanings to patterns of entailments that are supported
(e.g. Ranta 1994, Fox & Lappin 2005 ,Fox 2000). Variations of these
perspectives might be required in the case of non-assertoric utterances.
We may wonder what critieria might be used to decide between these
approaches.


This brings us back to the nature of the data itself. If the data
is (merely) about which arguments, or truth conditions, subjects
agree with, and which they disagree with, then the terms in which
the theory is expressed may be irrelevant. But it may also be legitimate
to be concerned with either (a) the intuitions that people have when
reasoning with language, or (b) some technical or philosophical issues
relating to the chosen formalism.


The truth-conditional vs proof-theoretic dichotomy might broadly be
characterised as model-theoretic vs axiomatic, where the model-theoretic
tend to be built around a pre-existing formal theory, and the axiomatic
involves formulating rules of behaviour “from scratch”.

In some sense, fitting an analysis of a new problem into an existing
framework could be described as providing some kind of “explanatory”
power, assuming that the existing framework has some salient motivation
that is independent of the specific details of the phenomena in question.
In contrast, building a new theory that captures the behaviour might
then be characterised as “descriptive”, as --- superficially at least
--- it does not show how an existing theory “already” accounts for
the data in some sense. Here we observe instead that the argument
can be run in the other direction. that a reductive model-theoretic
account merely “describes” how some aspects of a problem can be reduced
to some formalisation, but may fail to capture a subject's understanding
or intuitions about meaning.


It is surely appropriate for the formal theory itself to be at least
sympathetic to the ontological concerns and intuitions of its subjects
--- if not inform them (Dummett 1991). The alternative amounts to
little more than carving out otherwise arbitrary aspects of a system
that mimics the required behaviour, without a coherent explanation
of why some aspects of a formal theory characterise, or capture,
the intended meaning, but not others (cf. Benacerraf 1965). That
seems an impoverished approach, weakening any claim to “explain”.
Any constraint this imposes on what is it to be an explanatory account
of meaning then faces the same problem as naive notions of compositionality
(Zadrozny 1994) --- that is, what appears to be a meaningful restriction
is, in reality, a mirage.
BibTeX:
@incollection{Fox2014a,
  author = {Chris Fox},
  title = {The meaning of formal semantics},
  booktitle = {Semantics and Beyond. Philosophical and Linguistic Investigations},
  editor = {Piotr Stalmaszczyk},
  publisher = {De Gruyter},
  series = {Philosophische Analyse / Philosophical Analysis},
  address = {Berlin},
  year = {2014},
  volume = {57},
  pages = {85--108},
  url = {http://chris.foxearth.org/papers/CFox-Meaning-of-Semantics-PhiLang2013-paper.pdf},
  isbn = {978--3--11--036248--0}
}
2014 Curry-Typed Semantics in Typed Predicate Logic Fox, C. Logica Yearbook 2013   incollection semantics, natural language URL  
Abstract: Various questions arise in semantic analysis concerning the nature
of types. These questions include whether we need types in a semantic
theory, and if so, whether some version of simple type theory (STT,
Church 1940) is adequate or whether a richer more flexible theory
is required to capture our semantic intuitions.

Propositions and propositional attitudes can be represented in an
essentially untyped first-order language, provided a sufficiently
rich language of terms is adopted. In the absence of rigid typing,
care needs to be taken to avoid the paradoxes, for example by constraining
what kinds of expressions are to be interpreted as propositions (Turner
1992).

But the notion of type is ontologically appealing. In some respects,
STT seems overly restrictive for natural language semantics. For
this reason it is appropriate to consider a system of types that
is more flexible than STT, such as a Curry-style typing (Curry &
Feys 1958). Care then has to be taken to avoid the logical paradoxes.
Here we show how such an account, based on the Property Theory with
Curry Types (PTCT, Fox & Lappin 2005), can be formalised within Typed
Predicate Logic (TPL, Turner 2009). This presentation provides a
clear distinction between the classes of types that are being used
to (i) avoid paradoxes (ii) allow predicative polymorphic types.
TPL itself provides a means of expressing PTCT in a uniform language.
BibTeX:
@incollection{Fox2014b,
  author = {Chris Fox},
  title = {Curry-Typed Semantics in Typed Predicate Logic},
  booktitle = {Logica Yearbook 2013},
  editor = {Vit Puncochar},
  publisher = {College Publications},
  year = {2014},
  url = {http://chris.foxearth.org/papers/CFox-Curry-TPL-Logica2013-paper.pdf}
}
2014 The C@merata Task at MediaEval 2014: Natural Language Queries on Classical Music Scores Sutcliffe, R., Crawford, T., Fox, C., Root, D. L. & Hovy, E. Proceedings of Mediæval   inproceedings natural language, music URL  
BibTeX:
@inproceedings{Sutcliffe2014,
  author = {Richard Sutcliffe and Tim Crawford and Chris Fox and Deane L. Root and Eduard Hovy},
  title = {The C@merata Task at MediaEval 2014: Natural Language Queries on Classical Music Scores},
  booktitle = {Proceedings of Mediæval},
  address = {Barcelona},
  year = {2014},
  url = {http://ceur-ws.org/Vol-1263/mediaeval2014_submission_46.pdf}
}
2013 Adaptive Window Size Selection for Proximity Search Alarfaj, F., Kruschwitz, U. & Fox, C. Proceedings of the 5th BCS-IRSG Symposium on Future Directions in Access Symposium   inproceedings natural language processing URL  
BibTeX:
@inproceedings{Alarfaj2013,
  author = {Fawaz Alarfaj and Udo Kruschwitz and Chris Fox},
  title = {Adaptive Window Size Selection for Proximity Search},
  booktitle = {Proceedings of the 5th BCS-IRSG Symposium on Future Directions in Access Symposium},
  address = {Granada, Spain},
  year = {2013},
  note = {Part of ESSIR 2013},
  url = {http://ewic.bcs.org/upload/pdf/ewic_fdia13_paper8.pdf}
}
2013 Towards Profile-Based Document Summarisation for Interactive Search Assistance Alhindi, A., Kruschwitz, U. & Fox, C. Proceedings of the 5th BCS-IRSG Symposium on Future Directions in Access Symposium,   inproceedings natural language processing URL  
BibTeX:
@inproceedings{Alhindi2013,
  author = {Azhar Alhindi and Udo Kruschwitz and Chris Fox},
  title = {Towards Profile-Based Document Summarisation for Interactive Search Assistance},
  booktitle = {Proceedings of the 5th BCS-IRSG Symposium on Future Directions in Access Symposium,},
  address = {Granada, Spain},
  year = {2013},
  note = {Part of ESSIR 2013},
  url = {http://ewic.bcs.org/upload/pdf/ewic_fdia13_paper1.pdf}
}
2013 A Pilot Study on Using Profile-Based Summarisation for Interactive Search Assistance Alhindi, A. H., Kruschwitz, U. & Fox, C. 34th European Conference on Information Retrieval (ECIR’13)   inproceedings natural language processing URL  
BibTeX:
@inproceedings{Alhindi2013a,
  author = {Azhar Hasan Alhindi and Udo Kruschwitz and Chris Fox},
  title = {A Pilot Study on Using Profile-Based Summarisation for Interactive Search Assistance},
  booktitle = {34th European Conference on Information Retrieval (ECIR’13)},
  year = {2013},
  url = {link.springer.com/chapter/10.1007/978-3-642-36973-5_57}
}
2013 Axiomatising Questions Fox, C. Logica Year Book 2012   incollection semantics, natural language URL  
Abstract: Accounts of the formal semantics of natural language often adopt a
pre-existing framework.

Such formalisations rely upon informal narrative to explain the intended
interpretation of an expression --- an expression that may have different
interpretations in different circumstances, and may supports patterns
of behaviour that exceed what is intended. This ought to make us
question the sense in which such formalisations capture our intuitions
about semantic behaviour.

In the case of theories of questions and answers, a question might
be interpreted as a set (of possible propositional answers), or as
a function (that yields a proposition given a term that is intended
to be interpreted as a phrasal answer), but the formal theory itself
provides no means of distinguishing such sets and functions from
other cases where they are not intended to represent questions, or
their answers.

Here we sketch an alternative approach to formalisation a theory of
questions and answers that aims to be sensitive to such ontological
considerations.
BibTeX:
@incollection{fox13:_axiom_quest,
  author = {Chris Fox},
  title = {Axiomatising Questions},
  booktitle = {Logica Year Book 2012},
  editor = {Vit Puncochar and Petr Svarny},
  publisher = {College Publications},
  year = {2013},
  pages = {23--34},
  url = {http://chris.foxearth.org/papers/CFox-Questions-Logica2012-paper.pdf}
}
2012 Axiomatising Questions Fox, C.   conference semantics, natural language URL  
Abstract: Conference presentation. The paper of the same name is based on this
talk.
BibTeX:
@conference{fox12:_axiom_quest,
  author = {Chris Fox},
  title = {Axiomatising Questions},
  address = {Hejnice, Czech Republic},
  year = {2012},
  howpublished = {Presented at Logica 2012},
  url = {http://dl.dropbox.com/u/22441432/CFox-Logica2012-slides.pdf}
}
2012 Imperatives: a Judgemental Analysis Fox, C. Studia Logica   article semantics, natural language URL  
Abstract: This paper proposes a framework for formalising intuitions about the
behaviour of imperative commands. It seeks to capture notions of
satisfaction and coherence. Rules are proposed to express key aspects
of the general logical behaviour of imperative constructions. A key
objective is for the framework to allow patterns of behaviour to
be described while avoiding making any commitments about how commands,
and their satisfaction criteria, are to be interpreted. We consider
the status of some conundrums of imperative logic in the context
of this proposal.
BibTeX:
@article{fox12:_imper,
  author = {Chris Fox},
  title = {Imperatives: a Judgemental Analysis},
  journal = {Studia Logica},
  year = {2012},
  volume = {100},
  number = {4},
  pages = {879–905},
  url = {http://chris.foxearth.org/papers/CFox-Imperatives-StudiaLogica2012.pdf}
}
2012 In Defense of Axiomatic Semantics Fox, C. & Turner, R. Philosophical and Formal Approaches to Linguistic Analysis   incollection ontology, semantics, natural language URL  
Abstract: We may wonder about the status of logical accounts of the meaning
of language. When does a particular proposal count as a theory? How
do we judge a theory to be correct? What criteria can we use to decide
whether one theory is “better” than another?


Implicitly, many accounts attribute a foundational status to set theory,
and set-theoretic characterisations of possible worlds in particular.
The goal of a semantic theory is then to find a translation of the
phenomena of interest into a set-theoretic model. Such theories may
be deemed to have “explanatory” or “predictive” power if a mapping
can found into expressions of set-theory that have the appropriate
behaviour by virtue of the rules of set-theory (for example Montague
1973; Montague1974). This can be contrasted with an approach in which
we can help ourselves to “new” primitives and ontological categories,
and devise logical rules and axioms that capture the appropriate
inferential behaviour (as in Turner 1992). In general, this alternative
approach can be criticised as being mere “descriptivism”, lacking
predictive or explanatory power.


Here we will seek to defend the axiomatic approach. Any formal account
must assume some normative interpretation, but there is a sense in
which such theories can provide a more honest characterisation (cf.
Dummett 199). In contrast, the set-theoretic approach tends to conflate
distinct ontological notions. Mapping a pattern of semantic behaviour
into some pre-existing set-theoretic behaviour may lead to certain
aspects of that behaviour being overlooked, or ignored (Chierchia
& Turner 1988; Bealer 1982). Arguments about the explanatory and
predictive power of set-theoretic interpretations can also be questioned
(see Benacerraf 1965, for example). We aim to provide alternative
notions for evaluating the quality of a formalisation, and the role
of formal theory. Ultimately, claims about the methodological and
conceptual inadequacies of axiomatic accounts compared to set-theoretic
reductions must rely on criteria and assumptions that lie outside
the domain of formal semantics as such.
BibTeX:
@incollection{fox12:_in_defen_axiom_seman,
  author = {Chris Fox and Raymond Turner},
  title = {In Defense of Axiomatic Semantics},
  booktitle = {Philosophical and Formal Approaches to Linguistic Analysis},
  editor = {Piotr Stalmaszczyk},
  publisher = {Ontos Verlag},
  year = {2012},
  pages = {145--160},
  note = {Based on the paper ``A semantic Method'' presented at PhiLang 2011},
  url = {http://chris.foxearth.org/papers/CFox-Turner-PhiLang2011-paper.pdf}
}
2012 Obligations and Permissions Fox, C. Language and Linguistics Compass   article semantics, natural language, deontic URL  
Abstract: Utterances and statements that are concerned with obligations and
permissions are known as “deontic” expressions.

They can present something of a challenge when it comes to formalising
their meaning and behaviour.

The content of these expressions can appear to support entailment
relations similar to those of classical propositions, but such behaviour
can sometimes lead to counter-intuitive outcomes.

Historically, much of the descriptive work in this area has been philosophical
in outlook, concentrating on questions of morality and jurisprudence.

Some additional contributions have come from computer science, in
part due to the need to specify normative behaviour.

There are a number of formal proposals that seek to account for obligations
and permissions, such as “Standard Deontic Logic”. In the literature,
there has also been discussion of various conundrums and dilemmas
that need to be resolved, such as “the Good Samaritan”, “the Knower”,
“the Gentle Murderer”, “Contrary to Duty Obligations”, “Ross's Paradox”,
“Jørgensen's Dilemma”, “Sartre's Dilemma”, and “Plato's Dilemma”.

Despite all this work, there still appears to be no definite consensus
about how these kinds of expressions should be analysed, or how all
the deontic dilemmas should be resolved.

It is possible that obligations themselves, as opposed to their satisfaction
criteria, do not directly support a conventional logical analysis.

It is also possible that a linguistically informed analysis of obligations
and permissions may help to resolve some of the deontic dilemmas,
and clarify intuitions about how best to formulate a logic of deontic
expressions.
BibTeX:
@article{fox12:_oblig_permis,
  author = {Chris Fox},
  title = {Obligations and Permissions},
  journal = {Language and Linguistics Compass},
  year = {2012},
  volume = {6},
  number = {9},
  pages = {593–610},
  url = {http://chris.foxearth.org/papers/CFox-Obligations-Permissions-COMPASS2012.pdf}
}
2012 Ought ought to imply can Fox, C.   conference semantics, natural language, deontic URL  
Abstract: Conference slides: some thoughts on the Ought Implies Can puzzle,
and a meta-level proposal.
BibTeX:
@conference{fox12:_ought,
  author = {Chris Fox},
  title = {Ought ought to imply can},
  address = {Essex, UK},
  year = {2012},
  howpublished = {Presented at Ought and Can special philosophy workshop, 2012},
  url = {http://dl.dropbox.com/u/22441432/CFox-ought-oic-slides-2012.pdf}
}
2011 Experimenting with Automatic Text Summarization for Arabic El-Haj, M., Kruschwitz, U. & Fox, C. Human Language Technology   incollection arabic, natural language processing, summarisation URL  
BibTeX:
@incollection{el-haj11:_exper_autom_text_summar_arabic,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {Experimenting with Automatic Text Summarization for {Arabic}},
  booktitle = {Human Language Technology},
  editor = {Zygmunt Vetulani},
  publisher = {Springer},
  series = {Lecture Notes in Artificial Intelligence},
  year = {2011},
  number = {LNAI 6562},
  note = {Forth Language and Technology Conference, LTC 2009, Poznań, Poland, November 2009. Revised Selected Papers},
  url = {http://link.springer.com/chapter/10.1007/978-3-642-20095-3_45}
}
2011 Multi-Document Arabic Text Summarisation El-Haj, M., Kruschwitz, U. & Fox, C. Proceedings of the third Computer science and Electronic Engineering Conference   inproceedings arabic, natural language processing, summarisation URL  
Abstract: In this paper we present our generic extractive Arabic and English multi-document summarisers. We also describe the use of machine translation for evaluating the generated Arabic multi-document summaries using English extractive gold standards. In this work we first address the lack of Arabic multi-document corpora for summarisation and the absence of automatic and manual Arabic gold-standard summaries. These are required to evaluate any automatic Arabic summarisers. Second, we demonstrate the use of Google Translate in creating an Arabic version of the DUC-2002 dataset. The parallel Arabic/English dataset is summarised using the Arabic and English summarisation systems. The automatically generated summaries are evaluated using the ROUGE metric, as well as precision and recall. The results we achieve are compared with the top five systems in the DUC-2002 multi-document summarisation task.
BibTeX:
@inproceedings{el-haj11:_multi_docum_arabic_text_summar,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {Multi-Document {A}rabic Text Summarisation},
  booktitle = {Proceedings of the third Computer science and Electronic Engineering Conference},
  publisher = {IEEE Xplore},
  year = {2011},
  url = {http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=5995822}
}
2011 University of Essex at the TAC 2011 Multilingual Summarisation Pilot El-Haj, M., Kruschwitz, U. & Fox, C. Proceedings of the Text Analysis Conference (TAC) 2011, MultiLing Summarisation Pilot   inproceedings arabic, natural language processing, summarisation URL  
Abstract: We present the results of our Arabic and English runs at the TAC 2011
Multilingual summarisation (MultiLing) task. We partic- ipated with
centroid-based clustering for multi- document summarisation. The
automatically generated Arabic and English summaries were evaluated
by human participants and by two automatic evaluation metrics, ROUGE
and Au- toSummENG. The results are compared with the other systems
that participated in the same track on both Arabic and English languages.
Our Arabic summariser performed particularly well in the human evaluation.
BibTeX:
@inproceedings{el-haj11:_univer_essex_tac_multil_summar_pilot,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {{University of Essex at the TAC 2011 Multilingual Summarisation Pilot}},
  booktitle = {Proceedings of the Text Analysis Conference (TAC) 2011, MultiLing Summarisation Pilot},
  address = {Maryland, USA},
  year = {2011},
  url = {http://www.nist.gov/tac/publications/2011/papers.html}
}
2011 A Semantic Method Fox, C. & Turner, R.   conference ontology, semantics, natural language URL  
Abstract: Conference talk on which “In Defense of Axiomatic Semantics” is based.
BibTeX:
@conference{fox11:_seman_method,
  author = {Chris Fox and Raymond Turner},
  title = {A Semantic Method},
  address = {\L{}\'o{}d\'z{}, Poland},
  year = {2011},
  howpublished = {Second conference on the Philosophy of Language and Linguistics (PhiLang). This is the basis of the paper ``In Defense of Axiomatic Semantics'', published in Philosophical and Formal Approaches to Linguistic Analysis, Ontos Verlag, 2012},
  url = {http://dl.dropbox.com/u/22441432/PhiLang2011-slides.pdf}
}
2011 Exploring Clustering for Multi-Document Arabic Summarisation El-Haj, M., Kruschwitz, U. & Fox, C. The 7th Asian Information Retrieval Societies (AIRS 2011)   inproceedings arabic, natural language processing, summarisation URL  
Abstract: In this paper we explore clustering for multi-document Arabic summarisation. For our evaluation we use an Arabic version of the DUC-2002 dataset that we previously generated using Google Translate. We explore how clustering (at the sentence level) can be applied to multi-document summarisation as well as for redundancy elimination within this process. We use different parameter settings including the cluster size and the selection model applied in the extractive summarisation process. The automatically generated summaries are evaluated using the ROUGE metric, as well as precision and recall. The results we achieve are compared with the top five systems in the DUC-2002 multi-document summarisation task.
BibTeX:
@inproceedings{mahmoud11:_explor_clust_multi_docum_arabic_summar,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {Exploring Clustering for Multi-Document {Arabic} Summarisation},
  booktitle = {The 7th Asian Information Retrieval Societies (AIRS 2011)},
  publisher = {Springer},
  series = {Lecture Notes in Computer Science},
  address = {Berlin/Heidelberg},
  year = {2011},
  volume = {7097},
  pages = {550–561},
  url = {http://link.springer.com/chapter/10.1007/978-3-642-25631-8_50},
  isbn = {978-3-642-25630-1}
}
2010 Handbook of Computational Linguistics and Natural Language Processing   book semantics, natural language URL  
BibTeX:
@book{alexon:_handb_of_comput_linguis_and,,
  title = {Handbook of Computational Linguistics and Natural Language Processing},
  editor = {Alex Clark and Chris Fox and Shalom Lappin},
  publisher = {Wiley-Blackwell},
  year = {2010},
  url = {http://eu.wiley.com/WileyCDA/WileyTitle/productCd-1405155817.html}
}
2010 Using Mechanical Turk to Create a Corpus of Arabic Summaries El-Haj, M., Kruschwitz, U. & Fox, C. Proceedings of the International Conference on Language Resources and Evaluation (LREC)   inproceedings arabic, natural language processing, summarisation URL  
Abstract: This paper describes the creation of a human-generated corpus of extractive Arabic summaries of a selection of Wikipedia and Arabic newspaper articles using Mechanical Turk—an online workforce. The purpose of this exercise was two-fold. First, it addresses a shortage of relevant data for Arabic natural language processing. Second, it demonstrates the application of Mechanical Turk to the problem of creating natural language resources. The paper also reports on a number of evaluations we have performed to compare the collected summaries against results obtained from a variety of automatic summarisation systems.
BibTeX:
@inproceedings{el-haj10:_using_mechan_turk_creat_corpus_arabic_summar,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {Using {Mechanical Turk} to Create a Corpus of {Arabic} Summaries},
  booktitle = {Proceedings of the International Conference on Language Resources and Evaluation (LREC)},
  address = {Valletta, Malta},
  year = {2010},
  note = {In the Language Resources (LRs) and Human Language Technologies (HLT) for Semitic Languages workshop held in conjunction with the 7th International Language Resources and Evaluation Conference (LREC 2010)},
  url = {https://core.ac.uk/display/16387298}
}
2010 Expressiveness and Complexity in Underspecified Semantics Fox, C. & Lappin, S. Linguistic Analysis   article semantics, natural language URL  
Abstract: In this paper we address an important issue in the development of
an adequate formal theory of underspecified semantics. The tension
between expressive power and computational tractability poses an
acute problem for any such theory. Generating the full set of resolved
scope readings from an underspecified representation produces a combinatorial
explosion that undermines the efficiency of these representations.
Moreover, Ebert (2005) shows that most current theories of underspecified
semantic representation suffer from expressive incompleteness. In
previous work we present an account of underspecified scope representations
within Property Theory with Curry Typing (PTCT), an intensional first-order
theory for natural language semantics. We review this account, and
we show that filters applied to the underspecified-scope terms of
PTCT permit expressive completeness. While they do not solve the
general complexity problem, they do significantly reduce the search
space for computing the full set of resolved scope readings in non-worst
cases. We explore the role of filters in achieving expressive completeness,
and their relationship to the complexity involved in producing full
interpretations from underspecified representations.

This paper is dedicated to Jim Lambek.
BibTeX:
@article{fox10:_expres_compl_under_seman,
  author = {Chris Fox and Shalom Lappin},
  title = {Expressiveness and Complexity in Underspecified Semantics},
  journal = {Linguistic Analysis},
  year = {2010},
  volume = {36},
  pages = {385--417},
  url = {http://chris.foxearth.org/papers/fox-lappin-ecus-2010.pdf}
}
2010 The Good Samaritan and the Hygenic Cook Fox, C. Philosophy of Language and Linguistics   inproceedings semantics, natural language, deontic URL  
Abstract: When developing formal theories of the meaning of language, it is
appropriate to consider how apparent paradoxes and conundrums of
language are best resolved. But if we base our analysis on a small
sample of data then we may fail to take into account the influence
of other aspects of meaning on our intuitions. Here we consider the
so-called Good Samaritan Paradox (Prior, 1958), where we wish to
avoid any implication that there is an obligation to rob someone
from ``You must help a robbed man''. We argue that before settling
on a formal analysis of such sentences, we should consider examples
of the same form, but with intuitively different entailments---such
as ``You must use a clean knife''---and also actively seek other
examples that exhibit similar contrasts in meaning, even if they
do not exemplify the phenomena that is under investigation. This
can refine our intuitions and help us to attribute aspects of interpretation
to the various facets of meaning.
BibTeX:
@inproceedings{fox2010:_good_samar_hygen_cook,
  author = {Chris Fox},
  title = {The Good {S}amaritan and the Hygenic Cook},
  booktitle = {Philosophy of Language and Linguistics},
  editor = {Piotr Stalmaszczyk},
  publisher = {Ontos Verlag},
  series = {Linguistics and Philosophy},
  year = {2010},
  volume = {I: The Formal Turn},
  note = {Paper based on a talk at the conference on the Philosophy of Language and Linguistics, \L{}\'o{}d\'z{}, Poland},
  url = {http://chris.foxearth.org/papers/C-Fox-PhiLang2009-draft.pdf}
}
2010 Computational Semantics Fox, C. Handbook of Computational Linguistics and Natural Language Processing   incollection semantics, natural language URL  
Abstract: A brief introduction to Computational Semantics.
BibTeX:
@incollection{foxon:_comput_seman,
  author = {Chris Fox},
  title = {Computational Semantics},
  booktitle = {Handbook of Computational Linguistics and Natural Language Processing},
  editor = {Alex Clark and Chris Fox and Shalom Lappin},
  publisher = {Wiley-Blackwell},
  year = {2010},
  url = {http://chris.foxearth.org/papers/CFox-Semantics-2010-draft.pdf}
}
2009 The Influence of Text Pre-processing on Plagiarism Detection Češka, Z. & Fox, C. Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2009)   inproceedings natural language URL  
Abstract: This paper explores the influence of text pre-processing techniques
on plagiarism detection. We examine stop-word removal, lemmatization,
number replacement, synonymy recognition, and word generalization.
We also look into the influence of punctuation and word-order within
N-grams. All these techniques are evaluated according to their impact
on F_1-measure and speed of execution. Our experiments were performed
on a Czech corpus of plagiarized documents about politics. At the
end of this paper, we propose what we consider to be the best combination
of text pre-processing techniques.
BibTeX:
@inproceedings{ceska09:_influen_text_pre_plagiar_detec,
  author = {Zdeněk Češka and Chris Fox},
  title = {The Influence of Text Pre-processing on Plagiarism Detection},
  booktitle = {Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2009)},
  address = {Borovets, Bulgaria},
  year = {2009},
  url = {http://chris.foxearth.org/papers/C-Fox-RANLP2009-paper.pdf}
}
2009 Experimenting with Automatic Text Summarization for Arabic El-Haj, M., Kruschwitz, U. & Fox, C. Proceedings of the Fourth Language and Technology Conference: Human Language Technologies as a Challenge for Computer Science and Linguistics   inproceedings arabic, natural language processing, summarisation URL  
Abstract: The volume of information available on the Web is increasing rapidly. The need for systems that can automatically summarise documents is becoming ever more desirable. For this reason, text summarisation has quickly grown into a major research area as illustrated by the DUC and TAC conference series. Summarisation systems for Arabic are however still not as sophisticated and as reliable as those developed for languages like English. In this paper we discuss two summarisation systems for Arabic and report on a large user study performed on these systems. The first system, the Arabic Query-Based Text Summarisation System (AQBTSS), uses standard retrieval methods to map a query against a document collection and to create a summary. The second system, the Arabic Concept-Based Text Summarisation System (ACBTSS), creates a query-independent document summary. Five groups of users from different ages and educational levels participated in evaluating our systems.
BibTeX:
@inproceedings{el-haj09:_exper_autom_text_summar_arabic,
  author = {Mahmoud El-Haj and Udo Kruschwitz and Chris Fox},
  title = {Experimenting with Automatic Text Summarization for {Arabic}},
  booktitle = {Proceedings of the Fourth Language and Technology Conference: Human Language Technologies as a Challenge for Computer Science and Linguistics},
  editor = {Zygmunt Vetulani},
  address = {Poznań, Poland},
  year = {2009},
  pages = {365--369},
  url = {http://link.springer.com/chapter/10.1007/978-3-642-20095-3_45},
  isbn = {978--83--7177--746--2}
}
2009 The Good Samaritan and the Hygenic Cook: a cautionary tale about linguistic data Fox, C.   conference semantics, natural language, deontic URL  
Abstract: When developing formal theories of the meaning of language, it is
appropriate to consider how apparent paradoxes and conundrums of
language are best resolved. But if we base our analysis on a small
sample of data then we may fail to take into account the influence
of other aspects of meaning on our intuitions. Here we consider the
so-called Good Samaritan Paradox (Prior, 1958), where we wish to
avoid any implication that there is an obligation to rob someone
from ``You must help a robbed man''. We argue that before settling
on a formal analysis of such sentences, we should consider examples
of the same form, but with intuitively different entailments---such
as ``You must use a clean knife''---and also actively seek other
examples that exhibit similar contrasts in meaning, even if they
do not exemplify the phenomena that is under investigation. This
can refine our intuitions and help us to attribute aspects of interpretation
to the various facets of meaning.
BibTeX:
@conference{fox09:_good_samar_hygen_cook,
  author = {Chris Fox},
  title = {The Good Samaritan and the Hygenic Cook: a cautionary tale about linguistic data},
  year = {2009},
  howpublished = {Talk at the conference on the Philosophy of Language and Linguistics, \L{}\'o{}d\'z{}, Poland},
  url = {http://chris.foxearth.org/slides/C-Fox-PhiLang2009-slides.pdf}
}
2009 Obligations, Permissions and Transgressions: an alternative approach to deontic reasoning Fox, C. Proceedings of the Tenth Symposium on Logic and Language   inproceedings semantics, natural language, deontic URL  
Abstract: This paper proposes a logic of transgressions for obligations and
permissions. A key objective of this logic is to allow deontic conflicts
(Lemmon, 1962) but without appealing to defeasible or paraconsistent
reasoning, or multiple levels of obligation. This logic of transgressions
can be viewed as conceptually related to those approaches that formulate
obligations in terms of ``escaping'' from a sanction (Prior, 1958;
Nowell-Smith and Lemmon, 1960), and its modal variants (Anderson.
1958; Kanger, 1971), but where the notion of a transgression is more
fine-grained than a single ``sanction'
BibTeX:
@inproceedings{fox09:_oblig_permis_trans,
  author = {Chris Fox},
  title = {Obligations, Permissions and Transgressions: an alternative approach to deontic reasoning},
  booktitle = {Proceedings of the Tenth Symposium on Logic and Language},
  publisher = {Theoretical Linguistics Program, ELTE, Budapest},
  address = {Balatonszemes, Hungary},
  year = {2009},
  pages = {81--88},
  url = {http://www.nytud.hu/lola10/proceedings/fox.pdf}
}
2009 Statechart Slicing Luangsodsai, A. & Fox, C. Proceedings of the Sixth Joint Conference on Computer Science and Software Engineering (JCSSE2009)   inproceedings software analysis, processes URL  
Abstract: The paper discusses how to reduce a statechart model by slicing. We
start with the discussion of control dependencies and data dependencies
in statecharts. The and-or dependence graph is introduced to represent
control and data dependencies for statecharts. We show how to slice
statecharts by using this dependence graph. Our slicing approach
helps systems analysts and system designers in understanding system
specifications, maintaining software systems, and reusing parts of
systems models.
BibTeX:
@inproceedings{luangsodsai09:_statec_slicin,
  author = {Arthorn Luangsodsai and Chris Fox},
  title = {Statechart Slicing},
  booktitle = {Proceedings of the Sixth Joint Conference on Computer Science and Software Engineering (JCSSE2009)},
  address = {Phuket, Thailand},
  year = {2009},
  volume = {1},
  pages = {411--416},
  url = {http://chris.foxearth.org/papers/Arthorn-JCSSE2009-paper.pdf}
}
2008 Journal of Logic and Computation Special Issue: Lambda Calculus, Type Theory and Natural Language II   proceedings semantics, natural language URL  
Abstract: This special issue was a spin off of the second workshop on Lambda
Calculus, Type Theory and Natural Language.
BibTeX:
@proceedings{fernandez08:_special_issue_of_journ_of,,
  title = {Journal of Logic and Computation},
  booktitle = {Special Issue: Lambda Calculus, Type Theory and Natural Language II},
  editor = {Maribel Fernández and Chris Fox and Shalom Lappin},
  year = {2008},
  volume = {18(2)},
  number = {2},
  pages = {203--318},
  note = {Lambda Calculus, Type Theory and Natural Language II},
  url = {http://logcom.oxfordjournals.org/content/vol18/issue2/index.dtl}
}
2008 Imperatives: a logic of satisfaction Fox, C.   unpublished semantics, natural language URL  
Abstract: The paper discusses some issues concerning the semantic behaviour
of imperatives, presents a proof-theoretic formalisation that captures
their semantic behaviour in terms of propositional satisfaction criteria,
and relates this approach to some existing proposals. It can be viewed
as an attempt to formalise a "logic of satisfaction", as described
by Hare (1967). Conditional imperatives and pseudo-imperatives are
also considered, with some consideration of the nature of "practical
inference" where propositions and imperatives appear together (Kenny,
1966). The issues that arise concerning disjunction introduction
are discussed, including Ross' Paradox (Ross, 1941). A complementary
notion of "refinement" for imperatives is introduced that captures
"validity" relationships between commands. The theory is compared
with some other proposals in the literature.
BibTeX:
@unpublished{fox08:_imper,
  author = {Chris Fox},
  title = {Imperatives: a logic of satisfaction},
  year = {2008},
  note = {Draft paper, to be revised for publication. See Imperatives: a judgemental analysis},
  url = {http://chris.foxearth.org/papers/C-Fox-Satisfaction-2008.pdf}
}
2007 Axiomatic Imperatives Fox, C.   conference semantics, natural language URL  
Abstract: In the case of indicative sentences, broadly speaking there is some
consensus about how to approximate a range of phenomena by appeal
to truth conditional semantics and various forms of predicate logic
(modulo differences in theoretical framework and philosophical taste).
Even when these approximations fall short, they can help provide
a background against which the behaviour of more recalcitrant data
can be better understood.


In the case of imperatives, there have been various proposals for
their formal semantics. Unfortunately, these theories are often presented
without all the relevant formal details, and may rely on complex
and problematic notions, such as actions, cause and effect. This
can make it difficult to compare and evaluate such theories, or discern
any general consensus about how to address a given phenomena.


The current proposal seeks to capture the formal logical behaviour
of core imperatives by way of inference rules over propositional
satisfaction criteria. One objective is to find a level of abstraction
which avoids troublesome notions such as actions and causality, but
with sufficient expressive power to capture key intuitions about
the meaning of imperatives. In addition to giving an informative
analysis, the hope is that this will provide a baseline of clearly
formulated and generally accepted patterns of behaviour that can
be used to evaluate other proposals, and help us understand more
recalcitrant data.
BibTeX:
@conference{fox07:_axiom_imper,
  author = {Chris Fox},
  title = {Axiomatic Imperatives},
  year = {2007},
  howpublished = {Invited paper presented at the Linguistics Association of Great Britain (LAGB) workshop ``Issues in Dynamic Semantics,'' at King's College London},
  note = {Invited paper presented at the Linguistics Association of Great Britain (LAGB) workshop ``Issues in Dynamic Semantics,'' at King's College London},
  url = {http://chris.foxearth.org/slides/C-Fox-LAGB-2007.pdf}
}
2007 Expressive Completeness and Computational Efficiency for Underspecified Representations Fox, C. & Lappin, S. Festschrift for Robin Cooper   incollection semantics, natural language URL  
Abstract: Cooper (1983) pioneered underspecified scope representation in formal
and computational semantics through his introduction of quantifier
storage into Montague semantics as an alternative to the syntactic
operation of quantifying-in. In this paper we address an important
issue in the development of an adequate formal theory of underspecified
semantics. The tension between expressive power and computational
tractability poses an acute problem for any such theory. Ebert (2005)
shows that any reasonable current treatment of underspecified semantic
representation either suffers from expressive incompleteness or produces
a combinatorial explosion that is equivalent to generating the full
set of possible scope readings in the course of disambiguation. In
previous work we have presented an account of underspecified scope
representations within Property Theory with Curry Typing (PTCT),
an intensional first-order theory for natural language semantics.
Here we show how filters applied to the underspecified-scope terms
of PTCT permit both expressive completeness and the reduction of
computational complexity in a significant class of non-worst case
scenarios.
BibTeX:
@incollection{fox07:_expres_compl_and_comput_effic,
  author = {Chris Fox and Shalom Lappin},
  title = {Expressive Completeness and Computational Efficiency for Underspecified Representations},
  booktitle = {Festschrift for Robin Cooper},
  editor = {Lars Borin and Staffan Larsson},
  year = {2007},
  note = {Celebrating the occasion of Robin Cooper's 60th birthday},
  url = {http://chris.foxearth.org/papers/C-Fox-Festschrift-Cooper-2007.pdf}
}
2007 Mathematics for Computing Fox, C.   misc distance learning  
Abstract: Stand-alone ``computer aided learning'' resources, based upon an existing
printed volume.
BibTeX:
@misc{fox07:_mathem_for_comput,
  author = {Chris Fox},
  title = {Mathematics for Computing},
  year = {2007},
  howpublished = {University of {L}ondon. Stand-alone ``computer aided learning'' resources, based upon an existing printed volume},
  note = {Stand-alone ``computer aided learning'' resources, based upon an existing printed volume}
}
2006 And-Or Dependence Graphs for Slicing Statecharts Fox, C. & Luangsodsai, A. Beyond Program Slicing   inproceedings software analysis URL  
Abstract: The construction of an And-Or dependence graphs is illustrated, and
its use in slicing statecharts is described. The additional structure
allows for more precise slices to be constructed in the event of
additional information, such as may be provided by static analysis
and model checking, and with constraints on the global state and
external events.
BibTeX:
@inproceedings{fox_et_al:DSP:2006:493,
  author = {Chris Fox and Arthorn Luangsodsai},
  title = {And-Or Dependence Graphs for Slicing Statecharts},
  booktitle = {Beyond Program Slicing},
  editor = {David W. Binkley and Mark Harman and Jens Krinke},
  publisher = {Internationales Begegnungs- und Forschungszentrum f{\"u}r Informatik (IBFI), Schloss Dagstuhl, Germany},
  series = {Dagstuhl Seminar Proceedings},
  address = {Dagstuhl, Germany},
  year = {2006},
  number = {05451},
  url = {http://drops.dagstuhl.de/opus/volltexte/2006/493}
}
2005 Intergrating Role Activity Diagrams and Hybrid IDEF for Business Process Modeling Using MDA Badica, C., Teodorescu, M., Spahiu, C., Badica, A. & Fox, C. Proceedings of the Seventh International Symposium on Symbolic and Numeric Algorithms for Scientific Computing   inproceedings processes URL  
Abstract: Business process modeling is an important phase during requirements
col lection. Usually functional, dynamic and role models are needed.
We propose to integrate Role Activity Diagrams and Hybrid IDEF for
business process modeling within Model Driven Architecture. Our proposal
is demonstrated with a sample implementation.
BibTeX:
@inproceedings{badica05:_inter_role_activ_diagr_and,
  author = {Costin Badica and Maria Teodorescu and Cosmin Spahiu and Amelia Badica and Chris Fox},
  title = {Intergrating {Role Activity Diagrams} and Hybrid {IDEF} for Business Process Modeling Using {MDA}},
  booktitle = {Proceedings of the Seventh International Symposium on Symbolic and Numeric Algorithms for Scientific Computing},
  publisher = {IEEE Computer Society},
  address = {Timisoara, Romania},
  year = {2005},
  url = {http://chris.foxearth.org/papers/C-Fox-synasc2005-paper.pdf}
}
2005 ConSUS: A Light-Weight Program Conditioning Danicic, S., Daoudi, M., Fox, C., Harman, M., Hierons, R., Howroyd, J., Ourabya, L. & Ward, M. Journal of Systems and Software, special issue on Software Reverse Engineering   article software analysis URL  
Abstract: Program conditioning consists of identifying and removing a set of
statements which cannot be executed when a condition of interest
holds at some point in a program. It has been applied to problems
in maintenance, testing, re­use and re­engineering. Program conditioning
relies upon both symbolic execution and reasoning about symbolic
predicates. Automation of the process therefore requires some form
of automated theorem proving. However, the use of a full-power `heavyweight'
theorem prover would impose unrealistic performance constraints.


This paper reports on a lightweight approach to theorem proving using
the FermaT simplify decision procedure. This is used as a component
to ConSUS, a program conditioning system for the Wide Spectrum Language
WSL. The paper describes the symbolic execution algorithm used by
ConSUS, which prunes as it conditions.


The paper also provides empirical evidence that conditioning produces
a significant reduction in program size and, although exponential
in the worst case, the conditioning system has low degree polynomial
behaviour in many cases, thereby making it scalable to unit level
applications of program conditioning.
BibTeX:
@article{danicic05:_consus,
  author = {Sebastian Danicic and Mohammed Daoudi and Chris Fox and Mark Harman and Rob Hierons and John Howroyd and Lahcen Ourabya and Martin Ward},
  title = {ConSUS: A Light-Weight Program Conditioning},
  journal = {Journal of Systems and Software, special issue on Software Reverse Engineering},
  year = {2005},
  volume = {77},
  number = {3},
  pages = {241--262},
  url = {http://chris.foxearth.org/papers/C-Fox-JSS-paper.pdf}
}
2005 Slicing Algorithms are Minimal for Free Liberal Program Schemas Danicic, S., Fox, C., Harman, M., Howroyd, J. & Lawrence., M. L. Computer Journal   article software analysis URL  
Abstract: Program slicing is an automated source code extraction technique that
has been applied to a number of problems including testing, debugging,
maintenance, reverse engineering, program comprehension, reuse and
program integration. In all these applications the size of the slice
is crucial; the smaller the better. It is known that statement minimal
slices are not computable, but the question of dataflow minimal slicing
has remained open since Weiser posed it in 1979. This paper proves
that static slicing algorithms produce dataflow minimal end slices
for programs which can be represented as schemas which are free and
liberal.
BibTeX:
@article{danicic05:_slicin_algor_are_minim_for,
  author = {Sebastian Danicic and Chris Fox and Mark Harman and John Howroyd and Michael L. Lawrence.},
  title = {Slicing Algorithms are Minimal for Free Liberal Program Schemas},
  journal = {Computer Journal},
  year = {2005},
  volume = {48},
  pages = {737--748},
  url = {http://comjnl.oxfordjournals.org/cgi/content/abstract/bxh121?ijkey=m4zwzCzzC9J6uvW&keytype=ref}
}
2005 Proceedings of the second workshop on Lambda Calculus, Type Theory and Natural Language   proceedings semantics, natural language URL  
Abstract: This is the second Workshop on Lambda Calculus, Type Theory, and Natural
Language (LCTTNL). The first workshop was held in London in December
2003, and selected papers were published in a special issue of the
Journal of Logic and Computation, Volume 15 Number 2, April 2005.
The workshop was established with the goal of bringing together researchers
interested in functional programming, type theory, and the application
of the lambda calculus to the analysis of natural language. The communities
that have developed around each of these areas share many formal
and computational interests, but typically have little contact with
each other. LCTTNL is intended to provide a forum in which people
working on the lambda calculus from a variety of distinct perspectives
will share their research and come to appreciate new domains of application.
The success of the first workshop prompted us to hold this second
workshop.
BibTeX:
@proceedings{fernandez05:_proceed_of_secon_works_lambd,,
  title = {Proceedings of the second workshop on Lambda Calculus, Type Theory and Natural Language},
  editor = {Maribel Fernández and Chris Fox and Shalom Lappin},
  address = {King's College London},
  year = {2005},
  url = {http://lcttnl.foxearth.org/}
}
2005 Journal of Logic and Computation Lambda Calculus, Type Theory and Natural Language   proceedings semantics, natural language URL  
Abstract: This special issue was a spin off of the first workshop on Lambda
Calculus, Type Theory and Natural Language.
BibTeX:
@proceedings{fernandez05:_special_issue_of_journ_of,,
  title = {Journal of Logic and Computation},
  booktitle = {Lambda Calculus, Type Theory and Natural Language},
  editor = {Maribel Fernández and Chris Fox and Shalom Lappin},
  year = {2005},
  volume = {15(2)},
  number = {2},
  pages = {83--240},
  note = {Lambda Calculus, Type Theory and Natural Language},
  url = {http://logcom.oxfordjournals.org/content/vol15/issue2/index.dtl}
}
2005 Achieving Expressive Completeness and Computational Efficiency with Underspecified Scope Representations Fox, C. & Lappin, S. Proceedings of the Fifteenth Amsterdam Colloquium   inproceedings semantics, natural language URL  
Abstract: Ebert (2005) points out that most current theories of underspecified
semantic representation either suffer from expressive incompleteness
or do not avoid generating the full set of possible scope readings
in the course of disambiguation. In previous work we have presented
an account of underspecified scope representations within an intensional
first-order property theory enriched with Curry Typing for natural
language semantics. Here we show how filters applied to the underspecified
scope terms of this theory permit both expressive completeness and
the reduction of the search space of possible scope interpretations.
BibTeX:
@inproceedings{fox05:_achiev_expres_compl_and_comput,
  author = {Chris Fox and Shalom Lappin},
  title = {Achieving Expressive Completeness and Computational Efficiency with Underspecified Scope Representations},
  booktitle = {Proceedings of the Fifteenth Amsterdam Colloquium},
  year = {2005},
  url = {http://www.illc.uva.nl/AC05/uploaded_files/AC05Proceedings.pdf}
}
2005 Foundations of Intensional Semantics Fox, C. & Lappin, S.   book semantics, natural language URL  
Abstract: We present Property Theory with Curry Typing (PTCT), an intensional
first-order logic for natural language semantics. PTCT permits fine-grained
specifications of meaning. It also supports polymorphic types and
separation types. We develop an intensional number theory within
PTCT in order to represent proportional generalized quantifiers like
``most.'' We use the type system and our treatment of generalized
quantifiers in natural language to construct a type-theoretic approach
to pronominal anaphora that avoids some of the difficulties that
undermine previous type-theoretic analyses of this phenomenon.
BibTeX:
@book{fox05:_found_of_inten_seman,
  author = {Chris Fox and Shalom Lappin},
  title = {Foundations of Intensional Semantics},
  publisher = {Blackwell},
  year = {2005},
  url = {http://www.blackwellpublishing.com/book.asp?ref=063123375X&site=1}
}
2005 Program Slicing and Conditioning Fox, C.   misc software analysis URL  
Abstract: An introduction to the notions of program slicing and program conditioning.
BibTeX:
@misc{fox05:_progr_slicin_and_condit,
  author = {Chris Fox},
  title = {Program Slicing and Conditioning},
  year = {2005},
  howpublished = {Theoretical Computer Science Seminar, University of Kent},
  note = {Talk at the Theoretical Computer Science Seminar, University of Kent},
  url = {http://chris.foxearth.org/slides/C-Fox-Kent-slides-2005.pdf}
}
2005 Subject guide LaTeX classfile Fox, C.   misc distance learning  
Abstract: Subject guide LaTeX classfile for automating the production of distance
learning materials for the University of London in their house style.
BibTeX:
@misc{fox05:_subjec_guide_latex_class,
  author = {Chris Fox},
  title = {Subject guide \LaTeX\ classfile},
  year = {2005},
  howpublished = {For automating the production of distance learning materials for the University of London in their house style. FIrst version 2003. Final version 2005.},
  note = {For automating the production of distance learning materials for the University of London in their house style}
}
2005 Underspecified Interpretations in a Curry-Typed Representation Language Fox, C. & Lappin, S. Journal of Logic and Computation   article semantics, natural language DOIURL  
Abstract: Abstract In previous work we have developed Property Theory with Curry
Typing (PTCT), an intensional first-order logic for natural language
semantics. PTCT permits fine-grained specifications of meaning. It
also supports polymorphic types and separation types. We develop
an intensional number theory within PTCT in order to represent proportional
generalized quantifiers like "most", and we suggest a dynamic type-theoretic
approach to anaphora and ellipsis resolution. Here we extend the
type system to include product types, and use these to define a permutation
function that generates underspecified scope representations within
PTCT. We indicate how filters can be added to encode constraints
on possible scope readings. Our account offers several important
advantages over other current theories of underspecification.
BibTeX:
@article{fox05:_under_inter_in_curry_typed_repres_languag,
  author = {Chris Fox and Shalom Lappin},
  title = {Underspecified Interpretations in a {C}urry-Typed Representation Language},
  journal = {Journal of Logic and Computation},
  year = {2005},
  volume = {15},
  number = {2},
  pages = {131--143},
  url = {http://chris.foxearth.org/papers/C-Fox-JLAC-2005.pdf},
  doi = {10.1093/logcom/exi006}
}
2005 Polymorphic Quantifiers and Underspecification in Natural Language Fox, C. & Lappin, S. We Will Show Them: Essays in Honour of Dov Gabbay   incollection semantics, natural language URL  
Abstract: It is reasonably well-understood that natural language displays polymorphic
behaviour in both its syntax and semantics, where various constructions
can operate on a range of syntactic categories, or semantic types.
In mathematics, logic and computer science it is appreciated that
there are various ways in which such type-general behaviours can
be formulated. It is also known that natural languages are highly
ambiguous with respect to scoping artifacts, as evident with quantifiers,
negation and certain modifier expressions. To deal with such issues,
formal frameworks have been explored in which the polymorphic nature
of natural language can be expressed, and theories of underspecified
semantics have been proposed which seek to separate the process of
pure compositional interpretation from the assignment of scope. To
date, however, there has been no work on bringing these two aspects
together; there is no semantic treatments of scope ambiguity and
underspecification which explicitly takes into account the polymorphic
nature of natural language quantifiers.


In this paper, we extend an existing treatment of underspecification
and scope ambiguity in Property Theory with Curry Typing (PTCT) to
deal with arbitrary types of quantification by adopting a form of
polymorphism. In this theory of underspecification, all of the expressions
in the theory are terms of the logic; there is no ``meta-semantic''
machinery. For this reason all aspects of the theory must be able
to deal with polymorphism appropriately.
BibTeX:
@incollection{fox05:_we_will_show_them,
  author = {Chris Fox and Shalom Lappin},
  title = {Polymorphic Quantifiers and Underspecification in Natural Language},
  booktitle = {We Will Show Them: Essays in Honour of Dov Gabbay},
  editor = {S. Artemov and H. Barringer and A. S. d'Avila Garcez and L. C. Lamb and J. Woods},
  publisher = {College Publications},
  year = {2005},
  url = {http://chris.foxearth.org/papers/C-Fox-Festschrift-Gabbay-2007.pdf}
}
2005 Branch coverage preserving transformations for unstructured programs Hierons, R., Harman, M. & Fox, C. Computer Journal   article software analysis URL  
Abstract: Test data generation by hand is a tedious, expensive and error-prone
activity, yet testing is a vital part of the development process.
Several techniques have been proposed to automate the generation
of test data, but all of these are hindered by the presence of unstructured
control flow. This paper addresses the problem using testability
transformation. Testability transformation does not preserve the
traditional meaning of the program, rather it deals with preserving
test-adequate sets of input data. This requires new equivalence relations
which, in turn, entail novel proof obligations. The paper illustrates
this using the branch coverage adequacy criterion and develops a
branch adequacy equivalence relation and a testability transformation
for restructuring. It then presents a proof that the transformation
preserves branch adequacy.
BibTeX:
@article{hierons05:_branc_cover_preser_trans_for_unstr_progr,
  author = {Rob Hierons and Mark Harman and Chris Fox},
  title = {Branch coverage preserving transformations for unstructured programs},
  journal = {Computer Journal},
  year = {2005},
  volume = {48},
  number = {4},
  pages = {421--436},
  url = {http://chris.foxearth.org/papers/C-Fox-Computer-Journal-2005.pdf}
}
2004 Hybrid IDEF0/IDEF3 modelling of business processes: syntax, semantics and expressiveness Badica, C. & Fox, C. Computer Aided Verification of Information Systems (CaVIS 2004)   inproceedings processes URL  
Abstract: A description of the process dimension of a notation for business
process modelling that integrates aspects from IDEF0 and IDEF3 in
a novel way is presented. The features of this notation include black
box modelling of activities in the style of IDEF0 and glass box refinements
of activities using connectors for specifying process branching in
the style of IDEF3. The semantics of the notation is given by a mapping
to a place/transition net. The notation is shown to be as expressive
as a Standard Workflow Model.
BibTeX:
@inproceedings{badica04:_hybrid_idef0_idef3_model_of_busin_proces,
  author = {Costin Badica and Chris Fox},
  title = {Hybrid {IDEF0/IDEF3} modelling of business processes: syntax, semantics and expressiveness},
  booktitle = {Computer Aided Verification of Information Systems (CaVIS 2004)},
  address = {Timisoara, Romania},
  year = {2004},
  pages = {20--22},
  url = {http://chris.foxearth.org/papers/C-Fox-CaVIS04-paper.pdf}
}
2004 Verification of Multiple Input/Multiple Output Business Processes Badica, C. & Fox, C. Proceedings of the 2004 IEEE International Conference in Information Reuse and Integration (IEEE IRI-2004)   inproceedings processes URL  
Abstract: In many business process modelling situations using Petri nets, the
resulting model does not have a single input place and a single output
place. Therefore, the correctness of the model cannot be assessed
within the existing frameworks, which are devised for workflow nets
--- a particular class of Petri nets with a single input place and
a single output place. Moreover, the existing approaches for tackling
this problem are rather simplistic and they do not work even for
simple examples. This paper shows that, by an appropriate reduction
of a multiple input/multiple output Petri net, it is possible to
use the existing techniques to check the correctness of the original
process. The approach is demonstrated with an appropriate example.
BibTeX:
@inproceedings{badica04:_verif_of_multip_input_multip,
  author = {Costin Badica and Chris Fox},
  title = {Verification of Multiple Input/Multiple Output Business Processes},
  booktitle = {Proceedings of the 2004 IEEE International Conference in Information Reuse and Integration (IEEE IRI-2004)},
  address = {Las Vegas, Nevada, USA},
  year = {2004},
  note = {Sponsored by IEEE Systems, Man and Cybernetics Society},
  url = {http://chris.foxearth.org/papers/C-Fox-iri2004-paper.pdf}
}
2004 ConSIT: A Fully Automated Conditioned Program Slicer Fox, C., Danicic, S., Harman, M. & Hierons, R. Software --- Practise and Experience (SPE)   article software analysis URL  
Abstract: Conditioned slicing is a source code extraction technique. The extraction
is performed with respect to a slicing criterion which contains a
set of variables and conditions of interest. Conditioned slicing
removes the parts of the original program which cannot affect the
variables at the point of interest, when the conditions are satisfied.
This produces a conditioned slice, which preserves the behaviour
of the original with respect to the slicing criterion.


Conditioned slicing has applications in source code comprehension,
reuse, restructuring and testing. Unfortunately, implementation is
not straightforward because the full exploitation of conditions requires
the combination of symbolic execution, theorem proving and traditional
static slicing. Hitherto, this difficultly has hindered development
of fully automated conditioning slicing tools.


This paper describes the first fully automated conditioned slicing
system, ConSIT, detailing the theory that underlies it, its architecture
and the way it combines symbolic execution, theorem proving and slicing
technologies. The use of ConSIT is illustrated with respect to the
applications of testing and comprehension.
BibTeX:
@article{fox04:_consit,
  author = {Chris Fox and Sebastian Danicic and Mark Harman and Rob Hierons},
  title = {ConSIT: A Fully Automated Conditioned Program Slicer},
  journal = {Software --- Practise and Experience (SPE)},
  year = {2004},
  volume = {34},
  number = {1},
  pages = {15--46},
  note = {John Wiley \&\ Sons. Also published online on 26th November 2003},
  url = {http://chris.foxearth.org/papers/C-Fox-SPE-paper.pdf}
}
2004 An Expressive First-Order Logic with Flexible Typing for Natural Language Semantics Fox, C. & Lappin, S. Logic Journal of the Interest Group in Pure and Applied Logics   article semantics, natural language URL  
Abstract: We present Property Theory with Curry Typing (PTCT), an intensional
first-order logic for natural language semantics. PTCT permits fine-grained
specifications of meaning. It also supports polymorphic types and
separation types. We develop an intensional number theory within
PTCT in order to represent proportional generalized quantifiers like
``most.'' We use the type system and our treatment of generalized
quantifiers in natural language to construct a type-theoretic approach
to pronominal anaphora that avoids some of the difficulties that
undermine previous type-theoretic analyses of this phenomenon.
BibTeX:
@article{fox04:_expres_first_order_logic_with,
  author = {Chris Fox and Shalom Lappin},
  title = {An Expressive First-Order Logic with Flexible Typing for Natural Language Semantics},
  journal = {Logic Journal of the Interest Group in Pure and Applied Logics},
  year = {2004},
  volume = {12},
  number = {2},
  pages = {135--168},
  url = {http://chris.foxearth.org/papers/C-Fox-IGPL2004-paper.pdf}
}
2004 Generalized Quantifiers with Underspecified Scope Relations in a First-Order Representation Language Fox, C. & Lappin, S. Strategies of Quantification   conference semantics, natural language URL  
Abstract: In this paper we show that by adding Curry typing to a first-order
property theory it is possible to represent the full range of generalized
quantifiers (GQs) corresponding to natural language determiners.
We characterize GQs as property terms that specify cardinality relations
between properties (or separation types). We also generate underspecified
quantifier scope representations within the representation language,
rather than through meta-language devices, as in most current treatments
of underspecification (Reyle, 1993; Bos, 1995; Blackburn & Bos,
2003; Copestake, Flickinger, & Sag, 1997).
BibTeX:
@conference{fox04:_gener_quant_with_under_scope,
  author = {Chris Fox and Shalom Lappin},
  title = {Generalized Quantifiers with Underspecified Scope Relations in a First-Order Representation Language},
  booktitle = {Strategies of Quantification},
  address = {York, UK},
  year = {2004},
  url = {http://chris.foxearth.org/papers/C-Fox-York-2004.pdf}
}
2004 Generating underspecified interpretations as terms of the representation language Fox, C.   misc semantics, natural language URL  
Abstract: In previous work we have developed Property Theory with Curry Typing
(PTCT), an intensional first-order logic for natural language semantics.
PTCT permits fine-grained specifications of meaning. It also supports
polymorphic types and separation types. We develop an intensional
number theory within PTCT in order to represent proportional generalized
quantifiers like most, and we suggest a dynamic type-theoretic approach
to anaphora and ellipsis resolution. Here we extend the type system
to include product types, and use these to define a permutation function
that generates underspecified scope representations within PTCT.
We indicate how filters can be added to encode constraints on possible
scope readings. Our account offers several important advantages over
other current theories of underspecification.
BibTeX:
@misc{fox04:_gener_under_inter_as_terms,
  author = {Chris Fox},
  title = {Generating underspecified interpretations as terms of the representation language},
  year = {2004},
  howpublished = {Invited talk at the Eighth International Symposium on Logic and Language (LoLa8), Debrecen, Hungary},
  note = {Invited talk at the Eighth International Symposium on Logic and Language (LoLa8), Debrecen, Hungary. Joint work with Shalom Lappin},
  url = {http://chris.foxearth.org/slides/C-Fox-LoLa8-slides.pdf}
}
2004 Natural Language Semantics in a Flexibly Typed Intensional Logic Fox, C.   misc semantics, natural language URL  
Abstract: In this talk I shall present Property Theory with Curry Typing (PTCT),
an intensional first-order theory for natural language semantics
developed by myself and Shalom Lappin. PTCT permits fine-grained
specifications of meaning. It also supports polymorphic types and
separation types. We have developed an intensional number theory
within PTCT in order to represent proportional generalized quantifiers
like "most". We use the type system and our treatment of generalized
quantifiers in natural language to construct a type-theoretic approach
to pronominal anaphora and ellipsis. We have also developed a theory
of underspecification that is expressed within the term language
of the theory.


The talk will focus on the basics of PTCT itself, and outline the
treatment of anaphora and ellipsis. If there is time, a sketch of
our treatment of underspecification may also be given.
BibTeX:
@misc{fox04:_natur_languag_seman_in_flexib,
  author = {Chris Fox},
  title = {Natural Language Semantics in a Flexibly Typed Intensional Logic},
  year = {2004},
  howpublished = {The ITRI Seminar series, University of Brighton},
  note = {Invited talk at the ITRI Seminar series, University of Brighton. Joint work with Shalom Lappin},
  url = {http://chris.foxearth.org/slides/C-Fox-ITRI-2004-slides.pdf}
}
2004 A Type-Theoretic Approach to Anaphora and Ellipsis Resolution Fox, C.   misc semantics, natural language URL  
Abstract: We present an approach to anaphora and ellipsis resolution in which
pronouns and elided structures are interpreted by the dynamic identification
in discourse of type constraints on their semantic representations.
The content of these conditions is recovered in context from an antecedent
expression. The constraints define separation types (sub-types) in
Property Theory with Curry Typing (PTCT), an expressive first-order
logic with Curry typing that we have proposed as a formal framework
for natural language semantics.
BibTeX:
@misc{fox04:_type_theor_approac_to_anaph,
  author = {Chris Fox},
  title = {A Type-Theoretic Approach to Anaphora and Ellipsis Resolution},
  year = {2004},
  howpublished = {The Human Communications Research Centre Colloquium, Edinburgh},
  note = {Invited talk at The Human Communications Research Centre Colloquium, Edinburgh. Joint work with Shalom Lappin},
  url = {http://chris.foxearth.org/slides/C-Fox-Edinburgh-slides-2004.pdf}
}
2004 Agents Interpreting Imperative Sentences Pérez-Ramírez, M. & Fox, C. Proceedings of the Fifth International Conference on Intelligent Text Processing and Computational Linguistics (CICLing 2004)   inproceedings semantics, natural language  
Abstract: The aim of this paper is to present a model for the interpretation
of imperative sentences in which reasoning agents play the role of
speakers and hearers. A requirement is associated with both the person
who makes and the person who receives the order which prevents the
hearer coming to inappropriate conclusions about the actions s/he
has been commanded to do. By relating imperatives with the actions
they prescribe, the dynamic aspect of imperatives is captured. Further,
by using the idea of `encapsulation', it is possible to distinguish
what is demanded by an imperative from the inferential consequences
of the imperative. These two ingredients provide agents with the
tools to avoid inferential problems in interpretation. These two
ingredients provide agents with the tools to avoid inferential problems
in interpretation.
BibTeX:
@inproceedings{perez-ramirez04:_agent_inter_imper_senten,
  author = {M.~P\'{e}rez-Ram\'{i}rez and C.~Fox},
  title = {Agents Interpreting Imperative Sentences},
  booktitle = {Proceedings of the Fifth International Conference on Intelligent Text Processing and Computational Linguistics (CICLing 2004)},
  series = {Lecture Notes in Computer Science (LNCS)},
  address = {Seoul, South Korea},
  year = {2004}
}
2004 The Role of Imperatives in Inference: Agents and Actions Pérez-Ramírez, M. & Fox, C. Proceedings of the Mexican International Conference on Artificial Intelligence (MICAI'04)   inproceedings semantics, natural language  
Abstract: The aim of this paper is to present a model for the interpretation
of imperative sentences in which reasoning agents play the role of
speakers and hearers. A requirement is associated with both the person
who makes and the person who receives the order which prevents the
hearer coming to inappropriate conclusions about the actions s/he
has been commanded to do. By relating imperatives with the actions
they prescribe, the dynamic aspect of imperatives is captured and
by using the idea of encapsulation, it is possible to distinguish
what is demanded from what is not. These two ingredients provide
agents with the tools to avoid inferential problems in interpretation.
BibTeX:
@inproceedings{perez-ramirez04:_role_of_imper_in_infer,
  author = {M.~P\'{e}rez-Ram\'{i}rez and C.~Fox},
  title = {The Role of Imperatives in Inference: Agents and Actions},
  booktitle = {Proceedings of the Mexican International Conference on Artificial Intelligence (MICAI'04)},
  address = {Mexico City, Mexico},
  year = {2004}
}
2003 Workshop on Lambda Calculus, Type Theory and Natural Language   proceedings semantics, natural language  
Abstract: The Workshop on Lambda Calculus, Type Theory, and Natural Language
(LCTTNL) is designed to bring together researchers interested in
functional programming, type theory, and the application of the lambda
calculus to the analysis of natural language. The communities that
have developed around each of these areas share many formal and computational
interestes. Unfortunately, they have not had much contact with each
other. We hope that LCTTNL will provide a forum in which people working
on the lambda calculus from a variety of distinct perspectives will
share their research and come to appreciate new domains of application.
BibTeX:
@proceedings{fernandez03:_works_lambd_calcul_type_theor,,
  title = {Workshop on Lambda Calculus, Type Theory and Natural Language},
  editor = {Maribel Fernández and Chris Fox and Shalom Lappin},
  address = {London, UK},
  year = {2003},
  note = {The original workshop website is at http://www.dcs.kcl.ac.uk/staff/maribel/Workshop-Kings.html}
}
2003 Doing Natural Language Semantics in An Expressive First-Order Logic with Flexible Typing Fox, C. & Lappin, S. Proceedings of the Eighth Conference on Formal Grammar 2003 (FGVienna)   inproceedings semantics, natural language URL  
Abstract: We present Property Theory with Curry Typing (PTCT), an intensional
first-order logic for natural language semantics. PTCT permits fine-grained
specifications of meaning. It also supports polymorphic types and
separation types. (Separation types are also known as sub-types.)
We develop an intensional number theory within PTCT in order to represent
proportional generalized quantifiers like most. We use the
type system and our treatment of generalized quantifiers in natural
language to construct a type-theoretic approach to pronominal anaphora
that avoids some of the difficulties that undermine previous type-theoretic
analyses of this phenomenon.
BibTeX:
@inproceedings{fox03:_doing_natur_languag_seman_in,
  author = {Chris Fox and Shalom Lappin},
  title = {Doing Natural Language Semantics in An Expressive First-Order Logic with Flexible Typing},
  booktitle = {Proceedings of the Eighth Conference on Formal Grammar 2003 (FGVienna)},
  editor = {In G. Jaeger and P. Monachesi and G. Penn and S. Wintner},
  address = {Vienna, Austria},
  year = {2003},
  pages = {89--102},
  url = {http://chris.foxearth.org/papers/C-Fox-FG03-paper.pdf}
}
2003 A Fine-Grained Intensional First-Order Logic with Flexible Curry Typing Fox, C.   misc semantics, natural language URL  
Abstract: A highly intensional first-order logic will be presented which incorporates
an expressive type system, including general function spaces, separation
types and type polymorphism. Although first-order in power, the logic
is sufficiently expressive to capture aspects of natural language
semantics that are often characterised as requiring a higher-order
analysis. Aspects of the model theory for this logic will also be
discussed.
BibTeX:
@misc{fox03:_fine_grain_inten_first_order,
  author = {Chris Fox},
  title = {A Fine-Grained Intensional First-Order Logic with Flexible {C}urry Typing},
  year = {2003},
  howpublished = {The Fields Institute Workshop of Mathematical Linguistics, Ottawa},
  note = {Invited talk at The Fields Institute Workshop of Mathematical Linguistics, Ottawa. Joint work with Shalom Lappin},
  url = {http://chris.foxearth.org/slides/C-Fox-FWML-2003-slides.pdf}
}
2003 Property Theory with Curry Typing: An Intensional Logic for Natural Language Semantics Fox, C.   misc semantics, natural language URL  
Abstract: We present Property Theory with Curry Typing (PTCT), an intensional
first-logic for natural language semantics. PTCT permits fine-grained
specifications of meaning. It also supports polymorphic, separation,
and dependent types. We develop an intensional number theory with
PTCT in order to represent proportional generalized quantifiers like
most. We use the type system and our treatment of generalized
quantifiers in natural language to construct a type-theoretic approach
to pronominal anaphora that avoids some of the difficulties that
undermine previous type-theoretic analyses of this phenomenon.
BibTeX:
@misc{fox03:_proper_theor_with_curry_typin,
  author = {Chris Fox},
  title = {Property Theory with {C}urry Typing: An Intensional Logic for Natural Language Semantics},
  year = {2003},
  howpublished = {Foundations of Computational Linguistics, Workshop at the IEEE Symposium on Logic in Computer Science (LICS), Ottawa, Ontario, Canada},
  note = {Invited talk at Foundations of Computational Linguistics, Workshop at the IEEE Symposium on Logic in Computer Science (LICS), Ottawa, Ontario, Canada. Joint work with Shalom Lappin},
  url = {http://chris.foxearth.org/slides/C-Fox-LICS-2003-slides.pdf}
}
2003 Type-Theoretic Approach to Anaphora and Ellipsis Fox, C. & Lappin, S. Proceedings of Recent Advances in Natural Language Processing (RANLP 2003)   inproceedings semantics, natural language URL  
Abstract: We present an approach to anaphora and ellipsis resolution in which
pronouns and elided structures are interpreted by the dynamic identification
in discourse of type constraints on their semantic representations.
The content of these conditions is recovered in context from an antecedent
expression. The constraints define separation types (sub-types) in
Property Theory with Curry Typing (PTCT), an expressive first-order
logic with Curry typing that we have proposed as a formal framework
for natural language semantics
BibTeX:
@inproceedings{fox03:_type_theor_approac_to_anaph_and_ellip,
  author = {Chris Fox and Shalom Lappin},
  title = {Type-Theoretic Approach to Anaphora and Ellipsis},
  booktitle = {Proceedings of Recent Advances in Natural Language Processing (RANLP 2003)},
  address = {Borovets, Bulgaria},
  year = {2003},
  url = {http://chris.foxearth.org/papers/C-Fox-RANLP03-paper.pdf}
}
2003 An Axiomatisation of Imperatives using Hoare Logic Pérez-Ramírez, M. & Fox, C. Proceedings of the Fifth International Workshop on Computational Semantics(IWCS-5)   inproceedings semantics, natural language URL  
Abstract: This paper presents an axiomatisation of imperatives using Hoare logic.
It accounts for some inferential pragmatic aspects of imperatives.
Unlike Jorgensen's, Ross, and Chellas, proposals, rather than assigning
truth-values to imperatives, imperatives are evaluated as a relation
between the state demanded and the state or circumstances in which
the imperative is uttered.
BibTeX:
@inproceedings{perez-ramirez03:_axiom_of_imper_using_hoare_logic,
  author = {M.~P\'{e}rez-Ram\'{i}rez and C.~Fox},
  title = {An Axiomatisation of Imperatives using Hoare Logic},
  booktitle = {Proceedings of the Fifth International Workshop on Computational Semantics(IWCS-5)},
  editor = {Harry Bunt and Ielka van der Sluis and Roser Morante},
  address = {Tilburg, Netherlands},
  year = {2003},
  pages = {303--320},
  url = {http://chris.foxearth.org/papers/C-Fox-IWCS5-paper.pdf}
}
2003 Imperatives as Obligatory and Permitted Actions Pérez-Ramírez, M. & Fox, C. Proceedings of the Fourth International Conference on Intelligent Text Processing and Computational Linguistics (CICLing 2003)   inproceedings semantics, natural language URL  
Abstract: We present a dynamic deontic model for the interpretation of imperative
sentences in terms of Obligation (O) and Permission (P). Under the
view that imperatives prescribe actions and unlike the so-called
"standard solution" (Huntley, 1984) these operators act over actions
rather that over statements. Then by distinguishing obligatory from
non obligatory actions we tackle the paradox of Free Choice Permission
(FCP).
BibTeX:
@inproceedings{perez-ramirez03:_imper_as_oblig_and_permit_action,
  author = {M.~P\'{e}rez-Ram\'{i}rez and C.~Fox},
  title = {Imperatives as Obligatory and Permitted Actions},
  booktitle = {Proceedings of the Fourth International Conference on Intelligent Text Processing and Computational Linguistics (CICLing 2003)},
  editor = {Alexander F. Gelbukh},
  publisher = {Springer},
  series = {Lecture Notes in Computer Science (LNCS)},
  address = {Mexico City, Mexico},
  year = {2003},
  volume = {2588},
  pages = {52--64},
  url = {http://chris.foxearth.org/papers/C-Fox-CICLing2003-paper.pdf},
  isbn = {3-540-00532-3}
}
2002 Business Process Modeling in INSPIRE Using Petri Nets Badica, C., Brezovan, M. & Fox, C. Transactions on Automatic Control and Computer Science   article processes URL  
Abstract: This paper introduces a notation for business process modeling and
shows how it can be formally interpreted in terms of Petri nets.
Petri nets have a quite respectable research community, which is
35 years old. However, they were only recently proposed for business
process modeling. This is probably due to the fact they are often
claimed to be ``too complex'' for this task. Nevertheless, they are
quite well understood and the theory behind them is well developed,
so we think they have a good potential for business process modeling,
but more work needs to be done. In this paper we show that Petri
nets can help in understanding formally the business process modeling
notation developed in the Inspire project. This understanding can
act as a basis for a future work on formal analysis of business process
models developed with the INSPIRE tool. The Inspire project (IST-10387-1999)
aims to develop an integrated tool-set to support a systematic and
more human-oriented approach to business process re-engineering.
BibTeX:
@article{badica02:_busin_proces_model_in_inspir,
  author = {C. Badica and M. Brezovan and C. Fox},
  title = {Business Process Modeling in INSPIRE Using Petri Nets},
  journal = {Transactions on Automatic Control and Computer Science},
  year = {2002},
  volume = {47},
  number = {2},
  pages = {41--46},
  note = {A version of this paper was also presented at the Fifth international Conference on Technical Informatics, 18th--19th October 2002, Timisoara, Romania},
  url = {http://chris.foxearth.org/papers/C-Fox-CONTI2002-paper.pdf}
}
2002 Design and Implementation of a Business Process Representation Module Badica, C. & Fox, C. Advances in Electrical and Computer Engineering   article processes URL  
Abstract: This paper reports on work done in the INSPIRE project on developing
the Process Representation Module (PRM). The major aim of INSPIRE
is the development of a tool for intelligent, human-orientated business
process re-engineering. Our task was to develop the PRM, a core module
of the INSPIRE tool. The main responsibility of the PRM is to provide
an all-encompasing and consistent representation of business processes.
The paper describes the architecture and data-models of the system,
plus discussion of business prcess modelling and the formalisms used
in INSPIRE, and the details of the design and implementation of the
PRM.
BibTeX:
@article{badica02:_desig_and_implem_of_busin,
  author = {Costin Badica and Chris Fox},
  title = {Design and Implementation of a Business Process Representation Module},
  journal = {Advances in Electrical and Computer Engineering},
  year = {2002},
  volume = {2 (2002)},
  number = {1},
  pages = {38--45},
  note = {A version of this paper was presented at \emph{the Sixth IEEE International Conference on Development and Application Systems} (DAS 2002) Suceava, Romania, May 2002},
  url = {http://chris.foxearth.org/papers/C-Fox-suceava-paper.pdf}
}
2002 Modelling and Verification of Business Processes Badica, C. & Fox, C. Applied Simulation and Modelling (ASM 2002)   inproceedings processes URL  
Abstract: This paper introduces a notation for business process modelling based
on flownominal expressions, and shows how it can be used for static
verification of business processes, under the assumption of single
instance executions, by evaulationg them over boolean relations.


Its main advantage is simplicity, but it is also more restrictive
than other approaches because it can only indiciate those input patterns
that can cause it to enter an infinite loop, or resource starvation.
Nevertheless, this is useful because it can help isolate problems
at an early stage, prior to running any dynamic simulations.
BibTeX:
@inproceedings{badica02:_model_and_verif_of_busin_proces,
  author = {Costin Badica and Chris Fox},
  title = {Modelling and Verification of Business Processes},
  booktitle = {Applied Simulation and Modelling (ASM 2002)},
  address = {Crete, Greece},
  year = {2002},
  url = {http://chris.foxearth.org/papers/C-Fox-asm02-paper.pdf}
}
2002 First-Order Curry-typed Semantics for Natural Language Fox, C., Lappin, S. & Pollard, C. Proceedings of the Seventh International Workshop on Natural Language Understanding and Logic Programming (NLULP 2002)   inproceedings semantics, natural language URL  
Abstract: presents Property Theory with Curry Typing (PTCT) where the language
of terms and well-formed formulæ are joined by a language of types.
In addition to supporting fine-grained intensionality, the basic
theory is essentially first-order, so that implementations using
the theory can apply standard first-order theorem proving techniques.
The paper sketches a system of tableau rules that implement the theory.
Some extensions to the type theory are discussed, including type
polymorphism, which provides a useful analysis of conjunctive terms.
Such terms can be given a single polymorphic type that expresses
the fact that they can conjoin phrases of any one type, yielding
an expression of the same type.
BibTeX:
@inproceedings{chris02:_first_order_curry_typed_seman,
  author = {Chris Fox and Shalom Lappin and Carl Pollard},
  title = {First-Order {C}urry-typed Semantics for Natural Language},
  booktitle = {Proceedings of the Seventh International Workshop on Natural Language Understanding and Logic Programming (NLULP 2002)},
  editor = {S. Wintner},
  address = {Copenhagen, Denmark},
  year = {2002},
  pages = {175--192},
  note = {Also in \emph{Datalogiske Skrifter}, Volume 92, pages 87--102, 28th July 2002 (Federated Logic Conference 2002 Omnibus)},
  url = {http://chris.foxearth.org/papers/C-Fox-nlulp02-paper.pdf}
}
2002 A Higher-Order Fine-Grained Logic for Intensional Semantics Fox, C., Lappin, S. & Pollard, C. Proceedings of the Seventh International Symposium on Logic and Language (LoLa7)   inproceedings semantics, natural language URL  
Abstract: This paper describes a higher-order logic with fine-grained intensionality
(FIL). Unlike traditional Montogovian type theory, intensionality
is treated as basic. rather than derived through possible worlds.
This allows for fine-grained intensionality without impossible worlds.
Possible worlds and modalities are defined algebraically. The proof
theory for FIL is given as a set of tableau rules, and an algebraic
model theory is specified. The proof theory is shown to be sound
relative to this model theory. FIL avoids many of the problems created
by classical course-grained intensional logics that have been used
in formal and computational semantics.
BibTeX:
@inproceedings{chris02:_higher_order_fine_grain_logic,
  author = {Chris Fox and Shalom Lappin and Carl Pollard},
  title = {A Higher-Order Fine-Grained Logic for Intensional Semantics},
  booktitle = {Proceedings of the Seventh International Symposium on Logic and Language (LoLa7)},
  editor = {G. Alberti and K. Balough and P. Dekker},
  address = {P\'{e}cs, Hungary},
  year = {2002},
  pages = {37--46},
  url = {http://chris.foxearth.org/papers/C-Fox-lola02-fil-paper.pdf}
}
2002 Intensional First-Order Logic with Types Fox, C., Lappin, S. & Pollard, C. Proceedings of the Seventh International Symposium on Logic and Language (LoLa7)   inproceedings semantics, natural language URL  
Abstract: presents Property Theory with Curry Typing (PTCT) where the language
of terms and well-formed formulæ are joined by a language of types.
In addition to supporting fine-grained intensionality, the basic
theory is essentially first-order, so that implementations using
the theory can apply standard first-order theorem proving techniques.
Some extensions to the type theory are discussed, type polymorphism,
and enriching the system with sufficient number theory to account
for quantifiers of number, such as ``most.''
BibTeX:
@inproceedings{chris02:_inten_first_order_logic_with_types,
  author = {Chris Fox and Shalom Lappin and Carl Pollard},
  title = {Intensional First-Order Logic with Types},
  booktitle = {Proceedings of the Seventh International Symposium on Logic and Language (LoLa7)},
  editor = {G. Alberti and K. Balough and P. Dekker},
  address = {P\'{e}cs, Hungary},
  year = {2002},
  pages = {47--56},
  url = {http://chris.foxearth.org/papers/C-Fox-lola02-ptct-paper.pdf}
}
2002 ConSUS: A Scalable Approach to Conditional Slicing Daoudi, D., Danicic, S., Howroyd, J., Harman, M., Fox, C. & Ward, M. Proceedings of the 9th IEEE Working Conference on Reverse Engineering (WCRE2002)   inproceedings software analysis URL  
Abstract: Conditioned slicing can be applied to reverse engineering problems
which involve the extraction of executable fragments of code in the
context of some criteria of interest. This paper introduces ConSUS,
a conditioner for the Wide Spectrum Language, WSL. The symbolic executor
of ConSUS prunes the symbolic execution paths, and its predicate
reasoning system uses the FermaT simplify transformation in place
of a more conventional theorem prover. We show that this combination
of pruning and simplification as-reasoner leads to a more scalable
approach to conditioning.
BibTeX:
@inproceedings{daoudi02:_consus,
  author = {Daoudi, Dave/Mohammed and Sebastian Danicic and John Howroyd and Mark Harman and Chris Fox and Martin Ward},
  title = {ConSUS: A Scalable Approach to Conditional Slicing},
  booktitle = {Proceedings of the 9th IEEE Working Conference on Reverse Engineering (WCRE2002)},
  address = {Richmond, Virginia, USA},
  year = {2002},
  pages = {181--189},
  url = {http://chris.foxearth.org/papers/C-Fox-wcre02b-paper.pdf}
}
2002 Evolutionary Testing Supported by Slicing and Transformation Harman, M., Hu, L., Hierons, R., Fox, C., Danicic, S., Baresel, A., Sthamer, H. & Wegener, J. Proceedings of the 18th IEEE International Conference on Software Maintenance (ICSM02)   inproceedings software analysis URL  
Abstract: Evolutionary testing is a search based approach to the automated generation
of systematic test data, in which the search is guided by the test
data adequacy criterion.


Two problems for evolutionary testing are the large size of the search
space and structural impediments in the implementation of the program
which inhibit the formulation of a suitable fitness function to guide
the search.


In this paper we claim that slicing can be used to narrow the search
space and transformation can be applied to the problem of structural
impediments. The talk presents examples of how these two techniques
have been successfully employed to make evolutionary testing both
more efficient and more effective.
BibTeX:
@inproceedings{harman02:_evolut_testin_suppor_by_slicin_and_trans,
  author = {Mark Harman and Lin Hu and Rob Hierons and Chris Fox and Sebastian Danicic and Andre Baresel and Harmen Sthamer and Joachim Wegener},
  title = {Evolutionary Testing Supported by Slicing and Transformation},
  booktitle = {Proceedings of the 18th IEEE International Conference on Software Maintenance (ICSM02)},
  address = {Montreal, Canada},
  year = {2002},
  pages = {285},
  note = {Industrial Applications Track.},
  url = {http://chris.foxearth.org/papers/C-Fox-icsm02-paper.ps.gz}
}
2002 VADA: A Transformation-based System for Variable Dependence Analysis Harman, M., Fox, C., Hierons, R., Hu, L., Danicic, S. & Wegener, J. Proceedings of the Second IEEE International Workshop on Source Code Analysis and Manipulation (SCAM 2002)   inproceedings software analysis URL  
Abstract: Variable dependence is an analysis problem in which we seek to determine
the set of input variables which can affect the values stored in
a chosen set of intermediate program variables. Traditionally the
problem is studied as a dataflow analysis problem, and the answers
are computed in terms of solutions to data and control flow relations.


This paper shows the relationship between the variable dependence
analysis problem and slicing and describes a system, VADA, which
implements variable dependence analysis for C.


In order to cover the full range of C constructs and features, a transformation
to a core language is employed. Thus, the full analysis is only required
for the core language, which is relatively simple. This reduces the
overall effort required. The transformations used need only preserve
the variable dependence relation, and therefore need not be meaning
preserving in the traditional sense. We show how this relaxed meaning
further simplifies the transformation phase of the approach. Finally,
we present the results of an empirical study into the performance
of the system.
BibTeX:
@inproceedings{harman02:_vada,
  author = {Mark Harman and Chris Fox and Rob Hierons and Lin Hu and Sebastian Danicic and Joachim Wegener},
  title = {VADA: A Transformation-based System for Variable Dependence Analysis},
  booktitle = {Proceedings of the Second IEEE International Workshop on Source Code Analysis and Manipulation (SCAM 2002)},
  address = {Montreal, Canada},
  year = {2002},
  pages = {55--64},
  url = {http://chris.foxearth.org/papers/C-Fox-scam02-paper.ps.gz}
}
2002 Conditioned Slicing Supports Partition Testing Hierons, R., Harman, M., Fox, C., Ouarbya, L. & Daoudi, D. Journal of Software Testing, Verification and Reliability (STVR)   article software analysis URL  
Abstract: This paper describes the use of conditioned slicing to assist partition
testing, illustrating this with a case study. The paper shows how
a conditioned slicing tool can be used to provide confidence in the
uniformity hypothesis for correct programs, to aid fault detection
in incorrect programs and to highlight special cases.
BibTeX:
@article{hierons02:_condit_slicin_suppor_partit_testin,
  author = {Rob Hierons and Mark Harman and Chris Fox and Lahcen Ouarbya and Daoudi, Dave/Mohammed)},
  title = {Conditioned Slicing Supports Partition Testing},
  journal = {Journal of Software Testing, Verification and Reliability (STVR)},
  year = {2002},
  volume = {12},
  number = {1},
  pages = {23--28},
  url = {http://chris.foxearth.org/papers/C-Fox-testing-paper.ps.gz}
}
2002 A Denotational Interprocedural Program Slicer Ouarbya, L., Danicic, S., Daoudi, D., Harman, M. & Fox, C. Proceedings of the 9th IEEE Working Conference on Reverse Engineering (WCRE2002)   inproceedings software analysis URL  
Abstract: This paper extends a previously developed intraproce- dural denotational
program slicer to handle procedures. Using the denotational approach,
slices can be defined in terms of the abstract syntax of the object
language without the need of a control flow graph or similar intermediate
structure. The algorithm presented here is capable of correctly handling
the interplay between function and procedure calls, side-effects,
and short-circuit expression evaluation. The ability to deal with
these features is required in reverse engineering of legacy systems,
where code often contains side-effects.
BibTeX:
@inproceedings{ouarbya02:_denot_inter_progr_slicer,
  author = {Lahcen Ouarbya and Sebastian Danicic and Daoudi, David/Mohammed and Mark Harman and Chris Fox},
  title = {A Denotational Interprocedural Program Slicer},
  booktitle = {Proceedings of the 9th IEEE Working Conference on Reverse Engineering (WCRE2002)},
  publisher = {CSpress},
  address = {Richmond, Virginia, USA, },
  year = {2002},
  pages = {109--118},
  url = {http://chris.foxearth.org/papers/C-Fox-wcre02a-paper.pdf}
}
2001 Backward Conditioning: a new program specialisation technique and its application to program comprehension Danicic, S., Fox, C., Harman, M. & Hierons, R. IEEE Proceedings of the 9th International Workshop on Program Comprehension (IWPC2001),   inproceedings software analysis URL  
Abstract: This paper introduces backward conditioning. Like forward conditioning
(used in conditioned slicing), backward conditioning consists of
specialising a program with respect to a condition inserted into
the program.


However, whereas forward conditioning deletes statements which are
not executed when the initial state satisfies the condition, backward
conditioning deletes statements which cannot cause execution to enter
a state which satisfies the condition. The relationship between backward
and forward conditioning is reminiscent of the relationship between
backward and forward slicing.


Forward conditioning addresses program comprehension questions of
the form `what happens if the program starts in a state satisfying
condition c?', whereas backward conditioning addresses questions
of the form `what parts of the program could potentially lead to
the program arriving in a state satisfying condition c?'.


The paper illustrates the use of backward conditioning as a program
comprehension assistant and presents an algorithm for constructing
backward conditioned programs.
BibTeX:
@inproceedings{danicic01:_backw_condit,
  author = {Sebastian Danicic and Chris Fox and Mark Harman and Rob Hierons},
  title = {Backward Conditioning: a new program specialisation technique and its application to program comprehension},
  booktitle = {IEEE Proceedings of the 9th International Workshop on Program Comprehension (IWPC2001),},
  address = {Toronto, Canada},
  year = {2001},
  pages = {89--97},
  url = {http://chris.foxearth.org/papers/C-Fox-IWPC2001-paper.ps.gz},
  isbn = {0--7695--1131--7}
}
2001 Book Review: Linux: The Complete Reference, Third Edition, by Richard Petersen, Osborne/McGraw-Hill Fox, C. Software Testing, Verification & Reliability (STVR)   article software analysis  
BibTeX:
@article{fox01:_book_review,
  author = {Chris Fox},
  title = {Book Review: Linux: The Complete Reference, Third Edition, by Richard Petersen, Osborne/McGraw-Hill},
  journal = {Software Testing, Verification \& Reliability (STVR)},
  year = {2001},
  volume = {11},
  number = {1},
  pages = {55--58}
}
2001 A Framework for the Hyperintensional Semantics of Natural Language with Two Implementations Fox, C. & Lappin, S. Proceedings of the Fourth International Conference on Logical Aspects of Computational Linguistics (LACL2001)   inproceedings semantics, natural language URL  
Abstract: In this paper we present a framework for constructing hyperintensional
semantics for natural language. On this approach, the axiom of extensionality
is discarded from the axiom base of a logic. Weaker conditions are
specified for the connection between equivalence and identity which
prevent the reduction of the former relation to the latter. In addition,
by axiomatising an intensional number theory we can provide an internal
account of proportional cardinality quantifiers, like most.
We use a (pre-)lattice defined in terms of a (pre-)order that models
the entailment relation. Possible worlds/situations/indices are then
prime filters of propositions in the (pre-)lattice. Truth in a world/situation
is then reducible to membership of a prime filter. We show how this
approach can be implemented within (i) an intensional higher-order
type theory, and (ii) first-order property theory.
BibTeX:
@inproceedings{fox01:_framew_for_hyper_seman_of,
  author = {Chris Fox and Shalom Lappin},
  title = {A Framework for the Hyperintensional Semantics of Natural Language with Two Implementations},
  booktitle = {Proceedings of the Fourth International Conference on Logical Aspects of Computational Linguistics (LACL2001)},
  editor = {Groote, P.~de and G. Morrill and C. Retore},
  publisher = {Springer, Berlin and New York},
  series = {Lecture Notes in Computer Science (LNCS)},
  address = {Le Croisic, France},
  year = {2001},
  pages = {175--192},
  url = {http://chris.foxearth.org/papers/C-Fox-LACL2001-paper.ps.gz}
}
2001 Node Coarsening Calculi for Program Slicing Harman, M., Hierons, R., Danicic, S., Laurence, M., Howroyd, J. & Fox, C. Proceedings of the Eighth IEEE Working Conference on Reverse Engineering (WCRE2001)   inproceedings software analysis URL  
Abstract: Slicing has been shown to be a useful program abstraction technique,
with applications at many points in the software development life-cycle,
particularly as a tool to assist software evolution. Unfortunately,
slicing algorithms scale up rather poorly, diminishing the applicability
of slicing in practise.


In applications where many slices are required from a largely unchanging
system, incremental approaches to the construction of dependence
information can be used, ensuring that slices are constructed speedily.
However, for some applications, the only way to compute slices within
effective time constraints will be to trade precision for speed.
This approach has been successfully applied to a number of other
computationally expensive source code analysis techniques, most notably
point-to analysis.


This paper introduces a theory for trading precision for speed in
slicing based upon `blobbing together', or `coarsening', several
individual Control Flow Graph nodes. The theory defines the properties
which should be possessed by a logical calculus for `coarsening'
(coalescing several nodes in a region into a single representative
of the region). The theory is illustrated with a case study which
presents a calculus for R-coarsening, and a consistent and complete
set of inference rules which compromise precision for speed.
BibTeX:
@inproceedings{harman01:_node_coars_calcul_for_progr_slicin,
  author = {Mark Harman and Rob Hierons and Sebastian Danicic and Mike Laurence and John Howroyd and Chris Fox},
  title = {Node Coarsening Calculi for Program Slicing},
  booktitle = {Proceedings of the Eighth IEEE Working Conference on Reverse Engineering (WCRE2001)},
  address = {Stuttgart, Germany.},
  year = {2001},
  pages = {25--34},
  url = {http://chris.foxearth.org/papers/C-Fox-coarsening-paper.ps.gz}
}
2001 Pre/Post Conditioned Slicing Harman, M., Hierons, R., Fox, C., Danicic, S. & Howroyd, J. Proceedings of the 17th IEEE International Conference in Software Maintenance (ICSM2001)   inproceedings software analysis URL  
Abstract: shows how analysis of programs in terms of pre- and post- conditions
can be improved using a generalisation of conditioned program slicing
called pre/post conditioned slicing. Such conditions play an important
role in program comprehension, reuse, verification and re-engineering.


Fully automated analysis is impossible because of the inherent undecidability
of pre- and post- conditions. The method presented here reformulates
the problem to circumvent this. The reformulation is constructed
so that programs which respect the pre- and post-conditions applied
to them have empty slices. For those which do not respect the conditions,
the slice contains statements which could potentially break the conditions.
This separates the automatable part of the analysis from the human
analysis.
BibTeX:
@inproceedings{harman01:_pre_post_condit_slicin,
  author = {Mark Harman and Rob Hierons and Chris Fox and Sebastian Danicic and John Howroyd},
  title = {Pre/Post Conditioned Slicing},
  booktitle = {Proceedings of the 17th IEEE International Conference in Software Maintenance (ICSM2001)},
  address = {Florence, Italy},
  year = {2001},
  pages = {138--147},
  url = {http://chris.foxearth.org/papers/C-Fox-prepost-paper.ps.gz}
}
2000 ConSIT: A Conditioned Program Slicer Danicic, S., Fox, C., Harman, M. & Hierons, R. IEEE Proceedings of the International Conference in Software Maintenance (ICSM2000)   inproceedings software analysis URL  
Abstract: Conditioned slicing is a powerful generalisation of static and dynamic
slicing which has applications to many problems in software maintenance
and evolution, including re-use, re-engineering and program comprehension.


However, there has been relatively little work on the implementation
of conditioned slicing. Algorithms for implementing conditioned slicing
necessarily involve reasoning about the values of program predicates
in certain sets of states derived from the conditioned slicing criterion,
making implementation particularly demanding.


This paper introduces ConSIT, a conditional slicing system which is
based upon conventional static slicing, symbolic execution and theorem
proving. ConSIT is the first fully automated implementation of conditioned
slicing.
BibTeX:
@inproceedings{danicic00:_consit,
  author = {Sebastian Danicic and Chris Fox and Mark Harman and Rob Hierons},
  title = {ConSIT: A Conditioned Program Slicer},
  booktitle = {IEEE Proceedings of the International Conference in Software Maintenance (ICSM2000)},
  address = { San Jose, California, USA},
  year = {2000},
  pages = {216--226},
  url = {http://chris.foxearth.org/papers/C-Fox-ICSM2000.ps.gz}
}
2000 Introduction to Computing Fox, C.   booklet distance learning  
Abstract: A Subject Guide for the University of London's undergraduate
programme for external students.
BibTeX:
@booklet{fox00:_introd_to_comput,
  author = {Chris Fox},
  title = {Introduction to Computing},
  year = {2000},
  howpublished = {A \emph{Subject Guide} for the University of London's undergraduate programme for external students},
  note = {A \emph{Subject Guide} for the University of London's undergraduate programme for external students}
}
2000 The Ontology of Language: properties, individuals and discourse Fox, C.   book ontology, semantics, natural language URL  
Abstract: This monograph is concerned with exploring various ontological assumptions,
and whether they can be eliminated.

It examines the basic notions of proposition and property, as adopted
by property theory, and then goes on to explore what other ontological
assumptions may be necessary for a semantic theory of natural language,
covering plurals, mass terms, intensional individuals and discourse
representation.
BibTeX:
@book{fox00:_ontol_of_languag,
  author = {Chris Fox},
  title = {The Ontology of Language: properties, individuals and discourse},
  publisher = {The Center for the Study of Language and Information (CSLI)},
  series = {Lecture Notes of The Center for the Study of Language and Information (CSLI)},
  year = {2000},
  url = {http://web.stanford.edu/group/cslipublications/cslipublications/site/1575862344.shtml}
}
1999 Program Simplification as a Means of Approximating Undecidable Propositions Harman, M., Fox, C., Hierons, R., Binkley, D. & Danicic, S. IEEE Proceedings of the Seventh International Workshop on Program Comprehension 1999 (IWPC-99)   inproceedings software analysis URL  
Abstract: In this paper, an approach is described which mixes testing, slicing,
transformation and program verification to investigate speculative
hypotheses concerning a program formulated during program comprehension
activity.


Our philosophy is that such hypotheses (which are typically undecidable)
can, in some sense, be `answered' by a partly automated system which
returns neither `true' nor `false', but a program (the `test program')
which computes the answer.


The motivation for this philosophy is the way in which, as we demonstrate,
static analysis and manipulation technology can be applied to ensure
that the resulting program is significantly simpler than the original
program, thereby simplifying the process of investigating the original
hypothesis.
BibTeX:
@inproceedings{harman99:_progr_simpl_as_means_of,
  author = {Mark Harman and Chris Fox and Rob Hierons and David Binkley and Sebastian Danicic},
  title = {Program Simplification as a Means of Approximating Undecidable Propositions},
  booktitle = {IEEE Proceedings of the Seventh International Workshop on Program Comprehension 1999 (IWPC-99)},
  address = {Pittsburgh, Pennsylvania, USA},
  year = {1999},
  pages = {208--217},
  url = {http://chris.foxearth.org/papers/C-Fox-IWPC99-paper.ps.gz},
  isbn = {0--7695--0179--6}
}
1998 Plurals and Mass Terms in Property Theory Fox, C. Plurality and Quantification   incollection ontology, semantics, natural language URL  
Abstract: This chapter is concerned with representing the semantics of natural
language plurals and mass terms in property theory; a weak first-order
theory of Truth, Propositions and Properties with fine-grained intensionality
(Turner 1990, Turner 1992, Aczel 1980).


The theory allows apparently coreferring items to corefer without
inconsistency. This is achieved by using property modifiers which
keep track of the property used to refer to a term, much like Landman's
roles (Landman 1989). We can thus predicate apparently contradictory
properties of ``the judge'' and ``the cleaner,'' for example, even
if they turn out to be the same individual.


The same device can also be used to control distribution into mereological
terms: when we say ``the dirty water is liquid,'' we can infer that
those parts that are dirty water are liquid without inferring that
the dirt is liquid.


The theory shows how we can formalise some aspects of natural language
semantics without being forced to make certain ontological commitments.
This is achieved in part by adopting an axiomatic methodology. Axioms
are proposed that are just strong enough to support intuitively acceptable
inferences, whilst being weak enough for some ontological choices
to be avoided (such as whether or not the extensions of mass terms
should be homogeneous or atomic). The axioms are deliberately incomplete,
just as in basic PT, where incomplete axioms are used to avoid the
logical paradoxes.


The axioms presented are deliberately too weak to say much about `non-denoting'
definite descriptors. For example, we cannot erroneously prove that
they are all equal. Neither can we prove that predication of such
definites results in a proposition. This means that we cannot question
the truth of sentences such as ``the present king of France is bald.''
BibTeX:
@incollection{fox98:_plural_and_mass_terms_in_proper_theor,
  author = {Chris Fox},
  title = {Plurals and Mass Terms in Property Theory},
  booktitle = {Plurality and Quantification},
  editor = {F. Hamm and E. Hinrichs},
  publisher = {Kluwer Academic Press},
  series = {Studies in Linguistics and Philosophy (SLAP)},
  address = {Dordrecht},
  year = {1998},
  pages = {113--175},
  url = {http://chris.foxearth.org/papers/C-Fox-plurals-and-mass-terms-1998.pdf},
  isbn = {0--7923--4841--9}
}
1997 Artificial Intelligence Fox, C.   booklet distance learning  
Abstract: A Subject Guide for the University of London's undergraduate
programme for external students.
BibTeX:
@booklet{fox97:_artif_intel,
  author = {Chris Fox},
  title = {Artificial Intelligence},
  year = {1997},
  howpublished = {A \emph{Subject Guide} for the University of London's undergraduate programme for external students},
  note = {A \emph{Subject Guide} for the University of London's undergraduate programme for external students}
}
1997 Plural Anaphora in a Property-theoretic Discourse Representation Theory Fox, C. The Second International Workshop on Computational Semantics   conference semantics, natural language URL  
Abstract: It is possible to use a combination of classical logic and dependent
types to represent natural language discourse and singular anaphora
(Fox 1994b). In this paper, these ideas are extended to account for
some cases of plural anaphora. In the theory described universal
quantification and conditionals give rise to a context in which singular
referents within its scope are transformed into plurals. These ideas
are implemented in axiomatic Property Theory (Turner 1992) extended
with plurals (Fox 1993), giving a treatment of some examples of singular
and plural anaphora in a highly intensional, weakly typed, classical,
first-order logic.
BibTeX:
@conference{fox97:_plural_anaph_in_proper_theor,
  author = {Chris Fox},
  title = {Plural Anaphora in a Property-theoretic Discourse Representation Theory},
  booktitle = {The Second International Workshop on Computational Semantics},
  address = {Tilburg},
  year = {1997},
  pages = {(10 pages)},
  url = {http://chris.foxearth.org/papers/C-Fox-IWCS2-paper.ps.gz}
}
1996 FraCaS (Public EU Project Reports) Cooper, R., Crouch, R., Dekker, P., van Eijck, J., Fox, C., van Genabith, J., Jaspars, J., Kamp, H., Milward, D., Pinkal, M., Poesio, M., Pulman, S., Vestre, E., Asher, N. & others   misc semantics, natural language URL  
Abstract: The FraCaS project, funded by the European Union through the LRE program,
aimed at bringing about a convergence of current efforts in computational
semantics, thus reducing duplicate work. In particular, the project
was concerned with (i) examining the needs of current work on computational
semantics, especially applications dealing with real data; (ii) comparing
current semantic approaches both with respect to their claims and
their usefulness for natural language processing applications of
the kind studied in (i); and (iii) make preliminary proposals concerning
a formal, unified framework for computational semantics that addresses
the needs discussed in (i) and facilitates the comparison in (ii).
The project began in January 1994 and ended in March 1996.
BibTeX:
@misc{fox:_public_eu_projec_repor,
  author = {R. Cooper and R. Crouch and P. Dekker and J. van Eijck and C. Fox and J. van Genabith and J. Jaspars and H. Kamp and D. Milward and M. Pinkal and M. Poesio and S. Pulman and E. Vestre and N. Asher and others},
  title = {{FraCaS} (Public {EU} Project Reports)},
  year = {1996},
  note = {Public project reportsfor the EU funded project Framework for Computational Semantics (FraCaS) over the period 1994--96},
  url = {http://www.cogsci.ed.ac.uk/~fracas/}
}
1995 Online User's Guide for Squirrel Fox, C.   manual semantics, natural language URL  
Abstract: The Online User's Guide for Squirrel documents a system for
translating natural language queries into a relational query language
(SQL).


Please note that although all the documentation is still accessible,
the online demonstration of the Squirrel system is not currently
being maintained. Also, some of the contact details contained in
the online documentation for the project are out-of-date.


This documentation is also available electronically as a gzipped postscript
file from http://chris.foxearth.org/papers/C-Fox-squirrel-doc.ps.gz
BibTeX:
@manual{fox:_onlin_users_guide_for_squir,
  author = {Chris Fox},
  title = {Online User's Guide for Squirrel},
  year = {1995},
  note = {Hypertext Documentation for a Natural Language Front-end to Relational Databases},
  url = {http://cswww.essex.ac.uk/staff/foxcj/Squirrel/doc.html}
}
1994 Discourse Representation, Type Theory and Property Theory Fox, C. Proceedings of the International Workshop on Computational Semantics   inproceedings semantics, natural language URL  
Abstract: Since Aristotle, it has been accepted that the appropriate interpretation
of sentences is as propositions, and that general terms should be
interpreted as properties, distinct from propositions. Recent proposals
for natural language semantics have used constructive type theories
such as Martin-Loef's Type Theory MLTT (Martin-Loef 1982, 1984) which
treat anaphora and `donkey' sentences using dependent types (Sundholm
1989, Ranta 1991, Davila 1994). However, MLTT conflates the notions
of proposition and property. This work shows how, within Property
Theory, dependent-type expressions representing natural language
discourse can be mapped systematically into first-order expressions
with a classical notion of propositionhood, distinct from that of
properties.
BibTeX:
@inproceedings{fox94:_discour_repres_type_theor_and_proper_theor,
  author = {Chris Fox},
  title = {Discourse Representation, Type Theory and Property Theory},
  booktitle = {Proceedings of the International Workshop on Computational Semantics},
  editor = {H. Bunt and R. Muskens and G. Rentier},
  address = {Institute for Language Technology and Artificial Intelligence (ITK), Tilburg},
  year = {1994},
  pages = {71--80},
  url = {http://chris.foxearth.org/papers/C-Fox-ITK-paper.ps.gz}
}
1994 Episodes, Characterising Sentences and Causes Fox, C.   unpublished ontology, semantics, natural language URL  
Abstract: Episodic Logic (EL), as described in Chung Hee Hwang and Lenhart K.
Schubert's paper ``Episodic Logic: a Situational Logic for Natural
Language Processing'' (Hwang & Schubert1993), is a formal theory
of natural language semantics which has an extensive coverage of
phenomena. The theory has been applied effectively in various software
implementations of natural language systems.


This paper is not intended to undermine this theoretical and applied
work. It aims merely to illustrate some problems with the informal
intuitions that purport to explain and justify the formal theory
of EL. In particular, this paper criticises the view that we should
think of events as situations (episodes) which can be completely
characterised by natural language sentences. I argue that: (1) there
are no genuine natural language examples which require it; (2) it
results in a loss of expressiveness; and (3) it leads to problems
when giving the logical form of causal statements. I suggest that
the motivating example can be dealt with adequately using a (neo-)
Davidsonian approach.


That these arguments do not undermine the formal theory of EL and
its application in various systems can be seen from the fact (discussed
at the end of Section II) that the formal theory appears to make
no use of the problematic notions; they only appear in its informal
motivation. In effect, EL can be seen to provide a neo-Davidsonian
theory.


This paper is structured as follows: Section I introduces those aspects
of EL relevant for the discussion; Section II presents detailed criticisms;
Section III re-appraises the (neo-)Davidsonian approach to events,
and shows how it can cope with Hwang and Schubert's motivating example;
and Section IV makes some concluding remarks.
BibTeX:
@unpublished{fox94:_episod_charac_senten_and_causes,
  author = {Chris Fox},
  title = {Episodes, Characterising Sentences and Causes},
  year = {1994},
  note = {manuscript, Universit\"a{}t des Saarlandes, (10 pages)},
  url = {http://chris.foxearth.org/papers/C-Fox-EL-paper.ps.gz}
}
1994 Existence Presuppositions and Category Mistakes Fox, C. Acta Linguistica Hungarica   article semantics, natural language URL  
Abstract: This paper discusses a property-theoretic approach to the existence
presuppositions that occur with definite descriptors and some examples
of anaphoric reference. Property theory can avoid the logical paradoxes
(such as the Liar: ``this sentence is false.'') by taking them to
involve a category mistake, so they do not express felicitous propositions.
It is suggested that this approach might be extended to other cases
of infelicitous discourse, such as those due to a false presupposition
(as in: ``The present queen of France is bald.'') or due to a missing
antecedent (as in: ``Every man walks in. He whistles.''). These examples
may be represented by terms that embody category mistakes, so semantically
they do not express propositions.

Felicity of discourse then corresponds with the propositionhood of
the representation.
BibTeX:
@article{fox94:_exist_presup_and_categ_mistak,
  author = {Chris Fox},
  title = {Existence Presuppositions and Category Mistakes},
  journal = {Acta Linguistica Hungarica},
  year = {1994},
  volume = {42},
  number = {3/4},
  pages = {325--339},
  note = {Published 1996},
  url = {http://chris.foxearth.org/papers/C-Fox-LL5-paper.ps.gz}
}
1993 An Approach to Paraphrasing Logical Query Languages in English De Roeck, A. N., Fox, C., Lowden, B. G. T. & Walls, B. R. Journal of Database Technology   article semantics, natural language  
Abstract: This paper describes an extension to a system for parsing English
a relational calculus (SQL), by way of Property-theoretic semantics,
that paraphrases the relational query back into English, to ensure
the query has been interpreted as intended.
BibTeX:
@article{fox92:_approac_to_parap_logic_query,
  author = {A.N. {De Roeck} and Chris Fox and B. G. T. Lowden and B. R. Walls},
  title = {An Approach to Paraphrasing Logical Query Languages in English},
  journal = {Journal of Database Technology},
  publisher = {Pergamon Press},
  year = {1993},
  volume = {4},
  number = {4},
  pages = {227--233},
  note = {Volume dated 1991--92}
}
1993 Modal Reasoning in Relational Systems De Roeck, A. N., Fox, C., Lowden, B. G. T. & Walls, B. R. Journal of Database Technology   article semantics, natural language  
Abstract: This paper describes an extension to a system for parsing English
into a relational calculus (SQL), by way of Property-theoretic semantics,
that allows such a system to answer modal questions, such as ``Can
Sally earn 12,000?'' The system answered such queries by performing
a trial update to see if any of the constraints on the database would
then be broken.
BibTeX:
@article{fox92:_modal_reason_in_relat_system,
  author = {A.N. {De Roeck} and Chris Fox and B. G. T. Lowden and B. R. Walls},
  title = {Modal Reasoning in Relational Systems},
  journal = {Journal of Database Technology},
  publisher = {Pergamon Press},
  year = {1993},
  volume = {4},
  number = {4},
  pages = {235--244},
  note = {Volume dated 1991--92}
}
1993 Individuals and Their Guises: a Property-theoretic Analysis Fox, C. Proceedings of the Ninth Amsterdam Colloquium   inproceedings ontology, semantics, natural language URL  
Abstract: This paper reappraises Landman's formal theory of intensional individuals---individuals
under roles, or guises (Landman 1989)---within property theory (PT)
(Turner 1992). As many of Landman's axioms exist to overcome the
strong typing of his representation, casting his ideas in weakly
typed PT produces a simpler theory. However, there is the possibility
of an even greater simplification: if roles, or guises, are represented
with property modifiers then there is no need for Landman's intensional
individuals. Landman's argument against the use of property modifiers
is re-examined, and shown to be mistaken.
BibTeX:
@inproceedings{fox93:_indiv_and_their_guises,
  author = {Chris Fox},
  title = {Individuals and Their Guises: a Property-theoretic Analysis},
  booktitle = {Proceedings of the Ninth Amsterdam Colloquium},
  editor = {P. Dekker and M. Stokhof},
  year = {1993},
  volume = {II},
  pages = {301--312},
  url = {http://chris.foxearth.org/papers/C-Fox-AC9-paper.ps.gz}
}
1993 Mass Terms and Plurals in Property Theory Fox, C. School: University of Essex   phdthesis ontology, semantics, natural language URL  
Abstract: The Thesis Mass Terms and Plurals in Property Theory is concerned
with extending a weak axiomatic theory of Truth Propositions, and
Properties, with fine grained intensionality (PT), to represent the
semantics of natural language (NL) sentences involving plurals and
mass terms.


The use of PT as a semantic theory for NL eases the problem of modeling
the behaviours of these phenomena by removing the artificial burdens
imposed by strongly typed, model-theoretic semantic theories. By
deliberately using incomplete axioms, following the example set by
basic PT, it is possible to remain uncommitted about: the existence
of atomic mass terms; the existence of a `bottom element' (a part
of all terms) as a denotation of NL nominals; and the propositionhood
(and hence truth of) sentences such as ``the present King of France
is bald.''


Landman's theory concerning the representation of individuals under
roles, or guises is reappraised in terms of property modifiers. This
is used to offer a solution to the problem of distinguishing predication
of equal fusions of distinct properties, and the control of distribution
into mereological terms, when used to represent NL mass nominals,
without assuming an homogeneous extension.


The final theory provides a uniform framework for representing sentences
involving both mass and count nominals.
BibTeX:
@phdthesis{fox93:_mass_terms_and_plural_in_proper_theor,
  author = {Chris Fox},
  title = {Mass Terms and Plurals in Property Theory},
  school = {University of Essex},
  year = {1993},
  url = {http://chris.foxearth.org/papers/phd/}
}
1991 A Formal Approach to Translating English into SQL De Roeck, A. N., Fox, C., Lowden, B. G. T., Turner, R. & Walls, B. R. Aspects of Databases, Proceedings of the Ninth British National Conference on Databases (BNCOD 9)   inproceedings semantics, natural language  
Abstract: This paper describes a system for parsing English into Property-theoretic
semantics, using an attribute-value grammar and a bi-directional
chart parser, which then translates this representation into a relational
calculus (SQL) query which can be presented to a database INGRES
for evaluation. The query was optimised using techniques from resolution
theorem proving.
BibTeX:
@inproceedings{fox91:_formal_approac_to_trans_englis_into_sql,
  author = {A. N. {De Roeck} and Chris Fox and B. G. T. Lowden and R. Turner and B. R. Walls},
  title = {A Formal Approach to Translating English into SQL},
  booktitle = {Aspects of Databases, Proceedings of the Ninth British National Conference on Databases (BNCOD 9)},
  editor = {M. S. Jackson and A. E. Robinson},
  publisher = {Butterworth-Heinmann},
  year = {1991},
  pages = {110--125},
  isbn = {0-7506-1525-7}
}
1991 Helpful Answers to Modal and Hypothetical Questions Fox, C., (with R. A. J. Ball, Brown, E. K., Roeck, A. N. D., Groefsema, M., Obeid, N. & Turner), R. Proceedings of the European Association for Computational Linguistics (EACL)   inproceedings semantics, natural language  
Abstract: The paper describes a system in which a chart parser translates a
question in English into a propositional representation in a non-monotonic
logic. A ``context machine'' uses this representation to extract
salient statements from a knowledge base. A tableau theorem prover
then takes these statements and attempts to prove the proposition
associated with the original question. If the proof fails, the reason
for failure can be used to provide a relevant helpful answer.
BibTeX:
@inproceedings{fox91:_helpf_answer_to_modal_and_hypot_quest,
  author = {Chris Fox and {(with R. A. J. Ball and E. K. Brown and A. N. De Roeck and M. Groefsema and N. Obeid and R. Turner)}},
  title = {Helpful Answers to Modal and Hypothetical Questions},
  booktitle = {Proceedings of the European Association for Computational Linguistics (EACL)},
  year = {1991},
  pages = {257--262}
}
1991 A Natural Language System Based on Formal Semantics Fox, C., (with A. N. De Roeck, Lowden, B. G. T., Turner, R. & Walls), B. R. Proceedings of the International Conference on Current Issues in Computational Linguistics   inproceedings semantics, natural language  
Abstract: This paper describes a system for parsing English into Property-theoretic
semantics, using an attribute-value grammar and a bi-directional
chart parser, which then translates this representation into a relational
calculus (SQL) query which can be presented to a database (Ingres)
for evaluation.


The Property Theory used is a highly intensional first-order theory,
which avoids some of the problems of higher-order intensional logics.
BibTeX:
@inproceedings{fox91:_natur_languag_system_based_formal_seman,
  author = {Chris Fox and {(with A. N. {De Roeck} and B. G. T. Lowden and R. Turner and B. R. Walls)}},
  title = {A Natural Language System Based on Formal Semantics},
  booktitle = {Proceedings of the International Conference on Current Issues in Computational Linguistics},
  address = {Penang. Malaysia},
  year = {1991},
  pages = {221--234}
}


Dr Chris Fox · University of Essex · Wivenhoe Park · Colchester · CO4 3SQ · United Kingdom
· ☏ +44 (0)1206 87 2576 · ⚷ OpenPGP key
Created by JabRef on 2016-09-15