My Academic Publications
Last updated on 29 November 2023
This is a list of my academic publications, where possible with links to the published and archived versions. My publication list can also be found on my Google Scholar profile. Most of my publications in philosophy are available on my PhilPapers profile and the publications since 2015 are also available via the KU Leuven repository Lirias.
A list of my fiction and poetry can be found here.
2023
-
S. Wenmackers, “Uniform probability in cosmology,” Studies in History and Philosophy of Science, vol. 101, pp. 48–60, 2023.
[Abstract]
Problems with uniform probabilities on an infinite support show up in contemporary cosmology. This paper focuses on the context of inflation theory, where it complicates the assignment of a probability measure over pocket universes. The measure problem in cosmology, whereby it seems impossible to pick out a uniquely well-motivated measure, is associated with a paradox that occurs in standard probability theory and crucially involves uniformity on an infinite sample space. This problem has been discussed by physicists, albeit without reference to earlier work on this topic. The aim of this article is both to introduce philosophers of probability to these recent discussions in cosmology and to familiarize physicists and philosophers working on cosmology with relevant foundational work by Kolmogorov, de Finetti, Jaynes, and other probabilists. As such, the main goal is not to solve the measure problem, but to clarify the exact origin of some of the current obstacles. The analysis of the assumptions going into the paradox indicates that there exist multiple ways of dealing consistently with uniform probabilities on infinite sample spaces. Taking a pluralist stance towards the mathematical methods used in cosmology shows there is some room for progress with assigning probabilities in cosmological theories.
-
S. Friederich and S. Wenmackers, “The future of intelligence in the universe: A call for humility,” International Journal of Astrobiology, vol. 22, pp. 414–427, 2023.
[Abstract]
Recent astrophysical findings suggest that the era during which the universe is habitable has just begun. This raises the question whether the entire universe may at some point in the future be filled with intelligent life. Hanson et al. (2021) argued that we can be confident that the universe will, by cosmic standards, soon be dominated by imperialist civilizations which expand rapidly, persist long, and make drastic changes to the volumes they control. The main motivation for this “grabby civilizations” hypothesis is that it supposedly provides a good explanation of why we are so early in cosmic history. In this paper, we criticise this motivation and suggest that it fails, for reasons analogous to why the notorious Doomsday argument fails. In the last part of the paper we broaden our discussion and argue that it may be rational to assign a rather low prior probability to the grabby civilizations hypothesis. For instance, if there are any civilizations that expand rapidly and indefinitely, they may well not make any drastic changes to the volumes they inhabit, potentially for strategic reasons. Hence, we call for epistemic caution and humility regarding the question of the long-term evolution of intelligence in the universe.
-
I. Douven, F. Hindriks, and S. Wenmackers, “Moral bookkeeping,” Ergo, vol. 10, article number 15, 2023.
[Abstract]
There is widespread agreement among philosophers about the Mens Rea Asymmetry (MRA), according to which praise requires intent, whereas blame does not. However, there is evidence showing that MRA is descriptively inadequate. We hypothesize that the violations of MRA found inthe experimental literature are due to what we call “moral compositionality,” by which we mean that people evaluate the component parts of a moral problem separately and then reach an overall verdict by aggregating the verdicts on the component parts. We have subjected this hypothesis to the test and here report the results of our experiment. We explore several explanations of the experimental findings and conclude that they present a puzzle to moral theory.
-
S. Wenmackers, “Of demons, game shows, and baguettes: Tracing the nineteenth century origins of Seldon’s psychohistory,” in Asimov’s Foundation and Philosophy: Psychohistory and its Discontents, J. Heter and J. T. Simpson, Eds., Hidden Universe, pp. 39–48, 2023.
[Abstract]
Isaac Asimov (1920–1992) wrote Foundation, a science-fiction series about a galactic empire. The books are now being televised by Apple TV+: the first season premiered in 2021 and a second season is planned. The plot of Foundation crucially revolves around a fictional science that is supposed to predict the future course of large populations.
Would psychohistory have been developed if Hari Seldon hadn’t existed? Pondering psychohistory easily prompts the old question: how much of history is contingent and how much of it is inevitable? This chapter applies the question to the mathematics underlying the Foundation itself and offers reasons to think the answer may be affirmative. The analysis also helps us to understand why Seldon postulated that the predictions of psychohistory should be kept a secret and why it only works at the level of large groups of people, such as the Galactic Empire.
The purview of mathematics and statistics has profoundly changed throughout history. Ancient thinkers, such as Plato and Aristotle, believed that mathematics could only describe and predict heavenly motions. This belief continued throughout medieval times. Only in the seventeenth century, natural philosophers such as Galileo and Newton formulated mathematical laws that apply to objects on Earth as well as elsewhere in the universe.
Newtonian laws are deterministic, which means that an exact specification of the world at one time in principle allows us to compute the situation at all other times, past and future. Laplace explained this deterministic worldview using a thought experiment, now known as Laplace’s demon. Newcomb’s paradox explores one perplexing consequence of this idea. It involves a television show that employs a very accurate predictor of human behaviour, much like Laplace’s demon.
Yet, determinism does not automatically lead to predictability in practice. Poincaré was among the first to study chaotic systems, in which small changes in initial conditions may blow up to gigantic differences in the long run. Meanwhile, it was discovered that indeterminism may give rise to very stable and predictable patterns. Allegedly, Poincaré was able to prove his baker was committing fraud with the weight of his baguettes by looking at their weight distribution.
The nineteenth century also gave rise to statistical mechanics, which studies collections of particles of which the positions and velocities aren’t known exactly, but their probability distributions are. As a result, the particles do exhibit lawlike behaviour at the collective level. Another watershed occurred when Quetelet realized that not just particles and baguettes, but also people can be characterized by statistical distributions.
Both in fiction and in reality, the notion of statistical determinism makes us wonder how much an individual can really change the course of history. It invites further reflection not merely on Salvor Hardin or the Mule, but also on how free Seldon could have been in coming up with psychohistory and in designing his Plan in the first place.
Link to the book on the publisher’s website.
2022
-
S. Wenmackers, “Book review: Vieri Benci and Mauro Di Nasso. How to Measure the Infinite: Mathematics with Infinite and Infinitesimal Numbers,” Philosophia Mathematica, vol. 30, pp. 130–137, 2022.
[Abstract]
Review of the book by Vieri Benci and Mauro Di Nasso (2019, World Scientific).
-
I contributed to the following team-publication of the Coronavirus Pandemic Preparedness Team (shared authorship):
T. Nguyen, M. Ronse, A. Kiekens, pp. Thyssen, J. R. Nova Blanco, N. Van den Cruyce, M. Craps, Coronavirus Pandemic Preparedness Team, and A.-M. Vandamme, “Learning for the future: A case study of transdisciplinary collaboration to improve pandemic preparedness,” Transdisciplinary Insights, vol. 5, pp. 41–54, 2022.
[Abstract]
Since the World Health Organization (WHO) announced the COVID-19 pandemic, attention has turned to the impact of societal initiatives and what can be learned from them for the future beyond COVID-19. Little attention has been paid, however, to how ‘learning for the future’, as an organizational process, is concretely accomplished. This paper offers a collaborative autoethnography of our team’s project to ‘learn for the future’ through transdisciplinary collaboration during the first year of the COVID-19 pandemic, where our broader goal was to help improve future pandemic preparedness for Belgium and beyond. We engage practice theory, with its processual, relational ontology, to understand the empirical phenomenon of ‘learning for the future’ as a practice or set of relational activities and artifacts that constituted our experience and collective sense that we were ‘learning for the future’ in a transdisciplinary way. Our interpretive analysis uncovered three relational activities: inclusively broad sharing, participatory concretizing, and collective suspending of sense. The analysis further revealed that at the same time, these activities were the means through which the tension our team repeatedly experienced between the present and future (i.e. making an impact on the present pandemic versus taking a step back from the present to ‘learn for the future’) was being reproduced. This explains why our team’s repeated attempts to clarify priorities and reestablish the focus on the future did not simply resolve the tension. From a processual, relational perspective, ‘learning for the future’ emerged through ongoing efforts that relate to making a difference in the present. We discuss what our theoretical perspective and findings may mean for organizing for a more resilient society and future directions for research.
2021
-
D. E. P. Vanpoucke and S. Wenmackers, “Assigning probabilities to non-Lipschitz mechanical systems,” Chaos, vol. 31, pp. 1–14, 2021.
[Abstract]
We present a method for assigning probabilities to the solutions of initial value problems that have a Lipschitz singularity. To illustrate the method, we focus on the following toy example: d²r(t)/dt²=ra, r(t=0)=0, and d(r)/dt|r(t=0)=0, with a ∈ ]0,1[. This example has a physical interpretation as a mass in a uniform gravitational field on a frictionless, rigid dome of a particular shape; the case with a=1/2 is known as Norton’s dome. Our approach is based on (1) finite difference equations, which are deterministic; (2) elementary techniques from alpha-theory, a simplified framework for non-standard analysis that allows us to study infinitesimal perturbations; and (3) a uniform prior on the canonical phase space. Our deterministic, hyperfinite grid model allows us to assign probabilities to the solutions of the initial value problem in the original, indeterministic model.
-
L. Vignero and S. Wenmackers, “Degrees of riskiness, falsifiability, and truthlikeness. A neo-Popperian account applicable to probabilistic theories,” Synthese, vol. 199, pp. 11729–11764, 2021.
[Abstract]
In this paper, we take a fresh look at three Popperian concepts: riskiness, falsifiability, and truthlikeness (or verisimilitude) of scientific hypotheses or theories. First, we make explicit the dimensions that underlie the notion of riskiness. Secondly, we examine if and how degrees of falsifiability can be defined, and how they are related to various dimensions of the concept of riskiness as well as the experimental context. Thirdly, we consider the relation of riskiness to (expected degrees of) truthlikeness. Throughout, we pay special attention to probabilistic theories and we offer a tentative, quantitative account of verisimilitude for probabilistic theories.
-
P. Thyssen and S. Wenmackers, “Degrees of freedom,” Synthese, vol. 198, pp. 10207–10235, 2021.
[Abstract]
Human freedom is in tension with nomological determinism and with statistical determinism. The goal of this paper is to answer both challenges. Four contributions are made to the free-will debate. First, we propose a classification of scientific theories based on how much freedom they allow. We take into account that indeterminism comes in different degrees and that both the laws and the auxiliary conditions can place constraints. A scientific worldview pulls towards one end of this classification, while libertarianism pulls towards the other end of the spectrum. Second, inspired by Hoefer, we argue that an interval of auxiliary conditions corresponds to a region in phase space, and to a bundle of possible block universes. We thus make room for a form of non-nomological indeterminism. Third, we combine crucial elements from the works of Hoefer and List; we attempt to give a libertarian reading of this combination. On our proposal, throughout spacetime, there is a certain amount of freedom (equivalent to setting the initial, intermediate, or final conditions) that can be interpreted as the result of agential choices. Fourth, we focus on the principle of alternative possibilities throughout and propose three ways of strengthening it.
2019
-
S. Wenmackers, “Infinitesimal probabilities,” in Open Handbook of Formal Epistemology, J. Weisberg and R. Pettigrew, Eds., PhilPapers Foundation, 2019, pp. 199–265.
[Abstract]
Even taken separately, both infinitesimals and probabilities constitute major topics in philosophy and related fields. Infinitesimals are numbers that are infinitely small or extremely minute. The history of non-zero infinitesimals is a troubled one: although they played a crucial role in the development of the calculus by Leibniz and – to a lesser extent – by Newton, they were long believed to be based on an inconsistent concept. For probabilities, the interplay between objective and subjective aspects of the concept has led to many puzzles and paradoxes. Viewed in this way, considering infinitesimal probabilities combines two possible sources of complications. This chapter aims to elucidate the concept of infinitesimal probabilities, covering philosophical discussions and mathematical developments (in as far as they are relevant for the former). The introduction briefly describes what it means for a number to be infinitesimal or infinitely small and it addresses some key notions in the foundations of probability theory. The remainder of the chapter is devoted to interactions between these two notions. It is divided into three parts, dealing with the history, the mathematical framework, and the philosophical discussion on this topic, followed by a brief epilogue on methodological pluralism. The appendices introduce some technical concepts from non-standard analysis and relevant definitions of filters.
-
S. Wenmackers, “Lost in Space and Time: A Quest for Conceptual Spaces in Physics,” in Concepts and Their Applications, M. Kaipainen, A. Hautamäki, pp. Gärdenfors, and F. Zenker, Eds., Springer, 2019, vol. 405, pp. 127–149.
[Abstract]
In this chapter, I investigate whether dimensions in physics are analogous to quality dimensions (in the sense of Gärdenfors, 2000; 2004) and whether phase spaces are to be considered as conceptual spaces (as proposed by Masterton et al., 2017). To this end, I focus on the domain of force in classical physics and on the dimension of time from classical to relativistic physics. Meanwhile, I comment on the development of abstract spaces with non-spatial dimensions, such as conceptual spaces, which is itself part of a long history of conceptual development.
-
S. Wenmackers, “The Snow White problem,” Synthese, vol. 196, pp. 4137–4153, 2019.
[Abstract]
The Snow White problem is introduced to demonstrate how learning something of which one could not have learnt the opposite (due to observer selection bias) can change an agent’s probability assignment. This helps us to analyse the Sleeping Beauty problem, which is deconstructed as a combinatorial engine and a subjective wrapper. The combinatorial engine of the problem is analogous to Bertrand’s boxes paradox and can be solved with standard probability theory. The subjective wrapper is clarified using the Snow White problem. Sample spaces for all three problems are presented. The conclusion is that subjectivity plays no irreducible role in solving the Sleeping Beauty problem and that no reference to centered worlds is required to provide the answer.
-
S. Wenmackers, “Demystifying the Mystery Room,” Thought, vol. 8, pp. 86–95, 2019.
[Abstract]
The Mystery Room problem is a close variant of the Mystery Bag scenario (due to Titelbaum). It is argued here that dealing with this problem requires no revision of the Bayesian formalism, since there exists a solution for it in which indexicals or demonstratives play no essential role. The solution does require labels, which are internal to the probabilistic model. While there needs to be a connection between at least one label and one indexical or demonstrative, that connection is external to the probabilistic model that is used to determine the relevant conditional probability; hence, it does not complicate the update procedure.
-
R. Meester, B. Preneel, and S. Wenmackers, “Reply to Lucas & Henneberg: Are human faces unique?,” Forensic Science International, vol. 297, pp. 217–220, 2019.
[Abstract]
This paper offers a response to the 2015 article of Lucas and Henneberg “Are human faces unique?” (Forensic Sci. Int. 257, 514.e1–514.e6), as well as to the subsequent popularization thereof. In the first part, we assess the probabilistic claims made by the authors and find crucial parts to be unsupported. In particular, the authors offer no probabilistic model on which to base their conclusions. Even if we disregard the errors identified in the first part, we find a troubling discrepancy between the published findings and the popular summary.
2018
-
V. Benci, L. Horsten, and S. Wenmackers, “Infinitesimal probabilities,” British Journal for the Philosophy of Science,, vol. 69, pp. 509–552, 2018.
[Abstract]
Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general.
-
S. Wenmackers, “Do Infinitesimal Probabilities Neutralize the Infinite Utility in Pascal’s Wager?,” in Classic Arguments in the History of Philosophy: Pascal’s Wager, pp. Bartha and L. Pasternack, Eds., Cambridge: Cambridge University Press, 2018.
[Abstract]
In the “Infinity – nothing” passage of Pensées (1966 [1670], L418/S680), Pascal considers wagering for or against the existence of God, taking into account both probability (“chance”) and utility (“happiness,” “gain”). Jordan (2006) reconstructs the passage as presenting four related arguments: the first is based on weak dominance, the second is based on maximizing expected utility assuming the same probability for or against the existence of God, the third allows these probabilities to be different, and the fourth is based on strong dominance. Jordan calls the third part, which is the most discussed in contemporary philosophy, the “Canonical Wager.” It is only in the context of this Canonical Wager (henceforth simply referred to as “the Wager”) that bringing up infinitesimal probabilities is relevant. Pascal explicitly excluded this possibility, assuming that there is “one chance of winning against a finite number of chances of losing,” so that “there are not infinite chances of losing against that of winning” (1966 [1670], L418/S680). In this chapter, we investigate whether Pascal was right in excluding infinitesimal probabilities. In other words: can infinitesimal probabilities be used to neutralize the infinite utility in the Wager, blocking the conclusion stating that it’s prudent to wager for the existence of God? We will study this using a formal framework that enables us to represent both infinitesimal probabilities and infinite utilities and to combine them algebraically.
-
S. Wenmackers, “Herkansing voor infinitesimalen?,” Algemeen Nederlands Tijdschrift voor Wijsbegeerte, vol. 110, pp. 491–510, 2018.
[Abstract]
This article discusses the connection between the Zenonian paradox of magnitude and probability on infinite sample spaces. Two important premises in the Zenonian argument are: the Archimedean axiom, which excludes infinitesimal magnitudes, and perfect additivity. Standard probability theory uses real numbers that satisfy the Archimedean axiom, but it rejects perfect additivity. The additivity requirement for real-valued probabilities is limited to countably infinite collections of mutually incompatible events. A consequence of this is that there exists no standard probability function that describes a fair lottery on the natural numbers. If we reject the Archimedean axiom, allowing infinitesimal probability values, we can retain perfect additivity and describe a fair, countable infinite lottery. The article gives a historical overview to understand how the first option has become the current standard, whereas the latter remains ‘non-standard'.
-
S. Wenmackers, “On infinitesimals and the world below,” in Laat ons niet ernstig blijven: Huldeboek voor Jean Paul Van Bendegem, V. B. Kerkhove, K. François, S. Ducheyne, and P. Allo, Eds., Ghent, Belgium: Academia Press, 2018, pp. 321–326.
[Abstract]
This short paper comments on Van Bendegem’s (1987) proposal for a discrete and finitary structure underlying Euclidean geometry. The first two sections contrast his guiding idea with a Leibnizean heuristic and liken it to renormalization group methods in physics. The last two sections show how infinitesimals, from non-standard models of the reals, may help to flesh out the proposal and offer a viewpoint that may make this approach more acceptable to a finitist.
-
S. Wenmackers, “Book review: Ten great ideas about chance,” Tijdschrift voor Filosofie, vol. 80, pp. 398–400, 2018.
[Abstract]
Review in Dutch of the book by Persi Diaconis and Brian Skyrms (2017, Princeton UP).
2017
-
I. Douven and S. Wenmackers, “Inference to the best explanation versus Bayes’s rule in a social setting,” British Journal for the Philosophy of Science, vol. 68, pp. 535–570, 2017.
[Abstract]
This article compares inference to the best explanation with Bayes’s rule in a social setting, specifically, in the context of a variant of the Hegselmann–Krause model in which agents not only update their belief states on the basis of evidence they receive directly from the world, but also take into account the belief states of their fellow agents. So far, the update rules mentioned have been studied only in an individualistic setting, and it is known that in such a setting both have their strengths as well as their weaknesses. It is shown here that in a social setting, inference to the best explanation outperforms Bayes’s rule according to every desirable criterion.
-
I. Douven, S. Wenmackers, Y. Jraissati, and L. Decock, “Measuring graded membership: the case of color,” Cognitive Science, vol. 41, pp. 686–722, 2017.
[Abstract]
This paper considers Kamp and Partee’s account of graded membership within a conceptual spaces framework and puts the account to the test in the domain of colors. Three experiments are reported that are meant to determine, on the one hand, the regions in color space where the typical instances of blue and green are located and, on the other hand, the degrees of blueness/greenness of various shades in the blue–green region as judged by human observers. From the locations of the typical blue and typical green regions in conjunction with Kamp and Partee’s account follow degrees of blueness/greenness for the color shades we are interested in. These predicted degrees are compared with the judged degrees, as obtained in the experiments. The results of the comparison support the account of graded membership at issue.
2016
-
S. Wenmackers, “Children of the Cosmos,” in Trick or Truth?, A. Aguirre, B. Foster, and Z. Merali, Eds., Springer, 2016, pp. 5–20.
[Abstract]
Mathematics may seem unreasonably effective in the natural sciences, in particular in physics. In this essay, I argue that this judgment can be attributed, at least in part, to selection effects. In support of this central claim, I offer four elements. The first element is that we are creatures that evolved within this Universe, and that our pattern finding abilities are selected by this very environment. The second element is that our mathematics – although not fully constrained by the natural world – is strongly inspired by our perception of it. Related to this, the third element finds fault with the usual assessment of the efficiency of mathematics: our focus on the rare successes leaves us blind to the ubiquitous failures (selection bias). The fourth element is that the act of applying mathematics provides many more degrees of freedom than those internal to mathematics. This final element will be illustrated by the usage of ‘infinitesimals’ in the context of mathematics and that of physics. In 1960, Wigner wrote an article on this topic [4] and many (but not all) later authors have echoed his assessment that the success of mathematics in physics is a mystery.
-
S. Wenmackers and J.-W. Romeijn, “New theory about old evidence; A framework for open-minded Bayesianism,” Synthese, vol. 193, pp. 1225–1250, 2016.
[Abstract]
We present a conservative extension of a Bayesian account of confirmation that can deal with the problem of old evidence and new theories. So-called open-minded Bayesianism challenges the assumption – implicit in standard Bayesianism – that the correct empirical hypothesis is among the ones currently under consideration. It requires the inclusion of a catch-all hypothesis, which is characterized by means of sets of probability assignments. Upon the introduction of a new theory, the former catch-all is decomposed into a new empirical hypothesis and a new catch-all. As will be seen, this motivates a second update rule, besides Bayes' rule, for updating probabilities in light of a new theory. This rule conserves probability ratios among the old hypotheses. This framework allows for old evidence to confirm a new hypothesis due to a shift in the theoretical context. The result is a version of Bayesianism that, in the words of Earman, “keep[s] an open mind, but not so open that your brain falls out”.
-
S. Wenmackers, “Ballonnen boven de filosofische freesmachine,” Algemeen Nederlands Tijdschrift voor Wijsbegeerte, vol. 108, pp. 145–149, 2016.
[Abstract]
In haar discussie-artikel onderzoekt Annemarie Kalis de vraag waarom publieksfilosofie een slechte naam heeft bij veel academisch filosofen. In haar analyse gaat ze na welke doelen publieksfilosofie zou moeten vervullen. Op basis daarvan kunnen we kwaliteitscriteria bepalen, die verschillend kunnen zijn (en soms zelfs conflicterend) voor verschillende doelen. Een mogelijk antwoord op de uitgangsvraag is dat publieksfilosofie doorgaans doelen nastreeft die niet overeenstemmen met de criteria die critici hanteren. Verder doet Kalis een oproep om filosofie meer tot de wereld te richten. Tijdens het lezen van het focusartikel viel het me op hoeveel overeenkomsten er zijn met publiekscommunicatie in de exacte wetenschappen – iets waar ik als wetenschapsfilosoof ook mijn steentje aan probeer bij te dragen. In deze reactie wil ik de analyse van Kalis aanvullen met een reflectie vanuit die context.
-
S. Wenmackers, “ ‘Dat kan geen toeval zijn!’ Waarschijnlijkheid: over objectieve kansen en subjectieve graden van overtuiging,” in In verscheidenheid verenigd, pp. d’Hoine and B. Pattyn, Eds., Leuven, Belgium: UP Leuven, 2016, vol. 22, pp. 255–286.
2015
-
S. Wenmackers, “Zekerheid in de waarschijnlijkheidsleer,” Algemeen Nederlands Tijdschrift voor Wijsbegeerte, vol. 107, pp. 167–172, 2015.
[Abstract]
In het focusartikel bespreekt Jeanne Peijnenburg een discussie tussen C.I. Lewis en Hans Reichenbach over de vraag of zinvolle uitspraken over waarschijnlijkheid zekerheid veronderstellen. Lewis meende van wel, Reichenbach meende van niet. Peijnenburg toont aan hoe het standpunt van Reichenbach formeel onderbouwd kan worden. Ze onderzoekt de situatie van een aftelbare keten van elkaar ondersteunende proposities, waarbij de funderende propositie (p, vooraan in de keten) een waarschijnlijkheid kleiner dan één heeft. Haar argument toont aan dat dit er niet noodzakelijk toe leidt dat de waarschijnlijkheid van de uiteindelijk afgeleide doelpropositie (r, achteraan) in de limiet onbepaald of nul zou zijn. In deze reactie vertrek ik opnieuw van enkele citaten van Lewis. Dan ga ik in op de algemene vraag of het toekennen van een waarschijnlijkheid enige zekerheid veronderstelt. Op basis van deze bespreking stip ik de rol van zekerheid aan in Peijnenburgs argument.
2014
-
S. Wenmackers, D. E. P. Vanpoucke, and I. Douven, “Rationality: a social epistemology perspective,” Frontiers in Psychology, vol. 5, 2014.
[Abstract]
Both in philosophy and in psychology, human rationality has traditionally been studied from an “individualistic” perspective. Recently, social epistemologists have drawn attention to the fact that epistemic interactions among agents also give rise to important questions concerning rationality. In previous work, we have used a formal model to assess the risk that a particular type of social-epistemic interactions lead agents with initially consistent belief states into inconsistent belief states. Here, we continue this work by investigating the dynamics to which these interactions may give rise in the population as a whole.
-
L. Decock, I. Douven, C. Kelp, and S. Wenmackers, “Knowledge and approximate knowledge,” Erkenntnis, vol. 79, pp. 1129–1150, 2014.
[Abstract]
Traditionally, epistemologists have held that only truth-related factors matter in the question of whether a subject can be said to know a proposition. Various philosophers have recently departed from this doctrine by claiming that the answer to this question also depends on practical concerns. They take this move to be warranted by the fact that people’s knowledge attributions appear sensitive to contextual variation, in particular variation due to differing stakes. This paper proposes an alternative explanation of the aforementioned fact, one that allows us to stick to the orthodoxy. The alternative applies the conceptual spaces approach to the concept of knowledge. With knowledge conceived of spatially, the variability in knowledge attributions follows from recent work on identity, according to which our standards for judging things (including concepts) to be identical are context-dependent. On the proposal to be made, it depends on what is at stake in a context whether it is worth distinguishing between knowing and being at least close to knowing.
-
K. Krzyżanowska, S. Wenmackers, and I. Douven, “Rethinking Gibbard’s riverboat argument,” Studia Logica, vol. 102, pp. 771–792, 2014.
[Abstract]
According to the Principle of Conditional Non-Contradiction (CNC), conditionals of the form “If p, q” and “If p, not q“ cannot both be true, unless p is inconsistent. This principle is widely regarded as an adequacy constraint on any semantics that attributes truth conditions to conditionals. Gibbard has presented an example of a pair of conditionals that, in the context he describes, appear to violate CNC. He concluded from this that conditionals lack truth conditions. We argue that this conclusion is rash by proposing a new diagnosis of what is going on in Gibbard’s argument. We also provide empirical evidence in support of our proposal.
-
J. Peijnenburg and S. Wenmackers, “Infinite regress in decision theory, philosophy of science, and formal epistemology,” Synthese, vol. 191, pp. 627–628, 2014.
[Abstract]
Infinite regresses have had a mixed reception in philosophy. On the one hand, they are often quickly brushed aside as being unreasonable and even preposterous. If a position or a theory is shown to engender an infinite regress, that pretty well means the end of it: an argumentum ad infinitum is as good as a reductio ad absurdum. On the other hand, the alleged absurdity of infinite regresses has been a source of inspiration and the ground for far-reaching conclusions. It led to the requirement that a regress should come to a stop and thence to the verdict that there exist such things as a Prime Mover, a First Cause, a Highest Good, or a Causa Sui. Infinite regresses have been discussed in practically all branches of philosophy, not only in metaphysics and epistemology, but also in ethics, philosophy of mind, logic, and argumentation theory. In recent years, they have spurred a lively debate in traditional epistemology, where ‘infinitism’ (the idea that an infinite chain of epistemic justification is not prima facie absurd) is on its way to becoming a mature alternative to the traditional positions, foundationalism and coherentism. The papers in the present volume are about infinite regress in decision theory, philosophy of science, and formal epistemology. We are fortunate to have brought together a number of renowned philosophers and promising young scholars.
-
S. D. Pop, K. Hinrichs, N. Esser, S. Wenmackers, C. Cobet, and D. R. T. Zahn, “DNA structures on silicon and diamond,” in Ellipsometry of Functional Organic Surfaces and Films, K. Hinrichs and K.-J. Eichhorn, Eds., Berlin, Germany: Springer, 2014, vol. 52, pp. 47–59.
[Abstract]
In the design of DNA-based hybrid devices, it is essential to have knowledge of the structural, electronic and optical properties of these biomolecular films. Spectroscopic ellipsometry is a powerful technique to probe and asses these properties. In this chapter, we review its application to biomolecular films of single DNA bases and molecules on silicon and diamond surfaces characterized in the spectral range from the near-infrared (NIR) through the visible (Vis) and toward the vacuum ultraviolet (VUV). The reported optical constants of various DNA structures are of great interest, particularly in the development of biosensors.
2013
-
S. Wenmackers, “Ultralarge lotteries: Analyzing the lottery paradox using non-standard analysis,” Journal of Applied Logic, vol. 11, pp. 452–467, 2013.
[Abstract]
A popular way to relate probabilistic information to binary rational beliefs is the Lockean Thesis, which is usually formalized in terms of thresholds. This approach seems far from satisfactory: the value of the thresholds is not well-specified and the Lottery Paradox shows that the model violates the Conjunction Principle. We argue that the Lottery Paradox is a symptom of a more fundamental and general problem, shared by all threshold-models that attempt to put an exact border on something that is intrinsically vague. We propose application of the language of relative analysis — a type of non-standard analysis — to formulate a new model for rational belief, called Stratified Belief. This contextualist model seems well-suited to deal with a concept of beliefs based on probabilities ‘sufficiently close to unity’ and satisfies a moderately weakened form of the Conjunction Principle. We also propose an adaptation of the model that is able to deal with beliefs that are less firm than ‘almost certainty’. The adapted version is also of interest for the epistemicist account of vagueness.
-
S. Wenmackers and L. Horsten, “Fair infinite lotteries,” Synthese, vol. 190, pp. 37–61, 2013.
[Abstract]
This article discusses how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. Techniques and ideas from non-standard analysis are brought to bear on the problem.
-
V. Benci, L. Horsten, and S. Wenmackers, “Non-Archimedean probability,” Milan Journal of Mathematics, vol. 81, pp. 121–151, 2013.
[Abstract]
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned probability zero (in other words: the probability functions are regular). We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov’s axiomatization of probability is replaced by a different type of infinite additivity.
-
K. Krzyżanowska, S. Wenmackers, and I. Douven, “Inferential conditionals and evidentiality,” Journal of Logic, Language and Information, vol. 22, pp. 315–334, 2013.
[Abstract]
Many conditionals seem to convey the existence of a link between their antecedent and consequent. We draw on a recently proposed typology of conditionals to argue for an old philosophical idea according to which the link is inferential in nature. We show that the proposal has explanatory force by presenting empirical results on the evidential meaning of certain English and Dutch modal expressions.
2012
-
S. Wenmackers, D. E. P. Vanpoucke, and I. Douven, “Probability of inconsistencies in theory revision; a multi-agent model for updating logically interconnected beliefs under bounded confidence,” European Physical Journal B, vol. 85, 2012.
[Abstract]
We present a model for studying communities of epistemically interacting agents who update their belief states by averaging (in a specified way) the belief states of other agents in the community. The agents in our model have a rich belief state, involving multiple independent issues which are interrelated in such a way that they form a theory of the world. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating (in the given way). To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. It is shown that, under the assumptions of our model, an agent always has a probability of less than 2\% of ending up in an inconsistent belief state. Moreover, this probability can be made arbitrarily small by increasing the number of independent issues the agents have to judge or by increasing the group size. A real-world situation to which this model applies is a group of experts participating in a Delphi-study.
-
S. Wenmackers and D. E. P. Vanpoucke, “Models and simulations in material science: two cases without error bars,” Statistica Neerlandica, vol. 66, pp. 339–355, 2012.
[Abstract]
We discuss two research projects in material science in which the results cannot be stated with an estimation of the error: a spectroscopic ellipsometry study aimed at determining the orientation of DNA molecules on diamond and a scanning tunneling microscopy study of platinum-induced nanowires on germanium. To investigate the reliability of the results, we apply ideas from the philosophy of models in science. Even if the studies had reported an error value, the trustworthiness of the result would not depend on that value alone.
-
V. Benci, L. Horsten, and S. Wenmackers, “Axioms for Non-Archimedean Probability (NAP),” in Future Directions for Logic; Proceedings of PhDs in Logic III, J. De Vuyst and L. Demey, Eds., London, UK: College Publications, 2012, vol. 2.
[Abstract]
In this contribution, we focus on probabilistic problems with a denumerably or non-denumerably infinite number of possible outcomes. Kolmogorov (1933) provided an axiomatic basis for probability theory, presented as a part of measure theory, which is a branch of standard analysis or calculus. Since standard standard analysis does not allow for non-Archimedean quantities (i.e. infinitesimals), we may call Kolmogorov’s approach ‘Archimedean probability theory’. We show that allowing non-Archimedean probability values may have considerate epistemological advantages in the infinite case.
-
S. Wenmackers, “Ultralarge and infinite lotteries,” in Logic, Philosophy and History of Science in Belgium II; Proceedings of the Young Researchers Days 2010, B. Van Kerkhove, T. Libert, G. Vanpaemel, and P. Marage, Eds., Belgium, Brussels: Koninklijke Vlaamse Academie van België voor Wetenschappen en Kunsten, 2012, pp. 59–66.
-
K. Krzyżanowska, S. Wenmackers, I. Douven, and S. Verbrugge, “Conditionals, inference, and evidentiality,” in Proceedings of the Logic & Cognition Workshop at ESSLLI 2012; Opole, Poland, 13–17 August, 2012, J. Szymanik and R. Verbrugge, Eds., , 2012, vol. 883, pp. 38–47.
[Abstract]
At least many conditionals seem to convey the existence of a link between their antecedent and consequent. We draw on a recently proposed typology of conditionals to revive an old philosophical idea according to which the link is inferential in nature. We show that the proposal has explanatory force by presenting empirical results on two Dutch linguistic markers.
2011
-
S. Wenmackers, “Philosophy of Probability: Foundations, Epistemology, and Computation,” , 2011.
-
S. Wenmackers, “How to track reality,” in Inception and Philosophy; Ideas to die for, T. Botz-Bornstein, Ed., Chicago, IL: Open Court, 2011, vol. 62, pp. 3–23.
-
D. T. Tran, V. Vermeeren, L. Grieten, S. Wenmackers, pp. Wagner, J. Pollet, K. P. F. Janssen, L. Michiels, and J. Lammertyn, “Nanocrystalline diamond impedimetric aptasensor for the label-free detection of human IgE,” Biosensors and Bioelectronics, vol. 26, pp. 2987–2993, 2011.
[Abstract]
Like antibodies, aptamers are highly valuable as bioreceptor molecules for protein biomarkers because of their excellent selectivity, specificity and stability. The integration of aptamers with semiconducting materials offers great potential for the development of reliable aptasensors. In this paper we present an aptamer-based impedimetric biosensor using a nanocrystalline diamond (NCD) film as a working electrode for the direct and label-free detection of human immunoglobulin E (IgE). Amino (NH2)-terminated IgE aptamers were covalently attached to carboxyl (COOH)-modified NCD surfaces using carbodiimide chemistry. Electrochemical impedance spectroscopy (EIS) was applied to measure the changes in interfacial electrical properties that arise when the aptamer-functionalized diamond surface was exposed to IgE solutions. During incubation, the formation of aptamer–IgE complexes caused a significant change in the capacitance of the double-layer, in good correspondence with the IgE concentration. The linear dynamic range of IgE detection was from 0.03 µg/mL to 42.8 µg/mL. The detection limit of the aptasensor reached physiologically relevant concentrations (0.03 µg/mL). The NCD-based aptasensor was demonstrated to be highly selective even in the presence of a large excess of IgG. In addition, the aptasensor provided reproducible signals during six regeneration cycles. The impedimetric aptasensor was successfully tested on human serum samples, which opens up the potential of using EIS for direct and label-free detection of IgE levels in blood serum.
-
V. Vermeeren, L. Grieten, N. Vanden Bon, N. Bijnens, S. Wenmackers, S. D. Janssens, K. Haenen, pp. Wagner, and L. Michiels, “Impedimetric, diamond-based immmunosensor for the detection of C-Reactive Protein,” Sensors and Actuators B: Chemical, vol. 157, pp. 130–138, 2011.
[Abstract]
The high prevalence of cardiovascular diseases (CVD) demands a reliable and sensitive risk assessment technique. In order to develop a fast and label-free immunosensor for C-reactive protein (CRP), a risk factor for this condition, anti-CRP antibodies were physically adsorbed to the hydrogen (H)-terminated surface of nanocrystalline diamond (NCD). An Enzyme-Linked ImmunoSorbent Assay (ELISA) reference technique showed that this was a suitable substrate for antibody–antigen recognition reactions. Electrochemical Impedance Spectroscopy (EIS) was used to electronically detect CRP recognition. The specificity of the immunosensor was demonstrated by incubation with CRP and plasminogen as reference molecule. A different impedance behavior was observed in real-time after CRP addition as compared to plasminogen addition: the impedance increased only during CRP incubation. Fitting the data showed that this corresponded with a decrease in capacitance of the molecular layer due to its increased thickness by specific CRP recognition. Sensitivity experiments in real-time showed a clear discrimination between 1 µM, 100 nM, and 10 nM of CRP after 10 min at 100 Hz. Since, 10 nM of CRP was still clearly distinguishable from buffer solution, our CRP-directed immunosensor prototype reaches a sensitivity that is within the physiologically relevant concentration range of this biomarker in healthy controls and CVD patients. Moreover, this prototype displayed real-time discriminating power between spiked and unspiked serum, and thus also shows its applicability in this biological matrix.
2010
-
V. Vermeeren, N. Bijnens, L. Grieten, S. Wenmackers, N. Vanden Bon, K. Haenen, pp. Wagner, and L. Michiels, “Diamond-based biosensors with an impedimetric and label-free read-out,” in Nanotechnology 2010: Bio Sensors, Instruments, Medical, Environment and Energy, NSTI, 2010, vol. 3, pp. 23–26.
[Abstract]
Healthcare and diagnostics is slowly moving towards molecular medicine and point-of-care diagnosis. Electronic biosensors can offer opportunities in this area that the current state-of-the-art, such as denaturing gradient gel electrophoresis (DGGE) for mutation analysis and ELISA for abnormal protein detection, is unable to fulfill. We developed an electronic DNA- and immunosensor based on nanocrystalline diamond (NCD) and electrochemical impedance spectroscope (EIS). During DNA hybridization and denaturation, a difference is observed for 1-mismatch target DNA and complementary target DNA in real-time. Our immunosensor was made to detect C-reactive Protein (CRP), a risk marker for cardiovascular diseases. The specificity of the immunosensor was demonstrated by the incubation with CRP and the non-specific plasminogen. Since 10 nM of CRP was still clearly distinguishable from buffer, our prototype reached a sensitivity that is in the clinically relevant concentration ranges.
-
M. Bäcker, A. Poghossian, M. H. Abouzar, S. Wenmackers, S. D. Janssens, K. Haenen, pp. Wagner, and M. J. Schöning, “Capacitive field-effect (bio-)chemical sensors based on nanocrystalline diamond films,” Mater. Res. Soc. Symp. Proc., vol. 1203, pp. J17-31, 2010.
[Abstract]
Capacitive field-effect electrolyte-diamond-insulator-semiconductor (EDIS) structures with O-terminated nanocrystalline diamond (NCD) as sensitive gate material have been realized and investigated for the detection of pH, penicillin concentration, and layer-by-layer adsorption of polyelectrolytes. The surface oxidizing procedure of NCD thin films as well as the seeding and NCD growth process on a Si-SiO2 substrate have been improved to provide high pH-sensitive, non-porous thin films without damage of the underlying SiO2 layer and with a high coverage of O-terminated sites. The NCD surface topography, roughness, and coverage of the surface groups have been characterized by SEM, AFM and XPS methods. The EDIS sensors with O-terminated NCD film treated in oxidizing boiling mixture for 45 min show a pH sensitivity of about 50 mV/pH. The pH-sensitive properties of the NCD have been used to develop an EDIS-based penicillin biosensor with high sensitivity (65-70 mV/decade in the concentration range of 0.25-2.5 mM penicillin G) and low detection limit (5 µM). The results of label-free electrical detection of layer-by-layer adsorption of charged polyelectrolytes are presented, too.
-
R. Vansweevelt, A. Malesevic, M. Van Gompel, A. Vanhulsel, S. Wenmackers, J. D’Haen, V. Vermeeren, M. Ameloot, L. Michiels, C. Van Haesendonck, and P. Wagner, “Biological modification of carbon nanowalls with DNA strands and hybridization experiments with complementary and mismatched DNA,” Chemical Physics Letters, vol. 485, pp. 196–201, 2010.
[Abstract]
Carbon nanowalls (CNWs) are functionalized with short DNA strands. CNWs are a network of upstanding carbon flakes consisting of 4–6 graphene layers packed together. The biological modification of these structures is carried out via the photochemical attachment of an unsaturated fatty acid linker molecule, followed by an EDC-mediated reaction to bind amino-modified DNA strands to the linker. Hybridization experiments are done using complementary and mismatch DNA strands. Fluorescence microscopy shows a clear difference between EDC-negative and EDC-positive samples, supporting the covalent nature of the DNA attachment. Furthermore, we demonstrate that the hybridization is highly selective and reproducible.
2009
-
S. Wenmackers, V. Vermeeren, M. vandeVen, M. Ameloot, N. Bijnens, K. Haenen, L. Michiels, and P. Wagner, “Diamond-based DNA sensors: surface functionalization and read-out strategies,” physica status solidi (a), vol. 206, pp. 391–408, 2009.
[Abstract]
This article reviews the current state-of-the-art of diamond-based DNA sensors. Some general concepts involved in biosensors are introduced and applied to DNA sensors. The properties of chemically vapor deposited (CVD) diamond relevant for this application are summed up, with special attention for the stability and bio-compatibility of the material. Several routes to functionalize the diamond surface are considered. The physical properties of the obtained DNA layers are discussed in terms of surface density and molecular conformation. Possible read-out strategies are evaluated, including optical and electronic sensing. With diamond-based DNA sensors, real-time and label-free sensing is achieved.
-
N. Bijnens, V. Vermeeren, M. Daenen, L. Grieten, K. Haenen, S. Wenmackers, O. A. Williams, M. Ameloot, M. vandeVen, L. Michiels, and P. Wagner, “Synthetic diamond films as a platform material for label-free protein sensors,” physica status solidi (a), vol. 206, pp. 520–526, 2009.
[Abstract]
In the framework of developing a fast and label-free immunosensor for C-reactive protein (CRP) detection, H-terminated nanocrystalline diamond (NCD) was functionalised with anti-CRP antibodies that were physically adsorbed to the surface. Impedance spectroscopy was used to electronically detect real-time CRP recognition. Different impedance behaviours were observed after CRP addition as compared to after FITC-labelled ssDNA addition at low (100 Hz) as well as at high frequencies (1 MHz). Physical interpretations of the observed impedance changes were obtained by fitting the data to an equivalent electrical circuit. Concentrations of 1 µM CRP were recognised with a reaction time of 30 minutes.
-
Z. Remes, A. Kromka, H. Kozak, M. Vanecek, K. Haenen, and S. Wenmackers, “The infrared optical absorption spectra of the functionalized nanocrystalline diamond surface,” Diamond and Related Materials, vol. 18, pp. 772–775, 2009.
[Abstract]
We propose a new method of detecting the functional groups at the NCD surface based on the interfrence-free infrared reflection absorption spectroscopy of p-polarized IR light at Brewster’s angle of incidence (BA-IRRAS). We report IR absorbance spectra of a linker molecule monolayer (10-undecenoic acid) covalently bonded to the NCD surface with and without DNA fragments coupled to it, and IR spectra of organosilane polymer coatings deposited on NCD surface by the spin coating technology. The homogeneity of the surface coatings was monitored by fluorescence microscopy.
-
V. Vermeeren, S. Wenmackers, pp. Wagner, and L. Michiels, “DNA sensors with diamond as a promising alternative transducer material,” Sensors, vol. 9, pp. 5600–5636, 2009.
[Abstract]
Bio-electronics is a scientific field coupling the achievements in biology with electronics to obtain higher sensitivity, specificity and speed. Biosensors have played a pivotal role, and many have become established in the clinical and scientific world. They need to be sensitive, specific, fast and cheap. Electrochemical biosensors are most frequently cited in literature, often in the context of DNA sensing and mutation analysis. However, many popular electrochemical transduction materials, such as silicon, are susceptible to hydrolysis, leading to loss of bioreceptor molecules from the surface. Hence, increased attention has been shifted towards diamond, which surpasses silicon on many levels.
-
N. Smisdom, I. Smets, O. A. Williams, M. Daenen, S. Wenmackers, K. Haenen, M. Nesládek, J. D’Haen, pp. Wagner, J.-M. Rigo, M. Ameloot, and M. vandeVen, “Chinese hamster ovary cell viability on hydrogen and oxygen terminated nano- and microcrystalline diamond surfaces,” physica status solidi (a), vol. 206, pp. 2042–2047, 2009.
[Abstract]
Transfected Chinese hamster ovary cells were cultured on bare uncoated chemical vapor deposited thin nano- and microcrystalline diamond surfaces, hydrophobic hydrogen- and hydrophilic oxygen-terminated. Optical and biochemical analyses show that compared to glass controls, growth and viability were not significantly affected (one-way-analysis of variance, ANOVA). Based on twoway ANOVA analyses, neither grain size nor surface termination had a significant influence until 5 days post-seeding.
2008
-
S. Wenmackers, S. D. Pop, K. Roodenko, V. Vermeeren, O. A. Williams, M. Daenen, O. Douhéret, J. D’Haen, A. Hardy, V. M. K. Bael, K. Hinrichs, C. Cobet, M. vandeVen, M. Ameloot, K. Haenen, L. Michiels, N. Esser, and P. Wagner, “Structural and optical properties of DNA layers covalently attached to diamond surfaces,” Langmuir, vol. 24, pp. 7269–7299, 2008.
[Abstract]
Label-free detection of DNA molecules on chemically vapor-deposited diamond surfaces is achieved with spectroscopic ellipsometry in the infrared and vacuum ultraviolet range. This nondestructive method has the potential to yield information on the average orientation of single as well as double-stranded DNA molecules, without restricting the strand length to the persistence length. The orientational analysis based on electronic excitations in combination with information from layer thicknesses provides a deeper understanding of biological layers on diamond. The Pi–Pi* transition dipole moments, corresponding to a transition at 4.74 eV, originate from the individual bases. They are in a plane perpendicular to the DNA backbone with an associated n–Pi* transition at 4.47 eV. For 8–36 bases of single- and double-stranded DNA covalently attached to ultra-nanocrystalline diamond, the ratio between in- and out-of-plane components in the best fit simulations to the ellipsometric spectra yields an average tilt angle of the DNA backbone with respect to the surface plane ranging from 45° to 52°. We comment on the physical meaning of the calculated tilt angles. Additional information is gathered from atomic force microscopy, fluorescence imaging, and wetting experiments. The results reported here are of value in understanding and optimizing the performance of the electronic readout of a diamond-based label-free DNA hybridization sensor.
-
V. Vermeeren, S. Wenmackers, M. Daenen, K. Haenen, O. A. Williams, M. Ameloot, M. vandeVen, pp. Wagner, and L. Michiels, “Topographical and functional characterisation of the ssDNA probe layer generated through EDC-mediated covalent attachment to nanocrystalline diamond using fluorescence microscopy,” Langmuir, vol. 24, pp. 9125–9134, 2008.
[Abstract]
The covalent attachment method for DNA on nanocrystalline diamond (NCD), involving the introduction of COOH functionalities on the surface by photoattachment of 10-undecenoic acid (10-UDA), followed by the 1-ethyl-3-(3-dimethylaminopropyl)-carbodiimide (EDC)-mediated coupling to NH2-labeled ssDNA, is evaluated in terms of stability, density, and functionality of the resulting biological interface. This is of crucial importance in DNA biosensor development. The covalent nature of DNA attachment will infer the necessary stability and favorable orientation to the ssDNA probe molecules. Using confocal fluorescence microscopy, the influence of buffer type for the removal of excess 10-UDA and ssDNA, the probe ssDNA length, the probe ssDNA concentration, and the presence of the COOH-linker on the density and functionality of the ssDNA probe layer were investigated. It was determined that the most homogeneously dense and functional DNA layer was obtained when 300 pmol of short ssDNA was applied to COOH-modified NCD samples, while H-terminated NCD was resistant for DNA attachment. Exploiting this surface functionality dependence of the DNA attachment efficiency, a shadow mask was applied during the photochemical introduction of the COOH-functionalities, leaving certain regions on the NCD H-terminated. The subsequent DNA attachment resulted in a fluorescence pattern corresponding to the negative of the shadow mask. Finally, NCD surfaces covered with mixtures of the 10-UDA linker molecule and a similar molecule lacking the COOH functionality, functioning as a lateral spacer, were examined for their suitability in preventing nonspecific adsorption to the surface and in decreasing steric hindrance. However, purely COOH-modified NCD samples, patterned with H-terminated regions and treated with a controlled amount of probe DNA, proved the most efficient in fulfilling these tasks.
-
S. Wenmackers, “Morphology, functionality and molecular conformation study of CVD diamond surfaces functionalised with organic linkers and DNA,” , 2008.
2007
-
V. Vermeeren, N. Bijnens, S. Wenmackers, M. Daenen, K. Haenen, O. A. Williams, M. Ameloot, M. vandeVen, pp. Wagner, and L. Michiels, “Towards a real-time, label-free diamond-based DNA sensor,” Langmuir, vol. 23, pp. 13193–13202, 2007.
[Abstract]
Most challenging in the development of DNA sensors is the ability to distinguish between fully complementary target ssDNA (single-strand DNA) and 1-mismatch ssDNA. To deal with this problem, we performed impedance spectroscopy on DNA-functionalized nanocrystalline diamond (NCD) layers during hybridization and denaturation. In both reactions, a difference in behavior was observed for 1-mismatch target DNA and complementary target DNA in real-time. During real-time hybridization, a decrease of the impedance was observed at lower frequencies when the complementary target DNA was added, while the addition of 1-mismatch target ssDNA caused no significant change. Fitting these results to an electrical circuit demonstrates that this is correlated with a decrease of the depletion zone in the space charge region of the diamond. During real-time denaturation, differentiation between 1-mismatch and complementary target DNA was possible at higher frequencies. Denaturation of complementary DNA showed the longest exponential decay time of the impedance, while the decay time during 1-mismatch denaturation was the shortest. The real-time hybridization and denaturation experiments were carried out on different NCD samples in various buffer solutions at temperatures between 20 and 80°C. It was revealed that the best results were obtained using a Microhyb hybridization buffer at 80°C and 10× PCR buffer at 30°C for hybridization and 0.1 M NaOH at temperatures above 40°C for denaturation. We demonstrate that the combination of real-time hybridization spectra and real-time denaturation spectra yield important information on the type of target. This approach may allow a reliable identification of the mismatch sequence, which is the most biologically relevant.
2006
-
P. Christiaens, V. Vermeeren, S. Wenmackers, M. Daenen, K. Haenen, M. Nesládek, M. vandeVen, M. Ameloot, L. Michiels, and P. Wagner, “EDC-mediated DNA attachment to nanocrystalline CVD diamond films,” Biosensors and Bioelectronics, vol. 22, pp. 170–177, 2006.
[Abstract]
Chemical vapour deposited (CVD) diamond is a very promising material for biosensor fabrication owing both to its chemical inertness and the ability to make it electrical semiconducting that allows for connection with integrated circuits. For biosensor construction, a biochemical method to immobilize nucleic acids to a diamond surface has been developed. Nanocrystalline diamond is grown using microwave plasma-enhanced chemical vapour deposition (MPECVD). After hydrogenation of the surface, 10-undecenoic acid, an ω-unsaturated fatty acid, is tethered by 254 nm photochemical attachment. This is followed by 1-ethyl-3-[3-dimethylaminopropyl]carbodiimide (EDC)-mediated attachment of amino (NH2)-modified dsDNA. The functionality of the covalently bound dsDNA molecules is confirmed by fluorescence measurements, PCR and gel electrophoresis during 35 denaturation and rehybridisation steps. The linking method after the fatty acid attachment can easily be applied to other biomolecules like antibodies and enzymes.
2005
-
S. Wenmackers, pp. Christiaens, M. Daenen, K. Haenen, M. Nesládek, M. vandeVen, V. Vermeeren, L. Michiels, M. Ameloot, and P. Wagner, “DNA attachment to nanocrystalline diamond films,” physica status solidi (a), vol. 202, pp. 2212–2216, 2005.
[Abstract]
A biochemical method to immobilize DNA on synthetic diamond for biosensor applications is developed. Nanocrystalline diamond is grown using microwave plasma-enhanced chemical vapour deposition. On the hydrogen-terminated surface 10-undecenoic acid is tethered photochemically under 254 nm illumination, followed by 1-ethyl-3-[3-dimethylaminopropyl]carbodiimide crosslinker-mediated attachment of amino modified DNA. The attachment is functionally confirmed by comparison of supernatant fluorescence and gel electrophoresis. The linking procedure allowed for 35 denaturation and rehybridisation steps.
-
S. Wenmackers, pp. Christiaens, W. Deferme, M. Daenen, K. Haenen, M. Nesládek, pp. Wagner, V. Vermeeren, L. Michiels, M. vandeVen, M. Ameloot, J. Wouters, L. Naelaerts, and Z. Mekhalif, “Head-on immobilization of DNA fragments on CVD-diamond layers,” Materials Science Forum, vol. 492–493, pp. 267–272, 2005.
[Abstract]
Synthetic diamond is regarded as a promising material for biosensors: it forms a stable platform for genetic assays and its biocompatibility opens the possibility for in vivo sensing. In this study the use of a thymidine linker for covalent DNA attachment was evaluated. Contact angle measurements provided a qualitative test of the initially oxidized surface. X-ray photoemission spectroscopy was used for further analysis of the oxides and for monitoring the effect of subsequent chemical treatments. The presence of FITC-labelled DNA was confirmed by confocal fluorescence microscopy. Enzyme linked immunosorbent assays indicated that this DNA was merely adsorbed on the diamond surface instead of covalently bound.
2004
-
M. vandeVen, S. Wenmackers, K. Haenen, M. Nesládek, pp. Wagner, L. Michiels, and M. Ameloot, “Spectral confocal imaging and patterning effects of surface modified polycrystalline diamond films,” Biophysical Journal (a), vol. 86, pp. 608A–609A, 2004.
2003
-
S. Wenmackers, K. Haenen, M. Nesládek, pp. Wagner, L. Michiels, M. vandeVen, and M. Ameloot, “Covalent immobilization of DNA on CVD diamond films,” physica status solidi (a), vol. 199, pp. 44–48, 2003.
[Abstract]
Chemical vapour deposited (CVD) diamond is used in DNA immobilization experiments for its high stability and versatility. In this study a double-stranded, 250 base pairs long DNA fragment of the human PKU gene was covalently bound to CVD diamond films. Thymidine was used as a linker molecule for the binding in head-on configuration. The presence of surface-bound FITC-labelled DNA was confirmed by confocal fluorescence microscopy. In denaturation experiments the second, unlabelled strand of each pair was removed resulting in a loss of fluorescence. The possibility of non-specific binding of DNA to diamond can be excluded.