theory

Optimal Regulation of E-cigarettes: Theory and Evidence -- by Hunt Allcott, Charlie Rafkin

We model optimal e-cigarette regulation and estimate key sufficient statistics. Using tax changes and scanner data, we estimate relatively elastic demand and limited substitution between e-cigarettes and combustible cigarettes. In sample surveys, historical smoking declines for high- and low-vaping demographics were unchanged after e-cigarettes were introduced; this demographic shift-share identification also suggests limited substitution. We field a new survey of experts, who report that vaping is almost as harmful as smoking cigarettes. In our model, these results imply that current e-cigarette taxes are far below the social optimum, but Monte Carlo simulations highlight substantial uncertainty.




theory

Game of Thrones theory: Why Jaime and Cersei will NOT die together in A Dream of Spring



GAME OF THRONES saw Jaime and Cersei Lannister perish together, but what if George RR Martin's A Dream of Spring will see a different ending?




theory

Winds of Winter theory: Expect BIG focus on Sansa, Arya and future King Bran Stark



THE WINDS OF WINTER will have a big focus on the Stark children's POV, argues a new fan theory.




theory

Coronavirus: Chinese state media take aim at US 'lab theory'

State media says US claims that the coronavirus originated in a research laboratory are "absurd".




theory

Coronavirus: Is there any evidence for lab release theory?

BBC News examines allegations that the coronavirus was accidentally released from a lab.




theory

Rethinking youth bulge theory in policy and scholarship: incorporating critical gender analysis

7 May 2020 , Volume 96, Number 3

Lesley Pruitt

For decades ‘youth bulge’ theory has dominated understandings of youth in mainstream International Relations. Youth bulge theory has also become part of some public media analyses, mainstream political rhetoric, and even officially enshrined in the foreign policy of some states. Through the ‘youth bulge’ lens, youth—especially males—have been presented as current or future perpetrators of violence. However, this article argues that the youth bulge thesis postulated in mainstream IR is based on flawed theoretical assumptions. In particular, supporters of youth bulge theory fail to engage with existing research by feminist IR scholars and thus take on a biological essentialist approach. This has led to theoretical and practical misunderstandings of the roles youth play in relation to conflict, peace and security. These partial and biased understandings have also resulted in less effective policy-making. In critically reflecting on the ‘youth bulge’ thesis, this article argues that applying gender analysis is crucial to understanding the involvement of young people in general—and young men in particular—in conflict. Doing so will contribute to advancing more accurate analysis in scholarship and policy-making.




theory

Probabilistic Methods in Geometry, Topology and Spectral Theory

Yaiza Canzani, Linan Chen and Dmitry Jakobson, editors. American Mathematical Society | Centre de Recherches Mathematiques, 2019, CONM, volume 739, approx. 208 pp. ISBN: 978-1-4704-4145-6 (print), 978-1-4704-5599-6 (online).

This volume contains the proceedings of the CRM Workshops on Probabilistic Methods in Spectral Geometry and PDE, held from August 22–26, 2016 and...




theory

Complex Analysis and Spectral Theory

H. Garth Dales, Dmitry Khavinson and Javad Mashreghi, editors. American Mathematical Society | Centre de Recherches Mathematiques, 2020, CONM, volume 743, approx. 296 pp. ISBN: 978-1-4704-4692-5 (print), 978-1-4704-5453-1 (online).

This volume contains the proceedings of the Conference on Complex Analysis and Spectral Theory, in celebration of Thomas Ransford's 60th birthday, held...




theory

Motivic Homotopy Theory and Refined Enumerative Geometry

Federico Binda, Marc Levine, Manh Toan Nguyen and Oliver Röndigs, editors. American Mathematical Society, 2020, CONM, volume 745, approx. 286 pp. ISBN: 978-1-4704-4898-1 (print), 978-1-4704-5455-5 (online).

This volume contains the proceedings of the Workshop on Motivic Homotopy Theory and Refined Enumerative Geometry, held from May 14–18, 2018, at...




theory

????-theory in Algebra, Analysis and Topology

Guillermo Cortiñas and Charles A. Weibel, editors. American Mathematical Society, 2020, CONM, volume 749, approx. 398 pp. ISBN: 978-1-4704-5026-7 (print), 978-1-4704-5594-1 (online).

This volume contains the proceedings of the ICM 2018 satellite school and workshop K-theory conference in Argentina. The school was held...





theory

Topology and Elementary Electric Circuit Theory, I




theory

Topology and Elementary Electric Circuit Theory, II: Duality




theory

Topological Quantum Field Theory for Vampires




theory

Advances in Representation Theory of Algebras

David J. Benson, University of Aberdeen, Henning Krause, University of Bielefeld, and Andrzej Skowronski, Nicolaus Copernicus University, Editors - A publication of the European Mathematical Society, 2013, 378 pp., Hardcover, ISBN-13: 978-3-03719-125-5, List: US$98, Institutional Member: US$78.40, All Individuals: US$78.40, EMSSCR/9

This volume presents a collection of articles devoted to representations of algebras and related topics. Dististinguished experts in this field...




theory

Capacity Theory with Local Rationality: The Strong Fekete-Szego Theorem on Curves

Robert Rumely, University of Georgia - AMS, 2013, 437 pp., Hardcover, ISBN-13: 978-1-4704-0980-7, List: US$119, All AMS Members: US$95.20, SURV/193

This book is devoted to the proof of a deep theorem in arithmetic geometry, the Fekete-Szegö theorem with local rationality conditions. The...




theory

Hodge Theory, Complex Geometry, and Representation Theory

Robert S. Doran, Greg Friedman, and Scott Nollet, Texas Christian University, Editors - AMS, 2014, approx. 318 pp., Softcover, ISBN-13: 978-0-8218-9415-6, List: US$113, All AMS Members: US$90.40, CONM/608

This volume contains the proceedings of an NSF/Conference Board of the Mathematical Sciences (CBMS) regional conference on Hodge theory, complex...




theory

Ring Theory and Its Applications

Dinh Van Huynh, S. K. Jain, and Sergio R. Lopez-Permouth, Ohio University, and S. Tariq Rizvi and Cosmin S. Roman, Ohio State University, Editors - AMS, 2014, 311 pp., Softcover, ISBN-13: 978-0-8218-8797-4, List: US$113, All AMS Members: US$90.40, CONM/609

This volume contains the proceedings of the Ring Theory Session in honor of T. Y. Lam's 70th birthday, at the 31st Ohio State-Denison Mathematics...




theory

Perspectives in Representation Theory

Pavel Etingof, Massachusetts Institute of Technology, Mikhail Khovanov, Columbia University, and Alistair Savage, University of Ottawa, Editors - AMS, 2014, 370 pp., Softcover, ISBN-13: 978-0-8218-9170-4, List: US$126, All AMS Members: US$100.80, CONM/610

This volume contains the proceedings of the conference Perspectives in Representation Theory, held from May 12-17, 2012, at Yale University, in honor...




theory

Group Theory, Combinatorics, and Computing

Robert Fitzgerald Morse, University of Evansville, Daniela Nikolova-Popova, Florida Atlantic University, and Sarah Witherspoon, Texas A & M University, Editors - AMS, 2014, 187 pp., Softcover, ISBN-13: 978-0-8218-9435-4, List: US$78, All AMS Members: US$62.40, CONM/611

This volume contains the proceedings of the International Conference on Group Theory, Combinatorics and Computing held from October 3-8, 2012, in Boca...




theory

Operator-Valued Measures, Dilations, and the Theory of Frames

Deguang Han, University of Central Florida, David R. Larson, Texas A&M University, Bei Liu, Tianjin University of Technology, and Rui Liu, Nankai University - AMS, 2013, 84 pp., Softcover, ISBN-13: 978-0-8218-9172-8, List: US$65, All AMS Members: US$52, MEMO/229/1075

The authors develop elements of a general dilation theory for operator-valued measures. Hilbert space operator-valued measures are closely related to...




theory

Rethinking youth bulge theory in policy and scholarship: incorporating critical gender analysis

7 May 2020 , Volume 96, Number 3

Lesley Pruitt

For decades ‘youth bulge’ theory has dominated understandings of youth in mainstream International Relations. Youth bulge theory has also become part of some public media analyses, mainstream political rhetoric, and even officially enshrined in the foreign policy of some states. Through the ‘youth bulge’ lens, youth—especially males—have been presented as current or future perpetrators of violence. However, this article argues that the youth bulge thesis postulated in mainstream IR is based on flawed theoretical assumptions. In particular, supporters of youth bulge theory fail to engage with existing research by feminist IR scholars and thus take on a biological essentialist approach. This has led to theoretical and practical misunderstandings of the roles youth play in relation to conflict, peace and security. These partial and biased understandings have also resulted in less effective policy-making. In critically reflecting on the ‘youth bulge’ thesis, this article argues that applying gender analysis is crucial to understanding the involvement of young people in general—and young men in particular—in conflict. Doing so will contribute to advancing more accurate analysis in scholarship and policy-making.




theory

A Social Theory of Violence Looks Beyond the Shooter

Like most people in Virginia, Donald Black was horrified by Seung Hui Cho's shooting rampage last week that left 33 people dead, including the shooter.




theory

Picking a Theory is Like Building a Boat at Sea


"We are like sailors who on the open sea must reconstruct their ship
 but are never able to start afresh from the bottom." 
Otto Neurath's analogy in the words of Willard V. Quine

Engineers, economists, social planners, security strategists, and others base their plans and decisions on theories. They often argue long and hard over which theory to use. Is it ever right to use a theory that we know is empirically wrong, especially if a true (or truer) theory is available? Why is it so difficult to pick a theory?

Let's consider two introductory examples.

You are an engineer designing a robot. You must calculate the forces needed to achieve specified motions of the robotic arms. You can base these calculations on either of two theories. One theory assumes that an object comes to rest unless a force acts upon it. Let's call this axiom A. The other theory assumes that an object moves at constant speed unless a force acts upon it. Let's call this axiom G. Axiom A agrees with observation: Nothing moves continuously without the exertion of force; an object will come to rest unless you keep pushing it. Axiom G contradicts all observation; no experiment illustrates the perpetual motion postulated by the axiom. If all else is the same, which theory should you choose?

Axiom A is Aristotle's law of inertia, which contributed little to the development of mechanical dynamics. Axiom G is Galileo's law of inertia: one of the most fruitful scientific ideas of all time. Why is an undemonstrable assertion - axiom G - a good starting point for a theory?

Consider another example.

You are an economist designing a market-based policy to induce firms to reduce pollution. You will use an economic theory to choose between policies. One theory assumes that firms face pure competition, meaning that no single firm can influence market prices. Another theory provides agent-based game-theoretic characterization of how firms interact (without colluding) by observing and responding to price behavior of other firms and of consumers.

Pure competition is a stylized idealization (like axiom G). Game theory is much more realistic (like axiom A), but may obscure essential patterns in its massive detail. Which theory should you use?

We will not address the question of how to choose a theory upon which to base a decision. We will focus on the question: why is theory selection so difficult? We will discuss four trade offs.

"Thanks to the negation sign, there are as many truths as falsehoods;
we just can't always be sure which are which." Willard V. Quine

The tension between right and right. The number of possible theories is infinite, and sometimes it's hard to separate the wheat from the chaff, as suggested by the quote from Quine. As an example, I have a book called A Modern Guide to Macroeconomics: An Introduction to Competing Schools of Thought by Snowdon, Vane and Wynarczyk. It's a wonderful overview of about a dozen theories developed by leading economic scholars, many of them Nobel Prize Laureates. The theories are all fundamentally different. They use different axioms and concepts and they compete for adoption by economists. These theories have been studied and tested upside down and backwards. However, economic processes are very complex and variable, and the various theories succeed in different ways or in different situations, so the jury is still out. The choice of a theory is no simple matter because many different theories can all seem right in one way or another.

"The fox knows many things, but the hedgehog knows one big thing." Archilochus

The fox-hedgehog tension. This aphorism by Archilochus metaphorically describes two types of theories (and two types of people). Fox-like theories are comprehensive and include all relevant aspects of the problem. Hedgehog-like theories, in contrast, skip the details and focus on essentials. Axiom A is fox-like because the complications of friction are acknowledged from the start. Axiom G is hedgehog-like because inertial resistance to change is acknowledged but the complications of friction are left for later. It is difficult to choose between these types of theories because it is difficult to balance comprehensiveness against essentialism. On the one hand, all relevant aspects of the problem should be considered. On the other hand, don't get bogged down in endless details. This fox-hedgehog tension can be managed by weighing the context, goals and implications of the decision. We won't expand on this idea since we're not considering how to choose a theory; we're only examining why it's a difficult choice. However, the idea of resolving this tension by goal-directed choice motivates the third tension.

"Beyond this island of meanings which in their own nature are true or false
lies the ocean of meanings to which truth and falsity are irrelevant." John Dewey

The truth-meaning tension. Theories are collections of statements like axioms A and G in our first example. Statements carry meaning, and statements can be either true or false. Truth and meaning are different. For instance, "Archilochus was a Japanese belly dancer" has meaning, but is not true. The quote from Dewey expresses the idea that "meaning" is a broader description of statements than "truth". All true statements mean something, but not all meaningful statements are true. That does not imply, however, that all untrue meaningful statements are false, as we will see.

We know the meanings of words and sentences from experience with language and life. A child learns the meanings of words - chair, mom, love, good, bad - by experience. Meanings are learned by pointing - this is a chair - and also by experiencing what it means to love or to be good or bad.

Truth is a different concept. John Dewey wrote that

"truths are but one class of meanings, namely, those in which a claim to verifiability by their consequences is an intrinsic part of their meaning. Beyond this island of meanings which in their own nature are true or false lies the ocean of meanings to which truth and falsity are irrelevant. We do not inquire whether Greek civilization was true or false, but we are immensely concerned to penetrate its meaning."

A true statement, in Dewey's sense, is one that can be confirmed by experience. Many statements are meaningful, even important and useful, but neither true nor false in this experimental sense. Axiom G is an example.

Our quest is to understand why the selection of a theory is difficult. Part of the challenge derives from the tension between meaning and truth. We select a theory for use in formulating and evaluating a plan or decision. The decision has implications: what would it mean to do this rather than that? Hence it is important that the meaning of the theory fit the context of the decision. Indeed, hedgehogs would say that getting the meaning and implication right is the essence of good decision making.

But what if a relevantly meaningful theory is unprovable or even false? Should we use a theory that is meaningful but not verifiable by experience? Should we use a meaningful theory that is even wrong? This quandary is related to the fox-hedgehog tension because the fox's theory is so full of true statements that its meaning may be obscured, while the hedgehog's bare-bones theory has clear relevance to the decision to be made, but may be either false or too idealized to be tested.

Galileo's axiom of inertia is an idealization that is unsupported by experience because friction can never be avoided. Axiom G assumes conditions that cannot be realized so the axiom can never be tested. Likewise, pure competition is an idealization that is rarely if ever encountered in practice. But these theories capture the essence of many situations. In practical terms, what it means to get the robotic arm from here to there is to apply net forces that overcome Galilean inertia. But actually designing a robot requires considering details of dissipative forces like friction. What it means to be a small business is that the market price of your product is beyond your control. But actually running a business requires following and reacting to prices in the store next door.

It is difficult to choose between a relevantly meaningful but unverifiable theory, and a true theory that is perhaps not quite what we mean.

The knowledge-ignorance tension. Recall that we are discussing theories in the service of decision-making by engineers, social scientists and others. A theory should facilitate the use of our knowledge and understanding. However, in some situations our ignorance is vast and our knowledge will grow. Hence a theory should also account for ignorance and be able to accommodate new knowledge.

Let's take an example from theories of decision. The independence axiom is fundamental in various decision theories, for instance in von Neumann-Morgenstern expected utility theory. It says that one's choices should be independent of irrelevant alternatives. Suppose you are offered the dinner choice between chicken and fish, and you choose chicken. The server returns a few minutes later saying that beef is also available. If you switch your choice from chicken to fish you are violating the independence axiom. You prefer beef less than both chicken and fish, so the beef option shouldn't alter the fish-chicken preference.

But let's suppose that when the server returned and mentioned beef, your physician advised you to reduce your cholesterol intake (so your preference for beef is lowest) which prompted your wife to say that you should eat fish at least twice a week because of vitamins in the oil. So you switch from chicken to fish. Beef is not chosen, but new information that resulted from introducing the irrelevant alternative has altered the chicken-fish preference.

One could argue for the independence axiom by saying that it applies only when all relevant information (like considerations of cholesterol and fish oil) are taken into account. On the other hand, one can argue against the independence axiom by saying that new relevant information quite often surfaces unexpectedly. The difficulty is to judge the extent to which ignorance and the emergence of new knowledge should be central in a decision theory.

Wrapping up. Theories express our knowledge and understanding about the unknown and confusing world. Knowledge begets knowledge. We use knowledge and understanding - that is, theory - in choosing a theory. The process is difficult because it's like building a boat on the open sea as Otto Neurath once said. 




theory

Jabberwocky. Or: Grand Unified Theory of Uncertainty???


Jabberwocky, Lewis Carroll's whimsical nonsense poem, uses made-up words to create an atmosphere and to tell a story. "Billig", "frumious", "vorpal" and "uffish" have no lexical meaning, but they could have. The poem demonstrates that the realm of imagination exceeds the bounds of reality just as the set of possible words and meanings exceeds its real lexical counterpart.

Uncertainty thrives in the realm of imagination, incongruity, and contradiction. Uncertainty falls in the realm of science fiction as much as in the realm of science. People have struggled with uncertainty for ages and many theories of uncertainty have appeared over time. How many uncertainty theories do we need? Lots, and forever. Would we say that of physics? No, at least not forever.

Can you think inconsistent, incoherent, or erroneous thoughts? I can. (I do it quite often, usually without noticing.) For those unaccustomed to thinking incongruous thoughts, and who need a bit of help to get started, I can recommend thinking of "two meanings packed into one word like a portmanteau," like 'fuming' and 'furious' to get 'frumious' or 'snake' and 'shark' to get 'snark'.

Portmanteau words are a start. Our task now is portmanteau thoughts. Take for instance the idea of a 'thingk':

When I think a thing I've thought,
I have often felt I ought
To call this thing I think a "Thingk",
Which ought to save a lot of ink.

The participle is written "thingking",
(Which is where we save on inking,)
Because "thingking" says in just one word:
"Thinking of a thought thing." Absurd!

All this shows high-power abstraction.
(That highly touted human contraption.)
Using symbols with subtle feint,
To stand for something which they ain't.

Now that wasn't difficult: two thoughts at once. Now let those thoughts be contradictory. To use a prosaic example: thinking the unthinkable, which I suppose is 'unthingkable'. There! You did it. You are on your way to a rich and full life of thinking incongruities, fallacies and contradictions. We can hold in our minds thoughts of 4-sided triangles, parallel lines that intersect, and endless other seeming impossibilities from super-girls like Pippi Longstockings to life on Mars (some of which may actually be true, or at least possible).

Scientists, logicians, and saints are in the business of dispelling all such incongruities, errors and contradictions. Banishing inconsistency is possible in science because (or if) there is only one coherent world. Belief in one coherent world and one grand unified theory is the modern secular version of the ancient monotheistic intuition of one universal God (in which saints tend to believe). Uncertainty thrives in the realm in which scientists and saints have not yet completed their tasks (perhaps because they are incompletable). For instance, we must entertain a wide range of conflicting conceptions when we do not yet know how (or whether) quantum mechanics can be reconciled with general relativity, or Pippi's strength reconciled with the limitations of physiology. As Henry Adams wrote:

"Images are not arguments, rarely even lead to proof, but the mind craves them, and, of late more than ever, the keenest experimenters find twenty images better than one, especially if contradictory; since the human mind has already learned to deal in contradictions."

The very idea of a rigorously logical theory of uncertainty is startling and implausible because the realm of the uncertain is inherently incoherent and contradictory. Indeed, the first uncertainty theory - probability - emerged many centuries after the invention of the axiomatic method in mathematics. Today we have many theories of uncertainty: probability, imprecise probability, information theory, generalized information theory, fuzzy logic, Dempster-Shafer theory, info-gap theory, and more (the list is a bit uncertain). Why such a long and diverse list? It seems that in constructing a logically consistent theory of the logically inconsistent domain of uncertainty, one cannot capture the whole beast all at once (though I'm uncertain about this).

A theory, in order to be scientific, must exclude something. A scientific theory makes statements such as "This happens; that doesn't happen." Karl Popper explained that a scientific theory must contain statements that are at risk of being wrong, statements that could be falsified. Deborah Mayo demonstrated how science grows by discovering and recovering from error.

The realm of uncertainty contains contradictions (ostensible or real) such as the pair of statements: "Nine year old girls can lift horses" and "Muscle fiber generates tension through the action of actin and myosin cross-bridge cycling". A logically consistent theory of uncertainty can handle improbabilities, as can scientific theories like quantum mechanics. But a logical theory cannot encompass outright contradictions. Science investigates a domain: the natural and physical worlds. Those worlds, by virtue of their existence, are perhaps coherent in a way that can be reflected in a unified logical theory. Theories of uncertainty are directed at a larger domain: the natural and physical worlds and all imaginable (and unimaginable) other worlds. That larger domain is definitely not coherent, and a unified logical theory would seem to be unattainable. Hence many theories of uncertainty are needed.

Scientific theories are good to have, and we do well to encourage the scientists. But it is a mistake to think that the scientific paradigm is suitable to all domains, in particular, to the study of uncertainty. Logic is a powerful tool and the axiomatic method assures the logical consistency of a theory. For instance, Leonard Savage argued that personal probability is a "code of consistency" for choosing one's behavior. Jim March compares the rigorous logic of mathematical theories of decision to strict religious morality. Consistency between values and actions is commendable says March, but he notes that one sometimes needs to deviate from perfect morality. While "[s]tandard notions of intelligent choice are theories of strict morality ... saints are a luxury to be encouraged only in small numbers." Logical consistency is a merit of any single theory, including a theory of uncertainty. However, insisting that the same logical consistency apply over the entire domain of uncertainty is like asking reality and saintliness to make peace.




theory

New Theory & Psychology: Early Critical Theory and Beck’s Cognitive Theory

Two articles in the most recent issue of Theory & Psychology may interest AHP readers. Full details below. “How lost and accomplished revolutions shaped psychology: Early Critical Theory (Frankfurt School), Wilhelm Reich, and Vygotsky,” by Gordana Jovanovi?. Abstract: On the occasion of recent centenaries of revolutions in Europe (1917, 1918–19), this article examines, within a … Continue reading New Theory & Psychology: Early Critical Theory and Beck’s Cognitive Theory




theory

Prospect theory in the Moneyball movie

“I hate losing more than I even wanna win.” – Oakland Athletics General Manager Billy Beane (or some creative Hollywood writer channeling Billy Beane) Around 40 seconds in.




theory

Knowledge sharing for the development of learning resources : theory, method, process and application for schools, communities and the workplace : a UNESCO-PNIEVE resource / by John E. Harrington, Professor Emeritis.

The Knowledge Sharing for the Development of Learning Resources tutorial provides a professional step forward, a learning experience that leads to recognition that your leadership is well founded as well as ensuring that participants in the development of learning resources recognize they are contributing to an exceptional achievement.




theory

Forum 2019 : 3A Parliamentary committees : in theory and practice : slides / presented by Iain Evans, Former Member of Parliament.




theory

Animal Nutrition : From Theory to Practice.

Nutrition is the key driver of animal health, welfare and production. In agriculture, nutrition is crucial to meet increasing global demands for animal protein and consumer demands for cheaper meat, milk and eggs and higher standards of animal welfare. For companion animals, good nutrition is essential for quality and length of life. Animal Nutrition examines the science behind the nutrition and feeding of the major domesticated animal species: sheep, beef cattle, dairy cattle, deer, goats, pigs, poultry, camelids, horses, dogs and cats. It includes introductory chapters on digestion and feeding standards, followed by chapters on each animal, containing information on digestive anatomy and physiology, evidence-based nutrition and feeding requirements, and common nutritional and metabolic diseases. Clear diagrams, tables and breakout boxes make this text readily understandable and it will be of value to tertiary students and to practising veterinarians, livestock consultants, producers and nutritionists.




theory

Elements of the theory and practice of medicine : designed for the use of students and junior practitioners / by George Gregory.

London : H. Renshaw, 1839.




theory

Elements of the theory and practice of physic : designed for the use of students / by George Gregory.

London : printed for Burgess and Hill, 1825.




theory

Skill Rating for Multiplayer Games. Introducing Hypernode Graphs and their Spectral Theory

We consider the skill rating problem for multiplayer games, that is how to infer player skills from game outcomes in multiplayer games. We formulate the problem as a minimization problem $arg min_{s} s^T Delta s$ where $Delta$ is a positive semidefinite matrix and $s$ a real-valued function, of which some entries are the skill values to be inferred and other entries are constrained by the game outcomes. We leverage graph-based semi-supervised learning (SSL) algorithms for this problem. We apply our algorithms on several data sets of multiplayer games and obtain very promising results compared to Elo Duelling (see Elo, 1978) and TrueSkill (see Herbrich et al., 2006).. As we leverage graph-based SSL algorithms and because games can be seen as relations between sets of players, we then generalize the approach. For this aim, we introduce a new finite model, called hypernode graph, defined to be a set of weighted binary relations between sets of nodes. We define Laplacians of hypernode graphs. Then, we show that the skill rating problem for multiplayer games can be formulated as $arg min_{s} s^T Delta s$ where $Delta$ is the Laplacian of a hypernode graph constructed from a set of games. From a fundamental perspective, we show that hypernode graph Laplacians are symmetric positive semidefinite matrices with constant functions in their null space. We show that problems on hypernode graphs can not be solved with graph constructions and graph kernels. We relate hypernode graphs to signed graphs showing that positive relations between groups can lead to negative relations between individuals.




theory

The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy

Nancy Heckman

Source: Statist. Surv., Volume 6, 113--141.

Abstract:
The popular cubic smoothing spline estimate of a regression function arises as the minimizer of the penalized sum of squares $sum_{j}(Y_{j}-mu(t_{j}))^{2}+lambda int_{a}^{b}[mu''(t)]^{2},dt$, where the data are $t_{j},Y_{j}$, $j=1,ldots,n$. The minimization is taken over an infinite-dimensional function space, the space of all functions with square integrable second derivatives. But the calculations can be carried out in a finite-dimensional space. The reduction from minimizing over an infinite dimensional space to minimizing over a finite dimensional space occurs for more general objective functions: the data may be related to the function $mu$ in another way, the sum of squares may be replaced by a more suitable expression, or the penalty, $int_{a}^{b}[mu''(t)]^{2},dt$, might take a different form. This paper reviews the Reproducing Kernel Hilbert Space structure that provides a finite-dimensional solution for a general minimization problem. Particular attention is paid to the construction and study of the Reproducing Kernel Hilbert Space corresponding to a penalty based on a linear differential operator. In this case, one can often calculate the minimizer explicitly, using Green’s functions.




theory

The complexity of bird behaviour : a facet theory approach

Hackett, Paul, 1960- author
9783030121921 (electronic bk.)




theory

Models of tree and stand dynamics : theory, formulation and application

Mäkelä, Annikki, author
9783030357610




theory

Handbook of geotechnical testing : basic theory, procedures and comparison of standards

Li, Yanrong (Writer on geology), author.
0429323743 electronic book




theory

Semi-supervised inference: General theory and estimation of means

Anru Zhang, Lawrence D. Brown, T. Tony Cai.

Source: The Annals of Statistics, Volume 47, Number 5, 2538--2566.

Abstract:
We propose a general semi-supervised inference framework focused on the estimation of the population mean. As usual in semi-supervised settings, there exists an unlabeled sample of covariate vectors and a labeled sample consisting of covariate vectors along with real-valued responses (“labels”). Otherwise, the formulation is “assumption-lean” in that no major conditions are imposed on the statistical or functional form of the data. We consider both the ideal semi-supervised setting where infinitely many unlabeled samples are available, as well as the ordinary semi-supervised setting in which only a finite number of unlabeled samples is available. Estimators are proposed along with corresponding confidence intervals for the population mean. Theoretical analysis on both the asymptotic distribution and $ell_{2}$-risk for the proposed procedures are given. Surprisingly, the proposed estimators, based on a simple form of the least squares method, outperform the ordinary sample mean. The simple, transparent form of the estimator lends confidence to the perception that its asymptotic improvement over the ordinary sample mean also nearly holds even for moderate size samples. The method is further extended to a nonparametric setting, in which the oracle rate can be achieved asymptotically. The proposed estimators are further illustrated by simulation studies and a real data example involving estimation of the homeless population.




theory

A general theory for preferential sampling in environmental networks

Joe Watson, James V. Zidek, Gavin Shaddick.

Source: The Annals of Applied Statistics, Volume 13, Number 4, 2662--2700.

Abstract:
This paper presents a general model framework for detecting the preferential sampling of environmental monitors recording an environmental process across space and/or time. This is achieved by considering the joint distribution of an environmental process with a site-selection process that considers where and when sites are placed to measure the process. The environmental process may be spatial, temporal or spatio-temporal in nature. By sharing random effects between the two processes, the joint model is able to establish whether site placement was stochastically dependent of the environmental process under study. Furthermore, if stochastic dependence is identified between the two processes, then inferences about the probability distribution of the spatio-temporal process will change, as will predictions made of the process across space and time. The embedding into a spatio-temporal framework also allows for the modelling of the dynamic site-selection process itself. Real-world factors affecting both the size and location of the network can be easily modelled and quantified. Depending upon the choice of the population of locations considered for selection across space and time under the site-selection process, different insights about the precise nature of preferential sampling can be obtained. The general framework developed in the paper is designed to be easily and quickly fit using the R-INLA package. We apply this framework to a case study involving particulate air pollution over the UK where a major reduction in the size of a monitoring network through time occurred. It is demonstrated that a significant response-biased reduction in the air quality monitoring network occurred, namely the relocation of monitoring sites to locations with the highest pollution levels, and the routine removal of sites at locations with the lowest. We also show that the network was consistently unrepresenting levels of particulate matter seen across much of GB throughout the operating life of the network. Finally we show that this may have led to a severe overreporting of the population-average exposure levels experienced across GB. This could have great impacts on estimates of the health effects of black smoke levels.




theory

Frequency domain theory for functional time series: Variance decomposition and an invariance principle

Piotr Kokoszka, Neda Mohammadi Jouzdani.

Source: Bernoulli, Volume 26, Number 3, 2383--2399.

Abstract:
This paper is concerned with frequency domain theory for functional time series, which are temporally dependent sequences of functions in a Hilbert space. We consider a variance decomposition, which is more suitable for such a data structure than the variance decomposition based on the Karhunen–Loéve expansion. The decomposition we study uses eigenvalues of spectral density operators, which are functional analogs of the spectral density of a stationary scalar time series. We propose estimators of the variance components and derive convergence rates for their mean square error as well as their asymptotic normality. The latter is derived from a frequency domain invariance principle for the estimators of the spectral density operators. This principle is established for a broad class of linear time series models. It is a main contribution of the paper.




theory

A unified principled framework for resampling based on pseudo-populations: Asymptotic theory

Pier Luigi Conti, Daniela Marella, Fulvia Mecatti, Federico Andreis.

Source: Bernoulli, Volume 26, Number 2, 1044--1069.

Abstract:
In this paper, a class of resampling techniques for finite populations under $pi $ps sampling design is introduced. The basic idea on which they rest is a two-step procedure consisting in: (i) constructing a “pseudo-population” on the basis of sample data; (ii) drawing a sample from the predicted population according to an appropriate resampling design. From a logical point of view, this approach is essentially based on the plug-in principle by Efron, at the “sampling design level”. Theoretical justifications based on large sample theory are provided. New approaches to construct pseudo populations based on various forms of calibrations are proposed. Finally, a simulation study is performed.




theory

Statistical Theory Powering Data Science

Junhui Cai, Avishai Mandelbaum, Chaitra H. Nagaraja, Haipeng Shen, Linda Zhao.

Source: Statistical Science, Volume 34, Number 4, 669--691.

Abstract:
Statisticians are finding their place in the emerging field of data science. However, many issues considered “new” in data science have long histories in statistics. Examples of using statistical thinking are illustrated, which range from exploratory data analysis to measuring uncertainty to accommodating nonrandom samples. These examples are then applied to service networks, baseball predictions and official statistics.




theory

Larry Brown’s Contributions to Parametric Inference, Decision Theory and Foundations: A Survey

James O. Berger, Anirban DasGupta.

Source: Statistical Science, Volume 34, Number 4, 621--634.

Abstract:
This article gives a panoramic survey of the general area of parametric statistical inference, decision theory and foundations of statistics for the period 1965–2010 through the lens of Larry Brown’s contributions to varied aspects of this massive area. The article goes over sufficiency, shrinkage estimation, admissibility, minimaxity, complete class theorems, estimated confidence, conditional confidence procedures, Edgeworth and higher order asymptotic expansions, variational Bayes, Stein’s SURE, differential inequalities, geometrization of convergence rates, asymptotic equivalence, aspects of empirical process theory, inference after model selection, unified frequentist and Bayesian testing, and Wald’s sequential theory. A reasonably comprehensive bibliography is provided.




theory

Models as Approximations II: A Model-Free Theory of Parametric Regression

Andreas Buja, Lawrence Brown, Arun Kumar Kuchibhotla, Richard Berk, Edward George, Linda Zhao.

Source: Statistical Science, Volume 34, Number 4, 545--565.

Abstract:
We develop a model-free theory of general types of parametric regression for i.i.d. observations. The theory replaces the parameters of parametric models with statistical functionals, to be called “regression functionals,” defined on large nonparametric classes of joint ${x extrm{-}y}$ distributions, without assuming a correct model. Parametric models are reduced to heuristics to suggest plausible objective functions. An example of a regression functional is the vector of slopes of linear equations fitted by OLS to largely arbitrary ${x extrm{-}y}$ distributions, without assuming a linear model (see Part I). More generally, regression functionals can be defined by minimizing objective functions, solving estimating equations, or with ad hoc constructions. In this framework, it is possible to achieve the following: (1) define a notion of “well-specification” for regression functionals that replaces the notion of correct specification of models, (2) propose a well-specification diagnostic for regression functionals based on reweighting distributions and data, (3) decompose sampling variability of regression functionals into two sources, one due to the conditional response distribution and another due to the regressor distribution interacting with misspecification, both of order $N^{-1/2}$, (4) exhibit plug-in/sandwich estimators of standard error as limit cases of ${x extrm{-}y}$ bootstrap estimators, and (5) provide theoretical heuristics to indicate that ${x extrm{-}y}$ bootstrap standard errors may generally be preferred over sandwich estimators.




theory

Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex

EL Bienenstock
Jan 1, 1982; 2:32-48
Articles




theory

The Wham-O Pudding Essay Contest Theory of Educational Innovation

I regularly receive invitations to participate in essay contests devoted to rethinking American education. These competitions, I fear, are the worst way to spur real change.




theory

RHSU Classic: The Wham-O Pudding Essay Contest Theory of Educational Innovation

If I've learned anything after hanging out at a think tank for close to two decades, it's that dreaming up education innovations is easy. Number 12 in our countdown is my take on the goofy contests that talkers seem to be so fond of.




theory

Fin24.com | Trump offers 'rogue killer' theory, sends Pompeo to Saudi Arabia

US President Donald trump has suggested that 'rogue killers' may be behind the disappearance of journalist Jamal Khashoggi in Turkey.




theory

Transitioning Patients With Complex Health Care Needs to Adult Practices: Theory Versus Reality




theory

Novak Djokovic blasted for bizarre 'mind power' theory

After the backlash over his anti-vaxxer stance, Novak Djokovic has faced fresh criticism for suggesting that people can alter the make-up of food and water by using mind power and emotions.
Read Full Article at RT.com