Monday 29 June 2009
Digitising is NOT measuring
Digitising is the capture of digital data from a sensor. It is a new and low cost way of generating petabytes of data. However, simply taking an analogue input and transforming it to a digital image or signal does not mean it is a measurement.
There are millions and millions of digital images captured every day (perhaps billions). The vast majority of these are NOT measurements they are 'snapshots'. In order to use digital imagery as a measurement modality for science one needs to take care of magnification issues (not always equal in X-Y), linearity of grey scale response and/or colour response, bit depth, illumination set up to highlight features of interest, effect of image compression, frequency response of lenses used etc etc.
Matt
Saturday 27 June 2009
Science Studies the Sardine (Ed Ricketts 1947)
Unfortunately for the sardine, and the extended marine food chain that it forms part of, Ricketts was killed in May 1948 and his deep ecological insight and campaigning capability never helped save the sardine. Much later, in 1998, well known fisheries ecologist Dr Daniel Pauly published a seminal paper in Science called 'Fishing Down Marine Food Webs'. This is both a citation classic (with over 1,000 citations) and a highly influential paper. Pauly was not aware of Ricketts work when his paper was published.
Science Studies the Sardine
Mysterious Disappearance Focuses Attention on Woeful Lack of Information Regarding Billion Dollar Fish
By EDWARD F. RICKETTS, Pacific Biological Laboratories
The Herald has undoubtedly reported waterfront opinion correctly in stating that the shortage of sardines is being attributed to a change in the currents. I doubt very much if we can rely on such a simple explanation. I am reminded instead of an old nursery rhyme. Like the "farmer-in-the-Dell" there's a long chain of events involved. And you have to be familiar with all of them in order to know any one very clearly.
This is complicated still further by the fact that scientists haven't yet unravelled it all completely. But already quite a lot of information is available, and from most of the parts we can piece together some sort of picture puzzle. The result may be a labyrinth, but there's no way to avoid it. We just have to be patient and try to follow it through. Because I've done this myself after a fashion, perhaps I can be better than no guide at all.
By means of a very efficient straining apparatus in the gills, sardines are able to feed directly on the most primitive foodstuffs in the ocean. This so-called "plankton," chiefly diatoms (free floating microscopic plants) is the product of oceanic pastures, and like grain and grass and root crops, bears a direct relation to sunlight and fertilizer. Except that in the oceans of course you don't have to depend on the rains for moisture.
Some of these ocean pastures produce more per acre than the others. This is due to variations in the amount of fertilizer brought to the surface by a process called "upwelling."
DUST TO DUST
Only the upper layers of water receive enough light to permit the development of plants. In most places, when the sunlight starts to increase in spring, these plants grow so rapidly that they deplete the fertilizer and die out as a result. In the familiar pattern of dust to dust and ashes to ashes, their bodies disintegrate and relase the chemical elements. These, in the form of a rain of particles, contantly enrich the dark-deeper layers.
On the California coast - one of the few places in the world, incidentally, where this happens - winds from the land blow the surface waters of the shore far out to sea. To replace these waters, vertical currents are formed which bring up cool, fertilizer laden waters from the depths to enrich the surface layers, maintaining a high standard of fertility during the critical summer months. Since the seeds of the minute plants are everywhere ready and waiting to take advantage of favorable conditions, these waters are always blooming. It's a biological rule that where there's food, there's likely to be animals to make use off it. Foremost among such animals is the lowly sardine.
FISH "FACTORY"
In the long chain, this is the first link. Reproductive peculiarities of the sardine itself provide the second. Above their immediate needs, adult sardines store up food in the form of fat. During the breeding season, ALL this fat is converted into eggs and sperm. The adult sardine is nothing but a factory for sexual products.
The production of plankton is known to vary enormously from year to year. If feeding conditions are good, the sardines will all be fat, and each will produce great quantities of eggs. If the usual small percentage of eggs survive the normal hazards of their enemies, a very large year class will be hatched.
ABUNDANT ENEMIES
If then feeding conditions are still good at the time and place of hatching (spring in southern California and Northern Mexico), huge quantities of young sardines will be strewn along the Mexican coast by south-flowing currents. Here their enemies are also abundant. Including man - chiefly in the form of bait fishermen from the tuna clippers. Those surviving this hazard move back into California waters, extending each year further and furher up the coast, the oldest finally migrating each summer clear up to Vancouver Island. All return during the winter to the breeding ground. This process builds up until a balance is established. Or until man, concluding that the supply is in fact endless, builds too many reduction plants on that assumption.
On the other had, the small year classes resulting from decreased plankton production, will in themselves deplete the total sardine population. If then the enemies-again chiefly man - try desperately to take the usual amounts, by means for instance of more and larger boats, greater cruising radius, increased fishing skills, etc, sardine resources can be reduced to the danger point.
FORECAST FEASIBLE
All this being true, we might be able to forecast the year classes if we could get to know something about the conditions to which the adults are subjected at a given time. Their eggs are produced only from stored fat. The fat comes only from plankton. And there are ways of estimating the amount of plankton produced.
An investigator at the Scripps Institution of Oceanography long ago realized the fundmental importance of plankton. He has been counting the number of cells in daily water samples for more than 20 years. This scientist, now retiring perhaps discouraged because the university recently hasn't seen fit even to publish his not-very-spectacular figures, ought to be better known and more honored. He deals with the information of prime importance to the fisheries. But in order to get this information in sufficient detail, you have to know the man, and get him to write you about it personally.
FIGURES NEEDED
Figures from only a single point have very limited value. We should have data of this sort for a number of places up and down the coast so as to equalize the variations; and for many years. But we're lucky to have them even for La Jolla. The graph shows 1926, 1931, 1934 to have been poor years, but during those times the total sardine landings were still apparently well within the margin of safety. Years 1941-42 were also poor. But during this time the fishery was bringing in really large quantities. The evidence is that very lean adults resulted from these lean years, producing few eggs. And we would have done well take only a few of them and their progeny. A glance at the chart labelled "California Sardine Landings" will show what actually happened instead.
There's still another index of sardine production. This time the cat of the wife of the farmer-in-the-dell catches herself a fine succulent mouse.
TEMPERATURE KNOWN
We know that the food of the sardine depends directly on fertility. And that fertility depends on upwelling. Obviously, the water recently brought up from depths is colder than at the surface. When upwelling is active, surface temperatures will be low. Now we can very easily calculate the mean annual sea water temperatures from the readings taken daily at Hopkins Marine Station and elsewhere. The chart which has been prepared to show these figures bears out again the fact that 1941 and'42 must have been unfortunate years for the sardines. Following those years we should have taken fewer so as not to deplete the breeding stock.
FISHING Vs. FISH
Instead, each year the number of canneries increased. Each year we expended more fishing energy pursuing fewer fish. Until in the 1944-46 seasons we reached the peak of effort, but with fewer and fewer results. Each year we've been digging a little further into the breeding stock.
A study of the tonnage chart will show all this, and more. The total figures, including Canada, tell us pretty plainly that the initial damage was done in 1936-37, when the offshore reduction plants were being operated beyond regulation. And the subsequent needs of the war years made it difficult then also for us to heed the warnings of the scientists of the Fish and Game Division.
THE ANSWER
The answer to the question "Where are the sardines?, becomes quite obvious in this light. They're in the cans! The parents of the sardines we need so badly now were being ground up then into fish meal, were extracted for oil, were being canned; too many of them, far too many.
But the same line of reasoning shows that even the present small breeding stock, given a decent break, will stage a slow comeback. This year's figures from San Pedro, however, indicate no such decent break.
During this time of low population pressure, the migrating few started late, went not far, and came back earlier than usual. Actually we had our winter run during August and September when the fishermen were striking for higher rates. By the time we had put our local house back in order, the fish had gone on south. Many had failed to migrate in the first place, and were milling around their birthplace in the crowded Southern California waters. And the San Pedro fleet, augmented by out-of-work boats from the northern ports, has been making further serious inroads into the already depleted breeding stock.
WE WILL FORGET
My own personal belief is nevertheless optimistic. Next fall I expect to see the fish arrive, early again, and in somewhat greater quantity. But a really good year will be an evil thing for the industry. And still worse for Monterey. Because we'll forget our fears of the moment, queer misguided mortals that we are! We'll disregard conservation proposals as we have in the past; we'll sabotage those already enacted. And the next time this happens we'll be really sunk. Monterey will have lost its chief industry. And this time for good!
If on the other hand, next year is moderately bad - not as bad as this year, Heaven forbid, but let's hope it's decently bad! - maybe we'll go along with such conservation measures as will have been suggested by a Fish and Game Commission which in the past has shown itself far, far too deferent to the wishes of the operators - or maybe to their lobby!
SWEDEN'S EXAMPLE
We have before us our own depleting forests. We have Sweden's example; now producing more forest products every year than it did a hundred years back when the promise of depletion forced the adoption of conservation measures. We have this year's report on our own halibut landings, like the old days - the result solely of conservation instituted by a United States-Canadian commission which took over in the face of obvious depletion.
If we be hoggish, if we fail to cooperate in working this thing out, Monterey COULD go the way of Nootka, Fort Ross, Notley's Landing, or communities in the Mother Lode, ghost towns that faded when the sea otter or lumber or the gold mining failed. If we'll harvest each year only that year's fair proportion (and it'll take probably an international commission to implement such a plan!) there's no reason why we shouldn't go on indefinitely profiting by this effortless production of sea and sun and fertilizer. The farmer in the dell can go on with his harvesting.
This text from material reprinted for educational purposes by Roy van de Hoek, in its entirety from the newspaper:
MONTEREY PENINSULA HERALD 12th Annual Sardine Edition, p.1,3 March 7, 1947.
Thursday 18 June 2009
"Fact free science"
"I discuss below a particular example of a dynamic system 'Turing's morphogenetic waves' which gives rise to just the kind of structure that, as a biologist, I want to see. But first I must explain why I have a general feeling of unease when contemplating complex systems dynamics. Its devotees are practicing fact-free science. A fact for them is, at best, the output of a computer simulation: it is rarely a fact about the world.
John Maynard Smith, "Life at the Edge of Chaos?," The New York Review of Books, March 2, 1995
Tuesday 16 June 2009
Microscopy Site
The humble light microscope is an icon of research science. It has a practical 'resolution' that on a good day almost achieves the theoretical resolution that was calculated by Abbe in the 1880's. HERE is a great and very large website dedicated to microscopy. Enjoy.
Ancient Geometry, Stereology & Modern Medics
This is a paper HERE that Vyvyan Howard and I had fun writing a few years back and published in Chance magazine. It's a popular science type article on “Ancient Geometry, Stereology & Modern Medics”, that was originally written in 2002 and which I have recently tidied up into a PDF. It links ancient ideas in geometry with random sampling and state-of-the-art scientific imaging.
One of the key characters is a famous mathematician Zu Gengzhi, who had a famous mathematical Father Zu Chongzhi (in the picture).
Zu Gengzhi was the first to describe what we now refer to as Cavalieri's theorem;
"The volumes of two solids of the same height are equal if the areas of the plane sections at equal heights are the same."
The translation by Wagner (HERE) is given as
If blocks are piled up to form volumes,
And corresponding areas are equal,
Then the volumes cannot be unequal.
PhenoQuant™
Inspired by a 2004 paper by Vyvyan Howard et al (HERE) I have been trying to develop a business model whereby high quality quantification of the phenotypic impact of specific gene knock-outs in well defined strains of transgenic mice is provided as a service.
The quantification is achieved with protocolised stereological techniques, and clients get the specific data they need on time and in full. At the same time the parent business builds a proprietary database that can be data-mined to provide new insights and discoveries.
I worked with a Masters Student, James Stone, at the Manchester Science Enterprise Centre in 2007 to develop the concept and try to understand the market positioning for such a business, tentatively called PhenoQuant™. James and I did some outline work on a business plan and he explored routes for funding the opportunity. The plan is not mature and it is currently dormant but this is a real opportunity to bring quantification to one key aspect of modern Gene based biomedical research work.
Monday 15 June 2009
The Art of the Soluble
Galileo the quantifier
Galileo during his period in Padua before moving to Florence in 1610.
Cited in; The science of measurement: A historical survey. Herbert Arthur Klein. Dover 1988. p509.
The Data Deluge - 1987 Style
This article describes how mad baseball fans can get "...the ultimate service for the most avid baseball addict: daily, detailed reports on 16 minor leagues encompassing 154 teams and more than 3,000 players". This data deluge is made possible by facsimile machines installed in more than 150 minor league ball parks.
'Official scorers or a club official are supposed to file the information within an hour after the game and then the phone-line cost to the team is minimal since the fax transmissions take about 40 seconds each.
Triple-Point Careers™ in Science and Technology
Using this as an analogy we can define the space of opportunities for science knowledge workers who want to remain in science as the “phase diagram” of science careers. The solid state = Corporate R&D, the liquid state = Academic Research and the gaseous state = SME/startups.
Many people occupy positions fully within one of the three discrete phases these people manage their careers by following the norms of the profession. They understand the rules of the game for progression and can see how to piece together a career development plan for themselves. Their managers can understand how to help them develop and have many points of comparison by which they can judge performance and progression.
Many science careers also include a single, or small number, of transitions from one phase to another. Typically, for example, an academic may act as a consultant to a corporate R&D organization and then leave academia to take up a full time senior R&D manager position in the corporate. It is also quite common for a senior R&D manager, perhaps later in their career, to move from corporate R&D into a senior management position in academia. There are a number of national and international assistance schemes around to help scientists make these phase transitions at some stage in their career.
However, in addition to more mainstream careers and “phase transitions’ there are a number of interesting career opportunities at the boundaries between the different states.
Solid-Liquid boundary = Corporate R&D staff who are successful in the corporate and manage to develop a successful part-time or visiting academic position (e.g. professorship).
Liquid-Gas boundary = Academic scientists who have spun out a company and act as CTO, or more rarely CEO.
Solid-Gas boundary = Corporate R&D staff who have a spin-out company (e.g. from UV type activity).
Triple Point = people who successfully manage a corporate R&D career with parallel activity in an academic position and spin-out.
Staff who are occupying one of these boundary positions require specialized career advice and management. Their career development is non-trivial and not catered for by the mainstream career development methods adopted in the Solid, Liquid or Gas phases.
The triple-point career is a relatively rare combination today but is likely to be more common in the future. This is what is known as a Portfolio Career in other professions (e.g. banking/company directorship = see Financial Times 24/1/2006).
It is likely that in the knowledge economy people who are at, or who gravitate towards, a boundary or triple-point career will be uniquely skilled. It may be that at any one time a triple-point career person is flexing two of the three positions they occupy, i.e flipping at boundaries. However over a period of time (2-3 years say) they will flex all three components of their career portfolio.
Issues
Trying to successfully develop and sustain a triple point career is difficult. There are few guidelines for how to get into one of these positions or manage the psychological pressures of being successful there.
Tools
- Develop a core intellectual skill set that is deployed into maximally different application of the skill set within each “phase”. This is cross-disciplinary working. It allows a lower psychological stress.
- Find a number (at least one) mentor who has a triple point career.
- Develop tools for personal “Imagination Management”.
- Develop tools for IP and idea management.
- Hone to a high level the art of handling contradictory and possibly “conflicting” confidential information. Never waver from this.
- Build a network of people who are also triple point career makers but in particular people at the other interfacial positions (Solid-Liquid etc).
Bayesian Networks
Sunday 14 June 2009
Above all else do No harm
Saturday 13 June 2009
Basic Ideas of Scientific Sampling
It has a good description of the 'Central Paradox of Sampling' = "Once a sample has been obtained it is impossible to tell by inspection of the sample whether it has been obtained by a simple random sampling mechanism or not".
IOT: Baconian Science
Bacon was a Jacobean lawyer and politician turned advocate for science - famously writing a book Novum Organum which described how to use data collection, observation and experiment to do science.
Good material on Wikipedia about Bacon and in Oxford.
Quantification: A History of the Meaning of Measurement in the Natural and Social Sciences
There are some great articles, including one by the famous philosopher and historian of science Thomas Kuhn (he of 'Paradigm Shift' fame), also there is a great article by Wilks, which describes the role that individual measurements play in large systems of measurement and as inputs to quantitative models.
As Wilks puts it; "The subject of quantification in science is an enormous one with many as
pects. The foundation of quantification is measurement, and any discussion
of the nature of quantification must necessarily begin with a discussion of
the nature of measurement."
Enjoy
Friday 12 June 2009
When Summary Measures Aren't Enough
Another great Rafe Donahue post - HERE - this is a simple but well explained page that shows how powerful it is to show all of the data (what he calls the data atoms).
The figure shows how a plot can be constructed that shows all of the 'data atoms', together with summary measures of the distributions of the data.
As Donahue puts it;
"Analysis, from the Greek, implies breaking things down into component parts so as to understand the whole. Its opposite is synthesis, bringing together the parts to construct the whole. If we are going to do data analysis, then we must make attempts to break the data down to their component parts, their atoms. Computing summary measures like means and medians and percentiles and standard deviations and even F and chi-square and t statistics and P values, is not analysis; it is synthesis! And, worse than playing games with word meaning, data synthesis often obscures understanding the data. "
Matt
[Figure Copyright Rafe Donahue]
Fundamental Statistical Concepts in Presenting Data: Principles for Constructing Better Graphics
Norvigs More Ways to be wrong than right
More ways to be wrong than right.
Here is a nice meta list of all the mistakes one can make in experimental design and interpretation at Peter Norvigs website (http://norvig.com/experiment-design.html).
Comic Powers of Ten
One of my favourite web based comic artists, xkcd, has recently completed two sketches that when combined cover (almost) the 40+ orders of magnitude refered to by Edward Tufte and which are comic versions of the Eames Powers of Ten video and book.
From 46 Billion lights years up to Folks (Height = http://xkcd.com/482/)
From people at a PC to a Proton (Depth = http://xkcd.com/485/)
Matt
Quantifying in Scientific Images
I regularly teach histologists and biologists how to extract quantitative 3D information from 2D images and the issue of image dequantification is a key part of the program.
There are a number of linked issues;
(1) Often in experimentally derived images the x and y magnifications can be different due to non square CCD pixels, optical distortions etc. This can be overcome by taking images of known gratcules in both directions to check magnification.
(2) Using calculated "magnifications" derived from the nominal magnification on the side of the objective is often not accurate.
(3) Most histological images are 2D projections of a 3D space (admittedly thin) that has been subjected to considerable tissue processing, shrinkage etc since it was a live piece of tissue. This should be experimentally investigated and accounted for by the scientists. Note that relatively modest linear shrinkage becomes significant 3D volumetric shrinkage. Furthermore not all tissue shrinks the same amount. Famously Herbert Haug a German anatomist found in the 1960's that the brain tissue of young humans shrank more than the brain tissue of old human brains. This led to an apparent "loss" of neurons with age (they had a lower numerical densitry but after correcting for differential shrinkage the same total number).
(4) The act of taking a thin histological (or optical) section leads to an obersved reduction in feature dimension. This is not widely reflected upon and there are no accepted ways to indicate this in standard hsitological images.
(5) Often the histology stains used do not stain tissue compartments uniformly.
(6) Histological images are in fact 2D samples (real Flatland stuff) from 3 space and in common with all statistical sampling they suffer from the "Central Paradox of Sampling" = simply by looking at the picture you have no idea if it has been randomly sampled or carefully selected (i.e. it is an unbiased or a biased sample).
(7) The short advice I give for all quantitative image analysis tasks is always is to think Outside-In not Inside-Out. i.e. Think about what was done to obtain the image you have in front of you and whether this is appropriate for the scientific task at hand NOT get obsessed by the image details some of which are misleading.
Matt
How science works
There are extensive chapters on science, stats and engineering and how evidence from these different fields of endeavour should be considered by lawyers and judges.
I had a good detailed read of the chapter on 'How science works' and find it an excellent, informed and irreverent description of what actually happens in science rather than what theoretically happens.
Matt
March of Resolution
This relationship between science and resolution is not just nostalgic, there is an integral connection between the growth of science and the development of measurement resolution, as Edward Tufte describes it (Tufte 2003);
"The history of science over the centuries can be written in terms of improvements in resolution. From the beginning and all the way up to 1609, when Galileo's telescope first assisted human vision, scientific knowledge consisted of making descriptions and comparisons for events taking place at measurement scales accessible to the human eye, from about 10-3 (a tiny speck) and up to 10+7 meters (the Milky Way), some 11 orders of magnitude. Now, 400 years later, scientific descriptions and comparisons take place at scales from 10-18 and up to 10+25 meters, some 44 orders of magnitude. That is, from 1609 to 2003, scientific resolution improved an average of about 8 orders of magnitude per century (or 100 million-fold per century). Scientific resolution has increased an average 10,000,000 to 100,000,000 times per century in each of the 4 centuries since Galileo."
Although Tufte makes the argument with respect to the resolution of distances a similar argument can be made about improvements in temporal resolution (from sun dials with rough “hourly” resolution in antiquity to modern atomic clocks which can maintain an accuracy of about 10-9 seconds per day), chemical resolution or mass resolution.
Matt
Cramming more components onto integrated circuits
The End of Science
In August 2008 the editor-in-chief of Wired magazine, Chris Anderson, proclaimed, “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete”. This claim was backed up by 12 case studies that he uses to illustrate his point; Google quality data analysis has made the search for causation obsolete and that “correlation is enough”. There will be many working scientists who will not lose much sleep over this claim. In the end the Supercrunchers have to have data to work on and in science good data is still not trivial to obtain, nevertheless this article and its premise should provoke active scientists and statisticians to think deeply about the implications.
In contrast to Wired we believe that the core tools of science and scientific inference are as valid today as they ever were. But the pitfalls for lack of rigour are worse. There are now even more ways to do it wrong than right. We believe that even more than ever scientists need to be skilled in the art of Integrated Quantification (IQ); the ability to take into account all aspects of quantification in a holistic manner to achieve high quality scientific research.
This skillset is not taught yet in current University courses; neither science degrees nor statistics degrees deal with post data deluge quantification in a holistic manner. The ability to critically appraise the output of a Google-Algorithm that has found a putative correlation will be more important than ever. In business it may be enough to say that the correlation helps give you and edge when interacting with billions of humans but in science this has yet to be proven. If we don’t build this skillset and build methods to utilise it in both academic and commercial research we will have a new generation of scientists who believe the Wired view; the scientific method is obsolete.
Matt
Experimental Control Vs Statistical Control
This paper HERE, is a very thought provoking old paper dealing with what statistical procedures are suitable for single -subject studies.
One of the interesting thoughts is the role of Experimental Control Vs Statistical Control of variability.
Measurement Scientists build expertise in Experimental Control – they build equipment, modify instruments, write software understand their system. They do not undertake hypothesis driven activities using “traditional” statistical inference approaches such as Null Hypotheses (H0) and Alternative Hypotheses (HA). They are mainly physicists. This skillset is very specific to a given experimental technique and is generally NOT transferable.
Data Scientists build expertise in Statistical Control. They design experiments, calculate the power of an experiment. They undertake hypothesis driven activities using “traditional” statistical inference approaches. They are mainly mathematicians or statisticians. This skillset is very generic and transferable.
Ed Jaynes
Ed Jaynes was an American physicist who was born in 1922 and who carried on working pretty much until his death in 1998. Jaynes was one of the first people to fully explain that the classical probability theory developed by scientists such as Laplace (1749–1827) can be considered as a generalization of Aristotelian logic. In the case of reasoning about uncertainty our logic is that of probability theory and in the special case that our hypotheses are either true or false we can reason using classical deductive logic.
The distinctive approach of Ed Jaynes is not so widely known as it should be. Jaynes life’s work is summarised in a book called Probability Theory: The Logic of Science. This was published by Cambridge University Press and is HERE. Reviews include the following; ‘This is not an ordinary text. It is an unabashed, hard sell of the Bayesian approach to statistics. It is wonderfully down to earth, with hundreds of telling examples. Everyone who is interested in the problems or applications of statistics should have a serious look.’ SIAM News.
The statement that ‘this is not an ordinary text’ is a serious understatement. At one level the book is easy to read and has numerous real insights about the problems that a quantitative scientist faces when trying to reason about an unknown reality based on partial information – the situation most natural and social scientists face during their working lives. At a deeper level this is a difficult and very ambitious book. The aim of the book is no less than an explication of the foundations of applied scientific inference. It combines physical insight, philosophical challenges and mathematical difficulty. It will stretch your conceptions about how and where probability theory can and should be applied. In the final analysis much of the books technical material may well be shown to be wrong. However, simply trying to understand the book and its expansive vision of what scientific inference should be may well lead to a productive and creative life’s worth of scientific work.
Jaynes is a leading thinker in the field of Objective Bayesian statistics (see Wikipedia HERE).
Other bits and pieces about Jaynes, including a short bio, are HERE.
Tufte Principles of Analytical Design
Edward Tufte describes 6 fundamental principles for analytical design (that he claims are merely mirrors of 6 principles of analytical thinking).
In brief these are;
(1) Show comparisons, contrasts, differences.
(2) Show causality, mechanism, explanation, systematic structure.
(3) Show multivariate data; that is show more than 1 or 2 variables.
(4) Completely integrate words, numbers, images, diagrams.
(5) Thoroughy describe the evidence. Provide a detailed title, indicate the authors and sponsors, document the data sources, show complete measurement scales, point out relevant issues.
(6) Analytical presentations ultimately stand or fall depending on the quality, relevance, and integrity of their content.
Matt
Antediluvian
Return of the Max
Recently I have been digging back in time to re-acquant myself with Maximum Entropy data processing (MaxEnt). It’s a full 11 years since I last earned a living by applying MaxEnt and trying to get back into the area has been a good experience – I have been pleasantly surprised by how things are in MaxEnt land.
Last I knew, MaxEnt and Bayes where officially two distinct things. There was an understanding that MaxEnt could be thought of as a Bayesian prior probability distribution for positive additive distributed quantities (e.g. images and spectra). Now it seems that they are deeply and fundamentally connected and that there is a single underlying theory (confusingly called Maximum Relative Entropy or MrE) that shows that Bayes updating and MaxEnt are both special cases of a unifying inference framework. I do not pretend to fully understand all of the nuances, either philosophical or mathematical, but it appears that Ariel Caticha and his PhD student Adom Giffin have been having great fun setting up the new formalism and applying it (see particularly Catcicha’s lecture notes and Giffins thesis on arXiv.org which I have been reading). My experience is that even the best ideas take a time to catch on – som many people have to unlearn so much etc, but somehow I suspect that in another 11 years or so we may all be maximising our relative entropies!
Matt
Wigmore Evidence Charts
I have been reading a fascinating book by two probabilists Joseph Kadane and David Schum called “A probabilistic analysis of the Sacco and Vanzetti Evidence” about the famous US murder case from the 1920’s.
One interesting feature of the book is their use of a particular type of graphic which they call a ‘Wigmore Evidence Chart’. They also show a complete appendix of these charts summarising the Sacco and Vanzetti evidence. Wigmore was a US jurist who developed a probabalistic approach to proof (he published this approach from 1913 onwards) and is seen by the authors as the earliest exponent of what today are called ‘inference networks’ (Wikipedia HERE).
I have not seen this type of chart before. They are interesting but visually a bit ropey, they could really do with improvements in clarity, line weights, text integration and colours etc.
However, they are not dead; a quick search on Google took me to this 2008 paper by Bruce Hay of Harvard Law School (HERE). The paper is from the journal “Law, Probability and Risk” and the Abstract says the following;
“Wigmore’s `The Problem of Proof’, published in 1913, was a path-breaking attempt to systematize the process of drawing inferences from trial evidence. In this paper, written for a conference on visual approaches to evidence, I look at the Wigmore article in relation to cubist art, which coincidentally made its American debut in New York and Chicago the same spring that the article appeared. The point of the paper is to encourage greater attention to the complex meanings embedded in visual diagrams, meanings overlooked by the prevailing cognitive scientific approaches to the Wigmore method.”
I will be ordering this paper and reading around it, but in the meantime you may find this interesting.
Just found more on Wigmore – a whole book by Anderson, Schum & Twining HERE that has long explanation of how to apply the method.
Matt