Advanced education tends to focus on narrowly defined areas of expertise. It is all too common for experts in very narrowly defined areas to be woefully uninformed in broader areas of knowledge.
I have a masters degree in Computer Science and Engineering. My "expert" area is embedded software.
At first glance, this might seem like a very narrowly defined area. However, "my" software is currently running: Bakery ovens, Protein separation systems, Smart card readers, Allergy diagnosis systems, Wind shear observation stations and so on. And before writing the software for those systems I had to gain at least some knowledge of the problem domain. I do not claim to know how to bake the perfect bread, but I do claim that the oven software is doing exactly what the bakery needs in order to make the perfect bread.
During these years I had colleagues doing mechanical and electrical design as well and they too need to know what the end product is supposed to do.
I have met so-called experts but so far no one "woefully uninformed" in broader areas of knowledge. In my experience, people with advanced education are usually very informed in many areas outside their own area of expertise. Except maybe economists.
Grubbegrabben,
My comment is not that it is universal, nor even the majority, but that it is unfortunately common. And that has verify much been my experience, particularly in working with the national labs. A very high percentage of the national lab scientists are nothing shy of brilliant. An unfortunately large number miss absolutely basic logical issues and errors, or are very narrowly focused in their expertise.
I hesitate to use specific examples. However, even when the science is clear, all too often many of the researchers I have worked with will and did continue using methods and models that demonstrably cannot work correctly, or failed to apply lessons from even fairly closely related fields.
E.g. Chemists working at the micro scale in hydrogeology and being unable to apply lessons from modeling at differing scales ranging from molecular scales to millimeter scales to centimeter scales to kilometer scales. Or physicists and chemists doing surface interaction modeling not applying bulk chemistry related to solubility products and reaction kinetics. Or physicists and chemists knowing nothing about parametric pumping. Or corrosion scientists knowing nothing about unintentional heat exchange systems resulting in condensation cells, or inadvertent electrochemical cells. Or radiation scientists being entirely unfamiliar with the diurnal variation in background gamma associated with the crab nebulae and the black hole in the center of the Milky Way galaxy.
Or scientists proposing studies where the opening line of the proposal reads "We need to show that ...". Anything after that is a statement of belief, not of science, which renders the proposed study fatally flawed before it starts. Or failure to recognize type 1 and type 2 errors, or other error types, logical fallacies of a hundred types, etc... and failure to recognize the existence, let alone importance, of unstated assumptions and presumptions biasing experiments and analyses.
One of the most fundamental of these is failure to understand that science is the search for truth in the unknown, and NOT the end result of testing the null hypothesis using statistical measures. The p<=0.05 measure is one of the worst of these. A huge number of scientists I have worked with fail to understand that 1 time in 20 at a p=0.05 that the results of an experiment will indicate that the result is statistically significant when it absolutely isn't. And that is the expected result at p=0.05. 1 in 20 positive correlations are wrong. This also forms much of the basis of p-hacking.
Worse than most of these is the tendency to continue using methods and approaches that can be and have been shown not to apply and/or not to work.
Most often these were not intentional errors. Most often they are errors and biases built into the history of the fields of study. The deference given to precedent, even when the precedent is wrong is unfortunately common. E.g. defining uncertainty in an analysis of a real world system to be the mathematical sensitivity of the results to the variation of the parameters deemed most important in implementation of a mathematical model of the chosen conceptualization of the system. Clearly, the uncertainty is vastly larger. It includes the uncertainty in the choice of models to represent the system and even the absence of full understanding of the system. Yet still, even today, it is common to find uncertainty in modeling defining as the sensitivity of the model to variation of its parameters.
Occasionally the errors are intentional. They are chosen to maintain funding and careers in an environment where others (government or corporate leaders) want a particular outcome.
My point is not one of criticism to chastise or criticize. My point is that we all of us must always be on guard looking for the errors in thought or approach that each of us may (and will) make. That isn't to criticize. That is to correct and improve for all of us. And yes, we will each of us stumble from time to time, and commit all sorts of errors. The sooner we identify those and fix them, the better off we all are.
But this all goes far afield of the original query- how smart are we?
My point in answer to that was that a single parameter like level of educational attainment or IQ is a very crude and often misleading way to think about that.
Said as a positive statement, it is clear to me from the depths and breadths of the discussions here, that the quality of thought of those contributing to this forum is extremely high.
Quite often the discussions here have raised issues that have been underappreciated by the experts in the field(s). And in doing that with experts in the field being a part of this discussion, it has benefited everyone. More than that, collectively we are vastly smarter than the simple sum of our members.
Often we are individually and collectively stumbling in the dark. That happens at the frontier of knowledge.