The Knowledge Problem (Part 1)

If you've waded through How Do You Splint a Broken Paradigm? and Let's Shift That Paradigm a Bit More (God bless you), then I hope you've understood my situation as a specific example of a generalizable and, I believe, common problem. Broadly, I've begun to regard most sources of knowledge as highly questionable.

While I am far, far more skeptical than I used to be, I wouldn't say I've become a true Skeptic. I do believe there are some demonstrable scientific truths. I believe journalists and historians and social scientists get some things right. I believe my lying eyes (ears, nose, etc.), most of the time. I don't believe there's a Big Conspiracy to keep The Truth from us, though there are probably lots of small, unconnected conspiracies to keep certain bits of information from certain people.

In other words, I believe knowledge is possible, and that we can fairly easily get our brains around some of it. The real problem is a bit more treacherous. We are deluged with information from a great variety of sources, but in most cases we don't know which bits of it are to be trusted and which aren't. Worse, we're mortal, and we have many requirements on our  meager allotment of time in this world besides the sifting and sorting of information.

And yet, we must make decisions based on information. Not only the big things like who to support for president, but what to eat for breakfast (low-fat? low-carb? anything -- just hot and now?), whether to take up or remain in a religion, how to know when to devote time and money (and how much) to social or political causes, what to say to our co-workers at lunch when they ask what we think about current events, what advice to give to children and people we mentor, these sorts of things.

The knowledge problem, then, as best as I've been able to define it, goes like this: Time is really short; we have vital decisions to make; we need to get verified information out of an ocean of unverified data; that's very hard.

Are there any changes or refinements to the problem you would like to offer?

UPDATE 8/24/13: Due to the discussion in the comments, I've revised my formulation of the problem. I've given the new formulation in the post, Defining the Problem, Part 2: Knowledge, or Information?

32 comments:

Grim said...

Yes, one at least.

This problem is the one that von Clausewitz talks about in the context of war, as "the fog of war." Modern information technology has increased the problem from 'the fog of war,' to fog up everyday interactions of many different kinds.

That means that everyday interactions need to be planned with a similar allowance for error. You keep back a reserve force in maneuver warfare because you realize that you may need a capacity to attack in an unexpected direction (or directions) once you have committed your main forces and they are fixed by the battle already joined.

Similarly, recognizing the importance of community support for military efforts, you budget a certain amount of resources for reconciliation efforts when your judgment goes wrong. This is an important part of your strategy any time you are not just 'fighting through' a country, but intending to remain engaged with a community for a long time. If they're not to support your enemies (political or social or religious or corporate competitors, here), you need to make sure you give them reasons to trust that you'll make it right when you get something wrong.

So it's vital to get it right, but it's not possible in many cases. Of course, if you're being misled by bad paradigms, you can end up screwing up reconciliation too: enriching or empowering enemies you had mistaken for potential allies, for example. So it is wise to plan margins for error in your margins for error: doing reconciliation in ways that are hard to turn against you, or being careful to retain a capacity to destroy someone you are helping. You can imagine Google or Microsoft paying a generous settlement for an error, but on terms that would allow them a legal backdoor to eviscerate the recipient if they violate certain terms. The settlement is generous enough that you'd want to accept it, but the corporation is guarded against treachery.

The question quickly becomes something like, "Where can we be sure enough of good faith and certain knowledge that we can afford to treat people as friends rather than (at least potential) enemies?" And the answer is somewhat depressing. But there are many lessons here.

Tom said...

I agree with everything you say, but I'm not sure how it modifies the formulation of the knowledge problem. It adds one decision (where to put our faith) that we need information to make, but the problem itself seems unchanged: how to get good information on which to base decisions in a limited amount of time. Am I missing something?

Grim said...

I take it to modify both the desire for "verified information" and the final clause "that's very hard." It is not just very hard, it is so often impossible in the time allocated that we should ordinarily build in plans for failure.

Likewise, we must often act without verified information. So we don't really need verified information in order to act: rather, you want the optimum balance between verity and timeliness.

Tom said...

Yes, I think you're right.

I've been thinking the answer to the problem is a particular set of principles (as yet unknown to me) we could use to gather and analyze information, and now I see those principles need to include a way of handling being wrong.

Tom said...

Let's see ... It's a bit clunky, but how's this as a reformulation?

1. Time is really short.

2. We have vital decisions to make.

3. It is impossible to get enough verified information out of an ocean of unverified data to make the best possible decisions, and sometimes the information we need just isn't available.

4. We need sufficient information that is good enough to allow us to generally make good decisions and to minimize the harm when we make bad decisions.

Grim said...

Well, let's challenge that.

1) Time is short, but art is long. One of the ways in which we approach this problem is to learn what those before us knew. This not only helps us by teaching us how to recognize where they went wrong, but it provides us with a platform from which to criticize our own paradigms. Without an alternative, as you said, we cannot.

2) We have vital decisions to make, but urgency and importance are two different axes. Some decisions are really more vital, but there is time to consider more carefully; others are really more urgent, but not so important. One way of approaching the problem is to make sure we are making this distinction, so we focus the short time on problems that are both urgent and important; then problems that are urgent but somewhat important; and then on problems that are important but not urgent, leaving the unimportant problems generally to slide.

3) All you say in principle 3 is true, but we must still decide and act. One way to act is to learn to recognize areas in which the best available information is more likely to be wrong -- or, areas where being wrong is more likely to be disastrous. I am thinking here of Taleb's "The Fourth Quadrant," which is a typology of problems that lets you know that you can proceed without too much fear in some areas, but need to be very cautious about taking risks in others. So that is an aspect of your problem: developing similar typologies of kinds of problems, and also of kinds of "knowledge" that are more likely to be wrong.

A good example: a case in which all living experts generally agree is a case in which we are very likely to be wrong about what almost all firmly believe. There is nearly always good ground to be made here by checking your assumptions, checking your foundations, looking again at how we came to believe something so firmly, and whether that belief is still justified -- or if it ever was.

This is the work of philosophy.

4. We don't need sufficient information that is good enough: that is what we want. What we need is what we have at the moment when we must choose and do. That is what must be evaluated for its quality, and on the strength of that we must choose not only what to do, but plan how to prepare for being wrong.

Tom said...

For 1-3, we're moving into solutions, which is good. Taleb is on my reading list for this project, but I haven't gotten there yet. Maybe he should be higher up on the list.

For 4, my wording is squishy. One thing I wanted to imply with it was the need for research in the little time we have.

Your definition of need doesn't imply any kind of information gathering. In fact, it doesn't necessarily include information. If we are forced to make a decision with no information, we don't need it.

Grim said...

Right, it doesn't. What I mean by "need" is necessity: is accurate information necessary for action, or is it merely desirable?

Well, it is certainly helpful. But take (for example) two people making decisions about what to eat so as to maximize their health. One is a rich white woman living today, who is fascinated with the question and keeps up with all the latest studies on how and what to eat. The other is an ancient Jewish man, familiar with the oral law of his people.

Presumably the first person has a tremendous amount of more accurate and detailed information than the second person. Furthermore, it changes all the time as science improves, comes to question old insights or advance the ball on others. She makes her decisions in a way that changes, and sometimes she finds herself correcting what she comes to believe were earlier mistakes based on bad information.

The second person never alters his eating patterns, obtains no new information, and is operating on an understanding that lacks thousands of years of scientific input.

So -- whose outcomes will be better, ceteris paribus? If it's too hard to hold everything equal given the difference in the medicine available to the two, consider a modern ultra-Orthodox Jew who rigidly adheres to the ancient interpretations of the law. It's true that he gets to eat fewer things, and it's true that he understands much less about the processes involved. It's true that he simply lacks access to the best available information, and that causes him to make decisions some would consider bad (e.g., never in his life having the pleasure of eating delicious oysters).

On the other hand, the woman is spending a ton of time on matters that are important but not urgent; and that time is time that could be spent on matters either more urgent (but also important) or more important. She could probably have nearly as good an outcome by following the old laws like he does: unless she has some very strange food allergies, it might even be exactly as good an outcome, except for missing out on certain kinds of foods.

So: which action is best? How much is the 'accurate information' necessary to a good action? Art is long. That's part of the answer.

Grim said...

Notice I am for this purpose entirely avoiding the question of religious values. The Orthodox man is eating in accord with the law only to maximize his health. Insofar as he obtains additional religious benefits (which may more than offset the cost of not eating oysters, for example), that's set to the side. We're just talking about deciding what to eat in order to maximize health.

Tom said...

I think the more important point is that, under your definition, we don't need ANY information at all. We can always just make a random decision.

A better comparison would be the woman from your example and a man faced with choices presented in a language he can't understand, where he may end up getting food, or he may end up getting nothing or something harmful.

If the point is just to make decisions, that's right. But if the point is to "generally make good decisions and to minimize the harm when we make bad decisions", I think we do need some information.

Grim said...

There's still some information: the man knows what his options are.

But the point I wanted to make was a distinction between "information", or even "accurate information" or "verified information", and knowledge. You said you wanted to talk about the knowledge problem, and knowledge is not just information.

The woman in my example has "accurate information" in a far greater degree, and "verified information" insofar as science can verify it. The man -- especially if we take the modern Orthodox Jew, who has thousands of years more tradition to draw upon -- has a kind of knowledge. It's not the same thing as 'verified/accurate information' because he doesn't have that -- he doesn't know all the things the woman knows. His decisions, however, will bring him reliably to outcomes as good as hers will, without his spending nearly as much time or attention on it.

Clearly, then, one solution to the problem you're having is to prefer knowledge to information.

So let me play Socrates for a moment, and ask you: what, then, is knowledge?

Cass said...

I tend to think that the idea that our decisions are (or will ever be) based on anything but gut instinct to be wishful thinking.

Most of the time, we react and then rationalize afterwards when we have time. This is what bothers me about political theories that presuppose a rational and informed electorate: that's wishful thinking. Most voters are neither particularly rational nor particularly well informed.

Most of life is trial and error; action and reaction. Thinking, like knowledge, is that rarest of luxuries. When we can indulge, we should.

Tradition or culture used to take a lot of these decisions out of our hands. We were taught, "do this, do that" - don't think about it, just do what you are told or what you've been taught to do. Don't question. Don't rebel. Don't buck the system.

That's not a foolproof framework, either. And it continues to amaze me that so many people look back to that and think it's better than what we have now. It's different, certainly. It feels more "certain" because it was a far more confining code, and left us far fewer options to choose from.

And this is the lost age of freedom we're mourning?

I see the value of tradition, but I also see the down side of it. It's really a matter of tradeoffs; this for that, freedom for security; awareness for peace of mind; a sense of belonging or larger purpose for individualism.

Which is better? That's a bit of a value judgment, and one few want forced on them. How many variables do we really need to consider when making minor decisions? I'm guessing not all that many. The older I get, the more inclined I am to believe that we spend far too much time worrying about things that, if we were not so fortunate, would be far down on the priorities list.

Tom said...

Grim: There's still some information: the man knows what his options are.

No, the options are unintelligible to him, so he knows he has a decision to make, but not what his options are. The point I'm trying to make is, under your definition of "need," there is never a genuine need for information.

The point I would make to both of you is that people have to start with something. Even if it's the gut that makes the call, it has something to go on, some kind of information about the situation. The information is almost certainly incomplete, and it may well be at least partly wrong, but there's some information to start with. Otherwise, our decision maker is insane.

Tom said...

Grim, as for your example, I don't think I would make your distinction between the quality of information the two people have. I believe successful traditions can be a form of verification, so the man's dietary regime is verified by millenia of people who have followed it and lived healthy lives. Science is great at what it does, but it isn't the only source of accurate information.

But, what is knowledge? Yes, that is a truly essential question. I was hoping you wouldn't ask it. :-D

I guess we can start with Plato's 'justified true belief.' However, while justification is a big topic all on its own, the biggest problem is truth. I think that in many, many cases, we can only have an approximate idea of what the truth is. So, we may have knowledge, we may not, but either way, much of the time, we probably don't know if we know something.

I know, I know. Not only do we want to have knowledge, we want to know we have it. Often, I believe, that's just not possible.

Consider the Ptolemaic astronomers at the time of Galileo. They had 1400 years of observation, detailed and sophisticated mathematical descriptions of the phenomena, the ability to accurately predict the movement of astronomical bodies and events, all kinds of justification. It certainly seemed true. They died believing it. But, well, later we found out it wasn't true. They believed they had knowledge, but we don't believe they did.

Where does this lead? Probably to some form of pragmatism and a high degree of negative capability, as Texan99 might call it. Or, to take Cass's lead, to training the gut to make better decisions.

At least, that's what I think now.

Cass said...

Well, I've spend a goodly part of my lifetime relying on my intuition.

I haven't really decided that that's the same as knowledge - it's really more akin to a sense that may reflect some limited knowledge or may include things that are ultimately unknowable (like what goes in my husband's noggin) in the sense that there's absolutely no way for me to confirm them! :)

I think what I'm pushing back against is the idea that there's some way to be sure of our knowledge, or that our internal filters/biases are any less insidious than other people's. I like to think about my moral intuitions because it's an (imperfect) way of testing them against something a tad more objective than my ineffable sense of rectitude. Going to church does something similar - it forces my mind into uncomfortable channels I might not wade into voluntarily otherwise.

But either way, I'm still not *sure* I'm right, nor will I ever be.

I hear so many folks talk as though they have no doubt they're right about pretty much everything, or as though everything is cut and dried and simple. Shoot - I feel that way, too a lot of the time. "Of course I'm right, because ... well, I just AM, durnitall!" :)

But I don't think that way, when I take the time to really think things over.

Cass said...

So, we may have knowledge, we may not, but either way, much of the time, we probably don't know if we know something.

Or as some wag once said, there are the things we know we know, and the things we don't know we don't know - the unknown unknowns :p

Of course the sophisticated people found the idea of admitting you don't know something (and that that's a problem) to be funny for some reason.

Grim said...

Having spent a largely wasted evening participating in a community government activity, I'm inclined to agree that most people's capacity to evaluate decisions, and vote wisely, is quite small.

However, the one thing we aren't talking about right now is the restrictive function of tradition (about which you and I have long and frequently-elaborated arguments, Cass, that would only distract us here). The ancient Jew I posited may in some sense have been forced to adhere to tradition; but the modern Orthodos Jew also posited isn't forced to do so.

The reason what he has is knowledge and not "information" has to do with the reliability of the tradition in producing good results. There's a relationship between what he knows and what is true. And it isn't, perhaps surprisingly, a scientific relationship. It's not even really 'information' in the same sense as Tom is struggling with it. After all, it's not scientific or empirical information about the environment -- it's a law, passed down for thousands of years, in a form that changes quite little. It's not very responsive to changes, and for reasons that actually have nothing to do with the question of what is healthy to eat. So it's kind of surprising that it proves to be a highly reliable form of knowledge about what is good to eat.

Now I assume (Tom) that you're familiar with the usual reasons for rejecting 'justified true belief' (JTB) as a formula -- the Gettier cases. I'm not very impressed by them, myself, but I happen to like one of the solutions that they provoked, which is known as 'externalism.' I assume you know the distinction at least in passing?

This seems like a good case of externalism as a justifier (or 'truthmaker') of knowledge. We have knowledge about a religious law code that alleges itself to be God's rules about what His chosen people should eat. It's not really about healthy eating. But it relates to the world in such a way that its rules are true -- the world makes them true. If you live that way, you will eat well and prosper.

This is true even though it isn't really what the rules are about. They're about ritual cleanliness and uncleanliness. It's a kind of knowledge that is highly reliable even though it isn't actually 'information' about the problem of how to eat best.

So again, exactly how does information differ from knowledge? Well, 'verified information' really could be JTB. It is going to be justified in the right way (or at least what most people take to be the right way), i.e., it'll be justification that is directly relevant to the question.

The kosher rules are justification of the wrong sort: this is justified as good to eat because it keeps you ritually clean. That might even be said to be irrelevant to the question of whether it is the best way to eat for your health.

I think it can still be knowledge about what is good to eat, because it relates to the truth. The justification step is disposable, if the relationship to the truth is really there. And that means that knowledge isn't JTB, but (as the externalists say) a relationship with the truth.

Tom said...

Cass,

Tradition or culture used to take a lot of these decisions out of our hands. We were taught, "do this, do that" - don't think about it, just do what you are told or what you've been taught to do. Don't question. Don't rebel. Don't buck the system.

Well, and why shouldn't it? You yourself said most people are neither informed nor rational; wouldn't their decisions be better made according to tried and true principles? Things that have worked for a long, long time?

That's not a foolproof framework, either.

Nothing is, but it's a system built up of a lot of trial and error over many generations. It's better than every generation starting over with nothing and having to figure everything out itself.

And it continues to amaze me that so many people look back to that and think it's better than what we have now.

Well, the people who look back and say traditional ways were better simply don't define tradition the way you do.

An example disagreement on definition would be that I don't believe tradition was nearly the straightjacket it's often portrayed as. When we go to the historical sources, we find it used as a guideline, and people were always renegotiating the boundaries. If they weren't, societies would never change. So I see tradition as a flexible set of guides for a society. Some of it is determined by the past, but some of it is always redefined by the present generation to suit its ideas, needs, and tastes.

To give a specific example, giving women the right to vote was simply a renegotiation of American tradition. It was a natural extension of well-established, traditional principles. By the time it happened, in fact, our tradition demanded it. To not have given them the vote would have been anti-traditional.

Grim said...

Well, and why shouldn't it? You yourself said most people are neither informed nor rational; wouldn't their decisions be better made according to tried and true principles?

*Sigh.* I guess that is what we're talking about, now. Nevermind. :)

Tom said...

:-D

I'm going to have to pick this up tomorrow. Great fun, though. Thanks!

Tom said...

Of course the sophisticated people found the idea of admitting you don't know something (and that that's a problem) to be funny for some reason.

Well, because they know everything, of course. Knowledge problems, like following the law, is for the little people.

Tom said...

Now I assume (Tom) that you're familiar with the usual reasons for rejecting 'justified true belief' (JTB) as a formula -- the Gettier cases. I'm not very impressed by them, myself, but I happen to like one of the solutions that they provoked, which is known as 'externalism.' I assume you know the distinction at least in passing?

*looks around shiftily, quickly reads a couple of Wikipedia articles*

Why, yes, of course. Please continue.

...

Grim, something that bothers me about your example of the Jewish man's knowledge is that he isn't really eating that way to be healthy. He is eating that way to avoid pollution. His traditions constitute reliable information on how to avoid pollution, so to me this is JTB.

Now, the fact that the woman in the example could also adopt the diet for purely health reasons and be pretty successful is interesting. But, again, the diet has kept people healthy over thousands of years, which is empirical data that it is a healthy diet, even if it's never been scientifically studied. If she looks at that history and makes her decision, that's also JTB.

Is it just that you want to get rid of justification? Things that we know that are accidentally true still count as knowledge?

Tom said...

Even if we say the Jewish man keeps to the regime to avoid pollution, not to maintain health, he still knows it's a healthy diet. He doesn't know why it's healthy, which seems to be a challenge to JTB.

However, JTB (as I understand it) doesn't require knowing causes, per se. He knows it's a healthy way to eat because of personal experience, and presumably the experience of other Orthodox Jews; he has empirical information that justify his belief.

Grim said...

It's not that I want to get rid of justification. I think what the woman is doing is fine.

What I want to do is to draw a distinction between two different ways of approaching problems, because it seems important to me given the way you've structured your problem. You're thinking of this as a "knowledge problem," but you've set it up as an information problem.

So I think it's important to distinguish between information and knowledge. Once we can say exactly what the difference is, we can see that knowledge is a partial solution to the information problem. That's true even when the knowledge actually doesn't provide any information about the problem -- when it isn't a 'justification of the right type' for a JTB kind of knowledge.

So, let's try again. What exactly is knowledge, if it isn't (or isn't just) JTB? Now that you've read up on the issues a bit, can you say what is wrong with Gettier's cases? Most contemporary analytic philosophers think he's right. If you think he's wrong, can you say why? (My reason for thinking so can be worked out from the posts on Modern Logic I have already given you.)

Grim said...

However, JTB (as I understand it) doesn't require knowing causes, per se.

Well, that's the justification step. For the justification to be 'of the right type,' it needs to be directly relevant information.

So, for example, another kind of justification that wouldn't be acceptable for knowledge is, "Because Mom told me to eat this way." If all you know that she said this was the right way to do it, and no more about her reasons than that, then you're committing an informal fallacy (Argumentum Ad Verecundium, or 'appeal to questionable authority').

Mom may have had good reasons, of course. And you could find that you are healthy. So in a sense -- in my sense -- you do have knowledge that it is a good way to eat. And (toward a problem you raised) you know that you know that it is a good way to eat. You get what Timothy Williamson calls 'second-order knowledge' (which he agrees you can't always have).

But you don't have JTB. You have a belief, and it happens to be true, but it isn't justified in the right way on a JTB account of knowledge. It's justified by an informal fallacy, an error in logic.

Grim said...

Now, if I understand you, you want to say, "Well after a while, empirical evidence tells you that Mama was right." And that may be true, after a while. But in order for it to be true, you have to have been actually acting on this knowledge for a long time.

So the fact that we may eventually get empirical knowledge proves to be about the second-order claim: about knowing that you know. When you were acting, you just knew. You knew what was right, because Mama told you.

And I think you'll see that this is really how most people learn to eat. It's not pure theory. This is how we obtain knowledge about what to eat.

Tom said...

Now that you've read up on the issues a bit, can you say what is wrong with Gettier's cases?

Well, it's obvious what's wrong with them, so I'm probably dead wrong. They seem to hinge on superfluous terms and equivocation, or, if not equivocation, then improper transfers of justification.

Let's go back to Smith's article, "Aristotle's Logic."

Aristotle's definition of a syllogism (as opposed to the modern definition) is:

A deduction is speech (logos) in which, certain things having been supposed, something different from those supposed results of necessity because of their being so. (Prior Analytics I.2, 24b18-20)

Smith draws out three differences with the modern syllogism from this, the third one being:

The force of the qualification “because of their being so” has sometimes been seen as ruling out arguments in which the conclusion is not ‘relevant’ to the premises, e.g., arguments in which the premises are inconsistent, arguments with conclusions that would follow from any premises whatsoever, or arguments with superfluous premises.

That seems right. I agree with Aristotle here.

So, let's look at Gettier's first case:

Smith has applied for a job, but, it is claimed, has a justified belief that "Jones will get the job". He also has a justified belief that "Jones has 10 coins in his pocket". Smith therefore (justifiably) concludes (by the rule of the transitivity of identity) that "the man who will get the job has 10 coins in his pocket".

In fact, Jones does not get the job. Instead, Smith does. However, as it happens, Smith (unknowingly and by sheer chance) also had 10 coins in his pocket. So his belief that "the man who will get the job has 10 coins in his pocket" was justified and true. But it does not appear to be knowledge.

There are two issues that I see.

First, in line with what I think Aristotle would say, is that Smith's belief seems irrelevant to what actually happened, though I feel a bit of slippage here somewhere.

Second, and a bit firmer in my mind, it also seems like a problem of equivocation: 'the man who will get the job' and 'Jones' are convertible, so Smith's belief that 'the man who will get the job ...' still refers to 'Jones'; it was never 'any man who gets the job,' so it could not have meant Smith himself. (I.e., if a = b, when we say b, we are still saying a.)

Let's look at Gettier's second case.

Smith, it is claimed by the hidden interlocutor, has a justified belief that "Jones owns a Ford". Smith therefore (justifiably) concludes (by the rule of disjunction introduction) that "Jones owns a Ford, or Brown is in Barcelona", even though Smith has no knowledge whatsoever about the location of Brown.

In fact, Jones does not own a Ford, but by sheer coincidence, Brown really is in Barcelona. Again, Smith had a belief that was true and justified, but not knowledge.

'Brown is in Barcelona' seems irrelevant to Jones having a Ford, so even though the form is correct, I don't have to acquiesce to the transfer of justification.

I think this is one advantage of Aristotle; the forms are his tools, not vice versa.

Of course, I feel like I'm in way over my head at this point. I think I need to back out slowly and take another run at the difference between knowledge and information.

Grim said...

Good. That wasn't at all bad for a first try. Come back to the knowledge v. information question, and we'll talk about Gettier again soon.

Tom said...

One problem with my formulation of 'the knowledge problem' is that I'm fairly skeptical about how much we can know, or at least how much we can know we know. Maybe that's why I use information and knowledge fairly interchangeably.

Something you seem to be pointing to is the idea that we don't have to work everything out for ourselves. Knowledge is transferable, much like my explanation of tradition above. Sometimes, tradition is right, but no one really knows why until we break it.

Well, my brain is getting hazy. I'll be back tomorrow.

Grim said...

This aspect you're talking about has to do with externalist accounts of knowledge. If knowledge is a relationship to the truth, it's not that hard to transfer: all I have to do is put you in a similar relationship. It's something like making an introduction, as you would to a friend.

Something to think about, as you work on the problem.

Tom said...

In checking my dictionary and Wikipedia, it seems the chief difference between knowledge and information is that knowledge must be true, while information can be true or false.

Accepting that, it seems that maybe it is more of an information problem, and that, as you suggest, knowledge is part of the answer. I've opened this topic up in a new post.

To go back to my definition of the problem, we never worked out the fourth point to my satisfaction. Here it is again:

4. We need sufficient information that is good enough to allow us to generally make good decisions and to minimize the harm when we make bad decisions.

And your initial reply:

4. We don't need sufficient information that is good enough: that is what we want. What we need is what we have at the moment when we must choose and do. That is what must be evaluated for its quality, and on the strength of that we must choose not only what to do, but plan how to prepare for being wrong.

I can see that your objection implies that we have some information, but the formulation would allow for a case where we had no information. I.e., if we have to decide right now, and we have no information to go on, that's enough. However, that doesn't fulfill the needs of my qualifiers: good decisions that minimize harm from errors.

Somehow, I want to close that lacuna. I believe we do need some information, and that research is part of the solution to the problem. I'm just not sure how to phrase what I intend.

Tom said...

I do see your point, though. When time for research runs out, we have to decide based on what we have, not what we wish we had.

Also, research alone isn't enough; we must allot time for evaluation. Time allocation is a whole other can of worms, though, that I think we can tackle later.