Defining the Problem, Part 2: Knowledge, or Information?

Plato discussed the idea that knowledge is justified true belief (JTB), and historically a lot of philosophers have accepted this definition. With JTB, we only know something if  it is true, we believe it, and we have good reasons for believing it. There are some serious challenges to this idea of knowledge, but generally philosophers agree that a belief must be true to be considered knowledge; there is no such thing as false knowledge.

Information, on the other hand, can be simply a collection of data, whether true or not. Knowledge, then, is information that is true, justified, and believed.

The most recent (though probably not final) formulation of what I have called The Knowledge Problem goes like this:

1. Time is really short.
2. We have vital decisions to make.
3. It is impossible to get enough verified information out of an ocean of unverified data to make the best possible decisions, and sometimes the information we need just isn't available.
4. We need sufficient information that is good enough to allow us to generally make good decisions and to minimize the harm when we make bad decisions.

But is that really a knowledge problem? Or is it an information problem? After discussing the issue in that previous post, I'm beginning to think it's more of an information problem. Building up a body of knowledge is probably one of the answers to the problem.

What do you think? Knowledge Problem, or Information Problem? Naturally, feel free to help refine my four points or bring up related issues.

12 comments:

Grim said...

If it's an information problem to which knowledge is a partial solution, it's also a knowledge problem. So what you've gotten out of your first go at it is your problem, plus another very serious problem. :)

Grim said...

By the way, one of the conditions that anti-JTB epistemologists often talk about is "safety." It's worth thinking about the issue, in terms of some of the other ideas about knowledge.

http://www.iep.utm.edu/safety-c/

The idea here is that JTB would give you knowledge if you:

1) Believe something,
2) That is true,
3) On the right kind of justification (e.g., your justification is directly relevant and rationally valid).

But sometimes you could have a justified belief that turns out to be true about something that could have very easily be wrong (lotteries are common examples). Does this count as knowledge in the same way that the kosher tradition counts as knowledge about what is safe and healthy to eat? The justification is of the right type, whereas with kosher it was of the wrong type; but one seems very sketchy, and the other very safe.

For your purposes, if knowledge is to be a partial answer to the information problem, safety may be a necessary condition on knowledge. Otherwise you can't rely on knowledge adequately to dispose of the need to process the information.

Tom said...

So what you've gotten out of your first go at it is your problem, plus another very serious problem. :)

Hah. Serves me right for going to the elves -- er, philosophers -- for advice.

Tom said...

By the way, I continued part of our discussion in The Knowledge Problem. I'll copy my comments from there to here.

Tom said...

Tom said...

...

To go back to my definition of the problem, we never worked out the fourth point to my satisfaction. Here it is again:

4. We need sufficient information that is good enough to allow us to generally make good decisions and to minimize the harm when we make bad decisions.

And your initial reply:

4. We don't need sufficient information that is good enough: that is what we want. What we need is what we have at the moment when we must choose and do. That is what must be evaluated for its quality, and on the strength of that we must choose not only what to do, but plan how to prepare for being wrong.

I can see that your objection implies that we have some information, but the formulation would allow for a case where we had no information. I.e., if we have to decide right now, and we have no information to go on, that's enough. However, that doesn't fulfill the needs of my qualifiers: good decisions that minimize harm from errors.

Somehow, I want to close that lacuna. I believe we do need some information, and that research is part of the solution to the problem. I'm just not sure how to phrase what I intend.

11:55 AM
Anonymous Tom said...

I do see your point, though. When time for research runs out, we have to decide based on what we have, not what we wish we had.

Also, research alone isn't enough; we must allot time for evaluation. Time allocation is a whole other can of worms, though, that I think we can tackle later.

11:58 AM

Grim said...

Serves me right for going to the elves -- er, philosophers -- for advice.

You'll find few enough elves among the living philosophers. But do you know what it means? Tolkien did.

I can see that your objection implies that we have some information, but the formulation would allow for a case where we had no information. I.e., if we have to decide right now, and we have no information to go on, that's enough. However, that doesn't fulfill the needs of my qualifiers: good decisions that minimize harm from errors. Somehow, I want to close that lacuna. I believe we do need some information, and that research is part of the solution to the problem. I'm just not sure how to phrase what I intend.

The formula allows for the fact that the deciding capacity is either reason or the will. So which is it? Cassandra was arguing it was the latter; so does almost the whole of modern and contemporary philosophy. Aristotle argues otherwise: that it is reason, though something other than will (i.e., desire) normally sets the ends.

There's a significant question here, and how you answer it will inform every aspect of the solution you come to. How do the rational and nonrational parts of the soul decide? Does desire set the ends, and reason determine (and decide upon) the means? Or is there The Will, acting and deciding independently of Reason and Desire?

Tom said...

The formula allows for the fact that the deciding capacity is either reason or the will. So which is it?

Hm, that looks like an inclusive 'or' to me ...

As I brought up in the earlier thread, you must have some information to start with, otherwise the decision-maker is insane. There is all the difference in the world between zero information and one piece of bad information. Knowing your options, for example, is information. Knowing a set of principles that guides you to making a decision in the absence of external information is still having information.

If will or desire makes our decisions, it has to have information first. In that case, my question is, can we better inform the will / desire? Can we train the gut to make better calls?

Tom said...

The article on safety looks pretty interesting. I'll dig deeper into it as I have time.

I'm hoping we'll also get back to externalism and justification as well here.

Grim said...

Well, they are related things. Externalism doesn't worry about the justification step in the same way, but it often has conditions (like, sometimes, safety) that it insists upon instead.

It is contrasted with internalism, in which your beliefs exist in a way that is fundamentally disconnected from the outside world. The world is whatever it is, and (following some concerns raised by Hume and Kant) you can't really know for sure what it is like. Your ideas about the world are created by your mind (for some internalists, your brain), and are composites of many different sensations which are put into a kind of order by YOU. Thus, they aren't the world outside you at all: they're your ideas, existing nowhere except in your mind. There is actually no reason to believe that the outside world (if it exists! we get to skepticism pretty fast here) is really ordered in neither more nor less than three dimensions, with only the colors we perceive, or indeed that there is such a thing as color, etc.

Thus, justification is very important to the internalist because there is a serious disconnect between the mind and the world. Having the right kind of rational justification is the only way you can be sure (as sure as possible) that your belief is likely to carry the truths on which it is based forward to itself. For the internalist, justification is much like the truth-preservation sought by modern logical forms.

The externalist isn't blind to the problems of things like color, but holds that they relate to real external things. (Indeed, the danger is that they lose the internal things: at some point it's hard for them to talk about your having a mind to which you have privileged access, and Williamson in fact denies it -- see his chapter on 'Burning down our cognitive homes' in Knowledge and its Limits. This isn't quite as nonsensical as it sounds -- it may well be that there's a lot more bleed between our minds and the environment than we realize, and that a sufficiently advanced computer with proper sensory equipment might be able to understand your thoughts better than you could.)

But you can come to know some things, like what is good to eat, and that it is warm outside. What makes this true is some facts about how your body relates to certain kinds of foods, or how a body structured like yours relates to an ambient temperature.

If that's right, there's a certain fit between our minds and the world -- and small wonder, since our minds are part of the world. Instead of justification, knowledge exists when you have a relationship with something that is true about the world. You can have a proper justification or an improper one, but you still have knowledge as long as the relationship to the truth is there.

What "safety" holds is that you can't just have any such relationship count as knowledge: it has to be one that couldn't easily be false. Thus, your belief that the lake you see is water is safe in Minnesota in a way that it isn't on Mars. Even if the lake on Mars should prove to be water (and not salt, or some frozen gas), it was not so likely to be true that it could count as knowledge. You just got lucky.

Tom said...

Externalism makes sense in that I agree the outside and inside have an exchange going on. From a chemical point of view, all sensory input changes our minds quite literally; the only way we can know we sense something is through a chemical change. I'm not really sure what a relationship with the truth is, though.

Anyway, in JTB, empirical / sensory evidence can count as justification, which seems like external justification. Some Gettier problems rely on the senses providing justification (but being wrong, of course).

It seems like justification can work with both internal (logic) and external (senses) evidence.

Also, the idea that the body and mind are partly constituted by the environment is an old Hippocratic idea as well. Read Airs, Waters, Places for an ancient description of how the environment influences our minds and temperaments. Interestingly, the ancient Chinese had similar ideas; the medical idea of Qi is all about harmonizing your mind & body with the environment.

Tom said...

Of course, you may already be aware of those things.

Also, just to clarify what I said about Qi, they see it as something the body and environment share, which is where I saw the connection w/ some of what you're saying about externalism.

Grim said...

Those are good reasons to favor externalist approaches. I think internalism, though still the dominant school in philosophy, is increasingly hard to maintain.

Justification is still important for your problem -- the information problem -- but this is why knowledge can partially solve it. There are areas where knowledge is safe enough that we don't have to worry about the justification step being exactly the right type. It doesn't have to be fully logical, or fully vetted by research, because the relationship is sure enough that we know that we know some things.