next up previous
Next: Human with a Up: Beyond the binding Previous: Beyond the binding

What does it mean to understand RED?

In a previous section, I suggested the ``asking obvious questions'' test to measure understanding. So a system that understands the word ``enter'' for example, should know that when you enter a room, you are ``inside'' that room.

However, it might seem unnatural to talk about understanding without perception. Is it OK to say that an agent understands ``red'' when it does not have a ``red sensor''? It is one thing to know what things are red, and that red is a color, and that color is physical phenomenon related to the spectrum of light. It is another thing to be able to tell a red thing from a green thing. This latter type of ``understanding'', is considered an important requirement by advocates of ``embodiment'' and ``grounding''.

Other people think that this is a ``trivial'' requirement, i.e. it doesn't say anything about whether lack of grounding makes achieving human level intelligence impossible, or even difficult. It can be argued that it makes it actually easier by eliminating irrelevant detail. I find the arguments like the following not adequately answered by the proponents of embodiment:





Deniz Yuret
Tue Apr 1 21:26:01 EST 1997