Search results
Results From The WOW.Com Content Network
For example, a word can have several word senses. [3] Polysemy is distinct from monosemy, where a word has a single meaning. [3] Polysemy is distinct from homonymy—or homophony—which is an accidental similarity between two or more words (such as bear the animal, and the verb bear); whereas homonymy is a mere linguistic coincidence, polysemy ...
However, polysemy means that the audience may create new meanings out of the text. The audience's perceived meanings may not be intended by the producers. Therefore, 'polysemy' and 'opposition' should be seen as two analytically distinct processes, although they do interconnect in the overall reading process.
For example, fingers describe all digits on a hand, but the existence of the word thumb for the first finger means that fingers can also be used for "non-thumb digits on a hand". [13] Autohyponymy is also called "vertical polysemy". [a] [14] Horn called this "licensed polysemy", but found that autohyponyms also formed even when there is no ...
Polysemy is the phenomenon where the same word has multiple meanings. So a search may retrieve irrelevant documents containing the desired words in the wrong meaning. For example, a botanist and a computer scientist looking for the word "tree" probably desire different sets of documents.
Examples include the pair stalk (part of a plant) and stalk (follow/harass a person) and the pair left (past tense of leave) and left (opposite of right). A distinction is sometimes made between true homonyms, which are unrelated in origin, such as skate (glide on ice) and skate (the fish), and polysemous homonyms, or polysemes, which have a ...
Most people can agree in distinctions at the coarse-grained homograph level (e.g., pen as writing instrument or enclosure), but go down one level to fine-grained polysemy, and disagreements arise. For example, in Senseval-2, which used fine-grained sense distinctions, human annotators agreed in only 85% of word occurrences. [14]
Polysemy is a major obstacle for all computer systems that attempt to deal with human language. In English, the most frequently used terms have several common meanings. For example, the word fire can mean: a combustion activity; to terminate employment; to launch, or to excite (as in fire up).
In natural language processing, a word embedding is a representation of a word. The embedding is used in text analysis.Typically, the representation is a real-valued vector that encodes the meaning of the word in such a way that the words that are closer in the vector space are expected to be similar in meaning. [1]