Painless Unsupervised Learning with Features, by Taylor Berg-Kirkpatrick, Alexandre Bouchard-Cote, John DeNero, and Dan Klein, NAACL 2010.
I was at NAACL 2010 and saw this paper presented, and a lot of it went over my head. Now, having taken a course and read a textbook on structured prediction, it was a lot easier than I seemed to remember it being. The beginning is a little math heavy, understandably, but the entire second half of the paper is just a list of models you can apply this technique to, with results for each one.
Read more...
Learning Arguments and Supertypes of Semantic Relations using Recursive Patterns, by Zornitsa Kozareva and Eduard Hovy, ACL 2010.
This is essentially a follow on to the last paper I posted. Last time they used the phrase “such as” to build a kind of ontology of things starting with a single seed (like “animals such as lions and _ “). This time they expand the work to 14 other phrases, aiming to show that you can learn arbitrary semantic relations using the same basic methodology. For instance, they use “ _ and _ fly to _ “, to learn selectional preferences for what things tend to fly and where they fly to, and “ _ and nice dress” to learn adjectives that describe dresses. It’s an interesting approach, and they have pretty good success for getting a general idea of what arguments tend to go with what words. I wonder about its scalability, though, as there are thousands of verbs in English, not counting all of the particles and prepositions they can take, and their methodology seemed pretty intensive for each phrase they test. But it does give some credence to some things we’ve been thinking recently, that you can use simple verbs to approximate many or most relations that people are interested in.
Read more...
Toward Completeness in Concept Extraction and Classification, by Eduard Hovy, Zornitsa Kozareva, and Ellen Riloff, EMNLP 2009.
This paper deals with automatically constructing an ontology, or hierarchy, of concepts from a corpus. Their whole approach is essentially to use the phrase “such as” in a couple of ways to find hyponyms and hypernyms of words in a bootstrapping approach starting with two seed words. They start with “animals such as lions and _ “, and go from there to get other instances of animals. Once they’ve done that, they use all pairs of these animals that they have found and look for the phrase “ _ such as lions and [other animal, like tiger].” Maybe that returns “felines” or “cats,” so you use that as a new initial search term, in “felines such as tigers and _ .” If you do this enough times, you can get some idea of what kinds of groupings there are among words and what things fit into each grouping.
Read more...
A Typology of Near-Identity Relations for Coreference (NIDENT), by Marta Recasens, Eduard Hovy, and M. Antonia Marti, LREC 2010.
I thought this paper was interesting. They argue that many cases in coreference resolution are more complicated than simple binary decisions, and so they introduce “near-identity” relations, somewhere in between saying two entities are coreferent and not coreferent. I thought they had some good examples, but in the end I found their typology not incredibly persuasive. I would have said most of them were not coreferent and dealt with the subtlety another way. I’ll give some of their examples below, with the entities that are coreferent or near-identity in italics.
Read more...
Integrating Semantic Frames from Multiple Sources, by Namhee Kwon and Eduard Hovy, CICLing 2006
Integrating Semantic Frames from Multiple Sources, by Namhee Kwon and Eduard Hovy, CICLing 2006 (I had never heard of that conference before; this was a keynote talk that year by Ed Hovy, incidentally).
Read more...