TreeAdjoiningGrammars.ppt

上传人:文库蛋蛋多 文档编号:2428900 上传时间:2023-02-19 格式:PPT 页数:44 大小:783KB
返回 下载 相关 举报
TreeAdjoiningGrammars.ppt_第1页
第1页 / 共44页
TreeAdjoiningGrammars.ppt_第2页
第2页 / 共44页
TreeAdjoiningGrammars.ppt_第3页
第3页 / 共44页
TreeAdjoiningGrammars.ppt_第4页
第4页 / 共44页
TreeAdjoiningGrammars.ppt_第5页
第5页 / 共44页
点击查看更多>>
资源描述

《TreeAdjoiningGrammars.ppt》由会员分享,可在线阅读,更多相关《TreeAdjoiningGrammars.ppt(44页珍藏版)》请在三一办公上搜索。

1、Tree Adjoining Grammars,CIS 530 Intro to NLP,CIS 530-Intro to NLP,2,Context Free Grammars,S,NP,S,V,does,S,NP,V,think,VP,S,NP,who,Harry,Bill,Who does Bill think Harry likes?,Context Free Grammars:Derivations,CIS 530-Intro to NLP,3,Context Free Grammars,S,NP,S,V,does,S,NP,V,think,VP,S,NP,who,Harry,Bil

2、l,Who does Bill think Harry likes?,Context Free Grammars:Semantics,Meaning relations of the predicate/argument structures is lost in the treelikes(Harry,who),CIS 530-Intro to NLP,4,Context Free Grammars,CFGs can be parsed in time proportional to n3,where n is the length of the input in words by algo

3、rithms like CKY.,Context Free Grammars:Complexity,CIS 530-Intro to NLP,5,Transformational Grammars,S,NP,S,V,does,S,NP,V,think,VP,S,NP,V,NP,likes,VP,Harry,Bill,Who does Bill think Harry likes?,Context Free Deep StructureplusMovement Transformations,CIS 530-Intro to NLP,6,TGs can be parsed in exponent

4、ial time2n,where n is the length of the input in wordsExponential time is intractable,because exponentials grow so quickly,Transformational Grammars:Complexity,CIS 530-Intro to NLP,7,Lexicalized LTAG,Finite set of elementary trees anchored on lexical items-encapsulates syntactic and semantic depende

5、nciesElementary trees:Initial and Auxiliary,CIS 530-Intro to NLP,8,LTAG:A set of Elementary Trees,CIS 530-Intro to NLP,9,a1:,S,NP,V,NP,likes,a2:,S,NP,V,NP,likes,NP,e,S,transitive,object extraction,some other trees for likes:subject extraction,topicalization,subject relative,object relative,passive,e

6、tc.,VP,VP,LTAG:Examples,CIS 530-Intro to NLP,10,Lexicalized LTAG,Finite set of elementary trees anchored on lexical items-encapsulates syntactic and semantic dependenciesElementary trees:Initial and AuxiliaryOperations:Substitution and Adjoining,CIS 530-Intro to NLP,11,a:,X,b:,X,g:,X,b,Substitution,

7、CIS 530-Intro to NLP,12,a:,X,b:,X*,X,g:,X,X,b,Tree b adjoined to tree a at the node labeled X in the tree a,Adjoining,CIS 530-Intro to NLP,13,LTAG:A derivation,CIS 530-Intro to NLP,14,LTAG:A derivation,CIS 530-Intro to NLP,15,LTAG:A derivation,CIS 530-Intro to NLP,16,LTAG:A derivation,NP,S,a2:,CIS 5

8、30-Intro to NLP,17,LTAG:A derivation,NP,S,a2:,CIS 530-Intro to NLP,18,LTAG:A derivation,NP,S,a2:,NP,S,a2:,S,CIS 530-Intro to NLP,19,LTAG:A derivation,NP,S,a2:,S,NP,V,NP,likes,e,VP,b1:,S,CIS 530-Intro to NLP,20,LTAG:A derivation,NP,S,a2:,CIS 530-Intro to NLP,21,LTAG:Semantics,S,NP,S,V,does,S,NP,V,thi

9、nk,VP,S,NP,V,NP,likes,e,VP,who,Harry,Bill,who does Bill think Harry likes,Meaning relations of the predicate/argument structures are clear in the original base trees!,CIS 530-Intro to NLP,22,S,NP,V,NP,likes,NP,e,S,VP,S,NP,V,S*,b1:,think,VP,b2:,V,S,does,S*,NP,NP,NP,who,Harry,Bill,a3:,a2:,a4:,a5:,subs

10、titution,adjoining,who does Bill think Harry likes,LTAG:A Derivation,CIS 530-Intro to NLP,23,who does Bill think Harry likes,a2:,likes,a3:,who,b1:,think,a4:,Harry,b2:,does,a5:,Bill,*Compositional semantics on this derivation structure*Related to dependency diagrams,substitution,adjoining,LTAG:Deriva

11、tion Tree,TAGs:Complexity,TAGs can be parsed in polynomial timen5 rather than n3 for CFGsTAGS are a prime example of mildly context sensitive grammars(MCSGs)Plausible:MCSGs are sufficient to capture the grammars of all human languagesE.g.can parse Swiss German,CIS 530-Intro to NLP,24,CIS 530-Intro t

12、o NLP,25,Context Free Grammars Structure doesnt well represent“domains of locality”reflecting meaning Parsed in polynomial time n3(n is the length of the input)Transformational Grammars Captures domains of locality,accounting for surface word order by“movement”Parsing is intractable,requring 2n time

13、 Tree Adjoining Grammars Captures domains of locality,with surface discontiguities the result of adjunction Parsed in polynomial time n5(rather than n3 for CFGs),Adequacy vs.Complexity,TAGs&Mildly Context Sensitive Languages:Swiss German,CIS 530-Intro to NLP,26,CIS 530-Intro to NLP,27,English relati

14、ve clauses are nested,NP1 The mouse VP1 ate the cheeseForm:NP1 VP1NP1 The mouse NP2 the cat VP2 chased VP1 ate the cheeseForm:NP1 NP2 VP2 VP1Theorem:Languages of form wwr are context free,CIS 530-Intro to NLP,28,CFG trees naturally nest structure,V,NP,ate,the cheese,VP1,S,NP,VP2,S,CIS 530-Intro to N

15、LP,29,Swiss German sentences are harder.,In English:NP1 Claudia VP1 watched NP2 Eva vp2 make NP3 Ulrich VP2 workForm:NP1 VP1 NP2 VP2 NP3 VP3Not hard In Swiss German:NP1 Claudia NP2 Eva NP3 Ulrich VP1 watched vp2 make VP3 workForm:NP1 NP2 NP3 VP1 VP2 VP3Theorem:Languages of form ww cannot be generate

16、d by Context Free Grammars,CIS 530-Intro to NLP,30,Scrambling:N1 N2 N3 V1 V2 V3,V1,VP,N1 VP,VP,VP,N1,e,VP,N2 VP,VP,VP,N2,V2,e,VP,VP,N3 VP,VP,VP,N3,V3,e,VP,CIS 530-Intro to NLP,31,Scrambling:N1 N2 N3 V1 V2 V3,VP,N1,VP,N2 VP,VP,VP,VP,N3 VP,VP,VP,N3,V3,e,VP,CIS 530-Intro to NLP,32,Scrambling:N1 N2 N3 V

17、1 V2 V3,VP,VP,VP,N3,V3,e,CIS 530-Intro to NLP,33,A Simple Synchronous TAG translator,CIS 530-Intro to NLP,34,Substituting in“John”and“Mary”,CIS 530-Intro to NLP,35,Substituting“Apparently”,Parsing TAGs by“Supertagging”:Reducing parsing to POS tagging+,CIS 530-Intro to NLP,37,Supertag disambiguation-

18、supertagging,Given a corpus parsed by an LTAG grammarWe have statistics of supertags-unigram,bigram,trigram,etc.These statistics combine the lexical statistics as well as the statistics of the constructions in which the lexical items appear,CIS 530-Intro to NLP,38,Supertagging,the purchase price inc

19、ludes two ancillary companies,a9,b2,a1,.,.,.,a10,a6,a2,.,.,.,b1,a11,a7,a3,.,.,.,b3,a12,b4,a4,.,.,.,a13,a8,a5,.,.,.,On the average a lexical item has about 8 to 10 supertags,CIS 530-Intro to NLP,39,Supertagging,the purchase price includes two ancillary companies,a9,b2,a1,.,.,.,a10,a6,a2,.,.,.,b1,a11,

20、a7,a3,.,.,.,b3,a12,b4,a4,.,.,.,a13,a8,a5,.,.,.,-Select the correct supertag for each word-shown in blue-Correct supertag for a word means the supertag that corresponds to that word in the correct parse of the sentence,CIS 530-Intro to NLP,40,Supertagging-performance,-Performance of a trigram superta

21、gger,-Performance on the WSJ corpus,Size of thetraining corpus,Size of thetest corpus,#of wordscorrectly supertagged,%correct,Baseline,47,000,35,391,75.3%,1 million,47,000,43,334,92.2%,Srinivas(1997),CIS 530-Intro to NLP,41,Abstract character of supertagging,Complex(richer)descriptions of primitives

22、Contrary to the standard mathematical conventionDescriptions of primitives are simpleComplex descriptions are made from simple descriptionsAssociate with each primitive all information associated with it,CIS 530-Intro to NLP,42,Complex descriptions of primitives,Making descriptions of primitives mor

23、e complexIncreases the local ambiguity,i.e.,there are more descriptions for each primitiveHowever,these richer descriptions of primitives locally constrain each otherAnalogy to a jigsaw puzzle-the richer the description of each primitive the better,CIS 530-Intro to NLP,43,Complex descriptions of pri

24、mitives,Making the descriptions of primitives more complexAllows statistics to be computed over these complex descriptionsThese statistics are more meaningfulLocal statistical computations over these complex descriptions lead to robust and efficient processing,CIS 530-Intro to NLP,44,A different perspective on LTAG,Treat the elementary trees associated with a lexical item as if they are super part of speech(super-POS or supertags)Local statistical techniques have been remarkably successful in disambiguating standard POSApply these techniques for disambiguating supertags-almost parsing,

展开阅读全文
相关资源
猜你喜欢
相关搜索
资源标签

当前位置:首页 > 建筑/施工/环境 > 项目建议


备案号:宁ICP备2025010119号-1

经营许可证:宁B2-20210002

宁公网安备 64010402000987号