Jakob

Jakob Prange

Ph.D. student

in the Department of Computer Science
at Georgetown University

Email:
(Sometimes that pretty alias doesn't work; in that case, replace the part before the AT with "jp1724" and omit the "cs DOT" part.)

Affiliations | Papers | Presentations

I am a Ph.D. student of Computer Science at Georgetown University. My advisor is Prof. Nathan Schneider.

Having a background in Computational Linguistics, I want to explore how human language works and how this can be formalized well and efficiently. I try to accomplish this by combining methods and concepts from formal and distributional semantics, meaning & knowledge representation, deep learning, and old-school AI. I mostly rely on other people to apply my rather abstract research to something that's actually useful.

I received a Bachelor of Science in Computational Linguistics from Saarland University under the supervision of Prof. Dr. Manfred Pinkal and Dr. Stefan Thater. The topic of my thesis project was "POS-Tagging of Internet Texts Using Information about Distributional Similarity".

ACL Anthology | GitHub | Google Scholar | Semantic Scholar | ResearchGate | LinkedIn


Groups and projects I have been involved in

Current

Past



Publications

Tutorial

Refereed Journal Articles

Refereed Conference and Workshop Papers


Presentations

  • Supertagging the Long Tail with Tree-Structured Decoding of Complex Categories.
    (Alt title: CCG Supertagging as Top-down Tree Generation.)
    EACL, April 2021; SCiL, February 2021.
    [full paper], [abstract], [umass scholarworks]

  • Cross-linguistic Multilayer Semantic Annotation and Parsing with UCCA.
    U of Utah NLP group, August 2019.
    [abstract]

  • Corpus Linguistics: What is it and what can it do for me?
    1st IDHN Conference, May 2019.
    [slides]

  • Preposition Supersenses in German-English Parallel Data.
    MASC-SLL 2018; Google Assistant and Dialog Workshop 2018.
    [abstract], [poster]

  • The UdS POS Tagging Systems @ EmpiriST 2015.
    Roundtable discussion, NLP4CMC 2016.
    [slides]