Skip to content

Conversation

@duncanka
Copy link
Contributor

@duncanka duncanka commented Sep 17, 2017

  • Improved code structure that is easier to adapt to other neural transition tagging tasks
  • Made it possible for CNN library to release memory (necessary for repeatedly creating networks in the same run)
  • Fixed command line documentation

I've confirmed that the pretrained model still gets a UAS of 93.55% / LAS of 92.16% on the test set.

Shortens function; also allows it to be reused for other
transition-based NLP systems.
Also some corrections for consistent variable naming conventions
Makes numbers in logging accurate
(In preparation for reusing the same architecture for another
tagger that is not a parser.)
duncanka added 30 commits March 18, 2017 22:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant