Skip to content

martinm07/tokenization-layer-docs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

description
An NLP tokenization algorithm that is a trainable layer for neural networks.

Introduction

This is the documentation for a package about an experimental "tokenization layer", a tokenization algorithm that is a neural network layer, training as part of a model trying to solve some NLP task, to make tokens that are best for the task.

However note, this is simply a concept, and in it's current state, this layer should not be used in any real tasks.

About

Documentation of the tokenization-layer Python package.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors