We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I think discussions of tokenizing unicode have often got caught up in trying to design nicer APIs for it. I am taking the much lazier approach of promoting the existing but undocumented generate_tokens API to a public API.
generate_tokens
Why should we do this?
generate_tokens()
tokenize()
https://bugs.python.org/issue12486
Sorry, something went wrong.
Document tokenize.generate_tokens()
12d956b
Add news file
8f1ac2f
Add test for generate_tokens
00a10f0
Document behaviour around ENCODING token
d18583d
There was a problem hiding this comment.
The reason will be displayed to describe this comment to others. Learn more.
It's also widely used:
https://github.com/search?l=Python&q=generate_tokens&type=Code
Add generate_tokens to __all__
de72759
The test failures appear to be unrelated to this pull request (there's a ConnectionRefusedError in the asyncio tests). Could someone prod them to rerun?
ConnectionRefusedError
Close/reopen, hopefully that will rerun tests.
Looks good. Thanks @takluyver
c56b17b
@willingc: Please replace # with GH- in the commit message next time. Thanks!
#
GH-
Thanks !
willingc willingc approved these changes
Carreau Carreau approved these changes
Successfully merging this pull request may close these issues.