Commit Graph

7 Commits

Author SHA1 Message Date
Łukasz Langa
9b6df1d6bc [tokenize.open] Accept PathLike filename (#1921)
* [tokenize.open] Accept PathLike filename

* Still accept str, bytes and int on Python 3.6+
2018-02-26 11:01:48 -08:00
Jelle Zijlstra
e980c8987b tokenize: add generate_tokens in py3 (#1449)
Fixes #1433

This is undocumented but somebody is asking for it to be included.
2017-07-04 19:17:39 -07:00
Emily Morehouse
b6d08b81a3 #1286 Remove header comments from stubs (#1292)
- Updates documentation related to previously required comment headers.
- Removes all comment headers from stubs
- Occasionally included a header for stubs that were noted to be incomplete or contained todo's.
2017-05-22 15:14:15 -07:00
Martijn Pieters
3f0eb995aa Complete the tokenize module type hints (#984)
* Complete the tokenize module type hints
* Add missing import for Optional
* Use a 3.5-style named tuple, untokenize speaks with forked tongue so use Any
* Use explicit types for fields
2017-03-15 09:57:17 -07:00
Lukasz Langa
82b2d8e3bc Fixing flake8 F403, F405 errors 2016-12-20 02:28:12 -08:00
Lukasz Langa
fe0e3744cc Fixing flake8 E261 errors 2016-12-19 22:09:35 -08:00
Ben Darnell
088fd393b8 Add stdlib/3/tokenize.pyi (#151)
Mostly stubgen, with open() added by hand because stubgen somehow
doesn't see it.
2016-04-17 19:59:47 -07:00