Rebecca Chen
8366aa44bb
Add mistakenly removed constants back to tokenize. ( #4030 )
...
These constants were removed in
https://github.com/python/typeshed/pull/3839 because they are imported
from token. However, that is only true in Python 3.7+.
2020-05-17 18:05:07 -07:00
Tim Hatch
508fd84499
Expand tokenize stub to include Intnumber etc ( #3839 )
...
The all uppercase tokens, as well as tok_name mentioned in the comment
actually come from the `from token import *`.
2020-03-11 10:54:19 -07:00
Anthony Sottile
9ec0bcf7e4
Add cookie_re / blank_re to py3 tokenize ( #3745 )
2020-02-20 23:10:30 +01:00
Sebastian Rittau
c0d46a2035
Mostly undo #3372 ( #3481 )
...
readline callback must return str, not bytes.
2019-11-23 20:17:14 +01:00
Sebastian Rittau
ec7960a8cb
Convert namedtuples to class syntax ( #3321 )
2019-10-20 10:37:33 +02:00
Sebastian Rittau
299d89ab76
generate_tokens(readline) must return bytes ( #3372 )
2019-10-16 08:55:23 -07:00
Sebastian Rittau
c32e1e2280
Enable --disallow-any-generics for stubs ( #3288 )
2019-10-01 05:31:34 -07:00
Rebecca Chen
6f01493edc
Move some constants from tokenize to token in Python 3.7+. ( #3175 )
2019-08-08 08:45:08 +02:00
Michael Lee
efb67946f8
Use variable annotations everywhere ( #2909 )
2019-04-13 10:40:52 +02:00
Łukasz Langa
9b6df1d6bc
[tokenize.open] Accept PathLike filename ( #1921 )
...
* [tokenize.open] Accept PathLike filename
* Still accept str, bytes and int on Python 3.6+
2018-02-26 11:01:48 -08:00
Jelle Zijlstra
e980c8987b
tokenize: add generate_tokens in py3 ( #1449 )
...
Fixes #1433
This is undocumented but somebody is asking for it to be included.
2017-07-04 19:17:39 -07:00
Emily Morehouse
b6d08b81a3
#1286 Remove header comments from stubs ( #1292 )
...
- Updates documentation related to previously required comment headers.
- Removes all comment headers from stubs
- Occasionally included a header for stubs that were noted to be incomplete or contained todo's.
2017-05-22 15:14:15 -07:00
Martijn Pieters
3f0eb995aa
Complete the tokenize module type hints ( #984 )
...
* Complete the tokenize module type hints
* Add missing import for Optional
* Use a 3.5-style named tuple, untokenize speaks with forked tongue so use Any
* Use explicit types for fields
2017-03-15 09:57:17 -07:00
Lukasz Langa
82b2d8e3bc
Fixing flake8 F403, F405 errors
2016-12-20 02:28:12 -08:00
Lukasz Langa
fe0e3744cc
Fixing flake8 E261 errors
2016-12-19 22:09:35 -08:00
Ben Darnell
088fd393b8
Add stdlib/3/tokenize.pyi ( #151 )
...
Mostly stubgen, with open() added by hand because stubgen somehow
doesn't see it.
2016-04-17 19:59:47 -07:00