Compare commits
No commits in common. 'master' and 'gh-pages' have entirely different histories.
205 changed files with 59963 additions and 11155 deletions
@ -1,8 +0,0 @@ |
|||
name: Disco |
|||
|
|||
rules: |
|||
disco/*.py: |
|||
parser: numpy |
|||
|
|||
outputs: |
|||
- {type: markdown, path: docs/api/, title: 'Disco Documentation'} |
@ -1,3 +0,0 @@ |
|||
[flake8] |
|||
max-line-length = 120 |
|||
ignore=C408,C815,A003,A002,W504 |
@ -1,24 +0,0 @@ |
|||
build/ |
|||
dist/ |
|||
disco*.egg-info/ |
|||
storage.db |
|||
storage.json |
|||
*.dca |
|||
*.pyc |
|||
.eggs/ |
|||
.cache/ |
|||
.benchmarks/ |
|||
__pycache__ |
|||
.venv |
|||
|
|||
# Documentation stuff |
|||
docs/api/ |
|||
docs/_build |
|||
_book/ |
|||
node_modules/ |
|||
|
|||
# JetBrains IDE |
|||
.idea/ |
|||
|
|||
# macOS |
|||
.DS_Store |
@ -1,20 +0,0 @@ |
|||
language: python |
|||
|
|||
cache: pip |
|||
|
|||
python: |
|||
- '2.7' |
|||
- '3.6' |
|||
|
|||
matrix: |
|||
include: |
|||
- python: 3.7 |
|||
dist: xenial |
|||
sudo: true |
|||
|
|||
install: |
|||
- pip install -U pip setuptools flake8 |
|||
|
|||
script: |
|||
- python setup.py test |
|||
- flake8 disco/ |
@ -1,118 +0,0 @@ |
|||
# CHANGELOG |
|||
|
|||
## v0.0.12 |
|||
|
|||
### Additions |
|||
|
|||
- **MAJOR** Added voice gateway v3 support. This will result in increased stability for voice connections |
|||
- **BREAKING** Updated holster to v2.0.0 which changes the way emitters work (and removes the previous priorities). A migration guide will be provided post-RC cycle. |
|||
- Added support for ETF on Python 3.x via `earl-etf` (@GiovanniMCMXCIX) |
|||
- Supported detecting dead/inactive/zombied Gateway websocket connections via tracking `HEARTBEAT_ACK` (@PixeLInc) |
|||
- Added support for animated emoji (@Seklfreak) |
|||
- Added support for `LISTENING` and `WATCHING` game statuses (@PixeLInc) |
|||
- Added `wsaccel` package within the `performance` pack, should improve websocket performance |
|||
- Added the concept of a `shared_config` which propgates its options to all plugin configs (@enkoder) |
|||
- Added support for streaming zlib compression to our gateway socket. This is enabled by default and provides significant performance improvements on startup and overall bandwidth usage |
|||
- Added support for `Guild.system_channel_id` and `GUILD_MEMBER_JOIN` system message |
|||
- Added `Guild.create_category`, `Guild.create_text_channel` and `Guild.create_voice_channel` |
|||
- Added `Channel.create_text_channel` and `Channel.create_voice_channel` which can be called only on category channels to add sub-channels |
|||
|
|||
### Fixes |
|||
|
|||
- Fixed 'Invalid token passed' errors from showing up (via removal of token validation) |
|||
- Fixed `IndexError` being raised when `MessageIterator` was done iterating (@Majora320) |
|||
- Fixed overwrite calculations in `Channel.get_permissions` (@cookkkie) |
|||
- A plethora of PEP8 and general syntax changes have been made to cleanup the code |
|||
- Fixed a bug with `Emoji.custom` |
|||
- Fixed a bug in the typing system that would not allow Field's to have a `default` of `None` |
|||
- Fixed the `__str__` method for Channel's displaying (useless) unset data for DMs |
|||
- Fixed a bug with `MessageIterator` related to iterating before or after an ID of 0 |
|||
- Fixed incorrect field name (`icon_proxy_url` vs `proxy_icon_url`) in MessageEmbedAuthor model |
|||
- Fixed bugs related to creating and deleting pinned messages |
|||
- Fixed `GuildBan.reason` incorrectly handling unicode reasons |
|||
- Fixed `Paginator` throwing an exception when reaching the end of pagination, instead of just ending its iteration |
|||
- Fixed `Paginator` defaulting to start at 0 for all iterations |
|||
|
|||
### Etc |
|||
|
|||
- **BREAKING** Refactor the way Role's are managed and updated. You should update your code to use `Role.update` |
|||
- **BREAKING** Renamed `Model.update` to `Model.inplace_update`. You should not have to worry about this change unless you explicitly call that method |
|||
- **DEPRECATION** Deprecated the use of `Guild.create_channel`. You should use the explicit channel type creation methods added in this release |
|||
- Cleaned up various documentation |
|||
- Removed some outdated storage/etc examples |
|||
- Expanded `APIClient.guilds_roles_create` to handle more attributes |
|||
- Bumped various requirement versions |
|||
|
|||
## v0.0.11 |
|||
|
|||
### Additions |
|||
|
|||
- Added support for Guild audit logs, exposed via `Guild.get_audit_log_entries`, `Guild.audit_log` and `Guild.audit_log_iter`. For more information see the `AuditLogEntry` model |
|||
- Added built-in Flask HTTP server which can be enabled via `http_enabled` and configured via `http_host`/`http_port` config options. The server allows plugins to define routes which can be called externally. |
|||
- Added support for capturing the raw responses returned from API requests via the `APIClient.capture` contextmanager |
|||
- Added support for NSFW channels via `Channel.nsfw` and `Channel.is_nsfw` |
|||
- Added initial support for channel categories via `Channel.parent_id` and `Channel.parent` |
|||
- Added various setters for updating Channel properties, e.g. `Channel.set_topic` |
|||
- Added support for audit log reasons, accessible through passing `reason` to various methods |
|||
- Added `disco.util.snowflake.from_timestamp_ms` |
|||
- Added support for `on_complete` callback within DCADOpusEncoderPlayable |
|||
- **BREAKING** Added new custom queue types `BaseQueue`/`PlayableQueue` for use w/ `Player`. |
|||
- `queue` can be passed when creating a `Player`, should inherit from BaseQueue |
|||
- Users who previously utilized the `put` method of the old `Player.queue` must move to using `Player.queue.append`, or providing a custom queue implementation. |
|||
- Added `Emoji.custom` property |
|||
|
|||
### Fixes |
|||
|
|||
- Fixed GuildRoleCreate missing guild\_id, resulting in incorrect state |
|||
- Fixed SimpleLimiter behaving incorrectly (causing GW socket to be ratelimited in some cases) |
|||
- Fixed the shortest possible match for a single command being an empty string |
|||
- Fixed group matching being overly greedy, which allowed for extra characters to be allowed at the end of a group match |
|||
- Fixed errors thrown when not enabling manhole via cli |
|||
- Fixed various warnings emitted due to useage of StopIteration |
|||
- Fixed warnings about missing voice libs when importing `disco.types.channel` |
|||
- Fixed `Bot.get_commands_for_message` returning None (instead of empty list) in some cases |
|||
|
|||
### Etc |
|||
|
|||
- Greatly imrpoved the performance of `HashMap` |
|||
- **BREAKING** Increased the weight of group matches over command argument matches, and limited the number of commands executed per message to one. |
|||
- Reuse a buffer in voice code to slightly improve performance |
|||
|
|||
## v0.0.11-rc.8 |
|||
|
|||
### Additions |
|||
|
|||
- Added support for capturing the raw responses returned from the API via `APIClient.capture` contextmanager |
|||
- Added various pieces of documentation |
|||
|
|||
### Fixes |
|||
|
|||
- Fixed Python 3 errors and Python 2 deprecation warnings for CommandError using `.message` attribute |
|||
|
|||
### ETC |
|||
|
|||
- Grealty improved the performance of the custom HashMap |
|||
- Moved tests around and added pytest as the testing framework of choice |
|||
|
|||
|
|||
## v0.0.11-rc.7 |
|||
|
|||
### Additions |
|||
|
|||
- Added support for new NSFW attribute of channels |
|||
- `Channel.nsfw` |
|||
- `Channel.set_nsfw` |
|||
- `Channel.is_nsfw` behaves correctly, checking both the deprecated `nsfw-` prefix and the new attribute |
|||
- Added support for `on_complete` callback within DCADOpusEncoderPlayable |
|||
- **BREAKING** Added new custom queue types `BaseQueue`/`PlayableQueue` for use w/ `Player`. |
|||
- `queue` can be passed when creating a `Player`, should inherit from BaseQueue |
|||
- Users who previously utilized the `put` method of the old `Player.queue` must move to using `Player.queue.append`, or providing a custom queue implementation. |
|||
|
|||
### Fixes |
|||
|
|||
- Fixed bug within SimpleLimiter which would cause all events after a quiescent period to be immedietly dispatched. This would cause gateway disconnects w/ RATE\_LIMITED on clients with many Guilds and member sync enabled. |
|||
|
|||
### ETC |
|||
|
|||
- Improved log messages within GatewayClient |
|||
- Log voice endpoint within VoiceClient |
@ -1,2 +0,0 @@ |
|||
include README.md |
|||
include requirements.txt |
@ -1,57 +0,0 @@ |
|||
# disco |
|||
|
|||
[](https://pypi.python.org/pypi/disco-py/) |
|||
[](https://pypi.python.org/pypi/disco-py/) |
|||
[](https://travis-ci.org/b1naryth1ef/disco/) |
|||
|
|||
Disco is an extensive and extendable Python 2.x/3.x library for the [Discord API](https://discord.com/developers/docs/intro). Disco boasts the following major features: |
|||
|
|||
- Expressive, functional interface that gets out of the way |
|||
- Built for high-performance and efficiency |
|||
- Configurable and modular, take the bits you need |
|||
- Full support for Python 2.x/3.x |
|||
- Evented networking and IO using Gevent |
|||
|
|||
## Installation |
|||
|
|||
Disco was built to run both as a generic-use library, and a standalone bot toolkit. Installing disco is as easy as running `pip install disco-py`, however some extra packages are recommended for power-users, namely: |
|||
|
|||
|Name|Reason| |
|||
|----|------| |
|||
|requests[security]|adds packages for a proper SSL implementation| |
|||
|ujson|faster json parser, improves performance| |
|||
|erlpack (2.x), earl-etf (3.x)|ETF parser run with the --encoder=etf flag| |
|||
|gipc|Gevent IPC, required for autosharding| |
|||
|
|||
## Examples |
|||
|
|||
Simple bot using the builtin bot authoring tools: |
|||
|
|||
```python |
|||
from disco.bot import Bot, Plugin |
|||
|
|||
|
|||
class SimplePlugin(Plugin): |
|||
# Plugins provide an easy interface for listening to Discord events |
|||
@Plugin.listen('ChannelCreate') |
|||
def on_channel_create(self, event): |
|||
event.channel.send_message('Woah, a new channel huh!') |
|||
|
|||
# They also provide an easy-to-use command component |
|||
@Plugin.command('ping') |
|||
def on_ping_command(self, event): |
|||
event.msg.reply('Pong!') |
|||
|
|||
# Which includes command argument parsing |
|||
@Plugin.command('echo', '<content:str...>') |
|||
def on_echo_command(self, event, content): |
|||
event.msg.reply(content) |
|||
``` |
|||
|
|||
Using the default bot configuration, we can now run this script like so: |
|||
|
|||
`python -m disco.cli --token="MY_DISCORD_TOKEN" --run-bot --plugin simpleplugin` |
|||
|
|||
And commands can be triggered by mentioning the bot (configured by the BotConfig.command\_require\_mention flag): |
|||
|
|||
 |
@ -0,0 +1,18 @@ |
|||
# disco |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Constants |
|||
|
|||
|
|||
{'type': 'assign', 'targets': ['VERSION'], 'value': '0.0.12'} |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,11 @@ |
|||
# disco.api |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -0,0 +1,18 @@ |
|||
# disco.bot |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Constants |
|||
|
|||
|
|||
{'type': 'assign', 'targets': ['__all__'], 'value': {'elts': ['Bot', 'BotConfig', 'Plugin', 'Config', 'CommandLevels'], 'type': 'list'}} |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -0,0 +1,25 @@ |
|||
# disco.bot.providers |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Functions |
|||
|
|||
#### load_provider(name) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,280 @@ |
|||
# disco.bot.providers.base |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Constants |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### BaseProvider |
|||
|
|||
|
|||
_Inherits From _ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
#### __init__(self, config) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### exists(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### keys(self, other) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get_many(self, keys) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### set(self, key, value) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### delete(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### load(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### save(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### root(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
### StorageDict |
|||
|
|||
|
|||
_Inherits From `UserDict`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
#### __init__(self, parent_or_provider, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### keys(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### values(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### items(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### ensure(self, key, typ) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### update(self, obj) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### data(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### key(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __setitem__(self, key, value) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __getitem__(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __delitem__(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __contains__(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Functions |
|||
|
|||
#### join_key(`*args`) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### true_key(key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,96 @@ |
|||
# disco.bot.providers.disk |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### DiskProvider |
|||
|
|||
|
|||
_Inherits From `BaseProvider`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
#### __init__(self, config) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### autosave_loop(self, interval) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### _on_change(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### load(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### save(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### set(self, key, value) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### delete(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,30 @@ |
|||
# disco.bot.providers.memory |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### MemoryProvider |
|||
|
|||
|
|||
_Inherits From `BaseProvider`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,105 @@ |
|||
# disco.bot.providers.redis |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### RedisProvider |
|||
|
|||
|
|||
_Inherits From `BaseProvider`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
#### __init__(self, config) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### load(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### exists(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### keys(self, other) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get_many(self, keys) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### set(self, key, value) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### delete(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,114 @@ |
|||
# disco.bot.providers.rocksdb |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### RocksDBProvider |
|||
|
|||
|
|||
_Inherits From `BaseProvider`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
#### __init__(self, config) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### k(k) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### load(self) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### exists(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### keys(self, other) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get_many(self, keys) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### set(self, key, value) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### delete(self, key) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because one or more lines are too long
@ -0,0 +1,11 @@ |
|||
# disco.gateway |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
@ -0,0 +1,18 @@ |
|||
# disco.gateway.encoding |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Constants |
|||
|
|||
|
|||
{'type': 'assign', 'targets': ['ENCODERS'], 'value': {'keys': ['json'], 'values': ['JSONEncoder'], 'type': 'dict'}} |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,48 @@ |
|||
# disco.gateway.encoding.base |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### BaseEncoder |
|||
|
|||
|
|||
_Inherits From `Interface`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### encode(<code>obj</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### decode(<code>obj</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,48 @@ |
|||
# disco.gateway.encoding.etf |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### ETFEncoder |
|||
|
|||
|
|||
_Inherits From `BaseEncoder`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### encode(<code>obj</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### decode(<code>obj</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,48 @@ |
|||
# disco.gateway.encoding.json |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### JSONEncoder |
|||
|
|||
|
|||
_Inherits From `BaseEncoder`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### encode(<code>obj</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### decode(<code>obj</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -0,0 +1,18 @@ |
|||
# disco.types |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Constants |
|||
|
|||
|
|||
{'type': 'assign', 'targets': ['__all__'], 'value': {'elts': ['UNSET', 'Channel', 'Guild', 'GuildMember', 'Role', 'User', 'Message', 'VoiceState'], 'type': 'set'}} |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -0,0 +1,74 @@ |
|||
# disco.util.paginator |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### Paginator |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
Implements a class which provides paginated iteration over an endpoint. |
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### __init__(<code>self, <code>func, <code>sort_key,\*args,\*\*kwargs</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### fill(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### next(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __iter__(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __next__(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
@ -0,0 +1,121 @@ |
|||
# disco.util.serializer |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### Serializer |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### check_format(<code>cls, <code>fmt</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### json(</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### yaml(</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### pickle(</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### loads(<code>cls, <code>fmt, <code>raw</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### dumps(<code>cls, <code>fmt, <code>raw</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Functions |
|||
|
|||
|
|||
|
|||
#### dump_cell(<code>cell</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### load_cell(<code>cell</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### dump_function(<code>func</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### load_function(<code>args</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
@ -0,0 +1,23 @@ |
|||
# disco.util.string |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Functions |
|||
|
|||
|
|||
|
|||
#### underscore(<code>word</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -0,0 +1,11 @@ |
|||
# disco.voice |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -0,0 +1,134 @@ |
|||
# disco.voice.queue |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### BaseQueue |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### get(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
### PlayableQueue |
|||
|
|||
|
|||
_Inherits From `BaseQueue`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### __init__(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### append(<code>self, <code>item</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### _get(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### get(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### shuffle(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### clear(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __len__(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __iter__(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### __nonzero__(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
@ -0,0 +1,113 @@ |
|||
# disco.voice.udp |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Constants |
|||
|
|||
|
|||
{'type': 'assign', 'targets': ['MAX_TIMESTAMP'], 'value': 4294967295} |
|||
|
|||
|
|||
|
|||
{'type': 'assign', 'targets': ['MAX_SEQUENCE'], 'value': 65535} |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
## Classes |
|||
|
|||
### UDPVoiceClient |
|||
|
|||
|
|||
_Inherits From `LoggingClass`_ |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### Functions |
|||
|
|||
|
|||
|
|||
#### __init__(<code>self, <code>vc</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### increment_timestamp(<code>self, <code>by</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### setup_encryption(<code>self, <code>encryption_key</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### send_frame(<code>self, <code>frame, <code>sequence, <code>timestamp=None, <code>incr_timestamp=None</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### run(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### send(<code>self, <code>data</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### disconnect(<code>self</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
#### connect(<code>self, <code>host, <code>port, <code>timeout, <code>addrinfo=None</code>) |
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
File diff suppressed because it is too large
@ -1 +0,0 @@ |
|||
VERSION = '0.0.14-rc.1' |
@ -1,562 +0,0 @@ |
|||
import six |
|||
import json |
|||
import warnings |
|||
|
|||
from contextlib import contextmanager |
|||
from gevent.local import local |
|||
from six.moves.urllib.parse import quote |
|||
|
|||
from holster.enum import EnumAttr |
|||
from disco.api.http import Routes, HTTPClient, to_bytes |
|||
from disco.util.logging import LoggingClass |
|||
from disco.util.sanitize import S |
|||
from disco.types.user import User |
|||
from disco.types.message import Message |
|||
from disco.types.guild import Guild, GuildMember, GuildBan, Role, GuildEmoji, AuditLogEntry |
|||
from disco.types.channel import Channel |
|||
from disco.types.invite import Invite |
|||
from disco.types.webhook import Webhook |
|||
|
|||
|
|||
def optional(**kwargs): |
|||
""" |
|||
Takes a set of keyword arguments, creating a dictionary with only the non- |
|||
null values. |
|||
|
|||
:returns: dict |
|||
""" |
|||
return {k: v for k, v in six.iteritems(kwargs) if v is not None} |
|||
|
|||
|
|||
def _reason_header(value): |
|||
return optional(**{'X-Audit-Log-Reason': quote(to_bytes(value)) if value else None}) |
|||
|
|||
|
|||
class Responses(list): |
|||
def rate_limited_duration(self): |
|||
return sum(i.rate_limited_duration for i in self) |
|||
|
|||
@property |
|||
def rate_limited(self): |
|||
return self.rate_limited_duration() != 0 |
|||
|
|||
|
|||
class APIClient(LoggingClass): |
|||
""" |
|||
An abstraction over a :class:`disco.api.http.HTTPClient`, which composes |
|||
requests from provided data, and fits models with the returned data. The APIClient |
|||
is the only path to the API used within models/other interfaces, and it's |
|||
the recommended path for all third-party users/implementations. |
|||
|
|||
Args |
|||
---- |
|||
token : str |
|||
The Discord authentication token (without prefixes) to be used for all |
|||
HTTP requests. |
|||
client : Optional[:class:`disco.client.Client`] |
|||
The Disco client this APIClient is a member of. This is used when constructing |
|||
and fitting models from response data. |
|||
|
|||
Attributes |
|||
---------- |
|||
client : Optional[:class:`disco.client.Client`] |
|||
The Disco client this APIClient is a member of. |
|||
http : :class:`disco.http.HTTPClient` |
|||
The HTTPClient this APIClient uses for all requests. |
|||
""" |
|||
def __init__(self, token, client=None): |
|||
super(APIClient, self).__init__() |
|||
|
|||
self.client = client |
|||
self.http = HTTPClient(token, self._after_requests) |
|||
|
|||
self._captures = local() |
|||
|
|||
def _after_requests(self, response): |
|||
if not hasattr(self._captures, 'responses'): |
|||
return |
|||
|
|||
self._captures.responses.append(response) |
|||
|
|||
@contextmanager |
|||
def capture(self): |
|||
""" |
|||
Context manager which captures all requests made, returning a special |
|||
`Responses` list, which can be used to introspect raw API responses. This |
|||
method is a low-level utility which should only be used by experienced users. |
|||
""" |
|||
responses = Responses() |
|||
self._captures.responses = responses |
|||
|
|||
try: |
|||
yield responses |
|||
finally: |
|||
delattr(self._captures, 'responses') |
|||
|
|||
def gateway_get(self): |
|||
data = self.http(Routes.GATEWAY_GET).json() |
|||
return data |
|||
|
|||
def gateway_bot_get(self): |
|||
data = self.http(Routes.GATEWAY_BOT_GET).json() |
|||
return data |
|||
|
|||
def channels_get(self, channel): |
|||
r = self.http(Routes.CHANNELS_GET, dict(channel=channel)) |
|||
return Channel.create(self.client, r.json()) |
|||
|
|||
def channels_modify(self, channel, reason=None, **kwargs): |
|||
r = self.http( |
|||
Routes.CHANNELS_MODIFY, |
|||
dict(channel=channel), |
|||
json=kwargs, |
|||
headers=_reason_header(reason)) |
|||
return Channel.create(self.client, r.json()) |
|||
|
|||
def channels_delete(self, channel, reason=None): |
|||
r = self.http( |
|||
Routes.CHANNELS_DELETE, |
|||
dict(channel=channel), |
|||
headers=_reason_header(reason)) |
|||
return Channel.create(self.client, r.json()) |
|||
|
|||
def channels_typing(self, channel): |
|||
self.http(Routes.CHANNELS_TYPING, dict(channel=channel)) |
|||
|
|||
def channels_messages_list(self, channel, around=None, before=None, after=None, limit=50): |
|||
r = self.http(Routes.CHANNELS_MESSAGES_LIST, dict(channel=channel), params=optional( |
|||
around=around, |
|||
before=before, |
|||
after=after, |
|||
limit=limit, |
|||
)) |
|||
|
|||
return Message.create_map(self.client, r.json()) |
|||
|
|||
def channels_messages_get(self, channel, message): |
|||
r = self.http(Routes.CHANNELS_MESSAGES_GET, dict(channel=channel, message=message)) |
|||
return Message.create(self.client, r.json()) |
|||
|
|||
def channels_messages_create( |
|||
self, |
|||
channel, |
|||
content=None, |
|||
nonce=None, |
|||
tts=False, |
|||
attachment=None, |
|||
attachments=[], |
|||
embed=None, |
|||
sanitize=False): |
|||
|
|||
payload = { |
|||
'nonce': nonce, |
|||
'tts': tts, |
|||
} |
|||
|
|||
if attachment: |
|||
attachments = [attachment] |
|||
warnings.warn( |
|||
'attachment kwarg has been deprecated, switch to using attachments with a list', |
|||
DeprecationWarning) |
|||
|
|||
if content: |
|||
if sanitize: |
|||
content = S(content) |
|||
payload['content'] = content |
|||
|
|||
if embed: |
|||
payload['embed'] = embed.to_dict() |
|||
|
|||
if attachments: |
|||
if len(attachments) > 1: |
|||
files = { |
|||
'file{}'.format(idx): tuple(i) for idx, i in enumerate(attachments) |
|||
} |
|||
else: |
|||
files = { |
|||
'file': tuple(attachments[0]), |
|||
} |
|||
|
|||
r = self.http( |
|||
Routes.CHANNELS_MESSAGES_CREATE, |
|||
dict(channel=channel), |
|||
data={'payload_json': json.dumps(payload)}, |
|||
files=files, |
|||
) |
|||
else: |
|||
r = self.http(Routes.CHANNELS_MESSAGES_CREATE, dict(channel=channel), json=payload) |
|||
|
|||
return Message.create(self.client, r.json()) |
|||
|
|||
def channels_messages_modify(self, channel, message, content=None, embed=None, sanitize=False): |
|||
payload = {} |
|||
|
|||
if content is not None: |
|||
if sanitize: |
|||
content = S(content) |
|||
payload['content'] = content |
|||
|
|||
if embed: |
|||
payload['embed'] = embed.to_dict() |
|||
|
|||
r = self.http(Routes.CHANNELS_MESSAGES_MODIFY, |
|||
dict(channel=channel, message=message), |
|||
json=payload) |
|||
return Message.create(self.client, r.json()) |
|||
|
|||
def channels_messages_delete(self, channel, message): |
|||
self.http(Routes.CHANNELS_MESSAGES_DELETE, dict(channel=channel, message=message)) |
|||
|
|||
def channels_messages_delete_bulk(self, channel, messages): |
|||
self.http(Routes.CHANNELS_MESSAGES_DELETE_BULK, dict(channel=channel), json={'messages': messages}) |
|||
|
|||
def channels_messages_reactions_get(self, channel, message, emoji, after=None, limit=100): |
|||
r = self.http( |
|||
Routes.CHANNELS_MESSAGES_REACTIONS_GET, |
|||
dict(channel=channel, message=message, emoji=emoji), |
|||
params={'after': after, 'limit': limit}) |
|||
return User.create_map(self.client, r.json()) |
|||
|
|||
def channels_messages_reactions_create(self, channel, message, emoji): |
|||
self.http(Routes.CHANNELS_MESSAGES_REACTIONS_CREATE, dict(channel=channel, message=message, emoji=emoji)) |
|||
|
|||
def channels_messages_reactions_delete(self, channel, message, emoji, user=None): |
|||
route = Routes.CHANNELS_MESSAGES_REACTIONS_DELETE_ME |
|||
obj = dict(channel=channel, message=message, emoji=emoji) |
|||
|
|||
if user: |
|||
route = Routes.CHANNELS_MESSAGES_REACTIONS_DELETE_USER |
|||
obj['user'] = user |
|||
|
|||
self.http(route, obj) |
|||
|
|||
def channels_messages_reactions_delete_emoji(self, channel, message, emoji): |
|||
self.http(Routes.CHANNELS_MESSAGES_REACTIONS_DELETE_EMOJI, dict(channel=channel, message=message, emoji=emoji)) |
|||
|
|||
def channels_permissions_modify(self, channel, permission, allow, deny, typ, reason=None): |
|||
self.http(Routes.CHANNELS_PERMISSIONS_MODIFY, dict(channel=channel, permission=permission), json={ |
|||
'allow': allow, |
|||
'deny': deny, |
|||
'type': typ, |
|||
}, headers=_reason_header(reason)) |
|||
|
|||
def channels_permissions_delete(self, channel, permission, reason=None): |
|||
self.http( |
|||
Routes.CHANNELS_PERMISSIONS_DELETE, |
|||
dict(channel=channel, permission=permission), headers=_reason_header(reason), |
|||
) |
|||
|
|||
def channels_invites_list(self, channel): |
|||
r = self.http(Routes.CHANNELS_INVITES_LIST, dict(channel=channel)) |
|||
return Invite.create_map(self.client, r.json()) |
|||
|
|||
def channels_invites_create(self, channel, max_age=86400, max_uses=0, temporary=False, unique=False, reason=None): |
|||
r = self.http(Routes.CHANNELS_INVITES_CREATE, dict(channel=channel), json={ |
|||
'max_age': max_age, |
|||
'max_uses': max_uses, |
|||
'temporary': temporary, |
|||
'unique': unique, |
|||
}, headers=_reason_header(reason)) |
|||
return Invite.create(self.client, r.json()) |
|||
|
|||
def channels_pins_list(self, channel): |
|||
r = self.http(Routes.CHANNELS_PINS_LIST, dict(channel=channel)) |
|||
return Message.create_map(self.client, r.json()) |
|||
|
|||
def channels_pins_create(self, channel, message): |
|||
self.http(Routes.CHANNELS_PINS_CREATE, dict(channel=channel, message=message)) |
|||
|
|||
def channels_pins_delete(self, channel, message): |
|||
self.http(Routes.CHANNELS_PINS_DELETE, dict(channel=channel, message=message)) |
|||
|
|||
def channels_webhooks_create(self, channel, name=None, avatar=None): |
|||
r = self.http(Routes.CHANNELS_WEBHOOKS_CREATE, dict(channel=channel), json=optional( |
|||
name=name, |
|||
avatar=avatar, |
|||
)) |
|||
return Webhook.create(self.client, r.json()) |
|||
|
|||
def channels_webhooks_list(self, channel): |
|||
r = self.http(Routes.CHANNELS_WEBHOOKS_LIST, dict(channel=channel)) |
|||
return Webhook.create_map(self.client, r.json()) |
|||
|
|||
def guilds_get(self, guild): |
|||
r = self.http(Routes.GUILDS_GET, dict(guild=guild)) |
|||
return Guild.create(self.client, r.json()) |
|||
|
|||
def guilds_modify(self, guild, reason=None, **kwargs): |
|||
r = self.http(Routes.GUILDS_MODIFY, dict(guild=guild), json=kwargs, headers=_reason_header(reason)) |
|||
return Guild.create(self.client, r.json()) |
|||
|
|||
def guilds_delete(self, guild): |
|||
r = self.http(Routes.GUILDS_DELETE, dict(guild=guild)) |
|||
return Guild.create(self.client, r.json()) |
|||
|
|||
def guilds_channels_list(self, guild): |
|||
r = self.http(Routes.GUILDS_CHANNELS_LIST, dict(guild=guild)) |
|||
return Channel.create_hash(self.client, 'id', r.json(), guild_id=guild) |
|||
|
|||
def guilds_channels_create( |
|||
self, |
|||
guild, |
|||
channel_type, |
|||
name, |
|||
bitrate=None, |
|||
user_limit=None, |
|||
permission_overwrites=[], |
|||
nsfw=None, |
|||
parent_id=None, |
|||
position=None, |
|||
reason=None): |
|||
|
|||
payload = { |
|||
'name': name, |
|||
'type': channel_type.value if isinstance(channel_type, EnumAttr) else channel_type, |
|||
'permission_overwrites': [i.to_dict() for i in permission_overwrites], |
|||
'parent_id': parent_id, |
|||
} |
|||
|
|||
payload.update(optional( |
|||
nsfw=nsfw, |
|||
bitrate=bitrate, |
|||
user_limit=user_limit, |
|||
position=position, |
|||
)) |
|||
|
|||
r = self.http( |
|||
Routes.GUILDS_CHANNELS_CREATE, |
|||
dict(guild=guild), |
|||
json=payload, |
|||
headers=_reason_header(reason)) |
|||
return Channel.create(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_channels_modify(self, guild, channel, position, reason=None): |
|||
self.http(Routes.GUILDS_CHANNELS_MODIFY, dict(guild=guild), json={ |
|||
'id': channel, |
|||
'position': position, |
|||
}, headers=_reason_header(reason)) |
|||
|
|||
def guilds_members_list(self, guild, limit=1000, after=None): |
|||
r = self.http(Routes.GUILDS_MEMBERS_LIST, dict(guild=guild), params=optional( |
|||
limit=limit, |
|||
after=after, |
|||
)) |
|||
return GuildMember.create_hash(self.client, 'id', r.json(), guild_id=guild) |
|||
|
|||
def guilds_members_get(self, guild, member): |
|||
r = self.http(Routes.GUILDS_MEMBERS_GET, dict(guild=guild, member=member)) |
|||
return GuildMember.create(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_members_modify(self, guild, member, reason=None, **kwargs): |
|||
self.http( |
|||
Routes.GUILDS_MEMBERS_MODIFY, |
|||
dict(guild=guild, member=member), |
|||
json=kwargs, |
|||
headers=_reason_header(reason)) |
|||
|
|||
def guilds_members_roles_add(self, guild, member, role, reason=None): |
|||
self.http( |
|||
Routes.GUILDS_MEMBERS_ROLES_ADD, |
|||
dict(guild=guild, member=member, role=role), |
|||
headers=_reason_header(reason)) |
|||
|
|||
def guilds_members_roles_remove(self, guild, member, role, reason=None): |
|||
self.http( |
|||
Routes.GUILDS_MEMBERS_ROLES_REMOVE, |
|||
dict(guild=guild, member=member, role=role), |
|||
headers=_reason_header(reason)) |
|||
|
|||
def guilds_members_me_nick(self, guild, nick): |
|||
self.http(Routes.GUILDS_MEMBERS_ME_NICK, dict(guild=guild), json={'nick': nick}) |
|||
|
|||
def guilds_members_kick(self, guild, member, reason=None): |
|||
self.http(Routes.GUILDS_MEMBERS_KICK, dict(guild=guild, member=member), headers=_reason_header(reason)) |
|||
|
|||
def guilds_bans_list(self, guild): |
|||
r = self.http(Routes.GUILDS_BANS_LIST, dict(guild=guild)) |
|||
return GuildBan.create_hash(self.client, 'user.id', r.json()) |
|||
|
|||
def guilds_bans_create(self, guild, user, delete_message_days=0, reason=None): |
|||
self.http(Routes.GUILDS_BANS_CREATE, dict(guild=guild, user=user), params={ |
|||
'delete-message-days': delete_message_days, |
|||
'reason': reason, |
|||
}, headers=_reason_header(reason)) |
|||
|
|||
def guilds_bans_delete(self, guild, user, reason=None): |
|||
self.http( |
|||
Routes.GUILDS_BANS_DELETE, |
|||
dict(guild=guild, user=user), |
|||
headers=_reason_header(reason)) |
|||
|
|||
def guilds_roles_list(self, guild): |
|||
r = self.http(Routes.GUILDS_ROLES_LIST, dict(guild=guild)) |
|||
return Role.create_map(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_roles_create( |
|||
self, |
|||
guild, |
|||
name=None, |
|||
permissions=None, |
|||
color=None, |
|||
hoist=None, |
|||
mentionable=None, |
|||
reason=None): |
|||
|
|||
r = self.http( |
|||
Routes.GUILDS_ROLES_CREATE, |
|||
dict(guild=guild), |
|||
json=optional( |
|||
name=name, |
|||
permissions=permissions, |
|||
color=color, |
|||
hoist=hoist, |
|||
mentionable=mentionable, |
|||
), |
|||
headers=_reason_header(reason)) |
|||
return Role.create(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_roles_modify_batch(self, guild, roles, reason=None): |
|||
r = self.http(Routes.GUILDS_ROLES_MODIFY_BATCH, dict(guild=guild), json=roles, headers=_reason_header(reason)) |
|||
return Role.create_map(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_roles_modify( |
|||
self, |
|||
guild, |
|||
role, |
|||
name=None, |
|||
hoist=None, |
|||
color=None, |
|||
permissions=None, |
|||
position=None, |
|||
mentionable=None, |
|||
reason=None): |
|||
|
|||
r = self.http( |
|||
Routes.GUILDS_ROLES_MODIFY, |
|||
dict(guild=guild, role=role), |
|||
json=optional( |
|||
name=name, |
|||
hoist=hoist, |
|||
color=color, |
|||
permissions=permissions, |
|||
position=position, |
|||
mentionable=mentionable, |
|||
), |
|||
headers=_reason_header(reason)) |
|||
return Role.create(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_roles_delete(self, guild, role, reason=None): |
|||
self.http(Routes.GUILDS_ROLES_DELETE, dict(guild=guild, role=role), headers=_reason_header(reason)) |
|||
|
|||
def guilds_invites_list(self, guild): |
|||
r = self.http(Routes.GUILDS_INVITES_LIST, dict(guild=guild)) |
|||
return Invite.create_map(self.client, r.json()) |
|||
|
|||
def guilds_webhooks_list(self, guild): |
|||
r = self.http(Routes.GUILDS_WEBHOOKS_LIST, dict(guild=guild)) |
|||
return Webhook.create_map(self.client, r.json()) |
|||
|
|||
def guilds_emojis_list(self, guild): |
|||
r = self.http(Routes.GUILDS_EMOJIS_LIST, dict(guild=guild)) |
|||
return GuildEmoji.create_map(self.client, r.json()) |
|||
|
|||
def guilds_emojis_create(self, guild, reason=None, **kwargs): |
|||
r = self.http( |
|||
Routes.GUILDS_EMOJIS_CREATE, |
|||
dict(guild=guild), |
|||
json=kwargs, |
|||
headers=_reason_header(reason)) |
|||
return GuildEmoji.create(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_emojis_modify(self, guild, emoji, reason=None, **kwargs): |
|||
r = self.http( |
|||
Routes.GUILDS_EMOJIS_MODIFY, |
|||
dict(guild=guild, emoji=emoji), |
|||
json=kwargs, |
|||
headers=_reason_header(reason)) |
|||
return GuildEmoji.create(self.client, r.json(), guild_id=guild) |
|||
|
|||
def guilds_emojis_delete(self, guild, emoji, reason=None): |
|||
self.http( |
|||
Routes.GUILDS_EMOJIS_DELETE, |
|||
dict(guild=guild, emoji=emoji), |
|||
headers=_reason_header(reason)) |
|||
|
|||
def guilds_auditlogs_list(self, guild, before=None, user_id=None, action_type=None, limit=50): |
|||
r = self.http(Routes.GUILDS_AUDITLOGS_LIST, dict(guild=guild), params=optional( |
|||
before=before, |
|||
user_id=user_id, |
|||
action_type=int(action_type) if action_type else None, |
|||
limit=limit, |
|||
)) |
|||
|
|||
data = r.json() |
|||
|
|||
users = User.create_hash(self.client, 'id', data['users']) |
|||
webhooks = Webhook.create_hash(self.client, 'id', data['webhooks']) |
|||
return AuditLogEntry.create_map(self.client, r.json()['audit_log_entries'], users, webhooks, guild_id=guild) |
|||
|
|||
def users_get(self, user): |
|||
r = self.http(Routes.USERS_GET, dict(user=user)) |
|||
return User.create(self.client, r.json()) |
|||
|
|||
def users_me_get(self): |
|||
return User.create(self.client, self.http(Routes.USERS_ME_GET).json()) |
|||
|
|||
def users_me_patch(self, payload): |
|||
r = self.http(Routes.USERS_ME_PATCH, json=payload) |
|||
return User.create(self.client, r.json()) |
|||
|
|||
def users_me_guilds_delete(self, guild): |
|||
self.http(Routes.USERS_ME_GUILDS_DELETE, dict(guild=guild)) |
|||
|
|||
def users_me_dms_create(self, recipient_id): |
|||
r = self.http(Routes.USERS_ME_DMS_CREATE, json={ |
|||
'recipient_id': recipient_id, |
|||
}) |
|||
return Channel.create(self.client, r.json()) |
|||
|
|||
def invites_get(self, invite): |
|||
r = self.http(Routes.INVITES_GET, dict(invite=invite)) |
|||
return Invite.create(self.client, r.json()) |
|||
|
|||
def invites_delete(self, invite, reason=None): |
|||
r = self.http(Routes.INVITES_DELETE, dict(invite=invite), headers=_reason_header(reason)) |
|||
return Invite.create(self.client, r.json()) |
|||
|
|||
def webhooks_get(self, webhook): |
|||
r = self.http(Routes.WEBHOOKS_GET, dict(webhook=webhook)) |
|||
return Webhook.create(self.client, r.json()) |
|||
|
|||
def webhooks_modify(self, webhook, name=None, avatar=None, reason=None): |
|||
r = self.http(Routes.WEBHOOKS_MODIFY, dict(webhook=webhook), json=optional( |
|||
name=name, |
|||
avatar=avatar, |
|||
), headers=_reason_header(reason)) |
|||
return Webhook.create(self.client, r.json()) |
|||
|
|||
def webhooks_delete(self, webhook, reason=None): |
|||
self.http(Routes.WEBHOOKS_DELETE, dict(webhook=webhook), headers=_reason_header(reason)) |
|||
|
|||
def webhooks_token_get(self, webhook, token): |
|||
r = self.http(Routes.WEBHOOKS_TOKEN_GET, dict(webhook=webhook, token=token)) |
|||
return Webhook.create(self.client, r.json()) |
|||
|
|||
def webhooks_token_modify(self, webhook, token, name=None, avatar=None): |
|||
r = self.http(Routes.WEBHOOKS_TOKEN_MODIFY, dict(webhook=webhook, token=token), json=optional( |
|||
name=name, |
|||
avatar=avatar, |
|||
)) |
|||
return Webhook.create(self.client, r.json()) |
|||
|
|||
def webhooks_token_delete(self, webhook, token): |
|||
self.http(Routes.WEBHOOKS_TOKEN_DELETE, dict(webhook=webhook, token=token)) |
|||
|
|||
def webhooks_token_execute(self, webhook, token, data, wait=False): |
|||
obj = self.http( |
|||
Routes.WEBHOOKS_TOKEN_EXECUTE, |
|||
dict(webhook=webhook, token=token), |
|||
json=optional(**data), params={'wait': int(wait)}) |
|||
|
|||
if wait: |
|||
return Message.create(self.client, obj.json()) |
@ -1,320 +0,0 @@ |
|||
import requests |
|||
import random |
|||
import gevent |
|||
import six |
|||
import sys |
|||
|
|||
from holster.enum import Enum |
|||
|
|||
from disco import VERSION as disco_version |
|||
from requests import __version__ as requests_version |
|||
from disco.util.logging import LoggingClass |
|||
from disco.api.ratelimit import RateLimiter |
|||
|
|||
# Enum of all HTTP methods used |
|||
HTTPMethod = Enum( |
|||
GET='GET', |
|||
POST='POST', |
|||
PUT='PUT', |
|||
PATCH='PATCH', |
|||
DELETE='DELETE', |
|||
) |
|||
|
|||
|
|||
def to_bytes(obj): |
|||
if six.PY2: |
|||
if isinstance(obj, six.text_type): |
|||
return obj.encode('utf-8') |
|||
return obj |
|||
|
|||
|
|||
class Routes(object): |
|||
""" |
|||
Simple Python object-enum of all method/url route combinations available to |
|||
this client. |
|||
""" |
|||
# Gateway |
|||
GATEWAY_GET = (HTTPMethod.GET, '/gateway') |
|||
GATEWAY_BOT_GET = (HTTPMethod.GET, '/gateway/bot') |
|||
|
|||
# Channels |
|||
CHANNELS = '/channels/{channel}' |
|||
CHANNELS_GET = (HTTPMethod.GET, CHANNELS) |
|||
CHANNELS_MODIFY = (HTTPMethod.PATCH, CHANNELS) |
|||
CHANNELS_DELETE = (HTTPMethod.DELETE, CHANNELS) |
|||
CHANNELS_TYPING = (HTTPMethod.POST, CHANNELS + '/typing') |
|||
CHANNELS_MESSAGES_LIST = (HTTPMethod.GET, CHANNELS + '/messages') |
|||
CHANNELS_MESSAGES_GET = (HTTPMethod.GET, CHANNELS + '/messages/{message}') |
|||
CHANNELS_MESSAGES_CREATE = (HTTPMethod.POST, CHANNELS + '/messages') |
|||
CHANNELS_MESSAGES_MODIFY = (HTTPMethod.PATCH, CHANNELS + '/messages/{message}') |
|||
CHANNELS_MESSAGES_DELETE = (HTTPMethod.DELETE, CHANNELS + '/messages/{message}') |
|||
CHANNELS_MESSAGES_DELETE_BULK = (HTTPMethod.POST, CHANNELS + '/messages/bulk_delete') |
|||
CHANNELS_MESSAGES_REACTIONS_GET = (HTTPMethod.GET, CHANNELS + '/messages/{message}/reactions/{emoji}') |
|||
CHANNELS_MESSAGES_REACTIONS_CREATE = (HTTPMethod.PUT, CHANNELS + '/messages/{message}/reactions/{emoji}/@me') |
|||
CHANNELS_MESSAGES_REACTIONS_DELETE_ME = (HTTPMethod.DELETE, CHANNELS + '/messages/{message}/reactions/{emoji}/@me') |
|||
CHANNELS_MESSAGES_REACTIONS_DELETE_USER = (HTTPMethod.DELETE, |
|||
CHANNELS + '/messages/{message}/reactions/{emoji}/{user}') |
|||
CHANNELS_MESSAGES_REACTIONS_DELETE_EMOJI = (HTTPMethod.DELETE, |
|||
CHANNELS + '/messages/{message}/reactions/{emoji}') |
|||
CHANNELS_PERMISSIONS_MODIFY = (HTTPMethod.PUT, CHANNELS + '/permissions/{permission}') |
|||
CHANNELS_PERMISSIONS_DELETE = (HTTPMethod.DELETE, CHANNELS + '/permissions/{permission}') |
|||
CHANNELS_INVITES_LIST = (HTTPMethod.GET, CHANNELS + '/invites') |
|||
CHANNELS_INVITES_CREATE = (HTTPMethod.POST, CHANNELS + '/invites') |
|||
CHANNELS_PINS_LIST = (HTTPMethod.GET, CHANNELS + '/pins') |
|||
CHANNELS_PINS_CREATE = (HTTPMethod.PUT, CHANNELS + '/pins/{message}') |
|||
CHANNELS_PINS_DELETE = (HTTPMethod.DELETE, CHANNELS + '/pins/{message}') |
|||
CHANNELS_WEBHOOKS_CREATE = (HTTPMethod.POST, CHANNELS + '/webhooks') |
|||
CHANNELS_WEBHOOKS_LIST = (HTTPMethod.GET, CHANNELS + '/webhooks') |
|||
|
|||
# Guilds |
|||
GUILDS = '/guilds/{guild}' |
|||
GUILDS_GET = (HTTPMethod.GET, GUILDS) |
|||
GUILDS_MODIFY = (HTTPMethod.PATCH, GUILDS) |
|||
GUILDS_DELETE = (HTTPMethod.DELETE, GUILDS) |
|||
GUILDS_CHANNELS_LIST = (HTTPMethod.GET, GUILDS + '/channels') |
|||
GUILDS_CHANNELS_CREATE = (HTTPMethod.POST, GUILDS + '/channels') |
|||
GUILDS_CHANNELS_MODIFY = (HTTPMethod.PATCH, GUILDS + '/channels') |
|||
GUILDS_MEMBERS_LIST = (HTTPMethod.GET, GUILDS + '/members') |
|||
GUILDS_MEMBERS_GET = (HTTPMethod.GET, GUILDS + '/members/{member}') |
|||
GUILDS_MEMBERS_MODIFY = (HTTPMethod.PATCH, GUILDS + '/members/{member}') |
|||
GUILDS_MEMBERS_ROLES_ADD = (HTTPMethod.PUT, GUILDS + '/members/{member}/roles/{role}') |
|||
GUILDS_MEMBERS_ROLES_REMOVE = (HTTPMethod.DELETE, GUILDS + '/members/{member}/roles/{role}') |
|||
GUILDS_MEMBERS_ME_NICK = (HTTPMethod.PATCH, GUILDS + '/members/@me/nick') |
|||
GUILDS_MEMBERS_KICK = (HTTPMethod.DELETE, GUILDS + '/members/{member}') |
|||
GUILDS_BANS_LIST = (HTTPMethod.GET, GUILDS + '/bans') |
|||
GUILDS_BANS_CREATE = (HTTPMethod.PUT, GUILDS + '/bans/{user}') |
|||
GUILDS_BANS_DELETE = (HTTPMethod.DELETE, GUILDS + '/bans/{user}') |
|||
GUILDS_ROLES_LIST = (HTTPMethod.GET, GUILDS + '/roles') |
|||
GUILDS_ROLES_CREATE = (HTTPMethod.POST, GUILDS + '/roles') |
|||
GUILDS_ROLES_MODIFY_BATCH = (HTTPMethod.PATCH, GUILDS + '/roles') |
|||
GUILDS_ROLES_MODIFY = (HTTPMethod.PATCH, GUILDS + '/roles/{role}') |
|||
GUILDS_ROLES_DELETE = (HTTPMethod.DELETE, GUILDS + '/roles/{role}') |
|||
GUILDS_PRUNE_COUNT = (HTTPMethod.GET, GUILDS + '/prune') |
|||
GUILDS_PRUNE_BEGIN = (HTTPMethod.POST, GUILDS + '/prune') |
|||
GUILDS_VOICE_REGIONS_LIST = (HTTPMethod.GET, GUILDS + '/regions') |
|||
GUILDS_INVITES_LIST = (HTTPMethod.GET, GUILDS + '/invites') |
|||
GUILDS_INTEGRATIONS_LIST = (HTTPMethod.GET, GUILDS + '/integrations') |
|||
GUILDS_INTEGRATIONS_CREATE = (HTTPMethod.POST, GUILDS + '/integrations') |
|||
GUILDS_INTEGRATIONS_MODIFY = (HTTPMethod.PATCH, GUILDS + '/integrations/{integration}') |
|||
GUILDS_INTEGRATIONS_DELETE = (HTTPMethod.DELETE, GUILDS + '/integrations/{integration}') |
|||
GUILDS_INTEGRATIONS_SYNC = (HTTPMethod.POST, GUILDS + '/integrations/{integration}/sync') |
|||
GUILDS_EMBED_GET = (HTTPMethod.GET, GUILDS + '/embed') |
|||
GUILDS_EMBED_MODIFY = (HTTPMethod.PATCH, GUILDS + '/embed') |
|||
GUILDS_WEBHOOKS_LIST = (HTTPMethod.GET, GUILDS + '/webhooks') |
|||
GUILDS_EMOJIS_LIST = (HTTPMethod.GET, GUILDS + '/emojis') |
|||
GUILDS_EMOJIS_CREATE = (HTTPMethod.POST, GUILDS + '/emojis') |
|||
GUILDS_EMOJIS_MODIFY = (HTTPMethod.PATCH, GUILDS + '/emojis/{emoji}') |
|||
GUILDS_EMOJIS_DELETE = (HTTPMethod.DELETE, GUILDS + '/emojis/{emoji}') |
|||
GUILDS_AUDITLOGS_LIST = (HTTPMethod.GET, GUILDS + '/audit-logs') |
|||
|
|||
# Users |
|||
USERS = '/users' |
|||
USERS_ME_GET = (HTTPMethod.GET, USERS + '/@me') |
|||
USERS_ME_PATCH = (HTTPMethod.PATCH, USERS + '/@me') |
|||
USERS_ME_GUILDS_LIST = (HTTPMethod.GET, USERS + '/@me/guilds') |
|||
USERS_ME_GUILDS_DELETE = (HTTPMethod.DELETE, USERS + '/@me/guilds/{guild}') |
|||
USERS_ME_DMS_LIST = (HTTPMethod.GET, USERS + '/@me/channels') |
|||
USERS_ME_DMS_CREATE = (HTTPMethod.POST, USERS + '/@me/channels') |
|||
USERS_ME_CONNECTIONS_LIST = (HTTPMethod.GET, USERS + '/@me/connections') |
|||
USERS_GET = (HTTPMethod.GET, USERS + '/{user}') |
|||
|
|||
# Invites |
|||
INVITES = '/invites' |
|||
INVITES_GET = (HTTPMethod.GET, INVITES + '/{invite}') |
|||
INVITES_DELETE = (HTTPMethod.DELETE, INVITES + '/{invite}') |
|||
|
|||
# Webhooks |
|||
WEBHOOKS = '/webhooks/{webhook}' |
|||
WEBHOOKS_GET = (HTTPMethod.GET, WEBHOOKS) |
|||
WEBHOOKS_MODIFY = (HTTPMethod.PATCH, WEBHOOKS) |
|||
WEBHOOKS_DELETE = (HTTPMethod.DELETE, WEBHOOKS) |
|||
WEBHOOKS_TOKEN_GET = (HTTPMethod.GET, WEBHOOKS + '/{token}') |
|||
WEBHOOKS_TOKEN_MODIFY = (HTTPMethod.PATCH, WEBHOOKS + '/{token}') |
|||
WEBHOOKS_TOKEN_DELETE = (HTTPMethod.DELETE, WEBHOOKS + '/{token}') |
|||
WEBHOOKS_TOKEN_EXECUTE = (HTTPMethod.POST, WEBHOOKS + '/{token}') |
|||
|
|||
|
|||
class APIResponse(object): |
|||
def __init__(self): |
|||
self.response = None |
|||
self.exception = None |
|||
self.rate_limited_duration = 0 |
|||
|
|||
|
|||
class APIException(Exception): |
|||
""" |
|||
Exception thrown when an HTTP-client level error occurs. Usually this will |
|||
be a non-success status-code, or a transient network issue. |
|||
|
|||
Attributes |
|||
---------- |
|||
status_code : int |
|||
The status code returned by the API for the request that triggered this |
|||
error. |
|||
""" |
|||
def __init__(self, response, retries=None): |
|||
self.response = response |
|||
self.retries = retries |
|||
|
|||
self.code = 0 |
|||
self.msg = 'Request Failed ({})'.format(response.status_code) |
|||
self.errors = {} |
|||
|
|||
if self.retries: |
|||
self.msg += ' after {} retries'.format(self.retries) |
|||
|
|||
# Try to decode JSON, and extract params |
|||
try: |
|||
data = self.response.json() |
|||
|
|||
if 'code' in data: |
|||
self.code = data['code'] |
|||
self.errors = data.get('errors', {}) |
|||
self.msg = '{} ({} - {})'.format(data['message'], self.code, self.errors) |
|||
elif len(data) == 1: |
|||
key, value = list(data.items())[0] |
|||
self.msg = 'Request Failed: {}: {}'.format(key, ', '.join(value)) |
|||
except ValueError: |
|||
pass |
|||
|
|||
# DEPRECATED: left for backwards compat |
|||
self.status_code = response.status_code |
|||
self.content = response.content |
|||
|
|||
super(APIException, self).__init__(self.msg) |
|||
|
|||
|
|||
class HTTPClient(LoggingClass): |
|||
""" |
|||
A simple HTTP client which wraps the requests library, adding support for |
|||
Discords rate-limit headers, authorization, and request/response validation. |
|||
""" |
|||
BASE_URL = 'https://discord.com/api/v7' |
|||
MAX_RETRIES = 5 |
|||
|
|||
def __init__(self, token, after_request=None): |
|||
super(HTTPClient, self).__init__() |
|||
|
|||
py_version = '{}.{}.{}'.format( |
|||
sys.version_info.major, |
|||
sys.version_info.minor, |
|||
sys.version_info.micro) |
|||
|
|||
self.limiter = RateLimiter() |
|||
self.headers = { |
|||
'User-Agent': 'DiscordBot (https://github.com/b1naryth1ef/disco {}) Python/{} requests/{}'.format( |
|||
disco_version, |
|||
py_version, |
|||
requests_version), |
|||
} |
|||
|
|||
if token: |
|||
self.headers['Authorization'] = 'Bot ' + token |
|||
|
|||
self.after_request = after_request |
|||
self.session = requests.Session() |
|||
|
|||
def __call__(self, route, args=None, **kwargs): |
|||
return self.call(route, args, **kwargs) |
|||
|
|||
def call(self, route, args=None, **kwargs): |
|||
""" |
|||
Makes a request to the given route (as specified in |
|||
:class:`disco.api.http.Routes`) with a set of URL arguments, and keyword |
|||
arguments passed to requests. |
|||
|
|||
Parameters |
|||
---------- |
|||
route : tuple(:class:`HTTPMethod`, str) |
|||
The method.URL combination that when compiled with URL arguments |
|||
creates a requestable route which the HTTPClient will make the |
|||
request too. |
|||
args : dict(str, str) |
|||
A dictionary of URL arguments that will be compiled with the raw URL |
|||
to create the requestable route. The HTTPClient uses this to track |
|||
rate limits as well. |
|||
kwargs : dict |
|||
Keyword arguments that will be passed along to the requests library |
|||
|
|||
Raises |
|||
------ |
|||
APIException |
|||
Raised when an unrecoverable error occurs, or when we've exhausted |
|||
the number of retries. |
|||
|
|||
Returns |
|||
------- |
|||
:class:`requests.Response` |
|||
The response object for the request |
|||
""" |
|||
args = args or {} |
|||
retry = kwargs.pop('retry_number', 0) |
|||
|
|||
# Merge or set headers |
|||
if 'headers' in kwargs: |
|||
kwargs['headers'].update(self.headers) |
|||
else: |
|||
kwargs['headers'] = self.headers |
|||
|
|||
# Build the bucket URL |
|||
args = {k: to_bytes(v) for k, v in six.iteritems(args)} |
|||
filtered = {k: (v if k in ('guild', 'channel') else '') for k, v in six.iteritems(args)} |
|||
bucket = (route[0].value, route[1].format(**filtered)) |
|||
|
|||
response = APIResponse() |
|||
|
|||
# Possibly wait if we're rate limited |
|||
response.rate_limited_duration = self.limiter.check(bucket) |
|||
|
|||
self.log.debug('KW: %s', kwargs) |
|||
|
|||
# Make the actual request |
|||
url = self.BASE_URL + route[1].format(**args) |
|||
self.log.info('%s %s (%s)', route[0].value, url, kwargs.get('params')) |
|||
r = self.session.request(route[0].value, url, **kwargs) |
|||
|
|||
if self.after_request: |
|||
response.response = r |
|||
self.after_request(response) |
|||
|
|||
# Update rate limiter |
|||
self.limiter.update(bucket, r) |
|||
|
|||
# If we got a success status code, just return the data |
|||
if r.status_code < 400: |
|||
return r |
|||
elif r.status_code != 429 and 400 <= r.status_code < 500: |
|||
self.log.warning('Request failed with code %s: %s', r.status_code, r.content) |
|||
response.exception = APIException(r) |
|||
raise response.exception |
|||
else: |
|||
if r.status_code == 429: |
|||
self.log.warning( |
|||
'Request responded w/ 429, retrying (but this should not happen, check your clock sync') |
|||
|
|||
# If we hit the max retries, throw an error |
|||
retry += 1 |
|||
if retry > self.MAX_RETRIES: |
|||
self.log.error('Failing request, hit max retries') |
|||
raise APIException(r, retries=self.MAX_RETRIES) |
|||
|
|||
backoff = self.random_backoff() |
|||
self.log.warning('Request to `{}` failed with code {}, retrying after {}s ({})'.format( |
|||
url, r.status_code, backoff, r.content, |
|||
)) |
|||
gevent.sleep(backoff) |
|||
|
|||
# Otherwise just recurse and try again |
|||
return self(route, args, retry_number=retry, **kwargs) |
|||
|
|||
@staticmethod |
|||
def random_backoff(): |
|||
""" |
|||
Returns a random backoff (in milliseconds) to be used for any error the |
|||
client suspects is transient. Will always return a value between 500 and |
|||
5000 milliseconds. |
|||
|
|||
:returns: a random backoff in milliseconds |
|||
:rtype: float |
|||
""" |
|||
return random.randint(500, 5000) / 1000.0 |
@ -1,173 +0,0 @@ |
|||
import time |
|||
import gevent |
|||
|
|||
|
|||
from disco.util.logging import LoggingClass |
|||
|
|||
|
|||
class RouteState(LoggingClass): |
|||
""" |
|||
An object which stores ratelimit state for a given method/url route |
|||
combination (as specified in :class:`disco.api.http.Routes`). |
|||
|
|||
Parameters |
|||
---------- |
|||
route : tuple(HTTPMethod, str) |
|||
The route which this RouteState is for. |
|||
response : :class:`requests.Response` |
|||
The response object for the last request made to the route, should contain |
|||
the standard rate limit headers. |
|||
|
|||
Attributes |
|||
--------- |
|||
route : tuple(HTTPMethod, str) |
|||
The route which this RouteState is for. |
|||
remaining : int |
|||
The number of remaining requests to the route before the rate limit will |
|||
be hit, triggering a 429 response. |
|||
reset_time : int |
|||
A unix epoch timestamp (in seconds) after which this rate limit is reset |
|||
event : :class:`gevent.event.Event` |
|||
An event that is used to block all requests while a route is in the |
|||
cooldown stage. |
|||
""" |
|||
def __init__(self, route, response): |
|||
self.route = route |
|||
self.remaining = 0 |
|||
self.reset_time = 0 |
|||
self.event = None |
|||
|
|||
self.update(response) |
|||
|
|||
def __repr__(self): |
|||
return '<RouteState {}>'.format(' '.join(self.route)) |
|||
|
|||
@property |
|||
def chilled(self): |
|||
""" |
|||
Whether this route is currently being cooldown (aka waiting until reset_time). |
|||
""" |
|||
return self.event is not None |
|||
|
|||
@property |
|||
def next_will_ratelimit(self): |
|||
""" |
|||
Whether the next request to the route (at this moment in time) will |
|||
trigger the rate limit. |
|||
""" |
|||
|
|||
if self.remaining - 1 < 0 and time.time() <= self.reset_time: |
|||
return True |
|||
|
|||
return False |
|||
|
|||
def update(self, response): |
|||
""" |
|||
Updates this route with a given Requests response object. Its expected |
|||
the response has the required headers, however in the case it doesn't |
|||
this function has no effect. |
|||
""" |
|||
if 'X-RateLimit-Remaining' not in response.headers: |
|||
return |
|||
|
|||
self.remaining = int(float(response.headers.get('X-RateLimit-Remaining'))) |
|||
self.reset_time = int(float(response.headers.get('X-RateLimit-Reset'))) |
|||
|
|||
def wait(self, timeout=None): |
|||
""" |
|||
Waits until this route is no longer under a cooldown. |
|||
|
|||
Returns |
|||
------- |
|||
float |
|||
The duration we waited for, in seconds or zero if we didn't have to |
|||
wait at all. |
|||
""" |
|||
if self.event.is_set(): |
|||
return 0 |
|||
|
|||
start = time.time() |
|||
self.event.wait() |
|||
return time.time() - start |
|||
|
|||
def cooldown(self): |
|||
""" |
|||
Waits for the current route to be cooled-down (aka waiting until reset time). |
|||
""" |
|||
if self.reset_time - time.time() < 0: |
|||
raise Exception('Cannot cooldown for negative time period; check clock sync') |
|||
|
|||
self.event = gevent.event.Event() |
|||
delay = (self.reset_time - time.time()) + .5 |
|||
self.log.debug('Cooling down bucket %s for %s seconds', self, delay) |
|||
gevent.sleep(delay) |
|||
self.event.set() |
|||
self.event = None |
|||
return delay |
|||
|
|||
|
|||
class RateLimiter(LoggingClass): |
|||
""" |
|||
A in-memory store of ratelimit states for all routes we've ever called. |
|||
|
|||
Attributes |
|||
---------- |
|||
states : dict(tuple(HTTPMethod, str), :class:`RouteState`) |
|||
Contains a :class:`RouteState` for each route the RateLimiter is currently |
|||
tracking. |
|||
""" |
|||
def __init__(self): |
|||
self.states = {} |
|||
|
|||
def check(self, route): |
|||
""" |
|||
Checks whether a given route can be called. This function will return |
|||
immediately if no rate-limit cooldown is being imposed for the given |
|||
route, or will wait indefinitely until the route is finished being |
|||
cooled down. This function should be called before making a request to |
|||
the specified route. |
|||
|
|||
Parameters |
|||
---------- |
|||
route : tuple(HTTPMethod, str) |
|||
The route that will be checked. |
|||
|
|||
Returns |
|||
------- |
|||
float |
|||
The number of seconds we had to wait for this rate limit, or zero |
|||
if no time was waited. |
|||
""" |
|||
return self._check(None) + self._check(route) |
|||
|
|||
def _check(self, route): |
|||
if route in self.states: |
|||
# If the route is being cooled off, we need to wait until its ready |
|||
if self.states[route].chilled: |
|||
return self.states[route].wait() |
|||
|
|||
if self.states[route].next_will_ratelimit: |
|||
return gevent.spawn(self.states[route].cooldown).get() |
|||
|
|||
return 0 |
|||
|
|||
def update(self, route, response): |
|||
""" |
|||
Updates the given routes state with the rate-limit headers inside the |
|||
response from a previous call to the route. |
|||
|
|||
Parameters |
|||
--------- |
|||
route : tuple(HTTPMethod, str) |
|||
The route that will be updated. |
|||
response : :class:`requests.Response` |
|||
The response object for the last request to the route, whose headers |
|||
will be used to update the routes rate limit state. |
|||
""" |
|||
if 'X-RateLimit-Global' in response.headers: |
|||
route = None |
|||
|
|||
if route in self.states: |
|||
self.states[route].update(response) |
|||
else: |
|||
self.states[route] = RouteState(route, response) |
@ -1,6 +0,0 @@ |
|||
from disco.bot.bot import Bot, BotConfig |
|||
from disco.bot.plugin import Plugin |
|||
from disco.bot.command import CommandLevels |
|||
from disco.util.config import Config |
|||
|
|||
__all__ = ['Bot', 'BotConfig', 'Plugin', 'Config', 'CommandLevels'] |
@ -1,534 +0,0 @@ |
|||
import re |
|||
import os |
|||
import six |
|||
import gevent |
|||
import inspect |
|||
import importlib |
|||
|
|||
from six.moves import reload_module |
|||
from holster.threadlocal import ThreadLocal |
|||
from gevent.pywsgi import WSGIServer |
|||
|
|||
from disco.types.guild import GuildMember |
|||
from disco.bot.plugin import find_loadable_plugins |
|||
from disco.bot.command import CommandEvent, CommandLevels |
|||
from disco.bot.storage import Storage |
|||
from disco.util.config import Config |
|||
from disco.util.logging import LoggingClass |
|||
from disco.util.serializer import Serializer |
|||
|
|||
|
|||
class BotConfig(Config): |
|||
""" |
|||
An object which is used to configure and define the runtime configuration for |
|||
a bot. |
|||
|
|||
Attributes |
|||
---------- |
|||
levels : dict(snowflake, str) |
|||
Mapping of user IDs/role IDs to :class:`disco.bot.commands.CommandLevesls` |
|||
which is used for the default commands_level_getter. |
|||
plugins : list[string] |
|||
List of plugin modules to load. |
|||
commands_enabled : bool |
|||
Whether this bot instance should utilize command parsing. Generally this |
|||
should be true, unless your bot is only handling events and has no user |
|||
interaction. |
|||
commands_require_mention : bool |
|||
Whether messages must mention the bot to be considered for command parsing. |
|||
commands_mention_rules : dict(str, bool) |
|||
A dictionary describing what mention types can be considered a mention |
|||
of the bot when using :attr:`commands_require_mention`. This dictionary |
|||
can contain the following keys: `here`, `everyone`, `role`, `user`. When |
|||
a keys value is set to true, the mention type will be considered for |
|||
command parsing. |
|||
commands_prefix : str |
|||
A string prefix that is required for a message to be considered for |
|||
command parsing. |
|||
commands_allow_edit : bool |
|||
If true, the bot will reparse an edited message if it was the last sent |
|||
message in a channel, and did not previously trigger a command. This is |
|||
helpful for allowing edits to typod commands. |
|||
commands_level_getter : function |
|||
If set, a function which when given a GuildMember or User, returns the |
|||
relevant :class:`disco.bot.commands.CommandLevels`. |
|||
commands_group_abbrev : function |
|||
If true, command groups may be abbreviated to the least common variation. |
|||
E.g. the grouping 'test' may be abbreviated down to 't', unless 'tag' exists, |
|||
in which case it may be abbreviated down to 'te'. |
|||
plugin_config_provider : Optional[function] |
|||
If set, this function will replace the default configuration loading |
|||
function, which normally attempts to load a file located at config/plugin_name.fmt |
|||
where fmt is the plugin_config_format. The function here should return |
|||
a valid configuration object which the plugin understands. |
|||
plugin_config_format : str |
|||
The serialization format plugin configuration files are in. |
|||
plugin_config_dir : str |
|||
The directory plugin configuration is located within. |
|||
http_enabled : bool |
|||
Whether to enable the built-in Flask server which allows plugins to handle |
|||
and route HTTP requests. |
|||
http_host : str |
|||
The host string for the HTTP Flask server (if enabled) |
|||
http_port : int |
|||
The port for the HTTP Flask server (if enabled) |
|||
""" |
|||
levels = {} |
|||
plugins = [] |
|||
plugin_config = {} |
|||
shared_config = {} |
|||
|
|||
commands_enabled = True |
|||
commands_require_mention = True |
|||
commands_mention_rules = { |
|||
# 'here': False, |
|||
'everyone': False, |
|||
'role': True, |
|||
'user': True, |
|||
} |
|||
commands_prefix = '' |
|||
commands_allow_edit = True |
|||
commands_level_getter = None |
|||
commands_group_abbrev = True |
|||
|
|||
plugin_config_provider = None |
|||
plugin_config_format = 'json' |
|||
plugin_config_dir = 'config' |
|||
|
|||
storage_enabled = True |
|||
storage_fsync = True |
|||
storage_serializer = 'json' |
|||
storage_path = 'storage.json' |
|||
|
|||
http_enabled = False |
|||
http_host = '0.0.0.0' |
|||
http_port = 7575 |
|||
|
|||
|
|||
class Bot(LoggingClass): |
|||
""" |
|||
Disco's implementation of a simple but extendable Discord bot. Bots consist |
|||
of a set of plugins, and a Disco client. |
|||
|
|||
Parameters |
|||
---------- |
|||
client : :class:`disco.client.Client` |
|||
The client this bot should utilize for its connection. |
|||
config : Optional[:class:`BotConfig`] |
|||
The configuration to use for this bot. If not provided will use the defaults |
|||
inside of :class:`BotConfig`. |
|||
|
|||
Attributes |
|||
---------- |
|||
client : `disco.client.Client` |
|||
The client instance for this bot. |
|||
config : `BotConfig` |
|||
The bot configuration instance for this bot. |
|||
plugins : dict(str, :class:`disco.bot.plugin.Plugin`) |
|||
Any plugins this bot has loaded |
|||
""" |
|||
def __init__(self, client, config=None): |
|||
self.client = client |
|||
self.config = config or BotConfig() |
|||
|
|||
# Shard manager |
|||
self.shards = None |
|||
|
|||
# The context carries information about events in a threadlocal storage |
|||
self.ctx = ThreadLocal() |
|||
|
|||
# The storage object acts as a dynamic contextual aware store |
|||
self.storage = None |
|||
if self.config.storage_enabled: |
|||
self.storage = Storage(self.ctx, self.config.from_prefix('storage')) |
|||
|
|||
# If the manhole is enabled, add this bot as a local |
|||
if self.client.config.manhole_enable: |
|||
self.client.manhole_locals['bot'] = self |
|||
|
|||
# Setup HTTP server (Flask app) if enabled |
|||
self.http = None |
|||
if self.config.http_enabled: |
|||
try: |
|||
from flask import Flask |
|||
except ImportError: |
|||
self.log.warning('Failed to enable HTTP server, Flask is not installed') |
|||
else: |
|||
self.log.info('Starting HTTP server bound to %s:%s', self.config.http_host, self.config.http_port) |
|||
self.http = Flask('disco') |
|||
self.http_server = WSGIServer((self.config.http_host, self.config.http_port), self.http) |
|||
self.http_server_greenlet = gevent.spawn(self.http_server.serve_forever) |
|||
|
|||
self.plugins = {} |
|||
self.group_abbrev = {} |
|||
|
|||
# Only bind event listeners if we're going to parse commands |
|||
if self.config.commands_enabled: |
|||
self.client.events.on('MessageCreate', self.on_message_create) |
|||
|
|||
if self.config.commands_allow_edit: |
|||
self.client.events.on('MessageUpdate', self.on_message_update) |
|||
|
|||
# If we have a level getter and its a string, try to load it |
|||
if isinstance(self.config.commands_level_getter, six.string_types): |
|||
mod, func = self.config.commands_level_getter.rsplit('.', 1) |
|||
mod = importlib.import_module(mod) |
|||
self.config.commands_level_getter = getattr(mod, func) |
|||
|
|||
# Stores the last message for every single channel |
|||
self.last_message_cache = {} |
|||
|
|||
# Stores a giant regex matcher for all commands |
|||
self.command_matches_re = None |
|||
|
|||
# Finally, load all the plugin modules that where passed with the config |
|||
for plugin_mod in self.config.plugins: |
|||
self.add_plugin_module(plugin_mod) |
|||
|
|||
# Convert level mapping |
|||
for k, v in list(six.iteritems(self.config.levels)): |
|||
self.config.levels[int(k) if k.isdigit() else k] = CommandLevels.get(v) |
|||
|
|||
@classmethod |
|||
def from_cli(cls, *plugins): |
|||
""" |
|||
Creates a new instance of the bot using the utilities inside of the |
|||
:mod:`disco.cli` module. Allows passing in a set of uninitialized |
|||
plugin classes to load. |
|||
|
|||
Parameters |
|||
--------- |
|||
plugins : Optional[list(:class:`disco.bot.plugin.Plugin`)] |
|||
Any plugins to load after creating the new bot instance |
|||
|
|||
""" |
|||
from disco.cli import disco_main |
|||
inst = cls(disco_main()) |
|||
|
|||
for plugin in plugins: |
|||
inst.add_plugin(plugin) |
|||
|
|||
return inst |
|||
|
|||
@property |
|||
def commands(self): |
|||
""" |
|||
Generator of all commands this bots plugins have defined. |
|||
""" |
|||
for plugin in six.itervalues(self.plugins): |
|||
for command in plugin.commands: |
|||
yield command |
|||
|
|||
def recompute(self): |
|||
""" |
|||
Called when a plugin is loaded/unloaded to recompute internal state. |
|||
""" |
|||
if self.config.commands_group_abbrev: |
|||
groups = {command.group for command in self.commands if command.group} |
|||
self.group_abbrev = self.compute_group_abbrev(groups) |
|||
|
|||
self.compute_command_matches_re() |
|||
|
|||
def compute_group_abbrev(self, groups): |
|||
""" |
|||
Computes all possible abbreviations for a command grouping. |
|||
""" |
|||
# For the first pass, we just want to compute each groups possible |
|||
# abbreviations that don't conflict with eachother. |
|||
possible = {} |
|||
for group in groups: |
|||
for index in range(1, len(group)): |
|||
current = group[:index] |
|||
if current in possible: |
|||
possible[current] = None |
|||
else: |
|||
possible[current] = group |
|||
|
|||
# Now, we want to compute the actual shortest abbreivation out of the |
|||
# possible ones |
|||
result = {} |
|||
for abbrev, group in six.iteritems(possible): |
|||
if not group: |
|||
continue |
|||
|
|||
if group in result: |
|||
if len(abbrev) < len(result[group]): |
|||
result[group] = abbrev |
|||
else: |
|||
result[group] = abbrev |
|||
|
|||
return result |
|||
|
|||
def compute_command_matches_re(self): |
|||
""" |
|||
Computes a single regex which matches all possible command combinations. |
|||
""" |
|||
commands = list(self.commands) |
|||
re_str = '|'.join(command.regex(grouped=False) for command in commands) |
|||
if re_str: |
|||
self.command_matches_re = re.compile(re_str, re.I) |
|||
else: |
|||
self.command_matches_re = None |
|||
|
|||
def get_commands_for_message(self, require_mention, mention_rules, prefix, msg): |
|||
""" |
|||
Generator of all commands that a given message object triggers, based on |
|||
the bots plugins and configuration. |
|||
|
|||
Parameters |
|||
--------- |
|||
msg : :class:`disco.types.message.Message` |
|||
The message object to parse and find matching commands for |
|||
|
|||
Yields |
|||
------- |
|||
tuple(:class:`disco.bot.command.Command`, `re.MatchObject`) |
|||
All commands the message triggers |
|||
""" |
|||
content = msg.content |
|||
|
|||
if require_mention: |
|||
mention_direct = msg.is_mentioned(self.client.state.me) |
|||
mention_everyone = msg.mention_everyone |
|||
|
|||
mention_roles = [] |
|||
if msg.guild: |
|||
mention_roles = list(filter(lambda r: msg.is_mentioned(r), |
|||
msg.guild.get_member(self.client.state.me).roles)) |
|||
|
|||
if not any(( |
|||
mention_rules.get('user', True) and mention_direct, |
|||
mention_rules.get('everyone', False) and mention_everyone, |
|||
mention_rules.get('role', False) and any(mention_roles), |
|||
msg.channel.is_dm, |
|||
)): |
|||
return [] |
|||
|
|||
if mention_direct: |
|||
if msg.guild: |
|||
member = msg.guild.get_member(self.client.state.me) |
|||
if member: |
|||
# Filter both the normal and nick mentions |
|||
content = content.replace(member.user.mention, '', 1) |
|||
content = content.replace(member.user.mention_nickname, '', 1) |
|||
else: |
|||
content = content.replace(self.client.state.me.mention, '', 1) |
|||
elif mention_everyone: |
|||
content = content.replace('@everyone', '', 1) |
|||
else: |
|||
for role in mention_roles: |
|||
content = content.replace('<@{}>'.format(role), '', 1) |
|||
|
|||
content = content.lstrip() |
|||
|
|||
if prefix and not content.startswith(prefix): |
|||
return [] |
|||
else: |
|||
content = content[len(prefix):] |
|||
|
|||
if not self.command_matches_re or not self.command_matches_re.match(content): |
|||
return [] |
|||
|
|||
options = [] |
|||
for command in self.commands: |
|||
match = command.compiled_regex.match(content) |
|||
if match: |
|||
options.append((command, match)) |
|||
return sorted(options, key=lambda obj: obj[0].group is None) |
|||
|
|||
def get_level(self, actor): |
|||
level = CommandLevels.DEFAULT |
|||
|
|||
if callable(self.config.commands_level_getter): |
|||
level = self.config.commands_level_getter(self, actor) |
|||
else: |
|||
if actor.id in self.config.levels: |
|||
level = self.config.levels[actor.id] |
|||
|
|||
if isinstance(actor, GuildMember): |
|||
for rid in actor.roles: |
|||
if rid in self.config.levels and self.config.levels[rid] > level: |
|||
level = self.config.levels[rid] |
|||
|
|||
return level |
|||
|
|||
def check_command_permissions(self, command, msg): |
|||
if not command.level: |
|||
return True |
|||
|
|||
level = self.get_level(msg.author if not msg.guild else msg.guild.get_member(msg.author)) |
|||
|
|||
if level >= command.level: |
|||
return True |
|||
return False |
|||
|
|||
def handle_message(self, msg): |
|||
""" |
|||
Attempts to handle a newly created or edited message in the context of |
|||
command parsing/triggering. Calls all relevant commands the message triggers. |
|||
|
|||
Parameters |
|||
--------- |
|||
msg : :class:`disco.types.message.Message` |
|||
The newly created or updated message object to parse/handle. |
|||
|
|||
Returns |
|||
------- |
|||
bool |
|||
whether any commands where successfully triggered by the message |
|||
""" |
|||
commands = list(self.get_commands_for_message( |
|||
self.config.commands_require_mention, |
|||
self.config.commands_mention_rules, |
|||
self.config.commands_prefix, |
|||
msg, |
|||
)) |
|||
|
|||
if not len(commands): |
|||
return False |
|||
|
|||
for command, match in commands: |
|||
if not self.check_command_permissions(command, msg): |
|||
continue |
|||
|
|||
if command.plugin.execute(CommandEvent(command, msg, match)): |
|||
return True |
|||
return False |
|||
|
|||
def on_message_create(self, event): |
|||
if event.message.author.id == self.client.state.me.id: |
|||
return |
|||
|
|||
result = self.handle_message(event.message) |
|||
|
|||
if self.config.commands_allow_edit: |
|||
self.last_message_cache[event.message.channel_id] = (event.message, result) |
|||
|
|||
def on_message_update(self, event): |
|||
if not self.config.commands_allow_edit: |
|||
return |
|||
|
|||
# Ignore messages that do not have content, these can happen when only |
|||
# some message fields are updated. |
|||
if not event.message.content: |
|||
return |
|||
|
|||
obj = self.last_message_cache.get(event.message.channel_id) |
|||
if not obj: |
|||
return |
|||
|
|||
msg, triggered = obj |
|||
if msg.id == event.message.id and not triggered: |
|||
msg.inplace_update(event.message) |
|||
triggered = self.handle_message(msg) |
|||
|
|||
self.last_message_cache[msg.channel_id] = (msg, triggered) |
|||
|
|||
def add_plugin(self, inst, config=None, ctx=None): |
|||
""" |
|||
Adds and loads a plugin, based on its class. |
|||
|
|||
Parameters |
|||
---------- |
|||
inst : subclass (or instance therein) of `disco.bot.plugin.Plugin` |
|||
Plugin class to initialize and load. |
|||
config : Optional |
|||
The configuration to load the plugin with. |
|||
ctx : Optional[dict] |
|||
Context (previous state) to pass the plugin. Usually used along w/ |
|||
unload. |
|||
""" |
|||
if inspect.isclass(inst): |
|||
if not config: |
|||
if callable(self.config.plugin_config_provider): |
|||
config = self.config.plugin_config_provider(inst) |
|||
else: |
|||
config = self.load_plugin_config(inst) |
|||
|
|||
inst = inst(self, config) |
|||
|
|||
if inst.__class__.__name__ in self.plugins: |
|||
self.log.warning('Attempted to add already added plugin %s', inst.__class__.__name__) |
|||
raise Exception('Cannot add already added plugin: {}'.format(inst.__class__.__name__)) |
|||
|
|||
self.ctx['plugin'] = self.plugins[inst.__class__.__name__] = inst |
|||
self.plugins[inst.__class__.__name__].load(ctx or {}) |
|||
self.recompute() |
|||
self.ctx.drop() |
|||
|
|||
def rmv_plugin(self, cls): |
|||
""" |
|||
Unloads and removes a plugin based on its class. |
|||
|
|||
Parameters |
|||
---------- |
|||
cls : subclass of :class:`disco.bot.plugin.Plugin` |
|||
Plugin class to unload and remove. |
|||
""" |
|||
if cls.__name__ not in self.plugins: |
|||
raise Exception('Cannot remove non-existant plugin: {}'.format(cls.__name__)) |
|||
|
|||
ctx = {} |
|||
self.plugins[cls.__name__].unload(ctx) |
|||
del self.plugins[cls.__name__] |
|||
self.recompute() |
|||
return ctx |
|||
|
|||
def reload_plugin(self, cls): |
|||
""" |
|||
Reloads a plugin. |
|||
""" |
|||
config = self.plugins[cls.__name__].config |
|||
|
|||
ctx = self.rmv_plugin(cls) |
|||
module = reload_module(inspect.getmodule(cls)) |
|||
self.add_plugin(getattr(module, cls.__name__), config, ctx) |
|||
|
|||
def run_forever(self): |
|||
""" |
|||
Runs this bots core loop forever. |
|||
""" |
|||
self.client.run_forever() |
|||
|
|||
def add_plugin_module(self, path, config=None): |
|||
""" |
|||
Adds and loads a plugin, based on its module path. |
|||
""" |
|||
self.log.info('Adding plugin module at path "%s"', path) |
|||
mod = importlib.import_module(path) |
|||
loaded = False |
|||
|
|||
plugins = find_loadable_plugins(mod) |
|||
for plugin in plugins: |
|||
loaded = True |
|||
self.add_plugin(plugin, config) |
|||
|
|||
if not loaded: |
|||
raise Exception('Could not find any plugins to load within module {}'.format(path)) |
|||
|
|||
def load_plugin_config(self, cls): |
|||
name = cls.__name__.lower() |
|||
if name.endswith('plugin'): |
|||
name = name[:-6] |
|||
|
|||
path = os.path.join( |
|||
self.config.plugin_config_dir, name) + '.' + self.config.plugin_config_format |
|||
|
|||
data = {} |
|||
if self.config.shared_config: |
|||
data.update(self.config.shared_config) |
|||
|
|||
if name in self.config.plugin_config: |
|||
data.update(self.config.plugin_config[name]) |
|||
|
|||
if os.path.exists(path): |
|||
with open(path, 'r') as f: |
|||
data.update(Serializer.loads(self.config.plugin_config_format, f.read())) |
|||
|
|||
if hasattr(cls, 'config_cls'): |
|||
inst = cls.config_cls() |
|||
if data: |
|||
inst.update(data) |
|||
return inst |
|||
|
|||
return data |
@ -1,306 +0,0 @@ |
|||
import re |
|||
import argparse |
|||
|
|||
from holster.enum import Enum |
|||
|
|||
from six import integer_types |
|||
|
|||
from disco.bot.parser import ArgumentSet, ArgumentError |
|||
from disco.util.functional import simple_cached_property |
|||
|
|||
ARGS_REGEX = '(?: ((?:\n|.)*)$|$)' |
|||
ARGS_UNGROUPED_REGEX = '(?: (?:\n|.)*$|$)' |
|||
SPLIT_SPACES_NO_QUOTE = re.compile(r'["|\']([^"\']+)["|\']|(\S+)') |
|||
|
|||
USER_MENTION_RE = re.compile('<@!?([0-9]+)>') |
|||
ROLE_MENTION_RE = re.compile('<@&([0-9]+)>') |
|||
CHANNEL_MENTION_RE = re.compile('<#([0-9]+)>') |
|||
|
|||
CommandLevels = Enum( |
|||
DEFAULT=0, |
|||
TRUSTED=10, |
|||
MOD=50, |
|||
ADMIN=100, |
|||
OWNER=500, |
|||
) |
|||
|
|||
|
|||
class PluginArgumentParser(argparse.ArgumentParser): |
|||
def error(self, message): |
|||
raise CommandError(message) |
|||
|
|||
|
|||
class CommandEvent(object): |
|||
""" |
|||
An event which is created when a command is triggered. Contains information |
|||
about the message, command, and parsed arguments (along with shortcuts to |
|||
message information). |
|||
|
|||
Attributes |
|||
--------- |
|||
command : :class:`Command` |
|||
The command this event was created for (aka the triggered command). |
|||
msg : :class:`disco.types.message.Message` |
|||
The message object which triggered this command. |
|||
match : :class:`re.MatchObject` |
|||
The regex match object for the command. |
|||
name : str |
|||
The command name (or alias) which was triggered by the command |
|||
args : list(str) |
|||
Arguments passed to the command |
|||
""" |
|||
|
|||
def __init__(self, command, msg, match): |
|||
self.command = command |
|||
self.msg = msg |
|||
self.match = match |
|||
self.name = self.match.group(1).strip() |
|||
self.args = [] |
|||
|
|||
if self.match.group(2): |
|||
self.args = [i for i in self.match.group(2).strip().split(' ') if i] |
|||
|
|||
@property |
|||
def codeblock(self): |
|||
if '`' not in self.msg.content: |
|||
return ' '.join(self.args) |
|||
|
|||
_, src = self.msg.content.split('`', 1) |
|||
src = '`' + src |
|||
|
|||
if src.startswith('```') and src.endswith('```'): |
|||
src = src[3:-3] |
|||
elif src.startswith('`') and src.endswith('`'): |
|||
src = src[1:-1] |
|||
|
|||
return src |
|||
|
|||
@simple_cached_property |
|||
def member(self): |
|||
""" |
|||
Guild member (if relevant) for the user that created the message. |
|||
""" |
|||
return self.guild.get_member(self.author) |
|||
|
|||
@simple_cached_property |
|||
def channel(self): |
|||
""" |
|||
Channel the message was created in. |
|||
""" |
|||
return self.msg.channel |
|||
|
|||
@simple_cached_property |
|||
def guild(self): |
|||
""" |
|||
Guild (if relevant) the message was created in. |
|||
""" |
|||
return self.msg.guild |
|||
|
|||
@simple_cached_property |
|||
def author(self): |
|||
""" |
|||
Author of the message. |
|||
""" |
|||
return self.msg.author |
|||
|
|||
|
|||
class CommandError(Exception): |
|||
""" |
|||
An exception which is thrown when the arguments for a command are invalid, |
|||
or don't match the command's specifications. |
|||
""" |
|||
def __init__(self, msg): |
|||
self.msg = msg |
|||
|
|||
|
|||
class Command(object): |
|||
""" |
|||
An object which defines and handles the triggering of a function based on |
|||
user input (aka a command). |
|||
|
|||
Attributes |
|||
---------- |
|||
plugin : :class:`disco.bot.plugin.Plugin` |
|||
The plugin this command is a member of. |
|||
func : function |
|||
The function which is called when this command is triggered. |
|||
trigger : str |
|||
The primary trigger (aka name). |
|||
args : Optional[str] |
|||
The argument format specification. |
|||
aliases : Optional[list(str)] |
|||
List of trigger aliases. |
|||
group : Optional[str] |
|||
The group this command is a member of. |
|||
is_regex : Optional[bool] |
|||
Whether the triggers for this command should be treated as raw regex. |
|||
""" |
|||
def __init__(self, plugin, func, trigger, *args, **kwargs): |
|||
self.plugin = plugin |
|||
self.func = func |
|||
self.triggers = [trigger] |
|||
|
|||
self.dispatch_func = None |
|||
self.raw_args = None |
|||
self.args = None |
|||
self.level = None |
|||
self.group = None |
|||
self.is_regex = None |
|||
self.oob = False |
|||
self.context = {} |
|||
self.metadata = {} |
|||
self.parser = None |
|||
|
|||
self.update(*args, **kwargs) |
|||
|
|||
@property |
|||
def name(self): |
|||
return self.triggers[0] |
|||
|
|||
def __call__(self, *args, **kwargs): |
|||
return self.func(*args, **kwargs) |
|||
|
|||
def get_docstring(self): |
|||
return (self.func.__doc__ or '').format(**self.context) |
|||
|
|||
def update( |
|||
self, |
|||
args=None, |
|||
level=None, |
|||
aliases=None, |
|||
group=None, |
|||
is_regex=None, |
|||
oob=False, |
|||
context=None, |
|||
parser=False, |
|||
**kwargs): |
|||
self.triggers += aliases or [] |
|||
|
|||
def resolve_role(ctx, rid): |
|||
return ctx.msg.guild.roles.get(rid) |
|||
|
|||
def resolve_user(ctx, uid): |
|||
if isinstance(uid, int): |
|||
if uid in ctx.msg.mentions: |
|||
return ctx.msg.mentions.get(uid) |
|||
else: |
|||
return ctx.msg.client.state.users.get(uid) |
|||
else: |
|||
return ctx.msg.client.state.users.select_one(username=uid[0], discriminator=uid[1]) |
|||
|
|||
def resolve_channel(ctx, cid): |
|||
if isinstance(cid, integer_types): |
|||
return ctx.msg.guild.channels.get(cid) |
|||
else: |
|||
return ctx.msg.guild.channels.select_one(name=cid) |
|||
|
|||
def resolve_guild(ctx, gid): |
|||
return ctx.msg.client.state.guilds.get(gid) |
|||
|
|||
if args: |
|||
self.raw_args = args |
|||
self.args = ArgumentSet.from_string(args, { |
|||
'user': self.mention_type([resolve_user], USER_MENTION_RE, user=True), |
|||
'role': self.mention_type([resolve_role], ROLE_MENTION_RE), |
|||
'channel': self.mention_type([resolve_channel], CHANNEL_MENTION_RE, allow_plain=True), |
|||
'guild': self.mention_type([resolve_guild]), |
|||
}) |
|||
|
|||
self.level = level |
|||
self.group = group |
|||
self.is_regex = is_regex |
|||
self.oob = oob |
|||
self.context = context or {} |
|||
self.metadata = kwargs |
|||
|
|||
if parser: |
|||
self.parser = PluginArgumentParser(prog=self.name, add_help=False) |
|||
|
|||
@staticmethod |
|||
def mention_type(getters, reg=None, user=False, allow_plain=False): |
|||
def _f(ctx, raw): |
|||
if raw.isdigit(): |
|||
resolved = int(raw) |
|||
elif user and raw.count('#') == 1 and raw.split('#')[-1].isdigit(): |
|||
username, discrim = raw.split('#') |
|||
resolved = (username, int(discrim)) |
|||
elif reg: |
|||
res = reg.match(raw) |
|||
if res: |
|||
resolved = int(res.group(1)) |
|||
else: |
|||
if allow_plain: |
|||
resolved = raw |
|||
else: |
|||
raise TypeError('Invalid mention: {}'.format(raw)) |
|||
else: |
|||
raise TypeError('Invalid mention: {}'.format(raw)) |
|||
|
|||
for getter in getters: |
|||
obj = getter(ctx, resolved) |
|||
if obj: |
|||
return obj |
|||
|
|||
raise TypeError('Cannot resolve mention: {}'.format(raw)) |
|||
return _f |
|||
|
|||
@simple_cached_property |
|||
def compiled_regex(self): |
|||
""" |
|||
A compiled version of this command's regex. |
|||
""" |
|||
return re.compile(self.regex(), re.I) |
|||
|
|||
def regex(self, grouped=True): |
|||
""" |
|||
The regex string that defines/triggers this command. |
|||
""" |
|||
if self.is_regex: |
|||
return '|'.join(self.triggers) |
|||
else: |
|||
group = '' |
|||
if self.group: |
|||
if self.group in self.plugin.bot.group_abbrev: |
|||
rest = self.plugin.bot.group_abbrev[self.group] |
|||
group = '{}(?:{}) '.format(rest, ''.join(c + u'?' for c in self.group[len(rest):])) |
|||
else: |
|||
group = self.group + ' ' |
|||
return ('^{}({})' if grouped else '^{}(?:{})').format( |
|||
group, |
|||
'|'.join(self.triggers), |
|||
) + (ARGS_REGEX if grouped else ARGS_UNGROUPED_REGEX) |
|||
|
|||
def execute(self, event): |
|||
""" |
|||
Handles the execution of this command given a :class:`CommandEvent` |
|||
object. |
|||
|
|||
Returns |
|||
------- |
|||
bool |
|||
Whether this command was successful |
|||
""" |
|||
parsed_kwargs = {} |
|||
|
|||
if self.args: |
|||
if len(event.args) < self.args.required_length: |
|||
raise CommandError(u'Command {} requires {} argument(s) (`{}`) passed {}'.format( |
|||
event.name, |
|||
self.args.required_length, |
|||
self.raw_args, |
|||
len(event.args), |
|||
)) |
|||
|
|||
try: |
|||
parsed_kwargs = self.args.parse(event.args, ctx=event) |
|||
except ArgumentError as e: |
|||
raise CommandError(e.args[0]) |
|||
elif self.parser: |
|||
event.parser = self.parser |
|||
parsed_kwargs['args'] = self.parser.parse_args( |
|||
[i[0] or i[1] for i in SPLIT_SPACES_NO_QUOTE.findall(' '.join(event.args))]) |
|||
|
|||
kwargs = {} |
|||
kwargs.update(self.context) |
|||
kwargs.update(parsed_kwargs) |
|||
return self.plugin.dispatch('command', self, event, **kwargs) |
@ -1,231 +0,0 @@ |
|||
import re |
|||
import six |
|||
import copy |
|||
from disco.util.sanitize import S |
|||
|
|||
# Regex which splits out argument parts |
|||
PARTS_RE = re.compile(r'(\<|\[|\{)((?:\w+|\:|\||\.\.\.| (?:[0-9]+))+)(?:\>|\]|\})') |
|||
|
|||
BOOL_OPTS = { |
|||
'yes': True, |
|||
'no': False, |
|||
'true': True, |
|||
'False': False, |
|||
'1': True, |
|||
'0': False, |
|||
'on': True, |
|||
'off': False, |
|||
} |
|||
|
|||
# Mapping of types |
|||
TYPE_MAP = { |
|||
'str': lambda ctx, data: six.text_type(data), |
|||
'int': lambda ctx, data: int(data), |
|||
'float': lambda ctx, data: float(data), |
|||
'snowflake': lambda ctx, data: int(data), |
|||
} |
|||
|
|||
|
|||
def to_bool(ctx, data): |
|||
if data in BOOL_OPTS: |
|||
return BOOL_OPTS[data] |
|||
raise TypeError |
|||
|
|||
|
|||
TYPE_MAP['bool'] = to_bool |
|||
|
|||
|
|||
class ArgumentError(Exception): |
|||
""" |
|||
An error thrown when passed in arguments cannot be conformed/casted to the |
|||
argument specification. |
|||
""" |
|||
|
|||
|
|||
class Argument(object): |
|||
""" |
|||
A single argument, which is normally the member of a :class:`ArgumentSet`. |
|||
|
|||
Attributes |
|||
---------- |
|||
name : str |
|||
The name of this argument. |
|||
count : int |
|||
The number of raw arguments that compose this argument. |
|||
required : bool |
|||
Whether this is a required argument. |
|||
types : list(type) |
|||
Types this argument supports. |
|||
""" |
|||
def __init__(self, raw): |
|||
self.name = None |
|||
self.count = 1 |
|||
self.required = False |
|||
self.flag = False |
|||
self.types = None |
|||
self.parse(raw) |
|||
|
|||
@property |
|||
def true_count(self): |
|||
""" |
|||
The true number of raw arguments this argument takes. |
|||
""" |
|||
return self.count or 1 |
|||
|
|||
def parse(self, raw): |
|||
""" |
|||
Attempts to parse arguments from their raw form. |
|||
""" |
|||
prefix, part = raw |
|||
|
|||
if prefix == '<': |
|||
self.required = True |
|||
else: |
|||
self.required = False |
|||
|
|||
# Whether this is a flag |
|||
self.flag = (prefix == '{') |
|||
|
|||
if not self.flag: |
|||
if part.endswith('...'): |
|||
part = part[:-3] |
|||
self.count = 0 |
|||
elif ' ' in part: |
|||
split = part.split(' ', 1) |
|||
part, self.count = split[0], int(split[1]) |
|||
|
|||
if ':' in part: |
|||
part, typeinfo = part.split(':') |
|||
self.types = typeinfo.split('|') |
|||
|
|||
self.name = part.strip() |
|||
|
|||
|
|||
class ArgumentSet(object): |
|||
""" |
|||
A set of :class:`Argument` instances which forms a larger argument specification. |
|||
|
|||
Attributes |
|||
---------- |
|||
args : list(:class:`Argument`) |
|||
All arguments that are a member of this set. |
|||
types : dict(str, type) |
|||
All types supported by this ArgumentSet. |
|||
""" |
|||
def __init__(self, args=None, custom_types=None): |
|||
self.args = args or [] |
|||
self.types = copy.copy(TYPE_MAP) |
|||
self.types.update(custom_types or {}) |
|||
|
|||
@classmethod |
|||
def from_string(cls, line, custom_types=None): |
|||
""" |
|||
Creates a new :class:`ArgumentSet` from a given argument string specification. |
|||
""" |
|||
args = cls(custom_types=custom_types) |
|||
|
|||
data = PARTS_RE.findall(line) |
|||
if len(data): |
|||
for item in data: |
|||
args.append(Argument(item)) |
|||
|
|||
return args |
|||
|
|||
def convert(self, ctx, types, value): |
|||
""" |
|||
Attempts to convert a value to one or more types. |
|||
|
|||
Parameters |
|||
---------- |
|||
types : list(type) |
|||
List of types to attempt conversion with. |
|||
value : str |
|||
The string value to attempt conversion on. |
|||
""" |
|||
exc = None |
|||
for typ_name in types: |
|||
typ = self.types.get(typ_name) |
|||
if not typ: |
|||
raise Exception('Unknown type {}'.format(typ_name)) |
|||
|
|||
try: |
|||
return typ(ctx, value) |
|||
except Exception as e: |
|||
exc = e |
|||
continue |
|||
|
|||
raise exc |
|||
|
|||
def append(self, arg): |
|||
""" |
|||
Add a new :class:`Argument` to this argument specification/set. |
|||
""" |
|||
if self.args and not self.args[-1].required and arg.required: |
|||
raise Exception('Required argument cannot come after an optional argument') |
|||
|
|||
if self.args and not self.args[-1].count: |
|||
raise Exception('No arguments can come after a catch-all') |
|||
|
|||
self.args.append(arg) |
|||
|
|||
def parse(self, rawargs, ctx=None): |
|||
""" |
|||
Parse a string of raw arguments into this argument specification. |
|||
""" |
|||
parsed = {} |
|||
|
|||
flags = {i.name: i for i in self.args if i.flag} |
|||
if flags: |
|||
new_rawargs = [] |
|||
|
|||
for offset, raw in enumerate(rawargs): |
|||
if raw.startswith('-'): |
|||
raw = raw.lstrip('-') |
|||
if raw in flags: |
|||
parsed[raw] = True |
|||
continue |
|||
new_rawargs.append(raw) |
|||
|
|||
rawargs = new_rawargs |
|||
|
|||
for index, arg in enumerate((arg for arg in self.args if not arg.flag)): |
|||
if not arg.required and index + arg.true_count > len(rawargs): |
|||
continue |
|||
|
|||
if arg.count == 0: |
|||
raw = rawargs[index:] |
|||
else: |
|||
raw = rawargs[index:index + arg.true_count] |
|||
|
|||
if arg.types: |
|||
for idx, r in enumerate(raw): |
|||
try: |
|||
raw[idx] = self.convert(ctx, arg.types, r) |
|||
except Exception: |
|||
raise ArgumentError(u'cannot convert `{}` to `{}`'.format( |
|||
S(r), ', '.join(arg.types), |
|||
)) |
|||
|
|||
if arg.count == 1: |
|||
raw = raw[0] |
|||
|
|||
if (not arg.types or arg.types == ['str']) and isinstance(raw, list): |
|||
raw = ' '.join(raw) |
|||
|
|||
parsed[arg.name] = raw |
|||
|
|||
return parsed |
|||
|
|||
@property |
|||
def length(self): |
|||
""" |
|||
The number of arguments in this set/specification. |
|||
""" |
|||
return len(self.args) |
|||
|
|||
@property |
|||
def required_length(self): |
|||
""" |
|||
The number of required arguments to compile this set/specificaiton. |
|||
""" |
|||
return sum(i.true_count for i in self.args if i.required) |
@ -1,488 +0,0 @@ |
|||
import six |
|||
import types |
|||
import gevent |
|||
import inspect |
|||
import weakref |
|||
import warnings |
|||
import functools |
|||
|
|||
from gevent.event import AsyncResult |
|||
from holster.emitter import Priority |
|||
|
|||
from disco.util.logging import LoggingClass |
|||
from disco.bot.command import Command, CommandError |
|||
|
|||
|
|||
# Contains a list of classes which will be excluded when auto discovering plugins |
|||
# to load. This allows anyone to create subclasses of Plugin that act as a base |
|||
# plugin class within their project/bot. |
|||
_plugin_base_classes = set() |
|||
|
|||
|
|||
def register_plugin_base_class(cls): |
|||
""" |
|||
This function registers the given class under an internal registry of plugin |
|||
base classes. This will cause the class passed to behave exactly like the |
|||
builtin `Plugin` class. |
|||
|
|||
This is particularly useful if you wish to subclass `Plugin` to create a new |
|||
base class that other plugins in your project inherit from, but do not want |
|||
the automatic plugin loading to consider the class for loading. |
|||
""" |
|||
if not inspect.isclass(cls): |
|||
raise TypeError('cls must be a class') |
|||
|
|||
_plugin_base_classes.add(cls) |
|||
return cls |
|||
|
|||
|
|||
def find_loadable_plugins(mod): |
|||
""" |
|||
Generator which produces a list of loadable plugins given a Python module. This |
|||
function will exclude any plugins which are registered as a plugin base class |
|||
via the `register_plugin_base_class` function. |
|||
""" |
|||
module_attributes = (getattr(mod, attr) for attr in dir(mod)) |
|||
for modattr in module_attributes: |
|||
if not inspect.isclass(modattr): |
|||
continue |
|||
|
|||
if not issubclass(modattr, Plugin): |
|||
continue |
|||
|
|||
if modattr in _plugin_base_classes: |
|||
continue |
|||
|
|||
if getattr(modattr, '_shallow', False) and Plugin in modattr.__bases__: |
|||
warnings.warn( |
|||
'Setting _shallow to avoid plugin loading has been deprecated, see `register_plugin_base_class`', |
|||
DeprecationWarning, |
|||
) |
|||
continue |
|||
|
|||
yield modattr |
|||
|
|||
|
|||
class BasePluginDeco(object): |
|||
Prio = Priority |
|||
|
|||
# TODO: don't smash class methods |
|||
@classmethod |
|||
def add_meta_deco(cls, meta): |
|||
def deco(f): |
|||
if not hasattr(f, 'meta'): |
|||
f.meta = [] |
|||
|
|||
f.meta.append(meta) |
|||
|
|||
return f |
|||
return deco |
|||
|
|||
@classmethod |
|||
def with_config(cls, config_cls): |
|||
""" |
|||
Sets the plugins config class to the specified config class. |
|||
""" |
|||
def deco(plugin_cls): |
|||
plugin_cls.config_cls = config_cls |
|||
return plugin_cls |
|||
return deco |
|||
|
|||
@classmethod |
|||
def listen(cls, *args, **kwargs): |
|||
""" |
|||
Binds the function to listen for a given event name. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'listener', |
|||
'what': 'event', |
|||
'args': args, |
|||
'kwargs': kwargs, |
|||
}) |
|||
|
|||
@classmethod |
|||
def listen_packet(cls, *args, **kwargs): |
|||
""" |
|||
Binds the function to listen for a given gateway op code. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'listener', |
|||
'what': 'packet', |
|||
'args': args, |
|||
'kwargs': kwargs, |
|||
}) |
|||
|
|||
@classmethod |
|||
def command(cls, *args, **kwargs): |
|||
""" |
|||
Creates a new command attached to the function. |
|||
""" |
|||
|
|||
return cls.add_meta_deco({ |
|||
'type': 'command', |
|||
'args': args, |
|||
'kwargs': kwargs, |
|||
}) |
|||
|
|||
@classmethod |
|||
def pre_command(cls): |
|||
""" |
|||
Runs a function before a command is triggered. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'pre_command', |
|||
}) |
|||
|
|||
@classmethod |
|||
def post_command(cls): |
|||
""" |
|||
Runs a function after a command is triggered. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'post_command', |
|||
}) |
|||
|
|||
@classmethod |
|||
def pre_listener(cls): |
|||
""" |
|||
Runs a function before a listener is triggered. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'pre_listener', |
|||
}) |
|||
|
|||
@classmethod |
|||
def post_listener(cls): |
|||
""" |
|||
Runs a function after a listener is triggered. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'post_listener', |
|||
}) |
|||
|
|||
@classmethod |
|||
def schedule(cls, *args, **kwargs): |
|||
""" |
|||
Runs a function repeatedly, waiting for a specified interval. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'schedule', |
|||
'args': args, |
|||
'kwargs': kwargs, |
|||
}) |
|||
|
|||
@classmethod |
|||
def add_argument(cls, *args, **kwargs): |
|||
""" |
|||
Adds an argument to the argument parser. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'parser.add_argument', |
|||
'args': args, |
|||
'kwargs': kwargs, |
|||
}) |
|||
|
|||
@classmethod |
|||
def route(cls, *args, **kwargs): |
|||
""" |
|||
Adds an HTTP route. |
|||
""" |
|||
return cls.add_meta_deco({ |
|||
'type': 'http.add_route', |
|||
'args': args, |
|||
'kwargs': kwargs, |
|||
}) |
|||
|
|||
|
|||
class PluginDeco(BasePluginDeco): |
|||
""" |
|||
A utility mixin which provides various function decorators that a plugin |
|||
author can use to create bound event/command handlers. |
|||
""" |
|||
parser = BasePluginDeco |
|||
|
|||
|
|||
@register_plugin_base_class |
|||
class Plugin(LoggingClass, PluginDeco): |
|||
""" |
|||
A plugin is a set of listeners/commands which can be loaded/unloaded by a bot. |
|||
|
|||
Parameters |
|||
---------- |
|||
bot : :class:`disco.bot.Bot` |
|||
The bot this plugin is a member of. |
|||
config : any |
|||
The configuration data for this plugin. |
|||
|
|||
Attributes |
|||
---------- |
|||
client : :class:`disco.client.Client` |
|||
An alias to the client the bot is running with. |
|||
state : :class:`disco.state.State` |
|||
An alias to the state object for the client. |
|||
listeners : list |
|||
List of all bound listeners this plugin owns. |
|||
commands : list(:class:`disco.bot.command.Command`) |
|||
List of all commands this plugin owns. |
|||
""" |
|||
def __init__(self, bot, config): |
|||
super(Plugin, self).__init__() |
|||
self.bot = bot |
|||
self.client = bot.client |
|||
self.state = bot.client.state |
|||
self.ctx = bot.ctx |
|||
self.storage = bot.storage |
|||
self.config = config |
|||
|
|||
# General declarations |
|||
self.listeners = [] |
|||
self.commands = [] |
|||
self.schedules = {} |
|||
self.greenlets = weakref.WeakSet() |
|||
self._pre = {} |
|||
self._post = {} |
|||
|
|||
# This is an array of all meta functions we sniff at init |
|||
self.meta_funcs = [] |
|||
|
|||
for name, member in inspect.getmembers(self, predicate=inspect.ismethod): |
|||
if hasattr(member, 'meta'): |
|||
self.meta_funcs.append(member) |
|||
|
|||
# Unsmash local functions |
|||
if hasattr(Plugin, name): |
|||
method = types.MethodType(getattr(Plugin, name), self, self.__class__) |
|||
setattr(self, name, method) |
|||
|
|||
self.bind_all() |
|||
|
|||
@property |
|||
def name(self): |
|||
return self.__class__.__name__ |
|||
|
|||
def bind_all(self): |
|||
self.listeners = [] |
|||
self.commands = [] |
|||
self.schedules = {} |
|||
self.greenlets = weakref.WeakSet() |
|||
|
|||
self._pre = {'command': [], 'listener': []} |
|||
self._post = {'command': [], 'listener': []} |
|||
|
|||
for member in self.meta_funcs: |
|||
for meta in reversed(member.meta): |
|||
self.bind_meta(member, meta) |
|||
|
|||
def bind_meta(self, member, meta): |
|||
if meta['type'] == 'listener': |
|||
self.register_listener(member, meta['what'], *meta['args'], **meta['kwargs']) |
|||
elif meta['type'] == 'command': |
|||
# meta['kwargs']['update'] = True |
|||
self.register_command(member, *meta['args'], **meta['kwargs']) |
|||
elif meta['type'] == 'schedule': |
|||
self.register_schedule(member, *meta['args'], **meta['kwargs']) |
|||
elif meta['type'].startswith('pre_') or meta['type'].startswith('post_'): |
|||
when, typ = meta['type'].split('_', 1) |
|||
self.register_trigger(typ, when, member) |
|||
elif meta['type'].startswith('parser.'): |
|||
for command in self.commands: |
|||
if command.func == member: |
|||
getattr(command.parser, meta['type'].split('.', 1)[-1])( |
|||
*meta['args'], |
|||
**meta['kwargs']) |
|||
elif meta['type'] == 'http.add_route': |
|||
meta['kwargs']['view_func'] = member |
|||
self.bot.http.add_url_rule(*meta['args'], **meta['kwargs']) |
|||
else: |
|||
raise Exception('unhandled meta type {}'.format(meta)) |
|||
|
|||
def handle_exception(self, greenlet, event): |
|||
pass |
|||
|
|||
def wait_for_event(self, event_name, conditional=None, **kwargs): |
|||
result = AsyncResult() |
|||
listener = None |
|||
|
|||
def _event_callback(event): |
|||
for k, v in kwargs.items(): |
|||
obj = event |
|||
for inst in k.split('__'): |
|||
obj = getattr(obj, inst) |
|||
|
|||
if obj != v: |
|||
break |
|||
else: |
|||
if conditional and not conditional(event): |
|||
return |
|||
|
|||
listener.remove() |
|||
return result.set(event) |
|||
|
|||
listener = self.bot.client.events.on(event_name, _event_callback) |
|||
|
|||
return result |
|||
|
|||
def spawn_wrap(self, spawner, method, *args, **kwargs): |
|||
def wrapped(*args, **kwargs): |
|||
self.ctx['plugin'] = self |
|||
try: |
|||
res = method(*args, **kwargs) |
|||
return res |
|||
finally: |
|||
self.ctx.drop() |
|||
|
|||
obj = spawner(wrapped, *args, **kwargs) |
|||
self.greenlets.add(obj) |
|||
return obj |
|||
|
|||
def spawn(self, *args, **kwargs): |
|||
return self.spawn_wrap(gevent.spawn, *args, **kwargs) |
|||
|
|||
def spawn_later(self, delay, *args, **kwargs): |
|||
return self.spawn_wrap(functools.partial(gevent.spawn_later, delay), *args, **kwargs) |
|||
|
|||
def execute(self, event): |
|||
""" |
|||
Executes a CommandEvent this plugin owns. |
|||
""" |
|||
if not event.command.oob: |
|||
self.greenlets.add(gevent.getcurrent()) |
|||
try: |
|||
return event.command.execute(event) |
|||
except CommandError as e: |
|||
event.msg.reply(e.msg) |
|||
return False |
|||
finally: |
|||
self.ctx.drop() |
|||
|
|||
def register_trigger(self, typ, when, func): |
|||
""" |
|||
Registers a trigger. |
|||
""" |
|||
getattr(self, '_' + when)[typ].append(func) |
|||
|
|||
def dispatch(self, typ, func, event, *args, **kwargs): |
|||
# Link the greenlet with our exception handler |
|||
gevent.getcurrent().link_exception(lambda g: self.handle_exception(g, event)) |
|||
|
|||
# TODO: this is ugly |
|||
if typ != 'command': |
|||
self.greenlets.add(gevent.getcurrent()) |
|||
|
|||
self.ctx['plugin'] = self |
|||
|
|||
if hasattr(event, 'guild'): |
|||
self.ctx['guild'] = event.guild |
|||
if hasattr(event, 'channel'): |
|||
self.ctx['channel'] = event.channel |
|||
if hasattr(event, 'author'): |
|||
self.ctx['user'] = event.author |
|||
|
|||
for pre in self._pre[typ]: |
|||
event = pre(func, event, args, kwargs) |
|||
|
|||
if event is None: |
|||
return False |
|||
|
|||
result = func(event, *args, **kwargs) |
|||
|
|||
for post in self._post[typ]: |
|||
post(func, event, args, kwargs, result) |
|||
|
|||
return True |
|||
|
|||
def register_listener(self, func, what, *args, **kwargs): |
|||
""" |
|||
Registers a listener. |
|||
|
|||
Parameters |
|||
---------- |
|||
what : str |
|||
What the listener is for (event, packet) |
|||
func : function |
|||
The function to be registered. |
|||
desc |
|||
The descriptor of the event/packet. |
|||
""" |
|||
args = list(args) + [functools.partial(self.dispatch, 'listener', func)] |
|||
|
|||
if what == 'event': |
|||
li = self.bot.client.events.on(*args, **kwargs) |
|||
elif what == 'packet': |
|||
li = self.bot.client.packets.on(*args, **kwargs) |
|||
else: |
|||
raise Exception('Invalid listener what: {}'.format(what)) |
|||
|
|||
self.listeners.append(li) |
|||
|
|||
def register_command(self, func, *args, **kwargs): |
|||
""" |
|||
Registers a command. |
|||
|
|||
Parameters |
|||
---------- |
|||
func : function |
|||
The function to be registered. |
|||
args |
|||
Arguments to pass onto the :class:`disco.bot.command.Command` object. |
|||
kwargs |
|||
Keyword arguments to pass onto the :class:`disco.bot.command.Command` |
|||
object. |
|||
""" |
|||
self.commands.append(Command(self, func, *args, **kwargs)) |
|||
|
|||
def register_schedule(self, func, interval, repeat=True, init=True, kwargs=None): |
|||
""" |
|||
Registers a function to be called repeatedly, waiting for an interval |
|||
duration. |
|||
|
|||
Args |
|||
---- |
|||
func : function |
|||
The function to be registered. |
|||
interval : int |
|||
Interval (in seconds) to repeat the function on. |
|||
repeat : bool |
|||
Whether this schedule is repeating (or one time). |
|||
init : bool |
|||
Whether to run this schedule once immediately, or wait for the first |
|||
scheduled iteration. |
|||
kwargs: dict |
|||
kwargs which will be passed to executed `func` |
|||
""" |
|||
if kwargs is None: |
|||
kwargs = {} |
|||
|
|||
def repeat_func(): |
|||
if init: |
|||
func(**kwargs) |
|||
|
|||
while True: |
|||
gevent.sleep(interval) |
|||
func(**kwargs) |
|||
if not repeat: |
|||
break |
|||
|
|||
self.schedules[func.__name__] = self.spawn(repeat_func) |
|||
|
|||
def load(self, ctx): |
|||
""" |
|||
Called when the plugin is loaded. |
|||
""" |
|||
pass |
|||
|
|||
def unload(self, ctx): |
|||
""" |
|||
Called when the plugin is unloaded. |
|||
""" |
|||
for greenlet in self.greenlets: |
|||
greenlet.kill() |
|||
|
|||
for listener in self.listeners: |
|||
listener.remove() |
|||
|
|||
for schedule in six.itervalues(self.schedules): |
|||
schedule.kill() |
|||
|
|||
def reload(self): |
|||
self.bot.reload_plugin(self.__class__) |
@ -1,89 +0,0 @@ |
|||
import os |
|||
|
|||
from six.moves import UserDict |
|||
|
|||
from disco.util.hashmap import HashMap |
|||
from disco.util.serializer import Serializer |
|||
|
|||
|
|||
class StorageHashMap(HashMap): |
|||
def __init__(self, data): |
|||
self.data = data |
|||
|
|||
|
|||
class ContextAwareProxy(UserDict): |
|||
def __init__(self, ctx): |
|||
self.ctx = ctx |
|||
|
|||
@property |
|||
def data(self): |
|||
return self.ctx() |
|||
|
|||
|
|||
class StorageDict(UserDict): |
|||
def __init__(self, parent, data): |
|||
self._parent = parent |
|||
self.data = data |
|||
|
|||
def update(self, other): |
|||
self.data.update(other) |
|||
self._parent._update() |
|||
|
|||
def __setitem__(self, key, value): |
|||
self.data[key] = value |
|||
self._parent._update() |
|||
|
|||
def __delitem__(self, key): |
|||
del self.data[key] |
|||
self._parent._update() |
|||
|
|||
|
|||
class Storage(object): |
|||
def __init__(self, ctx, config): |
|||
self._ctx = ctx |
|||
self._path = config.path |
|||
self._serializer = config.serializer |
|||
self._fsync = config.fsync |
|||
self._data = {} |
|||
|
|||
if os.path.exists(self._path): |
|||
with open(self._path, 'r') as f: |
|||
self._data = Serializer.loads(self._serializer, f.read()) |
|||
if not self._data: |
|||
self._data = {} |
|||
|
|||
def __getitem__(self, key): |
|||
if key not in self._data: |
|||
self._data[key] = {} |
|||
return StorageHashMap(StorageDict(self, self._data[key])) |
|||
|
|||
def _update(self): |
|||
if self._fsync: |
|||
self.save() |
|||
|
|||
def save(self): |
|||
if not self._path: |
|||
return |
|||
|
|||
with open(self._path, 'w') as f: |
|||
f.write(Serializer.dumps(self._serializer, self._data)) |
|||
|
|||
def guild(self, key): |
|||
return ContextAwareProxy( |
|||
lambda: self['_g{}:{}'.format(self._ctx['guild'].id, key)], |
|||
) |
|||
|
|||
def channel(self, key): |
|||
return ContextAwareProxy( |
|||
lambda: self['_c{}:{}'.format(self._ctx['channel'].id, key)], |
|||
) |
|||
|
|||
def plugin(self, key): |
|||
return ContextAwareProxy( |
|||
lambda: self['_p{}:{}'.format(self._ctx['plugin'].name, key)], |
|||
) |
|||
|
|||
def user(self, key): |
|||
return ContextAwareProxy( |
|||
lambda: self['_u{}:{}'.format(self._ctx['user'].id, key)], |
|||
) |
@ -1,110 +0,0 @@ |
|||
""" |
|||
The CLI module is a small utility that can be used as an easy entry point for |
|||
creating and running bots/clients. |
|||
""" |
|||
from __future__ import print_function |
|||
|
|||
import os |
|||
import six |
|||
import logging |
|||
import argparse |
|||
|
|||
from gevent import monkey |
|||
|
|||
monkey.patch_all() |
|||
|
|||
parser = argparse.ArgumentParser() |
|||
|
|||
# Command line specific arguments |
|||
parser.add_argument('--run-bot', help='run a disco bot on this client', action='store_true', default=False) |
|||
parser.add_argument('--plugin', help='load plugins into the bot', nargs='*', default=[]) |
|||
parser.add_argument('--config', help='Configuration file', default=None) |
|||
parser.add_argument('--shard-auto', help='Automatically run all shards', action='store_true', default=False) |
|||
|
|||
# Configuration overrides |
|||
parser.add_argument('--token', help='Bot Authentication Token', default=None) |
|||
parser.add_argument('--shard-id', help='Current shard number/id', default=None) |
|||
parser.add_argument('--shard-count', help='Total number of shards', default=None) |
|||
parser.add_argument('--max-reconnects', help='Maximum reconnect attempts', default=None) |
|||
parser.add_argument('--log-level', help='log level', default=None) |
|||
parser.add_argument('--manhole', action='store_true', help='Enable the manhole', default=None) |
|||
parser.add_argument('--manhole-bind', help='host:port for the manhole to bind too', default=None) |
|||
parser.add_argument('--encoder', help='encoder for gateway data', default=None) |
|||
|
|||
|
|||
# Mapping of argument names to configuration overrides |
|||
CONFIG_OVERRIDE_MAPPING = { |
|||
'token': 'token', |
|||
'shard_id': 'shard_id', |
|||
'shard_count': 'shard_count', |
|||
'max_reconnects': 'max_reconnects', |
|||
'log_level': 'log_level', |
|||
'manhole': 'manhole_enable', |
|||
'manhole_bind': 'manhole_bind', |
|||
'encoder': 'encoder', |
|||
} |
|||
|
|||
|
|||
def disco_main(run=False): |
|||
""" |
|||
Creates an argument parser and parses a standard set of command line arguments, |
|||
creating a new :class:`Client`. |
|||
|
|||
Returns |
|||
------- |
|||
:class:`Client` |
|||
A new Client from the provided command line arguments |
|||
""" |
|||
from disco.client import Client, ClientConfig |
|||
from disco.bot import Bot, BotConfig |
|||
from disco.util.logging import setup_logging |
|||
|
|||
# Parse out all our command line arguments |
|||
args = parser.parse_args() |
|||
|
|||
# Create the base configuration object |
|||
if args.config: |
|||
config = ClientConfig.from_file(args.config) |
|||
else: |
|||
if os.path.exists('config.json'): |
|||
config = ClientConfig.from_file('config.json') |
|||
elif os.path.exists('config.yaml'): |
|||
config = ClientConfig.from_file('config.yaml') |
|||
else: |
|||
config = ClientConfig() |
|||
|
|||
for arg_key, config_key in six.iteritems(CONFIG_OVERRIDE_MAPPING): |
|||
if getattr(args, arg_key) is not None: |
|||
setattr(config, config_key, getattr(args, arg_key)) |
|||
|
|||
# Setup the auto-sharder |
|||
if args.shard_auto: |
|||
from disco.gateway.sharder import AutoSharder |
|||
AutoSharder(config).run() |
|||
return |
|||
|
|||
# Setup logging based on the configured level |
|||
setup_logging(level=getattr(logging, config.log_level.upper())) |
|||
|
|||
# Build out client object |
|||
client = Client(config) |
|||
|
|||
# If applicable, build the bot and load plugins |
|||
bot = None |
|||
if args.run_bot or hasattr(config, 'bot'): |
|||
bot_config = BotConfig(config.bot) if hasattr(config, 'bot') else BotConfig() |
|||
if not hasattr(bot_config, 'plugins'): |
|||
bot_config.plugins = args.plugin |
|||
else: |
|||
bot_config.plugins += args.plugin |
|||
|
|||
bot = Bot(client, bot_config) |
|||
|
|||
if run: |
|||
(bot or client).run_forever() |
|||
|
|||
return (bot or client) |
|||
|
|||
|
|||
if __name__ == '__main__': |
|||
disco_main(True) |
@ -1,154 +0,0 @@ |
|||
import time |
|||
import gevent |
|||
|
|||
from holster.emitter import Emitter |
|||
|
|||
from disco.state import State, StateConfig |
|||
from disco.api.client import APIClient |
|||
from disco.gateway.client import GatewayClient |
|||
from disco.gateway.packets import OPCode |
|||
from disco.types.user import Status, Game |
|||
from disco.util.config import Config |
|||
from disco.util.logging import LoggingClass |
|||
from disco.util.backdoor import DiscoBackdoorServer |
|||
|
|||
|
|||
class ClientConfig(Config): |
|||
""" |
|||
Configuration for the `Client`. |
|||
|
|||
Attributes |
|||
---------- |
|||
token : str |
|||
Discord authentication token, can be validated using the |
|||
`disco.util.token.is_valid_token` function. |
|||
shard_id : int |
|||
The shard ID for the current client instance. |
|||
shard_count : int |
|||
The total count of shards running. |
|||
max_reconnects : int |
|||
The maximum number of connection retries to make before giving up (0 = never give up). |
|||
log_level: str |
|||
The logging level to use. |
|||
manhole_enable : bool |
|||
Whether to enable the manhole (e.g. console backdoor server) utility. |
|||
manhole_bind : tuple(str, int) |
|||
A (host, port) combination which the manhole server will bind to (if its |
|||
enabled using :attr:`manhole_enable`). |
|||
encoder : str |
|||
The type of encoding to use for encoding/decoding data from websockets, |
|||
should be either 'json' or 'etf'. |
|||
""" |
|||
|
|||
token = '' |
|||
shard_id = 0 |
|||
shard_count = 1 |
|||
max_reconnects = 5 |
|||
log_level = 'info' |
|||
|
|||
manhole_enable = False |
|||
manhole_bind = ('127.0.0.1', 8484) |
|||
|
|||
encoder = 'json' |
|||
|
|||
|
|||
class Client(LoggingClass): |
|||
""" |
|||
Class representing the base entry point that should be used in almost all |
|||
implementation cases. This class wraps the functionality of both the REST API |
|||
(`disco.api.client.APIClient`) and the realtime gateway API |
|||
(`disco.gateway.client.GatewayClient`). |
|||
|
|||
Parameters |
|||
---------- |
|||
config : `ClientConfig` |
|||
Configuration for this client instance. |
|||
|
|||
Attributes |
|||
---------- |
|||
config : `ClientConfig` |
|||
The runtime configuration for this client. |
|||
events : `Emitter` |
|||
An emitter which emits Gateway events. |
|||
packets : `Emitter` |
|||
An emitter which emits Gateway packets. |
|||
state : `State` |
|||
The state tracking object. |
|||
api : `APIClient` |
|||
The API client. |
|||
gw : `GatewayClient` |
|||
The gateway client. |
|||
manhole_locals : dict |
|||
Dictionary of local variables for each manhole connection. This can be |
|||
modified to add/modify local variables. |
|||
manhole : Optional[`BackdoorServer`] |
|||
Gevent backdoor server (if the manhole is enabled). |
|||
""" |
|||
def __init__(self, config): |
|||
super(Client, self).__init__() |
|||
self.config = config |
|||
|
|||
self.events = Emitter() |
|||
self.packets = Emitter() |
|||
|
|||
self.api = APIClient(self.config.token, self) |
|||
self.gw = GatewayClient(self, self.config.max_reconnects, self.config.encoder) |
|||
self.state = State(self, StateConfig(self.config.get('state', {}))) |
|||
|
|||
if self.config.manhole_enable: |
|||
self.manhole_locals = { |
|||
'client': self, |
|||
'state': self.state, |
|||
'api': self.api, |
|||
'gw': self.gw, |
|||
} |
|||
|
|||
self.manhole = DiscoBackdoorServer(self.config.manhole_bind, |
|||
banner='Disco Manhole', |
|||
localf=lambda: self.manhole_locals) |
|||
self.manhole.start() |
|||
|
|||
def update_presence(self, status, game=None, afk=False, since=0.0): |
|||
""" |
|||
Updates the current clients presence. |
|||
|
|||
Params |
|||
------ |
|||
status : `user.Status` |
|||
The clients current status. |
|||
game : `user.Game` |
|||
If passed, the game object to set for the users presence. |
|||
afk : bool |
|||
Whether the client is currently afk. |
|||
since : float |
|||
How long the client has been afk for (in seconds). |
|||
""" |
|||
if game and not isinstance(game, Game): |
|||
raise TypeError('Game must be a Game model') |
|||
|
|||
if status is Status.IDLE and not since: |
|||
since = int(time.time() * 1000) |
|||
|
|||
payload = { |
|||
'afk': afk, |
|||
'since': since, |
|||
'status': status.value.lower(), |
|||
'game': None, |
|||
} |
|||
|
|||
if game: |
|||
payload['game'] = game.to_dict() |
|||
|
|||
self.gw.send(OPCode.STATUS_UPDATE, payload) |
|||
|
|||
def run(self): |
|||
""" |
|||
Run the client (e.g. the `GatewayClient`) in a new greenlet. |
|||
""" |
|||
return gevent.spawn(self.gw.run) |
|||
|
|||
def run_forever(self): |
|||
""" |
|||
Run the client (e.g. the `GatewayClient`) in the current greenlet. |
|||
""" |
|||
return self.gw.run() |
@ -1,277 +0,0 @@ |
|||
import gevent |
|||
import zlib |
|||
import six |
|||
import ssl |
|||
|
|||
from websocket import ABNF |
|||
|
|||
from disco.gateway.packets import OPCode, RECV, SEND |
|||
from disco.gateway.events import GatewayEvent |
|||
from disco.gateway.encoding import ENCODERS |
|||
from disco.util.websocket import Websocket |
|||
from disco.util.logging import LoggingClass |
|||
from disco.util.limiter import SimpleLimiter |
|||
|
|||
TEN_MEGABYTES = 10490000 |
|||
ZLIB_SUFFIX = b'\x00\x00\xff\xff' |
|||
|
|||
|
|||
class GatewayClient(LoggingClass): |
|||
GATEWAY_VERSION = 6 |
|||
|
|||
def __init__(self, client, max_reconnects=5, encoder='json', zlib_stream_enabled=True, ipc=None): |
|||
super(GatewayClient, self).__init__() |
|||
self.client = client |
|||
self.max_reconnects = max_reconnects |
|||
self.encoder = ENCODERS[encoder] |
|||
self.zlib_stream_enabled = zlib_stream_enabled |
|||
|
|||
self.events = client.events |
|||
self.packets = client.packets |
|||
|
|||
# IPC for shards |
|||
if ipc: |
|||
self.shards = ipc.get_shards() |
|||
self.ipc = ipc |
|||
|
|||
# Its actually 60, 120 but lets give ourselves a buffer |
|||
self.limiter = SimpleLimiter(60, 130) |
|||
|
|||
# Create emitter and bind to gateway payloads |
|||
self.packets.on((RECV, OPCode.DISPATCH), self.handle_dispatch) |
|||
self.packets.on((RECV, OPCode.HEARTBEAT), self.handle_heartbeat) |
|||
self.packets.on((RECV, OPCode.HEARTBEAT_ACK), self.handle_heartbeat_acknowledge) |
|||
self.packets.on((RECV, OPCode.RECONNECT), self.handle_reconnect) |
|||
self.packets.on((RECV, OPCode.INVALID_SESSION), self.handle_invalid_session) |
|||
self.packets.on((RECV, OPCode.HELLO), self.handle_hello) |
|||
|
|||
# Bind to ready payload |
|||
self.events.on('Ready', self.on_ready) |
|||
self.events.on('Resumed', self.on_resumed) |
|||
|
|||
# Websocket connection |
|||
self.ws = None |
|||
self.ws_event = gevent.event.Event() |
|||
self._zlib = None |
|||
self._buffer = None |
|||
|
|||
# State |
|||
self.seq = 0 |
|||
self.session_id = None |
|||
self.reconnects = 0 |
|||
self.shutting_down = False |
|||
self.replaying = False |
|||
self.replayed_events = 0 |
|||
|
|||
# Cached gateway URL |
|||
self._cached_gateway_url = None |
|||
|
|||
# Heartbeat |
|||
self._heartbeat_task = None |
|||
self._heartbeat_acknowledged = True |
|||
|
|||
def send(self, op, data): |
|||
self.limiter.check() |
|||
return self._send(op, data) |
|||
|
|||
def _send(self, op, data): |
|||
self.log.debug('GatewayClient.send %s', op) |
|||
self.packets.emit((SEND, op), data) |
|||
self.ws.send(self.encoder.encode({ |
|||
'op': op.value, |
|||
'd': data, |
|||
}), self.encoder.OPCODE) |
|||
|
|||
def heartbeat_task(self, interval): |
|||
while True: |
|||
if not self._heartbeat_acknowledged: |
|||
self.log.warning('Received HEARTBEAT without HEARTBEAT_ACK, forcing a fresh reconnect') |
|||
self._heartbeat_acknowledged = True |
|||
self.ws.close(status=4000) |
|||
return |
|||
|
|||
self._send(OPCode.HEARTBEAT, self.seq) |
|||
self._heartbeat_acknowledged = False |
|||
gevent.sleep(interval / 1000) |
|||
|
|||
def handle_dispatch(self, packet): |
|||
obj = GatewayEvent.from_dispatch(self.client, packet) |
|||
self.log.debug('GatewayClient.handle_dispatch %s', obj.__class__.__name__) |
|||
self.client.events.emit(obj.__class__.__name__, obj) |
|||
if self.replaying: |
|||
self.replayed_events += 1 |
|||
|
|||
def handle_heartbeat(self, _): |
|||
self._send(OPCode.HEARTBEAT, self.seq) |
|||
|
|||
def handle_heartbeat_acknowledge(self, _): |
|||
self.log.debug('Received HEARTBEAT_ACK') |
|||
self._heartbeat_acknowledged = True |
|||
|
|||
def handle_reconnect(self, _): |
|||
self.log.warning('Received RECONNECT request, forcing a fresh reconnect') |
|||
self.session_id = None |
|||
self.ws.close() |
|||
|
|||
def handle_invalid_session(self, _): |
|||
self.log.warning('Received INVALID_SESSION, forcing a fresh reconnect') |
|||
self.session_id = None |
|||
self.ws.close() |
|||
|
|||
def handle_hello(self, packet): |
|||
self.log.info('Received HELLO, starting heartbeater...') |
|||
self._heartbeat_task = gevent.spawn(self.heartbeat_task, packet['d']['heartbeat_interval']) |
|||
|
|||
def on_ready(self, ready): |
|||
self.log.info('Received READY') |
|||
self.session_id = ready.session_id |
|||
self.reconnects = 0 |
|||
|
|||
def on_resumed(self, _): |
|||
self.log.info('RESUME completed, replayed %s events', self.replayed_events) |
|||
self.reconnects = 0 |
|||
self.replaying = False |
|||
|
|||
def connect_and_run(self, gateway_url=None): |
|||
if not gateway_url: |
|||
if not self._cached_gateway_url: |
|||
self._cached_gateway_url = self.client.api.gateway_get()['url'] |
|||
|
|||
gateway_url = self._cached_gateway_url |
|||
|
|||
gateway_url += '?v={}&encoding={}'.format(self.GATEWAY_VERSION, self.encoder.TYPE) |
|||
|
|||
if self.zlib_stream_enabled: |
|||
gateway_url += '&compress=zlib-stream' |
|||
|
|||
self.log.info('Opening websocket connection to URL `%s`', gateway_url) |
|||
self.ws = Websocket(gateway_url) |
|||
self.ws.emitter.on('on_open', self.on_open) |
|||
self.ws.emitter.on('on_error', self.on_error) |
|||
self.ws.emitter.on('on_close', self.on_close) |
|||
self.ws.emitter.on('on_message', self.on_message) |
|||
|
|||
self.ws.run_forever(sslopt={'cert_reqs': ssl.CERT_NONE}) |
|||
|
|||
def on_message(self, msg): |
|||
if self.zlib_stream_enabled: |
|||
if not self._buffer: |
|||
self._buffer = bytearray() |
|||
|
|||
self._buffer.extend(msg) |
|||
|
|||
if len(msg) < 4: |
|||
return |
|||
|
|||
if msg[-4:] != ZLIB_SUFFIX: |
|||
return |
|||
|
|||
msg = self._zlib.decompress(self._buffer if six.PY3 else str(self._buffer)) |
|||
# If this encoder is text based, we want to decode the data as utf8 |
|||
if self.encoder.OPCODE == ABNF.OPCODE_TEXT: |
|||
msg = msg.decode('utf-8') |
|||
self._buffer = None |
|||
else: |
|||
# Detect zlib and decompress |
|||
is_erlpack = ((six.PY2 and ord(msg[0]) == 131) or (six.PY3 and msg[0] == 131)) |
|||
if msg[0] != '{' and not is_erlpack: |
|||
msg = zlib.decompress(msg, 15, TEN_MEGABYTES).decode('utf-8') |
|||
|
|||
try: |
|||
data = self.encoder.decode(msg) |
|||
except Exception: |
|||
self.log.exception('Failed to parse gateway message: ') |
|||
return |
|||
|
|||
# Update sequence |
|||
if data['s'] and data['s'] > self.seq: |
|||
self.seq = data['s'] |
|||
|
|||
# Emit packet |
|||
self.packets.emit((RECV, OPCode[data['op']]), data) |
|||
|
|||
def on_error(self, error): |
|||
if isinstance(error, KeyboardInterrupt): |
|||
self.shutting_down = True |
|||
self.ws_event.set() |
|||
raise Exception('WS received error: {}'.format(error)) |
|||
|
|||
def on_open(self): |
|||
if self.zlib_stream_enabled: |
|||
self._zlib = zlib.decompressobj() |
|||
|
|||
if self.seq and self.session_id: |
|||
self.log.info('WS Opened: attempting resume w/ SID: %s SEQ: %s', self.session_id, self.seq) |
|||
self.replaying = True |
|||
self.send(OPCode.RESUME, { |
|||
'token': self.client.config.token, |
|||
'session_id': self.session_id, |
|||
'seq': self.seq, |
|||
}) |
|||
else: |
|||
self.log.info('WS Opened: sending identify payload') |
|||
self.send(OPCode.IDENTIFY, { |
|||
'token': self.client.config.token, |
|||
'compress': True, |
|||
'large_threshold': 250, |
|||
'shard': [ |
|||
int(self.client.config.shard_id), |
|||
int(self.client.config.shard_count), |
|||
], |
|||
'properties': { |
|||
'$os': 'linux', |
|||
'$browser': 'disco', |
|||
'$device': 'disco', |
|||
'$referrer': '', |
|||
}, |
|||
}) |
|||
|
|||
def on_close(self, code, reason): |
|||
# Make sure we cleanup any old data |
|||
self._buffer = None |
|||
|
|||
# Kill heartbeater, a reconnect/resume will trigger a HELLO which will |
|||
# respawn it |
|||
if self._heartbeat_task: |
|||
self._heartbeat_task.kill() |
|||
|
|||
# If we're quitting, just break out of here |
|||
if self.shutting_down: |
|||
self.log.info('WS Closed: shutting down') |
|||
return |
|||
|
|||
self.replaying = False |
|||
|
|||
# Track reconnect attempts |
|||
self.reconnects += 1 |
|||
self.log.info('WS Closed: [%s] %s (%s)', code, reason, self.reconnects) |
|||
|
|||
if self.max_reconnects and self.reconnects > self.max_reconnects: |
|||
raise Exception('Failed to reconnect after {} attempts, giving up'.format(self.max_reconnects)) |
|||
|
|||
# Don't resume for these error codes |
|||
if code and 4000 <= code <= 4010: |
|||
self.session_id = None |
|||
|
|||
wait_time = self.reconnects * 5 |
|||
self.log.info('Will attempt to %s after %s seconds', 'resume' if self.session_id else 'reconnect', wait_time) |
|||
gevent.sleep(wait_time) |
|||
|
|||
# Reconnect |
|||
self.connect_and_run() |
|||
|
|||
def run(self): |
|||
gevent.spawn(self.connect_and_run) |
|||
self.ws_event.wait() |
|||
|
|||
def request_guild_members(self, guild_id_or_ids, query=None, limit=0): |
|||
""" |
|||
Request a batch of Guild members from Discord. Generally this function |
|||
can be called when initially loading Guilds to fill the local member state. |
|||
""" |
|||
self.send(OPCode.REQUEST_GUILD_MEMBERS, { |
|||
# This is simply unfortunate naming on the part of Discord... |
|||
'guild_id': guild_id_or_ids, |
|||
'query': query or '', |
|||
'limit': limit, |
|||
}) |
@ -1,11 +0,0 @@ |
|||
from .json import JSONEncoder |
|||
|
|||
ENCODERS = { |
|||
'json': JSONEncoder, |
|||
} |
|||
|
|||
try: |
|||
from .etf import ETFEncoder |
|||
ENCODERS['etf'] = ETFEncoder |
|||
except ImportError: |
|||
pass |
@ -1,16 +0,0 @@ |
|||
from websocket import ABNF |
|||
|
|||
from holster.interface import Interface |
|||
|
|||
|
|||
class BaseEncoder(Interface): |
|||
TYPE = None |
|||
OPCODE = ABNF.OPCODE_TEXT |
|||
|
|||
@staticmethod |
|||
def encode(obj): |
|||
pass |
|||
|
|||
@staticmethod |
|||
def decode(obj): |
|||
pass |
@ -1,24 +0,0 @@ |
|||
import six |
|||
from websocket import ABNF |
|||
|
|||
from disco.gateway.encoding.base import BaseEncoder |
|||
|
|||
if six.PY3: |
|||
from earl import unpack, pack |
|||
else: |
|||
from erlpack import unpack, pack |
|||
|
|||
|
|||
class ETFEncoder(BaseEncoder): |
|||
TYPE = 'etf' |
|||
OPCODE = ABNF.OPCODE_BINARY |
|||
|
|||
@staticmethod |
|||
def encode(obj): |
|||
return pack(obj) |
|||
|
|||
@staticmethod |
|||
def decode(obj): |
|||
if six.PY3: |
|||
return unpack(obj, encoding='utf-8', encode_binary_ext=True) |
|||
return unpack(obj) |
@ -1,20 +0,0 @@ |
|||
from __future__ import absolute_import, print_function |
|||
|
|||
try: |
|||
import ujson as json |
|||
except ImportError: |
|||
import json |
|||
|
|||
from disco.gateway.encoding.base import BaseEncoder |
|||
|
|||
|
|||
class JSONEncoder(BaseEncoder): |
|||
TYPE = 'json' |
|||
|
|||
@staticmethod |
|||
def encode(obj): |
|||
return json.dumps(obj) |
|||
|
|||
@staticmethod |
|||
def decode(obj): |
|||
return json.loads(obj) |
@ -1,746 +0,0 @@ |
|||
from __future__ import print_function |
|||
|
|||
import six |
|||
|
|||
from disco.types.user import User, Presence |
|||
from disco.types.channel import Channel, PermissionOverwrite |
|||
from disco.types.message import Message, MessageReactionEmoji |
|||
from disco.types.voice import VoiceState |
|||
from disco.types.guild import Guild, GuildMember, Role, GuildEmoji |
|||
from disco.types.base import Model, ModelMeta, Field, ListField, AutoDictField, snowflake, datetime |
|||
from disco.util.string import underscore |
|||
|
|||
# Mapping of discords event name to our event classes |
|||
EVENTS_MAP = {} |
|||
|
|||
|
|||
class GatewayEventMeta(ModelMeta): |
|||
def __new__(mcs, name, parents, dct): |
|||
obj = super(GatewayEventMeta, mcs).__new__(mcs, name, parents, dct) |
|||
|
|||
if name != 'GatewayEvent': |
|||
EVENTS_MAP[underscore(name).upper()] = obj |
|||
|
|||
return obj |
|||
|
|||
|
|||
class GatewayEvent(six.with_metaclass(GatewayEventMeta, Model)): |
|||
""" |
|||
The GatewayEvent class wraps various functionality for events passed to us |
|||
over the gateway websocket, and serves as a simple proxy to inner values for |
|||
some wrapped event-types (e.g. MessageCreate only contains a message, so we |
|||
proxy all attributes to the inner message object). |
|||
""" |
|||
|
|||
@staticmethod |
|||
def from_dispatch(client, data): |
|||
""" |
|||
Create a new GatewayEvent instance based on event data. |
|||
""" |
|||
cls = EVENTS_MAP.get(data['t']) |
|||
if not cls: |
|||
raise Exception('Could not find cls for {} ({})'.format(data['t'], data)) |
|||
|
|||
return cls.create(data['d'], client) |
|||
|
|||
@classmethod |
|||
def create(cls, obj, client): |
|||
""" |
|||
Create this GatewayEvent class from data and the client. |
|||
""" |
|||
cls.raw_data = obj |
|||
|
|||
# If this event is wrapping a model, pull its fields |
|||
if hasattr(cls, '_wraps_model'): |
|||
alias, model = cls._wraps_model |
|||
|
|||
data = { |
|||
k: obj.pop(k) for k in six.iterkeys(model._fields) if k in obj |
|||
} |
|||
|
|||
obj[alias] = data |
|||
|
|||
obj = cls(obj, client) |
|||
|
|||
if hasattr(cls, '_attach'): |
|||
field, to = cls._attach |
|||
setattr(getattr(obj, to[0]), to[1], getattr(obj, field)) |
|||
|
|||
return obj |
|||
|
|||
def __getattr__(self, name): |
|||
try: |
|||
_proxy = object.__getattribute__(self, '_proxy') |
|||
except AttributeError: |
|||
return object.__getattribute__(self, name) |
|||
|
|||
return getattr(getattr(self, _proxy), name) |
|||
|
|||
|
|||
def debug(func=None, match=None): |
|||
def deco(cls): |
|||
old_init = cls.__init__ |
|||
|
|||
def new_init(self, obj, *args, **kwargs): |
|||
if not match or match(obj): |
|||
if func: |
|||
print(func(obj)) |
|||
else: |
|||
print(obj) |
|||
|
|||
old_init(self, obj, *args, **kwargs) |
|||
|
|||
cls.__init__ = new_init |
|||
return cls |
|||
return deco |
|||
|
|||
|
|||
def wraps_model(model, alias=None): |
|||
alias = alias or model.__name__.lower() |
|||
|
|||
def deco(cls): |
|||
cls._fields[alias] = Field(model) |
|||
cls._fields[alias].name = alias |
|||
cls._wraps_model = (alias, model) |
|||
cls._proxy = alias |
|||
return cls |
|||
return deco |
|||
|
|||
|
|||
def proxy(field): |
|||
def deco(cls): |
|||
cls._proxy = field |
|||
return cls |
|||
return deco |
|||
|
|||
|
|||
def attach(field, to=None): |
|||
def deco(cls): |
|||
cls._attach = (field, to) |
|||
return cls |
|||
return deco |
|||
|
|||
|
|||
class Ready(GatewayEvent): |
|||
""" |
|||
Sent after the initial gateway handshake is complete. Contains data required |
|||
for bootstrapping the client's states. |
|||
|
|||
Attributes |
|||
----- |
|||
version : int |
|||
The gateway version. |
|||
session_id : str |
|||
The session ID. |
|||
user : :class:`disco.types.user.User` |
|||
The user object for the authed account. |
|||
guilds : list[:class:`disco.types.guild.Guild` |
|||
All guilds this account is a member of. These are shallow guild objects. |
|||
private_channels list[:class:`disco.types.channel.Channel`] |
|||
All private channels (DMs) open for this account. |
|||
""" |
|||
version = Field(int, alias='v') |
|||
session_id = Field(str) |
|||
user = Field(User) |
|||
guilds = ListField(Guild) |
|||
private_channels = ListField(Channel) |
|||
trace = ListField(str, alias='_trace') |
|||
|
|||
|
|||
class Resumed(GatewayEvent): |
|||
""" |
|||
Sent after a resume completes. |
|||
""" |
|||
trace = ListField(str, alias='_trace') |
|||
|
|||
|
|||
@wraps_model(Guild) |
|||
class GuildCreate(GatewayEvent): |
|||
""" |
|||
Sent when a guild is joined, or becomes available. |
|||
|
|||
Attributes |
|||
----- |
|||
guild : :class:`disco.types.guild.Guild` |
|||
The guild being created (e.g. joined) |
|||
unavailable : bool |
|||
If false, this guild is coming online from a previously unavailable state, |
|||
and if None, this is a normal guild join event. |
|||
""" |
|||
unavailable = Field(bool) |
|||
presences = ListField(Presence) |
|||
|
|||
@property |
|||
def created(self): |
|||
""" |
|||
Shortcut property which is true when we actually joined the guild. |
|||
""" |
|||
return self.unavailable is None |
|||
|
|||
|
|||
@wraps_model(Guild) |
|||
class GuildUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a guild is updated. |
|||
|
|||
Attributes |
|||
----- |
|||
guild : :class:`disco.types.guild.Guild` |
|||
The updated guild object. |
|||
""" |
|||
|
|||
|
|||
class GuildDelete(GatewayEvent): |
|||
""" |
|||
Sent when a guild is deleted, left, or becomes unavailable. |
|||
|
|||
Attributes |
|||
----- |
|||
id : snowflake |
|||
The ID of the guild being deleted. |
|||
unavailable : bool |
|||
If true, this guild is becoming unavailable, if None this is a normal |
|||
guild leave event. |
|||
""" |
|||
id = Field(snowflake) |
|||
unavailable = Field(bool) |
|||
|
|||
@property |
|||
def deleted(self): |
|||
""" |
|||
Shortcut property which is true when we actually have left the guild. |
|||
""" |
|||
return self.unavailable is None |
|||
|
|||
|
|||
@wraps_model(Channel) |
|||
class ChannelCreate(GatewayEvent): |
|||
""" |
|||
Sent when a channel is created. |
|||
|
|||
Attributes |
|||
----- |
|||
channel : :class:`disco.types.channel.Channel` |
|||
The channel which was created. |
|||
""" |
|||
|
|||
|
|||
@wraps_model(Channel) |
|||
class ChannelUpdate(ChannelCreate): |
|||
""" |
|||
Sent when a channel is updated. |
|||
|
|||
Attributes |
|||
----- |
|||
channel : :class:`disco.types.channel.Channel` |
|||
The channel which was updated. |
|||
""" |
|||
overwrites = AutoDictField(PermissionOverwrite, 'id', alias='permission_overwrites') |
|||
|
|||
|
|||
@wraps_model(Channel) |
|||
class ChannelDelete(ChannelCreate): |
|||
""" |
|||
Sent when a channel is deleted. |
|||
|
|||
Attributes |
|||
----- |
|||
channel : :class:`disco.types.channel.Channel` |
|||
The channel being deleted. |
|||
""" |
|||
|
|||
|
|||
class ChannelPinsUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a channel's pins are updated. |
|||
|
|||
Attributes |
|||
----- |
|||
channel_id : snowflake |
|||
ID of the channel where pins where updated. |
|||
last_pin_timestap : datetime |
|||
The time the last message was pinned. |
|||
""" |
|||
channel_id = Field(snowflake) |
|||
last_pin_timestamp = Field(datetime) |
|||
|
|||
|
|||
@proxy(User) |
|||
class GuildBanAdd(GatewayEvent): |
|||
""" |
|||
Sent when a user is banned from a guild. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild the user is being banned from. |
|||
user : :class:`disco.types.user.User` |
|||
The user being banned from the guild. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
user = Field(User) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
@proxy(User) |
|||
class GuildBanRemove(GuildBanAdd): |
|||
""" |
|||
Sent when a user is unbanned from a guild. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild the user is being unbanned from. |
|||
user : :class:`disco.types.user.User` |
|||
The user being unbanned from the guild. |
|||
""" |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
class GuildEmojisUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a guild's emojis are updated. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild the emojis are being updated in. |
|||
emojis : list[:class:`disco.types.guild.Emoji`] |
|||
The new set of emojis for the guild |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
emojis = ListField(GuildEmoji) |
|||
|
|||
|
|||
class GuildIntegrationsUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a guild's integrations are updated. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild integrations where updated in. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
|
|||
|
|||
class GuildMembersChunk(GatewayEvent): |
|||
""" |
|||
Sent in response to a member's chunk request. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild this member chunk is for. |
|||
members : list[:class:`disco.types.guild.GuildMember`] |
|||
The chunk of members. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
members = ListField(GuildMember) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
@wraps_model(GuildMember, alias='member') |
|||
class GuildMemberAdd(GatewayEvent): |
|||
""" |
|||
Sent when a user joins a guild. |
|||
|
|||
Attributes |
|||
----- |
|||
member : :class:`disco.types.guild.GuildMember` |
|||
The member that has joined the guild. |
|||
""" |
|||
|
|||
|
|||
@proxy('user') |
|||
class GuildMemberRemove(GatewayEvent): |
|||
""" |
|||
Sent when a user leaves a guild (via leaving, kicking, or banning). |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild the member left from. |
|||
user : :class:`disco.types.user.User` |
|||
The user who was removed from the guild. |
|||
""" |
|||
user = Field(User) |
|||
guild_id = Field(snowflake) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
@wraps_model(GuildMember, alias='member') |
|||
class GuildMemberUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a guilds member is updated. |
|||
|
|||
Attributes |
|||
----- |
|||
member : :class:`disco.types.guild.GuildMember` |
|||
The member being updated |
|||
""" |
|||
|
|||
|
|||
@proxy('role') |
|||
@attach('guild_id', to=('role', 'guild_id')) |
|||
class GuildRoleCreate(GatewayEvent): |
|||
""" |
|||
Sent when a role is created. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild where the role was created. |
|||
role : :class:`disco.types.guild.Role` |
|||
The role that was created. |
|||
""" |
|||
role = Field(Role) |
|||
guild_id = Field(snowflake) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
class GuildRoleUpdate(GuildRoleCreate): |
|||
""" |
|||
Sent when a role is updated. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild where the role was created. |
|||
role : :class:`disco.types.guild.Role` |
|||
The role that was created. |
|||
""" |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
class GuildRoleDelete(GatewayEvent): |
|||
""" |
|||
Sent when a role is deleted. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild where the role is being deleted. |
|||
role_id : snowflake |
|||
The id of the role being deleted. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
role_id = Field(snowflake) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
@wraps_model(Message) |
|||
class MessageCreate(GatewayEvent): |
|||
""" |
|||
Sent when a message is created. |
|||
|
|||
Attributes |
|||
----- |
|||
message : :class:`disco.types.message.Message` |
|||
The message being created. |
|||
guild_id : snowflake |
|||
The ID of the guild this message comes from. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
|
|||
|
|||
@wraps_model(Message) |
|||
class MessageUpdate(MessageCreate): |
|||
""" |
|||
Sent when a message is updated/edited. |
|||
|
|||
Attributes |
|||
----- |
|||
message : :class:`disco.types.message.Message` |
|||
The message being updated. |
|||
guild_id : snowflake |
|||
The ID of the guild this message exists in. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
|
|||
|
|||
class MessageDelete(GatewayEvent): |
|||
""" |
|||
Sent when a message is deleted. |
|||
|
|||
Attributes |
|||
----- |
|||
id : snowflake |
|||
The ID of message being deleted. |
|||
channel_id : snowflake |
|||
The ID of the channel the message was deleted in. |
|||
guild_id : snowflake |
|||
The ID of the guild this message existed in. |
|||
""" |
|||
id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
guild_id = Field(snowflake) |
|||
|
|||
@property |
|||
def channel(self): |
|||
return self.client.state.channels.get(self.channel_id) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.channel.guild |
|||
|
|||
|
|||
class MessageDeleteBulk(GatewayEvent): |
|||
""" |
|||
Sent when multiple messages are deleted from a channel. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The guild the messages are being deleted in. |
|||
channel_id : snowflake |
|||
The channel the messages are being deleted in. |
|||
ids : list[snowflake] |
|||
List of messages being deleted in the channel. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
ids = ListField(snowflake) |
|||
|
|||
@property |
|||
def channel(self): |
|||
return self.client.state.channels.get(self.channel_id) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.channel.guild |
|||
|
|||
|
|||
@wraps_model(Presence) |
|||
class PresenceUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a user's presence is updated. |
|||
|
|||
Attributes |
|||
----- |
|||
presence : :class:`disco.types.user.Presence` |
|||
The updated presence object. |
|||
guild_id : snowflake |
|||
The guild this presence update is for. |
|||
roles : list[snowflake] |
|||
List of roles the user from the presence is part of. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
roles = ListField(snowflake) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.client.state.guilds.get(self.guild_id) |
|||
|
|||
|
|||
class TypingStart(GatewayEvent): |
|||
""" |
|||
Sent when a user begins typing in a channel. |
|||
|
|||
Attributes |
|||
----- |
|||
guild_id : snowflake |
|||
The ID of the guild where the user is typing. |
|||
channel_id : snowflake |
|||
The ID of the channel where the user is typing. |
|||
user_id : snowflake |
|||
The ID of the user who is typing. |
|||
timestamp : datetime |
|||
When the user started typing. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
user_id = Field(snowflake) |
|||
timestamp = Field(datetime) |
|||
|
|||
|
|||
@wraps_model(VoiceState, alias='state') |
|||
class VoiceStateUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a users voice state changes. |
|||
|
|||
Attributes |
|||
----- |
|||
state : :class:`disco.models.voice.VoiceState` |
|||
The voice state which was updated. |
|||
""" |
|||
|
|||
|
|||
class VoiceServerUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a voice server is updated. |
|||
|
|||
Attributes |
|||
----- |
|||
token : str |
|||
The token for the voice server. |
|||
endpoint : str |
|||
The endpoint for the voice server. |
|||
guild_id : snowflake |
|||
The guild ID this voice server update is for. |
|||
""" |
|||
token = Field(str) |
|||
endpoint = Field(str) |
|||
guild_id = Field(snowflake) |
|||
|
|||
|
|||
class WebhooksUpdate(GatewayEvent): |
|||
""" |
|||
Sent when a channels webhooks are updated. |
|||
|
|||
Attributes |
|||
----- |
|||
channel_id : snowflake |
|||
The channel ID this webhooks update is for. |
|||
guild_id : snowflake |
|||
The guild ID this webhooks update is for. |
|||
""" |
|||
channel_id = Field(snowflake) |
|||
guild_id = Field(snowflake) |
|||
|
|||
|
|||
class MessageReactionAdd(GatewayEvent): |
|||
""" |
|||
Sent when a reaction is added to a message. |
|||
|
|||
Attributes |
|||
---------- |
|||
guild_id : snowflake |
|||
The guild ID the message is in. |
|||
channel_id : snowflake |
|||
The channel ID the message is in. |
|||
message_id : snowflake |
|||
The ID of the message for which the reaction was added too. |
|||
user_id : snowflake |
|||
The ID of the user who added the reaction. |
|||
emoji : :class:`disco.types.message.MessageReactionEmoji` |
|||
The emoji which was added. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
message_id = Field(snowflake) |
|||
user_id = Field(snowflake) |
|||
emoji = Field(MessageReactionEmoji) |
|||
|
|||
def delete(self): |
|||
self.client.api.channels_messages_reactions_delete( |
|||
self.channel_id, |
|||
self.message_id, |
|||
self.emoji.to_string() if self.emoji.id else self.emoji.name, |
|||
self.user_id, |
|||
) |
|||
|
|||
@property |
|||
def channel(self): |
|||
return self.client.state.channels.get(self.channel_id) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.channel.guild |
|||
|
|||
|
|||
class MessageReactionRemove(GatewayEvent): |
|||
""" |
|||
Sent when a reaction is removed from a message. |
|||
|
|||
Attributes |
|||
---------- |
|||
guild_id : snowflake |
|||
The guild ID the message is in. |
|||
channel_id : snowflake |
|||
The channel ID the message is in. |
|||
message_id : snowflake |
|||
The ID of the message for which the reaction was removed from. |
|||
user_id : snowflake |
|||
The ID of the user who originally added the reaction. |
|||
emoji : :class:`disco.types.message.MessageReactionEmoji` |
|||
The emoji which was removed. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
message_id = Field(snowflake) |
|||
user_id = Field(snowflake) |
|||
emoji = Field(MessageReactionEmoji) |
|||
|
|||
@property |
|||
def channel(self): |
|||
return self.client.state.channels.get(self.channel_id) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.channel.guild |
|||
|
|||
|
|||
class MessageReactionRemoveAll(GatewayEvent): |
|||
""" |
|||
Sent when all reactions are removed from a message. |
|||
|
|||
Attributes |
|||
---------- |
|||
guild_id : snowflake |
|||
The guild ID the message is in. |
|||
channel_id : snowflake |
|||
The channel ID the message is in. |
|||
message_id : snowflake |
|||
The ID of the message for which the reactions where removed from. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
message_id = Field(snowflake) |
|||
|
|||
@property |
|||
def channel(self): |
|||
return self.client.state.channels.get(self.channel_id) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.channel.guild |
|||
|
|||
|
|||
class MessageReactionRemoveEmoji(GatewayEvent): |
|||
""" |
|||
Sent when all reactions of a single emoji are removed from a message. |
|||
Attributes |
|||
---------- |
|||
guild_id : Snowflake |
|||
The guild ID the message is in. |
|||
channel_id : Snowflake |
|||
The channel ID the message is in. |
|||
message_id : Snowflake |
|||
The ID of the message for which the reaction was removed from. |
|||
emoji : :class:`disco.types.message.MessageReactionEmoji` |
|||
The emoji that was removed. |
|||
""" |
|||
guild_id = Field(snowflake) |
|||
channel_id = Field(snowflake) |
|||
message_id = Field(snowflake) |
|||
emoji = Field(MessageReactionEmoji) |
|||
|
|||
@property |
|||
def channel(self): |
|||
return self.client.state.channels.get(self.channel_id) |
|||
|
|||
@property |
|||
def guild(self): |
|||
return self.channel.guild |
@ -1,91 +0,0 @@ |
|||
import random |
|||
import gevent |
|||
import string |
|||
import weakref |
|||
|
|||
from holster.enum import Enum |
|||
|
|||
from disco.util.logging import LoggingClass |
|||
from disco.util.serializer import dump_function, load_function |
|||
|
|||
|
|||
def get_random_str(size): |
|||
return ''.join([random.choice(string.printable) for _ in range(size)]) |
|||
|
|||
|
|||
IPCMessageType = Enum( |
|||
'CALL_FUNC', |
|||
'GET_ATTR', |
|||
'EXECUTE', |
|||
'RESPONSE', |
|||
) |
|||
|
|||
|
|||
class GIPCProxy(LoggingClass): |
|||
def __init__(self, obj, pipe): |
|||
super(GIPCProxy, self).__init__() |
|||
self.obj = obj |
|||
self.pipe = pipe |
|||
self.results = weakref.WeakValueDictionary() |
|||
gevent.spawn(self.read_loop) |
|||
|
|||
def resolve(self, parts): |
|||
base = self.obj |
|||
for part in parts: |
|||
base = getattr(base, part) |
|||
|
|||
return base |
|||
|
|||
def send(self, typ, data): |
|||
self.pipe.put((typ.value, data)) |
|||
|
|||
def handle(self, mtype, data): |
|||
if mtype == IPCMessageType.CALL_FUNC: |
|||
nonce, func, args, kwargs = data |
|||
res = self.resolve(func)(*args, **kwargs) |
|||
self.send(IPCMessageType.RESPONSE, (nonce, res)) |
|||
elif mtype == IPCMessageType.GET_ATTR: |
|||
nonce, path = data |
|||
self.send(IPCMessageType.RESPONSE, (nonce, self.resolve(path))) |
|||
elif mtype == IPCMessageType.EXECUTE: |
|||
nonce, raw = data |
|||
func = load_function(raw) |
|||
try: |
|||
result = func(self.obj) |
|||
except Exception: |
|||
self.log.exception('Failed to EXECUTE: ') |
|||
result = None |
|||
|
|||
self.send(IPCMessageType.RESPONSE, (nonce, result)) |
|||
elif mtype == IPCMessageType.RESPONSE: |
|||
nonce, res = data |
|||
if nonce in self.results: |
|||
self.results[nonce].set(res) |
|||
|
|||
def read_loop(self): |
|||
while True: |
|||
mtype, data = self.pipe.get() |
|||
|
|||
try: |
|||
self.handle(mtype, data) |
|||
except Exception: |
|||
self.log.exception('Error in GIPCProxy:') |
|||
|
|||
def execute(self, func): |
|||
nonce = get_random_str(32) |
|||
raw = dump_function(func) |
|||
self.results[nonce] = result = gevent.event.AsyncResult() |
|||
self.pipe.put((IPCMessageType.EXECUTE.value, (nonce, raw))) |
|||
return result |
|||
|
|||
def get(self, path): |
|||
nonce = get_random_str(32) |
|||
self.results[nonce] = result = gevent.event.AsyncResult() |
|||
self.pipe.put((IPCMessageType.GET_ATTR.value, (nonce, path))) |
|||
return result |
|||
|
|||
def call(self, path, *args, **kwargs): |
|||
nonce = get_random_str(32) |
|||
self.results[nonce] = result = gevent.event.AsyncResult() |
|||
self.pipe.put((IPCMessageType.CALL_FUNC.value, (nonce, path, args, kwargs))) |
|||
return result |
@ -1,20 +0,0 @@ |
|||
from holster.enum import Enum |
|||
|
|||
SEND = 1 |
|||
RECV = 2 |
|||
|
|||
OPCode = Enum( |
|||
DISPATCH=0, |
|||
HEARTBEAT=1, |
|||
IDENTIFY=2, |
|||
STATUS_UPDATE=3, |
|||
VOICE_STATE_UPDATE=4, |
|||
VOICE_SERVER_PING=5, |
|||
RESUME=6, |
|||
RECONNECT=7, |
|||
REQUEST_GUILD_MEMBERS=8, |
|||
INVALID_SESSION=9, |
|||
HELLO=10, |
|||
HEARTBEAT_ACK=11, |
|||
GUILD_SYNC=12, |
|||
) |
@ -1,111 +0,0 @@ |
|||
from __future__ import absolute_import |
|||
|
|||
import gipc |
|||
import gevent |
|||
import pickle |
|||
import logging |
|||
import marshal |
|||
|
|||
from six import integer_types, string_types |
|||
from six.moves import range |
|||
|
|||
from disco.client import Client |
|||
from disco.bot import Bot, BotConfig |
|||
from disco.api.client import APIClient |
|||
from disco.gateway.ipc import GIPCProxy |
|||
from disco.util.logging import setup_logging |
|||
from disco.util.snowflake import calculate_shard |
|||
from disco.util.serializer import dump_function, load_function |
|||
|
|||
|
|||
def run_shard(config, shard_id, pipe): |
|||
setup_logging( |
|||
level=logging.INFO, |
|||
format='{} [%(levelname)s] %(asctime)s - %(name)s:%(lineno)d - %(message)s'.format(shard_id), |
|||
) |
|||
|
|||
config.shard_id = shard_id |
|||
client = Client(config) |
|||
bot = Bot(client, BotConfig(config.bot)) |
|||
bot.sharder = GIPCProxy(bot, pipe) |
|||
bot.shards = ShardHelper(config.shard_count, bot) |
|||
bot.run_forever() |
|||
|
|||
|
|||
class ShardHelper(object): |
|||
def __init__(self, count, bot): |
|||
self.count = count |
|||
self.bot = bot |
|||
|
|||
def keys(self): |
|||
for sid in range(self.count): |
|||
yield sid |
|||
|
|||
def on(self, sid, func): |
|||
if sid == self.bot.client.config.shard_id: |
|||
result = gevent.event.AsyncResult() |
|||
result.set(func(self.bot)) |
|||
return result |
|||
|
|||
return self.bot.sharder.call(('run_on', ), sid, dump_function(func)) |
|||
|
|||
def all(self, func, timeout=None): |
|||
pool = gevent.pool.Pool(self.count) |
|||
return dict(zip( |
|||
range(self.count), |
|||
pool.imap( |
|||
lambda i: self.on(i, func).wait(timeout=timeout), |
|||
range(self.count), |
|||
), |
|||
)) |
|||
|
|||
def for_id(self, sid, func): |
|||
shard = calculate_shard(self.count, sid) |
|||
return self.on(shard, func) |
|||
|
|||
|
|||
class AutoSharder(object): |
|||
def __init__(self, config): |
|||
self.config = config |
|||
self.client = APIClient(config.token) |
|||
self.shards = {} |
|||
self.config.shard_count = self.client.gateway_bot_get()['shards'] |
|||
|
|||
def run_on(self, sid, raw): |
|||
func = load_function(raw) |
|||
return self.shards[sid].execute(func).wait(timeout=15) |
|||
|
|||
def run(self): |
|||
for shard in range(self.config.shard_count): |
|||
if self.config.manhole_enable and shard != 0: |
|||
self.config.manhole_enable = False |
|||
|
|||
self.start_shard(shard) |
|||
gevent.sleep(6) |
|||
|
|||
logging.basicConfig( |
|||
level=logging.INFO, |
|||
format='{} [%(levelname)s] %(asctime)s - %(name)s:%(lineno)d - %(message)s'.format(id), |
|||
) |
|||
|
|||
@staticmethod |
|||
def dumps(data): |
|||
if isinstance(data, (string_types, integer_types, bool, list, set, dict)): |
|||
return '\x01' + marshal.dumps(data) |
|||
elif isinstance(data, object) and data.__class__.__name__ == 'code': |
|||
return '\x01' + marshal.dumps(data) |
|||
else: |
|||
return '\x02' + pickle.dumps(data) |
|||
|
|||
@staticmethod |
|||
def loads(data): |
|||
enc_type = data[0] |
|||
if enc_type == '\x01': |
|||
return marshal.loads(data[1:]) |
|||
elif enc_type == '\x02': |
|||
return pickle.loads(data[1:]) |
|||
|
|||
def start_shard(self, sid): |
|||
cpipe, ppipe = gipc.pipe(duplex=True, encoder=self.dumps, decoder=self.loads) |
|||
gipc.start_process(run_shard, (self.config, sid, cpipe)) |
|||
self.shards[sid] = GIPCProxy(self, ppipe) |
Some files were not shown because too many files changed in this diff
Loading…
Reference in new issue