Initial commit: 知识采集分析Agent项目
- Django后端API服务 - Vue前端界面 - 集成王璞智能分析API - 任务管理和报告生成功能 - Element Plus UI组件 - 响应式布局设计
This commit is contained in:
commit
f57e8494f4
Binary file not shown.
|
|
@ -0,0 +1,173 @@
|
||||||
|
# 外部API接口规范
|
||||||
|
|
||||||
|
## 1. 刘老师爬虫API
|
||||||
|
|
||||||
|
### 接口地址
|
||||||
|
```
|
||||||
|
POST http://liu-teacher-api/crawl-data
|
||||||
|
```
|
||||||
|
|
||||||
|
### 请求参数
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"task_id": 123,
|
||||||
|
"sources_config": {
|
||||||
|
"presetSources": [
|
||||||
|
{
|
||||||
|
"id": "wechat-group-001",
|
||||||
|
"name": "AI技术讨论群",
|
||||||
|
"category": "wechat",
|
||||||
|
"type": "微信|企微"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "official-account-002",
|
||||||
|
"name": "科技前沿公众号",
|
||||||
|
"category": "wechatOfficial",
|
||||||
|
"type": "公众号"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"customSources": [
|
||||||
|
{
|
||||||
|
"id": "custom-web-001",
|
||||||
|
"name": "自定义网站",
|
||||||
|
"url": "https://example.com",
|
||||||
|
"category": "web",
|
||||||
|
"type": "网页"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"webSearchEnabled": true
|
||||||
|
},
|
||||||
|
"current_execution_time": "2025-09-19T17:42:08.123456+08:00",
|
||||||
|
"last_execution_time": "2025-09-18T17:42:08.123456+08:00",
|
||||||
|
"web_search_enabled": true,
|
||||||
|
"requirement": "用户输入的需求描述"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 响应格式
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"code": 200,
|
||||||
|
"message": "爬取成功",
|
||||||
|
"data": {
|
||||||
|
"crawler_task_id": "crawler-task-123-1726756928",
|
||||||
|
"crawled_data": [
|
||||||
|
{
|
||||||
|
"source_id": "wechat-group-001",
|
||||||
|
"source_name": "AI技术讨论群",
|
||||||
|
"source_type": "wechat",
|
||||||
|
"content": "爬取到的完整文本内容...",
|
||||||
|
"metadata": {
|
||||||
|
"crawl_time": "2025-09-19T17:42:08+08:00",
|
||||||
|
"data_count": 150,
|
||||||
|
"time_range": {
|
||||||
|
"start": "2025-09-18T17:42:08+08:00",
|
||||||
|
"end": "2025-09-19T17:42:08+08:00"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 接口说明
|
||||||
|
- **目的**: 根据时间戳进行增量数据爬取,确保无遗漏
|
||||||
|
- **时间戳**: `current_execution_time` - `last_execution_time` 之间的所有数据
|
||||||
|
- **返回数据**: 爬取到的完整长文本内容,供报告生成使用
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. 王璞报告生成API
|
||||||
|
|
||||||
|
### 接口地址
|
||||||
|
```
|
||||||
|
POST http://wangpu-api/generate-report
|
||||||
|
```
|
||||||
|
|
||||||
|
### 请求参数
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"task_id": 123,
|
||||||
|
"requirement": "用户输入的需求描述",
|
||||||
|
"crawler_data": [
|
||||||
|
{
|
||||||
|
"source_id": "wechat-group-001",
|
||||||
|
"source_name": "AI技术讨论群",
|
||||||
|
"source_type": "wechat",
|
||||||
|
"content": "完整的爬虫数据内容..."
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"web_search_enabled": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 响应格式
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"code": 200,
|
||||||
|
"message": "报告生成任务已创建",
|
||||||
|
"data": {
|
||||||
|
"report_task_id": "report-task-123-1726756928",
|
||||||
|
"status": "processing"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 接口说明
|
||||||
|
- **目的**: 根据爬虫数据和用户需求生成智能报告
|
||||||
|
- **输入**: 刘老师返回的爬虫数据 + 用户需求
|
||||||
|
- **输出**: 报告生成任务ID,后续通过回调获取结果
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. 报告完成回调API (我方提供给王璞)
|
||||||
|
|
||||||
|
### 接口地址
|
||||||
|
```
|
||||||
|
POST http://our-api/report-callback
|
||||||
|
```
|
||||||
|
|
||||||
|
### 请求参数 (王璞调用)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"report_task_id": "report-task-123-1726756928",
|
||||||
|
"task_id": 123,
|
||||||
|
"status": "completed",
|
||||||
|
"report_data": {
|
||||||
|
"title": "AI技术发展趋势报告",
|
||||||
|
"content": "完整的报告内容...",
|
||||||
|
"summary": "报告摘要",
|
||||||
|
"generated_time": "2025-09-19T17:45:08+08:00"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 响应格式 (我方返回)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"code": 200,
|
||||||
|
"message": "报告接收成功"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. 实现状态
|
||||||
|
|
||||||
|
### 当前状态 (模拟)
|
||||||
|
- ✅ 接口规范已定义
|
||||||
|
- ✅ 时间戳逻辑已实现
|
||||||
|
- ✅ 数据格式已规范化
|
||||||
|
- 🔄 使用模拟数据推进开发
|
||||||
|
|
||||||
|
### 待对接
|
||||||
|
- ⏳ 刘老师爬虫API实现
|
||||||
|
- ⏳ 王璞报告生成API实现
|
||||||
|
- ⏳ 报告回调接收实现
|
||||||
|
|
||||||
|
### 下一步
|
||||||
|
1. 完善定时任务调度系统
|
||||||
|
2. 实现报告回调接收接口
|
||||||
|
3. 优化错误处理和重试机制
|
||||||
|
4. 准备接口对接测试
|
||||||
Binary file not shown.
|
|
@ -0,0 +1,5 @@
|
||||||
|
DEBUG=True
|
||||||
|
SECRET_KEY=your-secret-key-here-change-in-production
|
||||||
|
DATABASE_URL=sqlite:///db.sqlite3
|
||||||
|
REDIS_URL=redis://localhost:6379/0
|
||||||
|
OPENAI_API_KEY=your-openai-api-key-here
|
||||||
|
|
@ -0,0 +1,16 @@
|
||||||
|
"""
|
||||||
|
ASGI config for info_reporter_backend project.
|
||||||
|
|
||||||
|
It exposes the ASGI callable as a module-level variable named ``application``.
|
||||||
|
|
||||||
|
For more information on this file, see
|
||||||
|
https://docs.djangoproject.com/en/4.2/howto/deployment/asgi/
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from django.core.asgi import get_asgi_application
|
||||||
|
|
||||||
|
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'info_reporter_backend.settings')
|
||||||
|
|
||||||
|
application = get_asgi_application()
|
||||||
|
|
@ -0,0 +1,133 @@
|
||||||
|
"""
|
||||||
|
Django settings for info_reporter_backend project.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from pathlib import Path
|
||||||
|
from decouple import config
|
||||||
|
|
||||||
|
# Build paths inside the project like this: BASE_DIR / 'subdir'.
|
||||||
|
BASE_DIR = Path(__file__).resolve().parent.parent
|
||||||
|
|
||||||
|
# Security settings
|
||||||
|
SECRET_KEY = config('SECRET_KEY', default='django-insecure-change-me-in-production')
|
||||||
|
DEBUG = config('DEBUG', default=True, cast=bool)
|
||||||
|
ALLOWED_HOSTS = ['localhost', '127.0.0.1', '0.0.0.0']
|
||||||
|
|
||||||
|
# Application definition
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'django.contrib.admin',
|
||||||
|
'django.contrib.auth',
|
||||||
|
'django.contrib.contenttypes',
|
||||||
|
'django.contrib.sessions',
|
||||||
|
'django.contrib.messages',
|
||||||
|
'django.contrib.staticfiles',
|
||||||
|
|
||||||
|
# Third party apps
|
||||||
|
'rest_framework',
|
||||||
|
'corsheaders',
|
||||||
|
|
||||||
|
# Local apps
|
||||||
|
'tasks',
|
||||||
|
]
|
||||||
|
|
||||||
|
MIDDLEWARE = [
|
||||||
|
'corsheaders.middleware.CorsMiddleware',
|
||||||
|
'django.middleware.security.SecurityMiddleware',
|
||||||
|
'django.contrib.sessions.middleware.SessionMiddleware',
|
||||||
|
'django.middleware.common.CommonMiddleware',
|
||||||
|
'django.middleware.csrf.CsrfViewMiddleware',
|
||||||
|
'django.contrib.auth.middleware.AuthenticationMiddleware',
|
||||||
|
'django.contrib.messages.middleware.MessageMiddleware',
|
||||||
|
'django.middleware.clickjacking.XFrameOptionsMiddleware',
|
||||||
|
]
|
||||||
|
|
||||||
|
ROOT_URLCONF = 'info_reporter_backend.urls'
|
||||||
|
|
||||||
|
TEMPLATES = [
|
||||||
|
{
|
||||||
|
'BACKEND': 'django.template.backends.django.DjangoTemplates',
|
||||||
|
'DIRS': [],
|
||||||
|
'APP_DIRS': True,
|
||||||
|
'OPTIONS': {
|
||||||
|
'context_processors': [
|
||||||
|
'django.template.context_processors.debug',
|
||||||
|
'django.template.context_processors.request',
|
||||||
|
'django.contrib.auth.context_processors.auth',
|
||||||
|
'django.contrib.messages.context_processors.messages',
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
WSGI_APPLICATION = 'info_reporter_backend.wsgi.application'
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASES = {
|
||||||
|
'default': {
|
||||||
|
'ENGINE': 'django.db.backends.sqlite3',
|
||||||
|
'NAME': BASE_DIR / 'db.sqlite3',
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Password validation
|
||||||
|
AUTH_PASSWORD_VALIDATORS = [
|
||||||
|
{
|
||||||
|
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
# Internationalization
|
||||||
|
LANGUAGE_CODE = 'zh-hans'
|
||||||
|
TIME_ZONE = 'Asia/Shanghai'
|
||||||
|
USE_I18N = True
|
||||||
|
USE_TZ = True
|
||||||
|
|
||||||
|
# Static files
|
||||||
|
STATIC_URL = 'static/'
|
||||||
|
|
||||||
|
# Default primary key field type
|
||||||
|
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
|
||||||
|
|
||||||
|
# Django REST Framework
|
||||||
|
REST_FRAMEWORK = {
|
||||||
|
'DEFAULT_RENDERER_CLASSES': [
|
||||||
|
'rest_framework.renderers.JSONRenderer',
|
||||||
|
],
|
||||||
|
'DEFAULT_PARSER_CLASSES': [
|
||||||
|
'rest_framework.parsers.JSONParser',
|
||||||
|
],
|
||||||
|
'DEFAULT_PAGINATION_CLASS': 'rest_framework.pagination.PageNumberPagination',
|
||||||
|
'PAGE_SIZE': 10,
|
||||||
|
'DATETIME_FORMAT': '%Y-%m-%d %H:%M:%S',
|
||||||
|
}
|
||||||
|
|
||||||
|
# CORS settings
|
||||||
|
CORS_ALLOWED_ORIGINS = [
|
||||||
|
"http://localhost:3000",
|
||||||
|
"http://127.0.0.1:3000",
|
||||||
|
]
|
||||||
|
|
||||||
|
CORS_ALLOW_CREDENTIALS = True
|
||||||
|
|
||||||
|
# OpenAI settings
|
||||||
|
OPENAI_API_KEY = config('OPENAI_API_KEY', default='')
|
||||||
|
|
||||||
|
# Redis settings
|
||||||
|
REDIS_URL = config('REDIS_URL', default='redis://localhost:6379/0')
|
||||||
|
|
||||||
|
# Celery settings
|
||||||
|
CELERY_BROKER_URL = REDIS_URL
|
||||||
|
CELERY_RESULT_BACKEND = REDIS_URL
|
||||||
|
CELERY_ACCEPT_CONTENT = ['json']
|
||||||
|
CELERY_TASK_SERIALIZER = 'json'
|
||||||
|
CELERY_RESULT_SERIALIZER = 'json'
|
||||||
|
CELERY_TIMEZONE = TIME_ZONE
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
"""
|
||||||
|
URL configuration for info_reporter_backend project.
|
||||||
|
"""
|
||||||
|
from django.contrib import admin
|
||||||
|
from django.urls import path, include
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path('admin/', admin.site.urls),
|
||||||
|
path('', include('tasks.urls')),
|
||||||
|
]
|
||||||
|
|
@ -0,0 +1,16 @@
|
||||||
|
"""
|
||||||
|
WSGI config for info_reporter_backend project.
|
||||||
|
|
||||||
|
It exposes the WSGI callable as a module-level variable named ``application``.
|
||||||
|
|
||||||
|
For more information on this file, see
|
||||||
|
https://docs.djangoproject.com/en/4.2/howto/deployment/wsgi/
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from django.core.wsgi import get_wsgi_application
|
||||||
|
|
||||||
|
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'info_reporter_backend.settings')
|
||||||
|
|
||||||
|
application = get_wsgi_application()
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
#!/usr/bin/env python
|
||||||
|
"""Django's command-line utility for administrative tasks."""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Run administrative tasks."""
|
||||||
|
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'info_reporter_backend.settings')
|
||||||
|
try:
|
||||||
|
from django.core.management import execute_from_command_line
|
||||||
|
except ImportError as exc:
|
||||||
|
raise ImportError(
|
||||||
|
"Couldn't import Django. Are you sure it's installed and "
|
||||||
|
"available on your PYTHONPATH environment variable? Did you "
|
||||||
|
"forget to activate a virtual environment?"
|
||||||
|
) from exc
|
||||||
|
execute_from_command_line(sys.argv)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
|
|
@ -0,0 +1,9 @@
|
||||||
|
Django==4.2.7
|
||||||
|
djangorestframework==3.14.0
|
||||||
|
django-cors-headers==4.3.1
|
||||||
|
python-decouple==3.8
|
||||||
|
psycopg2-binary==2.9.9
|
||||||
|
celery==5.3.4
|
||||||
|
redis==5.0.1
|
||||||
|
requests==2.31.0
|
||||||
|
openai==1.3.7
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
from django.contrib import admin
|
||||||
|
|
||||||
|
# Register your models here.
|
||||||
|
|
@ -0,0 +1,6 @@
|
||||||
|
from django.apps import AppConfig
|
||||||
|
|
||||||
|
|
||||||
|
class TasksConfig(AppConfig):
|
||||||
|
default_auto_field = 'django.db.models.BigAutoField'
|
||||||
|
name = 'tasks'
|
||||||
|
|
@ -0,0 +1,136 @@
|
||||||
|
# Generated by Django 4.2.7 on 2025-09-19 02:19
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
import django.db.models.deletion
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
initial = True
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Report',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('title', models.CharField(max_length=255, verbose_name='报告标题')),
|
||||||
|
('summary', models.TextField(blank=True, verbose_name='报告摘要')),
|
||||||
|
('content', models.TextField(verbose_name='报告内容')),
|
||||||
|
('source_tag', models.CharField(blank=True, max_length=100, verbose_name='信息来源标签')),
|
||||||
|
('word_count', models.IntegerField(default=0, verbose_name='字数统计')),
|
||||||
|
('status', models.CharField(choices=[('generating', '生成中'), ('completed', '已完成'), ('failed', '失败')], default='generating', max_length=20, verbose_name='生成状态')),
|
||||||
|
('generated_at', models.DateTimeField(blank=True, null=True, verbose_name='生成完成时间')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': '报告',
|
||||||
|
'verbose_name_plural': '报告',
|
||||||
|
'db_table': 'reports',
|
||||||
|
'ordering': ['-generated_at', '-created_at'],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Task',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('title', models.CharField(max_length=255, verbose_name='任务标题')),
|
||||||
|
('description', models.TextField(blank=True, verbose_name='任务描述')),
|
||||||
|
('requirement', models.TextField(verbose_name='用户需求描述')),
|
||||||
|
('type', models.CharField(choices=[('single', '单次任务'), ('scheduled', '周期任务')], max_length=20, verbose_name='任务类型')),
|
||||||
|
('status', models.CharField(choices=[('generating', '生成中'), ('running', '运行中'), ('paused', '暂停'), ('error', '异常'), ('completed', '已完成')], default='generating', max_length=20, verbose_name='任务状态')),
|
||||||
|
('schedule_config', models.JSONField(blank=True, null=True, verbose_name='周期任务配置')),
|
||||||
|
('sources_config', models.JSONField(default=dict, verbose_name='信息源配置')),
|
||||||
|
('web_search_enabled', models.BooleanField(default=True, verbose_name='是否启用联网搜索')),
|
||||||
|
('last_report_time', models.DateTimeField(blank=True, null=True, verbose_name='最新报告生成时间')),
|
||||||
|
('next_run_time', models.DateTimeField(blank=True, null=True, verbose_name='下次执行时间')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, verbose_name='更新时间')),
|
||||||
|
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL, verbose_name='用户')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': '任务',
|
||||||
|
'verbose_name_plural': '任务',
|
||||||
|
'db_table': 'tasks',
|
||||||
|
'ordering': ['-last_report_time', '-created_at'],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='ReportSource',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('source_type', models.CharField(choices=[('wechat', '微信'), ('official-account', '公众号'), ('feishu', '飞书'), ('dingtalk', '钉钉'), ('email', '邮箱'), ('website', '网页'), ('baidu-pan', '百度网盘'), ('web-search', '联网搜索')], max_length=20, verbose_name='信息源类型')),
|
||||||
|
('source_name', models.CharField(max_length=255, verbose_name='信息源名称')),
|
||||||
|
('source_url', models.URLField(blank=True, verbose_name='原始链接')),
|
||||||
|
('raw_content', models.TextField(blank=True, verbose_name='原始内容')),
|
||||||
|
('extracted_content', models.TextField(blank=True, verbose_name='提取的关键内容')),
|
||||||
|
('collected_at', models.DateTimeField(auto_now_add=True, verbose_name='采集时间')),
|
||||||
|
('report', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='sources', to='tasks.report', verbose_name='报告')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': '报告来源数据',
|
||||||
|
'verbose_name_plural': '报告来源数据',
|
||||||
|
'db_table': 'report_sources',
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name='report',
|
||||||
|
name='task',
|
||||||
|
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='reports', to='tasks.task', verbose_name='任务'),
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='TaskLog',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('action_type', models.CharField(choices=[('created', '创建'), ('started', '开始'), ('paused', '暂停'), ('resumed', '恢复'), ('completed', '完成'), ('error', '错误')], max_length=20, verbose_name='操作类型')),
|
||||||
|
('message', models.TextField(blank=True, verbose_name='日志信息')),
|
||||||
|
('error_details', models.JSONField(blank=True, null=True, verbose_name='错误详情')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='创建时间')),
|
||||||
|
('task', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='logs', to='tasks.task', verbose_name='任务')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': '任务执行日志',
|
||||||
|
'verbose_name_plural': '任务执行日志',
|
||||||
|
'db_table': 'task_logs',
|
||||||
|
'ordering': ['-created_at'],
|
||||||
|
'indexes': [models.Index(fields=['task', 'created_at'], name='task_logs_task_id_e5a91c_idx'), models.Index(fields=['action_type'], name='task_logs_action__459c46_idx')],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='task',
|
||||||
|
index=models.Index(fields=['user', 'status'], name='tasks_user_id_a53e17_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='task',
|
||||||
|
index=models.Index(fields=['last_report_time'], name='tasks_last_re_0965d4_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='task',
|
||||||
|
index=models.Index(fields=['status'], name='tasks_status_031d4c_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='reportsource',
|
||||||
|
index=models.Index(fields=['report'], name='report_sour_report__67ea05_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='reportsource',
|
||||||
|
index=models.Index(fields=['source_type'], name='report_sour_source__1d8840_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='report',
|
||||||
|
index=models.Index(fields=['task', 'generated_at'], name='reports_task_id_ff8229_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='report',
|
||||||
|
index=models.Index(fields=['status'], name='reports_status_e83c1d_idx'),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='report',
|
||||||
|
index=models.Index(fields=['generated_at'], name='reports_generat_a38578_idx'),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
|
@ -0,0 +1,18 @@
|
||||||
|
# Generated by Django 4.2.7 on 2025-09-19 08:41
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('tasks', '0001_initial'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name='task',
|
||||||
|
name='status',
|
||||||
|
field=models.CharField(choices=[('running', '运行中'), ('generating', '生成中'), ('error', '异常'), ('paused', '暂停')], default='running', max_length=20, verbose_name='任务状态'),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
|
@ -0,0 +1,184 @@
|
||||||
|
from django.db import models
|
||||||
|
from django.contrib.auth.models import User
|
||||||
|
import json
|
||||||
|
|
||||||
|
|
||||||
|
class Task(models.Model):
|
||||||
|
"""任务模型"""
|
||||||
|
|
||||||
|
TASK_TYPES = [
|
||||||
|
('single', '单次任务'),
|
||||||
|
('scheduled', '周期任务'),
|
||||||
|
]
|
||||||
|
|
||||||
|
TASK_STATUSES = [
|
||||||
|
('running', '运行中'), # 当前任务在运行
|
||||||
|
('generating', '生成中'), # 当前任务有报告正在生成
|
||||||
|
('error', '异常'), # 报告生成异常
|
||||||
|
('paused', '暂停'), # 任务暂停
|
||||||
|
]
|
||||||
|
|
||||||
|
title = models.CharField(max_length=255, verbose_name='任务标题')
|
||||||
|
description = models.TextField(blank=True, verbose_name='任务描述')
|
||||||
|
requirement = models.TextField(verbose_name='用户需求描述')
|
||||||
|
type = models.CharField(max_length=20, choices=TASK_TYPES, verbose_name='任务类型')
|
||||||
|
status = models.CharField(max_length=20, choices=TASK_STATUSES, default='running', verbose_name='任务状态')
|
||||||
|
|
||||||
|
# JSON字段存储配置
|
||||||
|
schedule_config = models.JSONField(null=True, blank=True, verbose_name='周期任务配置')
|
||||||
|
sources_config = models.JSONField(default=dict, verbose_name='信息源配置')
|
||||||
|
|
||||||
|
web_search_enabled = models.BooleanField(default=True, verbose_name='是否启用联网搜索')
|
||||||
|
user = models.ForeignKey(User, on_delete=models.CASCADE, verbose_name='用户')
|
||||||
|
|
||||||
|
# 时间字段
|
||||||
|
last_report_time = models.DateTimeField(null=True, blank=True, verbose_name='最新报告生成时间')
|
||||||
|
next_run_time = models.DateTimeField(null=True, blank=True, verbose_name='下次执行时间')
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True, verbose_name='创建时间')
|
||||||
|
updated_at = models.DateTimeField(auto_now=True, verbose_name='更新时间')
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = 'tasks'
|
||||||
|
verbose_name = '任务'
|
||||||
|
verbose_name_plural = '任务'
|
||||||
|
ordering = ['-last_report_time', '-created_at']
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=['user', 'status']),
|
||||||
|
models.Index(fields=['last_report_time']),
|
||||||
|
models.Index(fields=['status']),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"{self.title} ({self.get_type_display()})"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def report_count(self):
|
||||||
|
"""获取报告数量"""
|
||||||
|
return self.reports.count()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def has_new_report(self):
|
||||||
|
"""是否有新报告(用于排序)"""
|
||||||
|
if not self.last_report_time:
|
||||||
|
return False
|
||||||
|
# 如果最新报告是24小时内生成的,认为是新报告
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
return timezone.now() - self.last_report_time < timedelta(hours=24)
|
||||||
|
|
||||||
|
|
||||||
|
class Report(models.Model):
|
||||||
|
"""报告模型"""
|
||||||
|
|
||||||
|
REPORT_STATUSES = [
|
||||||
|
('generating', '生成中'),
|
||||||
|
('completed', '已完成'),
|
||||||
|
('failed', '失败'),
|
||||||
|
]
|
||||||
|
|
||||||
|
task = models.ForeignKey(Task, on_delete=models.CASCADE, related_name='reports', verbose_name='任务')
|
||||||
|
title = models.CharField(max_length=255, verbose_name='报告标题')
|
||||||
|
summary = models.TextField(blank=True, verbose_name='报告摘要')
|
||||||
|
content = models.TextField(verbose_name='报告内容')
|
||||||
|
source_tag = models.CharField(max_length=100, blank=True, verbose_name='信息来源标签')
|
||||||
|
word_count = models.IntegerField(default=0, verbose_name='字数统计')
|
||||||
|
status = models.CharField(max_length=20, choices=REPORT_STATUSES, default='generating', verbose_name='生成状态')
|
||||||
|
|
||||||
|
generated_at = models.DateTimeField(null=True, blank=True, verbose_name='生成完成时间')
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True, verbose_name='创建时间')
|
||||||
|
updated_at = models.DateTimeField(auto_now=True, verbose_name='更新时间')
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = 'reports'
|
||||||
|
verbose_name = '报告'
|
||||||
|
verbose_name_plural = '报告'
|
||||||
|
ordering = ['-generated_at', '-created_at']
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=['task', 'generated_at']),
|
||||||
|
models.Index(fields=['status']),
|
||||||
|
models.Index(fields=['generated_at']),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"{self.title} - {self.task.title}"
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
# 自动计算字数
|
||||||
|
if self.content:
|
||||||
|
self.word_count = len(self.content.replace(' ', '').replace('\n', ''))
|
||||||
|
|
||||||
|
# 如果状态变为完成,更新生成时间和任务的最新报告时间
|
||||||
|
if self.status == 'completed' and not self.generated_at:
|
||||||
|
from django.utils import timezone
|
||||||
|
self.generated_at = timezone.now()
|
||||||
|
self.task.last_report_time = self.generated_at
|
||||||
|
self.task.save(update_fields=['last_report_time'])
|
||||||
|
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class ReportSource(models.Model):
|
||||||
|
"""报告来源数据模型"""
|
||||||
|
|
||||||
|
SOURCE_TYPES = [
|
||||||
|
('wechat', '微信'),
|
||||||
|
('official-account', '公众号'),
|
||||||
|
('feishu', '飞书'),
|
||||||
|
('dingtalk', '钉钉'),
|
||||||
|
('email', '邮箱'),
|
||||||
|
('website', '网页'),
|
||||||
|
('baidu-pan', '百度网盘'),
|
||||||
|
('web-search', '联网搜索'),
|
||||||
|
]
|
||||||
|
|
||||||
|
report = models.ForeignKey(Report, on_delete=models.CASCADE, related_name='sources', verbose_name='报告')
|
||||||
|
source_type = models.CharField(max_length=20, choices=SOURCE_TYPES, verbose_name='信息源类型')
|
||||||
|
source_name = models.CharField(max_length=255, verbose_name='信息源名称')
|
||||||
|
source_url = models.URLField(blank=True, verbose_name='原始链接')
|
||||||
|
raw_content = models.TextField(blank=True, verbose_name='原始内容')
|
||||||
|
extracted_content = models.TextField(blank=True, verbose_name='提取的关键内容')
|
||||||
|
collected_at = models.DateTimeField(auto_now_add=True, verbose_name='采集时间')
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = 'report_sources'
|
||||||
|
verbose_name = '报告来源数据'
|
||||||
|
verbose_name_plural = '报告来源数据'
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=['report']),
|
||||||
|
models.Index(fields=['source_type']),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"{self.source_name} - {self.report.title}"
|
||||||
|
|
||||||
|
|
||||||
|
class TaskLog(models.Model):
|
||||||
|
"""任务执行日志模型"""
|
||||||
|
|
||||||
|
ACTION_TYPES = [
|
||||||
|
('created', '创建'),
|
||||||
|
('started', '开始'),
|
||||||
|
('paused', '暂停'),
|
||||||
|
('resumed', '恢复'),
|
||||||
|
('completed', '完成'),
|
||||||
|
('error', '错误'),
|
||||||
|
]
|
||||||
|
|
||||||
|
task = models.ForeignKey(Task, on_delete=models.CASCADE, related_name='logs', verbose_name='任务')
|
||||||
|
action_type = models.CharField(max_length=20, choices=ACTION_TYPES, verbose_name='操作类型')
|
||||||
|
message = models.TextField(blank=True, verbose_name='日志信息')
|
||||||
|
error_details = models.JSONField(null=True, blank=True, verbose_name='错误详情')
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True, verbose_name='创建时间')
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = 'task_logs'
|
||||||
|
verbose_name = '任务执行日志'
|
||||||
|
verbose_name_plural = '任务执行日志'
|
||||||
|
ordering = ['-created_at']
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=['task', 'created_at']),
|
||||||
|
models.Index(fields=['action_type']),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"{self.task.title} - {self.get_action_type_display()}"
|
||||||
|
|
@ -0,0 +1,154 @@
|
||||||
|
from rest_framework import serializers
|
||||||
|
from .models import Task, Report, ReportSource, TaskLog
|
||||||
|
|
||||||
|
|
||||||
|
class TaskSerializer(serializers.ModelSerializer):
|
||||||
|
"""任务序列化器"""
|
||||||
|
|
||||||
|
report_count = serializers.ReadOnlyField()
|
||||||
|
has_new_report = serializers.ReadOnlyField()
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = Task
|
||||||
|
fields = [
|
||||||
|
'id', 'title', 'description', 'requirement', 'type', 'status',
|
||||||
|
'schedule_config', 'sources_config', 'web_search_enabled',
|
||||||
|
'last_report_time', 'next_run_time', 'created_at', 'updated_at',
|
||||||
|
'report_count', 'has_new_report'
|
||||||
|
]
|
||||||
|
read_only_fields = ['id', 'created_at', 'updated_at', 'last_report_time']
|
||||||
|
|
||||||
|
|
||||||
|
class CreateTaskSerializer(serializers.Serializer):
|
||||||
|
"""创建任务的序列化器"""
|
||||||
|
|
||||||
|
requirement = serializers.CharField(max_length=5000, help_text='用户需求描述')
|
||||||
|
type = serializers.ChoiceField(choices=['single', 'scheduled'], help_text='任务类型')
|
||||||
|
schedule = serializers.JSONField(required=False, allow_null=True, help_text='周期任务配置')
|
||||||
|
presetSources = serializers.ListField(
|
||||||
|
child=serializers.DictField(),
|
||||||
|
required=False,
|
||||||
|
default=list,
|
||||||
|
help_text='预设信息源'
|
||||||
|
)
|
||||||
|
customSources = serializers.ListField(
|
||||||
|
child=serializers.DictField(),
|
||||||
|
required=False,
|
||||||
|
default=list,
|
||||||
|
help_text='自定义信息源'
|
||||||
|
)
|
||||||
|
webSearchEnabled = serializers.BooleanField(default=True, help_text='是否启用联网搜索')
|
||||||
|
|
||||||
|
def validate_schedule(self, value):
|
||||||
|
"""验证周期任务配置"""
|
||||||
|
task_type = self.initial_data.get('type')
|
||||||
|
|
||||||
|
# 单次任务允许schedule为None或null
|
||||||
|
if task_type == 'single':
|
||||||
|
return None
|
||||||
|
|
||||||
|
# 周期任务必须有schedule配置
|
||||||
|
if task_type == 'scheduled':
|
||||||
|
if not value:
|
||||||
|
raise serializers.ValidationError('周期任务必须提供执行周期配置')
|
||||||
|
|
||||||
|
required_fields = ['frequency', 'time']
|
||||||
|
for field in required_fields:
|
||||||
|
if field not in value:
|
||||||
|
raise serializers.ValidationError(f'周期任务配置缺少必要字段: {field}')
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
class ReportSourceSerializer(serializers.ModelSerializer):
|
||||||
|
"""报告来源序列化器"""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = ReportSource
|
||||||
|
fields = [
|
||||||
|
'id', 'source_type', 'source_name', 'source_url',
|
||||||
|
'extracted_content', 'collected_at'
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class ReportSerializer(serializers.ModelSerializer):
|
||||||
|
"""报告序列化器"""
|
||||||
|
|
||||||
|
task_title = serializers.CharField(source='task.title', read_only=True)
|
||||||
|
sources = ReportSourceSerializer(many=True, read_only=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = Report
|
||||||
|
fields = [
|
||||||
|
'id', 'task_id', 'task_title', 'title', 'summary', 'content',
|
||||||
|
'source_tag', 'word_count', 'status', 'generated_at',
|
||||||
|
'created_at', 'updated_at', 'sources'
|
||||||
|
]
|
||||||
|
read_only_fields = ['id', 'created_at', 'updated_at', 'word_count']
|
||||||
|
|
||||||
|
|
||||||
|
class ReportListSerializer(serializers.ModelSerializer):
|
||||||
|
"""报告列表序列化器(不包含content字段)"""
|
||||||
|
|
||||||
|
task_title = serializers.CharField(source='task.title', read_only=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = Report
|
||||||
|
fields = [
|
||||||
|
'id', 'task_id', 'task_title', 'title', 'summary',
|
||||||
|
'source_tag', 'word_count', 'status', 'generated_at'
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class TaskLogSerializer(serializers.ModelSerializer):
|
||||||
|
"""任务日志序列化器"""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = TaskLog
|
||||||
|
fields = [
|
||||||
|
'id', 'action_type', 'message', 'error_details', 'created_at'
|
||||||
|
]
|
||||||
|
read_only_fields = ['id', 'created_at']
|
||||||
|
|
||||||
|
|
||||||
|
class TaskDetailSerializer(TaskSerializer):
|
||||||
|
"""任务详情序列化器"""
|
||||||
|
|
||||||
|
logs = TaskLogSerializer(many=True, read_only=True)
|
||||||
|
recent_reports = serializers.SerializerMethodField()
|
||||||
|
|
||||||
|
class Meta(TaskSerializer.Meta):
|
||||||
|
fields = TaskSerializer.Meta.fields + ['logs', 'recent_reports']
|
||||||
|
|
||||||
|
def get_recent_reports(self, obj):
|
||||||
|
"""获取最近的3个报告"""
|
||||||
|
recent_reports = obj.reports.filter(status='completed')[:3]
|
||||||
|
return ReportListSerializer(recent_reports, many=True).data
|
||||||
|
|
||||||
|
|
||||||
|
class UpdateTaskStatusSerializer(serializers.Serializer):
|
||||||
|
"""更新任务状态的序列化器"""
|
||||||
|
|
||||||
|
status = serializers.ChoiceField(choices=['running', 'paused'], help_text='任务状态')
|
||||||
|
|
||||||
|
|
||||||
|
class TaskStatisticsSerializer(serializers.Serializer):
|
||||||
|
"""任务统计序列化器"""
|
||||||
|
|
||||||
|
total_tasks = serializers.IntegerField()
|
||||||
|
running_tasks = serializers.IntegerField()
|
||||||
|
paused_tasks = serializers.IntegerField()
|
||||||
|
error_tasks = serializers.IntegerField()
|
||||||
|
completed_tasks = serializers.IntegerField()
|
||||||
|
today_reports = serializers.IntegerField()
|
||||||
|
weekly_reports = serializers.IntegerField()
|
||||||
|
|
||||||
|
|
||||||
|
class ReportStatisticsSerializer(serializers.Serializer):
|
||||||
|
"""报告统计序列化器"""
|
||||||
|
|
||||||
|
total_reports = serializers.IntegerField()
|
||||||
|
generated_today = serializers.IntegerField()
|
||||||
|
average_word_count = serializers.FloatField()
|
||||||
|
source_distribution = serializers.DictField()
|
||||||
|
daily_generation = serializers.ListField(child=serializers.DictField())
|
||||||
|
|
@ -0,0 +1,298 @@
|
||||||
|
"""
|
||||||
|
任务相关的业务逻辑服务
|
||||||
|
"""
|
||||||
|
import re
|
||||||
|
from typing import Dict, Tuple
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
|
||||||
|
class AITitleGenerator:
|
||||||
|
"""AI标题生成服务"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def generate_title_and_description(requirement: str) -> Tuple[str, str]:
|
||||||
|
"""
|
||||||
|
根据用户需求生成任务标题和描述
|
||||||
|
|
||||||
|
Args:
|
||||||
|
requirement: 用户需求描述
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
tuple: (标题, 描述)
|
||||||
|
"""
|
||||||
|
# 简化版本的关键词匹配(实际应该调用OpenAI API)
|
||||||
|
keywords = requirement.lower()
|
||||||
|
|
||||||
|
# 默认值
|
||||||
|
title = '信息监控任务'
|
||||||
|
description = '智能监控和分析任务'
|
||||||
|
|
||||||
|
# 关键词匹配规则
|
||||||
|
if any(keyword in keywords for keyword in ['openai', 'ai', '人工智能', '机器学习']):
|
||||||
|
title = 'AI技术动态监控'
|
||||||
|
description = '实时监控AI技术发展动态和产品更新'
|
||||||
|
elif any(keyword in keywords for keyword in ['产品', '功能', '发布', '更新']):
|
||||||
|
title = '产品功能监控'
|
||||||
|
description = '跟踪产品功能更新和发布动态'
|
||||||
|
elif any(keyword in keywords for keyword in ['新闻', '资讯', '动态', '消息']):
|
||||||
|
title = '新闻资讯监控'
|
||||||
|
description = '收集和分析相关新闻资讯'
|
||||||
|
elif any(keyword in keywords for keyword in ['市场', '行业', '竞争', '分析']):
|
||||||
|
title = '市场行业分析'
|
||||||
|
description = '监控市场动态和行业发展趋势'
|
||||||
|
elif any(keyword in keywords for keyword in ['股票', '金融', '投资', '财经']):
|
||||||
|
title = '金融投资监控'
|
||||||
|
description = '跟踪金融市场和投资机会'
|
||||||
|
elif any(keyword in keywords for keyword in ['政策', '法规', '监管', '政府']):
|
||||||
|
title = '政策法规监控'
|
||||||
|
description = '监控政策变化和法规更新'
|
||||||
|
|
||||||
|
# 从需求中提取关注对象,添加到标题中
|
||||||
|
object_match = re.search(r'关注对象[::]\s*([^\n\r,,。.]+)', requirement)
|
||||||
|
if object_match:
|
||||||
|
focus_object = object_match.group(1).strip()
|
||||||
|
if focus_object and len(focus_object) < 20:
|
||||||
|
title = f'{focus_object} - {title}'
|
||||||
|
|
||||||
|
return title, description
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def generate_with_openai(requirement: str) -> Tuple[str, str]:
|
||||||
|
"""
|
||||||
|
使用OpenAI API生成标题和描述(备用方案)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
requirement: 用户需求描述
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
tuple: (标题, 描述)
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
import openai
|
||||||
|
|
||||||
|
if not settings.OPENAI_API_KEY:
|
||||||
|
return AITitleGenerator.generate_title_and_description(requirement)
|
||||||
|
|
||||||
|
openai.api_key = settings.OPENAI_API_KEY
|
||||||
|
|
||||||
|
prompt = f"""
|
||||||
|
根据以下用户需求,生成一个简洁的任务标题和一句话描述:
|
||||||
|
|
||||||
|
用户需求:
|
||||||
|
{requirement}
|
||||||
|
|
||||||
|
请返回JSON格式:
|
||||||
|
{{
|
||||||
|
"title": "任务标题(不超过20字)",
|
||||||
|
"description": "任务描述(不超过50字)"
|
||||||
|
}}
|
||||||
|
"""
|
||||||
|
|
||||||
|
response = openai.ChatCompletion.create(
|
||||||
|
model="gpt-3.5-turbo",
|
||||||
|
messages=[
|
||||||
|
{"role": "system", "content": "你是一个专业的任务管理助手,擅长根据用户需求生成简洁明确的任务标题和描述。"},
|
||||||
|
{"role": "user", "content": prompt}
|
||||||
|
],
|
||||||
|
max_tokens=200,
|
||||||
|
temperature=0.7
|
||||||
|
)
|
||||||
|
|
||||||
|
result = response.choices[0].message.content
|
||||||
|
|
||||||
|
# 解析JSON响应
|
||||||
|
import json
|
||||||
|
parsed = json.loads(result)
|
||||||
|
return parsed.get('title', '信息监控任务'), parsed.get('description', '智能监控和分析任务')
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"OpenAI API调用失败: {e}")
|
||||||
|
# 降级到本地生成
|
||||||
|
return AITitleGenerator.generate_title_and_description(requirement)
|
||||||
|
|
||||||
|
|
||||||
|
class TaskService:
|
||||||
|
"""任务业务逻辑服务"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create_task(user, task_data: Dict):
|
||||||
|
"""
|
||||||
|
创建新任务
|
||||||
|
|
||||||
|
Args:
|
||||||
|
user: 用户对象
|
||||||
|
task_data: 任务数据
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Task: 创建的任务对象
|
||||||
|
"""
|
||||||
|
# 生成AI标题和描述
|
||||||
|
title, description = AITitleGenerator.generate_title_and_description(
|
||||||
|
task_data['requirement']
|
||||||
|
)
|
||||||
|
|
||||||
|
# 构建信息源配置
|
||||||
|
sources_config = {
|
||||||
|
'presetSources': task_data.get('presetSources', []),
|
||||||
|
'customSources': task_data.get('customSources', []),
|
||||||
|
'webSearchEnabled': task_data.get('webSearchEnabled', True)
|
||||||
|
}
|
||||||
|
|
||||||
|
# 导入模型
|
||||||
|
from .models import Task, TaskLog
|
||||||
|
|
||||||
|
# 创建任务
|
||||||
|
task = Task.objects.create(
|
||||||
|
title=title,
|
||||||
|
description=description,
|
||||||
|
requirement=task_data['requirement'],
|
||||||
|
type=task_data['type'],
|
||||||
|
schedule_config=task_data.get('schedule'),
|
||||||
|
sources_config=sources_config,
|
||||||
|
web_search_enabled=task_data.get('webSearchEnabled', True),
|
||||||
|
user=user
|
||||||
|
)
|
||||||
|
|
||||||
|
# 记录创建日志
|
||||||
|
TaskLog.objects.create(
|
||||||
|
task=task,
|
||||||
|
action_type='created',
|
||||||
|
message=f'任务创建成功: {title}'
|
||||||
|
)
|
||||||
|
|
||||||
|
# 如果是单次任务,立即开始执行
|
||||||
|
if task.type == 'single':
|
||||||
|
TaskService.start_task_execution(task)
|
||||||
|
|
||||||
|
return task
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def start_task_execution(task):
|
||||||
|
"""
|
||||||
|
开始执行任务
|
||||||
|
|
||||||
|
Args:
|
||||||
|
task: 任务对象
|
||||||
|
"""
|
||||||
|
# 更新任务状态
|
||||||
|
task.status = 'running'
|
||||||
|
task.save(update_fields=['status'])
|
||||||
|
|
||||||
|
# 导入模型
|
||||||
|
from .models import TaskLog
|
||||||
|
|
||||||
|
# 记录开始执行日志
|
||||||
|
TaskLog.objects.create(
|
||||||
|
task=task,
|
||||||
|
action_type='started',
|
||||||
|
message='任务开始执行'
|
||||||
|
)
|
||||||
|
|
||||||
|
# 这里可以添加Celery异步任务调用
|
||||||
|
# from .tasks import execute_task
|
||||||
|
# execute_task.delay(task.id)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def pause_task(task):
|
||||||
|
"""暂停任务"""
|
||||||
|
# 导入模型
|
||||||
|
from .models import TaskLog
|
||||||
|
|
||||||
|
task.status = 'paused'
|
||||||
|
task.save(update_fields=['status'])
|
||||||
|
|
||||||
|
TaskLog.objects.create(
|
||||||
|
task=task,
|
||||||
|
action_type='paused',
|
||||||
|
message='任务已暂停'
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def resume_task(task):
|
||||||
|
"""恢复任务"""
|
||||||
|
# 导入模型
|
||||||
|
from .models import TaskLog
|
||||||
|
|
||||||
|
task.status = 'running'
|
||||||
|
task.save(update_fields=['status'])
|
||||||
|
|
||||||
|
TaskLog.objects.create(
|
||||||
|
task=task,
|
||||||
|
action_type='resumed',
|
||||||
|
message='任务已恢复'
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class ReportService:
|
||||||
|
"""报告业务逻辑服务"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create_mock_report(task):
|
||||||
|
"""
|
||||||
|
创建模拟报告(用于演示)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
task: 任务对象
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Report: 创建的报告对象
|
||||||
|
"""
|
||||||
|
# 根据任务需求生成模拟报告内容
|
||||||
|
requirement = task.requirement.lower()
|
||||||
|
|
||||||
|
if 'openai' in requirement or 'ai' in requirement:
|
||||||
|
title = 'OpenAI GPT-4 Turbo重大更新'
|
||||||
|
summary = 'OpenAI发布GPT-4 Turbo新版本,支持更长上下文,性能显著提升...'
|
||||||
|
content = """# OpenAI GPT-4 Turbo重大更新
|
||||||
|
|
||||||
|
## 主要更新内容
|
||||||
|
1. **上下文长度提升**: 支持128K tokens的上下文长度
|
||||||
|
2. **性能优化**: 推理速度提升2倍
|
||||||
|
3. **成本降低**: API调用成本降低50%
|
||||||
|
|
||||||
|
## 详细分析
|
||||||
|
这次更新标志着AI技术的重要进展...
|
||||||
|
"""
|
||||||
|
source_tag = 'OpenAI官网'
|
||||||
|
else:
|
||||||
|
title = f'{task.title} - 最新动态报告'
|
||||||
|
summary = f'基于您的需求「{task.requirement[:50]}...」生成的分析报告'
|
||||||
|
content = f"""# {task.title}
|
||||||
|
|
||||||
|
## 监控概要
|
||||||
|
根据您的需求进行信息收集和分析。
|
||||||
|
|
||||||
|
## 关键发现
|
||||||
|
1. 相关信息已收集完成
|
||||||
|
2. 数据分析正在进行中
|
||||||
|
3. 将持续监控相关动态
|
||||||
|
|
||||||
|
## 详细内容
|
||||||
|
{task.requirement}
|
||||||
|
|
||||||
|
## 下一步计划
|
||||||
|
继续监控相关信息源,及时更新分析结果。
|
||||||
|
"""
|
||||||
|
source_tag = '联网搜索'
|
||||||
|
|
||||||
|
# 导入模型
|
||||||
|
from .models import Report, ReportSource
|
||||||
|
|
||||||
|
report = Report.objects.create(
|
||||||
|
task=task,
|
||||||
|
title=title,
|
||||||
|
summary=summary,
|
||||||
|
content=content,
|
||||||
|
source_tag=source_tag,
|
||||||
|
status='completed'
|
||||||
|
)
|
||||||
|
|
||||||
|
# 创建报告来源记录
|
||||||
|
ReportSource.objects.create(
|
||||||
|
report=report,
|
||||||
|
source_type='web-search',
|
||||||
|
source_name='联网搜索',
|
||||||
|
extracted_content=summary
|
||||||
|
)
|
||||||
|
|
||||||
|
return report
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
from django.test import TestCase
|
||||||
|
|
||||||
|
# Create your tests here.
|
||||||
|
|
@ -0,0 +1,15 @@
|
||||||
|
from django.urls import path, include
|
||||||
|
from rest_framework.routers import DefaultRouter
|
||||||
|
from .views import TaskViewSet, ReportViewSet, StatisticsViewSet, report_callback, generate_report
|
||||||
|
|
||||||
|
# 创建路由器
|
||||||
|
router = DefaultRouter()
|
||||||
|
router.register(r'tasks', TaskViewSet, basename='task')
|
||||||
|
router.register(r'reports', ReportViewSet, basename='report')
|
||||||
|
router.register(r'statistics', StatisticsViewSet, basename='statistics')
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path('api/', include(router.urls)),
|
||||||
|
path('api/report-callback/', report_callback, name='report-callback'),
|
||||||
|
path('generate_report', generate_report, name='generate-report'),
|
||||||
|
]
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,241 @@
|
||||||
|
<#
|
||||||
|
.Synopsis
|
||||||
|
Activate a Python virtual environment for the current PowerShell session.
|
||||||
|
|
||||||
|
.Description
|
||||||
|
Pushes the python executable for a virtual environment to the front of the
|
||||||
|
$Env:PATH environment variable and sets the prompt to signify that you are
|
||||||
|
in a Python virtual environment. Makes use of the command line switches as
|
||||||
|
well as the `pyvenv.cfg` file values present in the virtual environment.
|
||||||
|
|
||||||
|
.Parameter VenvDir
|
||||||
|
Path to the directory that contains the virtual environment to activate. The
|
||||||
|
default value for this is the parent of the directory that the Activate.ps1
|
||||||
|
script is located within.
|
||||||
|
|
||||||
|
.Parameter Prompt
|
||||||
|
The prompt prefix to display when this virtual environment is activated. By
|
||||||
|
default, this prompt is the name of the virtual environment folder (VenvDir)
|
||||||
|
surrounded by parentheses and followed by a single space (ie. '(.venv) ').
|
||||||
|
|
||||||
|
.Example
|
||||||
|
Activate.ps1
|
||||||
|
Activates the Python virtual environment that contains the Activate.ps1 script.
|
||||||
|
|
||||||
|
.Example
|
||||||
|
Activate.ps1 -Verbose
|
||||||
|
Activates the Python virtual environment that contains the Activate.ps1 script,
|
||||||
|
and shows extra information about the activation as it executes.
|
||||||
|
|
||||||
|
.Example
|
||||||
|
Activate.ps1 -VenvDir C:\Users\MyUser\Common\.venv
|
||||||
|
Activates the Python virtual environment located in the specified location.
|
||||||
|
|
||||||
|
.Example
|
||||||
|
Activate.ps1 -Prompt "MyPython"
|
||||||
|
Activates the Python virtual environment that contains the Activate.ps1 script,
|
||||||
|
and prefixes the current prompt with the specified string (surrounded in
|
||||||
|
parentheses) while the virtual environment is active.
|
||||||
|
|
||||||
|
.Notes
|
||||||
|
On Windows, it may be required to enable this Activate.ps1 script by setting the
|
||||||
|
execution policy for the user. You can do this by issuing the following PowerShell
|
||||||
|
command:
|
||||||
|
|
||||||
|
PS C:\> Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
|
||||||
|
|
||||||
|
For more information on Execution Policies:
|
||||||
|
https://go.microsoft.com/fwlink/?LinkID=135170
|
||||||
|
|
||||||
|
#>
|
||||||
|
Param(
|
||||||
|
[Parameter(Mandatory = $false)]
|
||||||
|
[String]
|
||||||
|
$VenvDir,
|
||||||
|
[Parameter(Mandatory = $false)]
|
||||||
|
[String]
|
||||||
|
$Prompt
|
||||||
|
)
|
||||||
|
|
||||||
|
<# Function declarations --------------------------------------------------- #>
|
||||||
|
|
||||||
|
<#
|
||||||
|
.Synopsis
|
||||||
|
Remove all shell session elements added by the Activate script, including the
|
||||||
|
addition of the virtual environment's Python executable from the beginning of
|
||||||
|
the PATH variable.
|
||||||
|
|
||||||
|
.Parameter NonDestructive
|
||||||
|
If present, do not remove this function from the global namespace for the
|
||||||
|
session.
|
||||||
|
|
||||||
|
#>
|
||||||
|
function global:deactivate ([switch]$NonDestructive) {
|
||||||
|
# Revert to original values
|
||||||
|
|
||||||
|
# The prior prompt:
|
||||||
|
if (Test-Path -Path Function:_OLD_VIRTUAL_PROMPT) {
|
||||||
|
Copy-Item -Path Function:_OLD_VIRTUAL_PROMPT -Destination Function:prompt
|
||||||
|
Remove-Item -Path Function:_OLD_VIRTUAL_PROMPT
|
||||||
|
}
|
||||||
|
|
||||||
|
# The prior PYTHONHOME:
|
||||||
|
if (Test-Path -Path Env:_OLD_VIRTUAL_PYTHONHOME) {
|
||||||
|
Copy-Item -Path Env:_OLD_VIRTUAL_PYTHONHOME -Destination Env:PYTHONHOME
|
||||||
|
Remove-Item -Path Env:_OLD_VIRTUAL_PYTHONHOME
|
||||||
|
}
|
||||||
|
|
||||||
|
# The prior PATH:
|
||||||
|
if (Test-Path -Path Env:_OLD_VIRTUAL_PATH) {
|
||||||
|
Copy-Item -Path Env:_OLD_VIRTUAL_PATH -Destination Env:PATH
|
||||||
|
Remove-Item -Path Env:_OLD_VIRTUAL_PATH
|
||||||
|
}
|
||||||
|
|
||||||
|
# Just remove the VIRTUAL_ENV altogether:
|
||||||
|
if (Test-Path -Path Env:VIRTUAL_ENV) {
|
||||||
|
Remove-Item -Path env:VIRTUAL_ENV
|
||||||
|
}
|
||||||
|
|
||||||
|
# Just remove the _PYTHON_VENV_PROMPT_PREFIX altogether:
|
||||||
|
if (Get-Variable -Name "_PYTHON_VENV_PROMPT_PREFIX" -ErrorAction SilentlyContinue) {
|
||||||
|
Remove-Variable -Name _PYTHON_VENV_PROMPT_PREFIX -Scope Global -Force
|
||||||
|
}
|
||||||
|
|
||||||
|
# Leave deactivate function in the global namespace if requested:
|
||||||
|
if (-not $NonDestructive) {
|
||||||
|
Remove-Item -Path function:deactivate
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
<#
|
||||||
|
.Description
|
||||||
|
Get-PyVenvConfig parses the values from the pyvenv.cfg file located in the
|
||||||
|
given folder, and returns them in a map.
|
||||||
|
|
||||||
|
For each line in the pyvenv.cfg file, if that line can be parsed into exactly
|
||||||
|
two strings separated by `=` (with any amount of whitespace surrounding the =)
|
||||||
|
then it is considered a `key = value` line. The left hand string is the key,
|
||||||
|
the right hand is the value.
|
||||||
|
|
||||||
|
If the value starts with a `'` or a `"` then the first and last character is
|
||||||
|
stripped from the value before being captured.
|
||||||
|
|
||||||
|
.Parameter ConfigDir
|
||||||
|
Path to the directory that contains the `pyvenv.cfg` file.
|
||||||
|
#>
|
||||||
|
function Get-PyVenvConfig(
|
||||||
|
[String]
|
||||||
|
$ConfigDir
|
||||||
|
) {
|
||||||
|
Write-Verbose "Given ConfigDir=$ConfigDir, obtain values in pyvenv.cfg"
|
||||||
|
|
||||||
|
# Ensure the file exists, and issue a warning if it doesn't (but still allow the function to continue).
|
||||||
|
$pyvenvConfigPath = Join-Path -Resolve -Path $ConfigDir -ChildPath 'pyvenv.cfg' -ErrorAction Continue
|
||||||
|
|
||||||
|
# An empty map will be returned if no config file is found.
|
||||||
|
$pyvenvConfig = @{ }
|
||||||
|
|
||||||
|
if ($pyvenvConfigPath) {
|
||||||
|
|
||||||
|
Write-Verbose "File exists, parse `key = value` lines"
|
||||||
|
$pyvenvConfigContent = Get-Content -Path $pyvenvConfigPath
|
||||||
|
|
||||||
|
$pyvenvConfigContent | ForEach-Object {
|
||||||
|
$keyval = $PSItem -split "\s*=\s*", 2
|
||||||
|
if ($keyval[0] -and $keyval[1]) {
|
||||||
|
$val = $keyval[1]
|
||||||
|
|
||||||
|
# Remove extraneous quotations around a string value.
|
||||||
|
if ("'""".Contains($val.Substring(0, 1))) {
|
||||||
|
$val = $val.Substring(1, $val.Length - 2)
|
||||||
|
}
|
||||||
|
|
||||||
|
$pyvenvConfig[$keyval[0]] = $val
|
||||||
|
Write-Verbose "Adding Key: '$($keyval[0])'='$val'"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return $pyvenvConfig
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
<# Begin Activate script --------------------------------------------------- #>
|
||||||
|
|
||||||
|
# Determine the containing directory of this script
|
||||||
|
$VenvExecPath = Split-Path -Parent $MyInvocation.MyCommand.Definition
|
||||||
|
$VenvExecDir = Get-Item -Path $VenvExecPath
|
||||||
|
|
||||||
|
Write-Verbose "Activation script is located in path: '$VenvExecPath'"
|
||||||
|
Write-Verbose "VenvExecDir Fullname: '$($VenvExecDir.FullName)"
|
||||||
|
Write-Verbose "VenvExecDir Name: '$($VenvExecDir.Name)"
|
||||||
|
|
||||||
|
# Set values required in priority: CmdLine, ConfigFile, Default
|
||||||
|
# First, get the location of the virtual environment, it might not be
|
||||||
|
# VenvExecDir if specified on the command line.
|
||||||
|
if ($VenvDir) {
|
||||||
|
Write-Verbose "VenvDir given as parameter, using '$VenvDir' to determine values"
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
Write-Verbose "VenvDir not given as a parameter, using parent directory name as VenvDir."
|
||||||
|
$VenvDir = $VenvExecDir.Parent.FullName.TrimEnd("\\/")
|
||||||
|
Write-Verbose "VenvDir=$VenvDir"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Next, read the `pyvenv.cfg` file to determine any required value such
|
||||||
|
# as `prompt`.
|
||||||
|
$pyvenvCfg = Get-PyVenvConfig -ConfigDir $VenvDir
|
||||||
|
|
||||||
|
# Next, set the prompt from the command line, or the config file, or
|
||||||
|
# just use the name of the virtual environment folder.
|
||||||
|
if ($Prompt) {
|
||||||
|
Write-Verbose "Prompt specified as argument, using '$Prompt'"
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
Write-Verbose "Prompt not specified as argument to script, checking pyvenv.cfg value"
|
||||||
|
if ($pyvenvCfg -and $pyvenvCfg['prompt']) {
|
||||||
|
Write-Verbose " Setting based on value in pyvenv.cfg='$($pyvenvCfg['prompt'])'"
|
||||||
|
$Prompt = $pyvenvCfg['prompt'];
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
Write-Verbose " Setting prompt based on parent's directory's name. (Is the directory name passed to venv module when creating the virutal environment)"
|
||||||
|
Write-Verbose " Got leaf-name of $VenvDir='$(Split-Path -Path $venvDir -Leaf)'"
|
||||||
|
$Prompt = Split-Path -Path $venvDir -Leaf
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Verbose "Prompt = '$Prompt'"
|
||||||
|
Write-Verbose "VenvDir='$VenvDir'"
|
||||||
|
|
||||||
|
# Deactivate any currently active virtual environment, but leave the
|
||||||
|
# deactivate function in place.
|
||||||
|
deactivate -nondestructive
|
||||||
|
|
||||||
|
# Now set the environment variable VIRTUAL_ENV, used by many tools to determine
|
||||||
|
# that there is an activated venv.
|
||||||
|
$env:VIRTUAL_ENV = $VenvDir
|
||||||
|
|
||||||
|
if (-not $Env:VIRTUAL_ENV_DISABLE_PROMPT) {
|
||||||
|
|
||||||
|
Write-Verbose "Setting prompt to '$Prompt'"
|
||||||
|
|
||||||
|
# Set the prompt to include the env name
|
||||||
|
# Make sure _OLD_VIRTUAL_PROMPT is global
|
||||||
|
function global:_OLD_VIRTUAL_PROMPT { "" }
|
||||||
|
Copy-Item -Path function:prompt -Destination function:_OLD_VIRTUAL_PROMPT
|
||||||
|
New-Variable -Name _PYTHON_VENV_PROMPT_PREFIX -Description "Python virtual environment prompt prefix" -Scope Global -Option ReadOnly -Visibility Public -Value $Prompt
|
||||||
|
|
||||||
|
function global:prompt {
|
||||||
|
Write-Host -NoNewline -ForegroundColor Green "($_PYTHON_VENV_PROMPT_PREFIX) "
|
||||||
|
_OLD_VIRTUAL_PROMPT
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Clear PYTHONHOME
|
||||||
|
if (Test-Path -Path Env:PYTHONHOME) {
|
||||||
|
Copy-Item -Path Env:PYTHONHOME -Destination Env:_OLD_VIRTUAL_PYTHONHOME
|
||||||
|
Remove-Item -Path Env:PYTHONHOME
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add the venv to the PATH
|
||||||
|
Copy-Item -Path Env:PATH -Destination Env:_OLD_VIRTUAL_PATH
|
||||||
|
$Env:PATH = "$VenvExecDir$([System.IO.Path]::PathSeparator)$Env:PATH"
|
||||||
|
|
@ -0,0 +1,66 @@
|
||||||
|
# This file must be used with "source bin/activate" *from bash*
|
||||||
|
# you cannot run it directly
|
||||||
|
|
||||||
|
deactivate () {
|
||||||
|
# reset old environment variables
|
||||||
|
if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then
|
||||||
|
PATH="${_OLD_VIRTUAL_PATH:-}"
|
||||||
|
export PATH
|
||||||
|
unset _OLD_VIRTUAL_PATH
|
||||||
|
fi
|
||||||
|
if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then
|
||||||
|
PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}"
|
||||||
|
export PYTHONHOME
|
||||||
|
unset _OLD_VIRTUAL_PYTHONHOME
|
||||||
|
fi
|
||||||
|
|
||||||
|
# This should detect bash and zsh, which have a hash command that must
|
||||||
|
# be called to get it to forget past commands. Without forgetting
|
||||||
|
# past commands the $PATH changes we made may not be respected
|
||||||
|
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
|
||||||
|
hash -r 2> /dev/null
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then
|
||||||
|
PS1="${_OLD_VIRTUAL_PS1:-}"
|
||||||
|
export PS1
|
||||||
|
unset _OLD_VIRTUAL_PS1
|
||||||
|
fi
|
||||||
|
|
||||||
|
unset VIRTUAL_ENV
|
||||||
|
if [ ! "${1:-}" = "nondestructive" ] ; then
|
||||||
|
# Self destruct!
|
||||||
|
unset -f deactivate
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# unset irrelevant variables
|
||||||
|
deactivate nondestructive
|
||||||
|
|
||||||
|
VIRTUAL_ENV="/Users/natalie/Documents/知识采集分析Agent/django-backend/venv"
|
||||||
|
export VIRTUAL_ENV
|
||||||
|
|
||||||
|
_OLD_VIRTUAL_PATH="$PATH"
|
||||||
|
PATH="$VIRTUAL_ENV/bin:$PATH"
|
||||||
|
export PATH
|
||||||
|
|
||||||
|
# unset PYTHONHOME if set
|
||||||
|
# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
|
||||||
|
# could use `if (set -u; : $PYTHONHOME) ;` in bash
|
||||||
|
if [ -n "${PYTHONHOME:-}" ] ; then
|
||||||
|
_OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}"
|
||||||
|
unset PYTHONHOME
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then
|
||||||
|
_OLD_VIRTUAL_PS1="${PS1:-}"
|
||||||
|
PS1="(venv) ${PS1:-}"
|
||||||
|
export PS1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# This should detect bash and zsh, which have a hash command that must
|
||||||
|
# be called to get it to forget past commands. Without forgetting
|
||||||
|
# past commands the $PATH changes we made may not be respected
|
||||||
|
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
|
||||||
|
hash -r 2> /dev/null
|
||||||
|
fi
|
||||||
|
|
@ -0,0 +1,25 @@
|
||||||
|
# This file must be used with "source bin/activate.csh" *from csh*.
|
||||||
|
# You cannot run it directly.
|
||||||
|
# Created by Davide Di Blasi <davidedb@gmail.com>.
|
||||||
|
# Ported to Python 3.3 venv by Andrew Svetlov <andrew.svetlov@gmail.com>
|
||||||
|
|
||||||
|
alias deactivate 'test $?_OLD_VIRTUAL_PATH != 0 && setenv PATH "$_OLD_VIRTUAL_PATH" && unset _OLD_VIRTUAL_PATH; rehash; test $?_OLD_VIRTUAL_PROMPT != 0 && set prompt="$_OLD_VIRTUAL_PROMPT" && unset _OLD_VIRTUAL_PROMPT; unsetenv VIRTUAL_ENV; test "\!:*" != "nondestructive" && unalias deactivate'
|
||||||
|
|
||||||
|
# Unset irrelevant variables.
|
||||||
|
deactivate nondestructive
|
||||||
|
|
||||||
|
setenv VIRTUAL_ENV "/Users/natalie/Documents/知识采集分析Agent/django-backend/venv"
|
||||||
|
|
||||||
|
set _OLD_VIRTUAL_PATH="$PATH"
|
||||||
|
setenv PATH "$VIRTUAL_ENV/bin:$PATH"
|
||||||
|
|
||||||
|
|
||||||
|
set _OLD_VIRTUAL_PROMPT="$prompt"
|
||||||
|
|
||||||
|
if (! "$?VIRTUAL_ENV_DISABLE_PROMPT") then
|
||||||
|
set prompt = "(venv) $prompt"
|
||||||
|
endif
|
||||||
|
|
||||||
|
alias pydoc python -m pydoc
|
||||||
|
|
||||||
|
rehash
|
||||||
|
|
@ -0,0 +1,64 @@
|
||||||
|
# This file must be used with "source <venv>/bin/activate.fish" *from fish*
|
||||||
|
# (https://fishshell.com/); you cannot run it directly.
|
||||||
|
|
||||||
|
function deactivate -d "Exit virtual environment and return to normal shell environment"
|
||||||
|
# reset old environment variables
|
||||||
|
if test -n "$_OLD_VIRTUAL_PATH"
|
||||||
|
set -gx PATH $_OLD_VIRTUAL_PATH
|
||||||
|
set -e _OLD_VIRTUAL_PATH
|
||||||
|
end
|
||||||
|
if test -n "$_OLD_VIRTUAL_PYTHONHOME"
|
||||||
|
set -gx PYTHONHOME $_OLD_VIRTUAL_PYTHONHOME
|
||||||
|
set -e _OLD_VIRTUAL_PYTHONHOME
|
||||||
|
end
|
||||||
|
|
||||||
|
if test -n "$_OLD_FISH_PROMPT_OVERRIDE"
|
||||||
|
functions -e fish_prompt
|
||||||
|
set -e _OLD_FISH_PROMPT_OVERRIDE
|
||||||
|
functions -c _old_fish_prompt fish_prompt
|
||||||
|
functions -e _old_fish_prompt
|
||||||
|
end
|
||||||
|
|
||||||
|
set -e VIRTUAL_ENV
|
||||||
|
if test "$argv[1]" != "nondestructive"
|
||||||
|
# Self-destruct!
|
||||||
|
functions -e deactivate
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
# Unset irrelevant variables.
|
||||||
|
deactivate nondestructive
|
||||||
|
|
||||||
|
set -gx VIRTUAL_ENV "/Users/natalie/Documents/知识采集分析Agent/django-backend/venv"
|
||||||
|
|
||||||
|
set -gx _OLD_VIRTUAL_PATH $PATH
|
||||||
|
set -gx PATH "$VIRTUAL_ENV/bin" $PATH
|
||||||
|
|
||||||
|
# Unset PYTHONHOME if set.
|
||||||
|
if set -q PYTHONHOME
|
||||||
|
set -gx _OLD_VIRTUAL_PYTHONHOME $PYTHONHOME
|
||||||
|
set -e PYTHONHOME
|
||||||
|
end
|
||||||
|
|
||||||
|
if test -z "$VIRTUAL_ENV_DISABLE_PROMPT"
|
||||||
|
# fish uses a function instead of an env var to generate the prompt.
|
||||||
|
|
||||||
|
# Save the current fish_prompt function as the function _old_fish_prompt.
|
||||||
|
functions -c fish_prompt _old_fish_prompt
|
||||||
|
|
||||||
|
# With the original prompt function renamed, we can override with our own.
|
||||||
|
function fish_prompt
|
||||||
|
# Save the return status of the last command.
|
||||||
|
set -l old_status $status
|
||||||
|
|
||||||
|
# Output the venv prompt; color taken from the blue of the Python logo.
|
||||||
|
printf "%s%s%s" (set_color 4B8BBE) "(venv) " (set_color normal)
|
||||||
|
|
||||||
|
# Restore the return status of the previous command.
|
||||||
|
echo "exit $old_status" | .
|
||||||
|
# Output the original/"old" prompt.
|
||||||
|
_old_fish_prompt
|
||||||
|
end
|
||||||
|
|
||||||
|
set -gx _OLD_FISH_PROMPT_OVERRIDE "$VIRTUAL_ENV"
|
||||||
|
end
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from celery.__main__ import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from distro.distro import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from django.core.management import execute_from_command_line
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(execute_from_command_line())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from httpx import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from charset_normalizer.cli import cli_detect
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(cli_detect())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from openai.cli import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from pip._internal.cli.main import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from pip._internal.cli.main import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from pip._internal.cli.main import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
python3
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
/Library/Developer/CommandLineTools/usr/bin/python3
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
python3
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from sqlparse.__main__ import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
#!/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/bin/python3
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from tqdm.cli import main
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
|
||||||
|
sys.exit(main())
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1 @@
|
||||||
|
pip
|
||||||
|
|
@ -0,0 +1,27 @@
|
||||||
|
Copyright (c) Django Software Foundation and individual contributors.
|
||||||
|
All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without modification,
|
||||||
|
are permitted provided that the following conditions are met:
|
||||||
|
|
||||||
|
1. Redistributions of source code must retain the above copyright notice,
|
||||||
|
this list of conditions and the following disclaimer.
|
||||||
|
|
||||||
|
2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer in the
|
||||||
|
documentation and/or other materials provided with the distribution.
|
||||||
|
|
||||||
|
3. Neither the name of Django nor the names of its contributors may be used
|
||||||
|
to endorse or promote products derived from this software without
|
||||||
|
specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||||
|
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||||
|
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||||
|
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
|
||||||
|
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||||
|
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||||
|
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
|
||||||
|
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||||
|
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||||
|
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
@ -0,0 +1,290 @@
|
||||||
|
Django is licensed under the three-clause BSD license; see the file
|
||||||
|
LICENSE for details.
|
||||||
|
|
||||||
|
Django includes code from the Python standard library, which is licensed under
|
||||||
|
the Python license, a permissive open source license. The copyright and license
|
||||||
|
is included below for compliance with Python's terms.
|
||||||
|
|
||||||
|
----------------------------------------------------------------------
|
||||||
|
|
||||||
|
Copyright (c) 2001-present Python Software Foundation; All Rights Reserved
|
||||||
|
|
||||||
|
A. HISTORY OF THE SOFTWARE
|
||||||
|
==========================
|
||||||
|
|
||||||
|
Python was created in the early 1990s by Guido van Rossum at Stichting
|
||||||
|
Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands
|
||||||
|
as a successor of a language called ABC. Guido remains Python's
|
||||||
|
principal author, although it includes many contributions from others.
|
||||||
|
|
||||||
|
In 1995, Guido continued his work on Python at the Corporation for
|
||||||
|
National Research Initiatives (CNRI, see http://www.cnri.reston.va.us)
|
||||||
|
in Reston, Virginia where he released several versions of the
|
||||||
|
software.
|
||||||
|
|
||||||
|
In May 2000, Guido and the Python core development team moved to
|
||||||
|
BeOpen.com to form the BeOpen PythonLabs team. In October of the same
|
||||||
|
year, the PythonLabs team moved to Digital Creations, which became
|
||||||
|
Zope Corporation. In 2001, the Python Software Foundation (PSF, see
|
||||||
|
https://www.python.org/psf/) was formed, a non-profit organization
|
||||||
|
created specifically to own Python-related Intellectual Property.
|
||||||
|
Zope Corporation was a sponsoring member of the PSF.
|
||||||
|
|
||||||
|
All Python releases are Open Source (see http://www.opensource.org for
|
||||||
|
the Open Source Definition). Historically, most, but not all, Python
|
||||||
|
releases have also been GPL-compatible; the table below summarizes
|
||||||
|
the various releases.
|
||||||
|
|
||||||
|
Release Derived Year Owner GPL-
|
||||||
|
from compatible? (1)
|
||||||
|
|
||||||
|
0.9.0 thru 1.2 1991-1995 CWI yes
|
||||||
|
1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
|
||||||
|
1.6 1.5.2 2000 CNRI no
|
||||||
|
2.0 1.6 2000 BeOpen.com no
|
||||||
|
1.6.1 1.6 2001 CNRI yes (2)
|
||||||
|
2.1 2.0+1.6.1 2001 PSF no
|
||||||
|
2.0.1 2.0+1.6.1 2001 PSF yes
|
||||||
|
2.1.1 2.1+2.0.1 2001 PSF yes
|
||||||
|
2.1.2 2.1.1 2002 PSF yes
|
||||||
|
2.1.3 2.1.2 2002 PSF yes
|
||||||
|
2.2 and above 2.1.1 2001-now PSF yes
|
||||||
|
|
||||||
|
Footnotes:
|
||||||
|
|
||||||
|
(1) GPL-compatible doesn't mean that we're distributing Python under
|
||||||
|
the GPL. All Python licenses, unlike the GPL, let you distribute
|
||||||
|
a modified version without making your changes open source. The
|
||||||
|
GPL-compatible licenses make it possible to combine Python with
|
||||||
|
other software that is released under the GPL; the others don't.
|
||||||
|
|
||||||
|
(2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
|
||||||
|
because its license has a choice of law clause. According to
|
||||||
|
CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
|
||||||
|
is "not incompatible" with the GPL.
|
||||||
|
|
||||||
|
Thanks to the many outside volunteers who have worked under Guido's
|
||||||
|
direction to make these releases possible.
|
||||||
|
|
||||||
|
|
||||||
|
B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
|
||||||
|
===============================================================
|
||||||
|
|
||||||
|
Python software and documentation are licensed under the
|
||||||
|
Python Software Foundation License Version 2.
|
||||||
|
|
||||||
|
Starting with Python 3.8.6, examples, recipes, and other code in
|
||||||
|
the documentation are dual licensed under the PSF License Version 2
|
||||||
|
and the Zero-Clause BSD license.
|
||||||
|
|
||||||
|
Some software incorporated into Python is under different licenses.
|
||||||
|
The licenses are listed with code falling under that license.
|
||||||
|
|
||||||
|
|
||||||
|
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
|
||||||
|
--------------------------------------------
|
||||||
|
|
||||||
|
1. This LICENSE AGREEMENT is between the Python Software Foundation
|
||||||
|
("PSF"), and the Individual or Organization ("Licensee") accessing and
|
||||||
|
otherwise using this software ("Python") in source or binary form and
|
||||||
|
its associated documentation.
|
||||||
|
|
||||||
|
2. Subject to the terms and conditions of this License Agreement, PSF hereby
|
||||||
|
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
|
||||||
|
analyze, test, perform and/or display publicly, prepare derivative works,
|
||||||
|
distribute, and otherwise use Python alone or in any derivative version,
|
||||||
|
provided, however, that PSF's License Agreement and PSF's notice of copyright,
|
||||||
|
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
|
||||||
|
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation;
|
||||||
|
All Rights Reserved" are retained in Python alone or in any derivative version
|
||||||
|
prepared by Licensee.
|
||||||
|
|
||||||
|
3. In the event Licensee prepares a derivative work that is based on
|
||||||
|
or incorporates Python or any part thereof, and wants to make
|
||||||
|
the derivative work available to others as provided herein, then
|
||||||
|
Licensee hereby agrees to include in any such work a brief summary of
|
||||||
|
the changes made to Python.
|
||||||
|
|
||||||
|
4. PSF is making Python available to Licensee on an "AS IS"
|
||||||
|
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||||
|
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
|
||||||
|
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||||
|
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
|
||||||
|
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||||
|
|
||||||
|
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
||||||
|
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
||||||
|
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
|
||||||
|
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||||
|
|
||||||
|
6. This License Agreement will automatically terminate upon a material
|
||||||
|
breach of its terms and conditions.
|
||||||
|
|
||||||
|
7. Nothing in this License Agreement shall be deemed to create any
|
||||||
|
relationship of agency, partnership, or joint venture between PSF and
|
||||||
|
Licensee. This License Agreement does not grant permission to use PSF
|
||||||
|
trademarks or trade name in a trademark sense to endorse or promote
|
||||||
|
products or services of Licensee, or any third party.
|
||||||
|
|
||||||
|
8. By copying, installing or otherwise using Python, Licensee
|
||||||
|
agrees to be bound by the terms and conditions of this License
|
||||||
|
Agreement.
|
||||||
|
|
||||||
|
|
||||||
|
BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
|
||||||
|
-------------------------------------------
|
||||||
|
|
||||||
|
BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
|
||||||
|
|
||||||
|
1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
|
||||||
|
office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
|
||||||
|
Individual or Organization ("Licensee") accessing and otherwise using
|
||||||
|
this software in source or binary form and its associated
|
||||||
|
documentation ("the Software").
|
||||||
|
|
||||||
|
2. Subject to the terms and conditions of this BeOpen Python License
|
||||||
|
Agreement, BeOpen hereby grants Licensee a non-exclusive,
|
||||||
|
royalty-free, world-wide license to reproduce, analyze, test, perform
|
||||||
|
and/or display publicly, prepare derivative works, distribute, and
|
||||||
|
otherwise use the Software alone or in any derivative version,
|
||||||
|
provided, however, that the BeOpen Python License is retained in the
|
||||||
|
Software, alone or in any derivative version prepared by Licensee.
|
||||||
|
|
||||||
|
3. BeOpen is making the Software available to Licensee on an "AS IS"
|
||||||
|
basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||||
|
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
|
||||||
|
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||||
|
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
|
||||||
|
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||||
|
|
||||||
|
4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
|
||||||
|
SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
|
||||||
|
AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
|
||||||
|
DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||||
|
|
||||||
|
5. This License Agreement will automatically terminate upon a material
|
||||||
|
breach of its terms and conditions.
|
||||||
|
|
||||||
|
6. This License Agreement shall be governed by and interpreted in all
|
||||||
|
respects by the law of the State of California, excluding conflict of
|
||||||
|
law provisions. Nothing in this License Agreement shall be deemed to
|
||||||
|
create any relationship of agency, partnership, or joint venture
|
||||||
|
between BeOpen and Licensee. This License Agreement does not grant
|
||||||
|
permission to use BeOpen trademarks or trade names in a trademark
|
||||||
|
sense to endorse or promote products or services of Licensee, or any
|
||||||
|
third party. As an exception, the "BeOpen Python" logos available at
|
||||||
|
http://www.pythonlabs.com/logos.html may be used according to the
|
||||||
|
permissions granted on that web page.
|
||||||
|
|
||||||
|
7. By copying, installing or otherwise using the software, Licensee
|
||||||
|
agrees to be bound by the terms and conditions of this License
|
||||||
|
Agreement.
|
||||||
|
|
||||||
|
|
||||||
|
CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
|
||||||
|
---------------------------------------
|
||||||
|
|
||||||
|
1. This LICENSE AGREEMENT is between the Corporation for National
|
||||||
|
Research Initiatives, having an office at 1895 Preston White Drive,
|
||||||
|
Reston, VA 20191 ("CNRI"), and the Individual or Organization
|
||||||
|
("Licensee") accessing and otherwise using Python 1.6.1 software in
|
||||||
|
source or binary form and its associated documentation.
|
||||||
|
|
||||||
|
2. Subject to the terms and conditions of this License Agreement, CNRI
|
||||||
|
hereby grants Licensee a nonexclusive, royalty-free, world-wide
|
||||||
|
license to reproduce, analyze, test, perform and/or display publicly,
|
||||||
|
prepare derivative works, distribute, and otherwise use Python 1.6.1
|
||||||
|
alone or in any derivative version, provided, however, that CNRI's
|
||||||
|
License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
|
||||||
|
1995-2001 Corporation for National Research Initiatives; All Rights
|
||||||
|
Reserved" are retained in Python 1.6.1 alone or in any derivative
|
||||||
|
version prepared by Licensee. Alternately, in lieu of CNRI's License
|
||||||
|
Agreement, Licensee may substitute the following text (omitting the
|
||||||
|
quotes): "Python 1.6.1 is made available subject to the terms and
|
||||||
|
conditions in CNRI's License Agreement. This Agreement together with
|
||||||
|
Python 1.6.1 may be located on the internet using the following
|
||||||
|
unique, persistent identifier (known as a handle): 1895.22/1013. This
|
||||||
|
Agreement may also be obtained from a proxy server on the internet
|
||||||
|
using the following URL: http://hdl.handle.net/1895.22/1013".
|
||||||
|
|
||||||
|
3. In the event Licensee prepares a derivative work that is based on
|
||||||
|
or incorporates Python 1.6.1 or any part thereof, and wants to make
|
||||||
|
the derivative work available to others as provided herein, then
|
||||||
|
Licensee hereby agrees to include in any such work a brief summary of
|
||||||
|
the changes made to Python 1.6.1.
|
||||||
|
|
||||||
|
4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
|
||||||
|
basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||||
|
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
|
||||||
|
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||||
|
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
|
||||||
|
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||||
|
|
||||||
|
5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
||||||
|
1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
||||||
|
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
|
||||||
|
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||||
|
|
||||||
|
6. This License Agreement will automatically terminate upon a material
|
||||||
|
breach of its terms and conditions.
|
||||||
|
|
||||||
|
7. This License Agreement shall be governed by the federal
|
||||||
|
intellectual property law of the United States, including without
|
||||||
|
limitation the federal copyright law, and, to the extent such
|
||||||
|
U.S. federal law does not apply, by the law of the Commonwealth of
|
||||||
|
Virginia, excluding Virginia's conflict of law provisions.
|
||||||
|
Notwithstanding the foregoing, with regard to derivative works based
|
||||||
|
on Python 1.6.1 that incorporate non-separable material that was
|
||||||
|
previously distributed under the GNU General Public License (GPL), the
|
||||||
|
law of the Commonwealth of Virginia shall govern this License
|
||||||
|
Agreement only as to issues arising under or with respect to
|
||||||
|
Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
|
||||||
|
License Agreement shall be deemed to create any relationship of
|
||||||
|
agency, partnership, or joint venture between CNRI and Licensee. This
|
||||||
|
License Agreement does not grant permission to use CNRI trademarks or
|
||||||
|
trade name in a trademark sense to endorse or promote products or
|
||||||
|
services of Licensee, or any third party.
|
||||||
|
|
||||||
|
8. By clicking on the "ACCEPT" button where indicated, or by copying,
|
||||||
|
installing or otherwise using Python 1.6.1, Licensee agrees to be
|
||||||
|
bound by the terms and conditions of this License Agreement.
|
||||||
|
|
||||||
|
ACCEPT
|
||||||
|
|
||||||
|
|
||||||
|
CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
|
||||||
|
--------------------------------------------------
|
||||||
|
|
||||||
|
Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
|
||||||
|
The Netherlands. All rights reserved.
|
||||||
|
|
||||||
|
Permission to use, copy, modify, and distribute this software and its
|
||||||
|
documentation for any purpose and without fee is hereby granted,
|
||||||
|
provided that the above copyright notice appear in all copies and that
|
||||||
|
both that copyright notice and this permission notice appear in
|
||||||
|
supporting documentation, and that the name of Stichting Mathematisch
|
||||||
|
Centrum or CWI not be used in advertising or publicity pertaining to
|
||||||
|
distribution of the software without specific, written prior
|
||||||
|
permission.
|
||||||
|
|
||||||
|
STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
|
||||||
|
THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
|
||||||
|
FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
|
||||||
|
FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||||
|
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||||
|
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
|
||||||
|
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||||
|
|
||||||
|
ZERO-CLAUSE BSD LICENSE FOR CODE IN THE PYTHON DOCUMENTATION
|
||||||
|
----------------------------------------------------------------------
|
||||||
|
|
||||||
|
Permission to use, copy, modify, and/or distribute this software for any
|
||||||
|
purpose with or without fee is hereby granted.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
|
||||||
|
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY
|
||||||
|
AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
|
||||||
|
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
|
||||||
|
LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
|
||||||
|
OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
|
||||||
|
PERFORMANCE OF THIS SOFTWARE.
|
||||||
|
|
@ -0,0 +1,101 @@
|
||||||
|
Metadata-Version: 2.1
|
||||||
|
Name: Django
|
||||||
|
Version: 4.2.7
|
||||||
|
Summary: A high-level Python web framework that encourages rapid development and clean, pragmatic design.
|
||||||
|
Home-page: https://www.djangoproject.com/
|
||||||
|
Author: Django Software Foundation
|
||||||
|
Author-email: foundation@djangoproject.com
|
||||||
|
License: BSD-3-Clause
|
||||||
|
Project-URL: Documentation, https://docs.djangoproject.com/
|
||||||
|
Project-URL: Release notes, https://docs.djangoproject.com/en/stable/releases/
|
||||||
|
Project-URL: Funding, https://www.djangoproject.com/fundraising/
|
||||||
|
Project-URL: Source, https://github.com/django/django
|
||||||
|
Project-URL: Tracker, https://code.djangoproject.com/
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Environment :: Web Environment
|
||||||
|
Classifier: Framework :: Django
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: BSD License
|
||||||
|
Classifier: Operating System :: OS Independent
|
||||||
|
Classifier: Programming Language :: Python
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: 3 :: Only
|
||||||
|
Classifier: Programming Language :: Python :: 3.8
|
||||||
|
Classifier: Programming Language :: Python :: 3.9
|
||||||
|
Classifier: Programming Language :: Python :: 3.10
|
||||||
|
Classifier: Programming Language :: Python :: 3.11
|
||||||
|
Classifier: Topic :: Internet :: WWW/HTTP
|
||||||
|
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
|
||||||
|
Classifier: Topic :: Internet :: WWW/HTTP :: WSGI
|
||||||
|
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
|
||||||
|
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||||
|
Requires-Python: >=3.8
|
||||||
|
License-File: LICENSE
|
||||||
|
License-File: LICENSE.python
|
||||||
|
License-File: AUTHORS
|
||||||
|
Requires-Dist: asgiref (<4,>=3.6.0)
|
||||||
|
Requires-Dist: sqlparse (>=0.3.1)
|
||||||
|
Requires-Dist: backports.zoneinfo ; python_version < "3.9"
|
||||||
|
Requires-Dist: tzdata ; sys_platform == "win32"
|
||||||
|
Provides-Extra: argon2
|
||||||
|
Requires-Dist: argon2-cffi (>=19.1.0) ; extra == 'argon2'
|
||||||
|
Provides-Extra: bcrypt
|
||||||
|
Requires-Dist: bcrypt ; extra == 'bcrypt'
|
||||||
|
|
||||||
|
======
|
||||||
|
Django
|
||||||
|
======
|
||||||
|
|
||||||
|
Django is a high-level Python web framework that encourages rapid development
|
||||||
|
and clean, pragmatic design. Thanks for checking it out.
|
||||||
|
|
||||||
|
All documentation is in the "``docs``" directory and online at
|
||||||
|
https://docs.djangoproject.com/en/stable/. If you're just getting started,
|
||||||
|
here's how we recommend you read the docs:
|
||||||
|
|
||||||
|
* First, read ``docs/intro/install.txt`` for instructions on installing Django.
|
||||||
|
|
||||||
|
* Next, work through the tutorials in order (``docs/intro/tutorial01.txt``,
|
||||||
|
``docs/intro/tutorial02.txt``, etc.).
|
||||||
|
|
||||||
|
* If you want to set up an actual deployment server, read
|
||||||
|
``docs/howto/deployment/index.txt`` for instructions.
|
||||||
|
|
||||||
|
* You'll probably want to read through the topical guides (in ``docs/topics``)
|
||||||
|
next; from there you can jump to the HOWTOs (in ``docs/howto``) for specific
|
||||||
|
problems, and check out the reference (``docs/ref``) for gory details.
|
||||||
|
|
||||||
|
* See ``docs/README`` for instructions on building an HTML version of the docs.
|
||||||
|
|
||||||
|
Docs are updated rigorously. If you find any problems in the docs, or think
|
||||||
|
they should be clarified in any way, please take 30 seconds to fill out a
|
||||||
|
ticket here: https://code.djangoproject.com/newticket
|
||||||
|
|
||||||
|
To get more help:
|
||||||
|
|
||||||
|
* Join the ``#django`` channel on ``irc.libera.chat``. Lots of helpful people
|
||||||
|
hang out there. See https://web.libera.chat if you're new to IRC.
|
||||||
|
|
||||||
|
* Join the django-users mailing list, or read the archives, at
|
||||||
|
https://groups.google.com/group/django-users.
|
||||||
|
|
||||||
|
To contribute to Django:
|
||||||
|
|
||||||
|
* Check out https://docs.djangoproject.com/en/dev/internals/contributing/ for
|
||||||
|
information about getting involved.
|
||||||
|
|
||||||
|
To run Django's test suite:
|
||||||
|
|
||||||
|
* Follow the instructions in the "Unit tests" section of
|
||||||
|
``docs/internals/contributing/writing-code/unit-tests.txt``, published online at
|
||||||
|
https://docs.djangoproject.com/en/dev/internals/contributing/writing-code/unit-tests/#running-the-unit-tests
|
||||||
|
|
||||||
|
Supporting the Development of Django
|
||||||
|
====================================
|
||||||
|
|
||||||
|
Django's development depends on your contributions.
|
||||||
|
|
||||||
|
If you depend on Django, remember to support the Django Software Foundation: https://www.djangoproject.com/fundraising/
|
||||||
|
|
||||||
|
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,5 @@
|
||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: bdist_wheel (0.37.1)
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
[console_scripts]
|
||||||
|
django-admin = django.core.management:execute_from_command_line
|
||||||
|
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
django
|
||||||
|
|
@ -0,0 +1,128 @@
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import importlib
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
|
||||||
|
is_pypy = '__pypy__' in sys.builtin_module_names
|
||||||
|
|
||||||
|
|
||||||
|
warnings.filterwarnings('ignore',
|
||||||
|
r'.+ distutils\b.+ deprecated',
|
||||||
|
DeprecationWarning)
|
||||||
|
|
||||||
|
|
||||||
|
def warn_distutils_present():
|
||||||
|
if 'distutils' not in sys.modules:
|
||||||
|
return
|
||||||
|
if is_pypy and sys.version_info < (3, 7):
|
||||||
|
# PyPy for 3.6 unconditionally imports distutils, so bypass the warning
|
||||||
|
# https://foss.heptapod.net/pypy/pypy/-/blob/be829135bc0d758997b3566062999ee8b23872b4/lib-python/3/site.py#L250
|
||||||
|
return
|
||||||
|
warnings.warn(
|
||||||
|
"Distutils was imported before Setuptools, but importing Setuptools "
|
||||||
|
"also replaces the `distutils` module in `sys.modules`. This may lead "
|
||||||
|
"to undesirable behaviors or errors. To avoid these issues, avoid "
|
||||||
|
"using distutils directly, ensure that setuptools is installed in the "
|
||||||
|
"traditional way (e.g. not an editable install), and/or make sure "
|
||||||
|
"that setuptools is always imported before distutils.")
|
||||||
|
|
||||||
|
|
||||||
|
def clear_distutils():
|
||||||
|
if 'distutils' not in sys.modules:
|
||||||
|
return
|
||||||
|
warnings.warn("Setuptools is replacing distutils.")
|
||||||
|
mods = [name for name in sys.modules if re.match(r'distutils\b', name)]
|
||||||
|
for name in mods:
|
||||||
|
del sys.modules[name]
|
||||||
|
|
||||||
|
|
||||||
|
def enabled():
|
||||||
|
"""
|
||||||
|
Allow selection of distutils by environment variable.
|
||||||
|
"""
|
||||||
|
which = os.environ.get('SETUPTOOLS_USE_DISTUTILS', 'stdlib')
|
||||||
|
return which == 'local'
|
||||||
|
|
||||||
|
|
||||||
|
def ensure_local_distutils():
|
||||||
|
clear_distutils()
|
||||||
|
distutils = importlib.import_module('setuptools._distutils')
|
||||||
|
distutils.__name__ = 'distutils'
|
||||||
|
sys.modules['distutils'] = distutils
|
||||||
|
|
||||||
|
# sanity check that submodules load as expected
|
||||||
|
core = importlib.import_module('distutils.core')
|
||||||
|
assert '_distutils' in core.__file__, core.__file__
|
||||||
|
|
||||||
|
|
||||||
|
def do_override():
|
||||||
|
"""
|
||||||
|
Ensure that the local copy of distutils is preferred over stdlib.
|
||||||
|
|
||||||
|
See https://github.com/pypa/setuptools/issues/417#issuecomment-392298401
|
||||||
|
for more motivation.
|
||||||
|
"""
|
||||||
|
if enabled():
|
||||||
|
warn_distutils_present()
|
||||||
|
ensure_local_distutils()
|
||||||
|
|
||||||
|
|
||||||
|
class DistutilsMetaFinder:
|
||||||
|
def find_spec(self, fullname, path, target=None):
|
||||||
|
if path is not None:
|
||||||
|
return
|
||||||
|
|
||||||
|
method_name = 'spec_for_{fullname}'.format(**locals())
|
||||||
|
method = getattr(self, method_name, lambda: None)
|
||||||
|
return method()
|
||||||
|
|
||||||
|
def spec_for_distutils(self):
|
||||||
|
import importlib.abc
|
||||||
|
import importlib.util
|
||||||
|
|
||||||
|
class DistutilsLoader(importlib.abc.Loader):
|
||||||
|
|
||||||
|
def create_module(self, spec):
|
||||||
|
return importlib.import_module('setuptools._distutils')
|
||||||
|
|
||||||
|
def exec_module(self, module):
|
||||||
|
pass
|
||||||
|
|
||||||
|
return importlib.util.spec_from_loader('distutils', DistutilsLoader())
|
||||||
|
|
||||||
|
def spec_for_pip(self):
|
||||||
|
"""
|
||||||
|
Ensure stdlib distutils when running under pip.
|
||||||
|
See pypa/pip#8761 for rationale.
|
||||||
|
"""
|
||||||
|
if self.pip_imported_during_build():
|
||||||
|
return
|
||||||
|
clear_distutils()
|
||||||
|
self.spec_for_distutils = lambda: None
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def pip_imported_during_build():
|
||||||
|
"""
|
||||||
|
Detect if pip is being imported in a build script. Ref #2355.
|
||||||
|
"""
|
||||||
|
import traceback
|
||||||
|
return any(
|
||||||
|
frame.f_globals['__file__'].endswith('setup.py')
|
||||||
|
for frame, line in traceback.walk_stack(None)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
DISTUTILS_FINDER = DistutilsMetaFinder()
|
||||||
|
|
||||||
|
|
||||||
|
def add_shim():
|
||||||
|
sys.meta_path.insert(0, DISTUTILS_FINDER)
|
||||||
|
|
||||||
|
|
||||||
|
def remove_shim():
|
||||||
|
try:
|
||||||
|
sys.meta_path.remove(DISTUTILS_FINDER)
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
__import__('_distutils_hack').do_override()
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
pip
|
||||||
|
|
@ -0,0 +1,47 @@
|
||||||
|
Copyright (c) 2015-2016 Ask Solem & contributors. All rights reserved.
|
||||||
|
Copyright (c) 2012-2014 GoPivotal, Inc. All rights reserved.
|
||||||
|
Copyright (c) 2009, 2010, 2011, 2012 Ask Solem, and individual contributors. All rights reserved.
|
||||||
|
Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>. All rights reserved.
|
||||||
|
|
||||||
|
py-amqp is licensed under The BSD License (3 Clause, also known as
|
||||||
|
the new BSD license). The license is an OSI approved Open Source
|
||||||
|
license and is GPL-compatible(1).
|
||||||
|
|
||||||
|
The license text can also be found here:
|
||||||
|
http://www.opensource.org/licenses/BSD-3-Clause
|
||||||
|
|
||||||
|
License
|
||||||
|
=======
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions are met:
|
||||||
|
* Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
* Redistributions in binary form must reproduce the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer in the
|
||||||
|
documentation and/or other materials provided with the distribution.
|
||||||
|
* Neither the name of Ask Solem, nor the
|
||||||
|
names of its contributors may be used to endorse or promote products
|
||||||
|
derived from this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
||||||
|
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
|
||||||
|
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||||
|
PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL Ask Solem OR CONTRIBUTORS
|
||||||
|
BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
|
||||||
|
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
|
||||||
|
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
||||||
|
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
|
||||||
|
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
|
||||||
|
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
||||||
|
POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
|
||||||
|
Footnotes
|
||||||
|
=========
|
||||||
|
(1) A GPL-compatible license makes it possible to
|
||||||
|
combine Celery with other software that is released
|
||||||
|
under the GPL, it does not mean that we're distributing
|
||||||
|
Celery under the GPL license. The BSD license, unlike the GPL,
|
||||||
|
let you distribute a modified version without making your
|
||||||
|
changes open source.
|
||||||
|
|
@ -0,0 +1,239 @@
|
||||||
|
Metadata-Version: 2.1
|
||||||
|
Name: amqp
|
||||||
|
Version: 5.3.1
|
||||||
|
Summary: Low-level AMQP client for Python (fork of amqplib).
|
||||||
|
Home-page: http://github.com/celery/py-amqp
|
||||||
|
Author: Barry Pederson
|
||||||
|
Author-email: auvipy@gmail.com
|
||||||
|
Maintainer: Asif Saif Uddin, Matus Valo
|
||||||
|
License: BSD
|
||||||
|
Keywords: amqp rabbitmq cloudamqp messaging
|
||||||
|
Platform: any
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Programming Language :: Python
|
||||||
|
Classifier: Programming Language :: Python :: 3 :: Only
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: 3.7
|
||||||
|
Classifier: Programming Language :: Python :: 3.8
|
||||||
|
Classifier: Programming Language :: Python :: 3.9
|
||||||
|
Classifier: Programming Language :: Python :: 3.10
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||||
|
Classifier: License :: OSI Approved :: BSD License
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: Operating System :: OS Independent
|
||||||
|
Requires-Python: >=3.6
|
||||||
|
Description-Content-Type: text/x-rst
|
||||||
|
License-File: LICENSE
|
||||||
|
Requires-Dist: vine<6.0.0,>=5.0.0
|
||||||
|
|
||||||
|
=====================================================================
|
||||||
|
Python AMQP 0.9.1 client library
|
||||||
|
=====================================================================
|
||||||
|
|
||||||
|
|build-status| |coverage| |license| |wheel| |pyversion| |pyimp|
|
||||||
|
|
||||||
|
:Version: 5.3.1
|
||||||
|
:Web: https://amqp.readthedocs.io/
|
||||||
|
:Download: https://pypi.org/project/amqp/
|
||||||
|
:Source: http://github.com/celery/py-amqp/
|
||||||
|
:Keywords: amqp, rabbitmq
|
||||||
|
|
||||||
|
About
|
||||||
|
=====
|
||||||
|
|
||||||
|
This is a fork of amqplib_ which was originally written by Barry Pederson.
|
||||||
|
It is maintained by the Celery_ project, and used by `kombu`_ as a pure python
|
||||||
|
alternative when `librabbitmq`_ is not available.
|
||||||
|
|
||||||
|
This library should be API compatible with `librabbitmq`_.
|
||||||
|
|
||||||
|
.. _amqplib: https://pypi.org/project/amqplib/
|
||||||
|
.. _Celery: http://celeryproject.org/
|
||||||
|
.. _kombu: https://kombu.readthedocs.io/
|
||||||
|
.. _librabbitmq: https://pypi.org/project/librabbitmq/
|
||||||
|
|
||||||
|
Differences from `amqplib`_
|
||||||
|
===========================
|
||||||
|
|
||||||
|
- Supports draining events from multiple channels (``Connection.drain_events``)
|
||||||
|
- Support for timeouts
|
||||||
|
- Channels are restored after channel error, instead of having to close the
|
||||||
|
connection.
|
||||||
|
- Support for heartbeats
|
||||||
|
|
||||||
|
- ``Connection.heartbeat_tick(rate=2)`` must called at regular intervals
|
||||||
|
(half of the heartbeat value if rate is 2).
|
||||||
|
- Or some other scheme by using ``Connection.send_heartbeat``.
|
||||||
|
- Supports RabbitMQ extensions:
|
||||||
|
- Consumer Cancel Notifications
|
||||||
|
- by default a cancel results in ``ChannelError`` being raised
|
||||||
|
- but not if a ``on_cancel`` callback is passed to ``basic_consume``.
|
||||||
|
- Publisher confirms
|
||||||
|
- ``Channel.confirm_select()`` enables publisher confirms.
|
||||||
|
- ``Channel.events['basic_ack'].append(my_callback)`` adds a callback
|
||||||
|
to be called when a message is confirmed. This callback is then
|
||||||
|
called with the signature ``(delivery_tag, multiple)``.
|
||||||
|
- Exchange-to-exchange bindings: ``exchange_bind`` / ``exchange_unbind``.
|
||||||
|
- ``Channel.confirm_select()`` enables publisher confirms.
|
||||||
|
- ``Channel.events['basic_ack'].append(my_callback)`` adds a callback
|
||||||
|
to be called when a message is confirmed. This callback is then
|
||||||
|
called with the signature ``(delivery_tag, multiple)``.
|
||||||
|
- Authentication Failure Notifications
|
||||||
|
Instead of just closing the connection abruptly on invalid
|
||||||
|
credentials, py-amqp will raise an ``AccessRefused`` error
|
||||||
|
when connected to rabbitmq-server 3.2.0 or greater.
|
||||||
|
- Support for ``basic_return``
|
||||||
|
- Uses AMQP 0-9-1 instead of 0-8.
|
||||||
|
- ``Channel.access_request`` and ``ticket`` arguments to methods
|
||||||
|
**removed**.
|
||||||
|
- Supports the ``arguments`` argument to ``basic_consume``.
|
||||||
|
- ``internal`` argument to ``exchange_declare`` removed.
|
||||||
|
- ``auto_delete`` argument to ``exchange_declare`` deprecated
|
||||||
|
- ``insist`` argument to ``Connection`` removed.
|
||||||
|
- ``Channel.alerts`` has been removed.
|
||||||
|
- Support for ``Channel.basic_recover_async``.
|
||||||
|
- ``Channel.basic_recover`` deprecated.
|
||||||
|
- Exceptions renamed to have idiomatic names:
|
||||||
|
- ``AMQPException`` -> ``AMQPError``
|
||||||
|
- ``AMQPConnectionException`` -> ConnectionError``
|
||||||
|
- ``AMQPChannelException`` -> ChannelError``
|
||||||
|
- ``Connection.known_hosts`` removed.
|
||||||
|
- ``Connection`` no longer supports redirects.
|
||||||
|
- ``exchange`` argument to ``queue_bind`` can now be empty
|
||||||
|
to use the "default exchange".
|
||||||
|
- Adds ``Connection.is_alive`` that tries to detect
|
||||||
|
whether the connection can still be used.
|
||||||
|
- Adds ``Connection.connection_errors`` and ``.channel_errors``,
|
||||||
|
a list of recoverable errors.
|
||||||
|
- Exposes the underlying socket as ``Connection.sock``.
|
||||||
|
- Adds ``Channel.no_ack_consumers`` to keep track of consumer tags
|
||||||
|
that set the no_ack flag.
|
||||||
|
- Slightly better at error recovery
|
||||||
|
|
||||||
|
Quick overview
|
||||||
|
==============
|
||||||
|
|
||||||
|
Simple producer publishing messages to ``test`` queue using default exchange:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
import amqp
|
||||||
|
|
||||||
|
with amqp.Connection('broker.example.com') as c:
|
||||||
|
ch = c.channel()
|
||||||
|
ch.basic_publish(amqp.Message('Hello World'), routing_key='test')
|
||||||
|
|
||||||
|
Producer publishing to ``test_exchange`` exchange with publisher confirms enabled and using virtual_host ``test_vhost``:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
import amqp
|
||||||
|
|
||||||
|
with amqp.Connection(
|
||||||
|
'broker.example.com', exchange='test_exchange',
|
||||||
|
confirm_publish=True, virtual_host='test_vhost'
|
||||||
|
) as c:
|
||||||
|
ch = c.channel()
|
||||||
|
ch.basic_publish(amqp.Message('Hello World'), routing_key='test')
|
||||||
|
|
||||||
|
Consumer with acknowledgments enabled:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
import amqp
|
||||||
|
|
||||||
|
with amqp.Connection('broker.example.com') as c:
|
||||||
|
ch = c.channel()
|
||||||
|
def on_message(message):
|
||||||
|
print('Received message (delivery tag: {}): {}'.format(message.delivery_tag, message.body))
|
||||||
|
ch.basic_ack(message.delivery_tag)
|
||||||
|
ch.basic_consume(queue='test', callback=on_message)
|
||||||
|
while True:
|
||||||
|
c.drain_events()
|
||||||
|
|
||||||
|
|
||||||
|
Consumer with acknowledgments disabled:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
import amqp
|
||||||
|
|
||||||
|
with amqp.Connection('broker.example.com') as c:
|
||||||
|
ch = c.channel()
|
||||||
|
def on_message(message):
|
||||||
|
print('Received message (delivery tag: {}): {}'.format(message.delivery_tag, message.body))
|
||||||
|
ch.basic_consume(queue='test', callback=on_message, no_ack=True)
|
||||||
|
while True:
|
||||||
|
c.drain_events()
|
||||||
|
|
||||||
|
Speedups
|
||||||
|
========
|
||||||
|
|
||||||
|
This library has **experimental** support of speedups. Speedups are implemented using Cython. To enable speedups, ``CELERY_ENABLE_SPEEDUPS`` environment variable must be set during building/installation.
|
||||||
|
Currently speedups can be installed:
|
||||||
|
|
||||||
|
1. using source package (using ``--no-binary`` switch):
|
||||||
|
|
||||||
|
.. code:: shell
|
||||||
|
|
||||||
|
CELERY_ENABLE_SPEEDUPS=true pip install --no-binary :all: amqp
|
||||||
|
|
||||||
|
|
||||||
|
2. building directly source code:
|
||||||
|
|
||||||
|
.. code:: shell
|
||||||
|
|
||||||
|
CELERY_ENABLE_SPEEDUPS=true python setup.py install
|
||||||
|
|
||||||
|
Further
|
||||||
|
=======
|
||||||
|
|
||||||
|
- Differences between AMQP 0.8 and 0.9.1
|
||||||
|
|
||||||
|
http://www.rabbitmq.com/amqp-0-8-to-0-9-1.html
|
||||||
|
|
||||||
|
- AMQP 0.9.1 Quick Reference
|
||||||
|
|
||||||
|
http://www.rabbitmq.com/amqp-0-9-1-quickref.html
|
||||||
|
|
||||||
|
- RabbitMQ Extensions
|
||||||
|
|
||||||
|
http://www.rabbitmq.com/extensions.html
|
||||||
|
|
||||||
|
- For more information about AMQP, visit
|
||||||
|
|
||||||
|
http://www.amqp.org
|
||||||
|
|
||||||
|
- For other Python client libraries see:
|
||||||
|
|
||||||
|
http://www.rabbitmq.com/devtools.html#python-dev
|
||||||
|
|
||||||
|
.. |build-status| image:: https://github.com/celery/py-amqp/actions/workflows/ci.yaml/badge.svg
|
||||||
|
:alt: Build status
|
||||||
|
:target: https://github.com/celery/py-amqp/actions/workflows/ci.yaml
|
||||||
|
|
||||||
|
.. |coverage| image:: https://codecov.io/github/celery/py-amqp/coverage.svg?branch=main
|
||||||
|
:target: https://codecov.io/github/celery/py-amqp?branch=main
|
||||||
|
|
||||||
|
.. |license| image:: https://img.shields.io/pypi/l/amqp.svg
|
||||||
|
:alt: BSD License
|
||||||
|
:target: https://opensource.org/licenses/BSD-3-Clause
|
||||||
|
|
||||||
|
.. |wheel| image:: https://img.shields.io/pypi/wheel/amqp.svg
|
||||||
|
:alt: Python AMQP can be installed via wheel
|
||||||
|
:target: https://pypi.org/project/amqp/
|
||||||
|
|
||||||
|
.. |pyversion| image:: https://img.shields.io/pypi/pyversions/amqp.svg
|
||||||
|
:alt: Supported Python versions.
|
||||||
|
:target: https://pypi.org/project/amqp/
|
||||||
|
|
||||||
|
.. |pyimp| image:: https://img.shields.io/pypi/implementation/amqp.svg
|
||||||
|
:alt: Support Python implementations.
|
||||||
|
:target: https://pypi.org/project/amqp/
|
||||||
|
|
||||||
|
py-amqp as part of the Tidelift Subscription
|
||||||
|
============================================
|
||||||
|
|
||||||
|
The maintainers of py-amqp and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. [Learn more.](https://tidelift.com/subscription/pkg/pypi-amqp?utm_source=pypi-amqp&utm_medium=referral&utm_campaign=readme&utm_term=repo)
|
||||||
|
|
||||||
|
|
@ -0,0 +1,34 @@
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/abstract_channel.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/basic_message.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/channel.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/connection.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/exceptions.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/method_framing.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/platform.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/protocol.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/sasl.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/serialization.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/spec.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/transport.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/amqp/utils.cpython-39.pyc,,
|
||||||
|
amqp-5.3.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||||
|
amqp-5.3.1.dist-info/LICENSE,sha256=9e9fEoLq4ZMcdGRfhxm2xps9aizyd7_aJJqCcM1HOvM,2372
|
||||||
|
amqp-5.3.1.dist-info/METADATA,sha256=sv93q3ZseR0T9pcxMMq8Jt_pxL0PNI_cbKA48tbprNM,8887
|
||||||
|
amqp-5.3.1.dist-info/RECORD,,
|
||||||
|
amqp-5.3.1.dist-info/WHEEL,sha256=a7TGlA-5DaHMRrarXjVbQagU3Man_dCnGIWMJr5kRWo,91
|
||||||
|
amqp-5.3.1.dist-info/top_level.txt,sha256=tWQNmFVhU4UtDgB6Yy2lKqRz7LtOrRcN8_bPFVcVVR8,5
|
||||||
|
amqp/__init__.py,sha256=QvARRZLvrDJRy_JCybG6TmprblyQPyF1pzIgR3fNRv4,2357
|
||||||
|
amqp/abstract_channel.py,sha256=D_OEWvX48yKUzMYm_sN-IDRQmqIGvegi9KlJriqttBc,4941
|
||||||
|
amqp/basic_message.py,sha256=Q8DV31tuuphloTETPHiJFwNg6b5M6pccJ0InJ4MZUz8,3357
|
||||||
|
amqp/channel.py,sha256=XzCuKPy9qFMiTsnqksKpFIh9PUcKZm3uIXm1RFCeZQs,74475
|
||||||
|
amqp/connection.py,sha256=8vsfpVTsTJBS-uu_SEEEuT-RXMk_IX_jCldOHP-oDlo,27541
|
||||||
|
amqp/exceptions.py,sha256=yqjoFIRue2rvK7kMdvkKsGOD6dMOzzzT3ZzBwoGWAe4,7166
|
||||||
|
amqp/method_framing.py,sha256=avnw90X9t4995HpHoZV4-1V73UEbzUKJ83pHEicAqWY,6734
|
||||||
|
amqp/platform.py,sha256=cyLevv6E15P9zhMo_fV84p67Q_A8fdsTq9amjvlUwqE,2379
|
||||||
|
amqp/protocol.py,sha256=Di3y6qqhnOV4QtkeYKO-zryfWqwl3F1zUxDOmVSsAp0,291
|
||||||
|
amqp/sasl.py,sha256=6AbsnxlbAyoiYxDezoQTfm-E0t_TJyHXpqGJ0KlPkI4,5986
|
||||||
|
amqp/serialization.py,sha256=xzzXmmQ45fGUuSCxGTEMizmRQTmzaz3Z7YYfpxmfXuY,17162
|
||||||
|
amqp/spec.py,sha256=2ZjbL4FR4Fv67HA7HUI9hLUIvAv3A4ZH6GRPzrMRyWg,2121
|
||||||
|
amqp/transport.py,sha256=tG50r-ybeXGwe3SoA5BacNY9BzRJnRn7BZs3XBuKwO0,23046
|
||||||
|
amqp/utils.py,sha256=JjjY040LwsDUc1zmKP2VTzXBioVXy48DUZtWB8PaPy0,1456
|
||||||
|
|
@ -0,0 +1,5 @@
|
||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: setuptools (75.4.0)
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
amqp
|
||||||
|
|
@ -0,0 +1,75 @@
|
||||||
|
"""Low-level AMQP client for Python (fork of amqplib)."""
|
||||||
|
# Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
import re
|
||||||
|
from collections import namedtuple
|
||||||
|
|
||||||
|
__version__ = '5.3.1'
|
||||||
|
__author__ = 'Barry Pederson'
|
||||||
|
__maintainer__ = 'Asif Saif Uddin, Matus Valo'
|
||||||
|
__contact__ = 'auvipy@gmail.com'
|
||||||
|
__homepage__ = 'http://github.com/celery/py-amqp'
|
||||||
|
__docformat__ = 'restructuredtext'
|
||||||
|
|
||||||
|
# -eof meta-
|
||||||
|
|
||||||
|
version_info_t = namedtuple('version_info_t', (
|
||||||
|
'major', 'minor', 'micro', 'releaselevel', 'serial',
|
||||||
|
))
|
||||||
|
|
||||||
|
# bumpversion can only search for {current_version}
|
||||||
|
# so we have to parse the version here.
|
||||||
|
_temp = re.match(
|
||||||
|
r'(\d+)\.(\d+).(\d+)(.+)?', __version__).groups()
|
||||||
|
VERSION = version_info = version_info_t(
|
||||||
|
int(_temp[0]), int(_temp[1]), int(_temp[2]), _temp[3] or '', '')
|
||||||
|
del(_temp)
|
||||||
|
del(re)
|
||||||
|
|
||||||
|
from .basic_message import Message # noqa
|
||||||
|
from .channel import Channel # noqa
|
||||||
|
from .connection import Connection # noqa
|
||||||
|
from .exceptions import (AccessRefused, AMQPError, # noqa
|
||||||
|
AMQPNotImplementedError, ChannelError, ChannelNotOpen,
|
||||||
|
ConnectionError, ConnectionForced, ConsumerCancelled,
|
||||||
|
ContentTooLarge, FrameError, FrameSyntaxError,
|
||||||
|
InternalError, InvalidCommand, InvalidPath,
|
||||||
|
IrrecoverableChannelError,
|
||||||
|
IrrecoverableConnectionError, NoConsumers, NotAllowed,
|
||||||
|
NotFound, PreconditionFailed, RecoverableChannelError,
|
||||||
|
RecoverableConnectionError, ResourceError,
|
||||||
|
ResourceLocked, UnexpectedFrame, error_for_code)
|
||||||
|
from .utils import promise # noqa
|
||||||
|
|
||||||
|
__all__ = (
|
||||||
|
'Connection',
|
||||||
|
'Channel',
|
||||||
|
'Message',
|
||||||
|
'promise',
|
||||||
|
'AMQPError',
|
||||||
|
'ConnectionError',
|
||||||
|
'RecoverableConnectionError',
|
||||||
|
'IrrecoverableConnectionError',
|
||||||
|
'ChannelError',
|
||||||
|
'RecoverableChannelError',
|
||||||
|
'IrrecoverableChannelError',
|
||||||
|
'ConsumerCancelled',
|
||||||
|
'ContentTooLarge',
|
||||||
|
'NoConsumers',
|
||||||
|
'ConnectionForced',
|
||||||
|
'InvalidPath',
|
||||||
|
'AccessRefused',
|
||||||
|
'NotFound',
|
||||||
|
'ResourceLocked',
|
||||||
|
'PreconditionFailed',
|
||||||
|
'FrameError',
|
||||||
|
'FrameSyntaxError',
|
||||||
|
'InvalidCommand',
|
||||||
|
'ChannelNotOpen',
|
||||||
|
'UnexpectedFrame',
|
||||||
|
'ResourceError',
|
||||||
|
'NotAllowed',
|
||||||
|
'AMQPNotImplementedError',
|
||||||
|
'InternalError',
|
||||||
|
'error_for_code',
|
||||||
|
)
|
||||||
|
|
@ -0,0 +1,163 @@
|
||||||
|
"""Code common to Connection and Channel objects."""
|
||||||
|
# Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>)
|
||||||
|
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from vine import ensure_promise, promise
|
||||||
|
|
||||||
|
from .exceptions import AMQPNotImplementedError, RecoverableConnectionError
|
||||||
|
from .serialization import dumps, loads
|
||||||
|
|
||||||
|
__all__ = ('AbstractChannel',)
|
||||||
|
|
||||||
|
AMQP_LOGGER = logging.getLogger('amqp')
|
||||||
|
|
||||||
|
IGNORED_METHOD_DURING_CHANNEL_CLOSE = """\
|
||||||
|
Received method %s during closing channel %s. This method will be ignored\
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class AbstractChannel:
|
||||||
|
"""Superclass for Connection and Channel.
|
||||||
|
|
||||||
|
The connection is treated as channel 0, then comes
|
||||||
|
user-created channel objects.
|
||||||
|
|
||||||
|
The subclasses must have a _METHOD_MAP class property, mapping
|
||||||
|
between AMQP method signatures and Python methods.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, connection, channel_id):
|
||||||
|
self.is_closing = False
|
||||||
|
self.connection = connection
|
||||||
|
self.channel_id = channel_id
|
||||||
|
connection.channels[channel_id] = self
|
||||||
|
self.method_queue = [] # Higher level queue for methods
|
||||||
|
self.auto_decode = False
|
||||||
|
self._pending = {}
|
||||||
|
self._callbacks = {}
|
||||||
|
|
||||||
|
self._setup_listeners()
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"is_closing",
|
||||||
|
"connection",
|
||||||
|
"channel_id",
|
||||||
|
"method_queue",
|
||||||
|
"auto_decode",
|
||||||
|
"_pending",
|
||||||
|
"_callbacks",
|
||||||
|
# adding '__dict__' to get dynamic assignment
|
||||||
|
"__dict__",
|
||||||
|
"__weakref__",
|
||||||
|
)
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, *exc_info):
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
def send_method(self, sig,
|
||||||
|
format=None, args=None, content=None,
|
||||||
|
wait=None, callback=None, returns_tuple=False):
|
||||||
|
p = promise()
|
||||||
|
conn = self.connection
|
||||||
|
if conn is None:
|
||||||
|
raise RecoverableConnectionError('connection already closed')
|
||||||
|
args = dumps(format, args) if format else ''
|
||||||
|
try:
|
||||||
|
conn.frame_writer(1, self.channel_id, sig, args, content)
|
||||||
|
except StopIteration:
|
||||||
|
raise RecoverableConnectionError('connection already closed')
|
||||||
|
|
||||||
|
# TODO temp: callback should be after write_method ... ;)
|
||||||
|
if callback:
|
||||||
|
p.then(callback)
|
||||||
|
p()
|
||||||
|
if wait:
|
||||||
|
return self.wait(wait, returns_tuple=returns_tuple)
|
||||||
|
return p
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
"""Close this Channel or Connection."""
|
||||||
|
raise NotImplementedError('Must be overridden in subclass')
|
||||||
|
|
||||||
|
def wait(self, method, callback=None, timeout=None, returns_tuple=False):
|
||||||
|
p = ensure_promise(callback)
|
||||||
|
pending = self._pending
|
||||||
|
prev_p = []
|
||||||
|
if not isinstance(method, list):
|
||||||
|
method = [method]
|
||||||
|
|
||||||
|
for m in method:
|
||||||
|
prev_p.append(pending.get(m))
|
||||||
|
pending[m] = p
|
||||||
|
|
||||||
|
try:
|
||||||
|
while not p.ready:
|
||||||
|
self.connection.drain_events(timeout=timeout)
|
||||||
|
|
||||||
|
if p.value:
|
||||||
|
args, kwargs = p.value
|
||||||
|
args = args[1:] # We are not returning method back
|
||||||
|
return args if returns_tuple else (args and args[0])
|
||||||
|
finally:
|
||||||
|
for i, m in enumerate(method):
|
||||||
|
if prev_p[i] is not None:
|
||||||
|
pending[m] = prev_p[i]
|
||||||
|
else:
|
||||||
|
pending.pop(m, None)
|
||||||
|
|
||||||
|
def dispatch_method(self, method_sig, payload, content):
|
||||||
|
if self.is_closing and method_sig not in (
|
||||||
|
self._ALLOWED_METHODS_WHEN_CLOSING
|
||||||
|
):
|
||||||
|
# When channel.close() was called we must ignore all methods except
|
||||||
|
# Channel.close and Channel.CloseOk
|
||||||
|
AMQP_LOGGER.warning(
|
||||||
|
IGNORED_METHOD_DURING_CHANNEL_CLOSE,
|
||||||
|
method_sig, self.channel_id
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
if content and \
|
||||||
|
self.auto_decode and \
|
||||||
|
hasattr(content, 'content_encoding'):
|
||||||
|
try:
|
||||||
|
content.body = content.body.decode(content.content_encoding)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
amqp_method = self._METHODS[method_sig]
|
||||||
|
except KeyError:
|
||||||
|
raise AMQPNotImplementedError(
|
||||||
|
f'Unknown AMQP method {method_sig!r}')
|
||||||
|
|
||||||
|
try:
|
||||||
|
listeners = [self._callbacks[method_sig]]
|
||||||
|
except KeyError:
|
||||||
|
listeners = []
|
||||||
|
one_shot = None
|
||||||
|
try:
|
||||||
|
one_shot = self._pending.pop(method_sig)
|
||||||
|
except KeyError:
|
||||||
|
if not listeners:
|
||||||
|
return
|
||||||
|
|
||||||
|
args = []
|
||||||
|
if amqp_method.args:
|
||||||
|
args, _ = loads(amqp_method.args, payload, 4)
|
||||||
|
if amqp_method.content:
|
||||||
|
args.append(content)
|
||||||
|
|
||||||
|
for listener in listeners:
|
||||||
|
listener(*args)
|
||||||
|
|
||||||
|
if one_shot:
|
||||||
|
one_shot(method_sig, *args)
|
||||||
|
|
||||||
|
#: Placeholder, the concrete implementations will have to
|
||||||
|
#: supply their own versions of _METHOD_MAP
|
||||||
|
_METHODS = {}
|
||||||
|
|
@ -0,0 +1,122 @@
|
||||||
|
"""AMQP Messages."""
|
||||||
|
# Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>
|
||||||
|
from .serialization import GenericContent
|
||||||
|
# Intended to fix #85: ImportError: cannot import name spec
|
||||||
|
# Encountered on python 2.7.3
|
||||||
|
# "The submodules often need to refer to each other. For example, the
|
||||||
|
# surround [sic] module might use the echo module. In fact, such
|
||||||
|
# references are so common that the import statement first looks in
|
||||||
|
# the containing package before looking in the standard module search
|
||||||
|
# path."
|
||||||
|
# Source:
|
||||||
|
# http://stackoverflow.com/a/14216937/4982251
|
||||||
|
from .spec import Basic
|
||||||
|
|
||||||
|
__all__ = ('Message',)
|
||||||
|
|
||||||
|
|
||||||
|
class Message(GenericContent):
|
||||||
|
"""A Message for use with the Channel.basic_* methods.
|
||||||
|
|
||||||
|
Expected arg types
|
||||||
|
|
||||||
|
body: string
|
||||||
|
children: (not supported)
|
||||||
|
|
||||||
|
Keyword properties may include:
|
||||||
|
|
||||||
|
content_type: shortstr
|
||||||
|
MIME content type
|
||||||
|
|
||||||
|
content_encoding: shortstr
|
||||||
|
MIME content encoding
|
||||||
|
|
||||||
|
application_headers: table
|
||||||
|
Message header field table, a dict with string keys,
|
||||||
|
and string | int | Decimal | datetime | dict values.
|
||||||
|
|
||||||
|
delivery_mode: octet
|
||||||
|
Non-persistent (1) or persistent (2)
|
||||||
|
|
||||||
|
priority: octet
|
||||||
|
The message priority, 0 to 9
|
||||||
|
|
||||||
|
correlation_id: shortstr
|
||||||
|
The application correlation identifier
|
||||||
|
|
||||||
|
reply_to: shortstr
|
||||||
|
The destination to reply to
|
||||||
|
|
||||||
|
expiration: shortstr
|
||||||
|
Message expiration specification
|
||||||
|
|
||||||
|
message_id: shortstr
|
||||||
|
The application message identifier
|
||||||
|
|
||||||
|
timestamp: unsigned long
|
||||||
|
The message timestamp
|
||||||
|
|
||||||
|
type: shortstr
|
||||||
|
The message type name
|
||||||
|
|
||||||
|
user_id: shortstr
|
||||||
|
The creating user id
|
||||||
|
|
||||||
|
app_id: shortstr
|
||||||
|
The creating application id
|
||||||
|
|
||||||
|
cluster_id: shortstr
|
||||||
|
Intra-cluster routing identifier
|
||||||
|
|
||||||
|
Unicode bodies are encoded according to the 'content_encoding'
|
||||||
|
argument. If that's None, it's set to 'UTF-8' automatically.
|
||||||
|
|
||||||
|
Example::
|
||||||
|
|
||||||
|
msg = Message('hello world',
|
||||||
|
content_type='text/plain',
|
||||||
|
application_headers={'foo': 7})
|
||||||
|
"""
|
||||||
|
|
||||||
|
CLASS_ID = Basic.CLASS_ID
|
||||||
|
|
||||||
|
#: Instances of this class have these attributes, which
|
||||||
|
#: are passed back and forth as message properties between
|
||||||
|
#: client and server
|
||||||
|
PROPERTIES = [
|
||||||
|
('content_type', 's'),
|
||||||
|
('content_encoding', 's'),
|
||||||
|
('application_headers', 'F'),
|
||||||
|
('delivery_mode', 'o'),
|
||||||
|
('priority', 'o'),
|
||||||
|
('correlation_id', 's'),
|
||||||
|
('reply_to', 's'),
|
||||||
|
('expiration', 's'),
|
||||||
|
('message_id', 's'),
|
||||||
|
('timestamp', 'L'),
|
||||||
|
('type', 's'),
|
||||||
|
('user_id', 's'),
|
||||||
|
('app_id', 's'),
|
||||||
|
('cluster_id', 's')
|
||||||
|
]
|
||||||
|
|
||||||
|
def __init__(self, body='', children=None, channel=None, **properties):
|
||||||
|
super().__init__(**properties)
|
||||||
|
#: set by basic_consume/basic_get
|
||||||
|
self.delivery_info = None
|
||||||
|
self.body = body
|
||||||
|
self.channel = channel
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"delivery_info",
|
||||||
|
"body",
|
||||||
|
"channel",
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def headers(self):
|
||||||
|
return self.properties.get('application_headers')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def delivery_tag(self):
|
||||||
|
return self.delivery_info.get('delivery_tag')
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,784 @@
|
||||||
|
"""AMQP Connections."""
|
||||||
|
# Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import socket
|
||||||
|
import uuid
|
||||||
|
import warnings
|
||||||
|
from array import array
|
||||||
|
from time import monotonic
|
||||||
|
|
||||||
|
from vine import ensure_promise
|
||||||
|
|
||||||
|
from . import __version__, sasl, spec
|
||||||
|
from .abstract_channel import AbstractChannel
|
||||||
|
from .channel import Channel
|
||||||
|
from .exceptions import (AMQPDeprecationWarning, ChannelError, ConnectionError,
|
||||||
|
ConnectionForced, MessageNacked, RecoverableChannelError,
|
||||||
|
RecoverableConnectionError, ResourceError,
|
||||||
|
error_for_code)
|
||||||
|
from .method_framing import frame_handler, frame_writer
|
||||||
|
from .transport import Transport
|
||||||
|
|
||||||
|
try:
|
||||||
|
from ssl import SSLError
|
||||||
|
except ImportError: # pragma: no cover
|
||||||
|
class SSLError(Exception): # noqa
|
||||||
|
pass
|
||||||
|
|
||||||
|
W_FORCE_CONNECT = """\
|
||||||
|
The .{attr} attribute on the connection was accessed before
|
||||||
|
the connection was established. This is supported for now, but will
|
||||||
|
be deprecated in amqp 2.2.0.
|
||||||
|
|
||||||
|
Since amqp 2.0 you have to explicitly call Connection.connect()
|
||||||
|
before using the connection.
|
||||||
|
"""
|
||||||
|
|
||||||
|
START_DEBUG_FMT = """
|
||||||
|
Start from server, version: %d.%d, properties: %s, mechanisms: %s, locales: %s
|
||||||
|
""".strip()
|
||||||
|
|
||||||
|
__all__ = ('Connection',)
|
||||||
|
|
||||||
|
AMQP_LOGGER = logging.getLogger('amqp')
|
||||||
|
AMQP_HEARTBEAT_LOGGER = logging.getLogger(
|
||||||
|
'amqp.connection.Connection.heartbeat_tick'
|
||||||
|
)
|
||||||
|
|
||||||
|
#: Default map for :attr:`Connection.library_properties`
|
||||||
|
LIBRARY_PROPERTIES = {
|
||||||
|
'product': 'py-amqp',
|
||||||
|
'product_version': __version__,
|
||||||
|
}
|
||||||
|
|
||||||
|
#: Default map for :attr:`Connection.negotiate_capabilities`
|
||||||
|
NEGOTIATE_CAPABILITIES = {
|
||||||
|
'consumer_cancel_notify': True,
|
||||||
|
'connection.blocked': True,
|
||||||
|
'authentication_failure_close': True,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Connection(AbstractChannel):
|
||||||
|
"""AMQP Connection.
|
||||||
|
|
||||||
|
The connection class provides methods for a client to establish a
|
||||||
|
network connection to a server, and for both peers to operate the
|
||||||
|
connection thereafter.
|
||||||
|
|
||||||
|
GRAMMAR::
|
||||||
|
|
||||||
|
connection = open-connection *use-connection close-connection
|
||||||
|
open-connection = C:protocol-header
|
||||||
|
S:START C:START-OK
|
||||||
|
*challenge
|
||||||
|
S:TUNE C:TUNE-OK
|
||||||
|
C:OPEN S:OPEN-OK
|
||||||
|
challenge = S:SECURE C:SECURE-OK
|
||||||
|
use-connection = *channel
|
||||||
|
close-connection = C:CLOSE S:CLOSE-OK
|
||||||
|
/ S:CLOSE C:CLOSE-OK
|
||||||
|
Create a connection to the specified host, which should be
|
||||||
|
a 'host[:port]', such as 'localhost', or '1.2.3.4:5672'
|
||||||
|
(defaults to 'localhost', if a port is not specified then
|
||||||
|
5672 is used)
|
||||||
|
|
||||||
|
Authentication can be controlled by passing one or more
|
||||||
|
`amqp.sasl.SASL` instances as the `authentication` parameter, or
|
||||||
|
setting the `login_method` string to one of the supported methods:
|
||||||
|
'GSSAPI', 'EXTERNAL', 'AMQPLAIN', or 'PLAIN'.
|
||||||
|
Otherwise authentication will be performed using any supported method
|
||||||
|
preferred by the server. Userid and passwords apply to AMQPLAIN and
|
||||||
|
PLAIN authentication, whereas on GSSAPI only userid will be used as the
|
||||||
|
client name. For EXTERNAL authentication both userid and password are
|
||||||
|
ignored.
|
||||||
|
|
||||||
|
The 'ssl' parameter may be simply True/False, or
|
||||||
|
a dictionary of options to pass to :class:`ssl.SSLContext` such as
|
||||||
|
requiring certain certificates. For details, refer ``ssl`` parameter of
|
||||||
|
:class:`~amqp.transport.SSLTransport`.
|
||||||
|
|
||||||
|
The "socket_settings" parameter is a dictionary defining tcp
|
||||||
|
settings which will be applied as socket options.
|
||||||
|
|
||||||
|
When "confirm_publish" is set to True, the channel is put to
|
||||||
|
confirm mode. In this mode, each published message is
|
||||||
|
confirmed using Publisher confirms RabbitMQ extension.
|
||||||
|
"""
|
||||||
|
|
||||||
|
Channel = Channel
|
||||||
|
|
||||||
|
#: Mapping of protocol extensions to enable.
|
||||||
|
#: The server will report these in server_properties[capabilities],
|
||||||
|
#: and if a key in this map is present the client will tell the
|
||||||
|
#: server to either enable or disable the capability depending
|
||||||
|
#: on the value set in this map.
|
||||||
|
#: For example with:
|
||||||
|
#: negotiate_capabilities = {
|
||||||
|
#: 'consumer_cancel_notify': True,
|
||||||
|
#: }
|
||||||
|
#: The client will enable this capability if the server reports
|
||||||
|
#: support for it, but if the value is False the client will
|
||||||
|
#: disable the capability.
|
||||||
|
negotiate_capabilities = NEGOTIATE_CAPABILITIES
|
||||||
|
|
||||||
|
#: These are sent to the server to announce what features
|
||||||
|
#: we support, type of client etc.
|
||||||
|
library_properties = LIBRARY_PROPERTIES
|
||||||
|
|
||||||
|
#: Final heartbeat interval value (in float seconds) after negotiation
|
||||||
|
heartbeat = None
|
||||||
|
|
||||||
|
#: Original heartbeat interval value proposed by client.
|
||||||
|
client_heartbeat = None
|
||||||
|
|
||||||
|
#: Original heartbeat interval proposed by server.
|
||||||
|
server_heartbeat = None
|
||||||
|
|
||||||
|
#: Time of last heartbeat sent (in monotonic time, if available).
|
||||||
|
last_heartbeat_sent = 0
|
||||||
|
|
||||||
|
#: Time of last heartbeat received (in monotonic time, if available).
|
||||||
|
last_heartbeat_received = 0
|
||||||
|
|
||||||
|
#: Number of successful writes to socket.
|
||||||
|
bytes_sent = 0
|
||||||
|
|
||||||
|
#: Number of successful reads from socket.
|
||||||
|
bytes_recv = 0
|
||||||
|
|
||||||
|
#: Number of bytes sent to socket at the last heartbeat check.
|
||||||
|
prev_sent = None
|
||||||
|
|
||||||
|
#: Number of bytes received from socket at the last heartbeat check.
|
||||||
|
prev_recv = None
|
||||||
|
|
||||||
|
_METHODS = {
|
||||||
|
spec.method(spec.Connection.Start, 'ooFSS'),
|
||||||
|
spec.method(spec.Connection.OpenOk),
|
||||||
|
spec.method(spec.Connection.Secure, 's'),
|
||||||
|
spec.method(spec.Connection.Tune, 'BlB'),
|
||||||
|
spec.method(spec.Connection.Close, 'BsBB'),
|
||||||
|
spec.method(spec.Connection.Blocked),
|
||||||
|
spec.method(spec.Connection.Unblocked),
|
||||||
|
spec.method(spec.Connection.CloseOk),
|
||||||
|
}
|
||||||
|
_METHODS = {m.method_sig: m for m in _METHODS}
|
||||||
|
|
||||||
|
_ALLOWED_METHODS_WHEN_CLOSING = (
|
||||||
|
spec.Connection.Close, spec.Connection.CloseOk
|
||||||
|
)
|
||||||
|
|
||||||
|
connection_errors = (
|
||||||
|
ConnectionError,
|
||||||
|
socket.error,
|
||||||
|
IOError,
|
||||||
|
OSError,
|
||||||
|
)
|
||||||
|
channel_errors = (ChannelError,)
|
||||||
|
recoverable_connection_errors = (
|
||||||
|
RecoverableConnectionError,
|
||||||
|
MessageNacked,
|
||||||
|
socket.error,
|
||||||
|
IOError,
|
||||||
|
OSError,
|
||||||
|
)
|
||||||
|
recoverable_channel_errors = (
|
||||||
|
RecoverableChannelError,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, host='localhost:5672', userid='guest', password='guest',
|
||||||
|
login_method=None, login_response=None,
|
||||||
|
authentication=(),
|
||||||
|
virtual_host='/', locale='en_US', client_properties=None,
|
||||||
|
ssl=False, connect_timeout=None, channel_max=None,
|
||||||
|
frame_max=None, heartbeat=0, on_open=None, on_blocked=None,
|
||||||
|
on_unblocked=None, confirm_publish=False,
|
||||||
|
on_tune_ok=None, read_timeout=None, write_timeout=None,
|
||||||
|
socket_settings=None, frame_handler=frame_handler,
|
||||||
|
frame_writer=frame_writer, **kwargs):
|
||||||
|
self._connection_id = uuid.uuid4().hex
|
||||||
|
channel_max = channel_max or 65535
|
||||||
|
frame_max = frame_max or 131072
|
||||||
|
if authentication:
|
||||||
|
if isinstance(authentication, sasl.SASL):
|
||||||
|
authentication = (authentication,)
|
||||||
|
self.authentication = authentication
|
||||||
|
elif login_method is not None:
|
||||||
|
if login_method == 'GSSAPI':
|
||||||
|
auth = sasl.GSSAPI(userid)
|
||||||
|
elif login_method == 'EXTERNAL':
|
||||||
|
auth = sasl.EXTERNAL()
|
||||||
|
elif login_method == 'AMQPLAIN':
|
||||||
|
if userid is None or password is None:
|
||||||
|
raise ValueError(
|
||||||
|
"Must supply authentication or userid/password")
|
||||||
|
auth = sasl.AMQPLAIN(userid, password)
|
||||||
|
elif login_method == 'PLAIN':
|
||||||
|
if userid is None or password is None:
|
||||||
|
raise ValueError(
|
||||||
|
"Must supply authentication or userid/password")
|
||||||
|
auth = sasl.PLAIN(userid, password)
|
||||||
|
elif login_response is not None:
|
||||||
|
auth = sasl.RAW(login_method, login_response)
|
||||||
|
else:
|
||||||
|
raise ValueError("Invalid login method", login_method)
|
||||||
|
self.authentication = (auth,)
|
||||||
|
else:
|
||||||
|
self.authentication = (sasl.GSSAPI(userid, fail_soft=True),
|
||||||
|
sasl.EXTERNAL(),
|
||||||
|
sasl.AMQPLAIN(userid, password),
|
||||||
|
sasl.PLAIN(userid, password))
|
||||||
|
|
||||||
|
self.client_properties = dict(
|
||||||
|
self.library_properties, **client_properties or {}
|
||||||
|
)
|
||||||
|
self.locale = locale
|
||||||
|
self.host = host
|
||||||
|
self.virtual_host = virtual_host
|
||||||
|
self.on_tune_ok = ensure_promise(on_tune_ok)
|
||||||
|
|
||||||
|
self.frame_handler_cls = frame_handler
|
||||||
|
self.frame_writer_cls = frame_writer
|
||||||
|
|
||||||
|
self._handshake_complete = False
|
||||||
|
|
||||||
|
self.channels = {}
|
||||||
|
# The connection object itself is treated as channel 0
|
||||||
|
super().__init__(self, 0)
|
||||||
|
|
||||||
|
self._frame_writer = None
|
||||||
|
self._on_inbound_frame = None
|
||||||
|
self._transport = None
|
||||||
|
|
||||||
|
# Properties set in the Tune method
|
||||||
|
self.channel_max = channel_max
|
||||||
|
self.frame_max = frame_max
|
||||||
|
self.client_heartbeat = heartbeat
|
||||||
|
|
||||||
|
self.confirm_publish = confirm_publish
|
||||||
|
self.ssl = ssl
|
||||||
|
self.read_timeout = read_timeout
|
||||||
|
self.write_timeout = write_timeout
|
||||||
|
self.socket_settings = socket_settings
|
||||||
|
|
||||||
|
# Callbacks
|
||||||
|
self.on_blocked = on_blocked
|
||||||
|
self.on_unblocked = on_unblocked
|
||||||
|
self.on_open = ensure_promise(on_open)
|
||||||
|
|
||||||
|
self._used_channel_ids = array('H')
|
||||||
|
|
||||||
|
# Properties set in the Start method
|
||||||
|
self.version_major = 0
|
||||||
|
self.version_minor = 0
|
||||||
|
self.server_properties = {}
|
||||||
|
self.mechanisms = []
|
||||||
|
self.locales = []
|
||||||
|
|
||||||
|
self.connect_timeout = connect_timeout
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
if self._transport:
|
||||||
|
return f'<AMQP Connection: {self.host}/{self.virtual_host} '\
|
||||||
|
f'using {self._transport} at {id(self):#x}>'
|
||||||
|
else:
|
||||||
|
return f'<AMQP Connection: {self.host}/{self.virtual_host} '\
|
||||||
|
f'(disconnected) at {id(self):#x}>'
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
self.connect()
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, *eargs):
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
def then(self, on_success, on_error=None):
|
||||||
|
return self.on_open.then(on_success, on_error)
|
||||||
|
|
||||||
|
def _setup_listeners(self):
|
||||||
|
self._callbacks.update({
|
||||||
|
spec.Connection.Start: self._on_start,
|
||||||
|
spec.Connection.OpenOk: self._on_open_ok,
|
||||||
|
spec.Connection.Secure: self._on_secure,
|
||||||
|
spec.Connection.Tune: self._on_tune,
|
||||||
|
spec.Connection.Close: self._on_close,
|
||||||
|
spec.Connection.Blocked: self._on_blocked,
|
||||||
|
spec.Connection.Unblocked: self._on_unblocked,
|
||||||
|
spec.Connection.CloseOk: self._on_close_ok,
|
||||||
|
})
|
||||||
|
|
||||||
|
def connect(self, callback=None):
|
||||||
|
# Let the transport.py module setup the actual
|
||||||
|
# socket connection to the broker.
|
||||||
|
#
|
||||||
|
if self.connected:
|
||||||
|
return callback() if callback else None
|
||||||
|
try:
|
||||||
|
self.transport = self.Transport(
|
||||||
|
self.host, self.connect_timeout, self.ssl,
|
||||||
|
self.read_timeout, self.write_timeout,
|
||||||
|
socket_settings=self.socket_settings,
|
||||||
|
)
|
||||||
|
self.transport.connect()
|
||||||
|
self.on_inbound_frame = self.frame_handler_cls(
|
||||||
|
self, self.on_inbound_method)
|
||||||
|
self.frame_writer = self.frame_writer_cls(self, self.transport)
|
||||||
|
|
||||||
|
while not self._handshake_complete:
|
||||||
|
self.drain_events(timeout=self.connect_timeout)
|
||||||
|
|
||||||
|
except (OSError, SSLError):
|
||||||
|
self.collect()
|
||||||
|
raise
|
||||||
|
|
||||||
|
def _warn_force_connect(self, attr):
|
||||||
|
warnings.warn(AMQPDeprecationWarning(
|
||||||
|
W_FORCE_CONNECT.format(attr=attr)))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def transport(self):
|
||||||
|
if self._transport is None:
|
||||||
|
self._warn_force_connect('transport')
|
||||||
|
self.connect()
|
||||||
|
return self._transport
|
||||||
|
|
||||||
|
@transport.setter
|
||||||
|
def transport(self, transport):
|
||||||
|
self._transport = transport
|
||||||
|
|
||||||
|
@property
|
||||||
|
def on_inbound_frame(self):
|
||||||
|
if self._on_inbound_frame is None:
|
||||||
|
self._warn_force_connect('on_inbound_frame')
|
||||||
|
self.connect()
|
||||||
|
return self._on_inbound_frame
|
||||||
|
|
||||||
|
@on_inbound_frame.setter
|
||||||
|
def on_inbound_frame(self, on_inbound_frame):
|
||||||
|
self._on_inbound_frame = on_inbound_frame
|
||||||
|
|
||||||
|
@property
|
||||||
|
def frame_writer(self):
|
||||||
|
if self._frame_writer is None:
|
||||||
|
self._warn_force_connect('frame_writer')
|
||||||
|
self.connect()
|
||||||
|
return self._frame_writer
|
||||||
|
|
||||||
|
@frame_writer.setter
|
||||||
|
def frame_writer(self, frame_writer):
|
||||||
|
self._frame_writer = frame_writer
|
||||||
|
|
||||||
|
def _on_start(self, version_major, version_minor, server_properties,
|
||||||
|
mechanisms, locales, argsig='FsSs'):
|
||||||
|
client_properties = self.client_properties
|
||||||
|
self.version_major = version_major
|
||||||
|
self.version_minor = version_minor
|
||||||
|
self.server_properties = server_properties
|
||||||
|
if isinstance(mechanisms, str):
|
||||||
|
mechanisms = mechanisms.encode('utf-8')
|
||||||
|
self.mechanisms = mechanisms.split(b' ')
|
||||||
|
self.locales = locales.split(' ')
|
||||||
|
AMQP_LOGGER.debug(
|
||||||
|
START_DEBUG_FMT,
|
||||||
|
self.version_major, self.version_minor,
|
||||||
|
self.server_properties, self.mechanisms, self.locales,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Negotiate protocol extensions (capabilities)
|
||||||
|
scap = server_properties.get('capabilities') or {}
|
||||||
|
cap = client_properties.setdefault('capabilities', {})
|
||||||
|
cap.update({
|
||||||
|
wanted_cap: enable_cap
|
||||||
|
for wanted_cap, enable_cap in self.negotiate_capabilities.items()
|
||||||
|
if scap.get(wanted_cap)
|
||||||
|
})
|
||||||
|
if not cap:
|
||||||
|
# no capabilities, server may not react well to having
|
||||||
|
# this key present in client_properties, so we remove it.
|
||||||
|
client_properties.pop('capabilities', None)
|
||||||
|
|
||||||
|
for authentication in self.authentication:
|
||||||
|
if authentication.mechanism in self.mechanisms:
|
||||||
|
login_response = authentication.start(self)
|
||||||
|
if login_response is not NotImplemented:
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
raise ConnectionError(
|
||||||
|
"Couldn't find appropriate auth mechanism "
|
||||||
|
"(can offer: {}; available: {})".format(
|
||||||
|
b", ".join(m.mechanism
|
||||||
|
for m in self.authentication
|
||||||
|
if m.mechanism).decode(),
|
||||||
|
b", ".join(self.mechanisms).decode()))
|
||||||
|
|
||||||
|
self.send_method(
|
||||||
|
spec.Connection.StartOk, argsig,
|
||||||
|
(client_properties, authentication.mechanism,
|
||||||
|
login_response, self.locale),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _on_secure(self, challenge):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _on_tune(self, channel_max, frame_max, server_heartbeat, argsig='BlB'):
|
||||||
|
client_heartbeat = self.client_heartbeat or 0
|
||||||
|
self.channel_max = channel_max or self.channel_max
|
||||||
|
self.frame_max = frame_max or self.frame_max
|
||||||
|
self.server_heartbeat = server_heartbeat or 0
|
||||||
|
|
||||||
|
# negotiate the heartbeat interval to the smaller of the
|
||||||
|
# specified values
|
||||||
|
if self.server_heartbeat == 0 or client_heartbeat == 0:
|
||||||
|
self.heartbeat = max(self.server_heartbeat, client_heartbeat)
|
||||||
|
else:
|
||||||
|
self.heartbeat = min(self.server_heartbeat, client_heartbeat)
|
||||||
|
|
||||||
|
# Ignore server heartbeat if client_heartbeat is disabled
|
||||||
|
if not self.client_heartbeat:
|
||||||
|
self.heartbeat = 0
|
||||||
|
|
||||||
|
self.send_method(
|
||||||
|
spec.Connection.TuneOk, argsig,
|
||||||
|
(self.channel_max, self.frame_max, self.heartbeat),
|
||||||
|
callback=self._on_tune_sent,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _on_tune_sent(self, argsig='ssb'):
|
||||||
|
self.send_method(
|
||||||
|
spec.Connection.Open, argsig, (self.virtual_host, '', False),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _on_open_ok(self):
|
||||||
|
self._handshake_complete = True
|
||||||
|
self.on_open(self)
|
||||||
|
|
||||||
|
def Transport(self, host, connect_timeout,
|
||||||
|
ssl=False, read_timeout=None, write_timeout=None,
|
||||||
|
socket_settings=None, **kwargs):
|
||||||
|
return Transport(
|
||||||
|
host, connect_timeout=connect_timeout, ssl=ssl,
|
||||||
|
read_timeout=read_timeout, write_timeout=write_timeout,
|
||||||
|
socket_settings=socket_settings, **kwargs)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def connected(self):
|
||||||
|
return self._transport and self._transport.connected
|
||||||
|
|
||||||
|
def collect(self):
|
||||||
|
if self._transport:
|
||||||
|
self._transport.close()
|
||||||
|
|
||||||
|
if self.channels:
|
||||||
|
# Copy all the channels except self since the channels
|
||||||
|
# dictionary changes during the collection process.
|
||||||
|
channels = [
|
||||||
|
ch for ch in self.channels.values()
|
||||||
|
if ch is not self
|
||||||
|
]
|
||||||
|
|
||||||
|
for ch in channels:
|
||||||
|
ch.collect()
|
||||||
|
self._transport = self.connection = self.channels = None
|
||||||
|
|
||||||
|
def _get_free_channel_id(self):
|
||||||
|
# Cast to a set for fast lookups, and keep stored as an array for lower memory usage.
|
||||||
|
used_channel_ids = set(self._used_channel_ids)
|
||||||
|
|
||||||
|
for channel_id in range(1, self.channel_max + 1):
|
||||||
|
if channel_id not in used_channel_ids:
|
||||||
|
self._used_channel_ids.append(channel_id)
|
||||||
|
return channel_id
|
||||||
|
|
||||||
|
raise ResourceError(
|
||||||
|
'No free channel ids, current={}, channel_max={}'.format(
|
||||||
|
len(self.channels), self.channel_max), spec.Channel.Open)
|
||||||
|
|
||||||
|
def _claim_channel_id(self, channel_id):
|
||||||
|
if channel_id in self._used_channel_ids:
|
||||||
|
raise ConnectionError(f'Channel {channel_id!r} already open')
|
||||||
|
else:
|
||||||
|
self._used_channel_ids.append(channel_id)
|
||||||
|
return channel_id
|
||||||
|
|
||||||
|
def channel(self, channel_id=None, callback=None):
|
||||||
|
"""Create new channel.
|
||||||
|
|
||||||
|
Fetch a Channel object identified by the numeric channel_id, or
|
||||||
|
create that object if it doesn't already exist.
|
||||||
|
"""
|
||||||
|
if self.channels is None:
|
||||||
|
raise RecoverableConnectionError('Connection already closed.')
|
||||||
|
|
||||||
|
try:
|
||||||
|
return self.channels[channel_id]
|
||||||
|
except KeyError:
|
||||||
|
channel = self.Channel(self, channel_id, on_open=callback)
|
||||||
|
channel.open()
|
||||||
|
return channel
|
||||||
|
|
||||||
|
def is_alive(self):
|
||||||
|
raise NotImplementedError('Use AMQP heartbeats')
|
||||||
|
|
||||||
|
def drain_events(self, timeout=None):
|
||||||
|
# read until message is ready
|
||||||
|
while not self.blocking_read(timeout):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def blocking_read(self, timeout=None):
|
||||||
|
with self.transport.having_timeout(timeout):
|
||||||
|
frame = self.transport.read_frame()
|
||||||
|
return self.on_inbound_frame(frame)
|
||||||
|
|
||||||
|
def on_inbound_method(self, channel_id, method_sig, payload, content):
|
||||||
|
if self.channels is None:
|
||||||
|
raise RecoverableConnectionError('Connection already closed')
|
||||||
|
|
||||||
|
return self.channels[channel_id].dispatch_method(
|
||||||
|
method_sig, payload, content,
|
||||||
|
)
|
||||||
|
|
||||||
|
def close(self, reply_code=0, reply_text='', method_sig=(0, 0),
|
||||||
|
argsig='BsBB'):
|
||||||
|
"""Request a connection close.
|
||||||
|
|
||||||
|
This method indicates that the sender wants to close the
|
||||||
|
connection. This may be due to internal conditions (e.g. a
|
||||||
|
forced shut-down) or due to an error handling a specific
|
||||||
|
method, i.e. an exception. When a close is due to an
|
||||||
|
exception, the sender provides the class and method id of the
|
||||||
|
method which caused the exception.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
After sending this method any received method except the
|
||||||
|
Close-OK method MUST be discarded.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
The peer sending this method MAY use a counter or timeout
|
||||||
|
to detect failure of the other peer to respond correctly
|
||||||
|
with the Close-OK method.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
When a server receives the Close method from a client it
|
||||||
|
MUST delete all server-side resources associated with the
|
||||||
|
client's context. A client CANNOT reconnect to a context
|
||||||
|
after sending or receiving a Close method.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
reply_code: short
|
||||||
|
|
||||||
|
The reply code. The AMQ reply codes are defined in AMQ
|
||||||
|
RFC 011.
|
||||||
|
|
||||||
|
reply_text: shortstr
|
||||||
|
|
||||||
|
The localised reply text. This text can be logged as an
|
||||||
|
aid to resolving issues.
|
||||||
|
|
||||||
|
class_id: short
|
||||||
|
|
||||||
|
failing method class
|
||||||
|
|
||||||
|
When the close is provoked by a method exception, this
|
||||||
|
is the class of the method.
|
||||||
|
|
||||||
|
method_id: short
|
||||||
|
|
||||||
|
failing method ID
|
||||||
|
|
||||||
|
When the close is provoked by a method exception, this
|
||||||
|
is the ID of the method.
|
||||||
|
"""
|
||||||
|
if self._transport is None:
|
||||||
|
# already closed
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.is_closing = True
|
||||||
|
return self.send_method(
|
||||||
|
spec.Connection.Close, argsig,
|
||||||
|
(reply_code, reply_text, method_sig[0], method_sig[1]),
|
||||||
|
wait=spec.Connection.CloseOk,
|
||||||
|
)
|
||||||
|
except (OSError, SSLError):
|
||||||
|
# close connection
|
||||||
|
self.collect()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
self.is_closing = False
|
||||||
|
|
||||||
|
def _on_close(self, reply_code, reply_text, class_id, method_id):
|
||||||
|
"""Request a connection close.
|
||||||
|
|
||||||
|
This method indicates that the sender wants to close the
|
||||||
|
connection. This may be due to internal conditions (e.g. a
|
||||||
|
forced shut-down) or due to an error handling a specific
|
||||||
|
method, i.e. an exception. When a close is due to an
|
||||||
|
exception, the sender provides the class and method id of the
|
||||||
|
method which caused the exception.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
After sending this method any received method except the
|
||||||
|
Close-OK method MUST be discarded.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
The peer sending this method MAY use a counter or timeout
|
||||||
|
to detect failure of the other peer to respond correctly
|
||||||
|
with the Close-OK method.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
When a server receives the Close method from a client it
|
||||||
|
MUST delete all server-side resources associated with the
|
||||||
|
client's context. A client CANNOT reconnect to a context
|
||||||
|
after sending or receiving a Close method.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
reply_code: short
|
||||||
|
|
||||||
|
The reply code. The AMQ reply codes are defined in AMQ
|
||||||
|
RFC 011.
|
||||||
|
|
||||||
|
reply_text: shortstr
|
||||||
|
|
||||||
|
The localised reply text. This text can be logged as an
|
||||||
|
aid to resolving issues.
|
||||||
|
|
||||||
|
class_id: short
|
||||||
|
|
||||||
|
failing method class
|
||||||
|
|
||||||
|
When the close is provoked by a method exception, this
|
||||||
|
is the class of the method.
|
||||||
|
|
||||||
|
method_id: short
|
||||||
|
|
||||||
|
failing method ID
|
||||||
|
|
||||||
|
When the close is provoked by a method exception, this
|
||||||
|
is the ID of the method.
|
||||||
|
"""
|
||||||
|
self._x_close_ok()
|
||||||
|
raise error_for_code(reply_code, reply_text,
|
||||||
|
(class_id, method_id), ConnectionError)
|
||||||
|
|
||||||
|
def _x_close_ok(self):
|
||||||
|
"""Confirm a connection close.
|
||||||
|
|
||||||
|
This method confirms a Connection.Close method and tells the
|
||||||
|
recipient that it is safe to release resources for the
|
||||||
|
connection and close the socket.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
A peer that detects a socket closure without having
|
||||||
|
received a Close-Ok handshake method SHOULD log the error.
|
||||||
|
"""
|
||||||
|
self.send_method(spec.Connection.CloseOk, callback=self._on_close_ok)
|
||||||
|
|
||||||
|
def _on_close_ok(self):
|
||||||
|
"""Confirm a connection close.
|
||||||
|
|
||||||
|
This method confirms a Connection.Close method and tells the
|
||||||
|
recipient that it is safe to release resources for the
|
||||||
|
connection and close the socket.
|
||||||
|
|
||||||
|
RULE:
|
||||||
|
|
||||||
|
A peer that detects a socket closure without having
|
||||||
|
received a Close-Ok handshake method SHOULD log the error.
|
||||||
|
"""
|
||||||
|
self.collect()
|
||||||
|
|
||||||
|
def _on_blocked(self):
|
||||||
|
"""Callback called when connection blocked.
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
This is an RabbitMQ Extension.
|
||||||
|
"""
|
||||||
|
reason = 'connection blocked, see broker logs'
|
||||||
|
if self.on_blocked:
|
||||||
|
return self.on_blocked(reason)
|
||||||
|
|
||||||
|
def _on_unblocked(self):
|
||||||
|
if self.on_unblocked:
|
||||||
|
return self.on_unblocked()
|
||||||
|
|
||||||
|
def send_heartbeat(self):
|
||||||
|
self.frame_writer(8, 0, None, None, None)
|
||||||
|
|
||||||
|
def heartbeat_tick(self, rate=2):
|
||||||
|
"""Send heartbeat packets if necessary.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
~amqp.exceptions.ConnectionForvced: if none have been
|
||||||
|
received recently.
|
||||||
|
|
||||||
|
Note:
|
||||||
|
This should be called frequently, on the order of
|
||||||
|
once per second.
|
||||||
|
|
||||||
|
Keyword Arguments:
|
||||||
|
rate (int): Number of heartbeat frames to send during the heartbeat
|
||||||
|
timeout
|
||||||
|
"""
|
||||||
|
AMQP_HEARTBEAT_LOGGER.debug('heartbeat_tick : for connection %s',
|
||||||
|
self._connection_id)
|
||||||
|
if not self.heartbeat:
|
||||||
|
return
|
||||||
|
|
||||||
|
# If rate is wrong, let's use 2 as default
|
||||||
|
if rate <= 0:
|
||||||
|
rate = 2
|
||||||
|
|
||||||
|
# treat actual data exchange in either direction as a heartbeat
|
||||||
|
sent_now = self.bytes_sent
|
||||||
|
recv_now = self.bytes_recv
|
||||||
|
if self.prev_sent is None or self.prev_sent != sent_now:
|
||||||
|
self.last_heartbeat_sent = monotonic()
|
||||||
|
if self.prev_recv is None or self.prev_recv != recv_now:
|
||||||
|
self.last_heartbeat_received = monotonic()
|
||||||
|
|
||||||
|
now = monotonic()
|
||||||
|
AMQP_HEARTBEAT_LOGGER.debug(
|
||||||
|
'heartbeat_tick : Prev sent/recv: %s/%s, '
|
||||||
|
'now - %s/%s, monotonic - %s, '
|
||||||
|
'last_heartbeat_sent - %s, heartbeat int. - %s '
|
||||||
|
'for connection %s',
|
||||||
|
self.prev_sent, self.prev_recv,
|
||||||
|
sent_now, recv_now, now,
|
||||||
|
self.last_heartbeat_sent,
|
||||||
|
self.heartbeat,
|
||||||
|
self._connection_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.prev_sent, self.prev_recv = sent_now, recv_now
|
||||||
|
|
||||||
|
# send a heartbeat if it's time to do so
|
||||||
|
if now > self.last_heartbeat_sent + self.heartbeat / rate:
|
||||||
|
AMQP_HEARTBEAT_LOGGER.debug(
|
||||||
|
'heartbeat_tick: sending heartbeat for connection %s',
|
||||||
|
self._connection_id)
|
||||||
|
self.send_heartbeat()
|
||||||
|
self.last_heartbeat_sent = monotonic()
|
||||||
|
|
||||||
|
# if we've missed two intervals' heartbeats, fail; this gives the
|
||||||
|
# server enough time to send heartbeats a little late
|
||||||
|
two_heartbeats = 2 * self.heartbeat
|
||||||
|
two_heartbeats_interval = self.last_heartbeat_received + two_heartbeats
|
||||||
|
heartbeats_missed = two_heartbeats_interval < monotonic()
|
||||||
|
if self.last_heartbeat_received and heartbeats_missed:
|
||||||
|
raise ConnectionForced('Too many heartbeats missed')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def sock(self):
|
||||||
|
return self.transport.sock
|
||||||
|
|
||||||
|
@property
|
||||||
|
def server_capabilities(self):
|
||||||
|
return self.server_properties.get('capabilities') or {}
|
||||||
|
|
@ -0,0 +1,288 @@
|
||||||
|
"""Exceptions used by amqp."""
|
||||||
|
# Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
from struct import pack, unpack
|
||||||
|
|
||||||
|
__all__ = (
|
||||||
|
'AMQPError',
|
||||||
|
'ConnectionError', 'ChannelError',
|
||||||
|
'RecoverableConnectionError', 'IrrecoverableConnectionError',
|
||||||
|
'RecoverableChannelError', 'IrrecoverableChannelError',
|
||||||
|
'ConsumerCancelled', 'ContentTooLarge', 'NoConsumers',
|
||||||
|
'ConnectionForced', 'InvalidPath', 'AccessRefused', 'NotFound',
|
||||||
|
'ResourceLocked', 'PreconditionFailed', 'FrameError', 'FrameSyntaxError',
|
||||||
|
'InvalidCommand', 'ChannelNotOpen', 'UnexpectedFrame', 'ResourceError',
|
||||||
|
'NotAllowed', 'AMQPNotImplementedError', 'InternalError',
|
||||||
|
'MessageNacked',
|
||||||
|
'AMQPDeprecationWarning',
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class AMQPDeprecationWarning(UserWarning):
|
||||||
|
"""Warning for deprecated things."""
|
||||||
|
|
||||||
|
|
||||||
|
class MessageNacked(Exception):
|
||||||
|
"""Message was nacked by broker."""
|
||||||
|
|
||||||
|
|
||||||
|
class AMQPError(Exception):
|
||||||
|
"""Base class for all AMQP exceptions."""
|
||||||
|
|
||||||
|
code = 0
|
||||||
|
|
||||||
|
def __init__(self, reply_text=None, method_sig=None,
|
||||||
|
method_name=None, reply_code=None):
|
||||||
|
self.message = reply_text
|
||||||
|
self.reply_code = reply_code or self.code
|
||||||
|
self.reply_text = reply_text
|
||||||
|
self.method_sig = method_sig
|
||||||
|
self.method_name = method_name or ''
|
||||||
|
if method_sig and not self.method_name:
|
||||||
|
self.method_name = METHOD_NAME_MAP.get(method_sig, '')
|
||||||
|
Exception.__init__(self, reply_code,
|
||||||
|
reply_text, method_sig, self.method_name)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
if self.method:
|
||||||
|
return '{0.method}: ({0.reply_code}) {0.reply_text}'.format(self)
|
||||||
|
return self.reply_text or '<{}: unknown error>'.format(
|
||||||
|
type(self).__name__
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def method(self):
|
||||||
|
return self.method_name or self.method_sig
|
||||||
|
|
||||||
|
|
||||||
|
class ConnectionError(AMQPError):
|
||||||
|
"""AMQP Connection Error."""
|
||||||
|
|
||||||
|
|
||||||
|
class ChannelError(AMQPError):
|
||||||
|
"""AMQP Channel Error."""
|
||||||
|
|
||||||
|
|
||||||
|
class RecoverableChannelError(ChannelError):
|
||||||
|
"""Exception class for recoverable channel errors."""
|
||||||
|
|
||||||
|
|
||||||
|
class IrrecoverableChannelError(ChannelError):
|
||||||
|
"""Exception class for irrecoverable channel errors."""
|
||||||
|
|
||||||
|
|
||||||
|
class RecoverableConnectionError(ConnectionError):
|
||||||
|
"""Exception class for recoverable connection errors."""
|
||||||
|
|
||||||
|
|
||||||
|
class IrrecoverableConnectionError(ConnectionError):
|
||||||
|
"""Exception class for irrecoverable connection errors."""
|
||||||
|
|
||||||
|
|
||||||
|
class Blocked(RecoverableConnectionError):
|
||||||
|
"""AMQP Connection Blocked Predicate."""
|
||||||
|
|
||||||
|
|
||||||
|
class ConsumerCancelled(RecoverableConnectionError):
|
||||||
|
"""AMQP Consumer Cancelled Predicate."""
|
||||||
|
|
||||||
|
|
||||||
|
class ContentTooLarge(RecoverableChannelError):
|
||||||
|
"""AMQP Content Too Large Error."""
|
||||||
|
|
||||||
|
code = 311
|
||||||
|
|
||||||
|
|
||||||
|
class NoConsumers(RecoverableChannelError):
|
||||||
|
"""AMQP No Consumers Error."""
|
||||||
|
|
||||||
|
code = 313
|
||||||
|
|
||||||
|
|
||||||
|
class ConnectionForced(RecoverableConnectionError):
|
||||||
|
"""AMQP Connection Forced Error."""
|
||||||
|
|
||||||
|
code = 320
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidPath(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Invalid Path Error."""
|
||||||
|
|
||||||
|
code = 402
|
||||||
|
|
||||||
|
|
||||||
|
class AccessRefused(IrrecoverableChannelError):
|
||||||
|
"""AMQP Access Refused Error."""
|
||||||
|
|
||||||
|
code = 403
|
||||||
|
|
||||||
|
|
||||||
|
class NotFound(IrrecoverableChannelError):
|
||||||
|
"""AMQP Not Found Error."""
|
||||||
|
|
||||||
|
code = 404
|
||||||
|
|
||||||
|
|
||||||
|
class ResourceLocked(RecoverableChannelError):
|
||||||
|
"""AMQP Resource Locked Error."""
|
||||||
|
|
||||||
|
code = 405
|
||||||
|
|
||||||
|
|
||||||
|
class PreconditionFailed(IrrecoverableChannelError):
|
||||||
|
"""AMQP Precondition Failed Error."""
|
||||||
|
|
||||||
|
code = 406
|
||||||
|
|
||||||
|
|
||||||
|
class FrameError(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Frame Error."""
|
||||||
|
|
||||||
|
code = 501
|
||||||
|
|
||||||
|
|
||||||
|
class FrameSyntaxError(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Frame Syntax Error."""
|
||||||
|
|
||||||
|
code = 502
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidCommand(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Invalid Command Error."""
|
||||||
|
|
||||||
|
code = 503
|
||||||
|
|
||||||
|
|
||||||
|
class ChannelNotOpen(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Channel Not Open Error."""
|
||||||
|
|
||||||
|
code = 504
|
||||||
|
|
||||||
|
|
||||||
|
class UnexpectedFrame(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Unexpected Frame."""
|
||||||
|
|
||||||
|
code = 505
|
||||||
|
|
||||||
|
|
||||||
|
class ResourceError(RecoverableConnectionError):
|
||||||
|
"""AMQP Resource Error."""
|
||||||
|
|
||||||
|
code = 506
|
||||||
|
|
||||||
|
|
||||||
|
class NotAllowed(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Not Allowed Error."""
|
||||||
|
|
||||||
|
code = 530
|
||||||
|
|
||||||
|
|
||||||
|
class AMQPNotImplementedError(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Not Implemented Error."""
|
||||||
|
|
||||||
|
code = 540
|
||||||
|
|
||||||
|
|
||||||
|
class InternalError(IrrecoverableConnectionError):
|
||||||
|
"""AMQP Internal Error."""
|
||||||
|
|
||||||
|
code = 541
|
||||||
|
|
||||||
|
|
||||||
|
ERROR_MAP = {
|
||||||
|
311: ContentTooLarge,
|
||||||
|
313: NoConsumers,
|
||||||
|
320: ConnectionForced,
|
||||||
|
402: InvalidPath,
|
||||||
|
403: AccessRefused,
|
||||||
|
404: NotFound,
|
||||||
|
405: ResourceLocked,
|
||||||
|
406: PreconditionFailed,
|
||||||
|
501: FrameError,
|
||||||
|
502: FrameSyntaxError,
|
||||||
|
503: InvalidCommand,
|
||||||
|
504: ChannelNotOpen,
|
||||||
|
505: UnexpectedFrame,
|
||||||
|
506: ResourceError,
|
||||||
|
530: NotAllowed,
|
||||||
|
540: AMQPNotImplementedError,
|
||||||
|
541: InternalError,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def error_for_code(code, text, method, default):
|
||||||
|
try:
|
||||||
|
return ERROR_MAP[code](text, method, reply_code=code)
|
||||||
|
except KeyError:
|
||||||
|
return default(text, method, reply_code=code)
|
||||||
|
|
||||||
|
|
||||||
|
METHOD_NAME_MAP = {
|
||||||
|
(10, 10): 'Connection.start',
|
||||||
|
(10, 11): 'Connection.start_ok',
|
||||||
|
(10, 20): 'Connection.secure',
|
||||||
|
(10, 21): 'Connection.secure_ok',
|
||||||
|
(10, 30): 'Connection.tune',
|
||||||
|
(10, 31): 'Connection.tune_ok',
|
||||||
|
(10, 40): 'Connection.open',
|
||||||
|
(10, 41): 'Connection.open_ok',
|
||||||
|
(10, 50): 'Connection.close',
|
||||||
|
(10, 51): 'Connection.close_ok',
|
||||||
|
(20, 10): 'Channel.open',
|
||||||
|
(20, 11): 'Channel.open_ok',
|
||||||
|
(20, 20): 'Channel.flow',
|
||||||
|
(20, 21): 'Channel.flow_ok',
|
||||||
|
(20, 40): 'Channel.close',
|
||||||
|
(20, 41): 'Channel.close_ok',
|
||||||
|
(30, 10): 'Access.request',
|
||||||
|
(30, 11): 'Access.request_ok',
|
||||||
|
(40, 10): 'Exchange.declare',
|
||||||
|
(40, 11): 'Exchange.declare_ok',
|
||||||
|
(40, 20): 'Exchange.delete',
|
||||||
|
(40, 21): 'Exchange.delete_ok',
|
||||||
|
(40, 30): 'Exchange.bind',
|
||||||
|
(40, 31): 'Exchange.bind_ok',
|
||||||
|
(40, 40): 'Exchange.unbind',
|
||||||
|
(40, 41): 'Exchange.unbind_ok',
|
||||||
|
(50, 10): 'Queue.declare',
|
||||||
|
(50, 11): 'Queue.declare_ok',
|
||||||
|
(50, 20): 'Queue.bind',
|
||||||
|
(50, 21): 'Queue.bind_ok',
|
||||||
|
(50, 30): 'Queue.purge',
|
||||||
|
(50, 31): 'Queue.purge_ok',
|
||||||
|
(50, 40): 'Queue.delete',
|
||||||
|
(50, 41): 'Queue.delete_ok',
|
||||||
|
(50, 50): 'Queue.unbind',
|
||||||
|
(50, 51): 'Queue.unbind_ok',
|
||||||
|
(60, 10): 'Basic.qos',
|
||||||
|
(60, 11): 'Basic.qos_ok',
|
||||||
|
(60, 20): 'Basic.consume',
|
||||||
|
(60, 21): 'Basic.consume_ok',
|
||||||
|
(60, 30): 'Basic.cancel',
|
||||||
|
(60, 31): 'Basic.cancel_ok',
|
||||||
|
(60, 40): 'Basic.publish',
|
||||||
|
(60, 50): 'Basic.return',
|
||||||
|
(60, 60): 'Basic.deliver',
|
||||||
|
(60, 70): 'Basic.get',
|
||||||
|
(60, 71): 'Basic.get_ok',
|
||||||
|
(60, 72): 'Basic.get_empty',
|
||||||
|
(60, 80): 'Basic.ack',
|
||||||
|
(60, 90): 'Basic.reject',
|
||||||
|
(60, 100): 'Basic.recover_async',
|
||||||
|
(60, 110): 'Basic.recover',
|
||||||
|
(60, 111): 'Basic.recover_ok',
|
||||||
|
(60, 120): 'Basic.nack',
|
||||||
|
(90, 10): 'Tx.select',
|
||||||
|
(90, 11): 'Tx.select_ok',
|
||||||
|
(90, 20): 'Tx.commit',
|
||||||
|
(90, 21): 'Tx.commit_ok',
|
||||||
|
(90, 30): 'Tx.rollback',
|
||||||
|
(90, 31): 'Tx.rollback_ok',
|
||||||
|
(85, 10): 'Confirm.select',
|
||||||
|
(85, 11): 'Confirm.select_ok',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
for _method_id, _method_name in list(METHOD_NAME_MAP.items()):
|
||||||
|
METHOD_NAME_MAP[unpack('>I', pack('>HH', *_method_id))[0]] = \
|
||||||
|
_method_name
|
||||||
|
|
@ -0,0 +1,189 @@
|
||||||
|
"""Convert between frames and higher-level AMQP methods."""
|
||||||
|
# Copyright (C) 2007-2008 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
from collections import defaultdict
|
||||||
|
from struct import pack, pack_into, unpack_from
|
||||||
|
|
||||||
|
from . import spec
|
||||||
|
from .basic_message import Message
|
||||||
|
from .exceptions import UnexpectedFrame
|
||||||
|
from .utils import str_to_bytes
|
||||||
|
|
||||||
|
__all__ = ('frame_handler', 'frame_writer')
|
||||||
|
|
||||||
|
#: Set of methods that require both a content frame and a body frame.
|
||||||
|
_CONTENT_METHODS = frozenset([
|
||||||
|
spec.Basic.Return,
|
||||||
|
spec.Basic.Deliver,
|
||||||
|
spec.Basic.GetOk,
|
||||||
|
])
|
||||||
|
|
||||||
|
|
||||||
|
#: Number of bytes reserved for protocol in a content frame.
|
||||||
|
#: We use this to calculate when a frame exceeeds the max frame size,
|
||||||
|
#: and if it does not the message will fit into the preallocated buffer.
|
||||||
|
FRAME_OVERHEAD = 40
|
||||||
|
|
||||||
|
|
||||||
|
def frame_handler(connection, callback,
|
||||||
|
unpack_from=unpack_from, content_methods=_CONTENT_METHODS):
|
||||||
|
"""Create closure that reads frames."""
|
||||||
|
expected_types = defaultdict(lambda: 1)
|
||||||
|
partial_messages = {}
|
||||||
|
|
||||||
|
def on_frame(frame):
|
||||||
|
frame_type, channel, buf = frame
|
||||||
|
connection.bytes_recv += 1
|
||||||
|
if frame_type not in (expected_types[channel], 8):
|
||||||
|
raise UnexpectedFrame(
|
||||||
|
'Received frame {} while expecting type: {}'.format(
|
||||||
|
frame_type, expected_types[channel]),
|
||||||
|
)
|
||||||
|
elif frame_type == 1:
|
||||||
|
method_sig = unpack_from('>HH', buf, 0)
|
||||||
|
|
||||||
|
if method_sig in content_methods:
|
||||||
|
# Save what we've got so far and wait for the content-header
|
||||||
|
partial_messages[channel] = Message(
|
||||||
|
frame_method=method_sig, frame_args=buf,
|
||||||
|
)
|
||||||
|
expected_types[channel] = 2
|
||||||
|
return False
|
||||||
|
|
||||||
|
callback(channel, method_sig, buf, None)
|
||||||
|
|
||||||
|
elif frame_type == 2:
|
||||||
|
msg = partial_messages[channel]
|
||||||
|
msg.inbound_header(buf)
|
||||||
|
|
||||||
|
if not msg.ready:
|
||||||
|
# wait for the content-body
|
||||||
|
expected_types[channel] = 3
|
||||||
|
return False
|
||||||
|
|
||||||
|
# bodyless message, we're done
|
||||||
|
expected_types[channel] = 1
|
||||||
|
partial_messages.pop(channel, None)
|
||||||
|
callback(channel, msg.frame_method, msg.frame_args, msg)
|
||||||
|
|
||||||
|
elif frame_type == 3:
|
||||||
|
msg = partial_messages[channel]
|
||||||
|
msg.inbound_body(buf)
|
||||||
|
if not msg.ready:
|
||||||
|
# wait for the rest of the content-body
|
||||||
|
return False
|
||||||
|
expected_types[channel] = 1
|
||||||
|
partial_messages.pop(channel, None)
|
||||||
|
callback(channel, msg.frame_method, msg.frame_args, msg)
|
||||||
|
elif frame_type == 8:
|
||||||
|
# bytes_recv already updated
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
return on_frame
|
||||||
|
|
||||||
|
|
||||||
|
class Buffer:
|
||||||
|
def __init__(self, buf):
|
||||||
|
self.buf = buf
|
||||||
|
|
||||||
|
@property
|
||||||
|
def buf(self):
|
||||||
|
return self._buf
|
||||||
|
|
||||||
|
@buf.setter
|
||||||
|
def buf(self, buf):
|
||||||
|
self._buf = buf
|
||||||
|
# Using a memoryview allows slicing without copying underlying data.
|
||||||
|
# Slicing this is much faster than slicing the bytearray directly.
|
||||||
|
# More details: https://stackoverflow.com/a/34257357
|
||||||
|
self.view = memoryview(buf)
|
||||||
|
|
||||||
|
|
||||||
|
def frame_writer(connection, transport,
|
||||||
|
pack=pack, pack_into=pack_into, range=range, len=len,
|
||||||
|
bytes=bytes, str_to_bytes=str_to_bytes, text_t=str):
|
||||||
|
"""Create closure that writes frames."""
|
||||||
|
write = transport.write
|
||||||
|
|
||||||
|
buffer_store = Buffer(bytearray(connection.frame_max - 8))
|
||||||
|
|
||||||
|
def write_frame(type_, channel, method_sig, args, content):
|
||||||
|
chunk_size = connection.frame_max - 8
|
||||||
|
offset = 0
|
||||||
|
properties = None
|
||||||
|
args = str_to_bytes(args)
|
||||||
|
if content:
|
||||||
|
body = content.body
|
||||||
|
if isinstance(body, str):
|
||||||
|
encoding = content.properties.setdefault(
|
||||||
|
'content_encoding', 'utf-8')
|
||||||
|
body = body.encode(encoding)
|
||||||
|
properties = content._serialize_properties()
|
||||||
|
bodylen = len(body)
|
||||||
|
properties_len = len(properties) or 0
|
||||||
|
framelen = len(args) + properties_len + bodylen + FRAME_OVERHEAD
|
||||||
|
bigbody = framelen > chunk_size
|
||||||
|
else:
|
||||||
|
body, bodylen, bigbody = None, 0, 0
|
||||||
|
|
||||||
|
if bigbody:
|
||||||
|
# ## SLOW: string copy and write for every frame
|
||||||
|
frame = (b''.join([pack('>HH', *method_sig), args])
|
||||||
|
if type_ == 1 else b'') # encode method frame
|
||||||
|
framelen = len(frame)
|
||||||
|
write(pack('>BHI%dsB' % framelen,
|
||||||
|
type_, channel, framelen, frame, 0xce))
|
||||||
|
if body:
|
||||||
|
frame = b''.join([
|
||||||
|
pack('>HHQ', method_sig[0], 0, len(body)),
|
||||||
|
properties,
|
||||||
|
])
|
||||||
|
framelen = len(frame)
|
||||||
|
write(pack('>BHI%dsB' % framelen,
|
||||||
|
2, channel, framelen, frame, 0xce))
|
||||||
|
|
||||||
|
for i in range(0, bodylen, chunk_size):
|
||||||
|
frame = body[i:i + chunk_size]
|
||||||
|
framelen = len(frame)
|
||||||
|
write(pack('>BHI%dsB' % framelen,
|
||||||
|
3, channel, framelen,
|
||||||
|
frame, 0xce))
|
||||||
|
|
||||||
|
else:
|
||||||
|
# frame_max can be updated via connection._on_tune. If
|
||||||
|
# it became larger, then we need to resize the buffer
|
||||||
|
# to prevent overflow.
|
||||||
|
if chunk_size > len(buffer_store.buf):
|
||||||
|
buffer_store.buf = bytearray(chunk_size)
|
||||||
|
buf = buffer_store.buf
|
||||||
|
|
||||||
|
# ## FAST: pack into buffer and single write
|
||||||
|
frame = (b''.join([pack('>HH', *method_sig), args])
|
||||||
|
if type_ == 1 else b'')
|
||||||
|
framelen = len(frame)
|
||||||
|
pack_into('>BHI%dsB' % framelen, buf, offset,
|
||||||
|
type_, channel, framelen, frame, 0xce)
|
||||||
|
offset += 8 + framelen
|
||||||
|
if body is not None:
|
||||||
|
frame = b''.join([
|
||||||
|
pack('>HHQ', method_sig[0], 0, len(body)),
|
||||||
|
properties,
|
||||||
|
])
|
||||||
|
framelen = len(frame)
|
||||||
|
|
||||||
|
pack_into('>BHI%dsB' % framelen, buf, offset,
|
||||||
|
2, channel, framelen, frame, 0xce)
|
||||||
|
offset += 8 + framelen
|
||||||
|
|
||||||
|
bodylen = len(body)
|
||||||
|
if bodylen > 0:
|
||||||
|
framelen = bodylen
|
||||||
|
pack_into('>BHI%dsB' % framelen, buf, offset,
|
||||||
|
3, channel, framelen, body, 0xce)
|
||||||
|
offset += 8 + framelen
|
||||||
|
|
||||||
|
write(buffer_store.view[:offset])
|
||||||
|
|
||||||
|
connection.bytes_sent += 1
|
||||||
|
return write_frame
|
||||||
|
|
@ -0,0 +1,79 @@
|
||||||
|
"""Platform compatibility."""
|
||||||
|
|
||||||
|
import platform
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
# Jython does not have this attribute
|
||||||
|
import typing
|
||||||
|
|
||||||
|
try:
|
||||||
|
from socket import SOL_TCP
|
||||||
|
except ImportError: # pragma: no cover
|
||||||
|
from socket import IPPROTO_TCP as SOL_TCP # noqa
|
||||||
|
|
||||||
|
|
||||||
|
RE_NUM = re.compile(r'(\d+).+')
|
||||||
|
|
||||||
|
|
||||||
|
def _linux_version_to_tuple(s: str) -> typing.Tuple[int, int, int]:
|
||||||
|
return tuple(map(_versionatom, s.split('.')[:3]))
|
||||||
|
|
||||||
|
|
||||||
|
def _versionatom(s: str) -> int:
|
||||||
|
if s.isdigit():
|
||||||
|
return int(s)
|
||||||
|
match = RE_NUM.match(s)
|
||||||
|
return int(match.groups()[0]) if match else 0
|
||||||
|
|
||||||
|
|
||||||
|
# available socket options for TCP level
|
||||||
|
KNOWN_TCP_OPTS = {
|
||||||
|
'TCP_CORK', 'TCP_DEFER_ACCEPT', 'TCP_KEEPCNT',
|
||||||
|
'TCP_KEEPIDLE', 'TCP_KEEPINTVL', 'TCP_LINGER2',
|
||||||
|
'TCP_MAXSEG', 'TCP_NODELAY', 'TCP_QUICKACK',
|
||||||
|
'TCP_SYNCNT', 'TCP_USER_TIMEOUT', 'TCP_WINDOW_CLAMP',
|
||||||
|
}
|
||||||
|
|
||||||
|
LINUX_VERSION = None
|
||||||
|
if sys.platform.startswith('linux'):
|
||||||
|
LINUX_VERSION = _linux_version_to_tuple(platform.release())
|
||||||
|
if LINUX_VERSION < (2, 6, 37):
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_USER_TIMEOUT')
|
||||||
|
|
||||||
|
# Windows Subsystem for Linux is an edge-case: the Python socket library
|
||||||
|
# returns most TCP_* enums, but they aren't actually supported
|
||||||
|
if platform.release().endswith("Microsoft"):
|
||||||
|
KNOWN_TCP_OPTS = {'TCP_NODELAY', 'TCP_KEEPIDLE', 'TCP_KEEPINTVL',
|
||||||
|
'TCP_KEEPCNT'}
|
||||||
|
|
||||||
|
elif sys.platform.startswith('darwin'):
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_USER_TIMEOUT')
|
||||||
|
|
||||||
|
elif 'bsd' in sys.platform:
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_USER_TIMEOUT')
|
||||||
|
|
||||||
|
# According to MSDN Windows platforms support getsockopt(TCP_MAXSSEG) but not
|
||||||
|
# setsockopt(TCP_MAXSEG) on IPPROTO_TCP sockets.
|
||||||
|
elif sys.platform.startswith('win'):
|
||||||
|
KNOWN_TCP_OPTS = {'TCP_NODELAY'}
|
||||||
|
|
||||||
|
elif sys.platform.startswith('cygwin'):
|
||||||
|
KNOWN_TCP_OPTS = {'TCP_NODELAY'}
|
||||||
|
|
||||||
|
# illumos does not allow to set the TCP_MAXSEG socket option,
|
||||||
|
# even if the Oracle documentation says otherwise.
|
||||||
|
# TCP_USER_TIMEOUT does not exist on Solaris 11.4
|
||||||
|
elif sys.platform.startswith('sunos'):
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_MAXSEG')
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_USER_TIMEOUT')
|
||||||
|
|
||||||
|
# aix does not allow to set the TCP_MAXSEG
|
||||||
|
# or the TCP_USER_TIMEOUT socket options.
|
||||||
|
elif sys.platform.startswith('aix'):
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_MAXSEG')
|
||||||
|
KNOWN_TCP_OPTS.remove('TCP_USER_TIMEOUT')
|
||||||
|
__all__ = (
|
||||||
|
'LINUX_VERSION',
|
||||||
|
'SOL_TCP',
|
||||||
|
'KNOWN_TCP_OPTS',
|
||||||
|
)
|
||||||
|
|
@ -0,0 +1,12 @@
|
||||||
|
"""Protocol data."""
|
||||||
|
|
||||||
|
from collections import namedtuple
|
||||||
|
|
||||||
|
queue_declare_ok_t = namedtuple(
|
||||||
|
'queue_declare_ok_t', ('queue', 'message_count', 'consumer_count'),
|
||||||
|
)
|
||||||
|
|
||||||
|
basic_return_t = namedtuple(
|
||||||
|
'basic_return_t',
|
||||||
|
('reply_code', 'reply_text', 'exchange', 'routing_key', 'message'),
|
||||||
|
)
|
||||||
|
|
@ -0,0 +1,191 @@
|
||||||
|
"""SASL mechanisms for AMQP authentication."""
|
||||||
|
|
||||||
|
import socket
|
||||||
|
import warnings
|
||||||
|
from io import BytesIO
|
||||||
|
|
||||||
|
from amqp.serialization import _write_table
|
||||||
|
|
||||||
|
|
||||||
|
class SASL:
|
||||||
|
"""The base class for all amqp SASL authentication mechanisms.
|
||||||
|
|
||||||
|
You should sub-class this if you're implementing your own authentication.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def mechanism(self):
|
||||||
|
"""Return a bytes containing the SASL mechanism name."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
def start(self, connection):
|
||||||
|
"""Return the first response to a SASL challenge as a bytes object."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
class PLAIN(SASL):
|
||||||
|
"""PLAIN SASL authentication mechanism.
|
||||||
|
|
||||||
|
See https://tools.ietf.org/html/rfc4616 for details
|
||||||
|
"""
|
||||||
|
|
||||||
|
mechanism = b'PLAIN'
|
||||||
|
|
||||||
|
def __init__(self, username, password):
|
||||||
|
self.username, self.password = username, password
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"username",
|
||||||
|
"password",
|
||||||
|
)
|
||||||
|
|
||||||
|
def start(self, connection):
|
||||||
|
if self.username is None or self.password is None:
|
||||||
|
return NotImplemented
|
||||||
|
login_response = BytesIO()
|
||||||
|
login_response.write(b'\0')
|
||||||
|
login_response.write(self.username.encode('utf-8'))
|
||||||
|
login_response.write(b'\0')
|
||||||
|
login_response.write(self.password.encode('utf-8'))
|
||||||
|
return login_response.getvalue()
|
||||||
|
|
||||||
|
|
||||||
|
class AMQPLAIN(SASL):
|
||||||
|
"""AMQPLAIN SASL authentication mechanism.
|
||||||
|
|
||||||
|
This is a non-standard mechanism used by AMQP servers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
mechanism = b'AMQPLAIN'
|
||||||
|
|
||||||
|
def __init__(self, username, password):
|
||||||
|
self.username, self.password = username, password
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"username",
|
||||||
|
"password",
|
||||||
|
)
|
||||||
|
|
||||||
|
def start(self, connection):
|
||||||
|
if self.username is None or self.password is None:
|
||||||
|
return NotImplemented
|
||||||
|
login_response = BytesIO()
|
||||||
|
_write_table({b'LOGIN': self.username, b'PASSWORD': self.password},
|
||||||
|
login_response.write, [])
|
||||||
|
# Skip the length at the beginning
|
||||||
|
return login_response.getvalue()[4:]
|
||||||
|
|
||||||
|
|
||||||
|
def _get_gssapi_mechanism():
|
||||||
|
try:
|
||||||
|
import gssapi
|
||||||
|
import gssapi.raw.misc # Fail if the old python-gssapi is installed
|
||||||
|
except ImportError:
|
||||||
|
class FakeGSSAPI(SASL):
|
||||||
|
"""A no-op SASL mechanism for when gssapi isn't available."""
|
||||||
|
|
||||||
|
mechanism = None
|
||||||
|
|
||||||
|
def __init__(self, client_name=None, service=b'amqp',
|
||||||
|
rdns=False, fail_soft=False):
|
||||||
|
if not fail_soft:
|
||||||
|
raise NotImplementedError(
|
||||||
|
"You need to install the `gssapi` module for GSSAPI "
|
||||||
|
"SASL support")
|
||||||
|
|
||||||
|
def start(self): # pragma: no cover
|
||||||
|
return NotImplemented
|
||||||
|
return FakeGSSAPI
|
||||||
|
else:
|
||||||
|
class GSSAPI(SASL):
|
||||||
|
"""GSSAPI SASL authentication mechanism.
|
||||||
|
|
||||||
|
See https://tools.ietf.org/html/rfc4752 for details
|
||||||
|
"""
|
||||||
|
|
||||||
|
mechanism = b'GSSAPI'
|
||||||
|
|
||||||
|
def __init__(self, client_name=None, service=b'amqp',
|
||||||
|
rdns=False, fail_soft=False):
|
||||||
|
if client_name and not isinstance(client_name, bytes):
|
||||||
|
client_name = client_name.encode('ascii')
|
||||||
|
self.client_name = client_name
|
||||||
|
self.fail_soft = fail_soft
|
||||||
|
self.service = service
|
||||||
|
self.rdns = rdns
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"client_name",
|
||||||
|
"fail_soft",
|
||||||
|
"service",
|
||||||
|
"rdns"
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_hostname(self, connection):
|
||||||
|
sock = connection.transport.sock
|
||||||
|
if self.rdns and sock.family in (socket.AF_INET,
|
||||||
|
socket.AF_INET6):
|
||||||
|
peer = sock.getpeername()
|
||||||
|
hostname, _, _ = socket.gethostbyaddr(peer[0])
|
||||||
|
else:
|
||||||
|
hostname = connection.transport.host
|
||||||
|
if not isinstance(hostname, bytes):
|
||||||
|
hostname = hostname.encode('ascii')
|
||||||
|
return hostname
|
||||||
|
|
||||||
|
def start(self, connection):
|
||||||
|
try:
|
||||||
|
if self.client_name:
|
||||||
|
creds = gssapi.Credentials(
|
||||||
|
name=gssapi.Name(self.client_name))
|
||||||
|
else:
|
||||||
|
creds = None
|
||||||
|
hostname = self.get_hostname(connection)
|
||||||
|
name = gssapi.Name(b'@'.join([self.service, hostname]),
|
||||||
|
gssapi.NameType.hostbased_service)
|
||||||
|
context = gssapi.SecurityContext(name=name, creds=creds)
|
||||||
|
return context.step(None)
|
||||||
|
except gssapi.raw.misc.GSSError:
|
||||||
|
if self.fail_soft:
|
||||||
|
return NotImplemented
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
return GSSAPI
|
||||||
|
|
||||||
|
|
||||||
|
GSSAPI = _get_gssapi_mechanism()
|
||||||
|
|
||||||
|
|
||||||
|
class EXTERNAL(SASL):
|
||||||
|
"""EXTERNAL SASL mechanism.
|
||||||
|
|
||||||
|
Enables external authentication, i.e. not handled through this protocol.
|
||||||
|
Only passes 'EXTERNAL' as authentication mechanism, but no further
|
||||||
|
authentication data.
|
||||||
|
"""
|
||||||
|
|
||||||
|
mechanism = b'EXTERNAL'
|
||||||
|
|
||||||
|
def start(self, connection):
|
||||||
|
return b''
|
||||||
|
|
||||||
|
|
||||||
|
class RAW(SASL):
|
||||||
|
"""A generic custom SASL mechanism.
|
||||||
|
|
||||||
|
This mechanism takes a mechanism name and response to send to the server,
|
||||||
|
so can be used for simple custom authentication schemes.
|
||||||
|
"""
|
||||||
|
|
||||||
|
mechanism = None
|
||||||
|
|
||||||
|
def __init__(self, mechanism, response):
|
||||||
|
assert isinstance(mechanism, bytes)
|
||||||
|
assert isinstance(response, bytes)
|
||||||
|
self.mechanism, self.response = mechanism, response
|
||||||
|
warnings.warn("Passing login_method and login_response to Connection "
|
||||||
|
"is deprecated. Please implement a SASL subclass "
|
||||||
|
"instead.", DeprecationWarning)
|
||||||
|
|
||||||
|
def start(self, connection):
|
||||||
|
return self.response
|
||||||
|
|
@ -0,0 +1,582 @@
|
||||||
|
"""Convert between bytestreams and higher-level AMQP types.
|
||||||
|
|
||||||
|
2007-11-05 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Copyright (C) 2007 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
import calendar
|
||||||
|
from datetime import datetime
|
||||||
|
from decimal import Decimal
|
||||||
|
from io import BytesIO
|
||||||
|
from struct import pack, unpack_from
|
||||||
|
|
||||||
|
from .exceptions import FrameSyntaxError
|
||||||
|
from .spec import Basic
|
||||||
|
from .utils import bytes_to_str as pstr_t
|
||||||
|
from .utils import str_to_bytes
|
||||||
|
|
||||||
|
ILLEGAL_TABLE_TYPE = """\
|
||||||
|
Table type {0!r} not handled by amqp.
|
||||||
|
"""
|
||||||
|
|
||||||
|
ILLEGAL_TABLE_TYPE_WITH_KEY = """\
|
||||||
|
Table type {0!r} for key {1!r} not handled by amqp. [value: {2!r}]
|
||||||
|
"""
|
||||||
|
|
||||||
|
ILLEGAL_TABLE_TYPE_WITH_VALUE = """\
|
||||||
|
Table type {0!r} not handled by amqp. [value: {1!r}]
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def _read_item(buf, offset):
|
||||||
|
ftype = chr(buf[offset])
|
||||||
|
offset += 1
|
||||||
|
|
||||||
|
# 'S': long string
|
||||||
|
if ftype == 'S':
|
||||||
|
slen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
try:
|
||||||
|
val = pstr_t(buf[offset:offset + slen])
|
||||||
|
except UnicodeDecodeError:
|
||||||
|
val = buf[offset:offset + slen]
|
||||||
|
|
||||||
|
offset += slen
|
||||||
|
# 's': short string
|
||||||
|
elif ftype == 's':
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
val = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
# 'x': Bytes Array
|
||||||
|
elif ftype == 'x':
|
||||||
|
blen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
val = buf[offset:offset + blen]
|
||||||
|
offset += blen
|
||||||
|
# 'b': short-short int
|
||||||
|
elif ftype == 'b':
|
||||||
|
val, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
# 'B': short-short unsigned int
|
||||||
|
elif ftype == 'B':
|
||||||
|
val, = unpack_from('>b', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
# 'U': short int
|
||||||
|
elif ftype == 'U':
|
||||||
|
val, = unpack_from('>h', buf, offset)
|
||||||
|
offset += 2
|
||||||
|
# 'u': short unsigned int
|
||||||
|
elif ftype == 'u':
|
||||||
|
val, = unpack_from('>H', buf, offset)
|
||||||
|
offset += 2
|
||||||
|
# 'I': long int
|
||||||
|
elif ftype == 'I':
|
||||||
|
val, = unpack_from('>i', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
# 'i': long unsigned int
|
||||||
|
elif ftype == 'i':
|
||||||
|
val, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
# 'L': long long int
|
||||||
|
elif ftype == 'L':
|
||||||
|
val, = unpack_from('>q', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
# 'l': long long unsigned int
|
||||||
|
elif ftype == 'l':
|
||||||
|
val, = unpack_from('>Q', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
# 'f': float
|
||||||
|
elif ftype == 'f':
|
||||||
|
val, = unpack_from('>f', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
# 'd': double
|
||||||
|
elif ftype == 'd':
|
||||||
|
val, = unpack_from('>d', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
# 'D': decimal
|
||||||
|
elif ftype == 'D':
|
||||||
|
d, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
n, = unpack_from('>i', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
val = Decimal(n) / Decimal(10 ** d)
|
||||||
|
# 'F': table
|
||||||
|
elif ftype == 'F':
|
||||||
|
tlen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
limit = offset + tlen
|
||||||
|
val = {}
|
||||||
|
while offset < limit:
|
||||||
|
keylen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
key = pstr_t(buf[offset:offset + keylen])
|
||||||
|
offset += keylen
|
||||||
|
val[key], offset = _read_item(buf, offset)
|
||||||
|
# 'A': array
|
||||||
|
elif ftype == 'A':
|
||||||
|
alen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
limit = offset + alen
|
||||||
|
val = []
|
||||||
|
while offset < limit:
|
||||||
|
v, offset = _read_item(buf, offset)
|
||||||
|
val.append(v)
|
||||||
|
# 't' (bool)
|
||||||
|
elif ftype == 't':
|
||||||
|
val, = unpack_from('>B', buf, offset)
|
||||||
|
val = bool(val)
|
||||||
|
offset += 1
|
||||||
|
# 'T': timestamp
|
||||||
|
elif ftype == 'T':
|
||||||
|
val, = unpack_from('>Q', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
val = datetime.utcfromtimestamp(val)
|
||||||
|
# 'V': void
|
||||||
|
elif ftype == 'V':
|
||||||
|
val = None
|
||||||
|
else:
|
||||||
|
raise FrameSyntaxError(
|
||||||
|
'Unknown value in table: {!r} ({!r})'.format(
|
||||||
|
ftype, type(ftype)))
|
||||||
|
return val, offset
|
||||||
|
|
||||||
|
|
||||||
|
def loads(format, buf, offset):
|
||||||
|
"""Deserialize amqp format.
|
||||||
|
|
||||||
|
bit = b
|
||||||
|
octet = o
|
||||||
|
short = B
|
||||||
|
long = l
|
||||||
|
long long = L
|
||||||
|
float = f
|
||||||
|
shortstr = s
|
||||||
|
longstr = S
|
||||||
|
table = F
|
||||||
|
array = A
|
||||||
|
timestamp = T
|
||||||
|
"""
|
||||||
|
bitcount = bits = 0
|
||||||
|
|
||||||
|
values = []
|
||||||
|
append = values.append
|
||||||
|
format = pstr_t(format)
|
||||||
|
|
||||||
|
for p in format:
|
||||||
|
if p == 'b':
|
||||||
|
if not bitcount:
|
||||||
|
bits = ord(buf[offset:offset + 1])
|
||||||
|
offset += 1
|
||||||
|
bitcount = 8
|
||||||
|
val = (bits & 1) == 1
|
||||||
|
bits >>= 1
|
||||||
|
bitcount -= 1
|
||||||
|
elif p == 'o':
|
||||||
|
bitcount = bits = 0
|
||||||
|
val, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
elif p == 'B':
|
||||||
|
bitcount = bits = 0
|
||||||
|
val, = unpack_from('>H', buf, offset)
|
||||||
|
offset += 2
|
||||||
|
elif p == 'l':
|
||||||
|
bitcount = bits = 0
|
||||||
|
val, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
elif p == 'L':
|
||||||
|
bitcount = bits = 0
|
||||||
|
val, = unpack_from('>Q', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
elif p == 'f':
|
||||||
|
bitcount = bits = 0
|
||||||
|
val, = unpack_from('>f', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
elif p == 's':
|
||||||
|
bitcount = bits = 0
|
||||||
|
slen, = unpack_from('B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
val = buf[offset:offset + slen].decode('utf-8', 'surrogatepass')
|
||||||
|
offset += slen
|
||||||
|
elif p == 'S':
|
||||||
|
bitcount = bits = 0
|
||||||
|
slen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
val = buf[offset:offset + slen].decode('utf-8', 'surrogatepass')
|
||||||
|
offset += slen
|
||||||
|
elif p == 'x':
|
||||||
|
blen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
val = buf[offset:offset + blen]
|
||||||
|
offset += blen
|
||||||
|
elif p == 'F':
|
||||||
|
bitcount = bits = 0
|
||||||
|
tlen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
limit = offset + tlen
|
||||||
|
val = {}
|
||||||
|
while offset < limit:
|
||||||
|
keylen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
key = pstr_t(buf[offset:offset + keylen])
|
||||||
|
offset += keylen
|
||||||
|
val[key], offset = _read_item(buf, offset)
|
||||||
|
elif p == 'A':
|
||||||
|
bitcount = bits = 0
|
||||||
|
alen, = unpack_from('>I', buf, offset)
|
||||||
|
offset += 4
|
||||||
|
limit = offset + alen
|
||||||
|
val = []
|
||||||
|
while offset < limit:
|
||||||
|
aval, offset = _read_item(buf, offset)
|
||||||
|
val.append(aval)
|
||||||
|
elif p == 'T':
|
||||||
|
bitcount = bits = 0
|
||||||
|
val, = unpack_from('>Q', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
val = datetime.utcfromtimestamp(val)
|
||||||
|
else:
|
||||||
|
raise FrameSyntaxError(ILLEGAL_TABLE_TYPE.format(p))
|
||||||
|
append(val)
|
||||||
|
return values, offset
|
||||||
|
|
||||||
|
|
||||||
|
def _flushbits(bits, write):
|
||||||
|
if bits:
|
||||||
|
write(pack('B' * len(bits), *bits))
|
||||||
|
bits[:] = []
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def dumps(format, values):
|
||||||
|
"""Serialize AMQP arguments.
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
bit = b
|
||||||
|
octet = o
|
||||||
|
short = B
|
||||||
|
long = l
|
||||||
|
long long = L
|
||||||
|
shortstr = s
|
||||||
|
longstr = S
|
||||||
|
byte array = x
|
||||||
|
table = F
|
||||||
|
array = A
|
||||||
|
"""
|
||||||
|
bitcount = 0
|
||||||
|
bits = []
|
||||||
|
out = BytesIO()
|
||||||
|
write = out.write
|
||||||
|
|
||||||
|
format = pstr_t(format)
|
||||||
|
|
||||||
|
for i, val in enumerate(values):
|
||||||
|
p = format[i]
|
||||||
|
if p == 'b':
|
||||||
|
val = 1 if val else 0
|
||||||
|
shift = bitcount % 8
|
||||||
|
if shift == 0:
|
||||||
|
bits.append(0)
|
||||||
|
bits[-1] |= (val << shift)
|
||||||
|
bitcount += 1
|
||||||
|
elif p == 'o':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
write(pack('B', val))
|
||||||
|
elif p == 'B':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
write(pack('>H', int(val)))
|
||||||
|
elif p == 'l':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
write(pack('>I', val))
|
||||||
|
elif p == 'L':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
write(pack('>Q', val))
|
||||||
|
elif p == 'f':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
write(pack('>f', val))
|
||||||
|
elif p == 's':
|
||||||
|
val = val or ''
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
if isinstance(val, str):
|
||||||
|
val = val.encode('utf-8', 'surrogatepass')
|
||||||
|
write(pack('B', len(val)))
|
||||||
|
write(val)
|
||||||
|
elif p == 'S' or p == 'x':
|
||||||
|
val = val or ''
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
if isinstance(val, str):
|
||||||
|
val = val.encode('utf-8', 'surrogatepass')
|
||||||
|
write(pack('>I', len(val)))
|
||||||
|
write(val)
|
||||||
|
elif p == 'F':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
_write_table(val or {}, write, bits)
|
||||||
|
elif p == 'A':
|
||||||
|
bitcount = _flushbits(bits, write)
|
||||||
|
_write_array(val or [], write, bits)
|
||||||
|
elif p == 'T':
|
||||||
|
write(pack('>Q', int(calendar.timegm(val.utctimetuple()))))
|
||||||
|
_flushbits(bits, write)
|
||||||
|
|
||||||
|
return out.getvalue()
|
||||||
|
|
||||||
|
|
||||||
|
def _write_table(d, write, bits):
|
||||||
|
out = BytesIO()
|
||||||
|
twrite = out.write
|
||||||
|
for k, v in d.items():
|
||||||
|
if isinstance(k, str):
|
||||||
|
k = k.encode('utf-8', 'surrogatepass')
|
||||||
|
twrite(pack('B', len(k)))
|
||||||
|
twrite(k)
|
||||||
|
try:
|
||||||
|
_write_item(v, twrite, bits)
|
||||||
|
except ValueError:
|
||||||
|
raise FrameSyntaxError(
|
||||||
|
ILLEGAL_TABLE_TYPE_WITH_KEY.format(type(v), k, v))
|
||||||
|
table_data = out.getvalue()
|
||||||
|
write(pack('>I', len(table_data)))
|
||||||
|
write(table_data)
|
||||||
|
|
||||||
|
|
||||||
|
def _write_array(list_, write, bits):
|
||||||
|
out = BytesIO()
|
||||||
|
awrite = out.write
|
||||||
|
for v in list_:
|
||||||
|
try:
|
||||||
|
_write_item(v, awrite, bits)
|
||||||
|
except ValueError:
|
||||||
|
raise FrameSyntaxError(
|
||||||
|
ILLEGAL_TABLE_TYPE_WITH_VALUE.format(type(v), v))
|
||||||
|
array_data = out.getvalue()
|
||||||
|
write(pack('>I', len(array_data)))
|
||||||
|
write(array_data)
|
||||||
|
|
||||||
|
|
||||||
|
def _write_item(v, write, bits):
|
||||||
|
if isinstance(v, (str, bytes)):
|
||||||
|
if isinstance(v, str):
|
||||||
|
v = v.encode('utf-8', 'surrogatepass')
|
||||||
|
write(pack('>cI', b'S', len(v)))
|
||||||
|
write(v)
|
||||||
|
elif isinstance(v, bool):
|
||||||
|
write(pack('>cB', b't', int(v)))
|
||||||
|
elif isinstance(v, float):
|
||||||
|
write(pack('>cd', b'd', v))
|
||||||
|
elif isinstance(v, int):
|
||||||
|
if v > 2147483647 or v < -2147483647:
|
||||||
|
write(pack('>cq', b'L', v))
|
||||||
|
else:
|
||||||
|
write(pack('>ci', b'I', v))
|
||||||
|
elif isinstance(v, Decimal):
|
||||||
|
sign, digits, exponent = v.as_tuple()
|
||||||
|
v = 0
|
||||||
|
for d in digits:
|
||||||
|
v = (v * 10) + d
|
||||||
|
if sign:
|
||||||
|
v = -v
|
||||||
|
write(pack('>cBi', b'D', -exponent, v))
|
||||||
|
elif isinstance(v, datetime):
|
||||||
|
write(
|
||||||
|
pack('>cQ', b'T', int(calendar.timegm(v.utctimetuple()))))
|
||||||
|
elif isinstance(v, dict):
|
||||||
|
write(b'F')
|
||||||
|
_write_table(v, write, bits)
|
||||||
|
elif isinstance(v, (list, tuple)):
|
||||||
|
write(b'A')
|
||||||
|
_write_array(v, write, bits)
|
||||||
|
elif v is None:
|
||||||
|
write(b'V')
|
||||||
|
else:
|
||||||
|
raise ValueError()
|
||||||
|
|
||||||
|
|
||||||
|
def decode_properties_basic(buf, offset):
|
||||||
|
"""Decode basic properties."""
|
||||||
|
properties = {}
|
||||||
|
|
||||||
|
flags, = unpack_from('>H', buf, offset)
|
||||||
|
offset += 2
|
||||||
|
|
||||||
|
if flags & 0x8000:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['content_type'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x4000:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['content_encoding'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x2000:
|
||||||
|
_f, offset = loads('F', buf, offset)
|
||||||
|
properties['application_headers'], = _f
|
||||||
|
if flags & 0x1000:
|
||||||
|
properties['delivery_mode'], = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
if flags & 0x0800:
|
||||||
|
properties['priority'], = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
if flags & 0x0400:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['correlation_id'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0200:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['reply_to'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0100:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['expiration'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0080:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['message_id'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0040:
|
||||||
|
properties['timestamp'], = unpack_from('>Q', buf, offset)
|
||||||
|
offset += 8
|
||||||
|
if flags & 0x0020:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['type'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0010:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['user_id'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0008:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['app_id'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
if flags & 0x0004:
|
||||||
|
slen, = unpack_from('>B', buf, offset)
|
||||||
|
offset += 1
|
||||||
|
properties['cluster_id'] = pstr_t(buf[offset:offset + slen])
|
||||||
|
offset += slen
|
||||||
|
return properties, offset
|
||||||
|
|
||||||
|
|
||||||
|
PROPERTY_CLASSES = {
|
||||||
|
Basic.CLASS_ID: decode_properties_basic,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class GenericContent:
|
||||||
|
"""Abstract base class for AMQP content.
|
||||||
|
|
||||||
|
Subclasses should override the PROPERTIES attribute.
|
||||||
|
"""
|
||||||
|
|
||||||
|
CLASS_ID = None
|
||||||
|
PROPERTIES = [('dummy', 's')]
|
||||||
|
|
||||||
|
def __init__(self, frame_method=None, frame_args=None, **props):
|
||||||
|
self.frame_method = frame_method
|
||||||
|
self.frame_args = frame_args
|
||||||
|
|
||||||
|
self.properties = props
|
||||||
|
self._pending_chunks = []
|
||||||
|
self.body_received = 0
|
||||||
|
self.body_size = 0
|
||||||
|
self.ready = False
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"frame_method",
|
||||||
|
"frame_args",
|
||||||
|
"properties",
|
||||||
|
"_pending_chunks",
|
||||||
|
"body_received",
|
||||||
|
"body_size",
|
||||||
|
"ready",
|
||||||
|
# adding '__dict__' to get dynamic assignment
|
||||||
|
"__dict__",
|
||||||
|
"__weakref__",
|
||||||
|
)
|
||||||
|
|
||||||
|
def __getattr__(self, name):
|
||||||
|
# Look for additional properties in the 'properties'
|
||||||
|
# dictionary, and if present - the 'delivery_info' dictionary.
|
||||||
|
if name == '__setstate__':
|
||||||
|
# Allows pickling/unpickling to work
|
||||||
|
raise AttributeError('__setstate__')
|
||||||
|
|
||||||
|
if name in self.properties:
|
||||||
|
return self.properties[name]
|
||||||
|
raise AttributeError(name)
|
||||||
|
|
||||||
|
def _load_properties(self, class_id, buf, offset):
|
||||||
|
"""Load AMQP properties.
|
||||||
|
|
||||||
|
Given the raw bytes containing the property-flags and property-list
|
||||||
|
from a content-frame-header, parse and insert into a dictionary
|
||||||
|
stored in this object as an attribute named 'properties'.
|
||||||
|
"""
|
||||||
|
# Read 16-bit shorts until we get one with a low bit set to zero
|
||||||
|
props, offset = PROPERTY_CLASSES[class_id](buf, offset)
|
||||||
|
self.properties = props
|
||||||
|
return offset
|
||||||
|
|
||||||
|
def _serialize_properties(self):
|
||||||
|
"""Serialize AMQP properties.
|
||||||
|
|
||||||
|
Serialize the 'properties' attribute (a dictionary) into
|
||||||
|
the raw bytes making up a set of property flags and a
|
||||||
|
property list, suitable for putting into a content frame header.
|
||||||
|
"""
|
||||||
|
shift = 15
|
||||||
|
flag_bits = 0
|
||||||
|
flags = []
|
||||||
|
sformat, svalues = [], []
|
||||||
|
props = self.properties
|
||||||
|
for key, proptype in self.PROPERTIES:
|
||||||
|
val = props.get(key, None)
|
||||||
|
if val is not None:
|
||||||
|
if shift == 0:
|
||||||
|
flags.append(flag_bits)
|
||||||
|
flag_bits = 0
|
||||||
|
shift = 15
|
||||||
|
|
||||||
|
flag_bits |= (1 << shift)
|
||||||
|
if proptype != 'bit':
|
||||||
|
sformat.append(str_to_bytes(proptype))
|
||||||
|
svalues.append(val)
|
||||||
|
|
||||||
|
shift -= 1
|
||||||
|
flags.append(flag_bits)
|
||||||
|
result = BytesIO()
|
||||||
|
write = result.write
|
||||||
|
for flag_bits in flags:
|
||||||
|
write(pack('>H', flag_bits))
|
||||||
|
write(dumps(b''.join(sformat), svalues))
|
||||||
|
|
||||||
|
return result.getvalue()
|
||||||
|
|
||||||
|
def inbound_header(self, buf, offset=0):
|
||||||
|
class_id, self.body_size = unpack_from('>HxxQ', buf, offset)
|
||||||
|
offset += 12
|
||||||
|
self._load_properties(class_id, buf, offset)
|
||||||
|
if not self.body_size:
|
||||||
|
self.ready = True
|
||||||
|
return offset
|
||||||
|
|
||||||
|
def inbound_body(self, buf):
|
||||||
|
chunks = self._pending_chunks
|
||||||
|
self.body_received += len(buf)
|
||||||
|
if self.body_received >= self.body_size:
|
||||||
|
if chunks:
|
||||||
|
chunks.append(buf)
|
||||||
|
self.body = bytes().join(chunks)
|
||||||
|
chunks[:] = []
|
||||||
|
else:
|
||||||
|
self.body = buf
|
||||||
|
self.ready = True
|
||||||
|
else:
|
||||||
|
chunks.append(buf)
|
||||||
|
|
@ -0,0 +1,121 @@
|
||||||
|
"""AMQP Spec."""
|
||||||
|
|
||||||
|
from collections import namedtuple
|
||||||
|
|
||||||
|
method_t = namedtuple('method_t', ('method_sig', 'args', 'content'))
|
||||||
|
|
||||||
|
|
||||||
|
def method(method_sig, args=None, content=False):
|
||||||
|
"""Create amqp method specification tuple."""
|
||||||
|
return method_t(method_sig, args, content)
|
||||||
|
|
||||||
|
|
||||||
|
class Connection:
|
||||||
|
"""AMQ Connection class."""
|
||||||
|
|
||||||
|
CLASS_ID = 10
|
||||||
|
|
||||||
|
Start = (10, 10)
|
||||||
|
StartOk = (10, 11)
|
||||||
|
Secure = (10, 20)
|
||||||
|
SecureOk = (10, 21)
|
||||||
|
Tune = (10, 30)
|
||||||
|
TuneOk = (10, 31)
|
||||||
|
Open = (10, 40)
|
||||||
|
OpenOk = (10, 41)
|
||||||
|
Close = (10, 50)
|
||||||
|
CloseOk = (10, 51)
|
||||||
|
Blocked = (10, 60)
|
||||||
|
Unblocked = (10, 61)
|
||||||
|
|
||||||
|
|
||||||
|
class Channel:
|
||||||
|
"""AMQ Channel class."""
|
||||||
|
|
||||||
|
CLASS_ID = 20
|
||||||
|
|
||||||
|
Open = (20, 10)
|
||||||
|
OpenOk = (20, 11)
|
||||||
|
Flow = (20, 20)
|
||||||
|
FlowOk = (20, 21)
|
||||||
|
Close = (20, 40)
|
||||||
|
CloseOk = (20, 41)
|
||||||
|
|
||||||
|
|
||||||
|
class Exchange:
|
||||||
|
"""AMQ Exchange class."""
|
||||||
|
|
||||||
|
CLASS_ID = 40
|
||||||
|
|
||||||
|
Declare = (40, 10)
|
||||||
|
DeclareOk = (40, 11)
|
||||||
|
Delete = (40, 20)
|
||||||
|
DeleteOk = (40, 21)
|
||||||
|
Bind = (40, 30)
|
||||||
|
BindOk = (40, 31)
|
||||||
|
Unbind = (40, 40)
|
||||||
|
UnbindOk = (40, 51)
|
||||||
|
|
||||||
|
|
||||||
|
class Queue:
|
||||||
|
"""AMQ Queue class."""
|
||||||
|
|
||||||
|
CLASS_ID = 50
|
||||||
|
|
||||||
|
Declare = (50, 10)
|
||||||
|
DeclareOk = (50, 11)
|
||||||
|
Bind = (50, 20)
|
||||||
|
BindOk = (50, 21)
|
||||||
|
Purge = (50, 30)
|
||||||
|
PurgeOk = (50, 31)
|
||||||
|
Delete = (50, 40)
|
||||||
|
DeleteOk = (50, 41)
|
||||||
|
Unbind = (50, 50)
|
||||||
|
UnbindOk = (50, 51)
|
||||||
|
|
||||||
|
|
||||||
|
class Basic:
|
||||||
|
"""AMQ Basic class."""
|
||||||
|
|
||||||
|
CLASS_ID = 60
|
||||||
|
|
||||||
|
Qos = (60, 10)
|
||||||
|
QosOk = (60, 11)
|
||||||
|
Consume = (60, 20)
|
||||||
|
ConsumeOk = (60, 21)
|
||||||
|
Cancel = (60, 30)
|
||||||
|
CancelOk = (60, 31)
|
||||||
|
Publish = (60, 40)
|
||||||
|
Return = (60, 50)
|
||||||
|
Deliver = (60, 60)
|
||||||
|
Get = (60, 70)
|
||||||
|
GetOk = (60, 71)
|
||||||
|
GetEmpty = (60, 72)
|
||||||
|
Ack = (60, 80)
|
||||||
|
Nack = (60, 120)
|
||||||
|
Reject = (60, 90)
|
||||||
|
RecoverAsync = (60, 100)
|
||||||
|
Recover = (60, 110)
|
||||||
|
RecoverOk = (60, 111)
|
||||||
|
|
||||||
|
|
||||||
|
class Confirm:
|
||||||
|
"""AMQ Confirm class."""
|
||||||
|
|
||||||
|
CLASS_ID = 85
|
||||||
|
|
||||||
|
Select = (85, 10)
|
||||||
|
SelectOk = (85, 11)
|
||||||
|
|
||||||
|
|
||||||
|
class Tx:
|
||||||
|
"""AMQ Tx class."""
|
||||||
|
|
||||||
|
CLASS_ID = 90
|
||||||
|
|
||||||
|
Select = (90, 10)
|
||||||
|
SelectOk = (90, 11)
|
||||||
|
Commit = (90, 20)
|
||||||
|
CommitOk = (90, 21)
|
||||||
|
Rollback = (90, 30)
|
||||||
|
RollbackOk = (90, 31)
|
||||||
|
|
@ -0,0 +1,679 @@
|
||||||
|
"""Transport implementation."""
|
||||||
|
# Copyright (C) 2009 Barry Pederson <bp@barryp.org>
|
||||||
|
|
||||||
|
import errno
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import socket
|
||||||
|
import ssl
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from ssl import SSLError
|
||||||
|
from struct import pack, unpack
|
||||||
|
|
||||||
|
from .exceptions import UnexpectedFrame
|
||||||
|
from .platform import KNOWN_TCP_OPTS, SOL_TCP
|
||||||
|
from .utils import set_cloexec
|
||||||
|
|
||||||
|
_UNAVAIL = {errno.EAGAIN, errno.EINTR, errno.ENOENT, errno.EWOULDBLOCK}
|
||||||
|
|
||||||
|
AMQP_PORT = 5672
|
||||||
|
|
||||||
|
EMPTY_BUFFER = bytes()
|
||||||
|
|
||||||
|
SIGNED_INT_MAX = 0x7FFFFFFF
|
||||||
|
|
||||||
|
# Yes, Advanced Message Queuing Protocol Protocol is redundant
|
||||||
|
AMQP_PROTOCOL_HEADER = b'AMQP\x00\x00\x09\x01'
|
||||||
|
|
||||||
|
# Match things like: [fe80::1]:5432, from RFC 2732
|
||||||
|
IPV6_LITERAL = re.compile(r'\[([\.0-9a-f:]+)\](?::(\d+))?')
|
||||||
|
|
||||||
|
DEFAULT_SOCKET_SETTINGS = {
|
||||||
|
'TCP_NODELAY': 1,
|
||||||
|
'TCP_USER_TIMEOUT': 1000,
|
||||||
|
'TCP_KEEPIDLE': 60,
|
||||||
|
'TCP_KEEPINTVL': 10,
|
||||||
|
'TCP_KEEPCNT': 9,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def to_host_port(host, default=AMQP_PORT):
|
||||||
|
"""Convert hostname:port string to host, port tuple."""
|
||||||
|
port = default
|
||||||
|
m = IPV6_LITERAL.match(host)
|
||||||
|
if m:
|
||||||
|
host = m.group(1)
|
||||||
|
if m.group(2):
|
||||||
|
port = int(m.group(2))
|
||||||
|
else:
|
||||||
|
if ':' in host:
|
||||||
|
host, port = host.rsplit(':', 1)
|
||||||
|
port = int(port)
|
||||||
|
return host, port
|
||||||
|
|
||||||
|
|
||||||
|
class _AbstractTransport:
|
||||||
|
"""Common superclass for TCP and SSL transports.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
host: str
|
||||||
|
|
||||||
|
Broker address in format ``HOSTNAME:PORT``.
|
||||||
|
|
||||||
|
connect_timeout: int
|
||||||
|
|
||||||
|
Timeout of creating new connection.
|
||||||
|
|
||||||
|
read_timeout: int
|
||||||
|
|
||||||
|
sets ``SO_RCVTIMEO`` parameter of socket.
|
||||||
|
|
||||||
|
write_timeout: int
|
||||||
|
|
||||||
|
sets ``SO_SNDTIMEO`` parameter of socket.
|
||||||
|
|
||||||
|
socket_settings: dict
|
||||||
|
|
||||||
|
dictionary containing `optname` and ``optval`` passed to
|
||||||
|
``setsockopt(2)``.
|
||||||
|
|
||||||
|
raise_on_initial_eintr: bool
|
||||||
|
|
||||||
|
when True, ``socket.timeout`` is raised
|
||||||
|
when exception is received during first read. See ``_read()`` for
|
||||||
|
details.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, host, connect_timeout=None,
|
||||||
|
read_timeout=None, write_timeout=None,
|
||||||
|
socket_settings=None, raise_on_initial_eintr=True, **kwargs):
|
||||||
|
self.connected = False
|
||||||
|
self.sock = None
|
||||||
|
self.raise_on_initial_eintr = raise_on_initial_eintr
|
||||||
|
self._read_buffer = EMPTY_BUFFER
|
||||||
|
self.host, self.port = to_host_port(host)
|
||||||
|
self.connect_timeout = connect_timeout
|
||||||
|
self.read_timeout = read_timeout
|
||||||
|
self.write_timeout = write_timeout
|
||||||
|
self.socket_settings = socket_settings
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"connection",
|
||||||
|
"sock",
|
||||||
|
"raise_on_initial_eintr",
|
||||||
|
"_read_buffer",
|
||||||
|
"host",
|
||||||
|
"port",
|
||||||
|
"connect_timeout",
|
||||||
|
"read_timeout",
|
||||||
|
"write_timeout",
|
||||||
|
"socket_settings",
|
||||||
|
# adding '__dict__' to get dynamic assignment
|
||||||
|
"__dict__",
|
||||||
|
"__weakref__",
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
if self.sock:
|
||||||
|
src = f'{self.sock.getsockname()[0]}:{self.sock.getsockname()[1]}'
|
||||||
|
try:
|
||||||
|
dst = f'{self.sock.getpeername()[0]}:{self.sock.getpeername()[1]}'
|
||||||
|
except (socket.error) as e:
|
||||||
|
dst = f'ERROR: {e}'
|
||||||
|
return f'<{type(self).__name__}: {src} -> {dst} at {id(self):#x}>'
|
||||||
|
else:
|
||||||
|
return f'<{type(self).__name__}: (disconnected) at {id(self):#x}>'
|
||||||
|
|
||||||
|
def connect(self):
|
||||||
|
try:
|
||||||
|
# are we already connected?
|
||||||
|
if self.connected:
|
||||||
|
return
|
||||||
|
self._connect(self.host, self.port, self.connect_timeout)
|
||||||
|
self._init_socket(
|
||||||
|
self.socket_settings, self.read_timeout, self.write_timeout,
|
||||||
|
)
|
||||||
|
# we've sent the banner; signal connect
|
||||||
|
# EINTR, EAGAIN, EWOULDBLOCK would signal that the banner
|
||||||
|
# has _not_ been sent
|
||||||
|
self.connected = True
|
||||||
|
except (OSError, SSLError):
|
||||||
|
# if not fully connected, close socket, and reraise error
|
||||||
|
if self.sock and not self.connected:
|
||||||
|
self.sock.close()
|
||||||
|
self.sock = None
|
||||||
|
raise
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def having_timeout(self, timeout):
|
||||||
|
if timeout is None:
|
||||||
|
yield self.sock
|
||||||
|
else:
|
||||||
|
sock = self.sock
|
||||||
|
prev = sock.gettimeout()
|
||||||
|
if prev != timeout:
|
||||||
|
sock.settimeout(timeout)
|
||||||
|
try:
|
||||||
|
yield self.sock
|
||||||
|
except SSLError as exc:
|
||||||
|
if 'timed out' in str(exc):
|
||||||
|
# http://bugs.python.org/issue10272
|
||||||
|
raise socket.timeout()
|
||||||
|
elif 'The operation did not complete' in str(exc):
|
||||||
|
# Non-blocking SSL sockets can throw SSLError
|
||||||
|
raise socket.timeout()
|
||||||
|
raise
|
||||||
|
except OSError as exc:
|
||||||
|
if exc.errno == errno.EWOULDBLOCK:
|
||||||
|
raise socket.timeout()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
if timeout != prev:
|
||||||
|
sock.settimeout(prev)
|
||||||
|
|
||||||
|
def _connect(self, host, port, timeout):
|
||||||
|
entries = socket.getaddrinfo(
|
||||||
|
host, port, socket.AF_UNSPEC, socket.SOCK_STREAM, SOL_TCP,
|
||||||
|
)
|
||||||
|
for i, res in enumerate(entries):
|
||||||
|
af, socktype, proto, canonname, sa = res
|
||||||
|
try:
|
||||||
|
self.sock = socket.socket(af, socktype, proto)
|
||||||
|
try:
|
||||||
|
set_cloexec(self.sock, True)
|
||||||
|
except NotImplementedError:
|
||||||
|
pass
|
||||||
|
self.sock.settimeout(timeout)
|
||||||
|
self.sock.connect(sa)
|
||||||
|
except socket.error:
|
||||||
|
if self.sock:
|
||||||
|
self.sock.close()
|
||||||
|
self.sock = None
|
||||||
|
if i + 1 >= len(entries):
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
|
||||||
|
def _init_socket(self, socket_settings, read_timeout, write_timeout):
|
||||||
|
self.sock.settimeout(None) # set socket back to blocking mode
|
||||||
|
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
||||||
|
self._set_socket_options(socket_settings)
|
||||||
|
|
||||||
|
# set socket timeouts
|
||||||
|
for timeout, interval in ((socket.SO_SNDTIMEO, write_timeout),
|
||||||
|
(socket.SO_RCVTIMEO, read_timeout)):
|
||||||
|
if interval is not None:
|
||||||
|
sec = int(interval)
|
||||||
|
usec = int((interval - sec) * 1000000)
|
||||||
|
self.sock.setsockopt(
|
||||||
|
socket.SOL_SOCKET, timeout,
|
||||||
|
pack('ll', sec, usec),
|
||||||
|
)
|
||||||
|
self._setup_transport()
|
||||||
|
|
||||||
|
self._write(AMQP_PROTOCOL_HEADER)
|
||||||
|
|
||||||
|
def _get_tcp_socket_defaults(self, sock):
|
||||||
|
tcp_opts = {}
|
||||||
|
for opt in KNOWN_TCP_OPTS:
|
||||||
|
enum = None
|
||||||
|
if opt == 'TCP_USER_TIMEOUT':
|
||||||
|
try:
|
||||||
|
from socket import TCP_USER_TIMEOUT as enum
|
||||||
|
except ImportError:
|
||||||
|
# should be in Python 3.6+ on Linux.
|
||||||
|
enum = 18
|
||||||
|
elif hasattr(socket, opt):
|
||||||
|
enum = getattr(socket, opt)
|
||||||
|
|
||||||
|
if enum:
|
||||||
|
if opt in DEFAULT_SOCKET_SETTINGS:
|
||||||
|
tcp_opts[enum] = DEFAULT_SOCKET_SETTINGS[opt]
|
||||||
|
elif hasattr(socket, opt):
|
||||||
|
tcp_opts[enum] = sock.getsockopt(
|
||||||
|
SOL_TCP, getattr(socket, opt))
|
||||||
|
return tcp_opts
|
||||||
|
|
||||||
|
def _set_socket_options(self, socket_settings):
|
||||||
|
tcp_opts = self._get_tcp_socket_defaults(self.sock)
|
||||||
|
if socket_settings:
|
||||||
|
tcp_opts.update(socket_settings)
|
||||||
|
for opt, val in tcp_opts.items():
|
||||||
|
self.sock.setsockopt(SOL_TCP, opt, val)
|
||||||
|
|
||||||
|
def _read(self, n, initial=False):
|
||||||
|
"""Read exactly n bytes from the peer."""
|
||||||
|
raise NotImplementedError('Must be overridden in subclass')
|
||||||
|
|
||||||
|
def _setup_transport(self):
|
||||||
|
"""Do any additional initialization of the class."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _shutdown_transport(self):
|
||||||
|
"""Do any preliminary work in shutting down the connection."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _write(self, s):
|
||||||
|
"""Completely write a string to the peer."""
|
||||||
|
raise NotImplementedError('Must be overridden in subclass')
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
if self.sock is not None:
|
||||||
|
try:
|
||||||
|
self._shutdown_transport()
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Call shutdown first to make sure that pending messages
|
||||||
|
# reach the AMQP broker if the program exits after
|
||||||
|
# calling this method.
|
||||||
|
try:
|
||||||
|
self.sock.shutdown(socket.SHUT_RDWR)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.sock.close()
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
self.sock = None
|
||||||
|
self.connected = False
|
||||||
|
|
||||||
|
def read_frame(self, unpack=unpack):
|
||||||
|
"""Parse AMQP frame.
|
||||||
|
|
||||||
|
Frame has following format::
|
||||||
|
|
||||||
|
0 1 3 7 size+7 size+8
|
||||||
|
+------+---------+---------+ +-------------+ +-----------+
|
||||||
|
| type | channel | size | | payload | | frame-end |
|
||||||
|
+------+---------+---------+ +-------------+ +-----------+
|
||||||
|
octet short long 'size' octets octet
|
||||||
|
|
||||||
|
"""
|
||||||
|
read = self._read
|
||||||
|
read_frame_buffer = EMPTY_BUFFER
|
||||||
|
try:
|
||||||
|
frame_header = read(7, True)
|
||||||
|
read_frame_buffer += frame_header
|
||||||
|
frame_type, channel, size = unpack('>BHI', frame_header)
|
||||||
|
# >I is an unsigned int, but the argument to sock.recv is signed,
|
||||||
|
# so we know the size can be at most 2 * SIGNED_INT_MAX
|
||||||
|
if size > SIGNED_INT_MAX:
|
||||||
|
part1 = read(SIGNED_INT_MAX)
|
||||||
|
|
||||||
|
try:
|
||||||
|
part2 = read(size - SIGNED_INT_MAX)
|
||||||
|
except (socket.timeout, OSError, SSLError):
|
||||||
|
# In case this read times out, we need to make sure to not
|
||||||
|
# lose part1 when we retry the read
|
||||||
|
read_frame_buffer += part1
|
||||||
|
raise
|
||||||
|
|
||||||
|
payload = b''.join([part1, part2])
|
||||||
|
else:
|
||||||
|
payload = read(size)
|
||||||
|
read_frame_buffer += payload
|
||||||
|
frame_end = ord(read(1))
|
||||||
|
except socket.timeout:
|
||||||
|
self._read_buffer = read_frame_buffer + self._read_buffer
|
||||||
|
raise
|
||||||
|
except (OSError, SSLError) as exc:
|
||||||
|
if (
|
||||||
|
isinstance(exc, socket.error) and os.name == 'nt'
|
||||||
|
and exc.errno == errno.EWOULDBLOCK # noqa
|
||||||
|
):
|
||||||
|
# On windows we can get a read timeout with a winsock error
|
||||||
|
# code instead of a proper socket.timeout() error, see
|
||||||
|
# https://github.com/celery/py-amqp/issues/320
|
||||||
|
self._read_buffer = read_frame_buffer + self._read_buffer
|
||||||
|
raise socket.timeout()
|
||||||
|
|
||||||
|
if isinstance(exc, SSLError) and 'timed out' in str(exc):
|
||||||
|
# Don't disconnect for ssl read time outs
|
||||||
|
# http://bugs.python.org/issue10272
|
||||||
|
self._read_buffer = read_frame_buffer + self._read_buffer
|
||||||
|
raise socket.timeout()
|
||||||
|
|
||||||
|
if exc.errno not in _UNAVAIL:
|
||||||
|
self.connected = False
|
||||||
|
raise
|
||||||
|
# frame-end octet must contain '\xce' value
|
||||||
|
if frame_end == 206:
|
||||||
|
return frame_type, channel, payload
|
||||||
|
else:
|
||||||
|
raise UnexpectedFrame(
|
||||||
|
f'Received frame_end {frame_end:#04x} while expecting 0xce')
|
||||||
|
|
||||||
|
def write(self, s):
|
||||||
|
try:
|
||||||
|
self._write(s)
|
||||||
|
except socket.timeout:
|
||||||
|
raise
|
||||||
|
except OSError as exc:
|
||||||
|
if exc.errno not in _UNAVAIL:
|
||||||
|
self.connected = False
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
class SSLTransport(_AbstractTransport):
|
||||||
|
"""Transport that works over SSL.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
host: str
|
||||||
|
|
||||||
|
Broker address in format ``HOSTNAME:PORT``.
|
||||||
|
|
||||||
|
connect_timeout: int
|
||||||
|
|
||||||
|
Timeout of creating new connection.
|
||||||
|
|
||||||
|
ssl: bool|dict
|
||||||
|
|
||||||
|
parameters of TLS subsystem.
|
||||||
|
- when ``ssl`` is not dictionary, defaults of TLS are used
|
||||||
|
- otherwise:
|
||||||
|
- if ``ssl`` dictionary contains ``context`` key,
|
||||||
|
:attr:`~SSLTransport._wrap_context` is used for wrapping
|
||||||
|
socket. ``context`` is a dictionary passed to
|
||||||
|
:attr:`~SSLTransport._wrap_context` as context parameter.
|
||||||
|
All others items from ``ssl`` argument are passed as
|
||||||
|
``sslopts``.
|
||||||
|
- if ``ssl`` dictionary does not contain ``context`` key,
|
||||||
|
:attr:`~SSLTransport._wrap_socket_sni` is used for
|
||||||
|
wrapping socket. All items in ``ssl`` argument are
|
||||||
|
passed to :attr:`~SSLTransport._wrap_socket_sni` as
|
||||||
|
parameters.
|
||||||
|
|
||||||
|
kwargs:
|
||||||
|
|
||||||
|
additional arguments of
|
||||||
|
:class:`~amqp.transport._AbstractTransport` class
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, host, connect_timeout=None, ssl=None, **kwargs):
|
||||||
|
self.sslopts = ssl if isinstance(ssl, dict) else {}
|
||||||
|
self._read_buffer = EMPTY_BUFFER
|
||||||
|
super().__init__(
|
||||||
|
host, connect_timeout=connect_timeout, **kwargs)
|
||||||
|
|
||||||
|
__slots__ = (
|
||||||
|
"sslopts",
|
||||||
|
)
|
||||||
|
|
||||||
|
def _setup_transport(self):
|
||||||
|
"""Wrap the socket in an SSL object."""
|
||||||
|
self.sock = self._wrap_socket(self.sock, **self.sslopts)
|
||||||
|
# Explicitly set a timeout here to stop any hangs on handshake.
|
||||||
|
self.sock.settimeout(self.connect_timeout)
|
||||||
|
self.sock.do_handshake()
|
||||||
|
self._quick_recv = self.sock.read
|
||||||
|
|
||||||
|
def _wrap_socket(self, sock, context=None, **sslopts):
|
||||||
|
if context:
|
||||||
|
return self._wrap_context(sock, sslopts, **context)
|
||||||
|
return self._wrap_socket_sni(sock, **sslopts)
|
||||||
|
|
||||||
|
def _wrap_context(self, sock, sslopts, check_hostname=None, **ctx_options):
|
||||||
|
"""Wrap socket without SNI headers.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
sock: socket.socket
|
||||||
|
|
||||||
|
Socket to be wrapped.
|
||||||
|
|
||||||
|
sslopts: dict
|
||||||
|
|
||||||
|
Parameters of :attr:`ssl.SSLContext.wrap_socket`.
|
||||||
|
|
||||||
|
check_hostname
|
||||||
|
|
||||||
|
Whether to match the peer cert’s hostname. See
|
||||||
|
:attr:`ssl.SSLContext.check_hostname` for details.
|
||||||
|
|
||||||
|
ctx_options
|
||||||
|
|
||||||
|
Parameters of :attr:`ssl.create_default_context`.
|
||||||
|
"""
|
||||||
|
ctx = ssl.create_default_context(**ctx_options)
|
||||||
|
ctx.check_hostname = check_hostname
|
||||||
|
return ctx.wrap_socket(sock, **sslopts)
|
||||||
|
|
||||||
|
def _wrap_socket_sni(self, sock, keyfile=None, certfile=None,
|
||||||
|
server_side=False, cert_reqs=None,
|
||||||
|
ca_certs=None, do_handshake_on_connect=False,
|
||||||
|
suppress_ragged_eofs=True, server_hostname=None,
|
||||||
|
ciphers=None, ssl_version=None):
|
||||||
|
"""Socket wrap with SNI headers.
|
||||||
|
|
||||||
|
stdlib :attr:`ssl.SSLContext.wrap_socket` method augmented with support
|
||||||
|
for setting the server_hostname field required for SNI hostname header.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
sock: socket.socket
|
||||||
|
|
||||||
|
Socket to be wrapped.
|
||||||
|
|
||||||
|
keyfile: str
|
||||||
|
|
||||||
|
Path to the private key
|
||||||
|
|
||||||
|
certfile: str
|
||||||
|
|
||||||
|
Path to the certificate
|
||||||
|
|
||||||
|
server_side: bool
|
||||||
|
|
||||||
|
Identifies whether server-side or client-side
|
||||||
|
behavior is desired from this socket. See
|
||||||
|
:attr:`~ssl.SSLContext.wrap_socket` for details.
|
||||||
|
|
||||||
|
cert_reqs: ssl.VerifyMode
|
||||||
|
|
||||||
|
When set to other than :attr:`ssl.CERT_NONE`, peers certificate
|
||||||
|
is checked. Possible values are :attr:`ssl.CERT_NONE`,
|
||||||
|
:attr:`ssl.CERT_OPTIONAL` and :attr:`ssl.CERT_REQUIRED`.
|
||||||
|
|
||||||
|
ca_certs: str
|
||||||
|
|
||||||
|
Path to “certification authority” (CA) certificates
|
||||||
|
used to validate other peers’ certificates when ``cert_reqs``
|
||||||
|
is other than :attr:`ssl.CERT_NONE`.
|
||||||
|
|
||||||
|
do_handshake_on_connect: bool
|
||||||
|
|
||||||
|
Specifies whether to do the SSL
|
||||||
|
handshake automatically. See
|
||||||
|
:attr:`~ssl.SSLContext.wrap_socket` for details.
|
||||||
|
|
||||||
|
suppress_ragged_eofs (bool):
|
||||||
|
|
||||||
|
See :attr:`~ssl.SSLContext.wrap_socket` for details.
|
||||||
|
|
||||||
|
server_hostname: str
|
||||||
|
|
||||||
|
Specifies the hostname of the service which
|
||||||
|
we are connecting to. See :attr:`~ssl.SSLContext.wrap_socket`
|
||||||
|
for details.
|
||||||
|
|
||||||
|
ciphers: str
|
||||||
|
|
||||||
|
Available ciphers for sockets created with this
|
||||||
|
context. See :attr:`ssl.SSLContext.set_ciphers`
|
||||||
|
|
||||||
|
ssl_version:
|
||||||
|
|
||||||
|
Protocol of the SSL Context. The value is one of
|
||||||
|
``ssl.PROTOCOL_*`` constants.
|
||||||
|
"""
|
||||||
|
opts = {
|
||||||
|
'sock': sock,
|
||||||
|
'server_side': server_side,
|
||||||
|
'do_handshake_on_connect': do_handshake_on_connect,
|
||||||
|
'suppress_ragged_eofs': suppress_ragged_eofs,
|
||||||
|
'server_hostname': server_hostname,
|
||||||
|
}
|
||||||
|
|
||||||
|
if ssl_version is None:
|
||||||
|
ssl_version = (
|
||||||
|
ssl.PROTOCOL_TLS_SERVER
|
||||||
|
if server_side
|
||||||
|
else ssl.PROTOCOL_TLS_CLIENT
|
||||||
|
)
|
||||||
|
|
||||||
|
context = ssl.SSLContext(ssl_version)
|
||||||
|
|
||||||
|
if certfile is not None:
|
||||||
|
context.load_cert_chain(certfile, keyfile)
|
||||||
|
if ca_certs is not None:
|
||||||
|
context.load_verify_locations(ca_certs)
|
||||||
|
if ciphers is not None:
|
||||||
|
context.set_ciphers(ciphers)
|
||||||
|
# Set SNI headers if supported.
|
||||||
|
# Must set context.check_hostname before setting context.verify_mode
|
||||||
|
# to avoid setting context.verify_mode=ssl.CERT_NONE while
|
||||||
|
# context.check_hostname is still True (the default value in context
|
||||||
|
# if client-side) which results in the following exception:
|
||||||
|
# ValueError: Cannot set verify_mode to CERT_NONE when check_hostname
|
||||||
|
# is enabled.
|
||||||
|
try:
|
||||||
|
context.check_hostname = (
|
||||||
|
ssl.HAS_SNI and server_hostname is not None
|
||||||
|
)
|
||||||
|
except AttributeError:
|
||||||
|
pass # ask forgiveness not permission
|
||||||
|
|
||||||
|
# See note above re: ordering for context.check_hostname and
|
||||||
|
# context.verify_mode assignments.
|
||||||
|
if cert_reqs is not None:
|
||||||
|
context.verify_mode = cert_reqs
|
||||||
|
|
||||||
|
if ca_certs is None and context.verify_mode != ssl.CERT_NONE:
|
||||||
|
purpose = (
|
||||||
|
ssl.Purpose.CLIENT_AUTH
|
||||||
|
if server_side
|
||||||
|
else ssl.Purpose.SERVER_AUTH
|
||||||
|
)
|
||||||
|
context.load_default_certs(purpose)
|
||||||
|
|
||||||
|
sock = context.wrap_socket(**opts)
|
||||||
|
return sock
|
||||||
|
|
||||||
|
def _shutdown_transport(self):
|
||||||
|
"""Unwrap a SSL socket, so we can call shutdown()."""
|
||||||
|
if self.sock is not None:
|
||||||
|
self.sock = self.sock.unwrap()
|
||||||
|
|
||||||
|
def _read(self, n, initial=False,
|
||||||
|
_errnos=(errno.ENOENT, errno.EAGAIN, errno.EINTR)):
|
||||||
|
# According to SSL_read(3), it can at most return 16kb of data.
|
||||||
|
# Thus, we use an internal read buffer like TCPTransport._read
|
||||||
|
# to get the exact number of bytes wanted.
|
||||||
|
recv = self._quick_recv
|
||||||
|
rbuf = self._read_buffer
|
||||||
|
try:
|
||||||
|
while len(rbuf) < n:
|
||||||
|
try:
|
||||||
|
s = recv(n - len(rbuf)) # see note above
|
||||||
|
except OSError as exc:
|
||||||
|
# ssl.sock.read may cause ENOENT if the
|
||||||
|
# operation couldn't be performed (Issue celery#1414).
|
||||||
|
if exc.errno in _errnos:
|
||||||
|
if initial and self.raise_on_initial_eintr:
|
||||||
|
raise socket.timeout()
|
||||||
|
continue
|
||||||
|
raise
|
||||||
|
if not s:
|
||||||
|
raise OSError('Server unexpectedly closed connection')
|
||||||
|
rbuf += s
|
||||||
|
except: # noqa
|
||||||
|
self._read_buffer = rbuf
|
||||||
|
raise
|
||||||
|
result, self._read_buffer = rbuf[:n], rbuf[n:]
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _write(self, s):
|
||||||
|
"""Write a string out to the SSL socket fully."""
|
||||||
|
write = self.sock.write
|
||||||
|
while s:
|
||||||
|
try:
|
||||||
|
n = write(s)
|
||||||
|
except ValueError:
|
||||||
|
# AG: sock._sslobj might become null in the meantime if the
|
||||||
|
# remote connection has hung up.
|
||||||
|
# In python 3.4, a ValueError is raised is self._sslobj is
|
||||||
|
# None.
|
||||||
|
n = 0
|
||||||
|
if not n:
|
||||||
|
raise OSError('Socket closed')
|
||||||
|
s = s[n:]
|
||||||
|
|
||||||
|
|
||||||
|
class TCPTransport(_AbstractTransport):
|
||||||
|
"""Transport that deals directly with TCP socket.
|
||||||
|
|
||||||
|
All parameters are :class:`~amqp.transport._AbstractTransport` class.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _setup_transport(self):
|
||||||
|
# Setup to _write() directly to the socket, and
|
||||||
|
# do our own buffered reads.
|
||||||
|
self._write = self.sock.sendall
|
||||||
|
self._read_buffer = EMPTY_BUFFER
|
||||||
|
self._quick_recv = self.sock.recv
|
||||||
|
|
||||||
|
def _read(self, n, initial=False, _errnos=(errno.EAGAIN, errno.EINTR)):
|
||||||
|
"""Read exactly n bytes from the socket."""
|
||||||
|
recv = self._quick_recv
|
||||||
|
rbuf = self._read_buffer
|
||||||
|
try:
|
||||||
|
while len(rbuf) < n:
|
||||||
|
try:
|
||||||
|
s = recv(n - len(rbuf))
|
||||||
|
except OSError as exc:
|
||||||
|
if exc.errno in _errnos:
|
||||||
|
if initial and self.raise_on_initial_eintr:
|
||||||
|
raise socket.timeout()
|
||||||
|
continue
|
||||||
|
raise
|
||||||
|
if not s:
|
||||||
|
raise OSError('Server unexpectedly closed connection')
|
||||||
|
rbuf += s
|
||||||
|
except: # noqa
|
||||||
|
self._read_buffer = rbuf
|
||||||
|
raise
|
||||||
|
|
||||||
|
result, self._read_buffer = rbuf[:n], rbuf[n:]
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def Transport(host, connect_timeout=None, ssl=False, **kwargs):
|
||||||
|
"""Create transport.
|
||||||
|
|
||||||
|
Given a few parameters from the Connection constructor,
|
||||||
|
select and create a subclass of
|
||||||
|
:class:`~amqp.transport._AbstractTransport`.
|
||||||
|
|
||||||
|
PARAMETERS:
|
||||||
|
|
||||||
|
host: str
|
||||||
|
|
||||||
|
Broker address in format ``HOSTNAME:PORT``.
|
||||||
|
|
||||||
|
connect_timeout: int
|
||||||
|
|
||||||
|
Timeout of creating new connection.
|
||||||
|
|
||||||
|
ssl: bool|dict
|
||||||
|
|
||||||
|
If set, :class:`~amqp.transport.SSLTransport` is used
|
||||||
|
and ``ssl`` parameter is passed to it. Otherwise
|
||||||
|
:class:`~amqp.transport.TCPTransport` is used.
|
||||||
|
|
||||||
|
kwargs:
|
||||||
|
|
||||||
|
additional arguments of :class:`~amqp.transport._AbstractTransport`
|
||||||
|
class
|
||||||
|
"""
|
||||||
|
transport = SSLTransport if ssl else TCPTransport
|
||||||
|
return transport(host, connect_timeout=connect_timeout, ssl=ssl, **kwargs)
|
||||||
|
|
@ -0,0 +1,64 @@
|
||||||
|
"""Compatibility utilities."""
|
||||||
|
import logging
|
||||||
|
from logging import NullHandler
|
||||||
|
|
||||||
|
# enables celery 3.1.23 to start again
|
||||||
|
from vine import promise # noqa
|
||||||
|
from vine.utils import wraps
|
||||||
|
|
||||||
|
try:
|
||||||
|
import fcntl
|
||||||
|
except ImportError: # pragma: no cover
|
||||||
|
fcntl = None # noqa
|
||||||
|
|
||||||
|
|
||||||
|
def set_cloexec(fd, cloexec):
|
||||||
|
"""Set flag to close fd after exec."""
|
||||||
|
if fcntl is None:
|
||||||
|
return
|
||||||
|
try:
|
||||||
|
FD_CLOEXEC = fcntl.FD_CLOEXEC
|
||||||
|
except AttributeError:
|
||||||
|
raise NotImplementedError(
|
||||||
|
'close-on-exec flag not supported on this platform',
|
||||||
|
)
|
||||||
|
flags = fcntl.fcntl(fd, fcntl.F_GETFD)
|
||||||
|
if cloexec:
|
||||||
|
flags |= FD_CLOEXEC
|
||||||
|
else:
|
||||||
|
flags &= ~FD_CLOEXEC
|
||||||
|
return fcntl.fcntl(fd, fcntl.F_SETFD, flags)
|
||||||
|
|
||||||
|
|
||||||
|
def coro(gen):
|
||||||
|
"""Decorator to mark generator as a co-routine."""
|
||||||
|
@wraps(gen)
|
||||||
|
def _boot(*args, **kwargs):
|
||||||
|
co = gen(*args, **kwargs)
|
||||||
|
next(co)
|
||||||
|
return co
|
||||||
|
|
||||||
|
return _boot
|
||||||
|
|
||||||
|
|
||||||
|
def str_to_bytes(s):
|
||||||
|
"""Convert str to bytes."""
|
||||||
|
if isinstance(s, str):
|
||||||
|
return s.encode('utf-8', 'surrogatepass')
|
||||||
|
return s
|
||||||
|
|
||||||
|
|
||||||
|
def bytes_to_str(s):
|
||||||
|
"""Convert bytes to str."""
|
||||||
|
if isinstance(s, bytes):
|
||||||
|
return s.decode('utf-8', 'surrogatepass')
|
||||||
|
return s
|
||||||
|
|
||||||
|
|
||||||
|
def get_logger(logger):
|
||||||
|
"""Get logger by name."""
|
||||||
|
if isinstance(logger, str):
|
||||||
|
logger = logging.getLogger(logger)
|
||||||
|
if not logger.handlers:
|
||||||
|
logger.addHandler(NullHandler())
|
||||||
|
return logger
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
pip
|
||||||
|
|
@ -0,0 +1,295 @@
|
||||||
|
Metadata-Version: 2.3
|
||||||
|
Name: annotated-types
|
||||||
|
Version: 0.7.0
|
||||||
|
Summary: Reusable constraint types to use with typing.Annotated
|
||||||
|
Project-URL: Homepage, https://github.com/annotated-types/annotated-types
|
||||||
|
Project-URL: Source, https://github.com/annotated-types/annotated-types
|
||||||
|
Project-URL: Changelog, https://github.com/annotated-types/annotated-types/releases
|
||||||
|
Author-email: Adrian Garcia Badaracco <1755071+adriangb@users.noreply.github.com>, Samuel Colvin <s@muelcolvin.com>, Zac Hatfield-Dodds <zac@zhd.dev>
|
||||||
|
License-File: LICENSE
|
||||||
|
Classifier: Development Status :: 4 - Beta
|
||||||
|
Classifier: Environment :: Console
|
||||||
|
Classifier: Environment :: MacOS X
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: Intended Audience :: Information Technology
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Operating System :: POSIX :: Linux
|
||||||
|
Classifier: Operating System :: Unix
|
||||||
|
Classifier: Programming Language :: Python :: 3 :: Only
|
||||||
|
Classifier: Programming Language :: Python :: 3.8
|
||||||
|
Classifier: Programming Language :: Python :: 3.9
|
||||||
|
Classifier: Programming Language :: Python :: 3.10
|
||||||
|
Classifier: Programming Language :: Python :: 3.11
|
||||||
|
Classifier: Programming Language :: Python :: 3.12
|
||||||
|
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||||
|
Classifier: Typing :: Typed
|
||||||
|
Requires-Python: >=3.8
|
||||||
|
Requires-Dist: typing-extensions>=4.0.0; python_version < '3.9'
|
||||||
|
Description-Content-Type: text/markdown
|
||||||
|
|
||||||
|
# annotated-types
|
||||||
|
|
||||||
|
[](https://github.com/annotated-types/annotated-types/actions?query=event%3Apush+branch%3Amain+workflow%3ACI)
|
||||||
|
[](https://pypi.python.org/pypi/annotated-types)
|
||||||
|
[](https://github.com/annotated-types/annotated-types)
|
||||||
|
[](https://github.com/annotated-types/annotated-types/blob/main/LICENSE)
|
||||||
|
|
||||||
|
[PEP-593](https://peps.python.org/pep-0593/) added `typing.Annotated` as a way of
|
||||||
|
adding context-specific metadata to existing types, and specifies that
|
||||||
|
`Annotated[T, x]` _should_ be treated as `T` by any tool or library without special
|
||||||
|
logic for `x`.
|
||||||
|
|
||||||
|
This package provides metadata objects which can be used to represent common
|
||||||
|
constraints such as upper and lower bounds on scalar values and collection sizes,
|
||||||
|
a `Predicate` marker for runtime checks, and
|
||||||
|
descriptions of how we intend these metadata to be interpreted. In some cases,
|
||||||
|
we also note alternative representations which do not require this package.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install annotated-types
|
||||||
|
```
|
||||||
|
|
||||||
|
## Examples
|
||||||
|
|
||||||
|
```python
|
||||||
|
from typing import Annotated
|
||||||
|
from annotated_types import Gt, Len, Predicate
|
||||||
|
|
||||||
|
class MyClass:
|
||||||
|
age: Annotated[int, Gt(18)] # Valid: 19, 20, ...
|
||||||
|
# Invalid: 17, 18, "19", 19.0, ...
|
||||||
|
factors: list[Annotated[int, Predicate(is_prime)]] # Valid: 2, 3, 5, 7, 11, ...
|
||||||
|
# Invalid: 4, 8, -2, 5.0, "prime", ...
|
||||||
|
|
||||||
|
my_list: Annotated[list[int], Len(0, 10)] # Valid: [], [10, 20, 30, 40, 50]
|
||||||
|
# Invalid: (1, 2), ["abc"], [0] * 20
|
||||||
|
```
|
||||||
|
|
||||||
|
## Documentation
|
||||||
|
|
||||||
|
_While `annotated-types` avoids runtime checks for performance, users should not
|
||||||
|
construct invalid combinations such as `MultipleOf("non-numeric")` or `Annotated[int, Len(3)]`.
|
||||||
|
Downstream implementors may choose to raise an error, emit a warning, silently ignore
|
||||||
|
a metadata item, etc., if the metadata objects described below are used with an
|
||||||
|
incompatible type - or for any other reason!_
|
||||||
|
|
||||||
|
### Gt, Ge, Lt, Le
|
||||||
|
|
||||||
|
Express inclusive and/or exclusive bounds on orderable values - which may be numbers,
|
||||||
|
dates, times, strings, sets, etc. Note that the boundary value need not be of the
|
||||||
|
same type that was annotated, so long as they can be compared: `Annotated[int, Gt(1.5)]`
|
||||||
|
is fine, for example, and implies that the value is an integer x such that `x > 1.5`.
|
||||||
|
|
||||||
|
We suggest that implementors may also interpret `functools.partial(operator.le, 1.5)`
|
||||||
|
as being equivalent to `Gt(1.5)`, for users who wish to avoid a runtime dependency on
|
||||||
|
the `annotated-types` package.
|
||||||
|
|
||||||
|
To be explicit, these types have the following meanings:
|
||||||
|
|
||||||
|
* `Gt(x)` - value must be "Greater Than" `x` - equivalent to exclusive minimum
|
||||||
|
* `Ge(x)` - value must be "Greater than or Equal" to `x` - equivalent to inclusive minimum
|
||||||
|
* `Lt(x)` - value must be "Less Than" `x` - equivalent to exclusive maximum
|
||||||
|
* `Le(x)` - value must be "Less than or Equal" to `x` - equivalent to inclusive maximum
|
||||||
|
|
||||||
|
### Interval
|
||||||
|
|
||||||
|
`Interval(gt, ge, lt, le)` allows you to specify an upper and lower bound with a single
|
||||||
|
metadata object. `None` attributes should be ignored, and non-`None` attributes
|
||||||
|
treated as per the single bounds above.
|
||||||
|
|
||||||
|
### MultipleOf
|
||||||
|
|
||||||
|
`MultipleOf(multiple_of=x)` might be interpreted in two ways:
|
||||||
|
|
||||||
|
1. Python semantics, implying `value % multiple_of == 0`, or
|
||||||
|
2. [JSONschema semantics](https://json-schema.org/draft/2020-12/json-schema-validation.html#rfc.section.6.2.1),
|
||||||
|
where `int(value / multiple_of) == value / multiple_of`.
|
||||||
|
|
||||||
|
We encourage users to be aware of these two common interpretations and their
|
||||||
|
distinct behaviours, especially since very large or non-integer numbers make
|
||||||
|
it easy to cause silent data corruption due to floating-point imprecision.
|
||||||
|
|
||||||
|
We encourage libraries to carefully document which interpretation they implement.
|
||||||
|
|
||||||
|
### MinLen, MaxLen, Len
|
||||||
|
|
||||||
|
`Len()` implies that `min_length <= len(value) <= max_length` - lower and upper bounds are inclusive.
|
||||||
|
|
||||||
|
As well as `Len()` which can optionally include upper and lower bounds, we also
|
||||||
|
provide `MinLen(x)` and `MaxLen(y)` which are equivalent to `Len(min_length=x)`
|
||||||
|
and `Len(max_length=y)` respectively.
|
||||||
|
|
||||||
|
`Len`, `MinLen`, and `MaxLen` may be used with any type which supports `len(value)`.
|
||||||
|
|
||||||
|
Examples of usage:
|
||||||
|
|
||||||
|
* `Annotated[list, MaxLen(10)]` (or `Annotated[list, Len(max_length=10))`) - list must have a length of 10 or less
|
||||||
|
* `Annotated[str, MaxLen(10)]` - string must have a length of 10 or less
|
||||||
|
* `Annotated[list, MinLen(3))` (or `Annotated[list, Len(min_length=3))`) - list must have a length of 3 or more
|
||||||
|
* `Annotated[list, Len(4, 6)]` - list must have a length of 4, 5, or 6
|
||||||
|
* `Annotated[list, Len(8, 8)]` - list must have a length of exactly 8
|
||||||
|
|
||||||
|
#### Changed in v0.4.0
|
||||||
|
|
||||||
|
* `min_inclusive` has been renamed to `min_length`, no change in meaning
|
||||||
|
* `max_exclusive` has been renamed to `max_length`, upper bound is now **inclusive** instead of **exclusive**
|
||||||
|
* The recommendation that slices are interpreted as `Len` has been removed due to ambiguity and different semantic
|
||||||
|
meaning of the upper bound in slices vs. `Len`
|
||||||
|
|
||||||
|
See [issue #23](https://github.com/annotated-types/annotated-types/issues/23) for discussion.
|
||||||
|
|
||||||
|
### Timezone
|
||||||
|
|
||||||
|
`Timezone` can be used with a `datetime` or a `time` to express which timezones
|
||||||
|
are allowed. `Annotated[datetime, Timezone(None)]` must be a naive datetime.
|
||||||
|
`Timezone[...]` ([literal ellipsis](https://docs.python.org/3/library/constants.html#Ellipsis))
|
||||||
|
expresses that any timezone-aware datetime is allowed. You may also pass a specific
|
||||||
|
timezone string or [`tzinfo`](https://docs.python.org/3/library/datetime.html#tzinfo-objects)
|
||||||
|
object such as `Timezone(timezone.utc)` or `Timezone("Africa/Abidjan")` to express that you only
|
||||||
|
allow a specific timezone, though we note that this is often a symptom of fragile design.
|
||||||
|
|
||||||
|
#### Changed in v0.x.x
|
||||||
|
|
||||||
|
* `Timezone` accepts [`tzinfo`](https://docs.python.org/3/library/datetime.html#tzinfo-objects) objects instead of
|
||||||
|
`timezone`, extending compatibility to [`zoneinfo`](https://docs.python.org/3/library/zoneinfo.html) and third party libraries.
|
||||||
|
|
||||||
|
### Unit
|
||||||
|
|
||||||
|
`Unit(unit: str)` expresses that the annotated numeric value is the magnitude of
|
||||||
|
a quantity with the specified unit. For example, `Annotated[float, Unit("m/s")]`
|
||||||
|
would be a float representing a velocity in meters per second.
|
||||||
|
|
||||||
|
Please note that `annotated_types` itself makes no attempt to parse or validate
|
||||||
|
the unit string in any way. That is left entirely to downstream libraries,
|
||||||
|
such as [`pint`](https://pint.readthedocs.io) or
|
||||||
|
[`astropy.units`](https://docs.astropy.org/en/stable/units/).
|
||||||
|
|
||||||
|
An example of how a library might use this metadata:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from annotated_types import Unit
|
||||||
|
from typing import Annotated, TypeVar, Callable, Any, get_origin, get_args
|
||||||
|
|
||||||
|
# given a type annotated with a unit:
|
||||||
|
Meters = Annotated[float, Unit("m")]
|
||||||
|
|
||||||
|
|
||||||
|
# you can cast the annotation to a specific unit type with any
|
||||||
|
# callable that accepts a string and returns the desired type
|
||||||
|
T = TypeVar("T")
|
||||||
|
def cast_unit(tp: Any, unit_cls: Callable[[str], T]) -> T | None:
|
||||||
|
if get_origin(tp) is Annotated:
|
||||||
|
for arg in get_args(tp):
|
||||||
|
if isinstance(arg, Unit):
|
||||||
|
return unit_cls(arg.unit)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# using `pint`
|
||||||
|
import pint
|
||||||
|
pint_unit = cast_unit(Meters, pint.Unit)
|
||||||
|
|
||||||
|
|
||||||
|
# using `astropy.units`
|
||||||
|
import astropy.units as u
|
||||||
|
astropy_unit = cast_unit(Meters, u.Unit)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Predicate
|
||||||
|
|
||||||
|
`Predicate(func: Callable)` expresses that `func(value)` is truthy for valid values.
|
||||||
|
Users should prefer the statically inspectable metadata above, but if you need
|
||||||
|
the full power and flexibility of arbitrary runtime predicates... here it is.
|
||||||
|
|
||||||
|
For some common constraints, we provide generic types:
|
||||||
|
|
||||||
|
* `IsLower = Annotated[T, Predicate(str.islower)]`
|
||||||
|
* `IsUpper = Annotated[T, Predicate(str.isupper)]`
|
||||||
|
* `IsDigit = Annotated[T, Predicate(str.isdigit)]`
|
||||||
|
* `IsFinite = Annotated[T, Predicate(math.isfinite)]`
|
||||||
|
* `IsNotFinite = Annotated[T, Predicate(Not(math.isfinite))]`
|
||||||
|
* `IsNan = Annotated[T, Predicate(math.isnan)]`
|
||||||
|
* `IsNotNan = Annotated[T, Predicate(Not(math.isnan))]`
|
||||||
|
* `IsInfinite = Annotated[T, Predicate(math.isinf)]`
|
||||||
|
* `IsNotInfinite = Annotated[T, Predicate(Not(math.isinf))]`
|
||||||
|
|
||||||
|
so that you can write e.g. `x: IsFinite[float] = 2.0` instead of the longer
|
||||||
|
(but exactly equivalent) `x: Annotated[float, Predicate(math.isfinite)] = 2.0`.
|
||||||
|
|
||||||
|
Some libraries might have special logic to handle known or understandable predicates,
|
||||||
|
for example by checking for `str.isdigit` and using its presence to both call custom
|
||||||
|
logic to enforce digit-only strings, and customise some generated external schema.
|
||||||
|
Users are therefore encouraged to avoid indirection like `lambda s: s.lower()`, in
|
||||||
|
favor of introspectable methods such as `str.lower` or `re.compile("pattern").search`.
|
||||||
|
|
||||||
|
To enable basic negation of commonly used predicates like `math.isnan` without introducing introspection that makes it impossible for implementers to introspect the predicate we provide a `Not` wrapper that simply negates the predicate in an introspectable manner. Several of the predicates listed above are created in this manner.
|
||||||
|
|
||||||
|
We do not specify what behaviour should be expected for predicates that raise
|
||||||
|
an exception. For example `Annotated[int, Predicate(str.isdigit)]` might silently
|
||||||
|
skip invalid constraints, or statically raise an error; or it might try calling it
|
||||||
|
and then propagate or discard the resulting
|
||||||
|
`TypeError: descriptor 'isdigit' for 'str' objects doesn't apply to a 'int' object`
|
||||||
|
exception. We encourage libraries to document the behaviour they choose.
|
||||||
|
|
||||||
|
### Doc
|
||||||
|
|
||||||
|
`doc()` can be used to add documentation information in `Annotated`, for function and method parameters, variables, class attributes, return types, and any place where `Annotated` can be used.
|
||||||
|
|
||||||
|
It expects a value that can be statically analyzed, as the main use case is for static analysis, editors, documentation generators, and similar tools.
|
||||||
|
|
||||||
|
It returns a `DocInfo` class with a single attribute `documentation` containing the value passed to `doc()`.
|
||||||
|
|
||||||
|
This is the early adopter's alternative form of the [`typing-doc` proposal](https://github.com/tiangolo/fastapi/blob/typing-doc/typing_doc.md).
|
||||||
|
|
||||||
|
### Integrating downstream types with `GroupedMetadata`
|
||||||
|
|
||||||
|
Implementers may choose to provide a convenience wrapper that groups multiple pieces of metadata.
|
||||||
|
This can help reduce verbosity and cognitive overhead for users.
|
||||||
|
For example, an implementer like Pydantic might provide a `Field` or `Meta` type that accepts keyword arguments and transforms these into low-level metadata:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import Iterator
|
||||||
|
from annotated_types import GroupedMetadata, Ge
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Field(GroupedMetadata):
|
||||||
|
ge: int | None = None
|
||||||
|
description: str | None = None
|
||||||
|
|
||||||
|
def __iter__(self) -> Iterator[object]:
|
||||||
|
# Iterating over a GroupedMetadata object should yield annotated-types
|
||||||
|
# constraint metadata objects which describe it as fully as possible,
|
||||||
|
# and may include other unknown objects too.
|
||||||
|
if self.ge is not None:
|
||||||
|
yield Ge(self.ge)
|
||||||
|
if self.description is not None:
|
||||||
|
yield Description(self.description)
|
||||||
|
```
|
||||||
|
|
||||||
|
Libraries consuming annotated-types constraints should check for `GroupedMetadata` and unpack it by iterating over the object and treating the results as if they had been "unpacked" in the `Annotated` type. The same logic should be applied to the [PEP 646 `Unpack` type](https://peps.python.org/pep-0646/), so that `Annotated[T, Field(...)]`, `Annotated[T, Unpack[Field(...)]]` and `Annotated[T, *Field(...)]` are all treated consistently.
|
||||||
|
|
||||||
|
Libraries consuming annotated-types should also ignore any metadata they do not recongize that came from unpacking a `GroupedMetadata`, just like they ignore unrecognized metadata in `Annotated` itself.
|
||||||
|
|
||||||
|
Our own `annotated_types.Interval` class is a `GroupedMetadata` which unpacks itself into `Gt`, `Lt`, etc., so this is not an abstract concern. Similarly, `annotated_types.Len` is a `GroupedMetadata` which unpacks itself into `MinLen` (optionally) and `MaxLen`.
|
||||||
|
|
||||||
|
### Consuming metadata
|
||||||
|
|
||||||
|
We intend to not be prescriptive as to _how_ the metadata and constraints are used, but as an example of how one might parse constraints from types annotations see our [implementation in `test_main.py`](https://github.com/annotated-types/annotated-types/blob/f59cf6d1b5255a0fe359b93896759a180bec30ae/tests/test_main.py#L94-L103).
|
||||||
|
|
||||||
|
It is up to the implementer to determine how this metadata is used.
|
||||||
|
You could use the metadata for runtime type checking, for generating schemas or to generate example data, amongst other use cases.
|
||||||
|
|
||||||
|
## Design & History
|
||||||
|
|
||||||
|
This package was designed at the PyCon 2022 sprints by the maintainers of Pydantic
|
||||||
|
and Hypothesis, with the goal of making it as easy as possible for end-users to
|
||||||
|
provide more informative annotations for use by runtime libraries.
|
||||||
|
|
||||||
|
It is deliberately minimal, and following PEP-593 allows considerable downstream
|
||||||
|
discretion in what (if anything!) they choose to support. Nonetheless, we expect
|
||||||
|
that staying simple and covering _only_ the most common use-cases will give users
|
||||||
|
and maintainers the best experience we can. If you'd like more constraints for your
|
||||||
|
types - follow our lead, by defining them and documenting them downstream!
|
||||||
|
|
@ -0,0 +1,10 @@
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/annotated_types/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/annotated_types/test_cases.cpython-39.pyc,,
|
||||||
|
annotated_types-0.7.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||||
|
annotated_types-0.7.0.dist-info/METADATA,sha256=7ltqxksJJ0wCYFGBNIQCWTlWQGeAH0hRFdnK3CB895E,15046
|
||||||
|
annotated_types-0.7.0.dist-info/RECORD,,
|
||||||
|
annotated_types-0.7.0.dist-info/WHEEL,sha256=zEMcRr9Kr03x1ozGwg5v9NQBKn3kndp6LSoSlVg-jhU,87
|
||||||
|
annotated_types-0.7.0.dist-info/licenses/LICENSE,sha256=_hBJiEsaDZNCkB6I4H8ykl0ksxIdmXK2poBfuYJLCV0,1083
|
||||||
|
annotated_types/__init__.py,sha256=RynLsRKUEGI0KimXydlD1fZEfEzWwDo0Uon3zOKhG1Q,13819
|
||||||
|
annotated_types/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
annotated_types/test_cases.py,sha256=zHFX6EpcMbGJ8FzBYDbO56bPwx_DYIVSKbZM-4B3_lg,6421
|
||||||
|
|
@ -0,0 +1,4 @@
|
||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: hatchling 1.24.2
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
@ -0,0 +1,21 @@
|
||||||
|
The MIT License (MIT)
|
||||||
|
|
||||||
|
Copyright (c) 2022 the contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
|
|
@ -0,0 +1,432 @@
|
||||||
|
import math
|
||||||
|
import sys
|
||||||
|
import types
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import tzinfo
|
||||||
|
from typing import TYPE_CHECKING, Any, Callable, Iterator, Optional, SupportsFloat, SupportsIndex, TypeVar, Union
|
||||||
|
|
||||||
|
if sys.version_info < (3, 8):
|
||||||
|
from typing_extensions import Protocol, runtime_checkable
|
||||||
|
else:
|
||||||
|
from typing import Protocol, runtime_checkable
|
||||||
|
|
||||||
|
if sys.version_info < (3, 9):
|
||||||
|
from typing_extensions import Annotated, Literal
|
||||||
|
else:
|
||||||
|
from typing import Annotated, Literal
|
||||||
|
|
||||||
|
if sys.version_info < (3, 10):
|
||||||
|
EllipsisType = type(Ellipsis)
|
||||||
|
KW_ONLY = {}
|
||||||
|
SLOTS = {}
|
||||||
|
else:
|
||||||
|
from types import EllipsisType
|
||||||
|
|
||||||
|
KW_ONLY = {"kw_only": True}
|
||||||
|
SLOTS = {"slots": True}
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = (
|
||||||
|
'BaseMetadata',
|
||||||
|
'GroupedMetadata',
|
||||||
|
'Gt',
|
||||||
|
'Ge',
|
||||||
|
'Lt',
|
||||||
|
'Le',
|
||||||
|
'Interval',
|
||||||
|
'MultipleOf',
|
||||||
|
'MinLen',
|
||||||
|
'MaxLen',
|
||||||
|
'Len',
|
||||||
|
'Timezone',
|
||||||
|
'Predicate',
|
||||||
|
'LowerCase',
|
||||||
|
'UpperCase',
|
||||||
|
'IsDigits',
|
||||||
|
'IsFinite',
|
||||||
|
'IsNotFinite',
|
||||||
|
'IsNan',
|
||||||
|
'IsNotNan',
|
||||||
|
'IsInfinite',
|
||||||
|
'IsNotInfinite',
|
||||||
|
'doc',
|
||||||
|
'DocInfo',
|
||||||
|
'__version__',
|
||||||
|
)
|
||||||
|
|
||||||
|
__version__ = '0.7.0'
|
||||||
|
|
||||||
|
|
||||||
|
T = TypeVar('T')
|
||||||
|
|
||||||
|
|
||||||
|
# arguments that start with __ are considered
|
||||||
|
# positional only
|
||||||
|
# see https://peps.python.org/pep-0484/#positional-only-arguments
|
||||||
|
|
||||||
|
|
||||||
|
class SupportsGt(Protocol):
|
||||||
|
def __gt__(self: T, __other: T) -> bool:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class SupportsGe(Protocol):
|
||||||
|
def __ge__(self: T, __other: T) -> bool:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class SupportsLt(Protocol):
|
||||||
|
def __lt__(self: T, __other: T) -> bool:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class SupportsLe(Protocol):
|
||||||
|
def __le__(self: T, __other: T) -> bool:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class SupportsMod(Protocol):
|
||||||
|
def __mod__(self: T, __other: T) -> T:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class SupportsDiv(Protocol):
|
||||||
|
def __div__(self: T, __other: T) -> T:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class BaseMetadata:
|
||||||
|
"""Base class for all metadata.
|
||||||
|
|
||||||
|
This exists mainly so that implementers
|
||||||
|
can do `isinstance(..., BaseMetadata)` while traversing field annotations.
|
||||||
|
"""
|
||||||
|
|
||||||
|
__slots__ = ()
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Gt(BaseMetadata):
|
||||||
|
"""Gt(gt=x) implies that the value must be greater than x.
|
||||||
|
|
||||||
|
It can be used with any type that supports the ``>`` operator,
|
||||||
|
including numbers, dates and times, strings, sets, and so on.
|
||||||
|
"""
|
||||||
|
|
||||||
|
gt: SupportsGt
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Ge(BaseMetadata):
|
||||||
|
"""Ge(ge=x) implies that the value must be greater than or equal to x.
|
||||||
|
|
||||||
|
It can be used with any type that supports the ``>=`` operator,
|
||||||
|
including numbers, dates and times, strings, sets, and so on.
|
||||||
|
"""
|
||||||
|
|
||||||
|
ge: SupportsGe
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Lt(BaseMetadata):
|
||||||
|
"""Lt(lt=x) implies that the value must be less than x.
|
||||||
|
|
||||||
|
It can be used with any type that supports the ``<`` operator,
|
||||||
|
including numbers, dates and times, strings, sets, and so on.
|
||||||
|
"""
|
||||||
|
|
||||||
|
lt: SupportsLt
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Le(BaseMetadata):
|
||||||
|
"""Le(le=x) implies that the value must be less than or equal to x.
|
||||||
|
|
||||||
|
It can be used with any type that supports the ``<=`` operator,
|
||||||
|
including numbers, dates and times, strings, sets, and so on.
|
||||||
|
"""
|
||||||
|
|
||||||
|
le: SupportsLe
|
||||||
|
|
||||||
|
|
||||||
|
@runtime_checkable
|
||||||
|
class GroupedMetadata(Protocol):
|
||||||
|
"""A grouping of multiple objects, like typing.Unpack.
|
||||||
|
|
||||||
|
`GroupedMetadata` on its own is not metadata and has no meaning.
|
||||||
|
All of the constraints and metadata should be fully expressable
|
||||||
|
in terms of the `BaseMetadata`'s returned by `GroupedMetadata.__iter__()`.
|
||||||
|
|
||||||
|
Concrete implementations should override `GroupedMetadata.__iter__()`
|
||||||
|
to add their own metadata.
|
||||||
|
For example:
|
||||||
|
|
||||||
|
>>> @dataclass
|
||||||
|
>>> class Field(GroupedMetadata):
|
||||||
|
>>> gt: float | None = None
|
||||||
|
>>> description: str | None = None
|
||||||
|
...
|
||||||
|
>>> def __iter__(self) -> Iterable[object]:
|
||||||
|
>>> if self.gt is not None:
|
||||||
|
>>> yield Gt(self.gt)
|
||||||
|
>>> if self.description is not None:
|
||||||
|
>>> yield Description(self.gt)
|
||||||
|
|
||||||
|
Also see the implementation of `Interval` below for an example.
|
||||||
|
|
||||||
|
Parsers should recognize this and unpack it so that it can be used
|
||||||
|
both with and without unpacking:
|
||||||
|
|
||||||
|
- `Annotated[int, Field(...)]` (parser must unpack Field)
|
||||||
|
- `Annotated[int, *Field(...)]` (PEP-646)
|
||||||
|
""" # noqa: trailing-whitespace
|
||||||
|
|
||||||
|
@property
|
||||||
|
def __is_annotated_types_grouped_metadata__(self) -> Literal[True]:
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __iter__(self) -> Iterator[object]:
|
||||||
|
...
|
||||||
|
|
||||||
|
if not TYPE_CHECKING:
|
||||||
|
__slots__ = () # allow subclasses to use slots
|
||||||
|
|
||||||
|
def __init_subclass__(cls, *args: Any, **kwargs: Any) -> None:
|
||||||
|
# Basic ABC like functionality without the complexity of an ABC
|
||||||
|
super().__init_subclass__(*args, **kwargs)
|
||||||
|
if cls.__iter__ is GroupedMetadata.__iter__:
|
||||||
|
raise TypeError("Can't subclass GroupedMetadata without implementing __iter__")
|
||||||
|
|
||||||
|
def __iter__(self) -> Iterator[object]: # noqa: F811
|
||||||
|
raise NotImplementedError # more helpful than "None has no attribute..." type errors
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **KW_ONLY, **SLOTS)
|
||||||
|
class Interval(GroupedMetadata):
|
||||||
|
"""Interval can express inclusive or exclusive bounds with a single object.
|
||||||
|
|
||||||
|
It accepts keyword arguments ``gt``, ``ge``, ``lt``, and/or ``le``, which
|
||||||
|
are interpreted the same way as the single-bound constraints.
|
||||||
|
"""
|
||||||
|
|
||||||
|
gt: Union[SupportsGt, None] = None
|
||||||
|
ge: Union[SupportsGe, None] = None
|
||||||
|
lt: Union[SupportsLt, None] = None
|
||||||
|
le: Union[SupportsLe, None] = None
|
||||||
|
|
||||||
|
def __iter__(self) -> Iterator[BaseMetadata]:
|
||||||
|
"""Unpack an Interval into zero or more single-bounds."""
|
||||||
|
if self.gt is not None:
|
||||||
|
yield Gt(self.gt)
|
||||||
|
if self.ge is not None:
|
||||||
|
yield Ge(self.ge)
|
||||||
|
if self.lt is not None:
|
||||||
|
yield Lt(self.lt)
|
||||||
|
if self.le is not None:
|
||||||
|
yield Le(self.le)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class MultipleOf(BaseMetadata):
|
||||||
|
"""MultipleOf(multiple_of=x) might be interpreted in two ways:
|
||||||
|
|
||||||
|
1. Python semantics, implying ``value % multiple_of == 0``, or
|
||||||
|
2. JSONschema semantics, where ``int(value / multiple_of) == value / multiple_of``
|
||||||
|
|
||||||
|
We encourage users to be aware of these two common interpretations,
|
||||||
|
and libraries to carefully document which they implement.
|
||||||
|
"""
|
||||||
|
|
||||||
|
multiple_of: Union[SupportsDiv, SupportsMod]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class MinLen(BaseMetadata):
|
||||||
|
"""
|
||||||
|
MinLen() implies minimum inclusive length,
|
||||||
|
e.g. ``len(value) >= min_length``.
|
||||||
|
"""
|
||||||
|
|
||||||
|
min_length: Annotated[int, Ge(0)]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class MaxLen(BaseMetadata):
|
||||||
|
"""
|
||||||
|
MaxLen() implies maximum inclusive length,
|
||||||
|
e.g. ``len(value) <= max_length``.
|
||||||
|
"""
|
||||||
|
|
||||||
|
max_length: Annotated[int, Ge(0)]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Len(GroupedMetadata):
|
||||||
|
"""
|
||||||
|
Len() implies that ``min_length <= len(value) <= max_length``.
|
||||||
|
|
||||||
|
Upper bound may be omitted or ``None`` to indicate no upper length bound.
|
||||||
|
"""
|
||||||
|
|
||||||
|
min_length: Annotated[int, Ge(0)] = 0
|
||||||
|
max_length: Optional[Annotated[int, Ge(0)]] = None
|
||||||
|
|
||||||
|
def __iter__(self) -> Iterator[BaseMetadata]:
|
||||||
|
"""Unpack a Len into zone or more single-bounds."""
|
||||||
|
if self.min_length > 0:
|
||||||
|
yield MinLen(self.min_length)
|
||||||
|
if self.max_length is not None:
|
||||||
|
yield MaxLen(self.max_length)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Timezone(BaseMetadata):
|
||||||
|
"""Timezone(tz=...) requires a datetime to be aware (or ``tz=None``, naive).
|
||||||
|
|
||||||
|
``Annotated[datetime, Timezone(None)]`` must be a naive datetime.
|
||||||
|
``Timezone[...]`` (the ellipsis literal) expresses that the datetime must be
|
||||||
|
tz-aware but any timezone is allowed.
|
||||||
|
|
||||||
|
You may also pass a specific timezone string or tzinfo object such as
|
||||||
|
``Timezone(timezone.utc)`` or ``Timezone("Africa/Abidjan")`` to express that
|
||||||
|
you only allow a specific timezone, though we note that this is often
|
||||||
|
a symptom of poor design.
|
||||||
|
"""
|
||||||
|
|
||||||
|
tz: Union[str, tzinfo, EllipsisType, None]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Unit(BaseMetadata):
|
||||||
|
"""Indicates that the value is a physical quantity with the specified unit.
|
||||||
|
|
||||||
|
It is intended for usage with numeric types, where the value represents the
|
||||||
|
magnitude of the quantity. For example, ``distance: Annotated[float, Unit('m')]``
|
||||||
|
or ``speed: Annotated[float, Unit('m/s')]``.
|
||||||
|
|
||||||
|
Interpretation of the unit string is left to the discretion of the consumer.
|
||||||
|
It is suggested to follow conventions established by python libraries that work
|
||||||
|
with physical quantities, such as
|
||||||
|
|
||||||
|
- ``pint`` : <https://pint.readthedocs.io/en/stable/>
|
||||||
|
- ``astropy.units``: <https://docs.astropy.org/en/stable/units/>
|
||||||
|
|
||||||
|
For indicating a quantity with a certain dimensionality but without a specific unit
|
||||||
|
it is recommended to use square brackets, e.g. `Annotated[float, Unit('[time]')]`.
|
||||||
|
Note, however, ``annotated_types`` itself makes no use of the unit string.
|
||||||
|
"""
|
||||||
|
|
||||||
|
unit: str
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class Predicate(BaseMetadata):
|
||||||
|
"""``Predicate(func: Callable)`` implies `func(value)` is truthy for valid values.
|
||||||
|
|
||||||
|
Users should prefer statically inspectable metadata, but if you need the full
|
||||||
|
power and flexibility of arbitrary runtime predicates... here it is.
|
||||||
|
|
||||||
|
We provide a few predefined predicates for common string constraints:
|
||||||
|
``IsLower = Predicate(str.islower)``, ``IsUpper = Predicate(str.isupper)``, and
|
||||||
|
``IsDigits = Predicate(str.isdigit)``. Users are encouraged to use methods which
|
||||||
|
can be given special handling, and avoid indirection like ``lambda s: s.lower()``.
|
||||||
|
|
||||||
|
Some libraries might have special logic to handle certain predicates, e.g. by
|
||||||
|
checking for `str.isdigit` and using its presence to both call custom logic to
|
||||||
|
enforce digit-only strings, and customise some generated external schema.
|
||||||
|
|
||||||
|
We do not specify what behaviour should be expected for predicates that raise
|
||||||
|
an exception. For example `Annotated[int, Predicate(str.isdigit)]` might silently
|
||||||
|
skip invalid constraints, or statically raise an error; or it might try calling it
|
||||||
|
and then propagate or discard the resulting exception.
|
||||||
|
"""
|
||||||
|
|
||||||
|
func: Callable[[Any], bool]
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
if getattr(self.func, "__name__", "<lambda>") == "<lambda>":
|
||||||
|
return f"{self.__class__.__name__}({self.func!r})"
|
||||||
|
if isinstance(self.func, (types.MethodType, types.BuiltinMethodType)) and (
|
||||||
|
namespace := getattr(self.func.__self__, "__name__", None)
|
||||||
|
):
|
||||||
|
return f"{self.__class__.__name__}({namespace}.{self.func.__name__})"
|
||||||
|
if isinstance(self.func, type(str.isascii)): # method descriptor
|
||||||
|
return f"{self.__class__.__name__}({self.func.__qualname__})"
|
||||||
|
return f"{self.__class__.__name__}({self.func.__name__})"
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Not:
|
||||||
|
func: Callable[[Any], bool]
|
||||||
|
|
||||||
|
def __call__(self, __v: Any) -> bool:
|
||||||
|
return not self.func(__v)
|
||||||
|
|
||||||
|
|
||||||
|
_StrType = TypeVar("_StrType", bound=str)
|
||||||
|
|
||||||
|
LowerCase = Annotated[_StrType, Predicate(str.islower)]
|
||||||
|
"""
|
||||||
|
Return True if the string is a lowercase string, False otherwise.
|
||||||
|
|
||||||
|
A string is lowercase if all cased characters in the string are lowercase and there is at least one cased character in the string.
|
||||||
|
""" # noqa: E501
|
||||||
|
UpperCase = Annotated[_StrType, Predicate(str.isupper)]
|
||||||
|
"""
|
||||||
|
Return True if the string is an uppercase string, False otherwise.
|
||||||
|
|
||||||
|
A string is uppercase if all cased characters in the string are uppercase and there is at least one cased character in the string.
|
||||||
|
""" # noqa: E501
|
||||||
|
IsDigit = Annotated[_StrType, Predicate(str.isdigit)]
|
||||||
|
IsDigits = IsDigit # type: ignore # plural for backwards compatibility, see #63
|
||||||
|
"""
|
||||||
|
Return True if the string is a digit string, False otherwise.
|
||||||
|
|
||||||
|
A string is a digit string if all characters in the string are digits and there is at least one character in the string.
|
||||||
|
""" # noqa: E501
|
||||||
|
IsAscii = Annotated[_StrType, Predicate(str.isascii)]
|
||||||
|
"""
|
||||||
|
Return True if all characters in the string are ASCII, False otherwise.
|
||||||
|
|
||||||
|
ASCII characters have code points in the range U+0000-U+007F. Empty string is ASCII too.
|
||||||
|
"""
|
||||||
|
|
||||||
|
_NumericType = TypeVar('_NumericType', bound=Union[SupportsFloat, SupportsIndex])
|
||||||
|
IsFinite = Annotated[_NumericType, Predicate(math.isfinite)]
|
||||||
|
"""Return True if x is neither an infinity nor a NaN, and False otherwise."""
|
||||||
|
IsNotFinite = Annotated[_NumericType, Predicate(Not(math.isfinite))]
|
||||||
|
"""Return True if x is one of infinity or NaN, and False otherwise"""
|
||||||
|
IsNan = Annotated[_NumericType, Predicate(math.isnan)]
|
||||||
|
"""Return True if x is a NaN (not a number), and False otherwise."""
|
||||||
|
IsNotNan = Annotated[_NumericType, Predicate(Not(math.isnan))]
|
||||||
|
"""Return True if x is anything but NaN (not a number), and False otherwise."""
|
||||||
|
IsInfinite = Annotated[_NumericType, Predicate(math.isinf)]
|
||||||
|
"""Return True if x is a positive or negative infinity, and False otherwise."""
|
||||||
|
IsNotInfinite = Annotated[_NumericType, Predicate(Not(math.isinf))]
|
||||||
|
"""Return True if x is neither a positive or negative infinity, and False otherwise."""
|
||||||
|
|
||||||
|
try:
|
||||||
|
from typing_extensions import DocInfo, doc # type: ignore [attr-defined]
|
||||||
|
except ImportError:
|
||||||
|
|
||||||
|
@dataclass(frozen=True, **SLOTS)
|
||||||
|
class DocInfo: # type: ignore [no-redef]
|
||||||
|
""" "
|
||||||
|
The return value of doc(), mainly to be used by tools that want to extract the
|
||||||
|
Annotated documentation at runtime.
|
||||||
|
"""
|
||||||
|
|
||||||
|
documentation: str
|
||||||
|
"""The documentation string passed to doc()."""
|
||||||
|
|
||||||
|
def doc(
|
||||||
|
documentation: str,
|
||||||
|
) -> DocInfo:
|
||||||
|
"""
|
||||||
|
Add documentation to a type annotation inside of Annotated.
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
>>> def hi(name: Annotated[int, doc("The name of the user")]) -> None: ...
|
||||||
|
"""
|
||||||
|
return DocInfo(documentation)
|
||||||
|
|
@ -0,0 +1,151 @@
|
||||||
|
import math
|
||||||
|
import sys
|
||||||
|
from datetime import date, datetime, timedelta, timezone
|
||||||
|
from decimal import Decimal
|
||||||
|
from typing import Any, Dict, Iterable, Iterator, List, NamedTuple, Set, Tuple
|
||||||
|
|
||||||
|
if sys.version_info < (3, 9):
|
||||||
|
from typing_extensions import Annotated
|
||||||
|
else:
|
||||||
|
from typing import Annotated
|
||||||
|
|
||||||
|
import annotated_types as at
|
||||||
|
|
||||||
|
|
||||||
|
class Case(NamedTuple):
|
||||||
|
"""
|
||||||
|
A test case for `annotated_types`.
|
||||||
|
"""
|
||||||
|
|
||||||
|
annotation: Any
|
||||||
|
valid_cases: Iterable[Any]
|
||||||
|
invalid_cases: Iterable[Any]
|
||||||
|
|
||||||
|
|
||||||
|
def cases() -> Iterable[Case]:
|
||||||
|
# Gt, Ge, Lt, Le
|
||||||
|
yield Case(Annotated[int, at.Gt(4)], (5, 6, 1000), (4, 0, -1))
|
||||||
|
yield Case(Annotated[float, at.Gt(0.5)], (0.6, 0.7, 0.8, 0.9), (0.5, 0.0, -0.1))
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Gt(datetime(2000, 1, 1))],
|
||||||
|
[datetime(2000, 1, 2), datetime(2000, 1, 3)],
|
||||||
|
[datetime(2000, 1, 1), datetime(1999, 12, 31)],
|
||||||
|
)
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Gt(date(2000, 1, 1))],
|
||||||
|
[date(2000, 1, 2), date(2000, 1, 3)],
|
||||||
|
[date(2000, 1, 1), date(1999, 12, 31)],
|
||||||
|
)
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Gt(Decimal('1.123'))],
|
||||||
|
[Decimal('1.1231'), Decimal('123')],
|
||||||
|
[Decimal('1.123'), Decimal('0')],
|
||||||
|
)
|
||||||
|
|
||||||
|
yield Case(Annotated[int, at.Ge(4)], (4, 5, 6, 1000, 4), (0, -1))
|
||||||
|
yield Case(Annotated[float, at.Ge(0.5)], (0.5, 0.6, 0.7, 0.8, 0.9), (0.4, 0.0, -0.1))
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Ge(datetime(2000, 1, 1))],
|
||||||
|
[datetime(2000, 1, 2), datetime(2000, 1, 3)],
|
||||||
|
[datetime(1998, 1, 1), datetime(1999, 12, 31)],
|
||||||
|
)
|
||||||
|
|
||||||
|
yield Case(Annotated[int, at.Lt(4)], (0, -1), (4, 5, 6, 1000, 4))
|
||||||
|
yield Case(Annotated[float, at.Lt(0.5)], (0.4, 0.0, -0.1), (0.5, 0.6, 0.7, 0.8, 0.9))
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Lt(datetime(2000, 1, 1))],
|
||||||
|
[datetime(1999, 12, 31), datetime(1999, 12, 31)],
|
||||||
|
[datetime(2000, 1, 2), datetime(2000, 1, 3)],
|
||||||
|
)
|
||||||
|
|
||||||
|
yield Case(Annotated[int, at.Le(4)], (4, 0, -1), (5, 6, 1000))
|
||||||
|
yield Case(Annotated[float, at.Le(0.5)], (0.5, 0.0, -0.1), (0.6, 0.7, 0.8, 0.9))
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Le(datetime(2000, 1, 1))],
|
||||||
|
[datetime(2000, 1, 1), datetime(1999, 12, 31)],
|
||||||
|
[datetime(2000, 1, 2), datetime(2000, 1, 3)],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Interval
|
||||||
|
yield Case(Annotated[int, at.Interval(gt=4)], (5, 6, 1000), (4, 0, -1))
|
||||||
|
yield Case(Annotated[int, at.Interval(gt=4, lt=10)], (5, 6), (4, 10, 1000, 0, -1))
|
||||||
|
yield Case(Annotated[float, at.Interval(ge=0.5, le=1)], (0.5, 0.9, 1), (0.49, 1.1))
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Interval(gt=datetime(2000, 1, 1), le=datetime(2000, 1, 3))],
|
||||||
|
[datetime(2000, 1, 2), datetime(2000, 1, 3)],
|
||||||
|
[datetime(2000, 1, 1), datetime(2000, 1, 4)],
|
||||||
|
)
|
||||||
|
|
||||||
|
yield Case(Annotated[int, at.MultipleOf(multiple_of=3)], (0, 3, 9), (1, 2, 4))
|
||||||
|
yield Case(Annotated[float, at.MultipleOf(multiple_of=0.5)], (0, 0.5, 1, 1.5), (0.4, 1.1))
|
||||||
|
|
||||||
|
# lengths
|
||||||
|
|
||||||
|
yield Case(Annotated[str, at.MinLen(3)], ('123', '1234', 'x' * 10), ('', '1', '12'))
|
||||||
|
yield Case(Annotated[str, at.Len(3)], ('123', '1234', 'x' * 10), ('', '1', '12'))
|
||||||
|
yield Case(Annotated[List[int], at.MinLen(3)], ([1, 2, 3], [1, 2, 3, 4], [1] * 10), ([], [1], [1, 2]))
|
||||||
|
yield Case(Annotated[List[int], at.Len(3)], ([1, 2, 3], [1, 2, 3, 4], [1] * 10), ([], [1], [1, 2]))
|
||||||
|
|
||||||
|
yield Case(Annotated[str, at.MaxLen(4)], ('', '1234'), ('12345', 'x' * 10))
|
||||||
|
yield Case(Annotated[str, at.Len(0, 4)], ('', '1234'), ('12345', 'x' * 10))
|
||||||
|
yield Case(Annotated[List[str], at.MaxLen(4)], ([], ['a', 'bcdef'], ['a', 'b', 'c']), (['a'] * 5, ['b'] * 10))
|
||||||
|
yield Case(Annotated[List[str], at.Len(0, 4)], ([], ['a', 'bcdef'], ['a', 'b', 'c']), (['a'] * 5, ['b'] * 10))
|
||||||
|
|
||||||
|
yield Case(Annotated[str, at.Len(3, 5)], ('123', '12345'), ('', '1', '12', '123456', 'x' * 10))
|
||||||
|
yield Case(Annotated[str, at.Len(3, 3)], ('123',), ('12', '1234'))
|
||||||
|
|
||||||
|
yield Case(Annotated[Dict[int, int], at.Len(2, 3)], [{1: 1, 2: 2}], [{}, {1: 1}, {1: 1, 2: 2, 3: 3, 4: 4}])
|
||||||
|
yield Case(Annotated[Set[int], at.Len(2, 3)], ({1, 2}, {1, 2, 3}), (set(), {1}, {1, 2, 3, 4}))
|
||||||
|
yield Case(Annotated[Tuple[int, ...], at.Len(2, 3)], ((1, 2), (1, 2, 3)), ((), (1,), (1, 2, 3, 4)))
|
||||||
|
|
||||||
|
# Timezone
|
||||||
|
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Timezone(None)], [datetime(2000, 1, 1)], [datetime(2000, 1, 1, tzinfo=timezone.utc)]
|
||||||
|
)
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Timezone(...)], [datetime(2000, 1, 1, tzinfo=timezone.utc)], [datetime(2000, 1, 1)]
|
||||||
|
)
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Timezone(timezone.utc)],
|
||||||
|
[datetime(2000, 1, 1, tzinfo=timezone.utc)],
|
||||||
|
[datetime(2000, 1, 1), datetime(2000, 1, 1, tzinfo=timezone(timedelta(hours=6)))],
|
||||||
|
)
|
||||||
|
yield Case(
|
||||||
|
Annotated[datetime, at.Timezone('Europe/London')],
|
||||||
|
[datetime(2000, 1, 1, tzinfo=timezone(timedelta(0), name='Europe/London'))],
|
||||||
|
[datetime(2000, 1, 1), datetime(2000, 1, 1, tzinfo=timezone(timedelta(hours=6)))],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Quantity
|
||||||
|
|
||||||
|
yield Case(Annotated[float, at.Unit(unit='m')], (5, 4.2), ('5m', '4.2m'))
|
||||||
|
|
||||||
|
# predicate types
|
||||||
|
|
||||||
|
yield Case(at.LowerCase[str], ['abc', 'foobar'], ['', 'A', 'Boom'])
|
||||||
|
yield Case(at.UpperCase[str], ['ABC', 'DEFO'], ['', 'a', 'abc', 'AbC'])
|
||||||
|
yield Case(at.IsDigit[str], ['123'], ['', 'ab', 'a1b2'])
|
||||||
|
yield Case(at.IsAscii[str], ['123', 'foo bar'], ['£100', '😊', 'whatever 👀'])
|
||||||
|
|
||||||
|
yield Case(Annotated[int, at.Predicate(lambda x: x % 2 == 0)], [0, 2, 4], [1, 3, 5])
|
||||||
|
|
||||||
|
yield Case(at.IsFinite[float], [1.23], [math.nan, math.inf, -math.inf])
|
||||||
|
yield Case(at.IsNotFinite[float], [math.nan, math.inf], [1.23])
|
||||||
|
yield Case(at.IsNan[float], [math.nan], [1.23, math.inf])
|
||||||
|
yield Case(at.IsNotNan[float], [1.23, math.inf], [math.nan])
|
||||||
|
yield Case(at.IsInfinite[float], [math.inf], [math.nan, 1.23])
|
||||||
|
yield Case(at.IsNotInfinite[float], [math.nan, 1.23], [math.inf])
|
||||||
|
|
||||||
|
# check stacked predicates
|
||||||
|
yield Case(at.IsInfinite[Annotated[float, at.Predicate(lambda x: x > 0)]], [math.inf], [-math.inf, 1.23, math.nan])
|
||||||
|
|
||||||
|
# doc
|
||||||
|
yield Case(Annotated[int, at.doc("A number")], [1, 2], [])
|
||||||
|
|
||||||
|
# custom GroupedMetadata
|
||||||
|
class MyCustomGroupedMetadata(at.GroupedMetadata):
|
||||||
|
def __iter__(self) -> Iterator[at.Predicate]:
|
||||||
|
yield at.Predicate(lambda x: float(x).is_integer())
|
||||||
|
|
||||||
|
yield Case(Annotated[float, MyCustomGroupedMetadata()], [0, 2.0], [0.01, 1.5])
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
pip
|
||||||
|
|
@ -0,0 +1,20 @@
|
||||||
|
The MIT License (MIT)
|
||||||
|
|
||||||
|
Copyright (c) 2018 Alex Grönholm
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||||
|
this software and associated documentation files (the "Software"), to deal in
|
||||||
|
the Software without restriction, including without limitation the rights to
|
||||||
|
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
|
||||||
|
the Software, and to permit persons to whom the Software is furnished to do so,
|
||||||
|
subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
|
||||||
|
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
|
||||||
|
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
|
||||||
|
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
|
||||||
|
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
|
|
@ -0,0 +1,105 @@
|
||||||
|
Metadata-Version: 2.1
|
||||||
|
Name: anyio
|
||||||
|
Version: 3.7.1
|
||||||
|
Summary: High level compatibility layer for multiple asynchronous event loop implementations
|
||||||
|
Author-email: Alex Grönholm <alex.gronholm@nextday.fi>
|
||||||
|
License: MIT
|
||||||
|
Project-URL: Documentation, https://anyio.readthedocs.io/en/latest/
|
||||||
|
Project-URL: Changelog, https://anyio.readthedocs.io/en/stable/versionhistory.html
|
||||||
|
Project-URL: Source code, https://github.com/agronholm/anyio
|
||||||
|
Project-URL: Issue tracker, https://github.com/agronholm/anyio/issues
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Framework :: AnyIO
|
||||||
|
Classifier: Typing :: Typed
|
||||||
|
Classifier: Programming Language :: Python
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: 3.7
|
||||||
|
Classifier: Programming Language :: Python :: 3.8
|
||||||
|
Classifier: Programming Language :: Python :: 3.9
|
||||||
|
Classifier: Programming Language :: Python :: 3.10
|
||||||
|
Classifier: Programming Language :: Python :: 3.11
|
||||||
|
Requires-Python: >=3.7
|
||||||
|
Description-Content-Type: text/x-rst
|
||||||
|
License-File: LICENSE
|
||||||
|
Requires-Dist: idna (>=2.8)
|
||||||
|
Requires-Dist: sniffio (>=1.1)
|
||||||
|
Requires-Dist: exceptiongroup ; python_version < "3.11"
|
||||||
|
Requires-Dist: typing-extensions ; python_version < "3.8"
|
||||||
|
Provides-Extra: doc
|
||||||
|
Requires-Dist: packaging ; extra == 'doc'
|
||||||
|
Requires-Dist: Sphinx ; extra == 'doc'
|
||||||
|
Requires-Dist: sphinx-rtd-theme (>=1.2.2) ; extra == 'doc'
|
||||||
|
Requires-Dist: sphinxcontrib-jquery ; extra == 'doc'
|
||||||
|
Requires-Dist: sphinx-autodoc-typehints (>=1.2.0) ; extra == 'doc'
|
||||||
|
Provides-Extra: test
|
||||||
|
Requires-Dist: anyio[trio] ; extra == 'test'
|
||||||
|
Requires-Dist: coverage[toml] (>=4.5) ; extra == 'test'
|
||||||
|
Requires-Dist: hypothesis (>=4.0) ; extra == 'test'
|
||||||
|
Requires-Dist: psutil (>=5.9) ; extra == 'test'
|
||||||
|
Requires-Dist: pytest (>=7.0) ; extra == 'test'
|
||||||
|
Requires-Dist: pytest-mock (>=3.6.1) ; extra == 'test'
|
||||||
|
Requires-Dist: trustme ; extra == 'test'
|
||||||
|
Requires-Dist: uvloop (>=0.17) ; (python_version < "3.12" and platform_python_implementation == "CPython" and platform_system != "Windows") and extra == 'test'
|
||||||
|
Requires-Dist: mock (>=4) ; (python_version < "3.8") and extra == 'test'
|
||||||
|
Provides-Extra: trio
|
||||||
|
Requires-Dist: trio (<0.22) ; extra == 'trio'
|
||||||
|
|
||||||
|
.. image:: https://github.com/agronholm/anyio/actions/workflows/test.yml/badge.svg
|
||||||
|
:target: https://github.com/agronholm/anyio/actions/workflows/test.yml
|
||||||
|
:alt: Build Status
|
||||||
|
.. image:: https://coveralls.io/repos/github/agronholm/anyio/badge.svg?branch=master
|
||||||
|
:target: https://coveralls.io/github/agronholm/anyio?branch=master
|
||||||
|
:alt: Code Coverage
|
||||||
|
.. image:: https://readthedocs.org/projects/anyio/badge/?version=latest
|
||||||
|
:target: https://anyio.readthedocs.io/en/latest/?badge=latest
|
||||||
|
:alt: Documentation
|
||||||
|
.. image:: https://badges.gitter.im/gitterHQ/gitter.svg
|
||||||
|
:target: https://gitter.im/python-trio/AnyIO
|
||||||
|
:alt: Gitter chat
|
||||||
|
|
||||||
|
AnyIO is an asynchronous networking and concurrency library that works on top of either asyncio_ or
|
||||||
|
trio_. It implements trio-like `structured concurrency`_ (SC) on top of asyncio and works in harmony
|
||||||
|
with the native SC of trio itself.
|
||||||
|
|
||||||
|
Applications and libraries written against AnyIO's API will run unmodified on either asyncio_ or
|
||||||
|
trio_. AnyIO can also be adopted into a library or application incrementally – bit by bit, no full
|
||||||
|
refactoring necessary. It will blend in with the native libraries of your chosen backend.
|
||||||
|
|
||||||
|
Documentation
|
||||||
|
-------------
|
||||||
|
|
||||||
|
View full documentation at: https://anyio.readthedocs.io/
|
||||||
|
|
||||||
|
Features
|
||||||
|
--------
|
||||||
|
|
||||||
|
AnyIO offers the following functionality:
|
||||||
|
|
||||||
|
* Task groups (nurseries_ in trio terminology)
|
||||||
|
* High-level networking (TCP, UDP and UNIX sockets)
|
||||||
|
|
||||||
|
* `Happy eyeballs`_ algorithm for TCP connections (more robust than that of asyncio on Python
|
||||||
|
3.8)
|
||||||
|
* async/await style UDP sockets (unlike asyncio where you still have to use Transports and
|
||||||
|
Protocols)
|
||||||
|
|
||||||
|
* A versatile API for byte streams and object streams
|
||||||
|
* Inter-task synchronization and communication (locks, conditions, events, semaphores, object
|
||||||
|
streams)
|
||||||
|
* Worker threads
|
||||||
|
* Subprocesses
|
||||||
|
* Asynchronous file I/O (using worker threads)
|
||||||
|
* Signal handling
|
||||||
|
|
||||||
|
AnyIO also comes with its own pytest_ plugin which also supports asynchronous fixtures.
|
||||||
|
It even works with the popular Hypothesis_ library.
|
||||||
|
|
||||||
|
.. _asyncio: https://docs.python.org/3/library/asyncio.html
|
||||||
|
.. _trio: https://github.com/python-trio/trio
|
||||||
|
.. _structured concurrency: https://en.wikipedia.org/wiki/Structured_concurrency
|
||||||
|
.. _nurseries: https://trio.readthedocs.io/en/stable/reference-core.html#nurseries-and-spawning
|
||||||
|
.. _Happy eyeballs: https://en.wikipedia.org/wiki/Happy_Eyeballs
|
||||||
|
.. _pytest: https://docs.pytest.org/en/latest/
|
||||||
|
.. _Hypothesis: https://hypothesis.works/
|
||||||
|
|
@ -0,0 +1,82 @@
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_backends/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_backends/_asyncio.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_backends/_trio.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_compat.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_eventloop.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_exceptions.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_fileio.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_resources.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_signals.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_sockets.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_streams.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_subprocesses.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_synchronization.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_tasks.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_testing.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/_core/_typedattr.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/_resources.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/_sockets.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/_streams.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/_subprocesses.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/_tasks.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/abc/_testing.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/from_thread.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/lowlevel.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/pytest_plugin.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/__init__.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/buffered.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/file.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/memory.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/stapled.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/text.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/streams/tls.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/to_process.cpython-39.pyc,,
|
||||||
|
../../../../../../../Library/Caches/com.apple.python/Users/natalie/Documents/知识采集分析Agent/django-backend/venv/lib/python3.9/site-packages/anyio/to_thread.cpython-39.pyc,,
|
||||||
|
anyio-3.7.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||||
|
anyio-3.7.1.dist-info/LICENSE,sha256=U2GsncWPLvX9LpsJxoKXwX8ElQkJu8gCO9uC6s8iwrA,1081
|
||||||
|
anyio-3.7.1.dist-info/METADATA,sha256=mOhfXPB7qKVQh3dUtp2NgLysa10jHWeDBNnRg-93A_c,4708
|
||||||
|
anyio-3.7.1.dist-info/RECORD,,
|
||||||
|
anyio-3.7.1.dist-info/WHEEL,sha256=pkctZYzUS4AYVn6dJ-7367OJZivF2e8RA9b_ZBjif18,92
|
||||||
|
anyio-3.7.1.dist-info/entry_points.txt,sha256=_d6Yu6uiaZmNe0CydowirE9Cmg7zUL2g08tQpoS3Qvc,39
|
||||||
|
anyio-3.7.1.dist-info/top_level.txt,sha256=QglSMiWX8_5dpoVAEIHdEYzvqFMdSYWmCj6tYw2ITkQ,6
|
||||||
|
anyio/__init__.py,sha256=Pq9lO03Zm5ynIPlhkquaOuIc1dTTeLGNUQ5HT5qwYMI,4073
|
||||||
|
anyio/_backends/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
anyio/_backends/_asyncio.py,sha256=fgwZmYnGOxT_pX0OZTPPgRdFqKLjnKvQUk7tsfuNmfM,67056
|
||||||
|
anyio/_backends/_trio.py,sha256=EJAj0tNi0JRM2y3QWP7oS4ct7wnjMSYDG8IZUWMta-E,30035
|
||||||
|
anyio/_core/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
anyio/_core/_compat.py,sha256=XZfBUInEt7jaiTBI2Qbul7EpJdngbwTtG4Qj26un1YE,5726
|
||||||
|
anyio/_core/_eventloop.py,sha256=xJ8KflV1bJ9GAuQRr4o1ojv8wWya4nt_XARta8uLPwc,4083
|
||||||
|
anyio/_core/_exceptions.py,sha256=uOrN5l98o6UrOU6O3kPf0VCDl_zPP-kgZs4IyaLVgwU,2916
|
||||||
|
anyio/_core/_fileio.py,sha256=DWuIul5izCocmJpgqDDNKc_GhMUwayHKdM5R-sbT_A8,18026
|
||||||
|
anyio/_core/_resources.py,sha256=NbmU5O5UX3xEyACnkmYX28Fmwdl-f-ny0tHym26e0w0,435
|
||||||
|
anyio/_core/_signals.py,sha256=KKkZAYL08auydjZnK9S4FQsxx555jT4gXAMcTXdNaok,863
|
||||||
|
anyio/_core/_sockets.py,sha256=szcPd7kKBmlHnx8g_KJWZo2k6syouRNF2614ZrtqiV0,20667
|
||||||
|
anyio/_core/_streams.py,sha256=5gryxQiUisED8uFUAHje5O44RL9wyndNMANzzQWUn1U,1518
|
||||||
|
anyio/_core/_subprocesses.py,sha256=OSAcLAsjfCplXlRyTjWonfS1xU8d5MaZblXYqqY-BM4,4977
|
||||||
|
anyio/_core/_synchronization.py,sha256=Uquo_52vZ7iZzDDoaN_j-N7jeyAlefzOZ8Pxt9mU6gY,16747
|
||||||
|
anyio/_core/_tasks.py,sha256=1wZZWlpDkr6w3kMD629vzJDkPselDvx4XVElgTCVwyM,5316
|
||||||
|
anyio/_core/_testing.py,sha256=7Yll-DOI0uIlIF5VHLUpGGyDPWtDEjFZ85-6ZniwIJU,2217
|
||||||
|
anyio/_core/_typedattr.py,sha256=8o0gwQYSl04zlO9uHqcHu1T6hOw7peY9NW1mOX5DKnY,2551
|
||||||
|
anyio/abc/__init__.py,sha256=UkC-KDbyIoKeDUDhJciwANSoyzz_qaFh4Fb7_AvwjZc,2159
|
||||||
|
anyio/abc/_resources.py,sha256=h1rkzr3E0MFqdXLh9aLLXe-A5W7k_Jc-5XzNr6SJ4w4,763
|
||||||
|
anyio/abc/_sockets.py,sha256=WWYJ6HndKCEuvobAPDkmX0tjwN2FOxf3eTGb1DB7wHE,5243
|
||||||
|
anyio/abc/_streams.py,sha256=yGhOmlVI3W9whmzPuewwYQ2BrKhrUFuWZ4zpVLWOK84,6584
|
||||||
|
anyio/abc/_subprocesses.py,sha256=r-totaRbFX6kKV-4WTeuswz8n01aap8cvkYVQCRKN0M,2067
|
||||||
|
anyio/abc/_tasks.py,sha256=a_5DLyiCbp0K57LJPOyF-PZyXmUcv_p9VRXPFj_K03M,3413
|
||||||
|
anyio/abc/_testing.py,sha256=Eub7gXJ0tVPo_WN5iJAw10FrvC7C1uaL3b2neGr_pfs,1924
|
||||||
|
anyio/from_thread.py,sha256=aUVKXctPgZ5wK3p5VTyrtjDj9tSQSrH6xCjBuo-hv3A,16563
|
||||||
|
anyio/lowlevel.py,sha256=cOTncxRW5KeswqYQQdp0pfAw6OFWXius1SPhCYwHZL4,4647
|
||||||
|
anyio/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
anyio/pytest_plugin.py,sha256=_Txgl0-I3kO1rk_KATXmIUV57C34hajcJCGcgV26CU0,5022
|
||||||
|
anyio/streams/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
anyio/streams/buffered.py,sha256=2ifplNLwT73d1UKBxrkFdlC9wTAze9LhPL7pt_7cYgY,4473
|
||||||
|
anyio/streams/file.py,sha256=-NP6jMcUd2f1VJwgcxgiRHdEsNnhE0lANl0ov_i7FrE,4356
|
||||||
|
anyio/streams/memory.py,sha256=QZhc5qdomBpGCgrUVWAaqEBxI0oklVxK_62atW6tnNk,9274
|
||||||
|
anyio/streams/stapled.py,sha256=9u2GxpiOPsGtgO1qsj2tVoW4b8bgiwp5rSDs1BFKkLM,4275
|
||||||
|
anyio/streams/text.py,sha256=1K4ZCLKl2b7yywrW6wKEeMu3xyQHE_T0aU5_oC9GPTE,5043
|
||||||
|
anyio/streams/tls.py,sha256=TbdCz1KtfEnp3mxHvkROXRefhE6S1LHiwgWiJX8zYaU,12099
|
||||||
|
anyio/to_process.py,sha256=_RSsG8UME2nGxeFEdg3OEfv9XshSQwrMU7DAbwWGx9U,9242
|
||||||
|
anyio/to_thread.py,sha256=HVpTvBei2sSXgJJeNKdwhJwQaW76LDbb1htQ-Mc6zDs,2146
|
||||||
|
|
@ -0,0 +1,5 @@
|
||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: bdist_wheel (0.40.0)
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
||||||
|
|
@ -0,0 +1,2 @@
|
||||||
|
[pytest11]
|
||||||
|
anyio = anyio.pytest_plugin
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
anyio
|
||||||
|
|
@ -0,0 +1,169 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
__all__ = (
|
||||||
|
"maybe_async",
|
||||||
|
"maybe_async_cm",
|
||||||
|
"run",
|
||||||
|
"sleep",
|
||||||
|
"sleep_forever",
|
||||||
|
"sleep_until",
|
||||||
|
"current_time",
|
||||||
|
"get_all_backends",
|
||||||
|
"get_cancelled_exc_class",
|
||||||
|
"BrokenResourceError",
|
||||||
|
"BrokenWorkerProcess",
|
||||||
|
"BusyResourceError",
|
||||||
|
"ClosedResourceError",
|
||||||
|
"DelimiterNotFound",
|
||||||
|
"EndOfStream",
|
||||||
|
"ExceptionGroup",
|
||||||
|
"IncompleteRead",
|
||||||
|
"TypedAttributeLookupError",
|
||||||
|
"WouldBlock",
|
||||||
|
"AsyncFile",
|
||||||
|
"Path",
|
||||||
|
"open_file",
|
||||||
|
"wrap_file",
|
||||||
|
"aclose_forcefully",
|
||||||
|
"open_signal_receiver",
|
||||||
|
"connect_tcp",
|
||||||
|
"connect_unix",
|
||||||
|
"create_tcp_listener",
|
||||||
|
"create_unix_listener",
|
||||||
|
"create_udp_socket",
|
||||||
|
"create_connected_udp_socket",
|
||||||
|
"getaddrinfo",
|
||||||
|
"getnameinfo",
|
||||||
|
"wait_socket_readable",
|
||||||
|
"wait_socket_writable",
|
||||||
|
"create_memory_object_stream",
|
||||||
|
"run_process",
|
||||||
|
"open_process",
|
||||||
|
"create_lock",
|
||||||
|
"CapacityLimiter",
|
||||||
|
"CapacityLimiterStatistics",
|
||||||
|
"Condition",
|
||||||
|
"ConditionStatistics",
|
||||||
|
"Event",
|
||||||
|
"EventStatistics",
|
||||||
|
"Lock",
|
||||||
|
"LockStatistics",
|
||||||
|
"Semaphore",
|
||||||
|
"SemaphoreStatistics",
|
||||||
|
"create_condition",
|
||||||
|
"create_event",
|
||||||
|
"create_semaphore",
|
||||||
|
"create_capacity_limiter",
|
||||||
|
"open_cancel_scope",
|
||||||
|
"fail_after",
|
||||||
|
"move_on_after",
|
||||||
|
"current_effective_deadline",
|
||||||
|
"TASK_STATUS_IGNORED",
|
||||||
|
"CancelScope",
|
||||||
|
"create_task_group",
|
||||||
|
"TaskInfo",
|
||||||
|
"get_current_task",
|
||||||
|
"get_running_tasks",
|
||||||
|
"wait_all_tasks_blocked",
|
||||||
|
"run_sync_in_worker_thread",
|
||||||
|
"run_async_from_thread",
|
||||||
|
"run_sync_from_thread",
|
||||||
|
"current_default_worker_thread_limiter",
|
||||||
|
"create_blocking_portal",
|
||||||
|
"start_blocking_portal",
|
||||||
|
"typed_attribute",
|
||||||
|
"TypedAttributeSet",
|
||||||
|
"TypedAttributeProvider",
|
||||||
|
)
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from ._core._compat import maybe_async, maybe_async_cm
|
||||||
|
from ._core._eventloop import (
|
||||||
|
current_time,
|
||||||
|
get_all_backends,
|
||||||
|
get_cancelled_exc_class,
|
||||||
|
run,
|
||||||
|
sleep,
|
||||||
|
sleep_forever,
|
||||||
|
sleep_until,
|
||||||
|
)
|
||||||
|
from ._core._exceptions import (
|
||||||
|
BrokenResourceError,
|
||||||
|
BrokenWorkerProcess,
|
||||||
|
BusyResourceError,
|
||||||
|
ClosedResourceError,
|
||||||
|
DelimiterNotFound,
|
||||||
|
EndOfStream,
|
||||||
|
ExceptionGroup,
|
||||||
|
IncompleteRead,
|
||||||
|
TypedAttributeLookupError,
|
||||||
|
WouldBlock,
|
||||||
|
)
|
||||||
|
from ._core._fileio import AsyncFile, Path, open_file, wrap_file
|
||||||
|
from ._core._resources import aclose_forcefully
|
||||||
|
from ._core._signals import open_signal_receiver
|
||||||
|
from ._core._sockets import (
|
||||||
|
connect_tcp,
|
||||||
|
connect_unix,
|
||||||
|
create_connected_udp_socket,
|
||||||
|
create_tcp_listener,
|
||||||
|
create_udp_socket,
|
||||||
|
create_unix_listener,
|
||||||
|
getaddrinfo,
|
||||||
|
getnameinfo,
|
||||||
|
wait_socket_readable,
|
||||||
|
wait_socket_writable,
|
||||||
|
)
|
||||||
|
from ._core._streams import create_memory_object_stream
|
||||||
|
from ._core._subprocesses import open_process, run_process
|
||||||
|
from ._core._synchronization import (
|
||||||
|
CapacityLimiter,
|
||||||
|
CapacityLimiterStatistics,
|
||||||
|
Condition,
|
||||||
|
ConditionStatistics,
|
||||||
|
Event,
|
||||||
|
EventStatistics,
|
||||||
|
Lock,
|
||||||
|
LockStatistics,
|
||||||
|
Semaphore,
|
||||||
|
SemaphoreStatistics,
|
||||||
|
create_capacity_limiter,
|
||||||
|
create_condition,
|
||||||
|
create_event,
|
||||||
|
create_lock,
|
||||||
|
create_semaphore,
|
||||||
|
)
|
||||||
|
from ._core._tasks import (
|
||||||
|
TASK_STATUS_IGNORED,
|
||||||
|
CancelScope,
|
||||||
|
create_task_group,
|
||||||
|
current_effective_deadline,
|
||||||
|
fail_after,
|
||||||
|
move_on_after,
|
||||||
|
open_cancel_scope,
|
||||||
|
)
|
||||||
|
from ._core._testing import (
|
||||||
|
TaskInfo,
|
||||||
|
get_current_task,
|
||||||
|
get_running_tasks,
|
||||||
|
wait_all_tasks_blocked,
|
||||||
|
)
|
||||||
|
from ._core._typedattr import TypedAttributeProvider, TypedAttributeSet, typed_attribute
|
||||||
|
|
||||||
|
# Re-exported here, for backwards compatibility
|
||||||
|
# isort: off
|
||||||
|
from .to_thread import current_default_worker_thread_limiter, run_sync_in_worker_thread
|
||||||
|
from .from_thread import (
|
||||||
|
create_blocking_portal,
|
||||||
|
run_async_from_thread,
|
||||||
|
run_sync_from_thread,
|
||||||
|
start_blocking_portal,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Re-export imports so they look like they live directly in this package
|
||||||
|
key: str
|
||||||
|
value: Any
|
||||||
|
for key, value in list(locals().items()):
|
||||||
|
if getattr(value, "__module__", "").startswith("anyio."):
|
||||||
|
value.__module__ = __name__
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,996 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import array
|
||||||
|
import math
|
||||||
|
import socket
|
||||||
|
from concurrent.futures import Future
|
||||||
|
from contextvars import copy_context
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from functools import partial
|
||||||
|
from io import IOBase
|
||||||
|
from os import PathLike
|
||||||
|
from signal import Signals
|
||||||
|
from types import TracebackType
|
||||||
|
from typing import (
|
||||||
|
IO,
|
||||||
|
TYPE_CHECKING,
|
||||||
|
Any,
|
||||||
|
AsyncGenerator,
|
||||||
|
AsyncIterator,
|
||||||
|
Awaitable,
|
||||||
|
Callable,
|
||||||
|
Collection,
|
||||||
|
Coroutine,
|
||||||
|
Generic,
|
||||||
|
Iterable,
|
||||||
|
Mapping,
|
||||||
|
NoReturn,
|
||||||
|
Sequence,
|
||||||
|
TypeVar,
|
||||||
|
cast,
|
||||||
|
)
|
||||||
|
|
||||||
|
import sniffio
|
||||||
|
import trio.from_thread
|
||||||
|
from outcome import Error, Outcome, Value
|
||||||
|
from trio.socket import SocketType as TrioSocketType
|
||||||
|
from trio.to_thread import run_sync
|
||||||
|
|
||||||
|
from .. import CapacityLimiterStatistics, EventStatistics, TaskInfo, abc
|
||||||
|
from .._core._compat import DeprecatedAsyncContextManager, DeprecatedAwaitable
|
||||||
|
from .._core._eventloop import claim_worker_thread
|
||||||
|
from .._core._exceptions import (
|
||||||
|
BrokenResourceError,
|
||||||
|
BusyResourceError,
|
||||||
|
ClosedResourceError,
|
||||||
|
EndOfStream,
|
||||||
|
)
|
||||||
|
from .._core._exceptions import ExceptionGroup as BaseExceptionGroup
|
||||||
|
from .._core._sockets import convert_ipv6_sockaddr
|
||||||
|
from .._core._synchronization import CapacityLimiter as BaseCapacityLimiter
|
||||||
|
from .._core._synchronization import Event as BaseEvent
|
||||||
|
from .._core._synchronization import ResourceGuard
|
||||||
|
from .._core._tasks import CancelScope as BaseCancelScope
|
||||||
|
from ..abc import IPSockAddrType, UDPPacketType
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from trio_typing import TaskStatus
|
||||||
|
|
||||||
|
try:
|
||||||
|
from trio import lowlevel as trio_lowlevel
|
||||||
|
except ImportError:
|
||||||
|
from trio import hazmat as trio_lowlevel # type: ignore[no-redef]
|
||||||
|
from trio.hazmat import wait_readable, wait_writable
|
||||||
|
else:
|
||||||
|
from trio.lowlevel import wait_readable, wait_writable
|
||||||
|
|
||||||
|
try:
|
||||||
|
trio_open_process = trio_lowlevel.open_process
|
||||||
|
except AttributeError:
|
||||||
|
# isort: off
|
||||||
|
from trio import ( # type: ignore[attr-defined, no-redef]
|
||||||
|
open_process as trio_open_process,
|
||||||
|
)
|
||||||
|
|
||||||
|
T_Retval = TypeVar("T_Retval")
|
||||||
|
T_SockAddr = TypeVar("T_SockAddr", str, IPSockAddrType)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Event loop
|
||||||
|
#
|
||||||
|
|
||||||
|
run = trio.run
|
||||||
|
current_token = trio.lowlevel.current_trio_token
|
||||||
|
RunVar = trio.lowlevel.RunVar
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Miscellaneous
|
||||||
|
#
|
||||||
|
|
||||||
|
sleep = trio.sleep
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Timeouts and cancellation
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
class CancelScope(BaseCancelScope):
|
||||||
|
def __new__(
|
||||||
|
cls, original: trio.CancelScope | None = None, **kwargs: object
|
||||||
|
) -> CancelScope:
|
||||||
|
return object.__new__(cls)
|
||||||
|
|
||||||
|
def __init__(self, original: trio.CancelScope | None = None, **kwargs: Any) -> None:
|
||||||
|
self.__original = original or trio.CancelScope(**kwargs)
|
||||||
|
|
||||||
|
def __enter__(self) -> CancelScope:
|
||||||
|
self.__original.__enter__()
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> bool | None:
|
||||||
|
# https://github.com/python-trio/trio-typing/pull/79
|
||||||
|
return self.__original.__exit__( # type: ignore[func-returns-value]
|
||||||
|
exc_type, exc_val, exc_tb
|
||||||
|
)
|
||||||
|
|
||||||
|
def cancel(self) -> DeprecatedAwaitable:
|
||||||
|
self.__original.cancel()
|
||||||
|
return DeprecatedAwaitable(self.cancel)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def deadline(self) -> float:
|
||||||
|
return self.__original.deadline
|
||||||
|
|
||||||
|
@deadline.setter
|
||||||
|
def deadline(self, value: float) -> None:
|
||||||
|
self.__original.deadline = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def cancel_called(self) -> bool:
|
||||||
|
return self.__original.cancel_called
|
||||||
|
|
||||||
|
@property
|
||||||
|
def shield(self) -> bool:
|
||||||
|
return self.__original.shield
|
||||||
|
|
||||||
|
@shield.setter
|
||||||
|
def shield(self, value: bool) -> None:
|
||||||
|
self.__original.shield = value
|
||||||
|
|
||||||
|
|
||||||
|
CancelledError = trio.Cancelled
|
||||||
|
checkpoint = trio.lowlevel.checkpoint
|
||||||
|
checkpoint_if_cancelled = trio.lowlevel.checkpoint_if_cancelled
|
||||||
|
cancel_shielded_checkpoint = trio.lowlevel.cancel_shielded_checkpoint
|
||||||
|
current_effective_deadline = trio.current_effective_deadline
|
||||||
|
current_time = trio.current_time
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Task groups
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
class ExceptionGroup(BaseExceptionGroup, trio.MultiError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class TaskGroup(abc.TaskGroup):
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._active = False
|
||||||
|
self._nursery_manager = trio.open_nursery()
|
||||||
|
self.cancel_scope = None # type: ignore[assignment]
|
||||||
|
|
||||||
|
async def __aenter__(self) -> TaskGroup:
|
||||||
|
self._active = True
|
||||||
|
self._nursery = await self._nursery_manager.__aenter__()
|
||||||
|
self.cancel_scope = CancelScope(self._nursery.cancel_scope)
|
||||||
|
return self
|
||||||
|
|
||||||
|
async def __aexit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> bool | None:
|
||||||
|
try:
|
||||||
|
return await self._nursery_manager.__aexit__(exc_type, exc_val, exc_tb)
|
||||||
|
except trio.MultiError as exc:
|
||||||
|
raise ExceptionGroup(exc.exceptions) from None
|
||||||
|
finally:
|
||||||
|
self._active = False
|
||||||
|
|
||||||
|
def start_soon(
|
||||||
|
self, func: Callable[..., Awaitable[Any]], *args: object, name: object = None
|
||||||
|
) -> None:
|
||||||
|
if not self._active:
|
||||||
|
raise RuntimeError(
|
||||||
|
"This task group is not active; no new tasks can be started."
|
||||||
|
)
|
||||||
|
|
||||||
|
self._nursery.start_soon(func, *args, name=name)
|
||||||
|
|
||||||
|
async def start(
|
||||||
|
self, func: Callable[..., Awaitable[Any]], *args: object, name: object = None
|
||||||
|
) -> object:
|
||||||
|
if not self._active:
|
||||||
|
raise RuntimeError(
|
||||||
|
"This task group is not active; no new tasks can be started."
|
||||||
|
)
|
||||||
|
|
||||||
|
return await self._nursery.start(func, *args, name=name)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Threads
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
async def run_sync_in_worker_thread(
|
||||||
|
func: Callable[..., T_Retval],
|
||||||
|
*args: object,
|
||||||
|
cancellable: bool = False,
|
||||||
|
limiter: trio.CapacityLimiter | None = None,
|
||||||
|
) -> T_Retval:
|
||||||
|
def wrapper() -> T_Retval:
|
||||||
|
with claim_worker_thread("trio"):
|
||||||
|
return func(*args)
|
||||||
|
|
||||||
|
# TODO: remove explicit context copying when trio 0.20 is the minimum requirement
|
||||||
|
context = copy_context()
|
||||||
|
context.run(sniffio.current_async_library_cvar.set, None)
|
||||||
|
return await run_sync(
|
||||||
|
context.run, wrapper, cancellable=cancellable, limiter=limiter
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# TODO: remove this workaround when trio 0.20 is the minimum requirement
|
||||||
|
def run_async_from_thread(
|
||||||
|
fn: Callable[..., Awaitable[T_Retval]], *args: Any
|
||||||
|
) -> T_Retval:
|
||||||
|
async def wrapper() -> T_Retval:
|
||||||
|
retval: T_Retval
|
||||||
|
|
||||||
|
async def inner() -> None:
|
||||||
|
nonlocal retval
|
||||||
|
__tracebackhide__ = True
|
||||||
|
retval = await fn(*args)
|
||||||
|
|
||||||
|
async with trio.open_nursery() as n:
|
||||||
|
context.run(n.start_soon, inner)
|
||||||
|
|
||||||
|
__tracebackhide__ = True
|
||||||
|
return retval # noqa: F821
|
||||||
|
|
||||||
|
context = copy_context()
|
||||||
|
context.run(sniffio.current_async_library_cvar.set, "trio")
|
||||||
|
return trio.from_thread.run(wrapper)
|
||||||
|
|
||||||
|
|
||||||
|
def run_sync_from_thread(fn: Callable[..., T_Retval], *args: Any) -> T_Retval:
|
||||||
|
# TODO: remove explicit context copying when trio 0.20 is the minimum requirement
|
||||||
|
retval = trio.from_thread.run_sync(copy_context().run, fn, *args)
|
||||||
|
return cast(T_Retval, retval)
|
||||||
|
|
||||||
|
|
||||||
|
class BlockingPortal(abc.BlockingPortal):
|
||||||
|
def __new__(cls) -> BlockingPortal:
|
||||||
|
return object.__new__(cls)
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__()
|
||||||
|
self._token = trio.lowlevel.current_trio_token()
|
||||||
|
|
||||||
|
def _spawn_task_from_thread(
|
||||||
|
self,
|
||||||
|
func: Callable,
|
||||||
|
args: tuple,
|
||||||
|
kwargs: dict[str, Any],
|
||||||
|
name: object,
|
||||||
|
future: Future,
|
||||||
|
) -> None:
|
||||||
|
context = copy_context()
|
||||||
|
context.run(sniffio.current_async_library_cvar.set, "trio")
|
||||||
|
trio.from_thread.run_sync(
|
||||||
|
context.run,
|
||||||
|
partial(self._task_group.start_soon, name=name),
|
||||||
|
self._call_func,
|
||||||
|
func,
|
||||||
|
args,
|
||||||
|
kwargs,
|
||||||
|
future,
|
||||||
|
trio_token=self._token,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Subprocesses
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(eq=False)
|
||||||
|
class ReceiveStreamWrapper(abc.ByteReceiveStream):
|
||||||
|
_stream: trio.abc.ReceiveStream
|
||||||
|
|
||||||
|
async def receive(self, max_bytes: int | None = None) -> bytes:
|
||||||
|
try:
|
||||||
|
data = await self._stream.receive_some(max_bytes)
|
||||||
|
except trio.ClosedResourceError as exc:
|
||||||
|
raise ClosedResourceError from exc.__cause__
|
||||||
|
except trio.BrokenResourceError as exc:
|
||||||
|
raise BrokenResourceError from exc.__cause__
|
||||||
|
|
||||||
|
if data:
|
||||||
|
return data
|
||||||
|
else:
|
||||||
|
raise EndOfStream
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
await self._stream.aclose()
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(eq=False)
|
||||||
|
class SendStreamWrapper(abc.ByteSendStream):
|
||||||
|
_stream: trio.abc.SendStream
|
||||||
|
|
||||||
|
async def send(self, item: bytes) -> None:
|
||||||
|
try:
|
||||||
|
await self._stream.send_all(item)
|
||||||
|
except trio.ClosedResourceError as exc:
|
||||||
|
raise ClosedResourceError from exc.__cause__
|
||||||
|
except trio.BrokenResourceError as exc:
|
||||||
|
raise BrokenResourceError from exc.__cause__
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
await self._stream.aclose()
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(eq=False)
|
||||||
|
class Process(abc.Process):
|
||||||
|
_process: trio.Process
|
||||||
|
_stdin: abc.ByteSendStream | None
|
||||||
|
_stdout: abc.ByteReceiveStream | None
|
||||||
|
_stderr: abc.ByteReceiveStream | None
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
if self._stdin:
|
||||||
|
await self._stdin.aclose()
|
||||||
|
if self._stdout:
|
||||||
|
await self._stdout.aclose()
|
||||||
|
if self._stderr:
|
||||||
|
await self._stderr.aclose()
|
||||||
|
|
||||||
|
await self.wait()
|
||||||
|
|
||||||
|
async def wait(self) -> int:
|
||||||
|
return await self._process.wait()
|
||||||
|
|
||||||
|
def terminate(self) -> None:
|
||||||
|
self._process.terminate()
|
||||||
|
|
||||||
|
def kill(self) -> None:
|
||||||
|
self._process.kill()
|
||||||
|
|
||||||
|
def send_signal(self, signal: Signals) -> None:
|
||||||
|
self._process.send_signal(signal)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def pid(self) -> int:
|
||||||
|
return self._process.pid
|
||||||
|
|
||||||
|
@property
|
||||||
|
def returncode(self) -> int | None:
|
||||||
|
return self._process.returncode
|
||||||
|
|
||||||
|
@property
|
||||||
|
def stdin(self) -> abc.ByteSendStream | None:
|
||||||
|
return self._stdin
|
||||||
|
|
||||||
|
@property
|
||||||
|
def stdout(self) -> abc.ByteReceiveStream | None:
|
||||||
|
return self._stdout
|
||||||
|
|
||||||
|
@property
|
||||||
|
def stderr(self) -> abc.ByteReceiveStream | None:
|
||||||
|
return self._stderr
|
||||||
|
|
||||||
|
|
||||||
|
async def open_process(
|
||||||
|
command: str | bytes | Sequence[str | bytes],
|
||||||
|
*,
|
||||||
|
shell: bool,
|
||||||
|
stdin: int | IO[Any] | None,
|
||||||
|
stdout: int | IO[Any] | None,
|
||||||
|
stderr: int | IO[Any] | None,
|
||||||
|
cwd: str | bytes | PathLike | None = None,
|
||||||
|
env: Mapping[str, str] | None = None,
|
||||||
|
start_new_session: bool = False,
|
||||||
|
) -> Process:
|
||||||
|
process = await trio_open_process( # type: ignore[misc]
|
||||||
|
command, # type: ignore[arg-type]
|
||||||
|
stdin=stdin,
|
||||||
|
stdout=stdout,
|
||||||
|
stderr=stderr,
|
||||||
|
shell=shell,
|
||||||
|
cwd=cwd,
|
||||||
|
env=env,
|
||||||
|
start_new_session=start_new_session,
|
||||||
|
)
|
||||||
|
stdin_stream = SendStreamWrapper(process.stdin) if process.stdin else None
|
||||||
|
stdout_stream = ReceiveStreamWrapper(process.stdout) if process.stdout else None
|
||||||
|
stderr_stream = ReceiveStreamWrapper(process.stderr) if process.stderr else None
|
||||||
|
return Process(process, stdin_stream, stdout_stream, stderr_stream)
|
||||||
|
|
||||||
|
|
||||||
|
class _ProcessPoolShutdownInstrument(trio.abc.Instrument):
|
||||||
|
def after_run(self) -> None:
|
||||||
|
super().after_run()
|
||||||
|
|
||||||
|
|
||||||
|
current_default_worker_process_limiter: RunVar = RunVar(
|
||||||
|
"current_default_worker_process_limiter"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def _shutdown_process_pool(workers: set[Process]) -> None:
|
||||||
|
process: Process
|
||||||
|
try:
|
||||||
|
await sleep(math.inf)
|
||||||
|
except trio.Cancelled:
|
||||||
|
for process in workers:
|
||||||
|
if process.returncode is None:
|
||||||
|
process.kill()
|
||||||
|
|
||||||
|
with CancelScope(shield=True):
|
||||||
|
for process in workers:
|
||||||
|
await process.aclose()
|
||||||
|
|
||||||
|
|
||||||
|
def setup_process_pool_exit_at_shutdown(workers: set[Process]) -> None:
|
||||||
|
trio.lowlevel.spawn_system_task(_shutdown_process_pool, workers)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Sockets and networking
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
class _TrioSocketMixin(Generic[T_SockAddr]):
|
||||||
|
def __init__(self, trio_socket: TrioSocketType) -> None:
|
||||||
|
self._trio_socket = trio_socket
|
||||||
|
self._closed = False
|
||||||
|
|
||||||
|
def _check_closed(self) -> None:
|
||||||
|
if self._closed:
|
||||||
|
raise ClosedResourceError
|
||||||
|
if self._trio_socket.fileno() < 0:
|
||||||
|
raise BrokenResourceError
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _raw_socket(self) -> socket.socket:
|
||||||
|
return self._trio_socket._sock # type: ignore[attr-defined]
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
if self._trio_socket.fileno() >= 0:
|
||||||
|
self._closed = True
|
||||||
|
self._trio_socket.close()
|
||||||
|
|
||||||
|
def _convert_socket_error(self, exc: BaseException) -> NoReturn:
|
||||||
|
if isinstance(exc, trio.ClosedResourceError):
|
||||||
|
raise ClosedResourceError from exc
|
||||||
|
elif self._trio_socket.fileno() < 0 and self._closed:
|
||||||
|
raise ClosedResourceError from None
|
||||||
|
elif isinstance(exc, OSError):
|
||||||
|
raise BrokenResourceError from exc
|
||||||
|
else:
|
||||||
|
raise exc
|
||||||
|
|
||||||
|
|
||||||
|
class SocketStream(_TrioSocketMixin, abc.SocketStream):
|
||||||
|
def __init__(self, trio_socket: TrioSocketType) -> None:
|
||||||
|
super().__init__(trio_socket)
|
||||||
|
self._receive_guard = ResourceGuard("reading from")
|
||||||
|
self._send_guard = ResourceGuard("writing to")
|
||||||
|
|
||||||
|
async def receive(self, max_bytes: int = 65536) -> bytes:
|
||||||
|
with self._receive_guard:
|
||||||
|
try:
|
||||||
|
data = await self._trio_socket.recv(max_bytes)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
if data:
|
||||||
|
return data
|
||||||
|
else:
|
||||||
|
raise EndOfStream
|
||||||
|
|
||||||
|
async def send(self, item: bytes) -> None:
|
||||||
|
with self._send_guard:
|
||||||
|
view = memoryview(item)
|
||||||
|
while view:
|
||||||
|
try:
|
||||||
|
bytes_sent = await self._trio_socket.send(view)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
view = view[bytes_sent:]
|
||||||
|
|
||||||
|
async def send_eof(self) -> None:
|
||||||
|
self._trio_socket.shutdown(socket.SHUT_WR)
|
||||||
|
|
||||||
|
|
||||||
|
class UNIXSocketStream(SocketStream, abc.UNIXSocketStream):
|
||||||
|
async def receive_fds(self, msglen: int, maxfds: int) -> tuple[bytes, list[int]]:
|
||||||
|
if not isinstance(msglen, int) or msglen < 0:
|
||||||
|
raise ValueError("msglen must be a non-negative integer")
|
||||||
|
if not isinstance(maxfds, int) or maxfds < 1:
|
||||||
|
raise ValueError("maxfds must be a positive integer")
|
||||||
|
|
||||||
|
fds = array.array("i")
|
||||||
|
await checkpoint()
|
||||||
|
with self._receive_guard:
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
message, ancdata, flags, addr = await self._trio_socket.recvmsg(
|
||||||
|
msglen, socket.CMSG_LEN(maxfds * fds.itemsize)
|
||||||
|
)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
else:
|
||||||
|
if not message and not ancdata:
|
||||||
|
raise EndOfStream
|
||||||
|
|
||||||
|
break
|
||||||
|
|
||||||
|
for cmsg_level, cmsg_type, cmsg_data in ancdata:
|
||||||
|
if cmsg_level != socket.SOL_SOCKET or cmsg_type != socket.SCM_RIGHTS:
|
||||||
|
raise RuntimeError(
|
||||||
|
f"Received unexpected ancillary data; message = {message!r}, "
|
||||||
|
f"cmsg_level = {cmsg_level}, cmsg_type = {cmsg_type}"
|
||||||
|
)
|
||||||
|
|
||||||
|
fds.frombytes(cmsg_data[: len(cmsg_data) - (len(cmsg_data) % fds.itemsize)])
|
||||||
|
|
||||||
|
return message, list(fds)
|
||||||
|
|
||||||
|
async def send_fds(self, message: bytes, fds: Collection[int | IOBase]) -> None:
|
||||||
|
if not message:
|
||||||
|
raise ValueError("message must not be empty")
|
||||||
|
if not fds:
|
||||||
|
raise ValueError("fds must not be empty")
|
||||||
|
|
||||||
|
filenos: list[int] = []
|
||||||
|
for fd in fds:
|
||||||
|
if isinstance(fd, int):
|
||||||
|
filenos.append(fd)
|
||||||
|
elif isinstance(fd, IOBase):
|
||||||
|
filenos.append(fd.fileno())
|
||||||
|
|
||||||
|
fdarray = array.array("i", filenos)
|
||||||
|
await checkpoint()
|
||||||
|
with self._send_guard:
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
await self._trio_socket.sendmsg(
|
||||||
|
[message],
|
||||||
|
[
|
||||||
|
(
|
||||||
|
socket.SOL_SOCKET,
|
||||||
|
socket.SCM_RIGHTS, # type: ignore[list-item]
|
||||||
|
fdarray,
|
||||||
|
)
|
||||||
|
],
|
||||||
|
)
|
||||||
|
break
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
|
||||||
|
class TCPSocketListener(_TrioSocketMixin, abc.SocketListener):
|
||||||
|
def __init__(self, raw_socket: socket.socket):
|
||||||
|
super().__init__(trio.socket.from_stdlib_socket(raw_socket))
|
||||||
|
self._accept_guard = ResourceGuard("accepting connections from")
|
||||||
|
|
||||||
|
async def accept(self) -> SocketStream:
|
||||||
|
with self._accept_guard:
|
||||||
|
try:
|
||||||
|
trio_socket, _addr = await self._trio_socket.accept()
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
trio_socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||||
|
return SocketStream(trio_socket)
|
||||||
|
|
||||||
|
|
||||||
|
class UNIXSocketListener(_TrioSocketMixin, abc.SocketListener):
|
||||||
|
def __init__(self, raw_socket: socket.socket):
|
||||||
|
super().__init__(trio.socket.from_stdlib_socket(raw_socket))
|
||||||
|
self._accept_guard = ResourceGuard("accepting connections from")
|
||||||
|
|
||||||
|
async def accept(self) -> UNIXSocketStream:
|
||||||
|
with self._accept_guard:
|
||||||
|
try:
|
||||||
|
trio_socket, _addr = await self._trio_socket.accept()
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
return UNIXSocketStream(trio_socket)
|
||||||
|
|
||||||
|
|
||||||
|
class UDPSocket(_TrioSocketMixin[IPSockAddrType], abc.UDPSocket):
|
||||||
|
def __init__(self, trio_socket: TrioSocketType) -> None:
|
||||||
|
super().__init__(trio_socket)
|
||||||
|
self._receive_guard = ResourceGuard("reading from")
|
||||||
|
self._send_guard = ResourceGuard("writing to")
|
||||||
|
|
||||||
|
async def receive(self) -> tuple[bytes, IPSockAddrType]:
|
||||||
|
with self._receive_guard:
|
||||||
|
try:
|
||||||
|
data, addr = await self._trio_socket.recvfrom(65536)
|
||||||
|
return data, convert_ipv6_sockaddr(addr)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
async def send(self, item: UDPPacketType) -> None:
|
||||||
|
with self._send_guard:
|
||||||
|
try:
|
||||||
|
await self._trio_socket.sendto(*item)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
|
||||||
|
class ConnectedUDPSocket(_TrioSocketMixin[IPSockAddrType], abc.ConnectedUDPSocket):
|
||||||
|
def __init__(self, trio_socket: TrioSocketType) -> None:
|
||||||
|
super().__init__(trio_socket)
|
||||||
|
self._receive_guard = ResourceGuard("reading from")
|
||||||
|
self._send_guard = ResourceGuard("writing to")
|
||||||
|
|
||||||
|
async def receive(self) -> bytes:
|
||||||
|
with self._receive_guard:
|
||||||
|
try:
|
||||||
|
return await self._trio_socket.recv(65536)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
async def send(self, item: bytes) -> None:
|
||||||
|
with self._send_guard:
|
||||||
|
try:
|
||||||
|
await self._trio_socket.send(item)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._convert_socket_error(exc)
|
||||||
|
|
||||||
|
|
||||||
|
async def connect_tcp(
|
||||||
|
host: str, port: int, local_address: IPSockAddrType | None = None
|
||||||
|
) -> SocketStream:
|
||||||
|
family = socket.AF_INET6 if ":" in host else socket.AF_INET
|
||||||
|
trio_socket = trio.socket.socket(family)
|
||||||
|
trio_socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||||
|
if local_address:
|
||||||
|
await trio_socket.bind(local_address)
|
||||||
|
|
||||||
|
try:
|
||||||
|
await trio_socket.connect((host, port))
|
||||||
|
except BaseException:
|
||||||
|
trio_socket.close()
|
||||||
|
raise
|
||||||
|
|
||||||
|
return SocketStream(trio_socket)
|
||||||
|
|
||||||
|
|
||||||
|
async def connect_unix(path: str) -> UNIXSocketStream:
|
||||||
|
trio_socket = trio.socket.socket(socket.AF_UNIX)
|
||||||
|
try:
|
||||||
|
await trio_socket.connect(path)
|
||||||
|
except BaseException:
|
||||||
|
trio_socket.close()
|
||||||
|
raise
|
||||||
|
|
||||||
|
return UNIXSocketStream(trio_socket)
|
||||||
|
|
||||||
|
|
||||||
|
async def create_udp_socket(
|
||||||
|
family: socket.AddressFamily,
|
||||||
|
local_address: IPSockAddrType | None,
|
||||||
|
remote_address: IPSockAddrType | None,
|
||||||
|
reuse_port: bool,
|
||||||
|
) -> UDPSocket | ConnectedUDPSocket:
|
||||||
|
trio_socket = trio.socket.socket(family=family, type=socket.SOCK_DGRAM)
|
||||||
|
|
||||||
|
if reuse_port:
|
||||||
|
trio_socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||||
|
|
||||||
|
if local_address:
|
||||||
|
await trio_socket.bind(local_address)
|
||||||
|
|
||||||
|
if remote_address:
|
||||||
|
await trio_socket.connect(remote_address)
|
||||||
|
return ConnectedUDPSocket(trio_socket)
|
||||||
|
else:
|
||||||
|
return UDPSocket(trio_socket)
|
||||||
|
|
||||||
|
|
||||||
|
getaddrinfo = trio.socket.getaddrinfo
|
||||||
|
getnameinfo = trio.socket.getnameinfo
|
||||||
|
|
||||||
|
|
||||||
|
async def wait_socket_readable(sock: socket.socket) -> None:
|
||||||
|
try:
|
||||||
|
await wait_readable(sock)
|
||||||
|
except trio.ClosedResourceError as exc:
|
||||||
|
raise ClosedResourceError().with_traceback(exc.__traceback__) from None
|
||||||
|
except trio.BusyResourceError:
|
||||||
|
raise BusyResourceError("reading from") from None
|
||||||
|
|
||||||
|
|
||||||
|
async def wait_socket_writable(sock: socket.socket) -> None:
|
||||||
|
try:
|
||||||
|
await wait_writable(sock)
|
||||||
|
except trio.ClosedResourceError as exc:
|
||||||
|
raise ClosedResourceError().with_traceback(exc.__traceback__) from None
|
||||||
|
except trio.BusyResourceError:
|
||||||
|
raise BusyResourceError("writing to") from None
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Synchronization
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
class Event(BaseEvent):
|
||||||
|
def __new__(cls) -> Event:
|
||||||
|
return object.__new__(cls)
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self.__original = trio.Event()
|
||||||
|
|
||||||
|
def is_set(self) -> bool:
|
||||||
|
return self.__original.is_set()
|
||||||
|
|
||||||
|
async def wait(self) -> None:
|
||||||
|
return await self.__original.wait()
|
||||||
|
|
||||||
|
def statistics(self) -> EventStatistics:
|
||||||
|
orig_statistics = self.__original.statistics()
|
||||||
|
return EventStatistics(tasks_waiting=orig_statistics.tasks_waiting)
|
||||||
|
|
||||||
|
def set(self) -> DeprecatedAwaitable:
|
||||||
|
self.__original.set()
|
||||||
|
return DeprecatedAwaitable(self.set)
|
||||||
|
|
||||||
|
|
||||||
|
class CapacityLimiter(BaseCapacityLimiter):
|
||||||
|
def __new__(cls, *args: object, **kwargs: object) -> CapacityLimiter:
|
||||||
|
return object.__new__(cls)
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self, *args: Any, original: trio.CapacityLimiter | None = None
|
||||||
|
) -> None:
|
||||||
|
self.__original = original or trio.CapacityLimiter(*args)
|
||||||
|
|
||||||
|
async def __aenter__(self) -> None:
|
||||||
|
return await self.__original.__aenter__()
|
||||||
|
|
||||||
|
async def __aexit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> None:
|
||||||
|
await self.__original.__aexit__(exc_type, exc_val, exc_tb)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def total_tokens(self) -> float:
|
||||||
|
return self.__original.total_tokens
|
||||||
|
|
||||||
|
@total_tokens.setter
|
||||||
|
def total_tokens(self, value: float) -> None:
|
||||||
|
self.__original.total_tokens = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def borrowed_tokens(self) -> int:
|
||||||
|
return self.__original.borrowed_tokens
|
||||||
|
|
||||||
|
@property
|
||||||
|
def available_tokens(self) -> float:
|
||||||
|
return self.__original.available_tokens
|
||||||
|
|
||||||
|
def acquire_nowait(self) -> DeprecatedAwaitable:
|
||||||
|
self.__original.acquire_nowait()
|
||||||
|
return DeprecatedAwaitable(self.acquire_nowait)
|
||||||
|
|
||||||
|
def acquire_on_behalf_of_nowait(self, borrower: object) -> DeprecatedAwaitable:
|
||||||
|
self.__original.acquire_on_behalf_of_nowait(borrower)
|
||||||
|
return DeprecatedAwaitable(self.acquire_on_behalf_of_nowait)
|
||||||
|
|
||||||
|
async def acquire(self) -> None:
|
||||||
|
await self.__original.acquire()
|
||||||
|
|
||||||
|
async def acquire_on_behalf_of(self, borrower: object) -> None:
|
||||||
|
await self.__original.acquire_on_behalf_of(borrower)
|
||||||
|
|
||||||
|
def release(self) -> None:
|
||||||
|
return self.__original.release()
|
||||||
|
|
||||||
|
def release_on_behalf_of(self, borrower: object) -> None:
|
||||||
|
return self.__original.release_on_behalf_of(borrower)
|
||||||
|
|
||||||
|
def statistics(self) -> CapacityLimiterStatistics:
|
||||||
|
orig = self.__original.statistics()
|
||||||
|
return CapacityLimiterStatistics(
|
||||||
|
borrowed_tokens=orig.borrowed_tokens,
|
||||||
|
total_tokens=orig.total_tokens,
|
||||||
|
borrowers=orig.borrowers,
|
||||||
|
tasks_waiting=orig.tasks_waiting,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
_capacity_limiter_wrapper: RunVar = RunVar("_capacity_limiter_wrapper")
|
||||||
|
|
||||||
|
|
||||||
|
def current_default_thread_limiter() -> CapacityLimiter:
|
||||||
|
try:
|
||||||
|
return _capacity_limiter_wrapper.get()
|
||||||
|
except LookupError:
|
||||||
|
limiter = CapacityLimiter(
|
||||||
|
original=trio.to_thread.current_default_thread_limiter()
|
||||||
|
)
|
||||||
|
_capacity_limiter_wrapper.set(limiter)
|
||||||
|
return limiter
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Signal handling
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
class _SignalReceiver(DeprecatedAsyncContextManager["_SignalReceiver"]):
|
||||||
|
_iterator: AsyncIterator[int]
|
||||||
|
|
||||||
|
def __init__(self, signals: tuple[Signals, ...]):
|
||||||
|
self._signals = signals
|
||||||
|
|
||||||
|
def __enter__(self) -> _SignalReceiver:
|
||||||
|
self._cm = trio.open_signal_receiver(*self._signals)
|
||||||
|
self._iterator = self._cm.__enter__()
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> bool | None:
|
||||||
|
return self._cm.__exit__(exc_type, exc_val, exc_tb)
|
||||||
|
|
||||||
|
def __aiter__(self) -> _SignalReceiver:
|
||||||
|
return self
|
||||||
|
|
||||||
|
async def __anext__(self) -> Signals:
|
||||||
|
signum = await self._iterator.__anext__()
|
||||||
|
return Signals(signum)
|
||||||
|
|
||||||
|
|
||||||
|
def open_signal_receiver(*signals: Signals) -> _SignalReceiver:
|
||||||
|
return _SignalReceiver(signals)
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Testing and debugging
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_task() -> TaskInfo:
|
||||||
|
task = trio_lowlevel.current_task()
|
||||||
|
|
||||||
|
parent_id = None
|
||||||
|
if task.parent_nursery and task.parent_nursery.parent_task:
|
||||||
|
parent_id = id(task.parent_nursery.parent_task)
|
||||||
|
|
||||||
|
return TaskInfo(id(task), parent_id, task.name, task.coro)
|
||||||
|
|
||||||
|
|
||||||
|
def get_running_tasks() -> list[TaskInfo]:
|
||||||
|
root_task = trio_lowlevel.current_root_task()
|
||||||
|
task_infos = [TaskInfo(id(root_task), None, root_task.name, root_task.coro)]
|
||||||
|
nurseries = root_task.child_nurseries
|
||||||
|
while nurseries:
|
||||||
|
new_nurseries: list[trio.Nursery] = []
|
||||||
|
for nursery in nurseries:
|
||||||
|
for task in nursery.child_tasks:
|
||||||
|
task_infos.append(
|
||||||
|
TaskInfo(id(task), id(nursery.parent_task), task.name, task.coro)
|
||||||
|
)
|
||||||
|
new_nurseries.extend(task.child_nurseries)
|
||||||
|
|
||||||
|
nurseries = new_nurseries
|
||||||
|
|
||||||
|
return task_infos
|
||||||
|
|
||||||
|
|
||||||
|
def wait_all_tasks_blocked() -> Awaitable[None]:
|
||||||
|
import trio.testing
|
||||||
|
|
||||||
|
return trio.testing.wait_all_tasks_blocked()
|
||||||
|
|
||||||
|
|
||||||
|
class TestRunner(abc.TestRunner):
|
||||||
|
def __init__(self, **options: Any) -> None:
|
||||||
|
from collections import deque
|
||||||
|
from queue import Queue
|
||||||
|
|
||||||
|
self._call_queue: Queue[Callable[..., object]] = Queue()
|
||||||
|
self._result_queue: deque[Outcome] = deque()
|
||||||
|
self._stop_event: trio.Event | None = None
|
||||||
|
self._nursery: trio.Nursery | None = None
|
||||||
|
self._options = options
|
||||||
|
|
||||||
|
async def _trio_main(self) -> None:
|
||||||
|
self._stop_event = trio.Event()
|
||||||
|
async with trio.open_nursery() as self._nursery:
|
||||||
|
await self._stop_event.wait()
|
||||||
|
|
||||||
|
async def _call_func(
|
||||||
|
self, func: Callable[..., Awaitable[object]], args: tuple, kwargs: dict
|
||||||
|
) -> None:
|
||||||
|
try:
|
||||||
|
retval = await func(*args, **kwargs)
|
||||||
|
except BaseException as exc:
|
||||||
|
self._result_queue.append(Error(exc))
|
||||||
|
else:
|
||||||
|
self._result_queue.append(Value(retval))
|
||||||
|
|
||||||
|
def _main_task_finished(self, outcome: object) -> None:
|
||||||
|
self._nursery = None
|
||||||
|
|
||||||
|
def _get_nursery(self) -> trio.Nursery:
|
||||||
|
if self._nursery is None:
|
||||||
|
trio.lowlevel.start_guest_run(
|
||||||
|
self._trio_main,
|
||||||
|
run_sync_soon_threadsafe=self._call_queue.put,
|
||||||
|
done_callback=self._main_task_finished,
|
||||||
|
**self._options,
|
||||||
|
)
|
||||||
|
while self._nursery is None:
|
||||||
|
self._call_queue.get()()
|
||||||
|
|
||||||
|
return self._nursery
|
||||||
|
|
||||||
|
def _call(
|
||||||
|
self, func: Callable[..., Awaitable[T_Retval]], *args: object, **kwargs: object
|
||||||
|
) -> T_Retval:
|
||||||
|
self._get_nursery().start_soon(self._call_func, func, args, kwargs)
|
||||||
|
while not self._result_queue:
|
||||||
|
self._call_queue.get()()
|
||||||
|
|
||||||
|
outcome = self._result_queue.pop()
|
||||||
|
return outcome.unwrap()
|
||||||
|
|
||||||
|
def close(self) -> None:
|
||||||
|
if self._stop_event:
|
||||||
|
self._stop_event.set()
|
||||||
|
while self._nursery is not None:
|
||||||
|
self._call_queue.get()()
|
||||||
|
|
||||||
|
def run_asyncgen_fixture(
|
||||||
|
self,
|
||||||
|
fixture_func: Callable[..., AsyncGenerator[T_Retval, Any]],
|
||||||
|
kwargs: dict[str, Any],
|
||||||
|
) -> Iterable[T_Retval]:
|
||||||
|
async def fixture_runner(*, task_status: TaskStatus[T_Retval]) -> None:
|
||||||
|
agen = fixture_func(**kwargs)
|
||||||
|
retval = await agen.asend(None)
|
||||||
|
task_status.started(retval)
|
||||||
|
await teardown_event.wait()
|
||||||
|
try:
|
||||||
|
await agen.asend(None)
|
||||||
|
except StopAsyncIteration:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
await agen.aclose()
|
||||||
|
raise RuntimeError("Async generator fixture did not stop")
|
||||||
|
|
||||||
|
teardown_event = trio.Event()
|
||||||
|
fixture_value = self._call(lambda: self._get_nursery().start(fixture_runner))
|
||||||
|
yield fixture_value
|
||||||
|
teardown_event.set()
|
||||||
|
|
||||||
|
def run_fixture(
|
||||||
|
self,
|
||||||
|
fixture_func: Callable[..., Coroutine[Any, Any, T_Retval]],
|
||||||
|
kwargs: dict[str, Any],
|
||||||
|
) -> T_Retval:
|
||||||
|
return self._call(fixture_func, **kwargs)
|
||||||
|
|
||||||
|
def run_test(
|
||||||
|
self, test_func: Callable[..., Coroutine[Any, Any, Any]], kwargs: dict[str, Any]
|
||||||
|
) -> None:
|
||||||
|
self._call(test_func, **kwargs)
|
||||||
|
|
@ -0,0 +1,217 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from abc import ABCMeta, abstractmethod
|
||||||
|
from contextlib import AbstractContextManager
|
||||||
|
from types import TracebackType
|
||||||
|
from typing import (
|
||||||
|
TYPE_CHECKING,
|
||||||
|
Any,
|
||||||
|
AsyncContextManager,
|
||||||
|
Callable,
|
||||||
|
ContextManager,
|
||||||
|
Generator,
|
||||||
|
Generic,
|
||||||
|
Iterable,
|
||||||
|
List,
|
||||||
|
TypeVar,
|
||||||
|
Union,
|
||||||
|
overload,
|
||||||
|
)
|
||||||
|
from warnings import warn
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from ._testing import TaskInfo
|
||||||
|
else:
|
||||||
|
TaskInfo = object
|
||||||
|
|
||||||
|
T = TypeVar("T")
|
||||||
|
AnyDeprecatedAwaitable = Union[
|
||||||
|
"DeprecatedAwaitable",
|
||||||
|
"DeprecatedAwaitableFloat",
|
||||||
|
"DeprecatedAwaitableList[T]",
|
||||||
|
TaskInfo,
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def maybe_async(__obj: TaskInfo) -> TaskInfo:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def maybe_async(__obj: DeprecatedAwaitableFloat) -> float:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def maybe_async(__obj: DeprecatedAwaitableList[T]) -> list[T]:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def maybe_async(__obj: DeprecatedAwaitable) -> None:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
async def maybe_async(
|
||||||
|
__obj: AnyDeprecatedAwaitable[T],
|
||||||
|
) -> TaskInfo | float | list[T] | None:
|
||||||
|
"""
|
||||||
|
Await on the given object if necessary.
|
||||||
|
|
||||||
|
This function is intended to bridge the gap between AnyIO 2.x and 3.x where some functions and
|
||||||
|
methods were converted from coroutine functions into regular functions.
|
||||||
|
|
||||||
|
Do **not** try to use this for any other purpose!
|
||||||
|
|
||||||
|
:return: the result of awaiting on the object if coroutine, or the object itself otherwise
|
||||||
|
|
||||||
|
.. versionadded:: 2.2
|
||||||
|
|
||||||
|
"""
|
||||||
|
return __obj._unwrap()
|
||||||
|
|
||||||
|
|
||||||
|
class _ContextManagerWrapper:
|
||||||
|
def __init__(self, cm: ContextManager[T]):
|
||||||
|
self._cm = cm
|
||||||
|
|
||||||
|
async def __aenter__(self) -> T:
|
||||||
|
return self._cm.__enter__()
|
||||||
|
|
||||||
|
async def __aexit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> bool | None:
|
||||||
|
return self._cm.__exit__(exc_type, exc_val, exc_tb)
|
||||||
|
|
||||||
|
|
||||||
|
def maybe_async_cm(
|
||||||
|
cm: ContextManager[T] | AsyncContextManager[T],
|
||||||
|
) -> AsyncContextManager[T]:
|
||||||
|
"""
|
||||||
|
Wrap a regular context manager as an async one if necessary.
|
||||||
|
|
||||||
|
This function is intended to bridge the gap between AnyIO 2.x and 3.x where some functions and
|
||||||
|
methods were changed to return regular context managers instead of async ones.
|
||||||
|
|
||||||
|
:param cm: a regular or async context manager
|
||||||
|
:return: an async context manager
|
||||||
|
|
||||||
|
.. versionadded:: 2.2
|
||||||
|
|
||||||
|
"""
|
||||||
|
if not isinstance(cm, AbstractContextManager):
|
||||||
|
raise TypeError("Given object is not an context manager")
|
||||||
|
|
||||||
|
return _ContextManagerWrapper(cm)
|
||||||
|
|
||||||
|
|
||||||
|
def _warn_deprecation(
|
||||||
|
awaitable: AnyDeprecatedAwaitable[Any], stacklevel: int = 1
|
||||||
|
) -> None:
|
||||||
|
warn(
|
||||||
|
f'Awaiting on {awaitable._name}() is deprecated. Use "await '
|
||||||
|
f"anyio.maybe_async({awaitable._name}(...)) if you have to support both AnyIO 2.x "
|
||||||
|
f'and 3.x, or just remove the "await" if you are completely migrating to AnyIO 3+.',
|
||||||
|
DeprecationWarning,
|
||||||
|
stacklevel=stacklevel + 1,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DeprecatedAwaitable:
|
||||||
|
def __init__(self, func: Callable[..., DeprecatedAwaitable]):
|
||||||
|
self._name = f"{func.__module__}.{func.__qualname__}"
|
||||||
|
|
||||||
|
def __await__(self) -> Generator[None, None, None]:
|
||||||
|
_warn_deprecation(self)
|
||||||
|
if False:
|
||||||
|
yield
|
||||||
|
|
||||||
|
def __reduce__(self) -> tuple[type[None], tuple[()]]:
|
||||||
|
return type(None), ()
|
||||||
|
|
||||||
|
def _unwrap(self) -> None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
class DeprecatedAwaitableFloat(float):
|
||||||
|
def __new__(
|
||||||
|
cls, x: float, func: Callable[..., DeprecatedAwaitableFloat]
|
||||||
|
) -> DeprecatedAwaitableFloat:
|
||||||
|
return super().__new__(cls, x)
|
||||||
|
|
||||||
|
def __init__(self, x: float, func: Callable[..., DeprecatedAwaitableFloat]):
|
||||||
|
self._name = f"{func.__module__}.{func.__qualname__}"
|
||||||
|
|
||||||
|
def __await__(self) -> Generator[None, None, float]:
|
||||||
|
_warn_deprecation(self)
|
||||||
|
if False:
|
||||||
|
yield
|
||||||
|
|
||||||
|
return float(self)
|
||||||
|
|
||||||
|
def __reduce__(self) -> tuple[type[float], tuple[float]]:
|
||||||
|
return float, (float(self),)
|
||||||
|
|
||||||
|
def _unwrap(self) -> float:
|
||||||
|
return float(self)
|
||||||
|
|
||||||
|
|
||||||
|
class DeprecatedAwaitableList(List[T]):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
iterable: Iterable[T] = (),
|
||||||
|
*,
|
||||||
|
func: Callable[..., DeprecatedAwaitableList[T]],
|
||||||
|
):
|
||||||
|
super().__init__(iterable)
|
||||||
|
self._name = f"{func.__module__}.{func.__qualname__}"
|
||||||
|
|
||||||
|
def __await__(self) -> Generator[None, None, list[T]]:
|
||||||
|
_warn_deprecation(self)
|
||||||
|
if False:
|
||||||
|
yield
|
||||||
|
|
||||||
|
return list(self)
|
||||||
|
|
||||||
|
def __reduce__(self) -> tuple[type[list[T]], tuple[list[T]]]:
|
||||||
|
return list, (list(self),)
|
||||||
|
|
||||||
|
def _unwrap(self) -> list[T]:
|
||||||
|
return list(self)
|
||||||
|
|
||||||
|
|
||||||
|
class DeprecatedAsyncContextManager(Generic[T], metaclass=ABCMeta):
|
||||||
|
@abstractmethod
|
||||||
|
def __enter__(self) -> T:
|
||||||
|
pass
|
||||||
|
|
||||||
|
@abstractmethod
|
||||||
|
def __exit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> bool | None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
async def __aenter__(self) -> T:
|
||||||
|
warn(
|
||||||
|
f"Using {self.__class__.__name__} as an async context manager has been deprecated. "
|
||||||
|
f'Use "async with anyio.maybe_async_cm(yourcontextmanager) as foo:" if you have to '
|
||||||
|
f'support both AnyIO 2.x and 3.x, or just remove the "async" from "async with" if '
|
||||||
|
f"you are completely migrating to AnyIO 3+.",
|
||||||
|
DeprecationWarning,
|
||||||
|
)
|
||||||
|
return self.__enter__()
|
||||||
|
|
||||||
|
async def __aexit__(
|
||||||
|
self,
|
||||||
|
exc_type: type[BaseException] | None,
|
||||||
|
exc_val: BaseException | None,
|
||||||
|
exc_tb: TracebackType | None,
|
||||||
|
) -> bool | None:
|
||||||
|
return self.__exit__(exc_type, exc_val, exc_tb)
|
||||||
|
|
@ -0,0 +1,153 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import math
|
||||||
|
import sys
|
||||||
|
import threading
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from importlib import import_module
|
||||||
|
from typing import (
|
||||||
|
Any,
|
||||||
|
Awaitable,
|
||||||
|
Callable,
|
||||||
|
Generator,
|
||||||
|
TypeVar,
|
||||||
|
)
|
||||||
|
|
||||||
|
import sniffio
|
||||||
|
|
||||||
|
# This must be updated when new backends are introduced
|
||||||
|
from ._compat import DeprecatedAwaitableFloat
|
||||||
|
|
||||||
|
BACKENDS = "asyncio", "trio"
|
||||||
|
|
||||||
|
T_Retval = TypeVar("T_Retval")
|
||||||
|
threadlocals = threading.local()
|
||||||
|
|
||||||
|
|
||||||
|
def run(
|
||||||
|
func: Callable[..., Awaitable[T_Retval]],
|
||||||
|
*args: object,
|
||||||
|
backend: str = "asyncio",
|
||||||
|
backend_options: dict[str, Any] | None = None,
|
||||||
|
) -> T_Retval:
|
||||||
|
"""
|
||||||
|
Run the given coroutine function in an asynchronous event loop.
|
||||||
|
|
||||||
|
The current thread must not be already running an event loop.
|
||||||
|
|
||||||
|
:param func: a coroutine function
|
||||||
|
:param args: positional arguments to ``func``
|
||||||
|
:param backend: name of the asynchronous event loop implementation – currently either
|
||||||
|
``asyncio`` or ``trio``
|
||||||
|
:param backend_options: keyword arguments to call the backend ``run()`` implementation with
|
||||||
|
(documented :ref:`here <backend options>`)
|
||||||
|
:return: the return value of the coroutine function
|
||||||
|
:raises RuntimeError: if an asynchronous event loop is already running in this thread
|
||||||
|
:raises LookupError: if the named backend is not found
|
||||||
|
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
asynclib_name = sniffio.current_async_library()
|
||||||
|
except sniffio.AsyncLibraryNotFoundError:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
raise RuntimeError(f"Already running {asynclib_name} in this thread")
|
||||||
|
|
||||||
|
try:
|
||||||
|
asynclib = import_module(f"..._backends._{backend}", package=__name__)
|
||||||
|
except ImportError as exc:
|
||||||
|
raise LookupError(f"No such backend: {backend}") from exc
|
||||||
|
|
||||||
|
token = None
|
||||||
|
if sniffio.current_async_library_cvar.get(None) is None:
|
||||||
|
# Since we're in control of the event loop, we can cache the name of the async library
|
||||||
|
token = sniffio.current_async_library_cvar.set(backend)
|
||||||
|
|
||||||
|
try:
|
||||||
|
backend_options = backend_options or {}
|
||||||
|
return asynclib.run(func, *args, **backend_options)
|
||||||
|
finally:
|
||||||
|
if token:
|
||||||
|
sniffio.current_async_library_cvar.reset(token)
|
||||||
|
|
||||||
|
|
||||||
|
async def sleep(delay: float) -> None:
|
||||||
|
"""
|
||||||
|
Pause the current task for the specified duration.
|
||||||
|
|
||||||
|
:param delay: the duration, in seconds
|
||||||
|
|
||||||
|
"""
|
||||||
|
return await get_asynclib().sleep(delay)
|
||||||
|
|
||||||
|
|
||||||
|
async def sleep_forever() -> None:
|
||||||
|
"""
|
||||||
|
Pause the current task until it's cancelled.
|
||||||
|
|
||||||
|
This is a shortcut for ``sleep(math.inf)``.
|
||||||
|
|
||||||
|
.. versionadded:: 3.1
|
||||||
|
|
||||||
|
"""
|
||||||
|
await sleep(math.inf)
|
||||||
|
|
||||||
|
|
||||||
|
async def sleep_until(deadline: float) -> None:
|
||||||
|
"""
|
||||||
|
Pause the current task until the given time.
|
||||||
|
|
||||||
|
:param deadline: the absolute time to wake up at (according to the internal monotonic clock of
|
||||||
|
the event loop)
|
||||||
|
|
||||||
|
.. versionadded:: 3.1
|
||||||
|
|
||||||
|
"""
|
||||||
|
now = current_time()
|
||||||
|
await sleep(max(deadline - now, 0))
|
||||||
|
|
||||||
|
|
||||||
|
def current_time() -> DeprecatedAwaitableFloat:
|
||||||
|
"""
|
||||||
|
Return the current value of the event loop's internal clock.
|
||||||
|
|
||||||
|
:return: the clock value (seconds)
|
||||||
|
|
||||||
|
"""
|
||||||
|
return DeprecatedAwaitableFloat(get_asynclib().current_time(), current_time)
|
||||||
|
|
||||||
|
|
||||||
|
def get_all_backends() -> tuple[str, ...]:
|
||||||
|
"""Return a tuple of the names of all built-in backends."""
|
||||||
|
return BACKENDS
|
||||||
|
|
||||||
|
|
||||||
|
def get_cancelled_exc_class() -> type[BaseException]:
|
||||||
|
"""Return the current async library's cancellation exception class."""
|
||||||
|
return get_asynclib().CancelledError
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# Private API
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def claim_worker_thread(backend: str) -> Generator[Any, None, None]:
|
||||||
|
module = sys.modules["anyio._backends._" + backend]
|
||||||
|
threadlocals.current_async_module = module
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
del threadlocals.current_async_module
|
||||||
|
|
||||||
|
|
||||||
|
def get_asynclib(asynclib_name: str | None = None) -> Any:
|
||||||
|
if asynclib_name is None:
|
||||||
|
asynclib_name = sniffio.current_async_library()
|
||||||
|
|
||||||
|
modulename = "anyio._backends._" + asynclib_name
|
||||||
|
try:
|
||||||
|
return sys.modules[modulename]
|
||||||
|
except KeyError:
|
||||||
|
return import_module(modulename)
|
||||||
|
|
@ -0,0 +1,94 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from traceback import format_exception
|
||||||
|
|
||||||
|
|
||||||
|
class BrokenResourceError(Exception):
|
||||||
|
"""
|
||||||
|
Raised when trying to use a resource that has been rendered unusable due to external causes
|
||||||
|
(e.g. a send stream whose peer has disconnected).
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class BrokenWorkerProcess(Exception):
|
||||||
|
"""
|
||||||
|
Raised by :func:`run_sync_in_process` if the worker process terminates abruptly or otherwise
|
||||||
|
misbehaves.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class BusyResourceError(Exception):
|
||||||
|
"""Raised when two tasks are trying to read from or write to the same resource concurrently."""
|
||||||
|
|
||||||
|
def __init__(self, action: str):
|
||||||
|
super().__init__(f"Another task is already {action} this resource")
|
||||||
|
|
||||||
|
|
||||||
|
class ClosedResourceError(Exception):
|
||||||
|
"""Raised when trying to use a resource that has been closed."""
|
||||||
|
|
||||||
|
|
||||||
|
class DelimiterNotFound(Exception):
|
||||||
|
"""
|
||||||
|
Raised during :meth:`~anyio.streams.buffered.BufferedByteReceiveStream.receive_until` if the
|
||||||
|
maximum number of bytes has been read without the delimiter being found.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, max_bytes: int) -> None:
|
||||||
|
super().__init__(
|
||||||
|
f"The delimiter was not found among the first {max_bytes} bytes"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class EndOfStream(Exception):
|
||||||
|
"""Raised when trying to read from a stream that has been closed from the other end."""
|
||||||
|
|
||||||
|
|
||||||
|
class ExceptionGroup(BaseException):
|
||||||
|
"""
|
||||||
|
Raised when multiple exceptions have been raised in a task group.
|
||||||
|
|
||||||
|
:var ~typing.Sequence[BaseException] exceptions: the sequence of exceptions raised together
|
||||||
|
"""
|
||||||
|
|
||||||
|
SEPARATOR = "----------------------------\n"
|
||||||
|
|
||||||
|
exceptions: list[BaseException]
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
tracebacks = [
|
||||||
|
"".join(format_exception(type(exc), exc, exc.__traceback__))
|
||||||
|
for exc in self.exceptions
|
||||||
|
]
|
||||||
|
return (
|
||||||
|
f"{len(self.exceptions)} exceptions were raised in the task group:\n"
|
||||||
|
f"{self.SEPARATOR}{self.SEPARATOR.join(tracebacks)}"
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
exception_reprs = ", ".join(repr(exc) for exc in self.exceptions)
|
||||||
|
return f"<{self.__class__.__name__}: {exception_reprs}>"
|
||||||
|
|
||||||
|
|
||||||
|
class IncompleteRead(Exception):
|
||||||
|
"""
|
||||||
|
Raised during :meth:`~anyio.streams.buffered.BufferedByteReceiveStream.receive_exactly` or
|
||||||
|
:meth:`~anyio.streams.buffered.BufferedByteReceiveStream.receive_until` if the
|
||||||
|
connection is closed before the requested amount of bytes has been read.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__(
|
||||||
|
"The stream was closed before the read operation could be completed"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TypedAttributeLookupError(LookupError):
|
||||||
|
"""
|
||||||
|
Raised by :meth:`~anyio.TypedAttributeProvider.extra` when the given typed attribute is not
|
||||||
|
found and no default value has been given.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class WouldBlock(Exception):
|
||||||
|
"""Raised by ``X_nowait`` functions if ``X()`` would block."""
|
||||||
|
|
@ -0,0 +1,603 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import os
|
||||||
|
import pathlib
|
||||||
|
import sys
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from functools import partial
|
||||||
|
from os import PathLike
|
||||||
|
from typing import (
|
||||||
|
IO,
|
||||||
|
TYPE_CHECKING,
|
||||||
|
Any,
|
||||||
|
AnyStr,
|
||||||
|
AsyncIterator,
|
||||||
|
Callable,
|
||||||
|
Generic,
|
||||||
|
Iterable,
|
||||||
|
Iterator,
|
||||||
|
Sequence,
|
||||||
|
cast,
|
||||||
|
overload,
|
||||||
|
)
|
||||||
|
|
||||||
|
from .. import to_thread
|
||||||
|
from ..abc import AsyncResource
|
||||||
|
|
||||||
|
if sys.version_info >= (3, 8):
|
||||||
|
from typing import Final
|
||||||
|
else:
|
||||||
|
from typing_extensions import Final
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from _typeshed import OpenBinaryMode, OpenTextMode, ReadableBuffer, WriteableBuffer
|
||||||
|
else:
|
||||||
|
ReadableBuffer = OpenBinaryMode = OpenTextMode = WriteableBuffer = object
|
||||||
|
|
||||||
|
|
||||||
|
class AsyncFile(AsyncResource, Generic[AnyStr]):
|
||||||
|
"""
|
||||||
|
An asynchronous file object.
|
||||||
|
|
||||||
|
This class wraps a standard file object and provides async friendly versions of the following
|
||||||
|
blocking methods (where available on the original file object):
|
||||||
|
|
||||||
|
* read
|
||||||
|
* read1
|
||||||
|
* readline
|
||||||
|
* readlines
|
||||||
|
* readinto
|
||||||
|
* readinto1
|
||||||
|
* write
|
||||||
|
* writelines
|
||||||
|
* truncate
|
||||||
|
* seek
|
||||||
|
* tell
|
||||||
|
* flush
|
||||||
|
|
||||||
|
All other methods are directly passed through.
|
||||||
|
|
||||||
|
This class supports the asynchronous context manager protocol which closes the underlying file
|
||||||
|
at the end of the context block.
|
||||||
|
|
||||||
|
This class also supports asynchronous iteration::
|
||||||
|
|
||||||
|
async with await open_file(...) as f:
|
||||||
|
async for line in f:
|
||||||
|
print(line)
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, fp: IO[AnyStr]) -> None:
|
||||||
|
self._fp: Any = fp
|
||||||
|
|
||||||
|
def __getattr__(self, name: str) -> object:
|
||||||
|
return getattr(self._fp, name)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def wrapped(self) -> IO[AnyStr]:
|
||||||
|
"""The wrapped file object."""
|
||||||
|
return self._fp
|
||||||
|
|
||||||
|
async def __aiter__(self) -> AsyncIterator[AnyStr]:
|
||||||
|
while True:
|
||||||
|
line = await self.readline()
|
||||||
|
if line:
|
||||||
|
yield line
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
|
||||||
|
async def aclose(self) -> None:
|
||||||
|
return await to_thread.run_sync(self._fp.close)
|
||||||
|
|
||||||
|
async def read(self, size: int = -1) -> AnyStr:
|
||||||
|
return await to_thread.run_sync(self._fp.read, size)
|
||||||
|
|
||||||
|
async def read1(self: AsyncFile[bytes], size: int = -1) -> bytes:
|
||||||
|
return await to_thread.run_sync(self._fp.read1, size)
|
||||||
|
|
||||||
|
async def readline(self) -> AnyStr:
|
||||||
|
return await to_thread.run_sync(self._fp.readline)
|
||||||
|
|
||||||
|
async def readlines(self) -> list[AnyStr]:
|
||||||
|
return await to_thread.run_sync(self._fp.readlines)
|
||||||
|
|
||||||
|
async def readinto(self: AsyncFile[bytes], b: WriteableBuffer) -> bytes:
|
||||||
|
return await to_thread.run_sync(self._fp.readinto, b)
|
||||||
|
|
||||||
|
async def readinto1(self: AsyncFile[bytes], b: WriteableBuffer) -> bytes:
|
||||||
|
return await to_thread.run_sync(self._fp.readinto1, b)
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def write(self: AsyncFile[bytes], b: ReadableBuffer) -> int:
|
||||||
|
...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def write(self: AsyncFile[str], b: str) -> int:
|
||||||
|
...
|
||||||
|
|
||||||
|
async def write(self, b: ReadableBuffer | str) -> int:
|
||||||
|
return await to_thread.run_sync(self._fp.write, b)
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def writelines(
|
||||||
|
self: AsyncFile[bytes], lines: Iterable[ReadableBuffer]
|
||||||
|
) -> None:
|
||||||
|
...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def writelines(self: AsyncFile[str], lines: Iterable[str]) -> None:
|
||||||
|
...
|
||||||
|
|
||||||
|
async def writelines(self, lines: Iterable[ReadableBuffer] | Iterable[str]) -> None:
|
||||||
|
return await to_thread.run_sync(self._fp.writelines, lines)
|
||||||
|
|
||||||
|
async def truncate(self, size: int | None = None) -> int:
|
||||||
|
return await to_thread.run_sync(self._fp.truncate, size)
|
||||||
|
|
||||||
|
async def seek(self, offset: int, whence: int | None = os.SEEK_SET) -> int:
|
||||||
|
return await to_thread.run_sync(self._fp.seek, offset, whence)
|
||||||
|
|
||||||
|
async def tell(self) -> int:
|
||||||
|
return await to_thread.run_sync(self._fp.tell)
|
||||||
|
|
||||||
|
async def flush(self) -> None:
|
||||||
|
return await to_thread.run_sync(self._fp.flush)
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def open_file(
|
||||||
|
file: str | PathLike[str] | int,
|
||||||
|
mode: OpenBinaryMode,
|
||||||
|
buffering: int = ...,
|
||||||
|
encoding: str | None = ...,
|
||||||
|
errors: str | None = ...,
|
||||||
|
newline: str | None = ...,
|
||||||
|
closefd: bool = ...,
|
||||||
|
opener: Callable[[str, int], int] | None = ...,
|
||||||
|
) -> AsyncFile[bytes]:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def open_file(
|
||||||
|
file: str | PathLike[str] | int,
|
||||||
|
mode: OpenTextMode = ...,
|
||||||
|
buffering: int = ...,
|
||||||
|
encoding: str | None = ...,
|
||||||
|
errors: str | None = ...,
|
||||||
|
newline: str | None = ...,
|
||||||
|
closefd: bool = ...,
|
||||||
|
opener: Callable[[str, int], int] | None = ...,
|
||||||
|
) -> AsyncFile[str]:
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
async def open_file(
|
||||||
|
file: str | PathLike[str] | int,
|
||||||
|
mode: str = "r",
|
||||||
|
buffering: int = -1,
|
||||||
|
encoding: str | None = None,
|
||||||
|
errors: str | None = None,
|
||||||
|
newline: str | None = None,
|
||||||
|
closefd: bool = True,
|
||||||
|
opener: Callable[[str, int], int] | None = None,
|
||||||
|
) -> AsyncFile[Any]:
|
||||||
|
"""
|
||||||
|
Open a file asynchronously.
|
||||||
|
|
||||||
|
The arguments are exactly the same as for the builtin :func:`open`.
|
||||||
|
|
||||||
|
:return: an asynchronous file object
|
||||||
|
|
||||||
|
"""
|
||||||
|
fp = await to_thread.run_sync(
|
||||||
|
open, file, mode, buffering, encoding, errors, newline, closefd, opener
|
||||||
|
)
|
||||||
|
return AsyncFile(fp)
|
||||||
|
|
||||||
|
|
||||||
|
def wrap_file(file: IO[AnyStr]) -> AsyncFile[AnyStr]:
|
||||||
|
"""
|
||||||
|
Wrap an existing file as an asynchronous file.
|
||||||
|
|
||||||
|
:param file: an existing file-like object
|
||||||
|
:return: an asynchronous file object
|
||||||
|
|
||||||
|
"""
|
||||||
|
return AsyncFile(file)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(eq=False)
|
||||||
|
class _PathIterator(AsyncIterator["Path"]):
|
||||||
|
iterator: Iterator[PathLike[str]]
|
||||||
|
|
||||||
|
async def __anext__(self) -> Path:
|
||||||
|
nextval = await to_thread.run_sync(next, self.iterator, None, cancellable=True)
|
||||||
|
if nextval is None:
|
||||||
|
raise StopAsyncIteration from None
|
||||||
|
|
||||||
|
return Path(cast("PathLike[str]", nextval))
|
||||||
|
|
||||||
|
|
||||||
|
class Path:
|
||||||
|
"""
|
||||||
|
An asynchronous version of :class:`pathlib.Path`.
|
||||||
|
|
||||||
|
This class cannot be substituted for :class:`pathlib.Path` or :class:`pathlib.PurePath`, but
|
||||||
|
it is compatible with the :class:`os.PathLike` interface.
|
||||||
|
|
||||||
|
It implements the Python 3.10 version of :class:`pathlib.Path` interface, except for the
|
||||||
|
deprecated :meth:`~pathlib.Path.link_to` method.
|
||||||
|
|
||||||
|
Any methods that do disk I/O need to be awaited on. These methods are:
|
||||||
|
|
||||||
|
* :meth:`~pathlib.Path.absolute`
|
||||||
|
* :meth:`~pathlib.Path.chmod`
|
||||||
|
* :meth:`~pathlib.Path.cwd`
|
||||||
|
* :meth:`~pathlib.Path.exists`
|
||||||
|
* :meth:`~pathlib.Path.expanduser`
|
||||||
|
* :meth:`~pathlib.Path.group`
|
||||||
|
* :meth:`~pathlib.Path.hardlink_to`
|
||||||
|
* :meth:`~pathlib.Path.home`
|
||||||
|
* :meth:`~pathlib.Path.is_block_device`
|
||||||
|
* :meth:`~pathlib.Path.is_char_device`
|
||||||
|
* :meth:`~pathlib.Path.is_dir`
|
||||||
|
* :meth:`~pathlib.Path.is_fifo`
|
||||||
|
* :meth:`~pathlib.Path.is_file`
|
||||||
|
* :meth:`~pathlib.Path.is_mount`
|
||||||
|
* :meth:`~pathlib.Path.lchmod`
|
||||||
|
* :meth:`~pathlib.Path.lstat`
|
||||||
|
* :meth:`~pathlib.Path.mkdir`
|
||||||
|
* :meth:`~pathlib.Path.open`
|
||||||
|
* :meth:`~pathlib.Path.owner`
|
||||||
|
* :meth:`~pathlib.Path.read_bytes`
|
||||||
|
* :meth:`~pathlib.Path.read_text`
|
||||||
|
* :meth:`~pathlib.Path.readlink`
|
||||||
|
* :meth:`~pathlib.Path.rename`
|
||||||
|
* :meth:`~pathlib.Path.replace`
|
||||||
|
* :meth:`~pathlib.Path.rmdir`
|
||||||
|
* :meth:`~pathlib.Path.samefile`
|
||||||
|
* :meth:`~pathlib.Path.stat`
|
||||||
|
* :meth:`~pathlib.Path.touch`
|
||||||
|
* :meth:`~pathlib.Path.unlink`
|
||||||
|
* :meth:`~pathlib.Path.write_bytes`
|
||||||
|
* :meth:`~pathlib.Path.write_text`
|
||||||
|
|
||||||
|
Additionally, the following methods return an async iterator yielding :class:`~.Path` objects:
|
||||||
|
|
||||||
|
* :meth:`~pathlib.Path.glob`
|
||||||
|
* :meth:`~pathlib.Path.iterdir`
|
||||||
|
* :meth:`~pathlib.Path.rglob`
|
||||||
|
"""
|
||||||
|
|
||||||
|
__slots__ = "_path", "__weakref__"
|
||||||
|
|
||||||
|
__weakref__: Any
|
||||||
|
|
||||||
|
def __init__(self, *args: str | PathLike[str]) -> None:
|
||||||
|
self._path: Final[pathlib.Path] = pathlib.Path(*args)
|
||||||
|
|
||||||
|
def __fspath__(self) -> str:
|
||||||
|
return self._path.__fspath__()
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
return self._path.__str__()
|
||||||
|
|
||||||
|
def __repr__(self) -> str:
|
||||||
|
return f"{self.__class__.__name__}({self.as_posix()!r})"
|
||||||
|
|
||||||
|
def __bytes__(self) -> bytes:
|
||||||
|
return self._path.__bytes__()
|
||||||
|
|
||||||
|
def __hash__(self) -> int:
|
||||||
|
return self._path.__hash__()
|
||||||
|
|
||||||
|
def __eq__(self, other: object) -> bool:
|
||||||
|
target = other._path if isinstance(other, Path) else other
|
||||||
|
return self._path.__eq__(target)
|
||||||
|
|
||||||
|
def __lt__(self, other: Path) -> bool:
|
||||||
|
target = other._path if isinstance(other, Path) else other
|
||||||
|
return self._path.__lt__(target)
|
||||||
|
|
||||||
|
def __le__(self, other: Path) -> bool:
|
||||||
|
target = other._path if isinstance(other, Path) else other
|
||||||
|
return self._path.__le__(target)
|
||||||
|
|
||||||
|
def __gt__(self, other: Path) -> bool:
|
||||||
|
target = other._path if isinstance(other, Path) else other
|
||||||
|
return self._path.__gt__(target)
|
||||||
|
|
||||||
|
def __ge__(self, other: Path) -> bool:
|
||||||
|
target = other._path if isinstance(other, Path) else other
|
||||||
|
return self._path.__ge__(target)
|
||||||
|
|
||||||
|
def __truediv__(self, other: Any) -> Path:
|
||||||
|
return Path(self._path / other)
|
||||||
|
|
||||||
|
def __rtruediv__(self, other: Any) -> Path:
|
||||||
|
return Path(other) / self
|
||||||
|
|
||||||
|
@property
|
||||||
|
def parts(self) -> tuple[str, ...]:
|
||||||
|
return self._path.parts
|
||||||
|
|
||||||
|
@property
|
||||||
|
def drive(self) -> str:
|
||||||
|
return self._path.drive
|
||||||
|
|
||||||
|
@property
|
||||||
|
def root(self) -> str:
|
||||||
|
return self._path.root
|
||||||
|
|
||||||
|
@property
|
||||||
|
def anchor(self) -> str:
|
||||||
|
return self._path.anchor
|
||||||
|
|
||||||
|
@property
|
||||||
|
def parents(self) -> Sequence[Path]:
|
||||||
|
return tuple(Path(p) for p in self._path.parents)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def parent(self) -> Path:
|
||||||
|
return Path(self._path.parent)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str:
|
||||||
|
return self._path.name
|
||||||
|
|
||||||
|
@property
|
||||||
|
def suffix(self) -> str:
|
||||||
|
return self._path.suffix
|
||||||
|
|
||||||
|
@property
|
||||||
|
def suffixes(self) -> list[str]:
|
||||||
|
return self._path.suffixes
|
||||||
|
|
||||||
|
@property
|
||||||
|
def stem(self) -> str:
|
||||||
|
return self._path.stem
|
||||||
|
|
||||||
|
async def absolute(self) -> Path:
|
||||||
|
path = await to_thread.run_sync(self._path.absolute)
|
||||||
|
return Path(path)
|
||||||
|
|
||||||
|
def as_posix(self) -> str:
|
||||||
|
return self._path.as_posix()
|
||||||
|
|
||||||
|
def as_uri(self) -> str:
|
||||||
|
return self._path.as_uri()
|
||||||
|
|
||||||
|
def match(self, path_pattern: str) -> bool:
|
||||||
|
return self._path.match(path_pattern)
|
||||||
|
|
||||||
|
def is_relative_to(self, *other: str | PathLike[str]) -> bool:
|
||||||
|
try:
|
||||||
|
self.relative_to(*other)
|
||||||
|
return True
|
||||||
|
except ValueError:
|
||||||
|
return False
|
||||||
|
|
||||||
|
async def chmod(self, mode: int, *, follow_symlinks: bool = True) -> None:
|
||||||
|
func = partial(os.chmod, follow_symlinks=follow_symlinks)
|
||||||
|
return await to_thread.run_sync(func, self._path, mode)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def cwd(cls) -> Path:
|
||||||
|
path = await to_thread.run_sync(pathlib.Path.cwd)
|
||||||
|
return cls(path)
|
||||||
|
|
||||||
|
async def exists(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.exists, cancellable=True)
|
||||||
|
|
||||||
|
async def expanduser(self) -> Path:
|
||||||
|
return Path(await to_thread.run_sync(self._path.expanduser, cancellable=True))
|
||||||
|
|
||||||
|
def glob(self, pattern: str) -> AsyncIterator[Path]:
|
||||||
|
gen = self._path.glob(pattern)
|
||||||
|
return _PathIterator(gen)
|
||||||
|
|
||||||
|
async def group(self) -> str:
|
||||||
|
return await to_thread.run_sync(self._path.group, cancellable=True)
|
||||||
|
|
||||||
|
async def hardlink_to(self, target: str | pathlib.Path | Path) -> None:
|
||||||
|
if isinstance(target, Path):
|
||||||
|
target = target._path
|
||||||
|
|
||||||
|
await to_thread.run_sync(os.link, target, self)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def home(cls) -> Path:
|
||||||
|
home_path = await to_thread.run_sync(pathlib.Path.home)
|
||||||
|
return cls(home_path)
|
||||||
|
|
||||||
|
def is_absolute(self) -> bool:
|
||||||
|
return self._path.is_absolute()
|
||||||
|
|
||||||
|
async def is_block_device(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_block_device, cancellable=True)
|
||||||
|
|
||||||
|
async def is_char_device(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_char_device, cancellable=True)
|
||||||
|
|
||||||
|
async def is_dir(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_dir, cancellable=True)
|
||||||
|
|
||||||
|
async def is_fifo(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_fifo, cancellable=True)
|
||||||
|
|
||||||
|
async def is_file(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_file, cancellable=True)
|
||||||
|
|
||||||
|
async def is_mount(self) -> bool:
|
||||||
|
return await to_thread.run_sync(os.path.ismount, self._path, cancellable=True)
|
||||||
|
|
||||||
|
def is_reserved(self) -> bool:
|
||||||
|
return self._path.is_reserved()
|
||||||
|
|
||||||
|
async def is_socket(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_socket, cancellable=True)
|
||||||
|
|
||||||
|
async def is_symlink(self) -> bool:
|
||||||
|
return await to_thread.run_sync(self._path.is_symlink, cancellable=True)
|
||||||
|
|
||||||
|
def iterdir(self) -> AsyncIterator[Path]:
|
||||||
|
gen = self._path.iterdir()
|
||||||
|
return _PathIterator(gen)
|
||||||
|
|
||||||
|
def joinpath(self, *args: str | PathLike[str]) -> Path:
|
||||||
|
return Path(self._path.joinpath(*args))
|
||||||
|
|
||||||
|
async def lchmod(self, mode: int) -> None:
|
||||||
|
await to_thread.run_sync(self._path.lchmod, mode)
|
||||||
|
|
||||||
|
async def lstat(self) -> os.stat_result:
|
||||||
|
return await to_thread.run_sync(self._path.lstat, cancellable=True)
|
||||||
|
|
||||||
|
async def mkdir(
|
||||||
|
self, mode: int = 0o777, parents: bool = False, exist_ok: bool = False
|
||||||
|
) -> None:
|
||||||
|
await to_thread.run_sync(self._path.mkdir, mode, parents, exist_ok)
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def open(
|
||||||
|
self,
|
||||||
|
mode: OpenBinaryMode,
|
||||||
|
buffering: int = ...,
|
||||||
|
encoding: str | None = ...,
|
||||||
|
errors: str | None = ...,
|
||||||
|
newline: str | None = ...,
|
||||||
|
) -> AsyncFile[bytes]:
|
||||||
|
...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
async def open(
|
||||||
|
self,
|
||||||
|
mode: OpenTextMode = ...,
|
||||||
|
buffering: int = ...,
|
||||||
|
encoding: str | None = ...,
|
||||||
|
errors: str | None = ...,
|
||||||
|
newline: str | None = ...,
|
||||||
|
) -> AsyncFile[str]:
|
||||||
|
...
|
||||||
|
|
||||||
|
async def open(
|
||||||
|
self,
|
||||||
|
mode: str = "r",
|
||||||
|
buffering: int = -1,
|
||||||
|
encoding: str | None = None,
|
||||||
|
errors: str | None = None,
|
||||||
|
newline: str | None = None,
|
||||||
|
) -> AsyncFile[Any]:
|
||||||
|
fp = await to_thread.run_sync(
|
||||||
|
self._path.open, mode, buffering, encoding, errors, newline
|
||||||
|
)
|
||||||
|
return AsyncFile(fp)
|
||||||
|
|
||||||
|
async def owner(self) -> str:
|
||||||
|
return await to_thread.run_sync(self._path.owner, cancellable=True)
|
||||||
|
|
||||||
|
async def read_bytes(self) -> bytes:
|
||||||
|
return await to_thread.run_sync(self._path.read_bytes)
|
||||||
|
|
||||||
|
async def read_text(
|
||||||
|
self, encoding: str | None = None, errors: str | None = None
|
||||||
|
) -> str:
|
||||||
|
return await to_thread.run_sync(self._path.read_text, encoding, errors)
|
||||||
|
|
||||||
|
def relative_to(self, *other: str | PathLike[str]) -> Path:
|
||||||
|
return Path(self._path.relative_to(*other))
|
||||||
|
|
||||||
|
async def readlink(self) -> Path:
|
||||||
|
target = await to_thread.run_sync(os.readlink, self._path)
|
||||||
|
return Path(cast(str, target))
|
||||||
|
|
||||||
|
async def rename(self, target: str | pathlib.PurePath | Path) -> Path:
|
||||||
|
if isinstance(target, Path):
|
||||||
|
target = target._path
|
||||||
|
|
||||||
|
await to_thread.run_sync(self._path.rename, target)
|
||||||
|
return Path(target)
|
||||||
|
|
||||||
|
async def replace(self, target: str | pathlib.PurePath | Path) -> Path:
|
||||||
|
if isinstance(target, Path):
|
||||||
|
target = target._path
|
||||||
|
|
||||||
|
await to_thread.run_sync(self._path.replace, target)
|
||||||
|
return Path(target)
|
||||||
|
|
||||||
|
async def resolve(self, strict: bool = False) -> Path:
|
||||||
|
func = partial(self._path.resolve, strict=strict)
|
||||||
|
return Path(await to_thread.run_sync(func, cancellable=True))
|
||||||
|
|
||||||
|
def rglob(self, pattern: str) -> AsyncIterator[Path]:
|
||||||
|
gen = self._path.rglob(pattern)
|
||||||
|
return _PathIterator(gen)
|
||||||
|
|
||||||
|
async def rmdir(self) -> None:
|
||||||
|
await to_thread.run_sync(self._path.rmdir)
|
||||||
|
|
||||||
|
async def samefile(
|
||||||
|
self, other_path: str | bytes | int | pathlib.Path | Path
|
||||||
|
) -> bool:
|
||||||
|
if isinstance(other_path, Path):
|
||||||
|
other_path = other_path._path
|
||||||
|
|
||||||
|
return await to_thread.run_sync(
|
||||||
|
self._path.samefile, other_path, cancellable=True
|
||||||
|
)
|
||||||
|
|
||||||
|
async def stat(self, *, follow_symlinks: bool = True) -> os.stat_result:
|
||||||
|
func = partial(os.stat, follow_symlinks=follow_symlinks)
|
||||||
|
return await to_thread.run_sync(func, self._path, cancellable=True)
|
||||||
|
|
||||||
|
async def symlink_to(
|
||||||
|
self,
|
||||||
|
target: str | pathlib.Path | Path,
|
||||||
|
target_is_directory: bool = False,
|
||||||
|
) -> None:
|
||||||
|
if isinstance(target, Path):
|
||||||
|
target = target._path
|
||||||
|
|
||||||
|
await to_thread.run_sync(self._path.symlink_to, target, target_is_directory)
|
||||||
|
|
||||||
|
async def touch(self, mode: int = 0o666, exist_ok: bool = True) -> None:
|
||||||
|
await to_thread.run_sync(self._path.touch, mode, exist_ok)
|
||||||
|
|
||||||
|
async def unlink(self, missing_ok: bool = False) -> None:
|
||||||
|
try:
|
||||||
|
await to_thread.run_sync(self._path.unlink)
|
||||||
|
except FileNotFoundError:
|
||||||
|
if not missing_ok:
|
||||||
|
raise
|
||||||
|
|
||||||
|
def with_name(self, name: str) -> Path:
|
||||||
|
return Path(self._path.with_name(name))
|
||||||
|
|
||||||
|
def with_stem(self, stem: str) -> Path:
|
||||||
|
return Path(self._path.with_name(stem + self._path.suffix))
|
||||||
|
|
||||||
|
def with_suffix(self, suffix: str) -> Path:
|
||||||
|
return Path(self._path.with_suffix(suffix))
|
||||||
|
|
||||||
|
async def write_bytes(self, data: bytes) -> int:
|
||||||
|
return await to_thread.run_sync(self._path.write_bytes, data)
|
||||||
|
|
||||||
|
async def write_text(
|
||||||
|
self,
|
||||||
|
data: str,
|
||||||
|
encoding: str | None = None,
|
||||||
|
errors: str | None = None,
|
||||||
|
newline: str | None = None,
|
||||||
|
) -> int:
|
||||||
|
# Path.write_text() does not support the "newline" parameter before Python 3.10
|
||||||
|
def sync_write_text() -> int:
|
||||||
|
with self._path.open(
|
||||||
|
"w", encoding=encoding, errors=errors, newline=newline
|
||||||
|
) as fp:
|
||||||
|
return fp.write(data)
|
||||||
|
|
||||||
|
return await to_thread.run_sync(sync_write_text)
|
||||||
|
|
||||||
|
|
||||||
|
PathLike.register(Path)
|
||||||
|
|
@ -0,0 +1,18 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from ..abc import AsyncResource
|
||||||
|
from ._tasks import CancelScope
|
||||||
|
|
||||||
|
|
||||||
|
async def aclose_forcefully(resource: AsyncResource) -> None:
|
||||||
|
"""
|
||||||
|
Close an asynchronous resource in a cancelled scope.
|
||||||
|
|
||||||
|
Doing this closes the resource without waiting on anything.
|
||||||
|
|
||||||
|
:param resource: the resource to close
|
||||||
|
|
||||||
|
"""
|
||||||
|
with CancelScope() as scope:
|
||||||
|
scope.cancel()
|
||||||
|
await resource.aclose()
|
||||||
|
|
@ -0,0 +1,26 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from typing import AsyncIterator
|
||||||
|
|
||||||
|
from ._compat import DeprecatedAsyncContextManager
|
||||||
|
from ._eventloop import get_asynclib
|
||||||
|
|
||||||
|
|
||||||
|
def open_signal_receiver(
|
||||||
|
*signals: int,
|
||||||
|
) -> DeprecatedAsyncContextManager[AsyncIterator[int]]:
|
||||||
|
"""
|
||||||
|
Start receiving operating system signals.
|
||||||
|
|
||||||
|
:param signals: signals to receive (e.g. ``signal.SIGINT``)
|
||||||
|
:return: an asynchronous context manager for an asynchronous iterator which yields signal
|
||||||
|
numbers
|
||||||
|
|
||||||
|
.. warning:: Windows does not support signals natively so it is best to avoid relying on this
|
||||||
|
in cross-platform applications.
|
||||||
|
|
||||||
|
.. warning:: On asyncio, this permanently replaces any previous signal handler for the given
|
||||||
|
signals, as set via :meth:`~asyncio.loop.add_signal_handler`.
|
||||||
|
|
||||||
|
"""
|
||||||
|
return get_asynclib().open_signal_receiver(*signals)
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue