python-3.x 如何得到响应时间和响应大小当使用aiohtp

lndjwyie  于 2023-01-03  发布在  Python
关注(0)|答案(3)|浏览(243)

是否可以获得使用aiohttp发出的每个请求的响应时间和响应大小?
文档中似乎没有这些属性。
谢啦,谢啦

62lalag4

62lalag41#

len(response.text())将返回解压缩响应的大小。如果您想要原始压缩响应的大小,您需要在创建aiohttp.ClientSession时设置auto_decompress=False。之后您可以使用len(await response.read())获得它。但它将使response.text()不可用,因为它需要解压缩响应。要使它再次可用,您必须手动解压缩它:

import time
import zlib
import brotli

async with aiohttp.ClientSession(auto_decompress=False) as session:
    start = time.monotonic()
    response = await session.get(url='www.test.com')
    response_time = time.monotonic() - start
    response_size = len(await response.read())

    encoding = response.headers['Content-Encoding']
    if encoding == 'gzip':
        response._body = zlib.decompress(response._body, 16 + zlib.MAX_WBITS)
    elif encoding == 'deflate':
        response._body = zlib.decompress(response._body, -zlib.MAX_WBITS)
    elif encoding == 'br':
        response._body == brotli.decompress(response._body)

    response_text = await response.text()

有关time.time()的信息,请访问pymotw.com:
由于time.time()查看系统时钟,并且用户或系统服务可以更改系统时钟以在多台计算机之间同步时钟,因此重复调用time.time()可能会生成前后变化的值。这可能会导致在尝试测量持续时间或使用这些时间进行计算时出现意外行为。请使用time.monotonic()避免这些情况。它总是返回向前的值。
AIOHTTP文档建议使用loop.time()(它也是单调的):

async def on_request_start(session, trace_config_ctx, params):
    trace_config_ctx.start = asyncio.get_event_loop().time()

async def on_request_end(session, trace_config_ctx, params):
    elapsed = asyncio.get_event_loop().time() - trace_config_ctx.start
    print("Request took {}".format(elapsed))

trace_config = aiohttp.TraceConfig()
trace_config.on_request_start.append(on_request_start)
trace_config.on_request_end.append(on_request_end)
async with aiohttp.ClientSession(trace_configs=[trace_config]) as client:
    client.get('http://example.com/some/redirect/')
k4ymrczo

k4ymrczo2#

一种可能性是:

  • 请求前的测量时间点
  • 请求后测量时间点
  • 两者之间的差别就是响应时间
  • 使用"response.text()"可以得到响应,并可以使用"len()"确定长度

一个小型的自包含示例如下所示:

import time
import asyncio
from aiohttp import ClientSession

async def fetch(session, url):
    start = time.time()
    async with session.get(url) as response:
        result = await response.text()
        end = time.time()
        print(url, ": ", end - start, "response length:", len(result))
        return result

async def crawl(urls: set):
    async with ClientSession() as session:
        tasks = []
        for url in urls:
            tasks.append(
                fetch(session, url)
            )
        await asyncio.gather(*tasks)

if __name__ == "__main__":
    urlSet = {"https://www.software7.biz/tst/number.php",
              "https://www.software7.biz/tst/number1.php",
              "https://www.software7.biz"}
    asyncio.run(crawl(urlSet))
    • 测试**

两个端点number.php和number1.php在服务器端的延迟分别为3秒和1秒,并且每个端点返回一个两位数。
调试控制台中的输出如下所示:

https://www.software7.biz :  0.16438698768615723 response length: 4431
https://www.software7.biz/tst/number1.php :  1.249755859375 response length: 2
https://www.software7.biz/tst/number.php :  3.214473009109497 response length: 2
7dl7o3gd

7dl7o3gd3#

您可以从标头中获取响应内容的大小:

response.headers['content-length']

相关问题