vllm 'SamplingParams' 对象没有 'update' 属性,

gmxoilav  于 2个月前  发布在  其他
关注(0)|答案(2)|浏览(32)

你好,我在更新vllmtransformers后遇到了这个错误:

Traceback (most recent call last):
File "/user/work/ad20999/1lm-rephrase/evaluate_gpt_rephrase_vllm.py", line 576, in <module>
    main()
File "/user/work/ad20999/1lm-rephrase/evaluate_gpt_rephrase_vllm.py", line 480, in main
    outputs = model.generate(inputs, sampling_params)
File "/user/work/ad20999/anaconda3/envs/vltm/lib/python3.10/site-packages/awq/models/base.py", line 110, in generate
    return self.model.generate(*args, **kwargs)
File "/user/work/ad20999/anaconda3/envs/vltm/lib/python3.10/site-packages/torch/utils/contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
File "/user/work/ad20999/anaconda3/envs/vltm/lib/python3.10/site-packages/transformers/generation/utils.py", line 1349, in generate
    model_kwargs = generation_config.update(**kwargs) # All unused kwargs must be model kwargs
AttributeError: 'SamplingParams' object has no attribute 'update'
ws51t4hk

ws51t4hk1#

model.generate(inputs, sampling_params)

model 似乎是一个 huggingface 模型,而不是来自 vLLM 的 LLM 对象,而 sampling_params 可能属于 vLLM 类型。

y4ekin9u

y4ekin9u2#

你好,Simon,感谢你的迅速回复!我使用model = LLM(model="TheBloke/Llama-2-7B-chat-AWQ", quantization="awq")加载了一个量化模型,更新之前同样的代码可以正常工作。

相关问题