我有一个非常简单的python脚本,它从一个列表(6k+长)中读取一个股票代码,并获取一些数据来标记交易日的异常交易量。
如果我只是在ticker文件中的每一行中运行一个循环,它需要几个小时才能运行。
在谷歌搜索的基础上,我找到了这种多处理的一个粗略示例,并决定尝试实现它。
当我运行脚本时,它运行得更快,但也引起了一些我看不出来的奇怪问题。有时我会遇到redis断路器错误,有时它只是停下来挂在股票代码文件的末尾。
有什么想法吗?
import yfinance as yf
import multiprocessing
import time
import logging
file = open("C:\\Users\\miner\\Desktop\\unusual.txt", 'w')
def main():
read_ticker_file()
def read_ticker_file():
file1 = open("C:\\Users\\miner\\Desktop\\tickers.txt", 'r')
lines = file1.readlines()
count = 0
ticker_arr = []
for line in lines:
count += 1
line = line.strip('\n')
line = line.strip()
ticker_arr.append(line)
return ticker_arr
def get_historical_data(symbol):
yahoo_ticker = yf.Ticker(symbol)
historical = yf.download(symbol, period="max", interval="1d")
average_volume_arr = historical['Volume']
try:
current_volume = yahoo_ticker.info['volume']
sum_volume = 0
for volume in average_volume_arr:
sum_volume += volume
average_volume = sum_volume / len(average_volume_arr)
if current_volume > average_volume:
volume_over_average = (current_volume - average_volume) / average_volume
volume_over_average = "{:.2%}".format(volume_over_average)
unusual_volume = (symbol + " - " + str(volume_over_average))
print(unusual_volume)
write_to_file(unusual_volume)
except Exception as e:
print(e)
def write_to_file(data):
file.writelines(data + "\n")
if __name__ == '__main__':
# start = time.time()
inputs = read_ticker_file()
pool = multiprocessing.Pool(processes=20)
pool.map(get_historical_data, inputs)
pool.close()
pool.join()
# end = time.time()
# print(start - end)
暂无答案!
目前还没有任何答案,快来回答吧!