Apache Spark 写入增量表时如何添加新列?

bkhjykvo  于 2023-05-23  发布在  Apache
关注(0)|答案(1)|浏览(230)

我正在使用delta-rs写入Delta Lake中的Delta表。下面是我的代码:

import time
import numpy as np
import pandas as pd
import pyarrow as pa
from deltalake.writer import write_deltalake

num_rows = 10
timestamp = np.array([time.time() + i * 0.01 for i in range(num_rows)])
current = np.random.rand(num_rows) * 10
voltage = np.random.rand(num_rows) * 100
temperature = np.random.rand(num_rows) * 50
data = {
    "timestamp": timestamp,
    "current": current,
    "voltage": voltage,
    "temperature": temperature,
}
df = pd.DataFrame(data)
storage_options = {
    "AWS_DEFAULT_REGION": "us-west-2",
    "AWS_ACCESS_KEY_ID": "xxx",
    "AWS_SECRET_ACCESS_KEY": "xxx",
    "AWS_S3_ALLOW_UNSAFE_RENAME": "true",
}
schema = pa.schema(
    [
        ("timestamp", pa.float64()),
        ("current", pa.float64()),
        ("voltage", pa.float64()),
        ("temperature", pa.float64()),
    ]
)
write_deltalake(
    "s3a://my-bucket/delta-tables/motor",
    df,
    mode="append",
    schema=schema,
    storage_options=storage_options,
)

上面的代码成功地将包括4列的数据写入Delta表。我可以通过Spark SQL确认:

spark-sql> describe table delta.`s3a://my-bucket/delta-tables/motor`;
23/05/22 06:38:51 WARN ObjectStore: Failed to get database delta, returning NoSuchObjectException
timestamp             double
current               double
voltage               double
temperature           double

# Partitioning
Not partitioned
Time taken: 0.39 seconds, Fetched 7 row(s)

spark-sql> select * from delta . `s3a://my-bucket/delta-tables/motor` limit 10;
23/05/22 07:01:50 WARN ObjectStore: Failed to get database delta, returning NoSuchObjectException
1.683746477029865E9 7.604250297497938 9.421758439102415 72.1927369069416
1.683746477039865E9 0.09092487512480374 17.989035574705202  35.350210012093214
1.683746477049866E9 7.493128659573002 9.390891728445448 48.541259705334625
1.683746477059866E9 2.717780962917138 0.9268887657049119  59.10566692023579
1.683746477069866E9 2.57300442470119  17.486083607683693  47.23521355609355
1.683746477079866E9 2.09432242350117  14.945888123248054  47.125030870747715
1.683746477089866E9 4.136491853926207 16.52334128991138 27.544656909406505
1.6837464770998669E9  1.1299759566741152  5.539831633892187 52.50892511866684
1.6837464771098669E9  0.9626607062002979  8.400536671329352 72.49131313291358
1.6837464771198668E9  7.6866231204656446  4.033915109232906 48.900631068812075
Time taken: 5.925 seconds, Fetched 10 row(s)

现在我尝试用一个新列pressure写入Delta表:

import time
import numpy as np
import pandas as pd
import pyarrow as pa
from deltalake.writer import write_deltalake

num_rows = 10
timestamp = np.array([time.time() + i * 0.01 for i in range(num_rows)])
current = np.random.rand(num_rows) * 10
voltage = np.random.rand(num_rows) * 100
temperature = np.random.rand(num_rows) * 50
pressure = np.random.rand(num_rows) * 1000
data = {
    "timestamp": timestamp,
    "current": current,
    "voltage": voltage,
    "temperature": temperature,
    "pressure": pressure,
}
df = pd.DataFrame(data)
storage_options = {
    "AWS_DEFAULT_REGION": "us-west-2",
    "AWS_ACCESS_KEY_ID": "xxx",
    "AWS_SECRET_ACCESS_KEY": "xxx",
    "AWS_S3_ALLOW_UNSAFE_RENAME": "true",
}
schema = pa.schema(
    [
        ("timestamp", pa.float64()),
        ("current", pa.float64()),
        ("voltage", pa.float64()),
        ("temperature", pa.float64()),
        ("pressure", pa.float64()), # <- I added this line
    ]
)
write_deltalake(
    "s3a://my-bucket/delta-tables/motor",
    df,
    mode="append",
    schema=schema,
    storage_options=storage_options,
    overwrite_schema=True, # <- Whether add this or not will return same error
)

注意在函数write_deltalake中添加overwrite_schema=True是否不会影响结果。
它会抛出这个错误:

...

Traceback (most recent call last):
  File "python3.11/site-packages/deltalake/writer.py", line 180, in write_deltalake
    raise ValueError(
ValueError: Schema of data does not match table schema

Table schema:
timestamp: double
current: double
voltage: double
temperature: double
pressure: double

Data Schema:
timestamp: double
current: double
voltage: double
temperature: double

这个错误让我困惑。因为我现有的Delta表数据模式应该有4列。而我想写的新数据有5列。但基于误差,则相反。
如何在Delta表中添加新列?谢谢!

yhived7q

yhived7q1#

看起来你需要mode='overwrite'才能使用overwrite_schema=True。(见源代码)
它似乎没有被很好地记录。如果要在追加时添加列,则需要首先覆盖现有数据,添加列,然后运行append语句。

相关问题