SQL Server sp_executesql not working from databricks - pyodc

noj0wjuj  于 2023-03-07  发布在  其他
关注(0)|答案(1)|浏览(125)

Below is my sample databricks code where ACTUAL_TABLE and TEST_TABLE are SQL tables.

I am trying to insert into ACTUAL_TABLE from TEST_TABLE in databricks using pyodc connection.

mssql_db.sql is my connection and the connection is working as expected. Code executes without any error but not inserting any data into the table. Looks like sp_executesql not working from databricks.

V_SQL='INSERT INTO dbo.ACTUAL_TABLE(NAME,GENDER) SELECT ZNAME,GENDER from dbo.TEST_TABLE'
mssql_db.sql(f"""begin declare @V_SQL NVARCHAR(MAX); 
begin
set @V_SQL=N'{V_SQL}'
EXECUTE sp_executesql @V_SQL;
end
end""")

Is there any other way to fulfil this requirement. I wanted to insert it from databricks only.

I also tried creating the proc in SQL and calling that proc from databricks even that didn't work.

5ssjco0h

5ssjco0h1#

Follow the below approach, I reproduce the same thing in my environment as per your requirement.

This is my demo table ACTUAL_TABLE1 .

Now, you can check Azure Databricks connected to the SQL server
Code:

import pyodbc
server = 'dem123.database.windows.net'
database = 'pool123'
username = 'xxxx'
password = 'xxxx'
    
conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=' + server + ';DATABASE='+ database +';UID=' + username + ';PWD='+ password)

cursor = conn.cursor()
sql = 'INSERT INTO dbo.ACTUAL_TABLE1 (NAME, GENDER) SELECT ZNAME, GENDER FROM dbo.TEST_TABLE'

cursor.execute(sql)

conn.commit()
conn.close()

Successfully executed and inserted the data into ACTUAL_TABLE1 using above code:
Output:

相关问题