I am using Microsoft SQL Server and SQLAlchemy
I have a callback in dash app that connects to SQL DB but it does not close the SPID connections in my activity monitor in SQL Server. So it will just accumlate a ton of connections. It works locally, where I see the connection open and then close a few seconds later but it does not in production. I have it deployed on Azure App Services. Any idea why?
@app.callback(ServersideOutput(“store”, “data”),
Input(component_id=‘my_interval’, component_property=‘n_intervals’))
def query(n_interval):
try:
if n_interval > 0 :
SERVER = 'DEVSQL'
DATABASE = 'mydb'
USERNAME = 'user'
PASSWORD = 'password'
DATA_CONNECTION = f"mssql+pyodbc://{USERNAME}:{PASSWORD}@{SERVER}/{DATABASE}?driver=ODBC+Driver+17+for+SQL+Server"
engine = create_engine(DATA_CONNECTION)
pyodbc.pooling = False
with engine.connect() as connection:
chosen_date_instance = pd.read_sql_query('''
SELECT *
FROM inspector.AUDIT_CHECK_INSTANCE
ORDER BY RUN_DT DESC
''', connection)
chosen_date_instance.to_csv('././datasets/audit_check_instance_data.csv')
audit_item_instance = pd.read_sql_query('''
SELECT *
FROM inspector.audit_item
''', connection)
audit_item_instance.to_csv('././datasets/audit_item_instance.csv')
audit_checklet_instance = pd.read_sql_query('''
SELECT *
FROM inspector.audit_checklet_instance
''', connection)
audit_checklet_instance.to_csv('././datasets/audit_checklet_instance.csv')
connection.close()
engine.dispose()
stored_df = chosen_date_instance.copy()
return stored_df.to_dict('records')
finally:
pass
5 posts - 2 participants






