1

I tried to run SQL with pyodbc but was not able to convert the results into data framework. I got the error message -
ValueError: Shape of passed values is (47, 1), indices imply (47, 3).

For names, I got a list of string ['ID', 'Country', 'Date']

For rows, I got a list of tuples like [('A1', 'USA', datetime.date(2011, 12, 31)), ('A1', 'USA', datetime.date(2020, 4, 21)), ...('A3','Cananda', datetime.date(2001, 2, 11))]

The query for the same table works fine if I use pandas.read_sql. I would get 47 rows and 3 columns instead of 1 column. However, I need to use cursor.execute due to the need of incorporating a local list for SQL. The actual list is much longer containing hundreds of ids. Any ideas on how to convert the results?

import pyodbc import pandas as pd conn = pyodbc.connect("DSN=dsn") conn.setdecoding(pyodbc.SQL_CHAR, encoding='utf-8') conn.setencoding(encoding="utf-8") cursor = conn.cursor() list_id = ["A1","A2","A3"] placeholders = ",".join("?" * len(list_id)) sql = "select * from table Where ID in (%s)" % placeholders cursor.execute(sql, list_id) names = [ x[0] for x in cursor.description] rows = cursor.fetchall() cursor.close() conn.close() df = pd.DataFrame(rows, columns =names) df 

1 Answer 1

1

Ok. I figured it out. Only need to reshape the rows.

import pyodbc import numpy as np import pandas as pd conn = pyodbc.connect("DSN=dsn") conn.setdecoding(pyodbc.SQL_CHAR, encoding='utf-8') conn.setencoding(encoding="utf-8") cursor = conn.cursor() pid = ["P077856","P079661"] placeholders = ",".join("?" * len(pid)) sql = "select PROJ_ID, ROLE_CODE from PROJECT.PROJECT_TEAM_V2 Where PROJ_ID in (%s)" % placeholders cursor.execute(sql, pid) names = [ x[0] for x in cursor.description] rows = cursor.fetchall() rows = np.array(rows).reshape(47,3) df = pd.DataFrame(rows, columns =names) df 
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.