Importing JSON data file into PostgreSQL using Python and Psycopg2 -


i having trouble getting query work. have json file on 80k lines of data. since have been having many problems cut document down 3 lines see if can data in before attempt full 80k lines:

import psycopg2 import io readtest1 = io.open("c:\users\samuel\dropbox\work\python , postgres\test1.json", encoding = "utf-8") readall = readtest1.readlines() 

i have seen online using readlines not best method method know. method read 3 lines in file. not sure expected make array also.

conn = psycopg2.connect("dbname = python_trial user = postgres") cur = conn.cursor() cur.execute("create table test4 (data json);") 

create table takes json data:

cur.executemany("insert test4 values (%s)", readall) 

the error:

traceback (most recent call last): file "<pyshell#13>", line 1, in <module> cur.executemany("insert test4 values (%s)", readall) typeerror: not arguments converted during string formatting 

i not sure doing incorrectly. seeing "\n" when print (readall). think caused using readlines method , not sure if messing query also.

use this:

cur.executemany("insert test4 values ('{0}')".format(readall)) 

or (readall,): cur.executemany("insert test4 values (%s)", (readall,))

see: http://initd.org/psycopg/docs/usage.html#passing-parameters-to-sql-queries