top of page

Profile

Join date: May 18, 2022

About

service pack 1 Autodesk – Picture this Category:1996 software ImageModelerQ: Python multiprocessing Pool for iterating over large files I'm



 

Autodesk ImageModeler 2009 SP1 Build


Download


 





2010-08-20 Autodesk ImageModeler 2009 SP1 Build 2011-11-06 Autodesk Symbol 3000 Home References External links autodesk.autocad plant 3d 2017 service pack 1 Autodesk – Picture this Category:1996 software ImageModelerQ: Python multiprocessing Pool for iterating over large files I'm trying to run the following code, where I have a list of files that I need to run through and add the results into a MySQL database. The files I am looking at are often 1GB+. The code below takes hours (many days) to complete. Am I going about this the right way? Are there faster options for doing this task? I read that Celery is a good option for this. import multiprocessing import MySQLdb import time def add_to_database(file_path, user_id): # Configure MySQL client mysql_db = MySQLdb.connect( host='localhost', user='user', passwd='pass', db='db', charset="utf8" ) # Create table cursor = mysql_db.cursor() cursor.execute("CREATE TABLE IF NOT EXISTS tasks (id int(11), task_id varchar(255), user_id int(11), entry_timestamp datetime, last_view timestamp)") # Insert record into table cursor.execute("INSERT INTO tasks (id, task_id, user_id, entry_timestamp, last_view) VALUES (%s, %s, %s, %s, %s)", (file_path, user_id, time.time(), time.time(), time.time())) # Commit cursor.commit() # Close database mysql_db.close() pool = multiprocessing.Pool() files = glob.glob("/*.csv") for




ee43de4aa9

A

Autodesk ImageModeler 2009 SP1 Build

More actions
bottom of page