multithreading - uploading records of list of files in parallel using python to DB -


I have a list of files I've recorded in each file \ n, I have to get those records in parallel and By uploading them to some SQL Server, one can know what is the best way to do this with Python

The best way to upload in parallel may not be, but to import SQL servers to bulk Using the same mechanisms - for example

Edit:

One way I have used frequently is to use them 1) Bulk data loading in a loft table 2) Process of data on database 3) Insert in main table

Step 2 and 3 can be added if the processing is of the appropriate type.

This can happen faster because there are fewer round trips on the server and there is a processing processing of data rather than a line based on a row.

In addition to this, I say that SQL Server will use more than one CPU in this processing so that you get ur processing parallel to free


Comments