I have a 1 GB txt file with 100000's of lines of SQL insert Cmds
which i am trying to execute into a SQL DB.
As the file is so big and uses 13gb of memory to process all in one go and normally has lots of errors
To limit the memory i am using the get-content with -readline 5000 but on occasions when i am doing the foreach i need line 5001 as well as this is part of the cmd
I know that the line will always start "insert into" but can span 3 or 4 lines and will always end will end with ");"
What the best way to read the file limit the memory used but ensure i don't miss any commands
so here is an example of what is happing
line no 4999 insert into table(1,2,3,4)
line no 5000 values
line no 5001 (6,7,8,9); (This then ends up as line 1 on the next foreach)
when the command is built line 4999 and 5000 form half the cmd so insert into table (1,2,3,4) values