arrays - Import XML objects in batches -


i'm working on powershell script deals large dataset. have found runs until memory available consumed. because of how large dataset is, , script does, has 2 arrays become large. original array around half gig, , final object 6 or 7 gigs en-memory. idea should work better if i'm able release rows done , run script in increments.

i able split imported xml using function i've found , tweaked, i'm not able change data contained in array.

this script i'm using split array batches currently: https://gallery.technet.microsoft.com/scriptcenter/split-an-array-into-parts-4357dcc1

and code used import , split results.

# import object should have been prepared beforehand query  # script. (queryforcombos.ps1) $saveobj = "\\server\share$\me\global\scripts\resultant sets\latestquery.xml" $result_table_import = import-clixml $saveobj if ($result_tables.count > 100000) {   $result_tables = split-array -inarray $result_table_import -size 30000; } else {   $result_tables = split-array -inarray $result_table_import -parts 6 } 

and of course there processing script uses data , converts desired.

for large xml files, don't think want read memory required xmldocument or import-clxml. should @ xmltextreader 1 way process xml file bit @ time.


Comments

Popular posts from this blog

Android : Making Listview full screen -

javascript - Parse JSON from the body of the POST -

javascript - How to Hide Date Menu from Datepicker in yii2 -