Optimizing Powershell Script to find old files and delete them on DFS replicated folders -


here story. have fileshare replicated between 2 servers located in different places in world. dfs not replicate file if has been viewed, wouldn't want delete file/folder because used within time period have set (7 days). make sure don't remove still used files have check both locations lastaccesstime.

i have this

set-executionpolicy remotesigned $limit = (get-date).adddays(-7) $pathone = "firstpath" $pathtwo = "secondpath" $tobedeletedpathone = get-childitem -path $pathone -recurse -force | where-object { !$_.psiscontainer -and $_.lastaccesstime -lt $limit } $tobedeletedpathtwo = get-childitem -path $pathtwo -recurse -force | where-object { !$_.psiscontainer -and $_.lastaccesstime -lt $limit } $diffobjects = compare-object -referenceobject $tobedeletedpathone -differenceobject $tobedeletedpathtwo -includeequal $tobedeletedoverall = diffobjects | {$_.sideindicator -eq "=="} 

after this, loop through , delete files marked deletion both locations.

part of problem have there tremendous amount of files , can take long time. wanted make better/faster. idea have script run scan different script on each fs server , wait them return output. way can scan on local machine easier remotely , scan both locations simultaneously.

the other part of problem comes in fact have no idea how this. continue work on , if solve it, post here in case in future finds useful.

you run locally. copy script machines want (make script copy them if need to) use pstools kick them off on local machines. should run script simultaneously on machines.


Comments

Popular posts from this blog

Android : Making Listview full screen -

javascript - Parse JSON from the body of the POST -

javascript - Chrome Extension: Interacting with iframe embedded within popup -