I would know if there limits using --args-file ? i.e. may I use a file with 1 million rows of /home locations ?Thank you
Would you please provide more specification when you say 1 million rows of /home location? What are you pointing towards files or the file which is having 1 million rows of text?
Jasmin Community Support Engineer | Sophos Support Sophos Support Videos | Knowledge Base | @SophosSupport | Sign up for SMS Alerts | If a post solves your question use the 'This helped me' link
I have about 5 hundred users . Instead to execute a
savscan /home/*/public_htmlI wrote a php file which creates a file /tmp/myfile (i.e.) with a list of files modified in latest 24 hours in /home/*/public_htmland I want check this file using args-file=myfile with the -f and -archive option for each file in the listsavscan --args-file=/tmp/myfile
Since there are various users using webmail or other dynamic script which adds constantly new files , the generated file /tmp/myfile is huge.
I tested this way using only a part of these users , and I noticed that if the number of files contained in the file grows alsosavscan execution seems to be more slower . Is it so ? More large is the file and more slow run savscan --args-file=myfile ?It seems to became very very slow over 25000 files , so I am considering to use linux split .
I have contacted our Support specialist team to have an exact view on this.
I'll update you once I have any update from their side.
Here is the article for
Please search for --args-file= to see more details.