Hi,
I’m responsible for the Avast usage on servers for a huge french company and we have some issues with it. I hope you can help me fix some strange behavior or at least explain some of them.
The main problem is about compressed files. We have compressed backups that are huge (hundreds of gigabytes), and some are very small. We would like to tell avast not to waste time on analyzing huge files. So we hopped the MAX_FILE_SIZE_TO_EXTRACT_MB would make the job but it doesn’t…
For example with MAX_FILE_SIZE_TO_EXTRACT_MB = 10 (for the sake of the example), if I start scan scan Ubuntu20.04.iso it will analyse it and will use the variable for the inside of the file. In this example I will get lines like this one:
Ubuntu20.04.iso |>pool/extra/packages/pool/main/s/systemd/systemd_245.4-4ubuntu3.19_amd64.deb|>data.tar.xz|>data.tar 42057: Compressed file is too big to be processed
but I expect
Ubuntu20.04.iso 42057: Compressed file is too big to be processed
Please explain more about the usage of this variable MAX_FILE_SIZE_TO_EXTRACT_MB and advice how to tell Avast to ignore too big compressed files (let say > 3Gb )
As a secondary question, it is about avast-fss. Is there a way to analyse file “on the fly”. With fss properly started I can wihout any difficulties make a cp /media/virus.exe /home/user
Is there a way to make it analyse each file created on the filesystem ?
regarding compressed files - as you correctly stated, MAX_FILE_SIZE_TO_EXTRACT_MB is applied to the files inside archives, so it will stop scanning big files inside compressed archives, but won’t stop scanning many small files in a big archive. It’s meant to defend against out-of-memory or decompression bomb which would affect (crash) the scanner. This is not a problem with the big archive with many small files - that can be scanned safely, even though it can take some time.
We currently don’t have any configuration option to exclude big archives completely. Could you maybe pre-filter the files before passing them to the scanner? If all files are backups and you just want to skip the big ones, you can decide that before sending them to scan, right?
Regarding avast-fss - yes, it is quite limited. Scan is triggered only by FAN_CLOSE_WRITE event. It does not block access to files, so it does not work as user protection. It can surely be made much better. Can I ask how do you use it? There didn’t seem much reason to invest developer time into FSS. We’re more focused on direct scanning of files via REST etc.
Indeed there is always ways to ignore files, but we count on Avast to ensure files has not been altered by a malicious user (they have right to update so sha’s are useless).
Although, concerning MAX_FILE_SIZE_TO_EXTRACT_MB we still have issues with “russian dolls”, I mean an iso containing tar files containing rpm files etc. When theses files are quit big, it happens Avast used a lot of RAM and even swap despite we specified an appropriate MAX_FILE_SIZE_TO_EXTRACT_MB value.
It will be great to have options to forbid avast to use more than specified RAM (if it reach the max-size then return the root-file as an error) and also a timeout for a root-file with also error return.
Ok for Avast-fss. too bad, but ok. We’ll create something to trigger Avast for created/altered files by some gid/uid and folders. Not a big deal, but the problem with big files as specified above are for us, and I hope you’ll help on this matter.