I’m not sure because I’m not a programmer but I think not. Even memory (RAM) must be ‘unpacked’ for program files and memory blocks to be scanned. You can improve your performance excluding these files form scanning (as you are sure they are not infected) or moving the sensibility of scanning to a lower level.
If you have free memory space, then we unpack the file directly to the memory and file scanning is faster. After scanning we unalloc that memory buffer (or delete a temp file).
Some archives have to be unpacked completely only to disk (ACE, CAB, ARC, ZOO) and then they’re scanned.
Sorry, NY Eve’s party, I understood your question in a different way than you thought. Every file from the certain archive is unpacked, scanned, deleted (temp or unallocated) and then avast seeks to the next file in the archive (this works for all archives except those four ones written above). The whole file has to be unpacked and then scanned (it’s not possible to scan the file during unpacking). Avast can’t scan a file, e.g. If you didn’t have space to unpack 300Mb setup.exe.
But back to business, does this mean you cannot scan files that will not fit into memory?
I was hoping that the stream of data coming from a file on disk or in memory could be changed into a stream of data coming from the unpacker and thus enabling scanning archives with large files on systems with small disks.
I am not talking about performance in this case, that’s not the problem.
No, it’s not possible. Our scanner works with the whole file: it reads only some KBs from the certain file positions e.g. in 100Mb file (avast does not scan the whole file - if this is not enabled in avast options). If the file is compressed twice with UPX, for the 2nd UPX decompression you have to work with the whole static file.